- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
That’s troublesome right there, it should be an outside commission that gets to see and choose which officers are flagged.
Truleo’s software allows supervisors to select from a set of specific behaviors to flag,
Were probably talking like hundreds of thousands of hours of new recordings each week. There’s zero chance a human is going to review that all.
The third party can still use the same kinds of automation tools.
Yes, that’s what I meant. The 3rd party would have get to see all of the info and they get to decide which flags to look for. IMO, then the third party, the police union and a civilian commission gets to vote on whether the officer stays or not. The unions have way too much power.
Why select people?
I would be curious what the article means by AI. For example this might include some transcription and sentiment anaysis. Didn’t see anything too complicated in their description of what the software does.
Video surveillance also had “violent behavior” models for a while. I am guessing that in 99.9% of the videos, nothing worth noticing happens. If that allows them to flag the remaining 0.1% for human review, that’s already a huge boost.