Citizen app fingers wrong citizen

757

Here’s what happens when you let computers make the “wanted” posters…

 

News at The Guardian today shows a law enforcement debacle where a privately developed app called Citizen called out a homeless individual for allegedly starting a fire in forests around Los Angeles.

 

According to reporting by Kari Paul, the system offered a $30,000 reward to over 860,000 people for a suspect who was briefly detained by law enforcement, but let go after they found insufficient evidence to tie him to a crime. No one told the app!

 

The errant notice was taken down a day after its posting, and a different man has now been arrested in relation to the crime.

 

Citizen uses police scanner information and other data to put together these types of announcements, showing that we’ve come a long way from the old Wild West when posters were hand printed and hand drawn, and presumably, designed by human hands.

 

In the wake of new crime-tech systems like Citizen, advocates for the wrongly accused worry that these types of AI crime identification systems can become scarlet letters pinned to the wrong chests. That’s not to mention the controversy around facial recognition tech that even police departments politely decline.

 

“A false accusation is almost like a conviction now, because of the way people are so quickly publicly shamed,” said Sarah Esther Lageson, an assistant professor at Rutgers School of Criminal Justice, as quoted in Paul’s report. “With their image and name online, even if it is the wrong person – that notoriety is for ever.”

 

Criticism and concern are not just limited to ACLU types, either: in related comments by law enforcement, a sheriff involved with the matter said Citizen’s mistake was “potentially disastrous.”

 

So while computers are good at, say, doing math, they probably shouldn’t be involved in assessing human culpability without a pretty significant human in the loop (HITL) component.

 

NO COMMENTS

LEAVE A REPLY