Scan the Interwebs today, and you’ll see loudly trumpeted the latest news on facial recognition technologies, which are being maligned as unfair uberveillance.
The scoop today is that Amazon has halted police use of its Rekognition facial recognition technology for one year, as protests over police brutality continue to simmer in the wake of the George Floyd murder and others.
What you don’t see in many of these headlines or articles is how this whole thing got started to begin with.
In 2016, Amazon unrolled the Rekognition platform, which uses algorithms to put a name to a face, (to use a mild euphemism).
“With Amazon Rekognition, you can identify objects, people, text, scenes, and activities in images and videos, as well as detect any inappropriate content,” spokespersons write of the firm’s groundbreaking technology. “Amazon Rekognition also provides highly accurate facial analysis and facial search capabilities that you can use to detect, analyze, and compare faces for a wide variety of user verification, people counting, and public safety use cases.”
Key elements of Rekognition include celebrity recognition tools, facial attribute detection and something called “People Pathing,” which ostensibly can be used for neat, harmless things like tracking sports players during a game, but could just as easily be used to track you everywhere you go.
Although some county governments in the state of Oregon were early adopters, more and more law enforcement agencies and departments started to use Rekognition for things like suspect identification, and most of us didn’t really know about it until now.
“Face recognition technology gives governments the unprecedented power to spy on us wherever we go,” writes Nicole Ozer, technology and civil liberties director with the American Civil Liberties Union of Northern California, today. “It fuels police abuse. This surveillance technology must be stopped.”
The rash of headlines will put Amazon Rekognition under a spotlight, and maybe after that one year is over, we’ll be a little smarter about how to approach the fine line between technology tools and unfair monitoring.