The ACLU says Amazon's 1-year suspension on selling facial recognition to law enforcement falls short and it wants a longer ban
- Amazon announced on Wednesday it is putting a one-year suspension on sales of its facial recognition software Rekognition to law enforcement.
- AI experts and civil rights activists have been campaigning for Amazon to halt the sale of Rekognition to law enforcement for years.
- The ACLU said a one-year suspension is not enough and an AI ethics expert told Business Insider Amazon's statement does not address whether police forces which already have Rekognition will be able to continue using it.
Amazon took an unusual step on Wednesday, recusing itself temporarily from the facial recognition market.
The tech giant announced it would suspend the sale of its facial recognition software Rekognition to law enforcement for one year.
"We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested," Amazon said in its statement.
The announcement came amid the Black Lives Matter protests and renewed concern about the use of facial recognition by law enforcement and government agencies in the US. There is evidence showing the technology is biased against those with darker skin.
For the American Civil Liberties Union, the one-year moratorium is not enough.
"This surveillance technology's threat to our civil rights and civil liberties will not disappear in a year," Nicole Ozer, technology and civil liberties director for the ACLU, said in a press statement.
"Amazon must fully commit to a blanket moratorium on law enforcement use of face recognition until the dangers can be fully addressed, and it must press Congress and legislatures across the country to do the same."
Why facial recognition is particularly dangerous to people of color
Civil rights organizations and AI experts have been advocating for years for Amazon and others to ban the sale of facial recognition tech, partly because as a policing tool it would be disproportionately used to surveil people of color.
This activism has had a resurgence with the Black Lives Matter protests taking place across the world in the wake of George Floyd's death, and Amazon was accused by its own employees of hypocrisy for voicing support for the protests while continuing to sell Rekognition to police.
"Face recognition technology gives governments the unprecedented power to spy on us wherever we go. It fuels police abuse. This surveillance technology must be stopped," Ozer said.
But on top of the risk that the technology could exacerbate institutional racism, the technology is itself flawed when it comes to identifying people of color.
When building AI-powered technologies like facial recognition, the data used to train the algorithms that make up that technology can ingrain existing biases. So if a dataset is comprised mainly of white faces, the facial recognition software will be good at identifying white people and less good at identifying people with darker skin tones.
A study led by AI researcher Joy Buolamwini published in January 2019 found that Amazon's Rekognition was far worse at recognizing women and people of color. The study, which tasked the program with identifying people's gender, made no errors in identifying the gender of white men but misidentified darker-skinned women as men 31% of the time.
The ACLU also ran a test of Amazon's technology in July 2018 on pictures of members of Congress. Rekognition wrongly identified 28 members of Congress — all of whom were people of color — as people who had been arrested.
Amazon has consistently dismissed criticisms of its software by saying that researchers need to pay more attention to the software's "confidence threshold," a percentage it spits out whenever it makes a match saying how sure it is that it has correctly identified someone.
As Rekognition has been sold to police departments across the US, multiple reports have emerged from the software's use by police however indicating that in practice law enforcement officers don't pay attention to this confidence threshold either, and are not well trained to recognize the limitations of the technology. A report published in May 2019 by Georgetown's Center on Privacy and Technology found NYPD officers were even running pictures of celebrities through their facial recognition systems to try and identify suspects.
What will happen to the police departments who already use Rekognition?
AI and privacy policy expert Dr. Nakeema Stefflbauer told Business Insider that while it is "great news" that Amazon has halted its Rekognition sales to law enforcement, it still begs the question of what will happen to the police departments who already use it.
"Who will guide or monitor their use of the tool they've licensed? Far better than suspending further sales would be recalling the software altogether, as is commonly done with other faulty or unreliable products," she said.
Mozilla fellow and privacy expert Frederike Kaltheuner also welcomed Amazon's decision, but said praise should not go to Amazon.
"Instead of praising Amazon for its overdue decision, the real story is how activists like Deb Raji and Joy Buolamwini have warned for years that the technology is immature and clearly demonstrates biased performance," she said. "Amazon tried to discredit the authors, and deny their results. Today we should celebrate the hard work that led to Amazon's decision."
Buolamwini welcomed the news in a statement on Twitter.
Amazon's one-year suspension followed an announcement by IBM that it would halt sales of "general purpose" facial recognition. To Kaltheuner, big companies turning away from the tech under activist pressure is a good sign.
"The tide is turning against law enforcement use of face recognition and this is a good thing. There are many other, lesser known companies that sell face recognition (and other invasive and problematic technology) to law enforcement around the world. These companies operate completely under the radar. That's why a moratorium on inaccurate and risky tech is the right move," she said.