20 federal agencies use facial recognition technologies that store billions of photos
- Multiple federal agencies use unregulated facial recognition technology to aid criminal investigations.
- Facial recognition technology is more likely to misidentify Black, Asian, and Indigenous persons.
- 13 agencies did not have up-to-date information on what non-federal systems are used by their employees,
The House Judiciary Subcommittee on Crime, Terrorism, and Homeland Security met Tuesday to discuss the unfettered use of facial recognition technology by federal law enforcement.
Both experts and committee members from both parties expressed concerns about the current use of facial recognition technology, including its invasion of individual and community privacy, procurement of images without proper consent, and high misidentification rates of nonwhite individuals.
From its survey of 42 federal agencies that employ law enforcement officers, the Government Accountability Office (GAO) found that 20 federal agencies own systems with facial recognition technology or use systems owned by other entities. GAO Director Gretta Goodwin testified that these systems can contain millions or billions of photos.
Of the 14 agencies that reported using facial recognition technology during criminal investigations, 13 did not have completely up-to-date information on what systems are used by their employees, meaning they may rely on systems owned by other entities to support their operations. The agencies themselves do not regularly track this information or have mechanisms in place to do so.
"Facial recognition technologies capabilities are only going to grow stronger. If law enforcement agencies do not know if or how their employees are using the technology, then they cannot ensure that the appropriate protections are in place," Goodwin said.
The FBI has access to over 640 million photos, in some cases through the use of private companies that scrape social media sites, according to Kara Frederick, a tech policy research fellow for the Heritage Foundation.
The United States also nearly matches China and its surveillance coverage with one camera for every 4.6 people compared to China's one for every 4.1 individuals, Frederick added.
"Authoritarian powers like China are at the bleeding edge of using facial recognition for internal control," Frederick said. "The demonstrated inclination by governments to expand these powers in democratic nations renders the slope a slippery one. And we know that once these powers expand, they almost never contract."
Bertram Lee, media and tech counsel for the Leadership Conference on Civil and Human Rights, cited research showing how facial recognition further entrenches racial disparities in the legal system. Algorithmic Justice League founder Joy Buolamwini and Black in AI cofounder Timnit Gebru found that some facial analysis algorithms misclassified Black women nearly 35% of the time, while white men were almost always correctly identified.
"Even if the accuracy of facial recognition technology was improved, the fundamental issue remains: Facial recognition technology dangerously expands the scope and power of law enforcement," Lee said. "When combined with existing networks of surveillance cameras dotting our urban and suburban landscapes, facial recognition algorithms could enable governments to track the public's movements, habits, and associations of all people at all times merely with the push of a button."
Robert Williams, a Black man who was wrongfully arrested on a felony larceny charge due to facial recognition technology, urged politicians to put safeguards in place.
Williams was taken into custody by the Detroit Police Department and later pleaded not guilty. During a court recess, Williams said he met with detectives and viewed photos of the suspect at the scene of the robbery.
"I held that piece of paper up to my face and said, 'I hope you don't think all Black people look alike.' [A detective] turned over another piece of paper and said, 'So, I guess the computer got it wrong.'"
The GAO report included recommendations such as implementing mechanisms to track what non-federal systems with facial recognition technology are used by employees to support investigative activities and assessing the privacy- and accuracy-related risks of such systems.