Facial recognition is almost perfectly accurate - here's why that could be a problem
- The accuracy rate of facial recognition depends on the data its fed. With enough good data, the accuracy rate could be almost perfect.
- Facial recognition is already being implemented in US airports for security and efficiency. But there are concerns that the US government is creating a digital ID library of millions of Americans without consent.
- The US Immigration and Customs Enforcement are also using facial recognition for security reasons. But without clear protections, it could be used as a tool to violate human rights.
- China is also implementing facial recognition on a bigger scale by giving citizens "social scores" in hopes of helping society become better.
- Facial recognition can be used to make tasks automated, convenient, and efficient. But there needs to be regulation and protections in place.
- Visit BusinessInsider.com for more stories.
Michelle Yan: How does my iPhone know that's me? And how does Facebook know that's me? And why is Facebook always asking if I want to tag myself in these photos? Well, both are using facial recognition technology. So what's going on?
Facial recognition is not a new thing. It was pretty simple. People would use a ruler to take measurements of your facial features, like how long your eyebrows are, the position of your eyes, the curve of your lips, and so forth. Today, the process is much faster.
WonSook Lee: Now, we use more like deep learning-based method, which raised the recognition even higher than humans can do. So almost perfect.
Michelle Yan: That's WonSook Lee, a professor in the school of Electrical Engineering and Computer Science at the University of Ottawa. She has 15 years of expertise in facial recognition, facial modeling, and computer animation. Wait, did she just say the accuracy rate of facial recognition is almost perfect?
WonSook Lee: So almost perfect.
Michelle Yan: Almost perfect? There are all these headlines about facial recognition being racist or having preprogrammed biases. Why was Amazon's "Rekognition" misidentifying dark-skinned women but identifying white men?
WonSook Lee: That's basically because they don't have a lot of database for African people. If we recognize the gender, or if we recognize the age, and then also if we recognize ethnic groups with their skin tone, we can raise the recognition system better.
Michelle Yan: Gotcha. So if Amazon's "Rekognition" was having trouble identifying dark-skinned women, programmers need to feed it more images of dark-skinned women and show them in various scenarios. Different lighting, various angles, in different outfits, wearing hats and sunglasses, all things that would increase the accuracy rate.
So theoretically, if the facial recognition software has enough variety of images of me, it should be able to recognize me a hundred percent of the time, even if I fed it an image of me in a darker scenario with glasses on and short blue hair. Now I don't care if Facebook misidentifies me as someone else, but what about when facial recognition misidentifies people at the airport? Or when law enforcement uses it to make an arrest?
WonSook Lee: Caucasian people recognize Caucasian very well, but they don't recognize Asian people very well. They don't recognize African people very well either. People are depending on familiarity, that kind of thing. Machines can do more objective work if they are trained enough with various people with enough data.
Michelle Yan: Luckily, it's not left up to just the machines. And if facial recognition gets it wrong, there are other methods in place like fingerprint, iris, ear, and palm recognition. But this still doesn't address the security and privacy concerns that come with this new technology.
Let's start with facial recognition in airports.
According to a recent report from BuzzFeed News, the US Customs and Border Protection stated that they're using this technology to, one: identify non-US citizens who use fraudulent travel documents, and two: provide a quicker check-in process. But there are concerns that the US government is using this technology to create a digital identification library of millions of Americans without their consent.
CBP says they're not storing the photos, but it's hard to prove if that's true or not. The US Customs and Border Protection claims they do delete photos after 12 hours and that they also give US citizens the option to opt out of checking-in with facial recognition and check-in manually. The reality is not many travelers know that they can opt out of the technology.
Another concern is the US Immigration and Customs Enforcement using facial recognition.
ICE says it's using the technology to protect the US from cross-border crimes and undocumented migrants they say threaten national security and public safety. But there are concerns that this could be used as a tool that violates human rights if there aren't clear protections and restrictions in place.
It's unclear how this will shake out in the US. But in China, we're already seeing how the overuse of facial recognition can get out of hand. The country is using facial recognition for mass surveillance and to give their citizens social scores.
A citizen's social score is based on their economic and social reputation and can be affected by bad driving, jaywalking, posting fake news online, or just buying too many video games.
WonSook Lee: So when people pass and your face is shown, and if they have a database of the person, we can recognize the person. And if there are CCTVs everywhere in the city, let's say there is a targeted person, government want to know or whatever, and actually they can follow the track and then find out everything about a person.
Michelle Yan: Their face is linked to their government records, social networks, and tracked behavior through CCTV cameras. Good and honest behaviors can lead to discounted airline tickets. Committing wrong deeds can cause problems like banning you from flights and trains or taking away your dog. The Chinese government believes this social credit system can help society be better by rewarding good behavior and denouncing bad behavior. But it does feel like Orwell's "1984." And citizens have reported concerns about their privacy and the lack of checks and balances on this system.
Facial recognition is a powerful tool, and with great power comes great responsibility, right? It could wind up being great for automation, finding missing people, or just checking you in at the airport. But should companies and governments even have access to all this data? What sorts of guidelines should they follow? Will the government use it in limited ways like just for law enforcement? Or will they use it publicly to embarrass citizens like China's social credit system? How can they guarantee that the data they collect will be deleted? What are they going to do in case of a data breach? Or how will they prevent one? These are questions that are still in discussion, and it doesn't sound like there are going to be any simple answers anytime soon.
Facial recognition is here to stay. But there needs to be more terms and services and guidelines and regulations that will help protect the rights, security, safety, and privacy of the people who are affected by the technology. Otherwise, its use might become widespread faster than it can be regulated. And if the US adopts the same social credit system as China, I don't wanna be banned from taking flights just because I jaywalk all the time. Come on, it's New York.
Michelle Yan: How does my iPhone know that's me? And how does Facebook know that's me? And why is Facebook always asking if I want to tag myself in these photos? Well, both are using facial recognition technology. So what's going on?
Facial recognition is not a new thing. It was pretty simple. People would use a ruler to take measurements of your facial features, like how long your eyebrows are, the position of your eyes, the curve of your lips, and so forth. Today, the process is much faster.
WonSook Lee: Now, we use more like deep learning-based method, which raised the recognition even higher than humans can do. So almost perfect.
Michelle Yan: That's WonSook Lee, a professor in the school of Electrical Engineering and Computer Science at the University of Ottawa. She has 15 years of expertise in facial recognition, facial modeling, and computer animation. Wait, did she just say the accuracy rate of facial recognition is almost perfect?
WonSook Lee: So almost perfect.
Michelle Yan: Almost perfect? There are all these headlines about facial recognition being racist or having preprogrammed biases. Why was Amazon's "Rekognition" misidentifying dark-skinned women but identifying white men?
WonSook Lee: That's basically because they don't have a lot of database for African people. If we recognize the gender, or if we recognize the age, and then also if we recognize ethnic groups with their skin tone, we can raise the recognition system better.
Michelle Yan: Gotcha. So if Amazon's "Rekognition" was having trouble identifying dark-skinned women, programmers need to feed it more images of dark-skinned women and show them in various scenarios. Different lighting, various angles, in different outfits, wearing hats and sunglasses, all things that would increase the accuracy rate.
So theoretically, if the facial recognition software has enough variety of images of me, it should be able to recognize me a hundred percent of the time, even if I fed it an image of me in a darker scenario with glasses on and short blue hair. Now I don't care if Facebook misidentifies me as someone else, but what about when facial recognition misidentifies people at the airport? Or when law enforcement uses it to make an arrest?
WonSook Lee: Caucasian people recognize Caucasian very well, but they don't recognize Asian people very well. They don't recognize African people very well either. People are depending on familiarity, that kind of thing. Machines can do more objective work if they are trained enough with various people with enough data.
Michelle Yan: Luckily, it's not left up to just the machines. And if facial recognition gets it wrong, there are other methods in place like fingerprint, iris, ear, and palm recognition. But this still doesn't address the security and privacy concerns that come with this new technology.
Let's start with facial recognition in airports.
According to a recent report from BuzzFeed News, the US Customs and Border Protection stated that they're using this technology to, one: identify non-US citizens who use fraudulent travel documents, and two: provide a quicker check-in process. But there are concerns that the US government is using this technology to create a digital identification library of millions of Americans without their consent.
CBP says they're not storing the photos, but it's hard to prove if that's true or not. The US Customs and Border Protection claims they do delete photos after 12 hours and that they also give US citizens the option to opt out of checking-in with facial recognition and check-in manually. The reality is not many travelers know that they can opt out of the technology.
Another concern is the US Immigration and Customs Enforcement using facial recognition.
ICE says it's using the technology to protect the US from cross-border crimes and undocumented migrants they say threaten national security and public safety. But there are concerns that this could be used as a tool that violates human rights if there aren't clear protections and restrictions in place.
It's unclear how this will shake out in the US. But in China, we're already seeing how the overuse of facial recognition can get out of hand. The country is using facial recognition for mass surveillance and to give their citizens social scores.
A citizen's social score is based on their economic and social reputation and can be affected by bad driving, jaywalking, posting fake news online, or just buying too many video games.
WonSook Lee: So when people pass and your face is shown, and if they have a database of the person, we can recognize the person. And if there are CCTVs everywhere in the city, let's say there is a targeted person, government want to know or whatever, and actually they can follow the track and then find out everything about a person.
Michelle Yan: Their face is linked to their government records, social networks, and tracked behavior through CCTV cameras. Good and honest behaviors can lead to discounted airline tickets. Committing wrong deeds can cause problems like banning you from flights and trains or taking away your dog. The Chinese government believes this social credit system can help society be better by rewarding good behavior and denouncing bad behavior. But it does feel like Orwell's "1984." And citizens have reported concerns about their privacy and the lack of checks and balances on this system.
Facial recognition is a powerful tool, and with great power comes great responsibility, right? It could wind up being great for automation, finding missing people, or just checking you in at the airport. But should companies and governments even have access to all this data? What sorts of guidelines should they follow? Will the government use it in limited ways like just for law enforcement? Or will they use it publicly to embarrass citizens like China's social credit system? How can they guarantee that the data they collect will be deleted? What are they going to do in case of a data breach? Or how will they prevent one? These are questions that are still in discussion, and it doesn't sound like there are going to be any simple answers anytime soon.
Facial recognition is here to stay. But there needs to be more terms and services and guidelines and regulations that will help protect the rights, security, safety, and privacy of the people who are affected by the technology. Otherwise, its use might become widespread faster than it can be regulated. And if the US adopts the same social credit system as China, I don't wanna be banned from taking flights just because I jaywalk all the time. Come on, it's New York.