+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

The pedestrian detection systems in self-driving cars are less likely to detect children and people of color, study suggests

Aug 26, 2023, 17:39 IST
Business Insider
Self-driving cars, like the Cruise model from General Motors pictured above, are already on streets throughout the United States.Heather Somerville/Reuters
  • Pedestrian detectors in self-driving cars are less likely to detect kids and people of color, study shows.
  • This is due to bias in open-source AI, on which self-driving cars rely, researchers say.
Advertisement

As the artificial intelligence revolution ramps up, one trend is clear: Bias in the training of AI systems is resulting in real-world discriminatory practices.

AI recruitment tools have been shown to discriminate against women. ChatGPT has demonstrated racist and discriminatory biases. In every reported case of police misidentifying a suspect because of facial recognition technology, that person has been Black.

And now, new research suggests even the pedestrian detection software in self-driving cars may be less effective in detecting people of color — as well as children, generally — as a result of AI bias, putting them at greater safety risk as more carmakers use the technology.

A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person's race, gender, and age. While gender only presented a small discrepancy in accuracy, researchers found the detection systems were less accurate at detecting pedestrians with dark skin tones.

"Before, minority individuals may have been denied vital services. Now they might face severe injury," Jie Zhang, a computer scientist at King's College London and a member of the research team, said in a statement.

Advertisement

The detection systems were 19.67% more likely to detect adults than children, and 7.52% more likely to detect people with lighter skin tones than people with darker skin tones, according to the study.

"Overall, this study sheds light on the fairness issues faced by existing pedestrian detectors, emphasizing the importance of addressing bias related to age and skin tone," the study reads. "The insights gained can pave the way for more fair and unbiased autonomous driving systems in the future."

This trend is a result of biases already present in the open-source AI systems that many companies use to build the detectors. While the study did not use the exact software companies like Tesla use to power self-driving cars because they are confidential, the software systems used for the study are based on the same open-source AI those companies use, according to Zhang.

The research team called on lawmakers to regulate self-driving car software to prevent bias in their detection systems.

"It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately," the study reads.

Advertisement
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article