scorecard
  1. Home
  2. tech
  3. news
  4. Twitter is making changes to its photo software after people online found it was automatically cropping out Black faces and focusing on white ones

Twitter is making changes to its photo software after people online found it was automatically cropping out Black faces and focusing on white ones

Katie Canales   

Twitter is making changes to its photo software after people online found it was automatically cropping out Black faces and focusing on white ones
Tech2 min read
  • Twitter said it is limiting its reliance on machine learning that helps it decide which part of a photo to crop on its platform.
  • Online users have reported racial bias on the social media firm's image cropping tool, which automatically focuses on the part of a photo it thinks the viewer will find most interesting.
  • One Twitter user recently highlighted how the face of Senate Majority Leader Mitch McConnell, who is white, was routinely centered in automatic image crops, while that of former President Barack Obama was cut out.

Twitter is making changes to its photo cropping function after an investigation into racial bias in the software, the company said on Thursday.

The announcement comes after users on the platform repeatedly showed that the tool — which uses machine learning to choose which part of an image to crop based on what it thinks is the most interesting — cuts out Black people from photos and centers on white faces instead.

Tony Arcieri, a cryptography engineer, posted a series of tweets in mid-September showing how the platform's algorithm routinely chose to highlight the face of Senate Majority Leader Mitch McConnell, who is white, instead of former President Barack Obama's in multiple photos of the two. The experiment prompted others to try similar experiments with the same result, and led to the company launching an investigation into its systems shortly after.

The social media company implemented its machine-learning-powered image cropping system in 2018. The system "relies on saliency, which predicts where people might look first," Twitter's chief design officer, Dantley Davis, and its chief technology officer, Parag Agrawal, wrote in the company blog post on Thursday.

They said in the post that Twitter will now limit its reliance on machine learning and will instead roll out a "what you see is what you get" feature — when people post photos to the site, they will appear as they did in the tweet composer. Exceptions may include photos that are abnormally wide or long.

A Twitter spokesperson declined to comment further and said it planned to share more details in the coming weeks.

Experts in recent years have raised the alarm on how algorithms reinforce racial bias. As Vox explains at length, algorithms are not objective, but guided by the subtle biases of the data and people who build them, regardless of intent. Machine learning and artificial intelligence systems are trained by taking in mounds of information with sometimes millions of data points. The photos and pieces of information that teach the algorithm decide how it acts, which can proliferate the racial biases on which it was trained.

This is far from the first time machine-learning systems have been found to have issues with racial bias. Google's image-recognition algorithm for years labeled Black people as "gorillas," only fixing the issue by removing the "gorilla" category from its image library, Wired reported in 2018. More recently, a program intended to unblur pixelated photos turned Obama into a white man.

Racial bias coded in AI has had severe consequences. In January, a Black man in Detroit was wrongfully arrested after being misidentified by facial-recognition technology used by the Michigan State Police. Shortly after, Detroit Police Chief James Craig said the facial-recognition software fails to correctly identify people "96% of the time."

A 2016 ProPublica investigation found that software used in court proceedings across the US also categorized more Black people as being at high risk of committing future crimes than white people. The software also failed to accurately identify who actually had a criminal background.

READ MORE ARTICLES ON


Advertisement

Advertisement