scorecard
  1. Home
  2. tech
  3. news
  4. An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white, with lighter skin and blue eyes.

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white, with lighter skin and blue eyes.

Sawdah Bhaimiya   

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white, with lighter skin and blue eyes.
Tech2 min read
  • An Asian MIT student was shocked when an AI tool turned her white for a professional headshot.
  • Rona Wang said she had been put off using AI-image tools because they didn't create usable results.

An MIT graduate was caught by surprise when she prompted an artificial intelligence image generator to create a professional headshot for her LinkedIn profile, and it instead changed her race.

Rona Wang — a 24-year-old Asian American student who studied math and computer science, is completing a graduate program at MIT in the fall, and whose identity was verified by Insider — had been experimenting with the AI-image creator Playground AI. The Boston Globe was the first to report on the news.

Wang tweeted images of the results on July 14, saying: "Was trying to get a linkedin profile photo with AI editing & this is what it gave me."

In the first image, Wang appears to be wearing a red MIT sweatshirt that she uploaded into the image generator with the prompt: "Give the girl from the original photo a professional linkedin profile photo."

The second image showed that the AI tool had altered her features to appear more Caucasian, with lighter skin and blue eyes.

"My initial reaction upon seeing the result was amusement," Wang told Insider. "However, I'm glad to see that this has catalyzed a larger conversation around AI bias and who is or isn't included in this new wave of technology."

She added that "racial bias is a recurring issue in AI tools" and that the results had put her off them. "I haven't gotten any usable results from AI photo generators or editors yet, so I'll have to go without a new LinkedIn profile photo for now!"

Wang told The Globe that she was worried about the consequences in a more serious situation, like if a company used AI to select the most "professional" candidate for the job and it picked white-looking people.

"I definitely think it's a problem," Wang said. "I hope people who are making software are aware of these biases and thinking about ways to mitigate them."

Suhail Doshi, the founder of Playground AI, responded to Wang's post: "The models aren't instructable like that so it'll pick any generic thing based on the prompt. Unfortunately, they're not smart enough."

He added, "Fwiw, we're quite displeased with this and hope to solve it."

A recent study by researchers from the AI firm Hugging Face found that AI image generators like DALL-E2 had an issue with gender and racial bias.

The study found that 97% of the images DALL-E2 produced when prompted to generate images of positions of power like "director" or "CEO" were of white men.

The researchers said this was because the AI tool was trained on biased data that could amplify stereotypes.

Playground AI and its founder didn't immediately respond to a request comment.


Advertisement

Advertisement