+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

AI may be able to predict your political views based on how attractive you are, a recent study found

Jun 21, 2023, 21:49 IST
Business Insider
A study found that deep learning accurately predicted the political affiliations of pictured people 61% of time.Getty Images
  • AI tools may be able to predict your political views, according to a new study.
  • Researchers found that AI links right-wing views to people who look happy and women deemed "attractive."
Advertisement

AI may be able to predict your political views based on how you look — and that could cause issues down the line, new research suggests.

A team of researchers based in Denmark and Sweden conducted a study to see if "deep learning techniques," like facial recognition technology, and predictive analytics can be used on faces to predict a person's political views.

The purpose of the March study, researchers wrote, "was to demonstrate the significant privacy threat posed by the intersection of deep learning techniques and readily-available photographs."

To do this, the researchers used a public dataset of 3,233 images of Danish political candidates who ran for local office and cropped them to only show their faces. After that, they applied advanced techniques to assess their facial expressions and a facial beauty database to determine a person's "beauty score."

Using these data points, the scientists predicted whether the figures pictured were left-wing or right-wing. The study found that the tech accurately predicted the political affiliations 61% of time.

Advertisement

Variations in facial expressions were linked to a candidate's political views, it found: The model predicted that conservative candidates "appeared happier than their left-wing counterparts" because of their smiles, whereas liberal candidates were more neutral. Women who expressed contempt — a facial expression characterized by neutral eyes and one corner of the lips lifted — were linked to more liberal politics by the model.

The researchers also found that the model correlated a candidate's level of attractiveness with their politics. Women deemed attractive by their beauty scores were predicted to have conservative views, though there was not a similar correlation between mens' level of attractiveness — measured by how masculine they look — and right-wing leanings.

While links between attractiveness and political ideology are nothing new, the study's findings reveal how powerful AI can be at deducing that information — and, the study's writers say, "confirmed the threat to privacy posed by deep learning approaches."

The preconceived notions around beauty standards and gender often used to train AI models can reinforce stereotypes that can lead to specific outcomes in, for example, hiring decisions.

"Facial photographs are commonly available to potential employers, and those involved in hiring decisions self-declare a willingness to discriminate based on ideology," the researchers wrote. "Members of the public may thus be aided by recognizing what elements of their photographs could affect their chances of employment."

Advertisement

Concerns around how cutting-edge tech may reinforce the perceptions around certain demographics have become more prevalent now that powerful AI tools like OpenAI's ChatGPT are taking the world by storm.

A separate study published in March found that DALL-E 2, an AI-image generator, produced images that linked white men with "CEO" or "director" 97% of the time — biased outputs the researchers warned can perpetuate racial stereotypes.

The researchers of the deep learning study didn't respond to Insider's request for comment before publication.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article