+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Google AI will no longer use gender labels like 'woman' or 'man' on images of people to avoid bias

Feb 20, 2020, 18:05 IST

Advertisement
GettyGoogle CEO Sundar Pichai.
  • A Google AI tool that can recognize objects in pictures will no longer attach gender labels like "woman" or "man" to images of people.
  • Google's Cloud Vision API is used by developers to analyze what's in an image, identifying anything from brand logos to faces to landmarks.
  • Google emailed Cloud Vision API customers on Thursday morning stating that it won't use labels that pertain to gender because you can't deduce someone's gender by their appearance alone.
  • An expert in AI bias told Business Insider that the change was a big positive.
  • Visit Business Insider's homepage for more stories.

A Google AI tool that can recognize and label what's in an image will no longer attach gender tags like "woman" or "man" to photos of people.

Google's Cloud Vision API is a service for developers that allows them to, among other things, attach labels to photos identifying the contents.

The tool can detect faces, landmarks, brand logos, and even explicit content, and has a host of uses from retailers using visual search to researchers identifying animal species.

In an email to developers on Thursday morning, seen by Business Insider, Google said it would no longer use "gendered labels" for its image tags. Instead, it will tag any images of people with "non-gendered" labels such as "person."

Advertisement

Google said it had made the change because it was not possible to infer someone's gender solely from their appearance. It also cited its own ethical rules on AI, stating that gendering photos could exacerbate unfair bias.

Per the email: "Given that a person's gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias."

We tested out the API, and here's what we found:

Shona Ghosh/Business InsiderGoogle's API no longer uses gendered labels for photos

Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.

Frederike Kaltheuner, a tech policy fellow at Mozilla with expertise on AI bias, told Business Insider that the update was "very positive."

She said in an email: "Anytime you automatically classify people, whether that's their gender, or their sexual orientation, you need to decide on which categories you use in the first place - and this comes with lots of assumptions.

"Classifying people as male or female assumes that gender is binary. Anyone who doesn't fit it will automatically be misclassified and misgendered. So this is about more than just bias - a person's gender cannot be inferred by appearance. Any AI system that tried to do that will inevitably misgender people."

Google notes in its own AI principles that algorithms and datasets can reinforce bias: "We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief."

Advertisement

Google invited affected developers to comment on its discussion forums. Only one developer had commented at the time of writing, and complained the change was down to "political correctness."

"I don't think political correctness has room in APIs," the person wrote. "If I can 99% of the times identify if someone is a man or woman, then so can the algorithm. You don't want to do it? Companies will go to other services."

Business Insider has approached Google for comment.

Get the latest Google stock price here.

NOW WATCH: Jeff Bezos reportedly just spent $165 million on a Beverly Hills estate - here are all the ways the world's richest man makes and spends his money

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article