scorecard
  1. Home
  2. tech
  3. Amazon built an AI to hire people, but reportedly had to shut it down because it was discriminating against women

Amazon built an AI to hire people, but reportedly had to shut it down because it was discriminating against women

Isobel Asher Hamilton,Isobel Asher Hamilton   

Amazon built an AI to hire people, but reportedly had to shut it down because it was discriminating against women
Tech2 min read

Jeff Bezos

David Ryder/Getty Images

Amazon CEO Jeff Bezos.

  • Amazon tried building an AI tool to help with recruiting but it showed a bias against women, Reuters reports.
  • Engineers found the AI was unfavourable towards female candidates because it had combed through male-dominated resumes to accrue its data.
  • Amazon reportedly abandoned the project at the beginning of 2017.

Amazon worked on building an AI to help with hiring people, but the plans backfired when it discovered the system discriminated against women, Reuters reports.

Citing five sources, Reuters said Amazon set up an engineering team in Edinburgh, Scotland in 2014 to find a way to automate its recruitment.

They created 500 computer models to trawl through past candidates' resumes and pick up on around 50,000 key terms. The system would crawl the web to recommend candidates.

"They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those," one source told Reuters.

A year later, however, the engineers noticed something troubling about their engine - it didn't like women. This was apparently because the AI combed through predominantly male resumes submitted to Amazon over a 10-year period to accrue data about who to hire.

Consequently, the AI concluded that men were preferable. It downgraded resumes containing the words "women's," and filtered out candidates who'd attended two women's only colleges.

Amazon's engineers tweaked the system to remedy these particular forms of bias, but couldn't be sure the AI wouldn't find new ways to unfairly discriminate against candidates.

Gender bias was not the only problem, Reuters' sources said. The computer programs also spat out candidates who were unqualified for the position.

Remedying algorithmic bias is a thorny issue, because algorithms can pick up on unconscious human bias. In 2016, ProPublica found a risk-assessment software used to forecast which criminals are most likely to reoffend exhibited racial bias against black people. Over-reliance on AI for things like recruitment, credit-scoring, and parole judgements have also created issues in the past.

Amazon reportedly abandoned the AI recruitment project by the beginning of last year after executives lost faith in it. Reuters' sources said that Amazon recruiters looked at recommendations generated by the AI, but never relied solely on its judgement.

Business Insider contacted Amazon for comment. It declined to comment on the project when approached by Reuters, but said it is committed to workplace diversity and equality.

READ MORE ARTICLES ON


Advertisement

Advertisement