scorecard
  1. Home
  2. tech
  3. AI
  4. news
  5. Here’s what ChatGPT said about why it discriminated against CVs with disability mentions

Here’s what ChatGPT said about why it discriminated against CVs with disability mentions

Here’s what ChatGPT said about why it discriminated against CVs with disability mentions
Tech3 min read
The world of tech has been nursing its newest baby for a while now: artificial intelligence. Much like a baby, the tool has been growing at a blazing rate for a while, and we’re really proud of them for it — many of us already have it accompany us to work everyday. And while AI may very well be the smartest baby on the planet, it has one fatal flaw that it shares with all real-life children: their all-but-human parents.

Just like we inherit many values, routines and beliefs from our parents, AI software like ChatGPT have been trained on decades of human-generated data — data that can suffer from inherent prejudices. This makes the tools far from perfect in its current state, as it has been shown to present bias against minorities, such as how Amazon’s AI recruitment process preferred men over women, or how ChatGPT treated black-sounding names differently.
The employment bias saga continues
Most of these examples come from the landscape of recruitment, since this is a field where AI tools are increasingly being used to streamline the process of summarising resumes and ranking candidates. However, a recent study has uncovered another type of significant biases within these systems, particularly against disabled individuals.

The study found that ChatGPT consistently ranked resumes with disability-related honours and credentials lower than identical resumes without such distinctions. For example, a resume featuring the "Tom Wilson Disability Leadership Award" was ranked lower, with the AI providing biased justifications. It suggested, for instance, that a resume highlighting an autism leadership award had "less emphasis on leadership roles," reflecting harmful stereotypes about autistic individuals.

"Ranking resumes with AI is starting to proliferate, yet there's not much research behind whether it's safe and effective," explains study author Kate Glazko. "For a disabled job seeker, there's always this question when you submit a resume of whether you should include disability credentials. I think disabled people consider that even when humans are the reviewers."

To investigate, the researchers used a publicly available curriculum vitae (CV) of one of the study's authors, extending about 10 pages. They created six “enhanced” CVs, each implying a different disability by including four disability-related credentials: a scholarship, an award, a diversity, equity, and inclusion (DEI) panel seat, and membership in a student organization. These enhanced CVs were then ranked against the original for a real "student researcher" job listing at a large US-based software company using ChatGPT's GPT-4 model. Each comparison was run 10 times, and in 60 trials, the system ranked the enhanced CVs first only 25% of the time.

"In a fair world, the enhanced resume should be ranked first every time," remarked Jennifer Mankoff, another author. "I can't think of a job where somebody who's been recognised for their leadership skills, for example, shouldn't be ranked ahead of someone with the same background who hasn't."

The researchers also discovered that customising ChatGPT with instructions to reduce ableism mitigated the bias for five out of six disabilities tested — deafness, blindness, cerebral palsy, autism, and the general term "disability". However, only three of these resumes were consistently ranked higher than those without disability mentions.
Biased explanations provided
When the researchers asked GPT-4 to explain its rankings, the responses revealed explicit and implicit ableism. For example, it noted that a candidate with depression had "additional focus on DEI and personal challenges," which it claimed detracted from the "core technical and research-oriented aspects of the role."

"Some of GPT's descriptions would colour a person's entire resume based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resume," Glazko explained. "For instance, it hallucinated the concept of 'challenges' into the depression resume comparison, even though 'challenges' weren't mentioned at all. So you could see some stereotypes emerge."

This study underscores the urgent need for greater scrutiny and refinement of AI tools in hiring. While generative AI like ChatGPT can streamline processes, it also perpetuates and amplifies biases, potentially undermining efforts to create more inclusive workplaces.

The findings of this study can be accessed here.

READ MORE ARTICLES ON




Advertisement