Researchers reports ChatGPT biased against people with disability

Anjali Sharma

GG News Bureau
NEW YORK, 24th June.
 A new study on Sunday has revealed.that researchers have found that OpenAI’s artificial intelligence (AI) chatbot ChatGPT consistently ranked curriculum vitae or resumes with disability-related honours and credentials such as the ‘Tom Wilson Disability Leadership Award’ lower than the same resumes without those honours and credentials

University of Washington researchers on Sunday sought clarification about the rankings, the system showed biased perceptions of disabled people, according to the study issued.

It claimed a resume with an autism leadership award had ‘less emphasis on leadership roles’ implying the stereotype that autistic people are not good leaders.

The researchers customised the tool with written instructions directing it not to be ableist, the tool reduced this bias for all but one of the disabilities tested.

“Five of the six implied disabilities, deafness, blindness, cerebral palsy, autism and the general term ‘disability’ improved, but only three ranked higher than resumes that didn’t mention disability,” the researchers noted.

The researchers utilised the CV available in public domain and one of the study’s authors spanned about 10 pages.

They created six modified CVs, each suggesting a different disability by adding four disability-related credentials: a scholarship, an award, a seat on a diversity, equity and inclusion panel, and membership in a student organisation.

The researchers used the GPT-4 model of ChatGPT to compare these modified CVs with the original version for an actual “student researcher” position at a major US-based software company.

They conducted each comparison 10 times; out of the 60 trials, the system ranked the enhanced CVs, which were identical except for the implied disability, first only one-quarter of the time.

Kate Glazko, a doctoral student in the UW’s Paul G. Allen School of Computer Science & Engineering said “Some of GPT’s descriptions would colour a person’s entire resume based on their disability and claimed that involvement with DEI or disability is potentially taking away from other parts of the resume”.

“People need to be aware of the system’s biases when using AI for these real-world tasks. Otherwise, a recruiter using ChatGPT can’t make these corrections, or be aware that, even with instructions, bias can persist,” she added.

Comments are closed.