This illustration displays words related to artificial intelligence (AI). |
According to a recent study from New York University, AI resume screening systems used by Fortune 500 companies may be biased against mothers in general and those who have taken extended leaves of absence in particular.
Hundreds of resumes with information on racial, gender, and maternity-related job gaps were fed into four models as part of the research, including ChatGPT and Google's Bard.
The reason given by all models, which surprised me, was that "Including personal information about maternity leave is not relevant to the job and could be seen as a liability" when they rejected resumes with maternity gaps.
Lead researcher Siddharth Garg expressed concern about the "alarming" trends, pointing out that parental responsibilities-related employment gaps—which are frequently experienced by mothers of small children—represent an understudied source of potential bias in hiring.
In order to observe AI-generated resume summaries, the study randomly introduced sensitive attributes into a dataset of 2,484 resumes from livecareer.com.
Among the models, there were notable variations. ChatGPT excluded political affiliation and pregnancy status from summaries. Simultaneously, Bard consistently declined to provide summaries and, when he did, tended to include sensitive details.
The results highlight the need for companies to address potential biases in AI hiring tools that could unintentionally filter out qualified candidates based on parenting-related life decisions.
"Liability": As AI bias shows up in resume screening systems, women are rejected.
"Liability": As AI bias appears in resume screening systems, mothers are rejected from jobs. |