New study investigates bias in AI resume screening

Soundside - A podcast by KUOW News and Information

Getting an interview for a new job has always been tough.  For candidates, the challenge is crafting a cover letter and resume that make you stand out. For employers, the challenge is often volume. Finding the right candidate with the right qualifications – in a stack of applications.  Now, inject AI into the mix. Large Language models (LLMs for short) are being used to help automate some of the tedium of sorting through candidate resumes.  But in a new study from the University of Washington, even the most cutting-edge AI is showing bias.  Soundside was joined by Kyra Wilson, doctoral student at the University of Washington Information School. Wilson co-authored the study with Aylin Caliskan, assistant professor in the iSchool.   Guests: Kyra Wilson, doctoral student at the University of Washington Information School.  Related Links:  AI tools show biases in ranking job applicants’ names according to perceived race and gender | UW News View of Gender, Race, and Intersectional Bias in Resume Screening via Language Model Retrieval See omnystudio.com/listener for privacy information.