Flawed AI Job Screening Tools Perpetuate Biases, Need Increased Scrutiny
-
AI is increasingly used to screen job applicants, analyzing videos, audio, and text to determine if candidates should move forward. But the science behind these tools is questionable.
-
HR software vendors promise AI can make better hiring decisions than humans, but in reality the tools often replicate or produce new biases.
-
Author Hilke Schellmann investigated various AI hiring tools and found major flaws, like transcription tools giving high English scores for German speech.
-
The algorithms are trained on human data, reflecting existing biases. Audits have uncovered problematic variables related to race, names, and disabilities.
-
Job seekers can fight back by optimizing resumes to be machine-readable, using generative AI for cover letters, and pushing for more transparency and regulation around hiring algorithms.