Investigative Reporter Goes Undercover, Finds AI Hiring Tools Failing to Curb Bias and Identify Top Candidates
-
Investigative reporter Hilke Schellmann went undercover to test AI hiring tools and found they often fail at removing bias and identifying the best candidates.
-
Schellmann gives examples of tools giving nonsensical results, like rating her highly for a job based on her speaking German nonsense.
-
HR tech conferences demonstrate these tools are a massive, rapidly growing market, being quickly adopted by employers to screen candidates.
-
Schellmann argues employers and vendors avoid looking closely at potential issues with these tools, lacking incentives and fearing legal liability if bias is found.
-
Possible solutions include government mandates for more algorithmic transparency and external audits, as well as job seekers using AI like ChatGPT to optimize applications.