NYC Law Falls Short in Tackling Biases in Hiring Algorithms; Federal Action Called For
-
NYC law requires employers to conduct annual bias audits of automated hiring tools, but puts burden on employers rather than vendors. This is proving ineffective.
-
Putting burden on vendors to audit and disclose biases of their tools would harness market pressures to reduce algorithmic bias.
-
Employers could then choose less biased tools, understanding legal risks.
-
Federal law should give agencies like EEOC authority to mandate vendor audits and disclosures within their jurisdiction.
-
Congress should pass law upgrading agency authority over AI vendors to require bias audits and disclosures.