Police Bias in Rape Cases Exposed by AI Analysis of Reports
-
Cleveland professor Rachel Lovell analyzed thousands of police reports on rape using AI to measure officer bias. The AI found subjective reports led to more prosecutions while factual reports failed to convey brutality.
-
AI and machine learning allow criminal justice researchers to analyze huge amounts of text data to find hidden patterns. These methods have huge potential for policing.
-
Lovell found many officers doubted victims in their reports, using vague language and rape myths. She wondered if this affected cases.
-
Lovell fed over 4 million words from 5,000 reports into a text mining program to quantify subjectivity and word patterns.
-
Subjective reports with more victim-focused details led to more prosecutions. Factual reports were shorter and used closing language like "insufficient evidence."