Investigative Platform Seeks Algorithm Accountability Amid Growing Government Automation
-
Daniel Howden founded Lighthouse Reports to investigate how governments use algorithms to make decisions affecting people's rights. The platform partners with media outlets to publish investigations.
-
Automated decision systems are being widely implemented by governments without public consultation or oversight, affecting areas like criminal justice and welfare benefits.
-
These systems often reflect societal biases if the training data itself contains biased information. Addressing the bias requires accountability and understanding of how the systems work.
-
AI systems are already affecting human and civil rights in areas like welfare fraud detection, criminal sentencing, and employment and loan applications.
-
Howden argues governments should regulate AI like other industries, with transparency and auditing requirements, rather than letting tech companies self-regulate with "light touch" rules.