Posted 4/5/2024, 6:52:48 PM
Israeli Military's AI Bombing System Linked to High Civilian Casualties in Gaza
- Israeli military used AI system "Lavender" to select bombing targets in Gaza with minimal human oversight
- System had 10% error rate in identifying militants, but was still used to automatically approve targets
- Resulted in thousands of civilian casualties, including women and children
- System judged it acceptable to kill up to 20 civilians per junior Hamas target
- Use of unguided "dumb bombs" on unimportant targets chosen by AI led to high collateral damage