Posted 4/7/2024, 4:27:20 PM
Concerns Over Israel's Use of AI in Gaza Airstrikes
- Israel's use of AI software "Lavender" to identify airstrike targets in Gaza, with concerns over lack of human oversight and high civilian death toll
- Lavender has 10% error rate in targeting and Israeli commanders deemed it acceptable to kill up to 15-20 civilians per low-level Hamas target
- Lavender excels at tracking targets to their homes, where wives, children, and relatives would be killed, even if target not home
- Devolution from "eye-for-an-eye" justice to 20-1 civilian kill ratio using AI targeting is morally questionable
- Ultimate issue is lack of human values programmed into AI; it reflects our own lack of empathy and equal cherishing of all human life