Questions Raised Over Israel's Use of AI to Target Gaza
-
Israel reportedly used an AI-assisted system called Lavender to identify targets in Gaza, leading to civilian deaths.
-
Lavender had a 10% error rate but was still used to fast-track bombing of Hamas operatives.
-
Legal experts say the AI targeting likely violates international humanitarian law.
-
Israel approved killing up to 15-20 civilians per junior Hamas operative targeted. Over 100 civilians could be killed to assassinate one commander.
-
Israel is trying to sell these AI tools to foreign governments who admire rather than condemn its actions in Gaza.