Israel's AI-powered targeting system under scrutiny over civilian deaths in Gaza
-
Israel used an AI system called Lavender to identify 37,000 potential Hamas targets in Gaza based on data links. Intelligence sources claim it enabled large numbers of civilian deaths.
-
Lavender was developed by Israel's Unit 8200 to rapidly process data and identify potential junior Hamas and PIJ operatives to target.
-
Sources said Israel allowed up to 20 civilian deaths per junior militant targeted, using unguided "dumb bombs" that destroyed entire homes.
-
At one point Lavender identified 37,000 possible targets, but its algorithm was tweaked down to a claimed 90% accuracy rate. Thousands remained marked for potential strikes.
-
Sources question the morality of the approach, with indiscriminate bombing and mass civilian deaths. One says "no one thought about what to do afterward, when the war is over."