Report Alleges Israeli Use of Controversial Database in Gaza Strikes
-
Israel allegedly used an AI-powered database called "Lavender" to select targets for assassination in Gaza, resulting in over 33,000 Palestinian deaths.
-
Lavender analyzed surveillance data to identify 37,000 Palestinians and their homes for potential elimination by airstrikes.
-
Human oversight of Lavender's selections was minimal - officers spent only ~20 seconds reviewing each target before authorizing strikes.
-
Israel used "dumb" bombs on many Lavender-picked targets to avoid "wasting" precision munitions.
-
Israel claims Lavender is just a database to cross-reference intel rather than an AI system, but human rights activists see it enabling grave international law violations.