Questions Raised Over Israel's Use of AI in Gaza Bombing Campaign
• Israeli military using AI tool called "Lavender" to help identify bombing targets in Gaza, though IDF denies AI used to ID terrorists specifically
• Tool had 10% error rate; human oversight of suggested targets was minimal (about 20 seconds per target)
• Reporting cites 6 Israeli intelligence officials involved in the program who say human sign off was essentially a "rubber stamp"
• When targeting alleged junior militants, Israeli army preferred using "dumb bombs" which cause more indiscriminate damage
• IDF statement denies excessive civilian casualties, says it reviews targets carefully before strikes and chooses munitions based on both operational and humanitarian considerations