Concerns Raised Over Israel's Use of AI to Generate Airstrike Targets in Gaza
-
Israel allegedly used an AI system called Lavender to generate tens of thousands of bombing targets in Gaza, leading to high civilian casualties.
-
Lavender and other AI systems like Habsora automate parts of the targeting process, with limited human oversight.
-
These systems reportedly make errors in around 10% of cases and lead to strikes on homes and civilians.
-
There are currently no clear international laws or rules governing use of military AI systems like those reportedly used in Gaza.
-
How the world responds to uses of military AI now will likely set a precedent for responsible development of these technologies in the future.