According to intelligence sources involved in the conflict, the Israeli military’s bombing operation in Gaza deployed a hitherto unknown AI-powered database that at one point identified 37,000 possible targets based on their apparent ties to Hamas.
Apart from discussing their employment of the artificial intelligence system named Lavender, the intelligence sources assert that Israeli military leaders approved the killing of several Palestinian civilians, especially in the initial weeks and months of the conflict.
Their very open evidence offers a unique window into the first-hand experiences of Israeli intelligence officers who have been identifying targets during the six-month conflict with the use of machine learning techniques.
Israel has stepped into uncharted terrain for sophisticated warfare with its employment of strong AI systems in its fight on Hamas. This has raised a number of ethical and legal concerns as well as changed the dynamic between military humans and robots.
Apart from discussing their employment of the artificial intelligence system named Lavender, the intelligence sources assert that Israeli military leaders approved the killing of several Palestinian civilians, especially in the initial weeks and months of the conflict.
One intelligence officer who utilized lavender remarked, “This is unparalleled, in my memory,” adding that they trusted a “statistical mechanism” more than a soldier in mourning. On October 7, everyone there—including me—lost loved ones. The computer performed it icily. And it was made simpler by that.
Impact Shorts
More ShortsIs it really important for people to be involved in the selecting process? asked another Lavender user. At this point, I would spend 20 seconds on each target and complete dozens of them each day. Apart from being an official seal of approval, I had no additional value as a human. It was quite time-saving.
Before they were published, their stories were only revealed to the Guardian. According to all six, Lavender was important in the war effort, sifting through vast amounts of data to quickly pinpoint possible “junior” operators for targeted attacks. According to four of the individuals, Lavender named up to 37,000 Palestinian men who the AI system had connected to PIJ or Hamas at one point early in the conflict.
The Israel Defense Forces’ Unit 8200, an elite intelligence division akin to the US National Security Agency or the UK’s GCHQ, created Lavender.
A number of the sources explained how the IDF applied pre-authorized allowances for the expected number of civilians who could be killed before a target was hit for specific types of targets.