Trending:

What is ‘Lavender’, the AI program that Israel ‘used’ to create kill lists in Gaza?

FP Explainers April 4, 2024, 14:11:43 IST

A joint investigation by +972 Magazine and Local Call has revealed that Israel Defense Forces made use of an AI program called Lavender to identify potential Hamas militants in the aftermath of the attacks perpetrated last year. The report claims the software identified over 37,000 in Gaza to be bombed

Advertisement
A man pushes a bycicle along as he walks amid building rubble in the devastated area in Gaza, amid the ongoing conflict between Israel and the Palestinian Hamas militant group. AFP
A man pushes a bycicle along as he walks amid building rubble in the devastated area in Gaza, amid the ongoing conflict between Israel and the Palestinian Hamas militant group. AFP

It’s been six months since Israel began its war against Hamas following its dastardly attacks on 7 October 2023. In the days following the surprise attack, the Israel Defense Forces (IDF) carried out a bombing campaign in the Gaza Strip, aiming to take out the Palestinian militants. This campaign has claimed lives of many — over 30,000 — and reduced the Gaza Strip to nothing but debris.

And it is now alleged that these strikes were carried out with the help of a secret artificial intelligence system — dubbed Lavender — which generated kill lists in Gaza.

STORY CONTINUES BELOW THIS AD

Israel has refuted the claims, stating that it does not use AI to designate people as targets for military strikes. “Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF said in a statement. “Information systems are merely tools for analysts in the target identification process.”

As the issue continues to gain traction, we take a closer look at what is this so-called AI system that was used and how Israel allegedly deployed it in its ongoing war against Hamas.

AI program Lavender, explained

Israel’s alleged use of the AI program called Lavender has emerged after Israeli-Palestinian publication +972 Magazine and Hebrew-language outlet Local Call carried out a joint investigation and cited six Israeli intelligence officials involved in the use and development of it.

According to the investigation, Lavender used broad parameters to identify potential targets, designating about 37,000 people for potential air strikes. It reportedly used machine learning to identify characteristics of militants and assigned people a score of 1-100, according to military sources, based on factors including association with suspected militants and frequently changing their phone.

A Palestinian boy reacts near the site of an Israeli strike on a house, amid the ongoing conflict between Israel and the Palestinian Islamist group Hamas, in Rafah, in the southern Gaza Strip. An investigation has revealed that IDF used an AI tool, Lavender, to design kill lists. File image/Reuters

The +972 Magazine-Local Call probe further revealed that the Lavender program joined another AI system — The Gospel — which was previously reported about too. As per their findings, the fundamental difference between the two systems is in the definition of the target: whereas ‘The Gospel’ marks buildings and structures that the army claims militants operate from, Lavender marks people — and puts them on a kill list.

STORY CONTINUES BELOW THIS AD

The program was developed, as per the joint investigation, by Israel Defense Forces’ elite intelligence division, Unit 8200. This is similar to America’s National Security Agency or UK’s GCHQ.

The need for such a program arose, according to the officials, in the immediate aftermath of the 7 October 2023 attacks . The investigation quoting the intelligence officers wrote that commanders demanded continuous attacks of targets.

“We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us,” said one intelligence officer to The Guardian. “We were told: now we have to f**k up Hamas, no matter what the cost. Whatever you can, you bomb.”

And in an attempt to meet their commanders’ demands, they came to rely on Lavender, which generated a database of individuals believed to have ‘militant’ characteristics.

A picture shows smoke billowing after Israeli bombardment in the vicinity of the Al-Shifa hospital in Gaza City. the software designated over 37,000 people in Gaza to be Hamas militants, found the investigation. File image/AFP

How Israel used Lavender

The first-hand testimonies by Israel’s intelligence officers, who worked with Lavender, have revealed how the Benjamin Netanyahu -led forces used this technology in the ongoing war against Hamas.

STORY CONTINUES BELOW THIS AD

According to them, Lavender had played a significant role in the war, processing huge troves of data to quickly identify “junior” operatives to target. In fact, four officials, as per a report in The Guardian, said that at one stage of the war, Lavender had listed as many as 37,000 Palestinian people to be linked to either Hamas or Palestinian Islamic Jihad.

Yuval Abraham, the author of the investigation, wrote, “During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based.

“One source stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorising a bombing.”

Israeli soldiers operate at the Shajaiya district of Gaza city amid the ongoing conflict between Israel and the Palestinian Islamist group Hamas. The IDF used Lavender with little human oversight, claims the investigation. File image/Reuters

The magazine also reported the Israeli army “systematically attacked” targets in their homes, usually at night when entire families were present. “The result being that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions,” it wrote.

STORY CONTINUES BELOW THIS AD

Moreover, the Israeli army, known for possessing high-valued weapons, “preferred” to use dumb bombs – unguided missiles which can cause large-scale damage — while targeting these low-ranking Hamas militants.

The IDF also reportedly permitted that for every junior Hamas operative being identified by Lavender, it was alright to kill 15-20 civilians and for ever senior operative, the number climbed to a 100 civilians. “We had a calculation for how many [civilians could be killed] for the brigade commander, how many [civilians] for a battalion commander, and so on,” one source was quoted as telling The Guardian.

A boy reacts next to the bodies of Palestinian men in Gaza, after they were killed in an Israeli strike, in Gaza City. The investigation claims the IDF reportedly permitted that for every junior Hamas operative being identified by Lavender, it was alright to kill 15-20 civilians and for ever senior operative. File image/Reuters

IDF rebuts claims

Israel has, however, refuted the investigation by +972 Magazine-Local Call. The IDF denied using AI to generate kill lists. Notably, it didn’t dispute the existence of the Lavender program. In a drawn-out statement, it said: “The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” it added. “Information systems are merely tools for analysts in the target identification process.”

STORY CONTINUES BELOW THIS AD

It further argued that Lavender was simply a database whose purpose was to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organisations. “This is not a list of confirmed military operatives eligible to attack.”

Lt Col Peter Werner, an IDF spokesperson, slammed the reportage, calling it false. On the claims that IDF permitted the deaths of 15-20 civilians for a junior Hamas operative, he wrote on X: No Hamas individual was targeted with an expected 100 civilian casualties”. No Hamas individual was automatically approved for attack with an expected 15-20 casualties.”

STORY CONTINUES BELOW THIS AD

Implications for the war

The allegations come at a very critical time for Israel. The country is under immense pressure from even its supporters over the rising number of civilian casualties. In fact, Israel is facing a new round of criticism following the death of seven World Central Kitchen aid workers in an Israeli airstrike in Gaza on Monday.

Also read: How the death of World Central Kitchen workers in Israel airstrike imperils Gaza aid

Experts speaking to The Guardian said that it was immensely worrying that the IDF had accepted and pre-authorised collateral damage ratios as high as 20 civilians.

One such expert is Sarah Harrison, a former lawyer at the US Department of Defense, and now an analyst at Crisis Group. She told The Guardian: “While there may be certain occasions where 15 collateral civilian deaths could be proportionate, there are other times where it definitely wouldn’t be. You can’t just set a tolerable number for a category of targets and say that it’ll be lawfully proportionate in each case.”

It will be interesting to note how Israel responds further, but as the war enters the sixth month, its unknown when it will end.

With inputs from agencies

Home Video Shorts Live TV