Lavender: Israel offers glimpse into terrifying world of military AI

You’re reading an excerpt from the Today’s WorldView newsletter. Sign up to get the rest free, including news from around the globe and interesting ideas and opinions to know, sent to your inbox every weekday.

It’s hard to concoct a more airy sobriquet than this one. A new report published by +972 magazine and Local Call indicates that Israel has allegedly used an AI-powered database to select suspected Hamas and other militant targets in the besieged Gaza Strip. According to the report, the tool, trained by Israeli military data scientists, sifted through a huge trove of surveillance data and other information to generate targets for assassination. It may have played a major role particularly in the early stages of the current war, as Israel conducted relentless waves of airstrikes on the territory, flattening homes and whole neighborhoods. At present count, according to the Gaza Health Ministry, more than 33,000 Palestinians, the majority being women and children, have been killed in the territory.

The AI tool’s name? “Lavender.”

This week, Israeli journalist and filmmaker Yuval Abraham published a lengthy expose on the existence of the Lavender program and its implementation in the Israeli campaign in Gaza that followed Hamas’s deadly Oct. 7 terrorist strike on southern Israel. Abraham’s reporting — which appeared in +972 magazine, a left-leaning Israeli English-language website, and Local Call, its sister Hebrew-language publication — drew on the testimony of six anonymous Israeli intelligence officers, all of whom served during the war and had “first-hand involvement” with the use of AI to select targets for elimination. According to Abraham, Lavender identified as many as 37,000 Palestinians — and their homes — for assassination. (The IDF denied to the reporter that such a “kill list” exists, and characterized the program as merely a database meant for cross-referencing intelligence sources.) White House national security spokesperson John Kirby told CNN on Thursday that the United States was looking into the media reports on the apparent AI tool.

READ MORE  3 Arrested After Brazilian Influencer Gang-Raped in India While on Motorcycle Trip Around the World

“During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based,” Abraham wrote.

“One source stated that human personnel often served only as a ‘rubber stamp’ for the machine’s decisions, adding that, normally, they would personally devote only about ‘20 seconds’ to each target before authorizing a bombing — just to make sure the Lavender-marked target is male,” he added. “This was despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.”

This may help explain the scale of destruction unleashed by Israel across Gaza as it seeks to punish Hamas, as well as the high casualty count. Earlier rounds of Israel-Hamas conflict saw the Israel Defense Forces go about a more protracted, human-driven process of selecting targets based on intelligence and other data. At a moment of profound Israeli anger and trauma in the wake of Hamas’s Oct. 7 attack, Lavender could have helped Israeli commanders come up with a rapid, sweeping program of retribution.

“We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us,” said one intelligence officer, in testimony published by Britain’s Guardian newspaper, which obtained access to the accounts first surfaced by +972.

Many of the munitions Israel dropped on targets allegedly selected by Lavender were “dumb” bombs — heavy, unguided weapons that inflicted significant damage and loss of civilian life. According to Abraham’s reporting, Israeli officials didn’t want to “waste” more expensive precision-guided munitions on the many junior-level Hamas “operatives” identified by the program. And they also showed little squeamishness about dropping those bombs on the buildings where the targets’ families slept, he wrote.

READ MORE  North Korea's Kim orders military to 'thoroughly annihilate' US, South Korea if provoked

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” A, an intelligence officer, told +972 and Local Call. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Widespread concerns about Israel’s targeting strategies and methods have been voiced throughout the course of the war. “It is challenging in the best of circumstances to differentiate between valid military targets and civilians” there, Brian Castner, senior crisis adviser and weapons investigator at Amnesty International, told my colleagues in December. “And so just under basic rules of discretion, the Israeli military should be using the most precise weapons that it can that it has available and be using the smallest weapon appropriate for the target.”

In reaction to the Lavender revelations, the IDF said in a statement that some of Abraham’s reporting was “baseless” and disputed the characterization of the AI program. It is “not a system, but simply a database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations,” the IDF wrote in a response published in the Guardian.

“The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” it added. “Information systems are merely tools for analysts in the target identification process.”

This week’s incident involving an Israeli drone strike on a convoy of vehicles belonging to World Central Kitchen, a prominent food aid group, killing seven of its workers, sharpened the spotlight on Israel’s conduct of the war. In a phone call with Israeli Prime Minister Benjamin Netanyahu on Thursday, President Biden reportedly called on Israel to change course and take demonstrable steps to better preserve civilian life and enable the flow of aid.

READ MORE  Palestinian militants kill 2 alleged informers for Israel and mob drags bodies through camp alleys

Separately, hundreds of prominent British lawyers and judges submitted a letter to their government, urging a suspension of arms sales to Israel to avert “complicity in grave breaches of international law.”

The use of AI technology is still only a small part of what has troubled human rights activists about Israel’s conduct in Gaza. But it points to a darker future. Lavender, observed Adil Haque, an expert on international law at Rutgers University, is “the nightmare of every international humanitarian lawyer come to life.”

Leave a Comment