icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
4 Apr, 2024 16:54

Israel using AI to pick targets in Gaza – report

The program is reportedly designed to detect Hamas operatives, but Israeli military sources say it often marks innocents for death
Israel using AI to pick targets in Gaza – report

The Israeli military is using artificial intelligence to mark suspected Palestinian militants for assassination with little human oversight or regard for civilian casualties, the Israeli-Palestinian +972 Magazine reported on Wednesday.

The AI system, known as ‘Lavender’, is designed to comb through the personal data of Gaza’s two million residents to draw up lists of those suspected of serving in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), six Israeli intelligence officers told the magazine.

The IDF has never publicly acknowledged the existence of the system, but has been known to use similar software during previous operations in Gaza.

At the outset of Israel’s ongoing war on Hamas, Lavender marked 37,000 Palestinians as militants and placed them on kill lists, the sources claimed. Whereas Israel Defense Forces (IDF) personnel initially pored over these lists and manually verified each name, humans soon came to serve as rubber stamps for the machine’s lists, one source said.

“I would invest 20 seconds for each target at this stage, and do dozens of them every day,” the officer said. “I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time.”

Lavender works by studying phone records, social media activity, photographs, and movements of known Palestinian militants, identifying common characteristics, and then searching for these characteristics among the wider population of Gaza. The system gives each Gazan a score of between 0 and 100, with those ranked near 100 deemed to be terrorists and therefore legitimate targets. 

Within weeks of the war breaking out, however, IDF commanders were allegedly instructing their subordinates to relax this selection criteria and approve strikes on targets only tangentially linked with Hamas. 

“We were told: now we have to f**k up Hamas, no matter what the cost. Whatever you can, you bomb,” one source recalled.

Once marked for assassination, low-level targets would be taken out in their homes – identified using a different AI system called ‘Gospel’ – with unguided bombs, while more precise munitions would be used on higher-ranking militants.

“At 5 am, [the air force] would come and bomb all the houses that we had marked,” a source said. “We took out thousands of people. We didn’t go through them one by one – we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”

Lavender reportedly ranks targets by their perceived importance, with one source alleging that lower priority names often include policemen, civil servants, and others who “help the Hamas government, but they don’t really endanger [Israeli] soldiers.”

When a suspect is chosen and an assassination order given, IDF commanders decide how many civilian casualties they deem acceptable to take out the target. According to a source, this number “went up and down” over time, with “20 uninvolved civilians” deemed an acceptable sacrifice at the beginning of the war, and up to 100 considered okay in strikes on top-ranking Hamas officials. 

“It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law,” they said. “But they directly tell you: ‘You are allowed to kill them along with many civilians.”

According to the latest figures from Gaza’s health ministry, Israeli forces have killed more than 33,000 people in nearly six months of fighting in the enclave, most of them women and children. Responding to +972 Magazine’s claims, the IDF said on Wednesday that it “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” and that it “outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.”



Podcasts
0:00
27:28
0:00
27:47