Is Artificial intelligence being used by Israel to select targets for bombing in Gaza?

An investigation by +972 Magazine and Local Call, citing six Israeli intelligence officials involved in the alleged programme, claims that the Israeli military has been using artificial intelligence to assist in identifying bombing targets in Gaza. The officials also claim that human review of the suggested targets was at best cursory.

According to the authorities, who were cited in-depth in a joint Israeli-Palestinian online publication inquiry, the AI-powered instrument was known as “Lavender” and had a 10% mistake rate.The investigation comes amid intensifying international scrutiny of Israel’s military campaign, after targeted air strikes killed several foreign aid workers delivering food in the Palestinian enclave. Israel’s siege of Gaza has killed more than 32,916 people, according to the Gaza Ministry of Health, and has led to a spiraling humanitarian crisis where nearly three-quarters of the population in northern Gaza are suffering from catastrophic levels of hunger, according to a United Nations-backed report.

The investigation’s author, Yuval Abraham, previously told CNN in January of his work looking into how the Israeli military has been ”heavily relying on artificial intelligence to generate targets for such assassinations with very little human supervision.”The Israeli military “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF statement on Wednesday said. But its analysts use a “database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations.”

Human officers are then responsible for verifying “that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” according to the IDF statement, a process also described by +972.The magazine also reported that the Israeli army “systematically attacked” targets in their homes, usually at night when entire families were present.

“The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions,” it wrote. The report, citing sources, said that when alleged junior militants were targeted, “the army preferred” to use so-called dumb bombs – unguided missiles which can cause large-scale damage.

Source: Here

Related posts

One of the world’s greatest religious spectacles is underway

TikTok’s fate in the United States is now in the hands of the Supreme Court.

Is TikTok a threat to the USA’s national security ?