You are currently viewing Opinion | What War by A.I. Actually Looks Like

Opinion | What War by A.I. Actually Looks Like

  • Post category:USA

Like the invasion of Ukraine, the ferocious offensive in Gaza has looked at times like a throwback, in some ways more closely resembling a 20th-century total war than the counterinsurgencies and smart campaigns to which Americans have grown more accustomed. By December, nearly 70 percent of Gaza’s homes and more than half its buildings had been damaged or destroyed. Today fewer than one-third of its hospitals remain functioning, and 1.1 million Gazans are facing “catastrophic” food insecurity, according to the United Nations. It may look like an old-fashioned conflict, but the Israel Defense Forces’ offensive is also an ominous hint of the military future — both enacted and surveilled by technologies arising only since the war on terrorism began.

Last week +972 and Local Call published a follow-up investigation by Abraham, which is very much worth reading in full. (The Guardian also published a piece drawing from the same reporting, under the headline “The Machine Did It Coldly.” The reporting has been brought to the attention of John Kirby, the U.S. national security spokesman, and been discussed by Aida Touma-Sliman, an Israeli Arab member of the Knesset, and by the United Nations secretary general, António Guterres, who said he was “deeply troubled” by it.) The November report describes a system called Habsora (the Gospel), which, according to the current and former Israeli intelligence officers interviewed by Abraham, identifies “buildings and structures that the army claims militants operate from.” The new investigation, which has been contested by the Israel Defense Forces, documents another system, known as Lavender, used to compile a “kill list” of suspected combatants. The Lavender system, he writes, “has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.”

Functionally, Abraham suggests, the destruction of Gaza — the killing of more than 30,000 Palestinians, a majority of them civilians including more than 13,000 children — offers a vision of war waged by A.I. “According to the sources,” he writes, “its influence on the military’s operations was such that they essentially treated the outputs of the A.I. machine ‘as if it were a human decision,’” though the algorithm had an acknowledged 10 percent error rate. One source told Abraham that humans would normally review each recommendation for just 20 seconds — “just to make sure the Lavender-marked target is male” before giving the recommendation a “rubber stamp.”

The more abstract questions raised by the prospect of A.I. warfare are unsettling on the matters of not just machine error but also ultimate responsibility: Who is accountable for an attack or a campaign conducted with little or no human input or oversight? But while one nightmare about military A.I. is that it is given control of decision making, another is that it helps armies become simply more efficient about the decisions being made already. And as Abraham describes it, Lavender is not wreaking havoc in Gaza on its own misfiring accord. Instead it is being used to weigh likely military value against collateral damage in very particular ways — less like a black box oracle of military judgment or a black hole of moral responsibility and more like the revealed design of the war aims of the Israel Defense Forces.

At one point in October, Abraham reports, the Israel Defense Forces targeted junior combatants identified by Lavender only if the likely collateral damage could be limited to 15 or 20 civilian deaths — a shockingly large number, given that no collateral damage had been considered acceptable for low-level combatants. More senior commanders, Abraham reports, would be targeted even if it meant killing more than 100 civilians. A second program, called Where’s Daddy?, was used to track the combatants to their homes before targeting them there, Abraham writes, because doing so at those locations, along with their families, was “easier” than tracking them to military outposts. And increasingly, to avoid wasting smart bombs to target the homes of suspected junior operatives, the Israel Defense Forces chose to use much less precise dumb bombs instead.



by NYTimes