Commentary: War by algorithm raises new moral dangers
The IDF has contested several aspects of this report. “Contrary to Hamas, the IDF is committed to international laws and acts accordingly,” the IDF said in a statement. The “system” was a database that human analysts used to verify identified targets. “For each target, IDF procedures require conducting an individual assessment of the anticipated military advantage and collateral damage expected.”
What is indisputable is the horrifying loss of civilian life in Gaza that occurred in the first weeks of the war following the murderous Hamas attack on Israel on Oct 7, 2023. According to Palestinian authorities, 14,800 people, including about 6,000 children and 4,000 women, were killed in Gaza before the temporary ceasefire of Nov 24 when Lavender was most used.
DEBATE ABOUT MILITARY USES OF AI
Many aspects of the Israeli-Gazan tragedy are unique, born of the region’s tangled history, demography and geography. But Israel is also one of the world’s most technologically advanced nations and the way it wages war feeds the global debate about the military uses of AI. That debate pits so-called realists against moral absolutists.
Realists argue that AI is a dual-use technology that can be deployed in myriad ways for both good and bad. Few, for example, would contest its use in defensive weapons, such as Israel’s Iron Dome that has intercepted multiple rocket attacks from Gaza.
But the Lavender system appears to have contributed to a “dramatically excessive” rate of collateral damage that is morally indefensible, says Tom Simpson, a former Royal Marine now philosophy professor at the University of Oxford.
Source: CNA