Lavender Haze: AI and the bombing of civilians in Gaza

Stop Killer Robots
8 min readApr 11, 2024

--

Erin Hunt is the Executive Director at Mines Action Canada, a member organisation of the Campaign to Stop Killer Robots. Her expertise includes the humanitarian impact of indiscriminate weapons, victim assistance, gender in disarmament and Canadian disarmament policy.

Photo from the current war in Gaza by Mohammed Ibrahim.

Last week +972 Magazine and Local Call reported on a new artificial intelligence system being used by Israel to identify targets in Gaza. Unlike the previously reported on Habsora /Gospel system which identified buildings and infrastructure, this system, called Lavender, is using artificial intelligence to identify individual people as targets. In the six months since Hamas’ massacre of Israelis on October 7, tens of thousands of Palestinians have been killed and injured by bombing and shelling and these reports indicate that many of them were targeted based on information provided by AI systems.

These new systems are raising serious concerns and questions about the role of artificial intelligence in weapons. As a co-founder of Stop Killer Robots, MAC has long been advocating against the growing use of AI in warfare and calling for new international law to prohibit autonomous weapons and to ensure meaningful human control over the use of force. The Lavender system is very concerning for several reasons but many of those concerns are obscured by a haze of techno-confusion about what the new technology is and is not.

First, is Lavender an autonomous weapon?

No, it is not. Lavender is using artificial intelligence to identify and list people who fit a certain profile so they can be targeted but the weapons used are traditional conventional weapons. The weapons themselves are not autonomous and it still requires humans choosing which target from a list to attack and overseeing that attack. Lavender is doing the work of a human intelligence officer, not the work of a targeting artillery officer or the guidance system in a weapon.

Lavender, like Habsora, is not an autonomous weapon system.

If it’s not an autonomous weapon, what are you worried about?

The reporting on this system has raised several very serious concerns. To cut through the fog let’s break down the concerns into those related to technology and those related to International Humanitarian Law and to our work on the use of explosive weapons in populated areas.

Technology

MAC has serious concerns about the digital dehumanization that such a system is based on. The reporting indicates that Lavender lists targets based on analysis of data obtained through mass surveillance — boiling down a person to just their data is a problem that Stop Killer Robots has been discussing for a while.

Turning people into data dehumanizes them and makes it much easier to kill them or even kill the wrong person. Statements from Stop Killer Robots on Lavender and Habsora as well as analysis from Dr. Lucy Suchman and Dr. Branka Marijan highlight the serious technological concerns about these AI driven target identification programs.

The use of AI to create targeting lists has been a concern from campaigners for years. From the very beginning of talks at the CCW, campaigners have been talking about the risk of systems where a human has only seconds to confirm a target selection from military systems. Starting in 2013, SKR has been telling states that advancing technology would result in a situation where the AI is not questioned. This appears to be happening with Lavender with disastrous results for Palestinian civilians.

Taking humans out of the loop of targeting decisions is a major problem morally, ethically and for International Humanitarian Law.

Photo from the current war in Gaza by Emad El Byed.

International Humanitarian Law

Artificial Intelligence has been the headline in the reporting about Lavender but once you dig deeper into the article, some serious concerns emerge about International Humanitarian Law (IHL) particularly on accountability and two key requirements — proportionality and precaution in attack.

The use of Artificial Intelligence in systems like Lavender have serious implications for accountability under IHL by delegating some decision-making to an algorithm. As we can see from the reporting, there are questions about who is responsible for the decision-making process when Lavender is used to create the list of potential targets. This ambiguity is problematic — AI should not be used to shield individuals from accountability under IHL. When mistakes are made, who will be held accountable?

Accountability is not the only IHL principle that is challenged by Lavender and similar technology.

Proportionality is one of the more difficult to understand concepts in IHL. The International Committee of the Red Cross says, “The principle of proportionality prohibits attacks against military objectives which are expected to cause incidental loss of civilian life, injury to civilians, damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated”. So what that means is that IHL recognizes that sometimes even attacks on military targets that follow all the rules of IHL can result in harm to civilians so military commanders have to make sure the expected civilian harm is not excessive to the military benefit.

There is no prescribed acceptable ratio between military benefit and civilian harm in IHL, so each country comes up with their own guidelines to help military personnel make this decision. With every military operating under their own proportionality math, it can be difficult to understand proportionality. The guidelines are often kept confidential which further blurs things.

However, the reporting on Lavender indicates that proportionality has been seriously skewed regarding the conflict in Gaza. The +972 article quotes two Israeli sources and reports that:

“for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.”

A ratio of one combatant to 15 civilians is exceptionally high while a ratio of 1 commander to over 100 civilians is unheard of among responsible militaries. Civilians must be protected in times of armed conflict and such ratios stretch the concept of proportionality beyond recognition.

The report on Lavender also raises questions about the requirement under IHL referred to as precaution in attack. Precaution in attack is exactly what it sounds like — the ICRC summarizes it as “even when an attack directed at a military objective is not expected to have excessive effects on the civilian population, all feasible precautionary measures must be taken to minimize those effects.

The mention of intentionally targeting individuals identified by Lavender in their homes without confirming the number of civilians present seems to overlook the principle of precaution in attack. Precaution in attack and proportionality are key principles of International Humanitarian Law designed to protect civilians trapped in armed conflict. These challenges to IHL need to be thoroughly investigated by experts in international humanitarian law to ensure accountability.

Photo from the current war in Gaza by Emad El Byed.

Use of Explosive Weapons in Population Areas

Beyond discussions of autonomy in weapons and IHL the coverage of Lavender raised a number of major concerns regarding the use of explosive weapons in populated areas. As a co-founder of the International Network on Explosive Weapons, Mines Action Canada has been working to limit the civilian harm caused by bombing and shelling in cities for over a decade now.

The report indicates that many of the targets identified by Lavender were targeted by so called “dumb” bombs. Dumb bombs have wide area effects that are known to frequently cause immediate harm to civilians and reverberating harm to civilian infrastructure like housing, hospitals, shops, markets and schools.

These dumb bombs are then used on residences identified as the homes of Lavender’s targets by another AI system, called “Where’s Daddy?”. Where’s Daddy? is a system that uses surveillance data to determine when individuals on the target list returned to their homes in real time. That combination of Lavender, Where’s Daddy? and dumb bombs has resulted in a huge number of civilian deaths and injuries. Bombing or shelling homes full of civilians with one identified target is problematic under the requirement to take precautions against civilian harm in attack and the principle of proportionality and then the use of large dumb bombs “meant literally destroying the whole house on top of its occupants.

The very large explosive weapons used to target Hamas officials have huge impact areas leading to destruction of large numbers of civilian buildings and hundreds of deaths.

This type of death and destruction is not an inevitable consequence of warfare or necessarily a result of the use of systems like Lavender.

Military personnel have choices about when to target the people on the lists compiled by AI, about how many civilians could be harmed, when to fire, and what weapons to use. The choices made by the Israeli military have been responsible for the deaths and injuries of thousands of civilians in Gaza.

Lavender is an immensely problematic system. It is not an autonomous weapon, but it dehumanizes people, it reduces meaningful human control over the use of force, and is known to make mistakes. There is a possibility that states will decide that they want to include such intelligence systems in the discussion.

In the meantime, Mines Action Canada calls on all parties in Gaza to protect civilians. INEW’s call to stop bombing civilians has never been more needed. The international community must pressure Israel to meet its obligations under IHL, especially precaution in attacks. Aid needs to be able to reach those who are suffering.

Above all, when it comes to new technology in warfare, we all need to be able to see through the hazy techno-confusion and recognize that systems like Lavender dehumanize others, obscure accountability, and weaken international humanitarian law but civilians are dying because of human decisions. While the world focuses on the high-tech aspects of this conflict, unimaginable harm to civilians continues.

We cannot let the Lavender haze distract us from the human decisions that lead to massive civilian casualties.

Originally published on 10 April on Mines Action Canada’s website.

Please note: While the Lavender system is not an autonomous weapon, its reported use by Israel in the Gaza Strip is deeply concerning from a legal, moral and humanitarian perspective. For Stop Killer Robots’ full comment on the Lavender system visit: https://www.stopkillerrobots.org/news/use-of-lavender-data-processing-system-in-gaza/

--

--

Stop Killer Robots

With growing digital dehumanisation, the Stop Killer Robots campaign works to ensure human control in the use of force. www.stopkillerrobots.org