News: RAW researchers review how Israeli AI system ‘Lavender’ is directing airstrikes in Gaza

Last week, the Israeli-Palestinian +972 Magazine revealed that the Israeli army has been using an AI-enabled decision-support system ‘Lavender’ to identify targets in their post-October 7 airstrikes  in Gaza that have thus far killed more than 33.000 people. Through the testimonies of six intelligence officers, the article reveals deeply concerning realities of minimal human oversight and permissive policies regarding civilian harm in conjunction with the use of AI. The reports relays that the system, which has identified 37.000 potential human targets at its peak, continuously marks Palestians as suspects for assassination. The intelligence officers reflect on how their oversight of how these systems came to these targets is very limited: “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time”

To make sense of the implications of this technology, researchers from the Realities of Algorithmic Warfare programme appeared across multiple international media platforms to offer their insightful analyses of how this recent development sets new precedents for the role of algorithmic technologies in warfare.

Jessica Dorsey, who was recently appointed as Expert Advisor for the Global Commission on Responsible Artificial Intelligence in the Military Domain (REAIM), co-authored an article for OpinioJuris and was interviewed by UK Times Radio. In her appearances, Dorsey critically reflects on how AI driven systems such as Lavender enable war and destruction to take place at an increasing speed and scale. Dorsey highlights the risks that these unregulated systems pose with regards to civilian harm and international humanitarian law, and underscores the urgent need for the regulation of AI-enabled decision-support systems 

Similarly, RAW’s Marijn Hoijtink points to diminished levels of human oversight in Israel’s application of the Lavender system in an interview with the Belgian radio channel Radio1. She voices concerns about Lavender’s accuracy —  which the Israeli intelligence services claim was tested once in the early weeks of the campaign and is 90 percent, but cannot be verified  —   is deeply problematic for a system determining matters of life and death. According to Hoijtink, these figures raise questions on proportionality in the context of war. She emphasises that Israel, just like every other nation, has the obligation to thoroughly verify targets in order to protect civilians and non-military infrastructure.

Follow the references below to explore Hoijtink and Dorsey’s contributions to the international debate, or listen to Dorsey’s radio interview through the audio player.

Bo, M., and J. Dorsey. 2024. ‘Symposium on Military AI and the Law of Armed Conflict: The “Need” for Speed – The Cost of Unregulated AI-Decision Support Systems to Civilians’.

‘“Lavendel”, “Het Evangelie” En “Where’s Daddy?”: Hoe Israël AI Gebruikt Om Palestijnse Doelwitten Te Bombarderen’. 2024.

Read also:
Hijink, M. ‘AI Is de Killer-App in Gaza en Oekraïne’. NRC, 19 April 2024.