From an inter- and transdisciplinary perspective, the Realities of Algorithmic Warfare (RAW) project engages with the realities of increasing autonomy in warfare through artificial intelligence. From a Conflict Studies, Media and Cultural Studies, and Law perspective we explore how integrating algorithms into existing military technology paves the way to more ludification, remoteness, and autonomy in war. This brings opportunities as well as serious risks to the battlefield as well as the fundamental building blocks of democratic societies like transparency, accountability and the rule of law.
As AI and other emerging disruptive technologies are increasingly integrated into all aspects of human life, advanced militaries worldwide have found themselves in what some call an AI arms race, feeding into the third revolution in warfare.In this context, advanced militaries such as those of the US, Russia, China, Turkey, and several EU countries including the Netherlands have been developing, experimenting with and deploying technologies with various levels of autonomy across numerous battlefields in Libya, Syria, Mali, Ukraine and beyond.
While developers and armed forces promise more effective military engagement through increased speed and precision, academic and policy debates tend to focus on the threat of fully autonomous weapons making life-and-death decisions.
These debates are fed by the expanding number of weapon systems capable of autonomously identifying targets through machine learning algorithms, including the Agile Condor pods on MQ-9 UAVs, loitering munitions, Hivemind AI on VBATs, uncrewed ships and drone swarms.
To have an informed debate amongst all stakeholders involved in developing, analysing, and regulating autonomous weapon systems it is crucial to reflect on questions such as: By whom and how are these weapon systems being developed? What problems are these technologies seen to be a solution for? How are they developed and operators trained? How autonomous are they in reality? How and where are they being experimented with and deployed? And to what effect? What impact do they have on the legitimacy, transparency, and accountability of operations? How and why should they be regulated? And how are they changing the character of warfare?
The Realities of Algorithmic War project will engage in in-depth research and debate amongst developers, investigators and regulators to reflect on how this effects security, human rights and democratic principles. By engaging with a variety of stakeholders crossing disciplinary thresholds, we will collectively contribute to bringing the academic and policy debates forward on topics related to the lived realities of increasing autonomy in war.
This project is run bij IRW’s dr. Lauren Gould & Dennis Jansen (PhD) and Jessica Dorsey (PhD)