This round-up looks at the military-tech complex including exploration of robots on the battlefield, acquisition of armed drones and the consequences of drone warfare for the pilots operating them. This round-up is written by Aoife Keogh.
French exploring the use of robots on the battlefield
A recent report by the French Ministry of Defence has stated that French military should be authorised to use autonomous weapons, particularly killer robots, under specific conditions.
The report, which was ordered by the Florence Party, comes from the Defence Ethics Committee – which is a military and civilian body. Independent experts were critical of the report and warned of the grey area between partial and full autonomy which may become blurred in combat situations.
According to The Times, the report “set out one of the most elaborate moral cases by a western state for the use of artificial intelligence systems that are expected to transform warfare.” The committee argues that the French military could benefit from partially autonomous weapons which have the capability to identify and engage with targets, while keeping a human operator informed, who has the option to stop the device.
The committee’s justification for this argument was the need to keep up with the changing evolution of contemporary warfare. Militaries around the world have been increasingly interested in the capabilities of artificial intelligence on the battlefield. In an article for De Correspondent, valued member of the IRW network, Lennart Hofman, discussed the race to bring AI into the battlefield. Hofman elaborates on the reasons why militaries are increasingly looking to artificial intelligence for the latest developments in weaponry. The article quotes Dr. Kenneth Payne from King’s College London who researches the influence of artificial intelligence on warfare, Payne sums up the reasoning behind this new arms race:
“The idea that opponents will get hold of a very powerful weapon that you don’t know exactly what it can do and how effective it is is the main driver of investment in AI.”
The French report is the result of a debate which was sparked by the French military’s use of the robotic dog ‘Spot’ in training simulations at Saint-Cyr Military College last month. The robotic dog was being trialed in training simulations for urban warfare.
Spot was created by US company Boston Dynamics. The robotic dog is equipped with camera’s which can be remotely operated. Michael Perry, Boston Dynamics Vice President told The Verge that Spot had been acquired by the French Military through European distributor Shark Robotics and that Boston Dynamics had not been notified of the robot’s possible redeployment to military operations. Boston Dynamics stress that as part of their sale agreements, the robotic dog should never be armed and that they do not condone using Spot on the battlefield .
It’s worth noting that Boston Dynamics have sold the robotic dog to police departments. According to the New York Times, Spot had been acquired by New York Police Department for the purpose of entering environments too dangerous for humans, to detect and alert to potential danger. However, the N.Y.P.D. are no longer using the robotic dog after public backlash regarding the use of Spot in a home invasion.
The N.Y.P.D. are not the only police force to acquire Boston Dynamic’s robotic dog, last month Reuters reported that Dutch police were now using Spot. The Dutch police force claim the robotic dog will be primarily used for drug lab seizures. In the article, Marjolein Smit, head of the Division for Special Operations, is quoted saying: “A drug lab is always risky for us because there are always dangerous substances involved, but also possibly a criminal with a firearm.” The Dutch are the first police force in Europe to use the robotic dog.
The use of Spot in police departments and now military training illustrates something the IRW is conducting new research on; the empirical observation of how technologies move back and forth across localities of engagement: across the North-South divide and between the military and police services.
Innovations in the field of artificial intelligence, in particular in the field of machine learning, which were often developed in the civilian domain and applied in industries such as healthcare (e.g. medical diagnosis), transportation (autonomous driving), and finance (e.g. stock price prediction) are increasingly crossing over to the field of law enforcement (e.g predictive policing) and the military domain (e.g. automated drone footage analysis), too. This is called the military-tech complex, where technologies are being developed for commercial ends and are increasingly being ‘financed’ or ‘sold’ to the military to be used on the battlefield without consent/knowledge of those who developed them.
This begs the question will Spot or other robotics be deployed on the battlefield in the future?
The Guardian addresses this question in their article “Machines set loose to slaughter: the dangerous rise of military AI”. In the article, they write about the use of machine learning in other sectors such as credit card fraud, where due to the massive datasets available, credit card companies have been able to detect patterns of fraud through constant analysis of millions of transactions. However, they stress that the transferral of this machine learning to the battlefield is much more difficult, the data needed for machines to understand complex battlefield situations is just not available and the contextual differences between combat zones is vast.
Armed drones are already monitoring the skies of many official and unofficial war zones and civilians have already experienced the devastating realities of these new technologies, illustrating that the military-tech industrial complex is not a new phenomenon. Killer robots may seem like a distant futuristic concept but advancements in AI are constantly being incorporated into military arsenals, and this French report reveals the willingness to engage with these technological developments on the battlefield.
Armed Drones to be acquired by Canadian Military
Royal Canadian Air Force commander Lt. Gen. Al Meinzinger told The Canadian Press that the Canadian military are putting in a request for bids to purchase armed drones at the value of $5 billion later this year.
After nearly twenty years of deliberation, the Canadian military are now preparing to acquire armed drones for operations. This includes plans for a center in Ottawa, where drone operations will take place.
As part of the drone force, there will be 300 service members, including technicians, pilots, and other members drawn from the air force and other parts of the military.
The Canadian military plan to use these vehicles for surveillance, reconnaissance and airstrikes.
It was not until 2017 that the Canadian government approved the use of drones within military operations— armed or unarmed. Thus far it has only used a small number of unarmed drones in Afghanistan.
There is little known about how the Canadian military plan to use the armed drones. Seniour researcher at arms-control group Branka Marijan commented on acquisition:
“This is not sufficient. Clarity on when, where, how, and for what purpose the armed drones would be used are needed,” she said.
The lack of information regarding the scenarios in which these weapons will be employed, has garnered questions surrounding whether they could be used for assassinations. In vague rebuttal Canadian officials have stated the armed drones will be used in the same way as conventional weapons such as fighter jets and artillery.
Critics are also concerned about the government’s decision to move ahead with the process of acquiring these armed drones without a political debate. Jagmeet Singh, the leader of the NDP, Canada’s third largest political party indicated that he was not in favour of the decision to move ahead with the process of acquire drones, stating:
“it’s not in line with the vision that I have for a Canada that is providing … peace and building a safer world.”
According to their latest update, DronesWars, a UK based monitoring organisation, has listed over twenty states who have now acquired and operated armed drones, while 18 more are in the process of acquiring these weapons. This list does not count non-state actors.
In 2018, the Netherlands purchased four MQ-9 Reaper Drones from the US, these weapons are due to arrive this year. The Dutch government officially announced the procurement of the armed drones for the purposes of intelligence, surveillance, and reconnaissance missions.
Both the Netherlands and Canada signed the Joint Declaration for the Export and Subsequent Use of Armed or Strike-Enabled Unmanned Aerial Vehicles in 2016. All signing parties agree to regulate the production, exportation and use of armed unmanned aerial vehicles. However, the declaration has been criticized for its ambiguous language relating to how these armed unmanned vehicles should be interpreted in international humanitarian and human rights law. Furthermore, the declaration is not politically or legally binding, leaving little repercussions for the misuse or misappropriation of these dangerous weapons.
The acquisition of armed drones by the Canadian and Dutch military is part of a wider proliferation of armed drones by Western and non-western states. Part of the appeal of armed drones on the battlefield is the distance that they can be operated from – keeping military personnel out of harms way – and the intelligence drones can collect on potential security threats. However, as described in the op-ed by the IRW members Dr. Lauren Gould and Isa Zoetbrood “More Drones, more war“ the type of surveillance drones compile and the ways this information is wielded can have dangerous consequences.
As stated in the article “from the sky, drones can therefore see everything and understand nothing,” meaning the information collected does not contextualize the underlying issues which have lead to conflict. Furthermore, as highlighted in the article, the abundance of information drones collect still leads to airstrikes which result in vast amounts of civilian harm. Despite, the commonplace of civilian casualties, drones are still employed under the guise of ‘precision’. When airstrikes get their target wrong or generate civilian harm, it is argued that there wasn’t enough surveillance and, therefore, used to justify the acquisition and deployment of more drones.
As more countries acquire armed drones for military operations, under the pretence that surveillance collected by drones will ensure threats are identified and neutralized, the more likely we are to witness the increase in airstrikes and sadly civilian harm.
One of IRW’s current projects is tracking the discourses surrounding the use of drones in contemporary warfare. For more information, click here.
Trump’s Secret Rules for Drone Strikes
On Friday the 30th of April, the Biden administration released a partially redacted document created by the Trump administration on the Principles, Standards and Procedures for US Direct Action Against Terrorism (PSP). The document outlined the rules for the use of drone strikes in counter-terrorism operations.
The 11-page document was made public through the freedom of information act lawsuits filed by The New York Times and the American Civil Liberties Union. The lawsuit was inherited by the Biden administration who initially sought a delay but have now complied and handed over the document, albeit with parts redacted.
The documents laid out some concerning developments in the US policy on the use of lethal force against terrorist suspects abroad. These developments include the loosening of policy safeguards put in place by his predecessor President Obama on the US use of lethal force for terrorist threats.
The Obama administration created the bureaucratic framework in 2013, it comprised of a list of rules on when, where, and how US forces could conduct operations using lethal force against terrorism abroad.
The Trump administration adapted many of these rules, loosening many restraints for the use of lethal force. The main criticisms of Trumps PSP rules include:
The Obama administration had formulated these rules for countries where the US was not engaged in active hostilities, most interpreted this to mean countries outside of Afghanistan, Iraq and Syria. However, in Trump’s adaptations there was no mention of where these actions were permissible or prohibited, essentially permitting the US to conduct lethal strikes inside and outside of active hostilities. The lack of specificity on where the strikes can take place has been subject to criticism by legal scholars arguing that there is no guidance under which international law to interpret these rules, therefore, undermining international legal frameworks.
Under the Obama administration, the rules were specific on terrorist groups which were considered a threat to US national security, whereas, under Trump’s rules, there was no specification about which terrorist groups that were considered a threat to the US, leaving it open to interpretation.
In addition, vague language was used in the designation of what was considered a threat. The Obama administration was heavily criticized for the use of ‘imminent threat’ to justify lethal force against terrorist abroad. The Trump administration dropped ‘imminent’ preferring to authorize lethal action against those who were considered simply a ‘threat’.
Furthermore, Trump’s rule has been criticized for the authorization of commanders to make decisions about attacks in counter-terrorism operations, as long as they fit within the operating principles. The document has been criticized for allowing flexibility surrounding these rules, stating that “variations” could be made “where necessary”.
A section of Trump’s rules which has led to a lot of criticism is the presence of regular exceptions to the “near certainty rule” which outlines that assaults would only be given the go ahead, if there was near certainty that there would be no civilian casualties. Under the Trump administration, this rule remained in place, but was enforced where there was doubt about the presence of women and children, however, their threshold was lowered considerably if there was a risk of harming civilian males.
It’s important to note that the rules outlined in the document have been suspended since President Biden’s first day in office, as he imposed an interim policy requiring White House approval on all potential airstrikes outside of Syria, Iraq and Afghanistan. The Biden administration are in the process of creating their own rules and regulations regarding this matter.
Speaking on the Trump rules, Brett Max Kaufman, a senior staff lawyer with A.C.L.U.’s centre for democracy argued the rules were “stripped down” and a “secretive and unaccountable use of lethal force.”
As argued by Hina Shamsi for Just Security, the US lethal strikes program began under President Bush, was escalated under the Obama administration and further extended under President Trump to allow for “open-ended authorization for the United States to kill virtually anyone it designates as a terrorist threat, anywhere in the world.”
The changes made by the Trump administration to the rules concerning lethal strikes illustrates how any sitting president can adapt the framework to use lethal strikes in whatever way they see fit. As mentioned earlier, President Biden will now adapt these rules to his administrations liking, exposing a clear issue with the regulation of legal strikes – regulations can be altered, stretched, and re-interpreted.
The Effect of Airstrikes on Drone Operators
As argued by Demmers & Gould in their Remote Warfare Paradox article, the turn to remote warfare is attractive to Western democracies because it decreases the number of body bags returning from the battlefield. However, in 2000 Micheal Ignatieff already warned that: ‘if war becomes unreal to the citizens of modern democracies, will they care enough to restrain and control the violence exercised in their name?’ Seemingly one of the only ways Western democracies feel the consequences of remote warfare now is through the psychological trauma and stories told by their drone pilots.
In an article in The Times, a former RAF pilot operating a MQ-9 Reaper armed drone has revealed the impact his former job had on his mental health. The officer served in the 13th Squadron at Waddington base in Lincolnshire, during the battle of ISIS in Syria and Iraq. The UK was an active member of the US-led coalition against ISIS, engaging in airstrikes in both Syria and Iraq.
The officer compared operating an armed drone firing a crossbow due to the visible detail of bodies which were harmed as a result of the weapon. He said he was “sicked” by what he called “morally questionable” deaths and injuries inflicted by the airstrikes.
He admitted to suffering from post-traumatic stress disorder after witnessing the devastating impacts of these strikes. The squadron, which the officer was a member of, was responsible for operating the armed drones and was nicknamed ‘the asylum’ because of the mental health issues associated with this form of work.
The officer is quoted saying “When you are killing the enemy, you see it in so much detail because you are watching them sometimes for hours or even days on end, then lingering afterward, watching the impact of what you have just done … your brain can’t tell the difference between 3,000 miles and 3ft.”
When the officer expressed concerns about the toll his work was having on his mental health, he was transferred to Scotland and reduced in rank, before being medically discharged from the army. In his attempt to receive compensation for the psychological harm inflicted on him through his work he was told there was “no evidence he had been exposed to actual or threatened death”.
His account is part of eight cases submitted as part of a dossier by Justice4Troops, a not-for-profit organisation which advocates for service members of the British Military. The organisation released the dossier in hopes of pressuring the MoD the initiate an inquiry into the Armed Forces Compensation Scheme.
In response, the MoD has stated that they closely monitor the squadron for psychological harm and have trained trauma risk management providers embedded within the squadron.
As contemporary warfare continues to evolve, incorporating more technological advances, allowing distance from the battlefield, it’s important to understand the new forms of harm these weapons inflict, including the devastating casualties, injuries and destruction in war zones but also the psychological toll for their operators.