Public attention was high when Google employees protested their company’s involvement in the Pentagon-initiated Project Maven, which aims to optimize drone targeting through artificial intelligence (AI). Back in 2018, around 4,000 staff and researchers signed a petition calling for a halt to the development of AI technology and software for warfare purposes. While the role of Silicon Valley in the advancement of military technology was hardly a surprise at the time, many felt that this partnership must end with the production of ‘killer robots’. After numerous media outlets picked up on the demands and reported on the internal uprisings, Google announced that it would not renew the contract with the Pentagon and adjust its own AI guidelines in order to live up to its former in-house motto ‘don’t be evil’.
Around four years later, in September 2022, the tech giant signed a $400,000 contract with the Pentagon to develop ‘Artificial Intelligence / Machine Learning for processing aerial imagery’. This joins a series of other contracts between Google and the US military and surveillance apparatus: in the second half of last year, the company was awarded a $3 million contract with the Pentagon for an ‘air logistics optimization’. This was shortly followed by a partial award for the DoD’s ‘Joint Warfighting Cloud Capability’, and earlier for the CIA’s ‘Commercial Cloud Enterprise’ and the Israeli Ministry of Defense’s ‘Project Nimbus’. All these contracts point to an unsteady and inconsistent implementation of Google’s renewed AI policy.
The company that started as a web search engine in 1997 and has since grown to one of the most powerful corporations in the world, is surely not the only expression of the complicated marriage of big tech and warfare. It is, however, a prominent one whose ambiguous direction and volatile handling illustrates an important point: when it comes to limiting AI technology on the battlefield, we should not rely on the self-regulation of profit-seeking companies in a competitive market. Instead, what we need is a multilateral and binding treaty to mitigate the negative consequences of an autonomized warfare.
For more information on the topic see:
This post was written by IRW LAB researcher Laszlo Steinwärder