Stop killer robots while we still can

February 26, 2014

On the 26th of February PAX will present the report ‘Deadly Decisions – 8 objections to killer robots’ at the annual NGO conference of the Campaign to Stop Killer Robots in London. The report highlights eight objections to killer robots and explains why a comprehensive and pre-emptive ban on these weapons is necessary.

Killer robots are fully autonomous weapon systems that can select and engage targets without meaningful human intervention. Although killer robots do not exist yet, the rapid advances in military technology show that the capability to deploy fully autonomous weapons is on its way. PAX calls on states to not only address legal, policy and strategic concerns, but to consider the crucial ethical question: do we want to delegate the power to make life and death decisions to a machine?

Eight objections
PAX is part of the steering committee of the Campaign to Stop Killer Robots and their report describes the eight most pressing concerns about the development, production and use of fully autonomous weapon systems. In short, the objections are:

  • Killer robots are by nature unethical, because there is no longer a human in the loop when an attack takes place. War is about human suffering, the loss of human lives, and consequences for human beings. Killing with machines is the ultimate demoralization of war.
  • Killer robots will lower the threshold of going to war. Although they shift risks away from a nation’s own soldiers, they also distance the public from experiencing the war, giving politicians more space in deciding when and how to go to war.
  • A killer robot cannot be programmed in such a way that they will be able to make sound decisions about who is a combatant and who is a civilian. Their mechanical manner of intelligence makes it impossible to apply the rule of distinction as well as human rights.
  • A killer robot cannot apply the rule of proportionality. A machine cannot weigh military gain and human suffering in war situations.
  • Killer robots complicate the chain of responsibility. This leads to an accountability vacuum that makes it impossible to hold anyone sufficiently accountable for possible violations of international law by a robot. Is it the robot, the programmer, the manufacturer or the commander?  
  • The development and use of killer robots could decrease transparency as they can be easily used in anonymous and clandestine operations. Given the lack of transparency and consistency of information in drone warfare, it is hard to imagine that governments would be transparent about the use of killer robots.
  • Killer robots will terrify local populations and could lead to increased animosity and possible retaliation. At the same time, citizens of states that deploy killer robots could be more vulnerable to counter attacks.
  • Killer robots are relatively cheap and easy to copy. If they are produced and used, they would proliferate widely to states and non-state actors.

Urgency
One thing becomes exceedingly clear: these weapons must be banned before it is too late. Co-author of the report Miriam Struyk: “It is great that so many states, including the Netherlands, have agreed to convene an informal meeting at the United Nations in Geneva from 13 – 16 May to discuss the legal, ethical, military and policy concerns of fully autonomous weapons. We hope this report will contribute to that discussion and emphasize the urgency of a ban. Technological developments towards fully autonomous weapons are moving at an increasingly rapid pace. It’s time we ask ourselves if such a development is desirable before it is too late.  

Watch our video about Killer Robots.

Download the report Deadly Decisions – 8 objections to killer robots written by Merel Ekelhof and Miriam Struyk.

Read more about our programme Stop Killer Robots.
See also stopkillerrobots.org

Get involved with our peace work.
Subscribe to the PAX Action Alert.