paxforpeace.nl uses cookies to ensure that we give you the best experience on our website.

https://www.paxforpeace.nl/our-work/programmes/killer-robots

Killer Robots

What are killer robots?
Killer robots are autonomous weapon systems that can select and attack targets without meaningful human control. Which means the weapon system can use lethal force without a direct instruction from a human operator. This is a function that could be applied to various weapon systems, for instance a battle tank, a fighter jet or a ship.

A common misunderstanding is that killer robots are the same as drones. With drones however there is still a human operator who selects and attacks targets from a distance. Another misunderstanding is that killer robots are the same as the Terminator or Robocop. These are science fiction concepts, which are unlikely to become a reality in the coming decades, if ever at all.

Weapon systems that can autonomously select and attack targets raise many legal, ethical and security concerns. This is why PAX aims for a ban on the development, production and deployment of autonomous weapons. Already in 2011 PAX warned about this development and in April 2013 PAX co-founded the international Campaign to Stop Killer Robots.

 

Do killer robots exist?
Killer robots do most likely not yet exist, but there are precursors that clearly show the trend of increasing autonomy. An example is the Harpy that can loiter for hours, searching for enemy radar signals. Once detected it attacks and destroys the enemy radar through self-destruction. Another example is the SGR-1 an armed robot on the border between North- and South-Korea. It has a machine gun and a grenade launcher and can detect human beings via infra-red sensors. The technology required to produce these weapons is developing incredibly quick. Countries such as China, Russia, Israel, the United States and the United Kingdom have already expressed an interest in the development of such weapon systems.

What are the concerns?
For PAX the most important concern is an ethical one. A machine should never be allowed to make the decision over life and death. This decision cannot be reduced to an algorithm. Then there are the legal concerns. Autonomous weapons are unlikely to be able to comply with International Humanitarian Law, as it is unlikely that they will be able to properly distinguish between civilians and combatants, or to make a proportionality assessment. Autonomous weapons also create an accountability vacuum. Who would be responsible for an unlawful act: the robot, the developer or the military commander? Also there are security concerns. Autonomous weapons could lower the threshold to use force and reduce the incentive to find political solutions to end conflicts. This new technology could lead to a new international arms race, which would have destabilising effects and threaten international peace and security. What happens when dictators or terrorists get their hands on these weapons? Read more about the concerns in this short PAX publication: Ten reasons to ban killer robots

What does PAX want?
PAX wants states to install a pre-emptive ban on the development, production and use of killer robots. Or in other terms PAX wants an international legally binding instrument safeguarding meaningful human control over the critical functions of the selection and attack of targets. To this end PAX co-founded the Campaign to Stop Killer Robots in 2013. This international coalition aims for a ban on these weapons and advocates for this at the United Nations and other forums. Technology continues to develop at a rapid pace which is why it is crucial that we as soon as possible decide where to draw the line of what is acceptable and unacceptable.

There is growing support for a ban. Currently there are 22 countries that have called for a ban.

In 2015 over 3.000 Artificial Intelligence experts and in 2017 116 CEO’s from robotics companies warned us for these weapons and called on the UN to take action. The European Parliament, Twenty Nobel Peace Laureates and over 160 religious leaders have also called for a ban on autonomous weapons.

What is the Dutch position?
The Dutch national position on autonomous weapons is based on a 2015 report by two advisory councils Autonomous weapon systems; the need for meaningful control’. PAX takes a critical stance to this report. PAX’s greatest concern is that the report states that human control in the programming phase before deployment is sufficient and that human control over the selection and attack of targets is not necessary. Also the report lacks a sense of urgency and pays little attention to the international context and the ethical concerns. PAX continues to work towards a shift in Dutch policy.

International debate

The issue was first discussed at the Human Rights Council in 2013 after a report by UN Special Rapporteur Christof Heyns. Then the issue was taken up by the UN Convention for Conventional Weapons (CCW) where informal expert meetings took place in 2014, 2015 and 2016. In 2017 the first Group of Governmental Experts (GGE), which has a more formal mandate, took place at the CCW. In 2018 there will be 2 GGE meetings, one week in April and one in August. PAX takes part in all the meetings at the CCW, where we meet with diplomats and speak at side-events.


Background information and links

Information on the process within the UN Convention for Conventional Weapons

Information on the international Campaign to Stop Killer Robots

Other relevant publications


Contact information

Miriam Struyk, Program director Security & Disarmament
struyk(javascript needed for readable email)paxforpeace.nl
 

Humanitarian Disarmament