Killer Robots
Total Page:16
File Type:pdf, Size:1020Kb
{ KILLER ROBOTS What are they and / what are the concerns? KILLER ROBOTS ARE WEAPON SYSTEMS THAT WOULD SELECT AND ATTACK TARGETS WITHOUT MEANINGFUL HUMAN CONTROL. This means the decision to deploy lethal force would be delegated to a WHAT machine. This far-reaching development would fundamentally change the way war is conducted and has been called the third revolution in warfare, after gunpowder and the atomic bomb. The function of autonomously selecting and attacking targets could be applied to various platforms, for instance a battle tank, a fighter jet or a ship. Another term used to ARE describe these weapons is lethal autonomous weapon systems (LAWS). A COMMON MISUNDERSTANDING IS THAT KILLER ROBOTS ARE DRONES OR THE TERMINATOR. Today’s armed drones still have a human operator controlling the weapon system from a distance who is responsible for selecting and identifying KILLER targets as well as pulling the trigger. The issue is also not about the Ter- minator. This science fiction concept is unlikely to become a reality in the coming decades if ever at all. The issue is about the removal of mean- ingful human control from the critical functions of selecting and attacking targets; some of these systems may currently be under development and ROBOTS? could be deployed in the coming years. THERE SHOULD ALWAYS BE MEANINGFUL HUMAN CONTROL OVER THE SELECTION AND ATTACK OF INDIVIDUAL TARGETS. The human operator must be able to make carefully considered legal and ethical assessments, with sufficient information about the situation on the / ground and enough time to make a well-considered, informed decision. The desire to retain some form of human control lies at the heart of the debate over lethal autonomous weapon systems. What level and nature of human control is necessary to make a weapon system legally and ethi- cally acceptable? How can we ensure that this control is effective, appro- priate and meaningful? DO KILLER Unless constraints are put in place, lethal autonomous weapons will be deployed in ROBOTS the coming years rather than decades. Several precursors clearly demonstrate the trend EXIST? to increasingly autonomous weapon systems. / -> SGR-A1 SEAHUNTER MADE BY: HANWHA (SOUTH KOREA) MADE BY: PENTAGON’S DARPA (UNITED STATES) Sold to: SOUTH KOREA Sold to: UNDER DEVELOPMENT This stationary robot, armed with a ma- This 40m long self-navigating warship is chine gun and a grenade launcher, op- designed to hunt for enemy submarines erated along the border between North and can operate without contact with a and South Korea. It can detect human be- human operator for 2-3 months at a time. ings using infra-red sensors and pattern It is currently unarmed. US representa- recognition software. The robot has both tives have said the goal is to arm the Sea a supervised and unsupervised mode Hunters and to build unmanned flotillas available. It can identify and track intrud- within a few years. However, it has been ers, with the possibility of firing at them. said any decision to use offensive lethal force would be made by humans. HARPY NEURON MADE BY: ISRAEL AEROSPACE INDUSTRIES (ISRAEL). MADE BY: DASSAULT AVIATION (FRANCE) Sold to: CHINA, INDIA, ISRAEL, SOUTH KOREA AND TURKEY. Sold to: UNDER DEVELOPMENT This 2.1m long ‘loitering’ missile is This 10m long stealth unmanned combat launched from a ground vehicle. It is aircraft can fly autonomously for over 3 armed with a 15 kg explosive warhead. hours for autonomous detection, locali- The Harpy can loiter for up to 9 hours at zation, and reconnaissance of ground tar- a time, searching for enemy radar signals. gets. The Neuron has fully automated at- It automatically detects, attacks and de- tack capabilities, target adjustment, and stroys enemy radar emitters by flying into communication between systems. the target and detonating. DUE TO THE SERIOUS CONCERNS LEGAL, SECURITY AND ETHICAL CONCERNS, THE CAMPAIGN TO ETHICS 1 A machine should never be allowed to make decisions STOP KILLER ROBOTS over life and death. Such decisions should not be reduced to an algorithm. This goes against the principles of human dignity and the right to life. CALLS FOR A BAN ON A robot does not understand or respect the value of human life. This means that a robot will not be able to THE DEVELOPMENT, make a ‘kill decision’ that takes into account, implicitly or explicitly, human dignity. It is simply completing the task it was programmed to do. This devalues and dehumanizes PRODUCTION AND USE the decision, and does not respect the value we place on human life. OF KILLER ROBOTS. PROLIFERATION 2 Once developed, lethal autonomous weapons may be relatively cheap to produce and simple to copy. This increases the likelihood of their proliferation to a wide variety of actors, including dictators and non-state actors. Proponents often focus on the short-term advantages of employing lethal autonomous weapons, but overlook the ACCOUNTABILITY longer-term prospect that these weapons may be used against their military and civilian population. 5 Lethal autonomous weapons create an accountability gap regarding who would be responsible for an unlawful act. Who would be responsible: the manufacturer, developer, military commander or robot itself? LEGALITY 3 Killer robots are unlikely to be able to adhere to fundamental principles of International Humanitarian ARMS RACE Law (IHL), such as distinguishing between a civilian and a soldier. A soldier cannot simply be defined as a human with 6 Rapid developments in robotics and artificial intelligence a weapon. In some countries a civilian can carry a weapon applied to military technology could lead to an for ceremonial reasons at a wedding, and shepherds may international arms race, which would have destabilising be armed to protect their livestock and themselves. effects and threaten international peace and security. Even harder is the proportionality assessment that weighs civilian harm in relation to military advantage. It is impossible to simply program international law, as it is UNPREDICTABILITY always dependent on interpretation of the context. 7 The deployment of lethal autonomous weapons could lead to accidental wars and rapid escalation of conflicts, as LOWER THE THRESHOLD FOR WAR well as other unintended, but dangerous consequences. It’s unclear how lethal autonomous weapons designed and 4 Some speculate that lethal autonomous weapons could deployed by opposing forces will react and interact with lead to less casualties among the attacking forces. each other. These weapons could be highly unpredictable, However this could also lead to an increase in conflicts especially in their interactions with other autonomous by lowering the threshold of going to war. Also, if there systems and if they are capable of self-learning. are fewer risks to a soldier’s security, it may be easier to employ lethal force. A perception of risk-free war may lead to a preference for military rather than political solutions. GLOBAL CONCERN PUBLIC OPINION STATES EUROPEAN PARLIAMENT ROBOTIC & AI EXPERTS ICRC UNITED NATIONS An IPSOS poll in 26 countries More than 80 states have spo- An overwhelming majority Over 3.000 artificial intelli- The International Committee Secretary-General António shows 61% of respondents ken on the matter of killer ro- of members of the European gence experts have called for of the Red Cross has called Guterres has called lethal oppose killer robots. The poll bots since 2013 and 28 states Parliament have called for a ban, including prominent on states to establish interna- autonomous weapons “mor- also asked what concerned have called for a ban. The ma- the start of negotiations to scientists such as Stephen tionally agreed limits on au- ally repugnant and politically them the most. Two-thirds jority of states have expressed prohibit lethal autonomous Hawking, Anca Dragan, Yosh- tonomy in weapon systems, unacceptable.” He has urged (66%) answered that lethal their desire to retain some weapons. ua Bengio and Stuart Russell. that address legal, ethical and states to negotiate a ban on autonomous weapons sys- form of human control over The founders and directors humanitarian concerns; these weapons. UN Special tems would “cross a moral weapons systems and the use of more than 200 technology Rapporteur Heyns has called line because machines should of force. companies have pledged not for a moratorium on these not be allowed to kill.” Over to develop killer robots, in- weapons. half (54%) said these weapons cluding Nnaisense and Clear- would be “unaccountable. path Robotics. Google has committed to not design or deploy AI for use in weapons. TIME LINE 2009 2012 2014 2015 2017 2018 -> Founding of the -> Founding of the Campaign -> First informal meeting on -> Open letter by over 3.000 -> Letter by 116 tech -> Pledge by tech companies International Committee for to Stop Killer Robots lethal autonomous weapons artificial intelligence experts companies calling on and individuals to not Robot Arms Control (ICRAC) at the UN in Geneva and scientists warning against the UN to ban lethal develop or produce lethal -> Autonomous weapons killer robots autonomous weapons autonomous weapons discussed at the UN Human -> Clearpath Robotics Rights Council becomes the first company -> The European Parliament to pledge to not develop calls for the start of killer robots negotiations on a ban on lethal autonomous weapon systems -> Austria, Brazil and Chile call for the start of negotiations on a treaty to retain meaningful human control over the critical functions in lethal autonomous weapon systems “ Machines that have the power and the discretion to take human lives are politically unacceptable, are morally repugnant and should be banned by international law. SECRETARY-GENERAL ANTÓNIO GUTERRES ” { “… the actual concept of autonomous -> weapons, that decisions of life and death are left up to machines, is in principle and intrinsically a problem.” - COUNCIL ON ETHICS, NORWEGIAN GOVERNMENT PENSION FUND } { “These can be weapons of terror, weapons that despots and terrorists use against “… the proliferation of lethal autonomous innocent populations, and weapons hacked { weapon systems remains a clear and present to behave in undesirable ways.