2020 Scharre Paul 1575997 E

Total Page:16

File Type:pdf, Size:1020Kb

2020 Scharre Paul 1575997 E This electronic thesis or dissertation has been downloaded from the King’s Research Portal at https://kclpure.kcl.ac.uk/portal/ Autonomous weapons and stability Scharre, Paul Awarding institution: King's College London The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without proper acknowledgement. END USER LICENCE AGREEMENT Unless another licence is stated on the immediately following page this work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International licence. https://creativecommons.org/licenses/by-nc-nd/4.0/ You are free to copy, distribute and transmit the work Under the following conditions: Attribution: You must attribute the work in the manner specified by the author (but not in any way that suggests that they endorse you or your use of the work). Non Commercial: You may not use this work for commercial purposes. No Derivative Works - You may not alter, transform, or build upon this work. Any of these conditions can be waived if you receive permission from the author. Your fair dealings and other rights are in no way affected by the above. Take down policy If you believe that this document breaches copyright please contact [email protected] providing details, and we will remove access to the work immediately and investigate your claim. Download date: 27. Sep. 2021 Autonomous Weapons and Stability Paul Scharre March 2020 Thesis Submitted in Fulfillment of the Requirements for the Degree of Doctor of Philosophy in the Department of War Studies Faculty of Social Science & Public Policy King’s College London 1 Copyright © 2019 by Paul Scharre The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without proper acknowledgement. 2 ABSTRACT Autonomous Weapons and Stability Paul Scharre Militaries around the globe are developing increasingly autonomous robotic weapons, raising the prospect of future fully autonomous weapons that could search for, select, and engage targets on their own. Autonomous weapons have been used in narrow circumstances to-date, such as defending ships and vehicles under human supervision. By examining over 50 case studies of autonomous military and cyber systems in use or currently in development around the globe, this thesis concludes that widespread use of fully autonomous weapons would be a novel development in war. Autonomous weapons raise important considerations for crisis stability, escalation control, and war termination that have largely been underexplored to-date. Fully autonomous weapons could affect stability in a number of ways. Autonomous weapons could increase stability by replacing human decision-making with automation, linking political leaders more directly to tactical actions. Accidents with autonomous weapons could lead to the opposite, however, causing unintended lethal engagements that escalate a crisis or make it more difficult to terminate conflict. Nine total mechanisms are explored by which fully autonomous weapons could affect stability, four of which would enhance stability and five of which would undermine stability. The significance of these mechanisms is assessed, and the most prominent factor is the extent to which autonomous weapons increase or decrease human control. If autonomous weapons were likely to operate consistent with human intent in realistic military operational environments, then widespread autonomous weapons would improve crisis stability, escalation control, and war termination by increasing political leaders’ control over their military forces and reducing the risk of accidents or unauthorized escalatory actions. If, however, unanticipated action by autonomous weapons is likely in realistic environments, then their use would undermine crisis stability, escalation control, and war termination because of the potential for unintended lethal engagements that could escalate conflicts or make it more difficult to end wars. The competing frameworks of normal accident theory and high-reliability organizations make different predictions about the likelihood of accidents with complex systems. 3 Over 25 case studies of military and non-military complex systems in real-world environments are used to test these competing theories. These include extensive U.S. experience with the Navy Aegis and Army Patriot air and missile defense systems and their associated safety track records. Experience with military and civilian autonomous systems suggest that three key conditions exist that will undermine reliability for fully autonomous weapons in wartime: (1) the absence of routine day-to-day experience under realistic conditions, since testing and peacetime environments will not perfectly replicate wartime conditions; (2) the presence of adversarial actors; and (3) the lack of human judgment to flexibly respond to novel situations. The thesis concludes that normal accident theory best describes the situation of fully autonomous weapons and that militaries are highly unlikely to be able to achieve high-reliability operations with fully autonomous weapons. Unanticipated lethal actions are likely to be normal consequences of using fully autonomous weapons and militaries can reduce but not entirely eliminate these risks. Recent and potential future advances in artificial intelligence and machine learning are likely to exacerbate these risks, even as they enable more capable systems. This stems from the challenge of accurately predicting the behavior of machine learning systems in complex environments, their poor performance in response to novel situations, and their vulnerability to various forms of spoofing and manipulation. The widespread deployment of fully autonomous weapons is therefore likely to undermine stability because of the risk of unintended lethal engagements. Four policy and regulatory options are explored to mitigate this risk. Their likelihood of success is assessed based on analysis of over 40 historical cases of attempts to control weapons and the specific features of autonomous weapons. The thesis concludes with three potentially viable regulatory approaches to mitigate risks from fully autonomous weapons. 4 TABLE OF CONTENTS Abstract ............................................................................................................................ 3 List of Tables, Images, and Illustrations ....................................................................... 7 Overview and Key Research Questions ........................................................................ 8 Introduction .................................................................................................................. 8 Topic Overview .......................................................................................................... 14 Background and Existing Literature ........................................................................... 17 Research Questions .................................................................................................... 41 Methodology ............................................................................................................... 43 Outline of the Report and Preview of Conclusions .................................................... 55 Part I: Autonomy and Autonomous Weapons ........................................................... 57 The AI and Robotics Revolution ................................................................................ 57 The Nature of Autonomy ............................................................................................ 66 A History of Automation and Autonomy in Weapons ............................................... 71 Part II: Autonomy in Weapons Today and Future Trends ...................................... 91 Autonomy in Robotic Platforms ................................................................................. 91 Autonomy in Missiles and Sensors .......................................................................... 101 Commercially-Available Components for Autonomous Weapons .......................... 112 Cyberspace and Autonomous Weapons ................................................................... 113 Part III: Stability and Autonomous Weapons ......................................................... 126 The Concept of Stability ........................................................................................... 127 Autonomous Weapons and Stability ........................................................................ 132 Part IV: Controlling Autonomous Weapons: Risk, Predictability, and Hazard .. 156 Understanding Accident Risk: Normal Accident Theory vs. High-Reliability Organizations ............................................................................................................ 156 Case Studies of Non-Military Autonomous Systems in Competitive and Uncontrolled Environments ............................................................................................................ 165 Case Studies in Complex Automated Military Weapon Systems: The Patriot and Aegis ......................................................................................................................... 180 Automation as a Source of Both Reliability and Failure .......................................... 200 5 Nuclear Weapons as a Case Study in Risk Management ......................................... 213 Summary: The Inevitability of
Recommended publications
  • Killer Robots
    { KILLER ROBOTS What are they and / what are the concerns? KILLER ROBOTS ARE WEAPON SYSTEMS THAT WOULD SELECT AND ATTACK TARGETS WITHOUT MEANINGFUL HUMAN CONTROL. This means the decision to deploy lethal force would be delegated to a WHAT machine. This far-reaching development would fundamentally change the way war is conducted and has been called the third revolution in warfare, after gunpowder and the atomic bomb. The function of autonomously selecting and attacking targets could be applied to various platforms, for instance a battle tank, a fighter jet or a ship. Another term used to ARE describe these weapons is lethal autonomous weapon systems (LAWS). A COMMON MISUNDERSTANDING IS THAT KILLER ROBOTS ARE DRONES OR THE TERMINATOR. Today’s armed drones still have a human operator controlling the weapon system from a distance who is responsible for selecting and identifying KILLER targets as well as pulling the trigger. The issue is also not about the Ter- minator. This science fiction concept is unlikely to become a reality in the coming decades if ever at all. The issue is about the removal of mean- ingful human control from the critical functions of selecting and attacking targets; some of these systems may currently be under development and ROBOTS? could be deployed in the coming years. THERE SHOULD ALWAYS BE MEANINGFUL HUMAN CONTROL OVER THE SELECTION AND ATTACK OF INDIVIDUAL TARGETS. The human operator must be able to make carefully considered legal and ethical assessments, with sufficient information about the situation on the / ground and enough time to make a well-considered, informed decision.
    [Show full text]
  • AI, Robots, and Swarms: Issues, Questions, and Recommended Studies
    AI, Robots, and Swarms Issues, Questions, and Recommended Studies Andrew Ilachinski January 2017 Approved for Public Release; Distribution Unlimited. This document contains the best opinion of CNA at the time of issue. It does not necessarily represent the opinion of the sponsor. Distribution Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323. Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil or contact CNA Document Control and Distribution Section at 703-824-2123. Photography Credits: http://www.darpa.mil/DDM_Gallery/Small_Gremlins_Web.jpg; http://4810-presscdn-0-38.pagely.netdna-cdn.com/wp-content/uploads/2015/01/ Robotics.jpg; http://i.kinja-img.com/gawker-edia/image/upload/18kxb5jw3e01ujpg.jpg Approved by: January 2017 Dr. David A. Broyles Special Activities and Innovation Operations Evaluation Group Copyright © 2017 CNA Abstract The military is on the cusp of a major technological revolution, in which warfare is conducted by unmanned and increasingly autonomous weapon systems. However, unlike the last “sea change,” during the Cold War, when advanced technologies were developed primarily by the Department of Defense (DoD), the key technology enablers today are being developed mostly in the commercial world. This study looks at the state-of-the-art of AI, machine-learning, and robot technologies, and their potential future military implications for autonomous (and semi-autonomous) weapon systems. While no one can predict how AI will evolve or predict its impact on the development of military autonomous systems, it is possible to anticipate many of the conceptual, technical, and operational challenges that DoD will face as it increasingly turns to AI-based technologies.
    [Show full text]
  • And Artificial Intelligence (AI): Considerations for Congress
    U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI): Considerations for Congress Updated November 20, 2018 Congressional Research Service https://crsreports.congress.gov R45392 U.S. Ground Forces Robotics and Autonomous Systems (RAS) and Artificial Intelligence (AI) Summary The nexus of robotics and autonomous systems (RAS) and artificial intelligence (AI) has the potential to change the nature of warfare. RAS offers the possibility of a wide range of platforms—not just weapon systems—that can perform “dull, dangerous, and dirty” tasks— potentially reducing the risks to soldiers and Marines and possibly resulting in a generation of less expensive ground systems. Other nations, notably peer competitors Russia and China, are aggressively pursuing RAS and AI for a variety of military uses, raising considerations about the U.S. military’s response—to include lethal autonomous weapons systems (LAWS)—that could be used against U.S. forces. The adoption of RAS and AI by U.S. ground forces carries with it a number of possible implications, including potentially improved performance and reduced risk to soldiers and Marines; potential new force designs; better institutional support to combat forces; potential new operational concepts; and possible new models for recruiting and retaining soldiers and Marines. The Army and Marines have developed and are executing RAS and AI strategies that articulate near-, mid-, and long-term priorities. Both services have a number of RAS and AI efforts underway and are cooperating in a number of areas. A fully manned, capable, and well-trained workforce is a key component of military readiness. The integration of RAS and AI into military units raises a number of personnel-related issues that may be of interest to Congress, including unit manning changes, recruiting and retention of those with advanced technical skills, training, and career paths.
    [Show full text]
  • Inside the 'Hermit Kingdom'
    GULF TIMES time out MONDAY, AUGUST 10, 2009 Inside the ‘Hermit Kingdom’ A special report on North Korea. P2-3 time out • Monday, August 10, 2009 • Page 3 widespread human rights abuses, to the South Korean news agency Traffi c lights are in place, but rarely is a luxury. although many of their accounts Yonhap, he has described himself as used. North Korea has a long history of inside date back to the 1990s. an Internet expert. Pyongyang’s eight cinemas are tense relations with other regional According to a report from the He is thought to have fi nally said to be frequently closed due powers and the West — particularly UN High Commission for Human annointed the youngest of his three to lack of power; when open, they since it began its nuclear Rights this year: “The UN General sons Kim Jong-un as his heir and screen domestic propaganda movies programme. China is regarded Assembly has recognised and “Brilliant Comrade”, following with inspiring titles such as The Fate as almost its sole ally; even so, condemned severe Democratic his reported stroke last year. Even of a Self-Defence Corps Man. relations are fraught, based as much People’s Republic of Korea human less is known about this leader- The state news agency KCNA as anything on China’s fear that rights violations including the in-waiting. Educated in Bern, runs a curious combination of brief the collapse of the current regime use of torture, public executions, Switzerland, the 25-year-old is said news items such as its coverage of could lead to a fl ood of refugees and extrajudicial and arbitrary to be a basketball fan.
    [Show full text]
  • Department of Defense Program Acquisition Cost by Weapons System
    The estimated cost of this report or study for the Department of Defense is approximately $36,000 for the 2019 Fiscal Year. This includes $11,000 in expenses and $25,000 in DoD labor. Generated on 2019FEB14 RefID: B-1240A2B FY 2020 Program Acquisition Costs by Weapon System Major Weapon Systems Overview The performance of United States (U.S.) weapon systems are unmatched, ensuring that U.S. military forces have a tactical combat advantage over any adversary in any environmental situation. The Fiscal Year (FY) 2020 acquisition (Procurement and Research, Development, Test, and Evaluation (RDT&E)) funding requested by the Department of Defense (DoD) totals $247.3 billion, which includes funding in the Base budget and the Overseas Contingency Operations (OCO) fund, totaling $143.1 billion for Procurement and $104.3 billion for RDT&E. The funding in the budget request represents a balanced portfolio approach to implement the military force objective established by the National Defense Strategy. Of the $247.3 billion in the request, $83.9 billion finances Major Defense Acquisition Programs (MDAPs), which are acquisition programs that exceed a cost threshold established by the Under Secretary of Defense for Acquisition and Sustainment. To simplify the display of the various weapon systems, this book is organized by the following mission area categories: • Aircraft and Related Systems • Missiles and Munitions • Command, Control, Communications, • Shipbuilding and Maritime Systems Computers, and Intelligence (C4I) • Space Based Systems Systems • Science and Technology • Ground Systems • Mission Support Activities • Missile Defeat and Defense Programs FY 2020 Investment Total: $247.3 Billion $ in Billions Numbers may not add due to rounding Introduction FY 2020 Program Acquisition Costs by Weapon System The Distribution of Funding in FY 2020 for Procurement and RDT&E by Component and Category* $ in Billions $ in Billions * Funding in Mission Support activities are not represented in the above displays.
    [Show full text]
  • The Challenge of Lethal Autonomous Weapons Systems (LAWS)
    ODUMUNC 2021 Issue Brief for the General Assembly First Committee: Disarmament and International Security The Challenge of Lethal Autonomous Weapons Systems (LAWS) by William Bunn Graduate Program in International Studies, Old Dominion University Introduction prosecuted with great prejudice by the flying robotic duo.2 On a cool desert morning, two aircraft appeared on the horizon, though their small size and low- observational (or stealth) characteristics made them difficult to pick out against the blue sky. With their trapezoidal shape, stub wings, and lack of tails, the craft looked more like something out of a science fiction movie than a modern battlefield. As “unmanned combat air vehicles” (UCAVs), these platforms were less than half the size of a typical manned fighter aircraft. More importantly, as the aircraft approached enemy territory, they began to “work as a team to accomplish their mission via X-45A experimental U.S. unmanned combat air vehicles leveraging a set of algorithms, onboard sensors made by Boeing. Each X-45A is over 26 long, wingspan of and communications data links...(utilizing) 34 feet, and weighs 8,000 lbs empty. autonomous control and decision making largely The attack described above did not take place 1 their own.” over a desert in the Middle East, but in a desert Suddenly, a…threat radar activated. The pair of in California, near Edwards Air Force Base. The networked UCAVs immediately classified the aircraft were Boeing X-45A technology threat and executed a plan to destroy it based on demonstrators targeting simulated enemy missile the position of each (aircraft) in relation to the sites.
    [Show full text]
  • The Arms Industry and Increasingly Autonomous Weapons
    Slippery Slope The arms industry and increasingly autonomous weapons www.paxforpeace.nl Reprogramming War This report is part of a PAX research project on the development of lethal autonomous weapons. These weapons, which would be able to kill people without any direct human involvement, are highly controversial. Many experts warn that they would violate fundamental legal and ethical principles and would be a destabilising threat to international peace and security. In a series of four reports, PAX analyses the actors that could potentially be involved in the development of these weapons. Each report looks at a different group of actors, namely states, the tech sector, the arms industry, and universities and research institutes. The present report focuses on the arms industry. Its goal is to inform the ongoing debate with facts about current developments within the defence sector. It is the responsibility of companies to be mindful of the potential applications of certain new technologies and the possible negative effects when applied to weapon systems. They must also clearly articulate where they draw the line to ensure that humans keep control over the use of force by weapon systems. If you have any questions regarding this project, please contact Daan Kayser ([email protected]). Colophon November 2019 ISBN: 978-94-92487-46-9 NUR: 689 PAX/2019/14 Author: Frank Slijper Thanks to: Alice Beck, Maaike Beenes and Daan Kayser Cover illustration: Kran Kanthawong Graphic design: Het IJzeren Gordijn © PAX This work is available under the Creative Commons Attribution 4.0 license (CC BY 4.0) https://creativecommons.org/licenses/ by/4.0/deed.en We encourage people to share this information widely and ask that it be correctly cited when shared.
    [Show full text]
  • Lethal Autonomous Weapons Systems: Artificial Intelligence and Autonomy
    LETHAL AUTONOMOUS WEAPONS SYSTEMS: ARTIFICIAL INTELLIGENCE AND AUTONOMY Raine Sagramsingh IEEE // Washington Internships for Students in Engineering 4 About the Author Raine Sagramsingh is a recent graduate of the FAMU-FSU College of Engineering. She graduated summa cum laude with a Bachelor of Science in mechanical engineering. She was an active member of the Florida State University Center for Leadership and Social Change’s Service Scholar Program. Raine completed internships with the Air Force Research Lab and Northrop Grumman Aerospace Systems. Raine will begin a full-time position as an electronics engineer with the 412th Test Wing at Edwards Air Force Base after completion of the WISE program. About IEEE IEEE and its members inspire a global community to innovate for a better tomorrow through its more than 423,000 members in over 160 countries, and its highly cited publications, conferences, technology standards, and professional and educational activities. IEEE is the trusted “voice” for engineering, computing, and technology information around the globe. IEEE-USA is an organizational unit of IEEE, created to support the career and public policy interests of IEEE’s U.S. members. Through its Government Relations programs, IEEE-USA works with all three branches of the federal government to help shape the workforce and technology policy to the benefit of members, the profession and the American public. About the WISE Program Founded in 1980 through collaborative efforts of several professional engineering societies, the Washington Internships for Students of Engineering (WISE) program has become one of the premier Washington internship programs. The WISE goal is to prepare future leaders of the engineering profession in the United States who are aware of, and who can contribute to, the increasingly important issues at the intersection of science, technology, and public policy.
    [Show full text]
  • MAPPING the DEVELOPMENT of AUTONOMY in WEAPON SYSTEMS Vincent Boulanin and Maaike Verbruggen
    MAPPING THE DEVELOPMENT OF AUTONOMY IN WEAPON SYSTEMS vincent boulanin and maaike verbruggen MAPPING THE DEVELOPMENT OF AUTONOMY IN WEAPON SYSTEMS vincent boulanin and maaike verbruggen November 2017 STOCKHOLM INTERNATIONAL PEACE RESEARCH INSTITUTE SIPRI is an independent international institute dedicated to research into conflict, armaments, arms control and disarmament. Established in 1966, SIPRI provides data, analysis and recommendations, based on open sources, to policymakers, researchers, media and the interested public. The Governing Board is not responsible for the views expressed in the publications of the Institute. GOVERNING BOARD Ambassador Jan Eliasson, Chair (Sweden) Dr Dewi Fortuna Anwar (Indonesia) Dr Vladimir Baranovsky (Russia) Ambassador Lakhdar Brahimi (Algeria) Espen Barth Eide (Norway) Ambassador Wolfgang Ischinger (Germany) Dr Radha Kumar (India) The Director DIRECTOR Dan Smith (United Kingdom) Signalistgatan 9 SE-169 72 Solna, Sweden Telephone: +46 8 655 97 00 Email: [email protected] Internet: www.sipri.org © SIPRI 2017 Contents Acknowledgements v About the authors v Executive summary vii Abbreviations x 1. Introduction 1 I. Background and objective 1 II. Approach and methodology 1 III. Outline 2 Figure 1.1. A comprehensive approach to mapping the development of autonomy 2 in weapon systems 2. What are the technological foundations of autonomy? 5 I. Introduction 5 II. Searching for a definition: what is autonomy? 5 III. Unravelling the machinery 7 IV. Creating autonomy 12 V. Conclusions 18 Box 2.1. Existing definitions of autonomous weapon systems 8 Box 2.2. Machine-learning methods 16 Box 2.3. Deep learning 17 Figure 2.1. Anatomy of autonomy: reactive and deliberative systems 10 Figure 2.2.
    [Show full text]
  • Winter Escapes on Display
    ISSUE 3 (133) • 21-27 JANUARY 2010 • €3 • WWW.HELSINKITIMES.FI DOMESTIC INTERNATIONAL BUSINESS CULTURE EAT & DRINK More Haiti Expat DocPoint: Testing options for left in forum Against ready-made first graders carnage gathers mainstream meals page 4 page 7 page 8 page 15 page 16 among others, those in danger of be- ing alienated with society should be Tighter helped while there is still time. Pent- Winter escapes on display ti Partanen, the Interior Ministry’s Director-General of the Department PETRA NYMAN ten includes trips abroad or within el encompasses literally of slower security for Rescue Services, also emphasised HELSINKI TIMES Finland,” she says. modes of travelling, such as biking that schools are not apart from the Nonetheless, the means of trav- or hiking, where speed travel refers for schools rest of society. “Actions related on- IT'S THAT time of year again, when elling have changed with the eco- to trips that take the holiday goer to ly to schools and educational institu- one can escape the winter and take nomic situation, with an increased his destination and back as swiftly PERTTI MATTILA – STT tions aren’t suffi cient. The prevention a trip to the Travel Fair, Matka 2010, emphasis on thrift. Domestic trav- as possible. MICHAEL NAGLER – HT of risks and risky behaviour requires at the Helsinki Fair Centre. Around el and trips to nearby countries are If there can be any positive out- broad socio-political action.” 1,200 exhibitors representing over popular choices at the moment. Also comes of the current economic SCHOOLS should have cameras and The report on the improvement of 70 countries will apppear at the fair.
    [Show full text]
  • TV 81 WEDNESDAY.Indd
    THE CYPRUS WEEKLY MARCH 14, 2012 Soldier TTVV WWEDNESDAYEDNESDAY 81 PICKSPICKS OFOF THETHE DDAYAY post-mortem. In his personal life, Dalgleish’s relationship with Deborah Riscoe comes to a crossroads when the investigation interferes with their private life. Directed by John Davies in 1993. A Touch of Frost: Mind Games (Wednesday, CYBC2, 21.00) Crime fi lm starring David Jason, Bruce Alexander, John Lyons, Keith Barron. Two young men are tricked by their girlfriends into stripping themselves nude for a midnight swim, during which the girls run off with the boys’ clothes, leaving them to streak through Denton. One is arrested but the other, Roman Cassell, gets back to his offi ce where, the next day, he is found beaten to death. Directed by Paul Har- rison in 2008. To Love and Die (Cytavisioncinema, 21:15) Action fi lm starring Shiri Appleby, Ivan Sergei and Kristin Dattilo. A young woman deals with her abandonment issues by reconnecting with her estranged dad, a contract killer who recruits her Unnatural Causes (Mega, 23.10) Crime mystery starringRoy Marsden, Simon Chandler and into his line of work. Directed by Mark Piznarski in 2009. Kenneth Colley. Commander Dalgleish of Scotland Yard investigates the apparent murder of a well- known author who is found fl oating in a dinghy with his hands chopped off. The man, Maurice Seton, had recently been ejected from a private club when he was caught photographing some of its mem- Soldier (TV Plus, 21:20) Action starring Kurt Russell, Jason Scott Lee and bers at the gambling tables. The club had been under suspicion of laundering large amounts of cash Jason Isaacs.
    [Show full text]
  • Killer Robots Are Lethal Autonomous Weapon Systems That Can Operate Without “Meaningful Human Control” — That Is, Without a Human Remotely Powering Them
    Factsheet: Killer Robots and South Korea What are killer robots? Killer robots are lethal autonomous weapon systems that can operate without “meaningful human control” — that is, without a human remotely powering them. Unlike armed drones, killer robots can select and fire upon targets on their own, using preprogrammed algorithms and data analysis. Why should we be concerned about killer robots? There are many reasons why we should be concerned about killer robots. Fully autonomous weapons remove human control from the complex decision to use force, thus eliminating the possibility for empathy, conscience, emotion, judgment or care for the value of human life. Autonomous weapon systems cannot be relied upon to comply with international laws regarding war or human rights. If autonomous weapons accidentally kill civilians, it is unclear who, if anyone, would be held liable or accountable. This would make it difficult to ensure justice for the victims. By replacing troops with machines, killer robots risk lowering the threshold for war, making it easier for countries to engage in conflicts. There is also the potential for killer robots to be used in circumstances outside of war, such as policing or border control. Finally, autonomous weapons are more susceptible to cyberattacks and hacking, making them less predictable and more uncontrollable, with potentially deadly results. What does race or gender have to do with killer robots? People can program killer robots with biases about race or gender — for example, by targeting someone based on their skin color or the way they are dressed. Recent studies on facial and vocal recognition software demonstrate that racism is already built into our technology.
    [Show full text]