This Electronic Thesis Or Dissertation Has Been Downloaded from the King’S Research Portal At

Total Page:16

File Type:pdf, Size:1020Kb

This Electronic Thesis Or Dissertation Has Been Downloaded from the King’S Research Portal At This electronic thesis or dissertation has been downloaded from the King’s Research Portal at https://kclpure.kcl.ac.uk/portal/ Autonomous weapons and stability Scharre, Paul Awarding institution: King's College London The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without proper acknowledgement. END USER LICENCE AGREEMENT Unless another licence is stated on the immediately following page this work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International licence. https://creativecommons.org/licenses/by-nc-nd/4.0/ You are free to copy, distribute and transmit the work Under the following conditions: Attribution: You must attribute the work in the manner specified by the author (but not in any way that suggests that they endorse you or your use of the work). Non Commercial: You may not use this work for commercial purposes. No Derivative Works - You may not alter, transform, or build upon this work. Any of these conditions can be waived if you receive permission from the author. Your fair dealings and other rights are in no way affected by the above. Take down policy If you believe that this document breaches copyright please contact [email protected] providing details, and we will remove access to the work immediately and investigate your claim. Download date: 01. Oct. 2021 Autonomous Weapons and Stability Paul Scharre March 2020 Thesis Submitted in Fulfillment of the Requirements for the Degree of Doctor of Philosophy in the Department of War Studies Faculty of Social Science & Public Policy King’s College London 1 Copyright © 2019 by Paul Scharre The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without proper acknowledgement. 2 ABSTRACT Autonomous Weapons and Stability Paul Scharre Militaries around the globe are developing increasingly autonomous robotic weapons, raising the prospect of future fully autonomous weapons that could search for, select, and engage targets on their own. Autonomous weapons have been used in narrow circumstances to-date, such as defending ships and vehicles under human supervision. By examining over 50 case studies of autonomous military and cyber systems in use or currently in development around the globe, this thesis concludes that widespread use of fully autonomous weapons would be a novel development in war. Autonomous weapons raise important considerations for crisis stability, escalation control, and war termination that have largely been underexplored to-date. Fully autonomous weapons could affect stability in a number of ways. Autonomous weapons could increase stability by replacing human decision-making with automation, linking political leaders more directly to tactical actions. Accidents with autonomous weapons could lead to the opposite, however, causing unintended lethal engagements that escalate a crisis or make it more difficult to terminate conflict. Nine total mechanisms are explored by which fully autonomous weapons could affect stability, four of which would enhance stability and five of which would undermine stability. The significance of these mechanisms is assessed, and the most prominent factor is the extent to which autonomous weapons increase or decrease human control. If autonomous weapons were likely to operate consistent with human intent in realistic military operational environments, then widespread autonomous weapons would improve crisis stability, escalation control, and war termination by increasing political leaders’ control over their military forces and reducing the risk of accidents or unauthorized escalatory actions. If, however, unanticipated action by autonomous weapons is likely in realistic environments, then their use would undermine crisis stability, escalation control, and war termination because of the potential for unintended lethal engagements that could escalate conflicts or make it more difficult to end wars. The competing frameworks of normal accident theory and high-reliability organizations make different predictions about the likelihood of accidents with complex systems. 3 Over 25 case studies of military and non-military complex systems in real-world environments are used to test these competing theories. These include extensive U.S. experience with the Navy Aegis and Army Patriot air and missile defense systems and their associated safety track records. Experience with military and civilian autonomous systems suggest that three key conditions exist that will undermine reliability for fully autonomous weapons in wartime: (1) the absence of routine day-to-day experience under realistic conditions, since testing and peacetime environments will not perfectly replicate wartime conditions; (2) the presence of adversarial actors; and (3) the lack of human judgment to flexibly respond to novel situations. The thesis concludes that normal accident theory best describes the situation of fully autonomous weapons and that militaries are highly unlikely to be able to achieve high-reliability operations with fully autonomous weapons. Unanticipated lethal actions are likely to be normal consequences of using fully autonomous weapons and militaries can reduce but not entirely eliminate these risks. Recent and potential future advances in artificial intelligence and machine learning are likely to exacerbate these risks, even as they enable more capable systems. This stems from the challenge of accurately predicting the behavior of machine learning systems in complex environments, their poor performance in response to novel situations, and their vulnerability to various forms of spoofing and manipulation. The widespread deployment of fully autonomous weapons is therefore likely to undermine stability because of the risk of unintended lethal engagements. Four policy and regulatory options are explored to mitigate this risk. Their likelihood of success is assessed based on analysis of over 40 historical cases of attempts to control weapons and the specific features of autonomous weapons. The thesis concludes with three potentially viable regulatory approaches to mitigate risks from fully autonomous weapons. 4 TABLE OF CONTENTS Abstract ............................................................................................................................ 3 List of Tables, Images, and Illustrations ....................................................................... 7 Overview and Key Research Questions ........................................................................ 8 Introduction .................................................................................................................. 8 Topic Overview .......................................................................................................... 14 Background and Existing Literature ........................................................................... 17 Research Questions .................................................................................................... 41 Methodology ............................................................................................................... 43 Outline of the Report and Preview of Conclusions .................................................... 55 Part I: Autonomy and Autonomous Weapons ........................................................... 57 The AI and Robotics Revolution ................................................................................ 57 The Nature of Autonomy ............................................................................................ 66 A History of Automation and Autonomy in Weapons ............................................... 71 Part II: Autonomy in Weapons Today and Future Trends ...................................... 91 Autonomy in Robotic Platforms ................................................................................. 91 Autonomy in Missiles and Sensors .......................................................................... 101 Commercially-Available Components for Autonomous Weapons .......................... 112 Cyberspace and Autonomous Weapons ................................................................... 113 Part III: Stability and Autonomous Weapons ......................................................... 126 The Concept of Stability ........................................................................................... 127 Autonomous Weapons and Stability ........................................................................ 132 Part IV: Controlling Autonomous Weapons: Risk, Predictability, and Hazard .. 156 Understanding Accident Risk: Normal Accident Theory vs. High-Reliability Organizations ............................................................................................................ 156 Case Studies of Non-Military Autonomous Systems in Competitive and Uncontrolled Environments ............................................................................................................ 165 Case Studies in Complex Automated Military Weapon Systems: The Patriot and Aegis ......................................................................................................................... 180 Automation as a Source of Both Reliability and Failure .......................................... 200 5 Nuclear Weapons as a Case Study in Risk Management ......................................... 213 Summary: The Inevitability of
Recommended publications
  • TEDX – What's Happening with Artificial Intelligence
    What’s Happening With Artificial Intelligence? Steve Omohundro, Ph.D. PossibilityResearch.com SteveOmohundro.com SelfAwareSystems.com http://googleresearch.blogspot.com/2015/06/inceptionism-going-deeper-into-neural.html Multi-Billion Dollar Investments • 2013 Facebook – AI lab • 2013 Ebay – AI lab • 2013 Allen Institute for AI • 2014 IBM - $1 billion in Watson • 2014 Google - $500 million, DeepMind • 2014 Vicarious - $70 million • 2014 Microsoft – Project Adam, Cortana • 2014 Baidu – Silicon Valley • 2015 Fanuc – Machine Learning for Robotics • 2015 Toyota – $1 billion, Silicon Valley • 2016 OpenAI – $1 billion, Silicon Valley http://www.mckinsey.com/insights/business_technology/disruptive_technologies McKinsey: AI and Robotics to 2025 $50 Trillion! US GDP is $18 Trillion http://cdn-media-1.lifehack.org/wp-content/files/2014/07/Cash.jpg 86 Billion Neurons https://upload.wikimedia.org/wikipedia/commons/e/ef/Human_brain_01.jpg http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2776484/ The Connectome http://discovermagazine.com/~/media/Images/Issues/2013/Jan-Feb/connectome.jpg 1957 Rosenblatt’s “Perceptron” http://www.rutherfordjournal.org/article040101.html http://bio3520.nicerweb.com/Locked/chap/ch03/3_11-neuron.jpg https://upload.wikimedia.org/wikipedia/commons/3/31/Perceptron.svg “The embryo of an electronic computer that [the Navy] expects will be able to walk, talk, see, write, reproduce itself and be conscious of its existence.” https://en.wikipedia.org/wiki/Perceptron 1986 Backpropagation http://www.ifp.illinois.edu/~yuhuang/samsung/ANN.png
    [Show full text]
  • Artificial Intelligence: Distinguishing Between Types & Definitions
    19 NEV. L.J. 1015, MARTINEZ 5/28/2019 10:48 AM ARTIFICIAL INTELLIGENCE: DISTINGUISHING BETWEEN TYPES & DEFINITIONS Rex Martinez* “We should make every effort to understand the new technology. We should take into account the possibility that developing technology may have im- portant societal implications that will become apparent only with time. We should not jump to the conclusion that new technology is fundamentally the same as some older thing with which we are familiar. And we should not hasti- ly dismiss the judgment of legislators, who may be in a better position than we are to assess the implications of new technology.”–Supreme Court Justice Samuel Alito1 TABLE OF CONTENTS INTRODUCTION ............................................................................................. 1016 I. WHY THIS MATTERS ......................................................................... 1018 II. WHAT IS ARTIFICIAL INTELLIGENCE? ............................................... 1023 A. The Development of Artificial Intelligence ............................... 1023 B. Computer Science Approaches to Artificial Intelligence .......... 1025 C. Autonomy .................................................................................. 1026 D. Strong AI & Weak AI ................................................................ 1027 III. CURRENT STATE OF AI DEFINITIONS ................................................ 1029 A. Black’s Law Dictionary ............................................................ 1029 B. Nevada .....................................................................................
    [Show full text]
  • Operationalizing Robotic and Autonomous Systems in Support of Multi-Domain Operations White Paper
    Operationalizing Robotic and Autonomous Systems in Support of Multi-Domain Operations White Paper Prepared by: Army Capabilities Integration Center – Future Warfare Division 30 November 2018 Distribution Unclassified Distribution is unlimited This page intentionally left blank ii Executive Summary Robotic and Autonomous Systems (RAS) and artificial intelligence (AI) are fundamental to the future Joint Force realizing the full potential of Multi-Domain Operations (MDO 1.5). These systems, in particular AI, offer the ability to outmaneuver adversaries across domains, the electromagnetic (EM) spectrum, and the information environment. The employment of these systems during competition allows the Joint Force to understand the operational environment (OE) in real time, and thus better employ both manned and unmanned capabilities to defeat threat operations meant to destabilize a region, deter escalation of violence, and turn denied spaces into contested spaces. In the transition from competition to armed conflict, RAS and AI maneuver, fires, and intelligence, surveillance, and reconnaissance (ISR) capabilities provide the Joint Force with the ability to deny the enemy’s efforts to seize positions of advantage. Improved sustainment throughput combined with the ability to attack the enemy’s anti- access/aerial denial networks provides U.S. Forces the ability to seize positions of operational, strategic, and tactical advantage. Increased understanding through an AI-enabled joint Common Operating Picture (COP) allows U.S. Forces the ability to orchestrate multi-domain effects to create windows of advantage. Post-conflict application of RAS and AI offer increased capacity to produce sustainable outcomes and the combat power to set conditions for deterrence. Developing an operational concept for RAS allows the Army to understand better the potential impact of those technologies on the nature and character of war.
    [Show full text]
  • FEBRUARY 2012 ISSUE No
    MILITARY AVIATION REVIEW FEBRUARY 2012 ISSUE No. 291 EDITORIAL TEAM COORDINATING EDITOR - BRIAN PICKERING WESTFIELD LODGE, ASLACKBY, SLEAFORD, LINCS NG34 0HG TEL NO. 01778 440760 E-MAIL”[email protected]” BRITISH REVIEW - GRAEME PICKERING 15 ASH GROVE, BOURNE, LINCS PE10 9SG TEL NO. 01778 421788 EMail "[email protected]" FOREIGN FORCES - BRIAN PICKERING (see Co-ordinating Editor above for address details) US FORCES - BRIAN PICKERING (COORDINATING) (see above for address details) STATESIDE: MORAY PICKERING 18 MILLPIT FURLONG, LITTLEPORT, ELY, CAMBRIDGESHIRE, CB6 1HT E Mail “[email protected]” EUROPE: BRIAN PICKERING OUTSIDE USA: BRIAN PICKERING See address details above OUT OF SERVICE - ANDY MARDEN 6 CAISTOR DRIVE, BRACEBRIDGE HEATH, LINCOLN LN4 2TA E-MAIL "[email protected]" MEMBERSHIP/DISTRIBUTION - BRIAN PICKERING MAP, WESTFIELD LODGE, ASLACKBY, SLEAFORD, LINCS NG34 0HG TEL NO. 01778 440760 E-MAIL.”[email protected]” ANNUAL SUBSCRIPTION (Jan-Dec 2012) UK £40 EUROPE £48 ELSEWHERE £50 @MAR £20 (EMail/Internet Only) MAR PDF £20 (EMail/Internet Only) Cheques payable to “MAP” - ALL CARDS ACCEPTED - Subscribe via “www.mar.co.uk” ABBREVIATIONS USED * OVERSHOOT f/n FIRST NOTED l/n LAST NOTED n/n NOT NOTED u/m UNMARKED w/o WRITTEN OFF wfu WITHDRAWN FROM USE n/s NIGHTSTOPPED INFORMATION MAY BE REPRODUCED FROM “MAR” WITH DUE CREDIT EDITORIAL - Welcome to the February edition of MAR! This issue sees the United Kingdom 2012 Review from Graeme - a month later than usual due to his work commitments. Because of this the issue is somewhat truncated in the Foreign Section department, but we should catch up with the March issue.
    [Show full text]
  • Loitering Munitions
    Loitering Munitions The Soldiers’ Hand Held Cruise Missiles Jerome Bilet, PhD Loitering munitions (LMs) are low-cost guided precision munitions which can be maintained in a holding pattern in the air for a certain time and rapidly attack, land or sea, non-line-of-sight (NLOS) targets. LMs are under the control of an operator who sees a real-time image of the target and its surrounding area, giving the capacity to control the exact time, attitude and direction of the attack of a static, re-locatable or moving target, including providing a contribution to the formal target identification and confirmation process1. Whether labelled as hand held cruise missiles, pocket artillery or miniature air force, loitering munitions will be – and in some instances already are – part of the toolbox of the modern warfighter. This is a logical add-on to the way unmanned systems are becoming preponderant in contemporary warfare. There is no need to demonstrate any longer the fact that unmanned systems2 are part of the everyday life of the warfighter, whether in the air, on the ground, and above or under the water. Unmanned aerial vehicles, the well-known UAVs, represent the largest subset of the unmanned systems. A rather new subclass of UAVs are the weaponised unmanned air vehicles. Loitering munitions are part of this family. This article will focus mainly on short range man-portable loitering munitions used by small tactical units. Two main options exist to create a small weaponised UAV. The first option is to produce miniature munitions to be attached to existing standard ISR drones.
    [Show full text]
  • An Open Letter to the United Nations Convention on Certain Conventional Weapons
    An Open Letter to the United Nations Convention on Certain Conventional Weapons As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations. We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies. We regret that the GGE’s first meeting, which was due to start today, has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November. Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.
    [Show full text]
  • Between Ape and Artilect Createspace V2
    Between Ape and Artilect Conversations with Pioneers of Artificial General Intelligence and Other Transformative Technologies Interviews Conducted and Edited by Ben Goertzel This work is offered under the following license terms: Creative Commons: Attribution-NonCommercial-NoDerivs 3.0 Unported (CC-BY-NC-ND-3.0) See http://creativecommons.org/licenses/by-nc-nd/3.0/ for details Copyright © 2013 Ben Goertzel All rights reserved. ISBN: ISBN-13: “Man is a rope stretched between the animal and the Superman – a rope over an abyss.” -- Friedrich Nietzsche, Thus Spake Zarathustra Table&of&Contents& Introduction ........................................................................................................ 7! Itamar Arel: AGI via Deep Learning ................................................................. 11! Pei Wang: What Do You Mean by “AI”? .......................................................... 23! Joscha Bach: Understanding the Mind ........................................................... 39! Hugo DeGaris: Will There be Cyborgs? .......................................................... 51! DeGaris Interviews Goertzel: Seeking the Sputnik of AGI .............................. 61! Linas Vepstas: AGI, Open Source and Our Economic Future ........................ 89! Joel Pitt: The Benefits of Open Source for AGI ............................................ 101! Randal Koene: Substrate-Independent Minds .............................................. 107! João Pedro de Magalhães: Ending Aging ....................................................
    [Show full text]
  • Colonel Nicole Malachowski Transcript of Interview
    MILITARY WOMEN AVIATORS ORAL HISTORY INITIATIVE Interview No. 12 Transcript Interviewee: Colonel Nicole M.E. Malachowski, United States Air Force, Retired Date: August 15, 2019 By: Lieutenant Colonel Monica Smith, USAF, Retired Place: National Air and Space Museum South Conference Room 901 D Street SW, Suite 700 Washington, D.C. 20024 SMITH: I'm Monica Smith at the National Air and Space Museum in Washington, D.C. Today is August 15, 2019, and I have the pleasure of speaking with Colonel Nicole Malachowski, United States Air Force, retired. This interview is being taped as part of the Military Women Aviators Oral History Initiative, and it will be archived at the Smithsonian Institution. Welcome, Colonel Malachowski. MALACHOWSKI: Thank you for having me, Monica. SMITH: Actually, you said to say Nicole. MALACHOWSKI: Yeah. SMITH: Is that alright? MALACHOWSKI: Nicole, [laughs] SMITH: Alright. So let's — MALACHOWSKI: I'm retired, so — SMITH: [laughs] That's great. Let's start with you stating your full name and your occupation. MALACHOWSKI: So my name is Nicole Margaret Ellingwood Malachowski. My friends sometimes call me Fifi. I am a retired colonel from the United States Air Force, and currently I'm a motivational and inspirational speaker. SMITH: Fantastic. What were your dates of service? MALACHOWSKI: So let's see. I was commissioned on the 29th of May, 1996, from the Air Force Academy, and I was medically retired from the military on the 29th of December, 2017. SMITH: So 21 years of service. MALACHOWSKI: 21years, 7 months, and 0 days. SMITH: [laughs] You've got it exact.
    [Show full text]
  • Use of Unmanned Air, Maritime and Land Platforms by the Australian
    Chapter 2 Background Introduction 2.1 This chapter will provide a background to the inquiry including the increasing use of military unmanned platforms, use of unmanned aerial vehicles (UAVs) by the United States (US), the proliferation of UAV capability and ADF use of unmanned platforms. Terminology 2.2 While popularly referred to as 'drones', unmanned platforms are an area of defence technology rich in acronyms and abbreviations. The range of terminology has been increased by a differing focus on the unmanned vehicle/unit itself and the associated systems of communication and control. In particular, the numbers and categories of UAV (also referred to as remotely piloted aircraft (RPA) or unmanned aircraft systems (UAS)) have soared in recent years. For convenience, the term 'unmanned platform' has been used in the committee's report to refer to all complex remotely operated devices and their associated communication and control systems. Unmanned platforms 2.3 Unmanned platforms often have a number of common characteristics. These include the structure of the platform itself, the external control system (such as a ground control station), the communications system which links to the control system, and the payload (which could include sensors or munitions). Automated functions are also often incorporated such as waypoint navigation via GPS. 1 Figure 2.1. Visualisation of UAV communications. 1 Extracted from Alberto Cuadra and Criag Whitlock, 'How drones are controlled', The Washington Post, 20 June 2014. 6 2.4 There are differing views on the first uses of unmanned platforms in a military context.2 Notably, in the 1950s, the Australian Government Aircraft Factory produced advanced 'target drones' (the GAF Jindivik) as part of an agreement with the United Kingdom (UK) for guided missile testing.
    [Show full text]
  • Letters to the Editor
    Articles Letters to the Editor Research Priorities for is a product of human intelligence; we puter scientists, innovators, entrepre - cannot predict what we might achieve neurs, statisti cians, journalists, engi - Robust and Beneficial when this intelligence is magnified by neers, authors, professors, teachers, stu - Artificial Intelligence: the tools AI may provide, but the eradi - dents, CEOs, economists, developers, An Open Letter cation of disease and poverty are not philosophers, artists, futurists, physi - unfathomable. Because of the great cists, filmmakers, health-care profes - rtificial intelligence (AI) research potential of AI, it is important to sionals, research analysts, and members Ahas explored a variety of problems research how to reap its benefits while of many other fields. The earliest signa - and approaches since its inception, but avoiding potential pitfalls. tories follow, reproduced in order and as for the last 20 years or so has been The progress in AI research makes it they signed. For the complete list, see focused on the problems surrounding timely to focus research not only on tinyurl.com/ailetter. - ed. the construction of intelligent agents making AI more capable, but also on Stuart Russell, Berkeley, Professor of Com - — systems that perceive and act in maximizing the societal benefit of AI. puter Science, director of the Center for some environment. In this context, Such considerations motivated the “intelligence” is related to statistical Intelligent Systems, and coauthor of the AAAI 2008–09 Presidential Panel on standard textbook Artificial Intelligence: a and economic notions of rationality — Long-Term AI Futures and other proj - Modern Approach colloquially, the ability to make good ects on AI impacts, and constitute a sig - Tom Dietterich, Oregon State, President of decisions, plans, or inferences.
    [Show full text]
  • On the Danger of Artificial Intelligence
    On The Danger of Artificial Intelligence Saba Samiei A thesis submitted to Auckland University of Technology in fulfilment of the requirements for the degree of Master of Computing and Information Sciences (MCIS) July 2019 School of Engineering, Computer and Mathematical Sciences 1 | P a g e Abstract In 2017, the world economic forum announced that AI would increase the global economy by USD 16 trillion by 2030 (World Economic Forum, 2017). Yet, at the same time, some of the world’s most influential leaders warned us about the danger of AI. Is AI good or bad? Of utmost importance, is AI an existential threat to humanity? This thesis examines the latter question by breaking it down into three sub-questions, is the danger real?, is the defence adequate?, and how a doomsday scenario could happen?, and critically reviewing the literature in search for an answer. If true, and sadly it is, I conclude that AI is an existential threat to humanity. The arguments are as follows. The current rapid developments of robots, the success of machine learning, and the emergence of highly profitable AI companies will guarantee the rise of the machines among us. Sadly, among them are machines that are destructive, and the danger becomes real. A review of current ideas preventing such a doomsday event is, however, shown to be inadequate and a futuristic look at how doomsday could emerge is, unfortunately, promising! Keywords: AI, artificial intelligence, ethics, the danger of AI. 2 | P a g e Acknowledgements No work of art, science, anything in between or beyond is possible without the help of those currently around us and those who have previously laid the foundation of success for us.
    [Show full text]
  • Operation Odyssey Dawn (Libya): Background and Issues for Congress
    Operation Odyssey Dawn (Libya): Background and Issues for Congress Jeremiah Gertler, Coordinator Specialist in Military Aviation March 30, 2011 Congressional Research Service 7-5700 www.crs.gov R41725 CRS Report for Congress Prepared for Members and Committees of Congress Operation Odyssey Dawn (Libya): Background and Issues for Congress Summary This report provides an overview of military operations in Libya under U.S. command from March 19 to March 29, 2011, and the most recent developments with respect to the transfer of command of military operations from the United States to NATO on March 30. The ongoing uprising in Libya against the government of Muammar al Qadhafi has been the subject of evolving domestic and international debate about potential international military intervention, including the proposed establishment of a no-fly zone over Libya. On March 17, 2011, the United Nations Security Council adopted Resolution 1973, establishing a no-fly zone in Libyan airspace, authorizing robust enforcement measures for the arms embargo established by Resolution 1970, and authorizing member states “to take all necessary measures … to protect civilians and civilian populated areas under threat of attack in the Libyan Arab Jamahiriya, including Benghazi, while excluding a foreign occupation force of any form on any part of Libyan territory.” In response, the United States established Operation Odyssey Dawn, the U.S. contribution to a multilateral military effort to enforce a no-fly zone and protect civilians in Libya. Military operations under Odyssey Dawn commenced on March 19, 2011. U.S. and coalition forces quickly established command of the air over Libya’s major cities, destroying portions of the Libyan air defense network and attacking pro-Qadhafi forces deemed to pose a threat to civilian populations.
    [Show full text]