Missing Man: Contextualising Legal Reviews for Autonomous Weapon Systems

DISSERTATION of the University of St. Gallen, School of Management, Economics, Law, Social Sciences and International Affairs to obtain the title of Doctor of Philosophy in Law

submitted by

Fredrik von Bothmer

from

Germany

Approved on the application of

Prof. Dr. Thomas Burri

and

Prof. Dirk Lehmkuhl, PhD

Dissertation no. 4804

Difo-Druck GmbH, Bamberg 2018

The University of St. Gallen, School of Management, Economics, Law, Social Sciences and International Affairs hereby consents to the printing of the present dissertation, without hereby expressing any opinion on the views herein expressed.

St. Gallen, May 22, 2018

The President:

Prof. Dr. Thomas Bieger

II

Table of Contents I. Introduction: Key Categories and Terminology ...... 7 A. Just another New Emerging Weapon Technology? ...... 12 B. Terminology and Definition ...... 18

II. Legal Challenges posed by AWS: IHL and Responsibility ...... 27 A. AWS as Means and Methods of Warfare ...... 33 B. Illegality of AWS per se ...... 34 1. Basic rules for inherent illegality of weapon systems ...... 34 a) Indiscriminate weapon by nature ...... 34 b) Unnecessary suffering or superfluous injury by nature ...... 35 c) Uncontrollable harmful effects ...... 36 2. Martens Clause: Dictates of the Public Conscience ...... 36 3. Preliminary Conclusion: No Illegality of AWS per se ...... 39 C. Lawful use of AWS ...... 39 1. Distinction ...... 41 2. Proportionality ...... 49 3. Precautions in attacks ...... 51 4. Implementation of suggested standards ...... 54 5. Preliminary conclusion: Compliance with IHL principles possible ...... 55 D. Challenges to the Responsibility for AWS ...... 57 1. Responsibility framework ...... 59 a) Articles on State Responsibility: a framework for AWS? ...... 60 b) Private military contractors: an example of unclear attribution ...... 61 c) Suggestions to include AWS in the ASR framework ...... 63 (1) Attribution of AWS conduct to the state? ...... 63 (2) Responsibility for the acts of state organs operating AWS ...... 65 (3) “Due diligence” obligations to prevent violations ...... 65 (4) Article 36 AP I new weapons review: a “link” for state responsibility? ...... 67 III

2. Individual Criminal Responsibility ...... 69 a) Potential addressees of responsibility ...... 70 b) Command responsibility ...... 71 3. Preliminary conclusion ...... 76

III. Multilateral Discussions in the UN CCW ...... 77 A. Structure of the Convention on Certain Conventional Weapons ...... 77 B. Discussions leading to Informal Meetings of Experts on AWS ...... 78 C. AWS on the CCW agenda ...... 82 1. Raising awareness of AWS ...... 83 2. 2014 Meeting of Experts on LAWS ...... 83 3. 2015 Meeting of Experts on AWS ...... 84 4. April 2016 CCW Meeting of Experts on AWS ...... 124 D. 2016 CCW Review Conference: GGE on AWS ...... 133 E. CCW GGE Proposals for Regulating AWS ...... 135 1. Pre-emptive Ban on AWS ...... 137 a) Characteristics ...... 138 b) Enforceability ...... 144 c) Feasibility ...... 146 d) Poor Precedent ...... 149 2. Drone Accountability Regime ...... 155 a) Characteristics ...... 156 b) Enforceability ...... 161 c) Feasibility ...... 164 3. Article 36 AP I as a means to regulate and restrict AWS ...... 169 a) Characteristics ...... 172 b) Enforceability ...... 175 c) Feasibility ...... 176

IV. Legal Review of New Weapons as a Means to Regulate AWS ...... 177

IV

A. Scope of Article 36 AP I ...... 178 B. Application to AWS ...... 179 C. National Weapon Reviews during Procurement and Development ...... 181 1. The German Legal Review Process ...... 181 a) Review Authority ...... 182 b) Character of the Review Decision ...... 182 c) Framework ...... 183 d) Review Methodology ...... 184 2. The Swedish Legal Review Process ...... 185 a) Review Authority ...... 185 b) Character of the Review Decision ...... 185 c) Framework ...... 186 d) Review Methodology ...... 186 3. The Swiss Legal Review Process ...... 187 a) Review Authority ...... 187 b) Character of the Review Decision ...... 188 c) Framework ...... 188 d) Review Methodology ...... 189 4. The United Kingdom’s Legal Review Process ...... 189 a) Review Authority ...... 189 b) Character of the Review Decision ...... 190 c) Framework ...... 190 d) Review Methodology ...... 192 5. The US Legal Review ...... 194 a) Review Authority ...... 195 (1) US Defence Acquisition Policies ...... 196 (2) Law of War Manual: Compliance Test ...... 196 (3) US Navy Implementing Instructions ...... 196 b) Character of the Review Decision ...... 197 V

c) Framework ...... 198 d) Review Methodology ...... 199 (1) Phase Zero: Coordination ...... 200 (2) Phase One: The Request ...... 200 (3) Phase Two: The Legal Review ...... 201 (4) Phase three: Coordination ...... 204 (5) Phase four: Dissemination and Retention ...... 205 e) The US Department of Defence Directive on Autonomy ...... 205 6. Shortcomings in Existing Legal Review Systems and Lessons Learned 207

V. Proposal: Contextualising the Legal Review to include AWS ...... 211 A. Context related Incentives for Autonomy and Technical Difficulties ...... 217 1. DARPA: The US Technology Innovation Accelerator ...... 222 2. DARPA Robotics Challenge: “May the Best Robot Win” ...... 226 B. Categorising Systems According to the Context ...... 230 1. Defence Close-in Weapon Systems ...... 233 2. Autonomous Targeting and Attacking Missiles ...... 235 3. Unmanned Ground Systems (UGS) ...... 238 4. Unmanned Undersea Systems (UUS) ...... 240 5. Unmanned Surface Systems (USS) ...... 241 6. Unmanned Aerial Systems (UAS) ...... 243 7. Swarm Technology ...... 245 C. Lawfulness Depends on Operating Environment ...... 249 D. No Drone Zone: Lessons Learned from Geo-Fencing ...... 254 E. Autonomous Driving Systems ...... 255 F. Limitations upon operational assignments for AWS ...... 255 1. Three-Step-Approach for AWS’ Introduction ...... 258 2. Deliberate targeting ...... 259 3. Dynamic targeting ...... 261 VI

4. IHL compliance of AWS ...... 263 G. Preliminary Conclusion: Restrictions relevant to the Context ...... 263 H. AWS Update for the Legal Review ...... 265 1. Goal: Maintaining Meaningful Human Control ...... 267 2. First Steps: Domestic Legislation ...... 269 3. Long-Term: Regulatory Treaty ...... 271

VI. Conclusion ...... 272

VII. Bibliography ...... 276

VIII. Appendix: Curriculum Vitae ...... 329

VII

To my wife Dorothee.

1

Preface

I am grateful for the support of the Hanns Seidel Foundation and the members of the Contours of a New World Order PhD Programme that allowed me to conduct research at Ludwig-Maximilians-University’s Chair of Public

International Law, at The Fletcher School of Law and Diplomacy, and at

Harvard University. I am very grateful to Professor Philip Alston for his hospitality at New York University School of Law’s Centre for Human Rights and Global Justice, where I stayed as visiting scholar in residence. I would like to thank Professor Dirk Lehmkuhl, PhD for his valuable input during my research proposal colloquium. I would like to thank my fellow LL.M. colleague

Elizabeth Schéré for her feedback and proofreading of the initial draft. Last but not least, I would like to thank Professor Dr. Thomas Burri, LL.M. for his great mentorship and guidance that allowed me to focus on such an emerging topic.

The latter even led to travel to the Defence Advanced Research Projects

Agency’s Robotics Challenge in Los Angeles with him – looking back an infinite jest.

Brooklyn Heights, December 30, 2017

2

The United States has established [a] position on the potential future

development of lethal autonomous weapon systems – it neither encourages

nor prohibits the development of such future systems.1

Abstract

Emerging autonomous weapon systems (AWS) will soon be able to select targets and kill humans without human control. If the international community does not act quickly to implement effective checks and balances on this issue, limitations or restrictions on certain functions within AWS will no longer be possible, risking a point of no return. Technology is outpacing diplomatic deliberations, as states are currently divided on the question of the legality of these weapons.

Some states hold that a pre-emptive ban is necessary to prohibit offensive AWS, arguing that the human factors that have historically intervened to enforce international norms of conduct will no longer exist, or will only survive in fundamentally limited form. Other states find that there is no need for immediate action. They assert that risks are exaggerated given that only a few countries, like the US and , have the means to realistically pursue autonomous systems. Hence, they see no need for action. In addition, although and

Russia are trying to catch up with the technological development, they are still far behind. This diplomatic divide has prevented action with regard to AWS for

1 MICHAEL W. MEIER, 'U.S. Delegation Opening Statement', (13 April 2015) The Convention on Certain Conventional Weapons (CCW) Informal Meeting of Experts on Lethal Autonomous Weapons Systems, www.unog.ch/80256EDD006 B8954/(httpAssets)/8B33A1CDBE80EC60C1257E2800275E56/$file/2015_LAWS_MX_USA+bis.pdf, last visited December 15, 2017. 3 a long time. However, the situation is moving forward thanks to a number of

State parties having successfully advocated for the creation of a formal “Group of Governmental Experts”, a format forum to discuss potential action. At present time, analytical papers are fuelling misunderstanding by relying heavily on hypotheticals that are informed by science-fiction-like scenarios like the

“Terminator”. This study proposes, in contrast, a regulatory scheme situated between the two extremes based on existing technology. Toward that end, it will focus on existing weapon systems and prototypes under development to evaluate whether the current legal review of new weapon systems is an effective compatibility check for the laws of war. Building upon this legal analysis, this study further presents an updated proposal for the legal review enshrined in

Article 36 of the Protocol Additional to the Geneva Conventions (API) that will take into account the “context” of deployment and environments on a case-by- case basis for specific weapons’ capabilities to effectively govern future robotic weapon inventions.

4

Kurzfassung

Neuartige autonome Waffensysteme (AWS) werden bald in der Lage sein, auch menschliche Ziele auswählen und töten zu können. Wenn die internationale

Gemeinschaft nicht schnell handelt, sind Verbote oder Beschränkungen bestimmter Funktionen nicht mehr möglich. Die diplomatischen Diskussionen können im Moment mit den technischen Entwicklungen nicht Schritt halten, da die Staatengemeinschaft derzeit in der Frage der Rechtmäßigkeit dieser Waffen uneinig ist. Einige Staaten sind der Ansicht, dass ein Präventivverbot erforderlich ist, um offensive AWS zu verbieten. Andere Staaten sind der

Ansicht, dass keine unmittelbaren Maßnahmen erforderlich sind, da ohnehin nur wenige Länder autonome Systeme entwickelten. Kürzlich wurde jedoch eine

Gruppe von Regierungsexperten eingesetzt, um mögliche Maßnahmen zu diskutieren. Diese Arbeit schlägt ein Regulierungsschema vor, das sich auf bestehende Waffensysteme und Prototypen in der Entwicklung konzentriert, um zu bewerten, ob die aktuelle rechtliche Überprüfung neuer Waffensysteme eine wirksame Kompatibilitätsprüfung für Kriegsgesetze darstellt. Aufbauend auf dieser rechtlichen Analyse präsentiert diese Arbeit einen Vorschlag zur

Aktualisierung der rechtlichen Überprüfung in Artikel 36 des Zusatzprotokolls zu den Genfer Konventionen, welcher den Kontext und die Umgebung spezifischer Waffen in einer fallspezifischen Prüfung berücksichtigt, um künftige

Entwicklungen von Roboterwaffen effektiv steuern zu können.

5

Version abrégée

De nouveaux systèmes d'armes autonomes (AWS) seront bientôt en mesure de choisir des cibles humaines et tuer pour. Si la communauté internationale n'agit pas rapidement, des interdictions ou des restrictions sur certaines fonctionnalités ne sont plus possibles. En ce moment avec les développements techniques des discussions diplomatiques ne peuvent pas suivre parce que la communauté internationale est actuellement divisée sur la question de la légalité de ces armes. Certains États estiment qu'une interdiction préventive est nécessaire pour interdire les AWS offensifs. D'autres États sont d'avis qu'aucune action immédiate est nécessaire, puisque dans tous les cas, seuls quelques pays ont développé des systèmes autonomes. Récemment, cependant, un groupe d'experts gouvernementaux a été utilisé pour discuter des mesures possibles. Ce travail propose un système de réglementation qui met l'accent sur les systèmes d'armes existants et des prototypes en développement, afin d'évaluer si l'examen juridique actuel des nouveaux systèmes d'armes est une vérification de compatibilité efficace des lois de la guerre. Fort de cette analyse juridique, ce document présente une proposition visant à mettre à jour l'examen juridique à l'article 36 du Protocole additionnel aux Conventions de Genève, qui tient compte du contexte et de l'environnement d'armes spécifiques dans un examen spécifique cas afin de pouvoir contrôler le développement futur des bras de robot efficace.

6

I. Introduction: Key Categories and Terminology

In 1927, Fritz Lang had envisioned a half man half machine in his movie

Metropolis. Ninety years later, we stand at a crossroads: more than 70 High

Contracting Parties to the United Nations Convention on Certain Conventional

Weapons (CCW) convened in the format “Group of Governmental Experts”

(GGE) to discuss the way forward with Autonomous Weapon Systems (AWS) in 2017.2 After years of informal discussions, the transition to the GGE process could pave the way in establishing a regulatory framework for this new technology. Meanwhile, the US Department of Defence’s (DoD) Directive

“Autonomy in Weapon Systems” had initially a provision that it would need to be reissued, cancelled, or certified by November 2017 according to DoD regulations.3 The directive was cleared again for public release on 8 May 2017 without substantive changes.4

2 See for details regarding discussions held in the format of “Group of Governmental Experts of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects” 13 to 17 November 2017: THE UNITED NATIONS OFFICE AT GENEVA, '2017 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS)', (2017) www.unog.ch/80256EE600585943/(httpPages)/F027DAA4966EB9C7C12580CD0039D7B5?OpenDocument, last visited December 15, 2017; RAY ACHESON, 'Losing Control: The Challenge of Autonomous Weapons for LAWS, Ethics, and Humanity', (15 November 2017) 5 CCW Report, www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2017/gge/reports/CCWR5.3.pdf, last visited December 15, 2017; JESSICA CUSSINS, 'AI Researchers Create Video to Call for Autonomous Weapons Ban at UN', (14 November 2017) https://futureoflife.org/2017/11/14/ai-researchers-create-video-call- autonomous-weapons-ban-un/, last visited December 15, 2017. 3 UNITED STATES DEPARTMENT OF DEFENSE, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012) www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf, para. 7 (b), last visited December 15, 2017. 4 Id. at 4. 7

Today, driverless cars already appear on the autobahn. It is not necessarily far- fetched to imagine the emergence of a weaponized fighter jet – without either a pilot on-board or a human remotely controlling the jet. Such AWS would be systems that, once activated, could select and engage targets without further intervention by a human operator.

This might soon be possible. Currently, however, two groups of actors are talking at cross-purposes regarding the adoption and use of such fully AWS.

One group is advocating for a pre-emptive ban, believing AWS can never lawfully be deployed. The other camp considers AWS as inevitable, supporting the further development of its technology before taking any action towards regulation. This group holds that the existing law is sufficient to cover current and future emerging technologies. These arguments are being exchanged as part of expert meetings in the United Nations (UN) CCW.

Critically, however, almost none of the experts of this fairly new field are engaging with technology that exists. Most actors involved are in both the policy-making and law-making field and lack the technical background necessary to understand the mechanics of these weapon systems. Hence, discussions have, as of yet, not been informed by an examination of the existing technology or prototypes in testing status, but rather by highly implausible projected-future scenarios à la Terminator. The failure to grapple with the reality 8 is problematic in terms of preparing for the future of artificial intelligence (AI) and for regulating military systems currently in the making.5

This study positions itself in the middle of the “total ban” versus “pro legal status quo” debate. On the one hand, opponents of AWS must accept that a proposed ban is not feasible: the main state parties that would be in a technical and financial position to develop AWS, such as the US, UK, , and Israel, have already openly declared that they do not support a ban.6 However, without an effective case-by-case analysis of specific AWS, they may be used in a manner, that will ultimately prevent them from abiding by the fundamental rules of international humanitarian law (IHL). Sensor technology, for example, currently makes it impossible for machines to distinguish between combatants and civilians. In addition, machines cannot recognize whether or not combatants surrender. Hence, a deployment in an environment where both are present, and a scenario in which AWS would have to distinguish between them, would not be lawful. Geographical, temporal, and functional boundaries are therefore crucial for the lawful use of AWS.

5 See THOMAS BURRI, 'The Politics of Robot Autonomy', (2016) 7 European Journal of Risk Regulation, 342 for an additional challenge, namely that “interdisciplinary dialog and understanding are mostly absent” in the current discussions. 6 RAY ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', (11 April 2016) available at: www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2016/meeting-experts- laws/reports/CCWR3.2.pdf, 4, last visited December 15, 2017. 9

This study proposes that the obligation to conduct legal reviews of new weapons and means of warfare enshrined in Article 36 of the First Additional Protocol to the Geneva Conventions (API) needs to include the context of use when determining the legality of emerging technology. So far, the few existing national review processes of new weapon systems do not consider context. This study posits that this omission is an oversight because not only the language of

Article 36 AP I requires it (“in some or all circumstances”) but also the legality of a weapon system may shift depending on the context or environment in which it is used. For example, an urban setting that requires distinguishing between combatants and civilians, as well as the principle of proportionality in attack may not (yet) be a lawful context for the use of AWS. That being said, deploying AWS underwater against military objects may very well be considered to be lawful by a legal review. Hence, updating Article 36 AP I legal reviews to include context of use may close the so-called loophole AWS create.

The following legal study proposes a realistic look into the future by focusing on the current state of technology and aims at contributing to the discussion by:

• Providing an overview of the reasons for different views on the legality of

AWS,

• Defining AWS, and describing the current state of AWS while focusing

on prototypes under development,

10

• Explaining potential policy approaches that the GGE could follow

towards AWS,

• Presenting the international obligation to review new weapon systems,

• Explaining existing national legal review processes and their limitations,

• Describing the factors making a ban impractical,

• Concluding with policy advice on how to strengthen the context of use

consideration in the process of domestic legal reviews, and

• Illustrating how the following key recommendations may lead to

international standards: the international exchange of best practices,

information, and lessons as well as a thorough domestic implementation

of the international obligation to conduct legal reviews of new weapon

systems.

Chapter I clarifies the terminology and introduces the reader to the main categories of autonomous systems. Chapter II discusses the legal challenges posed by AWS, in particular international humanitarian law (IHL) and a potential responsibility gap. Chapter III studies in detail the discussions in the

UN forum Convention on Certain Conventional Weapons in Geneva. Chapter

IV explains the existing national legal reviews of new weapon systems. Chapter

V draws upon limitations, lessons learned and best practices of the legal reviews described and presents the proposal that context related legal reviews allow including AWS. Chapter VI concludes this study.

11

A. Just another New Emerging Weapon Technology?

At present, the US and Israel are the most advanced states when it comes to unmanned military systems. And though European countries have less military capabilities,7 they are on their way to catching up with their aforementioned counterparts. To date, maritime AWS do not seem to be on the agenda of policy makers, even though several experts agree that the likelihood for an actual deployment of AWS is highest in the relatively “uncluttered” maritime environment.8 Since only very few experts9 have looked into legal questions concerning maritime autonomous systems, this study addresses them too.

In the past, societies considered new weapons such as the crossbow, projectiles, firearms, gunpowder, bayonets, airplanes, and submarines to be unlawful at the

7 KLAUS SCHARIOTH, 'Making ESDP strong will strengthen NATO and the Transatlantic Partnership', in Esther Brimmer (ed.),The EU’s Search for a Strategic Role - ESDP and Its Implications for Transatlantic Relations (Center for Transatlantic Relations, 2002), 166. 8 GEORGE LUCAS, 'Automated Warfare', (2011) 25 Stanford Law & Policy Review, 317-340; Jeffrey Thurnher, Unmanned (Maritime) Systems and Precautions in Attack, International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (13-14 February 2015); Wolff Heintschel von Heinegg, Unmanned Maritime Systems: The Challenges, International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (13-14 February 2015); Wolff Heintschel von Heinegg/Gian Luca Beruto, (eds.), International Humanitarian Law and Weapons Technologies (Milan 2012); Wolff Heintschel von Heinegg/Volker Epping (eds.), International Humanitarian Law. Facing New Challenges, (Springer, 2007). 9 ROBERT SPARROW, 'Killer Robots', (2007) 24 Journal of Applied Philosophy, 62-77; ANDREW NORRIS, 'Legal Issues Relating to Unmanned Maritime Systems', (2013) US Naval War College Monograph, www.hsdl.org/?view&did=731705, 1-87; LUCAS, 'Automated Warfare', (2011) 25 Stanford Law & Policy Review, 317-340; ANDREW H. HENDERSON, 'Murky Waters: The Legal Status of Unmanned Undersea Vehicles', (2006) 53 Naval Law Review, 55-72; WILLIAM MATTHEWS, 'Murky Waters: Seagoing Drones Swim Into New Legal and Ethical Territory', (9 April 2013) Defense News, http://auvac.org/newsitems/view/2015#sthash.LDZvtqPG.dpuf, 1-2; ROB MCLAUGHLIN, 'Unmanned Naval Vehicles at Sea: USVs, UUVs, and the Adequacy of the Law', (2012) 21 Journal of Law, Information and Science, 100-117; BRUCE BERKOWITZ, 'Sea Power in the Robotic Age', (2014) XXX Issues in Science and Technology (available at: http://issues.org/30-2/bruce-2/), 1-10; OLAF SEIRING, 'Der Einsatz unbemannter Flugsysteme in nicht internationalen bewaffneten Konflikten', (Berliner Wissenschafts-Verlag, 2016). 12 time of their respective inventions. Today, they are widely used and accepted,10 in particular unmanned aerial combat vehicles (UACV), such as the Predator or the Reaper, which are used in conflicts in Afghanistan or Pakistan.11 Their use, however, remains controversial. The UN Special Rapporteur on targeted killings, Christof Heyns, warns the international community of the current use of drones, which pose a risk of global war without borders, and has called on the latter to discuss how to deal with this emerging and dangerous trend.12 There is a consensus, however, that just because a weapon is novel or employs new technology does not mean that the weapon is automatically illegal.13 New technologies put more distance between the human operator and the actual target on the ground. Attacking from a distance, however, is not a new trend; Homer in the Iliad already describes this phenomenon. 14 UAS, as well as more autonomous systems, increase this distance even further. International

Humanitarian Law governs the use of UAS in armed conflict. However, this body of law is also applied to their use in situations that do not amount to an armed conflict, thereby providing a holistic framework for how UAS may be used legally.15 As mentioned above, when it comes to military applications of

10 OFFICE OF THE GENERAL COUNSEL DEPARTMENT OF DEFENSE, 'Department of Defense Law War Manual', (12 June 2015), 314. 11 TONY GILLESPIE & ROBIN WEST, 'Requirements for Autonomous Unmanned Air Systems set by Legal Issues', (2010) 4 The International C2 Journal, 2. 12 UNITED NATIONS NEWS CENTRE, 'UN human rights expert questions targeted killings and use of lethal force', (20 October 2011) www.un.org/apps/news/story.asp?NewsID=40136#.WhH9A7HMy2x, last visited December 15, 2017. 13 DOD, 'Department of Defense Law War Manual', (12 June 2015), § 6.1. 14 HOMER, The Iliad - Translated by Robert Fragles (Penguin, 1998). 15 NATHALIE WEIZMANN, 'Remotely Piloted Aircraft and International Law', in Michael Aaronson & Adriand Johnson (eds.), Hitting the target ? - How new capabilites are shaping international intervention (Stephen Austin & Sons, 2013). 13

UAS, the US and Israel are still the strongest players. However, China is catching up by selling UAS technology abroad.16

Although autonomous technology is developing at a high speed, there is no immediate cause for alarm. At the moment, about thirty people operate a

Predator or Reaper drone, and another eighty people analyse the incoming information.17 These two systems represent the most sophisticated currently in use. Hence, it is fair to assume that the military will not be replacing humans completely with machines in the battlefield any time soon. However, some experts argue that states have the obligation to develop more precise technology

(less collateral damage) that would spare their own soldiers’ lives. This argument is also used to back up the decision to develop the armed UAS MALE

RPAS (Medium Altitude Long Endurance Remotely Piloted Aircraft System) in a European consortium. The MALE European Programme, configured by

Germany, Italy, Spain, and France, kicked off in September 2016.18 Another driving force for more autonomy is potential cost savings associated with going unmanned in warfare: One of the US Navy’s next generation air fighter might be

16 DUNCAN HUNTER, 'Why is Obama Letting China Beat the US at Global Drone Sales?', (4 December 2015) Defense One (available at: www.defenseone.com/ideas/2015/12/why-obama-letting-china-beat-us-global-drone- sales/124213), last visited December 15, 2017. 17 OREN GROSS, 'The New Way of War: Is There a Duty to Use Drones?', (2015) 67 Florida Law Review, 1. 18 AIRBUS, 'European MALE RPAS (Medium Altitude Long Endurance Remotely Piloted Aircraft System) Programme takes off', (28 September 2016) www.airbus.com/newsroom/press-releases/en/2016/09/european- male-rpas-medium-altitude-long-endurance-remotely-piloted-aircraft-system-programme-takes-off.html, last visited December 15, 2017. 14 the Unmanned Carrier-Launched Airborne Surveillance and Strike (UCLASS) drone.19

Many perceive the trend towards of coding and algorithms in warfare as “killing without a heart”, in particular when critical functions such as targeting and attacking will be performed without human control. 20 Meanwhile, more offensive systems are being developed. For example, the Office of Naval

Research (ONR) revealed the so-called Low-Cost Unmanned Aerial Vehicle

Swarming Technology (LOCUST) prototype. 21 LOCUST will be ready to demonstrate a ship-based launch of 30 swarming UAVs to autonomously overwhelm a potential adversary.22 Micro robots will increasingly be deployed as swarms of thousands that imitate nature by following a lead agent.23 ONR states that information-sharing technology enables defensive and offensive missions and the human operator’s role will be to monitor the situation – stepping in and taking control as need be.24

19 PAUL SCHARRE & DANIEL BURG, 'To Safe Money, Go Unmanned', (22 October 2014) War on the Rocks (available at: http://warontherocks.com/2014/10/to-save-money-go-unmanned/), last visited December 15, 2017. 20 M. SHANE RIZA, Killing without a Heart. Limits on Robotic Warfare in an Age of Persistent Conflict (Potomac Books, 2013). 21 DAVID SMALLEY, 'LOCUST: Autonomous, swarming UAVs fly into the future', (14 April 2015) www.onr.navy.mil/Media-Center/Press-Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx, last visited December 15, 2017. 22 Id. at 1. 23 MICHAEL RUBENSTEINALEJANDRO CORNEJO & RADHIKA NAGPAL, 'Programmable self-assembly in a thousand-robot swarm', (2014) 345 Science, 795-799. 24 SMALLEY, 'LOCUST: Autonomous, swarming UAVs fly into the future', (14 April 2015) www.onr.navy.mil/Media-Center/Press-Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx, last visited December 15, 2017. 15

Another example of a very sophisticated swarm technology is the Strategic

Capabilities Office’s Perdix drone. Perdix is close to becoming operational. In one of its final exams, an autonomous flying swarm of 100 Perdix drones performed tasks together.25 It is very likely that multiple systems or swarms will frequently carry out future offensive operations. Acting as force multipliers in scenarios where speed is the motivation to deploy such systems, the available time for human intervention will most likely be restricted.26 This leads to the assumption that autonomous swarms would be inherently unpredictable due to their configuration.27 Even if the human decision-making process would be enhanced through machine-generated data in a so-called “intelligent partnership”, the human would be susceptible to becoming the weaker link.28 In the case of the aforementioned Perdix swarm of 100 drones, this could mean that operators would have too little time to override machine decisions or would overly trust machines. In fact, top military officials are supporting the rise of unmanned combat air vehicles (UCAVs). In April 2015 Secretary of the Navy

Ray Mabus stated that the Lockheed Martin F-35 Joint Strike Fighter (JSF) would almost certainly be the last manned strike fighter aircraft of the US Navy and that “unmanned systems, particularly autonomous ones, have to be the new

25 DAVID MARTIN, 'The Coming Swarm - New generation of drones set to revolutionize warfare', (8 January 2017) CBS News: http://www.cbsnews.com/news/60-minutes-autonomous-drones-set-to-revolutionize-military- technology/, last visited December 15, 2017. 26 MICHAEL BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', (11 to 15 April 2016) www.unog.ch/80256EDD006B8954/(httpAssets)/DDC13B243BA863E6C1257FDB00380A88/ $file/ReportLAWS_2016_AdvancedVersion.pdf, para. 68, last visited December 15, 2017. 27 Id. at para. 40. 28 Id. at para. 61. 16 normal in ever-increasing areas”. 29 Those “ever-increasing areas” will be investigated further below in this study. In addition, the US Pentagon is spending $3 billion a year on unmanned and autonomous systems.30 However, for the time being, both the US and the UK are officially stating that they have no intention to develop systems that operate without human intervention in the weapon command and control chain.31 As previously mentioned, there is no guarantee that this policy may not change over time.

Other prototypes, such as the British stealth plane TARANIS, find pre- determined objects with little human intervention. Samsung’s border-control robot SGR-A1 can target individual humans in very narrow circumstances with the help of visible recognition software32. This demonstrates the time-sensitive nature of an analysis of the impact of emerging technology on the law of armed conflict due to the accruing autonomy of military prototypes, which may potentially lead to irreversible consequences. To really grasp the debate and to be able to differentiate between military and civilian applications, it is crucial to understand what the object under review, AWS, exactly are. The question then

29 SAM LAGRONE, 'Mabus: F-35 Will Be ‘Last Manned Strike Fighter’ the Navy, Marines ‘Will Ever Buy or Fly’', (15 April 2015) https://news.usni.org/2015/04/15/mabus-f-35c-will-be-last-manned-strike-fighter-the- navy-marines-will-ever-buy-or-fly, last visited December 15, 2017. 30 UNITED STATES DEPARTMENT OF DEFENSE, 'Unmanned Systems Integrated Roadmap FY2013-2038, Reference Number: 14-S-0553', (2013) www.defense.gov/pubs/DOD-USRM-2013.pdf, 3, last visited December 15, 2017. 31 UNITED KINGDOM MINISTRY OF DEFENCE, 'The UK Approach to Unmanned Aircraft Systems (UAS)', (30 March 2011) www.gov.uk/government/publications/jdn-2-11-the-uk-approach-to-unmanned-aircraft-systems, para. 508, last visited December 15, 2017; DOD, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012) www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf, para. 4, last visited December 15, 2017. 32 RONALD CRAIG ARKIN, Governing lethal behavior in autonomous robots (CRC Press, 2009), 169. 17 arises: are AWS just another new invention, such as the submarine, or do they pose novel challenges that are ethically or legally almost impossible to overcome? This study aims to address the question as to whether the law allows the use of AWS.

B. Terminology and Definition

This section explains what the specific characteristics of AWS are, especially in comparison to currently used remote-control weapons. So far, there is no internationally agreed upon definition for AWS. Nevertheless, certain characteristics can be found in almost all of the proposed attempts to define

AWS. For example, the International Committee of the Red Cross (ICRC) builds upon the notion that a weapon system can independently select and attack targets. The ICRC explicitly proposes that,

“autonomous weapon systems is an umbrella term that would encompass any type of weapon systems, whether operating in the air, on land or at sea, with autonomy in its ‘critical functions’, meaning a weapon that can select (i.e. search for or detect, identify, track, select) and attack (i.e. use force against, neutralize, damage or destroy) targets without human intervention.”33

33 INTERNATIONAL COMMITTEE OF THE RED CROSS, 'International humanitarian law and the challenges of contemporary armed conflicts', (8-10 December 2015) 32nd International Conference of the Red Cross and Red Crescent, 32IC/15/11, Geneva (available at: www.icrc.org/en/document/international-humanitarian-law-and- challenges-contemporary-armed-conflicts), 44. 18

The ICRC definition also includes that, “after initial activation, it is the weapon system itself – using its sensors, programming and weapon(s) – that takes on the targeting processes and actions that are ordinarily controlled directly by humans.”34

In the context of maritime autonomous systems, the terminology used for AWS is manifold, ranging from Remotely Operated Underwater Vehicles (ROV),

Unmanned Sea Surface Vessels (USSV), Unmanned Underwater Vehicles

(UUV), to Autonomous Underwater Vehicles (AUV). However, two categories must be differentiated: Unmanned Surface Vessels (USV) and Unmanned

Underwater Vessels (UUV). This is crucial because the underwater environment presents special obstacles for technology, such as communication challenges.

The US Department of Defence differentiates amongst the following levels of

AWS:35

• Semi-AWS, once activated, may only engage individual targets or

specific target groups that have been selected by a human operator; or

providing terminal guidance to hone in on selected targets, provided that

human control is retained over the decision to select individual targets and

specific target groups for engagement;

34 Id. at 44. 35 DOD, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012) www.esd.whs.mil/Portals/ 54/Documents/DD/issuances/dodd/300009p.pdf, 13-14, last visited December 15, 2017. 19

• Human-supervised AWS retain the ability for human operators to

intervene and terminate engagements, including in the event of a weapon

system failure, before unacceptable levels of damage occur;

• Fully AWS, once activated, can select and engage targets without further

intervention by a human operator.

Although an internationally agreed upon definition is still missing, certain categories are common among the initial attempts of defining AWS. In succinct terms, AWS are systems that look for and engage in targets without human intervention. The Israeli Harpy, for example, is an autonomous weapon system designed to detect, attack and destroy radar emitters according to the manufacturer’s description.36 Northrop Grumman demonstrated full autonomy in taking off and landing on aircraft carriers, autonomous navigation and refuelling in the air with its X47-B prototype for the US Navy.37 Human Rights

Watch38 calls these human “out of the loop” systems, while human “in the loop” systems are semi-autonomous systems. These include so-called “fire and forget” missiles that have been in use by militaries around the world for decades. These missiles have an on-board guiding system that takes over after the operator selected a specific target. The last category introduced by Human Rights Watch

36 ISRAEL AEROSPACE INDUSTRIES LTD., 'Harpy Loitering Weapon', (2014) www.iai.co.il/2013/16143-16153- en/IAI.aspx, 1; CHRISTOF HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013) Human Rights Council, 23rd Sess., UN Doc. A/HRC/23/47, para. 45. 37 NORTHROP GRUMMAN CORPORATION, 'X-47B UCAS', (2014) www.northropgrumman.com/Capabilities/X47BUCAS/ Pages/default.aspx, last visited December 15, 2017. 38 BONNIE DOCHERTY, 'Losing Humanity. The Case against Killer Robots', (2012) Human Rights Watch/Harvard International Human Rights Clinic, 2; BONNIE DOCHERTY, 'Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban', (2016) Human Rights Watch/Harvard Law School International Human Rights Clinic, 1-56. 20 is human “on the loop” systems. Here, a human operator has the veto power: he/she can essentially override the specific system’s engagement while monitoring the actions of the system in real-time. An example of this category is the Israeli Iron Dome.39 This study will now examine some of the most recent, and publicly available, technological developments in AWS.

So far, with remote-control weapons, such as armed drones, human operators in real time perform the critical functions of targeting and attacking. In contrast, fully AWS would perform these critical functions on their own, without direct human control. There are certain weapon systems in use that have automatic or autonomous functions that can be activated for fixed periods and in highly constrained tasks (mostly defensive only), types of targets (objects rather than humans), as well as their environments or circumstances (simple, predictable, constrained, and with a low probability that civilians can be encountered).40

Although human operators control deployed weapons in real time, the trend is to automise the critical functions. At present, at least thirty nations have autonomous weapons. These are mostly defensive types, with human supervision and autonomous features to defend against short-warning attacks.41

39 ZVIKA HAIMOVICH, 'An Israeli Perspective on Ballistic Missile Defense', (5 April 2016) Fletcher School of Law and Diplomacy - ISSP Lecture Series, personal discussion (notes on file with author). 40 ICRC, 'International humanitarian law and the challenges of contemporary armed conflicts', (8-10 December 2015) 32nd International Conference of the Red Cross and Red Crescent, 32IC/15/11, Geneva (available at: www.icrc.org/en/document/international-humanitarian-law-and-challenges-contemporary-armed-conflicts), 45. 41 PAUL SCHARRE, 'Presentation at the UN CCW - Technical Issues I', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2 C1257C8D00513E26?OpenDocument, 3, last visited December 15, 2017. 21

The system itself and especially the levels of automation differ.42 There are automatic munitions with simple, less complex mechanic functions, which are used in simple environments as static deployed systems. In addition, there are precision-guided munitions (PGM), which operate in more complex environments, honing in on predefined target classes or aiming at long-distance locations. The automatic target identification and tracking systems work beyond visual range and prior to weapon release on predefined target signatures or target locations. 43 The more sophisticated systems include highly automated or autonomous systems. They operate in difficult but unambiguous environments and have complex functions for air or missile defence and counter-artillery. The human operators on the loop are often under high time pressure while using these weapon systems. Lastly, so-called loitering munitions already have highly complex functions. The environment varies due to different situational contexts.

The task often encompasses searching and destroying specific target classes in large areas. Here again, humans are “on the loop”, potentially sometimes practically already “out of the loop”.44 The ICRC holds that future systems will potentially have “more freedom of action to determine their targets, to operate

42 WOLFGANG RICHTER, 'Military Rationale for Autonomous Functions in Weapons Systems (AWS)', (14 April 2015) see id. at, 2. 43 Id. at 2. 44 Id. at 3. 22 outside tightly constrained spatial and temporal limits, and to react to rapidly changing circumstances”45.

A common approach to categorize unmanned systems is to rely on their automation levels. Experts generally distinguish between three type of systems depending on the degree of human control in their operation: remote control, automatic and autonomous functioning.46

A human operator in real-time can at all times remotely control systems, which are the first and most common of the three systems.47 These systems can also be described as “human in the loop” systems.48

The second category consists of automated systems. They carry out pre- programmed tasks without further human involvement after activation. Those can be described as human on the loop systems. Automatic systems are directly

45 ICRC, 'International humanitarian law and the challenges of contemporary armed conflicts', (8-10 December 2015) 32nd International Conference of the Red Cross and Red Crescent, 32IC/15/11, Geneva (available at: www.icrc.org/en/document/international-humanitarian-law-and-challenges-contemporary-armed-conflicts), 45. 46 See for details: PETER ASARO, 'On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making', (2012) 94 International Review of the Red Cross, 690; WILLIAM C. MARRA & SONIA K. MCNEIL, 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1144; HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013) Human Rights Council, 23rd Sess., UN Doc. A/HRC/23/47, para. 42; ALAN BACKSTROM & IAN HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 488; PHILIP ALSTON, 'Lethal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 40. 47 ALSTON, 'Lethal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 40. 48 For an overview of the different levels of human intervention relying on the loop see: DOCHERTY, 'Losing Humanity. The Case against Killer Robots', (2012) Human Rights Watch/Harvard International Human Rights Clinic, 2. 23 controlled by either a human or quantified input parameters with no interpretation by the machine.49 Within the category of automatic systems two sub-categories exist: semi-automatic and fully automatic systems. Semi- automatic systems are those that are able to operate some of their functions in automatic mode. Other functions remain in remote control mode (i.e. human operator). To differentiate between the different modes and to explain machine decision-making, the so-called OODA Loop (Observe, Orient, Decide, Act) is an effective tool.50 Air Force Pilot John Boyd developed the OODA Loop to initially describe human decision-making in this four-step process. According to

Boyd, the advantage lies with the fighter pilot, who in a so-called “dogfight” is able to make faster, more accurate decisions.51 If a machine itself carries out all of the four categories, it can be described as fully automatic. At present, defensive systems such as the Phalanx Close-In Weapons System (CIWS) rely on “autonomously performing its own search, detect, evaluation, track, engage and kill assessment functions”52 because fast-reactions are crucial for successful self-defence. Such near-autonomous air and missile defence systems, like the

49 GILLESPIE & WEST, 'Requirements for Autonomous Unmanned Air Systems set by Legal Issues', (2010) 4 The International C2 Journal, 2. 50 MARRA & MCNEIL, 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1144; MARKUS WAGNER, 'Taking Humans out of the Loop: Implications for International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 1-14. 51 MARRA & MCNEIL, 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1145. 52 DEPARTMENT OF THE NAVY, 'U.S. Navy Fact Sheet: MK 15 - Phalanx Close-In Weapons System (CIWS)', (15 November 2013) www.navy.mil/navydata/fact_print.asp?cid=2100&tid=487&ct=2&page=1, last visited December 15, 2017. 24

Patriot air and missile defence system have been in service since the eighties.53

Automatic systems, however, only operate well within structured and predictable environments.54

The third category consists of autonomous systems or systems with humans “out of the loop”. Autonomy in machines require the ability to operate in a real-world environment without external control after initial activation and at least in some areas of operation, for extended periods of time.55 In the context of AWS, autonomy means that once the latter are activated, they can select and engage targets without any further human intervention by a human operator.56 The machine itself makes an autonomous decision regarding the selection of targets and the use of lethal force.57 The real difference is that automatic systems are predictable in their outcome. Autonomous systems, however, have the ability to

53 JOHN K. HAWLEY, 'PATRIOT WARS - Automation and the Patriot Air and Missile Defense System', (January 2017) Center for a New American Security - www.cnas.org/publications/reports/patriot-wars, last visited December 15, 2017. 54 HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013) Human Rights Council, 23rd Sess., UN Doc. A/HRC/23/47, para. 42. 55 PATRICK LINGEORGE BEKEY & KEITH ABNEY, 'Autonomous Military Robotics. Risk, Ethics, Design', (2008) US Department of Navy, Office of Naval Research, http://ethics.calpoly.edu/ONR_report.pdf, 4, last visited December 15, 2017; MARRA & MCNEIL, 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1152. 56 HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013) Human Rights Council, 23rd Sess., UN Doc. A/HRC/23/47, para. 38; WILLIAM H. BOOTHBY, Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors (T.M.C. Asser Press, 2014), 107; MICHAEL N. SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon- systems-and-international-humanitarian-law-a-reply-to-the-critics/, 4, last visited December 15, 2017. 57 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon-systems-and- international-humanitarian-law-a-reply-to-the-critics/, 4; WILLIAM H. BOOTHBY, 'How Far Will the Law Allow Unmanned Targeting to Go?', in Dan Saxon (ed.), International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff, 2013), 48. 25 independently choose between different means to achieve human designed ends, including the authority to alter such ends, without human intervention.58

Differentiating between high automation and autonomy is crucial. Automated systems are not self-directed and lack decision-making capability.59 Autonomy, on the other hand, requires independent decision-making, going so far as to

“learn, meaning they can draw conclusions based on past experience and incorporate these lessons into future actions” 60 . Troy Jones and Mitch

Leammukda have proposed a three-tiered approach to understanding whether a machine is either highly automated or autonomous: the ability to function successfully in uncertain environments, the frequency of operator interaction, and the machine’s level of assertiveness to a variety of possible decisions.61

The real challenges arise when systems carry out functions such as detecting and firing at targets without direct human intervention. These autonomous systems are different from the existing pre-programmed systems described above,62 and for that reason, are thoroughly examined in this study.

58 MARRA & MCNEIL, 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1154; BOOTHBY, Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors (T.M.C. Asser Press, 2014), 104. 59 MARRA & MCNEIL, 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1150. 60 PETER W. SINGER, Wired for War - The Robotics Revolution and Conflict in the Twenty-first Century (Penguin Books, 2009), 75-77. 61 TROY JONES & MITCH LEAMMUKDA, 'Requirements-Driven Autonomous System Test Design: Building Trusting Relationships', (2011) www.researchgate.net/publication/228598990, last visited December 15, 2017. 62 WAGNER, 'Taking Humans out of the Loop: Implications for International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 8. 26

II. Legal Challenges posed by AWS: IHL and Responsibility

Your scientists were so preoccupied with whether or not they could, they didn’t

stop to think if they should.

Ian Malcolm, Jurassic Park

The governance of the use of weapon systems by the law of armed conflict

(LOAC), which is largely based on the Geneva Conventions of 1949 and the

1997 Additional Protocol I, is twofold. On the one hand, weapons law indicates whether a weapon in itself, and its common uses are lawful. On the other hand, targeting law determines whether the use of a particular weapon is lawful based on its specific mission or in specific circumstances.63

Throughout history, emerging weapon systems have challenged existing law.64

Legal research has so far mainly focused on remotely controlled systems and targeted killings; both for which the legal assessment mostly follows that of conventional warfare.65 Although there are contributions that deal with AWS,

63 UK MINISTRY OF DEFENCE, 'The UK Approach to Unmanned Aircraft Systems (UAS)', (30 March 2011) www.gov.uk/government/ publications/jdn-2-11-the-uk-approach-to-unmanned-aircraft-systems, para. 502, last visited December 15, 2017. 64 JOHN A. BURGESS, 'Emerging Technologies and the Security of Western Europe', in Stephen J. Flanagan & Fen Osler Hampson (eds.), Securing Europe’s Future: Changing elements of European security (Dover, 1986), 64-84. 65 See NEW YORK CITY BAR ASSOCIATION - COMMITTEE ON INTERNATIONAL LAW, 'The Legality under International Law of Targeted Killings by Drones launched by the United States', (June 2014) http://bit.ly/1lKPEuV, last visited December 15, 2017; MEDEA BENJAMIN, Drone Warfare. Killing by Remote Control (Verso, 2012); NICK TURSE & TOM ENGELHARDT, Terminator Planet: The First History of Drone Warfare, 2001-2050 (Dispatch Books, 2012); ARMIN KRISHNAN, Gezielte Tötung: Die Zukunft des Krieges (Matthes & Seitz, 2012); MARTIN FELIX HÖFER, Gezielte Tötungen - Terrorismusbekämpfung und die neuen Feinde der Menschheit (Mohr Siebeck, 2013), 96-99; ELISABETH STRÜWER, Zum Zusammenspiel von humanitärem Völkerrecht und Menschenrechten am Beispiel des Targeted Killing (Peter Lang, 2009), 51-54; 27 they are often concerned with specific aspects, such as the morals and ethics surrounding the use of increased autonomy in UAS.66 To date, there is no comprehensive study of ground, sea, and underwater systems. Although experts67 agree on the fact that AWS pose challenges that need to be addressed quickly, neither the UN nor individual states have a proper policy or specific legal provisions in place regarding AWS. Only very few states, such as the

United States68 and the United Kingdom69 have adopted a preliminary public policy on AWS. Moreover, the former has initiated a development military program for their air, ground and underwater use.70 Opponents of AWS lobby for a pre-emptive ban. In particular, civil society organizations hold that the principles of international humanitarian law (i.e. the principles of

JORDAN J. PAUST, 'Self-Defense, Targeting of Non-State Actors and Permissibility of U.S. Force of Drones in Pakistan', (2009) 19 Journal of Transnational Law and Policy, 237-280, 241-242; JORDAN J. PAUST, 'Use of Armed Force Against Terrorists in Afghanistan, Iraq, and Beyond', (2002) 35 Cornell International Law Journal, 533-557, 538; WILLIAM BOOTHBY, The Law of Targeting (Oxford University Press, 2012); NILS MELZER, Targeted Killing in International Law (Oxford University Press, 2008); ROLAND OTTO, Targeted Killings and International Law – With Special Regard to Human Rights and International Humanitarian Law (Springer, 2012); STEVEN J. ZALOGA, Unmanned Aerial Vehicles: Robotic Aerial Vehicles, 1917-2007 (Osprey Publishing, 2008). 66 ARMIN KRISHNAN, Killer Robots - Legality and Ethicality of Autonomous Weapons (Farnham, 2009); ROBIN BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts. Anforderungen an das Konstruktionsdesign und Einsatzbeschränkungen (Duncker & Humblot, 2014); NEHAL BHUTA, SUSANNE BECK, ROBIN GEISS, HIN-YAN LIU & CLAUS KRESS, Autonomous Weapons Systems – Law, Ethics, Policy (Cambridge University Press, 2016). 67 REBECCA CROOTOF, 'The Killer Robots Are Here: Legal and Policy Implications', (2015) 36 Cardozo Law Revue, 1837-1915; BONNIE DOCHERTY, 'The Trouble with Killer Robots. Why we need to ban fully autonomous weapons systems, before it's too late', http://foreignpolicy.com/2012/11/19/the-trouble-with-killer-robots/, last visited December 15, 2017; DOCHERTY, 'Losing Humanity. The Case against Killer Robots', (2012) Human Rights Watch/Harvard International Human Rights Clinic, 1-55; NOEL E. SHARKEY, 'The evitability of autonomous robot warfare', (2012) 94 International Review of the Red Cross, 787-799; ARTICLE 36, 'Memorandum from Article 36 for Delegates to the Convention on Certain Conventional Weapons (CCW), Structuring Debate on Autonomous Weapon Systems', (14 November 2013) www.article36.org/wp- content/uploads/2013/11/Autono-mous-weapons-memo-for-CCW.pdf, last visited December 15, 2017. 68 DOD, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012) www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf, 1-15, last visited December 15, 2017. 69 UNITED KINGDOM MINISTRY OF DEFENCE, 'Unmanned aircraft systems (Joint Doctrine Publication 0-30.2)', (12 September 2017) www.gov.uk/government/publications/unmanned-aircraft-systems-jdp-0-302, 1-84, last visited December 15, 2017. 70 UNITED STATES AIR FORCE, 'Unmanned Aircraft Systems Flight Plan 2009-2047 (Washington D.C.)', (2009) fas.org/irp/program/collect/uas_2009.pdf, 41, last visited December 15, 2017. 28 proportionality, distinction, necessity, and the dictates of humanity), pose such intricate challenges that AWS cannot comply with them fully.71 Critics of AWS often argue that the use of force will only increase once machines replace human soldiers on the battlefields.72 Governments will presumably resort to the use of force more often if their own human forces are not endangered. Recently, more than 8,000 artificial intelligence and robotics experts, including Stephen

Hawking, Elon Musk, Noam Chomsky, and Steve Wozniak, signed an Artificial

Intelligence open letter initiated by the Future of Life Institute calling for a ban on AWS.73 The latter’s signatories concluded that “starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”74 The initiative builds upon the first of a series of papers by Human Rights Watch and the International Human

Rights Clinic at Harvard Law School addressing challenges posed by AWS.75

Joining the concert of lobbying for a pre-emptive ban are the Campaign to Stop

Killer Robots, a global coalition of 61 international, regional, and national non- governmental organizations (NGOs) in 26 countries, that calls for a pre-emptive ban on fully AWS,76 and the International Committee on Robot Arms Control

(ICRAC). In addition, these opponents believe that AWS need a responsible

71 DOCHERTY, 'Losing Humanity. The Case against Killer Robots', (2012) Human Rights Watch/Harvard International Human Rights Clinic, 1-55. 72 NOEL SHARKEY, 'Killing Made Easy: From Joysticks to Politics', in Patrick Lin, Keith Abney & George A. Bekey (eds.), Robot Ethics - The Ethical and Social Implications of Robotics (MIT Press, 2012), 111-128. 73 FUTURE OF LIFE INSTITUTE, 'An Open Letter - Research Priorities for Robust and Beneficial Artificial Intelligence', https://futureoflife.org/ai-open-letter, last visited December 15, 2017. 74 FUTURE OF LIFE INSTITUTE, 'Autonomous Weapons: An Open Letter from AI & Robotics Researchers', https://futureoflife.org/open-letter-autonomous-weapons, last visited December 15, 2017. 75 DOCHERTY, 'Losing Humanity. The Case against Killer Robots', 1-55. 76 See www.stopkillerrobots.org, last visited December 15, 2017. 29 agent that can be held accountable in case of errors.77 Others do not explicitly state that AWS would be unlawful under existing law, but they desire a moratorium at least until discussions lead to an AWS regulatory agreement on the international level.78 In contrast, several experts hold that no action is required because the current law is sufficient to cover AWS, as well.

Challengers of a pre-emptive ban hold that, given the lack of knowledge about

AI, it would be premature to ban a whole category of weapons that may have the potential to be more precise or save lives of operating soldiers.79 In addition, given the lack of support from the relevant state parties, a ban does not seem feasible for the way ahead. Quite the contrary, the likelihood of an outright pre- emptive ban of AWS in the immediate future, is extremely low. In addition, supporters of AWS stress that potential advantages cannot be explored or pursued if the international community of states rushes to vote for a prohibition.80 Moreover, these supporters hold that AWS could be programmed without the instinct of self-preservation, so that they could sacrifice themselves if necessary. In addition enhanced loitering time allows for better battlefield

77 PETER ASARO, 'A Body to , But Still No Soul to Damn. Legal Perspectives on Robotics', in Patrick Lin, Keith Abney & George A. Bekey (eds.), Robot ethics: the ethical and social implications of robotics (MIT Press, 2012), 169-186. 78 HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013) Human Rights Council, 23rd Sess., UN Doc. A/HRC/23/47, 20-21. 79 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon-systems-and- international-humanitarian-law-a-reply-to-the-critics/, last visited December 15, 2017; MICHAEL N. SCHMITT, 'Unmanned Combat Aircraft Systems and International Humanitarian Law: Simplifying the oft benighted Debate', (2012) 30 Boston University International Law Journal, 595-619; MICHAEL N. SCHMITT & JEFFREY S. THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', (2013) 4 Harvard National Security Journal, 231-281; KENNETH ANDERSON, DANIEL REISNER & MATTHEW WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 386-411; KENNETH ANDERSON & MATTHEW C. WAXMAN, 'Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can', (2013) Columbia Public Law Research Paper, 1-33. 80 ARKIN, Governing lethal behavior in autonomous robots (CRC Press, 2009). 30 observation and the lack of emotions could result in fewer human rights violations that result from anger, revenge or the pleasure of killing.81 Military experts have emphasized the advantages of increased autonomy: multiplying combat power, reducing the risks facing their own forces, freeing personnel to conduct more complex tasks, and lowering costs.82 However, compliance with international humanitarian law principles heavily depends on technological capabilities like sensor recognition and data processing. Supporters of AWS even argue that they may be able to comply with the law of armed conflict, if

“Moore’s law”, which predicts that computing capabilities approximately double every two years holds true, and artificial intelligence becomes possible in the future.83

At present, AWS face technical difficulties issues such as poor sensor technology and/or unpredictable environments. 84 These weapons also face ethical concerns. Should machines be allowed to make life or death decisions

81 Id. at 29-33; RONALD CRAIG ARKIN, 'Lethal Autonomous Weapons Systems and the Plight of the Noncombatant (Presentation at the CCW informal expert meeting on 13 May 2014)', www.unog.ch/80256EDD006B8954/(httpAssets)/ FD01CB0025020DDFC1257CD70060EA38/$file/Arkin_LAWS_technical_2014.pdf, last visited December 15, 2017. 82 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 393. 83 GORDON E. MOORE, 'Cramming more components onto integrated circuits', (1965) reprinted in: 86 Proceedings of the IEEE, No. 1 (1998), 82-85; on the development of artificial intelligence see: AMIR HUSAIN, The Sentient Machine: The Coming Age of Artificial Intelligence (Scribner, 2017); JAMES BARRAT, Our Final Invention: Artificial Intelligence and the End of the Human Era (St. Martin's Griffin, 2015); MAX TEGMARK, Life 3.0: Being Human in the Age of Artificial Intelligence (Knopf, 2017); PEDRO DOMINGOS, The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World (Basic Books, 2015); TIME, Artificial Intelligence: The Future of Humankind (TIME-Life, 2017). 84 JONATHAN DAVID HERBACH, 'Into the Caves of Steel: Precaution, Cognition and Robotic Weapon Systems under the International Law of Armed Conflict', (2012) 4 Amsterdam Law Forum, 18; NOEL SHARKEY, 'Automating Warfare: Lessons Learned from the Drones', (2011) 21 Journal of Law, Information and Science, 143. 31 instead of humans?85 Could AWS comply with the Just War Theory?86 Besides these specific and philosophical concerns, several experts have expressed issues with regard to international humanitarian law posed by AWS.87 This study focuses on compliance of AWS with the latter, 88 analysing whether the development or use of AWS would violate international humanitarian law.89 If these weapons do not comply, this study proposes which technical and legal requirements they would need to fulfil in order to be lawfully deployed and used. Technical capabilities, as a prerequisite to abide by the laws of war, are also covered where necessary. But most importantly, at the core of these discussions, is the applicability of international humanitarian law principles.

Firstly, however, this study posits whether AWS can be categorized within established standards.

85 PAUL SCHARRE, Army of None: Autonomous Weapons and the Future of War (W. W. Norton & Company forthcoming 2018). 86 PATRICK LIN, GEORGE BEKEY & KEITH ABNEY, 'Robots in War: Issues of Risk and Ethics', in R. Capurro and M. Nagenborg (eds.), Ethics and Robotics (IOS Press, 2009), 49; SPARROW, 'Killer Robots', (2007) 24 Journal of Applied Philosophy, 74; ASARO, 'On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making', (2012) 94 International Review of the Red Cross, 695; SHARKEY, 'The evitability of autonomous robot warfare', (2012) 94 International Review of the Red Cross, 708. 87 DOCHERTY, 'Losing Humanity. The Case against Killer Robots', 32; SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', (2013) 4 Harvard National Security Journal, 231; ALSTON, 'Lethal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 54; MARKUS WAGNER, 'Autonomy in the Battlespace: Independently Operating Weapon Systems and the Law of Armed Conflict', (2013) International Humanitarian Law and the Changing Technology of War, 99. 88 This study makes reference to international humanitarian law as codified for international armed conflicts in Protocol Additional 1 to the Geneva Conventions or, if states are not party, the respective customary international law will be applied. 89 Existing law must always be the starting point because “the key to integrating automation, robotics and AI into our lives is to identify the potential legal issues even if the underlying law is uncertain or developing”, see: THOMAS BURRI & ISABELLE WILDHABER, 'Introduction to the Special Issue of the European Journal of Risk Regulation: The Man and the Machine – When Systems Take Decisions Autonomously', (2016) 7 European Journal of Risk Regulation, 296. 32

A. AWS as Means and Methods of Warfare

Categorizing AWS as combatants or “virtual combatants”90 is not convincing.

Firstly, existing law does not support treating machines like humans.91 Secondly, classifying machines as combatants runs counter to the object and purpose of the status of combatants, which is to safeguard persons.92

Classifying AWS as combatants is also not necessary because they can be categorized as weapon or means or method of warfare. Since weapons are not specifically defined within IHL, a case can be made as to whether an item has an offensive capability that is applied to a military object or an enemy combatant.93

Weapons are commonly referred to as instruments used in fighting.94 However, the broader term of means and methods of warfare is better suited to cover

AWS. If this is indeed the case, IHL already provides a framework for differentiating between lawful and unlawful AWS. This is true because AWS are systems consisting of a platform that can be mounted with various kinds of weapons and would have an on board guiding system. The question remains as to whether AWS should be considered as means or methods of warfare. While means of warfare encompass weapons or respective weapon systems, methods

90 DAVID AKERSON, 'The Illegality of Offensive Lethal Autonomy', in Dan Saxon (ed.), International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff, 2013), 88. 91 RIEKE ARENDT, Völkerrechtliche Probleme beim Einsatz autonomer Waffensysteme (Berliner Wissenschafts Verlag, 2016), 47. 92 HIN-YAN LIU, 'Categorization and legality of autonomous and remote weapons systems', (2012) 94 International Review of the Red Cross, 629. 93 JUSTIN MCCLELLAND, 'The Review of Weapons in Accordance with Article 36 of Additional Protocol I', (2003) 85 see id. at, 404. 94 See http://thelawdictionary.org/weapon, last visited December 15, 2017. 33 refer to the tactics, techniques and procedures (TTP) of armed conflict.95 This is why AWS should be considered as a means of warfare.96 However, deploying a swarm of micro AWS to target certain vehicle in a specific area constitutes a method of warfare.97 Firstly, the possibility of unlawful AWS per se is analysed.

Secondly, the potential for a lawful use of AWS is addressed.

B. Illegality of AWS per se

1. Basic rules for inherent illegality of weapon systems

Extremely restrictive conditions must be met for weapon systems to be inherently illegal, and so far, few weapons have fulfilled them.98

a) Indiscriminate weapon by nature

In Article 35 II AP I reference is made to weapons of a nature to cause superfluous injury or unnecessary suffering to combatants. This rule is concerned with the nature of the weapon itself, its design purpose, or intended

“normal uses”.99 A narrow reading of this clause is supported by the example of the prohibitory ban on blinding lasers in Article 1 AP IV, which is expressly limited to “laser weapons specifically designed, as their sole combat function or

95 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', (2013) 4 Harvard National Security Journal, 271. 96 BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 493; BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 41. 97 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', (2013) 4 Harvard National Security Journal, 271. 98 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', 399 making reference to Judge Advocate General, U.S. Air Force, AFI51-402, Legal Reviews of Weapons and Cyber Capabilities 3.1.1, 3.1.2 (2011). 99 WILLIAM H. BOOTHBY, Weapons and the Law of Armed Conflict (Oxford University Press, 2009), 69–85. 34 as one of their combat functions, to cause permanent blindness to unenhanced vision”.

A weapon’s nature must be distinguished from potential operational circumstances, in which AWS are unable to distinguish between civilians and combatants. The crux of this issue in this case is that these weapons cannot, either by their nature or design, abide by the principle of distinction in any possible circumstance. 100 Very few weapons fulfil this high standard and specifically for AWS, the focus must be on potentially indiscriminate use of the actual weapon system; solely the fact of being autonomous does not suffice to meet the very restrictive condition.

b) Unnecessary suffering or superfluous injury by nature

This rule of Article 35 II AP I is applicable to combatants only and does not aim at protecting civilians. The standard for suffering of combatants in armed conflict is relatively high. Examples of prohibited inhumane suffering include those inflicted with needless, or shells filled with glass shards that cannot be detected even by x-rays.101 That being said, AWS would not fall under this provision. At the most, it may restrict certain weapons mounted onto AWS.

100 YORAM DINSTEIN, The conduct of hostilities under the law of armed conflict (Cambridge University Press, 2010), 62. 101 YVES SANDOZ, CHRISTOPHE SWINARSKI & BRUNO ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) (Martinus Nijhof Publishers, 1987), 1419. 35

c) Uncontrollable harmful effects

Article 51 IV c AP I requires weapons not to have uncontrollable harmful effects. Typical examples of such uncontrollable effects are biological or chemical weapons, namely a virus that cannot be controlled once released.102

Although autonomy, per definitionem, requires self-control of machines, its

“effects” are not uncontrollable under this provision.103

2. Martens Clause: Dictates of the Public Conscience

The Martens clause has been proposed to justify a prohibition of AWS.104 The argument is that the latter should be banned because they go against considerations of humanity and public conscience.105

Invoking the Martens clause as a supplementary or residual protection in the absence of specific treaty or customary law to outlaw a specific weapon is controversial. 106 Proposed by the Russian publicist Fyodor Fyodorovich

Martens, it was inserted in the preamble of the 1899 Hague Convention II regulating the laws and customs of the war on land and was restated in the 1907

Hague Convention IV. The clause originally read:

102 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 400. 103 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 250. 104 DOCHERTY, 'Losing Humanity. The Case against Killer Robots', 35. 105 AKERSON, 'The Illegality of Offensive Lethal Autonomy', in Saxon (ed.), International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff, 2013), 90-96. 106 THEODOR MERON, 'The Martens Clause, Principles of Humanity, and Dictates of Public Conscience', (2000) 94 American Journal of International Law, 79. 36

Until a more complete code of the laws of war is issued, the High

Contracting Parties think it right to declare that in cases not included in

the Regulations adopted by them, populations and belligerents remain

under the protection and empire of the principles of international law, as

they result from the usages established between civilized nations, from the

laws of humanity, and the requirements of the public conscience.

Today, the Martens clause can be found in Article 1 II Protocol Additional 1 to the Geneva Conventions of 1949, and it reads: “In cases not covered by this

Protocol or by other international agreements, civilians and combatants remain under the protection and authority of the principles of international law derived from established custom, from the principles of humanity and from dictates of public conscience.” That being said, it remains that to be successfully invoked in prohibiting AWS, the clause needs to be binding, which it is not. In addition, including humanity and public conscience, as new sources of international law, has had no effect.107 In fact, although the general applicability of the Martens clause is almost unanimously accepted, the legal substance of the clause is open to a number of interpretations.108

107 ANTONIO CASSESE, 'The Martens Clause: Half a Loaf or Simply Pie in the Sky?', (2000) 11 European Journal of International Law, 193. 108 EMILY CRAWFORD, 'The Modern Relevance of the Martens Clause', (2006) 6 ISIL Yearbook of International Humanitarian and Refugee Law, 19. 37

William Boothby holds that the Martens clause should not be interpreted as to provide legal protection in its own right, but should serve as a basis for the relationship between customary and treaty law. This way, in particular with a view to new technologies or methods of warfare, the Martens clause stipulates that warfare cannot be conducted in a total absence of the law. 109 The

Commentary on the Additional Protocols to the Geneva Conventions states that the Martens clause should be interpreted as preventing the assumption that anything that is not explicitly prohibited is permitted (a contrario).110 Emily

Crawford holds that the Martens clause alone cannot outlaw certain methods or means of warfare; it should rather be used as an interpretative tool in conjunction with other general principles of humanitarian law.111 All of the above interpretations require additional customary or treaty law provisions in order to give meaning to considerations of humanity and public conscience. The

Martens clause alone cannot prohibit the development and deployment of

AWS,112 especially since treaty law governs the legality of weapon systems – some of them bearing directly on the development of AWS.113 Marco Sassòli adds that the clause should not be mistaken as a condition that only humans may

109 BOOTHBY, Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors (T.M.C. Asser Press, 2014). 110 SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, para. 55. 111 CRAWFORD, 'The Modern Relevance of the Martens Clause', (2006) 6 ISIL Yearbook of International Humanitarian and Refugee Law, 20; MERON, 'The Martens Clause, Principles of Humanity, and Dictates of Public Conscience', (2000) 94 American Journal of International Law, 88 (“pushing the Martens clause beyond reasonable limits”). 112 TYLER D. EVANS, 'At War with the Robots: Autonomous Weapon Systems and the Martens Clause', (2013) 41 Hofstra Law Review, 726. 113 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon-systems-and- international-humanitarian-law-a-reply-to-the-critics/, 32, last visited December 15, 2017. 38 operate weapons, but instead should be interpreted as considering the means and methods of warfare which are the most effective in protecting humanity, regardless of whether humans, machines or a combination of the two are behind this goal.114

3. Preliminary Conclusion: No Illegality of AWS per se

For an illegality per se, there may not be any possible kind of lawful use. The

ICRC acknowledges that the question is not whether “new technologies are good or bad in themselves, but instead what are the circumstances for their use.”115 Just as other long-range or over-the-horizon weapon, AWS should not be considered as illegal per se. Almost every weapon can be misused in a way that is illegal; hence, the key question is whether customary or treaty provisions stipulate that the use of AWS should be lawful.

C. Lawful use of AWS

There is almost universal consensus that international humanitarian law applies to AWS.116 A legal assessment of weapon systems must include the following by operating personnel of the laws of armed conflict when activating or deploying weapon systems. AWS may not be deployed in environments where

114 MARCO SASSÒLI, 'Is increasing dehumanization leading to less humanity in warfare?', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM-2015/DEHUM-2015.html, last visited December 15, 2017; ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', 410. 115 INTERNATIONAL COMMITTEE OF THE RED CROSS, 'IHL and the challenges of contemporary armed conflict: Report prepared for the 31st conference', (2011) www.icrc.org/eng/resources/documents/report/31-international- conference-ihl-challenges-report-2011-10-31.htm, 40, last visited December 15, 2017. 116 Id. at 36. 39 they are not approved due to operational design or technological standard defects. This will most likely be the case when civilians are present. However, the correlation between weapon systems and human operators remains to be essential even in the case of increased autonomy.117 Due to the unpredictability of AWS decision-making and its accompanying loss of human comprehension on how the systems “think”, means that individuals cannot predict in advance their conduct.118 Here, the focus is on machine learning and artificial intelligence technologies. Today, in both civil and military life, humans use systems they do not fully understand. It does not mean that AWS cannot lawfully be used within the existing legal framework. Operators, however, need to have a general understanding of operational processes as well as system’s functions and limitations.

In the absence of a new treaty banning AWS, weapon systems must comply with the law of armed conflict. States may not resort to any means or method of warfare just because it is available and practical. The “cardinal principles” laid out in the International Court of Justice Nuclear Weapons Opinion include the principle of distinction or the prohibition to use methods of warfare to cause unnecessary harm, which are binding as customary international law upon all

117 WILLIAM H. BOOTHBY, 'Dehumanization of warfare – Is there a legal problem with in particular concerning Art. 36 AP I?', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM- 2015/DEHUM-2015.html, last visited December 15, 2017. 118 RYAN CALO, 'Robotics and the New Cyberlaw', (2015) 103 California Law Review, 125-126. 40 parties in conflict.119 Since AWS do not exist just yet, the following section presents, which of the aforementioned provisions can be programmed into AWS and which cannot. This should serve to determine where to draw the line between acceptable and unacceptable autonomy in AWS.

The question at hand is whether it will be possible to develop a computer program that enables AWS to distinguish between legitimate military targets, military objectives, combatants, and civilians directly participating in hostilities.

If this is possible, AWS may be lawfully used in combat. Hence, this study will analyse the core IHL principles to uncover the feasibility of including them in

AWS design and programming.

1. Distinction

On of the hardest tasks for AWS is distinguishing between combatants and civilians. The principle of distinction is codified in Articles 48, 51 II and 52 II of

AP I but is also considered to be customary international law. 120

Correspondingly, Rule 1 of the ICRC’s Customary International Humanitarian

Law study stipulates that parties to a conflict must at all times distinguish between civilians and combatants, that attacks may only be directed against combatants and not against civilians.121

119 INTERNATIONAL COURT OF JUSTICE, 'Legality of the Threat or Use of Nuclear Weapons, (Advisory Opinion of 8 July 1996)', (1996) I.C.J. Reports, 226-227. 120 JEAN-MARIE HENCKAERTS & LOUISE DOSWALD-BECK, Customary International Humanitarian Law - Volume I (Cambridge University Press, 2009), 3-4. 121 Id. at 3. 41

The distinction principle can be broken down into the following considerations: distinguish between military and non-military targets, distinguish between combatants and civilians, recognising civilians directly participating in hostilities, and lastly to be able to spare hors de combat.

Firstly, AWS would need to be able to differentiate between military and civilian objects. Article 52 II AP I offers a definition:

“In so far as objects are concerned, military objectives are limited to

those objects which by their nature, location, purpose or use make an

effective contribution to military action and whose total or partial

destruction, capture or neutralization, in the circumstances ruling at the

time, offers a definite military advantage.”

Article 52 III AP I, on the other hand, provides a non-exhaustive list of examples of civilian objects, presumably non-military, such as “a place of worship, a house or other dwelling or a school”.

As mentioned above, the definition of military objects contains two criteria: (1) the object has to make an effective contribution to military action and its destruction, and (2) the capture or neutralization needs to offer a definite military advantage. Firstly, AWS would need to be programmed in such a way 42 that they could effectively differentiate between civilian and military objects.

This seems to be technically possible given that they would fall under such categories as fighter jet, tank or military barracks. In addition, AWS would also have to be able to decide whether objects make an effective contribution to the military action of the enemy forces. This does not seem to be easily accessible for programming. The categories of “nature, location, purpose or use” by which military objects are defined, is described in detail in the literature already.122 The

“nature” category for example, comprises all objects directly used by armed forces, such as weapons, equipment, depots, buildings occupied by armed forces or staff headquarters.123 The criterion of “location”, has by nature no military function but contributes to military action nonetheless, e.g. bridges or other construction. 124 While “use” represents a present function, “purpose” is concerned with intended future uses of objects. These encompass civilian objects that can become useful military objects, such as schools or hotels that are used to accommodate troops or headquarters staff.125 It may be of use to note that the military objects by nature category seems to be programmable into

AWS because it consists of objects directly used by military forces.126 The second category, location, also seems to be programmable into AWS because it

122 See for example BOOTHBY, The Law of Targeting (Oxford University Press, 2012). 123 SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, para. 2020. 124 Id. at 2021. 125 Id. at 2022. 126 BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 492. 43 represents a fixed point that could be defined and prioritised in advance by a military commander geographically. The last two categories, present use and future purpose, however, require a case-by-case judgement that cannot be pre- programmed into AWS. In addition, AWS would need to decide whether those objects mentioned above could actually make an effective contribution to military action. To conclude, a target recognition system in AWS should be programmed to only target objects that are military objectives by nature or by location because they can be pre-programmed and may guarantee proper identification.127

Secondly, AWS would need to decide, in a second step, whether the total or partial destruction, capture or neutralization of the targeted object, in the circumstances ruling at the time, offers a definite military advantage. Since a military advantage consists in ground gained and in weakening the enemy forces, 128 they represent objectifiable concepts, easily adaptable to programming. Next, this military advantage must be concrete and perceptible rather than hypothetical and speculative.129 Only if military advantage is definite

(in contrast to potential or indeterminate advantage), 130 it is legitimate to launch

127 Id. at 496 advising “to only look for high priority targets such as mobile air defence systems and surface-to- surface rocket launchers – objects that are military objectives by nature”. 128 SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 2218. 129 YORAM DINSTEIN, 'Legitimate Military Objectives Under The Current Jus in Bello', (2002) 78 International Law Studies, 143-144. 130 SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 2024. 44 an attack. Installing a certain degree of likelihood that an anticipated attack will gain terrain can be programmed into AWS because computer programs today support military commanders in anticipating military advantage and damage assessments.131 For example, the military commander could set a specific degree of military advantage before launching an attack.

Lastly, when in doubt about the legitimacy of an anticipated target, the safety of the civilian population always prevails and will lead to ceasing the attacks. This is explicitly enshrined in Article 52 III AP I which stipulates that:

“In case of doubt whether an object which is normally dedicated to

civilian purposes, such as a place of worship, a house or other dwelling

or a school, is being used to make an effective contribution to military

action, it shall be presumed not to be so used.”

This translates for AWS into: if objects cannot clearly be identified as military objects but may be civilian or dual-use objects, the system must refrain from an attack.132 Implementing a high requirement degree for AWS can satisfy this rule.

This will most likely narrow down lawful operational contexts but leave a variety of potential scenarios that are addressed in more detail below. In any

131 See for contemporary developments, such as the UK’s Royal Navy developing a system, STARTLE, using artificial intelligence situational awareness software to help humans assess threats to continuously monitor and evaluate potential threats: AMBROSE MCNEVIN, 'Royal Navy to explore artificial intelligence use in battle threat assessment', (17 October 2016) www.cbronline.com/4th-revolution/royal-navy-explore-artificial-intelligence- use-battle-threat-assessment/, last visited December 15, 2017. 132 WAGNER, 'Autonomy in the Battlespace: Independently Operating Weapon Systems and the Law of Armed Conflict', (2013) International Humanitarian Law and the Changing Technology of War, 113; BOOTHBY, Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors (T.M.C. Asser Press, 2014), 108-109. 45 event, as suggested by this author, for the foreseeable future, lawful targets will remain to be military objects defined by nature or location for which the rule of doubt will have little relevance.

What is going to be even more complex for AWS is the requirement to also distinguish between combatants and civilians, a customary international law rule explicitly enshrined in Article 48 AP I.133 Only combatants and those civilians directly participating in hostilities may be lawfully targeted. The above rule in case of doubt does also apply here. The question is how can distinction capabilities be programmed into AWS? Combatants are military objectives if they meet the following four conditions in combination: 1) subordinate to a party to the conflict as a collective entity and a subject of international law; 2) an organization of a military character; 3) a responsible command exercising effective control over the members of the organization; 4) respect for the rules of international law applicable in armed conflict.134 For AWS’ sensors, only those criteria visible could be effectively identified. Relying on uniforms will not be an effective means any longer given current asymmetric forms of warfare. Linking the status to the carrying of guns, in addition, is not sufficient either. As Markus Wagner aptly points out, individuals carrying a rifle could do

133 SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 1863. 134 Id. at 1681. 46 so for hunting or for their own protection. 135 William Boothby discusses identification by means of iris recognition techniques136 and Michael Schmitt proposes visual identification of individuals by software137. Neither do armed forces have a database with information about enemy soldiers generally available nor do the suggested methods seem to present a reliable approach satisfying the rule in question. Currently, AWS cannot satisfy the provision to distinguish between combatants and civilians, especially due to poor sensor technology.138

With regard to those civilians becoming lawful military objectives by directly participating in hostilities, it will be incredibly challenging “for the programmer who has to translate the ICRC guidance on direct participation in hostilities into a computer program”.139 Firstly, the criteria for direct participation in hostilities is still disputed.140 The same applies as above, however, that only those criteria visible to sensors could serve to be recognized by AWS. What is true is that

AWS could be programmed to wait to be shot at first in case of doubt as to the

135 WAGNER, 'Autonomy in the Battlespace: Independently Operating Weapon Systems and the Law of Armed Conflict', (2013) International Humanitarian Law and the Changing Technology of War, 115. 136 BOOTHBY, Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors (T.M.C. Asser Press, 2014), 108. 137 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', 11. 138 SHARKEY, 'Automating Warfare: Lessons Learned from the Drones', (2011) 21 Journal of Law, Information and Science, 143. 139 MARCO SASSÒLI, 'Autonomous Weapons – Potential advantages for the respect of international humanitarian law', (2013) Professionals in Humanitarian Assistance and Protection (PHAP), 4. 140 For an overview see: NILS MELZER, 'Interpretative Guidance on the Notion of Direct Participation in Hostilities under International Humanitarian Law ', (2009) www.icrc.org/eng/assets/files/other/icrc-002- 0990.pdf, 1-90, last visited December 15, 2017. 47 status of the opponent. In any event, in case of doubt, the above rule applies that the targets must be considered civilians, thus thwarting the attack.

Due to poor sensor and recognition technology, algorithms will have difficulty complying with the aforementioned provision. However, the technological capacities of the system within a particular deployment environment should be taken into account.141 An operation in an urban warfare environment with civilians and combatants intermingled would be unlawful unless sensor and recognition technology was capable to ensure compliance with the provision. In operational circumstances without civilians, however, even low-level distinguishing AWS could satisfy this rule.142

Fire-and-forget cruise missiles already autonomously target pre-programmed objects. This object targeting must be reduced to easily identifiable and distinguishable targets, such as tanks, and not be applied to objects that can either be used for military or civilian use, such as transport vehicles. Compliance becomes even harder when it comes to human targets. At the moment, it appears unlikely that algorithm will be able to replace human judgment. The “ethical governor”143 proposed by Ronald Arkin, is not convincing. Hors de combat, for

141 JEFFREY S. THURNHER, 'The Law That Applies to Autonomous Weapon Systems', (2013) 17 American Society of International Law Insights, www.asil.org/insights/volume/17/issue/4/law-applies-autonomous- weapon-systems, 3, last visited December 15, 2017. 142 Id. at 3. 143 ARKIN, Governing lethal behavior in autonomous robots, 127-137. 48 example, are impossible to distinguish.144 To summarize, there are operational circumstances in which the use of AWS would be lawful with respect to the distinction provision, others would be unlawful, pending the technology that satisfies the standards laid out by Article 48 AP I. To conclude, the principle of distinction is open to quantitative programming. However, due to the lack of sufficiently sophisticated sensor and targeting software technology, only military objects should be potential targets, humans should not be targeted. As much as they can, human operators and military commanders should ensure proper supervision and effective functioning of AWS.

2. Proportionality

Although some suggest that translating proportionality principles into computer programs for AWS “may be an opportunity to enhance the observance of

IHL,”145 others reply that the proportionality assessment requires subjective value judgments146. The customary law rule codified in Articles 51 V (b), 57 II

(a) (iii) AP I requires that the reasonably anticipated military advantage of an operation must be weighed against anticipated civilian harm. Civilian casualties may not be excessive in relation to expected military advantage. This test, however, is so ambiguous that military lawyers question even if human soldiers

144 HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), 13. 145 SASSÒLI, 'Autonomous Weapons – Potential advantages for the respect of international humanitarian law', 4. 146 WAGNER, 'Autonomy in the Battlespace: Independently Operating Weapon Systems and the Law of Armed Conflict', (2013) International Humanitarian Law and the Changing Technology of War, 120. 49 can properly apply it.147 Those who claim that human judgment is essential, state that “warfare is an art and not a science reducible to the sterile algorithm of ones and zeros”.148 Proportionality goes beyond programming an algorithm to use its sensors properly. Only in clearly excessive cases, however, there is a common understanding that it can easily be determined. Such determinations remain open endeavours with differing outcomes on a case-by-case basis in which military lawyers compare prior situations. 149 Michael Schmitt, however, suggests building upon existing procedures, such as the collateral damage estimate methodology (CDEM), which determines the likelihood of collateral damage to objects or persons near a target by considering factors like “the precision of a weapon, its blast effect, attack tactics, the probability of civilian presence in structures near the target, and the composition of structures to estimate the number of civilian casualties likely to be caused during an attack.”150 Such a

CDEM could be installed in AWS as well. Schmitt even concludes that this could be a possibility given that the likelihood of harm to civilians in the target area relies on objective data and scientific algorithms151.

147 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', 402. 148 MICHAEL A. GUETELEIN, 'Lethal Autonomous Weapons - Ethical and Doctrinal Implications', (2005) Naval War College Newport, RI, http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA464896, 16, last visited December 15, 2017. 149 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon-systems-and- international-humanitarian-law-a-reply-to-the-critics/, last visited December 15, 2017. 150 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 254-255. 151 Id. at 255. 50

That being said, the next step, which is comparing of military advantage with civilian losses seems to be impossible for AWS, due to its unquantifiable nature.

There is no accepted formula to program into a machine.152 Looking to future developments, some suggest that military experts may develop indicators and criteria to evaluate proportionality in order to make subjective judgments regarding proportionality slightly more objective.153 However, as long as this is not feasible, the performance of such a comparison test will not be possible for

AWS.

Again, proportionality must be assessed in conjunction with potential operational settings. Here too, AWS need to comply with judgments of proportionality, which seem impossible in urban warfare environments while they may be feasible in uninhabited deserts, under the sea or machine-to- machine operations.154 To conclude, although the proportionality test cannot yet be properly performed by AWS, as long as an attack may not lead to civilian casualties, military objectives may lawfully be attacked by AWS.

3. Precautions in attacks

The precautions in attacks principle enshrined in Articles 57 and 58 AP I requires all feasible measures to be taken to spare civilian population from

152 BOOTHBY, The Law of Targeting (Oxford University Press, 2012), 96-97. 153 SASSÒLI, 'Autonomous Weapons – Potential advantages for the respect of international humanitarian law', 4. 154 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 231-281. 51 casualties.155 This discretionary principle is so far an obligation that is required of human military commanders. In practice, reasonableness and good faith must be part of the decision to attack, taken as a result of all the information available at the time.156 In most cases, this judgment takes place in the planning phase of an overall large operation rather than in a particular strike. While military commanders decide how to deploy military resources in an operation, the precautionary principle binds them personally. There might also be technical components that need to be programmed into AWS, but their role will be to assist in estimating the amount of collateral damage or the choice of weapon prior to an attack.157 The final say, however, lies with the military commander.

Moreover, precautionary considerations taken in planning of an operation involve anticipated conditions that are usually above the level of a specific weapon. This is true not only for human soldiers but also for AWS. The actual attack is often only a small piece in the overall operation plan and situational awareness is often low for those directly carrying out the attack.158 In case of a fighter jet pilot, the time frame for the actual attack is very short and visibility and identification of the target is not always easy. The trigger standard is the

155 DINSTEIN, The conduct of hostilities under the law of armed conflict (Cambridge University Press, 2010), 138-140; IAN HENDERSON, The Contemporary Law of Targeting: Military Objectives, Proportionality and Precautions in Attack under Additional Protocol I (Martinus Nijhoff Publishers, 2009), 157-196. 156 W. HAYS PARKS, 'The Protection of Civilians from Air Warfare', (1997) 27 Israel Yearbook on Human Rights, 110. 157 GREGORY S. MCNEAL, 'Targeted Killing and Accountability', (2014) 102 Georgetown Law Journal, 739–750. 158 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', 405. 52 same, regardless of whether a human or a machine executes a particular strike.

Policy might even urge to apply more caution in AWS cases, and to refrain from an attack in doubtful circumstances.159 In addition, the procedure before any weapon system is used in a particular operation remains the same: military commanders and operating soldiers undertake considerations such as the likelihood of civilian presence in a particular environment, military advantage gained by a certain attack, as well as limitations and safety mechanisms for a particular choice of weapon.160 This will most likely not change even in case of

AWS’ use. Whether a certain conduct was lawful will not only depend on AWS’ programming and technical abilities but also on human judgments.

To clarify this point, it is important to distinguish between two steps involved in the use of AWS. The first step is the decision to deploy the AWS by a military commander rather than any other possible weapon system. Here, the human personnel involved must take the aforementioned points into consideration in deciding whether to use AWS in a particular situation. In the second step, there is the actual decision-making by AWS after activation. Although these two stages interact, the more the humans involved in the first stage follow the rules provided, the less errors will occur in the actual operation of AWS. These difficulties may be overcome by properly integrating humans in planning of the

159 Id. at 405. 160 Id. at 405. 53 attack, so that only the execution remains for the AWS itself to be carried out, since anticipating different conditions in pre-planning is challenging.161

4. Implementation of suggested standards

With a view to Article 36 AP I, military lawyers need to ensure that the system’s design abides by the provisions enshrined in the above rules through formulation of intended uses and lawful deployment circumstances. As long as machines are not able (due to poor sensor or targeting software) to fully abide by all applicable IHL principles, it will be necessary to keep humans on the loop to ensure the performance of the AWS can be properly observed and attack decisions can be aborted if necessary. 162 Formal legal advice must be implemented into the engineering process of AWS that can be tested, quantified and measured.163 For this, several disciplines and ministries must be included to translate legal requirements into effective testing for AWS.164 Finally, the actual operating soldier’s capabilities and limits as well as effective training methods must be formulated. Dialogue and cooperation among states should lead to information exchange of nationally generated legal standards and testing

161 WILLIAM BOOTHBY, 'Some legal challenges posed by remote attack', (2012) 94 International Review of the Red Cross, 579. 162 BOOTHBY, Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors (T.M.C. Asser Press, 2014), 148; BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 496. 163 BACKSTROM & HENDERSON, 401; US DEPARTMENT OF DEFENSE, 'Report of the Defense Science Board Task Force on Developmental Test and Evaluation ', (2008) www.acq.osd.mil/dsb/reports/ADA482504.pdf, 38, last visited December 15, 2017. 164 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', 410. 54 procedures to ultimately create international rules.165 This process is described in detail further below.

5. Preliminary conclusion: Compliance with IHL principles

possible

The question is whether AWS may be introduced into armed conflict even if they cannot fully comply with all prerequisites set out by IHL. On the one hand,

NGOs call for a ban as long as technology cannot ensure full compliance with international humanitarian principles, such as proportionality. Others, however, support a gradual introduction in specific environments according to the level of autonomy possible to comply with applicable legal conditions. They suggest a lawful deployment in an environment in which civilians are not present, even in case AWS cannot properly distinguish or assess proportionality in the context of legal rules. The reasoning behind this is, according to some,166 that a weighing of military advantage against civilian harms is not even required in such a scenario. This study agrees with this assessment. These scenarios, however, need a thorough screening as to whether AWS’ deployment would be lawful, which is suggested in the legal review chapter below. While the legality of UAS, even in the form of AWS, has been the centre of discussions for a while now, the legal assessment follows the logic of remote control systems.167 The public

165 Id. at 410. 166 ARKIN, Governing lethal behavior in autonomous robots (CRC Press, 2009). 167 For an in-depth analysis on the legality of autonomous UAS see: BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts. 55 interest for autonomous UUS and USS has not matched that of so-called drones.

This is a problem given that the level of autonomy is increasing with high-speed in the maritime environment due to many tasks not requiring any humans. In addition, the specific law of naval warfare is not as well prepared for the robotic revolution as other specific disciplines of law. Definitional and status issues reveal that naval warfare is not yet prepared for AWS. While even modern generations of naval mines and torpedoes are still regulated by the 1907 Hague

Convention VIII and customary international law of naval warfare, the classification of UUS and USS has yet to be resolved. One option would be to consider naval AWS as warships. Article 29 of the United Nations Convention on the Law of the Seas (UNCLOS) defines a warship as a ship

“belonging to the armed forces of a State bearing the external marks

distinguishing such ships of its nationality, under the command of an

officer duly commissioned by the government of the State and whose name

appears in the appropriate service list or its equivalent, and manned by a

crew which is under regular armed forces discipline.”168

Both the prerequisite of command as well as a manned crew speak against such a classification as warships. The classification as auxiliary vessels or state vessels would generally be possible but would preclude AWS from participating in maritime warfare.169 Since UUS and USS are often started and controlled

168 See also LOUISE DOSWALD-BECK, San Remo Manual on International Law Applicable to Armed Conflicts at Sea (Cambridge University Press, 1994), Section V (g). 169 Id. at Section V (h) (i). 56 from another military platform, some suggest they should be included in the latter’s status.170 However, this would not cover naval AWS deployed from land rather than from warships. Hence, naval AWS would need to be classified as warships in order to apply the law of naval warfare properly: this would include both exercising belligerent rights as well as being a lawful target itself.171 The negotiations within the UN described further below address this issue. The following section discusses whether AWS would create a responsibility gap, since no individual could be held accountable in case AWS make mistakes.172

D. Challenges to the Responsibility for AWS

Someone needs to be held accountable for violations perpetrated by AWS.

Looking into the cause173 for a wrongful act is a good starting point.174 Causes include programming errors, malfunctions, accidents, or even intentional misuse. Firstly, this study will look at the potential issues associated with the use of AWS, which would result in a responsibility gap. Secondly, with the help of an attribution framework, exist an opportunity to link AWS violations to responsible entities.

170 HENDERSON, 'Murky Waters: The Legal Status of Unmanned Undersea Vehicles', (2006) 53 Naval Law Review, 65. 171 ROBERT FRAU, 'Regulatory Approaches to Unmanned Naval Systems in International Law of Peace and War', (2012) 2 Journal of International Law of Peace and Armed Conflict, 88. 172 Parts of this chapter have already been published in: FREDRIK VON BOTHMER, 'Robots in Court. Responsibility for Lethal Autonomous Weapons Systems (LAWS)', in Sandra BrändliRehana HarasgamaRoman Schister & Aurelia Tamò (eds.), Mensch und Maschine. Symbiose oder Parasitismus? (Schriften der Assistierenden der Universität St. Gallen, 2015), 102-124; See for an overview also: ARENDT, Völkerrechtliche Probleme beim Einsatz autonomer Waffensysteme (Berliner Wissenschafts Verlag, 2016), 121-156. 173 Causation is an especially disputed element with regard to state responsibility: HELMUT PHILIPP AUST, Complicity and the Law of State Responsibility (Cambridge University Press, 2011), 210. 174 See LIN, BEKEY & ABNEY, 'Autonomous Military Robotics. Risk, Ethics, Design', (2008) US Department of Navy, Office of Naval Research, http://ethics.calpoly.edu/ONR_report.pdf, last visited December 15, 2017. 57

The concern was raised that states may attempt to avoid their responsibilities with regard to international law – that is war crimes or human rights abuses – by delegating combat to AWS. Chief among those who perceive an accountability gap is Andreas Matthias who describes it as the result of creating learning machines.175 He illustrates his argument with the example of traditional software engineers who were in control of machine behaviour. When an error occurred, the programmer was at fault. Autonomous artificially intelligent systems, however, would act outside the observation horizon of the programmer so that he is unable to intervene. Matthias concludes that the designers’ role changes from coders to creators.176

Michael Schmitt, on the other hand, holds that the “mere fact that a human might not be in control of a particular engagement does not mean that no human is responsible for the actions of the autonomous weapon system”.177 Thilo

Marauhn confirms that, “LAWS do not escape the law of state responsibility nor international criminal law as long as there is a link to some legal personality for

175 See ANDREAS MATTHIAS, 'The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata', (2004) 6 Ethics and Information Technology, 175; HEATHER ROFF, 'Killing in War: Responsibility, Liability and Lethal Autonomous Robots', (2013) www.academia.edu/2606840/Killing_in_War_Responsibility_Liability_and_Lethal_ Autonomous_Robots, 9, last visited December 15, 2017; HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), 77; GEORGE R. LUCAS, 'Legal and Ethical Precepts Governing Emerging Military Technologies. Research and Use', (2014) 6 Amsterdam Law Forum, 23, 31. 176 See MATTHIAS, 'The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata', (2004) 6 Ethics and Information Technology, 177, 181-182. 177 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', 33; SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 231, 277. 58 the establishment of accountability”.178 Humans decide how to program the system and when to launch it, Schmitt agrees.179 This study wonders whether there is not, rather than a “gap”, simply a lack of comprehensive scientific analysis. Overall, scholarship is not yet offering a consolidated response; perhaps there is even confusion. Lack of visibility is often the case with new developments, especially technological ones.

1. Responsibility framework

States may be held accountable for their wrongdoings by the international legal principle of state responsibility and individuals through international criminal responsibility. For the purpose of this paper, the assumption is made that international humanitarian law and human rights law is applicable.180 The next section starts with the Articles on State Responsibility (ASR) codified by the

International Law Commission (ILC) in 2001.181

178 See THILO MARAUHN, 'An Analysis of the Potential Impact of Lethal Autonomous Weapons Systems on Responsibility and Accountability for Violations of International Law', (13 May 2014) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EDD006B8954/(httpAssets)/35FEA015C2466A57C1257CE4004BC A51/$file/Marauhn_MX_Laws_SpeakingNotes_2014.pdf, 3, last visited December 15, 2017. 179 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 277. 180 Many authors engage with the question of the (territorial) application of law: NOAM LUBELL & NATHAN DEREJKO, 'A Global Battlefield? Drones and the Geographical Scope of Armed Conflict', (2013) Journal of International Criminal Justice, 65; NOAM LUBELL, 'Challenges in Applying Human Rights Law to Armed Conflict', (2005) 87 International Review of the Red Cross, 737; MARCO SASSÓLI, 'Le DIH, une lex specialis par rapport aux droit humains?', in Andreas Auer, Alexandre Flückiger & Michel Hottelier (eds.), Les droits de l'homme et la constitution, Etudes en l'honneur du Professeur Giorgio Malinverni (Schulthess, 2007), 375; ROBERT FRAU, 'Unbemannte Luftfahrzeuge im internationalen bewaffneten Konflikt', (2011) 24 Journal of International Law of Peace and Armed Conflict, 60. 181 See JAMES CRAWFORD, The International Law Commission’s Articles on State Responsibility. Introduction, Text and Commentaries (Cambridge University Press, 2002). 59

a) Articles on State Responsibility: a framework for

AWS?

The prerequisites for state responsibility are twofold: Article 1 ASR requires the state to commit an internationally wrongful act while Article 2 ASR demands an attributable conduct, constituting a breach of a state’s international obligations.182 In every case, the starting point is the violation of an international obligation through acts or omissions. It is important to note that states are capable of committing offences as legal persons.183 For the sake of argument, this study will consider what would happen were AWS responsible for an internationally wrongful act.

According to Article 4 ASR, the conduct of a state’s organs (i.e. national troops) is considered acts of the state.184 Article 4 II ASR defines organs as, “any person or entity which has that status in accordance with the internal law of the state”.

In addition, according to Article 5 ASR, actions by persons or entities that constitute elements of governmental authority are acts of the state. Furthermore,

Article 7 ASR states that violations committed by human soldiers are attributed to the state even if they occur ultra vires. That being said, as pictures of Abu-

182 See MARCO SASSÒLI, 'State responsibility for violations of international humanitarian law', (2002) 84 International Review of the Red Cross, 401, especially with regard to international humanitarian law violations. 183 See IAN BROWNLIE, Principles of Public International Law (Oxford University Press, 2008), 445; INTERNATIONAL COURT OF JUSTICE, 'Advisory Opinion of 11.4.1949 - Reparations for Injuries suffered in the Service of the United Nations', (1949) I.C.J. Reports, 174, 179. 184 CRAWFORD, State Responsibility The International Law Commission’s Articles on State Responsibility, Art. 4, para. 3; IAN BROWNLIE, State Responsibility (Oxford University Press, 1983), 450. 60

Ghraib alarmingly revealed, there are many examples of soldiers no longer controlled by their commanders.185

b) Private military contractors: an example of unclear attribution

In the last ten years, a potential responsibility gap has emerged with regard to private military contractors (PMC) and violations committed during their deployment. With wars in Afghanistan and Iraq a new trend developed: the privatization of tasks, including armed conflicts, formerly only performed by soldiers.186 Companies such as ACADEMI (previously Blackwater)187 represent this privatization of war188.

Attribution works through contractual delegation to those corporate warriors, which Peter Singer calls “military outsourcing and the loss of control”.189 In the chain of command with traditional soldiers, the state has disciplinary oversight and control over enforcement mechanisms. In contrast with PMCs, the hiring state only holds a contractual relationship to the latter.190 The acts committed by

185 See GEORGE R. FAY, 'Investigation of the Abu Ghraib Detention Facility and the 205th Military Intelligence Brigade', (2004) www.washingtonpost.com/wp-srv/nationi/documents/fay_report_8-25-04.pdf, last visited December 15, 2017. 186 See FRANCESCO FRANCIONI, 'Private Military Contractors and International Law: An Introduction', (2008) 19 European Journal of International Law, 961; AMANDA TARZWELL, 'In Search of Accountability: Attributing the Conduct of Private Security Contractors to the United States Under the Doctrine of State Responsibility', (2009) Oregon Review of International Law, 179; CHRISTOPHER H. LYTTON, 'Blood for Hire: How the War in Iraq Has Reinvented the World's Second Oldest Profession', (2006) see id. at, 307. 187 See http://academi.com, last visited December 15, 2017; for details on alleged atrocities see: JAMES GLANZ & ANDREW W. LEHREN, 'Use of Contractors Added to War’s Chaos in Iraq', (23 October 2010) www.nytimes.com/2010/10/24/world/ middleeast/24contractors.html?_r=2&hp&, last visited December 15, 2017. 188 For details see: PETER W. SINGER, Corporate Warriors - The Rise of the Privatized Military Industry (Cornell University Press, 2003). 189 Id. at 158. 190 FRANCIONI, 'Private Military Contractors and International Law: An Introduction', (2008) 19 European Journal of International Law, 962. 61 private military personnel are attributed to these private persons rather than to the state, even if they include carrying weapons or fire engagement. 191

Compared to the responsibility they would face their own soldiers, states might only bear reduced responsibility when relying on private personnel. Questions that arise include the following: do private military personnel form part of the armed forces? Who is responsible if abuses are committed by PMC? The main issue is whether the contracting state can be held accountable for having failed to prevent violations committed by PMC.

Possible solutions 192 to these problems include attributing wrongful acts committed by PMC to the organization that has the authority over the operation.193 Carsten Hoppe wants to close the gap with positive obligations for the hiring state, and argues that customary international law may allow for assimilation of PMC as “members of the armed forces”.194 He finds state responsibility to be applicable; he maintains that a general “due diligence obligation” should be placed upon the hiring state to prevent and suppress violations of human rights and humanitarian law.195 Hannah Tonkin supports

191 CARSTEN HOPPE, 'Passing the Buck: State Responsibility for Private Military Companies', (2008) 19 The European Journal of International Law, 989. 192 The most comprehensive overview of possible solutions is currently offered by HANNAH TONKIN, State Control over Private Military and Security Companies in Armed Conflict (Cambridge University Press, 2013); See also CHIA LEHNARDT, Private Militärfirmen und völkerrechtliche Verantwortlichkeit. Eine Untersuchung aus humanitär-völkerrechtlicher und menschenrechtlicher Perspektive (Mohr Siebeck, 2011). 193 NIGEL D. WHITE & SORCHA MACLEOD, 'EU Operations and Private Military Contractors: Issues of Corporate and Institutional Responsibility', (2008) 19 European Journal of International Law, 966. 194 HOPPE, 'Passing the Buck: State Responsibility for Private Military Companies', (2008) 19 The European Journal of International Law, 1009, relying on the responsibility for the armed forces according to Arts. 3 HC IV and 91 AP I. 195 Id. at 1013. 62 this view and stresses positive due diligence obligations in case a direct attribution fails.196

c) Suggestions to include AWS in the ASR framework

How can the responsibility gap be successfully bridged? In order to do so, possibilities to link AWS’ conduct to responsible entities must be found. What can be taken for AWS from the solutions suggested for PMCs?

(1) Attribution of AWS conduct to the state?

If AWS could be included in the category of state organs, in the sense of Article

4 ASR, the attribution of responsibility would follow relatively easily. The state organ’s responsibility is not limited, as long as the conduct in question is performed in an official capacity.197 AWS as opposed to human soldiers would always be “on duty” and never act in a private capacity. However, as it stands, the ASR do not include machines as organs.198 Therefore, AWS cannot be considered to be state organs. To interpret a machine as an organ would distort the ASR.

196 TONKIN, State Control over Private Military and Security Companies in Armed Conflict, 59; LEHNARDT, Private Militärfirmen und völkerrechtliche Verantwortlichkeit, 216. 197 See CRAWFORD, The International Law Commission’s Articles on State Responsibility, Art. 4, para. 5. 198 For a definition of state organ: JAMES CRAWFORD, State Responsibility – The General Part (Cambridge University Press, 2013), 118; see RONALD CRAIG ARKIN, 'The Robot didn’t do it', (2013) Position Paper for the Workshop on Anticipatory Ethics, Responsibility and Artificial Agents, 1; for suggestions to assign responsibility in advance, see GARY E. MARCHANT, BRADEN ALLENBY, RONALD ARKIN & EDWARD T. BARRETT, 'International Governance of Autonomous Military Robots', (2011) 12 Columbia Science and Technology Law Review, 277, for an obligation to integrate a “black box” obligation that would allow for better ex post control. 63

Yet the misconduct of AWS may be attributable to the state on another basis, namely if an AWS has been authorized to exercise elements of governmental authority, according to Article 5 ASR. Offensive combat entails the exercise of governmental authority. This norm, however, explicitly speaks of a “person or entity” and is thus intended to cover quasi-state entities.199 According to the

ASR Commentary it includes any “natural or legal person”.200 Although some discuss the question of personhood for robots,201 this idea does not receive much support on an international level. Hence, it would also stretch Article 5 to apply it to AWS.

The principle of de-facto attribution in case of acts under the direction or control of the state is laid down in Article 8 ASR. However, in this case too, the wording explicitly mentions “person or group of persons”. This study wonders if

AWS could be defined as “members of the armed forces” in the sense of Article

43 of the AP I.202 However, the same issue persists: the term “members of the armed forces” does not include machines. This is illustrated by the fact that mention is made to “individuals or persons”.203 It is important to note that it is a human state organ operating or commanding the use of AWS. Therefore, this is where the attribution link must be sought.

199 See CRAWFORD, The International Law Commission’s Articles on State Responsibility, Art. 5, para. 1. 200 See id. at Art. 4, para. 12. 201 Among the forefront: ANDREAS MATTHIAS, Automaten als Träger von Rechten (Logos Verlag, 2010). 202 In conjunction with Art. 91 AP I it provides a rule of attribution to the state. 203 See PETER ROWE, 'Members of the Armed Forces and Human Rights Law', in Andrew Clapham & Paola Gaeta (eds.), The Oxford Handbook of International Law in Armed Conflict (Oxford University Press, 2014), 522. 64

(2) Responsibility for the acts of state organs operating AWS

The conduct of individual soldiers and officers of armed forces is to be considered as conduct of state organs.204 Their conduct is hence attributable to the state, pursuant to Article 4 ASR.205 The question is whether the use of AWS alters this rule. It is currently unclear whether it is soldiers or military commanders who actually operate or order the operation of AWS. Conduct could either be attributed to the soldier operating the AWS or the military commander making the decision to deploy AWS. It is this conduct that may result in state responsibility.

(3) “Due diligence” obligations to prevent violations

Generally, in the case of unlawful conduct that is not directly attributable to the state, the state is not under any obligation to prevent this behaviour from happening. States, however, do have an obligation of due diligence to control

AWS, in addition to their obligation to refrain from committing wrongful acts.

The due diligence obligation does not require a certain outcome but rather demands best efforts to do what is reasonably possible within their powers.206

204 See INTERNATIONAL COURT OF JUSTICE, 'Case Concerning Armed Activities on the Territory of the Congo (DRC v. Uganda)', (2005) I.C.J. Reports, 213; see on its customary international law character: INTERNATIONAL COURT OF JUSTICE, 'Advisory Opinion - Difference Relating to Immunity from Legal Process of a Special Rapporteur to the Commission of Human Rights', (1999) I.C.J. Reports, 62. 205 BROWNLIE, Principles of Public International Law, 450; NICLAS VON WOEDTKE, Die Verantwortlichkeit Deutschlands für seine Streitkräfte im Auslandseinsatz und die sich daraus ergebenden Schadensersatzansprüche von Einzelpersonen als Opfer deutscher Militärhandlungen (Duncker & Humblot, 2010), 132. 206 PIERRE-MARIE DUPUY, 'Reviewing the Difficulties of Codification: on Ago’s Classification of Obligations of Means and Obligations of Result in Relation to State Responsibility', (1999) European Journal of International Law, 371; JAMES CRAWFORD, 'Revising the Draft Articles on State Responsibility', 436, 439; RICCARDO 65

The International Court of Justice (ICJ) in its Genocide case held that with regard to the obligation to prevent genocide, it is not a requirement to succeed, but states must employ all means reasonably available to them to prevent such a disastrous event.207

According to the ASR Commentary, such obligations “are usually construed as best efforts obligations, requiring states to take all reasonable or necessary measures to prevent a given event from occurring, but without warranting that the event will not occur”.208 In case the state fails to take such positive steps of prevention and a violation occurs, it may constitute a wrongful act. In fact, the state’s own failure gives rise to state responsibility.209 A due diligence obligation must be considered in respect to a particular primary obligation.210 According to

Common Article 1 to the Geneva Conventions (GC) states are under an obligation to ensure respect for the law of armed conflict.211 It may be concluded

PISILLO-MAZZESCHI, 'The Due Diligence Rule and the Nature of the International Responsibility of States', (1992) German Yearbook of International Law, 47. 207 INTERNATIONAL COURT OF JUSTICE, 'Case Concerning the Application of the Convention of the Prevention and Punishment of the Crime of Genocide (Bosnia and Herzegovina v. Serbia and Montenegro) (Merits)', (2007) I.C.J. Reports, 430; Further see ICJ, 'Case Concerning Armed Activities on the Territory of the Congo (DRC v. Uganda)', (2005) I.C.J. Reports, 178, applying a standard of vigilance; ICJ, 'Corfu Channel (UK v. Albania)', (1949) I.C.J. Reports, 22, for a due diligence assessment based on knowledge. 208 See CRAWFORD, The International Law Commission’s Articles on State Responsibility, Art. 14, para. 14. 209 See BROWNLIE, State Responsibility, 150. 210 Obligations differ for states and circumstances: see PIERRE-MARIE DUPUY, Due Diligence in the International Law of State Responsibility, OECD: Legal Aspects of Transfrontier Pollution (OECD, 1977); PISILLO- MAZZESCHI, 'The Due Diligence Rule and the Nature of the International Responsibility of States', (1992) German Yearbook of International Law, 30; JAN HESSBRUEGGE, 'The Historical Development of the Doctrines of Attribution and Due Diligence in International Law', (2004) New York University Journal of International Law and Politics, 265; ROBERT BARNIDGE, Non-State Actors and Terrorism: Applying the Law of State Responsibility and the Due Diligence Principle (T.M.C. Asser Press, 2007); for a positive obligation from Art. 86 AP I towards new weapons (Art. 36 AP I), see SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, para. 36, 43. 211 See SANDOZ, SWINARSKI & ZIMMERMANN, 3536; in support of a positive general due diligence nature: HENCKAERTS & DOSWALD-BECK, Customary International Humanitarian Law - Volume I (Cambridge University Press, 2009), 509, stating that states must exert their influence, to the degree possible, to stop violations of international humanitarian law; BIRGIT KESSLER, 'The Duty to “Ensure Respect” under Common Article 1 of the Geneva Conventions', (2001) German Yearbook of International Law, 498; for the lack of 66 that states must take diligent steps to ensure that the use of AWS comply with international humanitarian law. Inaction with regard to such a preventive obligation may lead to state responsibility. The Genocide case goes further and applies a “psychological element”: States have an obligation to prevent unlawful conduct when the “state is aware, or should normally be aware”.212 The burden of proof is on the state using AWS and not upon the victims, since the latter do not have access to the corresponding technology, due to lack of transparency and for security reasons. A state then needs to fulfil two affirmative steps to discharge itself from its obligations: it needs to provide a system which allows for the prevention of violations and it must use this system to prevent the latter in a specific case.

(4) Article 36 AP I new weapons review: a “link” for state responsibility?

Article 36 AP I obliges state parties to verify in the study, development, acquisition or adoption of new weapons, means or methods of warfare whether their employment would be prohibited. This involves a proper legal review prior to deployment. 213 What this study suggests is that states could be held

territorial limitation with regard to Common Art. 1 GC: INTERNATIONAL COURT OF JUSTICE, 'Military and Paramilitary Activities In and Against Nicaragua (Nicaragua v. USA)', (1986) I.C.J. Reports, 220, 255; Art. 3 HC IV and Art. 91 AP I reflect as customary law the responsibility of states, see ICJ, 'Case Concerning Armed Activities on the Territory of the Congo (DRC v. Uganda)', (2005) I.C.J. Reports, 214. 212 ICJ, 'Case Concerning the Application of the Convention of the Prevention and Punishment of the Crime of Genocide (Bosnia and Herzegovina v. Serbia and Montenegro) (Merits)', (2007) I.C.J. Reports, 431; INTERNATIONAL LAW COMMISSION, Draft Articles on the Prevention of Transboundary Harm from Hazardous Activities with Commentaries (A/56/10, 2001), Art. 3, para. 18. 213 Since AWS do not yet exist, they are to be considered as ”new” and should be subsumed as a “means” of warfare: BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 226; ISABELLE DAOUST, ROBIN COUPLAND & RIKKE ISHOEY, 'New wars, new weapons? The obligation of States to assess the legality of means and methods of warfare', (2002) 84 International Review 67 responsible if a legal review prior to deployment of AWS was insufficient or inappropriate.

There is very little state practice, as few states have implemented a national review process,214 although they are obliged to do so.215 Prior review becomes more crucial as the autonomy level increases and the control of human operators decreases. The Commentary on Article 36 AP I predicts that, “if man does not master technology, but allows it to master him, he will be destroyed by technology”.216 With increasing use of AWS, legal assessment of an attack must come earlier. 217 The review duty incorporates, inter alia, the “cardinal principles”218 and is not limited to intended use but must extend to possible misuse.219

of the Red Cross, 345; JUSTIN MCCLELLAND, 'The Review of Weapons in Accordance with Article 36 of Additional Protocol I', (2003) 85 see id. at, 397; JAMES D. FRY, 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', (2006) 44 Columbia Journal of Transnational Law, 453; W. HAYS PARKS, 'Means and Methods of Warfare', (2006) 38 George Washington International Law Review, 511. 214 See BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 219; KATHLEEN LAWAND, 'Reviewing the legality of new weapons, means and methods of warfare', (2006) 88 International Committee of the Red Cross Review, 925, 926; KATHLEEN LAWAND, ROBIN COUPLAND & PETER HERBY, 'A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977', (2006) International Committee of the Red Cross, 5; the USA, however, are among them: see WOLFF HEINTSCHEL VON HEINEGG, 'Verteidigungsausschuss, Anhörung: Völker, verfassungsrechtliche sowie sicherheitspolitische und ethische Fragen im Zusammenhang mit unbemannten Luftfahrzeugen', (30 June 2014) Protokoll: 18/16, 36, www.bundestag.de/bundestag/ausschuesse18/a12/oeffentliche_anhoerung, last visited December 15, 2017. 215 This follows from pacta sunt servanda, UNITED NATIONS, 'Vienna Convention on the Law of Treaties', (1969) 1155 UNTS 331, https://treaties.un.org/doc/Publication/UNTS/Volume%201155/volume-1155-I-18232- English.pdf, Article 26, last visited December 15, 2017. 216 See SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 1476, with regard to automation on the battlefield. 217 See TIM MCFARLAND & TIM MCCORMACK, 'Mind the Gap: Can Developers of Autonomous Weapons Systems be Liable for War Crimes?', (2014) 90 International Law Studies, 362. 218 ICJ, 'Legality of the Threat or Use of Nuclear Weapons, (Advisory Opinion of 8 July 1996)', (1996) I.C.J. Reports, 226, 257, including Art. 35 (2), Art. 48, Art. 51 (4) (b) (c) AP I. 219 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 231, arguing for a broad scope of a review obligation; in contrast, for a review of only 68

This study posits that states should be held responsible if better testing and legal review could have prevented violations committed by AWS.220 Since it is, in general, state agents that conduct legal reviews in their official capacity,221 attribution would naturally follow from the ASR. The Article 36 legal review process, its implementation and the sharing of best practices as well as realistic standardized testing and transparency are considered to be among the most urgent tasks. All stakeholders are called upon to contribute their share: developers need to implement international humanitarian law principles, lawyers involved in the review process need to understand how weapons systems operate and efficient operational guidelines must be formulated.222 The next section deals with the question of who can be held individually criminally responsible.

2. Individual Criminal Responsibility

Actions caused by AWS may lead to important breaches of international criminal law. For the purpose of this study, the focus is on international criminal

“normal uses” see SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 1480. 220 See LAWAND, COUPLAND & HERBY, 'A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977', (2006) International Committee of the Red Cross, 15. 221 See BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 238; see for reference to review in Germany with regard to drones: THILO MARAUHN, 'Der Einsatz unbemannter bewaffneter Drohnen im Lichte des geltenden Völkerrechts', (2013) 9 Arbeitspapiere Deutsche Stiftung Friedensforschung, 26, 33. 222 See BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 483. 69 law as it is codified in the International Criminal Court (ICC) Statute223.

Criminal liability requires the fulfilment of a factual (actus reus) and a mental element (mens rea).224

a) Potential addressees of responsibility

First, this study considered whether robots themselves can be held criminally responsible; and, based on the aforementioned legal investigation, this must be answered in the negative. Similar to state responsibility, individual criminal responsibility is inseparably linked to legal personality.225 Andreas Matthias discusses quasi-legal personality for robots;226 others view AWS “as not simply weapons but a class of combatants”227. However, robots themselves will not yet, and may never, be included in international criminal law.228 The wording of

223 See UNITED NATIONS, 'Rome Statute of the International Criminal Court', (1 July 2002) 2187 UNTS 90, http://untreaty.un.org/cod/icc/statute/romefra.htm, last visited on December 15, 2017. 224 See for details on these terms: WILLIAM WILSON, Criminal Law – Doctrine and Theory (Longman, 2008), 173; see for mens rea: JOHAN D. VAN DER VYVER, 'International Criminal Court and the Concept of Mens Rea in International Criminal Law', (2004) 12 University of Miami International and Comparative Law Review, 57; for questions regarding legal personality pursuant to the law of the European Union (AI entity), see THOMAS BURRI, 'Free Movement of Algorithms: Artificially Intelligent Persons Conquer the European Union's Internal Market', (2017) in: Woodrow Barfield and Ugo Pagallo (eds), Research Handbook on the Law of Artificial Intelligence, Edward Elgar, (forthcoming), available at: https://ssrn.com/abstract=3010233, 10-11, last visited on December 15, 2017. 225 See ROBERT FRAU, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', in Robert Frau (ed.), Drohnen und das Recht. Völker- und verfassungsrechtliche Fragen automatisierter und autonomer Kriegsführung (Mohr Siebeck, 2014), 235, 243. 226 See MATTHIAS, Automaten als Träger von Rechten, 85. 227 ROFF, 'Killing in War: Responsibility, Liability and Lethal Autonomous Robots', (2013) www.academia.edu/2606840/Killing_in_War_Responsibility_Liability_and_Lethal_Autonomous_Robots, 3, last visited on December 15, 2017. 228 Possibilities of punishment (destruction or program restriction) seem far-fetched since this would not satisfy the purpose of deterrence or retribution; see FRAU, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', in Frau (ed.), Drohnen und das Recht, 243; HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), para. 76. 70

Article 25 I ICC Statute explicitly limits jurisdiction to “natural persons”.229

This excludes criminal responsibility of AWS a priori. Some experts hold that

“legal responsibility for the actions of a robot falls on the individual who grants the robot permission to act on their behalf”.230 This leaves individual criminal responsibility with humans: the program designer or manufacturer, 231 the military commander, subordinate soldiers, civilian supervisors or political leaders232. When it comes to the potential use of AWS in the battlefield, the responsibility of the military commander is the most likely scenario. Therefore, the next section looks further into this aspect.

b) Command responsibility

Traditionally, military commanders can be held accountable for subordinate human soldiers’ actions by way of command responsibility.233 According to

229 See KAI AMBOS, Internationales Strafrecht (C. H. Beck, 2011), § 7, para. 10; FRAU, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', 238, holds that partial legal personality for the purpose of international criminal law does not exist for machines. 230 See PATRICK LIN, KEITH ABNEY & GEORGE A. BEKEY, Robot ethics: the ethical and social implications of robotics (MIT Press, 2012), 59. 231 The developer’s responsibility will not be covered; it is highly disputed: for arguments in favour of responsibility, see INTERNATIONAL COMMITTEE OF THE RED CROSS, 'Report of the ICRC Expert Meeting on Autonomous weapon systems: technical, military, legal and humanitarian aspects in Geneva', (26-28 March 2014) www.icrc.org/eng/assets/files/2014/ expert-meeting-autonomous-weapons-icrc-report-2014-05-09.pdf, 15, last visited on December 15, 2017; others see a responsibility for the intentional production for war crimes: DOCHERTY, 'Losing Humanity. The Case against Killer Robots', 12; SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 277-278; MARAUHN, 'An Analysis of the Potential Impact of Lethal Autonomous Weapons Systems on Responsibility and Accountability for Violations of International Law', 4, finds that designers and manufacturers can be held responsible for aiding and abetting, Art. 25 (3) (c) ICC Statute; see MCFARLAND & MCCORMACK, 'Mind the Gap: Can Developers of Autonomous Weapons Systems be Liable for War Crimes?', (2014) 90 International Law Studies, 361, for the question of application of international humanitarian law if the programming takes place before an armed conflict. 232 See ROFF, 'Killing in War: Responsibility, Liability and Lethal Autonomous Robots', (2013) www.academia.edu/2606840/Killing_in_War_Responsibility_Liability_and_Lethal_Autonomous_Robots, 11, last visited on December 15, 2017 for an overview of potential addresses of responsibility. 233 See GUÉNAËL METTRAUX, The Law of Command Responsibility (Oxford University Press, 2009); KAI AMBOS, 'Superior Responsibility', in Antonio Cassese, Paola Gaeta & John Jones (eds.), The Rome Statute of the International Criminal Court: A Commentary, (Oxford University Press, 2002), 823; ILIAS BANTEKAS, 'The 71

Article 87 AP I, military commanders shall prevent breaches by members of the armed forces under their command and other persons under their control. This supervision obligation applies not only to subordinate humans but also to weapons systems.234 The superior’s responsibility in Article 86 II AP I states that responsibility for breaches by subordinates rests with superiors if they knew, or should have known that a breach would be committed and did not take all feasible measures within their power to prevent or repress the breach.

According to Article 28 (a) (i) (ii) ICC Statute, military commanders need to have actual knowledge or potential knowledge (“should have known”) that subordinates over whom they had effective command and control, or effective authority and control, committed a crime, and failed to prevent it.235

The question remains, however, of how command responsibility can be established for the use of AWS. In other words: is a commander responsible for delegating the decision to attack to AWS rather than to subordinate soldiers? For

Contemporary Law of Superior Responsibility', (1999) 93 American Journal of International Law, 573; many experts discuss this with regard to AWS: HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), para. 78; ROFF, 'Killing in War: Responsibility, Liability and Lethal Autonomous Robots', 14; analogies are drawn to minor children, child soldiers or animals together with the notion of “diminished responsibility”, see LIN, ABNEY & BEKEY, Robot ethics: the ethical and social implications of robotics (MIT Press, 2012), 58 and to the responsibility of intelligent artifacts, see LEON E. WEIN, 'The Responsibility of Intelligent Artifacts: Towards an Automation Jurisprudence', (1992) 6 Harvard Journal of Law and Technology, 103, 143; see also JOHN W. SNAPPER, 'Responsibility for Computer Based Errors', (1985) 16 Metaphilosophy, 289, 295; see for the responsibility for drones: FREDERIK ROSEN, 'Extremely Stealthy and Incredibly Close: Drones, Control and Legal Responsibility', (2014) Journal of Conflict and Security Law, 113; GEERT-JAN KNOOPS, 'Drones at Trial. State and Individual (Criminal) Liabilities for Drone Attacks', (2014) 14 International Criminal Law Review, 42; a responsibility for subordinates is also considered part of customary international law, see HENCKAERTS & DOSWALD-BECK, Customary International Humanitarian Law - Volume I (Cambridge University Press, 2009), 559. 234 See FRAU, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', 242; HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), para. 78, demands that military commanders need to sufficiently understand the programming of LAWS for being held liable. 235 The same rule applies for Article 7 (3) Statute of the International Criminal Tribunal for the former Yugoslavia, 32 ILM 1159 (1993). 72 this, actus reus requires command or control over subordinates, a failure to exercise it properly, and a causal link. For the fulfilment of mens rea, the commander needs to have actual or potential knowledge. While causation is fulfilled if the commander’s omission increased the risk of the commission of the crime,236 case law also reflects that causation is not always required as a separate element237.

Actual knowledge suffices for mens rea. This is the case when soldiers or military commanders intentionally misuse AWS in areas in which they could not abide by the laws of war (outside manual restrictions), or if they knew or should have known that the autonomous weapon system had been programmed to commit war crimes and did nothing to stop its use.238 This would be the case if an operator deployed or a commander knowingly ordered the use of AWS that cannot distinguish properly between civilians and combatants in an environment in which both are likely to be present.239 Mens rea would then be fulfilled. The intentional misuse, however, would most rarely occur. More difficult to prove are cases in which AWS would commit wrongful acts unexpectedly, without

236 See INTERNATIONAL CRIMINAL COURT, 'Pre-Trial Chamber II, Situation in the Central African Republic in the case of the Prosecutor v. Jean-Pierre Bemba Gombo', (2009) No.: ICC-01/05-01/08, 425; KAI AMBOS, Treatise on International Criminal Law (Oxford University Press, 2013), 215. 237 See INTERNATIONAL CRIMINAL TRIBUNAL FOR THE FORMER YUGOSLAVIA, 'Trial Chamber, Prosecutor v. Zejnil Delalić et al.', (1998) Case: IT-96-21-T, 398, 400; INTERNATIONAL CRIMINAL TRIBUNAL FOR THE FORMER YUGOSLAVIA, 'Appeals Chamber, Prosecutor v. Tihomir Blaškic', (2004) Case: IT-95-14-A, 76. 238 See SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 277. 239 Id. at 278. 73 human knowledge of humans involved.240 Robots will probably one day be able to learn from their amassed experience, and their behaviour will adapt to events out of the user’s control.241 That being said, according to the ICC, it is sufficient to fulfil the mens rea (“should have known”) requirement that the commander was merely “negligent” in failing to obtain knowledge of criminal conduct of subordinates.242 To prove mens rea for the use of AWS, the error rate243 must be in the focus. This rate varies with the state of technology. For the moment, it seems that only if an AWS has a very low error rate and is used in a structured environment, the assumption is warranted that mens rea is not fulfilled.

However, especially for AWS of the first generation, which do not yet exist, it must be assumed, that the mens rea criterion would be met. This is believed to be true since technology in general develops over time and realistic scenarios for testing prior deployment are hard to imagine. Accordingly, the commander may not securely rely on AWS.

240 HUMAN RIGHTS WATCH/HARVARD LAW SCHOOL'S INTERNATIONAL HUMAN RIGHTS CLINIC, 'Advancing the debate on Killer Robots: 12 Key Arguments for a preemptive ban on Fully Autonomous Weapons', (2014) www.hrw.org/sites/default/files/ related_material/Advancing%20the%20Debate_final.pdf, 12, last visited December 15, 2017; see SPARROW, 'Killer Robots', (2007) 24 Journal of Applied Philosophy, 66. 241 See THOMAS HELLSTRÖM, 'On the moral responsibility of military robots', (2013) 15 Ethics and Information Technology, 105. 242 ICC, 'Pre-Trial Chamber II, Situation in the Central African Republic in the case of the Prosecutor v. Jean- Pierre Bemba Gombo', (2009) No.: ICC-01/05-01/08, 429; contra INTERNATIONAL CRIMINAL TRIBUNAL FOR RWANDA, 'Appeals Chamber - Prosecutor v. Ignace Bagilishema', (2002) Case: ICTR-95-1A-A, 35. 243 See arguments for better testing prior deployment and that subordinates must be aware of their obligations: AMBOS, Treatise on International Criminal Law, 199; WILLIAM J. FENRICK, 'Art. 28', in Otto Triffterer (ed.), Commentary on the Rome Statute (Beck/Hart, 1999), 9; SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 3545; FRAU, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', 244, concludes that Art. 86 (2) and 87 AP I entail a duty to control a potentially dangerous system and make sure that it is used according to the law. 74

For the question of the effectiveness of command and control, it is decisive to determine the point in time of such an assessment. In principle, the launch of an attack, in the sense of Article 49 AP I, is the decisive point in time for the establishment of criminal responsibility.244 With the use of AWS, this moment shifts to the point where the last human delegates the decision to attack to the

AWS.245 When delegating the decision to an AWS, the last human is still able to

“control” the system. A commander would knowingly decide to delegate the decision to attack to a machine rather than to a human. His supervision obligation enshrined in the concept of command responsibility would, in this case, intensify.246

In sum, in cases of intentional misuse, criminal responsibility can be established.

Establishing a duty upon the commander to prevent crimes committed via AWS, would always need to prove mens rea, which remains difficult.

244 See SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 1877. 245 See MARAUHN, 'An Analysis of the Potential Impact of Lethal Autonomous Weapons Systems on Responsibility and Accountability for Violations of International Law', 5, last visited December 15, 2017; ICRC, 'Report of the ICRC Expert Meeting on Autonomous weapon systems: technical, military, legal and humanitarian aspects in Geneva', (26-28 March 2014) www.icrc.org/eng/assets/files/2014/expert-meeting- autonomous-weapons-icrc-report-2014-05-09.pdf, 15, last visited December 15, 201; DOD, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012) www.esd.whs.mil/ Portals/54/Documents/DD/issuances/dodd/300009p.pdf, 3, last visited December 15, 2017, stressing that persons authorizing “the use of, direct the use of, or operate autonomous and semi-autonomous weapon systems must do so (…) in accordance with the law of war”. 246 See FRAU, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', 244, 247: who bases command responsibility on Art. 86-87 AP I since Art. 28 ICC Statute shall only be owed towards subordinate humans; HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), 81, argues for amending rules with regard to command responsibility to cover LAWS, which this author sees as virtually impossible due to lack of international support. 75

3. Preliminary conclusion

Many argue that existing law is not sufficient to cover the responsibility for

AWS and all possible candidates for responsibility are said to be ultimately inappropriate. They deduce that AWS, as such, would be unlawful.247 However, this argument holds true only if it is impossible to find someone who can be held responsible. Amendments to existing rules are difficult to achieve on an international level but should nevertheless be attempted in on-going discussions.

This section argued that AWS’ conduct performed during armed conflict could be attributed to the state. Holding states accountable is possible by means of positive due diligence obligations and a proper Article 36 AP I legal review.

Further, this chapter demonstrated with regard to AWS’ responsibility, that

Heyns’ assumption was correct: responsibility should be placed first and foremost upon states using AWS.248 It is the states that introduce them into battle; therefore if something goes wrong, the responsibility should rest on them.

Furthermore, additional or accessory individual criminal responsibility is possible in cases of intentional misuse or if actual or potential knowledge can be proved. There may be legitimate reasons against AWS as such. A responsibility gap is not among them.

247 See DOCHERTY, 'Losing Humanity. The Case against Killer Robots', 1-55; SPARROW, 'Killer Robots', (2007) 24 Journal of Applied Philosophy, 62-79; HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), 80. 248 See HEYNS, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013), 81. 76

III. Multilateral Discussions in the UN CCW

This chapter contains a dense description of the on-going Convention on Certain

Conventional Weapons discussions that focus on AWS. The process in finding a middle ground between those arguing for the ban and others supporting no action is a very recent development. The State Parties to the CCW started a regulatory process in May 2014 as a result of political and social pressures by civil society organizations. In November 2014, they put AWS onto the official agenda of the CCW.249 Discussions in 2015 through 2017 covered regulations that range from an outright pre-emptive ban to an additional protocol to the

Geneva Conventions, and even an interpretative manual250 similar to the Tallinn

Manual on the International Law Applicable to Cyber Warfare, to enable the exchange of best practices and codes of conduct.251

A. Structure of the Convention on Certain Conventional Weapons

The Convention on Prohibitions or Restrictions on the Use of Certain

Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects aims to ban or restrict the use of specific types of weapons that are considered to cause unnecessary or unjustifiable suffering to

249 CHAIRPERSON'S REPORT, 'Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), (Advanced version of 16 May 2014)', www.unog.ch/80256EDD006B8954/%28httpAssets%29/350D9ABED1AFA 515C1257CF30047A8C7/$file/Report_AdvancedVersion_10June.pdf, 1-28, last visited December 15, 2017. 250 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 386-411, 406-410. 251 MICHAEL BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', (13 to 17 April 2015) www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A 2C1257C8D00513E26?OpenDocument, 1- 28, last visited December 15, 2017. 77 combatants or to affect civilians indiscriminately. 252 The CCW’s structure allows flexibility in order to adapt in case of future developments. In order to incorporate all of this, the CCW consists of a chapeau Convention and annexed

Protocols. While the Convention itself contains general provisions, prohibitions or restrictions relating to specific weapons, weapon systems are regulated in annexed Protocols.253 All of this illustrates that the CCW is the appropriate forum in which to discuss regulating AWS.

To date, 121 States are parties to the Convention, which entered into force in

1983.254

B. Discussions leading to Informal Meetings of Experts on AWS

There was a break-through in the discussions in the UN CCW during the fifths

CCW Review Conference in December 2016. State parties agreed to move on to the next step: the establishment of a Group of Governmental Experts (GGE). A

GGE, as an expert subsidiary body of the CCW, is a practical prerequisite for formal discussions, which may ultimately lead to a binding agreement.255 Open-

252 See for details on The Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects as amended on 21 December 2001: www.unog.ch/80256EE600585943/(httpPages)/4F0DEF093B4860B4C1257180004B1B30?OpenDocument, last visited December 15, 2017. 253 See www.unog.ch/80256EE600585943/(httpPages)/4F0DEF093B4860B4C1257180004B1B30?OpenDocument, last visited December 15, 2017. 254 See Ibd. 255 See CONVENTION ON CERTAIN CONVENTIONAL WEAPONS, 'Additional Protocol to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be deemed to be Excessively Injurious or to have Indiscriminate Effects (Protocol IV, entitled Protocol on Blinding Laser Weapons)', (1995) United Nations, Treaty Series, vol. 1380 (Doc. CCW/CONF.I/16 Part I), 370; MINES ACTION 78 ended GGEs have been the established method of work for CCW deliberations on concerns such as landmines and cluster munitions.256 Based on that long- standing precedent, a GGE on AWS is now open to all interested states. States like Switzerland, Canada, and Austria lobbied for the establishment of a GGE at the 2016 CCW review conference; China, the US, and Russia, did not.257 What does this tell us about the possibility of a potential AWS regulation?

As previously mentioned, experts are still deeply divided on the topic of a ban on AWS. Explicitly arguing against a ban, the US delegation declared that:

“We believe that it is important to focus on increasing our understanding

versus trying to decide possible outcomes. It remains our view that it is

premature to try and determine where these discussions might or should

lead.”258

CANADA, 'Lessons from Protocol IV on Blinding Laser Weapons for the Current Discussions about Autonomous Weapons - A Memorandum to CCW Delegates', (2014) https://bankillerrobotscanada.files.wordpress.com/2014/05/international-piv-memo-final.pdf, 1-4, last visited December 15, 2017; BONNIE DOCHERTY, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', (2015) Human Rights Watch/Harvard International Human Rights Clinic, 1-22; See www.unog.ch/80256EE600585943/(httpPages)/49166481C076C45AC12571 C0003A51C0?OpenDocument, last visited December 15, 2017. 256 CAMPAIGN TO STOP KILLER ROBOTS, 'Step up the CCW mandate', (18 June 2015) www.stopkillerrobots.org/2015/06/ mandateccw/, last visited December 15, 2017. 257 ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', (11 April 2016) available at: www.reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2016/meeting-experts- laws/reports/CCWR3.2.pdf, 4, last visited December 15, 2017. 258 MEIER, 'U.S. Delegation Opening Statement', (13 April 2015) The Convention on Certain Conventional Weapons (CCW) Informal Meeting of Experts on Lethal Autonomous Weapons Systems, available at: http://www.unog.ch/80256EDD 006B8954/(httpAssets)/8B33A1CDBE80EC60C1257E2800275E56/$file/2015_LAWS_MX_USA+bis.pdf, last visited December 15, 2017. 79

Similarly, France declared, “any preventive prohibition of the development of any potential Lethal Autonomous Weapon System would therefore appear premature”.259

In addition, commercial actors, as drivers of technological development, are pushing back against states on the domestic front from establishing any international regulation limiting their freedom in the field of autonomous systems. Autonomous vehicles, for example, share the basic technology of AWS and are of high interest for stakeholders involved (Google, Apple) because autonomous robots have the potential to revolutionize how we live, learn, and travel – as well as how we conduct warfare.260 Opponents of a ban bring forward the following arguments:261

• The technology is far away (Israel, United Kingdom, , and Spain

argue that full AWS are a possibility of the distant future and may never

exist at all);

• States have “no plans” to develop AWS;

259 FRENCH DELEGATION, 'Non paper: Legal framework for any potential development and operational use of a future LAWS', (11 April 2016) Third CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256 EE600585943/(httpPages)/37D51189AC4FB6E1C1257F4D004CAFB2?OpenDocument, 3, last visited December 15, 2017; France, Australia, and Germany have reiterated this view in the GGE meetings 2017: RAY ACHESON, 'Confronting Reality: We can build Autonomous Weapons but we can't make them smart', (14 November 2017) 5 CCW Report, www.reachingcriticalwill.org/images/documents/Disarmament- fora/ccw/2017/gge/reports/CCWR5.2.pdf, 2, last visited December 15, 2017. 260 WORLD ECONOMIC FORUM, 'What If: Robots Go to War?', (21 January 2016) www.weforum.org/events/world-economic-forum-annual-meeting-2016/sessions/what-if-robots-go-to-war, last visited December 15, 2017; The Future of Warfare: Race with the Machines: www.securityconference.de/en/activities/munich-security-conference/msc-2016/agenda-and-participants, last visited December 15, 2017. 261 ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', (11 April 2016) www.reachingcriticawill.org/images/documents/Disarmament- fora/ccw/2016/meeting-experts-laws/reports/CCWR3.2.pdf, 1-6, last visited December 15, 2017. 80

• Existing law applies to AWS and is adequate to regulate their

development and use.

Canada is against a ban and encourages proper testing under the national legal review obligation stemming from Article 36 AP I.262 The Netherlands neither supports the deployment of weapons without human control, nor a moratorium.

Similarly, Turkey wants to ensure human control over weapons, but does not support a pre-emptive ban.263

Supporters of a ban on AWS, such as Pakistan, bring forward the following arguments:264

• Threat to global peace and security, proliferation, and lowering of the

threshold for warfare;

• Ethics and morality, accountability, transparency, and compliance with

IHL and human rights;

• Possible diversion to non-state actors;

• Most weapon prohibitions come after mass devastation caused by that

weapon.

This illustrates the dilemma: the most advanced states in the field of autonomous technologies, the US and Israel, together with Russia, oppose a ban, while China

262 Id. at 2. 263 Id. at 2. 264 FRANK SAUER, 'Autonomous Weapons Systems. Humanising or Dehumanising Warfare?', (2014) 4 Global Governance Spotlight, 4; ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', 2; FEDERAL FOREIGN OFFICE OF GERMANY, Lethal Autonomous Weapons Systems -Technology, Definition, Ethics, Law and Security (Zarbock GmbH & Co. KG, 2016). 81 and Germany, with others lobby for one. Almost all countries, however, want to maintain human control over the use of weapons – although states differ on the question of how meaningful this human control needs to be.265 What exactly are the arguments and standpoints put forward in the CCW meetings so far? This chapter critically analyses the states’ positions, coalitions, and the experts mentioned in the past CCW expert discussions. This study then focuses on the way ahead in the CCW with the newly established GGE format.266

C. AWS on the CCW agenda

As mentioned above, the CCW is the right forum to discuss AWS. This study firmly believes this because the CCW preamble recognizes “the need to continue the codification and progressive development of the rules of international law applicable in armed conflict”.267 This is probably why the

CCW state parties decided to put AWS on its official agenda in the first place.

Most prominently, the civil society organization Campaign to Stop Killer

Robots brought the topic of AWS to the attention of a wider civil society and to the CCW.

265 ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', 4; for an overview of country positions as of 14 November 2017, see: THE CAMPAIGN TO STOP KILLER ROBOTS, 'Country Views on Killer Robots', (14 November 2017) www.stopkillerrobots.org/wp- content/uploads/2013/03/KRC_ CountryViews_14Nov2017.pdf, 1-4, last visited December 15, 2017. 266 PAUL SCHARRE, 'The Lethal Autonomous Weapons Governmental Meeting (Part I: Coping with Rapid Technological Change)', (9 November 2017) www.justsecurity.org/46889/lethal-autonomous-weapons- governmental-meeting-part-i-coping-rapid-technological-change/, 1-4, last visited December 15, 2017; PAUL SCHARRE, 'Lethal Autonomous Weapons and Policy-Making Amid Disruptive Technological Change', (14 November 2017) www.justsecurity.org/47082/lethal-autonomous-weapons-policy-making-disruptive- technological-change/, 1-4, last visited December 15, 2017. 267 CONVENTION ON PROHIBITIONS OR RESTRICTIONS ON THE USE OF CERTAIN CONVENTIONAL WEAPONS WHICH MAY BE DEEMED TO BE EXCESSIVELY INJURIOUS OR TO HAVE INDISCRIMINATE EFFECTS CONCLUDED AT GENEVA ON 10 OCTOBER 1980, (1992) vol. 1342 United Nations Treaty Series, 163. 82

1. Raising awareness of AWS

The traditional NGOs that regularly raise awareness for emerging topics that potentially challenge existing norms are the ICRC, Human Rights Watch or

Amnesty International. Especially active with regard to AWS are the ICRAC and the Campaign to Stop Killer Robots. The latter is supposedly the primary pressure source put on governments and the international community to address the topic of AWS before a point of no return is reached. The Campaign to Stop

Killer Robots was founded in April 2013 and consists of 51 NGOs from 24 countries.268 The Campaign’s aim is to achieve a protocol pre-emptively banning

AWS.269 The ban could follow the model of Additional Protocol IV, prohibiting blinding laser weapons.

2. 2014 Meeting of Experts on LAWS

87 countries participated in the first informal meeting of experts on AWS from

13 to 16 May 2014. 30 country delegations presented statements at the opening of the conference. Cuba, Ecuador, Egypt, Pakistan and the Holy Sea supported the ban proposed by the NGO community right from the beginning. Although no country openly declared that it would be developing AWS in the foreseeable future, the Czech Republic as well as Israel and the US delegation stressed the potential benefits of more autonomy in weapon systems. However, many

268 SAUER, 'Autonomous Weapons Systems. Humanising or Dehumanising Warfare?', (2014) 4 Global Governance Spotlight, 2. 269 CAMPAIGN TO STOP KILLER ROBOTS, 'The Solution', (2016) www.stopkillerrobots.org/the-solution/, last visited December 15, 2017. 83 countries spoke in support of the concept of meaningful human control (MHC) that was introduced into the debate by the NGO Article 36. Countries including

Germany, Switzerland, France, Austria, Norway, the United Kingdom and the

Netherlands pursued a line of argument in which MHC may be maintained whenever using lethal force. Overall, there was almost general consensus among delegations that the ethical dimension may pose greater obstacles than the existing legal framework governing AWS.270

3. 2015 Meeting of Experts on AWS

The second meeting of experts discussed emerging technologies (i.e. AWS) and their challenges. 271 In total 91 nations participated in this meeting, which took place in Geneva from 13 to 17 April 2015.

The UN High Representative for Disarmament Affairs, Angela Kane, addressed the CCW delegates in a video message at the beginning of the meeting by stating that

“over the past two years, this issue (AWS) has evolved rapidly from the

realm of science- fiction to a serious international concern. This is fitting

given prevailing trends – which include the rapid pace of technological

development and the increasing automation of military systems. It is now

270 JEAN-HUGUES SIMON-MICHEL, 'Report of the 2014 informal Meeting of Experts on LAWS', (10 June 2014) www.unog.ch/80256EDD006B8954/(httpAssets)/350D9ABED1AFA515C1257CF30047A8C7/$file/Report_Ad vancedVersion_10June.pdf, 1-10, last visited December 15, 2017. 271 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', (13 to 17 April 2015), last visited December 15, 2017. 84

well established that the advent of autonomous weapons raises wider

concerns pertaining to international humanitarian law, ethics, human

rights, as well as to broader questions of international peace and

security.”272

The international non-governmental-organization community addressed the

CCW state parties in advance, with the call to maintain MHC in order to assure that they understand the process of targeting individual objects and the context in space and time in which an attack will take place. 273 Explicit recommendations included that states commit themselves to MHC, prohibit weapons without MHC, as well as explaining how control was applied in existing automatic or autonomous systems.274

Areas of common understanding emerged in the general debate among state parties. Absolute adherence by, and respect for IHL and IHR in the examination of AWS, was met by mutual consent of the participating parties. Several delegations pointed out that systems such as AWS did not yet exist and denied any intention of acquiring them.275 Some state representatives held that AWS

272 ANGELA KANE, 'Video Message to the CCW Meeting of Experts on LAWS on 13 April 2015', https://s3.amazonaws.com/ unoda-web/wp-content/uploads/2015/04/LAWS-Meeting-April-2015-HR-Video- Message.pdf, last visited December 15, 2017. 273 For many: ARTICLE 36, 'Killing by Machine - Key Issues for Understanding Meaningful Human Control', (2015) www.article36.org/wp-content/uploads/2013/06/KILLING_BY_MACHINE_6.4.15.pdf, 5, last visited December 15, 2017. 274 Id. at 5. 275 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 15. 85 that do not allow for human intervention would breach IHL. Delegations underlined the importance of a fact-based discussion, while others added that although the CCW was the appropriate forum for IHL, HR matters should equally be addressed within the framework of the Human Rights Council.

State parties claimed that no matter the type of weapon, they would always adhere by the respective principles of IHL with regard to AWS, especially the principles of distinction, proportionality, the principle of distinction and a proper accountability chain.276 With regard to those IHL principles, some delegations argued that AWS could not abide and “called for an immediate, legally binding instrument providing for a ban”.277

According to some delegations, AWS would lead to a fundamental change in the nature of warfare. Moreover, it was expressed that AWS would lack judgment and compassion, and could increase the risk of waging war and asymmetric warfare. Ultimately, some states fear that the lack of clear attribution would lead to impunity. Concerns included topics such as proliferation and the arms race.

The latter could challenge global stability, diminish achievements on disarmament in general, and ultimately serve terrorism if non-state actors were to possess these AWS.278 The majority of states, however, put forward that it is

276 Id. at para. 17. 277 Id. at para. 18. 278 Id. at para. 18. 86 too early to draw such conclusions. The notion of meaningful human control was once again in the limelight. Some delegations spoke in support of this notion; proposing that it could help shape the understanding of AWS in general, and serve as a means to distinguish different systems. Others, such as the USA, challenged the fact that MHC could be helpful at this stage. Concepts in defining

AWS were also introduced, with some focusing on “critical functions” while others favoured “autonomy”. Still, others drew the conclusion that it would help to make a distinction between automated and autonomous systems. Some delegations made clear that existing systems were not part of the on-going discussions. By this, states that already possess stationary and defensive systems want to keep them in action. The dual-use character of military and civilian functions and potential benefits of autonomy as such were also highlighted as being important for the civilian sector.279 Unsurprisingly, the parties concluded that discussions should continue and even be intensified. Proposals included holding another Informal Meeting of Experts, or the creation of a mandate that allows for a more ambitious formal Group of Governmental Experts or an immediate ban. Many delegations support enhanced transparency by means of a proper Article 36 AP I review process and the sharing of best practices to elicit trust building.280 According to the ICRC, existing systems should be on the table to enhance the understanding of the potential risks. While some delegations

279 Id. at para. 20. 280 Id. at para. 21. 87 stressed the need for definitions at this early stage to understand AWS better, others called for a framework for regulations.281

In the session focusing on technical issues, Stuart Russel introduced potential implications of Artificial Intelligence (AI) for AWS. In an initial overview, he described AI as an intelligent connection of perception to action, in which intelligence means doing the right thing given the available information. In concreto, that right would be an action that is expected to achieve its goal or at least maximize its utility.282

Russel firstly outlined the rapid progress of the past few years, accelerated by big data processing, enhanced industrial research, and an overarching framework combining different theoretical approaches. According to Russel, autonomous systems and computers already perform tasks such as face recognition, flight navigation or tactical video games better than humans. In

Russels’s mind, it is only a matter of time then, until machines outperform humans. He goes so far as saying that humans will more or less be defenceless against artificially intelligent machines soon. In his analysis, the impeding factor in the development of AWS will rather be physical inadequacies, such as energy, speed, range or payload, than deficiencies of the computer systems. The

281 Id. at para. 22. 282 STUART RUSSELL, 'Artificial Intelligence: Implications for Autonomous Weapons', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22 EC75A2C1257C8D00513E26?OpenDocument, 3. 88 biggest challenge is the uncertainty of outcomes and unpredictability of actions, which Russel calls the “unknown unknowns”.283 This is due to the fact that objects, behaviours and potential circumstances are unknown at the design time.284 However, with more research and work put into system integration and testing the capabilities of AWS may include distributed situation awareness or integrated strategic, tactical planning and execution for extended tasks. Tasks they would be able to perform, include clearing an underground complex or prohibiting air or ground infiltration over a large area. With the recent rapid progress in computer that can process big data, sense or act decision cycles in

AWS would possibly be performed in milliseconds.285

In the presentation entitled “The Distributed Autonomy: Software Abstractions and Technologies for Autonomous Systems”, Andrea Omicini outlined software systems would always would have to be the focus of AWS. He described computer systems as “multi-agent systems (MAS)”286 and the decisive factor will always be “autonomy”, regardless of who or what is performing the task.287

Since every agent strives to fulfil its own goals, autonomy means “distributed property of socio-technical systems”, according to Omicini.288 In other words,

283 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 24, a, iii. 284 RUSSELL, 'Artificial Intelligence: Implications for Autonomous Weapons', 6. 285 Id. at. 13 286 ANDREA OMICINI, 'The Distributed Autonomy - Software Abstractions and Technologies for Autonomous Systems', see id. at, 3. 287 Id. at. 4. 288 Id. at. 7. 89 autonomy is distributed between several agents or even agent communities.

When AWS would be created with self-organizing MAS the system goals may no longer be visible when observing the system.289 At the same time decision- making, responsibility and liability will be distributed between these entities.290

As a result, Omicini posits that challenges to attribution and responsibility must be addressed with norms that need to be implemented into agents.291

Paul Scharre stresses the need for a thorough understanding of technical issues with regard to AWS by outlining the state of play and expectations accompanied by AWS. 292 Scharre introduced three dimensions of autonomy. The first is the relationship between human and machine. Here, Scharre references the degree of human control by differentiating “in, on or out of the loop”.293 The second dimension is the degree of intelligence of the machine. The spectrum of complexity covered includes automatic, automated, autonomous and intelligent machines.294 The last dimension encompasses the nature of the task.295 This is the dimension of autonomy where it is crucial to know whether a person is behind the task or supervising a machine doing it or if the latter is simply doing

289 Id. at 9. 290 Id. at 8. 291 Id. at. 10. 292 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 24, b. 293 SCHARRE, 'Presentation at the UN CCW - Technical Issues I', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22 EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017. 294 Id. at 1. 295 Id. at 1. 90 it without possible human intervention.296 Scharre reflects upon what people most fear about AWS: the uncertainty of of delegating such pernicious tasks to a machine. However, Scharre points out that AWS would still be operated by humans, thus, according to programming instructions and rules of engagement generated by humans.297 In the same vein, Scharre wants the debate to focus on likely developments rather than on science-fiction-based fears. The first AWS would probably come in the form of loitering missiles, adaptive cyber weapons, or undersea sub-hunting AWS. He continues by stating that he is aware that discriminating among combatants remains the biggest challenge. Therefore, IHL challenges can be addressed by using AWS solely in environments where they can abide by the rules (i.e. underwater or in outer space).298 Scharre summarizes reasons for states to strive for AWS, which include: speed, communication interruption, self-defence, and the technology race.299 Lastly, Scharre warns of excesses by making reference to the 2010 New York Stock Exchange crash, where an automated stock trading systems caused a loss of ten per cent in a few minutes only. With AWS, however, Scharre explicitly cautions against unanticipated outcomes by not using them in uncontrolled environments. The author sees risks for AWS, such as by cyber hacking, human errors, malfunctions or even tricking AWS with false data.300 He suggests that states come up with “rules of the road” on how to incorporate autonomy in weapons

296 Id. at 1. 297 Id. at 2. 298 Id. at 2. 299 Id. at 3. 300 Id. at 4. 91 on the one hand, and to share information on algorithms to avoid “flash wars”

(security considerations permitting) on the other.301

The following discussion focuses on how to accurately define “autonomy” in order to distinguish between systems with different levels of autonomy. As a result of artificially intelligent software, according to the panel presiding, actions become less controllable. Military security concerns were identified as one of the biggest challenges for engineering discipline or implementing transparency in AWS. Nevertheless, many delegations see legitimate reasons for security concerns that need to be weighed against transparency requirements. Some fear the lack of predictability in complex environments. According to NGOs, focusing the debate on environments without IHL challenges (i.e. underwater) may lead to premature AWS deployment.302

Elizabeth Quintana outlined operational considerations for AWS. Here, she focuses on the changing security environment. According to her, militaries around the world consider AWS as fulfilling their need for speed, lower costs for military personnel and access to areas in which humans cannot easily penetrate.303 However, the global defence industry needs to focus on resilience

301 Id. at 5. 302 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 24, iv. 303 ELIZABETH QUINTANA, 'Operational Considerations for LAWS', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), 92 against cyber-attacks, according to Quintana. What’s more is that there is lingering doubt on how to assess accountability of commanders in the military.304 Quintana comes up with the following counter-arguments: AWS falling into the hands of enemies, a potential arms race, and a lower threshold for waging war in case human soldiers are not be put at risk.305 Lastly, Quintana advocates for keeping the existing thorough political oversight of the military even in case of more autonomy and that MHC should encompass strategic, operational and tactical dimensions.306

Strategic doctrines and their application to AWS were the topic of Heather

Roff’s kick-off presentation. Roff gave a brief historical overview of the reasons behind the increased autonomy in weapon systems. While in the past autonomy was supposed to make up for less human soldiers and to better penetrate underwater or in the air, now, dangerous or labour intensive functions are shifting to machines.307 Roff made clear that the reasons for more autonomy in weapon systems is not the same for all environments. For aerial autonomous systems, the focus is on enhanced endurance, increased intelligence collecting,

www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, 7- 8, last visited December 15, 2017. 304 Id. at 9. 305 Id. at 9. 306 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 25; QUINTANA, 'Operational Considerations for LAWS', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C 1257C8D00513E26?OpenDocument, 10, 13, last visited December 15, 2017. 307 HEATHER ROFF, 'Strategic, Operational and Tactical Considerations for LAWS', see id. at, 3. 93 reconnaissance, surveillance and the trend to swarm technology.308 However, technical obstacles such as expensive development and operations as well as interoperability drawbacks are accompanied by political scrutiny due to public distrust. The distance between operator and actual battlefield has a potential for un-proportionate use of force is often criticized. According to Roff, military rationales for adopting naval AWS include obstacles to communication over long distances or underwater, and difficulty in surveying of vast areas.309 Here,

Roff states, lack of proper testing in complex environments and insufficient research of maritime law creates non-negligible challenges. Currently, AWS for land use focus mainly on tasks like minesweeping, air defence systems, and logistic and support providing.310 Their biggest challenge in this environment is asymmetric warfare and the indistinguishable pawns on their playground (i.e. combatants and civilians). Military rationale calls for small, easily distributable, stealthy, and durable systems.311 Roff warns, however, that this may lead to an arms race, and contribute to the proliferation and acquisition of AWS by non- state actors.312

308 Id. at 5. 309 Id. at 6. 310 Id. at 7. 311 Id. at 8. 312 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 28, a; ROFF, 'Strategic, Operational and Tactical Considerations for LAWS', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/ 6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 9. 94

Darren Ansell introduced the topic of “Reliability and Vulnerability of

Autonomous Systems” to the expert meeting. Ansell first laid out potential risks, like incorrect algorithms or incorrect use of software accompanying the AWS, which could lead to disastrous outcomes compared to conventional technology.313 Mechanisms he proposes to reduce those risks include adhering to existing certified industrial standards like the DO-178C airborne system certification, and building upon them for AWS’ regulatory requirements.314

Finally, Ansell proposes implementing the highest level of accuracy in programming and intensive and proper testing before deployment. The testing should encompass virtual (synthetic) environments as well as active research areas including formal methods and model checking.315 By doing so, Ansell assured that programming errors could be reduced although guarantees would always be about AWS’ decisions given certain inputs, not embracing its overall effects.316

Wolfgang Richter focuses on the military perspective in his presentation,

“Military Rationale for Autonomous Functions in Weapons Systems”. Richter names the supposed advantages of AWS: the reduction in loss of life of soldiers,

313 DARREN ANSELL, 'The Reliability and Vulnerability of Autonomous Systems', see id. at, 4. 314 Id. at 8. 315 Id. at 6-7. 316 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 28, b; ANSELL, 'The Reliability and Vulnerability of Autonomous Systems', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/ 6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 7, last visited December 15, 2017. 95 more precision, force multiplier, and increased protection of civilians.317 He is not of the opinion that AWS could become a game changer, believing that the autonomy trend has been present for decades. Richter elaborates on this view with examples of automatic munitions such as landmines, tracking and target identification, and existing defensive systems that engage incoming rockets as well as automatic target selection processes.318 However, Richter clarifies that there is a difference between autonomy in weapon systems and tactical autonomy. The latter will – as states and military commanders have argued for some time now – always require human supervision.319 Most of the current battlefield environments require combined use of force and high degrees of coordination. In addition, prolonged reconnaissance, extensive movements of troops, timely support of weapons as well as foreseeing different battlefield situations will require human command and control in future.320 Rather, Richter states that AWS will only carry out very specific tasks. The rest, including specific target selection, the choice of target categories, and the potential area and time of deployment, will always require human commanders applying their understanding of IHL.321

317 WOLFGANG RICHTER, 'Military Rationale for Autonomous Functions in Weapons Systems (AWS)', see id. at, 4. 318 Id. at 2-3. 319 Id. at 5. 320 Id. at 5. 321 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 28, d; RICHTER, 'Military Rationale for Autonomous Functions in Weapons Systems (AWS)', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 11, last visited December 15, 2017. 96

Frédéric Vanderhaegen posits in his presentation whether dissonance could affect the resilience of autonomous systems. Focusing on resilience, he first outlines what this term means in the context of AWS: the capability of the system to control the instability of its autonomy. The outcome would either be success in the form of resilience or failure, which would result in vulnerability of the system.322 Dissonance, however, is the element that may disrupt the stability of autonomous systems.323 Finally, Vanderhaegen presents the possible methods to manage dissonance.324

The following discussion concerns itself with the challenges facing to global stability and the arms race. While some fear a prolonged warfare, others stress that the arms race is not a specific feature of AWS, and that transparency is a possible countermeasure to these risks. Specific proposals to enhance transparency include publishing national Article 36 AP I weapons review processes. Other proposals include a code of conduct, new arms control mechanisms that take into account new technologies, and the sharing of best practices and national standards for testing and operational deployment. Again, it was stressed that humans will remain to make tactical, operational or strategic decisions and that AWS will rather serve as supplementary means to assist on the technical level of specific operations. Concerns raised include

322 FRÉDÉRIC VANDERHAEGEN, 'Les dissonance peuvent-elles affecter la résilience des systèmes autonomes', see id. at, 5. 323 Id. at 6. 324 Id. at 7. 97 unpredictability due to self-learning machines in unknown environments and the difficulty in testing complex systems. Safeguards to address such concerns were limitations of deployment in time and space as well as self-destroying mechanisms as a last resort in case of malfunction or high jacking of AWS. As mentioned above, the potential benefits of AWS are benefits of AWS are minesweeping, search and rescue, and potential reduction of civilian casualties due to more precise systems.

The most important point raised concerns the legal analysis of AWS. A life-long evaluation testing and verifying the proper adaptation to changing environments is stressed as a means to comply with IHL.325 This point will also be addressed in more detail further below.

Maya Brehm focuses on the notion of meaningful human control (MHC), which is one of the focus areas of the CCW meeting. Brehm focuses on four different facets of the concept of MHC that could assist policy makers.326 It is generally agreed-upon, Brehm states, that any weapon system using armed force must avoid being “uncontrollable” or “out of control”.327 Firstly, Brehm introduces a hypothesis, in which human control would not be acceptable any more. The

325 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 28, c. 326 MAYA BREHM, 'Meaningful Human Control', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26? OpenDocument, last visited December 15, 2017. 327 Id. at 1. 98 bottom line is described as systems that would operate over long periods of time without human supervision or intervention possibility.328 She points out that a certain kind of human control is essential for AWS to be generally accepted. For

Brehm, gathering information on operations is crucial to allow those involved to anticipate and reasonably foresee consequences for a proper legal assessment.329

The second facet Brehm outlines is that of controllability or predictability. To avoid system failures, robust and reliable systems must be developed goes her argument.330 Thirdly, Brehm ponders whether AWS, based on their probabilistic matching algorithms, could make decisions to attack non-legitimate targets with lethal force. 331 Lastly, Brehm suggests ensuring responsibility both in the process of applying force and for the outcome. 332 Brehm sums up the contribution of MHC to the policy debate by stating that this mechanism allows for the drawing of normative boundaries.333

As illustrated, MHC and the notion of “autonomy” are identified as a means to understand the specific characteristics of AWS as long as a generally accepted definition is still lacking. These criterions hopefully will enhance the

328 Id. at 2. 329 Id. at 2. 330 Id. at 3. 331 Id. at 3. 332 Id. at 4. 333 Id. at 5; BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 29-31. 99 understanding of AWS and assist in limiting finally their use where necessary to ensure legality.334

The ICRC presents its definition of AWS: selecting and attacking targets without human intervention, including IHL critical functions. In addition, attention should be paid to systems already in existence or in a prototype-status rather than on hypothetical future scenarios. Research should be directed towards analysing critical functions in existing semi-autonomous systems. With regard to the emerging notion of MHC the ICRC wants to differentiate between the various types of control that have to be applied at specific stages of action.

However, overall human control over the process of target selection and the use of force will remain crucial. The process would be slowed down should parties solely concentrate on “definition exercises”. And although transparency is important, algorithmic regulation transparency is not the same as transparency and observability by a human operator.335

Marcel Dickow holds that although MHC is supposed to restrain “autonomy”, neither concept is fully defined yet. 336 However, to grasp the notion of

334 Id. at para. 34. 335 INTERNATIONAL COMMITTEE OF THE RED CROSS, 'Statement at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (17 April 2015) www.unog.ch/80256EDD006B8954/(httpAssets)/E2917CC329 52137FC1257E2F004CED22/$file/CCW+Meeting+of+Experts+ICRC+closing+statement+17+Apr+2015+final. pdf, last visited December 15, 2017. 336 MARCEL DICKOW, 'A Multidimensional Definition of Robotic Autonomy - Possibilities for Definitions and Regulation', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 3, last visited December 15, 2017. 100 autonomy, he suggests defining several single vectors such as time, space, sensors or dealing with errors.337 Dickow proposes a new definition of robotic autonomy which reads “a system that excesses a –technically and politically defined- threshold of several accumulated qualitative criteria” and thus encompasses technical and political dimensions. 338 This should allow for calculating and benchmarking different layers that could play a role in robotic autonomy. Dickow’s multidimensional definition of robotic autonomy favours a techno-centric approach. He identifies as benefits to this approach that it is calculable, reproducible, negotiable, verifiable and transparent.339 Groups of criteria for vectors that should be taken into account in his Multidimensional

Autonomy Definition (MAD) proposal are: physical (time, space, energy), sensors (quality, quantity, capabilities), weapons (quality, quantity, impact), human control (steering, veto), and machines (errors, fault tolerance, self- preservation).340

Another proposal exists that would study AWS based on their critical functions to select and engage a target. In contrast to an approach focusing on the level of technological sophistication, the level of autonomy here is in the use of force.

Supporters of this approach claim that this is exactly what concerns ethical and legal questions. Further, what plays an important role in this study is the level of

337 Id. at 3-4. 338 Id. at 5. 339 Id. at 7. 340 Id. at 6. 101 human supervision, whether the targeting object is an object or a human, the nature of the environment, and the level of reliability and predictability of the machines. In a nutshell, this proposal wonders how can critical functions and human supervision be affected by a certain deployment context.

With regard to policy implications, the existing legal framework of IHL must be sufficiently prepared to address the legality of AWS. MHC is a standard-setting concept for both the technical and normative aspects of these machines. One proposal goes further by suggesting that MHC is to become an explicit condition of IHL with further elaboration and standard setting to be evolved by interpretation and practice. In this regard, Article 36 AP I was mentioned as a tool to guarantee specification and transparency through standard setting.341

Neil Davison lays out the fundamental characteristics of AWS and contextual factors for human control in his presentation. He suggests focusing on the

“critical functions” of AWS in general.342 When it comes to use of force, however, Davison stresses that the concept of “autonomy” and not the technical sophistication of a weapon system must be under scrutiny.343 As a way forward,

Davison wants to simplify the debate by excluding non-critical functions as well

341 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 33, b. 342 NEIL DAVISON, 'Characteristics of autonomous weapon systems', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC7 5A2C1257C8D00513E26?OpenDocument, 2, last visited December 15, 2017. 343 Id. at 2. 102 as non-weaponized systems. 344 According to Davison, the key factors for maintaining human control are human supervision, freedom of action of the weapon and its technical capabilities.345

Nehal Bhuta presents the Collaborative Operations in Denied Environment

Program (CODE) framework. 346 Supporting the need for DARPA, Bhuta elaborates further that the goal would be to enable the supervision of a large group of UAV’s while maintaining an appropriate level of “human judgment” over the use of force.347 To this end, collaborative autonomy networks and software must enable robust and dynamic team performance of teams of systems.348

The debate among state parties and experts resolves around the emerging norm of MHC. While some delegations stress that it would be helpful to further elaborate this concept, others held that it was too imprecise and subjective.

According to some, the notion of “autonomy” should be the decisive technical character of AWS. In support of this opinion, some have criticized that focusing on human control as a limiting factor while trying to define autonomy would be incongruous. However, there is broad support for considering that MHC could h help in assessing ethical rather than legal matters. Another proposal analyses the

344 Id. at 2. 345 Id. at 3. 346 NEHAL BHUTA, see id. at, 2. 347 Id. at 3. 348 Id. at 4. 103 predictability of AWS with regard to self-learning machines to determine potential lawful development and use of AWS. To properly address accountability challenges, it was upheld that, in the end, it states would remain the responsible entities. Finally, the focus should be on an effective Article 36

AP I weapons review mechanism, the sharing of national best practices, or even a body that could, on an international level, assure application of those standards.349

Pekka Appelqvist describes how a systems approach to AWS would be favourable over a platform-oriented approach. It would take into account characteristics, considerations and implications. With her broader, so-called system of systems (SOS) approach, Appelqvist addresses issues such as safety and risk assessment, fault tolerance and reliability testing.350 Appelqvist wants to use on-board capabilities as well as the infrastructure in the operational environment. 351 In her proposal, the Command, Control, Communication,

Computer, Intelligence, Surveillance and Reconnaissance (C4ISR) system provides the intelligence infrastructure, as well as the tight boundaries of operation for AWS.352 Explicitly addressing the call for transparency and review

349 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 34 c, d. 350 PEKKA APPELQVIST, 'Systems approach to LAWS - characteristics, considerations and implications', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 3, last visited December 15, 2017. 351 Id. at 4. 352 Id. at 4. 104 process of AWS, Appelqvist states that following the SOS approach would allow for observation of AWS on an operational, technological and organizational level.353 According to her, the essence of cognitive features is that

AWS will always begin to operate with incomplete information but in a second step, however, intelligence acquired will become a transferable quality of systems for various platforms.354 Doubtful about enforcement possibilities of restrictions upon AWS and easy circumventions, Appelqvist suggests agreeing on principles that allow for its development.355

Giovanni Sartor outlines the legal regulation framework for civilian systems in order to explain challenges for AWS’ liability.356 National law characterizes the civilian autonomous system’s legal domain with civil and criminal liability.357

When it comes to criminal law (i.e. intentional misuse, recklessness or negligence) the approach would most likely be the same, regardless of the identity of the source of the violation.358 In case it is impossible to hold someone accountable for criminal offences in the civil sphere, social benefits, such as fewer accidents with autonomous cars, may outweigh social costs, and thus be acceptable. Concepts like strict liability, compulsory insurance or producer

353 Id. at 4. 354 Id. at 5. 355 Id. at 6. 356 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 41, a. 357 GIOVANNI SARTOR, 'Liabilities for autonomous systems in the civil domain', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22 EC75A2C1257C8D00513E26?OpenDocument, 3, last visited December 15, 2017. 358 Id. at 4-7. 105 compensation, may make up for liability gaps due to lack of fault.359 However, in the case of military offences committed through AWS, there is no justification in balancing social advantages and disadvantages, although a proportionate military goal may suffice.360

Jason Millar took on MHC to explain the difficulties that may arise from dual- use technologies. Millar questions if relying on the human factor helps the discussion of AWS since human decision-making is often inconsistent. 361

However, this assumption should not lead to simply delegating functions to

AWS either. Rather Millar calls for extreme care while designing AWS. He recommends a combination of human and machine decision-making processes within a MHC framework, utilizing the lessons learned by existing systems.362

Caitríona McLeish posits that existing export control regimes for biological

(BTWC) and chemical (CWC) dual-use products could serve as examples for

AWS. The special task, according to McLeish, is to regulate military development in such a way that civil trade and development is not affected.363

Both chemical and biological weapons regimes make reference to purpose of

359 Id. at 14. 360 Id. at 15. 361 JASON MILLAR, 'Meaningful Human Control and Dual-Use Technologies', see id. at, 4. 362 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 41, b; MILLAR, 'Meaningful Human Control and Dual-Use Technologies', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/ 6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 7, last visited December 15, 2017. 363 CAITRÍONA MCLEISH, 'Experiences from the CBW regime in dealing with the problem of dual use', see id. at, 3. 106 use criteria rather than to specific technologies.364 What McLeish calls the “web of prevention” is an inter-sessional process in which lessons learned and better practices can be shared amongst governments.365 An effective inspection and verification system, however, requires –sometimes forced- cooperation in the industry as well.366

The Israeli delegation affirms that humans would always be involved in the development, programming, testing legal reviews, and the decision to use lethal force. Humans will continue to set constraints on time, area, targets and tasks for operations. Moreover, the IHL compliance for the use of AWS lies with the state.367

The US delegation doubts that a narrow MHC definition is the right way forward but calls for safeguards to prevent unintentional consequences. Human control, especially in the battlefield pressure, is vulnerable to external factors.

As an alternative to the emerging concept of MHC, the US delegation proposes to focus on “human judgment” as the decisive element in defining AWS. The

US affirms that AWS will have to be used in a manner and timeframe consistent

364 Id. at 4-5. 365 Id. at 9. 366 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 41, c; MCLEISH, 'Experiences from the CBW regime in dealing with the problem of dual use', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 6. 367 EITAN LEVON, 'Statement by Israel at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (13 April 2015) www.unog.ch/80256EDD006B8954/(httpAssets)/1B879A0827FBA307 C1257E29004778B8/$file/2015_LAWS_MX_Israel_GS+bis.pdf, 2, last visited December 15, 2017. 107 with the intentions of the military commander. If this cannot be guaranteed, there must be a way to avoid or abort the mission. The comparison to existing control regimes was raised, but was rebutted by pointing out that there is not even a definition of AWS yet and regulating would be premature. The UK delegation stresses that IHL has previously accommodated evolutions in technology; existing IHL would be sufficient to cope with AWS as well.368

In the following session focusing on international humanitarian law, William

Boothby posits that the Article 36 AP I weapons review process can effectively address IHL challenges posed by AWS. The review of new weapons technologies is both for state parties to AP I or because of the customary law character of the provision mandatory. 369 The following scenarios are hypothesised by the author: what is the accuracy in targeting performed by a machine or a human? Could the machines anticipate military advantage or assess collateral damage prior to an attack? Could they properly distinguish between combatants and civilians or hors de combat?370 Currently, according to

Boothby’s, technology is not able to provide offensive AWS with sufficient

368 UNITED KINGDOM OF GREAT BRITAIN AND NORTHERN IRELAND, 'Statement at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (13 April 2015) www.unog.ch/80256EDD006B8954/(httpAssets)/1CBF996AF7AD 10E2C1257E260060318A/$file/2015_LAWS_MX_United+Kingdom.pdf, 1-2, last visited December 15, 2017. 369 WILLIAM H. BOOTHBY, 'Article 36, Weapons Reviews and autonomous weapons', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/ 6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 2, last visited December 15, 2017. 370 Id. at 3-4. 108 sensor technology to pass a weapons review.371 However, existing laws apply and don’t necessarily preclude future systems from being lawful.372 He also points out that MHC is rather an interim policy approach that may preclude the application of existing targeting law. 373 Hence, Boothby favors a proper application and implementation of the weapons review instead of relying on such non-legal concepts as MHC, which would question existing systems.374

Kathleen Lawand’s presentation focuses on whether the development of AWS would call for additional regulation.375 Lawand explains this due to the fact that

AWS have unique functional characteristics that should be assessed with a view to intended and expected use, although that “use” may not always be foreseeable.376 According to the author, AWS is an “umbrella term” that covers many kinds of systems. Autonomy in this regard is a characteristic of a technology that is attached to functions of a weapon system but not the weapon system itself.377 For each specific AWS to pass a legality test it must predictably and reliably respect IHL in its expected uses. In detail, factors like the type of

371 Id. at 5. 372 Id. at 5. 373 Id. at 6. 374 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 42-44; BOOTHBY, 'Article 36, Weapons Reviews and autonomous weapons', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049 BE22EC75A2C1257C8D00513E26?OpenDocument, 6, last visited December 15, 2017. 375 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 46, a. 376 KATHLEEN LAWAND, 'Presentation on the Panel on possible challenges to IHL due to increasing degrees of autonomy', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE 600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017. 377 Id. at 1. 109 task assigned (offensive or defensive), context (air, ground, sea, urban or uninhabited environment), the type of target (material or personnel), force used, freedom of movement (fixed or mobile) and timeframe, are all relevant to the ability of a specific AWS to respect IHL.378 The calculation of foreseeable effects decreases the more autonomous and complex AWS become. Safeguards on operational levels would have to include that commanders understand the specific abilities of systems for a comprehensive risk assessment in each case.379

Lawand calls for the implementation of Article 36 AP I reviews especially with a focus on proper testing to enhance the predictability of AWS’ effects.380

When considering emerging technologies in general, Eric Talbot Jensen assumes that throughout history, new technologies have met with mistrust.381 While giving a historic overview of previous systems that had given rise to attempts at prohibitions, like balloons, submarines or aircrafts, none succeeded. 382

Therefore, Jensen suggests concentrating all efforts on compliance with existing law rather than on suggested prohibitions.383 Potential benefits, such as higher precision in warfare could not be achieved otherwise and should not be abandoned just because of current technological limitations that could well be

378 Id. at 2. 379 Id. at 4. 380 Id. at 3. 381 ERIC TALBOT JENSEN, 'Statement on Lethal Autonomous Weapons', see id. at, 1. 382 Id. at 2. 383 Id. at 3. 110 overcome in future.384 However, Jensen acknowledges that at the point in time in where AWS will be equipped with AI, the legal situation must be reconsidered.385

Some delegations emphasize that humans should be the ones deciding on missions involving life and death because the targeting rules are best assessed by humans. However, this is rather an ethical than legal assessment. 386 The proposals include good faith national implementation of the intended use of

AWS and transparency measures such as sharing of best practices and lessons learned. For some delegations, the Article 36 AP I obligation is an efficient and sufficient procedure to overcome concerns over AWS.387 However, others claim that the lack of technical and scientific experience, as well as the testing challenges on national levels would not allow for review obligation compliance.

Jensen, however, replied that the US has chosen not to field weapon systems as a result of past legal weapon reviews. Moreover, concerns included a lack of international supervision to guarantee common standards. However, without such provisions trust-building measures cannot be pursued.

384 Id. at 3. 385 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 46, b; JENSEN, 'Statement on Lethal Autonomous Weapons', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC7 5A2C1257C8D00513E26?OpenDocument, 3. 386 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 46, c. 387 Id. at para. 47. 111

With regard to a potential accountability gap, doubts are expressed that the existing framework of command and control could sufficiently cope with AWS actions. For example, AWS in law enforcement (i.e. policing) could not be subject to the Article 36 weapon review. Furthermore, the increasing autonomy in weapon systems might risk expanding the notion of what constitutes an

“attack” according to the UN Charter. However, special training and instructing rules of engagement, could strengthen guarantees for responsibility, and ensure that misuse would effectively be prosecuted.388

Christof Heyns focuses on Human Rights implications of AWS in the subsequent session entitled “Overarching Issues”. He stresses that while the

CCW mandate focuses on IHL, a comprehensive approach should not exclude human rights.389 He posits that this is necessary because increased autonomy in the lethal or non-lethal use of force would not only be used in armed conflicts but also in law enforcement situations.390 Heyns outlines the provisions that would most probably be at stake in this regard: the right to life and dignity but also humane treatment, bodily integrity, and the right to remedy.391 Especially the latter would be infringed due to a lack of individual accountability.392 Even if AWS could comply with IHL provisions, it would remain that not only the dignity of the targeted person but also of the person ordering the attack would be

388 Id. at para. 49. 389 Id. at para. 54, a, i. 390 Id. at para. 54,a, ii. 391 Id. at para. 54,a, iii. 392 Id. at para. 54,a, vi. 112 infringed says Heyns.393 Finally, the force used in law enforcement operations always needs to be a proportionate last resort, an action that AWS would not be able to apply.394

Patrick Lin offers an analysis of the right to life and the Martens Clause in view of AWS. Lin frames the right to life as the right to human dignity, which would be infringed when life-and-death decisions lack human judgment. 395 Some critics of AWS, according to Lin, require dignity to include accountability, remedy and respect.396 To clarify the concept of the Martens Clause, Lin references two cases, which elaborate on the constitutional respect of human dignity in Germany, “a nation with one of the most developed and thoughtful conceptions of human dignity in law”. 397 However, moral reflection could be included in the programming and design of AWS.398 The Martens Clause would require a weapon system to meet the very high threshold of being mala in se to amount to a prohibition.399 Although the concept of the Martens Clause is

393 Id. at para. 54,a, iv. 394 Id. at para. 54,a, v. 395 PATRICK LIN, 'The right to life and the Martens Clause', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C1257C8D0 0513E26?OpenDocument, 2, last visited December 15, 2017. 396 Id. at 2-3. 397 Id. at 1. 398 Id. at 3. 399 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 54, a; LIN, 'The right to life and the Martens Clause', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22 EC75A2C1257C8D00513E26?OpenDocument, 5, last visited December 15, 2017. 113 relevant to the debate, it needs further elaboration in case it should support a ban of what Lin calls “an important dual-use technology”.400

Bonnie Docherty outlines the human rights application and a potential accountability gap associated with AWS. She challenges the assumption that only IHL should be considered. Docherty holds that human rights apply both in peace (law enforcement) and during armed conflict.401 In general, the right to remedy should promote personal accountability.402 However, in case of AWS,

Docherty argues that the accountability gap under existing law results, for both armed conflict and in law enforcement, in all parties escaping liability.403

According to the author, accountability is relevant across the disciplines of IHL, international criminal law as well as domestic civil law.404

In sum, the debate crystallizes the support for the idea that there should not be any limitations to the use of AWS, other than IHL and IHR considerations.

Again, an Article 36 AP I national review is mentioned as a good starting point, in that it could ultimately lead to international standards that take into account national sovereignty.405

400 LIN, 'The right to life and the Martens Clause', 1. 401 BONNIE DOCHERTY, 'Human Rights Implications of Fully Autonomous Weapons', see id. at, 3. 402 Id. at 11. 403 Id. at 11. 404 Id. at 15. 405 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 54, c. 114

With regard to the very nature of AWS, delegations differ in their positions.

While some go even so far as to assume they would become independent intelligent agents some day, the majority stress that they should rather be considered as somewhat sophisticated tools. At the same time, some delegations oppose the idea that AWS should decide over the life-and-death of humans.

Rather human commanders, the operating personnel or even people involved on the programming, should remain the responsible entities when it comes to lethal use of force. The next generation will more easily accept new technologies, including AWS.406

Applicability and relevance of the Martens Clause for the legality of AWS was met with inconsistent points of view among the participants. Some considered the Martens Clause to be relevant since no explicit provision was applicable, others stressed that there was no regulatory gap because IHL provides a framework for potential use of AWS. However, there was almost unanimous support for the assumption that the right to life and human dignity should be main considerations.407 For this, the CCW and the Human Rights Council should work more closely.408

406 Id. at para. 57. 407 Id. at para. 59. 408 Id. at para. 60. 115

Michael Horowitz points out that there is a correlation between the public opinion, international security issues, and their relationship with AWS. The

Martens Clause paves the way for considerations of public conscience: however, public opinion is only one of many factors of public conscience says

Horowitz.409 According to him, the bar is set high to claim that a weapon system violates the dictates of public conscience.410 Horowitz bases his claims on a US survey where the majority of voters preferred the use of AWS in cases where they could reduce the risk to US forces, and most of the participants would support AWS for defensive purposes.411 It is still too early to determine whether

AWS would go against the public conscience since only nine per cent in India and thirteen per cent in the US oppose defensive use of AWS.412 Horowitz sums up his research by stating that the outcome of the public opinion depends on the context in which AWS could be used.413

Jean-Marc Rickli elaborates on the impact AWS would have on international security. He focuses on the following three pillars of strategic stability, non-state actors and future prospects. Firstly, Rickli outlines strategic stability as “the condition that exists when two potential adversaries recognize that neither would

409 MICHAEL C. HOROWITZ, 'Autonomous Weapon Systems: Public Opinion and Security Issues', (16 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, 3, last visited December 15, 2017. 410 Id. at 4. 411 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 62; HOROWITZ, 'Autonomous Weapon Systems: Public Opinion and Security Issues', 8, 12. 412 HOROWITZ, 'Autonomous Weapon Systems: Public Opinion and Security Issues', 12, 14. 413 Id. at 15. 116 gain an advantage if it were to begin a conflict with the other”.414 In his opinion, the offensive stakeholder would have an advantage in case if he possessed

AWS.415 As a result, it follows that the pre-emptive strike would constitute the best strategy thereby lowering the barrier to use force and ultimately result in an arms race.416 For non-state actors, AWS would be especially attractive since they could challenge states’ monopoly on the use of force, they could serve as force multipliers, swarming tactics would support asymmetric warfare and terrorists could even try to hack into AWS.417

Moreover, AWS could have side effects for the civilian population or defensive

AWS would be re-programmed to be used offensively. It was also mentioned that many states would not be open to transparency due to commercial or military security concerns. However, some delegations put forward explicit proposals for more transparency. This would include the implementation and publication of national Article 36 AP I weapons’ reviews, the introduction of internal safeguards preventing proliferation and misuse or sharing information and best practices through national points of contacts.418

414 JEAN-MARC RICKLI, 'Some Considerations of the Impact of LAWS on International Security: Strategic Stability, Non-State Actors and Future Prospects', see id. at, 3. 415 Id. at 4. 416 Id. at 4. 417 Id. at 5-6. 418 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 64, c. 117

Sarah Knuckey elaborates that in an international dialogue about transparency, governments should share information on AWS development, which must be monitored. Knuckey suggests building upon transparency in existing weapon systems and rethinking existing institutions and norms, taking into consideration emerging technologies.419 While some delegations favour a continuation of a more informal meeting of experts in the future, others support a more robust mandate that could include questions such as the content of Article 36 AP I obligations and implementation, the application of the Martens Clause and elaborations on concepts such as “MHC”, “critical functions”, “autonomy”,

“command and control” or “system-human interaction”. This more formal framework could convene as a “Group of Governmental Experts (GGE)”.420

This formal approach was supported by the delegations of Germany, Finland, the Netherlands, Austria, Pakistan, Croatia, Ireland and Sierra Leone.

Continuing discussions in an informal setting was supported by the US, United

Kingdom, Australia, and Canada.421

The US delegation strongly opposed a ban on AWS. According to them, the only practical solution is to legalise AWS. This is due to the fact that AWS are

419 Id. at para. 72. 420 SARAH KNUCKEY, 'Towards an international dialogue about transparency: What kinds of AWS information should governments share, with whom, and on what basis?', United Nations Institute for Disarmament Research's side event 'Transparency and LAWS' on 16 April 2015, www.unog.ch/80256EDD006B8954/(httpAssets)/A079BA452 1FCB9F2C1257E270044A4EB/$file/2015_LAWS_SideEvent_UNIDIR.pdf, last visited December 15, 2017. 421 CAMPAIGN TO STOP KILLER ROBOTS, 'Country Policy Positions', (25 March 2015) www.stopkillerrobots.org/wp-content/uploads/2015/03/KRC_CCWexperts_Countries_25Mar2015.pdf, 1-15, last visited December 15, 2017. 118 preferable in certain contexts. Ethics, although crucial, should not be confused up with legal issues, and ethics are not decisive for the legality of AWS. In the same vein, the Martens Clause with its principle of humanity, and dictates of humanity is important but does not in itself justify a ban. Since states disagree on the IHL legality of AWS, this supports that there is no customary IHL prohibition. In particular, the US suggested working on an interim document for a comprehensive review process that would outline best practices and apply explicitly to AWS. The UK delegation supported this interim document.

According to the US, humans will remain involved in the development, programming testing, legal review and the decision to use AWS, according to the US delegation.

The Canadian delegation posits that it is less dignified to be killed by a machine, than being beaten to death by a human. Canada outlines the interaction between

AWS and humans, and finds that the particular context of use and transparency with regard to Article 36 AP I, are the key elements to take into consideration for future discussions. For Canada, IHL is sufficient enough to deal with such emerging technologies as AWS. France proposed a periodic review of issues related to AWS.422

422 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 2. 119

Jeroen van den Hoven speaks about moral responsibility and engineering of

AWS. He suggests taking a proactive so called “Value Sensitive Design” approach to consider ethical constraints before the point of no return has been reached.423 By following this design-oriented concept, van den Hoven assumes that competing values could be accommodated jointly.424 This way, critical technical functions may follow overall morally legitimate goals.425 Designing responsibility into AWS right from the start should be possible since according to him this was a “non-functional requirement”.426 Engineers should design systems in a way that accommodates responsible human agents, which van den

Hoven calls “meta-task responsibility”. 427 More precisely, van den Hoven suggests for MHC to consist of ownership of a decisional mechanism and that mechanism should respond to that influence, which he calls “responsibility sensitive design”.428

Ian Anthony addresses two measures that constitute for many of the delegates the way forward in the AWS debate. Transparency and information sharing measures should strengthen mutual trust amongst the state parties.429 By making

423 JEROEN VAN DEN HOVEN, 'Why the Future needs us today - Moral Responsibility and Engineering Autonomous Weapon Systems', (17 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017. 424 Id. at 1. 425 Id. at 2. 426 Id. at 2. 427 Id. at 3. 428 Id. at 3-4. 429 IAN ANTHONY, 'LAWS at the CCW: Transparency and information sharing measures', see id. at, 1. 120 policies, stating intentions and capabilities openly available common understanding may be fostered and areas of disagreement could be identified.430

So far, there is little data on countries’ approaches to AWS available, except explicit policies made by the USA and the UK or official statements made within the CCW framework.431 Anthony concretely recommends systematically collecting official information describing the state of play through a harvesting system and making that information widely available. 432 The information sharing mechanisms Anthony came up with consist of several elements. Firstly, states should appoint a focal point for AWS within their national governments to combine expertise from various ministries.433 Secondly, this focal point should also serve as a point of contact for states.434 Lastly, regular and structural meetings should be used to present, explain and share national positions.435 This information exchange should work on a voluntary basis and include how states comply with Article 36 AP I obligation with regard to existing systems with automatic or autonomous functions.436 By elaborating on national weapons review procedures, this would help identify best practices, build trust through transparency and contribute to finding definitions of standards.437 However,

430 Id. at 1. 431 Id. at 2. 432 Id. at 2. 433 Id. at 2. 434 Id. at 3. 435 Id. at 3. 436 Id. at 3. 437 Id. at 4. 121

Anthony acknowledges that this can only be approached by way of legitimate national security concerns.438

In this following paragraph, this study will identify the key take-away from the expert discussions and national delegate statements of the two CCW expert meetings. It is important to note that these official discussions influenced the fifth CCW Review Conference in November 2016. Many of the experts as well as national delegates stressed the importance of a proper national implementation of Article 36 weapons review obligation. In addition, transparency was defined in the political realm as a condition in which,

“information about governmental preferences, intention and capabilities is made available”.439 Information sharing about best practices was identified to build trust by establishing national focal points to gather national expertise and share that information with other CCW state parties.440 Further, states wondered whether there existed an emerging international norm that could best be used to specify autonomy. This is important due to the fact that without a proper definition and delamination of the issue, AWS cannot be regulated properly. On another note, MHC was introduced into the debate on AWS by the non-

438 Id. at 4. 439 BERNARD I. FINEL & KRISTIN M. LORD, Power and Conflict in the Age of Transparency (Palgrave Macmillan, 2002), 3. 440 ANTHONY, 'LAWS at the CCW: Transparency and information sharing measures', (17 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE0 49BE22EC75A2C1257C8D00513E26?OpenDocument, 3- 4, last visited December 15, 2017. 122 governmental civil society organization called “Article 36”. Maya Brehm441 and

Thilo Marauhn442 have spearheaded this concept and almost every government has made explicit reference to this emerging notion since then. MHC is supposed to counter the concept of ‘appropriate human involvement’, which the

US used for describing their approach towards more autonomy in weapon systems.443 The NGOs, however, claimed that this concept would allow for loopholes such as interpreting it to mean no human involvement at all.444 The

US delegation then stated that it wanted to “ensure appropriate levels of human judgment over the use of force”445 as an alternative terminology. By doing so, they rejected the notion of MHC. Another suggestion was focusing on “critical functions” of target selection and use of lethal force. Here, the first two of four stages would be programming and deployment, which would rest with human operators. However, the following two stages, the selection and attacking of targets, would raise questions as to what kind of human involvement would be required in these so-called critical functions.446 These issues were the starting point for the discussions in 2016/2017, which we now turn to.

441 MAYA BREHM, 'Meaningful Human Control', (14 April 2015) see id. at, 1. 442 THILO MARAUHN, 'The Notion of „Meaningful Human Control“ in Light of the Law of Armed Conflict', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM- 2015/DEHUM-2015.html, last visited December 15, 2017. 443 SAUER, 'Autonomous Weapons Systems. Humanising or Dehumanising Warfare?', (2014) 4 Global Governance Spotlight, 3. 444 Id. at 3. 445 MICHAEL W. MEIER, 'U.S. Delegation Opening Statement', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75 A2C1257C8D00513E26?OpenDocument, 2, last visited December 15, 2017. 446 KATHLEEN LAWAND, 'Presentation on the Panel on possible challenges to IHL due to increasing degrees of autonomy', (15 April 2015) see id. at, 4. 123

4. April 2016 CCW Meeting of Experts on AWS

The third informal meeting of experts on AWS took place from 11 to 15 April

2016 in Geneva under the German presidency.447 The prospect of moving from informal discussions to a more formal mechanism attracted the participation of more state parties and civil society organisations than the previous years. In total, 95 states, 34 international experts, and several civil society organisations, took part in this round of discussions at the Palais des Nations.448

Although the key issues when it came to challenges posed by AWS had not changed over the three years of discussions within the CCW, the German presidency managed to unanimously adopt general recommendations for the fifth Review Conference in December 2016.449 In line with the programme of work, the following topics were discussed in an interactive exchange between experts on the panel, High Contracting Parties, and non-governmental organisations.

During the general debate, there was a common understanding that fully autonomous weapon systems do not yet exist. Views differed concerning the

447 For an overview of panel presentations see the compilation edited under the academic oversight of Robin Geiss: GERMANY, Lethal Autonomous Weapons Systems -Technology, Definition, Ethics, Law and Security (Zarbock GmbH & Co. KG, 2016). 448 See for an overview of participants: www.unog.ch/80256EE600585943/(httpPages)/37D51189AC4FB6E1C1257F4D004CAFB2?OpenDocument, last visited December 15, 2017. 449 MICHAEL BIONTINO, 'Recommendations to the 2016 Review Conference', (15 April 2016) http://www.unog.ch/ 80256EDD006B8954/(httpAssets)/6BB8A498B0A12A03C1257FDB00382863/$file/Recommendations_LAWS _2016_AdvancedVersion+(4+paras)+.pdf, 2, last visited December 15, 2017. 124 question whether states would be able to develop fully AWS in the near future, in the long-term or not at all. Here, several delegations stressed their intentions to not develop fully AWS at all.450 Views also diverged when it came to a working definition. While China, made this a prerequisite for understanding

AWS,451 others stressed that this would be problematic at this stage given that

AWS do not yet exist.452 Further, some delegations emphasized that elements of a working definition need to be discussed. Building upon discussions in earlier meetings, some delegations proposed “meaningful human control” as a framework to consider AWS in relation to human involvement. Through this framework, those in favour of the concept want to assess legal and ethical challenges posed by AWS. 453 Other delegations, however, believe that difficulties would arise when defining the scope of this rather broad concept and its subjective nature. Hence, alternative paths proposed included using

“appropriate human judgement” or to consider this topic at a different stage, namely when it comes to “weapon selection, deployment, target selection and attack.”454

450 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 13. 451 PEOPLE'S REPUBLIC OF CHINA, 'Statement by China to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016) www.unog.ch/80256EE600585943/(httpPages)/37D51189AC4FB6E1C1257F4D004CAFB2, 2, last visited December 15, 2017. 452 MAYA YARON, 'Statement by Israel to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016) www.unog.ch/80256EDD006B8954/(httpAssets)/A02C15B2E5B49AA1C125 7F9B0029C454/$file/2016_LAWS_MX_GeneralDebate_Statements_Israel.pdf, 2, last visited December 15, 2017. 453 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 15. 454 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 15. 125

Delegations agreed that international law, in particular international humanitarian law and human rights law, should guide the discussions and be applied to AWS. Views differed, however, when it came to the questions of compliance with the fundamental principles of distinction, proportionality, and precautions in attacks. While some delegations held that the current IHL legal framework sufficiently covers AWS as well, others disagreed. In a response, the legal review of new weapon systems was brought forward as a means to restrict

AWS if necessary. Stressing the lack of implementation of this obligation and the challenges posed by AWS, others held that the review mechanism is not capable of fully addressing AWS.455

With regard to potential challenges to responsibility and accountability in the use of AWS, there was an almost common understanding that responsibility for the development, production, and deployment of AWS, should rest with the state operating AWS. An unequivocal accountability chain in the deployment of

AWS would be crucial to establish a potential link for individual responsibility.456

455 MINES ACTION CANADA, 'Statement to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (13 April 2016) www.unog.ch/80256EDD006B8954/(httpAssets)/B07EB66CCB57B877C125 7F9B004FF6C9/$file/2016_LAWS+MX+ChallengestoIHL_Statement_MineActionCanada.pdf, last visited December 15, 2017. 456 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 17. 126

As a way ahead, several delegations reiterated their commitment to building measures and transparency. In particular, information sharing in the area of legal weapons reviews, which could develop best practices and benchmarks.457 The specific dual-use character of autonomous technology as well as the benefits to and the overlaps with civilian developments was highlighted. Here, several highly technologically developed states stressed that potential measures, should not hamper legitimate research and development of civilian applications.458

Protocol IV of the CCW banning blinding laser weapons was brought forward as an example of a regulation in which a weapon category was successfully banned without hampering research and development in its respective civilian field.459

While there was one proposal to continue the discussions in the informal process, the recommendations to the fifth CCW Review Conference were adopted unanimously. In addition, several delegations sponsored the transition to an open-ended group of governmental experts to be considered by the Review

Conference. The proposal for the GGE’s mandate were: working on a definition, considering an instrument for more transparency and confidence building measures, and building upon the applicable laws to AWS.460

In the first thematic session, focusing on mapping autonomy, experts built upon differing levels of autonomy in existing weapon systems as well as future trends.

457 Id. at para. 20. 458 Id. at para. 22. 459 Id. at para. 22. 460 Id. at para. 24. 127

This included an overview of progress and limitations of artificial intelligence.

The description of existing systems included UAS, missiles, maritime systems as well as land vehicles. Experts pointed out that the respective systems are used in “certain operational contexts”.461 Some considered those kind of systems as autonomous to highlight the progression of technology while others stressed that they should rather be classified as being automatic, i.e. with automatic target recognition. This resulted in a distinction between teleoperated, automated, and autonomous systems. Experts on the panel emphasized that existing systems so far rely on human supervision, “particularly in view of their technical limitations.”462 According to these experts, the limitations are twofold: on the one hand, technological limitations include their current inability to process and deal with complex and unfamiliar situations due to a lack of situational awareness. On the other hand, human hesitance, such as reluctance of military commanders to lose control of a deployed system, limits the prospects for full autonomy. 463 Currently, research relevant to the field of AWS focuses on cooperation and interaction (swarms), mobility, and situational awareness. The latter described the ability to collect and analyse data as a basis for decision- making of the machine itself.464 Experts agreed that an overly broad approach to autonomy, covering inter alia, machine and self-learning, self-determination and artificial intelligence, might not be helpful in moving forward. Instead, they

461 Id. at para. 28. 462 Id. at para. 28. 463 Id. at para. 29. 464 Id. at para. 30-31. 128 propose focusing on the functions specific to AWS – especially those involving targeting and attacking – stating that this could provide a better understanding of these machines.465

The next thematic session466 attempted to establish a working definition for

AWS. The concept of critical functions of AWS was generated by Chris Jenks to provide more clarity. Lucy Suchman proposed autonomy as self-directed action, while Wendell Wallach concentrated on the concept of predictability of weapon systems. Alternative approaches presented focused more on the human-machine relationship and the level of control over AWS. Here, Anja Dahlmann proposed the so-called multi-dimensional risk-assessment as a means to better understand the level of human control. Richard Moyes described the concept of meaningful human control, while Merel Ekelhoff presented an overview of current targeting processes and proposed building upon existing checks and balances. The notion of human judgement, initially proposed by the US delegation, was picked up by

Dan Saxon. A working definition was considered as being very helpful to frame the discussions in order to overcome abstract debates. However, some delegations pointed out that having an internationally agreed upon definition at this stage should not be a prerequisite and should not slow down substantial work.467 In general, several delegations proposed that a working definition

465 Id. at para. 33. 466 See for details: id. at para. 35. 467 Id. at para. 36. 129 would need to be sufficiently broad in order to cover existing systems, semi- autonomous weapons as well as future developments. In general, discussions revealed that the central element was the human-machine interface and the level of human control over the critical function of the use of force by AWS.468

As a means to specify what autonomy in AWS meant, some delegations presented focusing on specific characteristics such as operating “without human supervision from the moment of their activation.”469 Those underlining the CCW as a forum stressed that only focusing exclusively on critical functions, such as target selection and engagement would be within their mandate. Another challenge for a working definition was considered to be the concept of predictability, including risk and reliability – but also stressing differences in accepting human error and machine malfunctions. Agreeing that the topic of a working definition should be put onto the GGE’s agenda concluded this session.470

The session dealing with challenges to international humanitarian law focused primarily on the review process of new weapons as well as attributing responsibility for the use of AWS.471 Gilles Giacca of the ICRC presented an

468 Id. at para. 38. 469 Id. at para. 39. 470 Id. at para. 40-42. 471 Id. at para. 43-52. 130 overview of the legal review of weapons requirements.472 Kimberley N. Trapp added the notion of “due diligence” as a way to address possible challenges to

IHL.473 Overall, there was general support for the applicability of IHL to AWS.

Several delegations, however, doubted that AWS would be able to comply with the fundamental principles of distinction, precautions in attacks and proportionality when selecting and attacking targets fully autonomously. This developed into a discussion in which many delegations held that human judgement would be necessary in order to comply with the fundamental principles just mentioned. This led to the assessment that a human operator should be involved in the application of force because pre-programming a legal assessment, taking into consideration changing circumstances such as surrender or conduct of hostilities, would most probably be impossible given the current state of technology. In addition, predictability and risk in the context of complex and rapidly evolving conflict scenarios were among concerns mentioned by delegations. There was almost unanimous support for legal reviews as a means to ensure that new weapon technology will be used in conformity with IHL.474

The next session considered human rights and ethical issues related to AWS.475

The United Nations Special Rapporteur on Extrajudicial, Summary or Arbitrary

472 GERMANY, Lethal Autonomous Weapons Systems -Technology, Definition, Ethics, Law and Security, 119- 127. 473 Id. at 285-286. 474 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 48; See below for more details on the legal review of new weapons. 475 Id. at para. 53-64. 131

Executions, Christof Heyns, raised the question of whether machines should be allowed to make life and death decisions. 476 With respect to the CCW’s mandate, international human rights law should also be considered in cases of armed conflicts where applicable beside IHL. According to delegates, this should not preclude discussions in other fora such as the Human Rights Council.

Among the concerned provisions mentioned were human dignity, the right to life, the right to physical integrity and the right to a fair trial. Potential benefits of AWS, however, were also mentioned during discussions. The use in hazardous environments or dull, dirty or dangerous work, as well as potential enhancement of IHL compliance through filtering of large amounts of data when supporting human operators might improve human capabilities and even increase precision in the use of force.477 Ethical considerations may inform the determination of normative principles such as the Martens clause, in particular giving meaning to the principles of humanity and the dictates of the public conscience.

In the last session, experts and member states focused on security issues related to AWS, such as potential regional and global destabilization. The availability of

AWS to non-state actors that do not abide by international law was mentioned as

476 GERMANY, Lethal Autonomous Weapons Systems -Technology, Definition, Ethics, Law and Security, 159. 477 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 57. 132 a destabilising factor.478 Resilience to cyber-attacks was also mentioned as a potential vulnerability of AWS, which would need to be factored into their design. As a result, underlying computer programmes are kept secret in order to conceal the respective cyber-attack vulnerabilities.479

D. 2016 CCW Review Conference: GGE on AWS

The Fifth CCW Review Conference was held from 12 to 16 December 2016 in

Geneva. During the CCW informal meeting of experts in April 2016, member states had agreed by consensus on recommendations for further work on AWS to be considered by the Fifth Review Conference.

The real turning point in the recommendations was the unanimous adoption to recommend that the 2016 Fifth Review Conference may decide to establish an open-ended Group of Governmental Experts of the High Contracting Parties to the Convention on Certain Conventional Weapons.480 The GGE is an expert subsidiary body of the Convention whose mandate is agreed upon by the annual meetings of the High Contracting Parties to the CCW.481 The suggested GGE of the High Contracting Parties on emerging technologies in the area of lethal autonomous weapons systems (LAWS) could, according to the

478 Id. at para. 70. 479 Id. at para. 65, 67. 480 BIONTINO, 'Recommendations to the 2016 Review Conference', (15 April 2016) www.unog.ch/80256EDD00 6B8954/(httpAssets)/6BB8A498B0A12A03C1257FDB00382863/$file/Recommendations_LAWS_2016_Advan cedVersion+(4+paras)+.pdf, para. 3, last visited December 15, 2017. 481 For meetings of the GGE see: www.unog.ch/80256EE600585943/(httpPages)/49166481C076C45AC12571C0003A51C0?OpenDocument, last visited December 15, 2017. 133 recommendations to the Fifth Review Conference, “explore and agree on possible recommendations on options related to emerging technologies in the area of LAWS, in the context of the objectives and purposes of the Convention, taking into account all proposals – past, present and future.”482

It was suggested in the informal meeting of experts in April 2016483 that the

GGE should further consider the topics discussed above. Added to the topics was in particular a potential working definition of AWS, international humanitarian law in the context of AWS as well as human rights law, if deemed applicable. Furthermore, legal and political responsibility and accountability as well as questions of ethics and moral considerations were among the recommended topics to be discussed in the GGE. In addition, the recommendations submitted by the German chair of the informal meeting of experts included the potential effects on regional and global security and stability. The recommendations suggested that the GGE may include the potential influence of AWS on the threshold for armed conflicts and the risk of an arms race. Lastly, potential military value and risks (e.g. the proliferation to non-state actors and those posed cyber operations against AWS) were suggested to the Fifth Review Conference.

482 MICHAEL BIONTINO, 'Statement by the Permanent Representative of Germany to the Conference on Disarmament on Lethal Autonomous Weapons Systems (LAWS) to the Fifth CCW Review Conference', (12 December 2016) http://www.unog.ch/80256EDD006B8954/(httpAssets)/97043403171925A7C125808B00368CEF/$file/2016+Be richt+Vorsitzender+LAWS.pdf, 5, last visited December 15, 2017. 483 BIONTINO, 'Recommendations to the 2016 Review Conference', para. 4. 134

The Fifth Review Conference agreed to these points of discussion.484 The

Conference decided that the GGE should meet in 2017 to prepare and submit a report to the 2017 Meeting of the High Contracting Parties to the Convention consistent with the recommendations contained in document CCW/CONF.V/2 of the April 2016 informal meeting of experts on AWS.

E. CCW GGE Proposals for Regulating AWS

This chapter describes and analyses the potential proposals that were already put forward but could be build upon by the GGE.485 In the GGE on AWS that met from 13 to 17 November 2017, 86 countries as well as UNIDIR, the ICRC, and the Campaign to Stop Killer Robots participated.486 Proposals discussed below aim to clarify the legal landscape by regulating emerging weapons technology.

The first proposal envisions a complete pre-emptive ban on AWS, analogous to

Protocol IV of the CCW prohibiting blinding laser weapons. The second proposal under review is the Drone Accountability Regime (DAR) that would set up a whole new international regime to restrict the development and use of

484 CONFERENCE OF THE HIGH CONTRACTING PARTIES TO THE CONVENTION ON PROHIBITIONS OR RESTRICTIONS ON THE USE OF CERTAIN CONVENTIONAL WEAPONS, 'Final Document of the Fifth Review Conference CCW/CONF.V/10', (23 December 2016) http://www.unog.ch/80256EE600585943/(httpPages)/9F975E1E06869679C1257F50004F7E8C?OpenDocument , para. 30, last visited December 15, 2017. 485 For discussions on military effects and the legal and ethical dimensions of AWS in the 2017 GGE see: ACHESON, 'Losing Control: The Challenge of Autonomous Weapons for LAWS, Ethics, and Humanity', (15 November 2017) 5 CCW Report, www.reachingcriticalwill.org/images/documents/Disarmament- fora/ccw/2017/gge/reports/CCWR5.3.pdf, 1-6, last visited December 15, 2017; for the International Panel on the Regulation of Autonomous Weapons (iPRAW) that is an independent group of experts from different nation states supported by the German Federal Foreign Office see: INTERNATIONAL PANEL ON THE REGULATION OF AUTONOMOUS WEAPONS, 'Focus on Technology and Application of Autonomous Weapons', (2017) www.ipraw.org/state-of-technology/, 1-26, last visited December 15, 2017. 486 THE CAMPAIGN TO STOP KILLER ROBOTS, 'Support builds for new international law on killer robots', (2017) www.stopkillerrobots.org/2017/11/gge/, last visited December 15, 2017. 135 lethal drones. Both proposals, however, are flawed. A pre-emptive ban is too ambitious. It lacks state support,487 which is crucial. The DAR, on the other hand, is very well situated in the real world and feasible. However, the authors envision a whole new regime and overlook that existing law already offers a mechanism.

Article 36 of AP I offers a regime to review new weapon systems. The provision demands that, “in the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party”.

The legal review of new weapon systems is not without flaws either. The number of states that actually apply this obligation is rather low. Although the

US is not a state party, it has a legal review scheme in place that mirrors the obligation enshrined in Article 36 AP I. The most recent Department of Defence

Law War Manual of June 2015 explicitly addresses AWS by stating that

“autonomy in weapons” is lawful.488 However, this seems to oversimplify the complex challenges posed by AWS. This point is valid considering many experts and civil society organizations hold that AWS are unlawful and should

487 Several states, among them such powerful countries as Russia and the United States, claim it is too soon to begin negotiating new international law or politically-binding measures, see: id. at 1. 488 DOD, 'Department of Defense Law War Manual', (12 June 2015), 329. 136 be pre-emptively banned.489 Conducting a proper legal review of new weapon systems could ensure that restrictions are applied to AWS. These can be formulated according to the specific technological ability of each system, and then made to abide by the rules of international humanitarian law. An on-going re-review on a case-by-case basis of specific models could ensure that functional, geographical, or target-specific restrictions are taken into account in case the on-going technological development makes it possible to expand functions that are considered to be lawful. Hence, both the ban and the DAR should have focused on strengthening the existing system rather than calling for novel legislation. The following chapter makes the claim that with a proper domestic implementation the Article 36 AP I legal review mechanism should be the method of choice to address the challenges of emerging autonomous weapon systems.

1. Pre-emptive Ban on AWS

Several civil society organizations call for a pre-emptive ban on autonomous weapon systems as the best way to regulate emerging weapon systems. The pre- emptive ban on blinding laser weapons enshrined in Protocol IV might be a model for tailoring a prohibition of AWS. Just recently, Human Rights Watch

(HRW) and the International Human Rights Clinic at Harvard Law School have offered a proposal entitled “Precedent for Preemption: The Ban on Blinding

489 SHARKEY, 'The evitability of autonomous robot warfare', (2012) 94 International Review of the Red Cross, 787-799; RICHARD STONE, 'Scientists Campaign Against Killer Robots', (2013) 342 Science, 1428-1429. 137

Lasers as a Model for a Killer Robots Prohibition.”490 This section analyses the argument that a pre-emptive ban would be the best way to address autonomous weapon systems. The paper points to similarities between the pre-emptive ban of blinding lasers and autonomous weapon systems. The authors make the correct analogy with the First CCW Review Conference in 1995, which adopted the

Protocol IV prohibiting Blinding Laser Weapons. It is important to note that, like AWS, these weapons were still in development at the time.491 This is the strongest similarity to the weapons in our study. Besides this point, the similarities are not as convincing.

a) Characteristics

The report elaborates on the history of the blinding laser protocol to support its claim. The authors argue that there are five areas of concern that both blinding lasers and AWS share.492 These are:

• Concerns under the Martens Clause,

• Threats to civilians,

• Risks of proliferation,

• The need to clarify the legal landscape,

• Protection of legitimate technology.

490 DOCHERTY, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', (2015) Human Rights Watch/Harvard International Human Rights Clinic, 1-18. 491 Id. at 1. 492 Id. at 1. 138

Human Rights Watch (HRW) holds that the Martens Clause and its enshrined principles of humanity and public conscience led to the legally binding prohibition on blinding lasers.493 By relying on the substitute character of the

Martens Clause, meaning, it applies only when explicit provisions governing a certain conduct or situation is not available, Human Rights Watch views it to be particularly relevant. Especially in cases of new emerging weapons, there is often a lack of specific law governing such innovations. HRW also cites the

International Court of Justice’s Nuclear Weapons Advisory Opinion stating that the Martens Clause has “proved to be an effective means of addressing the rapid evolution of military technology”. 494 The organization elaborates on the different interpretations of the Martens Clause itself: Either as an independent legal standard or a guiding principle495 for existing law.496 As HRW points out,

International Humanitarian Law is “written for and applied by human beings.”497

And this shall hold true for fully AWS.498

HRW is exact in holding that the Martens Clause is relevant for an assessment of AWS. However, HRW’s claim that a ban is the only possible solution to protect the Martens Clause is misguided. At a first glance, this claim made by

493 Id. at 3. 494 ICJ, 'Legality of the Threat or Use of Nuclear Weapons, (Advisory Opinion of 8 July 1996)', (1996) I.C.J. Reports, para. 78. 495 CASSESE, 'The Martens Clause: Half a Loaf or Simply Pie in the Sky?', (2000) 11 European Journal of International Law, 212. 496 DOCHERTY, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', (2015) Human Rights Watch/Harvard International Human Rights Clinic, 4. 497 Id. at 4. 498 Id. at 4. 139 this type of organization, makes sense. However, an exhaustive analysis of the pros and cons of other possibilities is missing. The paper does not discuss an alternative to the ban, such as limitations of certain functions. HRW does mention that the killing of humans by machines is the quintessential example of lack of compassion or a sense of morality. And this will most likely disrupt the principle of humanity or the dictates of public conscience.499 Is the same true, however, if machines are restricted to defensive functions? By not differentiating between the various possibilities, the proposal lacks credibility and foresight, as it is considered only by those states open to restricting offensive uses but not defensive functions. HRW itself mentions that the

Martens Clause could inform the legal review of new weapons enshrined in

Article 36 AP I.500 Here, the authors could have elaborated further on the mechanisms of legal review to how the legal review could be used to differentiate between lawful and unlawful functions in AWS. In addition, when describing the development in the CCW that led to Protocol IV prohibiting blinding lasers, the influence of the Martens Clause on this decision is hard to assess. The final statement of the paper that the Martens Clause “can help drive the adoption of a pre-emptive ban,”501 it appears to be a soft policy measure rather than a clear legal provision.

499 Id. at 4. 500 Id. at 3. 501 Id. at 7. 140

HRW appropriately specifies that the principles of international humanitarian law are often assumed to pose the biggest obstacles for AWS. These are the principles of distinction and proportionality. Especially in contemporary conflicts, an implied lack of comprehension of human behaviour, and the lack of visibility in weighing military advantage against civilian harm, poses insurmountable (technical) challenges to AWS. As for the risk of proliferation, it is understandable. It is, however, not specific to this kind of weapon but applies to almost all weapons. Some are considered so dangerous that they are regulated in specific arms trade treaties. The argument often put forward is that repressive regimes could turn “emotionless” machines against their own people.502 It is important to note that there exists no precedent for a ban of a weapon simply due to the fact that it could be misused. For the foreseeable future, only very few, very highly developed states, especially the US and Israel, will be able to develop such systems. Moreover, a system of mutual control, such as outlined in the DAR, with a provision that only member states may trade technology, could obtain the same objectives with less restrictive measures. HRW argues that a ban would be essential since some parties would probably ignore “less restrictive regulations.”503 The paper does not elaborate on this point. The question that arises is why would the same actors then abide by a ban? It seems to be rather a question of enforcement.

502 Id. at 10. 503 Id. at 12. 141

HRW holds that an additional protocol banning AWS would help clarify the legal landscape. This is definitely true. But again the paper does not address less restrictive alternatives, such as existing interpretative manuals, like the Tallinn

Manual on Cyber Warfare. HRW mentions the “scientific uncertainty” concerning the future of autonomous technology.504 A ban does not naturally come to one’s mind as the only possible answer. HRW puts forward both positions that are currently discussed on a global level: there are those who favour the “wait and see” approach, in other words, the status quo and existing law are sufficient to regulate AWS. This school of thought wishes to learn more about the technology before moving forward. On the other hand, the paper describes the view of advocates, like HRW itself, that call for a pre-emptive ban, which should be a legally binding instrument, such as an international treaty.

Their argument is that only such an instrument could ensure that the legality of specific weapons and modes of attacks are clarified.505 Moreover, a ban would facilitate enforcement and finally lead to the intended stigmatization of the weapons that HRW ultimately attends to achieve.506 Then, the paper goes into more detail. It cites the views of states parties in the First Review Conference to the CCW that supported a ban against laser weapons. The paper accurately points out that the prohibition of blinding laser weapons in Protocol IV led to an

504 Id. at 12. 505 HUMAN RIGHTS WATCH/HARVARD LAW SCHOOL'S INTERNATIONAL HUMAN RIGHTS CLINIC, 'The Need for New Law to Ban Fully Autonomous Weapons: Memorandum to Convention on Conventional Weapons Delegates', (2013), 14-15. 506 DOCHERTY, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', (2015) Human Rights Watch/Harvard International Human Rights Clinic, 12. 142 unambiguous regulation that made case-by-case determinations by an international legal standard possible.507 However, the envisioned usefulness of laser weapons on battlefields was very low at the time of the prohibition. The same is not true for AWS.

The paper then aptly identifies the legitimate state concerns for the protection of civilian technology. This point is salient and probably one of the biggest obstacles for advocates of a pre-emptive ban. However, a ban is not the best way to address this legitimate concern. HRW, however, holds that technology, in general, does not suffer when a specific weaponized subset of a specific technology is prohibited.508 HRW makes a rather practical proposal. To secure a legitimate civilian and military technology, such as autonomous driving, they want to draw the line at weaponization. The prohibition, according to HRW, would also exclude semi-autonomous weapons and existing remotely controlled unmanned robots, such as drones. 509 To leave possibilities for legitimate development of autonomy in machines, HRW suggests prohibiting technological development that can be used exclusively for fully autonomous weapons or that is explicitly intended for use in such weapons.510 To counter arguments of critics of a ban further, HRW holds that over the 20 years of the laser ban the

507 Id. at 13. 508 Id. at 14; The International Committee for Robot Arms Control (ICRAC) also holds that the peaceful use of robotics should be in the centre of discussions on AWS: http://icrac.net/2015/11/new-icrac-video-on-peaceful- uses-of-robotics-and-banning-laws/, last visited December 15, 2017. 509 Id. at 14. 510 Id. at 14. 143 prohibition has not spilled over to lawful civilian and military applications, such as guidance, measurement, and targeting functions of lasers.511

HRW concludes by stating that, “a pre-emptive ban can keep problematic weapons out of arsenals while protecting related lawful technology” 512 .

However, HRW has not convincingly demonstrated why a regulation needs to have the strongest form possible, a ban. The objective of prohibiting only certain very narrow functions can also be achieved by a more flexible regulatory system that could take into account the developments in technological sophistication of autonomous systems. For example, the Organization for the Prohibition of

Chemical Weapons (OPCW) oversees adherence to the Chemical Weapons

Convention, which prohibits the use of certain chemical weapons that includes those with dual-use character.

b) Enforceability

When it comes to the possible applications of AWS, the similarities with blinding lasers end. In addition, the potential advantages of AWS in battlefield situations are much higher than they were for blinding laser weapons. HRW acknowledges some of the differences, in particular specific legal problems posed by each of the systems and their respective character of technology.513

However, says HRW, despite these divergences, a ban should still be supported.

511 Id. at 14-15. 512 Id. at 16. 513 Id. at 16. 144

While both systems challenge international humanitarian law, AWS threaten the dictates of public conscience enshrined in the Martens clause. The organization supports its argument by pointing out that machines taking human lives without meaningful human control could undermine this principle much more than blinding lasers ever could.514 Moreover, HRW goes on to hold that the principle of distinction creates tougher issues for AWS since their programming would have to distinguish between soldiers and combatants, as well as the wounded hors de combat.515 Another problem HRW points out is the higher scale of damage AWS would cause compared to laser weapons. This, however, is not convincing. At least, it would entail understanding the technology better and it would be wise to consider that lasers could also become as big a threat as AWS, depending on the weapon carrier platform.

This study finds it compelling that HRW would acknowledge that blinding lasers refer to a certain type of weapon, while fully AWS represent a whole class of potential weapons,516 but does not analyse this stark difference any further.

HRW could have looked into the discrepancies between defensive and offensive uses of weapons. Specifically since this is a major point of contention for many states, HRW concludes by stating that despite these differing characteristics, the parallels are salient enough to make Protocol IV a valuable precedent. As

514 Id. at 17. 515 Id. at 17. 516 Id. at 17. 145 mentioned, this study disagrees. The pre-emptive ban report should have further differentiated between specific systems or considered alternatives.

c) Feasibility

While the proposal reflects the view of some experts and states,517 it seems to want to crack the nut with a sledgehammer. A pre-emptive ban is not the right way forward. Firstly, legal disparities are stronger than the paper admits.

Secondly, a pre-emptive ban is impractical: states like Israel and the US would never support such an endeavour. This study does however agree that Human

Rights Watch has correctly identified the CCW as the appropriate forum for discussing potential regulations, since experts have already convened there.

HRW has made explicit recommendations to CCW member states, including a new protocol that would ban fully AWS within one or two years.518 This plan is very ambitious – especially, since the CCW member states have only just decided to move discussions to the more formal format of a GGE.

HRW articulates comprehensible concerns associated with AWS. However, two counter-arguments make the paper less convincing. Firstly, a ban is not the only solution to addressing the five main challenges it found in its analogy analysis.

517 According the Campaign to Killer Robots, the following states support a ban: Algeria, Argentina, Bolivia, Chile, Costa Rica, Cuba, Ecuador, Egypt, Ghana, Guatemala, Holy See, Mexico, Nicaragua, Pakistan, Panama, Peru, State of Palestine, Venezuela, and Zimbabwe, see: CAMPAIGN TO STOP KILLER ROBOTS, 'Who Supports the Call to Ban Killer Robots?', (27 June 2017) www.stopkillerrobots.org/wp- content/uploads/2013/03/KRC_ListBanEndorsers_27June2017.pdf, last visited December 15, 2017. 518 DOCHERTY, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', 3. 146

A legal review of weapons could be ideal restrictive measure to ensure that these five concerns are addressed and controlled. According to HRW, blinding lasers

“would not have been a ground breaking addition to warfare” whereas AWS

“have the potential to revolutionize it.” 519 The revolution HRW refers to, however, is only viewed negatively: “the unprecedented lack of human control.”520 That being said, many experts agree that AWS have the potential to generate ground breaking advantages in warfare: lack of negative emotions, such as fear and revenge, and the potential to spare soldier casualties. By only focusing on the negative and not the positive, this paper’s credibility is put into question.

HRW should have mentioned that some states have already stated that they are opposed to an outright ban. As mentioned, the US, UK, and Israel are at the forefront of this.521 According to supporters of this view, the situation is still unclear and neither possible risks, nor possible benefits, can be assessed on a solid basis. In addition, they assume that there are no legal limitations due to autonomy per se. They further argue that the way forward should be to focus on existing international humanitarian law and human rights law to ensure

519 Id. at 17. 520 Id. at 17. 521 OWEN BOWCOTT, 'UK opposes international ban on developing "Killer Robots"', (13 April 2015) The Guardian (available at: www.theguardian.com/politics/2015/apr/13/uk-opposes-international-ban-on-developing- killer-robots), last visited December 15, 2017. 147 compliance.522 The strongest argument against a pre-emptive ban is that it took states three experts meetings within the framework of the CCW to move towards the GGE process.523 This expert meeting preceded the successful ban of lasers and is currently the format used for addressing Cyber Technology. Even civil society organizations do not see much potential for state support for a ban; neither in the CCW expert meeting on AWS that took place from 11-15 April

2016 nor at the Fifth Review Conference held from 12-16 December 2016. The

Campaign to Killer Robots commented, as a result of holding informal meetings for years that, “the decision lacks ambition, shows no sense of urgency, and reflects the CCW’s usual ‘go slow and aim low’ approach.” 524 Most commentators agreed with that statement. The reasons for HRW’s ambitious position on a successful pre-emptive are not convincing under these circumstances.

As outlined, state parties within the CCW have profoundly different views. As long as advantages and disadvantages are not further explored, for states that already rely heavily on different levels of autonomy, such as the US and Israel amongst others, the prospect of new regulations restricting research and development is rejected. In general, statements made by states reveal that there is no consensus and interests do not align. Michael Glennon could add another

522 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 61. 523 Id. at para. 78. 524 CAMPAIGN TO STOP KILLER ROBOTS, 'More talks in 2016 but little ambition', (13 November 2015) www.stopkillerrobots.org/2015/11/noambition/, last visited December 15, 2017. 148 interesting perspective with regard to cyber capabilities: if states are relatively equal in their capabilities, “the imposition of legal limits freezes in no advantage or disadvantage.”525 When it comes to AWS, however, states are relatively unequal, with a high technological advantage for the US. Moreover, as long as the potential of autonomous technology, artificial intelligence, and machine learning on warfare are not fully grasped, states cannot evaluate the potential advantages they might lose in not pursuing this technology. So far, no state has stated its desire to propose any new law. Hence, the chances for new legislation are relatively low, thus supporting this study’s position that the focus should be on existing law instead.

d) Poor Precedent

Rebecca Crootof has already commented on the recent HRW proposal. She too holds that the report fails to address the decisive differences between the successful blinding laser ban and the attempt to ban AWS pre-emptively.

Crootof mentions the stigmatization of certain classes of weapons that was achieved through bans of chemical weapons, biological weapons, and anti- personnel landmines. She especially acknowledges the temptations of the analogy Protocol IV offers: it has been the most successful pre-emptive ban of new weaponry, has never been violated to date, and no state has deployed

525 MICHAEL J. GLENNON, 'The Road Ahead: Gaps, Leaks and Drips', (2013) 89 International Law Studies, 379. 149 blinding lasers in armed conflict ever since.526 Having said this, however, she determines, as does this study, that the two weapon systems do not have a lot in common and hence the analogy does not offer much guidance for AWS.

In a historical analysis Crootof identifies the factors that lead to successful weapons bans. She has identified the following conditions that would accumulated suggest that an envisioned ban could succeed:527

• If the weapon is not effective in the first place,

• If there are other means to achieve similar military objectives,

• If the weapon is not novel, in other words, if it can easily be analogized to

existing weapons, and if the usages and effects of the weapon are well

understood,

• If similar weapons have already been regulated,

• If social or military disruption are unlikely to be created by the weapon,

• If the weapon is not yet part of states’ armed forces,

• If the weapon is likely to cause superfluous injury or suffering,

• If the weapon is considered to be inherently indiscriminate,

• If the weapon is perceived to create public concern and spur civil society

activism,

526 REBECCA CROOTOF, 'Why the Prohibition on Permanently Blinding Lasers is Poor Precedent for a Ban on Autonomous Weapon Systems', (24 November 2015) Lawfare Blog (available at: www.lawfareblog.com/why- prohibition-permanently-blinding-lasers-poor-precedent-ban-autonomous-weapon-systems#), 2, last visited December 15, 2017. 527 CROOTOF, 'The Killer Robots Are Here: Legal and Policy Implications', (2015) 36 Cardozo Law Revue, 1883-1891; A similar study found 7 characteristics: SEAN WATTS, 'Regulation-Tolerant Weapons, Regulation- Resistant Weapons and the Law of War', (2015) 91 International Law Studies, 608-618. 150

• If there is sufficient state commitment in enacting regulations,

• If the scope of the ban is clear and narrow,

• If violations are potentially identifiably.

If applied to AWS, what would these conditions suggest for the likelihood of a ban? Crootof comments briefly that – the only conditions that apply to AWS – would be civil society activism and public concern.528 All other factors speak against the likelihood of a successful ban. This statement merits closer examination. Although many assume that fully AWS do not yet exist, projects and prototypes of highly sophisticated weapon systems do. This study mentions the Anti-Submarine Warfare Continuous Trail Unmanned Vessel (ACTUCV) that is currently under development by DARPA. Some weapon systems with defensive functions, such as the Iron Dome, are already in use today. Many experts, including military officials, perceive that AWS have many functions that would allow militaries to surpass their capabilities, either in speed, time, or distance (height and depth). AWS would enable militaries to penetrate regions, which are so far not reachable by human soldiers. Autonomous machines would also be much faster in processing data than humans. Hence, the assumption of being ineffective would be difficult to prove. Their novelty stems from their ability to process information and take decisions without human input due to their artificial intelligence and their capability of machine learning. That is why

528 CROOTOF, 'Why the Prohibition on Permanently Blinding Lasers is Poor Precedent for a Ban on Autonomous Weapon Systems', 4. 151 an analogy falls short of historic precedent. Another important factor that speaks against a ban is that AWS would primarily be weapon carrier platforms that can be equipped with a wide variety of weapons.529 From this follows that AWS are not as such inflicting superfluous injury. Crootof even holds that, due to these specifics, AWS can be used in a discriminate manner.530 Again, determining a clear scope and narrow tailoring of a ban seems to be difficult because so far this kind of new technology has not yet been fully understood. Although a definition has been identified as a crucial condition for future regulation, so far states do not agree on a common proposal. This is also due to the fact that states cannot find a consensus on what exactly “full” autonomy means and how this could be differentiated from existing systems.531 Without such a distinction, however, it will be impossible to convince the states already regularly deploying weapon systems with autonomous functions to support a ban.

Crootof continues by stating that the analogy proposed by HRW can be rebutted with a close analysis of how different the situation was for blinding laser weapons compared to AWS. The outlined preconditions, with which the potential for a pre-emptive ban can be assessed, produce a very different outlook of success for this kind of weapon. Many of the factors speak in favour of the likelihood of a ban. The scope of the ban is clear and narrow. Article 1 of to

529 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 242. 530 CROOTOF, 'Why the Prohibition on Permanently Blinding Lasers is Poor Precedent for a Ban on Autonomous Weapon Systems', 4. 531 Id. at 5. 152

Protocol IV on Blinding Laser Weapons532 depicts that, “it is prohibited to employ laser weapons specifically designed, as their sole combat function or as one of their combat functions, to cause permanent blindness”, and hence state parties “shall not transfer such weapons to any State or non-State entity.” When states authored the text for the ban, they “knew precisely which capabilities they were foregoing, as the usages and effects of permanently blinding lasers were well-understood, predictable, and limited.” 533 In addition, there was rather widespread support from states because the usefulness of this specific function in lasers was – from a military perspective – not very effective or essential. In fact, although lasers would have been helpful in putting soldiers out of action, these weapons were not uniquely effective since other means to accomplish the same military objective existed. 534 Especially considering more humane alternatives, the potential for superfluous injury and civil society resistance was also high.

In conclusion, these distinctions between lasers and AWS make the prospect for success of a pre-emptive ban for AWS very low. The report points out too many discrepancies, like blinding laser would not have been “ground-breaking” and

532 CCW, 'Additional Protocol to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be deemed to be Excessively Injurious or to have Indiscriminate Effects (Protocol IV, entitled Protocol on Blinding Laser Weapons)', (1995) United Nations, Treaty Series, vol. 1380, (Doc. CCW/CONF.I/16 Part I), 370. 533 CROOTOF, 'Why the Prohibition on Permanently Blinding Lasers is Poor Precedent for a Ban on Autonomous Weapon Systems', 5. 534 Id. at 5. 153 that AWS are rather a “broad class” than specific systems or functions.535 The next consequential step, however, the assessment of what these differences mean for the likelihood of support by states, is unfortunately not discussed. Instead, although written together with the international human rights clinic at Harvard

Law School, the report can first and foremost be understood as a call for action, which represents the core mission of civil society organizations such as HRW.

The latter groups state that the differences “make a pre-emptive prohibition even more pressing.”536 This study does concede that their goal is admirable: the report is a call for action directed at state parties to the CCW, hoping that at least some of the states may sit in on this proposal and will gradually come to support it. However, this does not make the blinding laser ban a better precedent for

AWS. In fact, the distinctions that HRW did not address are exactly what illustrate that a ban will not work. In the end, an attempt to ban AWS would most probably be a barrel burst such as other weapons that could not be banned, from the crossbow to the submarine. The following proposal, the Drone

Accountability Regime, is much more situated in the real world; in particular it puts a special focus on feasibility.

535 DOCHERTY, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', 17. 536 Id. at 18. 154

2. Drone Accountability Regime

Allen Buchanan and Robert Keohane’s Drone Accountability Regime is a proposal for an international regime to regulate lethal drones.537 Their basic argument is that the use of remotely controlled weapons currently lack transparency and accountability.538 Although Buchanan and Keohane posit that the risk of drone misuse is real, they view that their advantages outweigh the serious risks that are characteristic in other lethal means of warfare.539 For lethal drones, the authors discard the feasibility of a ban right right from the beginning.

Their argument is exact in that a pre-emptive ban would not be possible, since drones already exist. Moreover, the authors are right in their assumption that states would not be willing to cease using drones. According to their assessment, however, “current international regulation is inadequate, because actors who control lethal drone use are not held accountable by any existing international body for acting in conformity with relatively uncontroversial moral and legal norms that clearly apply to their behaviour.”540 The potential applicability of this proposal to AWS regulation shall be addressed in the following part.

537 ALLEN BUCHANAN & ROBERT O. KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 15. 538 Id. at 15. 539 Id. at 23. 540 Id. at 23. 155

a) Characteristics

According to the authors, the use of lethal drones carries such special risks that it warrants new regulations.541 The authors present a regulatory regime that has the flexibility to evolve over time so that it can adapt to new challenges and opportunities. Secondly, they favor an international regime over domestic regulation. Thirdly, Buchanan and Keohane do not want their DAR to be an exception to the laws of war. Rather, they want to strengthen the existing laws of war by ensuring better compliance with their proposal.

Buchanan and Keohane consider the Missile Technology Control Regime

(MTCR) as the best fit for an analogy to their DAR proposal. The MCTR, however, only seeks to restrain missile proliferation but does not cover deployment and use.542 This is problematic since the latter two criteria would be the most intricate and salient to investigate. The authors also point out one more difference: the MCTR has no enforcement mechanism but exclusively relies on sanctions issued bilaterally by the US.543 With regard to this regime, they found that states joined it to please the US, expecting some kind of benefits from the

US. The authors expect that the same would be true with regard to the DAR.544

541 Id. at 15. 542 Id. at 25. 543 Id. at 26. 544 Id. at 33. 156

The rules of the DAR would not only govern the exports of drone technology but also the use of drones themselves. Instead of legally binding enforcement,

Buchanan and Keohane’s proposal envisions an imposition of reputational costs, such as naming and shaming for frequent violations. 545 Another main characteristic of the DAR is its informal character. This is an explicit choice, in particular in the first phase. The reason the authors choose this model is because they expect more acceptance, support, and endorsement from states.546 They are, however, well aware of the fact that the ideal system would consist of a treaty with enforcement provisions and a permanent secretariat.547 Buchanan and

Keohane, however, envision their DAR framework consisting of an Assembly of

States, represented by Permanent Representatives, like the UN, which would meet regularly. In addition, Transnational Council of non-state organizations should exist to ensure their representation of transnational organizations. Next, an Ombudsman should act as an agent of accountability, investigating complaints brought before him. The Assembly of States should ensure that the provisions are implemented through state authority, although this again, would not be binding. Rather, it would serve as a “forum for negation and bargaining.”548 Following the concept of naming and shaming, this would be the arena in which actual or potential violation of the DAR rules could be made public. The Transnational Council of non-state actors would also ensure a high

545 Id. at 26. 546 Id. at 25. 547 Id. at 25. 548 Id. at 27. 157 level of transparency and publicity, consisting of civil society organizations such as the Red Cross.549 Both, however, would follow the informal approach and would not have mandatory power over states.550

Accountability would be ensured mainly through the Ombudsperson. Buchanan and Keohane envision this “key actor” based on the model of the Ombudsperson created by the UN Security Council Resolution 1267, which represents people that claim to have been wrongly listed as terrorists.551 The Assembly of States together in consultation with the Transnational Council would elect this position. Four different parties would have direct access to the Ombudsperson: individuals who claim to have been wrongly targeted, NGOs, drone operators themselves or anyone else in the chain of command involved in the drone operation.552 Lastly, the Ombudsperson would have the right to make abuses public, to assign responsibility, and to initiate a process that would lead to compensation for individuals wrongfully targeted.553

The DAR would have two levels of accountability. On the international level, individual member states could be held accountable to interstate and transnational institutions within the DAR. 554 On the domestic level, states

549 Id. at 27. 550 Id. at 27. 551 Id. at 29. 552 Id. at 28. 553 Id. at 28. 554 Id. at 16. 158 themselves would hold their drone operators accountable.555 The DAR requires each member state to establish a national supervisory body that would be required to conduct periodic reviews. In addition, this body would need to keep records of every drone strike and would be accountable to the Assembly of

States and monitored by the Transnational Council.556 To enhance transparency,

Buchanan and Keohane envision that the DAR would include both a passive an active form. The passive transparency allows every state to see which template for the DAR each state implemented on the national level. The active transparency mechanism would be tailored according to the model of the Human

Rights Council, in which states would have to actively defend their national templates before other member states. These review sessions would include

NGOs selected by the Transnational Council. These review session would be conducted regularly, at least every five years.557

One of the key problems with the current system, according to Buchanan and

Keohane, is that the criteria for “signature strikes” is kept secret while arguably violating the discriminatory principle by attacking persons whose identities are not confirmed.558 This lack of transparency is what initially inspired the authors to propose the DAR in the first place. This is the reason why the DAR has a holistic accountability mechanism. The ex ante accountability would require

555 Id. at 28. 556 Id. at 29. 557 Id. at 30. 558 Id. at 22. 159 states to explicitly specify written procedures for their decision-making processes for drone strikes. 559 Here, Buchanan and Keohane make great demands of potential DAR member states. Drone-using states must publicize their targeting criteria, they must explain how they adhere to the distinction and proportionality principle, and they must explain how they address the requirements of military necessity.560 The DAR also requires that states issue a policy statement saying that drone operators may refuse commands to conduct drone strikes.561

The ex post accountability mechanism would then require public justification for each specific drone strike after it was conducted.562 The ex post accountability mechanism is, according to Buchanan and Keohane, “the crucial safeguard against excessive or improper use of lethal drones.”563 In other words, this version of accountability would impose on to explain its targeting criteria. They would have to keep written records of each strike, and the national supervisory body would have to conduct periodic reviews of all records and forward those summaries to the Ombudsperson.564

559 Id. at 17, 30. 560 Id. at 31. 561 Id. at 31. 562 Id. at 17. 563 ALLEN BUCHANAN & ROBERT O. KEOHANE, 'Toward a Drone Accountability Regime: A Rejoinder', (2015) 29 Ethics and International Affairs, 68. 564 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 32. 160

b) Enforceability

The DAR aims for practical application. Buchanan and Keohane are aware of the fact that it will be difficult to reach an international agreement of states even for an informal and non-legalized regime.565 However, they see incentives for states to join the DAR. These are especially strong for drone using states – just as it has been the case for other arms control regimes: the limitation of proliferation and the unregulated use by other actors.566 The idea of a regulatory regime flexible enough to evolve with the technology it aims to regulate shows a thorough understanding of the specific challenges posed by the latter. In this respect, the authors want to address new challenges and opportunities.567 This regime allows for dynamic accountability that consists of an on-going reassessment of the regulatory environment and modifications to it.568 One of the main features of the DAR is that it should be an international regulation.

They first argue that even in a liberal democracy with a robust civil society like the US, transparency is lacking with regard to potential abuses.569 Secondly, as a result, less democratic states would be less willing to set up an effective national drone use policy. Moreover, according to Buchanan and Keohane, political

565 Id. at 17. 566 Id. at 17. 567 Id. at 17. 568 Id. at 17. 569 Id. at 24. 161 leaders tend to give more weight to minimizing harm for their own citizens than risks susceptible of hurting foreigners.570

Neta Crawford, however, criticizes the DAR. In her opinion, it focuses too much on an international regulation and accountability is not the right approach.

According to Crawford, a focus on the domestic level would be easier to implement since only the US uses lethal drones for targeted killings. 571

Buchanan and Keohane, however, dismiss her critique by pointing to the DAR’s advantages over a purely domestic accountability regime. First, they hold that an international regime would provide incentives for the US to assure institutionalized transparency and accountability.572 Their first argument posits that the US could as first-mover shape the rules of an international regime while still having the leverage to do so as the lead nation in drone technology.573

Second, by taking an international leadership role, the US could gain reputational benefits and even encourage other states to join the DAR before they are capable of developing drone technology of their own.574 Third, an international commitment would make it harder for future US administrations to back out of transparency and accountability because they could lose

570 Id. at 24. 571 NETA C. CRAWFORD, 'Accountability for Targeted Drone Strikes Against Terrorists?', see id. at, 47. 572 ALLEN BUCHANAN & ROBERT O. KEOHANE, 'Toward a Drone Accountability Regime: A Rejoinder', see id. at, 69. 573 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 33. 574 Id. at 33. 162 international reputation.575 Finally, Buchanan and Keohane believe that over time, even an informal regime could gain credibility, offer positive incentives for compliance, and increase the costs of noncompliance. 576 In sum, the DAR would, include all of the benefits of a purely domestic regime while combining these with an additional incentive for other states to increase accountability and transparency of their drone use by joining the international DAR.577 Buchanan and Keohane cite President Barack Obama’s speech at the National Defense

University on 23 May 2013 in which he declared with regard to drone use oversight: “not only did Congress authorize the use of force, it is briefed on every strike that America takes.”578 Although Buchanan and Keohane agree with

Obama’s defence of the drone program and with the statement that civilian casualties were less severe than if the US had used other means of military force, they still hold that the US drone policy is insufficient.579 The question then arises whether it needs new legislation to address emerging weapons technologies, including AWS.

575 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime: A Rejoinder', (2015) 29 Ethics and International Affairs, 69. 576 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 34. 577 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime: A Rejoinder', (2015) 29 Ethics and International Affairs, 69. 578 BARACK OBAMA, 'Obama's Speech on Drone Policy', (23 May 2013) International New York Times (http://nyti.ms/10OWqpi), 2, last visited December 15, 2017. 579 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 20. 163

c) Feasibility

Buchanan and Keohane chose an informal system for their proposal. In this study’s view, this is a very practical approach and seems to be right for what they envision. They consider a strong formal interstate drone system to be infeasible.580 This approach is well thought out. Their goal is first to come up with a proposal to initiate a discussion. They consider the DAR not as ideal, but want to “create an environment for learning.”581 Here, best practices can be shared within this “experimentalist regime.”582 The authors are well aware of the potential criticism of not demanding more from states. For this reason, they wrote in their chapter entitled “feasibility,” that they admit that their project is rather ambitious. They therefore found overall present a rather ambitious project. At the same time, however, they expect opposition from the leading country in drone technology and development, the US, and other global players that might also be sceptical, like China, Russia, and Israel.583

Those critics that call for a legally binding regime, the authors call them

“utopian” and suggest that they look at the basic facts.584 Here, they reference existing arms control mechanisms. The authors make an analogy with the resistance of legally binding constraints for landmines - which they rightly point

580 Id. at 25. 581 Id. at 28. 582 Id. at 29. 583 Id. at 32-33. 584 Id. at 33. 164 out was a “much less valuable technology”.585 Buchanan and Keohane add that the major actual or potential users, the US, China, Russia, and India, have not yet agreed upon a landmine regime.586 The authors hold that those critics that strive for the ideal legal regime sacrifice the good for the unattainable.587

Another possibility, according to Buchanan and Keohane, would be to encourage states other than the US to join the regime before they use drones themselves.588 That being said, the strongest incentive in the framework is that it allows for transfer of drones and drone technology between member states. This, indeed, seems like a very strong incentive. Even if many states decide not to join, the framework can grow if actual developing states of drone technology are members. When it comes to drone technology, the US and Israel are so far advanced compared to others that sharing of their technology could entice those that are restricted. To set the threshold for joining the DAR even lower, the authors suggest including provisions for withdrawing so that committing to the

DAR does not come with a high burden of risk. This again is an experimental feature of the DAR. After the initial phase, and when enough states sign up,

Buchanan and Keohane see the potential for giving the regime more teeth. They want to achieve this over time, in particular with a strong involvement by civil society organizations, reputational incentives, and by putting pressure on states so that the latter finally underpin the system for its effectiveness with actual

585 Id. at 33. 586 Id. at 25. 587 Id. at 33. 588 Id. at 33. 165 sanctions.589 For these types of enforcement measures, Buchanan and Keohane rely on “state leadership.”590

From a purely international law stand point, this study wonders whether new legislation is necessary to specifically regulate the use of lethal drones. As

Buchanan and Keohane have stated, international law applies fully to drone use.591 They correctly identify the three bodies of law that may apply as international humanitarian law, international human rights law, and the law enshrined in the UN Charter.592 The law that applies to the use of lethal drones is precisely the same that governs the use of conventional combat aircrafts.

Janina Dill, however, is worried that the DAR would make drone use more legitimate. Her arguments goes that even though the DAR would be informal and tailored towards drones, it could erode existing legal standards for the use of force by states who rely on a specific set of standards once they resort to use drones.593 She is concerned that “the DAR risks reducing the reputational costs of unlawful resorts to force by offering procedures that legitimize actions that may not ultimately fulfil the legal criteria for self-defence.”594 The authors of the

DAR, however, reply that the risk of “false legitimacy” would in the end be

589 Id. at 35. 590 Id. at 25. 591 Id. at 18. 592 Id. at 18. 593 JANINA DILL, 'The Informal Regulation of Drones and the Formal Legal Regulation of War', see id. at 58. 594 Id. at 58. 166 smaller than the current risk of a lack of transparency and accountability without any regulation.595 According to Buchanan and Keohane, their regime would regulate “problematic, nontransparent activity without bestowing inappropriate legitimacy on drone use.”596 The authors describe that the lawful recourse to war is regulated in the UN Charter, in humanitarian law of war, which addresses permissible weaponry and legitimate targets, as well as international human rights law, which governs the use of force especially outside the context of war.597 Why not propose to strengthen the existing law of war? Why does it need a new regime? Unfortunately, they rule out AWS right from the beginning.

Although Buchanan and Keohane are aware of the advantages of a treaty-based international regime, they view this as infeasible due to present political circumstances, which is why they propose an informal international regime.598

This could accomplish some of the goals of a formal regime and could lead to the acceptance of a formal regime at a later stage.599 With the Ombudsperson, however, Buchanan and Keohane have set themselves quite a task. It is hard to imagine finding state support for making abuses public, assigning responsibility, and initiating a process that would lead to compensation for individuals wrongfully targeted. Buchanan and Keohane want to hold actors, primarily

595 ALLEN BUCHANAN & ROBERT O. KEOHANE, 'Toward a Drone Accountability Regime: A Rejoinder', see id. at 68. 596 Id. at 68. 597 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 18. 598 Id. at 28. 599 Id. at 16. 167 states, responsible to a certain set of standards, judge whether they fulfilled their standards. They want their regime to be able to “hold policymakers accountable.” 600 Taking into consideration the Draft Articles on the

Responsibility of States for Internationally Wrongful Acts (ASR), which the

International Law Commission (ILC) came up with after almost 45 years of deliberation, this is a rather ambitious plan. Against this backdrop, this part of the DAR seems to be overambitious. The authors use such terms as “standards”,

“goals”, “procedures”, “heuristics”, and “proxies.”601 Unfortunately, when the reader expects that the authors fill these concepts with content, they refer to the fact that “different standards may be suitable for different states,”602 which leaves concrete examples up to imagination of the reader.

The role of existing law on state responsibility is not to directly prevent any particular conduct, but to possibly establish that a specific conduct was unlawful.603 A critical study of the existing system of state responsibility by

Buchanan and Keohane would have transformed their theoretical approach to actual norms that guide states’ behaviour. This way, they would also have been able to address the question as to whether their proposal is overly ambitious here. At least, the ASR would have offered a starting point. In a strictly legal

600 Id. at 24. 601 Id. at 25. 602 Id. at 25. 603 JELENA PEJIC, 'Extraterritorial targeting by means of armed drones: Some legal implications', (2015) International Review of the Red Cross (www.icrc.org/en/document/jelena-pejic-extraterritorial-targeting-means- armed-drones-some-legal-implications), 38, last visited December 15, 2017. 168 sense, according to existing law, states that deploy armed drones in violation of international law are responsible for the consequences of their conduct. Under certain circumstances, even third states can be held responsible for assisting in drone operations according to Article 16 ASR. In concreto, consenting to launch drone operations from their territory while being aware of the unlawfulness of the operation or sharing information of individuals that later become unlawful targets.604 Taking into account the principles of state responsibility would have been crucial for the DAR’s proposal. The existing law, however, offers already a framework that covers a possibility to legally review emerging weapon systems, which the authors overlooked. Let us now turn to the legal review enshrined in

Article 36 AP I.

3. Article 36 AP I as a means to regulate and restrict AWS

This section proposes that the legal review should focus on the “burden of proof” of whether a specific AWS can comply with the applicable provisions.

AWS need to prove their reliability in different circumstances to get clearance for certain specific tasks. Those who want to introduce AWS into battlefield must demonstrate accurately that the machines could perform the tasks necessary. From this point, on could imagine that current technology may already allow for machine-on-machine warfare in remote areas. This review

604 Article 16, Aid or assistance in the commission of an internationally wrongful act reads: A State which aids or assists another State in the commission of an internationally wrongful act by the latter is internationally responsible for doing so if: (a) that State does so with knowledge of the circumstances of the internationally wrongful act; and (b) the act would be internationally wrongful if committed by that State. 169 could be conducted on a regular basis to ensure that systems are up-to-date or can even be trusted with more functions.

Human Rights Watch discussed the possibility for a ban of AWS, overlooking

Article 36 AP I as a better regulation regime. Although Buchanan and Keohane mention fully autonomous lethal drones, they exclude them from their analysis.

Their argument for not including AWS is that their capabilities were not properly explored yet.605 This is a mistake for several reasons. Firstly, they raise the claim that they want to present a regulation proposal that is flexible to “meet new challenges and opportunities.”606 However, by excluding fully autonomous systems, they do not acknowledge that we are already experiencing a technological development towards greater autonomy. Instead, they focus on lethal drones that involve “continuous human control.”607 By ruling out AWS, they unnecessarily restricted themselves. Buchanan and Keohane justified their proposal of an international regime by pointing out the “special risks of lethal drone use,” which would call for new regulation.608 The same, however, holds true with autonomous weapon systems. Therefore, this study criticizes this omission by the authors. In addition, by not covering AWS, their regime will soon be out-dated, because prototypes with autonomous functions already exist.

605 BUCHANAN & KEOHANE, 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 18. 606 Id. at 15. 607 Id. at 18. 608 Id. at 15. 170

As shown above, states are already discussing how to regulate AWS in the CCW framework. Therefore, Buchanan and Keohane overlooked an existing system that would complete their goal: the regulation of new weapon systems. This is even more striking since the authors have claimed that they want to “ensure better compliance” with the laws of war and not “carve out an exception.”609

The legal review of new weapon systems is not only a customary international law principle but it is also explicitly enshrined in Article 36 AP I. It can easily be found in the set of laws the authors strive to ensure better compliance with.

Unfortunately, the proposal does not even mention the existing legal review of new weapons.

States may not be willing to prohibit what they do not yet fully understand but they may agree that regulation is necessary. In case of nuclear weapons, states are willing to regulate them while not completely prohibiting them.610 This speaks for the existing legal review system. The legality of AWS depends on the individual capabilities of each weapons system. The existing provisions of the law of armed conflict can be applied to new weapons technologies. However, offensive AWS currently cannot pass the weapons review test due to poor sensor technology. The existing law needs to be unanimously applied, and whenever needed, the international community must develop interpretations and modifications to existing law.

609 Id. at 15. 610 JAMES D. FRY, Legal Resolution of Nuclear Non-Proliferation Disputes (Cambridge University Press, 2013). 171

Compared to a ban, Article 36 AP I is a better-suited framework because it allows for a sliding scale. Moreover, considering implementation and support from state parties, the proposed ban does not appear comprehensive enough.

Some states, will most likely not willing to accept a ban that includes even fully autonomous defensive weapons.

a) Characteristics

Article 36 AP I obliges state parties to verify whether in study, development, acquisition or adoption of new weapons, means or methods of warfare, their employment could be prohibited. Since AWS do not exist yet, they should be considered as “new” and should be categorised as a “means” of warfare.611 This involves a proper legal review duty prior to deployment. There is very little state practice, as few states have implemented a national review process, although they are obligated to under the pacta sunt servanda principle enshrined in

Article 26 of the 1969 Vienna Convention on the Law of Treaties. Two of the technologically advanced states, at least when it comes to automation in warfare, the United States and Israel, are not party to AP I. Nevertheless, both states have a long history of formally reviewing the legality of new weapons in a process that is considered to be similar to that of Article 36 AP I.612 Prior review

611 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 223-227. 612 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 398. 172 becomes more decisive, the greater the autonomy level increases, and the less effective control is left for human operators. With increasing use of AWS, the point in time of the legal assessment of an attack must be earlier.613 The review duty incorporates, inter alia, the cardinal principles, such as the distinction between combatants and non-combatants laid down in Article 48 AP I, the prohibition to cause unnecessary suffering to combatants according to Articles

35 II AP I, the prohibition of indiscriminate attacks of Article 51 IV (b) (c) AP I, and the Martens Clause.614 The review obligation is not limited to intended use but must extend to possible misuse.615 The Article 36 legal review process, its implementation and the sharing of best practices are considered to be among the most urgent tasks when it comes to AWS. 616 One should add realistic standardized testing and some degree of transparency to the list. All stakeholders are called upon to contribute their share: developers need to implement international humanitarian law principles, lawyers involved in the review process need to understand how weapons systems operate and meaningful operational guidelines need to be formulated.617

613 TIM MCFARLAND & TIM MCCORMACK, 'Mind the Gap: Can Developers of Autonomous Weapons Systems be Liable for War Crimes?', 362. 614 ICJ, 'Legality of the Threat or Use of Nuclear Weapons, (Advisory Opinion of 8 July 1996)', (1996) I.C.J. Reports, para. 257. 615 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 219; contrary: SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, para. 1480. 616 SIMON-MICHEL, 'Report of the 2014 informal Meeting of Experts on LAWS', para. 4. 617 BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 495-499. 173

So far, only ten out of 158 State parties have implemented an institutional review procedure.618 Article 36 AP I can help solve the problem of self-learning machines and artificial intelligence. In the scenario where weapon systems will be able to detect and attack targets without direct human control, the challenges are:

∗ The programming of algorithms to match pre-programmed target

parameters,

∗ Weapon’s sensor-mechanism,

∗ The environment and time of deployment.

States like Germany are currently adjusting their Article 36 review’ process, especially with a view towards more autonomy in warfare.619 As weapon carrier platforms, AWS can be armed with different weapons.620 Notwithstanding their status as customary international law, the fundamental provisions of distinction between combatants and civilians, proportionality and precautions in attacks, all laid out in Article 36 AP I are binding upon all states in armed conflict.621 In addition to this review process, the law of targeting demands a legal evaluation on a case-by-case examination with regard to every single attack.

618 FRY, 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', (2006) 44 Columbia Journal of Transnational Law, 474; DAOUST, COUPLAND & ISHOEY, 'New wars, new weapons? The obligation of States to assess the legality of means and methods of warfare', (2002) 84 International Review of the Red Cross, 354. 619 DIETER WEINGÄRTNER, 'Dehumanization: Legal aspects from a practitioner’s perspective', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015) www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM-2015/DEHUM-2015.html, last visited December 15, 2017. 620 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 242. 621 BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 23-41. 174

b) Enforceability

The scope of Article 36 AP I covers the applicable law, in particular international humanitarian law, which the two proposals identified as decisive.

Among the principles are: proportionality in Article 51 V, discrimination in

Article 51 IV, feasibility in Article 57, and maximum extent feasible in Article

58 AP I. Due to their relative character, they are capable of adapting their interpretation as technology evolves. This is especially interesting with regard to

AWS for the precision of targeting objects.622 At the heart of the study is the question of how precise a system can target objects. Poor sensor technology poses particular challenges with regard to the distinction and proportionality principles. The question then arises whether those challenges can be overcome with restrictions on potential lawful scenarios. Targeting requires an evaluation that current systems cannot achieve. In remote control systems, the human operator is responsible to ensure the lawfulness of the use of the system. Limited defensive systems, such as Israel’s Iron Dome, may be lawful when used in certain restricted scenarios and environments. An autonomous offensive targeting and attacking system, for example, would currently fail a legal review due to its poor technological development and lack of ability to abide by the rules of international humanitarian law. Especially important for a legal review, is to look at the precision of the weapon system under review, the level of

622 WILLIAM H. BOOTHBY, 'The Legal Challenge of New Technologies: An Overview', in Hitoshi Nasu & Robert McLaughlin (eds.), New Technologies and the Law of Armed Conflict (TMS Asser Press, 2014), 25. 175 autonomy used, and the human control exercised in the system. In particular, it is important to understand the level of (meaningful) human control in the critical function of selecting and attacking targets. Several states have already identified the legal review as the right way forward with regard to AWS. This momentum must be used to learn from those countries that already have a mechanism in place to share best practices.

c) Feasibility

The main objective of the DAR is transparency, accountability, and better compliance with the laws of war. The pre-emptive ban wants to clarify the legal landscape, restrict proliferation, reduce threats to civilians, and address concerns under the Martens Clause. The legal review system of Article 36 AP I already entails all of these features and, in addition, is even more feasible because it constitutes existing law. Even more, in contrast to the pre-emptive ban proposed by HRW, Article 36 AP I allows for a differentiated approach that would make it potentially easier for states to support a regulation that builds upon this existing mechanism. In the expert meetings focusing on AWS at the CCW, several states expressed that enhancing transparency as a trust-building measure could be achieved through the procedures of a legal weapons review in accordance with Article 36 AP I.623 As a concrete measure, the publishing of

623 BIONTINO, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 22. 176 implementing procedures for legal weapons reviews was proposed. 624 The likelihood of having leeway in the actual conduct of specific reviews could convince states to agree in better implementing their existing Article 36 AP I obligations.

Some weapons law experts believe that the law is destined to always follow the technological development.625 However, as William Boothby points out, even where no specific treaty law exists, customary international law consisting of humanity and the dictates of public conscience, provide protection.626 Initial restrictions implemented in the AWS can serve as a means to ensure compliance with international humanitarian law. An environment where no civilians appear to be (e.g. high seas) seems to be the first possible deployment scenario. A regular review of emerging technology could ensure to extend or restrict deployment possibilities. First and foremost, the strengthening of the existing system of weapons review enshrined in Article 36 AP I should be the focus of states for now.

IV. Legal Review of New Weapons as a Means to Regulate AWS

The prototypes cited earlier demonstrate the speed at which technology develops. Soon, weapon systems with even more functions and capabilities,

624Id. at 31. 625 TIMOTHY L.H. MCCORMACK, 'A non liquet on nuclear weapons - The ICJ avoids the application of general principles of international humanitarian law', (1997) 316 International Review of the Red Cross, 90; BOOTHBY, 'The Legal Challenge of New Technologies: An Overview', 26. 626 Id. at 27. 177 including the targeting and attacking of humans, will be possible. How can an institutionalized process effectively ensure that new technological inventions stay within the realm of the applicable law? The weapon inventors do not feel compelled to think about legal or ethical implications of their inventions. During a lecture at the Fletcher School of Law and Diplomacy entitled “Technologies to

Bend the Arc of the Future,” Arati Prabhakar, the director of the DARPA, affirmed DARPA’s primary goal: “Preventing technological surprise to the US while creating technological surprise for the enemy”.627 Prabhakar stressed that

DARPA’s role is to pursue and understand all aspects of new technologies and not to back away only because another technology might be safer. The decision of whether to pursue certain technologies that have unintended consequences, is a question beyond the tasks assigned to DARPA.628 This study wonders then whether international law and policy currently provide for sufficient checks and balances given the high pace of technological development in emerging AWS.

A. Scope of Article 36 AP I

According to Article 36 of the Protocol Additional I to the Geneva Conventions of 1949, the principal source of law potentially governing AWS, stipulates that states should conduct a legal review of new means and methods of warfare to ensure international law compliance:

627 ARATI PRABHAKAR, 'IBGC Speaker Series and the Tufts Institute for Innovation: “Technologies to Bend the Arc of the Future”', (31 March 2016) available at: http://fletcher.tufts.edu/Calendar/2016/03/31/Technologies-to- Bend-the-Arc-of-the-Future-with-Dr-Arati-Prabhakar-of-DARPA.aspx, last visited December 15, 2017. 628 Id. at 1. 178

In the study, development, acquisition or adoption of a new weapon,

means or method of war, a High Contracting Party is under an obligation

to determine whether its employment would, in some or all circumstances,

be prohibited by this Protocol or by any other rule of international law

applicable to the High Contracting Party.

Lawyers have to ensure that weapon systems are used in accordance with the law applicable to the state conducting the review. These are first and foremost the core IHL principles mentioned above, as well as treaty provisions the reviewing state is party to.629 For this, we have to understand what the laws applicable to AWS are and whether there is a mechanism in place ensuring that the legal reviewer understands not only the individual weapon system’s technology under review but can set limits to its functions or deployment scenarios.

B. Application to AWS

According to Article 36 AP I, the legal review of new weapon systems must ensure that they can be applied in accordance to international humanitarian law principles. As stated above, fully AWS do not yet exist; therefore, they must be considered as new. As a means of warfare, AWS trigger a legal review and overall states agree that they fall into the category of new weapons, means or

629 LAWAND, COUPLAND & HERBY, 'A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977', (2006) International Committee of the Red Cross, 11. 179 methods of warfare.630 The provisions posing the hardest challenges for IHL compliance of AWS are those governing the use of force in armed conflicts:

1. The Distinction Principle, codified in Article 51 III AP I, requires that

belligerents must distinguish between combatants and civilians when

using force;

2. The Proportionality Principle, codified in Article 51 V (b) AP I requires

that harm caused to civilians or civilian property must be proportional and

not excessive in relation to the concrete and direct military advantage

anticipated by an attack on a military objective;

3. The Prohibition to use indiscriminate weapons (Article 51 IV (b) (c) AP I)

or that cause superfluous injury or unnecessary suffering (Article 35 II AP

I) to combatants; and

4. Military Necessity, codified in Article 52 AP I, permits only measures

that are necessary to accomplish a legitimate military purpose.

Violation of any of the principles is prohibited. AWS, however, do not necessarily violate IHL per se;631 compliance depends on the specific sensors within the system, the weapons employed, and the environment of use.632

630 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts; FRENCH DELEGATION, 'Non paper: Legal framework for any potential development and operational use of a future LAWS', 1. 631 See above at II B; Agreeing: SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics'; SCHMITT, 'Unmanned Combat Aircraft Systems and International Humanitarian Law: Simplifying the oft benighted Debate'; SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 231-181. 632 MICHAEL SCHMITT, 'Regulating Autonomous Weapons Might be Smarter Than Banning Them', (10 August 2015) Just Security, available at: www.justsecurity.org/25333/regulating-autonomous-weapons-smarter- banning/, 3, last December 15, 2017. 180

To determine which functions can be delegated relative to the level of autonomy, every single AWS needs a legal assessment in conjunction with a potential deployment scenario, or environment. In the following section, this study discusses the various national legal reviews currently at play.

C. National Weapon Reviews during Procurement and

Development

The obligation to legally review new weapons discussed above does not prescribe a specific method or format for weapons reviews: this regulation right was left to the states to determine. States approach established procurement and development processes in their own way. This study posits that studying these various methods may lead to uncovering best practices, which can then be shared to make the review process better equipped to include AWS. These key factors are presented at the end of this chapter, followed by a proposal to update the existing review processes to include AWS.

1. The German Legal Review Process

The Stockholm International Peace Research Institute (SIPRI) convened an expert seminar in Stockholm in September 2015 to allow interested states to discuss their respective legal review processes. Vincent Boulanin compiled details of the different legal review processes as well as challenges discussed by the participating states France, Germany, Sweden Switzerland, the United

181

Kingdom and the United States.633 The following parts describe the respective review processes of Boulanin’s findings.

a) Review Authority

Germany signed the Protocol Additional I in 1977 and ratified it in 1991. Since

2005, the German Federal Ministry of Defence (MOD) has a Permanent

Steering Group under the title Review of New Weapons and Methods of

Warfare. 634 The Directorate-General for Legal Affairs’ International and

Operational Law Branch is responsible for the Steering Group. To synergize expertise in the German MOD, the Directorates-General for Security and

Defence Policy, for Equipment, for Information Technology and In-Service

Support, for Planning, for Forces Policy, and for Strategy and Operations form part of the Steering Group.635 The Steering Group is a permanent body, though its composition may differ depending on subject matter under review.

b) Character of the Review Decision

The Steering Group’s findings take on the form of recommendations within the development and procurement process and are recorded on file. They do not constitute a binding decision as to whether a new weapon may be introduced

633 VINCENT BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', (2015) 2015/1 SIPRI Insights on Peace and Security, 1-28. 634 Id. at 19. 635 Id. at 19. 182 into Germany’s MOD.636 The decision cannot be appealed. Access to review decisions is then decided pursuant to domestic law on a case-by-case basis.

c) Framework

Upon ratification in 1991, Germany made a reservation stating that:

“It is the understanding of the Federal Republic of Germany that the rules

relating to the use of weapons introduced by Additional Protocol I were

intended to apply exclusively to conventional weapons without prejudice

to any other rules of international law applicable to other types of

weapons.” (highlights added by author)

This interpretative declaration excludes dual-use systems from the scope of the

Article 36 AP I review obligation, except in cases where the “intended use clearly contributes to the conduct of warfare.”637 Whether a system or device has to undergo the legal review process is made on a case-by-case basis. Germany takes into account its own international humanitarian law obligations in conducting this legal review. Only new weapons, means or methods of warfare that have a “sufficiently broad range of meaningful operational scenarios”638 for lawful use will be considered for introduction into the German MOD’s arsenal.

636 Id. at 19. 637 Id. at 19. 638 Id. at 20. 183

d) Review Methodology

The Steering Group conducts reviews at the earliest possible stage. Very complex systems may make it necessary for the reviewing entity to phase the review process according to the development cycle. The Steering Group even has a formal intervention possibility in the procurement process.639

The Steering Group does not only rely on information provided by the manufacturer but may also consult the MOD or external experts in the process.

The Integrated Project Team (IPT) supervises tests and evaluations conducted throughout the procurement process, and most importantly, the team is maintained for the entire life cycle of the respective weapon system.640 In case a new weapon system has passed the legal review process and is cleared for procurement, the Directorate-General for Equipment, Information Technology and In-Service Support assigns a specific project manager to head the IPT. This project manager ensures continuous cooperation of all parties involved and is responsible for producing reports related to the product to e.g. the Federal Audit

Office, the Defence Committee as well as the Budget Committee of the German

Parliament (Bundestag). 641 In addition, the project manager is the person responsible for operational testing and other integrated compliance demonstrations.

639 Id. at 20. 640 Id. at 20. 641 Id. at 20. 184

2. The Swedish Legal Review Process

Sweden was among the first states to establish a formal legal review of new weapons in 1974, even before the opening for signature of the Protocol

Additional I in 1977.

a) Review Authority

The Swedish Delegation for International Law Monitoring of Arms Projects (the

Delegation) is the entity responsible for the legal review of new weapons, means or methods of warfare.642 The Swedish Ordinance on International Law Reviews of Arms Projects is the applicable regulation that requires the Swedish Armed

Forces, the Swedish Defence Material Administration and the Swedish Defence

Research Agency to report weapons projects to the Swedish Delegation.643 The delegation is an independent body whose members are elected by the Swedish

Government and produce a report to the latter once a year based on their two to three weapons reviews that year.

b) Character of the Review Decision

Although the Swedish delegation issues approval or non-approval decisions, they are not legally binding.644 They may, however, make recommendations to the requesting authority to make modifications in the design or suggest

642 MARIE JACOBSSON, 'Modern Weaponry and Warfare: The Application of Article 36 of Additional Protocol I by Governments', in Anthony M. Helm (ed.), The Law of War in the 21st Century: Weaponry and the Use of Force (U.S. Naval War College International Law Studies, 2006). 643 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', (2015) 2015/1 SIPRI Insights on Peace and Security, 21. 644 Id. at 21. 185 limitations of use to bring the system into accordance with international law requirements. The review requesting authority may appeal the decision to the

Swedish Government.645 With an exception for classified information, a record of the review decision may be requested. This is a direct result of the Swedish principle of public access to official documents.

c) Framework

The Swedish Delegation for International Law Monitoring of Arms Projects reviews new means and methods of warfare. In addition, the Swedish

Delegation monitors planned purchases and modifications of all types of weapon systems. This includes not only Swedish Armed Forces and Coast Guard but also Swedish Police. 646 The Delegation takes into account international humanitarian law primarily, but also, if applicable, human rights law as well as disarmament law.

d) Review Methodology

The review is triggered by a request of an entity planning to develop, acquire or modify a weapon system falling within the scope of the review authority. The

Swedish Delegation encourages reviews as early as possible and relies on documentation provided by the requesting entity. The latter has the responsibility of ensuring that appropriate tests have been conducted.647 In case

645 Id. at 21. 646 Id. at 21. 647 Id. at 21. 186 tests seem to be insufficient, or are test results are difficult to interpret, or scientific criteria do not seem to have been followed properly, the Swedish

Delegation may order additional information or tests.

The Delegation follows a multidisciplinary approach. The Swedish government elects its eight members. Members of the Delegation have expertise in the field of international and national law, military, medicine and arms technology.648

The members of the Delegation belong to the Ministry of Defence, the Ministry of Foreign Affairs, the Swedish Armed Forces, the Defence Material

Administration, the Swedish Defence Research Institute and the Swedish

Defence University.

3. The Swiss Legal Review Process

Switzerland ratified the Additional Protocol I to the Geneva Conventions in

1982,649 and has a formal national legal review process since 2007.650

a) Review Authority

The requirement for a legal review of new means and methods of weapons before acquisition is enshrined in a Swiss Ministry of Defence ordinance.651 A

648 PERMANENT MISSION OF SWEDEN IN GENEVA, 'Challenges to International Humanitarian Law', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256EDD006B8954/(httpAssets)/0B1EA0C5D19CEE25C1257F9 B0051BB4E/$file/2016_LAWS+MX+ChallengestoIHL_Statement_Sweden.pdf, last visited December 15, 2017. 649 See for Treaties, States Parties and Commentaries of Switzerland compiled by ICRC: https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/vwTreatiesByCountrySelected.xsp?xp_countrySelected=CH, last visited December 15, 2017. 650 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', (2015) 2015/1 SIPRI Insights on Peace and Security, 22. 187

Chief of Defence’s directive regulates the review process and tasks the Law of

Armed Conflict Section within the Ministry of Defence with the reviews.652

b) Character of the Review Decision

Beginning with the drafting of the system specifications during the project planning, legal reviews are continuously performed throughout the entire procurement process. Once a decision for a particular manufacturer or model has been taken, the legal review may be repeated. The final procurement decision even “requires a positive confirmation of compliance with international law.”653

c) Framework

The Law of Armed Conflict Section primarily assesses whether a new weapon violates any treaty obligation, in particular whether it may violate the prohibition of indiscriminate weapons, may cause superfluous injuries or unnecessary suffering. In addition, the Law of Armed Conflict Section takes into consideration relevant regulations and doctrine in its assessment to ensure compliance with international humanitarian law of means and methods of warfare. So far, there is no formal definition of types of weapons to trigger the legal review process besides the fact that weapons need to be new or in case of a

651 DAS EIDGENÖSSISCHE DEPARTEMENT FÜR VERTEIDIGUNG, BEVÖLKERUNGSSCHUTZ UND SPORT (VBS), 'Verordnung des VBS über das Armeematerial (Armeematerialverordnung, VAMAT) - 514.20', (6 December 2007) www.admin.ch/opc/ de/classified-compilation/20071623/index.html, last visited December 15, 2017. 652 MICHAEL SIEGRIST, 'A purpose-oriented working definition for autonomous weapons systems', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256EDD006B8954/(httpAssets)/558F0762F97E8064C1257F9B00 51970A/$file/2016.04.13+LAWS+Legal+Session+(as+read).pdf, 3, last visited December 15, 2017. 653 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 22. 188 modification of an existing weapon, changes the intended use or its performance.654

d) Review Methodology

While manufacturers provide documentation, the Law of Armed Conflict

Section may also, if deemed necessary, conduct its own tests and evaluations.655

During the legal review process, the Law of Armed Conflict Section may include subject-matter experts (i.e. physics, medicine or chemistry). Switzerland currently reviews its legal review process, focusing in particular on including a weapon’s definition in order to clearly identify the review subject.656

4. The United Kingdom’s Legal Review Process

The United Kingdom of Great Britain and Northern Ireland ratified the

Additional Protocol I of the Geneva Conventions and established a national legal review of new weapon systems in 1998.

a) Review Authority

Initially, the British Ministry of Defence conducted the legal review of new weapons. Today, the Development, Concepts and Doctrine Centre (DCDC) carries them out on behalf of the MOD Central Legal Services. This satellite office consists of lawyers from all three branches of the British military: the

Navy, Air Force and Army. The DCDC is located in the UK Defence Academy

654 Id. at 22. 655 Id. at 22. 656 SIEGRIST, 'A purpose-oriented working definition for autonomous weapons systems', 3. 189 in Shrivenham.657 DCDC is the MOD’s think tank focusing on “global future developments and possible future operating concepts for the UK Armed

Forces.”658 DCDC’s lawyers are qualified barristers or solicitors who serve three-year terms, while the senior lawyer position rotates between the Army,

Navy, and Air Force. All lawyers need to have operational experience in order to serve in the DCDC’s legal review process.659

b) Character of the Review Decision

The review process of the DCDC is jointly approved by both the lawyers compiling a formal written legal advice, and the consulted experts who focus on the accuracy of the scientific content within the final text. After signing off the report, it is peer-reviewed by another lawyer from DCDC.660

c) Framework

The subject of review – means and methods of warfare – is defined in a broad sense, including the way in which weapons are used and the conditions under which warfare is conducted. According to the United Kingdom MOD’s UK

Weapons Review manual, the following equipment may trigger the legal review mechanism: laser designators, sighting and target acquisition equipment, data

657 UNITED KINGDOM OF GREAT BRITAIN AND NORTHERN IRELAND MINISTRY OF DEFENCE, 'UK Weapon Reviews', (8 March 2016) Development, Concepts and Doctrine Centre, www.gov.uk/government/uploads/system/uploads/attachment_data/file/ 507319/20160308- UK_weapon_reviews.pdf, 3, last visited December 15, 2017. 658 Id. at 3. 659 Id. at 3. 660 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 23. 190 links and target data processing software.661 The DCDC may also conduct a legal review in case existing capabilities or equipment is used in novel ways. As a result, the legal review “may record limitations on the use of a weapon or method of or means of warfare in order to ensure compliance.”662

The classified nature of weapon systems and procurement processes entails confidentiality measures, restricting a full disclosure of the process.

Nevertheless, the DCDC discloses the following five main areas in their legal reviews:663

1. A potential prohibition of the weapon or a restriction in its use deriving

form specific treaty provisions or any other applicable international law

provision;

2. The nature of the weapon may not cause superfluous injury or

unnecessary suffering;

3. The potential of the weapon to be used discriminately;

4. The potential to cause widespread, long-term and severe damage to the

environment;

5. The likelihood of the weapon to be affected by on-going or future trends

in the development of International Humanitarian Law.

661 UK, 'UK Weapon Reviews', 4. 662 Id. at 4. 663 See for details: UNITED KINGDOM OF GREAT BRITAIN AND NORTHERN IRELAND, 'Possible Challenges to IHL due to Increasing Degrees of Autonomy', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256EDD 006B8954/(httpAssets)/37B0481990BC31DAC1257F940053D2AE/$file/2016_LAWS+MX_ChallengestoIHL_ Statements_United+Kingdom.pdf, 3. 191

According to the DCDC, any system needs to fulfil all five areas in order to successfully pass the legal review – regardless of whether weapon systems have any level of autonomy or not. However, the UK introduced a national reservation upon ratifying of Additional Protocol I stating that AP I shall not apply to nuclear weapons.664 In addition, secondary sources also account for resources of the British review. This includes inter alia commentaries from the

ICRC, academic papers, and UN or NGO reports.665 The UK not only maintains links with lawyers working in these respective divisions but even established positions within sister services abroad.666

d) Review Methodology

The military lawyers from DCDC work closely with the procurement team.

Throughout the whole procurement cycle, lawyers engage in the process by attending demonstration days, by discussing issues with the producing companies and by attending technical meetings.667 Reviewing lawyers consult with, if necessary, equipment and procurement teams, scientists, doctors, environmental and armed forces experts as well as companies designing and building weapon systems.668

664 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 23. 665 UK, 'UK Weapon Reviews', 5. 666 Id. at 5. 667 UK, 'Possible Challenges to IHL due to Increasing Degrees of Autonomy', 3, last visited December 15, 2017. 668 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 23. 192

The formal legal review follows three stages that aim to ensure that the legal review is being conducted before making decisions on allocation and spending for new weapons.669

The three stages of review within the procurement process of new equipment are:670

1. “Initial Gate”: The decision to commit funds to developing a specific

capability;

2. “Main Gate”: The decision to commit fully to the procurement of a

particular weapon;

3. “Entering of Service”: When the finalised equipment or weapon enters

service.

One exception to the standard operating procedures is applicable in case of

Urgent Capability Requirements (UCR), which may modify the procurement process. The UCR is a means to quickly alter or develop weapons according to urgent battlefield needs. There is still a legal review – in some cases it is done orally – but it is done once there has been an initial clearance.671

DCDC writes a so-called “non-review” in case the legal analysis concludes that a weapons review is not necessary (i.e. equipment falls outside the scope of

669 UK, 'Possible Challenges to IHL due to Increasing Degrees of Autonomy', 3. 670 See for details of the process: UK MINISTRY OF DEFENCE, 'UK Weapon Reviews', (8. March 2016) Development, Concepts and Doctrine Centre, www.gov.uk/government/uploads/system/uploads/attachment_data/file/507319/20160308- UK_weapon_reviews.pdf, 4. 671 Id. at 4. 193

Article 36 AP I, like a new radio). However, DCDC keeps record of these decisions, as a precautionary measure if new technology or a platform change comes along to trigger a new legal review.672

The United Kingdom is committed to transparency in the following areas: classification of material relating to the performance and use of weapons as well as concerns of national security permit.673 To ensure this, the DCDC published a short report in 2016 entitled “UK Weapon Reviews” in which the MOD gives an overview of the process adopted by the UK for conducting weapon reviews in accordance with Article 36 AP I.674

5. The US Legal Review

The Department of Defence (DoD) policy and practice of conducting legal reviews of weapons came before the obligations enshrined in Article 36 AP I.675

Although the US is not state party to Article 36 AP I (for political reasons), it has a sophisticated legal review mechanism in place and considers the above review obligation to be customary international law.676

672 Id. at 5. 673 UK, 'Possible Challenges to IHL due to Increasing Degrees of Autonomy', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256EDD006B8954/(httpAssets)/37B0481990BC31D AC1257F940053D2AE/$file/2016_LAWS+MX_ChallengestoIHL_Statements_United+Kingdom.pdf, 2. 674 UK, 'UK Weapon Reviews', 1-5. 675 DOD, 'Department of Defense Law War Manual', (12 June 2015), 6.2.3, 315; Details regarding the US legal review of new weapons primarily build upon a personal interview conducted by the author with lawyers in the Code 10 Division in the Pentagon on 29 February 2016. (Account on file with the author). Disclaimer: Any faults in this chapter are mine, of course. 676 DAOUST, COUPLAND & ISHOEY, 'New wars, new weapons? The obligation of States to assess the legality of means and methods of warfare', (2002) 84 International Review of the Red Cross, 355, 357. 194

a) Review Authority

The policy to conduct legal reviews of new weapons in the US dates back to

1974. 677 All of the three military services have regulations in place implementing the legal review policy.678 Although the overall responsibility for conducting legal reviews lies with the Department of Defence, all of the three military services have a department responsible for conducting legal reviews.

The Navy, however, is an exception because the Department of the Navy consist of the Marine Corps and the Navy. Weapons to be developed or acquired by the

Department of the Navy are legally reviewed by the Judge Advocate General of the Navy in close collaboration with the Staff Judge Advocate to the

Commander of the Marine Corps. The US Navy’s legal review is currently conducted in Code 10, the International and Operational Law Division of the US

Navy Judge Advocate General’s Office in the Pentagon.

To avoid potential conflicts of legal interpretation and to ensure consistency in the legal review across US military services, the military lawyers may work with the Department of Defence’s Law of War Working Group.679 The DoD follows a multidisciplinary approach in their legal reviews. Hence, the reviewing authority actively coordinates with the weapon producers, the DOD procurement

677 DOD, 'Department of Defense Law War Manual', 6.2.3, 315. 678 This section is mainly based on a personal interview conducted by the author with US Navy JAG attorneys of Code 10 on 29 March 2016 in the Pentagon, interview record on file with the author; For the Army see: ‘Review of legality of weapons under international law’, Army Regulation 27-53, 1 Jan. 1979; For the Navy see: Secretary of the Navy, ‘Department of the Navy implementation and operation of the defence acquisition system and the joint capabilities integration and development system’, Instruction 5000.2E, 1 Sep. 2011; For the Air Force see: ‘Legal reviews of weapons and cyber capabilities’, Instruction 51-402, 27 July 2011. 679 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 24. 195 division as well as weapon operators. This is why reviews may include lawyers but also safety officers, engineers, and software developers.680

(1) US Defence Acquisition Policies

The Defence Acquisition System sets mandatory policies in DoD Directive

5000.01 governing all acquisitions carried out by the Department of Defence.

This includes the requirement that all weapon systems undergo a legal review by an attorney in the relevant Department to ensure compliance with applicable domestic and international law.

(2) Law of War Manual: Compliance Test

The DoD Law of War Manual states that all weapons employed by the DoD undergo weapons reviews to ensure compliance with the laws of armed conflict.681 In addition, the Manual stipulates that the weapons review process is governed by individual Department instructions. In the case of the US Navy, this is the instruction SECNAVINST 5000.2E.

(3) US Navy Implementing Instructions

For the Department of the Navy, SECNAVINST 5000.2E is the implementing instruction for Directive 5000.01. Program Managers are required for all weapon systems to submit a review request to Code 10 and to delineate the contents of such requests. The instruction 5000.2E stipulates that “all potential weapons or

680 Id. at 25. 681 DOD, 'Department of Defense Law War Manual', (12 June 2015), 315. 196 weapon systems are reviewed by JAG before the award of the EMD

(Engineering and Manufacturing Development) contract and again before the award of the initial production contract.”682 In other words, the legal review serves as a check so that taxpayer money is not invested into research and development of a weapon or weapon system that may not lawfully be deployed.

The second crucial provision is that a review needs to be conducted for the compliance of the proposed weapon with regard to arms control agreements.683

Instead of Code 10, the Director of the Strategic Systems Programs (DIRSSP) conducts this legal review via the Naval Treaty Implementation Program (NTIP)

Office (NT00) with the advice of the Navy Office of General Counsel (OGC).

b) Character of the Review Decision

The legal review of weapons process may lead to one of the following three results: approve, approve with conditions, or disapprove.684 In case a new weapon does not meet all criteria and is not approved, the review report may state corrective actions identified during the process that should be followed in order to ensure that another later review process would allow the envisioned equipment to pass the legal review process.685 During this stage, the reviewing lawyer considers whether legal restrictions on the use of the weapon system

682 SECRETARY OF THE US NAVY, 'Department of the Navy implementation and operation of the Defense Aquisition System and the Joint Capabilities Integration and Development System - SECNAV INSTRUCTION 5000.2E', (1 September 2011) available at: www.public.navy.mil/cotf/OTD/SECNAVINST%205000.2E.pdf, para. 1.6.1.a.(1). 683 Id. at para. 1.6.2. 684 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 24. 685 W. HAYS PARKS, 'Conventional Weapons and Weapons Reviews', in Tim McCormack & Avril McDonald (eds.), Yearbook of International Humanitarian Law (T.M.C. Asser Press, 2007), 133. 197 apply or whether practical training or rules of engagement measures specific to a particular weapon system are necessary. 686 While the review process is transparent, the specific records of individual reviews are usually subject to classification and cannot be accessed. However, if a weapon system is exported to another country, certain information from the review process may be provided.687

c) Framework

According to the Judge Advocate General of the US Navy, the purpose of the legal review of weapons is to ensure that all weapon systems comply with IHL and domestic legal requirements prior to acquisition or fielding. The requirement is laid out in Directive 5000.01 and SECNAVINST 5000.2E, which require that the Judge Advocate General of the Navy conduct a legal review of all weapons and weapon systems intended to meet a military requirement of the

Department of the Navy. This review shall ensure that the intended use of proposed systems is consistent with obligations of the US, both under international and domestic law. The US Navy defines weapons, for triggering legal review purposes, as “all arms, munitions, material, instruments, mechanisms, devices, and those components required for their operation, that are intended to have an effect of injuring, damaging, destroying, or disabling

686 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 25. 687 Id. at 24. 198 personnel or property, to include nonlethal weapons”.688 The term “weapon” does not include launch or delivery platforms, but does include the weapons and weapon systems contained on those platforms. While “means of warfare” includes the intended effects of weapons in normal and expected usages,

“methods of warfare” is to be understood in a broader sense of employment of weapons.689

d) Review Methodology

The actual review depends on the complexity of the weapon system under review. In case there is already a precedent on file or a review of a similar weapon available, the review lawyer in charge may conduct the review on his own in a relatively short timeframe.690 The DoD’s weapons’ review follows a holistic approach: lawyers conduct the review throughout requirement identification, development, testing and evaluation, as well as fielding and employment processes. The review may even accept feedback from operators in the field after deployment – which may lead to modifications that will ultimately require an additional legal review. 691 In general, however, the review is structured in the following phases.

688 Personal interview conducted by the author with US Navy JAG attorneys of Code 10 on 29 March 2016 in the Pentagon, interview record on file with the author. 689 DOD, 'Department of Defense Law War Manual', (12 June 2015), para. 5.1.1. 690 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 24. 691 Id. at 25. 199

(1) Phase Zero: Coordination

This early involvement ensures that attorneys are participating in all aspects of weapons development and acquisition. The role of Code 10 in conducting reviews of the law of armed conflict reviews, for example, is only one facet of that process. Code 10 OPLAW maintains relations with attorneys, in particular with NAVSEA, Naval Strategic Systems, and MARCORSYSCOM.

(2) Phase One: The Request

The instructions governing the Department of the Navy’s Acquisitions System,

SECNAVINST 5000.2E, require all Program Managers (PM) overseeing the development of a weapon system to submit a request for legal review to the

Judge Advocate General (JAG) of the Navy Code 10 Division. Code 10 has promulgated a sample request692 to help guide the PM in generating the request and providing all of the required information. The components of the review includes, firstly, a detailed description of the weapon system, including all of its parts, how it functions, what it does, and whether it is self-propelled, portable, mounted or attached to a platform. Secondly, it contains a detailed description of the concept or method of employment explaining how the weapon system will be used. Thirdly, information is requested regarding the ability of the weapon system to be directed at a particular target (i.e. accuracy), as well as a comparison to similar weapon systems already in use. In a next step, information

692 A sample request form that informed this chapter is on record with the author. 200 is addressed regarding the impact of the weapon system on the human body and on material objects. Lastly, any additional pertinent testing results are also incorporated. Recent examples of requests for review include those submitted by the Naval Surface Warfare Centre Crane Division, the Special Warfare

Development Centre, Research Development and Acquisition. According to the

Navy JAG in Code 10, the latter requested the review of the Long Range Anti-

Ship Missile (LRASM) system that was described above.

(3) Phase Two: The Legal Review

In the next phase, Code 10 produces a written report on the actual legal review.

In doing so, it may reach out for additional information and expertise to testing facilities (NAVSEA, Naval Surface Warfare Centre Crane Division) and other government agencies or other branches of service. Such additional information and expertise may be requested from DoD branches or US government testing facilities, such as the Naval Surface Warfare Centre (NSWC) Crane Division, the NSWC Dahlgren Division, the Walter Reed National Military Medical

Centre, or from the FBI. The review includes a description of the functioning of the weapon as well as its purpose. The actual report starts off with a description of the weapon system itself. It includes its physical description (e.g. material composition, size, weight etc.) and its intended uses (i.e. concept of employment). This can be straightforward and short (e.g. the riot baton) when the weapon’s functioning is very basic or when the weapon has been around for

201 a while and the adjustment is small (e.g. the PHALANX anti-missile system update). In other cases, it can be quite lengthy and complex, such as cases including Rules of Engagement (RoE), required operator training, collateral effect mitigation measures inherent to the weapon system (e.g. Directed Energy

Weapon – Navy Laser Weapon System, LaWS from Raytheon).

In a next step, Code 10 analyses whether the weapon under review could cause unnecessary suffering manifestly disproportionate to the military advantage to be reasonably expected from its use. It looks into whether the weapon may be controlled in such a manner that it is capable of being directed against a lawful target. Lastly, the Code 10 review attorney analyses whether there are specific rules of law or treaties prohibiting or restricting the use of the weapon. The analysis of whether a weapon might cause unnecessary suffering relies heavily upon the quality of the information submitted by the requester. In this phase, the military advantage cited above is taken into consideration. According to the attorney conducting the review in Code 10, this will most likely mean an inquiry into the degree as to which the new weapon is better than its predecessors. The military advantage is then compared to the suffering likely to be caused to the target or to the civilians who may be collaterally affected. This includes an assessment of what kinds of expected injuries can occur in case the weapon impacts the human body. The analysis then explicitly determines whether the weapon may cause unnecessary suffering that is manifestly disproportionate to 202 the military advantage that could be reasonably expected from its use. In addition, the review includes an assessment of whether the weapon can be used in a discriminate manner. For that, Code 10 reviews the technical data regarding the accuracy of the weapon and its ability to be directed against a lawful target, with an understanding of the expected frequency and severity of any collateral effects.

The complete analysis is conducted with the understanding that the weapon system will be employed in accordance with its intended use. This implies that any weapon system can be used in violation of the law of armed conflict principles if modified or used in a manner other than its intended employment.

The purpose of the weapons review is to ensure that, if used as intended, the weapon will comply with these principles. Lastly, the review attorney will consider other relevant treaties. For this, Code 10 takes a broad look at existing laws, treaties, as well as Department of Defence or Department of the Navy directives or regulations to determine if there is any specific prohibition or restriction upon the anticipated use of the proposed weapon. For instance, if the review involves Directed Energy Weapons (DEW), Code 10 will analyse the weapon system in light of the Convention of Certain Conventional Weapons

203

Protocol on Blinding Lasers, to which the US is a state party, the US DoD policy memo on blinding lasers,693 and other applicable protocols.

(4) Phase three: Coordination

The cross service coordination follows the law of armed conflict review. Once

Code 10 is satisfied that the weapon system meets requisite legal obligations, the attorney conducting the review will share the review with the respective operational law sections of other services in order to obtain a concurring opinion. Their approval is not required under applicable law or guidelines but is considered useful in ensuring cross-service consistency in the general DoD approach to weapon reviews. In addition, this process enhances mutual understanding of the respective service’s analysis of weapon systems. Lastly,

SECNAVINST 5710.23C on the “Implementation of and Compliance with

Arms Control Agreements” requires that the Director of Strategic Systems

Programs reviews all weapon systems for compliance with arms control agreements. This review is separate from the Code 10 weapons review and not part of their review. The Naval Treaty Implementation Program (NTIP) office conducts the review for the US Navy. Nonetheless, Code 10 notes this requirement in every review and works with the requesting organization to ensure that the requester understands and abides by the requirement. The arms

693 U.S. SECRETARY OF DEFENSE, 'Department of Defense Policy on Blinding Lasers, SECDEF Memo U00888/97, 17 January 1997', (1997) reprinted in: Annotated Supplement to the Commander's Handbook on the Law of Naval Operations, Naval War College, § 9.8. 204 control review can either occur prior to, in parallel, or after the LOAC review conducted by Code 10.

(5) Phase four: Dissemination and Retention

Once a review is completed, a copy is sent to the requesting organization and any intermediate endorsers. Code 10 maintains an archive of all completed weapons reviews.

e) The US Department of Defence Directive on Autonomy

Applicable guidelines must be able to address the rapidly growing technological developments of autonomous functions in machine prototypes. This study wonders whether the US DoD has put in place policies and legal regulations to lead this expansion effectively. “When technology becomes too complex or decisions have to be taken that are above my pay grade, I have to look into policy guidelines,” is what the US Navy JAG charged with the legal review in the Pentagon stated about the Navy’s legal review of new weapon systems. In other words, when the current law does not prohibit a specific kind of weapon system, but the attorney reviewer’s instinct tells him that there are conflicting interests involved, he/she must look into broader US Department of Defence or

Department of the Navy policies regarding that specific topic.

The Department of Defence Directive 3000.09 Autonomy in Weapon Systems is the first, and so far only, service encompassing all of the existing policy on

205

AWS. Back in 2012, then Deputy Secretary of Defence, Ashton Carter, outlined the key points to follow in the development of AWS. This directive applies to semi-autonomous and autonomous weapon systems using lethal or non-kinetic force, including guided munitions that can independently select and discriminate targets,694 such as the LRASM mentioned above.

At first glance, the DoD Directive on AWS is unambiguous about the use of lethal force by AWS:695

• AWS must permit commanders and operators to exercise appropriate

levels of human judgment over the use of force;

• Persons who authorize the use of, direct the use of, or operate AWS must

do so in accordance with the law of war;

• Individual targets or specific target groups need to be previously selected

by an authorized human operator; and

• Human-supervised AWS may be used to select and engage targets (except

human targets).

The directive makes clear that in the selection of human targets and the use of lethal force, a human operator need to be involved.696 According to Directive

3000.09, LRASM would be governed by the Directive and fall under the category of “fire and forget” or “lock-on-after-launch homing munitions.” The

694 DOD, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012) www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf, para. 2 (2), (3), last visited December 15, 2017. 695 Id. at para. 4. 696 However, the directive itself contains two explicit exceptions, including AWS that can use lethal force and a Deputy Secretary of Defence waiver in cases of urgent military operational need – however, expect the obligation for a legal review, at id. at para. 4 (d); id. at Enclosure 3, para. 2. 206

US Navy itself classifies LRASM as a “semi autonomous weapon systems” in light of this Directive.697

6. Shortcomings in Existing Legal Review Systems and Lessons

Learned

According to this study, Code 10 is logistically and analytically not prepared for the robotic revolution. Currently, only one Navy JAG processes all of the incoming requests (approximately 10 per year). Superiors then sign off those review decision. The individual JAG reviewing a highly complex new weapon system, with increasing autonomous functions, will find himself/herself overwhelmed. The dramatic and accelerating changes in technology mean that simply going through the paperwork, archives, or getting back to the producer is not feasible anymore. The review system needs to be strengthened. With ever more complex systems (robotics, unmanned, artificial intelligence: decisions taken by weapon systems autonomously), more external and internal expertise will be needed. Without a better understanding of how autonomous technology affects legal compliance, the attorneys in Code 10 have to either take the weapon’s manufacturers word for granted, or else block the individual system entirely. The shortcomings in the present process are:

• Formalized understaffing and lack of resources;

697 JOHN MARKOFF, 'Fearing Bombs That Can Pick Whom to Kill', in,International New York Times (available at: http://nyti.ms/1pNYvFi) 11 November 2014, last visited December 15, 2017. 207

• Extremely heavy reliance on defence contractors, affecting the

independence of the review;

• Inadequate understanding of complex technological developments,

especially if insights are given by industry.

Decisions regarding the acceptable level of autonomy must be made at the leadership level. The focus should be on operational testing – focusing on implementation of the so-called Test, Evaluation, Validation, and Verification

(TEVV) mechanism. In addition, novel developmental and operational test and evaluation (T&E) techniques would focus on the unique challenges of autonomy, including unpredictable environments, emerging behaviours, and human-machine communication. 698 Tools and methods to reduce risks and enhance predictability of AWS are:699

• Research, Development, Test, & Evaluation (RDT&E), Developmental

Testing (DT), and Operational Testing (OT);

• Run time behaviour prediction and recovery.

In particular, the Directive on Autonomy from 2012 is out-dated stating itself that it needs to be “reissued, cancelled, or certified” by November 2017 by the latest.700 In practice, this means that the individual attorney reviewing specific weapon systems has to rely on out-dated and vague policies. In a field that is

698 JON BORNSTEIN, 'Autonomy Roadmap Autonomy Community of Interest', (24 March 2015) available at: www.defenseinnovationmarketplace.mil/resources/AutonomyCOI_NDIA_Briefing20150319.pdf, 8, last visited December 15, 2017. 699 Id. at 9. 700 DOD, 'Directive Number 3000.09: Autonomy in Weapons Systems', para. 7 (b). 208 changing at high speed, this is inappropriate. Other US services, such as the US

Air Force, seem to be better prepared for developing policies guiding technological development. 701 As for the US Navy, their current publicly available policies covering unmanned systems are rather broad and do not cover specific guidance for legal reviews.702

That is why the US Delegation in the UN CCW in 2015 reaffirmed that the directive: 703

• Does not establish a US position on the potential future development of

lethal AWS – it neither encourages nor prohibits the development of such

future systems;

• Establishes a deliberative approval process by senior officials, sets out the

technical criteria that would need to be satisfied in order to develop AWS;

and

• Imposes additional requirements beyond what is normally required during

their weapon acquisition process.

701 UNITED STATES AIR FORCE OFFICE OF THE CHIEF SCIENTIST, 'Autonomous Horizons: System Autonomy in the Air Force - A Path to the Future (AF/ST TR 15-01)', (June 2015) I: Human-Autonomy Teaming available at: www.af.mil/Portals/1/documents/SECAF/AutonomousHorizons.pdf?timestamp=1435068339702, 1-33, last visited December 15, 2017; GREG L. ZACHARIAS, 'Presentation to the House of representatives Armed Service Committee Subcommittee on Emerging Threats and Capabilities - Advancing the Science and Acceptance of Autonomy for Future Defense Systems', (November 2015) available at: http://docs.house.gov/meetings/AS/AS26/20151119/104186/HHRG-114-AS26-Wstate-ZachariasG- 20151119.pdf, 1-14, last visited December 15, 2017. 702 DOD, 'Unmanned Systems Integrated Roadmap FY2013-2038, Reference Number: 14-S-0553', (2013) www.defense.gov/pubs/DOD-USRM-2013.pdf, ; US DEPARTMENT OF THE NAVY, 'The Navy Unmanned Undersea Vehicle (UUV) Master Plan', (9 November 2004) www.navy.mil/navydata/technology/uuvmp.pdf, last visited December 15, 2017. 703 MEIER, 'U.S. Delegation Opening Statement', (13 April 2015), 2. 209

These additional requirements are designed to minimize the probability and consequences of failure in autonomous and semi-autonomous weapons systems that could lead to unintended engagements and ensure appropriate levels of human judgment over the use of force.

The DoD needs to develop new test and evaluation methods, especially tailored towards autonomous functions, to enable the fielding of assured (reliable and trustworthy) autonomous systems.704 The high complexity of the systems and the underlying questions that lie more in ethics than in the law require that the highest ranking officials in the military, together with policy makers, have a proper and up-to-date policy towards autonomy. As discussed above, the current approach lacks the resources, flexibility, and broader analytical framework to respond effectively to AWS. With these problems in mind, this study presents a system that addresses exactly these deficiencies. The main goal of this proposal framework is to be effective, but also – and more importantly – to be flexible enough in order to constantly adapt to new inventions. An informational policy brief about AWS needs to address the following key challenges:

• Uncertainty: How can policy makers make the best-informed decisions

about how to govern the future?

704 BORNSTEIN, 'Autonomy Roadmap Autonomy Community of Interest', (24 March 2015) available at: www.defenseinnovationmarketplace.mil/resources/AutonomyCOI_NDIA_Briefing20150319.pdf, 22, last visited December 15, 2017. 210

• Benefit: Which system captures the benefits best while minimizing the

disadvantages?

• Dystopian Exponential: Without regulations, we risk losing the chance to

guide the technological development.

The next chapter reviews the potential for domestic or informal proposals for improvement, and possible international instrument addressing these issues.

V. Proposal: Contextualising the Legal Review to include AWS

Indeed, context-based legal reviews of methods and means of warfare can help

close loopholes and ensure that the spirit of these laws prevails.705

Already, most legal weapon review processes tailor each of their evaluations to the specific equipment or weapon system under assessment. However, they do not generally consider use issues due to their contextual nature.706 To date, reviewers fail to consider the context in which a weapon or method is expected to be used.707

By considering the context of deployment and use, AWS will be used in a manner consistent with applicable international law. With regard to the CCW,

705 FRY, 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', (2006) 44 Columbia Journal of Transnational Law, 455. 706 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 273. 707 FRY, 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', (2006) 44 Columbia Journal of Transnational Law, 480 who speaks of “terrain” instead of context. 211 conducting legal reviews was considered to be essential in the Final

Declarations of the Second, Third, Fourth and Fifth CCW Review Conferences, where, if not already done, states declared their determination to undertake such reviews.708

This study will now delve into the common features of existing legal reviews.

To date, those states conducting legal reviews rely first and foremost on work presented by representatives of the ministries of defence or foreign affairs in addition to members of the armed forces.709 Most legal reviews are conducted in an inter-disciplinary process that takes into account military, legal, technological, political, and medical assessments. Most of these assessments take place as early as possible, either during the development or procurement process. Processes differ as to whether multi-disciplinary teams conduct the review itself or whether experts are consulted when deemed necessary by the respective review authority. Only very few review processes allow for an independent review authority. The same is true for the outcomes of the review processes: most of them only have an advisory status. However, some are directly linked to the procurement process of the weapon under review. Key to credible review processes is the actions states take as a result of these individual reviews. The legal reviews analysed above require the following modifications

708 See Second Review Conference, Final Declaration, 5; Third Review Conference, Final Declaration, 4; Fourth Review Conference, Final Declaration, 4; CCW, 'Final Document of the Fifth Review Conference CCW/CONF.V/10', 9-10. 709 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 48. 212 and adaptations that, hopefully, will also serve as a starting point for key recommendations for future legal reviews that will take into account AWS.

Weapons reviews may result in:710

• Modifications of system requirements;

• Formulation of new operational directives;

• Prescription of restrictions of how a weapon could be used;

• Introduction of specific training taking into account new capabilities.

One obvious limitation of present review processes is that only very few states have implemented national review processes despite their international obligation. In addition, only little information of these national reviews is accessible so far. Some states fear that the national process could be misused by other states to not distinguish between lawful and unlawful weapons but to purely legitimize their new technologies.711 These deficiencies identified within the CCW have already led to initial proposals to develop common standards at the international level. A guide on legal weapons reviews clarifying the variety of legal reviews could achieve this. The ICRC is currently updating its manual on legal reviews to take into account this call for action. For example, a compilation of best practices could help establish comprehensive and transparent standards to increase trust between states during this process.

710 Id. at para. 48. 711 Id. at para. 50. 213

Now that we have analysed the most advanced legal review processes, the following key factors should be considered so that an update of these processes continues to ensure that emerging weapon technologies remain used in a lawful manner:712

1. The review authority could place restrictions on the environment or the

time frame in which each specific weapon system may be used in

accordance with applicable law;

2. Recommendations of restrictions could also inform training courses or

rules of engagement;

3. The respective review methodologies need to be regularly reviewed and

adapted if necessary to keep up with the technological development of

emerging autonomous systems;

4. A license and clearance system could certify specific AWS for certain

scenarios in which the systems are performed in accordance with

applicable law, and abide by IHL rules with regard to their individual

technological (sensors etc.) capabilities;

5. A licensing system on a case-by-case or model-by-model basis would

entail the assurance of expected weapons effects, intended use, types of

targets and – most importantly – the context of use, i.e. the environment

712 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 16; ICRC, 'Report of the ICRC Expert Meeting on Autonomous weapon systems: technical, military, legal and humanitarian aspects in Geneva', 62. 214

of deployment. This would be easier for states to implement than a new

legal framework.

Building upon existing legal reviews has the advantage of being able to adapt the application of the law according to the technological development of weapon systems.

The key challenges to updating the respective national legal reviews to cover

AWS are:713

• Reliability of communication with human operators;

• Degree of risk of interference and detectability;

• Ability to calculate algorithms for complex situations;

• Fail-safe function for system or machine failures;

• Level of mobility in a complex and unfamiliar environment.

To stay well informed with technological developments, national review authorities need to chart the progress of autonomy in the field of weapons they plan to develop or acquire. Potential characteristics could either be pursued by a functional approach or one focusing on system capabilities, i.e. target identification, target prioritisation or mobility.714 Michael Siegrist, legal officer at the International Humanitarian Law and International Criminal Justice

Section within the Swiss Directorate of International Law, supports the view that

713 BIONTINO, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', para. 28. 714 Id. at para. 30. 215 the Article 36 AP I legal review of new weapons process can ensure that AWS are being used lawfully when stating that

“the process of national legal reviews may require procedural and

technical adaptations to fully capture the complexity of autonomous

weapons systems, if rigorously implemented, it holds the potential of

ensuring that all new weapons, means and methods of warfare are

developed and acquired in compliance with international law.”715

Now that we have established that the legal review of new means and methods of warfare enshrined in Article 36 AP I is best suited to control and restrict

AWS, the question arises as to how many states have a proper review process in place. Of the 174 State Parties to the AP I, only around 15 states have implemented this obligation with a national review mechanism. Hence, the focus should be on enhancing compliance with the protocol through and national implementation of the legal review obligation.

The dilemma of having a theoretical framework available to potentially control the development of AWS within applicable rules but not having a proper implementation has been identified already within the CCW expert meetings.

An information-sharing proposal to enhance compliance should focus on the following key recommendations:716

715 SIEGRIST, 'A purpose-oriented working definition for autonomous weapons systems', 4. 716 BOULANIN, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', 18. 216

• Increased transparency can lead to the formulation of guiding principles

and strengthen international confidence in the review process;

• The sharing of lessons learned and best practices could inform the

development of international standards with regard to more autonomy.

A. Context related Incentives for Autonomy and Technical

Difficulties

While technology enables AWS, military applications drive their adoption. The

US DoD Autonomy Roadmap issued in 2015 presents the main drivers:717

• Manpower efficiencies (reduce human footprint and personnel cost);

• Rapid response and 24/7 presence (timely, persistent, enduring);

• Harsh environments (day, night, hot, cold, bad weather, rubble, barriers);

• Logistical support (reduce logistics burden: hold, transport, carry, watch).

In 2015, the United Nations Institute for Disarmament Research (UNIDIR), which is an autonomous institute within the United Nations that conducts research on disarmament and security, issued the report “The Weaponization of

Increasingly Autonomous Technologies in the Maritime Environment: Testing the Waters.” UNIDIR has assembled its own list of the main drivers for maritime systems for commercial, scientific, and military purposes.718 They

717 BORNSTEIN, 'Autonomy Roadmap Autonomy Community of Interest', (24 March 2015) available at: www.defenseinnovationmarketplace.mil/resources/AutonomyCOI_NDIA_Briefing20150319.pdf, 7, last visited December 15, 2017. 718 UNITED NATIONS INSTITUTE FOR DISARMAMENT RESEARCH, 'The Weaponization of Increasingly Autonomous Technologies in the Maritime Environment: Testing the Waters', (2015) UNIDIR Resources No. 4 (available at: www.unidir.org/files/publications/pdfs/testing-the-waters-en-634.pdf), 1-14, last visited December 15, 2017. 217 resemble the broader DoD drivers but, as the title indicates, focus on the maritime environment specifically. Firstly, environmental factors influence the development for greater autonomy: bad weather, ocean currents, temperature, and underwater pressure are all taken into consideration. From a military perspective, among the most important factors, the vast area that needs to be monitored and controlled. Secondly, maritime environments are often physically or mentally stressful. In particular, long submarine deployments regularly bring sailors to their psychological limits. Thirdly, the cost of deploying and maintaining human crews at sea represent an economic driving force. Lastly, when it comes to underwater operations, communications become increasingly difficult to maintain. This is especially true when autonomous systems penetrate communication-denied areas in covert operations.719 Several experts agree that the specifics of the maritime environment, such as the lack of civilian encounter underwater or on the high sea, make the deployment of AWS in this environment very likely.720 This study looked into whether actual robot-makers think about AWS.

719 Id. at 2. 720 WOLFF HEINTSCHEL VON HEINEGG, 'Unmanned Maritime Systems: The Challenges', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (13-14 February 2015), www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM-2015/DEHUM-2015.html, last visited December 15, 2017; JEFFREY THURNHER, 'Unmanned (Maritime) Systems and Precautions in Attack', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (13-14 February 2015), at www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM- 2015/DEHUM-2015.html, last visited December 15, 2017. 218

A survey this author conducted from 5 to 15 June 2015 among robotic experts and team members of the DRC Finals, found that AWS are considered to be inevitable and probably not too far away. 64 % of those interviewed did not question that AWS will be developed and used. The balance of opinion is much narrower when it came to the question of when AWS will technically be able to be deployed onto the battlefields. The majority (27 %) assumes that fully AWS could technically be deployed in five to ten years. However, 24 % believe that it will take at least 10-15 years, while a 21 % think that AWS are 25-50 years away.

The survey found significant agreement when it comes to obstacles by robot makers: 95 % believe that technical rather than legal regulations hinder robotic development.

When it came to the question of whether governments had a moral obligation to do everything they can to save their soldiers’ lives – including developing and using AWS, the majority agreed. However, the difference is modest with 38 % supporting the notion of a moral obligation, 33 % stating this measure is too extreme, and another third remaining unsure.

While 64 % say that AWS are inevitable, 33 % hold that we would still be better off without them, but almost as many (31 %) think that mankind would be better 219 off with AWS. In the same vein, opinions diverge about whether AWS’ advantages outweigh their disadvantages. More than 54 % (35 % strongly, 19 % moderately) support the claim of opposing AWS because decisions of life and death should remain with humans someone must be held responsible for inflicting harm. However, 38 % (32 % moderately, 6 % strongly) support the claim that AWS would save soldiers from the physical (death or injury) and psychological (post-traumatic stress disorder) harms of war and that they would be able to make ethical decisions, since they could not feel emotions such as hate or revenge and would lack the instinct for self-defence.

On the other hand, a majority (60 %) agrees that AWS should be developed, although 41 % want to restrict them strictly to defensive use of force, while 19

% approve offensive use of force too. Almost one third (27 %) disapprove of the prospect of development of AWS and support a pre-emptive ban. In individual responses, experts specified that they would favour the development, but only to understand the technology and problems of autonomy, to one day, counteract the use of them if need be. Others hold that there is no choice because someone will always start developing such systems once they are technologically possible, political ban or not. Meanwhile, some support a development for non-lethal purposes only.

220

This study found that robot experts made the following arguments for regulation

(more than one choice available): AWS would not be reliable enough due to sensor, or other technological limitations (50 %); 47 % state that only humans should make the decision over life and death; 39 % of participants fear a potential accountability gap; 25 % say that there would be a risk of AWS falling into the hands of enemies; and just 3 % do not see valid reasons against AWS.

More than half (53 %) of the experts interviewed insist that remote control, meaning having a human operator in the loop, is the only possibility to maintain human control. However, 39 % of robotics specialists express that human control can also be maintained by way of programming safeguards into the design of AWS.

Asked directly which concept (more than one choice available) they would find helpful to maintain human control over AWS, 70 % approve of human control over “critical functions,” meaning action-based command over targeting and attacking decisions. 27 % support the concept of meaningful human control in the sense of strong direction and always having significant human involvement.

Almost as many experts (24 %) favour the notion of human judgment, meaning work control, or zero human involvement. This survey illustrates that even among robotic experts there is no consensus. That being said, their first and foremost concern is technical difficulties, since they are prerequisites for the actual development of AWS. 221

Hence, before key decisions can be safely delegated to fully AWS, specific issues have to be solved:721

• Unpredictable Environments: Autonomous systems must operate in

unknown, untested environmental conditions;

• Unintended Behaviour: It is difficult to assure correct behaviour in a

countless number of different environmental conditions;

• Human-Machine System: Handoff, communication, and interplay

between operator and autonomy are key enablers for the trust and

effectiveness of an autonomous system.

In the following section, current prototypes under development will be reviewed to demonstrate how highly advanced the current level of autonomy already is. In the long run, the US Department of Defence even wants to test “fully autonomous systems.”722 The overall trend of more autonomy in the US military holds true for other countries as well. This study will now look at how the US has achieved its technological edge when it comes to unmanned systems.

1. DARPA: The US Technology Innovation Accelerator

The US has a state-owned innovation laboratory that is tasked with advancing the US’s technological superiority. As a result, the US is among the most advanced states when it comes to autonomous systems. The Defence Advanced

721 BORNSTEIN, 'Autonomy Roadmap Autonomy Community of Interest', 24. 722 Id. at 23. 222

Research Projects Agency (DARPA) is a research agency established as part of the US Department of Defence in 1958. Among projects financed by DARPA is a prize competition called DARPA Robotics Challenge (DRC).

DARPA has a straightforward mission: to prevent technological surprise for the

US and to create technological surprise for its enemies.723 DARPA provides key initial funding and investment for breakthrough technology. It is unique among its kind holding competitions as a means to drive innovation, and promoting collaboration among engineers and scientist. DARPA’s record in this regard is impressive. Some of its technological achievements are precision guidance, stealth technology, unmanned aerial vehicles, and driverless cars.724 DARPA itself describes its role as the Department of Defence’s innovation engine that

“undertakes projects that are finite in duration but that create lasting revolutionary change.”725

The US Congress had set a high goal: to have one-third of ground military forces vehicles autonomous by 2015.726 Military contractors, however, had only managed to present every now and then a new prototype that was not very

723 For an overview of DARPA’s activities see: www.darpa.mil/about.aspx, last visited December 15, 2017. 724 DEFENSE ADVANCED RESEARCH PROJECTS AGENCY, 'Breakthrough Technologies for National Security', (2015) www.darpa.mil/about.aspx, 18, last visited December 15, 2017. 725 See for details: www.darpa.mil/our_work, last visited December 15, 2017. 726 KRISHNAN, Killer Robots - Legality and Ethicality of Autonomous Weapons (Farnham, 2009), 11. 223 reliable.727 The original idea of the DARPA Grand Challenge (DGC) was to accelerate the development of driverless vehicles for military supply and cargo into battlefields. The impetus that originated from these challenges can be observed in the civilian sector as well: several prototypes of driverless cars are now on the roads. Therefore, in a way, DARPA achieved its goal. It increased access to technologies and lowered its cost. However, autonomous cars still have a driver for security and legal reasons. Google is very proud of how few accidents its autonomous cars have had – and the driverless vehicles caused none.728 How could this be achieved with the help of the DARPA Grand

Challenges? DARPA changed the strategy from funding traditional research of military contractors to holding a “race” featuring a prize of one million dollars for the winner. In the first DGC, held in 2004, none of the 15 competing robotic self-driving cars made it to the final round. However, one year later, five teams managed to successfully complete the course. The winning vehicle was Stanford

University’s Stanley, a diesel-powered Volkswagen Touareg R5. Stanley’s biggest advantage was its software system, which relied “on state-of-the-art artificial intelligence technologies, such as machine learning and probabilistic reasoning.” 729 Its developer, Sebastian Thrun, moved on to spearhead the

727 BURKHARD BILGER, 'Auto Correct - Has the self-driving car at last arrived?', (25 November 2013) The New Yorker, www.newyorker.com/magazine/2013/11/25/auto-correct, 4, last visited December 15, 2017. 728 ANDREW J. HAWKINS, 'Google’s ‘worst’ self-driving accident was still a human’s fault', (26 September 2016) www.theverge.com/2016/9/26/13062214/google-self-driving-car-crash-accident-fault, last visited December 15, 2017. 729 SEBASTIAN THRUN, MIKE MONTEMERLO, HENDRIK DAHLKAMP, DAVID STAVENS, ANDREI ARON, JAMES DIEBEL, PHILIP FONG, JOHN GALE, MORGAN HALPENNY, GABRIEL HOFFMANN, KENNY LAU, CELIA OAKLEY, MARK PALATUCCI, VAUGHAN PRATT & PASCAL STANG, 'Stanley: The Robot that Won the DARPA Grand Challenge', (2006) 23 Journal of Field Robotics, 662. 224 development of the Google self-driving car and is the founder of Google’s X

Lab.730 The third, and last competition of this kind, took place in 2007 and was called “Urban Challenge.” The tasks this time were much more challenging: the driverless vehicles had to obey all traffic regulations on a 60 miles course, while also adapting to traffic.731

The maturity of state-of-the-art prototypes of driverless cars is a very good indicator of how successful DARPA is at its objectives. Military ground vehicles such as the Lockheed Martin’s Squad Mission Support System (SMSS) have been navigating autonomously using robotic technologies since 2011.732 So far, twenty-one states, Alabama, Arkansas, California, Colorado, Connecticut,

Florida, Georgia, Illinois, Louisiana, Michigan, New York, Nevada, North

Carolina, North Dakota, Pennsylvania, South Carolina, Tennessee, Texas, Utah,

Virginia and Vermont, and Washington D.C. have passed legislation related to autonomous vehicles.733 The UK followed in 2013 and Switzerland in 2015.734

Daimler’s Freightliner Inspiration in Nevada is the first driverless truck to obtain

730 BILGER, 'Auto Correct - Has the self-driving car at last arrived?', 4, 10. 731 See: http://archive.darpa.mil/grandchallenge, last visited December 15, 2017. 732 LOCKHEED MARTIN, 'Lockheed Martin Demonstrates Autonomous Systems that Advance Unmanned Technology on Land, Air, and Sea', (9 May 2017) http://news.lockheedmartin.com/2017-05-09-Lockheed- Martin-Demonstrates-Autonomous-Systems-that-Advance-Unmanned-Technology-on-Land-Air-and- Sea?_ga=2.117835721.1502180965.1512361255-581737331.1408534907, last visited December 15, 2017. 733 NATIONAL CONFERENCE OF STATE LEGISLATURES, 'Autonomous Vehicles Enacted Legislation', (23 October 2017) www.ncsl.org/research/transportation/autonomous-vehicles-self-driving-vehicles-enacted-legislation.aspx, last visited December 15, 2017. 734 SEBASTIAN HÖRL, FRANCESCO CIARI & KAY W. AXHAUSEN, 'Recent perspectives on the impact of autonomous vehicles', (September 2016) Working paper - Institute for Transport Planning and Systems, available at: www.ethz.ch/content/dam/ethz/special-interest/baug/ivt/ivt-dam/vpl/reports/2016/ab1216.pdf, 1-37, last visited December 15, 2017. 225 a license. 735 Plans discussed at the moment even include a “NAFTA superhighway for driverless trucks” as a special corridor that runs from Mexico up to Canada.736 Google officials say that its driverless car will be ready for the market by 2020.737 Much of these success stories are built upon these initial ideas funded by DARPA.

2. DARPA Robotics Challenge: “May the Best Robot Win”

Currently, DARPA is focused on cyber security and robotics as the “next big thing” in national security. It is calling for a “revolution in capability.”738 During the DRC, teams competed against each other in a challenge that consisted of guiding robots through physical tasks, which tested mobility, manipulation, dexterity, perception and operator control mechanisms.739 In the trials that led to the final DRC, teams had to demonstrate the performance of critical real-world disaster-response skills with their prototype robots.740 In addition, in the Virtual

Robotics Challenge (VRC), teams had to show their software ability by guiding a robot through sample tasks in a simulated suburban obstacle course. The six

735 DANIEL THOMAS, 'Driverless convoy: Will truckers lose out to software?', (26 May 2015) BBC News, www.bbc.com/news/business-32837071, last visited December 15, 2017. 736 LEE MATHEWS, 'NAFTA superhighway for driverless trucks could become a reality', (26 May 2015) www.geek.com/news/nafta-superhighway-for-driverless-trucks-could-become-a-reality-1623558/, last visited December 15, 2017. 737 PAUL LIENERT & JOE WHITE, 'Google partners with auto suppliers on self-driving car', (14 January 2015) www.reuters.com/article/us-autoshow-google-urmson/google-partners-with-auto-suppliers-on-self-driving-car- idUSKBN0KN29820150114?feedType=RSS&feedName=technologyNews;%20http://time.com/3693637/driverl ess-cars-prediction/, last visited December 15, 2017. 738 See www.darpa.mil/about.aspx, last visited December 15, 2017. 739 See http://archive.darpa.mil/roboticschallenge/, last visited December 15, 2017. 740 Id. 226 best performing teams got an ATLAS robot from Boston Dynamics and funding from DARPA to take part in the following DRC Trials later that year.741

DARPA’s Robotics Challenge Trials were the second of three DRC challenges, but the first physical contest. It took place in December 2013. 742 Sixteen participating teams had to conquer eight tasks. Among them were teams from

MIT, NASA, Carnegie Mellon University, Lockheed Martin, or even from

Japan, like the winning team “SCHAFT.” 743 The teams were tasked with stimulating what a robot would have to perform in a disaster zone. The robots had to safely enter and effectively work inside such a disaster zone, while humans could operate them from a safe distance.744 The tasks consisted of walking across rough terrain, removing debris from doorways, opening doors, climbing industrial ladders, connecting fire hoses, locating and closing leaking valves and driving a utility vehicle.745 As a result of the trials, DARPA as the funding agency as well as the technicians involved, got a good first-hand impression of the current state of robotic technology. According to Brad

Tousley, Director of DARPA’s Tactical Technology Office, the disaster-relief trials had exemplified how technology can save lives.746

741 Id. 742 Id. 743 Id. 744 Id. 745 Id. 746 CHERYL PELLERIN, 'Eight teams earn DARPA funds for 2014 robotics finals', (27 December 2013) www.army.mil/article/117612/eight_teams_earn_darpa_funds_for_2014_robotics_finals, 2, last visited December 15, 2017. 227

The DARPA Robotics Challenge Finals took place in Pomona, California, in

June 2015. The contest consisted of a natural or man-made disaster environment in which the robots and their teams were tested on their assistance capabilities for disaster response.747 In total, 25 teams from, inter alia, Germany, Italy,

Japan, China, and the US, participated in the competition. The

DRC program manager, Gill Pratt, said that the broad international participation showed that many governments placed priority on furthering robotic technology. 748 This assumption is supported by the fact that some of the competing teams were funded by the European Union, Japan or South Korea.749

At the Finals, there were 15 different commercial or custom-built robotic forms present. Most of them (7 teams) used the upgraded ATLAS robot from Boston

Dynamics. Each team had its own software and user interface.750 One major focal point of the DRC Finals was communication. In contrast to earlier competitions, robots had to be operated completely without wires such as power cords or wired communication tethers. All communication with the robots was done through wireless radio provided by DARPA during the DRC Finals.751 The

DRC Finals Operations Book explicitly stated that it was forbidden to

747 http://archive.darpa.mil/roboticschallenge/, last visited December 15, 2017. 748 PELLERIN, 'Eight teams earn DARPA funds for 2014 robotics finals', (27 December 2013) www.army.mil/article/117612/eight_teams_earn_darpa_funds_for_2014_robotics_finals, last visited December 15, 2017. 749 FUTURE TIMELINE, 'ATLAS humanoid robot gets an upgrade', (23 January 2015) www.futuretimeline.net/blog/2015/01/23.htm, last visited December 15, 2017. 750 http://archive.darpa.mil/roboticschallenge/, last visited December 15, 2017. 751 DARPA ROBOTICS CHALLENGE, 'DRC Finals Operations Book (DISTAR Case 24508)', (30 April 2015) www.theroboticschallenge.org/news/ops-manual, 21, last visited December 15, 2017. 228 communicate to the team operator during the run as a means to increase his situational awareness. 752 DARPA calls this “operator supervision,” humans remotely observing the robot’s activity through its sensors.753

In addition, to increase the need for stable supervised autonomy, the DARPA staff intentionally simulated communications blackouts between robots and their human operators working from a distance. During those times, robots had to proceed on their own. Most of those changes were intended to realistically reflect the conditions robots would face going into a real world disaster zone.754

For this reason, the ATLAS robots for the DRC Finals had 75 per cent new wireless communication on-board power.755 One of the key aspects of the DRC

Finals was the resilience and reliability of the robots. This included robustness against falls, battery power management, sufficient partial autonomy to complete tasks despite deliberate communication interruptions. 756 The new

ATLAS “unplugged” had a wireless emergency stop to increase the safety of operations as well as three on-board computers for increasing perception. In addition, a wireless router in the robot’s head allowed for untethered communication.757

752 Id. at 26. 753 http://archive.darpa.mil/roboticschallenge/, last visited December 15, 2017. 754 Id. 755 Id. 756 Id. 757 Id. 229

Going even further into combining emerging technologies and warfare, the

Department of Defence created an “Algorithmic Warfare Cross-Functional

Team” in April 2017, dedicated to maintaining a qualitative edge in war by harnessing algorithmic systems with artificial intelligence and machine learning.758

B. Categorising Systems According to the Context

Emerging weapons, such as submarines and airplanes, have always been considered as major leaps in military capabilities. Yet, times are changing, because a whole new category is emerging where humans are no longer involved in selecting targets and killing other humans. Development and use is not too far away any longer: the maritime environment would arguably not require the same high level of sensor technology to differentiate between civilians and combatants compared to urban warfare. Systems are already able to distinguish between military and civilian vessels,759 which could make their use against military object very likely from a military and policy perspective. A review of current military systems applied in maritime contexts will serve as a valuable opportunity to study the current adoption of specific AWS and the legal issues surrounding their use.

758 DUSTIN LEWIS, NAZ MODIRZADEH & GABRIELLA BLUM, 'Artificial Intelligence - The Pentagon’s New Algorithmic-Warfare Team', (26 June 2017) www.lawfareblog.com/pentagons-new-algorithmic-warfare- team?utm_source=HLS+PILAC&utm_campaign=c5299c361b- EMAIL_CAMPAIGN_2017_06_19&utm_medium =email&utm_term=0_fbdb5dc78b-c5299c361b-262381653, last visited December 15, 2017. 759 T. X. HAMMES, 'The Future of Warfare: Small, many, smart vs. few & exquisite?', (16 July 2014) War on the Rocks (available at: http://warontherocks.com/2014/07/the-future-of-warfare-small-many-smart-vs-few- exquisite/), last visited December 15, 2017. 230

The demand for more autonomy is very high in the context of the high sea or underwater because they represent the most challenging environments for humans. At the same time, technological developments permit machines to penetrate depths or distances like never before. The systems addressed in this chapter show how well developed the respective level of autonomy is in the maritime environment. Both the private and military sectors drive the development of autonomous systems because as a result of two crucial factors: physical limitations of humans and cost reduction. In addition, the disputes concerning the right of passage in the South China Sea or sensitive coastal facilities have become a topic of military tensions. Among the latter, are underwater cables that carry not only $10 trillion a day worth of global business but also 95 per cent of daily communications – these cables are hard to monitor and breaks are hard to find and repair.760 The Russians are reportedly operating submarines near these vital undersea cables and are said to currently be developing the so-called “Kanyon,” which is apparently envisioned to be an autonomous submarine with a nuclear warhead.761 Admiral James Stavridis

760 DAVID E. SANGER & ERIC SCHMITT, 'Russian Ships Near Data Cables Are Too Close for U.S. Comfort', (25 October 2015) International New York Times (available at: http://nyti.ms/1kFmzsA), last visited December 15, 2017. 761 FRANZ-STEFAN GADY, 'Is Russia Building a Top-Secret Nuclear-Armed Underwater Drone?', (17 September 2015) The Diplomat (available at: http://thediplomat.com/2015/09/is-russia-building-a-top-secret-nuclear-armed- underwater-drone/), last visited December 15, 2017. 231 commented that Russia is “reaching backwards for the tools of the Cold War, albeit with a high degree of technical improvement”.762

How does the US Navy react to these kinds of new threats? Which functions and missions are currently envisioned and with which projected systems? Here, the focus is on researching perception-based control and decision-making for exploration and exploitation of naval environments.763

The following section gives an overview of the different kinds of systems that exist already or are currently under development to showcase the respective existing levels of autonomy with respect to the context of envisaged deployment. Landmines, however, have only a very basic level of target recognition, such as exploding when triggered by a certain weight on a pressure plate or trip wire. Those non-command-detonated landmines can be classified as weapons rather than systems and require no human weapon-release authorization.764 This is why they shall be excluded from the following detailed review. One approach of classifying unmanned systems is according to the environment in which they are used. The systems presented below include:

1. Defence close-in weapon systems;

762 SANGER & SCHMITT, 'Russian Ships Near Data Cables Are Too Close for U.S. Comfort', (25 October 2015) International New York Times (available at: http://nyti.ms/1kFmzsA), last visited December 15, 2017. 763 BORNSTEIN, 'Autonomy Roadmap Autonomy Community of Interest', 21. 764 BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 487. 232

2. Autonomous Targeting and Attacking Missiles;

3. Unmanned Ground Systems (UGS);

4. Unmanned Undersea Vehicles (UUV);

5. Unmanned Surface Systems (USS);

6. Unmanned Aerial Systems (UAS); and

7. Micro-drone swarm systems.

1. Defence Close-in Weapon Systems

Source: www.raytheon.com/capabilities/rtnwcm/groups/gallery/documents/image/iron_d ome_hero_img.jpg

The Phalanx CIWS,765 the MIM-104 Patriot,766 a surface-to-air missile (SAM) system, and the AEGIS Weapon System (AWS)767, a centralized, automated, command-and-control (C2) weapons control system have automatic detect and track capabilities and have been in use for several decades. The Israeli Iron

Dome Weapon System is a relatively new interceptor solution for countering

765 www.raytheon.com/capabilities/products/phalanx/, last visited December 15, 2017. 766 www.army-technology.com/projects/patriot/, last visited December 15, 2017. 767 US NAVY, 'AEGIS Weapon System (AWS)', (26 January 2017) www.navy.mil/navydata/fact_display.asp?cid=2100&tid=200&ct=2, last visited December 15, 2017. 233 rockets, artillery and mortars (CRAM). The Iron Dome system can assess a threat based on its trajectory to determine the expected target area and only intercept if the attack poses danger to an occupied area, while allowing off- target threats to detonate in unoccupied areas.768 For short-range rockets, the decision to intercept or not is a split-second one.769 The Israeli Rafael Advanced

Defence Systems, the producer of the Iron Dome, partners with Raytheon USA to produce David’s Sling, a system countering long-range artillery rockets

(LRAR), short-range ballistic missiles (SRBM), cruise missiles (CM) and traditional air defence threats.770 Especially impressive is the Iron Dome’s ability to discriminate between threats headed towards populated areas or open sea or open fields.771 It can either choose to attack and not attack an incoming rocket. Hence, Iron Dome can be classified as having already human-supervised autonomous functions.772

768 www.rafael.co.il/5619-689-en/Marketing.aspx, last visited December 15, 2017; www.raytheon.com/capabilities/products/irondome/, last visited December 15, 2017. 769 MARK TRAN, 'Iron Dome: Israel's 'game-changing' missile shield', (9 July 2014) www.theguardian.com/world/2014/jul/09/iron-dome-gaza-israel-air-defence-missile, last visited December 15, 2017. 770 www.rafael.co.il/5618-694-EN/Marketing.aspx, last visited December 15, 2017. 771 www.rafael.co.il/5619-689-en/Marketing.aspx, last visited December 15, 2017. 772 SCHMITT, 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon-systems-and- international-humanitarian-law-a-reply-to-the-critics/, 4; AKERSON, 'The Illegality of Offensive Lethal Autonomy', in Saxon (ed.), International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff, 2013), 68; Arguing instead for classifying them as automatic systems: BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 488; EVANS, 'At War with the Robots: Autonomous Weapon Systems and the Martens Clause', (2013) 41 Hofstra Law Review, 705. 234

2. Autonomous Targeting and Attacking Missiles

Picture of LRASM: www.baesystems.com/en-us/lrasm-long-range-sensor

The Autonomous Long Range Anti-Ship Missile (LRASM) is a particularly interesting system since it is still in the planning stages but has already successfully passed the legal review of new weapons and weapon systems test of the Navy Judge Advocate General’s (JAG) Code 10 Division.773 LRASM is a joint project of DARPA and the Office of Naval Research, an “autonomous, precision-guided anti-ship stand-off missile” that the US Navy and US Air Force are planning to introduce into their respective arsenals by 2019.774 Lockheed

Martin started the development of LRASM in 2009 and conducted the maiden flight test in July 2012.

773 Personal interview conducted by the author with US Navy JAG attorneys of Code 10 on 29 March 2016 in the Pentagon, Washington, D.C. – interview record on file with the author. 774 NAVAL TECHNOLOGY, 'Long Range Anti-Ship Missile (LRASM), United States of America', available at: www.naval-technology.com/projects/long-range-anti-ship-missile/, last visited December 15, 2017. 235

Lockheed Martin describes LRASM’s main functions as:775

• Reducing dependence on intelligence, surveillance and reconnaissance

(ISR) platforms, network links, and GPS navigation in aggressive

electronic warfare environments;

• Drawing on its routing and guidance capabilities, to safely navigate to the

enemy area, where the weapon can use gross target cueing data to find

and destroy its pre-determined target in denied environments (italics by

author).

LRASM can be air-launched from conventional air fighter jets such as F/A-18E with LRASM load out or ship-launched. A satellite transmits hostile Surface

Action Group (SAG) locations before the launching of LRASM. The line-of- sight weapons data link is then established. When leaving the line-of-sight radius, LRASM switches to the non line-of-sight weapons satellite data link.

The crucial point here is the denied environment, which does not allow for human interaction. LRASM follows a planned routing to several waypoints.

Closer to the target, however, LRASM switches to autonomous routing. This autonomous navigation draws on autonomous sensor modes for targeting:776

• Organic area of uncertainty (AOU) reduction,

• Target classification with criteria matching, and

• Terminal routing.

775 www.lockheedmartin.com/us/products/LRASM.html, last visited December 15, 2017. 776 Id. 236

When LRASM positively identifies a target of matching criteria, it will descend to terminal altitude as a means of enhanced survivability technique. LRASM will identify the desired mean point of impact (DMPI) in the ships, which will most often be the central control unit. The program’s intent is to develop a next- generation missile capable of autonomously detecting and identifying targets.

Functioning as the eyes and the ears of the LRASM missile, British Aerospace

(BAE) Systems’ advanced long-range sensor is designed to enable the attack of targets within a group of enemy ships protected by sophisticated enemy air defence systems.777 When it comes to targeting capability, LRASM brings autonomy to a degree that does not yet exist; as a result, the exact guidance and seeker technology is kept secret.778 The question, for the purpose of this study, is then: What is the level of autonomy used and how much meaningful human control is retained with an operator? When it comes to LRASM, humans program the target criteria and launch the system at individual target areas, but the computer decides whether what the sensors are seeing is factually a legitimate target, and if it should be attacked without further human intervention.779

777 See www.baesystems.com/en-us/lrasm-long-range-sensor, last visited December 15, 2017. 778 KRIS OSBORN, 'Navy LRASM Missile Destroys Enemy Targets Semi-Autonomously; Lockheed Tests Ship- Fired Variant', (19 January 2016) www.scout.com/military/warrior/story/1634058-navy-weapon-destroys-target- semi-autonomously, last visited December 15, 2017. 779 MARK GUBRUD, 'Killer Robots and Laser-Guided Bombs: A reply to Horowitz & Scharre', (4 December 2014) available at: http://gubrud.net/?p=398, last visited December 15, 2017. 237

3. Unmanned Ground Systems (UGS)

Source: www.unmannedsystemstechnology.com/2016/06/northrop-grumman- and-u-s-navy-complete-cdr-for-eod-robotic-system/

Unmanned Ground Systems need to manoeuvre in a broad variety of environments. To successfully master different environments, military applications rely more and more on the parallel development of sensors and avoidance algorithms, created by the civilian sphere. Both collision avoidance systems tailored for UAS as well as innovation in autonomous driving represent a driving force for dual use capabilities being leveraged for UGS.780 The auto industry has a much greater ability to apply research and development technology funding with the initial focus on improving safety, and many of these technological developments will apply to making UGS safer as well. The

Advanced Explosive Ordnance Disposal Robotic System (AEODRS) program

780 DOD, 'Unmanned Systems Integrated Roadmap FY2013-2038, Reference Number: 14-S-0553', (2013) www.defense.gov/pubs/DOD-USRM-2013.pdf, 88, last visited December 15, 2017. 238 developed by the Northrop Grumman Corporation for the US Navy is a system designed for EOD reconnaissance and threat assessment.781

The Israeli Defence Forces (IDF) use the AvantGuard unmanned ground combat vehicle (UGCV) developed by G-NIUS Unmanned Ground Systems in Israel in

2010. AvantGuard is designed to manoeuvre in harsh environments and counter improvised explosive devices (IED). The system can be deployed semi- autonomously, in remote control mode, and autonomously. It can also apply to a soldier or a vehicle in follow-me mode.782

Boston Dynamics develops systems for DARPA, the US Army, Navy and

Marine Corps such as the quadrupedal BigDog robot, the bipedal humanoid robot Atlas, the four-footed robot Cheetah and Handle, a robot combining wheels and legs.783 Handle’s sophistication has been described as “superior to anything you’ll find in nature” and called an “evolutionary marvel.”784 However,

Boston Dynamic’s robots are not weaponized yet.

781 SUDI BRUNI, 'Northrop Grumman, US Navy Complete Critical Design Review for Bomb Disposal Robot Program', (7 June 2016) http://investor.northropgrumman.com/phoenix.zhtml?c=112386&p=irol- newsArticle_pf&ID=2175739, last visited December 15, 2017. 782 www.army-technology.com/projects/avantguardunmannedgr/, last visited December 15, 2017. 783 www.bostondynamics.com, last visited December 15, 2017. 784 MATT SIMON, 'Boston Dynamics’ New Rolling, Leaping Robot Is an Evolutionary Marvel', (3 January 2017) www.wired.com/2017/03/boston-dynamics-new-rolling-leaping-robot-evolutionary-marvel/, last visited December 15, 2017. 239

4. Unmanned Undersea Systems (UUS)

Source: www.navaldrones.com/LDUUV-INP.html

Unmanned undersea systems are a quickly growing field too. The US alone will invest “$600 million over the next five years in variable size and variable payload unmanned undersea vehicles – a new capability you’ll be seeing a lot more of” declared Secretary of Defense Ashton Carter in February 2016.785 One of the systems currently under development under the auspices of the Navy’s

Unmanned Maritime Systems program office and the Naval Undersea Warfare

Centre (NUWC) is the Large Displacement Unmanned Undersea Vehicle

(LDUUV). This UUV could be used for acoustic surveillance and mine counter-

785 MARK POMERLEAU, 'DOD plans to invest $600M in unmanned underwater vehicles', (4 February 2016) https://defensesystems.com/articles/2016/02/04/dod-navy-uuv-investments.aspx, last visited December 15, 2017. 240 measures but also for offensive operations, as well as anti-submarine warfare.786

NUWC Division Newport is spearheading the LDUUV and is tasked with research, development, test and evaluation of submarines, autonomous underwater systems, and offensive and defensive undersea weapon systems.787

The Navy’s Unmanned Undersea Vehicle (UUV) Master Plan released in 2004 described the following as potential mission categories: intelligence, surveillance and reconnaissance, barrier patrol for defence and force protection, mine countermeasures, anti-submarine warfare, payload delivery and time critical strike.788

5. Unmanned Surface Systems (USS)

786 US NAVY PROGRAM EXECUTIVE OFFICE LITTORAL COMBAT SHIPS, 'Large Displacement Unmanned Underwater Vehicle Program Achieves Acquisition Milestone', (3 September 2015) NNS150903-22 www.navy.mil/submit/display.asp?story_id=90932, last visited December 15, 2017. 787 US NAVAL SEA SYSTEMS COMMAND OFFICE OF CORPORATE COMMUNICATIONS, 'Navy Holds Industry Day on Large Displacement Unmanned Undersea Vehicle Program', (6 October 2016) NNS161006-09 http://www.navy.mil/submit/display.asp?story_id=97063, last visited December 15, 2017. 788 US NAVY, 'The Navy Unmanned Undersea Vehicle (UUV) Master Plan', (9 November 2004) www.navy.mil/navydata/technology/uuvmp.pdf, 3, last visited December 15, 2017. 241

ACTUV: www.darpa.mil/program/anti-submarine-warfare-continuous-trail- unmanned-vessel

Anti-Submarine Warfare (ASW) will soon be conducted by unmanned systems like the so-called Sea Hunter or Continuous Trail Unmanned Vessel (ACTUV).

The ACTUV program has three primary goals:789

1. Operate without a human on board at any point in the operating cycle;

2. Enable independently deployed systems, capable of concluding missions

spanning thousands of kilometres of range, and months of endurance,

under a sparse remote supervisory control model. This includes

autonomous compliance with maritime laws and conventions for safe

navigation, autonomous system management for operational reliability,

and autonomous interactions with an intelligent adversary;

3. Employ non-conventional sensor technologies that achieve robust

continuous track of the quietest submarine targets over their entire

operating envelope.

DARPA signed a Memorandum of Agreement with the US Navy’s ONR to jointly fund an extended test phase of an ACTUV prototype in 2014. Already in

April 2016 the vessel was transitioned from a DARPA-led design and construction project to open-water testing conducted jointly with ONR. DARPA and ONR currently test the capabilities of the vessel and several innovative

789 SCOTT LITTLEFIELD, 'DARPA Program Information: Anti-Submarine Warfare (ASW) Continuous Trail Unmanned Vessel (ACTUV)', (2016) www.darpa.mil/program/anti-submarine-warfare-continuous-trail- unmanned-vessel, last visited December 15, 2017. 242 payloads focusing on potential missions including submarine tracking and countermining activities. If these tests are successfully completed, ACTUV could transition to the US Navy by 2018.790

However, the core platform and autonomy technologies are broadly extendable to underpin a wide range of missions and configurations for future unmanned naval vessels.791 This could, in the long run, also include attacking submarines autonomously.

6. Unmanned Aerial Systems (UAS)

X-47B: www.northropgrumman.com/Capabilities/X47BUCAS/Slideshow/slide3.jpg

The X-47B is an Unmanned Combat Air System (UCAS). The name itself reveals that it was designed as a strike fighter. Northrop Grumman has conducted flight-testing with two X-47B aircrafts. In 2013, these aircraft were

790 DEFENSE ADVANCED RESEARCH PROJECTS AGENCY, 'ACTUV Unmanned Vessel Helps TALONS Take Flight in Successful Joint Test', (20 October 2016) www.darpa.mil/news-events/2016-10-24, last visited December 15, 2017. 791 LITTLEFIELD, 'DARPA Program Information: Anti-Submarine Warfare (ASW) Continuous Trail Unmanned Vessel (ACTUV)', (2016) www.darpa.mil/program/anti-submarine-warfare-continuous-trail-unmanned-vessel, last visited December 15, 2017. 243 used to demonstrate the first ever carrier-based launches and recoveries by an autonomous, low-observable relevant unmanned aircraft. In April of 2015, the

X-47B once again made aviation history by successfully conducting the first ever Autonomous Aerial Refuelling (AAR) of an unmanned aircraft.792 Experts hold that “an on-board computer [that] can pull off such a feat speaks volumes about the future of unmanned naval aviation.”793

The X47-B, however, will not be pursued,794 but rather transitioned to the

Unmanned Carrier Launched Airborne Surveillance and Strike (UCLASS) system, primarily a carrier-based unmanned aerial refuelling platform. 795

Opponents of a weaponization of UCLASS in the US Congress have prevailed so far.796 However, congressional proponents of unmanned air power, such as

Representative Randy Forbes, want UCLASS to become a true unmanned strike aircraft.797 This study can safely infer that the X-47B reveals that the US Navy is already capable of operating an AWS with full autonomy modes. However, a fully weaponized AWS is not currently being pursued although it has made

792 www.northropgrumman.com/Capabilities/x47bucas/Pages/default.aspx, last visited December 15, 2017. 793 JAMES HOLMES, 'The Mighty X-47B: Is It Really Time for Retirement?', (6 May 2015) http://nationalinterest.org/feature/the-mighty-x-47b-it-really-time-retirement-12818, last visited December 15, 2017. 794 Id. at 1. 795 SAM LAGRONE, 'Pentagon to Navy: Convert UCLASS Program Into Unmanned Aerial Tanker, Accelerate F- 35 Development, Buy More Super Hornets', (1 February 2016) https://news.usni.org/2016/02/01/pentagon-to- navy-convert-uclass-program-into-unmanned-aerial-tanker-accelerate-f-35-development-buy-more-super- hornets, last visited December 15, 2017. 796 DAVE MAJUMDAR, 'The War over UCLASS (And Future of Naval Power Projection) Continues', (30 September 2015) http://nationalinterest.org/blog/the-buzz/the-war-over-uclass-future-naval-power-projection- continues-13973, last visited December 15, 2017. 797 HOLMES, 'The Mighty X-47B: Is It Really Time for Retirement?', (6 May 2015) http://nationalinterest.org/feature/the-mighty-x-47b-it-really-time-retirement-12818, last visited December 15, 2017. 244 aviation history twice. The combination of the use of force and AWS remains a controversial topic. US President Donald Trump has declared that he wants to broaden CIA powers by allowing not only the Pentagon to conduct targeted killing operations by UAS but to also grant authority to the CIA.798 Although almost technically feasible, political backup for autonomous strike functions in the X-47B seem to be lacking.

7. Swarm Technology

Source: https://defensesystems.com/pages/topics/~/media/GIG/Defense%20Systems/per dixinflight.jpg

Recent tests have shown that miniaturisation of warfare in terms of swarming

UAS will soon enter the US Navy’s arsenal. Both the Low-Cost UAS Swarming

Technology (LOCUST) program and Perdix represent swarming UAVs that are

798 GORDON LUBOLD & SHANE HARRIS, 'Trump Broadens CIA Powers, Allows Deadly Drone Strikes', (13 March 2017) www.wsj.com/articles/trump-gave-cia-power-to-launch-drone-strikes-1489444374, last visited December 15, 2017. 245 designed to autonomously overwhelm adversaries. 799 LOCUST enables completely autonomous collaborative synchronization and formation flight by information sharing between the individual UAS “in either defensive or offensive missions”800. Having said this, the putting into practise of ONR officials’ statement that although “LOCUST autonomy is cutting edge compared to remote-controlled UAVs, there will always be a human monitoring the mission, able to step in and take control as desired”,801 seems to be challenging.

Nevertheless ONR officials note that while the LOCUST autonomy is cutting edge compared to remote-controlled UAS, there will always be a human monitoring the mission, able to step in and take control as desired.802

The US Department of Defence’s Strategic Capabilities Office (SCO) partnered with the Naval Air Systems Command where they successfully conducted tests with the world’s largest micro-drone swarms in California in October 2016. 103

Perdix micro-drones were launched from F/A-18 Super Hornets and demonstrated advanced swarm behaviours such as collective decision-making, adaptive formation flying, and self-healing.803

799 US OFFICE OF NAVAL RESEARCH, 'LOCUST: Autonomous, swarming UAVs fly into the future', (14 April 2015) www.onr.navy.mil/Media-Center/Press-Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx, last visited December 15, 2017. 800 Id. at 1. 801 Id. at 1. 802 Id. at 1. 803 US DEPARTMENT OF DEFENSE, 'Successful Micro-Drone Demonstration - Press Release No: NR-008-17', (9 January 2017) www.defense.gov/News/News-Releases/News-Release-View/Article/1044811/department-of- defense-announces-successful-micro-drone-demonstration, last visited December 15, 2017. 246

According to SCO Director William Roper the complex nature of combat requires that Perdix not be pre-programmed synchronized individuals but that they become “a collective organism, sharing one distributed brain for decision- making and adapting to each other like swarms in nature.”804 To achieve this, the swarm has no leader and can adapt to changing circumstances by communicating and collaborating with every other Perdix. The military rationale behind this development is the following: the Pentagon demonstrates that it can rely on teams of small, inexpensive, autonomous systems to conduct the same missions that once required large and expensive material and personnel. SCO

Director Roper, however, stresses that the Navy’s future battle network will use machines and autonomous systems to empower humans to make better decisions faster while still being “in the loop”.805 How operators will still be “in the loop” when swarms of more than 100 individual drones perform tasks with speeds of

Mach 0.6, remains to be proven.

In addition, the Office of Naval Research successfully demonstrated the Control

Architecture for Robotic Agent Command and Sensing (CARACaS) system in

August 2014, where 13 unmanned patrol boats equipped with the technology

804 Id. at 1. 805 Id. at 1. 247 worked together as a unit both in autonomous and remote mode.806 These boats can be outfitted with payloads ranging from non-lethal to lethal. Such swarm boats could patrol harbours, escort ships, or overwhelm adversaries.

Each of the above systems triggers the threshold of a new weapon system.

Hence, they need to undergo a legal review before deployment, which will be analysed further below in detail. Looking at examples of existing systems provide crucial insight. Weapon systems are not robots like in Hollywood movies, but they do enjoy significant, and varying independence from human control. Their ability to make autonomous targeting decisions poses real challenges for those charged with understanding those capabilities and evaluating their compliance with international law. The resources available to the lawyers responsible for reviews are limited in both number and analytical depth. They include the DoD Directive 3000.09, the legal review of new weapons under customary international law, and consultative opportunities within the US government.

For this study, only weaponized systems trigger the threshold of creating challenges for international humanitarian law through their autonomous targeting and attacking potential. Both offensive and defensive systems are

806 KEVIN MCCANEY, 'Navy puts autonomous 'swarmboats' into action', (5 October 2014) https://defensesystems.com/articles/2014/10/05/onr-navy-autonomous-swarm-boats.aspx, last visited December 15, 2017. 248 considered because there is no established right of returning fire to protect equipment under the paradigm of individual self-defence.807

C. Lawfulness Depends on Operating Environment

The operational environment is particularly crucial when it comes to AWS. As described above, reviews should be linked to technical capabilities, which are critical for the determination of system performance flexibilities, such as the appropriate levels of automation, manoeuvrability and communication options needed to accomplish the mission.808 Hence, specific aspects of the envisaged physical operating environments should be included in the review process.

Testing and certification will be crucial for the proposed review system in order to effectively restrict AWS.

AWS programmed to identify and attack humans in urban areas is currently hazardous because they cannot only be directed against combatants. Deploying

AWS to attack submarines or warships in areas where no civilian vessels will be encountered could be lawful. The technology to reliably identify certain key categories of military objectives such as submarines, warships, or combat aircrafts, already exists (see above IV). The same is true for technology that helps ensure compliance with the proportionality principle, Article 51 V (b) of

807 BACKSTROM & HENDERSON, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 498. 808 DOD, 'Unmanned Systems Integrated Roadmap FY2013-2038, Reference Number: 14-S-0553', (2013) www.defense.gov/pubs/DOD-USRM-2013.pdf, vi, last visited December 15, 2017. 249

AP I,809 by identifying that humans might be present in a certain target areas of an anti-radar AWS, and as a result, refrain from engaging.810

Some even assume that, in the long run, AWS can enhance the military’s ability to comply with IHL. Humans, for example, may be less able to deal with the fog, friction, and fear of combat than advanced sensors.811 In contrast, AWS might be able to out-perform humans in accurately identifying targets, assessing potential collateral damage, and suspending or cancelling an attack at the last minute.812

This supports an individual assessment of each system taking into consideration the particular situation to decide whether or not AWS are better or less capable of complying with IHL than humans. The use of AWS would be unlawful, if another weapon system exists and can feasibly cause less harm to civilians and yield similar military advantage. The proportionality principle prohibits an attack in case the expected collateral damage would be excessive relative to the anticipated military advantage. With regard to AWS, it is important to note that the military advantage of any particular strike is situational. The situation on the battlefield can change rapidly and dramatically. This speaks for very narrow deployment scenarios, such as programming AWS to only engage at no or low

809 www.icrc.org/customary-ihl/eng/docs/v1_rul_rule14, last visited December 15, 2017. 810 SCHMITT, 'Regulating Autonomous Weapons Might be Smarter Than Banning Them', 3. 811 Id. at 3. 812 Id. at 3. 250 levels of collateral damage and in a relatively short time frame. Otherwise, the more time elapses after deployment and the greater the distance, the less AWS will be able to meet the proportionality principle because the situation might have changed since the launch by the human operator.813 It should be possible to model the effects of AWS in different environments in order to determine lawful contexts and scenarios.814 Lastly, potential proliferation is a legitimate concern.

However, almost every weapon can be used in an unlawful manner. Whether this should prohibit states planning to use AWS in accordance with their legal obligations, is questionable. The US will most probably not restrict itself because other states, and even less so for non-state actors like terrorists,815 who most probably would not abide by IHL.

The Israeli delegation agrees with the assumption that context is crucial in the determination of the lawfulness of a potential AWS deployment when stating in the CCW that:

“In this regard, the context – referring to the specific system and the

specific scenario of use – is of utmost importance. The characteristics and

capabilities of each system must be adapted to the complexity of its

intended environment of use. Where deemed necessary, the system's

813 Id. at 4. 814 FRY, 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', (2006) 44 Columbia Journal of Transnational Law, 480. 815 THE WHITE HOUSE - OFFICE OF THE PRESS SECRETARY, 'Fact Sheet: The 2015 National Security Strategy', (6 February 2015) www.whitehouse.gov/the-press-office/2015/02/06/fact-sheet-2015-national-security-strategy, last visited December 15, 2017. 251

operation would be limited by, for example, restricting the system's

operation to a specific perimeter, during a limited timeframe, against

specific types of targets, to conduct specific kinds of tasks, or other such

limitations which are all set by a human. Likewise, for example, if

necessary, a system could be programmed to refrain from action when

facing complexities it cannot resolve.”816

Tasks that have an objectively correct outcome and occur in controlled or predictable situations may be delegated to AWS. 817 In addition, object identification and recognition are fields that may lawfully be delegated to AWS.

On the other hand, operations that depend heavily on context, or occur in uncontrolled and unpredictable surrounding do not seem to be made for AWS.

For example, value-based decisions, like whether a person is in fact a combatant, may depend on the context or even actions (hors de combat).818 The same is true for proportionality considerations, where ethical and moral judgment might be necessary.819 This leads this study to infer that the following

816 ISRAEL, 'Statement at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (17 April 2015) www.unog.ch/80256EDD006B8954/(httpAssets)/AB30BF0E02AA39EAC1257E29004769F3/$file/2015_LAW S_MX_Israel_characteristics.pdf, last visited December 15, 2017. 817 See THOMAS BURRI, 'Machine Learning and the Law: 5 Theses', (3 January 2017) https://ssrn.com/abstract=2927625, 4 who holds that “law- and policymakers should keep an eye out for highly structured environments. This is where machine learning will likely be applied in the near future”, last visited December 15, 2017. 818 www.icrc.org/customary-ihl/eng/docs/v1_rul_rule47, last visited December 15, 2017. 819 CENTER FOR NEW AMERICAN SECURITY, 'Autonomous Weapons and Human Control', (April 2016) Ethical Autonomy Project, www.cnas.org/autonomous-weapons-and-human-control#.VwzvdcchuOp, 7, last visited December 15, 2017. 252 contexts will remain tied to human judgment and hence, the use of AWS would be less appropriate or indeed arguably illegal:820

• Civilians are present that could suffer unintended damages;

• Military necessity is outweighed by the expected harm caused to civilians;

• Unclear characteristics and conditions of the environment;

• Unsatisfactory weapon safety characteristics, capabilities, and limitations.

Hence, legal reviews need to determine the sophistication of the on board control unit working with the specific weapon mounted as well as sensor technology, data processing and programming. To conclude, certain deployment scenarios can be cleared or licensed during the legal review, such as an environment (underwater) where civilians may not be present, so that the threshold for a proportionality assessment is not likely. In addition, at present, states have declared that they will have at least humans on the loop, supervising the correct functioning of AWS. In case legally reviewed AWS would want to be used in a human out of the loop mode, this would trigger another legal review.821

820 FRENCH DELEGATION, 'Non paper: Legal framework for any potential development and operational use of a future LAWS', 2. 821 LAWAND, COUPLAND & HERBY, 'A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977', (2006) International Committee of the Red Cross, 24. 253

D. No Drone Zone: Lessons Learned from Geo-Fencing

While the US Federal Aviation Administration already regulates private drones,822 geo-fencing is a technology that defines a virtual boundary around a real-world geographical area.823 This technology keeps drones out of, or within the bounds, of a pre-defined area or allows a device on-board mapping. The contexts relevant to the analysis of AWS are geo-fencing technologies used with localized firearms that restrict firing to locations where it is permitted. In addition, some drone manufacturers build geo-fencing constraints into UAS navigation systems. Actions resulting from a violation of this geo-fence area may come in the form of a notification to a pre-programmed entity, a trigger warning to the user or operator or preventing a device from entering into a protected space and thereby making the device unusable other than in the pre- defined area. In the context of AWS, the geo-fence could be pre-determined during the legal review of a new weapon. Lawful scenarios could then include underwater anti-submarine warfare. However, based on the sophistication of systems, urban areas where civilians may be encountered could be labelled as

“no go areas” for AWS.

822 CECILIA KANG, 'Drone Registration Rules Are Announced by F.A.A.', (14 December 2015) https://mobile.nytimes.com/2015/12/15/technology/drone-registration-rules-are-announced-by- faa.html?smid=fb-nytimes&smtyp=cur&_r=0&referer=, last visited December 15, 2017; THOMAS BURRI, SHAWN BAYERN, THOMAS D. GRANT, DANIEL M. HÄUSERMANN, FLORIAN MÖSLEIN & RICHARD WILLIAMS, 'Company Law and Autonomous Systems: A Blueprint for Lawyers, Entrepreneurs, and Regulators', (2017) 9 Hastings Science and Technology Law Journal, 1-22, have recently suggested private company law as a means to regulate and provide functional and adaptive legal “housing” for autonomous systems. 823 See www.techopedia.com/definition/14937/geofencing, last visited December 15, 2017. 254

E. Autonomous Driving Systems

The California Department of Motor Vehicles frequently grants permits to test self-driving vehicle technology on California’s public roads. Pursuant to

California Vehicle Code Section 38750, a manufacturer of autonomous technology may apply to the Department of Motor Vehicles, and be approved to test a vehicle in autonomous technology mode on public roads.824 Among companies testing autonomous vehicles in California are Google, Volkswagen

Group of America, Mercedes-Benz, Tesla, Ford, and GM.825

This study wonders which steps need to be taken in order to receive a test license for an autonomous vehicle on public roads.826 Car manufacturers usually start with building test cars and training them in parking lots and private roads prior to receiving approval from the California DMV.827 The same safe testing environments should be ensured for AWS before they can be released on any battlefield.

F. Limitations upon operational assignments for AWS

The question arises as to whether executive human oversight or veto-power were necessary to comply with IHL until technical limitations are overcome. If

824 See www.dmv.ca.gov/portal/dmv/detail/vehindustry/ol/auton_veh_tester, last visited December 15, 2017. 825 See www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testing, last visited December 15, 2017. 826 For an overview of liability issues concerning self-driving vehicles see: MELINDA FLORINA LOHMANN, Automatisierte Fahrzeuge im Lichte des Schweizer Zulassungs- und Haftungsrechts (Nomos Verlag, 2016). 827 SEAN O'KANE, 'California gives Nvidia the go-ahead to test self-driving cars on public roads', (9 December 2016) www.theverge.com/2016/12/9/13902704/california-dmv-permit-nvidia-autonomous-car-testing, last visited December 15, 2017. 255 restricting AWS’ use could ensure its legality, one potential lawful situation would suffice to permit a lawful use.828 These particular environments would probably be underwater, in the air – rather anti-object defensive systems with full pre-programming or restrictions to self-defence.829

A proposed kill-box,830 in which systems could attack without distinction, is most probably neither practical nor lawful. AWS, however, could monitor no fly zones or no sail zones, especially in times of conflict. Human operators could ensure that no civil airplane or ship would find its way into such zones by monitoring its outer bounds. This could be similar to test areas for submarines, which ensure that civilians will not be harmed. Defensive contexts where no civilians are encountered appear lawful. Such environments are naval encounters at high sea against other machines. Today, systems where human operators activate, monitor, and override the system are the US Patriot and Phalanx anti- missile systems and Israel’s Iron Dome, in which incoming missiles are destroyed in the air.831 Identification in very narrow circumstances is possible with the help of visible recognition software, like the Samsung SGR-A1, where

828 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 265. 829 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 257; BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 259; BENJAMIN KASTAN, 'Autonomous Weapons Systems: A Coming Legal Singularity?', (2013) 45 University of Illinois Journal of Law, Technology & Policy, 60. 830 ARKIN, Governing lethal behavior in autonomous robots, 147. 831 BOB SIMON, 'Will Israel’s “Iron Dome” help bring peace?', (17 February 2013) www.cbsnews.com/news/will- israels-iron-dome-help-bring-peace/, last visited December 15, 2017. 256 potential enemies can raise their hands or wave a white flag to surrender.832

Many consider air-to-air warfare over open water performed by autonomous systems to be the most likely in the future.833

With ever increasing speed of warfare, fighter jets might become too fast for humans to operate. Tank warfare in uninhabited deserts, is a likely operations, as is combat on the high seas between battle ships, submarines or ship antimissile defence applications. Again, in these scenarios, civilians are probably not present. Machine to machine warfare is also a possible scenario, in which civilian harm can be limited. Already today, human targeting officers use sophisticated software prior to attack to determine of estimated collateral damage. Although this is no substitution for the actual weighing process, it could probably be programmed into AWS. With this, it could proceed in a manner similar to that currently used in Afghanistan by the US military. Here, the so-called “Collateral Damage Estimation Methodology” (CDEM) is used to conduct a proportionality analysis; in case of potential civilian casualties, higher authorization must approve particular strikes.834

Most of the environments in which AWS could operate do not involve civilian objects or humans because AWS cannot properly abide by the rules of

832 ARKIN, Governing lethal behavior in autonomous robots, 169. 833 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 401. 834 GEOFFREY S. CORN, VICTOR HANSEN, RICHARD JACKSON, CHRISTOPHER JENKS, ERIC TALBOT JENSEN & JAMES A. SCHOETTLER, The Law of Armed Conflict: An Operational Approach (Wolters Kluwer, 2012), 194. 257 distinction, proportionality or precaution. Nevertheless, progress in automation, sensor – and recognition technology may allow for a broader variety of scenarios in the future. This is under the strict condition that AWS can comply with the applicable obligations mentioned above. Legal reviews need to keep up with technological advancement by considering different levels of autonomy and deployment possibilities with a potential follow-up mechanism after initial appraisal.

Limitations to autonomy should include that AWS not to be operated over a long period of time, they should not be used in wide areas, and they should only use narrow proxy indicators in environments that are not cluttered.

The law applicable to AWS has been laid out, as have different lawful scenarios.

In the following chapter, a potential introduction cycle and different targeting models shall be applied to those scenarios.

1. Three-Step-Approach for AWS’ Introduction

The proposed three-step-approach to introduce AWS,835 first presents a pre- programmed robot looking for signatures, like enemy weapons, before alerting human operators. The latter would then decide how to proceed. In a next step, humans could veto a machine-initiated attack. Lastly, AWS could be designed to

835 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 389. 258 target and fire autonomously. They would hold the attack and ask for human authorization in case they expect potential collateral damage above a certain pre- programmed level. However, the authors acknowledge that there is not yet, and not soon to come, a technological possibility for AWS to assess civilian status or estimate harm as part of their own independent targeting decisions. AWS software needs to translate quantitative and qualitative decision-making criteria for distinction and proportionality into machine-readable code. Hence, technological limitations do not yet allow for this kind of proposal.

2. Deliberate targeting

In this proposal, target and attack modalities would be pre-programmed by human operators and AWS would act solely according to the mission plan pre- planned by human military commanders.836

Distinction requires the ability to identify even after activation. However, target objects and target persons must be approached differently. Static or immobile objects could be identified solely by objective criteria such as GPS-positioning.

Dynamic or mobile targets would need to have unique criteria or signatures such as shape, size, speed, heat, radar or infrared signals.837 Due to a potential change of status by direct participation in hostilities or being hors de combat, persons cannot be pre-programmed targets in compliance with Articles 51 II 1 and 52 I 1

836 KASTAN, 'Autonomous Weapons Systems: A Coming Legal Singularity?', (2013) 45 University of Illinois Journal of Law, Technology & Policy, 57-58. 837 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 271. 259

AP I. Articles 50 I 2, 52 III AP I require that technical applications (sensors) must ensure that AWS only engage with fire if the target can undoubtedly be identified as being military. Proportionality contains an “obligation of conduct” with an ex-ante analysis of the moment in which the decision to actually attack was taken. This can be satisfied by an assessment prior to the actual attack since there should be no difference between human soldiers or AWS.838 Precautions in attack enshrined in Article 57 II a (ii) AP I contains a feasibility test. This means that AWS must be replaced by another choice of weapon, one that is less inflictive upon civilians but still as effective.839 This provision will most likely minimize the possibility to use AWS overall, especially if civilians are around.

Again, in environments without civilian presence, AWS can be the weapon of choice.840 The provision to terminate attacks put forward in Article 57 II b AP I, the so called apparent test, demands an assessment of potential collateral damage by a military commander. Warning civilians of an attack enshrined in

Article 57 II (e) AP I must be conducted in the planning stage and it is technologically feasible for AWS.841 Article 57 III AP I requires that precaution must be applied for a choice of target. However, for deliberate targeting, this is performed by human military commanders in the planning phase before the actual attack. Deliberate targeting allows for AWS because the system does not

838 Id. at 274. 839 HERBACH, 'Into the Caves of Steel: Precaution, Cognition and Robotic Weapon Systems under the International Law of Armed Conflict', (2012) 4 Amsterdam Law Forum, 5-8. 840 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 246. 841 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 277. 260 choose its targets in this alternative. Critics to deliberate targeting, however, claim that this holds true only for target objects that are identified solely by objective parameters, such as tanks. In fact, this alternative lacks any situational awareness for a changing environment.842

3. Dynamic targeting

In this alternative, AWS conduct targeting and attack decisions autonomously.843

Human input or control takes place in the programming phase prior to attacks.

The distinction principle demands restricting potential targets in the programming. Due to a lack of quality analysis competence, military objects must be identifiable solely by objective criteria. As long as AWS cannot distinguish between combatants and civilians, AWS must wait for being shot at

(“conduct-based targeting”)844 or only fire in case valid identification as military target is possible. As proportionality requires weighing potential civilian casualties against military advantage, a prerequisite could be to take into account every human present in a potential attack area or those objects that do not fall within a pre-programmed military category as being “civilian” so that the AWS holds its fire. This safeguard is possible because technology allows for identification of humans already.845 Exceptions could be made for active attacks against AWS or friendly forces. An assessment conducted prior to a specific

842 Id. at 278. 843 KASTAN, 'Autonomous Weapons Systems: A Coming Legal Singularity?', (2013) 45 University of Illinois Journal of Law, Technology & Policy, 57. 844 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 266. 845 PETER W. SINGER, 'Der ferngesteuerte Krieg', (2010) Spektrum der Wissenschaft, 79. 261 attack by a military commander could meet the standard of military necessity enshrined in Article 49 AP I because the rule encompasses a notion of “attack” as a deployment of AWS rather than every single strike.846 This way, AWS do not have to assess military advantage autonomously. Pre-programing parameters for a proportionality assessment could include setting a disproportional high maximum limit prior to attacks and thereby ensuring compliance by over- fulfilling IHL requirements. Moreover, resort to AWS could be restricted to cases with a potentially high military advantage in which military commanders have a great margin of appreciation of what is considered “excessive.” The precautionary principle does not require the pre-planning of every detail but it must be ensured that only military targets will be attacked.847 The feasibility principle of Article 57 II a (ii) AP I demands resorting to another choice of weapon in case of civilian presence. Here, the probability of human presence and the margin of error known to the military commander must be considered prior to AWS’ activation. Article 57 III AP I demands to choose the “lesser of two evils” of a weapon system.848 Potential civilian presence still remains the crux of assessment. In case the mission planning requires destroying all military

846 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 282. 847 Id. at 275. 848 SANDOZ, SWINARSKI & ZIMMERMANN, Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, 686. 262 targets, the norm is practically not enforceable. However, “dynamic targeting” generally aims at destroying all military targets at the same time.849

4. IHL compliance of AWS

To sum up, AWS can comply with the legal requirements set out by international humanitarian law at least in some situations.850 The use of AWS does not differ from the use of conventional weapons in so far as questions of legality have to be considered with a view to actual deployment in specific environments. The possibility of IHL compliance for the use of AWS, in case of deliberate and dynamic targeting, is rather a question of technological abilities in different circumstances relative to the legal standard, than one of principle.851

Restrictions upon AWS’ use may result in very narrow deployment scenarios.

Proper testing and regular Article 36 AP I legal reviews could help shaping those scenarios.

G. Preliminary Conclusion: Restrictions relevant to the Context

The use of AWS, with regard to IHL, is not per se unlawful. Restrictions on their use, however, apply. Technological limitations can be overcome by strict restrictions. This results in a lawful deployment only in battlefield environments in which AWS are able to identify lawful targets. Due to the lack of technological possibilities, which would abide by the principle of distinction,

849 BORRMANN, Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts, 285. 850 SCHMITT & THURNHER, 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', 279-280. 851 ANDERSON, REISNER & WAXMAN, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', 406. 263 these scenarios will almost never include human presence in potential attack areas. If operational personnel accept these provisions, an additional implementation of human intervention after activation is not necessary. The biggest challenge remains the formulation of robust standards under Article 36

AP I that could be applied to future systems. Regulations should be implemented on a case-by-case basis, which would allow for periodical reviews new military technologies. This should result in a “standards of care” principle that specifies and delimitates the lawful contexts and purposes for AWS deployment. By doing so, unconventional uses (outside established legal paradigms) could be avoided. A clear assignment of responsibility for the deployment and supervision could also apply to these standards of care.

The delegation of the United Kingdom agreed when holding at the 2016 CCW

Expert Meeting that the legal weapons review was “a process which has been developed exactly for circumstances such as where the legality of new and novel weapons technologies needs to be thoroughly assessed and understood”.852 The following statement made during the CCW meetings aptly summarizes the AWS debate, positing that, “there was a broad consensus among the experts…that the

852 UNITED KINGDOM OF GREAT BRITAIN AND NORTHERN IRELAND, 'Statement to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016) www.unog.ch/80256EDD006B8954/(httpAssets)/ 49456EB7B5AC3769C1257F920057D1FE/$file/2016_LAWS+MX_GeneralExchange_Statements_United+Kin gdom.pdf, para. 5, last visited December 15, 2017; Israel agreeing in general by stating that “LAWS should undergo legal review before they are deployed”: YARON, 'Statement by Israel to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016) www.unog.ch/80256EDD006B8954/(httpAssets)/A02C15B2E5B49AA 1C1257F9B0029C454/$file/2016_LAWS_MX_GeneralDebate_Statements_Israel.pdf, 2, last visited December 15, 2017. 264 international law of war does not necessarily pose an obstacle” but it is rather

“the ethical dimension, which may well pose the greatest problem for

LAWS”.853

H. AWS Update for the Legal Review

Current US military policies require that human operators make the decision of whether or not to launch an attack. The study of current prototypes has shown that AWS must retain sufficient human judgment and control when targeting or attacking functions are involved (LRASM). In addition, AWS with full autonomy are not yet – rather for policy than for technical reasons – empowered to target and attack (ACTUV, X47-B) objects or even humans. Hence, it is suggested that they will be deployed in a manner and in an environment in which they can lawfully operate.

Further, the US has openly declared that it will pursue full autonomy. This would include selecting and attacking targets. For this to be conducted in a lawful manner, the existing review regulatory approach needs an update that would allow the reviewing team to fully comprehend the sophistication and capabilities of sensor technology of the AWS under review. As stated in the introduction, so far papers in the field of autonomous systems are based more on speculation than on informed decisions on the state of technology. That is why this proposal builds upon the findings of both the state of the art of current

853 SAUER, 'Autonomous Weapons Systems. Humanising or Dehumanising Warfare?', 3. 265 technology and the legal analysis of these systems. The following conditions need to be taken into consideration for an AWS legal review to function effectively:

• Incorporate in its definition the findings and limitations from the legal

analysis;

• Avoid technologically uninformed framework proposal;

• Advocate inclusive panels to add scientific expertise;

• Focus on a flexible process (guidelines) rather than an ad-hoc “solution,”

into which later innovations can be channelled.

These factors are critical to ensure that ever more autonomous weapon systems will be covered and used within the realm of applicable law of armed conflict provisions.

As we have seen above, many countries’ militaries have already invested heavily in autonomy in warfare, resulting in costly but very effective unmanned prototypes. Limiting this development will be extremely difficult due to the investments made, but also given the likelihood of potential advantages of more unmanned and autonomous systems. Unmanned systems have already taken over key functions on the battlefield. Lastly, the authors of any AWS regulatory proposal must be sensitive to the vast potential for civilian applications of autonomous systems, such as the automobile industry or health care. At the same time, this is the main reason for a coordinated international plan regarding 266 emerging AWS. There is still a window of opportunity, since fully AWS do not yet exist.

1. Goal: Maintaining Meaningful Human Control

A dialogue within the UN CCW community of states in the form of the agreed upon GGE regarding norms is a first step towards overcoming the current deadlocked. There is almost universal support for maintaining meaningful human control when using weapon systems for targeting or attacking functions.854 MHC would arguably be maintained when AWS are:855

• Easily recallable;

• Pre-programmed in a manner that narrowly limits engagement

parameters; and

• Designed carefully to control the method by which it “learns.”

The main actors might accept mechanisms that can be flexible and responsive to a certain degree of human control. Key countries such as Israel, the UK, France,

Japan, and Germany support this notion of “meaningful human control.”856 The

US fundamentally agrees while favouring the term “appropriate levels of human judgment,” 857 and Russia proposes “human responsibility.” 858 More

854 ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', 4; Differences in terminology, such as “appropriate levels of human judgment” or “appropriate human involvement” do still exist: SCHARRE, 'Lethal Autonomous Weapons and Policy-Making Amid Disruptive Technological Change', (14 November 2017) www.justsecurity.org/47082/lethal-autonomous- weapons-policy-making-disruptive-technological-change/, 4, last visited December 15, 2017. 855 SCHMITT, 'Regulating Autonomous Weapons Might be Smarter Than Banning Them', 6. 856 ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', 4. 857 MICHAEL W. MEIER, 'U.S. Delegation Opening Statement', (11 April 2016), 2. 267 significantly, this study reveals the opportunity for establishing broader norms or regulating guidance, given the involvement of key actors in the area. As a way forward in the discussions on AWS, strengthening the domestic weapons review process could be an interim step states could agree to. This would include sharing of best legal, policy, operational, and technical practices.

If the maintenance of MHC is the ultimate goal, regulation – and not a ban – seems to be the right approach. This is illustrated by the different approaches taken towards anti-personnel land mines. While the Ottawa Convention 859 outlaws their use, the Protocol II to the CCW860 sets limits on the period mines remain active. By way of analogy, state parties could agree to impose limitations on AWS that reflect the concerns mentioned during the UN CCW expert meetings. Another analogy can be found in the Protocol III to the CCW861 that restricts the use of incendiaries against a military objective located within a concentration of civilians. The likelihood that states are willing to prohibit the use of AWS in urban areas where they pose considerable danger to civilians seems much higher than an outright ban.

858 ACHESON, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', 4. 859 See Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, 18 September 1997, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/Treaty.xsp?documentId=B587BB 399470269441256585003BA277&action=openDocument, last visited December 15, 2017. 860 See Protocol (II) on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices. Geneva, 10 October 1980, https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/Treaty.xsp?documentId=6258BAB1CD 31AD0EC12563CD002D6DC9&action=openDocument, last visited December 15, 2017. 861 See Protocol on Prohibitions or Restrictions on the Use of Incendiary Weapons (Protocol III). Geneva, 10 October 1980, https://ihl-databases.icrc.org/applic/ihl/ihl.nsf/Treaty.xsp?documentId=1E37E38A 51A1941DC12563CD002D6DEA&action=openDocument, last visited December 15, 2017. 268

Since proliferation of AWS is indeed a realistic concern, their sale or transfer could be limited, just as the international trade in conventional weapons that was recently regulated by the Arms Trade Treaty (ATT).862 Regulating by policy through sharing lessons learned and best practices with regard to the Article 36

AP I legal reviews of new weapons systems is an urgent initial step. In the long run states should focus on common objectives, such as maintaining human involvement in the use of force.

2. First Steps: Domestic Legislation

The discussions in the UN CCW outlined above exemplify the current state of the debate. It remains up to state parties to conclude the best way forward. For the moment, legal reviews should be the guiding principle employed by all states to respect and implement the most effective use of this existing universal obligation. One way for the US administration to demonstrate its commitment towards adhering to international rules could be to pass domestic legislation or to strengthen their national legal review mechanisms with regard to more autonomy in weapon systems. After having secured congressional approval, US representatives could coordinate with the main stakeholders from China, Russia, and Europe to streamline domestic policies towards AWS. This could become a forum for sharing lessons learned and best practices. For example, executive

862 UNITED NATIONS, 'Arms Trade Treaty', (24 December 2014) C.N.630.2014.TREATIES-XXVI.8, www.un.org/disarmament/convarms/att/), last visited December 15, 2017. 269 governmental agreements do not need to be ratified. In addition, coordination between the US and Europe could convince Russia and China to also cooperate in the field of imposing guidelines for standardized legal reviews of AWS.

The standards applied could be:

• AWS have to retain meaningful human control when using lethal force;

• For any weapon system to be considered legal it would have to

demonstrate that it operates within the realm of the applicable law, in

particular consistent with IHL;

• Certain critical functions (selecting and attacking targets) should have

meaningful human control;

• Consideration of the context (offensive or defensive) in which AWS will

be deployed.

Furthermore, standard methods and protocols for testing AWS should be developed to ensure compliance across domestic legal review mechanisms for new weapons. AWS would then need to have an IHL and security clearance

(license) before they can lawfully be deployed. In the absence of legislation, these measures could be adopted as national policy, thus providing a stronger framework for legal reviews, and establishing these structures as a broader de- facto review process.

270

3. Long-Term: Regulatory Treaty

Multilateral treaties might have significant advantages. Feasibility, on the other hand, requires a balancing of treaty consensus and enforceability as well as compromises and limitations of multilateral treaties. 863 In fact, the more countries sign up to a treaty, the more individual states will be pushed to agree to a compromise. The most important fact that needs to be considered in this discussion is the differing national attitude and approach towards international treaties. While the US Senate’s stance towards international treaty regimes, such as the Anti-Ballistic Missile Treaty, the Comprehensive Test Ban Treaty, and the Biological Weapons Convention, is almost hostile (a specific form of

American exceptionalism), European countries consider international treaties as crucial. In addition, Europeans do not usually consider domestic implementation of international treaty negotiations as an issue. In contrast, the US considers ratification an almost insurmountable obstacle. With regard to a regulatory treaty, the US delegation in the Fifth Review Conference of 2016 in the CCW declared that it aims for

“a non-legally binding outcome document that describes a comprehensive

weapons review process, including the policy, technical, legal, and

operational best practices that States could consider using if they decide

863 For an overview of how the “legalization” of international regimes transforms world politics, including binding regimes and their implementation and enforcement see: CHRISTIAN BRUETSCH & DIRK LEHMKUHL, Law and Legalization in Transnational Relations (Routledge, 2007). 271

to develop LAWS or any other weapon system that uses advanced

technology”.864

European countries must understand that the US Senate is not likely to ratify an international treaty covering AWS in the near future. In the long run, aligning all stakeholders at home to agree upon an international treaty regulating AWS by way of maintaining meaningful human control could be the ultimate goal. This can be achieved, if states decide to pursue this road in order to clarify the use of

AWS in all circumstances in a binding instrument. An international instrument would at least guarantee that all actors apply the same standards. For example, this treaty could explicitly contain standard methods and protocols for testing

AWS.

VI. Conclusion

This study discussed a potential regulation of AWS. The proposed pre-emptive ban is unrealistic due to a lack of state support at the current state of technology.

The existing domestic legal review explicitly enshrined in Article 36 AP I offers a realistic means to immediately act: it is broad enough to cover emerging AWS, it relies on existing law, does not need a whole new regulatory regime, and lastly, it is not simply based on voluntary commitments. Based upon domestic reviews, lessons learned and sharing of best practices, states can ensure a broader distribution of established standards. States that are not willing to pre-

864 MEIER, 'U.S. Delegation Opening Statement', (11 April 2016), 3. 272 emptively ban what they do not yet fully comprehend may be willing to agree upon strengthening the existing system of legal reviews that they agreed to or that they have already implemented. They should focus on implementing context-based legal reviews of methods and means of warfare because Article

36 AP I offers the possibility to effectively restrict new weapon systems in order to ensure compliance with the applicable laws of war.

It is suggested that the robots will be amongst us soon. In the course of the study, it becomes clear that they already come in many forms and even contain high levels of autonomy. In addition, it should be possible to distinguish between those that can be used in a beneficial and lawful manner (the good), those that we should not deploy (the bad), and those that might even risk extinguishing humanity (the ugly). Current prototypes can lawfully be used because they allow for more precision. To keep up with the complexity of future systems, the current review systems need to include the context of use. When systems are upgraded after initial clearance, AWS will have to be re-reviewed.

To overcome the deadlocked situation in the UN CCW, state parties have agreed to establish a Group of Governmental Experts to come up with a proposal of how to regulate AWS. The international streamlining of national legal reviews would be the most feasible proposal.

273

In sum, the purely domestic legal review is unsatisfactory due to its relative lack of transparency, while a treaty is also unlikely. Instead, interim measures, such as sharing of best practices of national reviews are more feasible, and could build greater confidence among the main actors. These measures might even have the potential to establish common standards, ultimately leading towards opportunities to create an international instrument. In case the legal review becomes part of a treaty-based AWS Review Regime, the focus should be on the context of the use of a weapon system since its legality may shift depending on the setting in which it is used.865 Following this approach, a legal review can and must ensure that AWS may only be operated with respect to an environment in which they can lawfully aim at military targets.

To conclude, while there is no principle of IHL that prohibits the use of AWS per se, it depends on how (technologically) well-equipped AWS are in order to respect the core principles of IHL. As long as sensor and on-board programs are not sophisticated enough, the context of lawful use should be restricted to environments in which no proportionality assessment is required.866

The responsibility to ensure that weapons comply with the applicable law rests with the state developing, buying, or using these weapons. The international

865 FRY, 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', 453-519; FRY, 'The XM25 Individual Semi-Automatic Airburst Weapon System and International Law: Landing on the wrong planet?', (2013) 36 University of New South Wales Law Journal, 683. 866 SASSÒLI, 'Autonomous Weapons – Potential advantages for the respect of international humanitarian law', 5. 274 community of states and civil society organizations will closely follow developments with regard to autonomous weapons technology (including state practice). Self-driving cars may soon be able and allowed to take over the steering of the car on highways with rather predictable obstacles. At the same time, it might take longer until cars can manage steering through rush hour traffic in a city. The same is true for military applications. Minesweeping robots or defensive systems, such as counter submarine warfare, might pass legal reviews under existing or evolving structures with little difficulty. On the other hand, it may take a long time (if ever) to lawfully use AWS in urban warfare where encountering civilians is high probable. Countries can still take up leadership and become first movers in developing transparent international standards for contextualised legal reviews. It is more about the will than about the possibility. We are now standing at a crossroads where states will decide how the coexistence of man and machine will look like. The international community of states should ensure that robots only take over in contexts where they can do so according to the law.

275

VII. Bibliography

A. Treaties and International Documents

Conference of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons, 'Final Document of the Fifth Review Conference CCW/CONF.V/10', (23 December 2016), www.unog.ch/80256EE600 585943/(httpPages)/9F975E1E06869679C1257F50004F7E8C?OpenDocu ment, last visited December 15, 2017.

Convention on Certain Conventional Weapons, 'Additional Protocol to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be deemed to be Excessively Injurious or to have Indiscriminate Effects (Protocol IV, entitled Protocol on Blinding Laser Weapons)', Doc. CCW/CONF.I/16 Part I , (1995) United Nations Treaty Series, vol. 1380, 370.

Convention on Certain Conventional Weapons Chairperson’s Report, 'Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), (Advanced version of 16 May 2014)', www.unog.ch/80256EDD006B8954/%28httpAssets%29/350D9ABED 1AFA515C1257CF30047A8C7/$file/Report_AdvancedVersion_10June.pd f, last visited December 15, 2017.

Convention (I) for the Amelioration of the Condition of the Wounded and Sick in Armed Forces in the Field. Geneva, 12 August 1949, https://ihl- databases.icrc.org/applic/ihl/ ihl.nsf/7c4d08d9b287a42141256739003e636b/fe20c3d903ce27e3c125641e 004a92f3?OpenDocument, last visited December 15, 2017.

276

Convention (II) for the Amelioration of the Condition of Wounded, Sick and Shipwrecked Members of Armed Forces at Sea. Geneva, 12 August 1949, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/7c4d08d9b287a42141256739003e636b /44072487ec4c2131c125641e004a9977?OpenDocument, last visited December 15, 2017.

Convention (IV) respecting the Laws and Customs of War on Land and its annex: Regulations concerning the Laws and Customs of War on Land. The Hague, 18 October 1907, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/385ec082b509e76c4125673900 3e636d/1d1726425f6955aec125641e0038bfd6?OpenDocument, last visited December 15, 2017.

Convention (VIII) relative to the Laying of Automatic Submarine Contact Mines. The Hague, 18 October 1907, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/52d68d14de6160e0c 12563da005fdb1b/30b2a952e0f3feb9c125641e0039cbe2?OpenDocument, last visited December 15, 2017.

Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be deemed to be Excessively Injurious or to have Indiscriminate Effects, Concluded at Geneva on 10 October, (1992) vol. 1342 United Nations Treaty Series.

Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, 18 September 1997, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/Treaty.xsp?documentId=B587BB39947 0269441256585003BA277&action=openDocument, last visited December 15, 2017. 277

International Law Commission, 'Draft Articles on the Prevention of Transboundary Harm from Hazardous Activities with Commentaries', (2001) Vol. II (2).

Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (AP I), 8 June 1977, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/7c4d08d9b287a42141256739003e636b / f6c8b9fee14a77fdc125641e0052b079?OpenDocument, last visited December 15, 2017.

Protocol (II) on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices. Geneva, 10 October 1980, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/ Treaty.xsp?documentId=6258BAB1CD31AD0EC12563CD002D6DC9&ac tion=openDocument, last visited December 15, 2017.

Protocol on Prohibitions or Restrictions on the Use of Incendiary Weapons (Protocol III). Geneva, 10 October 1980, https://ihl- databases.icrc.org/applic/ihl/ihl.nsf/Treaty.xsp? documentId=1E37E38A51A1941DC12563CD002D6DEA&action=openD ocument, last visited December 15, 2017.

United Nations, 'Arms Trade Treaty', (24 December 2014) C.N.630.2014.TREATIES-XXVI.8, www.un.org/disarmament/convarms/att/, last visited December 15, 2017.

United Nations, 'Rome Statute of the International Criminal Court', (1 July 2002) 2187 UNTS 90, http://untreaty.un.org/cod/icc/statute/romefra.htm, last visited December 15, 2017.

278

United Nations, 'Vienna Convention on the Law of Treaties', (1969) 1155 UNTS 331, https://treaties.un.org/doc/Publication/UNTS/Volume 1155/volume- 1155-I-18232-English.pdf, last visited December 15, 2017.

B. Cases

International Criminal Court, 'Pre-Trial Chamber II, Situation in the Central African Republic in the case of the Prosecutor v. Jean-Pierre Bemba Gombo', (2009) No.: ICC-01/05-01/08.

International Court of Justice, 'Advisory Opinion of 11.4.1949 - Reparations for Injuries suffered in the Service of the United Nations', (1949) I.C.J. Reports, 174.

International Court of Justice, 'Corfu Channel (United Kingdom v. Albania)', (1949) I.C.J. Reports, 4.

International Court of Justice, 'Military and Paramilitary Activities In and Against Nicaragua (Nicaragua v. USA)', (1986) I.C.J. Reports, 14.

International Court of Justice, 'Legality of the Threat or Use of Nuclear Weapons, (Advisory Opinion of 8 July 1996)', (1996) I.C.J. Reports, 226.

International Court of Justice, 'Advisory Opinion - Difference Relating to Immunity from Legal Process of a Special Rapporteur to the Commission of Human Rights', (1999) I.C.J. Reports, 62.

International Court of Justice, 'Case Concerning Armed Activities on the Territory of the Congo (Democratic Republic of the Congo v. Uganda)', (2005) I.C.J. Reports, 168.

International Court of Justice, 'Case Concerning the Application of the Convention of the Prevention and Punishment of the Crime of Genocide

279

(Bosnia and Herzegovina v. Serbia and Montenegro) (Merits)', (2007) I.C.J. Reports, 43.

International Criminal Tribunal for Rwanda, 'Appeals Chamber - Prosecutor v. Ignace Bagilishema', (2002) Case: ICTR-95-1A-A.

International Criminal Tribunal for the Former Yugoslavia, 'Trial Chamber, Prosecutor v. Zejnil Delalić et al.', (1998) Case: IT-96-21-T.

International Criminal Tribunal for the Former Yugoslavia, 'Appeals Chamber, Prosecutor v. Tihomir Blaškic', (2004) Case: IT-95-14-A.

C. Books, Articles, and Presentations

Acheson, Ray, 'Civil society perspectives on the CCW Meeting of Experts - Editoral: Seeking Action on Autonomous Weapons', (11 April 2016), www.reachingcriticalwill.org/ images/documents/Disarmament- fora/ccw/2016/meeting-experts-laws/reports/ CCWR3.2.pdf, last visited December 15, 2017.

Acheson, Ray, 'Confronting Reality: We can build Autonomous Weapons but we can't make them smart', (14 November 2017) 5 CCW Report, www.reachingcriticalwill.org/ images/documents/Disarmament- fora/ccw/2017/gge/reports/CCWR5.2.pdf, last visited December 15, 2017.

Acheson, Ray, 'Losing Control: The Challenge of Autonomous Weapons for LAWS, Ethics, and Humanity', (15 November 2017) 5 CCW Report, www.reachingcriticalwill.org/ images/documents/Disarmament- fora/ccw/2017/gge/reports/CCWR5.3.pdf, last visited December 15, 2017.

Airbus, 'European MALE RPAS (Medium Altitude Long Endurance Remotely Piloted Aircraft System) Programme takes off', (28 September 2016),

280

www.airbus.com/newsroom/press-releases/en/2016/09/european-male- rpas-medium-altitude-long-endurance-remotely-piloted-aircraft-system- programme-takes-off.html, last visited December 15, 2017.

Akerson, David, 'The Illegality of Offensive Lethal Autonomy', in D. Saxon (ed.), International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff, 2013), 65-98.

Alston, Philip, 'Lethal Robotic Technologies: The Implications for Human Rights and International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 35-60.

Ambos, Kai, 'Superior Responsibility', in A. Cassese, P. Gaeta and J. Jones (eds.), The Rome Statute of the International Criminal Court: A Commentary (Oxford University Press, 2002), 235-250.

Ambos, Kai, 'Internationales Strafrecht', (C.H. Beck, 2011).

Ambos, Kai, 'Treatise on International Criminal Law: Foundations and General Part – Volume I', (Oxford University Press, 2013).

Anderson, Kenneth, Reisner, Daniel and Waxman, Matthew, 'Adapting the Law of Armed Conflict to Autonomous Weapon Systems', (2014) 90 International Law Studies, 386-411.

Anderson, Kenneth and Waxman, Matthew C., 'Law and Ethics for Autonomous Weapon Systems: Why a Ban Won't Work and How the Laws of War Can', (2013) Columbia Public Law Research Paper,1-33.

Ansell, Darren, 'The Reliability and Vulnerability of Autonomous Systems', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS),

281

www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Anthony, Ian, 'LAWS at the CCW: Transparency and information sharing measures', (17 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C1 257C8D00513E26?OpenDocument, last visited December 15, 2017.

Appelqvist, Pekka, 'Systems approach to LAWS - characteristics, considerations and implications', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE 22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Arendt, Rieke, 'Völkerrechtliche Probleme beim Einsatz autonomer Waffensysteme', (Berliner Wissenschafts-Verlag, 2016).

Arkin, Ronald Craig, 'Lethal Autonomous Weapons Systems and the Plight of the Noncombatant (Presentation at the CCW informal expert meeting on 13 May 2014)', www.unog.ch/80256EDD006B8954/(httpAssets)/FD01CB0025020DDFC1 257CD70060EA38/$file/Arkin_LAWS_technical_2014.pdf, last visited December 15, 2017.

Arkin, Ronald Craig, 'Governing lethal behavior in autonomous robots', (CRC Press, 2009).

Arkin, Ronald Craig, 'The Robot didn’t do it', (2013) Position Paper for the Workshop on Anticipatory Ethics, Responsibility and Artificial Agents, 1- 2. 282

Article 36, 'Memorandum from Article 36 for Delegates to the Convention on Certain Conventional Weapons (CCW), Structuring Debate on Autonomous Weapon Systems', (14 November 2013), www.article36.org/wp-content/uploads/2013/11/Autono-mous-weapons- memo-for-CCW.pdf, last visited December 15, 2017.

Article 36, 'Killing by Machine - Key Issues for Understanding Meaningful Human Control', (2015), www.article36.org/wp- content/uploads/2013/06/KILLING_BY_MACHINE_ 6.4.15.pdf, last visited December 15, 2017.

Asaro, Peter, 'A Body to Kick, But Still No Soul to Damn. Legal Perspectives on Robotics', in P. Lin, K. Abney and G. A. Bekey (eds.), Robot ethics: the ethical and social implications of robotics (MIT Press, 2012).

Asaro, Peter, 'On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making', (2012) 94 International Review of the Red Cross, 687-709.

Aust, Helmut Philipp, 'Complicity and the Law of State Responsibility', (Cambridge University Press, 2011).

Backstrom, Alan and Henderson, Ian, 'New Capabilities in Warfare: An Overview of Contemporary Technological Developments and the Associated Legal and Engineering Issues in Article 36 Weapons Reviews', (2012) 94 International Review of the Red Cross, 483-514.

Bantekas, Ilias, 'The Contemporary Law of Superior Responsibility', (1999) 93 American Journal of International Law, 573-595.

283

Barnidge, Robert, 'Non-State Actors and Terrorism: Applying the Law of State Responsibility and the Due Diligence Principle', (T.M.C. Asser Press, 2007).

Barrat, James, 'Our Final Invention: Artificial Intelligence and the End of the Human Era', (Thomas Dunne Books, 2015).

Benjamin, Medea, 'Drone Warfare. Killing by Remote Control', (Verso, 2012).

Berkowitz, Bruce, 'Sea Power in the Robotic Age', (2014) XXX Issues in Science and Technology, http://issues.org/30-2/bruce-2/, last visited December 15, 2017.

Bhuta, Nehal, 'Collaborative Operations in Denied Environment Program (CODE) framework', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75 A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Bhuta, Nehal; Beck, Susanne; Geiß, Robin; Liu, Hin-Yan and Kreß, Claus, 'Autonomous Weapons Systems – Law, Ethics, Policy', (Cambridge University Press, 2016).

Bilger, Burkhard, 'Auto Correct - Has the self-driving car at last arrived?', (25 November 2013) The New Yorker, www.newyorker.com/magazine/2013/11/25/auto-correct, last visited December 15, 2017.

Biontino, Michael, 'Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', (11 to 15 April 2016), www.unog.ch/80256EDD006

284

B8954/(httpAssets)/DDC13B243BA863E6C1257FDB00380A88/$file/Rep ortLAWS_2016_AdvancedVersion.pdf, last visited December 15, 2017.

Biontino, Michael, 'Statement by the Permanent Representative of Germany to the Conference on Disarmament on Lethal Autonomous Weapons Systems (LAWS) to the Fifth CCW Review Conference', (12 December 2016), www.unog.ch/80256EDD006B8954/ (httpAssets)/97043403171925A7C125808B00368CEF/$file/2016+Bericht +Vorsitzender+LAWS.pdf, last visited December 15, 2017.

Biontino, Michael, 'Chairperson's Report of the 2015 CCW Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)', (13 to 17 April 2015), www.unog.ch/ 80256EE600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E 26?OpenDocument, last visited December 15, 2017.

Biontino, Michael, 'Recommendations to the 2016 Review Conference', (15 April 2016), www.unog.ch/80256EDD006B8954/(httpAssets)/6BB8A498B0A12A03C1 257FDB00382863/$file/Recommendations_LAWS_2016_AdvancedVersio n+(4+paras)+.pdf, last visited December 15, 2017.

Boothby, William H., 'The Law of Targeting', (Oxford University Press, 2012).

Boothby, William H., 'Some legal challenges posed by remote attack', (2012) 94 International Review of the Red Cross, 579-595.

Boothby, William H., 'Dehumanization of warfare – Is there a legal problem with in particular concerning Art. 36 API?', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), www.rewi.europa-

285

uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM- 2015/DEHUM-2015.html, last visited December 15, 2017.

Boothby, William H., 'Article 36, Weapons Reviews and autonomous weapons', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Boothby, William H., 'Weapons and the Law of Armed Conflict', (Oxford University Press, 2009).

Boothby, William H., 'How Far Will the Law Allow Unmanned Targeting to Go?', in D. Saxon (ed.), International Humanitarian Law and the Changing Technology of War (Martinus Nijhoff, 2013).

Boothby, William H., 'Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors', (T.M.C. Asser Press, 2014).

Boothby, William H., 'The Legal Challenge of New Technologies: An Overview', in H. Nasu and R. McLaughlin (eds.), New Technologies and the Law of Armed Conflict (T.M.C. Asser Press, 2014).

Bornstein, Jon, 'Autonomy Roadmap Autonomy Community of Interest', (24 March 2015), www.defenseinnovationmarketplace.mil/resources/Autonomy COI_NDIA_Briefing20150319.pdf, last visited December 15, 2017.

Borrmann, Robin, 'Autonome unbemannte bewaffnete Luftsysteme im Lichte des Rechts des internationalen bewaffneten Konflikts. Anforderungen an

286

das Konstruktionsdesign und Einsatzbeschränkungen', (Duncker & Humblot, 2014).

Bothmer, Fredrik von, 'Robots in Court. Responsibility for Lethal Autonomous Weapons Systems (LAWS)', in S. Brändli, R. Harasgama, R. Schister and A. Tamò (eds.), Mensch und Maschine. Symbiose oder Parasitismus? (Schriften der Assistierenden der Universität St. Gallen, 2015), 101-124.

Boulanin, Vincent, 'Implementing Article 36 Weapon Reviews in the Light of Increasing Autonomy in Weapon Systems', (2015) 2015/1 SIPRI Insights on Peace and Security, 1-28.

Bowcott, Owen, 'UK opposes international ban on developing "Killer Robots"', (13 April 2015) The Guardian, www.theguardian.com/politics/2015/apr/13/uk-opposes-international-ban- on-developing-killer-robots, last visited December 15, 2017.

Brehm, Maya, 'Meaningful Human Control', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Brownlie, Ian, 'State Responsibility Part I: System of Law of Nations', (Oxford University Press, 1983).

Brownlie, Ian, 'Principles of Public International Law', 7. ed. (Oxford University Press, 2008).

Bruni, Sudi, 'Northrop Grumman, US Navy Complete Critical Design Review for Bomb Disposal Robot Program', (7 June 2016),

287

http://investor.northropgrumman.com/ phoenix.zhtml?c=112386&p=irol- newsArticle_pf&ID=2175739, last visited December 15, 2017.

Brütsch, Christian and Lehmkuhl, Dirk, 'Law and Legalization in Transnational Relations', (Routledge, 2007).

Buchanan, Allen and Keohane, Robert O., 'Toward a Drone Accountability Regime', (2015) 29 Ethics and International Affairs, 15-37.

Buchanan, Allen and Keohane, Robert O., 'Toward a Drone Accountability Regime: A Rejoinder', (2015) 29 Ethics and International Affairs, 67-70.

Burgess, John A., 'Emerging Technologies and the Security of Western Europe', in S. J. Flanagan and F. O. Hampson (eds.), Securing Europe’s Future: Changing elements of European security (Dover, 1986).

Burri, Thomas, 'Machine Learning and the Law: 5 Theses', (3 January 2017), https://ssrn.com/abstract=2927625, last visited December 15, 2017.

Burri, Thomas, 'The Politics of Robot Autonomy', (2016) 7 European Journal of Risk Regulation, 341-360.

Burri, Thomas, 'Free Movement of Algorithms: Artificially Intelligent Persons Conquer the European Union's Internal Market', (2017) In: Woodrow Barfield and Ugo Pagallo (eds), Research Handbook on the Law of Artificial Intelligence, Edward Elgar, (forthcoming), https://ssrn.com/abstract=3010233, last visited December 15, 2017.

Burri, Thomas; Bayern, Shawn; Grant, Thomas D.; Häusermann, Daniel M.; Möslein, Florian and Williams, Richard, 'Company Law and Autonomous Systems: A Blueprint for Lawyers, Entrepreneurs, and Regulators', (2017) 9 Hastings Science and Technology Law Journal, 135-162.

288

Burri, Thomas and Wildhaber, Isabelle, 'Introduction to the Special Issue of the European Journal of Risk Regulation: The Man and the Machine – When Systems Take Decisions Autonomously', (2016) 7 European Journal of Risk Regulation, 295-296.

Calo, Ryan, 'Robotics and the New Cyberlaw', (2015) 103 California Law Review, 101-146.

Campaign to Stop Killer Robots, 'More talks in 2016 but little ambition', (13 November 2015), www.stopkillerrobots.org/2015/11/noambition/, last visited December 15, 2017.

Campaign to Stop Killer Robots, 'Step up the CCW mandate', (18 June 2015), www.stopkillerrobots.org/2015/06/mandateccw/, last visited December 15, 2017.

Campaign to Stop Killer Robots, 'Country Policy Positions', (25 March 2015), www.stopkillerrobots.org/wp- content/uploads/2015/03/KRC_CCWexperts_ Countries_25Mar2015.pdf, last visited December 15, 2017.

Campaign to Stop Killer Robots, 'Who Supports the Call to Ban Killer Robots?', (27 June 2017), www.stopkillerrobots.org/wp- content/uploads/2013/03/KRC_ ListBanEndorsers_27June2017.pdf, last visited December 15, 2017.

Campaign to Stop Killer Robots, 'The Solution', (2016) www.stopkillerrobots.org/the-solution/, last visited December 15, 2017.

Campaign to Stop Killer Robots, 'Country Views on Killer Robots', (14 November 2017), www.stopkillerrobots.org/wp-

289

content/uploads/2013/03/KRC_CountryViews _14Nov2017.pdf, last visited December 15, 2017.

Campaign to Stop Killer Robots, 'Support builds for new international law on killer robots', (2017), www.stopkillerrobots.org/2017/11/gge/, last visited December 15, 2017.

Cassese, Antonio, 'The Martens Clause: Half a Loaf or Simply Pie in the Sky?', (2000) 11 European Journal of International Law, 187-216.

Center for New American Security, 'Autonomous Weapons and Human Control', (April 2016) Ethical Autonomy Project, www.cnas.org/autonomous- weapons-and-human-control-.VwzvdcchuOp, last visited December 15, 2017.

Corn, Geoffrey S.; Hansen, Victor; Jackson, Richard; Jenks, Christopher; Jensen, Eric Talbot and Schoettler, James A., 'The Law of Armed Conflict: An Operational Approach', (Aspen Casebook, 2012).

Crawford, Emily, 'The Modern Relevance of the Martens Clause', (2006) 6 ISIL Yearbook of International Humanitarian and Refugee Law, 1-18.

Crawford, James, 'Revising the Draft Articles on State Responsibility', (1999) 10 European Journal of International Law, 436-460.

Crawford, James, 'The International Law Commission’s Articles on State Responsibility. Introduction, Text and Commentaries', (Cambridge University Press, 2002).

Crawford, James, 'State Responsibility – The General Part', (Cambridge University Press, 2013).

290

Crawford, Neta C., 'Accountability for Targeted Drone Strikes Against Terrorists?', (2015) 29 Ethics and International Affairs, 39-49.

Crootof, Rebecca, 'Why the Prohibition on Permanently Blinding Lasers is Poor Precedent for a Ban on Autonomous Weapon Systems', (24 November 2015) Lawfare Blog, www.lawfareblog.com/why-prohibition- permanently-blinding-lasers-poor-precedent-ban-autonomous-weapon- systems, last visited December 15, 2017.

Crootof, Rebecca, 'The Killer Robots Are Here: Legal and Policy Implications', (2015) 36 Cardozo Law Revue, 1837-1915.

Cussins, Jessica, 'AI Researchers Create Video to Call for Autonomous Weapons Ban at UN', (14 November 2017), https://futureoflife.org/2017/11/14/ai-researchers-create-video-call- autonomous-weapons-ban-un/, last visited December 15, 2017.

Daoust, Isabelle; Coupland, Robin and Ishoey, Rikke, 'New wars, new weapons? The obligation of States to assess the legality of means and methods of warfare', (2002) 84 International Review of the Red Cross, 345-363.

Das Eidgenössische Departement für Verteidigung, Bevölkerungsschutz und Sport, 'Verordnung des VBS über das Armeematerial (Armeematerialverordnung, VAMAT) - 514.20', (6 December 2007), www.admin.ch/opc/de/classified-compilation/20071623/index.html, last visited December 15, 2017.

Davison, Neil, 'Characteristics of autonomous weapon systems', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017. 291

Defense Advanced Research Projects Agency, 'ACTUV Unmanned Vessel Helps TALONS Take Flight in Successful Joint Test', (20 October 2016), www.darpa.mil/news-events/2016-10-24, last visited December 15, 2017.

Defense Advanced Research Projects Agency, 'Breakthrough Technologies for National Security', (2015), www.darpa.mil/about.aspx, last visited December 15, 2017.

Defense Advanced Research Projects Agency, 'DRC Finals Operations Book (DISTAR Case 24508)', (30 April 2015), www.theroboticschallenge.org/news/ops-manual, last visited December 15, 2017.

Dickow, Marcel, 'A Multidimensional Definition of Robotic Autonomy - Possibilities for Definitions and Regulation', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/( httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Dill, Janina, 'The Informal Regulation of Drones and the Formal Legal Regulation of War', (2015) 29 Ethics and International Affairs, 51-58.

Dinstein, Yoram, 'Legitimate Military Objectives Under The Current Jus in Bello', (2002) 78 International Law Studies, 139-172.

Dinstein, Yoram, 'The conduct of hostilities under the law of armed conflict', 2nd edition (Cambridge University Press, 2010).

Docherty, Bonnie, 'Human Rights Implications of Fully Autonomous Weapons', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS),

292

www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Docherty, Bonnie, 'Losing Humanity. The Case against Killer Robots', (2012) Human Rights Watch/Harvard International Human Rights Clinic, 1-55.

Docherty, Bonnie, 'The Trouble with Killer Robots. Why we need to ban fully autonomous weapons systems, before it's too late' (2012), http://foreignpolicy.com/2012/11/19/the-trouble-with-killer-robots/, last visited December 15, 2017.

Docherty, Bonnie, 'Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition', (2015) Human Rights Watch/Harvard International Human Rights Clinic, 1-18.

Docherty, Bonnie, 'Making the Case: The Dangers of Killer Robots and the Need for a Preemptive Ban', (2016) Human Rights Watch/Harvard Law School International Human Rights Clinic, 1-50.

Domingos, Pedro, 'The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World', (Basic Books, 2015).

Doswald-Beck, Louise, 'San Remo Manual on International Law Applicable to Armed Conflicts at Sea', (Cambridge University Press, 1994).

Dupuy, Pierre-Marie, 'Due Diligence in the International Law of State Responsibility, OECD: Legal Aspects of Transfrontier Pollution', (OECD Press, 1977).

Dupuy, Pierre-Marie, 'Reviewing the Difficulties of Codification: on Ago’s Classification of Obligations of Means and Obligations of Result in Relation to State Responsibility', (1999) European Journal of International Law, 371-385. 293

Evans, Tyler D., 'At War with the Robots: Autonomous Weapon Systems and the Martens Clause', (2013) 41 Hofstra Law Review, 697-733.

Fay, George R., 'Investigation of the Abu Ghraib Detention Facility and the 205th Military Intelligence Brigade', (2004), www.washingtonpost.com/wp-srv/nationi/documents/fay_report_8-25- 04.pdf, last visited December 15, 2017.

Finel, Bernard I. and Lord, Kristin M., 'Power and Conflict in the Age of Transparency', (Palgrave Macmillan, 2002).

Federal Foreign Office of Germany, 'Lethal Autonomous Weapons Systems - Technology, Definition, Ethics, Law and Security', (Zarbock GmbH & Co. KG, 2016).

Francioni, Francesco, 'Private Military Contractors and International Law: An Introduction', (2008) 19 European Journal of International Law, 961-964

Frau, Robert, 'Unbemannte Luftfahrzeuge im internationalen bewaffneten Konflikt', (2011) 24 Journal of International Law of Peace and Armed Conflict, 60-72.

Frau, Robert, 'Regulatory Approaches to Unmanned Naval Systems in International Law of Peace and War', (2012) 2 Journal of International Law of Peace and Armed Conflict, 84-91.

Frau, Robert, 'Völkerstrafrechtliche Aspekte automatisierter und autonomer Kriegsführung', in R. Frau (ed.), Drohnen und das Recht. Völker- und verfassungsrechtliche Fragen automatisierter und autonomer Kriegsführung (Mohr Siebeck, 2014), 235-250.

French Republic, 'Non paper: Legal framework for any potential development and operational use of a future LAWS', (11 April 2016) Third CCW 294

Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/37D51189AC4FB6E1C1257F4D004CAFB2?OpenDocument, last visited December 15, 2017.

Fry, James D., 'Contextualized Legal Reviews for the Methods and Means of Warfare: Cave Combat and International Humanitarian Law', (2006) 44 Columbia Journal of Transnational Law, 453-519.

Fry, James D., 'Legal Resolution of Nuclear Non-Proliferation Disputes', (Cambridge University Press, 2013).

Fry, James D., 'The XM25 Individual Semi-Automatic Airburst Weapon System and International Law: Landing on the wrong planet?', (2013) 36 University of New South Wales Law Journal, 682-710.

Future of Life Institute, 'An Open Letter - Research Priorities for Robust and Beneficial Artificial Intelligence', https://futureoflife.org/ai-open-letter, last visited December 15, 2017.

Future Timeline, 'ATLAS humanoid robot gets an upgrade', (23 January 2015), www.futuretimeline.net/blog/2015/01/23.htm, last visited December 15, 2017.

Gady, Franz-Stefan, 'Is Russia Building a Top-Secret Nuclear-Armed Underwater Drone?', (17 September 2015) The Diplomat, http://thediplomat.com/2015/09/is-russia-building-a-top-secret-nuclear- armed-underwater-drone/, last visited December 15, 2017.

Gillespie, Tony and West, Robin, 'Requirements for Autonomous Unmanned Air Systems set by Legal Issues', (2010) 4 The International C2 Journal, 1-32.

295

Glanz, James and Lehren, Andrew W., 'Use of Contractors Added to War’s Chaos in Iraq', (23 October 2010), www.nytimes.com/2010/10/24/world/middleeast/24contractors.html?_ r=2&hp&, last visited December 15, 2017.

Glennon, Michael J., 'The Road Ahead: Gaps, Leaks and Drips', (2013) 89 International Law Studies, 362-386.

Gross, Oren, 'The New Way of War: Is There a Duty to Use Drones?', (2015) 67 Florida Law Review, 1-72.

Gubrud, Mark, 'Killer Robots and Laser-Guided Bombs: A reply to Horowitz & Scharre', (4 December 2014), http://gubrud.net/?p=398, last visited December 15, 2017.

Guetelein, Michael A., 'Lethal Autonomous Weapons - Ethical and Doctrinal Implications', (2005) Naval War College Newport, RI, http://oai.dtic.mil/oai/oai?verb=getRecord& metadataPrefix=html&identifier=ADA464896, last visited December 15, 2017.

Haimovich, Zvika, 'An Israeli Perspective on Ballistic Missile Defense', (5 April 2016) Fletcher School of Law and Diplomacy - ISSP Lecture Series, http://fletcher.tufts.edu/Calendar/2016/04/05/An-Israeli-Perspective-on- Ballistic-Missile-Defense-Brigadier-General-Zvika-Haimovich-ISSP- Luncheon-Lecture-Series.aspx, last visited December 15, 2017.

Hammes, T. X., 'The Future of Warfare: Small, many, smart vs. few & exquisite?', (16 July 2014) War on the Rocks, http://warontherocks.com/2014/07/the-future-of-warfare-small-many- smart-vs-few-exquisite/, last visited December 15, 2017.

296

Hawkins, Andrew J., 'Google’s ‘worst’ self-driving accident was still a human’s fault', (26 September 2016), www.theverge.com/2016/9/26/13062214/google-self-driving-car-crash- accident-fault, last visited December 15, 2017.

Hawley, John K., 'PATRIOT WARS - Automation and the Patriot Air and Missile Defense System', (January 2017) Center for a New American Security, www.cnas.org/publications/reports/patriot-wars, last visited December 15, 2017.

Heinegg, Wolff Heintschel von, 'Unmanned Maritime Systems: The Challenges', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (13-14 February 2015), www.rewi.europa- uni.de/de/lehrstuhl/or/ voelkerrecht/projekte/Tagung-DEHUM- 2015/DEHUM-2015.html, last visited December 15, 2017.

Heinegg, Wolff Heintschel von, 'Verteidigungsausschuss, Anhörung: Völker, verfassungsrechtliche sowie sicherheitspolitische und ethische Fragen im Zusammenhang mit unbemannten Luftfahrzeugen', (30 June 2014) Protokoll: 18/16, 36, www.bundestag.de/bundestag/ausschuesse18/a12/oeffentliche_anhoerung, last visited December 15, 2017.

Hellström, Thomas, 'On the moral responsibility of military robots', (2013) 15 Ethics and Information Technology, 99-107.

Henckaerts, Jean-Marie and Doswald-Beck, Louise, 'Customary International Humanitarian Law - Volume I', (Cambridge University Press, 2009).

Henderson, Andrew H., 'Murky Waters: The Legal Status of Unmanned Undersea Vehicles', (2006) 53 Naval Law Review, 55-72.

297

Henderson, Ian, 'The Contemporary Law of Targeting: Military Objectives, Proportionality and Precautions in Attack under Additional Protocol I', (Brill, 2009).

Herbach, Jonathan David, 'Into the Caves of Steel: Precaution, Cognition and Robotic Weapon Systems under the International Law of Armed Conflict', (2012) 4 Amsterdam Law Forum, 3-20.

Hessbruegge, Jan, 'The Historical Development of the Doctrines of Attribution and Due Diligence in International Law', (2004) New York University Journal of International Law and Politics, 265-306.

Heyns, Christof, 'Report of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions', (9 April 2013) Human Rights Council, 23rd Sess., UN Doc. A/HRC/23/47, 1-22.

Höfer, Martin Felix, 'Gezielte Tötungen - Terrorismusbekämpfung und die neuen Feinde der Menschheit', (Mohr Siebeck, 2013).

Holmes, James, 'The Mighty X-47B: Is It Really Time for Retirement?', (6 May 2015), http://nationalinterest.org/feature/the-mighty-x-47b-it-really-time- retirement-12818, last visited December 15, 2017.

Homer, 'The Iliad - Translated by Robert Fragles', (Penguin Classics, 1998).

Hoppe, Carsten, 'Passing the Buck: State Responsibility for Private Military Companies', (2008) 19 The European Journal of International Law, 989- 1014.

Hörl, Sebastian, Ciari, Francesco and Axhausen, Kay W., 'Recent perspectives on the impact of autonomous vehicles', (September 2016) Working paper - Institute for Transport Planning and Systems,

298

www.ethz.ch/content/dam/ethz/special-interest/baug/ivt/ivt- dam/vpl/reports/2016/ab1216.pdf, last visited December 15, 2017.

Horowitz, Michael C., 'Autonomous Weapon Systems: Public Opinion and Security Issues', (16 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE2 2EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Hoven, Jeroen van den, 'Why the Future needs us today - Moral Responsibility and Engineering Autonomous Weapon Systems', (17 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Human Rights Watch and Harvard Law School's International Human Rights Clinic, 'The Need for New Law to Ban Fully Autonomous Weapons: Memorandum to Convention on Conventional Weapons Delegates', (2013), www.hrw.org/sites/default/files/ supporting_resources/11.2013_memo_to_ccw_delegates_fully_autonomou s_weapons.pdf, last visited December 15, 2017.

Human Rights Watch and Harvard Law School's International Human Rights Clinic, 'Advancing the debate on Killer Robots: 12 Key Arguments for a preemptive ban on Fully Autonomous Weapons', (2014), www.hrw.org/sites/default/files/related_material/ Advancing the Debate_final.pdf, last visited December 15, 2017.

299

Hunter, Duncan, 'Why is Obama Letting China Beat the US at Global Drone Sales?', (4 December 2015) Defense One, www.defenseone.com/ideas/2015/12/why-obama-letting-china-beat-us- global-drone-sales/124213, last visited December 15, 2017.

Husain, Amir, 'The Sentient Machine: The Coming Age of Artificial Intelligence', (Scribner, 2017).

International Committee of the Red Cross, 'International humanitarian law and the challenges of contemporary armed conflicts', (8-10 December 2015) 32nd International Conference of the Red Cross and Red Crescent, 32IC/15/11, Geneva, www.icrc.org/en/document/international- humanitarian-law-and-challenges-contemporary-armed-conflicts, last visited December 15, 2017.

International Committee of the Red Cross, 'Statement at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (17 April 2015), www.unog.ch/80256EDD 006B8954/(httpAssets)/E2917CC32952137FC1257E2F004CED22/$file/C CW+Meeting+of+Experts+ICRC+closing+statement+17+Apr+2015+final. pdf, last visited December 15, 2017.

International Committee of the Red Cross, 'Report of the ICRC Expert Meeting on Autonomous weapon systems: technical, military, legal and humanitarian aspects in Geneva', (26-28 March 2014), www.icrc.org/eng/assets/files/2014/expert-meeting-autonomous-weapons- icrc-report-2014-05-09.pdf, last visited December 15, 2017.

International Committee of the Red Cross, 'IHL and the challenges of contemporary armed conflict: Report prepared for the 31st conference', (2011), www.icrc.org/eng/resources/documents/report/31-international-

300

conference-ihl-challenges-report-2011-10-31.htm, last visited December 15, 2017.

International Panel on the Regulation of Autonomous Weapons, 'Focus on Technology and Application of Autonomous Weapons', (2017), www.ipraw.org/state-of-technology/, last visited December 15, 2017.

Israel, 'Statement at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (17 April 2015), www.unog.ch/80256EDD006B8954/(httpAssets)/ AB30BF0E02AA39EAC1257E29004769F3/$file/2015_LAWS_MX_Israe l_characteristics.pdf, last visited December 15, 2017.

Jacobsson, Marie, 'Modern Weaponry and Warfare: The Application of Article 36 of Additional Protocol I by Governments', in A. M. Helm (ed.), The Law of War in the 21st Century: Weaponry and the Use of Force (U.S. Naval War College International Law Studies, 2006), 183-191.

Jensen, Eric Talbot, 'Statement on Lethal Autonomous Weapons', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Jones, Troy and Leammukda, Mitch, 'Requirements-Driven Autonomous System Test Design: Building Trusting Relationships', (2011), www.researchgate.net/publication/228598990, last visited December 15, 2017.

Kane, Angela, 'Video Message to the CCW Meeting of Experts on LAWS on 13 April 2015', https://s3.amazonaws.com/unoda-web/wp-

301

content/uploads/2015/04/LAWS-Meeting-April-2015-HR-Video- Message.pdf, last visited December 15, 2017.

Kang, Cecilia, 'Drone Registration Rules Are Announced by F.A.A.', (14 December 2015), https://mobile.nytimes.com/2015/12/15/technology/drone-registration- rules-are-announced-by-faa.html?smid=fb- nytimes&smtyp=cur&_r=0&referer=, last visited December 15, 2017.

Kastan, Benjamin, 'Autonomous Weapons Systems: A Coming Legal Singularity?', (2013) 45 University of Illinois Journal of Law, Technology & Policy, 45-82.

Kessler, Birgit, 'The Duty to “Ensure Respect” under Common Article 1 of the Geneva Conventions', (2001) German Yearbook of International Law, 498- 516.

Knoops, Geert-Jan, 'Drones at Trial. State and Individual (Criminal) Liabilities for Drone Attacks', (2014) 14 International Criminal Law Review, 42-81.

Knuckey, Sarah, 'Towards an international dialogue about transparency: What kinds of AWS information should governments share, with whom, and on what basis?', United Nations Institute for Disarmament Research's side event 'Transparency and LAWS' on 16 April 2015, www.unog.ch/80256EDD006B8954/(httpAssets)/A079BA4521FCB 9F2C1257E270044A4EB/$file/2015_LAWS_SideEvent_UNIDIR.pdf, last visited December 15, 2017.

Krishnan, Armin, 'Killer Robots - Legality and Ethicality of Autonomous Weapons', (Routledge, 2009).

302

Krishnan, Armin, 'Gezielte Tötung: Die Zukunft des Krieges', (Matthes & Seitz, 2012).

LaGrone, Sam, 'Pentagon to Navy: Convert UCLASS Program Into Unmanned Aerial Tanker, Accelerate F-35 Development, Buy More Super Hornets', (1 February 2016), https://news.usni.org/2016/02/01/pentagon-to-navy- convert-uclass-program-into-unmanned-aerial-tanker-accelerate-f-35- development-buy-more-super-hornets, last visited December 15, 2017.

LaGrone, Sam, 'Mabus: F-35 Will Be ‘Last Manned Strike Fighter’ the Navy, Marines ‘Will Ever Buy or Fly’', (15 April 2015), https://news.usni.org/2015/04/15/mabus-f-35c-will-be-last-manned-strike- fighter-the-navy-marines-will-ever-buy-or-fly, last visited December 15, 2017.

Lawand, Kathleen, 'Presentation on the Panel on possible challenges to IHL due to increasing degrees of autonomy', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Lawand, Kathleen, 'Reviewing the legality of new weapons, means and methods of warfare', (2006) 88 International Committee of the Red Cross Review, 925-930.

Lawand, Kathleen, Coupland, Robin and Herby, Peter, 'A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977', (2006) International Committee of the Red Cross, 1-35.

303

Lehnardt, Chia, 'Private Militärfirmen und völkerrechtliche Verantwortlichkeit. Eine Untersuchung aus humanitär-völkerrechtlicher und menschenrechtlicher Perspektive', (Mohr Siebeck, 2011).

Levon, Eitan, 'Statement by Israel at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (13 April 2015), www.unog.ch/80256EDD006B8954/(httpAssets)/ 1B879A0827FBA307C1257E29004778B8/$file/2015_LAWS_MX_Israel_ GS+bis.pdf, last visited December 15, 2017.

Lewis, Dustin, Modirzadeh, Naz and Blum, Gabriella, 'Artificial Intelligence - The Pentagon’s New Algorithmic-Warfare Team', (26 June 2017), www.lawfareblog.com/pentagons-new-algorithmic-warfare- team?utm_source=HLS+PILAC&utm_ campaign=c5299c361b- EMAIL_CAMPAIGN_2017_06_19&utm_medium =email&utm_term=0_fbdb5dc78b-c5299c361b-262381653, last visited December 15, 2017.

Lienert, Paul and White, Joe, 'Google partners with auto suppliers on self- driving car', (14 January 2015), www.reuters.com/article/us-autoshow- google-urmson/google-partners-with-auto-suppliers-on-self-driving-car- idUSKBN0KN29820150114?feedType =RSS&feedName=technologyNews; http://time.com/3693637/driverless- cars-prediction/, last visited December 15, 2017.

Lin, Patrick, 'The right to life and the Martens Clause', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

304

Lin, Patrick; Abney, Keith and Bekey, George A., 'Robot ethics: the ethical and social implications of robotics', (MIT Press, 2012).

Lin, Patrick; Bekey, George and Abney, Keith, 'Autonomous Military Robotics. Risk, Ethics, Design', (2008) US Department of Navy, Office of Naval Research, http://ethics.calpoly.edu/ONR_report.pdf, last visited December 15, 2017.

Lin, Patrick; Bekey, George and Abney, Keith, 'Robots in War: Issues of Risk and Ethics', in R. Capurro and M. Nagenborg (eds.), Ethics and Robotics (IOS Press, 2009) 49-67.

Littlefield, Scott, 'DARPA Program Information: Anti-Submarine Warfare (ASW) Continuous Trail Unmanned Vessel (ACTUV)', (2016), www.darpa.mil/program/anti-submarine-warfare-continuous-trail- unmanned-vessel, last visited December 15, 2017.

Liu, Hin-Yan, 'Categorization and legality of autonomous and remote weapons systems', (2012) 94 International Review of the Red Cross, 627-652.

Lohmann, Melinda Florina, 'Automatisierte Fahrzeuge im Lichte des Schweizer Zulassungs- und Haftungsrechts', (Nomos, 2016).

Israel Aerospace Industries, 'Harpy Loitering Weapon', (2014), www.iai.co.il/2013/16143-16153-en/IAI.aspx, last visited December 15, 2017.

Lockheed Martin, 'Lockheed Martin Demonstrates Autonomous Systems that Advance Unmanned Technology on Land, Air, and Sea', (9 May 2017), http://news.lockheedmartin.com/2017-05-09-Lockheed-Martin- Demonstrates-Autonomous-Systems-that-Advance-Unmanned- Technology-on-Land-Air-and-

305

Sea?_ga=2.117835721.1502180965.1512361255-581737331.1408534907, last visited December 15, 2017.

Lubell, Noam, 'Challenges in Applying Human Rights Law to Armed Conflict', (2005) 87 International Review of the Red Cross, 737-754.

Lubell, Noam and Derejko, Nathan, 'A Global Battlefield? Drones and the Geographical Scope of Armed Conflict', (2013) Journal of International Criminal Justice, 65-88.

Lubold, Gordon and Harris, Shane, 'Trump Broadens CIA Powers, Allows Deadly Drone Strikes', (13 March 2017), www.wsj.com/articles/trump- gave-cia-power-to-launch-drone-strikes-1489444374, last visited December 15, 2017.

Lucas, George, 'Automated Warfare', (2011) 25 Stanford Law & Policy Review, 317-340.

Lucas, George R., 'Legal and Ethical Precepts Governing Emerging Military Technologies. Research and Use', (2014) 6 Amsterdam Law Forum, 23-34.

Lytton, Christopher H., 'Blood for Hire: How the War in Iraq Has Reinvented the World's Second Oldest Profession', (2006) Oregon Review of International Law, 307-335.

Majumdar, Dave, 'The War over UCLASS (And Future of Naval Power Projection) Continues', (30 September 2015), http://nationalinterest.org/blog/the-buzz/the-war-over-uclass-future-naval- power-projection-continues-13973, last visited December 15, 2017.

Marauhn, Thilo, 'The Notion of „Meaningful Human Control“ in Light of the Law of Armed Conflict', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), 306

www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung- DEHUM-2015/DEHUM-2015.html, last visited December 15, 2017.

Marauhn, Thilo, 'An Analysis of the Potential Impact of Lethal Autonomous Weapons Systems on Responsibility and Accountability for Violations of International Law', (13 May 2014) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EDD006B8954/(httpAssets)/ 35FEA015C2466A57C1257CE4004BCA51/$file/Marauhn_MX_Laws_Sp eakingNotes_2014.pdf, last visited December 15, 2017.

Marauhn, Thilo, 'Der Einsatz unbemannter bewaffneter Drohnen im Lichte des geltenden Völkerrechts', (2013) 9 Arbeitspapiere Deutsche Stiftung Friedensforschung, 26-51.

Marchant, Gary E.; Allenby, Braden; Arkin, Ronald and Barrett, Edward T., 'International Governance of Autonomous Military Robots', (2011) 12 Columbia Science and Technology Law Review, 272-316.

Markoff, John, 'Fearing Bombs That Can Pick Whom to Kill', (11 November 2014), www.nytimes.com/2014/11/12/science/weapons-directed-by-robots- not-humans-raise-ethical-questions.html, last visited December 15, 2017.

Marra, William C. and McNeil, Sonia K., 'Understanding the Loop: Regulating the Next Generation of War Machines', (2013) 36 Harvard Journal of Law & Public Policy, 1139-1185.

Martin, David, 'The Coming Swarm - New generation of drones set to revolutionize warfare', (8 January 2017) CBS News, http://www.cbsnews.com/news/60-minutes-autonomous-drones-set-to- revolutionize-military-technology/, last visited December 15, 2017.

307

Mathews, Lee, 'NAFTA superhighway for driverless trucks could become a reality', (26 May 2015), www.geek.com/news/nafta-superhighway-for- driverless-trucks-could-become-a-reality-1623558/, last visited December 15, 2017.

Matthews, William, 'Murky Waters: Seagoing Drones Swim Into New Legal and Ethical Territory', (9 April 2013) Defense News, http://auvac.org/newsitems/view/ 2015-sthash.LDZvtqPG.dpuf), last visited December 15, 2017.

Matthias, Andreas, 'The Responsibility Gap: Ascribing Responsibility for the Actions of Learning Automata', (2004) 6 Ethics and Information Technology, 175-183.

Matthias, Andreas, 'Automaten als Träger von Rechten', 2nd edition (Logos Verlag, 2010).

McCaney, Kevin, 'Navy puts autonomous 'swarmboats' into action', (5 October 2014), https://defensesystems.com/articles/2014/10/05/onr-navy- autonomous-swarm-boats.aspx, last visited December 15, 2017.

McClelland, Justin, 'The Review of Weapons in Accordance with Article 36 of Additional Protocol I', (2003) 85 International Review of the Red Cross, 397-420.

McCormack, Timothy L.H., 'A non liquet on nuclear weapons - The ICJ avoids the application of general principles of international humanitarian law', (1997) 316 International Review of the Red Cross, 76-91.

McFarland, Tim and McCormack, Tim, 'Mind the Gap: Can Developers of Autonomous Weapons Systems be Liable for War Crimes?', (2014) 90 International Law Studies, 361-385.

308

McLaughlin, Rob, 'Unmanned Naval Vehicles at Sea: USVs, UUVs, and the Adequacy of the Law', (2012) 21 Journal of Law, Information and Science, 100-117.

McLeish, Caitríona, 'Experiences from the CBW regime in dealing with the problem of dual use', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/ 6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

McNeal, Gregory S., 'Targeted Killing and Accountability', (2014) 102 Georgetown Law Journal, 681-794.

McNevin, Ambrose, 'Royal Navy to explore artificial intelligence use in battle threat assessment', (17 October 2016), www.cbronline.com/4th- revolution/royal-navy-explore-artificial-intelligence-use-battle-threat- assessment/, last visited December 15, 2017.

Meier, Michael W., 'U.S. Delegation Opening Statement', (11 April 2016) The Convention on Certain Conventional Weapons (CCW) Third Informal Meeting of Experts on Lethal Autonomous Weapons Systems, www.unog.ch/80256EDD006B8954/ (httpAssets)/EFF7036380934E5EC1257F920057989A/$file/2016_LAWS+ MX_GeneralExchange_Statements_United+States.pdf, last visited December 15, 2017.

Meier, Michael W., 'U.S. Delegation Opening Statement', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE

309

600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26?Open Document, last visited December 15, 2017.

Meier, Michael W., 'U.S. Delegation Opening Statement', (13 April 2015) The Convention on Certain Conventional Weapons (CCW) Informal Meeting of Experts on Lethal Autonomous Weapons Systems, www.unog.ch/80256EDD006B8954/ (httpAssets)/8B33A1CDBE80EC60C1257E2800275E56/$file/2015_LAW S_MX_USA+bis.pdf, last visited December 15, 2017.

Melzer, Nils, 'Targeted Killing in International Law', (Oxford University Press, 2008).

Melzer, Nils, 'Interpretative Guidance on the Notion of Direct Participation in Hostilities under International Humanitarian Law ', (2009), www.icrc.org/eng/assets/files/other/ icrc-002-0990.pdf, last visited December 15, 2017.

Meron, Theodor, 'The Martens Clause, Principles of Humanity, and Dictates of Public Conscience', (2000) 94 The American Journal of International Law, 78-89.

Mettraux, Guénaël, 'The Law of Command Responsibility', (Oxford University Press, 2009).

Millar, Jason, 'Meaningful Human Control and Dual-Use Technologies', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

310

Mines Action Canada, 'Statement to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (13 April 2016), www.unog.ch/80256EDD006B8954/ (httpAssets)/B07EB66CCB57B877C1257F9B004FF6C9/$file/2016_LAW S+MX+ChallengestoIHL_Statement_MineActionCanada.pdf, last visited December 15, 2017.

Mines Action Canada, 'Lessons from Protocol IV on Blinding Laser Weapons for the Current Discussions about Autonomous Weapons - A Memorandum to CCW Delegates', (2014), https://bankillerrobotscanada.files.wordpress.com/2014/05/international- piv-memo-final.pdf, last visited December 15, 2017.

Moore, Gordon E., 'Cramming more components onto integrated circuits', (1965) reprinted in: 86 Proceedings of the IEEE, No. 1 (1998), 114-117.

National Conference of State Legislatures, 'Autonomous Vehicles Enacted Legislation', (23 October 2017), www.ncsl.org/research/transportation/autonomous-vehicles-self-driving- vehicles-enacted-legislation.aspx, last visited December 15, 2017.

Naval Technology, 'Long Range Anti-Ship Missile (LRASM), United States of America', www.naval-technology.com/projects/long-range-anti-ship- missile/, last visited December 15, 2017.

New York City Bar Association - Committee on International Law, 'The Legality under International Law of Targeted Killings by Drones launched by the United States', (June 2014), http://bit.ly/1lKPEuV, last visited December 15, 2017.

311

Norris, Andrew, 'Legal Issues Relating to Unmanned Maritime Systems', (2013) US Naval War College Monograph, www.hsdl.org/?view&did=731705, last visited December 15, 2017.

Northrop Grumman Corporation, 'X-47B UCAS', (2014), www.northropgrumman.com/ Capabilities/X47BUCAS/Pages/default.aspx, last visited December 15, 2017.

O'Kane, Sean, 'California gives Nvidia the go-ahead to test self-driving cars on public roads', (9 December 2016), www.theverge.com/2016/12/9/13902704/california-dmv-permit-nvidia- autonomous-car-testing, last visited December 15, 2017.

Obama, Barack, 'Obama's Speech on Drone Policy', (23 May 2013) International New York Times, http://nyti.ms/10OWqpi, last visited December 15, 2017.

Omicini, Andrea, 'The Distributed Autonomy - Software Abstractions and Technologies for Autonomous Systems', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/ (httpPages)/6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Osborn, Kris, 'Navy LRASM Missile Destroys Enemy Targets Semi- Autonomously; Lockheed Tests Ship-Fired Variant', (19 January 2016), www.scout.com/military/ warrior/story/1634058-navy-weapon-destroys- target-semi-autonomously, last visited December 15, 2017.

Otto, Roland, 'Targeted Killings and International Law – With Special Regard to Human Rights and International Humanitarian Law ', (Springer, 2012).

312

Parks, W. Hays, 'The Protection of Civilians from Air Warfare', (1997) 27 Israel Yearbook on Human Rights, 65-111.

Parks, W. Hays, 'Means and Methods of Warfare', (2006) 38 George Washington International Law Review, 1-15.

Parks, W. Hays, 'Conventional Weapons and Weapons Reviews', in T. McCormack and A. McDonald (eds.), Yearbook of International Humanitarian Law (T.M.C. Asser Press, 2007).

Paust, Jordan J., 'Use of Armed Force Against Terrorists in Afghanistan, Iraq, and Beyond', (2002) 35 Cornell International Law Journal, 533-557.

Paust, Jordan J., 'Self-Defense, Targeting of Non-State Actors and Permissibility of U.S. Force of Drones in Pakistan', (2009) 19 Journal of Transnational Law and Policy, 237-280.

Pejic, Jelena, 'Extraterritorial targeting by means of armed drones: Some legal implications', (2015) International Review of the Red Cross, www.icrc.org/en/document/jelena-pejic-extraterritorial-targeting-means- armed-drones-some-legal-implications, last visited December 15, 2017.

Pellerin, Cheryl, 'Eight teams earn DARPA funds for 2014 robotics finals', (27 December 2013), www.army.mil/article/117612/eight_teams_earn_darpa_funds_for_2014_ robotics_finals, last visited December 15, 2017.

People’s Republic of China, 'Statement by China to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016), www.unog.ch/80256EE 600585943/(httpPages)/37D51189AC4FB6E1C1257F4D004CAFB2, last visited December 15, 2017.

313

Permanent Mission of Sweden to the United Nations, 'Challenges to International Humanitarian Law', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256EDD006B8954/(httpAssets)/0B1EA0C5D19CEE25C 1257F9B0051BB4E/$file/2016_LAWS+MX+ChallengestoIHL_Statement _Sweden.pdf, last visited December 15, 2017.

Pisillo-Mazzeschi, Riccardo, 'The Due Diligence Rule and the Nature of the International Responsibility of States', (1992) German Yearbook of International Law, 9-51.

Pomerleau, Mark, 'DOD plans to invest $600M in unmanned underwater vehicles', (4 February 2016), https://defensesystems.com/articles/2016/02/04/dod-navy-uuv- investments.aspx, last visited December 15, 2017.

Prabhakar, Arati, 'IBGC Speaker Series and the Tufts Institute for Innovation: “Technologies to Bend the Arc of the Future”', (31 March 2016), http://fletcher.tufts.edu/Calendar/ 2016/03/31/Technologies-to-Bend-the- Arc-of-the-Future-with-Dr-Arati-Prabhakar-of-DARPA.aspx, last visited December 15, 2017.

Quintana, Elizabeth, 'Operational Considerations for LAWS', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256E E600585943/(httpPages)/6CE049BE22EC75A2C1257C8D00513E26?Ope nDocument, last visited December 15, 2017.

Richter, Wolfgang, 'Military Rationale for Autonomous Functions in Weapons Systems (AWS)', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS),

314

www.unog.ch/80256EE600585943/(httpPages)/6CE049B E22EC75A2C1257C8D00513E26?OpenDocument,

Rickli, Jean-Marc, 'Some Considerations of the Impact of LAWS on International Security: Strategic Stability, Non-State Actors and Future Prospects', (16 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Riza, M. Shane, 'Killing without a Heart. Limits on Robotic Warfare in an Age of Persistent Conflict', (Potomac Books, 2013).

Roff, Heather, 'Strategic, Operational and Tactical Considerations for LAWS', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Roff, Heather, 'Killing in War: Responsibility, Liability and Lethal Autonomous Robots', (2013), www.academia.edu/2606840/Killing_in_War_Responsibility _Liability_and_Lethal_Autonomous_Robots, last visited December 15, 2017.

Rosen, Frederik, 'Extremely Stealthy and Incredibly Close: Drones, Control and Legal Responsibility', (2014) Journal of Conflict and Security Law, 113- 131.

Rowe, Peter, 'Members of the Armed Forces and Human Rights Law', in A. Clapham and P. Gaeta (eds.), The Oxford Handbook of International Law in Armed Conflict (Oxford University Press, 2014). 315

Rubenstein, Michael; Cornejo, Alejandro and Nagpal, Radhika, 'Programmable self-assembly in a thousand-robot swarm', (2014) 345 Science, 795-799.

Russell, Stuart, 'Artificial Intelligence: Implications for Autonomous Weapons', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Sandoz, Yves; Swinarski, Christophe and Zimmermann, Bruno, 'Commentary on the Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I)', (Springer, 1987).

Sanger, David E. and Schmitt, Eric, 'Russian Ships Near Data Cables Are Too Close for U.S. Comfort', (25 October 2015) International New York Times, http://nyti.ms/1kFmzsA, last visited December 15, 2017.

Sartor, Giovanni, 'Liabilities for autonomous systems in the civil domain', (15 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Sassóli, Marco, 'Le DIH, une lex specialis par rapport aux droit humains?', in A. Auer, A. Flückiger and M. Hottelier (eds.), Les droits de l'homme et la constitution, Etudes en l'honneur du Professeur Giorgio Malinverni (Schulthess, 2007).

Sassòli, Marco, 'Is increasing dehumanization leading to less humanity in warfare?', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), 316

www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung- DEHUM-2015/DEHUM-2015.html, last visited December 15, 2017.

Sassòli, Marco, 'State responsibility for violations of international humanitarian law', (2002) 84 International Review of the Red Cross, 401-434.

Sassòli, Marco, 'Autonomous Weapons – Potential advantages for the respect of international humanitarian law', (2013) Professionals in Humanitarian Assistance and Protection (PHAP), 1-5.

Sauer, Frank, 'Autonomous Weapons Systems. Humanising or Dehumanising Warfare?', (2014) 4 Global Governance Spotlight, 1-4.

Scharioth, Klaus, 'Making ESDP strong will strengthen NATO and the Transatlantic Partnership', in E. Brimmer (ed.), The EU’s Search for a Strategic Role - ESDP and Its Implications for Transatlantic Relations (Center for Transatlantic Relations, 2002), 165-174.

Scharre, Paul, 'The Lethal Autonomous Weapons Governmental Meeting (Part I: Coping with Rapid Technological Change)', (9 November 2017), www.justsecurity.org/ 46889/lethal-autonomous-weapons-governmental- meeting-part-i-coping-rapid-technological-change/, last visited December 15, 2017.

Scharre, Paul, 'Presentation at the UN CCW - Technical Issues I', (13 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/6CE049BE22EC75A2C12 57C8D00513E26?OpenDocument, last visited December 15, 2017.

Scharre, Paul, 'Lethal Autonomous Weapons and Policy-Making Amid Disruptive Technological Change', (14 November 2017), 317

www.justsecurity.org/47082/lethal-autonomous-weapons-policy-making- disruptive-technological-change/, last visited December 15, 2017.

Scharre, Paul, 'Army of None: Autonomous Weapons and the Future of War', (forthcoming 2018).

Scharre, Paul and Burg, Daniel, 'To Safe Money, Go Unmanned', (22 October 2014) War on the Rocks, http://warontherocks.com/2014/10/to-save- money-go-unmanned/, last visited December 15, 2017.

Schmitt, Michael, 'Regulating Autonomous Weapons Might be Smarter Than Banning Them', (10 August 2015) Just Security, www.justsecurity.org/25333/regulating-autonomous-weapons-smarter- banning/, last visited December 15, 2017.

Schmitt, Michael N., 'Unmanned Combat Aircraft Systems and International Humanitarian Law: Simplifying the oft benighted Debate', (2012) 30 Boston University International Law Journal, 595-619.

Schmitt, Michael N., 'Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics', (2013) 19 Harvard National Security Journal, http://harvardnsj.org/2013/02/autonomous-weapon- systems-and-international-humanitarian-law-a-reply-to-the-critics/, last visited December 15, 2017.

Schmitt, Michael N. and Thurnher, Jeffrey S., 'Out of the Loop: Autonomous Weapon Systems and the Law of Armed Conflict', (2013) 4 Harvard National Security Journal, 231-281.

Seiring, Olaf, 'Der Einsatz unbemannter Flugsysteme in nicht internationalen bewaffneten Konflikten', (Berliner Wissenschafts-Verlag, 2016).

318

Sharkey, Noel, 'Automating Warfare: Lessons Learned from the Drones', (2011) 21 Journal of Law, Information and Science, 140-154.

Sharkey, Noel, 'Killing Made Easy: From Joysticks to Politics', in P. Lin, K. Abney and G. A. Bekey (eds.), Robot Ethics - The Ethical and Social Implications of Robotics (MIT Press, 2012), 111-128.

Sharkey, Noel E., 'The evitability of autonomous robot warfare', (2012) 94 International Review of the Red Cross, 787-799.

Siegrist, Michael, 'A purpose-oriented working definition for autonomous weapons systems', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256EDD006B8954/(httpAssets)/558F0762F97E8064C12 57F9B0051970A/$file/2016.04.13+LAWS+Legal+Session+(as+read).pdf, last visited December 15, 2017.

Simon, Bob, 'Will Israel’s “Iron Dome” help bring peace?', (17 February 2013), www.cbsnews.com/news/will-israels-iron-dome-help-bring-peace/, last visited December 15, 2017.

Simon, Matt, 'Boston Dynamics’ New Rolling, Leaping Robot Is an Evolutionary Marvel', (3 January 2017), www.wired.com/2017/03/boston- dynamics-new-rolling-leaping-robot-evolutionary-marvel/, last visited December 15, 2017.

Simon-Michel, Jean-Hugues, 'Report of the 2014 informal Meeting of Experts on LAWS', (10 June 2014), www.unog.ch/80256EDD006B8954/(httpAssets)/350D9ABED1AFA 515C1257CF30047A8C7/$file/Report_AdvancedVersion_10June.pdf, last visited December 15, 2017.

319

Singer, Peter W., 'Corporate Warriors - The Rise of the Privatized Military Industry', (Cornell University Press, 2003).

Singer, Peter W., 'Wired for War - The Robotics Revolution and Conflict in the Twenty-first Century', (Penguin Books, 2009).

Singer, Peter W., 'Der ferngesteuerte Krieg', (2010) Spektrum der Wissenschaft, 1-2.

Smalley, David, 'LOCUST: Autonomous, swarming UAVs fly into the future', (14 April 2015), www.onr.navy.mil/Media-Center/Press- Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx, last visited December 15, 2017.

Snapper, John W., 'Responsibility for Computer Based Errors', (1985) 16 Metaphilosophy, 289-295.

Sparrow, Robert, 'Killer Robots', (2007) 24 Journal of Applied Philosophy, 62- 77.

Stone, Richard, 'Scientists Campaign Against Killer Robots', (2013) 342 Science, 1428-1429.

Strüwer, Elisabeth, 'Zum Zusammenspiel von humanitärem Völkerrecht und Menschenrechten am Beispiel des Targeted Killing', (Peter Lang, 2009).

Tarzwell, Amanda, 'In Search of Accountability: Attributing the Conduct of Private Security Contractors to the United States Under the Doctrine of State Responsibility', (2009) Oregon Review of International Law, 179- 204.

Tegmark, Max, 'Life 3.0: Being Human in the Age of Artificial Intelligence ', (Knopf, 2017). 320

Thomas, Daniel, 'Driverless convoy: Will truckers lose out to software?', (26 May 2015) BBC News, www.bbc.com/news/business-32837071, last visited December 15, 2017.

Thrun, Sebastian, Montemerlo, Mike, Dahlkamp, Hendrik, Stavens, David, Aron, Andrei, Diebel, James, Fong, Philip, Gale, John, Halpenny, Morgan, Hoffmann, Gabriel, Lau, Kenny, Oakley, Celia, Palatucci, Mark, Pratt, Vaughan and Stang, Pascal, 'Stanley: The Robot that Won the DARPA Grand Challenge', (2006) 23 Journal of Field Robotics, 661-692.

Thurnher, Jeffrey, 'Unmanned (Maritime) Systems and Precautions in Attack', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (13-14 February 2015), www.rewi.europa- uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung-DEHUM- 2015/DEHUM-2015.html, last visited December 15, 2017.

Thurnher, Jeffrey S., 'The Law That Applies to Autonomous Weapon Systems', (2013) 17 American Society of International Law Insights, www.asil.org/insights/volume/ 17/issue/4/law-applies-autonomous- weapon-systems, last visited December 15, 2017.

TIME, 'Artificial Intelligence: The Future of Humankind', (Time, 2017).

Tonkin, Hannah, 'State Control over Private Military and Security Companies in Armed Conflict', (Cambridge University Press, 2013).

Tran, Mark, 'Iron Dome: Israel's 'game-changing' missile shield', (9 July 2014), www.theguardian.com/world/2014/jul/09/iron-dome-gaza-israel-air- defence-missile, last visited December 15, 2017.

Triffterer, Otto and Ambos, Kai, 'The Rome Statute of the International Criminal Court: A Commentary', 3rd edition (Beck/Hart, 2016).

321

Turse, Nick and Engelhardt, Tom, 'Terminator Planet: The First History of Drone Warfare, 2001-2050', (CreateSpace Independent Publishing Platform, 2012).

United Kingdom Ministry of Defence, 'Unmanned aircraft systems (Joint Doctrine Publication (JDP) 0-30.2)', (12 September 2017), www.gov.uk/government/publications/unmanned-aircraft-systems-jdp-0- 302, last visited December 15, 2017.

United Kingdom Ministry of Defence, 'The UK Approach to Unmanned Aircraft Systems (UAS)', (30 March 2011), www.gov.uk/government/publications/jdn-2-11-the-uk-approach-to- unmanned-aircraft-systems, last visited December 15, 2017.

United Kingdom Ministry of Defence, 'UK Weapon Reviews', (8 March 2016) Development, Concepts and Doctrine Centre, www.gov.uk/government/uploads/system/ uploads/attachment_data/file/507319/20160308-UK_weapon_reviews.pdf, last visited December 15, 2017.

United Kingdom, 'Statement to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016), www.unog.ch/80256EDD006B8954/(httpAssets)/ 49456EB7B5AC3769C1257F920057D1FE/$file/2016_LAWS+MX_Gener alExchange_Statements_United+Kingdom.pdf, last visited December 15, 2017.

United Kingdom, 'Statement at the CCW Meeting of Experts on Lethal Autonomous Weapon Systems', (13 April 2015), www.unog.ch/80256EDD006B8954/(httpAssets)/

322

1CBF996AF7AD10E2C1257E260060318A/$file/2015_LAWS_MX_Unite d+Kingdom.pdf, last visited December 15, 2017.

United Kingdom, 'Possible Challenges to IHL due to Increasing Degrees of Autonomy', (13 April 2016) 2016 CCW Meeting of Experts on LAWS, www.unog.ch/80256E DD006B8954/(httpAssets)/37B0481990BC31DAC1257F940053D2AE/$fi le/2016_LAWS+MX_ChallengestoIHL_Statements_United+Kingdom.pdf, last visited December 15, 2017.

United Nations Institute for Disarmament Research, 'The Weaponization of Increasingly Autonomous Technologies in the Maritime Environment: Testing the Waters', (2015) UNIDIR Resources No. 4, www.unidir.org/files/publications/pdfs/testing-the-waters-en-634.pdf, last visited December 15, 2017.

United Nations News Centre, 'UN human rights expert questions targeted killings and use of lethal force', (20 October 2011), www.un.org/apps/news/story.asp?NewsID=40136-.WhH9A7HMy2x, last visited December 15, 2017.

United Nations Office at Geneva, '2017 Group of Governmental Experts on Lethal Autonomous Weapons Systems (LAWS)', (2017), www.unog.ch/80256EE600585943/ ment, last visited December 15, 2017.

United States Air Force, 'Unmanned Aircraft Systems Flight Plan 2009-2047', (Washington D.C., 2009), fas.org/irp/program/collect/uas_2009.pdf, last visited December 15, 2017.

United States Air Force Office of the Chief Scientist, 'Autonomous Horizons: System Autonomy in the Air Force - A Path to the Future (AF/ST TR 15-

323

01)', (June 2015) I: Human-Autonomy Teaming, www.af.mil/Portals/1/documents/SECAF/ AutonomousHorizons.pdf?timestamp=1435068339702, last visited December 15, 2017.

United States Department of Defense, 'Directive Number 3000.09: Autonomy in Weapons Systems', (21 November 2012), www.esd.whs.mil/Portals/54/Documents/DD/ issuances/dodd/300009p.pdf, last visited December 15, 2017.

United States Department of Defense, 'Unmanned Systems Integrated Roadmap FY2013-2038, Reference Number: 14-S-0553', (2013), www.defense.gov/pubs/DOD-USRM-2013.pdf, last visited December 15, 2017.

United States Department of Defense, 'Successful Micro-Drone Demonstration - Press Release No: NR-008-17', (9 January 2017), www.defense.gov/News/News-Releases/News-Release- View/Article/1044811/department-of-defense-announces-successful-micro- drone-demonstration, last visited December 15, 2017.

United States Department of Defense, 'Report of the Defense Science Board Task Force on Developmental Test and Evaluation ', (2008), www.acq.osd.mil/dsb/reports/ ADA482504.pdf, last visited December 15, 2017.

United States Department of the Navy, 'The Navy Unmanned Undersea Vehicle (UUV) Master Plan', (9 November 2004), www.navy.mil/navydata/technology/uuvmp.pdf, last visited December 15, 2017.

324

United States Department of the Navy, 'AEGIS Weapon System (AWS)', (26 January 2017), www.navy.mil/navydata/fact_display.asp?cid=2100&tid=200&ct=2, last visited December 15, 2017.

United States Department of the Navy, 'U.S. Navy Fact Sheet: MK 15 - Phalanx Close-In Weapons System (CIWS)', (15 November 2013), www.navy.mil/navydata/ fact_print.asp?cid=2100&tid=487&ct=2&page=1, last visited December 15, 2017.

United States Navy Program Executive Office Littoral Combat Ships, 'Large Displacement Unmanned Underwater Vehicle Program Achieves Acquisition Milestone', (3 September 2015) NNS150903-22, www.navy.mil/submit/display.asp?story_id=90932, last visited December 15, 2017.

United States Naval Sea Systems Command Office of Corporate Communications, 'Navy Holds Industry Day on Large Displacement Unmanned Undersea Vehicle Program', (6 October 2016) NNS161006-09, www.navy.mil/submit/display.asp?story_id=97063, last visited December 15, 2017.

United States Office of Naval Research, 'LOCUST: Autonomous, swarming UAVs fly into the future', (14 April 2015), www.onr.navy.mil/Media- Center/Press-Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx, last visited December 15, 2017.

United States Office of the General Counsel Department of Defense, 'Department of Defense Law War Manual', (12 June 2015).

325

United States Secretary of Defense, 'Department of Defense Policy on Blinding Lasers, SECDEF Memo U00888/97, 17 January 1997', (1997) reprinted in: Annotated Supplement to the Commander's Handbook on the Law of Naval Operations, Naval War College, www.jag.navy.mil/distrib/instructions/AnnotatedHandbkLONO.pdf, last visited December 15, 2017.

United States Secretary of the Navy, 'Department of the Navy implementation and operation of the Defense Aquisition System and the Joint Capabilities Integration and Development System - SECNAV INSTRUCTION 5000.2E', (1 September 2011), www.public.navy.mil/cotf/OTD/SECNAVINST 5000.2E.pdf, last visited December 15, 2017.

Vanderhaegen, Frédéric, 'Les dissonance peuvent-elles affecter la résilience des systèmes autonomes', (14 April 2015) CCW Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS), www.unog.ch/80256EE600585943/(httpPages)/ 6CE049BE22EC75A2C1257C8D00513E26?OpenDocument, last visited December 15, 2017.

Vyver, Johan D. Van der, 'International Criminal Court and the Concept of Mens Rea in International Criminal Law', (2004) 12 University of Miami International and Comparative Law Review, 57-151.

Wagner, Markus, 'Taking Humans out of the Loop: Implications for International Humanitarian Law', (2011) 21 Journal of Law, Information and Science, 155-165.

326

Wagner, Markus, 'Autonomy in the Battlespace: Independently Operating Weapon Systems and the Law of Armed Conflict', (2013) International Humanitarian Law and the Changing Technology of War, 99-122.

Watts, Sean, 'Regulation-Tolerant Weapons, Regulation-Resistant Weapons and the Law of War', (2015) 91 International Law Studies, 540-621.

Wein, Leon E., 'The Responsibility of Intelligent Artifacts: Towards an Automation Jurisprudence', (1992) 6 Harvard Journal of Law and Technology, 103-154.

Weingärtner, Dieter, 'Dehumanization: Legal aspects from a practitioner’s perspective', International Conference on the Dehumanization of Warfare at European University Viadrina Frankfurt (14 February 2015), www.rewi.europa-uni.de/de/lehrstuhl/or/voelkerrecht/projekte/Tagung- DEHUM-2015/DEHUM-2015.html, last visited December 15, 2017.

Weizmann, Nathalie, 'Remotely Piloted Aircraft and International Law', in M. Aaronson and A. Johnson (eds.), Hitting the target ? - How new capabilites are shaping international intervention (Stephen Austin & Sons, 2013) 33- 44.

White House - Office of the Press Secretary, 'Fact Sheet: The 2015 National Security Strategy', (6 February 2015), www.whitehouse.gov/the-press- office/2015/02/06/fact-sheet-2015-national-security-strategy, last visited December 15, 2017.

White, Nigel D. and MacLeod, Sorcha, 'EU Operations and Private Military Contractors: Issues of Corporate and Institutional Responsibility', (2008) 19 European Journal of International Law, 965-988.

327

Wilson, William, 'Criminal Law – Doctrine and Theory', 4th edition (Longman, 2011).

Woedtke, Niclas von, 'Die Verantwortlichkeit Deutschlands für seine Streitkräfte im Auslandseinsatz und die sich daraus ergebenden Schadensersatzansprüche von Einzelpersonen als Opfer deutscher Militärhandlungen', (Duncker & Humblot, 2010).

World Economic Forum, 'What If: Robots Go to War?', (21 January 2016), www.weforum.org/events/world-economic-forum-annual-meeting- 2016/sessions/what-if-robots-go-to-war, last visited December 15, 2017.

Yaron, Maya, 'Statement by Israel to the Informal Meeting of Experts on Lethal Autonomous Weapons Systems', (11 April 2016), www.unog.ch/80256EDD006B8954/ (httpAssets)/A02C15B2E5B49AA1C1257F9B0029C454/$file/2016_LAW S_MX_GeneralDebate_Statements_Israel.pdf, last visited December 15, 2017.

Zacharias, Greg L., 'Presentation to the House of representatives Armed Service Committee Subcommittee on Emerging Threats and Capabilities - Advancing the Science and Acceptance of Autonomy for Future Defense Systems', (November 2015), http://docs.house.gov/meetings/AS/AS26/20151119/104186/HHRG-114- AS26-Wstate-ZachariasG-20151119.pdf, last visited December 15, 2017.

Zaloga, Steven J., 'Unmanned Aerial Vehicles: Robotic Aerial Vehicles, 1917- 2007 ', (Osprey Publishing, 2008).

328

VIII. Appendix: Curriculum Vitae

Fredrik von Bothmer

Personal information Date of birth: July 18, 1984 Place of birth: Soltau, Germany Nationality: German

Education 01.2014 – 09.2018 University of St. Gallen (HSG), Switzerland PhD in Law 09.2015 – 05.2016 The Fletcher School of Law and Diplomacy, USA Master in International Law (LL.M.) 03.2015 – 07.2015 New York University (NYU) School of Law, USA Visiting Doctoral Researcher 10.2005 – 03.2013 Ludwig-Maximilians University, Germany First State Examination in law

Work Experience Since 10.2017 United Nations, Department of Political Affairs, USA Associate Political Affairs Officer 04.2017 – 07.2017 MONUSCO, Democratic Republic of the Congo DDR Officer (UNV) 10.2008 – 08.2013 Institute of International Law, LMU Munich, Germany Research Assistant/Lecturer 11.2009 – 11.2016 Various internships with the German Federal Foreign Office and United Nations

329