Artificial Intelligence, Strategic Stability and Nuclear Risk

Total Page:16

File Type:pdf, Size:1020Kb

Artificial Intelligence, Strategic Stability and Nuclear Risk SIPRI ARTIFICIAL INTELLIGENCE, Policy Paper STRATEGIC STABILITY AND NUCLEAR RISK VINCENT BOULANIN, LORA SAALMAN, PETR TOPYCHKANOV, FEI SU AND MOA PELDÁN CARLSSON June 2020 STOCKHOLM INTERNATIONAL PEACE RESEARCH INSTITUTE SIPRI is an independent international institute dedicated to research into conflict, armaments, arms control and disarmament. Established in 1966, SIPRI provides data, analysis and recommendations, based on open sources, to policymakers, researchers, media and the interested public. The Governing Board is not responsible for the views expressed in the publications of the Institute. GOVERNING BOARD Ambassador Jan Eliasson, Chair (Sweden) Dr Vladimir Baranovsky (Russia) Espen Barth Eide (Norway) Jean-Marie Guéhenno (France) Dr Radha Kumar (India) Ambassador Ramtane Lamamra (Algeria) Dr Patricia Lewis (Ireland/United Kingdom) Dr Jessica Tuchman Mathews (United States) DIRECTOR Dan Smith (United Kingdom) Signalistgatan 9 SE-169 72 Solna, Sweden Telephone: + 46 8 655 9700 Email: [email protected] Internet: www.sipri.org Artificial Intelligence, Strategic Stability and Nuclear Risk vincent boulanin, lora saalman, petr topychkanov, fei su and moa peldán carlsson June 2020 Contents Preface v Acknowledgements vi Abbreviations vii Executive Summary ix 1. Introduction 1 Box 1.1. Key definitions 6 2. Understanding the AI renaissance and its impact on nuclear weapons 7 and related systems I. Understanding the AI renaissance 7 II. AI and nuclear weapon systems: Past, present and future 18 Box 2.1. Automatic, automated, autonomous: The relationship between 15 automation, autonomy and machine learning Box 2.2. Historical cases of false alarms in early warning systems 20 Box 2.3. Dead Hand and Perimetr 22 Figure 2.1. A brief history of artificial intelligence 10 Figure 2.2. The benefits of machine learning 11 Figure 2.3. Approaches to the definition and categorization of autonomous 14 systems Figure 2.4. Benefits of autonomy 16 Figure 2.5. Foreseeable applications of AI in nuclear deterrence 24 3. AI and the military modernization plans of nuclear-armed states 31 I. The United States 33 II. Russia 44 III. The United Kingdom 52 IV. France 59 V. China 67 VI. India 78 VII. Pakistan 87 VIII. North Korea 93 Box 3.1. The artificial intelligence race 32 Figure 3.1. Recent policy developments related to artificial intelligence in 34 the United States Figure 3.2. Recent policy developments related to artificial intelligence in 45 Russia Figure 3.3. Recent policy developments related to artificial intelligence in 53 the United Kingdom Figure 3.4. Recent policy developments related to artificial intelligence in 60 France Figure 3.5. Recent policy developments related to artificial intelligence in 68 China Figure 3.6. Recent policy developments related to artificial intelligence in 79 India Figure 3.7. Recent policy developments related to artificial intelligence in 88 Pakistan Table 3.1. Applications of artificial intelligence of interest to the US Department 38 of Defense Table 3.2. State of adoption of artificial intelligence in the United States 42 nuclear deterrence architecture Table 3.3. State of adoption of artificial intelligence in the Russian nuclear 50 deterrence architecture Table 3.4. State of adoption of artificial intelligence in the British nuclear 58 deterrence architecture Table 3.5. State of adoption of artificial intelligence in the French nuclear 66 deterrence architecture Table 3.6. State of adoption of artificial intelligence in the Chinese nuclear 77 deterrence architecture Table 3.7. State of adoption of artificial intelligence in the Indian nuclear 85 deterrence architecture Table 3.8. State of adoption of artificial intelligence in the Pakistani nuclear 92 deterrence architecture Table 3.9. North Korean universities conducting research and studies in 96 artificial intelligence Table 3.10. State of adoption of artificial intelligence in the North Korean 99 nuclear deterrence architecture 4. The positive and negative impacts of AI on strategic stability and 101 nuclear risk I. The impact on strategic stability and strategic relations 101 II. The impact on the likelihood of nuclear conflict: Foreseeable risk scenarios 113 Box 4.1. Machine learning and verification of nuclear arms control and 104 disarmament: Opportunities and challenges 5. Mitigating the negative impacts of AI on strategic stability and 123 nuclear risk I. Mitigating risks: What, how and where 123 II. Possible technical and organizational measures for risk reduction 127 III. Possible policy measures for risk reduction 130 Figure 5.1. Risks and challenges posed by the use of artificial intelligence 124 in nuclear weapons 6. Conclusions 136 I. Key findings 136 II. Recommendations 140 Figure 6.1. Possible risk reduction measures and how they can be implemented 142 Figure 6.2. Four key measures to deal with the negative impact of AI on 143 strategic stability and nuclear risk About the authors 144 Preface The current period sees the post-cold war global strategic landscape in an extended process of redefinition. This is the result of a number of different trends. Most importantly, the underlying dynamics of world power have been shifting with the economic, political and military rise of China, the reassertion under President Vladimir Putin of a great power role for Russia, and the dis enchantment expressed by the current United States administration with the intern ational institutions and arrangements that the USA itself had a big hand in creating. As a result, the China–US rivalry has increasingly supplanted the Russian–US nuclear rivalry as the core binary confrontation of international politics. This pair of dyadic antagonisms is, moreover, supplemented by growing regional nuclear rivalries and strategic triangles in South Asia and the Middle East. Against this increasingly toxic geopolitical background, the arms control framework created at the end of the cold war has deteriorated. Today, the commitment of the states with the largest nuclear arsenals to pursue stability through arms control and potentially disarmament is in doubt. The impact of coronavirus disease 2019 (COVID-19) is not yet clear but may well be a source of further unsettling developments. All of this is the volatile backdrop to considering the consequences of new technological developments for armament dynamics. The world is going through a fourth industrial revolution, characterized by rapid advances in artificial intelligence (AI), robotics, quantum technology, nanotechnology, bio technology and digital fabrication. The question of how these technologies will be used has not yet been answered in full detail. It is beyond dispute, however, that nuclear- armed states will seek to use these technologies for their national security. The SIPRI project ‘Mapping the impact of machine learning and autonomy on strategic stability’ set out to explore the potential effect of AI exploitation on strategic stability and nuclear risk. The research team has used a region-by-region approach to analyze the impact that the exploitation of AI could have on the global strategic landscape. This report is the final publication of this two-year research project funded by the Carnegie Corporation of New York; it presents the key findings and recommendations of the SIPRI authors derived from their research as well as a series of regional and transregional workshop organized in Europe, East and South Asia and the USA. It follows and complements the trilogy of edited volumes that compile the perspectives of experts from these regions on the topic. SIPRI commends this study to decision makers in the realms of arms control, defence and foreign affairs, to researchers and students in departments of politics, international relations and computer science, as well as to members of the general public who have a professional and personal interest in the subject. Dan Smith Director, SIPRI Stockholm, June 2020 Acknowledgements The authors would like to express their sincere gratitude to the Carnegie Corporation of New York for its generous financial support of the project. They are also indebted to all the experts who participated in the workshops and other events that SIPRI organized in Stockholm, Beijing, Colombo, New York, Geneva and Seoul. The content of this report reflects the contributions of this inter- national group of experts. The authors also wish to thank the external reviewer, Erin Dumbacher, as well as SIPRI colleagues Sibylle Bauer, Mark Bromley, Tytti Erästö, Shannon Kile, Luc van de Goor, Pieter Wezeman and Siemon Wezeman for their com prehensive and constructive feedback. Finally, we would like to acknowledge the invaluable editorial work of David Cruickshank and the SIPRI editorial department. Responsibility for the views and information presented in this report lies entirely with the authors. Abbreviations A2/AD Anti-access/area-denial AGI Artificial general intelligence AI Artificial intelligence ATR Automatic target recognition AURA Autonomous Unmanned Research Aircraft CAIR Centre for Artificial Intelligence and Robotics CBM Confidence-building measure CCW Certain Conventional Weapons Convention CD Conference on Disarmament DARPA Defense Advanced Research Projects Agency DBN Deep belief network DCDC Development, Concepts and Doctrine Centre DGA Direction générale de l’armement (directorate general of armaments of France) DIU Defense Innovation Unit DOD Department of Defense DRDO Defence Research and Development Organisation GAN Generative adversarial network ICBM Intercontinental ballistic
Recommended publications
  • Result Build a Self-Driving Network
    SELF-DRIVING NETWORKS “LOOK, MA: NO HANDS” Kireeti Kompella CTO, Engineering Juniper Networks ITU FG Net2030 Geneva, Oct 2019 © 2019 Juniper Networks 1 Juniper Public VISION © 2019 Juniper Networks Juniper Public THE DARPA GRAND CHALLENGE BUILD A FULLY AUTONOMOUS GROUND VEHICLE IMPACT • Programmers, not drivers • No cops, lawyers, witnesses GOAL • Quadruple highway capacity Drive a pre-defined 240km course in the Mojave Desert along freeway I-15 • Glitches, insurance? • Ethical Self-Driving Cars? PRIZE POSSIBILITIES $1 Million RESULT 2004: Fail (best was less than 12km!) 2005: 5/23 completed it 2007: “URBAN CHALLENGE” Drive a 96km urban course following traffic regulations & dealing with other cars 6 cars completed this © 2019 Juniper Networks 3 Juniper Public GRAND RESULT: THE SELF-DRIVING CAR: (2009, 2014) No steering wheel, no pedals— a completely autonomous car Not just an incremental improvement This is a DISRUPTIVE change in automotive technology! © 2019 Juniper Networks 4 Juniper Public THE NETWORKING GRAND CHALLENGE BUILD A SELF-DRIVING NETWORK IMPACT • New skill sets required GOAL • New focus • BGP/IGP policies AI policy Self-Discover—Self-Configure—Self-Monitor—Self-Correct—Auto-Detect • Service config service design Customers—Auto-Provision—Self-Diagnose—Self-Optimize—Self-Report • Reactive proactive • Firewall rules anomaly detection RESULT Free up people to work at a higher-level: new service design and “mash-ups” POSSIBILITIES Agile, even anticipatory service creation Fast, intelligent response to security breaches CHALLENGE
    [Show full text]
  • SIPRI Yearbook 2018: Armaments, Disarmament and International
    world nuclear forces 273 VII. Pakistani nuclear forces shannon n. kile and hans m. kristensen Pakistan continues to prioritize the development and deployment of new nuclear weapons and delivery systems as part of its ‘full spectrum deterrence posture’ vis-à-vis India. It is estimated that Pakistan possessed 140–50 war- heads as of January 2018 (see table 6.8). Pakistan’s nuclear weapon arsenal is likely to expand significantly over the next decade, although estimates of the increase in warhead numbers vary considerably.1 Pakistan is believed to be gradually increasing its military fissile material holdings, which include both plutonium and highly enriched uranium (HEU) (see section X). Pakistan’s plutonium production complex is located at Khushab in the province of Punjab. It consists of four operational heavy water nuclear reactors and a heavy water production plant.2 Pakistan appears to be increasing its capacity to reprocess spent nuclear fuel—that is, to chemically separate plutonium from irradiated reactor fuel. A small reprocessing plant has been expanded at the New Laboratories facility of the Pakistan Institute of Science and Technology (PINSTECH) near Rawal- pindi. A larger reprocessing plant has been constructed at the Chashma Nuclear Power Complex in Punjab and may already be operational.3 Uranium enrichment takes place at the gas centrifuge plant in the Khan Research Laboratories (KRL) complex at Kahuta in Punjab and at a smaller plant located at Gadwal, also in Punjab. A new uranium enrichment centri- fuge plant may be under construction in the KRL complex at Kahuta.4 Pakistan’s capacity to produce HEU for nuclear weapons is constrained by its limited indigenous supply of natural uranium.5 Aircraft The Pakistan Air Force’s (PAF) Mirage III and Mirage V combat aircraft are the most likely aircraft to have been given a nuclear delivery role.
    [Show full text]
  • Report: the New Nuclear Arms Race
    The New Nuclear Arms Race The Outlook for Avoiding Catastrophe August 2020 By Akshai Vikram Akshai Vikram is the Roger L. Hale Fellow at Ploughshares Fund, where he focuses on U.S. nuclear policy. A native of Louisville, Kentucky, Akshai previously worked as an opposition researcher for the Democratic National Committee and a campaign staffer for the Kentucky Democratic Party. He has written on U.S. nuclear policy and U.S.-Iran relations for outlets such as Inkstick Media, The National Interest, Defense One, and the Quincy Institute’s Responsible Statecraft. Akshai holds an M.A. in International Economics and American Foreign Policy from the Johns Hopkins University SAIS as well as a B.A. in International Studies and Political Science from Johns Hopkins Baltimore. On a good day, he speaks Spanish, French, and Persian proficiently. Acknowledgements This report was made possible by the strong support I received from the entire Ploughshares Fund network throughout my fellowship. Ploughshares Fund alumni Will Saetren, Geoff Wilson, and Catherine Killough were extremely kind in offering early advice on the report. From the Washington, D.C. office, Mary Kaszynski and Zack Brown offered many helpful edits and suggestions, while Joe Cirincione, Michelle Dover, and John Carl Baker provided much- needed encouragement and support throughout the process. From the San Francisco office, Will Lowry, Derek Zender, and Delfin Vigil were The New Nuclear Arms Race instrumental in finalizing this report. I would like to thank each and every one of them for their help. I would especially like to thank Tom Collina. Tom reviewed numerous drafts of this report, never The Outlook for Avoiding running out of patience or constructive advice.
    [Show full text]
  • Killer Robots
    { KILLER ROBOTS What are they and / what are the concerns? KILLER ROBOTS ARE WEAPON SYSTEMS THAT WOULD SELECT AND ATTACK TARGETS WITHOUT MEANINGFUL HUMAN CONTROL. This means the decision to deploy lethal force would be delegated to a WHAT machine. This far-reaching development would fundamentally change the way war is conducted and has been called the third revolution in warfare, after gunpowder and the atomic bomb. The function of autonomously selecting and attacking targets could be applied to various platforms, for instance a battle tank, a fighter jet or a ship. Another term used to ARE describe these weapons is lethal autonomous weapon systems (LAWS). A COMMON MISUNDERSTANDING IS THAT KILLER ROBOTS ARE DRONES OR THE TERMINATOR. Today’s armed drones still have a human operator controlling the weapon system from a distance who is responsible for selecting and identifying KILLER targets as well as pulling the trigger. The issue is also not about the Ter- minator. This science fiction concept is unlikely to become a reality in the coming decades if ever at all. The issue is about the removal of mean- ingful human control from the critical functions of selecting and attacking targets; some of these systems may currently be under development and ROBOTS? could be deployed in the coming years. THERE SHOULD ALWAYS BE MEANINGFUL HUMAN CONTROL OVER THE SELECTION AND ATTACK OF INDIVIDUAL TARGETS. The human operator must be able to make carefully considered legal and ethical assessments, with sufficient information about the situation on the / ground and enough time to make a well-considered, informed decision.
    [Show full text]
  • An Off-Road Autonomous Vehicle for DARPA's Grand Challenge
    2005 Journal of Field Robotics Special Issue on the DARPA Grand Challenge MITRE Meteor: An Off-Road Autonomous Vehicle for DARPA’s Grand Challenge Robert Grabowski, Richard Weatherly, Robert Bolling, David Seidel, Michael Shadid, and Ann Jones. The MITRE Corporation 7525 Colshire Drive McLean VA, 22102 [email protected] Abstract The MITRE Meteor team fielded an autonomous vehicle that competed in DARPA’s 2005 Grand Challenge race. This paper describes the team’s approach to building its robotic vehicle, the vehicle and components that let the vehicle see and act, and the computer software that made the vehicle autonomous. It presents how the team prepared for the race and how their vehicle performed. 1 Introduction In 2004, the Defense Advanced Research Projects Agency (DARPA) challenged developers of autonomous ground vehicles to build machines that could complete a 132-mile, off-road course. Figure 1: The MITRE Meteor starting the finals of the 2005 DARPA Grand Challenge. 2005 Journal of Field Robotics Special Issue on the DARPA Grand Challenge 195 teams applied – only 23 qualified to compete. Qualification included demonstrations to DARPA and a ten-day National Qualifying Event (NQE) in California. The race took place on October 8 and 9, 2005 in the Mojave Desert over a course containing gravel roads, dirt paths, switchbacks, open desert, dry lakebeds, mountain passes, and tunnels. The MITRE Corporation decided to compete in the Grand Challenge in September 2004 by sponsoring the Meteor team. They believed that MITRE’s work programs and military sponsors would benefit from an understanding of the technologies that contribute to the DARPA Grand Challenge.
    [Show full text]
  • An International Armed Conflict of Low Intensity
    THE WAR REPORT THE UNITED STATES OF AMERICA AND THE ISLAMIC REPUBLIC OF IRAN: AN INTERNATIONAL ARMED CONFLICT OF LOW INTENSITY Aeria view of the Persian Gulf, © NASA DECEMBER 2019 I MILOŠ HRNJAZ THE GENEVA ACADEMY A JOINT CENTER OF Iranian Prime Minister, Mohammed Mosaddeq, pushed CLASSIFICATION OF THE CONFLICT for nationalization of the oil fields and the Shah signed The United States of America and the Islamic Republic of this decision. The response of the British was harsh as they Iran were engaged in an international armed conflict (IAC) saw oil from Iran as a strategic interest. Both Iranians and in June 2019 by virtue of Iran’s shooting down a US military the British expected the support of the US. The Americans drone and the alleged counter cyber-attack by the US. pushed Britain to cancel plans for a military invasion, so the British decided to look for alternative ways to overthrow Mosaddeq. The new US administration wasn’t impressed HISTORY OF THE CONFLICT with Mosaddeq either (especially his flirting with the USSR and the communist Tudeh Party of Iran), so it decided to BACKGROUND actively participate in his overthrow and arrest. This was It has been more than 160 years since the first Treaty perceived by Iranians as the ultimate betrayal by America of Friendship and Commerce was signed between Iran and the event played an important role in the development and the US, exactly 140 years since the first US warship of Iranian political identity and anti-Americanism since entered the Persian Gulf and almost 140 years since Iran then.5 Mosadeqq became the brave figure who represented (Persia) and the US established diplomatic relations.1 Since the fight for independent Iran, free from the influence of then, their relationship has oscillated between cooperation the West.
    [Show full text]
  • AI, Robots, and Swarms: Issues, Questions, and Recommended Studies
    AI, Robots, and Swarms Issues, Questions, and Recommended Studies Andrew Ilachinski January 2017 Approved for Public Release; Distribution Unlimited. This document contains the best opinion of CNA at the time of issue. It does not necessarily represent the opinion of the sponsor. Distribution Approved for Public Release; Distribution Unlimited. Specific authority: N00014-11-D-0323. Copies of this document can be obtained through the Defense Technical Information Center at www.dtic.mil or contact CNA Document Control and Distribution Section at 703-824-2123. Photography Credits: http://www.darpa.mil/DDM_Gallery/Small_Gremlins_Web.jpg; http://4810-presscdn-0-38.pagely.netdna-cdn.com/wp-content/uploads/2015/01/ Robotics.jpg; http://i.kinja-img.com/gawker-edia/image/upload/18kxb5jw3e01ujpg.jpg Approved by: January 2017 Dr. David A. Broyles Special Activities and Innovation Operations Evaluation Group Copyright © 2017 CNA Abstract The military is on the cusp of a major technological revolution, in which warfare is conducted by unmanned and increasingly autonomous weapon systems. However, unlike the last “sea change,” during the Cold War, when advanced technologies were developed primarily by the Department of Defense (DoD), the key technology enablers today are being developed mostly in the commercial world. This study looks at the state-of-the-art of AI, machine-learning, and robot technologies, and their potential future military implications for autonomous (and semi-autonomous) weapon systems. While no one can predict how AI will evolve or predict its impact on the development of military autonomous systems, it is possible to anticipate many of the conceptual, technical, and operational challenges that DoD will face as it increasingly turns to AI-based technologies.
    [Show full text]
  • Cyber Warfare a “Nuclear Option”?
    CYBER WARFARE A “NUCLEAR OPTION”? ANDREW F. KREPINEVICH CYBER WARFARE: A “NUCLEAR OPTION”? BY ANDREW KREPINEVICH 2012 © 2012 Center for Strategic and Budgetary Assessments. All rights reserved. About the Center for Strategic and Budgetary Assessments The Center for Strategic and Budgetary Assessments (CSBA) is an independent, nonpartisan policy research institute established to promote innovative thinking and debate about national security strategy and investment options. CSBA’s goal is to enable policymakers to make informed decisions on matters of strategy, secu- rity policy and resource allocation. CSBA provides timely, impartial, and insight- ful analyses to senior decision makers in the executive and legislative branches, as well as to the media and the broader national security community. CSBA encour- ages thoughtful participation in the development of national security strategy and policy, and in the allocation of scarce human and capital resources. CSBA’s analysis and outreach focus on key questions related to existing and emerging threats to US national security. Meeting these challenges will require transforming the national security establishment, and we are devoted to helping achieve this end. About the Author Dr. Andrew F. Krepinevich, Jr. is the President of the Center for Strategic and Budgetary Assessments, which he joined following a 21-year career in the U.S. Army. He has served in the Department of Defense’s Office of Net Assessment, on the personal staff of three secretaries of defense, the National Defense Panel, the Defense Science Board Task Force on Joint Experimentation, and the Defense Policy Board. He is the author of 7 Deadly Scenarios: A Military Futurist Explores War in the 21st Century and The Army and Vietnam.
    [Show full text]
  • Future of Research on Catastrophic and Existential Risk
    ORE Open Research Exeter TITLE Working together to face humanity's greatest threats: Introduction to The Future of Research in Catastrophic and Existential Risk AUTHORS Currie, AM; Ó hÉigeartaigh, S JOURNAL Futures DEPOSITED IN ORE 06 February 2019 This version available at http://hdl.handle.net/10871/35764 COPYRIGHT AND REUSE Open Research Exeter makes this work available in accordance with publisher policies. A NOTE ON VERSIONS The version presented here may differ from the published version. If citing, you are advised to consult the published version for pagination, volume/issue and date of publication Working together to face humanity’s greatest threats: Introduction to The Future of Research on Catastrophic and Existential Risk. Adrian Currie & Seán Ó hÉigeartaigh Penultimate Version, forthcoming in Futures Acknowledgements We would like to thank the authors of the papers in the special issue, as well as the referees who provided such constructive and useful feedback. We are grateful to the team at the Centre for the Study of Existential Risk who organized the first Cambridge Conference on Catastrophic Risk where many of the papers collected here were originally presented, and whose multi-disciplinary expertise was invaluable for making this special issue a reality. We’d like to thank Emma Bates, Simon Beard and Haydn Belfield for feedback on drafts. Ted Fuller, Futures’ Editor-in-Chief also provided invaluable guidance throughout. The Conference, and a number of the publications in this issue, were made possible through the support of a grant from the Templeton World Charity Foundation (TWCF); the conference was also supported by a supplementary grant from the Future of Life Institute.
    [Show full text]
  • The Iranian Revolution in 1979
    Demonstrations of the Iranian People’s Mujahideens (Warriors) during the Iranian Revolution in 1979. Dr. Ali Shariati, an Iranian leftist on the left. Ayatollah Khomeini on the right. The Iranian Revolution Gelvin, ch. 17 & 18 & other sources, notes by Denis Bašić The Pahlavi Dynasty • could hardly be called a “dynasty,” for it had only two rulers - Reza Shah Pahlavi (ruled 1926-1941) and his son Muhammad Reza Shah Pahlavi (ruled 1941-1979). • The son came to power after his father was deposed by the Allies (the Russian and British forces) due to his alliance with the Nazi Germany. • The Allies reestablished the majlis and allowed the organization of trade unions and political parties in order to limit the power of the new shah and to prevent him from following his father’s independence course. • Much to the chagrin of the British and Americans, the most popular party proved to be Tudeh - the communist party with more than 100,000 members. • The second Shah’s power was further eroded when in 1951 Muhammad Mossadegh was elected the prime minister on a platform that advocated nationalizing the oil industry and restricting the shah’s power. • Iranian Prime Minister 1951–3. A prominent parliamentarian. He was twice Muhammad appointed to that office by Mohammad Reza Pahlavi, the Shah of Iran, after a Mossadegh positive vote of inclination by the 1882-1967 parliament. Mossadegh was a nationalist and passionately opposed foreign intervention in Iran. He was also the architect of the nationalization of the Iranian oil industry, which had been under British control through the Anglo- Iranian Oil Company (AIOC), today known as British Petroleum (BP).
    [Show full text]
  • Phillip Saunders Testimony
    Testimony before the U.S.-China Economic and Security Review Commission Hearing on China’s Nucle ar Force s June 10, 2021 Phillip C. Saunders Director, Center for the Study of Chinese Military Affairs Institute of National Strategic Studies, National Defense University The views expressed are those of the author and do not necessarily represent those of the National Defense University, the Department of Defense, or the U.S. government. Introduction The People’s Republic of China (PRC) is in the midst of an ambitious strategic modernization that will transform its nuclear arsenal from a limited ground-based nuclear force intended to provide an assured second strike after a nuclear attack into a much larger, technologically advanced, and diverse nuclear triad that will provide PRC leaders with new strategic options. China also fields an increasing number of dual-capable medium and intermediate-range ballistic missiles whose status within a future regional crisis or conflict may be unclear, potentially casting a nuclear shadow over U.S. and allied military operations. In addition to more accurate and more survivable delivery systems, this modernization includes improvements to the People’s Liberation Army (PLA) nuclear command, control, and communications (NC3) and strategic intelligence, surveillance, and reconnaissance (ISR) systems that will provide PRC leaders with greater situational awareness in a crisis or conflict. These systems will also support development of ballistic missile defenses (BMD) and enable possible shifts in PRC nuclear
    [Show full text]
  • 600 Technology (Applied Sciences)
    600 600 Technology (Applied sciences) Class here inventions See also 303.48 for technology as a cause of cultural change; also 306.4 for sociology of technology; also 338.1–338.4 for economic aspects of industries based on specific technologies; also 338.9 for technology transfer, appropriate technology See Manual at 300 vs. 600; also at 363 vs. 302–307, 333.7, 570–590, 600; also at 363.1 vs. 600; also at 583–585 vs. 600 SUMMARY 601–609 Standard subdivisions and technical drawing, hazardous materials technology, patents 610 Medicine and health 620 Engineering and allied operations 630 Agriculture and related technologies 640 Home and family management 650 Management and auxiliary services 660 Chemical engineering and related technologies 670 Manufacturing 680 Manufacture of products for specific uses 690 Construction of buildings 601 Philosophy and theory 602 Miscellany Do not use for patents; class in 608 Class interdisciplinary works on trademarks and service marks in 929.9 Interdisciplinary collections of standards relocated to 389 .9 Commercial miscellany Class commercial miscellany of products and services used in individual and family living in 640.29; class commercial miscellany of manufactured products in 670.29; class interdisciplinary commercial miscellany in 381.029 603 Dictionaries, encyclopedias, concordances 604 Technical drawing, hazardous materials technology; groups of people 657 604 Dewey Decimal Classification 604 .2 Technical drawing Including arrangement and organization of drafting rooms, preservation and storage of drawings; specific drafting procedures and conventions (e.g., production illustration, dimensioning; lettering, titling; shades, shadows, projections); preparation and reading of copies Class here engineering graphics, mechanical drawing For architectural drawing, see 720.28.
    [Show full text]