Democratic Defense Against Disinformation 2.0
Total Page:16
File Type:pdf, Size:1020Kb
Democratic Defense Against Disinformation 2.0 Alina Polyakova and Daniel Fried ISBN-13: 978-1-61977-592-3 This report is written and published in accordance with the Atlantic Council Policy on Intellectual Independence. The author is solely responsible for its analysis and recommendations. The Atlantic Council and its donors do not determine, nor do they necessarily endorse or advocate for, any of this report’s conclusions. June 2019 TABLE OF CONTENTS INTRODUCTION 1 Framing Solutions (Against a Moving Target) 1 PROGRESS REPORT 4 Europe Seeks Solutions 4 The G7 in the Game 6 The United States Tries to Get a Grip 7 Social Media Companies on the Spot 12 Civil Society Stays the Course 15 NEXT STEPS 16 The US Executive Should Get Serious 16 The US Congress Should Fill the Gap 20 Europe Should Make it Real 22 Social Media Companies Should Step Up 23 ABOUT THE AUTHORS 24 Democratic Defense Against Disinformation 2.0 INTRODUCTION his second edition of the paper, Democratic media companies, and civil-society groups invest in Defense Against Disinformation, seeks to cap- long-term resilience against disinformation, including ture the rapid development of policy responses raising social awareness of disinformation and encour- to the challenge—especially by governments aging digital literacy education, including how to dis- Tand social media companies—since initial publication cern fabricated or deceptive content and sources. in February 2018. The authors stand by the fundamen- tals of the earlier analysis and recommendations: that Democratic Defense Against Disinformation 2.0 will re- democratic societies can combat and mitigate the view developments over the past year, assess their ef- challenge of foreign disinformation while working with- fectiveness thus far, and offer suggestions for next steps. in democratic norms and respect for freedom of ex- pression. The first edition offered a “whole-of-society” Framing Solutions (Against a Moving Target) vision, with policy suggestions for governments, social media companies, and civil-society groups. Since pub- Vladimir Putin’s Russia was perhaps first among lication of the first edition, the European Union (EU) major powers to deploy techniques of full-spectrum, and, to a lesser degree, the US government, have taken state-sponsored disinformation for the digital age—the actions that parallel some of these suggestions. For intentional spread of inaccurate information designed their part, social media companies have moved from to influence societies. It will not be the last. Other state an initial and unsustainable denial of the problem to a actors with perhaps greater capabilities, such as China, stance of willingness to help deal with it, though the and nonstate actors, such as terrorist groups with a depth of this commitment (and the effectiveness of the higher tolerance for risk, will adapt the disinformation responses) has yet to be determined. toolkit to undermine democracies or are already doing so. There is nothing new about state propaganda and Collectively, democracies have moved beyond “admir- other means of political subversion (“active measures” ing the problem,” meaning a sort of existential despair in was the term of art for Soviet efforts of this kind). But, the face of a new threat not easily managed. They have digital and social media, in combination with more tra- now entered a period of “trial and error,” in which new ditional methods, offer new means to achieve traditional ideas and solutions for countering, and building resil- ends. Russia’s democratic and pro-Western neighbors, ience against, disinformation are being tested, though especially Ukraine, Georgia, and the Baltic states, have unevenly and with setbacks. Meanwhile, the disinforma- contended with Russian disinformation attacks for years. tion challenge has evolved and advanced, as state and nonstate actors deploy new technologies, develop new Other targets of state-sponsored disinformation cam- methods of exploitation, and adapt to responses. paigns—the United States and some Western European countries—woke up late to the challenge, with the Supply and demand of disinformation. Many of our United States doing so only after its 2016 presidential recommendations address the “supply side” of disin- election, in which Russia played a large and malign role. formation; i.e., they recommended, and continue to The Department of Justice Special Counsel Report1 and recommend, policies and actions to limit the influx of two independent reports, prepared for the US Senate’s disinformation into US and other media ecosystems. Select Committee on Intelligence and published in But, tools to block disinformation will be imperfect, December 2018, detail Russia’s disinformation tactics and some degree of disinformation will be part of the during and after the 2016 US elections, including by media landscape for the indefinite future. Addressing the Russian-government-supported Internet Research the “demand side” of disinformation—i.e., reducing Agency (IRA), the now-notorious St. Petersburg troll general social acceptance of fabrications and distor- farm.2 The February 2018 Department of Justice in- tions—is likely to be more important for sustained so- dictment of thirteen Russian operatives involved in the cietal immunity. It is critical that governments, social IRA information operations provides the most in-depth 1 Robert S. Mueller, III, “Report On The Investigation Into Russian Interference In The 2016 Presidential Election” (US Department of Justice, Washington, DC, 2019), https://www.justice.gov/storage/report.pdf. 2 Renee DiResta et al., The Tactics & Tropes of the Internet Research Agency, New Knowledge, December 17, 2018, https://cdn2.hubspot. net/hubfs/4326998/ira-report-rebrand_FinalJ14.pdf; Philip N. Howard et al., The IRA and Political Polarization in the United States, Oxford Internet Institute, August 22, 2018, https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2018/12/The-IRA-Social-Media- and-Political-Polarization.pdf. ATLANTIC COUNCIL 1 Democratic Defense Against Disinformation 2.0 democratic societies. As such, they don’t stop when the ballot box closes. Still, due to the high level of at- tention and consequential outcomes, elections provide an ideal high-impact opportunity for this type of influ- ence operation. Ahead of elections throughout Europe and North America in 2019 and 2020, governments, social media companies, and civil-society groups must learn from each other and accelerate the implementation of best practices to defend against disinformation. Democracies are learning that means of defense and norms of resilience applicable to traditional propa- ganda and subversion are inadequate to meet the present danger. Also, disinformation techniques will continue to evolve. For example, innovation in artificial intelligence (AI) is producing “deepfakes” and other “synthetic media” products—video and audio manipu- lation with the capability to manufacture the appear- ance of reality, such as nonexistent, but real-looking, remarks by a political leader. As these tools become more low cost and accessible, they will become perfect weapons for information warfare.4 Such technologies could drive the next great leap in AI-driven disinfor- mation.5 More generally, disinformation techniques are shifting from the use of simple automated bots to Vladimir Putin’s Russia was perhaps first among major powers more sophisticated interaction with (and manipulation to deploy techniques of full-spectrum, state-sponsored disinformation for the digital age—the intentional spread of of) domestic groups, extremist and otherwise, through inaccurate information designed to influence societies. various forms of impersonation and amplification of Photo Credit: The Russian Presidential Press and Information organic posts by domestic actors. Thus, it may be in- Office creasingly difficult to disentangle foreign-origin disin- formation from domestic social media conversations. Rather than trying to break through the noise, the new strategy aims to blend in with the noise—obfuscating research to date about the internal machinery of the manipulative activity and blurring the line between au- Russian operation.3 thentic and inauthentic content. Exposure is not enough. Disinformation campaigns are The first edition of Democratic Defense Against Disinfor- not going away. Quite the opposite—other malicious mation offered recommendations on ways democratic state actors with an interest in undermining democra- governments and free societies can combat disinfor- cies, including Iran, North Korea, and China, are learn- mation, while respecting the norms and values of free ing from Russian tactics. Meanwhile, the tools of attack expression and the rule of law.6 As democratic countries are evolving and adapting to democratic responses. learned in the struggle against Soviet communism, we State-sponsored disinformation campaigns aim to need not become them to fight them: democratic so- amplify existing social divisions and further polarize cieties should not fight propaganda with propaganda, 3 US Department of Justice, “Internet Research Agency Indictment” (US Department of Justice, Washington, DC, 2018), https://www. justice.gov/file/1035477/download. 4 + Polyakova, Weapons of the weak: Russia and the AI-driven asymmetric warfare, November 15, 2018, https://www.brookings.edu/ research/weapons-of-the-weak-russia-and-ai-driven-asymmetric-warfare/. 5 Alina Polyakova and Spencer Phipps Boyer, The future of political warfare: Russia, the