Understanding State-Sponsored Trolls on Twitter and Their Influence On
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
How Russia Tried to Start a Race War in the United States
Michigan Journal of Race and Law Volume 24 2019 Virtual Hatred: How Russia Tried to Start a Race War in the United States William J. Aceves California Western School of Law Follow this and additional works at: https://repository.law.umich.edu/mjrl Part of the Communications Law Commons, Internet Law Commons, and the Law and Race Commons Recommended Citation William J. Aceves, Virtual Hatred: How Russia Tried to Start a Race War in the United States, 24 MICH. J. RACE & L. 177 (2019). Available at: https://repository.law.umich.edu/mjrl/vol24/iss2/2 This Article is brought to you for free and open access by the Journals at University of Michigan Law School Scholarship Repository. It has been accepted for inclusion in Michigan Journal of Race and Law by an authorized editor of University of Michigan Law School Scholarship Repository. For more information, please contact [email protected]. VIRTUAL HATRED: HOW RUSSIA TRIED TO START A RACE WAR in the UNITED STATES William J. Aceves* During the 2016 U.S. presidential election, the Russian government engaged in a sophisticated strategy to influence the U.S. political system and manipulate American democracy. While most news reports have focused on the cyber-attacks aimed at Democratic Party leaders and possible contacts between Russian officials and the Trump presidential campaign, a more pernicious intervention took place. Throughout the campaign, Russian operatives created hundreds of fake personas on social media platforms and then posted thousands of advertisements and messages that sought to promote racial divisions in the United States. This was a coordinated propaganda effort. -
ASD-Covert-Foreign-Money.Pdf
overt C Foreign Covert Money Financial loopholes exploited by AUGUST 2020 authoritarians to fund political interference in democracies AUTHORS: Josh Rudolph and Thomas Morley © 2020 The Alliance for Securing Democracy Please direct inquiries to The Alliance for Securing Democracy at The German Marshall Fund of the United States 1700 18th Street, NW Washington, DC 20009 T 1 202 683 2650 E [email protected] This publication can be downloaded for free at https://securingdemocracy.gmfus.org/covert-foreign-money/. The views expressed in GMF publications and commentary are the views of the authors alone. Cover and map design: Kenny Nguyen Formatting design: Rachael Worthington Alliance for Securing Democracy The Alliance for Securing Democracy (ASD), a bipartisan initiative housed at the German Marshall Fund of the United States, develops comprehensive strategies to deter, defend against, and raise the costs on authoritarian efforts to undermine and interfere in democratic institutions. ASD brings together experts on disinformation, malign finance, emerging technologies, elections integrity, economic coercion, and cybersecurity, as well as regional experts, to collaborate across traditional stovepipes and develop cross-cutting frame- works. Authors Josh Rudolph Fellow for Malign Finance Thomas Morley Research Assistant Contents Executive Summary �������������������������������������������������������������������������������������������������������������������� 1 Introduction and Methodology �������������������������������������������������������������������������������������������������� -
Taming the Trolls: the Need for an International Legal Framework to Regulate State Use of Disinformation on Social Media
Taming the Trolls: The Need for an International Legal Framework to Regulate State Use of Disinformation on Social Media * ASHLEY C. NICOLAS INTRODUCTION Consider a hypothetical scenario in which hundreds of agents of the Russian GRU arrive in the United States months prior to a presidential election.1 The Russian agents spend the weeks leading up to the election going door to door in vulnerable communities, spreading false stories intended to manipulate the population into electing a candidate with policies favorable to Russian positions. The agents set up television stations, use radio broadcasts, and usurp the resources of local newspapers to expand their reach and propagate falsehoods. The presence of GRU agents on U.S. soil is an incursion into territorial integrity⎯a clear invasion of sovereignty.2 At every step, Russia would be required to expend tremendous resources, overcome traditional media barriers, and risk exposure, making this hypothetical grossly unrealistic. Compare the hypothetical with the actual actions of the Russians during the 2016 U.S. presidential election. Sitting behind computers in St. Petersburg, without ever setting foot in the United States, Russian agents were able to manipulate the U.S. population in the most sacred of domestic affairs⎯an election. Russian “trolls” targeted vulnerable populations through social media, reaching millions of users at a minimal cost and without reliance on established media institutions.3 Without using * Georgetown Law, J.D. expected 2019; United States Military Academy, B.S. 2009; Loyola Marymount University M.Ed. 2016. © 2018, Ashley C. Nicolas. The author is a former U.S. Army Intelligence Officer. -
Recent Trends in Online Foreign Influence Efforts
Recent Trends in Online Foreign Influence Efforts Diego A. Martin, Jacob N. Shapiro, Michelle Nedashkovskaya Woodrow Wilson School of Public and International Affairs Princeton University Princeton, New Jersey, United States E-mail: [email protected] Email: [email protected] E-mail: [email protected] Abstract: Foreign governments have used social media to influence politics in a range of countries by promoting propaganda, advocating controversial viewpoints, and spreading disinformation. We analyze 53 distinct foreign influence efforts (FIEs) targeting 24 different countries from 2013 through 2018. FIEs are defined as (i) coordinated campaigns by one state to impact one or more specific aspects of politics in another state (ii) through media channels, including social media, (iii) by producing content designed to appear indigenous to the target state. The objective of such campaigns can be quite broad and to date have included influencing political decisions by shaping election outcomes at various levels, shifting the political agenda on topics ranging from health to security, and encouraging political polarization. We draw on more than 460 media reports to identify FIEs, track their progress, and classify their features. Introduction Information and Communications Technologies (ICTs) have changed the way people communi- cate about politics and access information on a wide range of topics (Foley 2004, Chigona et al. 2009). Social media in particular has transformed communication between leaders and voters by enabling direct politician-to-voter engagement outside traditional avenues, such as speeches and press conferences (Ott 2017). In the 2016 U.S. presidential election, for example, social media platforms were more widely viewed than traditional editorial media and were central to the campaigns of both Democratic candidate Hillary Clinton and Republican candidate Donald Trump (Enli 2017). -
Fact Sheet for Reining in the Runet
REINING IN THE RUNET THE KREMLIN’S STRUGGLE TO CONTROL CYBERSPACE LINCOLN PIGMAN Since 2011–2012, when the combination of the Arab Spring and anti-government demonstrations in the Russian Federation left the country’s political elites determined to bring the Russian internet, or Runet, under state control, Russia has witnessed the establishment of a domestic internet control regime encompassing four strategies of control in cyberspace. These include 1) restricting internet users’ access to problematic content and information; 2) passively deterring online dissent by limiting internet users’ anonymity; 3) actively deterring online dissent by threatening internet users with punitive sanctions; and 4) competing with and drowning out online dissent by covertly producing and disseminating pro-government content and information. This report provides an original framework for the study of Russia’s evolving domestic internet control regime as well as a guide to understanding the online struggle between Russia’s political elites and its non-systemic political opposition, an increasingly critical element of contemporary Russian politics. KEY INSIGHTS • Punitive measures under the guise of countering “extremism” serve as the government’s primary method of active deterrence. • The Kremlin has systematically disseminated pro-government content designed to counter those critics undeterred by limited anonymity and the threat of state retribution. • While Russia’s political elites have demonstrated that their intent to limit access to platforms benefiting the non-systemic opposition, they are simultaneously willing to engage on the very platforms they are trying to curtail. • At the moment, Russia’s primary strategy to restrict internet freedom focuses on restricting users’ access to problematic content and information. -
COVID-19 Effects and Russian Disinformation Campaigns
Homeland Security Affairs Journal SPECIAL COVID-19 ISSUE COVID-19 Effects and Russian Disinformation Campaigns By Wesley R. Moy and Kacper Gradon 2 COVID-19 Effects and Russian Disinformation Campaigns By Wesley R. Moy and Kacper Gradon Abstract The effects of the novel coronavirus and its related disease COVID-19 have been far reaching, touching American society and its Western partners’ health and mortality, economics, social relationships, and politics. The effects have highlighted longstanding societal divisions, disproportionately harming minority groups and the socioeconomically disadvantaged with Black Americans and their communities hit especially hard. The problem being considered is the potential for Russian malign foreign influence to exploit the divides exacerbated by the coronavirus and target the United States and its European allies during the period leading to the 2020 elections and beyond. Possible coronavirus effects are inventoried and categorized into economic, healthcare, and environmental issue areas that may be leveraged and weaponized. The article includes the scenarios of such weaponization, including the description of focal points that may be attacked. We conclude with a set of recommendations to counter malign foreign influence. Suggested Citation Moy, Wesley and Kacper Gradon. “COVID-19 Effects and Russian Disinformation” Homeland Security Affairs 16, Article 8 (December, 2020) www.hsaj.org/articles16533. Introduction In late-May 2020, the novel coronavirus (SARS-CoV-2) and its related illness COVID-19, had resulted in the deaths of more than 100,000 Americans and infected nearly 1.7 million. By mid- October 2020, deaths had increased to over 210,000 and nearly 8,000,000 have been infected. -
Disinformation, and Influence Campaigns on Twitter 'Fake News'
Disinformation, ‘Fake News’ and Influence Campaigns on Twitter OCTOBER 2018 Matthew Hindman Vlad Barash George Washington University Graphika Contents Executive Summary . 3 Introduction . 7 A Problem Both Old and New . 9 Defining Fake News Outlets . 13 Bots, Trolls and ‘Cyborgs’ on Twitter . 16 Map Methodology . 19 Election Data and Maps . 22 Election Core Map Election Periphery Map Postelection Map Fake Accounts From Russia’s Most Prominent Troll Farm . 33 Disinformation Campaigns on Twitter: Chronotopes . 34 #NoDAPL #WikiLeaks #SpiritCooking #SyriaHoax #SethRich Conclusion . 43 Bibliography . 45 Notes . 55 2 EXECUTIVE SUMMARY This study is one of the largest analyses to date on how fake news spread on Twitter both during and after the 2016 election campaign. Using tools and mapping methods from Graphika, a social media intelligence firm, we study more than 10 million tweets from 700,000 Twitter accounts that linked to more than 600 fake and conspiracy news outlets. Crucially, we study fake and con- spiracy news both before and after the election, allowing us to measure how the fake news ecosystem has evolved since November 2016. Much fake news and disinformation is still being spread on Twitter. Consistent with other research, we find more than 6.6 million tweets linking to fake and conspiracy news publishers in the month before the 2016 election. Yet disinformation continues to be a substantial problem postelection, with 4.0 million tweets linking to fake and conspiracy news publishers found in a 30-day period from mid-March to mid-April 2017. Contrary to claims that fake news is a game of “whack-a-mole,” more than 80 percent of the disinformation accounts in our election maps are still active as this report goes to press. -
Runet RIPE 79
INTERNET GOVERNANCE IN RUSSIA TREND ON SOVEREIGNIZATION ILONA STADNIK SAINT-PETERSBURG STATE UNIVERSITY, RUSSIA CyberspAce Alignment to nAGonAl borders insteAd of Internet frAgmentAGon (Mueller 2017) 1. National securitization • Reframing CyberseCurity as a national seCurity issue • Militarization of CyberspaCe • Nationalization of threat intelligenCe • RelianCe on national standards and teChnologies THEORETICAL FRAMEWORK • Reassertion of legal authority for network kill switChes Methods to implement alignment 2. Territorialization of information flows • Content filtering • Data loCalization 3. Efforts to structure control of critical Internet resources along national lines NationAl SecuritizAtion Reframing CyberseCurity as a national seCurity issue 2000, 2016 Doctrine on information security Militarization of CyberspaCe “information operations troops” since 2013 Nationalization of threat intelligence GOSSOPKA, NCCCI, and publiC/private CERTs Reliance on national standards and teChnologies Import substitution program for software 2015 Reassertion of legal authority for network kill switChes DisCourse of external kill switch loCal shutdowns of mobile Internet in Ingushetia RepubliC and MosCow TerritoriAlizAtion of information flows filtering praCWCes since 2012 through adopWon of speCifiC laws. Child pornography, informaWon promoWng drugs and suiCide, calls for mass riots, extremist acvies, parWCipaWon in mass publiC events that violate the established proCedure, unliCensed content – buy a Court deCision or by request of federal • Content -
Hacks, Leaks and Disruptions | Russian Cyber Strategies
CHAILLOT PAPER Nº 148 — October 2018 Hacks, leaks and disruptions Russian cyber strategies EDITED BY Nicu Popescu and Stanislav Secrieru WITH CONTRIBUTIONS FROM Siim Alatalu, Irina Borogan, Elena Chernenko, Sven Herpig, Oscar Jonsson, Xymena Kurowska, Jarno Limnell, Patryk Pawlak, Piret Pernik, Thomas Reinhold, Anatoly Reshetnikov, Andrei Soldatov and Jean-Baptiste Jeangène Vilmer Chaillot Papers HACKS, LEAKS AND DISRUPTIONS RUSSIAN CYBER STRATEGIES Edited by Nicu Popescu and Stanislav Secrieru CHAILLOT PAPERS October 2018 148 Disclaimer The views expressed in this Chaillot Paper are solely those of the authors and do not necessarily reflect the views of the Institute or of the European Union. European Union Institute for Security Studies Paris Director: Gustav Lindstrom © EU Institute for Security Studies, 2018. Reproduction is authorised, provided prior permission is sought from the Institute and the source is acknowledged, save where otherwise stated. Contents Executive summary 5 Introduction: Russia’s cyber prowess – where, how and what for? 9 Nicu Popescu and Stanislav Secrieru Russia’s cyber posture Russia’s approach to cyber: the best defence is a good offence 15 1 Andrei Soldatov and Irina Borogan Russia’s trolling complex at home and abroad 25 2 Xymena Kurowska and Anatoly Reshetnikov Spotting the bear: credible attribution and Russian 3 operations in cyberspace 33 Sven Herpig and Thomas Reinhold Russia’s cyber diplomacy 43 4 Elena Chernenko Case studies of Russian cyberattacks The early days of cyberattacks: 5 the cases of Estonia, -
Breaking the Spin Cycle: Teaching Complexity in the 19.3
Lane Glisson 461 Breaking the Spin Cycle: Teaching Complexity in the 19.3. Age of Fake News portal Lane Glisson publication, abstract: This article describes a discussion-based approach for teaching college students to identify the characteristics of ethical journalism and scholarly writing, by comparingfor fake news with credible information in a strategically planned slideshow. Much has been written on the need to instruct our students about disinformation. This librarian shares a lesson plan that engages students’ critical thinking skills by using a blend of humor, analysis, and a compelling visual presentation. The teaching method is contextualized by research on the distrust of the press and scientific evidence since the rise of hyper-partisan cable news, Russian trollaccepted farms, and alternative facts. and Introduction edited, Throughout our culture, the old notions of “truth” and “knowledge” are in danger of being replaced by the new ones of “opinion,” “perception” and “credibility.” copy Michio Kakutani1 What if truth is not an absolute or a relative, but a skill—a muscle, like memory, that collectively we have neglected so much that we have grown measurably weaker at using it? How might we rebuild it, going from chronic to bionic? reviewed, Kevin Young2 npeer 2015, I knew I had a problem. After several years of teaching library instruction is classes, I noticed that my conception of factual- ity and that of my students had diverged. Most Most students preferred Istudents preferred Google and YouTube to do their mss. Google and YouTube to do research. When asked in my classes how they dis- cerned the credibility of a website, most shrugged their research. -
Identities and Their Discontents: Youtube As a Platform for Political Opposition in Contemporary Russia
Identities and Their Discontents: YouTube as a Platform for Political Opposition in Contemporary Russia By Theo Tindall Submitted to Central European University Department of Political Science In partial fulfilment of the requirements for the degree of Master of Arts in Political Science Supervisor: Professor András Bozóki Advisor: Professor Alexandra Kowalski CEU eTD Collection Vienna, Austria (2021) Abstract This thesis will examine the role played by online media in the development of political opposition in contemporary Russia, focusing in particular on the role of YouTube as an alternative to traditional forms of mass media, and the way in which online political opposition interacts with authoritative constructions of Russian national and popular identities. By examining a range of theoretical approaches to nationalism and twentieth- and twenty first- century mass media, this thesis will argue that identities should be understood as objects of discursive contestation which may be disputed or instrumentalised by opposition in order to undermine political authority, before exploring the implications of this argument in the context of contemporary Russian politics. In Russia, YouTube offers opposition a platform for the publication of independent content and a way of circumventing state controls on television and other traditional media. However, YouTube, which has been owned by Google since 2006, ensures that this national opposition must be articulated within the wider discursive structures of global capitalism. As such, even as YouTube provides an opportunity for the development and dissemination of politically oppositional material, this opposition is shaped by the conditions of its articulation within the globalised, profit-oriented space of YouTube. Russian YouTube is therefore characterised by the tension stemming from its location within these two competing authoritative discourses, which, even as they allow the development of political opposition, condition the forms it may take. -
H-Diplo | ISSF POLICY Series America and the World—2017 and Beyond
H-Diplo | ISSF POLICY Series America and the World—2017 and Beyond Fractured: Trump’s Foreign Policy after Two Years Essay by David C. Hendrickson, Colorado College Published on 29 January 2019 | issforum.org Editor: Diane Labrosse Web and Production Editor: George Fujii Shortlink: http://tiny.cc/PR-1-5BN Permalink: http://issforum.org/roundtables/policy/1-5BN-fractured PDF URL: http://issforum.org/ISSF/PDF/Policy-Roundtable-1-5BN.pdf he presidency of Donald Trump is the strangest act in American history; unprecedented in form, in style an endless sequence of improvisations and malapropisms.1 But in substance there is continuity, probably much more than is customarily recognized. It is hard to recognize the continuity, amid the Tdaily meltd owns (and biennial shutdowns), but it exists. In large measure Trump has been a Republican president, carrying out a Republican agenda. His attack on the regulatory agencies follows a Republican script. His call for a prodigious boost to military spending, combined with sharp cuts in taxes, has been the Republican program since the time of Ronald Reagan’s presidency. His climate skepticism corresponds with that of Republican leaders in Congress. On trade and immigration, Trump has departed most radically from Bush Republicanism, but even in that regard Trump’s policies harken back to older traditions in the Grand Old Party. He is different in character and temperament from every Republican predecessor as president, yet has attached himself to much of the traditional Republican program.2 It is in foreign policy, the subject of this essay, where Trump’s role has been most disorienting, his performance ‘up-ending’ in substance and method.