<<

Russian Information Operations

By Sarah Morrison Student Number: 102128946 Dr HAS – Doctor of Philosophy Swinburne University of Technology March 2021

Amended with Examiners Feedback July 2021

1

Abstract Since the fall of the , has increasingly engaged in information operations both internally and abroad. The current research examines the historical development of Russian information operations from 1989 to the present day, paying particular attention to the role technology, specifically the Internet and networks (SMNs), has played in developing these campaigns. The development of Russian information operations is a contemporary issue due to Russian campaigns' perceived effect on targeted countries, such as the US, in 2016 during the presidential election.

The research question ‘How have Russian information operations developed since the collapse of the Soviet Union’ is answered by examining various doctrine on information campaigns and the intersection of technology and society from both a western and Russian perspective, applying a qualitative study with quasi-quantitative methodologies. The primary source of data used throughout the research project were tweets and accounts propagated by the Russian , (IRA), that Twitter compiled and released for research purposes.

The research demonstrates that Russian information operations continuously evolve and adapt to deployed information campaigns' successes and failures. Furthermore, the Internet, particularly SMNs, have added a ‘sticky’ element to information operations. Information spread on SMNs is more likely to stick with people than information spread through traditional media sources like newspapers and television. When looking specifically at the 2016 US presidential election, the research findings indicate that contrary to allegations that Russia created disinformation to sow discord and interrupt the US democratic process throughout 2016, the IRA did not need to develop new disinformation to spread online. Instead, the IRA relied heavily on information circulating on SMNs and traditional media at the time, begging the question, where did the disinformation originate, if not from Russia?

2

Acknowledgement I would like to thank my Principal Supervisor Associate Professor James Martin and my Co- Supervisor, Dr Belinda Barnett, for their support and assistance in producing this research. I would also like to thank my husband David and two children Connor and Emily, for their patience and understanding.

3

Candidate Declaration I declare that the following thesis contains no material that has been accepted for the award to the candidate of any other degree or diploma and, to the best of my knowledge, contains no material previously published or written by another person except where due reference is made in the text of the examinable outcome, and with permission received to republish the work in the thesis.

Sarah Morrison

4

Table of Contents

ABSTRACT ...... 2

ACKNOWLEDGEMENT ...... 3

CANDIDATE DECLARATION ...... 4

TABLE OF CONTENTS ...... 5

LIST OF TABLES ...... 9

CHAPTER ONE – INTRODUCTION ...... 11

1.1 INTRODUCTION ...... 11 1.2 BACKGROUND ...... 11 1.3 THE US AND RUSSIA – A SHAKY PAST ...... 13 1.4 THE COLD WAR, CAMPAIGNS AND THE BEGINNING OF INFORMATION WARFARE ...... 14 1.5 PRESENT DAY ...... 15 1.6 THE CURRENT STUDY ...... 17 1.7 THESIS STRUCTURE ...... 18 1.7.1 CHAPTER TWO: LITERATURE REVIEW ...... 18 1.7.2 CHAPTER THREE: METHODOLOGY ...... 18 1.7.3 CHAPTER FOUR: THE RISE OF THE KREMLIN TROLL ...... 18 1.7.4 CHAPTER FIVE: DEFINING INFORMATION IN INFORMATION OPERATIONS ...... 19 1.7.5 CHAPTER SIX: CRIMEA ...... 19 1.7.6 CHAPTER SEVEN: 2014: RUSSIA ATTACKS THE US ...... 19 1.7.7 CHAPTER EIGHT: THE 2016 US ELECTION ...... 19 1.7.8 CHAPTER NINE: A WORLD OF DISINFORMATION ...... 20 1.7.9 CHAPTER TEN: CONCLUSION ...... 20

CHAPTER TWO – LITERATURE REVIEW ...... 21

2.1 BACKGROUND ...... 21 2.2 DEFINING RUSSIA'S STRATEGY ...... 23 2.2.1 CURRENT WESTERN LITERATURE ...... 23 2.2.2 INFORMATION WARFARE ...... 26 2.2.3 RUSSIA'S USE OF CYBER TECHNOLOGY ...... 32 2.3 TROLL FARMS, TROLLS AND SOCIAL BOTS ...... 35 2.4 RUSSIA’S MODERN-DAY PROPAGANDA CAMPAIGNS ...... 38 2.4.1 THE RE-INVENTION OF SOVIET-PROPAGANDA ...... 38 2.4.2 THE EFFECTIVENESS OF RUSSIA’S INFORMATION WARFARE AGAINST EUROPE AND NEIGHBOURING COUNTRIES42 2.5 RUSSIA'S NEW INFORMATION WARFARE AGAINST THE US ...... 44 2.5.1 RUSSIA'S INFORMATION WARFARE AGAINST US PRESIDENTIAL ELECTIONS ...... 44 2.5.2 THE US 2016 PRESIDENTIAL ELECTION ...... 44 2.5.3 THE USE OF TROLL FARMS AGAINST THE US ...... 46

5

CHAPTER THREE – METHODOLOGY ...... 48

3.1 RESEARCH QUESTIONS ...... 48 3.2 RESEARCH APPROACH ...... 48 3.2.1 THEORY ...... 50 3.2.2 EPISTEMOLOGICAL PERSPECTIVE ...... 51 3.2.3 ONTOLOGICAL ISSUES ...... 53 3.2.4 QUALITATIVE, QUANTITATIVE OR MIXED METHODS ...... 54 3.2.5 INFLUENCES ...... 55 3.2.6 LIMITATIONS ...... 56 3.3 RESEARCH DESIGN ...... 56 3.4 RESEARCH METHOD ...... 59 3.4.1 STAGE ONE – COMPILING ...... 60 3.4.2 STAGE TWO – FURTHER COMPILING ...... 61 3.4.3 STAGE THREE—DISASSEMBLING ...... 63 3.4.4 STAGE FOUR—REASSEMBLING ...... 65 3.4.5 STAGE FIVE—INTERPRETING ...... 66 3.4.6 STAGE SIX – CONCLUSION ...... 69 3.5 CHAPTER REVIEW ...... 69 3.5.1 CHAPTER FOUR – THE RISE OF THE KREMLIN TROLL ...... 69 3.5.2 CHAPTER FIVE AND SIX ...... 69 3.5.3 CHAPTER SEVEN AND CHAPTER EIGHT ...... 70 3.5.4 CHAPTER NINE ...... 70

CHAPTER FOUR—THE RISE OF THE KREMLIN TROLL ...... 71

4.1 INTRODUCTION ...... 71 4.2 CHECHEN WARS ...... 71 4.3 GEORGIAN WAR, 2008 ...... 76 4.4 ARAB SPRINGS, 2010-2011 AND RUSSIAN DEMONSTRATIONS, 2011 ...... 81 4.4.1 THE KREMLIN’S RESPONSE ...... 85 4.5 THE RISE OF THE TROLL FARM ...... 87 4.6 THE INTERNET RESEARCH AGENCY (IRA) ...... 89 4.7 CONCLUSION ...... 91

CHAPTER FIVE – DEFINING INFORMATION IN INFORMATION OPERATIONS ...... 94

5.1 INTRODUCTION ...... 94 5.2 STATE INFORMATION DOCTRINES ...... 95 5.3 RUSSIA’S CONTEMPORARY INFORMATION OPERATIONS ...... 97 5.4 THE DYNAMICS OF SMNS ...... 101 5.5 THE LITTLE BLUE BIRD—TWITTER ...... 106 5.6 RUSSIA’S INFORMATION OPERATIONS ON TWITTER ...... 109 5.7 CONCLUSION ...... 113

CHAPTER SIX – CRIMEA ...... 115

6.1 INTRODUCTION ...... 115 6.2 RUSSIA’S POST-SOVIET UNION DISINFORMATION, PROPAGANDA AND CONSPIRACY CAMPAIGNS ...... 116

6

6.3 THE DANGEROUS OTHER ...... 118 6.4 AND THE ANNEXATION OF CRIMEA ...... 119 6.5 CRIMEA’S INFORMATION OPERATIONS ...... 121 6.6 CRIMEA’S RUSSIAN HERITAGE ...... 125 6.7 THE INTERNET RESEARCH AGENCY IN 2014 ...... 126 6.7.1 TOP TEN (10) TWITTER ACCOUNTS WITH THE HIGHEST FOLLOWERS ...... 128 6.7.2 KEY WORD SEARCH ANALYSIS ...... 130 6.7.3 GENERAL OBSERVATIONS ...... 133 6.8 CONCLUSION ...... 135

CHAPTER SEVEN – 2014: RUSSIA ATTACKS THE US ...... 146

7.1 INTRODUCTION ...... 146 7.2 RUSSIA ATTACKS AMERICA ...... 146 7.3 THE COLUMBIAN CHEMICAL INFORMATION CAMPAIGN ...... 149 7.4 AND A SHOOTING IN ...... 157 7.5 KOCH’S ’S ...... 160 7.6 THE START OF AN INFORMATION WAR ...... 164 7.7 TRUST NETWORKS ...... 166 7.8 CONCLUSION ...... 169

CHAPTER EIGHT – THE 2016 US PRESIDENTIAL ELECTION ...... 174

8.1 INTRODUCTION ...... 174 8.2 A HISTORY OF INTERFERENCE ...... 175 8.3 THE 2016 US PRESIDENTIAL ELECTION ...... 177 8.3.1 CYBER-ACTIVITY ...... 178 8.3.2 STATE FUNDED-MEDIA ...... 179 8.3.3 GROWTH OF ANTI-AMERICAN RHETORIC ...... 179 8.3.4 THE INTERNET RESEARCH AGENCY ...... 180 8.3.5 SOCIAL MEDIA POSTS ...... 181 8.3.6 PURPLE STATES ...... 182 8.4 PREVIOUS STUDIES ...... 182 8.5 TWITTER ANALYSIS ...... 183 8.6 RUSSIAN TROLL FARM ACTIVITY ...... 185 8.7 TWITTER ...... 193 8.7.1 ADDITIONAL REPEATED THEMES ...... 196 8.7.2 OTHER EVENTS ...... 202 8.8 CONCLUSION ...... 203

CHAPTER NINE – A WORLD OF DISINFORMATION ...... 205

9.1 INTRODUCTION ...... 205 9.2 OTHER IDENTIFIED DISINFORMATION CAMPAIGNS IN 2016 ...... 206 9.3 #PIZZAGATE – A CASE STUDY ...... 209 9.4 QANON ...... 218 9.5 SERGEI SKRIPAL ...... 219 9.6 SOCIAL MEDIA NETWORK’S RESPONSE TO THE 2016 US PRESIDENTIAL ELECTIONS ...... 221 9.6.1 ...... 221

7

9.6.2 ...... 223 9.6.3 TWITTER ...... 224 9.7 IRA ACTIVITY SINCE THE 2016 US PRESIDENTIAL ELECTION ...... 224 9.8 THE 2020 US PRESIDENTIAL ELECTION ...... 226 9.9 CONCLUSION ...... 228

CHAPTER TEN – CONCLUSION ...... 229

10.1 MAJOR FINDINGS ...... 229 10.2 LIMITATIONS ...... 233 10.3 CONCLUDING REMARKS ...... 233 10.4 IMPLICATIONS ...... 234 10.5 FURTHER RESEARCH ...... 234

APPENDIX ONE – ETHICS APPROVAL CERTIFICATES ...... 236

APPENDIX TWO – SQL QUERIES, COMMAND-LINE SEARCHES, BASH COMMANDS AND PEARL SCRIPTS ...... 238

BIBLIOGRAPHY ...... 240

8

List of Tables Table 3.1—Epistemology Definitions and Extensions

Table 3.1 – Epistemology Definitions and Extensions 51 Table 3.2—Fundamental Differences between quantitative and qualitative research strategies 56 Table 3.3—Primary Research Designs (Bryman, 2016, p. 32) 57 Table 3.4— Snapshot of the IRA Twitter Data 63 Table 5.1—2013 IRA Tweets concerning Russia’s, ’s and the EU’s relationship 114 Table 6.1—2014 Ukraine and Crimea Key Word Searches 129 Table 6.2—Top Ten Twitter Accounts with the highest number of follows 130 Table 6.3—The Ten Most Retweeted Tweets from the most followed accounts – Crimea Campaign 138 Table 6.4—The categorisation of tweets from the top 10 accounts with the most followers – Crimea Campaign 139 Table 6.5—The 11 most retweeted tweets from “Крым” search – Crimea Campaign 141 Table 6.6—The 10 most retweeted tweets from “Украин” search – Crimea Campaign 143 Table 6.7—The 10 most retweeted tweets from “Ukrain” search – Crimea Campaign 144 Table 6.8—IRA Account Creation Year 135 Table 6.9—Top Ten Used Throughout Crimea Campaign 136 Table 7.1—Twitter Hashtags for Columbian Chemical Information Campaign 151 Table 7.2—Keyword Search – 2014 US Campaigns 152 Table 7.3—Total Number of IRA English Language Tweets per Year 157 Table 7.4—Total Tweets based on the Creation Date of Accounts Used in the Columbian Chemical Information Campaign per Month in September 2014 158 Table 7.5—Client Used to Coordinate Action Across Multiple Accounts in 2014 US Campaigns 159 Table 7.6—Top Ten Follower Accounts for the 2014 Campaigns 174 Table 7.7—Top 10 Followed Accounts in Crimea Campaign 174 Table 7.8—Ten most followed accounts interaction – 2014 US Campaigns and Crimea Campaigns 162 Table 7.9—Analysis of #wakeupamerica 166 Table 7.10—Top Ten Followed Accounts in Koch’s Turkey Farm Campaign 176 Table 8.1—Examples of the IRA Campaign reporting regurgitating ideas that were already in circulation 189 Table 8.2—Previously Identified Media Campaigns 198

9

Table 9.1—Common Hashtags Used Throughout the 2016 US Campaign 216

10

Chapter One – Introduction

1.1 Introduction This research project's primary parameter is the development of Russian information operations since the collapse of the Soviet Union. In examining this parameter, this thesis will demonstrate that Russia's renewed interest in information operations after the fall of the Soviet Union began as a means to compensate for the nation’s diminished military and geopolitical position. However, the rapid growth of Russia’s economy at the turn of the 21st century saw Russia demonstrate the full power of its information operations. As Chapter Four discusses, 's election as President of Russia between December 1999 and May 2008 and then again from May 2012 to the present day saw a flourishing of Soviet-era type active measure campaigns (Abrams, 2016).

This research provides a unique approach to the study of Russian information operations. It is a qualitative study with quasi-quantitative methodologies, which has adopted a blended approach that is both longitudinal in design and a case study. As a case study, this research observes the Internet Research Agency (IRA), a Russian troll farm that utilised Twitter to conduct information campaigns, and how its social media tactics have evolved from 2009 to the present day. As the available data on the IRA stems from 2009 to 2018, the case study is also longitudinal.

1.2 Background Soviet KGB Russian disinformation and propaganda, aktivnyye meropriyatiya (active measure campaigns) appeared on a massive scale at the height of the Cold War. Although widely used by the Kremlin, according to Abrama (2016), the West never really understood the effectiveness of aktivnyye meropriyatiya. With Vladimir Putin's election as Russia's second President in 1999, reports of increased active measure campaigns began to surface. Nine years later, a Czech Republic report provided evidence that Russia was engaged in active measure campaigns against neighbouring countries, stating that Russia was once again using active measure campaigns, replicated on Soviet-bloc techniques (Intelligence Services, 2009, p. 5). The report contended further that 'the efficiency of the Soviet propaganda [of the 1980s] and the Russian propaganda now on individuals or groups of people are comparable, regardless of their culture, social, economic or political background' (Intelligence Services, 2009, p. 5). However, according to Giles (2016b) and Paul & Matthews (2016), insufficient attention has 11

historically been paid to Russian information operations as many in the West have believed that the false narratives deployed by the Kremlin are implausible as they deviate so far from the truth, and that no one would take them seriously.

Russian information operations rely on two primary factors; first, a high number of channels and messages, and second, "a shameless willingness to disseminate partial truths or outright fictions" (Paul & Matthews, 2016, p. 1). Both factors are contrary to traditional theories regarding influence campaigns that emphasise truth and credibility and the circumvention of contradiction (Paul & Matthews, 2016). Giles (2016) argues that putting too much emphasis on this fact has led Russian campaigns to be measured by the wrong criteria because researchers fundamentally misunderstand Russian objectives. With reports that Russia undertook information campaigns against the US during the 2016 US presidential election, a renewed interest has surfaced regarding Russian information campaigns.

As demonstrated throughout this thesis, there is a magnitude of research specific to Russian information warfare. For example, Keir Giles research explores Russian military, defence and security issues and Paul and Matthews (2016) work on Russia’s propaganda model, dubbed by the authors as Russia’s ‘Firehose of Falsehood’. A history of Russia’s relationship with propaganda and the conspiracy is outlined in Ilya Yablokov writings, while Jaitner and Mattisson (2015) provide a modern perspective on Russian conspiracy theories. When examining the relationship between information operations and technology research may be found in digital and media fields of study. Kate Starbird’s research, for example, examines disinformation and and the intersection of technology and society and the role social media platforms play in this interaction. When looking at the relationship between society and technology José van Dijck and Nicolas P Suzor are of relevance. However, these studies do not cover the development of Russian information operations to the establishment and use of troll farms. The current research project builds on authors such as Paul and Matthews (2015) when looking at the development of Russian information operations and the use of disinformation online and Kate Starbird’s research which examines SMNs as a tool for disinformation. In doing so, demonstrates that Russian information operations are not static; they evolve, adapt and develop based on their successes and failures, abandoning failed information operations and testing new approaches. In so doing, this project extends on Giles (2016) research. Unlike Western countries which claim to only carry out tactical information

12 operations in times of hostility, Russia's information operations are all-encompassing, form part of their broader information security structure and are carried out on an on-going basis. Russia aims to control information, whether in print or digital media, the mass consciousness, and even individual consciousnesses (Giles 2016). As a result, Russia can take advantage of what Galeotti (2019) describes as the West's reluctance to capitalise on political and information operations to disrupt government and military leadership. Russia intends to destabilise any sense of certainty with regards to world affairs and to undermine Western domestic political institutions. As O’Loughlin (2015, p. 169) explains, "such a strategy undermines the very fundamentals of information and credibility that informed debate is supposed to rest upon".

1.3 The US and Russia – A Shaky Past Prior to World War II relations between the and the Soviet Union could, at best, be described as fractured due to events dating back to the Bolshevik Revolution of 1917. It was during this revolution that the Soviet Union campaigned for a communist regime that would conquer the world, repudiate czarist debts, many of which were owed to Americans, and confiscate from Russian citizens any property that was American-centric, such as clothing and music (Paterson & McMahon, 1999). However, in the early 1920’s some of the antagonism between the two nation-states lifted, with the United States sending relief to Russia during a time of famine. In turn, this aid led to investment on the part of American businesses into the Union of Soviet Socialist Republics (USSR) (Paterson & McMahon, 1999). Nonetheless, communist doctrine, with its attendant ambitions to a global proletariat revolution, prompted the United States to continue to view the Soviet Union as a foreign threat (Lewis, 2000).

In 1941 Stalin joined with Great Britain and the United States in the ‘Grand Alliance’ to defeat Hitler, and the Soviet Union’s image appeared to improve in America’s mass media. Russia had put communist ideologies aside, at least to the extent of cooperating with foreign capitalist powers, in order to defeat a common enemy. Factions of the United States government even began to believe that Russia had started to embrace democracy (Lewis, 2000). In 1944 to the benefit of all, the Soviet Union created, “arguably, the greatest disaster for in World War II” (Beckhusen, 2017, p. 1). Using Soviet maskirovka or (a form of information warfare) against the Germans during Operation Bagration, the Soviets utilised misdirection (by sending troops to Finland) and ‘radio silence’ (to distract German headquarters in ) to

13

mislead German military forces. By doing so, the Soviet soldiers opened up an opportunity to attack an insufficiently defended front, leading ultimately to the destruction of the Wehrmacht's Army Group Centre (Beckhusen, 2017).

At the end of the war, citizens in many nations involved in the conflict struggled to survive due to shortages of food and other necessities. No such challenges faced the United States, which emerged from the war “the strongest nation on the globe” (Paterson & McMahon, 1999, p. 4). The United States envisioned rebuilding a free world that was both politically reliable and stable, based on a capitalist model. It soon became evident that the Soviet Union would continue to follow the communist doctrine, which contradicted the United States’ world vision (Paterson & McMahon, 1999). The Soviet Union had already demonstrated military strength and strategic and tactical capability; the question appeared whether they would continue to expand its sphere of influence beyond those territories conquered during the war.

1.4 The Cold War, Propaganda Campaigns and the beginning of Information Warfare The Cold War that began after 1945 saw the United States and the Soviet Union become primary adversaries fighting for global hegemony. This conflict lasted decades and focused on “spheres of influence, economic and strategic advantage, nuclear-weapons supremacy, control of international organizations and ideological superiority” (Paterson & McMahon, 1999, p. xi). Some would argue that the Cold War was born out of a ‘brave and essential’ stance against communist aggression. Others contend that it was taken due to the United States’ need to control Eastern Europe and to prevent the Soviet Union from gaining further power (Paterson & McMahon, 1999). Goldgeier and McFaul (2003, p. 1) provide a more simplistic view:

The Soviet Union and the United States were rivals not only because they were the two greatest powers in the international system, but because they were two powers with antithetical visions about how political, social and economic life should be organised.

From the late 1940s to the end of the Cold War in 1989, world issues, such as the conflict in the , European defence and civil wars in Africa, Asia and South America were influenced by the rivalry between the US and the USSR, with both nations often carving out areas as staging grounds against one another. It was not only on the political world stage that

14

the US and USSR figurately duelled for the world to see but also through the subtle art of information operations carried out by both nations at various times. For example, in the 1950s the US engaged in soft power campaigns, with the distribution of pro-US fliers throughout the Soviet empire under Project Troy (discussed in detail in Chapter Eight), to outright attempts of election manipulation as seen by both the US and Soviet Union between the 1970s and 90s.

1.5 Present Day Today, information operations are carried out by a multitude of countries across social media networks. In 2019 a report released by the Computation Propaganda Research Project revealed that information operations conducted through social media had become a global phenomenon carried out by ‘cyber troops’ employed by a range of governments and other political actors (Bradshaw & Howard, 2019). The Computation Propaganda Research Project defines cyber troops as actors who have been tasked by government or political actors to manipulate (Bradshaw & Howard, 2017). However, this definition does not differentiate between political advertising and paid trolls employed by nation-states to sow discord in foreign countries.

The 2019 report examines cyber activity in several countries, demonstrating that coordinated manipulation campaigns occur in every political regime type1. The report highlights how deliberate propaganda could be explicitly seen during elections and referendums. Computational propaganda was defined as the use of:

Automation, algorithms and big-data analytics to manipulate public life [encompassing] issues to do with so-called “”, the spread of misinformation on social media platforms, illegal data harvesting and micro- profiling, the exploitation of social media platforms for foreign influence

1 The 2019 report identified the following countries as manipulating social media: Angola, Argentina, Armenia, Australia, Austria, Azerbaijan, Bahrain, Bosnia & Herzegovina, Brazil, Cambodia, , Colombia, Croatia, , Czech Republic, Ecuador, Egypt, Eritrea, Ethiopia, Georgia, Germany, Greece, Honduras, Guatemala, Hungary, , Indonesia, Iran, Israel, Italy, Kazakhstan, Kenya, Kyrgyzstan, Macedonia, Malaysia, Malta, , Moldova, Myanmar, , , North Korea, Pakistan, Philippines, Poland, Qatar, Russia, Rwanda, Saudi Arabia, Serbia, , South Korea, Spain, Sri Lanka, Sweden, , Taiwan, Tajikistan, Thailand, Tunisia, Turkey, Ukraine, United Arab Emirates, , United States, Uzbekistan, , Vietnam, and .

15

operations, the amplification of hate speech or harmful content through fake accounts or political bots, and clickbait content for optimized social media consumption (Bradshaw & Howard, 2018, p. 4).

Over 30 countries have introduced legislation or government agencies and task forces to counter fake news and combat propaganda. These efforts include disseminating counter- narratives, raising citizen awareness and creating fact-checking portals.

In Australia, for example, the Turnbull Government, in preparation for the federal by-elections held in Australia in July 2018, established the Electoral Integrity Taskforce. The task force was established to support three main functions:

• Electoral laws: Electoral communications are required to be authorised to ensure voters know who is communicating on any said electoral matter. • Foreign Influence: Any person lobbying, participating in communication activities or disbursement activities during an election, on behalf of a foreign principal, must register the action under the Foreign Influence Transparency Scheme to ensure transparency to voters. • Security of Elections: Proactive measures will be provided by the Australian Signals Directorate (ASD) and the Australian Cyber Security Centre (ACSC) to maintain electoral systems' cybersecurity.

The task force is compromised of the following agencies2:

• Australian Electoral Commission; • Department of Finance; • Department of Prime Minister and Cabinet; • Department of Infrastructure, Transport, Regional Development and Communications; • Attorney-General’s Department; • Department of Home Affairs; and • Australian Federal Police (Australian Electoral Commission, 2020b).

In 2019 the taskforce commenced an advertising campaign on social media called “Stop and Consider”. The campaign called for voters to check the source of any electoral communication they engaged with during the 2019 federal election. The campaign also asked voters to review

2 As of January 2021 this task force was still in existence. 16

the information's reliability, the currency and whether the information could be a scam (Australian Electoral Commission, 2020a). Examples of the campaign captured on Twitter can be seen in Image 1.1 below.

Image 1.1 – An example of Australia’s Stop and Consider Campaign

According to Bradshaw and Howard (2018), propaganda is generally funded and coordinated by a government body. In some instances, political parties were found to be the primary organisers of disinformation campaigns, using taskforces designed to protect citizens from the very activity they would then engage in, such as censorship and propaganda campaigns (Bradshaw & Howard, 2019).

1.6 The current study What follows is an analysis of the development of Russian information operations since the collapse of the Soviet Union and how they have changed over time. It demonstrates that Russian information operations are not static and continue to evolve. To support this claim, Russian troll farm activity on Twitter, released for research purposes by Twitter, has been used throughout this thesis to demonstrates how Russian information operations have evolved and adapted since the advent of the Internet. The overarching research question examined throughout this thesis, is “how have Russian information operations developed since the collapse of the Soviet Union?” Following a timeline of events, it is argued that this evolution of information operations provides insight into Russian information operations and information operations in the Internet age.

17

1.7 Thesis Structure The following section will provide a brief description of each of the chapters throughout this thesis.

1.7.1 Chapter Two: Literature Review The literature review will provide a critical examination of the existing literature regarding Russia’s use of information and influence operations. As will be demonstrated, several terms have been developed to describe Russian information campaigns, such as hybrid warfare, information warfare and information operations. Chapter Two will define these and other relevant terms used throughout the thesis, such as troll, troll farm and . The chapter will also examine various doctrine on information campaigns, which includes an in-depth analysis of relevant western and Russian literature and literature from media studies. Through an examination of this literature, Chapter Two will demonstrate a gap in research on Russian troll farm activity which this thesis intends to fill.

1.7.2 Chapter Three: Methodology Chapter Three provides a roadmap of the approach adopted throughout the research project. In brief, the methodology relies upon Bryman’s (2016) social research process. As explained in Chapter Three, a longitudinal case study of the Internet Research Agency (IRA), a Russian troll farm identified in 2015 was undertaken in terms of research method. This approach aims to offer an understanding of how the Russian government adapted technology systems over time.

1.7.3 Chapter Four: The Rise of the Kremlin Troll After the Cold War collapse, the Russian government demonstrated several failings regarding military strategy and information operations. As shown throughout Chapter Four, however, the Russian government also undertook improvements through critical learnings to formulate information warfare and their adaptation to online communication applications. This chapter will examine the failings of both the Chechnya and Georgia wars and demonstrates how Russia learnt from previous military mistakes to take control of the information narrative, for example, by creating troll farms, such as the IRA.

18

1.7.4 Chapter Five: Defining Information in Information Operations Chapter Five will examine the role social media networks play in providing a means to accomplish all the elements required for a successful information campaign. As will be demonstrated, the Russian government places significant importance on information and has invested in various means to ensure control over information. The propagation of falsehoods and fake news is an intricate part of Russia’s culture and history. In contemporary society, this tradition continues, as Russian propaganda is produced en masse and distributed via many channels. Further, government-created narratives are repeated, supported by experts, and propagated by trusted sources to ensure these propaganda and disinformation campaigns' success.

1.7.5 Chapter Six: Crimea Having examined the development of Russian information operations since the collapse of the Soviet Union and establishing the importance the Internet and SMNs play in future information operations, Chapter Six will examine Russia’s information campaign during the annexure of Crimea. However, as will be demonstrated, although propagated by a new means, including the Internet, Russia's information operations still displayed characteristics of Russian active measure campaigns, which date back centuries.

1.7.6 Chapter Seven: 2014: Russia Attacks the US After a successful Crimea campaign, Russia appears to set its sights on the US with four consecutive information operations targeting US citizens. Chapter Seven examines these four campaigns, which include claims that a chemical plant exploded in the US state of Columbia, an Ebola outbreak and shooting of unarmed black women in Atlanta and a food poisoning event at Thanksgiving. As will be demonstrated, all four campaigns were based on lies and disinformation, and all four operations appear to have failed. However, examining the campaigns shows specific lessons that the IRA would learn from each failed attempt.

1.7.7 Chapter Eight: The 2016 US Election The US and Russia have a long history of interfering in each other's political affairs, including elections. Chapter Eight will examine this history before analysing events that occurred during the 2016 US presidential election. As will be demonstrated, the IRA’s 2016 US election campaign would not involve creating new content nor investing in useful fools to legitimise false stories, as discussed in previous campaigns. Instead, the Russian government relied

19 heavily on existing content circulating in the mainstream news and social media networks. Research presented in Chapter Nine demonstrates that by 2016 IRA operatives had embedded themselves into echo chambers and engaged in conversations and insults based on the existing narratives developed throughout the election, indicating a significant change and approach to information warfare when compared to previous operations.

1.7.8 Chapter Nine: A World of Disinformation Today it is not uncommon to see disinformation across social media networks (SMNs). Chapter Nine will examine disinformation campaigns running in conjunction with the 2016 IRA information campaign, demonstrating that these campaigns, rather than complementing each other, run in silos. Chapter Nine will also examine the response by SMNs to the IRA and disinformation campaigns in general and assess how the 2016 information campaigns impacted the way the world viewed the 2020 US presidential election.

1.7.9 Chapter Ten: Conclusion The final chapter, Chapter Ten, will review the significant findings of the research project before providing suggestions for further research.

20

Chapter Two – Literature Review

The following chapter will examine the literature on Russia’s use of information campaigns and the development of influence operations. Firstly, by examining scholarly work on Russia’s use of information as a weapon before defining Russia’s information strategy by examining current Western literature and Russian Doctrine. Examined will be modern technological examples of Russia’s strategy, as described by such sources as the American think tank RAND, The US Centre for Naval Analyses, Arlington and scholarly text. As will be demonstrated, there is a plethora of research that defines Russia’s approach to information using various terminologies and definitions. This review will then examine the literature on Russia’s current propaganda campaigns to describe how Russia has married Soviet-era information warfare techniques with technological advances to produce a modern-day approach. This section will also examine the literature on the effectiveness of these campaigns against Eastern Europe and how neighbouring governments and the North Atlantic Treaty Organization (NATO) are countering these campaigns.

The final section will examine the literature on Russia’s information warfare against the US, paying particular attention to the Russian government attempts to influence US public opinion. First, by analysing historical efforts to influence US Presidential elections; second, by an examination of the anti-American rhetoric that began to appear in 2014 against the US from Russian troll farms. Third and final, a review of the literature describing Russia’s influence campaign during the 2016 US presidential election, will be undertaken.

2.1 Background There is extensive literature on information as a strategic weapon; information warfare has a long history. The cold war, for example, saw both Russia and the US wage against each other. US operations included “covertly starting magazines and organisations to organise artists and intellectuals against communism and providing financial and logistical support to dissidents behind the Iron Curtain” (Doran, 2017, p. 1). Examining the Soviet campaign, Kennan (1948) defined political warfare as

the employment of all the means at a nation's command, short of war, to achieve its national objectives. Such operations are both overt and covert. They range from such overt actions as political alliances, economic measures […], and “white”

21

propaganda, to such covert operations as clandestine support of “friendly” foreign elements, “black” and even encouragement of underground resistance in hostile states.

The end of the cold war saw the US move away from its political warfare legacy, with a new focus on influencing the foreign audience through public diplomacy and strategic communication in place of political warfare (Boot & Doran, 2013). According to Suzen (2018), this left the US vulnerable to political warfare as discussed below. In contrast, Russia's focus was on the establishment of new units employing personnel "for monitoring, analysing, and countering the perceived risks and threats of the global network society" (Libicki, 1995, pp. 7- 8). Libicki (1995) suggests that at the collapse of the Soviet-Union, there appears to be a reduction in conventional military budgets and a new emphasis placed on cyber threats and information security. The Cold War, as suggested by many Russian observers, never ended (Schweitzer, 2016), with Russia's military behaviour a continuation of prior patterns (Fasola, 2017).

Parallel to the end of the cold war was the continuing development of a 'knowledge economy', a term coined in the works of Castells (1989), with the primary product in the digital age set to be information. A by-product of this economy included a new emphasis placed on information warfare (Libicki, 1995). According to Knight (2004), information warfare became prevalent with the realisation that 'information and ideas' contribute to military success in the same way that 'weapons and tactics' had done in the past. Special Representative of the President of the Russian Federation, writing as Deputy Premier and former Defence Minister, Sergei Ivanov (2007) states, "the development of information technology has resulted in the information itself turning into a certain kind of weapon. It is a weapon that allows us to carry out would-be military actions in practically any theatre of war and most importantly, without using military power". Russia recognised information warfare as a proxy for more traditional forms of war (Blank, 2017).

Blank (2017) proposes that Russia's perceived defeat in the Yugoslav and Kosovar conflict during 1998 and 1999, was seen by the Russian State as a result of domestic public support wavering, Russia not being able to control NATO's attack against Serbian dissidents (Goldgeier & McFaul, 2003). In response, Russia curtailed media access to ensure the support of the Russian people, resulting in "enduring public support allowed Putin to give military a freer rein

22

to fight a long way without any hint of public opposition" (Blank, 2017, p. 84). Russia has demonstrated a history of propaganda and agitation to amass change and rally support, Russia's Kosovar campaign, according to Berkhoff (2012), is an example of old tactics described in new ways.

A key term used throughout this research paper, as highlighted above is propaganda. According to Jackall (1995), the term propaganda originated in religion during the 1600s and the establishment of de Propaganda Fide which sort to preach and catechise “the countries… lost to the Church in the debacle of the sixteenth century and to organize into an efficient corps the numerous missionary enterprises for the diffusion of the gospel in pagan lands” (1). With the Great War and then the Cold War, propaganda would become a tool to ensure the allegiance of civilian populations (Jackall, 1995). Propaganda is often difficult to distinguish between professions such as marketing, advertising, public relations and mass communications (Laskin, 2019). According to Laskin (2019), propaganda is thus a strategy which looks to unite people through dissolving individuality, while promoting an us-versus-them dichotomy. The key characteristics distinguishing propaganda from professions such as marketing, public relations etc, is that propaganda demands a sacrifice of the individual, in the name of society, country, politics or a god. As such, Laskin (2019) defines propaganda as “a persuasive communication activity that establishes symbiotic relations between an individual and a larger entity into which this individual is being dissolved (313)”.

Current literature does not dispute Russia's use of propaganda and information as part of their broader military strategy. What does appear to be under some debate is the terminology used to define this strategy. Information and information technologies are continuing to grow in importance with regards to "national security in general and to warfare specifically" (Libicki, 1995, p. ix). The various terms also imply different perceptions of information warfare. Therefore, these terms must be defined to ensure a consistent approach to policy and legislation that focuses on foreign information operations.

2.2 Defining Russia's Strategy 2.2.1 Current Western Literature Western literature on Russian influence operations can roughly be divided into two predilections. One focuses on the technicalities of the Russian campaign applying Western concepts to Russian activity (e.g. hybrid warfare). The other is more concerned about the 23 strategic context which tries to capture the Russian campaign within its terms. For example, since Russia’s annexure of Crimea in 2014, the notion of hybrid warfare has been used by various authors to describe Russia's strategic approach (e.g. Renz 2016, Dickey et al. 2015 and Beehner et al. 2018) as "an operational approach to warfighting that uses an explicit mix of military and non-military tactics" (Renz, 2016, p. 283).

Beehner, Collins, Ferenzi, Person, and Brantly (2018, p. 3), defines hybrid warfare as the integration of " and other non-kinetic tools into a conventional strategy". The Russia-Georgia war, according to Beehner et al. (2018), being the first example of hybrid war. As the authors explain, the tactical advantage gained by Russia's use of cyberspace in the war had limited impact. In comparison, the strategic and psychological effects of cyberspace were much more poignant, hence demonstrating the value of information operations (Beehner et al., 2018), discussed below. As Beehner et al. (2018, p. 3) write, "where states once tried to control the radio waves, broadcast television channels, newspapers, or other form of soft communication, they now add to these sources of information control cyberspace and its component aspects, websites, and social media". Dickey, Everett, Galvach, J Mesko, and V Soltis (2015) provide two examples of Russia's modern forms of hybrid or asymmetric warfare, which they suggest form part of Russia's political warfare campaign, the recent movement in the Crimean Peninsula and Eastern Ukraine. Asymmetric warfare, according to the authors, refers to the disadvantaged position of Russia's military, compared to other nation-states who have more significant resources and finances, such as the US and organisations such as NATO.

Renz (2016) suggests that Russia's current approach to warfare is not new, and the term hybrid warfare is not useful. Firstly, it does not reflect the ongoing modernisation of Russia's military strategy, and secondly, it makes Russia look stronger than it is. In agreement with Renz (2016) is Galeotti (2016), who argues that hybrid war is not new and reflects a long tradition of Russian/USSR strategy. As Galeotti (2016, p. 296) writes, "from the tsars through the Bolsheviks, they have been accustomed to a style of warfare that embraces much more eagerly the irregular and the criminal, the spook and the provocateur, the activist and the fellow- traveller3".

3 Fellow-Travellers is a term given to Russian political and ideological sympathises (Caute, 1988)

24

Kalinina (2016), writing under a grant from the University of West Bohemia in Czechia, agrees that Russia views information as a weapon, however, suggest that there is confusion regarding the term hybrid warfare. Hybrid warfare refers to a coordinated effort to achieve both a psychological and physical element to a confrontation. Kalinina (2016) borrowing from Russian Military expert Vladmir Slipchenko (2002, p. 150) and his writings on new generation warfare, suggests that a more suitable heading for Russia's activities is information confrontation. Information confrontation referring to the surpassing of the enemies "ability to analyse, acquire and use information as well as regarding its quality and quantity". Strategies of Russian information confrontation, according to Kalinina (2016) include:

• Maskirovka (deception); • Disinformation; • Radio-electronic confrontation; • Physical destruction of information infrastructure; • Computer network attacks; • Information impact; and • Information aggression.

All and any of which, could be used in an extensive list of specially designed tools, such as logic bombs4 computer viruses5, aggression or psychological attacks (Slipchenko, 2002, p. 48).

A further term used to define Russia’s current military approach is information strategy. Information strategy involves both technical and nontechnical aspects that include strategies applied against the enemy and strategies that are in place to aid the protagonist (Arquilla, 2007). According to Blank (2017), Russia's current approach to cyberwar is a mix of information operations and political warfare. The objective of information operations to predominantly exert power by influencing the behaviour of a target audience. As Dahl (1957, p. 203) explains, this is the ability of A to influence B, to the extent that A “can get B to do something that B would not otherwise do”. There is no actual distinction in Russian discourse between

4 This is a piece of code that has been inserted into a software application or operating system which, after a certain length of time (or if certain conditions are not met), carries out a malicious attack (Security, 2018) 5 A computer virus is "a piece of code which is capable of copying itself and typically has a detrimental effect, such as corrupting the system or destroying data" (Dictionary.com, 2018)

25

information operations and cyber warfare as they both come under the heading of information warfare (Blank, 2017). Blank (2017) writes that media campaigns and asymmetric war, which incorporates information warfare and information operations, gained support within the Russian government due to a belief that this type of political and information warfare was and is being carried out effectively against Russia by its enemies. As a RAND think tank report suggests, not only sees a considerable shortfall in its ability to coordinate information operations against the US but also over-estimates the US’s ability to spur ‘democratic and liberal revolutions’ (Robinson, Jones, Janicke and Maglaras, 2018).

To counter any shortcomings, according to Osipova (2017), Moscow has introduced the concept of soft power as part of its information campaigns. Soft power is a term defined by Joseph Nye to describe influence campaigns designed to gently nudge or push individuals in a different direction (Nye, 2017). According to Ahlawat (2017), soft power was developed to describe a ‘Western liberal framework’ however, has been adopted by Russia “to achieve global recognition and a significant role in regional and world politics” (Osipova, 2017, p. 346). According to Osipova (2017), Moscow views soft power as a tool for American expansionism, the Moldavian crisis of 2009, the 2011-2012 Putin protests and Ukrainian upheaval in 2014 examples according to Russia, of successful soft power campaigns carried out by America. In response, a decision was made by the Kremlin to not only balance American soft power with Moscow’s application of the strategy but, oppose American soft power efforts when necessary (Osipova, 2017). Rutland and Kazantsev (2016) suggest, however, that Russia’s Soviet Union and Cold War past, combined with its use of force against Ukraine, Syria and Georgia have reinforced the negative hard power stereotype that is Russia, hard power being the use of pay-offs and coercion, for example (Nye Jr, 2009). Therefore, Russian leaders have failed to develop “soft power as an effective policy tool” (Rutland & Kazantsev, 2016, p. 395).

2.2.2 Information Warfare With the advent of the cyber realm, Connell and Vogler (2017) argue that Russia has conceptualised cyber operations within the broader heading of information warfare—this is a holistic approach that incorporates information operations, psychological, electronic warfare and computer network operations. Russia using the word Infomatization to capture the holistic approach, and to gestate cyber operations on a global scale (Connell & Vogler, 2017). Szapranski (1995) describes information warfare as a way of reducing violence and 26

subduing your opponent without the need to engage in traditional, kinetic war. A favourite analogy used by many authors is that "information warfare may better be considered a mosaic of forms rather than one particular form" (Libicki, 1995, p. 6). In agreement is Applebaum and Lucas (2016), who describe information warfare as a variety of tactics including command and control; intelligence-based; electronic; psychological; economic; cyberwarfare; and computer hacking.

Research by authors such as Blank (2017) and Brangetto and Veenendaal (2016) into Russia's modern information warfare strategies suggest Russia’s strategy is based on the political warfare strategies developed by Russia during the Soviet era. Information warfare seen as a "war-winning strategy that avoids attribution, inhibits enemy reactions and minimizes expenses – all crucial strategic issues for Russia" (Blank, 2017, p. 93). Further, information warfare is not only about getting messages across ahead of the opponent, but it also involves "confusing, distracting, dividing, and demoralizing the adversary" (Brangetto & Veenendaal, 2016, p. 113). With the outcome of information warfare campaigns being disruption rather than destruction, cyberspace provides a perfect platform to carry out influence campaigns and to test the weapon that is information (Brangetto & Veenendaal, 2016).

As Russian Defence Minister Sergei Ivanov stated in an interview in 2007,

the development of information technology has resulted in information itself turning into a certain kind of weapon. It is a weapon that allows us to carry out would-be military actions in practically any theatre of war and most importantly, without using military power. That is why we have to take all the necessary steps to develop, improve, and, if necessary—and it already seems to be necessary— develop new multi-purpose automatic control systems, so that in the future we do not find ourselves left with nothing (qt. in Blank, 2017: 85).

The evolution of Russia's approach to information warfare can be traced through official and semi-official publications, as well as through research literature. The official and semi-official sources include Russia's National Security Strategy 2020, an official Russian document that outlines Russia's strategic priorities with regards to foreign and domestic policy. The Russian Foreign Policy Concept acknowledges the National Security Strategy 2020 and provides a broader framework for Russian strategic and security priorities. Russian military specialists

27

Chekinov and Bogdanov's work in Russia's Military Thought [2010 and 2013] which examines Russia's military security, and the Gerasimov Doctrine, written by a Russian Military Strategist, General Valery Vasilyevich Gerasimov, which explores the changing nature of war and the use of technology by the military.

According to Jaitner and Geers (2015), one of the primary purposes of Russia's National Security Strategy 20206 is to conceptualise Russia's strategic position regarding information security. The document states that Russia's primary goal is to protect its information security borders and disseminate truthful information to Russian citizens, through local internet platforms and other forms of social media. The document outlines how the advent of the Internet has allowed Russia to enhance information warfare by providing new means and new tactics for military strategy, and that Moscow should embrace the wealth of possibilities that have emerged from Western social trends and technologies. Giles (2009) describes The National Security Strategy 2020 as an umbrella document for Russian policies regarding security and strategy issues and looks to define Russia's foreign and domestic threats as well as recommendations to deal with these threats. The document does not name the counties it believes are attempting military supremacy (Dimitrakopoulou & Liaropoulos, 2010), although several authors advocate that the implication of the document suggests this threat is the US (e.g. Giles, 2009 and Dimitrakopoulou & Liaropoulos, 2010). The strategy does state, however, that the expansion of NATO near Russia's borders is a threat to Russia and Russia's citizens (De Haas, 2009).

In 2013 The Russian Foreign Policy Concept, according to Monaghan (2013), was written as part of a periodical update to Russia's domestic and international affairs, as well as an update to Russian strategic planning guidelines. Included in the update was the introduction of the term soft power as an alternative to classical Russian diplomacy. According to the policy, sovereign states currently use soft power against Russia to influence its people and policy. As a response to this soft power influence, Moscow has urged Russian media and businesses to promote and fortify Russia in its reporting, Moscow pledging support for those who engage in this activity (Monaghan, 2013). Within the policy, soft power is defined as a "comprehensive toolkit for achieving foreign policy objectives building on civil society potential, information,

6 On the 31st of December 2020, Putin signed a decree approving the development of a new National Security Strategy (see tass.com for further details). 28 cultural and other methods and technologies alternative to traditional democracy" (Federation, 2013, p. 3). Thus, soft power is as a tool to counter any foreign states application of soft power towards Russia and a way to promote Russian culture.

The Russian political narrative (that is the narrative that Russia is disseminating regarding the government and the government's treatment of the Russian people) is not only targeted at individuals residing in Russia but also at every Russian speaking individual worldwide (Jaitner & Geers, 2015). Within this narrative, Russia represents itself as a misunderstood superpower that has held on to traditional values and a conventional way of life. Conventional values in Western society, according to this narrative, have decayed and cannot be trusted—specifically with regards to the information that is contrary to Russia's perspective. As Jaitner and Geers (2015, p. 93) write, while "propaganda in Soviet times, […] was largely an unidirectional, top- down phenomenon, today's information warfare encompasses a worldwide audience that is both narrative-bearing and narrative-developing". The narrative that Russia has developed and is delivering, according to Jaitner and Geers (2015), suggests to those listening, that the West has waged information warfare against the regime, and Russia is victimised.

Chekinov and Bogdanov (2010) reiterate this point when writing on Russia's military security. According to the authors, globalisation and foreign political influences are not aimed at assisting Russia but serve their own political and financial advantage. They are suggesting that globalisation is a geopolitical process that seeks to "exploit the modern world's realities in their selfish interests disregarding the needs of the remaining participants in international relations" (Chekinov and Bogdanov, 2010, p. 2). The solution to countering this activity, according to Chekinov and Bogdanov (2010), is to invest in asymmetrical military strategies. Chekinov and Bogdanov (2010) write that within Russia's military security there is a growing need for non- military alternatives to confrontations to avoid deployment of next-generation weapons, defining military security as the incorporation of all forms of security: political, economic, and informational. The authors state that Russia's current military strength cannot compete with that of NATO and the US, and so needs to rely on an asymmetric war strategy. The core of asymmetric actions being the ability of weaker opponents to fight against stronger opponents. The authors suggest that once this occurs, Russia's military position will be strengthened as it will include both symmetric and asymmetric strategies in times of war.

29

Three years later, Chekinov and Bogdanov (2013) expand on their discussion of Russia's military and new-generation warfare and explain how these alternatives are used alongside traditional military strategy in times of war. The strategy is broken down into three stages. The first would occur during the lead up to the war and would involve massive propaganda to lower morale against both civilian and the military personnel of the adversary. Second, on commencement of the war, traditional kinetic means (conventional warfare) would be carried out alongside electronic warfare to undermine government systems by targeting critical infrastructure. Then during the third and final stage, ground forces would be utilised to annihilate the last points of resistance—the final stage being the most crucial, as it will ensure the enemy is no longer in a position to retaliate (Chekinov and Bogdanov, 2013).

This strategy was reconfirmed in a report by the Institute of Modern Russia in 2014, on the Kremlin's weaponization of information. The report highlighted three distinct objectives Russia must undertake as part of its information strategy in times of war. The first is to shatter communications, the second is to demoralise the enemy, and the last is to take out the opposition's command structures (Pomerantsev & Weiss, 2014). Unlike Chekinov and Bogdanov’s (2013) strategy, the Institute of Modern Russia's report does not mention the annihilation by ground forces or any last points of resistance; this may be due to the focus of the report being solely information warfare, while Chekinov and Bogdanov (2013) are looking at a more holistic approach to war which includes both kinetic and non-kinetic means.

One of the most known and debated Russian Doctrines to emerge last decade was the Gerasimov Doctrine. The Gerasimov Doctrine was based on the work of retired Army General Nikolay Yegorovich Makarov (Galeotti, 2016) and was published in 2013 by his successor General , Chief Staff of the Russian Federation. The Doctrine states that technology is changing war, and strategically there is a need to move with these changes. There is a need to envisage the future while working towards possible new technologies. The Doctrine also states that the military and academia need to work together to manage this perceived future if they were to succeed (Valery Gerasimov, 2013). Dickey et al. (2015) describe Gerasimov's Doctrine as new generation warfare with an emphasis on the role that non-military measures be utilised in achieving strategic goals. According to Eronen (2016), the Gerasimov Doctrine was written at a time when Russia was unable to compete with foreign capabilities of hybrid warfare, described above. The Doctrine, therefore, is not only a means of highlighting these

30 inadequacies but also a way of informing the government that a more significant investment was needed to bring Russian capabilities in line with foreign adversaries (namely the West) and to exceed foreign capabilities.

There is some debate as to whether the Gerasimov Doctrine was written with regards to the Russian or the US military. Charap (2015) for example, argues that the Gerasimov Doctrine is not about Russia but is “describing what he [Gerasimov] sees as the new US way of war, not Russian Doctrine" (Charap, 2015, p. 53). Bartles (2016) in agreement with Charap (2015) suggests that the West misunderstands the Gerasimov's Doctrine and that aspects of the apparent 'new generation warfare' spoken about by western interpretation of Gerasimov's Doctrine have been practised since the beginning of warfare. Furthermore, the Doctrine clearly states that the greatest threat to Russia is the US.

Robinson’s et al. (2018) research, funded as part of a US Army Project into the history of political war, suggests that current Russian Doctrine is built not only on Russia's rich history of political warfare but also on Russia's continuing investment in information operations. The Kremlin is analysing the changing nature of warfare via three means: firstly, through observing current US activities secondly, by drawing from past policy and 'old methods' and thirdly, by drawing on its domestic political system. They further suggest that Russian political warfare has many elements, including diplomatic and proxy, information, cyber, military, intelligence, economic, coordination and leadership (Robinson et al., 2018, pp.56-63).

According to Robinson et al. (2018, p. 54), Russian "senior leaders take the growing importance of non-military tactics for achieving political objectives" seriously. Robinson et al. (2018) provide a five-category schema to conceptualise Russia's clout over organisations and foreign countries:

1. Organisations that are owned and operated by the government such as Rossiya Segodnya, a Russian government organisation whose intention is to act as an official Russian international information agency; 2. Semi-governmental organisations that are funded by the government as well as openly working with the government. An example is RT, formerly Russia Today, whose

31

mission is to "provide an alternative perspective on major global events and acquaint international audiences with the Russian viewpoint" (RT, 2018); 3. Organisations that have no official ties but are widely perceived to be connected to the government. An example of this is troll farms, described in detail below, which are organised groups formed for the specific purpose of affecting public opinion through the generation of misinformation and/or disinformation on the Internet (Snider, 2018); 4. Organisations such as the Orthodox Church which is an independent decision-maker, but their agenda is parallel to the Russian regime; and 5. Those organisations which are not influenced at all by Russia's regime.

Pomerantsev (2015) states that the Kremlin's approach to maintaining control and manipulating the political process with disinformation and propaganda, as well as a culture of buying loyalty, has led to a 'cynical citizenry' (a counterargument could be that it has merely aided the continuation of the cynical citizen). In 1999 one of Putin's priorities as president was to take control over Russia's media. While previously, the media had been run by Russian oligarchs who used the platforms to disseminate their messages, under Putin, the media began to report under the Kremlin's rigid constraint. Putin’s administration described by Pomerantsev (2015, p. 40) as "80 per cent propaganda and 20 per cent violence". As a result, Pomerantsev (2015) argues that it has left Russia's allies and international adversaries in the wonder of what to expect from the government regarding the information or disinformation that will be propagated.

2.2.3 Russia's Use of Cyber Technology Literature suggests that Russia’s government uses any means possible to impact information and spread disinformation (See for example Pomerantsev, 2015; Applebaum & Lucas, 2016 and Arabidze, 2018) therefore it is not surprising that the advent of the information age has seen Russia move information warfare online. Brangetto and Veenendaal (2016) define influence cyber operations as online information warfare campaigns that provide the minimal possibility for attribution as well as "any real change of provoking an armed (or even any kind of) response" (Brangetto & Veenendaal, 2016, p. 119). Influence cyber operations are low-risk, low-cost capabilities that contribute to the destabilisation of adversaries (Brangetto & Veenendaal, 2016). Furthermore, influence cyber operations provide an alternative method of achieving Russia's political objectives (Brangetto & Veenendaal, 2016). Influence cyber operations not only provide plausible deniability, but they are also a modern-day example of 32

active measure campaigns. Moscow has developed robust capabilities which enable Russia "to shape the perceptions of broad swathes of a population and thus influence elections or bring pressure to bear on democratic governments" (Stout, 2017, p. 1).

Cyber technology provides a gateway for Russia to undertake several information warfare operations, including:

• Intelligence operations; • Counter-intelligence operations; • Deceit campaigns; • The spread of disinformation; • Electronic warfare; • Debilitation of communications; • Degradation of navigation support; • Psychological pressure; • Degradation of information systems; and • Propaganda (Brangetto & Veenendaal, 2016).

As Nye (2017) explains, the use of cyber technology has changed the way information warfare is carried out. Cyber technology makes information warfare not only cheaper and faster but also far-reaching and difficult to detect, thus making it easy to deny.

An important message identified in the literature is that the social and political landscape has changed on a global scale due to the development of information and communication technologies. It is leading to a change in warfare with the introduction of cyber conflicts, as demonstrated in the 2007 Estonia attack. This was the first time Russia was accused of using cyber technology as a form of warfare via a distributed denial-of-service7 (DDoS) attack, which left significant infrastructure and facilities in Estonia crippled8 (Davis, 2007). Emin Azizov, a representative from Russian Hacker magazine, suggests that the Russian government

7 A Distributed Denial of Service attack (DDoS) leverages multiple systems to attack the victim making it difficult to defend against. This is achieved by the attacker gaining control over one system, dubbed the master system and then using this system to control other ‘zombies’ systems to form a botnet. “Once the botnet is assembled, the attacker can use the traffic generated by the compromised devices to flood the target domain and knock it offline” (Rouse, 2016). 8 According to Brangetto and Veenendaal (2016) DDoS attacks are the most common of all influence cyber operations. 33

did not coordinate these attacks. Instead, Davis (2007) claims that “patriarchs of Russia were responsible”, which are the children and grandchildren of Russian soldiers who fought for Russia in World War II, who were defending the honour of the fallen by protesting Estonia's decision to move a bronze statue of a Russian soldier to a new location. Azizov (in Davis, 2007) argues that the botnets9 which were involved in the attack, which were usually bots for hire by Russian criminal networks and organised crime groups, were relegated at no cost. It was Russian pride which drove the attack. By Contrast, Herzog (2011) concluded it was Russian hackers controlled by the government and criminal hackers who have been captured and given a choice when caught, to either work for Russia's (FSB) or go to jail, who had undertaken the attack. This is in-line with Davis' (2007) conclusion that a particular department within the FSB had since been identified as being responsible for the attack against Estonia and that it was not just a patriarchal response from Russian hackers.

Seven years later, during the 2014 Russian and Ukrainian conflict, Russia's information warfare, according to the literature, appeared to intensify and extend globally, demonstrating a new calibre of modern warfare strategy (see for example Thornton, 2015 and Snegovaya, 2015). According to Kalinina (2016), Russia's use of technology and advanced communications made it difficult for its opponents to counter the rhetoric. As Kalinina (2016, p. 161) explains,

The funds and vigour invested in these information campaigns mediated via any available channel led to Russia’s almost certain superiority within the information landscape. Even though it is difficult to measure the effect of these undertakings, as they run parallel to other, more “traditional”, political and military activities, it is evident that the content of the mediated information has a significant impact. Carefully constructed narratives have legitimising and mobilising effect and create social reality for people living within the information landscapes where these narratives are being communicated.

It is suggested by many Russian commentators, such as the European’s People Party think tank, that Putin, since coming into power has engaged in an information war against liberal values to upset the balance in Europe (Samadashvili, 2015). Furthermore, the literature suggests that

9 As described in Footnote 1, a botnet is a network of compromised computers known as zombies that can be controlled simultaneously from a master system (DuPaul, 2012) 34

Russia is attempting to befuddle western countries through the appropriation of western media outlets and through buying political clout. According to Salome Samadashvili (2015) Executive Director of Samadashvili International Consultants, and Ambassador to the Kingdom of Belgium, Russia is utilising political and business allies across Europe to help spread its message. An example is Russian non-profit organisations (NGO’) such as the Institute for Democracy and Cooperation in New York which, according to Samadashvili (2015) is a Russian think tank situated in the US.

Wirtz (2015) asserts that Russia, more than any other actor in the cyber realm, has devised a way to achieve political objectives using cyber warfare. Russia sees information warfare as an essential prelude to traditional military operations (Wirtz, 2015). For example, the use of propaganda campaigns to impact the morale of opponents citizens and army and/or strikes to take down communication networks (see Chekinoc and Bogdanov, 2013 and Stout, 2017 above). A point reiterated in the fact that despite strategic inferiority to NATO and the US, Russia has won every conflict it has participated in since 2000 (Blank, 2017). Russia’s “use of information warfare in all these operations [attesting] to its improved grasp of how information and cyber operations (which are a unified phenomenon in its thinking) contribute to victory” (Blank, 2017, p. 85). Russia’s information warfare campaign has not gone unnoticed. For example, after the 2014 invasion into Ukraine by Russia and the accompanying propaganda blitz that confused those people in Russian-speaking areas of Ukraine, the Ukraine Government set-up StopFake, an organization whose priority is to “expose and ridicule Russian propaganda” (Applebaum & Lucas, 2016, p. 1).

2.3 Troll Farms, Trolls and Social Bots Essential to Russian online operations is the existence of troll farms, trolls and social bots. According to Snider (2018), a troll farm is an organised group formed with the specific aim of influencing public opinion. The trolls work to generate misinformation and disinformation through the use of online traffic. Misinformation, according to Hao and Wei (2013) is the idea that the sender of a message can manipulate it in such a way that the receiver interprets the message within the constructs of the senders information structure. Misinformation, however, according to Anwar, Ang, and Jayakumar (2021), although is seen as information which is false, may have not been created with an intention of causing harm. In comparison, disinformation, may be defined as falsehoods or rumours, which were:

35

(i) Propagated as part of a political agenda by a domestic group/relativization/differing interpretation of facts based on ideological bias—some of this can achieve viral status whether or not there is malicious intent; (ii) Part of state-sponsored disinformation campaigns, and which can undermine national security and resilience (Anwar et al., 2021, p. 7)

In their analysis of trolling behaviour, Paavola, Helo, Jalonen, Sartonen, and Huhtinen (2016) define trolling as a helpful tool that propagates disinformation, forcing discussions off-track even though there are no legitimate facts to back the argument. In today's hyper-connected network society, where social network posts have caused stock market crashes10, trolling is not only performed by individuals wanting "to cause disruption and/or trigger or exacerbate conflict for [...] their own amusement" (Hardaker, 2010, p. 237), but as a form of information warfare.

Trolling is divided into two types: classic and hybrid trolling (Paavola et al., 2016). Classic trolling behaviour is where the individual feeds a psychological need to troll. Hybrid trolling behaviour involves the propagation of a particular agenda, usually political (Paavola et al., 2016). In comparison, according to Zannettou et al. (2018) trolls are either individual persons who are 'paid to operate' or individuals who have been called out by other users as 'trolls'. Russia began training internet trolls in the 2000s to spread positive propaganda about the government of Russian President Vladimir Putin. According to Gregory (2018), Putin's online army of trolls flooded news and commentary sites with sometimes abusive content, in defence of Russia's military actions. According to the literature, today, Russian trolls are used to produce disinformation campaigns "engineered for the social media age, and they fling up swarms of falsehoods, concocted theories and red herrings, intended not so much to persuade people but to bewilder them" (Warrick & Troianovski, 2018, p. 1). There appears to be no grand Russian masterplan in relation to disinformation campaigns, just a broad strategy of creating an environment more conducive to Putin and the interests of Moscow (Galeotti, 2019). Modern technology has allowed this process of saturating an audience and then recycling

10 In 2013 a tweet sent from a hacked account claiming President Obama was injured saw the Dow Jones Industrial Average drop 143.5 points and the Standard & Poor's 500 Index lose more than $136 billion in value within seconds of the post being sent. 36 disinformation campaigns to do it all again. Sock puppet websites can achieve significant reach and penetration, making it easy for a disinformation campaign to propagate (Giles, 2016).

Unlike trolling activity, which humans carry out, according to DuBois, Golbeck, and Srinivasan (2011), social bots are computer algorithms that automate interactions with humans on social media networks (SMNs) such as Twitter and Facebook. Social bots are a new phenomenon with beginnings in approximately 2010 during the US midterm elections (Ratkiewicz et al., 2011). Social bots can be innocuous or favourable, aggregating content, which Ratkiewicz et al. (2011) suggests is not only helpful, but also useful to the user. In contrast, Ferrara, Varol, Davis, Menczer, and Flammini (2016) define social bots as malicious entities with the specific purpose of manipulating SMNs via rumour spreading, spam, misinformation, propagation, slander, malware and noise. Malicious social bots, according to Ferrara et al. (2016) in certain circumstances, will lead a user to believe that an idea is more popular than it is by populating SMN sites with information about a particular idea or position. Combined with the recent study by Kramer, Guillory, and Hancock (2014) which demonstrated that emotions on SMNs can be contagious, malicious social bots are positioned well to manipulate particular online social networks and alter user perceptions of reality (Ferrara et al., 2016).

Ferrara et al. (2016) suggests that estimates concerning the spread of social bots on SMNs vary greatly. Social bots may be owned by

State or non-state actors, local and foreign governments, political parties, private organisations, and even single individuals with adequate resources could obtain the operational capabilities and technical tools to deploy armies of social bots and affect the directions of online […] conversation (Bessi & Ferrara, 2016, p. 11).

Various studies have demonstrated that social media is used extensively to foster political conversations (see, for example, Piedrahita, Borge-Holthoefer, Moreno, and González-Bailón (2018)) and now plays a significant role in advancing these conversations. It is estimated that 400,000 bots were engaged in the political discussion during a four-week review of twenty-

37

two election hashtags11 on Twitter leading up to the 2016 US presidential election (Ferrara et al., 2016). A report by Kim, Graham, Wan, and Rizoiu (2019, p. 1) demonstrated that troll and bot activity is not mutually exclusive. The interference in the 2016 US Presidential elections establishing a symbiotic relationship between the two accounts "to weaponise social media, to spread state-sponsored propaganda and to destabilise foreign politics". With regards to Twitter, human actors are crucial in spreading propaganda and disinformation (Zannettou et al., 2018).

According to (Giles, 2016), Russian trolls and social bots are a vital means to spread disinformation. They allow for direct interaction with the reader through services such as online discussion boards and Twitter accounts which "act as a force multiplier for driving home the Russian message". In 2015, news of the existence of the Russian troll farm the Internet Research Agency (IRA) began to surface in the , with several newspaper articles and exposes emerging on the IRA (see for example Chen, 2015 and AJC, 2015 ). However, it should be noted that academic research is scant on the subject of troll farms, and in particular, the rise of the Kremlin troll demonstrates minimal research. This research project fills the gap in academic research on Russian troll farm activity.

2.4 Russia’s Modern-Day Propaganda Campaigns 2.4.1 The re-invention of Soviet-Propaganda According to Fitzgerald and Brantly (2017), ’s re-invention of Soviet propaganda laid the foundation for future generations to respond to geopolitical events. Included is the use of press agencies, organisations, news outlets and the recruitment of journalists outside the Soviet Union to produce articles as per Soviet direction (Fitzgerald & Brantly, 2017). By 2000, Moscow had become “aware of the increasing prevalence and importance of the Internet [and] established websites dedicated to the dissemination of Russia propaganda” (Fitzgerald & Brantly, 2017, p. 225). Today, Russia’s propaganda model requires a high number of channels to disseminate information and a willingness of these channels to disseminate lies and partial truths (Paul & Matthews, 2016). The modern prose suggests that Russia’s approach to propaganda is built on:

11 #election2016, #elections2016, #tcot, # p2, #hillaryclinton, #donaldtrump, #presidentialdebate, #debates2016, #imwither, #trump2016, #nevertrump, #neverhillary, #trumppence16, #hillary, #trumpwon, #debate, #trump, #garyjohnson, #jillstein, #jillnothill, # debatenight, #debates, #VPDebate. 38

Soviet Cold War-era techniques, with an emphasis on obfuscation and on getting targets to act in the interests of the propagandist without realising that they have done so. In other ways, it is entirely new and driven by the characteristics of the contemporary information environment. Russia has taken advantage of technology and available media in ways that would have been inconceivable during the Cold War (Paul & Matthews, 2016, p. 1).

According to Siggett (2017, p. iii), Russia’s information warfare strategy is the amalgamation of “old school tactics with new age technology”, with Russia’s information operations previously conducted during the Cold War evolved to include trolls and social media, as described above. In this sense, cyber technology is used to circulate pro-Russian propaganda and undermine perceived rival governments and institutions via two primary means, the first is cyberespionage, where technology is used to source confidential political information from private systems, which is then leaked publicly. The second is through the employees of troll farms, to create online profiles and fake blogs to spread misleading or pro-Russian viewpoints. As discussed, the aim of the is not to fabricate lies that everyone will believe, but to merely sew doubt into the minds of those who read their rhetoric (Connell & Vogler, 2017). According to Ben Nimmo, from Council who was quoted in the 2018 book Like War, Russia’s information warfare strategy relies on the 4D’s, “dismiss the critic, distort the facts, distract from the main issue, and dismay the audience” (Bunker, 2018, p. 107).

While previous attempts to plant stories in international media outlets, according to Treverton (2017) former Chief of the US National Intelligence Council, had been almost impossible for the Soviet Union, today’s social media platforms have made it easier and more accessible to do so. They allow Russia to create what Treverton (2017) refers to as ‘noise’ as well as a deception to distract the reader. Treverton (2017) asks, “much as magicians divert to deceive, how much of Russian noise is intended to distract, and why?” (18). An example of ‘noise’ is described by Fitzgerald and Brantly (2017) in their account of a story planted by Moscow that occurred during the 2008 Georgia war. Russia, using the underlying ethnic tension that was already present in Georgia at the time, circulated stories of brutality by the Georgian government. These stories were picked up by international media outlets and disproven. The “unfounded stories of atrocities were nevertheless successful in helping to alienate the Georgian identity from minority populations in Abkhazia and Ossetia” (Fitzgerald & Brantly,

39

2017, p. 226). Russia had relied on a massacre that occurred almost 200 years prior, against the Abkhazian people, by using historical data, Russia was then able to rehash racial, social and cultural emotions which led to the success of the disinformation strategy.

Thomas Rid (2012), in his article Cyberwar will not take place identified three activities in which he claims that all cyber activities fall under; espionage, subversion and sabotage. According to Nissen (2015), however, these three categories are not enough to cover social media and the activities and effects created through the use of these networks. Instead, Nissen (2015) derived an effects framework which incorporated the three categories from Rid (2012) but also considered authors’ such as British academic Shima Keene (2011) and NATO’s research into ‘effects’ to create an effects framework “for organising and describing activities and effects in order to systematize the analysis of both the theoretical and the empirical aspects of social media in contemporary conflicts” (Nissen, 2015, p. 59). The framework is made up of six core activities where SMNs may be applied for military activities by both non-state and state actors: intelligence collection, targeting, psychological war, cyber-operations, defence and command and control.

According to Nissen (2015), targeting finds or identifies which SMN profiles and accounts should be hacked (attacked), monitored or influenced. Intelligence collection is the search for and subsequent analysis of information from SMNs. Psychological warfare is the use of messages disseminated to influence target audiences, how these audiences are to perceive certain messages and the attitudes and behaviour towards these messages. Operations seek to breach, alter content or render an SMN account unusable. Defence is the protection of the users own SMN network, and command and control is the use of SMNs to synchronise and coordinate activities. As Nissen (2015, p. 60) states, “a common denominator, […] is that all of these activities, regardless of whether they may have both on- and offline effects, can be conducted in and through social network media”.

40

Monitor Targeting Collect Exploit Facilitate Coordinate Synchronise Intelligence Command and Control Collection The weaponization of SMNs Shape Inform (activities and Influence effects) Psychological Manipulate Defence Warfare Mislead Expose

Diminish Detect Promote Prevent Deceive Secure Coerce Protect Deter Mobilise Deny Convince Disrupt Operations Degrade Breach Destroy

Diagram 2.1 – Activities and Effects Framework (Adopted from Nissen 2015)

The weaponization of SMNs however, transcends Russia information warfare techniques and has been studied in various communities and cultures. Oxford University’s Computational Propaganda Research Project found at least 48 regimes that have applied a model of steering public opinion, spreading misinformation and undermining critics (Bunker, 2018). According to Suzor (2018, p. 129) for example, “the military of Myanmar executed an extensive, systematic campaign involving hundreds of military personnel who used fake Facebook accounts to spread anti-Rohingya propaganda, flooding news and celebrity pages with incendiary comments and disinformation”. The military campaign, according to Suzor (2018), used language to dehumanising and demonise the Rohingya people. A further example may be seen in the experience of Ceyda Karan, a Journalist who worked for one of Turkey’s few independent newspapers Cumhuriyet, who endured a three-day trolling campaign after two 41

journalists with a high volume of followers posted fraudulent information against Karan. The trolling campaign involved almost 14,000 tweets, and was incited after Karan spoke out against the Turkish government (Nyst & Monaco, 2018).

Alice Marwick and Rebecca Lewis from the Data and Society Research Institute have documented “how far-right political groups have learnt to exploit the algorithms that influence visibility on social media platforms to increase the circulation of propaganda and fake news” (Suzor, 2018, p. 22). According to van Dijck (2018) the emergence of the platform society has facilitated a way for people to exploit the Internet in order to manipulate controls in place to protect individuals. The platform society being a hybrid of platform owners and users which bypass regulations grounded in society, such as private versus public, consumer versus citizen and market versus states, binaries which institutional and legal frameworks are predicted on. As van Dijck (2018, p. 21) explains, “the deliberately hybrid status allows platform operators and users to bypass regulations or escape professional norms and standards to which most sectors are subjected, either by law or by customer, this creating a legal and social grey area”. In contrast, SMNs have also been used to liberate individuals, for example Zeyrep Tufekei, an academic in Turkey studies demonstrated how people use social media networks to organise political action, and the way that SMNs have empowered social movements, as discussed in the colour revolutions throughout Chapter Five.

2.4.2 The effectiveness of Russia’s Information Warfare Against Europe and Neighbouring Countries The effect of Russian propaganda campaigns in Russia and neighbouring countries Ukraine, Azerbaijan and Kyrgyzstan has produced mixed results according to studies. For example, in a study by Gerber and Zavisca (2016b), it was concluded that “the Russian narrative regarding the malevolence of the United States—and, secondarily, its European allies—resonates most in Russia and least in Ukraine [with] mixed appeal in Kyrgyzstan and Azerbaijan” (Gerber & Zavisca, 2016b, p. 86). The study results concluded that the more exposed a person was to Russian-based broadcasts, the more likely a person accepted the Russian propaganda narrative.

Do Russian propaganda campaigns only influence people who are already sympathetic to Russia, or are they influencing those who previously sympathetic? As Treverton (2017) highlights, modern technology, rather than allow people to view all angles of an argument, instead, segments people into ‘echo chambers’ “in which they hear only what they want and

42 learn only what they already thought” (12). Under this theory, Treverton (2017) states that only those individuals who have already subscribed to Russian propaganda, or are sympathetic to Russia, are viewing Russia’s narrative. Bunker (2018) describes this phenomenon as homophily, which means ‘love of the same’.

In numerous studies, across numerous counties, involving millions of people, researchers have discovered a cardinal rule that explains how information disseminates across the internet, as well as how it shapes out politics, media, and wars. The best predicator is not accuracy or event content; it is the number of friends who share the content first. They are more likely to believe what it says – and then to share it with others who, in turn, will believe what they say (Bunker, 2018, p. 123).

Robinson et al. (2018) writing on Russia’s information warfare suggest, that the influence of Russia’s information campaigns vary depending on the context of the campaign, with a greater impact being seen in and around Russia’s borders. However, according to Robinson et al. (2018, p. 69), due to an increase in Russian information warfare and hybrid threats, “Russia will most likely continue to hone and expand its information tools of influence both close to and far beyond its borders”. As such, according to the authors’ Russia will continue to develop information warfare techniques in which to influence a broader range of individuals and governments regardless as to whether they are already sympathetic to Russia or not.

In addition to the above authors, many government bodies have also researched Russia's propaganda campaigns. The European Endowment for Democracy undertook a comprehensive study of Russia's spoken media's reach and impact, which has led to numerous recommendations. For example, the organisation has recommended surveys of Russia's neighbouring countries, to monitor Russian propaganda. A further recommendation is to support grassroots organisations such as StopFake which is dedicated to finding and ousting fake news stories (Waszczykowski, 2015). The East StratCom Task Force was also set up in 2015 to report and analysis on Russian disinformation trends and explain and correct any disinformation narrative, the Task Force charged with raising public awareness of disinformation (European Union External Action, 2017). NATO countries also have set-up a centre in Latvia to respond to any Russian disinformation. With regards to Western countries, such as the US, little had been invested before the 2016 presidential election results to assist in

43

the understanding of Russian disinformation campaigns (Applebaum & Lucas, 2016) despite literature suggesting for several decades that the US is vulnerable to all forms of information warfare (e.g. Libicki, 1995).

2.5 Russia's New Information Warfare Against the US 2.5.1 Russia's Information Warfare against US Presidential Elections Russia and the US have a long history as adversaries. Therefore, it is not surprising that Russia (as discussed in Chapter Eight in greater detail) would have an interest in who is being elected to run the opposing country. According to various sources discussed below, over the last half a century, Russia has made several attempts to influence US presidential elections. The first known case was in 1960 when Adlai Stevenson was approached by a Russian Diplomat and informed that the Russian government believed Stephenson would be the best candidate to help with the development of Russian and US relations. Moscow was reported to offer support to Stephenson if he considered running for President, Stephenson was said to have turned the offer down (Daley, 2017b). In 1968 Presidential nominee Hubert Humphrey was offered a similar partnership to that provided to Stephenson, Humphrey also reported to have turned the offer down (Doran, 2017).

In 1976, the USSR was accused once more of interfering with a US Presidential election when Moscow falsified documents to derail Henry Jackson's presidential chances. The reports claimed that Jackson was a homosexual, an allegation, if believed, would have sent Jackson to jail (Doran, 2017). In 1984, a further example of the USSR attempting to influence the US Presidential elections was in the use of propaganda that promoted Presidential nominee Ronald Reagan as a warmonger, the USSR promoting the slogan 'Reagan means War' (Economist, 2016). After the 1984 Election Russian interference or attempted interference in US Election, appears to have stopped. According to Richard Burr, Republican from , in an interview PolitiFact, the 2016 US presidential election is the first known case in 40 years of Russian interference.

2.5.2 The US 2016 Presidential Election An Intelligence Community Assessment's examination of Russia's interference in the 2016 Election concluded that Russian President Vladimir Putin had ordered the campaign aimed at the US presidential election in 2016. The report claimed that the goal of the operation was to "undermine public faith in the US democratic process, denigrate Secretary Clinton, and harm

44

her electability and potential presidency" (Intelligence Community Assessment, 2017, p. ii). Furthermore, the Intelligence Community Assessment concluded that there was a clear preference for President-elect demonstrated by the Russian campaign. The influence campaign, examined in detail throughout this thesis, consisting of cyber-activity, state-funded media campaigns as well as social media campaigns.

A robust discussion has followed regarding whether the Russian information warfare campaign against the 2016 US presidential election was successful. According to Senator Mark R Warner, the campaign was very successful, as he stated during the US Senate Judiciary Subcommittee on Crime and Terrorism "if you look back at the results, […] it's a pretty good return on investment" (Timberg, Dwoskin, & Entous, 2017, p. 1). An interpretation of Warner's comments is, without Russia's interference, President Trump would not have been elected. According to former National Security Agency and CIA director General Michael Hayden, Russia's interferences in the 2016 US presidential election was "the most successful covert influence campaign in recorded history" (qt in Gioe, 2018: 2).

Gioe (2018) argues that Russia's success is a result of a combination of traditional human intelligence (HUMINT) and cyber tactics, such as phishing, hacking, social engineering, and weaponizing stolen information. In contrast, Allen and Parnes (2017) and Isikoff and Corn (2018) argue that even though Russia may have been partially responsible for the election of President Trump and the failure of 's presidential bid, it was Clinton's actions that contributed to her loss. What is evident in both arguments, suggests Inkster (2016) is that the incident, that is the election tampering, has demonstrated the West’s vulnerability to information warfare.

Gioe, (2018) although in agreement that the Russian influence campaign was successful, suggests that the number of voter turnout cannot measure this success, as indicated by Warner (2017). Instead, the success of the campaign can only be measured by the political fallout that has ensured since finding out that the Russians had interfered in the election, that has fractured America's society. James Clapper, the former Director of National Intelligence in the US, stated that Moscow's methods and tactics during the 2016 presidential election have generated a considerable amount of influence that is impossible to measure by just the election outcome (qt in Gioe, 2018).

45

2.5.3 The use of Troll Farms against the US Gerber and Zavisca (2016b) state that Russia's use of troll farms is not a new phenomenon. Previously, Russia has used troll farms to push rhetoric that suited the Russian government in both a domestic and international capacity (Gerber & Zavisca, 2016b). According to Planton Mamatov, director of the Russian company Magic Inc PR, between 2008 and 2011 Mamatov ran a troll farm in the Ural Mountains which employed 20-30 people to "carry out online tasks for Kremlin contacts and regional authorities from Putin's Party" (Chen, 2015a, p. 1). Mamatov, in a recent interview, stated that in 2017 (which was the date of the interview) troll farms employed approximately 1,000 individuals in Russia. In 2014 there appeared to be an increase in anti-American rhetoric coming out of Russia troll farms (Gerber & Zavisca, 2016b). A year later the Strategic Communications Centre of Excellence set up by NATO in Latvia to counter Russian propaganda reported that the Russian government was "pursuing a conscious strategy of swaying public opinion in its favour and against the United States and its NATO allies both domestically and abroad" (Gerber & Zavisca, 2016b, p. 80).

In 2015, Magazine ran an exclusive exposé on the Russian troll farm operating under the company name Internet Research Agency (IRA). The IRA which, according to the United States District Court for the District of Columbia (2018) indictment, was formed with the specific purpose of waging an information warfare campaign against the US through the use of internet trolls. The article regurgitated many incidents carried out by the IRA, which appeared to be a test of strength by the Russian troll farm. The first of these events occurred on the 11th September 2014, when the troll farm executed a barrage of tweets, fabricated images and footage of an exposition at Columbian Chemical plant, , USA, discussed in detail in Chapter Six. As would be reported, there was no explosion, no fallout and no company named Columbian Chemical (R. Smith, 2015). The images and the effort put in to convince the US public that another the 11th September attack had occurred, according to Smith (2015) was extraordinary. The second and third campaigns carried by the IRA, also discussed in Chapter Six, occurred in Atlanta during December 2014 and involved an Ebola outbreak and the hoax campaign around the alleged shooting of an unarmed African American woman. Once more, the Internet was bombarded with tweets, images and alleged footage of the incidents (AJC, 2015a). There does not appear to be any motive behind the attacks. However, as suggested by Chen (2015a), the social-media campaigns could be construed as a 'practice' run for a much bigger campaign. 46

In 2016 according to the literature, the attention of the IRA seemed to have moved to the upcoming 2016 US presidential election. The United States District Court for the District of Columbia (2018, p. 6) indictment stated that the IRA’s primary goal during this time was to spread "distrust towards the candidates and the political system in general". According to an internal IRA memorandum written in 2016, employees were to use any opportunity made available to criticise Hillary Clinton and other candidates. Employees were to exclude the Democrat and the Republican candidate Donald Trump.

Numerous studies have been undertaken on the 2016 US presidential election and the influence the Russian government may or may not have on the election. For example, Tornoe (2018) writing on a Pew Research Centre study concluded that 64 percent of adults surveyed believed that fake news stories caused significant confusion, while 23 percent of those surveyed admitted to either unintentionally or intentionally sharing fake news themselves. One particular case Tornoe (2018) examines is @Jenn_Abrams, a Russian operative working out of the IRA during the election campaign who was quoted by several major news outlets, such as USA Today, and the BBC. According to Rodriguez (2019) in his study of disinformation operations aimed at democratic elections, based on the IRA example, it is clear that this activity violates the primary norms of general law, such as sovereignty, non- intervention and due-diligence.

This chapter has examined in detail the literature on Russian information operations. As demonstrated, there is little debate that Russia is engaging in an information campaign designed to not only flood the information market with Kremlin manipulated narratives, but also confuse and cause doubt to those listening. Recently the Kremlin extended information operations to the Internet, where Moscow’s narrative is now able to reach a global audience at speeds never experienced before. With the introduction of the Internet and social media networks (SMNs) the Kremlin has extended its reach towards the West, where the Russian government initiated an information campaign in the 2016 US presidential election. Although this move is not unique to this period, it has been argued, as demonstrated above, that the Russian government succeeded in its mission to throw doubt onto the American election procedure and cause chaos and confusion in the aftermath.

47

Chapter Three – Methodology The is project is built on specific research methodologies as will be outlined in this chapter, and what follows is a roadmap of the overall research approach adopted. The chapter will describe the methods used, and the justification for applying these methods as opposed to other research approaches. The chapter will also defend the methods chosen to answer the research questions.

3.1 Research Questions This research project will explore theories of Russian information operations on social media, demonstrating that Russian information operations are fluid and not fixed. Russian information operations evolve and adapt by learning as demonstrated through the examination of the Kremlin troll activity on Twitter. Therefore, the overarching research question for this project is “How have Russian Information Operations developed since the collapse of the Soviet Union?”

Subsequent research questions included:

1. How has the Kremlin troll operation/strategy developed? 2. How does Russia use social media to support its strategic goals? 3. How do social media platforms respond to State-sponsored trolling operations?

To answer these questions, applied throughout this dissertation is a variety of methodological approaches. The final section of this paper, Chapter Review, will provide an overview of the various methodological approaches applied throughout each chapter.

3.2 Research Approach When deciding on a research approach to adopt, a review of social research texts demonstrated varying approaches to research designs. For example, Bryman (2016) suggests that social research should include the identification of specific methodological processes beginning with the identification of theory and concluding with the identification of external influences on the researcher, as outlined in Process Diagram 3.1.

48

Theory Epistemological Ontological Research Strategy Influences Perspective Issues

Process Diagram 3.1—Social Research Process (Adapted from Bryman 2016).

In comparison, Crotty (1998), suggests that the research’s epistemological stance impacts the theoretical perspectives chosen by the researcher, this, in turn, has a knock-on effect with regards to the methodology adopted and the researcher’s methods. Gray (2013) has captured Crotty’s (1998) approach of the relationship between Epistemology, Theoretical Perspectives, Methodology and Methods as per Process Diagram 3.2.

Theoretical Epistemology Methodology Methods Perspectives

Process Diagram 3.2 – Crotty’s (1998) Research Design

There is a distinct difference in the way that Bryman (2016) and Crotty (1998) define epistemology, ontology and theoretical perspectives. Crotty (1998) identifies Epistemology perspectives as objectivism, constructivism and subjectivism. In comparison, Bryman (2016) defines epistemology as belonging to either the natural or the social sciences; therefore, it may be defined, for example, as positivism or interpretivism.

Bryman’s (2016) approach to Social Research has been applied as a framework to this research study, as outlined in Process Diagram 3.1, this is primarily due to Bryman’s (2016) research being a more contemporary example; this was pertinent when examining the examples both Crotty (1998) and Bryman (2016) provide throughout their explanations of the research process. Bryman (2016) being the newer text applied examples of recent research projects and how they related to research methodologies, several of these projects touching on E-research, big data and computer-assisted content analysis, the majority of which were not around in 1998. For example, Bryman’s (2016) use of Snee (2013) and Greaves et al. (2014) research using social media networks and Humphreys, Gill, and Krishnamurthy (2014) and Ledford and Anderson (2013) use of big data all provided a basis for me to understand my research project.

49

However, the research that I have undertaken is novel and unique in some respects; this is the first time a SMN has released a dataset of State sponsored activity for academic study. Therefore, as will be demonstrated, I have had to develop my methodological approach at times to fit the current research project.

3.2.1 Theory Researchers utilise theory in many ways, for example deductive and inductive. Deductive, which is when theoretical ideas drive the collection and analysis of data. Inductive, an open- ended strategy, where "theoretical ideas emerge out of the data" (Bryman, 2016). According to Gray (2013), an inductive approach to research implies that when approaching the research, pre-existing theories or ideas are not considered. Richards (1993, p. 40) argues, however, that you cannot go theory-free into a study and inductive and deductive research are linked approaches. Further, that research requires a repeated exchange between both inductive and deductive approaches (Parkhe, 1993). The research design and questions for this project were established via an examination of current information operation theories, as outlined in Chapter Two. The project then remained open to new theories that emerged from the analysis of the Twitter data and secondary sources.

Research into theory before the commencement of the body of a research project can have a crucial role in the research projects design. In order for researchers to continue to advance theory, as suggested, a regular exchange is required between inductive and deductive research methodologies (Parkhe, 1993). As Perry & Jenson (2001, p. 1) state, “pure induction with no prior theory might prevent the researcher from benefiting from existing theory, just as pure deduction might prevent the development of new and useful theory”. In the current research project, the literature review explored and examined the research relating to Russian information operations as well as the relevant theoretical headings12 Russian information operations pertain too. A narrative review, also known as a traditional literature review, which involved a “critical interpretation of the literature” (Bryman, 2016, p. 693), was therefore undertaken.

It should be stipulated at this time, that the purpose of this research is not to validate the theories identified in the literature review. Instead, the study is concerned with identifying new theories

12 Such as Information Warfare, Hybrid War and Political Warfare 50

born out of the research results. Therefore, this research project is a mixture of both deductive and inductive. Deductive, as the project took into consideration prior theoretical positions when developing the research project, inductive as I identified new theories as the research developed.

3.2.2 Epistemological Perspective Identifying the epistemological perspective provides for the clarification of issues relating to the research design, including the overarching structure of the research, the evidence the study will offer and how that research will be interpreted (Easterby-Smith and Thorpe 2002). Two primary Epistemologies were considered for this research, as outlined in Table 3.1.

Table 3.1—Epistemology Definitions and Extensions Epistemology Definition Extensions Positivism Positivism (and realism) take a Post Positivism scientific approach to research. It is Realism founded in the natural sciences and Empirical Realism focuses on the social reality and Critical Realism beyond. Interpretivism Interpretivism takes a social science Symbolic Interactionism approach to research which involves Phenomenology applying subjective meaning to the social action that is being researched. (Adapted from Bryman, 2016).

According to Bryman (2016), positivism has five principles:

1. The principle of phenomenalism; 2. The principle of deductivism; 3. The principle of inductivism; 4. The research must be objective; and 5. The senses cannot confirm normative statements.

While positivism strives to explain human behaviour, interpretivism sets out to understand human behaviour. Therefore, centred around the understanding of human action is interpretivism. Interpretivism examines “culturally derived and historically situated 51 interpretations of the social life-world” (Crotty 1998). Within social science research it is theorised that knowledge is transmitted through experiences, ideas and discourses, this is the definition of interpretivism (MyTutor 2019) and this is the approach taken throughout this research project. As discussed, this research project is inductive and not deductive, and the research is based in the social sciences and not the natural sciences, so it does not fall within the Positivism framework.

Interpretivism “is concerned with the uniqueness of a particular situation, contributing to the underlying pursuit of contextual depth” (Bryant 1985). The goal of Interpretivism within this research is to contextualise the events that have occurred, interpret the situation being researched and identify the relationship between people, technology, and the organisation. According to Lacity and Janson (1994), there are two text analysis methods within interpretivism, hermeneutics and intentional analysis. Hermeneutics was founded in exegesis and began with research centred around the interpretation of sacred and ancient texts. Two primary goals exist within hermeneutics, the first is the exact translation of the text, and the second is the discovery of any instructions contained in the translated text. Exegesis develops the rules for the researcher in interpreting the translated text and the historical and cultural meaning (Lacity and Janson 1994).

In comparison, intentional analysis assumes that the researcher and the research have commonalities, such as existing in the same period (epoch), speaking a common language and being culturally compatible. Intentional analysis provides four steps to assist the research with interpreting text data; these are:

1. The researcher describes the facts of the phenomenon; 2. The researcher determines the way the participants ascribe meaning to their separate realities; 3. The researcher identifies themes or variants; and 4. The researcher abstracts the essence from the text. (Lacity and Janson 1994)

The current project will apply intentional analysis to determine the specific purpose of the IRA and its activity, the decision-making process of the IRA and how effective Russian troll farm activity is in spreading disinformation and propaganda. Detailed descriptions, direct quotes from the IRA Tweets and a continuous review process of the project and how the data is

52

interpreted will be applied to validate the qualitative research project. Secondary sources such as Sanatory Hearing transcripts, Indictments and academic sources will support and supplement the IRA Twitter data (see Influences for further insight concerning the validation process).

3.2.3 Ontological Issues Ontology is defined as “the nature of reality that is to be studied, and what can be known about it” (Terre Blanche, Durrheim et al. 2007). Social ontology feeds into the conduct of social research, the researcher’s ontological assumptions and commitments feeding into the formulation of research questions and how the research is carried out (Bryman 2016). According to Bryman (2016), there are two ontological positions, objectivism and constructionism also known as constructivism13. Objectivism “asserts that social phenomena and their meanings have an existence that is independent of social actors” while constructionism asserts that “social phenomena and their meanings are continually being accomplished by social actors” (Bryman 2016). In simplistic terms, objectivism believes that what is happening in the world, the way that people think, and feel does not determine a social phenomenon. The phenomenon exists outside of social influence. In comparison, constructionism states that what is happening in the world will affect and influence the social phenomenon under study. Buzan, Wæver et al. (1998) study on security examined how terms such as danger and threat contribute to the construction of a perceived security issue. The authors found that the construction and definition of these terms came from shared beliefs, ideas and values, which in turn exerted influence on policy and political action.

Constructivism’s foundation was born from the writings of Max Weber and Emilie Durkheim and has been compared with the social theory, Rational Choice,

Rational Choice is a social theory that offers a framework for understanding how actors operate with fixed preferences that they attempt to maximize under a set of constraints. It makes no claims about the content of those preferences; they could be world domination or religious salvation. Nor does it assume anything about the content of constraints; they could be guns or ideas. Rational choice offers no claims about the actual patterns of world politics. Like rational choice, constructivism is a

13 Bryman (2016) does not distinguish between constructionism and constructivism. 53

social theory that is broadly concerned with the relationship between agents and structures, but it is not a substantive theory (Gheciu, Wohlforth et al. 2018).

The current project will take a constructionist approach to the research.

3.2.4 Qualitative, Quantitative or Mixed Methods Bryman (2016) states that there are several fundamental differences between qualitative and quantitative research strategies. Table 3.2 captures these differences. Social research draws on the social sciences, such as criminology, political sciences and history, for conceptual and theoretical inspiration (Bryman, 2016, p. 32). The social research undertaken for this dissertation is primarily qualitative, allowing for the examination of Russian information operations on SMNs. The Twitter data is not structured and is primarily text-based, combined with the use of secondary sources, the research supports a qualitative research approach. However, the research will also measure the spread of Russian information campaigns through the statistical analysis of the tweet data14. As such, this research project has both quantitative and qualitative aspects. According to Silverman (2007), qualitative research can have aspects of quantitative research, known as quasi-quantitative, when terms such as 'some', 'many' and 'often' appear in the research. Examples of these terms are used in the current research project when answering questions such as,

1. How many times was an IRA tweet re-tweeted? 2. How often did IRA operates post? 3. What are some of the more popular IRA tweets?

When conducting qualitative research, it is often necessary for the researcher to understand the relevant frequency of the phenomenon in question; this is where quasi-quantitative research becomes essential. However, it is imperative that when using quasi-quantitative research methods, the research has meaning and enhances the research project (Bryman, 2016). The current research project will apply a mix of supplementary counting and corroborative counting via the quasi-quantitative methods. The supplementary use as the data adds additional dimensions allowing for the development of the qualitative findings, and corroborative as the quasi-quantitative data also corroborates the qualitative findings.

14 This includes the amount times a tweet was liked, quoted, re-tweeted and commented on. 54

Table 3.2 – Fundamental Differences between quantitative and qualitative research strategies Quantitative Qualitative Principal orientation to the Deduction: testing of Inductive, generation of role of theory concerning theory. theory. research.

Epistemological Natural science model, in Interpretivism. orientation. particular positivism.

Ontological orientation. Objectivism. Constructionism. (Bryman, 2016, p. 32)

3.2.5 Influences It is important to highlight any influences that may be present, in particular, personal values and practical considerations that could affect my judgement when I undertook this research project, in order to ensure transparency. According to Horsburgh (2003, p. 308) “qualitative research usually operates from the premise that total detachment on the part of the researcher is unattainable”. The process of reflexivity refers to the researcher’s acknowledgement that the decisions made by them throughout the process of the research project are not void of repercussions on the research and that their actions and decisions influence the journey of the research project. Furthermore, Horsburgh (2003) argues that the researcher cannot be either detached from the research, nor can the researcher be neutral. However, this does not mean that any conclusions drawn from the research lack rigour. The purpose of a thorough and structured methodology section is to demonstrate that the discovery of the research is grounded (Horsburgh, 2003, p. 308).

I acknowledge that my lived experiences have contributed to the research. However, it is essential to note that I have no allegiance to either Russia or the other counties examined in this paper. The exception to this being in Chapter One where I examined Australia’s response to information operations that take place on social media networks. Further, the primary source of data used for this research project, the IRA Twitter data, was produced external to the research project, so, therefore, has not been influenced by the agenda of this research project. However, having adopted an interpretivist approach to the research, it is acknowledged that

55 my lived experiences would have influenced the research project in ways in which I may be unaware.

3.2.6 Limitations The final stage of the research process is to address any limitations of the research project. As noted, this project relies heavily on data produced by Twitter on IRA activity. However, due to Twitter protecting the algorithms used to identify State sponsored activity on its platform, there is no way to test how Twitter has identified this data as IRA activity or to validate that Twitter has identified and included all IRA activity on its platform. It is assumed, that if Twitter identified the algorithms used to identify State sponsored activity, then State actors could learn to circumvent this activity.

A further limitation is the data released by Twitter only extends from 2009 to mid 2018, no current data on IRA activity has been made available to researchers. Additionally, without access to Russian government documents on information operations, any theories explored, or strategies identified throughout this project are not able to be verified with government documents.

3.3 Research Design The research design is the framework which outlines how evidence will be generated to answer the research question. Bryman (2016) states that there are five primary research designs, as outlined in Table 3.3.

Table 3.3 Primary Research Designs (Bryman, 2016, p. 32). Research Design Characteristics Qualitative or Quantitative Experimental Involves the manipulation of the independent Quantitative variable in order to access the effect on the dependent variable. It involves two or more experimental groups (Bhat, 2019).

Cross-sectional Observational study. Qualitative and or survey Takes a snapshot of a single point in time. Quantitative

56

Cross-sectional studies involve recording information regarding the subject matter without manipulation to the study environment (Institute for Work & Health, 2015).

Longitudinal Observational study. Qualitative and The research records information regarding the Quantitative subject matter without manipulation to the study environment. Longitudinal studies go for long periods, collecting data at various intervals along the way (Institute for Work & Health, 2015).

Comparative Compares and contrasts two or more ideas or Qualitative and objects in order to identify any differences or Quantitative similarities between the ideas or objects (Aftab & Bukhari, 2011).

Case Study A detailed and intensive study of a single case, Qualitative and such as a community, school, family or Quantitative organisation. Often employs both qualitative and quantitative research elements (Bryman, 2016).

The data utilised for this project was released by Twitter to be used to support research. Twitter did not provide specifications as to what the research projects should entail, with the discretion of study designs left to researchers. Applications to use the Twitter data became available six- months into the research project and have since driven the direction of this research. It is the first time a social media provider has released extensive records of state-sponsored trolling activity. The data, although part of this research project, was not collected specifically for this dissertation, nor is it categorised as secondary data. The data is a new and emerging form of data known as ‘big data’, a term that is difficult to define but usually refers to abundant sources of data collected by retailers and SMNs (Bryman, 2016). Big data has been used in several

57 studies to examine the effect of social media during political movements, and an example is a study by Bruns, Highfield, and Burgess (2014) on the role social media played in the Arab Spring uprisings.

There are two categories of big data research studies, the first is the content of the social media postings and the second is concerned with the structure and the process of the social media activity (Bryman, 2016). This project was interested in the content of the social media postings, as such a thematic analysis was undertaken to extract core themes, such as the annexure of Crimea discussed in Chapter Six also, the Columbian Chemical Campaign (CCC) discussed in Chapter Seven. To identify themes coding, also referred to as indexing, was undertaken. Coding refers to either names and symbols, tags for assigning units of meaning and/or labels used for classification (Lofland & Lofland, 1995). Coding occurred, via two means; the first was through the examination of the literature on Russian information operations and the second was through an examination of the Twitter data. Once coding was complete, themes were used to sort the information (see Research Method for further information on the coding undertaken throughout this research project). Utilising the software Aeon Timeline 2, significant events that were identified during the coding process were sorted into a timeline. Through the investigation of the events in the timeline, additional codes and themes were then identified. An example is the analysis of the Koch Turkey Farm disinformation campaign, identified while reviewing data on the CCC15, discussed in Chapter Seven. It was during the analysis of CCC tweets, that an additional disinformation campaign was identified involving an alleged food- poisoning involving Koch Turkey’s over the Thanksgiving period in 2015. Although the campaign had been reported in the media, up until this point, my research had not identified the additional campaign.

This project has adopted a blended research approach that is both a case study and longitudinal in design. Case study, as it draws on the uniqueness of the IRA, which is an integral feature of case study research, and longitudinal as it observes the IRA on Twitter over eight years. As Bryant (1985) wrote, with regards to Klein and Myers (1999) proposed principals for interpretive field research in information systems, "longitudinal case studies can offer a broader understanding of how people adapt technological systems for their purposes, particularly over time", the current study applied the same principles. The Twitter data under examination

15 For further information on this process refer to the Research Methods section. 58

extends for almost nine years, providing a snapshot of IRA troll farm activity, which was then used to explore the history of the development of the Kremlin troll. As the data was extracted from Twitter verbatim, no outside influences manipulated the raw data, the data supplied is data produced by IRA trolls as part of their work directive.

Process Diagram 3.3 provides a graphical representation of the research approach adopted throughout this research project.

Theory Epistemological Ontological Research Research Design Perspective Issues Strategy

• Deductive • Actor-Network • Constructionist • Qualitative • Case Study • Inductive Theory • Quasi- • Longitudinal (interpretivism) quantitative

Process Diagram 3.3 – The Methodology Framework of the Current Research Project

3.4 Research Method Having framed the current research project as a qualitative study with quasi-quantitative methodologies, the next part of this chapter will examine the stages undertaken throughout this research project. Included in this section is the justification as to why I chose the IRA Twitter data as the primary source of data collection and process of analysis on the IRA Twitter data.

According to Castleberry and Nolen (2018), there are five steps to qualitative analysis; these are: • Compiling; • Disassembling; • Reassembling; • Interpreting; and • Concluding.

A review of the stages of this research project will provide insight into each of these five steps, beginning with Compiling the data as outlined in Stages One and Two. Compiling refers to the researcher familiarising themselves with data associated with the research and other textual references and in so doing grouping the data into a consistent and organised format (Castleberry

59

& Nolen, 2018). Process Diagram 3.4 provides a visual representation of the process undertaken.

Stage 1 Review of Literature Compiling

Stage 2 Review of Twitter Data Compiling

Stage 3 Disassembling Interpreting Coding

Stage 4 Reassembling

Themes

Stage 5 Interpretation Findings

Stage 6 Conclusion

Process Diagram 3.4 – The Research Method

3.4.1 Stage One – Compiling The first stage of this research project involved understanding the research topic and the data associated with the topic, beginning with an examination of the literature on Russian information and influence campaigns. As part of this review, four broad categories were established to group the literature on Russian information, and influence campaigns,

1. Western literature focusing on the technicalities of Russian campaigns. 2. Western literature focusing on the strategic and military context of Russian campaigns. 3. Russian literature focusing on the strategic and military context of Russian campaigns. 60

4. Russian literature focusing on the Russian National Security document.

Within these fields of research, additional headings and themes emerged throughout the literature review, that were used to describe Russian information and influence campaigns, these include,

• Hybrid Warfare • Information Warfare • Political Warfare • Asymmetric Warfare • Information Confrontation • Information Operations • Cyber Campaigns

Based on the analysis of this literature, a decision was made to refer to Russian information and influence campaigns on SMNs as information operations, as per Russian doctrine on the subject.

3.4.2 Stage Two – Further Compiling The original premise of this research project was a case study on Russian information operations conducted via SMNs. However, on the 17th of October 2018, Twitter released an extensive dataset consisting of accounts and "related content associated with potential information operations" (Gadde & Roth, 2018) to enable independent research. A decision was made to extend the research project to include analysis of the Twitter data for the following reasons:

1. The Twitter data was a unique data set and available to researchers wishing to utilise the data in research projects. 2. The Twitter data allowed for a more focused approach to the broad heading of SMNs as it provided a specific focus on Twitter. 3. The data extended over nine years, allowing for an examination of the development of the Kremlin troll farm. 4. The data provided a unique insight into the IRA, the only known Kremlin troll farm.

61

Ethical permission was sort and approved by the Macquarie University Ethics Committee on the 7th of March 2019 and then re-applied for via two separate ethics applications (Appendix One) under Swinburne University of Technology on the 3rd of October 2019 and approved by Swinburne University of Technology Human Research Ethics Committee on the 28th April 2020. The first ethics application sort approval to undertake the research project using the IRA Twitter data and the second application sort permission to include sanitised tweets from legitimate Twitter accounts throughout this research project. The IRA dataset included 3,841 accounts originating from Russia and those accounts affiliated with the IRA dating back to 2009. The content analysis section of this dissertation utilises the IRA data. A snapshot of the data is in Table 3.4.

Table 3.4. Snapshot of the IRA Twitter Data Number of Number of Number of Tweets Distinct Users Languages 2009 212 4 2 2010 19,063 7 7 2011 14,656 13 14 2012 217,324 67 17 2013 148,503 283 36 2014 2,329,674 2,100 47 2015 3,132,627 2,614 55 2016 1,565,337 1,379 47 2017 1,330,419 1,045 45 2018 10,817 27 17 (2009 and also 2018 do not comprise a full year of data).

Several negative factors concerning the Twitter data were identified early in the research project. The first was the size of the Twitter data, which consisted of 8,768,632 tweets. Due to the number of tweets, not every tweet posted by the IRA and provided as part of the overall Twitter dump was read for this research project, limiting the study to specific events. Tinati, Halford, Carr, and Pope (2014) argue that by reducing big data into smaller scale analyse, the researcher is failing to take advantage of the full potential of the data, and that the research is “methodologically limited because social scientists have approached big data with methods that cannot explore many of the particular qualities that make it so appealing to use: that is, the 62

scale, proportionality, dynamism and relationality” (Tinati et al., 2014, p. 665). Data mining techniques were applied to the whole data set, to avoid this scenario, reducing the data set to the most relevant tweets, which then allowed me to focus on the development of Russian information operations on Twitter. For a complete review of the data mining that took place, see Stage 5 – Interpreting below. In addition to this, quantitative analysis was undertaken on the Twitter data, as demonstrated in this Chapter and Chapter Five.

Another issue arose with regards to the diversity of languages used by IRA operatives. The Twitter analysis undertaken for this thesis demonstrated several factors early in the research stages, such as Russia's expansion of language diversity. In 2009 Russian and English were the only two languages present, with Russian being the dominant language. However, the introduction of new language capabilities slowly expanded, with the number of languages almost doubling between 2012 and 2014, suggesting that Russia was increasing information campaigns to include a wider global audience. As such, SmartCat.ai was used to translate non- English tweets.

A further issue that arose when trying to download data from 2014 through to 2017 as the sheer volume of tweets made it difficult for my home computer to process which meant that the data could only be examined month by month. Tweets which contained too much data led to reduced processing times on my computer, with Excel often closing unexpectedly if the excel file was too big. It was during Stage Two that I began to familiarise myself with the data through translating tweets and examining patterns throughout the Twitter data.

3.4.3 Stage Three—Disassembling Having decided to incorporate the Twitter data into the research project, the literature was re- visited to identify high-level coding categories. The high-level coding categories, known as deductive coding, acted as a reference guide throughout the research process, producing a codebook. The codebook changed and developed as the research developed, with new codes added and existing codes re-organised as the process continued (Yi, 2018). As Yi (2018) explains, there are two primary coding methods, deductive and inductive coding. Inductive coding “is used when you know little about the research subject and conducting heuristic or exploratory research” (Yi, 2018). In comparison, deductive coding methods are applied when the researcher has access to information regarding the subject before commencement of the research project (Yi, 2018). With regards to the current research project, information regarding 63

the subject was drawn from the literature review and media stories about the 2016 US presidential election and the IRA, therefore a deductive coding approach was adopted.

The coding process involved the initial categories entered into an Excel spreadsheet, for incorporation into Aeon Timeline 2, at a later stage. Categories included internal protests in Russia, external conflicts between Russia and neighbouring countries and the broad heading of allegations of Kremlin trolling activity. The purpose of the Excel spreadsheet was to gather information into groupings that may be relevant to the research project. Coding schemes may be identified during a review of the literature inside and outside of the researcher's discipline and require the researcher to ask specific questions, such as,

• What is happening in the text? • Who are the actors, and what are their roles? • When did it happen? • Where did it happen? • What are the identifiable reasons as to why it happened? • How did it happen? (the process or strategy) (Castleberry & Nolen, 2018).

In addition to the coding undertaken on the literature of Russian information operations, additional coding occurred through an examination of the Twitter data. Included were:

• Political commentary around Russian elections; • Tweets concerning the colour revolutions; • The annexure of Crimea; • Disinformation around events that fictitiously occurred in the US such as an Ebola scare and a food poisoning scare; • Commentary around the election; and even • Stories of fake news around Kremlin trolls.

Coding of data occurred in two stages, initial coding and then focused coding. Initial coding is where I ask simple questions of the data, such as: what is the data an example of?; what is going on?; and what the data represents? Initial coding is designed to recognise as many topics as possible. Once this had occurred, I then began focused coding, which builds on the initial coding process but is selective. According to Lofland and Lofland (1995) focused coding requires the researcher to ask more focused questions, such as, what categorisation is the data a part of, what questions does the categorisation address and what answer to the question will

64 the categorisation produce. It was during Stage Three that decisions were made to focus on the development of the Kremlin Troll through both secondary sources and the Twitter data, using the US as a specific case example of this development.

3.4.4 Stage Four—Reassembling Once the coding of the data was complete, categorisation and mapping of the data occurred to create themes. Themes capture "something important about the data concerning the research question, and represents some level of patterned response or meaning within the data set" (Braun & Clarke, 2006). A theme may be:

• A categorisation that was identified through the analyse of data; • A categorisation that relates to the research focus or even the research questions; • Something that builds upon codes identified in the disassembling stage; • Something that provides the basis for a theoretical understanding of the researcher’s data which will help contribute to the body of research (Bryman, 2016, p. 584).

Aeon TimeLine 2 software was utilised to not only capture an event but also to place the event on a timeline, as demonstrated in Image 3.1. Themes were then able to be identified through the timeline.

65

Image 3.1 – Timeline of events relevant to research (Extracted from timeline developed in Aeon TimeLine2).

3.4.5 Stage Five—Interpreting Although the interpretation of the data commenced during the compiling stage of the research design and continued throughout disassembling and reassembling stages, most of this process occurred during Stage Five, with the data analysis. Themes were identified, not by how often they appeared but rather whether the theme captured something relevant to the overall research questions. A thematic map, which is a visual representation of the themes, codes and their relationship as per Diagram 3.6, was developed to assist in the process of interpretation

66

Russian Information Campaigns

Developments in The collapse of the Russian doctrine on Russian Information US Elections Soviet Union information campaigns Campaign capabilities

2009 the Internet Vladimir Putin The US interference Russian interference Western research into The colour revolutions Research Agency (IRA) becomes president with Russia during the Cold War information campaigns is established

2012 the IRA starts to 2015 Russia joins US The Orange Bill Clinton's support The Chechen Wars interject political centric echo chambers Revolutions of Yeltsin commentary of social media

The 2007 Estonia 2013 Gerasimov The 2016 US Obama's Restart Cyber Attack Doctrine Presidential election

Hillary Clinton's comments regarding 2014 Annexure of The 2008 Georgia War the 2011 Russian Crimea Duma Elections

The 2011 Russian 2014 IRA starts to Duma Elections target the US

2014 Annexure of Crimea

Diagram 3.2 – Thematic Map

Throughout this stage of the research project, Yin's (2015) five goals of qualitative interpretivism were strived for to achieve transparency in the research process and to ensure that the results would be consistent, if tested by other researchers. Yin’s (2015) five goals of qualitative interpretivism are as follows:

1. The beginning, middle and end of how interpretations were reached should be transparent to the reader; 2. Any other researcher should be able to reach the same conclusions if given the same data to work with;

67

3. The interpretations should be representative of the raw data; 4. The interpretation will add value to the research area; and 5. The researcher should use credible research methods that are respected by the researcher’s colleagues.

To ensure attainment of goals 2, 3 and 5, when handling and analysing the data, specific care was taken to adhere to ethical standards of data handling. The Twitter data was downloaded and stored on a SharePoint drive owned by Swinburne University of Technology. When an analysis of a particular section of the data was required, sequel queries were utilised to extract a copy of the required data in Comma Separated Value (CSV) files. The original data was not manipulated or analysed to ensure the content was not corrupted. The CSV files were then opened in Excel or Numbers and examined as part of the qualitative analysis, or if a quasi- quantitative inquiry was required, the analysis would be carried out via Command-Line or Bash Command. Computer code was also utilised in the form of Pearl Script to obtain more detailed analysis, as demonstrated in Chapter Eight. To ensure visibility and for researchers to be able to replicate the research findings, a list of sequel queries, command-lines, Bash commands and Pearl Scripts utilised for this research project are provided in Appendix Two. The process is also outlined in Process Diagram 3.5 below. All analysis was undertaken on an Apple computer which has MySQL built into the system design and all command-line and Bash commands were undertaken in Terminal. To ensure attainment of goals 1 and 5, a thorough analysis was undertaken of the data throughout the relevant chapters.

CSV Files opened and analysed in Numbers or Microsoft Excel, or

Twitter data The data from the Command-line or Bash Upon completition of downloaded in MySQL queries were query was downloaded Command utilised to the analysis the CSV MySQLand stored in a used to extract specific in CSV format from undertake quasi- files were saved to Swinburne SharePoint data for analysis SharePoint and saved quantatitive analysis, SharePoint for storage drive onto local drive or

Pearl Script utilised to undertake quasi- quantatitive analysis

68

Process Diagram 3.5 – Data Handling Process

3.4.6 Stage Six – Conclusion In stage six of the research process, conclusions were drawn from the data analysis and the research questions were addressed and answered.

3.5 Chapter Review The remainder of this chapter will provide a review of the thesis chapters, demonstrating the described methodological approaches as applied to the research specific.

3.5.1 Chapter Four – The Rise of the Kremlin Troll Chapter Four examines Russia's approach to information warfare during the Chechnya Wars, the Georgia war, the 2010/2011 Arab Spring protests; and the international protests and internal unrest in Russia at the end of 2011, to answer Question 1, ‘How has the Kremlin troll operation/strategy developed?’ Secondary sources, such as known Western and Russian commentary on Russia's strategic approach to information operations and warfare, were also utilised in answering this question. Examples include Senior Consulting Fellow at Chatham House Keir Giles, Senior Non-Resident Fellow at the Institute of International Relations Prague Mark Galeotti, Russian General Valery Gerasimov and Chekinov and Bogdanov, Russian experts on military theory. The chapter also draws on various disciplines, including Politics and International Studies, for example, the work of Viktoria Spaiser, Human Centred Design, for example, Kate Starbird and Media and Communication Studies, for example, Göran Bolin.

3.5.2 Chapter Five and Six Chapter Five will examine the development of the Kremlin troll and the Internet Research Agency (IRA). Primary and secondary research is utilised throughout the chapter, including analysis of the Twitter data, the 2017 US Indictment (United States District Court for the District of Columbia, 2018) against the IRA and the second-hand accounts of two former IRA employees. Chapter Six will then examine Russia’s annexure of Crimea, drawing on secondary sources such as Ilya Yablokov a Teaching Fellow in Russian Studies at the University of Leeds when examining the role plays to help the Russian government control a narrative of its choosing and Chekinov and Bogdanov writing on Russian Military theory. Political commentary from authors such as Bērziņš (2014) and Statements by political figures,

69

such as Putin (2011) were also utilised throughout both chapters. In doing so, the question, ‘How does Russia use social media to support its strategic goals’ begins to be answered.

As Chapter Five and Six included analysis of the Twitter data, a qualitative approach to the research was adopted, as well as a quasi-quantitative approach, to understand the flow and spread of disinformation in Russia and Crimea. The impact this had on Russian citizens and the citizens of neighbouring countries, such Ukraine, was examined.

3.5.3 Chapter Seven and Chapter Eight In continuing to answer the question, ‘How does Russia use social media to support its strategic goals’, Chapter Seven and Chapter Eight analyse specific IRA information campaign against the US, applying an inductive approach to the research. In Chapter Seven, the theory of trusts networks is explored and applied to four information campaigns perpetrated by the IRA against US citizens on social media. Chapter Eight continued the inductive approach demonstrating the importance of echo chambers as well as conspiracy theory. Elements of Castell’s political theory of networks were also applied to both chapters, including the description of the information campaigns carried out by the IRA against US citizens; accessing the impact the information campaigns had at the time, the identification of themes in the IRA Twitter data and conclusions based on my analysis of the information operations.

Chapters Seven and Eight continued the examination of big data and how technology influences human behaviour. I also examined the phenomenon of social networks, echo chambers and conspiracy theories, and how they have assisted IRA information campaigns on SMNs via the manipulation of existing echo chambers through current societal demands. Demonstrating that social media networks were turned into a mechanism for information warfare. Finally, as per Chapter Five and Six, in analysing the Twitter data, a qualitative approach was adopted throughout both chapters as well as a quasi-quantitative approach, in order to understand the flow and spread of disinformation.

3.5.4 Chapter Nine Chapter Nine examined Question 3 ‘How do social media platforms respond to State-sponsored trolling operations?’ This was undertaken through an examination of the Senate Intel Hearing with Social Media Representatives in 2017 as well as various media statements and responses from Twitter, Facebook and Instagram.

70

Chapter Four—The Rise of the Kremlin Troll

4.1 Introduction Russia's reliance on disinformation and propaganda is not a new instrument in the Kremlin's toolkit; it is, however, one that was largely forgotten by Western society, believed to be a remnant of the Cold War, only to be revitalised through Russian military reform (Galeotti, 2016, p. 2). According to Giles (2015), two key initiatives have led to Russian military reform since the end of the Cold War. The first was the inauguration of the former head of the Russian Federal Security Services (FSB16), Vladimir Putin in 1999 and the second was Russia's failure in the 2008 Georgia War. However, it was not until 2011 that Russia's modern-day military began to take shape as a result of a series of critical events which have been said to have revolutionised Russia's information operations (Giles, 2016): the Chechnya Wars in the 1990s; the armed conflict in Georgia in 2008; the 2010/2011 Arab Spring protests; the international protests and internal unrest in Russia at the end of 2011; and the annexure of Crimea in 2014.

This chapter explores Russia's adaptation of information operations and the rise of the Kremlin troll. What follows is an examination of the development of the Internet Research Agency (IRA), a known troll farm in St Petersburg. As will be demonstrated through this timeline, rather than failing at information operations, Russia has been learning and adapting these techniques and strategies using online communication applications, particularly SMNs. Included in this strategy is the flooding of communications with a narrative designed to confuse and cast doubt on Russian and world events, as well as to depict the role Russia envisions the West is playing in Russian affairs.

4.2 Chechen Wars In 1991, after the fall of the Soviet Union, Chechnya declared independence from Russia. Using former Soviet Union military equipment left behind after the end of the Cold War, Chechnya was then able to create its own military force. By 1994 however, Russia began to reassert authority over Chechnya, working with Chechnyan opposition leaders to try and assert control over the government and remove former Russian Air Force General turned President of Chechnya, Dzhokar Dudayev, from power. Russia undertook 'black' operations, using

16 Federalnaya Sluzhba Bezopasnosti. 71 proxies to attack Chechnya rather than attacking Chechnya directly. Russia's use of proxies17 may be seen as a time-honoured strategy to demonstrate strength with limited resources and achieve Russia’s geo-political objectives (Galeotti, 2016). During 1994, Russia deployed tanks to Chechnyan opposition fighters in an attempted coup against the Chechen President. The coup failed and soon after Russia's involvement was made public by the independent Russian press (Finch, 1998). According to Finch (1998), when Russian leaders realised that the Chechen proxies would not be able to defeat Dudeyaev, and also to avoid any implication in the failed coup, Russian Ministers counselled the then President of Russia, to deploy new forces to Chechnya in the form of an outright Russian attack using conventional military forces. The sudden haste in the deployment of Russian troops revealed noticeable disorientation in Russia's command and control capabilities (Finch, 1998, p. 2).

4.1 Map of Key Points of Interest During First and Second Chechen Wars

17 Such proxies include criminal groups, political lobbyists, journalists and hackers, any of which could be "activated and deployed in the pursuit of a political victory, one in which kinetic measures may play a major, minor, or even no role" (Galeotti, 2016, p. 297). 72

Galeotti (2016) describes the Chechnya wars or counterinsurgencies as invasions lacking in traditional Russian panache. Particularly, traditional Russian flare was missing in the information and political aspects of the first campaign. Journalists, Chechen government sources and Chechen sympathisers were given unfettered access to report on the events taking place, including the scourge of Russian operations and the casualties of the war. Russia did not have a compelling voice or counter-narrative in the global discussion of the event (Galeotti, 2016). Russia's information operations, or lack thereof, were not the sole reason why Russia lost the first Chechnya war, they merely contributed to the result of the war, which lasted until 1996. During the conflict, the Russian government and military made little effort to generate internal and external public support (Finch, 1998).

Furthermore, little explanation was given to Russian citizens regarding Russia's military operation (Thomas, 2003), with the very nature of the conflict pitting the Russian military against Russian citizens living in Chechnya, a move that Russian media widely criticised (Finch, 1998). As Finch (1998, p. 6) describes, if the

Russian government was intent on winning the hearts and minds of the Chechen people, and convincing them to remain a part of Russia, then carpet bombing, and massed artillery strikes on civilian targets were the wrong tools. Having failed to apply lesser means of persuasion, use of the military was premature.

The lack of commentary provided by the Russian government also meant that journalists, as well as civilians, began turning to sources outside of Russia for information about the war. Head of Russia’s Federal Security Service (FSB), Sergei Stepashin, recalled after the campaign that journalists, unable to receive details of the war from Russia, turned to Chechnya for information (Falichev, 1995). Russia was not prepared for the propaganda and ideological campaign Chechnya delivered, with Russian military forces ill-equipped to deal with the press (Thomas, 2003). In 1996, Russia brokered a cease-fire after Chechen guerrilla warfare led to the demoralisation of Russian troops (Global Security, 2019).

In May 1999, President Yeltsin was facing impeachment for his 1994 decision to deploy Russian troops to Chechnya. By October 1999, however, media and public support of the second Chechnya campaign and for President Yeltsin and Prime Minister Vladimir Putin was widespread (Thomas, 2003). In the second Chechen War (1999-2009) Russia seemed to have

73

learnt from previous mistakes, taking draconian measures to ensure control of the media narrative of the war, in both Russia and abroad (Galeotti, 2016). Russia also appeared to have adopted the USA's example of information control during the Gulf War and entered into the second Chechen war with a strategic information plan (Thomas, 2003).

The Russian government created a narrative of fighting Islamic extremists in Chechnya, after a terrorist attack occurred in apartments in Moscow and Volgodonsk, Russia in September 199918, leading up to the invasion. Reporters also showed little sympathy for Chechen fighters after they kidnapped approximately 1,800 people in Dagestan, the centre of the conflict during the second campaign since 1992. Some of the kidnapped victims were brutally murdered by Chechen fighters, included amongst the victims were local citizens, foreigners and journalists (Thomas, 2003). In December 1999, Russian Federation Resolution No. 1538 was initiated by President Yeltsin which ensured that the Russian population would only receive select information regarding the conflict from foreign sources and to also filter what information concerning the campaign would be disseminated from Chechnya. Russia studied NATO's press conferences to learn how to speak to the press and according to Thomas (2003, p. 113) "placed experienced people in key positions to ensure media control".

The concern in the Russian government that the Russian information war was failing outside of Russia began to emerge in October 1999. The sentiment was reiterated by the Head of the FSB Public Relations Centre Aleksandr Zhanovich, who, when speaking to the Russian administration, criticised the foreign press for allowing Chechen rebels airtime (Thomas, 2003). It was also during October 1999 that the Russian Information Centre was established on the order of the newly elected Prime Minister, Vladimir Putin and headed by former Public Relations figure Mikhail Margelov in what appears to be an attempt to control the media. The centre offered instructions to reporters, via a website on how to report from the front. The website also offered information on events that were occurring within the war, including maps and expert opinions. Margelov however, continued to express his concerns, including the concern that the Chechen militants were using the foreign media to open a second front in the information war (Thomas, 2003).

18 It is unclear whether these attacks occurred against Russians by the Russian State or whether they were indeed terrorist attacks. 74

In response, Russia put in place information blockades shutting down independent reporting and taking control of television and newspapers to ensure the release of sanctioned news stories only. The Russian government explained that these measures were necessary to prevent the enemy "from objectively assessing the situation" (Polkovnikov, 1999). However, these information blocks were limited in their usefulness as the Kremlin had underestimated the power of the Internet. Chechnya used the new resource as a means of communicating to the outside world what was occurring internally. Several internet sites were established by Chechen supporters to report on the Chechen version of events in Dagestan. As the Chechen versions were the only unfettered means of information, media outlets from around the world, including some within Russia, began using these sites to report on events in Chechnya. Once more, the Chechen version of events began to be the only primary source of reporting on the campaign (Thomas, 2003).

In January 2000, heavy fighting in Groznyy saw high causalities. This, combined with the broken promises of the Russian government that the campaign was coming to an end, led to wavering in public support for the war within Russia. It was also at this time that Chechen internet usage expanded, providing video footage of attacks on Russia, interviews with Chechen commanders and videos of Chechen fighters in action. At the start of the second Chechen war, the Russian media appeared to accept the Russian government's story with regards to the conflict. However, as the war progressed, Chechens bypassed the information blockade imposed by Russia through the use of the Internet and foreign news reporters (Thomas, 2003). An important take away from the second Chechen War was that even though Russia may have won the war, they did not win the information war (Giles, 2016).

The lessons Russia learnt/took from the Chechen war were twofold. First, the mindset of Russia’s leaders had been altered with regards to practical knowledge and insights to its approach to information warfare (Heickerö, 2010). Russia had learnt the importance of controlled information flow and the psychological impacts information could have on society, both of which have since been identified by the Kremlin as cardinal. As Heickerö (2010) writes, "by controlling the information-psychological aspects such as the mass media—for instance, TV, radio and newspapers – as well as the information flow, stability can be achieved" (Heickerö, 2010, p. 17). Second, the Internet was a destabilising factor in information

75

operations and public access to the Internet and information itself, should be controlled (Giles, 2016).

4.3 Georgian War, 2008 Since the end of the Cold War and extending into the Chechen and Georgia wars, the North Atlantic Treaty Organisation (NATO) may be seen as an area of contention for Russia. After the fall of the Berlin wall in November 1991, the US government worked with West Germany leaders to reunite Germany. Formerly secret US government documents from the early 1990s reveal that an implied agreement was made with Russia that NATO would not expand beyond West Germany if Germany reunified. Although there is no formal contract or agreement, Sarotte’s (2014) investigative article discovered a trail of letters and notes suggesting that promises were made to former President of the Soviet Union Mikhail Gorbachev by the US and West Germany, that NATO would not expand east from its current position. According to the letter trail, James Baker, the US Secretary of State at the time, acting on behalf of the US Government, made assurances to Gorbachev that NATO would not expand past West Germany. On hearing this discussion, staff members from the US National Security Council wrote to Helmut Kohl, the West German chancellor, on behalf of President George W Bush, explaining that the decision not to expand into after reunification did not make sense. Further, the letter requested that Kohl in his up-coming meeting with Gorbachev, inform the Soviet Leader that special military status would apply to what is now Eastern Germany. As Sarotte (2014, p. 93) explains, “Although the letter did not define exactly what the special status would entail, the implication was clear: all of Germany would be in the alliance, but to make it easier for Moscow to accept this development, some kind of face-saving regulations would apply to its eastern region”. Kohl made a decision not to relay the new message, but instead reiterated Barker’s assertions, that NATO would not expand (Sarotte, 2014).

When it became apparent in 1994 that NATO was planning an expansion, President Yeltsin, according to Goldgeier and McFaul (2003), became enraged. From the Russian President’s position, previous agreements had been broken with regards to NATOs expansion. Further, NATO had been established as a response to the Soviet threat, so therefore, the continued expansion and even the very existence of NATO after the collapse of the Soviet Union suggested that the West still considered Russia as a threat. During the Summit of the Council

76

on Security and Cooperation in Europe in December 1994, President Yeltsin responded to the news of NATO’s expansion plans in his address:

Europe, even before it has managed to shrug off the legacy of the Cold War, is risking encumbering itself with a cold peace. NATO was created in Cold War times. Today, it is trying to find a place in Europe, not without difficulty. It is important that this search not create new divisions but promote European unity. We believe that the plans of expanding NATO are contrary to this logic. Why sow the seeds of distrust? After all, we are no longer adversaries, we are partners. Some explanations that we hear imply that this is ‘expansion of stability,’ just in case developments in Russia go the undesirable way. If this is the reason why some want to move the NATO area of responsibility closer to the Russian boarders, let me say this: it is too early to give up on democracy in Russia! (Asmus, 2004, p. 95).

When President Vladimir Putin came to power in 1999, the relationship between Russia and NATO was unresolved. A year later, in November 2001, in what appears to be an attempt to resolve relations, Russia established the NATO-Russia Council. However, the alliance did nothing to stop NATO’s intentions of expansion. From 2003 the West continued to extend influence in Eastern Europe by funding anti-Russian revolutions in Georgia and Ukraine (Thalis, 2018). According to the Australian Institute of International Affairs, over one billion dollars in aid from both the US and Eastern Europe was directed into Georgia. Western NGOs played “a key role in financing opposition parties and organising demonstrations” (Thalis, 2018). In March of 2004, NATO accepted seven new member States, three of which were Baltic: NATO was now the closest it had ever been to the Russian heartland. Later that same year, Georgia and Ukraine would also sign Individual Partnership Action Plans with NATO. NATO continued to be a threat to Russia, with the Bush administration in 2007 releasing plans for a missile defence system in Eastern Europe, under the pretext of protecting Europe from an Iranian nuclear attack. Russia responded with a counter plan to construct a joint Russia-US warning system in Azerbaijan, but the proposal was rejected by the US. In response, President Putin declared that NATO was a real threat towards Russia. In 2008 NATO released a statement asserting the intention of extending an invitation to Georgia and Ukraine to join NATO, a move that would position military forces on Russia’s doorstep (Thalis, 2018). The 2008 Russo-Georgia war would prevent this from occurring (Donovan, 2009).

77

Russia's campaign in Georgia in 2008 is viewed as a success, as Russia met its goal of taking control of Abkhazia and South Ossetia. The military operation was planned carefully, demonstrating a coordinated approach between military, cyber warfare and diplomatic offensives (Vendil Pallin & Westerlund, 2009). As Donovan (2009, p. 1) explains, "in the brief war, the Russian military in a quick and decisive campaign overwhelmed Georgian forces to gain control of two breakaway republics, destroyed much of Georgia's armed forces on land and sea, and caused NATO to reconsider its offer of membership to Georgia". In agreement is Galeotti (2016, p. 296) who states that Russia saw a convincing victory in the Georgia war, demonstrating coordination at the highest level between state and non-state actors and military and political actors.

4.2 Map of Key Points of Interest During 2008 Georgian War

The "Ossetian problem", as it has been referred to (see for example Donovan, 2009), is the result of ethnic enclaves between Georgian and Ossetian created deliberately during the Soviet Union as a way to manage the territory and prevent centralisation of authority in the region. On the collapse of the Soviet Union, both South Ossetia and Abkhazia declared their independence from Georgia. Georgia, however, had sought to regain control of the South Ossetian republic. In response, Russia volunteered to aid in peacekeeping exercises, and in

78

doing so, gained a permanent position on the peacekeeping forces in the Ossetian regions. In the months leading up to the war, several activities provided the opportunity for Russia to increase its troops and military presence in the region. An increase of 1,000 Russian peacekeeping forces, were introduced in approximately April of 2008. The troops were reported as paratroopers, which Donovan (2009, p. 10) describes as "some of the best trained and prepared forces within Russia". Battalion troops were also sent into Abkhazia by Russia to repair a disused railroad in anticipation of the Sochi Olympics, the troops finishing their work one week before the commencement of the war. Lastly, Russian military training was held in the North Caucasus, opposite South Ossetia (Donovan, 2009).

When the war began, Russia presented itself as a peacemaker after proxy South Ossetian militias carried out attacks provoking Tbilisi to make the first move, separatist forces were also engaged in South Ossetia, and Abkhazia. Russia's information campaign against Georgia began strong. Russia portrayed President Mikheil Saakashvili as the aggressor, while Russia was the victim, which was obliged to defend its citizens as attempts by South Ossetia and Abkhazia to become recognised by Russian parliament were being thwarted by Georgia. Russia entered South Ossetia in anticipation of Georgia’s troops responding to separatist troops breaking a cease-fire that had been in place since 1992. Russia was accusing Georgia of aggression towards South Ossetia (Roudik, 2019). The back-story presented by Russia was so compelling that 92 per cent of people polled by CNN at the time found in favour of Russia's intervention (Cohen & Hamilton, 2011). According to Thomas (2003), the ostensibly humanitarian position Russia had undertaken in joining the conflict was believed to fit in with what Russia referred to as the "Western doctrine's" need to legitimise military intervention on the national stage. Vladimir Putin, then Prime Minister of Russia, would also blame the US for what was occurring in Georgia, claiming that America should have done more to prevent the conflict. Putin went as far as accusing the US of orchestrating the campaign as part of an election stunt (Chance, 2008). In response to Putin and Russia's information campaign, Georgia launched a counter-disinformation campaign, led by a private consultancy and public relations firms. The Georgia campaign included images of the Russian military targeting civilians, portraying Moscow as the villains (Iasiello, 2017).

Cyberspace also played an essential role in the Russo-Georgian War, as both military and civilians on both sides of the conflict leveraged its power (Donovan, 2009). Media and

79

communications were redistributed via the Internet by means of blogs, news channels and rumours, proving so useful that Russian internet and television sites were filtered by Georgian authorities (Deibert, Rohozinski, & Crete-Nishihata, 2012). Command-and-control servers, originating from Russia, were responsible for malicious hacks, DoS and DDoS attacks against Georgian systems and websites, including web page defacements, and attacks against critical Georgian websites including government, financial services and media (Iasiello, 2017). It was unclear, however, who was responsible for these cyberattacks. As Donovan (2009) explains, the Russian government has never claimed responsibility for these activities, and it remains unclear whether these operations were coordinated, encouraged, or officially tolerated by Russian authorities.

Several lessons learnt from the Chechnya conflict may be seen in the deployment of the five- day war in Georgia. A communication plan was implemented coordinating responses between the government and military with a pro-Russian message seen across traditional and sites. Influential political figures were engaged to undertake political communications on the conflict, such as Prime Minister Vladimir Putin, former President Mikhail Gorbachev and from the then-current President Medvedev. Russian strikes were also taken against "key communication facilities severely restrict[ing] communication with the national command authority. National fibre-optic trench lines were severed, and DDoS activities disrupted Internet-based communication" (Donovan, 2009, p. 10).

However, the five-day war in Georgia demonstrated the need for heterogeneity of Russian military proficiency; one example may be seen in Russian command and control capabilities (Vendil Pallin & Westerlund, 2009, p. 407). Russia fell short with poor communication strategies on numerous fronts, with criticism occurring within the Russian government of the information warfare strategy that occurred throughout the campaign. As previously noted, pro- Russian media coverage was undertaken, and cyberattacks were carried out by alleged Russian patriot hackers throughout the war. Russian analysts, however, suggest that the campaigns were amateur and that the personnel attached to the information warfare division were not trained efficiently (Vendil Pallin & Westerlund, 2009). In terms of the cyberattacks undertaken in Russia against Georgia, although seen as successful in interrupting websites and Georgian government information systems, they appear to have had no apparent impact on Georgia's fighting ability (Donovan, 2009). Further, Russian command and control capabilities fell short

80

with regards to a Russian spokesperson. While Georgian nationals presented themselves to a global audience speaking clearly and precisely in English, there was no one with the same skillset to speak for Russia (Giles, 2016).

In response to Russia's command and control deficiencies during the Georgia war, an idea was formed to create specialised information troops. The information troops would include specialists in a range of areas such as hacking, journalism, psychological operations, strategic communications and linguistics. Although there is no proof that the information troops came to fruition, a push for change towards information capabilities was orchestrated (Giles 2009; Giles 2016), leading to, for example, the development of the Russian troll. The war was also used to emphasise Russia's need for military reform, as well as establish the need to improve Russian military equipment and capabilities (Vendil Pallin & Westerlund, 2009).

4.4 Arab Springs, 2010-2011 and Russian Demonstrations, 2011 A critical event that demonstrated the power of social media to Russia and the world was the Arab Springs uprisings. The Arab Springs uprisings saw citizens in various countries in the Middle East and North Africa such as Egypt, Libya, Tunisia, Bahrain and Syria, unite together in anti-government protests (Bruns et al., 2014). Several factors may be seen to have contributed to the uprisings. In June 2010 Khaled Said was brutally murdered by two police officers after the Alexandrian man posted a video online of the same police officers carrying out a drug deal and exchanging money. Said's parents were told that he had choked on a packet of drugs. However, the Internet was soon flooded with images of Said's bloodied and disfigured face, causing public outrage in Egypt and around the world (Eltantawy & Wiest, 2011). Later that year, Mohammed Bouazizi, a vegetable merchant in Tunisia, set himself alight in front of a municipal building in protest of the government (Howard et al., 2011). Not long after the death of Bouazizi, the world witnessed the fall of President Zine El Abidine on the back of the Tunisian revolution, which was said to have further inspired Egyptian protestors. As Eltantawy and Wiest (2011, p. 1212) write, even though Egypt had committed to protests on Egypt's Army Day, "the success in Tunisia appears to have influenced Egyptians and strengthened a sense of collective identity and purpose, primarily because the similarities in the oppressive conditions under which both groups lived and the goals of the citizen-activists".

81

4.3 Map of Key Points of Interest During the Arab Springs Uprisings and Russian Demonstrations

During the protests, the Internet and SMNs represented a critical new capability in citizen solidarity and a unified movement (Eltantawy & Wiest, 2011). In a study by Eltantawy and Wiest (2011, p. 1218) on the use of SMNs during the Egyptian revolutions19 it was demonstrated that:

Egyptian protesters were able to disseminate a continuous stream of text, videos, and images from the streets of the revolution directly to millions via social media technologies, and indirectly through the republication of these messages on news networks such as Al Jazeera and CNN.

SMNs had created a new form of a social movement, known as cyberactivism, which would change the landscape of collective action (Eltantawy & Wiest, 2011). According to Eltantawy and Wiest (2011), the revolution may be traced back to the early 2000s when Egyptian bloggers began to tackle political issues online, attracting a global audience. Then in 2008, Egypt saw

19 The Egyptian protests began in December 2010 and led to an 18-day revolution, the result of which led to the resignation of President Hosni Mubarak in February 2011 (Eltantawy & Wiest, 2011). 82

its first cyberactivism attempt, when textile workers used social media to organise a strike. The strike, however, was not successful and was defeated by Egyptian State security forces (Eltantawy & Wiest, 2011). By the end of 2010, Western news sites were being used by Egyptian individuals and political organisations "to spread credible information to their supporters through the revolutionary period" (Howard et al., 2011).

SMNs were also used to organise and mobilise demonstrators to facilitate regime change (Giles, 2015). When the Egyptian and Libyan governments realised that SMNs were being used to coordinate protests and provide footage to the outside world of internal unrest and violence, cellular communications and the Internet were turned off. In response, the sharing of how-to documents instructing people to use dial-up modems were distributed. Additionally, engineers from Twitter, Google and SayNow initiated 'SpeaktoTweet", which provided a means for activists to call and leave messages that would then be tweeted (Eltantawy & Wiest, 2011). Bloggers whose servers resided outside of Egypt were also relied on to spread the news of the protests, knowing that their voices would not be taken offline (Howard et al., 2011). In a study on the Tunisia and Egypt protests, Howard et al. (2011) found that social media was used by democracy advocates to connect with supporters outside of their relevant countries. The connections provided a means to get information out on what was happening during the protests and throughout the various regions in order to inform the Western world. Additionally, in many cases, the researchers found that "democracy advocates in Egypt and Tunisia picked up followers in other countries, where similar democratic protests would later erupt. Ultimately social media brought a cascade of messages about freedom and democracy across North America and the Middle East and helped raise expectations for the success of political uprisings" (Howard et al., 2011, pp. 2-3).

It is easy to imagine that Russian authorities would have been watching the various uprisings which were occurring on its doorstep and monitoring for potential replications inside the State (Bechev, 2016). Russia's last revolution led to the collapse of the Soviet Union and what Putin referred to as "the greatest geopolitical disaster of the 20th century". Russian media responded by suggesting that the colour revolutions and the Arab Springs uprisings were orchestrations of the West in a direct attack against Russia and the Russian way of life (Katz, 2011). In 2015 Vladimir Putin reiterated this allegation in his state-of-the-nation address, where he accused the US of creating "a zone of chaos" in Libya, Iraq and Syria (CBSNews, 2015).

83

Russia portrayed the Arab Springs Uprisings as a product of 'social control technology' set in motion from the US as a form of aggression towards Russia. A year later, during the protest movement in Russia surrounding the parliamentary and presidential elections, Moscow would again claim that the manifestation of aggression was a result of information encroachment formulated by the US and codenamed 'Anti-Putin' (Jolanta Darczewska, 2014). Putin accused the US of having spent millions of dollars to influence the Russian elections to incite political change in Russia (ABC News, 2011). According to Lonkila (2012), the Putin regime was caught by surprise by the internal protests and degree of civil unrest witnessed during the lead up to the Russian Duma elections in 2011. United Russia was "dubbed a party of swindlers and thieves" (Lonkila, 2012, p. 3) as Russia witnessed online activism and public demonstrations denouncing Vladimir Putin. Where in the past there was public fear of participation in political opposition, a result of previous public beatings and the deaths of human right activists and journalists, such as the murder of Anna Politkovskaya, by December 2011 videos ridiculing Putin and his political party had begun to appear (Lonkila, 2012).

Putin also placed blame on the then US Secretary of State Hillary Clinton, for interfering in Russia by setting the "tone for some opposition activists" (CBSNews, 2015). Clinton, during her speech at the meeting of the 56-nation Organization for Security and Co-operation in Europe (OSCE), stated that the US had "serious concerns" regarding Russia's Duma elections. Further, Clinton told the room that "when authorities fail to prosecute those who attack people for exercising their rights or exposing abuses, they subvert justice and undermine the people's confidence in their governments" (Mohammed & Adomaitis, 2011).

Russia's protests, like other cries for democracy around the world, were a direct result of the growth of the Internet and mobile communications (Lonkila, 2012). Whereas previously Russian households received their news from Kremlin-controlled news sources such as Sputnik and RT, the growth of the Internet and mobile communications provided a new source of information for the Russian population. Stories of political corruption and maladministration began to appear on YouTube, questioning Putin's government and authority, while gatherings and protests were organised via SMNs. SMNs bought together like-minded people and provided a platform where participants trusted what was said, something that had been lost

84

previously through corrupt Kremlin-controlled media sources. SMNs also provided a way for participants to stay anonymous (Lonkila, 2012).

In 2016, Russian General Valery Vasilyevich Gerasimov, speaking on the events of the Arab Spring, suggested that the rules of war had changed. "The role of non-military means of achieving political and strategic goals has grown, and, in many cases, they have exceeded the power of force of weapons in their effectiveness" (Valerii Gerasimov, 2016, p. 24). Valerii Gerasimov (2016) suggested that information operations open vast asymmetrical possibilities to reduce an enemy’s fighting potential and that a coordinated effort of research organisations, ministries and agencies could achieve this.

4.4.1 The Kremlin’s Response During the Chechen and Georgia Wars, Russia had learnt that the Internet was a powerful tool in terms of controlling perceptions of the events taking place. The Kremlin's response, at first wary, had changed. The Kremlin's aim was no longer to control internet communications. As Galeotti (2019, p. 35) writes:

The Internet was identified as a potential threat emerged at a time when the security apparatus was relatively weak and in no position to control it. While attempts have been made […] to try and control online activity, instead the security structures had to accept that they operated in an information age and instead looked to means to exploit this.

Russia had invested heavily after 2008 in Twitterbots and targeted DDoS attacks, combining modern technology with "old-fashioned dirty tricks" (Giles, 2016, p. 30). However, the results were unsettling as it became evident to the Kremlin that something was missing from Russia's disinformation strategy. Automated systems were not enough, and actual human engagement was needed to penetrate the mass consciousness online (Giles, 2016). The solution: troll farms.

Exactly when the Kremlin began to use troll farms20 to spread disinformation and propaganda is debatable. A 2011 report suggests that Russian troll farms began using Twitter to spread

20 A 'troll farm' is an organised group that has come together for the specific purpose of affecting public opinion through the generation of misinformation and/or disinformation on the internet. In 85 propaganda and misinformation to Russian citizens and their neighbours in 2009 (Howard et al., 2011)21. Then in 2012, Russia began targeting misinformation at US voters utilising the techniques deployed on Russian citizens and neighbouring Eastern European countries (Howard et al., 2011). According to Planton Mamatov, director of the Russian company, Magic Inc PR, Mamatov ran a troll farm in partnership with the founder of Ra Digital, Arseny Kamyshev, of approximately 20-30 people, from 2008-2013, to "carry out online tasks for Kremlin contacts and regional authorities from Putin's United Russia Party" (Chen, 2015b).

Olga Kryshtanovskaya, Russian sociologist, activist, deputy from the United Russia party and Director of Kryshtanovskaya Labs, suggests Russia's use of troll farms began in approximately 2011 in response to Alexei Navalyn's successful social media campaign. Navalyn, the Russian Opposition Leader, used social media to gain support from the Russian people leading up to the 2012 Russian parliamentary election. As the Russian media is mostly under the control of the Russian government, opposition activists in Russia are often ignored intentionally by the media leading up to the elections. In response, Navalyn turned to an alternative communication space built on SMNs: "This space influences traditional media and the political agenda of the country, giving Navalyn a far-ranging voice" (Herasimenka, 2018). Navalyn's campaign gained significant momentum very quickly, as he built a rapport with younger voters. Herasimenka (2018) attributes Navalyn's success to five points:

1. Navalyn politicised VKontakte (VK), Russia's largest social media network; 2. Navalyn's campaign used an encrypted platform called Telegram to communicate, protecting members; 3. Navalyn's team created their own TV network utilising YouTube; 4. Navalyn targeted his campaign to Russian provinces, which had until this point, kept out of politics as they were seen by many in these area's as Moscow's domain; and 5. The use of SMNs identified supporters of Navalyn as an opposition activist from all over Russia, who would mobilise and spill over from the online and into the streets.

contrast, an individual disseminating misinformation and/or disinformation is referred to as a troll or internet troll (Snider, 2018). 21 See also Keller, Schoch, Stier, and Yang (2020). 86

According to Weir (2018), the Kremlin would soon learn to mimic this behaviour to spread propaganda and disinformation online. In approximately 2012, the Kremlin tasked , known as ‘Putin's Cardinal’, with designing a strategy to deal with the challenges presented by Facebook, Twitter and other social networks used during the 2011 Russian demonstrations. In response, Volodin installed Prism Corporation, a computer program which is said to monitor 60 million online sources at once, enabling online access to public opinion. Prism provided Volodin with a way to closely monitor social media sites and social tensions, ensuring the government's immediate reaction when necessary (Chen, 2015). Volodin also introduced the mandatory registration of Russian bloggers and began blacklisting internet sites without legal authority, just based on what the Kremlin believed to be unsuitable to the Russian people. Alexei Navalyn's blog was among those internet sites that have since been blacklisted (Weir, 2018).

4.5 The Rise of the Troll Farm The first identifiable troll farm activity that I was able to identify was Platon Mamatov’s troll farm in the Ural Mountains. According to Platon Mamatov, in an interview with the New York Times, Mamatov coordinated a group of Internet trolls to assist in boosting the image of Alexander Misharin, Governor of the Sverdlovsk Region (Business Quarter, 2013). The aim of the farm was to "carry out online tasks for Kremlin contacts and regional authorities from Putin's United Russia Party" (Chen, 2015). The existence of the farm was confirmed in late 2011 when Russian News outlet URA.RU reported that paid commentators were operating in the Russian Ural segment of the Internet in a bid to form a positive image of Urals regional authorities. The story came to light after hackers posted correspondence from the Kremlin and Kremlin officials outlining the campaign (URA, 2012b). Members of Mamatov's staff confirmed this story, in response to not having been paid for the work they had carried out throughout the campaign (URA, 2012b). The project entitled, Improving the Information Background in the Sverdlovsk Runet Segment had been operational since mid-December 2011 in response to online criticism leading up to the Russian Duma elections (URA, 2012a). According to URA (URA, 2012a), approval for the project was granted by Andrei Vorobyov, the head of the Central Election Commission of the European Union, "when it became clear that opposition activity on the Internet posed a real political threat" (URA, 2012a).

87

Deputy Prime Minister Alexei Bagaryakov coordinated the project and was in charge of ensuring online discussions 'did not get heated' and to direct conversations in what has been described as a more constructive direction (URA, 2012a). However, reports indicate that the trolls were utilised instead as a political tool in the upcoming elections to paint a favourable picture of Vladimir Putin and Alexander Misharin. It was not long after that the trolls would become known as 'Misharin bots'22 after online users noticed an influx of positive feedback concerning the regional authorities and in particular Misharin in forums and blogs (Business Quarter, 2013). The comments were tied back to the PR Consultant Platon Mamatov. At the time of discovery, Mamatov did not try and hide his involvement in the campaign, and as highlighted previously, he has been willing to talk about his work with reporters. Mamatov described the process of his operation as follows:

The group of influence will include: a curator from the administration of the governor (Yevgeny Zorin), a coordinator, a monitoring specialist, and commentators. At the first stage, it is supposed to use ten commentators provided with a special program complex and geographically located outside Yekaterinburg.

Subsequently, volunteers from various social movements, members of United Russia, members of the regional government and other people loyal to the regional administration can be connected to the comment. The coordinator of the influence group will also be coordinating and monitoring their work.

Each commentator will have at his disposal several (from three to five) network characters. Each of them will not be a faceless “bot”, but a unique personality with a separate IP address, its own character, life history, activity on the Internet, relationships with other users and other properties. Every character will be completely indistinguishable from a real person (URA, 2012b).

From news reports and interviews with Platon Mamatov, it is clear that the troll farms Mamatov created were based on human interactions and not bot activity. Mamatov's operation also appears to be a private company that was hired to run information campaigns for Russian political figures (Chen, 2015; URA, 2012b).

22 Although they were known as ’Misharin bots’, evidence provided by (URA, 2012a) suggests that the campaign was orchestrated by paid trolls and not bots. 88

As previously mentioned, there are several discrepancies in dates when examining the starting date of troll farm activity in Russia. A recent report by the Computational Propaganda Research Project suggests that Russian Troll Farms began using Twitter to spread propaganda and misinformation to Russian citizens and their neighbours in 2009. The report is in contrast to Platon Mamatov's claims to have begun using troll farms against Russian citizens from 2008 to 2013 in the Ural Mountains and Olga Kryshtanovskaya assertion that troll farms began operations in 2011. The Twitter data which has been made available on the Internet Research Agency suggests that while accounts were operating in 2009, they contained no political commentary. Further analysis of the Twitter data was undertaken to examine if operatives had undertaken information campaigns to influence local opinions in the Ural Mountains leading up to the Duma elections, during the elections in 2010, there was not. The analysis, therefore, suggests that Platon Mamatov's company and the troll farm activity related to the IRA were two separate operations.

4.6 The Internet Research Agency (IRA) One of the most notorious troll farms which came to public attention in 2014, is the Internet Research Agency (IRA), said to have been established in 2013. Information on the IRA for the purpose of this research has been drawn primarily from the 2017 US Indictment (United States District Court for the District of Columbia, 2018); an expose piece which appeared in The New York Times; and the second-hand reports of two former employees of the IRA that have provided insight into the internal operations of the organisation. The first source, Ludmila Savchuk, was a reporter who went uncover to work for the IRA to uncover the trolling activities of the farm. The second was a former employee of the IRA Marat Mindiyarov, turned informant to various western newspapers in 2018.

According to Savchuk, the IRA operated out of a basement (Chen, 2015) until 2014. Then, with an increase in online activity concerning the Ukraine crisis and Russia's annexure of Crimea, the IRA offices expanded to cover the extra workload expected of employees. The IRA then moved to a different location and occupied over four floors and employed more than 600 workers. The workers were split into two central departments23 (Chen, 2015). At the beginning of each shift, employees were assigned a technical task sheet that contained a

23 The troll farm has since moved to Optikov Street in Saint Petersberg, with no known description available of the operation as it stands today (Nechepurenk & Schwirtz, 2018). 89 message that the employee was instructed to support and spread online (Walker, 2015). As the technical task sheet of former employee Marat Mindiyarov reads;

The majority of experts agree that the US is deliberately trying to weaken Russia, and Ukraine is being used only as a way to achieve this goal. If the Ukrainian people had not panicked and backed a coup, the west would have found another way to pressure Russia. However, our country is not going to go ahead with the US plans, and we will fight for our sovereignty on the international stage (Walker, 2015).

The job description of the first group of employees according to Walker (2015), was to troll social media sites, both legitimate sites and sites set up by other employees and spread the message on their daily task sheet. The employees were told to create original and new content for each new message they posted; that is, they were not to be repetitive in their postings (Walker, 2015). Often employees would work in groups of three, with one making a comment or responding to an online post, and then the other two employees would respond to the post in order to start a discussion and to get the thread trending (Walker, 2015). The second group of employees undertook the task of creating mundane blogs or accounts which looked at everyday living, such as gardening or craft. Then between the mundane posts, the employees would include political commentary in an attempt to influence followers. Both groups of employees used Virtual Private Networks (VPNs) to route their operations through computers outside of Russia, presumably to hide the operative’s location. During the 2016 US presidential elections, VPNs were used to route operations through computers located in the US (Dave Lee, 2018).

As per Savchuk’s original reporting on the IRA activities, employees would create content for popular SMNs such as LiveJournal, Vkontakte, Facebook, Twitter and Instagram. Comments were also left in the comment section of news outlets. For example, when opposition leader was murdered, Savchuk was moved into a specific team to leave comments on various news sites, to suggest that Nemtsov's murder was initiated by his party and not by the Kremlin as per speculation of various sources (Satter, 2017; Chen, 2015). Savchuk would work two days on and two days off on twelve-hour shifts. Over those two shifts, Savchuk would be expected to submit five political posts, ten non-political posts and 150-200 comments on other workers posts. Grammar and what Savchuk describes as politology lessons, that is, the study of the way the Kremlin manipulated and reported on politics, was also provided to new

90

employees to ensure employees were aware of the "proper Russian point of view on current events" (Chen, 2015).

In a recount of his two months working for the IRA, Mindiyarov explains how he was assigned to post comments on Russian political sites, the work he describes as his rendition of George Orwell's book 1984. Mindiyariov also discusses how he had applied for a position within the English comment department of the IRA, a specific area where employees left comments in English on Western sites, a position held in high regard with a higher salary. He did not, however, pass the assessment criteria, which was to write an essay in English on his views on 'if Hillary Clinton were to become president'. Mindiyarov, in his interview with Washington's Top News, recalled writing favourably of Hillary Clinton becoming president, the basis he believes, to his unsuccessful application (Green, 2018).

At the time of writing, the IRA continues to be referenced as the only known and researched troll farm in Russia. However, Giles (2016, P. 10) suggests that rather than be the only troll farm in existence, the IRA exists as an "effective distraction from the wider network of troll farms, or the organisation behind them".

4.7 Conclusion After the fall of the Soviet Union, Russia appeared to have lost the sophistication and flair of information warfare seen during the Cold War era. The Russian government demonstrated several failings after the collapse of the Cold War in terms of military strategy and information operations. However, the Russian government also undertook improvements through critical learnings in their formulation of information warfare and their adaptation to online communication applications. During the first Chechen war, in their haste to deploy troops Russia's government was not prepared for an information war against Chechnya. In the second Chechen war, Russia's information operations started strong. However, Russia was unable to maintain control of the information war, as the Internet had emerged as a new and formidable force, the likes of which had never been seen before. In 2008, Russia's military entered the Georgia war prepared for both an information and a kinetic war. Unknown persons unleashed an array of cyber-attacks against Georgian websites, and the Russian government established a communication plan to ensure the Russian government remained in control of the information flow. However, once again, Russia fell short with regards to information operations. The cyber-

91

attacks did not impact Georgia's ability to respond to Russia's attacks, and unqualified personnel appeared to have led the information operation side of Russia's campaign. Unlike the first Chechen war, Russia demonstrated a successful kinetic military strategy in both the second Chechen war and the Georgia war, but not with regards to information. Tanks and soldiers were not enough for Russia to win on all sides of the war; more was needed.

Russia learnt from the failings of the three wars and began to create an information army, to respond to the type of harmful rhetoric seen during both the Chechen wars and the Georgia war. A new imposing strength was in the making, the Internet troll. With the internal unrest occurring in countries bordering Russia, and eventually entering Russia, the Russian government needed a way to control the discourse: The Internet appears to be the Kremlin’s answer. Beginning in 2008/2009 Russian government officials may be seen using online communication forums to spread favourable messages towards the United Party of Russia and Vladimir Putin. Russian authorities also took back control of the Internet, by implementing Prism and patrolling internet sites for anti-Russia discourse. From a Russian government perspective, it would appear that Russia's development of kinetic and information warfare techniques was in response to the growing threat posed by NATO and the US. At the end of the Cold War, NATO implied that it would not expand. However, by 1994 NATO was the closest it had ever been to Russia and was threatening to advance even closer to Russia's borders with an invitation extended to Georgia and Ukraine; if not for the 2008 Georgia war, this expansion would almost certainly have occurred.

As the coming chapters will demonstrate, Russia's SMN influence has evolved into a sophisticated information operation occurring over various mediums in three stages. The first stage is the creation of disinformation via multiple sources, including: Kremlin-controlled media outlets such as RT and Sputnik; media outlets sympathetic and supportive of the Russian government such as True Pundit; sites such as YouTube which produce user-generated media; and hackers such as Guccifer 2.0 and who leak information relevant to the story. The second stage is involving trolls and bots amplifying and disseminating the (dis)information created in stage one. The final stage is undertaken by “useful idiots or fools”24, who are usually

24 This is a term used to describe civilians who inadvertently help Russia spread propaganda and disinformation. (Bogart & Thorburn, 2005). 92

ideologically robust supporters of the Kremlin25. These individuals often hold some form of authority over the (dis)information that is disseminated, their central role to add some form of legitimacy to the campaign (Helmus et al., 2018). Although, as will be demonstrated, by 2016, the useful fool would be replaced with a sock puppet responding to a social media thread. Old Kremlin techniques used to sow or gather disinformation have therefore been reinvented and given new power through the likes of Twitter and Facebook which have been specifically designed to target or personalise messages at users.

25 An example of the recruitment of a useful fool or someone who is ideologically friendly to the Kremlin may be seen in Russia's HIV AIDs disinformation campaign of the 1980s where retired biophysicist and Communist sympathiser, Professor was recruited to support the disinformation campaign that the HIV AIDs virus was a biological weapon created by the US (Bogart, L. M. and S. Thorburn (2005). "Are HIV/AIDS conspiracy beliefs a barrier to HIV prevention among African Americans?" AIDS Journal of Acquired Immune Deficiency Syndromes 38(2): 213-218). 93

Chapter Five – Defining Information in Information Operations

5.1 Introduction As demonstrated in the previous chapter, Russia’s development of information warfare on social media networks (SMNs) was decades in the making, starting in the 1990s with the Chechen Wars and continuing to the present day. The Russian government's information warfare campaigns have centred on controlling the narrative and providing information on events and situations through communication and media networks which were pre-approved by the government. This chapter will now examine how easy and seamlessly it was for the Russian government to utilise SMNs to undertake propaganda and to spread disinformation within Russia and abroad. In doing so, this chapter will review the concept of information from a geopolitical perspective, paying particular attention to how Russia and in recent years, the US define the weaponization of information in an online battlespace, before examining the role information plays in Russia’s contemporary information campaigns. Demonstrating that the dynamics of SMNs such as Twitter were a perfect fit for state and non-state actors looking for avenues to distribute propaganda and disinformation. Such features include the ability of users to generate their own content26, the ease of repeating a narrative through multiple subscribers, likes and retweets (with regards to Twitter), and the ease of, on the part of a user, to distort and misrepresent content. In addition to these characteristics, SMNs have seen the development of information warriors as well as online ‘echo chambers’. Echo chambers, for example, involve people with similar beliefs and ideologies coming together to form global online communities. While ‘information warriors’, a term used by Singer and Brooking (2018), describes those individuals who are using the Internet to shape storylines to frame their audience's understanding of events, provoke a response and incite followers to action, for whatever cause the information warrior is advocating.

Next, this chapter will examine Russia’s use of Twitter, specifically, highlighting key characteristics that made it appealing to Russian state-actors to weaponize. As van Dijck (2018) explains, Twitter has become a central node in the accumulation of online platforms which have come together to create a communication network to “effectively function as news aggregators”, known as the platform ecosystem. However, unlike traditional news aggregators

26 Also known as user-generated content (UGC). 94 which rely on professional editors to choose content on behalf of users, social media permits everyone to “share news or other content from anyone and from anywhere” (van Dijck, 2018, pp. 52-53). Thus, troll farms, such as the Internet Research Agency (IRA), which employ hundreds of people to create multiple user accounts, can manipulate a user’s Twitter experience through penetrating online echo chambers and also through a repetitive narrative picked up by Twitter's algorithms as a trending or a potentially trending headline. The last section of this chapter will provide an account of the first IRA user groups on Twitter through an analysis of Twitter’s IRA data from 2009 to 2013, as described in Chapter Three.

5.2 State Information Doctrines Several Russian government documents published over the last two centuries demonstrate the importance the Russian government places on information security. In September 2000 Russia released the Doctrine of Information Security which detailed the necessity to reset Russia’s image on the public stage abroad. The doctrine laid down the foundations of the Russian government’s position regarding the information sphere. The main objectives, according to Veebel (2015, p. 1) were “to protect strategically important information, to protect against deleterious foreign information, and to inculcate in the people patriotism and Russian values”. Numerous other doctrines and strategies followed this document, such as the Conceptual Views Regarding the Activities of the Armed Forces of the Russian Federation in the Information Space 2011 and the Basic Principles for State Policy of the Russian Federation in the Field of International Information Security 2013. A 2009 document, the National Security Strategy to 2020, highlighting that a negative influence on Russia’s interests will continue to be information warfare. As the doctrine states: “the global information struggle will intensify, threats will increase to the stability of industrialised and developing countries, their socio- economic development and democratic institutions”. The doctrine does not, however, suggest who would be responsible for these threats.

Each of the documents defined above, focus on information security, rather than the Western terminology cybersecurity, suggesting that Russia’s focus, concerns both the technical and cognitive wholeness of information as opposed to just the technical wholeness (Jaitner & Mattsson, 2015). In 2012 the US Department of Defense took a similar approach in defining the information environment in their Information Operations (2012) doctrine,

95

[The information environment is] the aggregate of individuals, organizations, and systems that collect, process, disseminate, or act on information. This environment consists of three interrelated dimensions, which continuously interact with individuals, organizations, and systems. These dimensions are known as physical, information, and cognitive. The physical dimension is composed of command and control systems, key decision-makers, and supporting infrastructure that enable individuals and organizations to create effects. The informational dimension specifies where and how information is collected, processed, stored, disseminated, and protected. The cognitive dimension encompasses the minds of those who transmit, receive, and respond to or act on information.

The Information Operations (2012) doctrine is both technical and military and provides a useful framework that ties together technology, processes and content to form an information battle- space (Nissen, 2015). As Nissen (2015) explains, the traditional notion of battle-space or conflict environment has been replaced with contemporary battles taking asymmetric form, where SMNs play a central role. Now played alongside the “theatres of operations” which depict geographical conflict, are “virtual theatres”, where forms of remote warfare empower individuals who previously were unable to create any effect in the traditional battle-space. The battlespace has since branched into the political, economic and social spheres because of the information revolution which has interlinked networks and expanded the information dimension in all aspects of life, including conflicts at a tactical level.

In addition to the doctrines described above, are strategic narratives, which allow political actors to collaborate and shape perceptions, behaviours and beliefs actors (Miskimmon, O'loughlin, & Roselle, 2014), as seen in the competing stories during times of conflict in Chapter Four. As demonstrated in these examples, the strategic narrative is not a single story, but several stories repeated through multiple channels which come together to form a narrative. “The basic concept of a strategic narrative is therefore that it offers a framework through which conflicts’ past, present and future can be structured to help establish and maintain power in the international system and to shape the context and the system itself” (Nissen, 2015, p. 45). However, in a global information environment, there is no longer a monopoly on telling a story. Instead, there are many stories and many ‘truths’. As the UK military doctrine on Strategic Communication (2019) states, competing narratives may be deliberately combative such as a state’s adversaries or even a hostile media, so competing narratives battle it out in an enduring competition.

96

As is the premise of this research, in contemporary society, SMNs are a conduit for state-actors, such as Russia, to disseminate propaganda, as stories are interlinked and made to appear as though they are coming from multiple sources, with words and images shared in support of the narrative (see also Nissen 2015). Further, the creation of various sources or a large social media presence builds on the notion of force multiplication. The US Department of Defense defines force multiplication as “a capability that, when added to and employed by a combat force, significantly increases the combat potential of that force and thus enhances the probability of successful mission accomplishment” (US Department of Defense, 2020). The benefactor of force multiplication thus appears stronger, creating the appearance of a significant online presence, so that not only will their message spread further, but it also gives the impression of many followers. The fake peer endorsement or social proof, which is when users uncertain on what to do, turn to user’s with more followers for guidance (Cialdini, 1984), may, in turn, lead potentially to real followers (Nissen, 2015).

5.3 Russia’s Contemporary Information Operations Russian information operations have developed to use force multiplication by utilising a multitude of channels to manipulate, distort, exfiltrate, extract, and insert information, and to ensure that the Russian government approves the only source of information presented (Giles, 2019). The channels include "fake or real news media for planting disinformation; troll campaigns; official government statements; speeches at rallies or demonstrations; defamatory YouTube videos; direct messages by SMS, or even walking up to somebody on the street and telling them something" (Giles, 2019, p. 6). As reflected in Russian doctrine, the critical principle of information operations is information, regardless of the channel transmitting that information. The aim of the Russian government, therefore, is to control all information irrespective of the platform, whether it be cyber, print media, the individual or mass consciences (Giles, 2019).

According to Paul and Matthews (2016), however, it is not enough to control information to ensure a successful propaganda campaign; three additional components are necessary. First, as discussed previously through the theory of force multiplication above, a variety of sources are needed to push the narrative, as multiple sources reiterating the same message is more convincing than a single source. Further, various sources suggest a variety of perspectives,

97

which leads to more significant consideration on the part of the user (Harkins & Petty, 1987). Second, the number of sources supporting the position provides endorsement, regardless of whether it is a false endorsement or not, which can boost confidence, reliance, and trust in the information regardless of the quality of the arguments (Cialdini, 1984). In 2018 a Yale University study revealed that the discerning factor for people when deciding whether to believe something or not was familiarity. The research is demonstrating that the more times a person saw a headline or a similar headline, the more likely that person was to believe the headline to be real, regardless if the story was untrue (Pennycook, Cannon, & Rand, 2018). Last, information disseminating from groups that the user is associated with or identifies with, are more likely to accept the message.

An examination of various Russian propaganda campaigns over the last century, demonstrates each of these components at work. The Soviet-era providing numerous examples of successful persistent narratives, for example, the Katyn massacres, the name given to the slaughter of 22,000 Polish prisoners-of-war, executed en masse in 1940. In 1939 the Soviet Union invaded Poland after signing a secret pact with Nazi Germany leading to approximately 250,000 Polish citizens being taken prisoner by Russian soldiers. After interrogating prisoners with police and military backgrounds, Russian authorities executed those prisoners without trial. The prisoners, broken up into groups, were escorted to various locations and executed by a single shot to the back of the head, with only a few members of each group surviving. In 1943 the German government announced the finding of a mass grave in Katyn forest, the first of many such graves from the massacre. For several decades, the Soviet government denied responsibility for the murders. In 1972 however, British authorities revealed reports from 1940, which indicated Soviet responsibility. In response, the Soviet government reported on the village of Khatyn, located 250 kilometres east of Katyn, whose population was slaughtered by Germans in 1943, in what appears to be an attempt by the Russian authorities to control the narrative (The War Institute Review, 2019). The Russian government ran a strong, persistent, repetitive narrative that distracted from the truth (Giles, 2019). According to a newspaper article published in 1974:

The Russians have tried to erase Katyn from maps and history books. The reference to it in the 1953 edition of the Soviet Encyclopedia was dropped in the 1973 edition. No visitors are allowed to the area, and no memorial has been erected.

98

It was not until 1969 that the Russians announced the unveiling of a “memorial complex” on the site of the village of Khatyn. It was one of 9,200 Byelorussian villages destroyed by the Germans, and one of 136 of which all the inhabitants were killed.

The Russians appear to have chosen Khatyn because of the similarity of its name to Katyn. They hoped in this way to obscure the fact they have erected no memorial to the victims of Katyn, which was no less a crime than the one committed at Khatyn (Fitzgibbon, 1980).

The persistent narrative of the Khatyn murders was so compelling that in 1974 US President Richard Nixon visited Khatyn where Moscow had in 1969 erected a national war memorial to honour the fallen red army soldiers (Miller, 2012).

The second component discussed by Paul and Matthews (2016) concerns the number of sources supporting the position endorsing the propaganda campaign that can boost confidence, reliance, and trust in the information regardless of the quality of the views disseminated. An example of this may be seen in 2014 when Aleksandr Kornilov, a member of the Coordination Council of Russian Compatriots in Estonia (KSORS), announced the launch of three new online news sites in the Baltics under the basis of entertaining viewers. However, in 2018 computers seized in an investigation targeting Kornilov revealed a four-year Skype discussion between Kornilov and Aleksandr Svyazin, an employee of Russian government owned Rossiya Segodnya, which operates Russian media platforms Sputnik and RIA Novosti. Kornilov was implicated in a Russian state-sponsored propaganda campaign (Roonemaa & Springe, 2018). The campaign involved Kornilov receiving instructions from Svyazin on what to publish across the multiple media sites. A specific example reported by Buzzfeed, the news site that exposed the campaign, may be seen on the 18th of December 2015 when Svyazin instructed Kornilov to publish several surveys regarding the EU’s position on both the US and Russia. As Svyazin stated in the correspondence, “We have a command to publish five surveys about Europe, conducted by a European company on the order of the Flagship [RIA Novosti]. The published materials need to include thorough comments by experts” (qt in Roonemaa and Springe 2018— brackets in the original quote). On the 19th of December 2015, Svyazin contacted Kornilov requesting immediate publication on a survey stating that half of the people in Germany, France, and the UK wished the EU was not as dependent on the US as it was. The second survey stated that most EU citizens believed that the EU when deciding on Russian sanctions, 99 had been influenced by the US. The third survey said that Europeans had a rising concern regarding the dependence the EU had on the US. The company that was given credit for the surveys, ICM Unlimited stated in an interview with Buzzfeed, that they did not undertake the surveys but were hired in 2015 by Sputnik to conduct polling. ICM Unlimited was unable, however, to comment if the results of the polls were those reported on the news sites (Roonemaa & Springe, 2018).

Propaganda, which has been supported by experts, has a long history in Russia. Yablokov (2018) highlights the significant role that public intellectuals play in spreading anti-Western rhetoric. Yablokov (2018) provides numerous examples, including the instance of Natalya Narochnitskaya, a public intellectual used to help explain Russia’s loss of ‘superpower status.’ Narochnitskaya, a Russian politician and academic who specialises in nation-building through interpreting history argues that “Russia’s past territorial and geopolitical achievements cannot be dismissed today since they form the basis of the country’s greatness. These achievements were not realized by state rulers, but by ordinary Russians who sacrificed their lives in defence of their country” (Yablokov, 2018, p. 38). Narochnitskaya always reminds her followers of her successes as an academic, politician, and diplomat to enhance her public profile and elevate her status as an expert in geopolitics (Yablokov, 2018), while tapping into the veins of nationalism at a working-class level.

The last component that Paul and Matthews (2016) discuss is the dissemination of information from groups that the user is associated with or with whom they identify, are more likely to accept the message. Russian trolls have become proficient in utilising SMNs to spread propaganda and disinformation, and they seem well trained in creating and penetrating echo chambers (discussed below) sceptical of mainstream information or media channels. A by- product of Cold War information tactics has resulted in Russian citizens having little to no trust in official communications (Jaitner & Mattsson, 2015). As Benn (1996) explains, the foundations for media freedom in Russia appeared after the collapse of the Soviet Union in the Gorbachev period, the first media law of the new era coming into effect on the 1st of August 1990. The law promised freedom of expression and prohibited censorship and the shutdown of media outlets unless under specific circumstances and only then by court order. Before this, Soviet Union media was under strict control and censorship by the government. Given this, interpersonal interactions such as those provided by SMNs have gained importance amongst

100

Russian citizens since their inception. Information shared through SMNs by acquaintances are trusted more than messages spouted by the mainstream media. The trust placed in news stories shared by friends, may be seen as a cardinal rule discovered by multiple researchers across numerous studies that demonstrate information will propagate online, and shapes individuals’ perceptions on politics wars and media based on the number of times the content is shared by a user’s friends (Singer & Brooking, 2018). As Singer and Brooking (2018, p. 123) write with regards to posting on SMNs, a person is “more likely to believe what [a friend] says – and then to share it with others, who, in turn, will believe what they say” (Byrne, 1971). This concept, known as homophily, implies that people tend to have positive ties with those who are socially similar to themselves, hence have an innate preference to those individuals, their thoughts and their opinions. In open societies, Russian trolls have adopted this methodology to "successfully create doubts regarding objectivity that is desired from the mainstream media" (Jaitner & Mattsson, 2015, p. 47).

The introduction of SMNs have proven to be a successful propaganda tool that allows the Russian government to amalgamate all three qualities highlighted by Paul and Matthews (2016). However as will be demonstrated throughout this thesis SMNs have provided a way to amalgamate all three qualities into a single propaganda campaign, exacerbating or powering a type of information warfare not previously possible, at a scale never seen before. Russia has harnessed the power of SMNs to undermine faith in institutions like the media and the judiciary, not only internal to Russia but also inside countries hostile to Russia. Russian authorities are using SMNs to create pro-Russian sentiments at home, while also using them to undermine faith in traditional institutions in the West. And, as will be demonstrated, successfully, at a speed made possible only by the growth of the Internet and SMNs. SMNs offer an opportunity to bark out a repetitive narrative, by various sources with whom users identify with, as will be demonstrated in Chapter Six’s analysis of the annexure of Crimea and Chapter Eights’ analysis of the 2016 US Presidential Election.

5.4 The Dynamics of SMNs Social media is defined by Kaplan and Haenlein (2010, p. 60) as “a group of internet-based applications that build on the ideological and technological foundations of Web 2.0, and that allow the creation and exchange of user-generated content”. From the mid-2000s, the world saw a rapid growth of social media platforms such as Facebook, Twitter, YouTube and

101

LinkedIn, promoting connectivity while capitalising on user data. The interconnection of the platforms mentioned above, and non-profit and smaller profit sites resulted in an interconnection of platforms, described by van Dijck (2013) as new infrastructure. This ecosystem moved the Internet from a networked communication tool to a platformed society. As van Dijck (2013, p. 5) explains:

Until the turn of the millennium, networked media were mostly generic services that you could join or actively utilize to build groups, but the service itself would not automatically connect you to others. With the advent of Web 2.0, shortly after the turn of the millennium, online services shifted from offering channels for networked communication to becoming interactive, two-way vehicles for networked sociality.

The creation of a two-way communication channel has provided a means for the Russian government to not only create a narrative, but also control the narrative through the creation of information warriors (discussed below), who penetrate user echo chambers in an attempt to control the narrative.

Singer and Brooking (2018) suggest there are five basic concepts of what it takes to win online battles: narrative, emotion, authenticity, community and inundation: ‘successful information warriors’ demonstrate all five characteristics. As the authors explain, the war that SMNs have enabled is not won on the battlefield, but instead by warriors who can shape storylines which in turn provoke a response that actuates people. The warriors connect with their audience on a personal level creating a sense of comradery and fellowship and can create a global reply again and again. The same way that Robert D Leigh described radio as a tool for propaganda, so too has the Internet proved to be a tool for anyone wishing to spread rumours and fake news (Singer & Brooking, 2018). Explored below are several theories as to why SMNs have proven to be a successful conduit for information campaigns.

SMNs are online technologies which encourage people to share content, insights, opinions, perspectives, experiences and even traditional media. SMNs are further characterised “by easy access, global reach, and the rapid (close to real-time) flow of multimedia information” (Nissen, 2015, p. 35). According to van Dijck (2013) at the time of publication, SMNs had not only provided a way for individuals to get in touch and stay in touch with friends, family and

102

colleagues but due to their diversity, SMNs have provided a platform for users to create an archive of their lives. The user can then share this archive with a select group of others, open up the archive to a broader audience or utilise the SMN as a “stage for digital flaneurs” (Boyd & Ellison, 2007, p. 155), where the user is not only seen but can also see into the lives of other users (Boyd and Ellison, 2007; Garde-Hansen, 2009). The result is users with common interests, from various demographics and time zones, gathering to conduct one-to-one and one- to-many exchanges or conversations. SMNs also allows the creation of user-generated content (UGC), the automation of content and the permutation and repetition of information. They also offer the ability to impersonate people or organisations, offer user anonymity and give the user the ability to distort and misrepresent content intentionally.

SMNs have six key characteristics; they are pervasive, ubiquitous, instantaneous, interactive, socially specific and sticky (Collings & Rohozinski, 2009). Pervasive, as the world’s internet usage continues to increase and using SMNs become more common. Ubiquitous, as the number of network-enabled devices continues to grow, facilitating the migration of content from digital devices such as phones with cameras to SMN sites. Instantaneous, as SMNs are close to real- time, allowing the user to upload an image which is then available instantly and globally. Interactive as SMNs enable friends, family and even total strangers a way to communicate with each other through messages and comments. Socially specific, as existing networks and shared interests provide clout to social connections. Lastly ‘sticky’, as, according to the authors, information seen and read on SMNs is more likely to have an impact on the audience than traditional media content. As Nissen (2015) states, “through participation in information aggregation, crowd-sourcing information, or just following events, content can stick with audiences” (38). Nguyen (2019) drew a similar conclusion in his study of radical polarisation and online communities.

Radical polarisation is when groups of people live in different ‘worlds’ and where the facts that surround them are different from other groups (Nguyen, 2019). Nguyen (2019) describes this as occurring via two means, the first is an epistemic bubble, when a member of society is only getting their news from one online source, such as Facebook, with their online Facebook community sharing the same political views. According to such scholars as Pariser (2011) and Sunstein (2018), the bubble forms due to the personalisation offered by SMNs. While in the past individuals were restricted to viewing only news presented by traditional media sources,

103 the Internet allows the user to tailor news to their personal choices. As van Dijck (2018, pp. 65-66) states, platformisation leads to news becoming a personal value rather than a public value, “this indicates that content personalisation cannot be solely attributed to either platform algorithms or user preferences and practices but results from the interplay between platforms, users, and news organisations”. Personalisation, therefore, leads to social fragmentation, which forms filter bubbles that enclose users and prevent them from experiencing social values and perspectives different from the social bubble which encases them (van Dijck, 2018). In agreement is the SMN Facebook (2018), which suggests it is the individual users bias selection towards specific news articles and friend network which exposes users to particular stories and not any algorithm produced by Facebook.

The second group that Nguyen (2019) describes is online echo chambers, which are communities or groups of people who hold similar beliefs or biases (Treverton, 2017). Online echo chambers further contribute to an individual's willingness to believe in disinformation and propaganda; individuals who have embedded themselves in these small, self-enclosed online networks are exposed to skewed information. The echo chambers justify disinformation due to the limited information which individuals are exposed to within their online environment (Sunstein and Vermeule, 2009). According to Sunstein and Vermeule (2009), this is called a crippled epistemology, when most individuals rely on other people to think for them and that individuals do not have personal and direct information regarding real-world events. Further, Sunstein and Vermeule (2009, p. 211) state, these individuals know very few things, and what they do know is wrong. As Nguyen (2019) explains, "an echo chamber leads its members to distrust everybody on the outside of that chamber. And that means that an insider's trust for other insiders can grow unchecked". Nelson and Webster (2017) suggest that echo chamber members are exposed to both sides of an argument; they are, however, conditioned to deny and dismiss any evidence contrary to the member's belief (Nguyen, 2019), including traditional news media. In addition to the culture of echo chambers, according to Singer and Brooking (2018) is the idea of confirmation bias, which is when a person favours information that confirms their existing beliefs. According to Nickerson (1998), confirmation bias occurs when people search or interpret information in support of their preconceptions and ignore any evidence which could disconfirm their beliefs.

104

In her study of Russian trolling and propaganda in Finland, Aro (2016) describes the dissemination of disinformation in various forms to target competing demographics. Memes, for example, are used to lure a younger audience while echo chambers form to give individuals a sense of belonging and community. Those individuals who do not conform to a crippled epistemology are bullied and threatened within their chosen platform and channelled into complying with these ideologies. In these scenarios, members are lured into an SMN using social engineering, a technique that employs deception to manipulate individuals. Users are manipulated, in this instance, into accepting an agenda or political view once they have established themselves as part of the community, on the threat of being ostracised from their community if they do not. However, in their study on the effectiveness of Russian propaganda in Russia and its neighbours, Ukraine, Azerbaijan and Kyrgyzstan, Gerber and Zavisca (2016b) found that only those individuals who already subscribed to Russian propaganda, or are sympathetic to it, would be viewing this narrative. The study found that "the Russian narrative regarding the malevolence of the United States—and, secondarily, its European allies— resonates most in Russia and least in Ukraine. It has mixed appeal in Kyrgyzstan and Azerbaijan" (Gerber & Zavisca, 2016b, p. 86). The study concluded, that the more exposed a person was to Russian-based broadcasts, the more likely a person accepted the Russian propaganda narrative. Treverton (2017), in agreement with the theory of echo chambers and confirmation bias defined above, argues that rather than allowing people to view all angles of an argument, the Internet segments people into echo chambers "in which they hear only what they want and learn only what they already thought" (Treverton, 2017, p. 12). As Chapter Eight demonstrates, propaganda campaigns targeted at US citizens by the IRA, circumvented this shortcoming by reinventing personas to fit into existing echo chambers to push the Kremlin’s narrative.

A further appealing characteristic of SMNs is the ability to manipulate a user’s emotional selection process, which occurs when a user displays negative emotions (angry, upset, scared) through distinct accounts or rumours (Aro, 2016). According to Bell and Sternberg (2002), emotional selection is the ability of a photo or verbal statement to evoke emotions such as fear, anger and disgust. During the Arab Spring uprisings, for example, SMNs proved to be a valuable source of information to those involved in the uprisings and to their supporters globally. SMNs also worked as an alternative news source when governments shut off

105

communication to the outside world. The imagery produced by protestors and shared on SMNs, sparked a global emotional response.

As well as being a tool to assist freedom of speech and democracy, SMNs have also provided a new platform for disinformation (Starbird, 2017). Platforms like Twitter allow the dispersal of a small amount of information in each tweet; short messages have a significant influence when repeated. Once a message has spread and gains traction, it is difficult to dissuade people from disbelieving the message, especially when the rumour triggers intense feelings (Aro, 2016). As Sunstein and Vermeule (2009, p. 216) explain, "in the marketplace of ideas, emotional selection plays a significant role, and it helps to explain such diverse phenomena as [for example] moral panics about deviant behaviour [and] hysteria about child abuse". Regarding memes, Bell and Sternberg (2002) suggest that they are not only selected but retained in a social environment based on the memes ability to tap and provoke emotions. As per SMNs manipulating a user’s emotional response through distinct accounts and rumours, untruthful memes often spark high emotions, which survive in social environments. In their study on fake news stories on Twitter between 2006 and 2017, Vosoughi, Roy, and Aral (2018) found that false stories disseminated on Twitter spread faster, farther and more profound than stories that told the truth. Further, depending on the news presented, a user's emotional response would drive a tweet’s dynamics; for example, user responses to fake news saw the news as novel. Fake tweets also produced responses such as fear, disgust and surprise. In comparison “true stories inspired anticipation, sadness, joy and trust” (Vosoughi et al., 2018).

5.5 The Little Blue Bird—Twitter Twitter27 is an SMN that allows a user to write a 140-character burst based on the 160-character limit imposed by short message service (SMS), minus 20 characters Twitter has put aside for the uniform resource locator (URL) (if the URL is shorter than 20 characters, Twitter will automatically expand it to 20 characters). As Twitter’s co-founder Jack Dorsey explained, Twitter was named after the founders looked in the dictionary for an appropriate word. Twitter’s definition was perfect, “short bursts of inconsequential information” and “chirps of birds”, which is how Dorsey himself, defined the product (Singer & Brooking, 2018, p. 48). According to van Dijck (2013, p. 69), “Twitter presents itself as an echo chamber of random

27 For a full account of the role Twitter plays in contemporary politics and culture see (Burgess & Baym, 2020). 106 chatter, the online underbelly of mass opinions where collective emotions are formed and where quick-lived trends wax and wane in the public eye”. van Dijck (2013), further describes Twitter as a paradox, which enables connectedness through engineering connectivity where algorithms weigh and select tweet content and user contributions, based on the social practices of following and trending, two distinct features of Twitter. Following occurs when a user subscribes to other users’ tweets—subscribers are known as followers. Following allows users to engage in real-time communal dialogue. Twitter implemented following so individuals could connect, interact and exchange conversation. The term would also come to represent trailing a user and buying into their ideas (van Dijck, 2013).

As Burgess and Baym (2020, p. 8) state, by mid-2007 Twitter “was receiving significant mainstream media and technology press attention. Politicians John Edward and Barack Obama were there by May 2007, as were leading international media outlets like the BBC, Al Jazeera and the New York Times”. Additionally a diverse and wide range of communities had also discovered Twitter, such as academics, fans of popular culture and activists (Burgess & Baym, 2020). In late 2008, Twitter continued to expand the users experience, with the implementation of ‘trending topics’ which empowered users to organise posts by topic through a sign (#), the hashtag providing a means for users to simplify topics through words or phrases. Users could now trend topics by watching and using hashtags, or they could passively track topics. Trending topics therefore not only “refer to streams of ‘surfacing content’ but may also signal content messages aimed at pushing the message to go viral and spill over into other social platforms and mainstream media” (van Dijck, 2013, p. 77). According to Burgess and Baym (2020), hashtags play both a structural and semantic role, as they coordinate not only topics but communities, which provides a means for users to find each other as well as contributions to discussions on both issues and events that may be relevant to the user.

In 2009, Twitter introduced retweeting, which generated an enormous amount of traffic on Twitter as it allowed users to repost tweets by other Twitter users. Twitter had implemented and was encouraging conversational tagging. Just as hashtags provided a way to organise topics such as news report and events, “retweeting became inextricable from this” (Burgess & Baym, 2020, p. 96). Further functionality of relevance to this study includes the 2010 addition of geospatial function and 2011 forward function. The 2010 geospatial function was a way for users to initiate conversational interactions with people in specific areas and locations, while

107

the discover function was a way to forward “the most relevant stories from your world” (Twitter.com).

Twitter’s design fits neatly into the Russian modern-day propaganda model discussed in the literature review; which requires a high number of channels to spread propaganda, and a willingness of these channels to disseminate lies and partial truths. It is easy to register user accounts on Twitter, and the system defaults to users publicly talking to each other. Also, Twitter’s very design makes usage fast, open and difficult to control, a point Twitter’s CEO Dick Costolo admitted to in an internal memo dated 2015:

We suck at dealing with abuse and trolls on the platform and we've sucked at it for years. It's no secret and the rest of the world talks about it every day. We lose core user after core user by not addressing simple trolling issues that they face every day.

I'm frankly ashamed of how poorly we've dealt with this issue during my tenure as CEO. It's absurd. There's no excuse for it. I take full responsibility for not being more aggressive on this front. It's nobody else's fault but mine, and it's embarrassing.

We're going to start kicking these people off right and left and making sure that when they issue their ridiculous attacks, nobody hears them.

Everybody on the leadership team knows this is vital.

@dickc

Adding to Twitter’s value as an information weapon, was that Twitter’s number of active users expanded considerably. In 2008 during the Georgia war, Twitter had 2.8 million unique users with an average of 300,000 tweets a day. By 2014 Twitter had grown to 288 million active users with Twitter reaching a peak of 661 million tweets in a single day during the Football World Cup of August 2014. From 2009, Russia's reliance on SMNs increased in line with the global usage of SMNs (Jonsson & Seely, 2015). This expansion enabled "not only instantaneous news consumption and distribution but also crowdsourcing, enabling mass manipulation across the information spectrum" (Jonsson & Seely, 2015). Twitter was a perfect tool for the Kremlin's information warfare toolbox.

108

5.6 Russia’s Information Operations on Twitter SMNs such as Twitter and Facebook have “rapidly become central nodes in the platform ecosystem, where they effectively function as news aggregators but with a few twists” (van Dijck, 2018, p. 52). Everyone on SMNs can share news content from anywhere or by anyone on SMNs whereas traditional news distributors rely on professional editors and select content based on algorithms. SMNs are not just new technological ways of exerting influence and communicating, SMNs are also a weapon-system that provides non-state and state actors intelligence, authority, operations and even command and control capabilities (Nissen, 2015). Thus, as alluded to in the introduction, troll farms, such as the IRA, can amplify any message on SMNs by merely ordering their employees to start talking about the topic and to like and retweet other employee tweets on the same topic. The premise behind such behaviour is to manipulate a user’s Twitter experience through a repetitive narrative in the hope that the narrative is picked up by Twitter's algorithms as a trending or a potentially trending headline.

Russia began its troll farm activity on Twitter in 2009, four years before the registration of the IRA, and corresponding with when van Dijck’s (2013) claim that Twitter's audience begun to soar. An examination of the Twitter data associated with the IRA in 2009, demonstrates only four active accounts, in two languages, Russian and English, with a total of 212 tweets during 2009. The 2009 tweets showed no obvious political rhetoric, only what appears to be conversational banter. For example, “Не столько ум, сколько сердце помогает человеку сближаться с людьми и быть им приятным” [“Not so much the mind as the heart helps a person to come closer to people and to be pleasant to them”]. In 2010, the IRA had increased to 7 Twitter accounts, utilising seven different languages, with a total of 19,063 tweets. Some of the tweets from this time refer to minor political situations, for example, "Как гуляла делегация Грузии после саммита НАТО в Лиссабоне http://j.mp/gDgsXU" ["How the Georgian delegation walked after the NATO summit in Lisbon http://j.mp/gDgsXU"]. Satiric comments may also be seen with regards to American politicians—Джордж Бушь катался на велосипеде, упал и повредил ногу, ответственность за теракт взяла на себя алькаида.... [“George Bush rode a bicycle, fell and injured his leg, and the responsibility for the attack was taken by alkaida”].

109

Even though Twitter introduced hashtags in 2008, there was a lack of hashtags used in the first few years by Russian operatives on Twitter. In 2009, the Twitter data files only picked up seven (7) hashtags out of 212 records. The hashtags appeared to be random and hold no significance. They included: ‘#rutwitter’, ‘#RuFollowMe’, ‘#Question’, ‘#iswearithurts’, ‘#aintnothinglike’, ‘#musicmonday’, and ‘#classicmoviequotes’. In 2010 the Twitter data files picked up 321 tweets with hashtags out of the 19,063 tweets, the most popular hashtags used: #rusia (64 times) #music (48 times) and #ru_ff which appears to stand for Russia Friends Forever (91 times). By 2011 the use of hashtags had dropped to 78 tweets and compromised of random hashtags such as #SoundCloud, #ru_ff, #sex and #STARBUCKS_COFFEE.

The original purpose of the IRA is unclear. For example, had the Russian government planned the IRA's expansion, or were the first troll farms a social experiment that proved successful? By 2011, however, the IRA activity had grown to 13 accounts, utilising 14 languages. 2011 also saw the agency produce fewer tweets, the total number of tweets dropping to 14,656; however, the focus of the tweets began to change. The tweets were a mix of content with the primary objective of the operative's during this time, appearing to be the establishment of an increased number of accounts and to acquire followers. Included were numerous pro-Putin statements, for example, “Не вам сражаться; вы станьте и смотрите на то, как я буду спасать ваши души зачерствевшие. На бой с Путиным, я православием благословлен" ["It is not for you to fight; you stand and look at how I will save your stale souls. To the battle with Putin, I am blessed by Orthodoxy."] and "пусть всегда будет солнце, пусть всегда будет небо, пусть всегда будет Путин, пусть всегда буду Я" ["may there always be sun, may there always be heaven, may there always be Putin, may there always be I."]. Many tweets appear to be written with the intention of shocking or baiting the reader. For example, "Новиночка. Резиновая женщина для любителей анального секса, с грудями на спине” [“New thing. Rubber woman for lovers of anal sex, with breasts on her back",] and "-К пушкам,трусы! -закричал Немцов. -Палите картечью,цельтесь в кремлевские звезды и в окна президента и премьер.министра!” ["-To the guns, cowards! cried Nemtsov28 -Fire buckshot, aim at the Kremlin stars and at the windows of the president and prime minister!"].

28 Boris Nemtsov was a Russian Politician and critic of Vladimir Putin. Nemtsov was a supporter of capitalism in post-Soviet Russia and served under Boris Yeltsin in the 1990s. In 2015 Nemtsov would be murdered with suspicion falling on the Kremlin for his death (Birnbaum & Branign, 2015)

110

In 2012, with both the Russian Duma Elections and the Chechenia elections due to occur, the IRA's conversation on Twitter changed once more, with the hashtag playing a significant role. #Putin and #CleanElections were used heavily by IRA operatives. With regards to the Chechenia election, one operative tweeted on the 26th of January 2012 “Об этом событии расскажут СМИ всего Мира – радио, телевидение, газеты, Интернет. Моя душа поет и ликует. Горжусь Чечней и чеченским народом" ["The media of the whole World will tell about this event—radio, television, newspapers, Internet. My soul sings and rejoices. I am proud of Chechnya and the Chechen people"]. Also, during this time, fake news stories appear to increase, with one operative posting "Собчак и Навальный тайные любовники. Шок! Видио на Lifenews.ru!” ["Sobchak and Navalny secret lovers. Shock! Vidio [sic] on Lifenews.ru!”29 and “Путин в храме, Медведев в храме, а Навальный где-то в Майами.” ["Putin is in the temple, Medvedev is in the temple, and Navalny is somewhere in Miami"]. Minor reference is also made to the Arab Springs uprisings, with one Operative tweeting “Революции в Тунисе и Египте на сегодняшний день—самые дорогие пиар-акции курортов Краснодарского края” [“The revolutions in Tunisia and Egypt today are the most expensive public relations actions of resorts in the Krasnodar Territory”].

In 2012 IRA Twitter data also demonstrated an expansion to 67 accounts, in 17 languages with a staggering 217,324 tweets. According to Howard, Ganesh, Liotsiou, Kelly, and François (2018), 2012 was also the year that Russia began targeting disinformation at US voters utilising the techniques deployed on Russian citizens and neighbouring Eastern European countries. However, a review of the IRA data for 2012 demonstrated no direct information campaigns against western countries. Russian was still the dominant language used by the IRA, despite the expanded use of other languages, only a few English tweets had been produced. What was produced in English was primarily social banter, for example, "The Cure-Killing an Arab. http://t.co/hybIlA0W", which refers to the 1978 song by The Cure which had received racist interpretations in the early 2000s leading The Cure to place a sticker on a rebranding of the song which read:

The song ‘Killing an Arab’ has absolutely no racist overtones whatsoever. It is a song which decries the existence of all prejudice and consequent violence. The Cure

29 Sobchak and Navalny were in opposition to Putin during the 2012 Election. 111

condemn its use in furthering anti-Arab feeling (From the Cover of Standing on a Beach, The Cure).

However, it is unclear as to whether the tweet was about popular culture or an attempt to spur on ethnic tensions, a standard tool of the Russian government discussed throughout this dissertation.

In 2013 the Twitter landscape of the IRA shifted once more, with the total number of tweets produced by the IRA dropping to 148,503 with 283 distinct users in 36 different languages. The 2013 tweets, when examined in samples, show no apparent signs of political bias and read like friendly ‘chit-chat’, essential updates and in some cases news feeds. For example, iris0_o wrote on the 1st January 2013, “В Новой Зеландии утонул звукорежиссер "Властелина колец" Майкл Хопкинс #кино” [“The Lord of the Rings sound engineer Michael Hopkins drowned in New Zealand #cinema”]; on the 12th of September 2013 Mary_Giber wrote, “Мультфильм «Холодное сердце» вырвался в лидеры кинопроката США http://t.co/uaaAYeEObj” [“Cartoon "Frozen" broke into the leaders of the United States film distribution http://t.co/uaaAYeEObj”]; and on the 28th of December 2013 NovostiSPb wrote, “Левада-центр: Путин снова стал "человеком года" для россиян http://t.co/SqR8UWwZWo #СПб” [“Levada Center: Putin again became the "man of the year" for the Russians http://t.co/SqR8UWwZWo # St. Petersburg”]. During 2013 it was also not unusual for Twitter accounts to read as news feeds, such as NovostiSPb, who described themselves as a “По отчеству—с Невы #Петербург #Питер #СПб #СанктПетербург #Новости ВКонтакте: https://t.co/uef6IFQdoi” [“Patronymic—from the Neva # Petersburg # Peter # SPb # St. Petersburg # VKontakte News: https://t.co/uef6IFQdoi”] and was responsible for 14,187 tweets throughout 2013 alone.

In 2014 the world witnessed the annexure of Crimea by Russia, as discussed in Chapter Six. A search undertaken on the 2013 IRA tweets to see if a campaign regarding the annexed city had begun leading up to the event was unsuccessful. The search was conducted using keywords, as per Table 5.1 below. A tweet was considered to mention Russia/Ukraine/EU if the tweet referred to either Russia and Ukraine’s relationship or Ukraine’s planned move to the EU. All other tweets fell into the “Does not mention Russia/Ukraine/EU”.

112

Table 5.1 – 2013 IRA Tweets concerning Russia’s, Ukraine’s and the EU’s relationship Keyword search Number of Mention’s Does not mention Tweets Russia/Ukraine/EU Russia/Ukraine/EU Крым 127 6 121 (Crimea) Крымский 2 0 2 (Crimean) Украина 200 35 165 (Ukraine) Украинец 5 0 0 (Ukrainian/Ukrainia) Украинцы 8 1 7 (Ukrainians) Crimea(n) 0 N/A N/A Ukraine 6 0 6 Ukrainian(s) 0 N/A N/A Ukrainia 2 0 2

Examples of tweets that mention Russia/Ukraine/EU include: “Украина в системном кризисе и Россия для нее «наименьшее зло»” [“Ukraine in a systemic crisis and Russia is “the least evil” for it”]; “Держись, Украина. Россия идет на помощь. http://t.co/LcNhu3GDYX” [“Hang on, Ukraine. Russia is going to the rescue. http://t.co/LcNhu3GDYX”]; and “Представители Украины и РФ обсудили вопросы пребывания в Крыму ЧФ РФ” [“Representatives of Ukraine and the Russian Federation discussed the stay in the Crimea Black Sea Fleet”]. As the table demonstrates, there was very little IRA activity on Twitter concerning Ukraine’s proposed move to the EU, no mention of Russia’s relationship with Crimea and minimal political commentary around Russia and Ukraine’s relationship.

5.7 Conclusion The Russian government places significant importance on information and as such, has invested in a variety of means to ensure control over information. As demonstrated, the propagation of falsehoods and fake news is an intricate part of Russia’s culture and history. In contemporary society this tradition continues, as Russian propaganda is produced en masse

113

and distributed via a multitude of channels, including "text, video, audio and still imagery propagated via the Internet, social media, satellite television, and traditional radio and television broadcasting" (Paul & Matthews, 2016, p. 2). To ensure the success of these propaganda and disinformation campaigns, a narrative is repeated, supported by experts and propagated by trusted sources. The information operations are by no means new techniques, as demonstrated in the Katyn massacre of 1943. What is new, however, is the role social media networks play in providing a means to accomplish all the elements required for a successful information campaign. More significantly, SMNs provide an avenue for state actors such as Russia to carry out propaganda campaigns unchecked, the dynamics of SMNs providing the Russian government with an almost perfect platform to undertake information operations. SMNs have a global reach, provide information in close to real-time and allow users with common interests to come together in online communities. More importantly, SMNs are sticky, suggesting that information received on SMNs tends to stick with audiences more than traditional media sources (Nissen, 2015).

Information warriors have appeared on SMNs shaping storylines to suit their agenda and provoke responses from other SMN users (Singer & Brooking, 2018), and although not all information warriors may be accredited to the Russian government, Russian operatives have easily and seamlessly adopted SMNs as part of its information warfare toolkit. This was reflected in 2015 when Twitter CEO’s made a formal statement to Twitter’s employees concerning trolling activities on the Twitter platform and the fact that Twitter “sucked” at identifying and policing trolling activities. An example of such trolling operations may be seen with the appearance in 2009 of Russia’s Internet Research Agency (IRA). The seemingly harmless tweets of 2009 consisted of conversational banter with no political commentary or direction, starting to demonstrate geopolitical statements and opinions from 2012.

The following chapter will examine the 2014 conflict between Russian and Ukraine and the role SMNs such as Twitter, played in the annexure of Crimea.

114

Chapter Six – Crimea

6.1 Introduction As the previous chapter demonstrated, the Russian government controls the spread of information, propaganda and disinformation through a repetitive narrative that is supported by experts and propagated by trusted sources. The current chapter will examine Russia’s use of conspiracy theories to run information warfare campaigns. An underlining theme in Russian conspiracy theories examined in this chapter is a plot by the West, in particular the US, to undermine Russian values. In this sense, the US and the West are seen as the dangerous ‘other’, polluting Russia’s way of life while trying to wrench away from Russia those countries that still have a large Russian population and a long connection to Russia, such as former Soviet Union states. The use of conspiracy theory to drive a political agenda in Russia, as will be demonstrated, dates back to the mid 18th century, with anti-Western conspiracy theories being amongst the “most popular instruments of social cohesion used by the political elites to maintain control over the county” (Yablokov, 2018, p. 48). Examples of anti-Western conspiracy theory include the anti-Semitic conspiracy that appeared in the Soviet Union period, which claimed that the US created a virtual state within Israel for the sole purpose of dominating the Middle East. Another, more recent example, may be seen in the colour revolutions where Moscow accused the West of orchestrating events in an attempt to start a revolution in Russia.

In 2013 when a wave of demonstrations and civil unrest spread across Ukraine, in what would be dubbed Euromaidain, the Russian government once more saw this as a direct assault against the West (Mankoff, 2014). Ukraine is a former state of the Soviet Union and a neighbour of Russia. Any dependency on the EU would lead Ukraine closer to NATO and the UN and in turn, would push Ukraine further away from Russia. In what appears to be in response to Euromaidan and an attempt to damage the relationship between Ukraine and the US, NATO and the UN, Russia undertook both a kinetic and information warfare campaign against Ukraine, resulting in the annexure of Crimea, as well as the Eastern regions of the country. The information warfare campaign, as will be demonstrated, was sophisticated, relentless and successful. The information campaign was carried out via multiple channels, including via the SMN Twitter. The final section of this chapter will examine in detail the IRA’s campaign on Twitter during the annexure of Crimea, in so doing, demonstrating a repetitive narrative,

115 supported by experts and propagated by trusted sources. In many cases the narrative falls under the definition of conspiracy theory.

6.2 Russia’s Post-Soviet Union Disinformation, Propaganda and Conspiracy Campaigns Conspiracy theory may be defined as the projection of responsibility for a tragic or harmful event onto a powerful group or organisation, and therefore rejecting any official version of events. As Byford (2011, p. 21) writes, conspiracy theories are “reserved for conspiracy-based explanations which deal with large scale, dramatic social and political events […]; for explanations that do not just describe or explain an alleged conspiracy, but also uncover it and in doing so expose some remarkable and hitherto unknown ‘truth’ about the world”. Conspiracy theory in Russia may be dated back to at least the mid 18th century (Zorin, 2001), with fears regarding masonic plots emerging in the upper echelons of Russian society. Rumours that Freemasonry had ties to the devil and that members took part in sexual rituals spread in Russia, (and in other European countries) with freemasons accused of starting the French Revolution (Smith, 1999). Fear that the freemasons were plotting a revolution in Russia led to the closure of Freemason societies in the country and the imprisonment of Nikolai Novikov, the Russian Freemason leader (Smith, 1999). By the 19th century, conservative conspiracist theorists such as prominent writer Mikhail Kathov focused on the conspiracy of the 'Western plot'. According to Katkov (1863), a 'genuine Russian' was an Orthodox Christian, a loyal subject and a committed monarchist, if not then they were seen as the dangerous ‘other' and an enemy of the state, ‘other’ being defined by two opposing domains of ‘us’ and ‘them’. ‘Them’ referring to individuals who do not hold the same beliefs or ideologies as the ‘us’, due to such things as religious or political views, race, colour or ethnicity. Leading to the concept that if you were not with ‘us’, then you were ‘them’ a dangerous ‘other’ who threatened the existing social network (Jackson, 2002).

By the late 1900s the conspiracy theory had become a mechanism of the Soviet ruling party to interpret domestic and foreign policies to suit their particular agenda. As Yablokov (2018, p. 20) states, "this understanding of the Soviet Union as a besieged nation became a norm in Soviet life, especially in the 1930s when the active search for public enemies and wreckers began". An early example of this seen in a 1937 newspaper article in the Soviet's main paper

116

at the time, Pravda: "We know that engines do not stop by themselves, machine tools do not break down on their own, boilers do not explode on their own. Someone's hand is hidden behind these events" (qtd in Rittersporn, 2014, p. 34). In this example, conspiracy theory was used as a mechanism to explain away disfunctions in the Soviet industry and economy (Yablokov, 2018). By 1923, the Russian government had institutionalised the development of disinformation and propaganda as a tool for the ruling elite with the establishment of the Ru. Dezinfobiuro / dezbiuro [The Bureau of Sabotage, Misinformation and Special Propaganda] by Joźef Unszlicht within the Soviet Union's secret police30. This was followed in 1959 with Russia's Department D, for Деза [Disinformation] which was founded under the KGB. The Soviet Union's military intelligence31, is also suspected32 of operating a disinformation department.

Although the military intelligence operations were not confirmed, the success of the KGB's Department D meant that by 1963 it was promoted to a Service A of the 1st Directorate of the KGB, A standing for aktika [active measures] (Darcezewska & Zochowski, 2017, p. 26):

the special services carried out functions which were informative (analysis of the situation), organisational, security-related, monitoring, and also conceptual and inspirational. They used their channels to build a network of agents who exerted influence and financed actions in support of the Kremlin’s policies. […] Agents were placed in positions in the opinion-making media [and] international organisations […]. By generating false documents, by activating the internal opposition in the West and creating political and social crises there […] by causing events which were desirable from the point of view of the Kremlin, they worked under its supervision and control.

With the collapse of the Soviet Union, the US requested the dissolution of the Soviet Union’s active measures department. However, reports in 2002 by Colonel Sergei Tretiakov, a Russian defector, suggest that rather than being dissolved, Service A was transformed to Ру. Мероприятия содейсвия, [facilitation assistance unit] within the Рус. Служба внешней разведки, [Foreign Intelligence Service (FIS)] (J Darczewska & Zochowski, 2017). In 1999,

30 The Объединенное Государственное Политическое Управление (OGPU). 31 Главное разведыватель ьное управление (GRU). 32 See for example, Darczewska and Zochowski 2019. 117

the FSB also created the Directorate for Support Programmers, a parallel structure to Ру. Мероприятия содейсвия (Soldatov & Borogan, 2015).

6.3 The Dangerous Other The Russian government has a long history of weaponizing conspiracies, propaganda and disinformation to explain away events which did not reflect favourably on the Russian government. Included in this narrative was the idea of the dangerous other, that is, anyone who did not fit the profile of an ideal Russian. As Yablokov (2018, p. 23) writes, “suspicion and paranoid assumptions about the people next door were central to the consciousness of the late Soviet citizen, and [would survive] the Soviet collapse”. A powerful example of a conspiracy theory that was seen in the Imperial and Soviet times and is still used today is the anti-Semitic conspiracy, which encouraged fear of a powerful, albeit small group of Jewish people within the State. For example, Maksim Shechenkos, a modern Russian public intellectual known for spreading anti-Western conspiracy theories in Russia, has written about Israel as a ‘purely virtual state’, created by the US for the sole purpose of domination in the Middle East (Yablokov, 2018). Shevhenko, exploiting long standing prejudices against the Russian Jewish population, has transformed the Jews into a conspiring dangerous ‘other’ who hamper the improvement of interethnic peace in Russia and abet conflicts. As Yablokov (2018, p. 49) explains, “the anti-Western conspiracy theories expounded by public intellectuals have become a populist tool; this serves to legitimise the authoritarian rule of the president and delegitimise his opponents”.

However, it could be argued that the anti-Western conspiracy theory is more than just a tool to be utilised by the Russian government. For example, several sources have described how Moscow saw the colour-revolutions in Georgia (2003), Ukraine (2004) and Krygystan (2005) as truly orchestrated by the West to destabilise the region in the hope of having a knock-on effect in Russian (Pezard & Rhoades, 2020). Russian political scientist, Gleb Olegovich Pavlovsky, after returning to Russia from the in Ukraine stating that that “the West was not interested in Ukraine per se, but the real goal of the Orange Revolution was to set off revolution in Russia” (qtd in Yablokov, 2018, pp. 30-31). The 2015 Russian National Security Strategy also makes specific reference to the revolutions, and how they are a threat to Russia.

118

6.4 Euromaidan and the Annexation of Crimea Since the early 1990s, after the collapse of the Soviet Union, Russia has supported breakaway regions in Eurasia. Roslycky (2011) suggests that the Russian government not only supports these regions but, by using military and soft power operations, has created a geopolitical tool to assist Russia in maintaining influence in these breakaway states. According to Mankoff (2014), Moscow fans ethnic tensions, applying limited force when political uncertainty is attained. Then in order for the Russian government to retain some form of control in the region, Moscow will support territorial revisions that back a strategy which includes Russia’s assistance. As Mankoff (2014, p. 60) suggests,, “Moscow’s meddling has created so-called frozen conflicts in these states, in which the splinter territories remain beyond the control of the central governments and local de facto authorities enjoy Russian protection and influence”.

The Russian compatriot policy may also be seen as a way for the Russian government to maintain a grip over its neighbours. The compatriot policy offers financial, social and cultural support to ethnic Russians living outside of Russia’s boarders after the collapse of the Soviet Union (Pezard & Rhoades, 2020). Former President of Russia referred to these countries as the ‘zone of privileged interests’ which include Russia’s neighbouring countries and former Soviet Union states, whose populations have ethnic ties and varying degrees of loyalty towards Russia. The Russian government has demonstrated an uneasiness with the ‘zone of privileged interests’ adopting a Western orientation since the collapse of the Soviet Union (Pezard & Rhoades, 2020). With regards to Ukraine and Georgia, this is particularly discernible; the Euromaidan revolution of 2013-2014 prompted “both Russia’s annexation of Crimea and its initiation of a violent conflict in Eastern Ukraine” (Pezard & Rhoades, 2020, p. 5). Any attempt to pull a country in the ‘zone of privileged interest’ away from Russia’s influence may be seen as stepping over Russia’s line of tolerance (Pezard & Rhoades, 2020).

Ukraine has been the object of 200 invasions over the last thousand years, which has left Ukrainians sensitive to border conflicts (Magocsi, 2010). On the collapse of the USSR in 1991, President Leonid Kravchuk announced, “Ukraine will defend its integrity, sovereignty in line with the Constitution, by all means available to it” (Marples & Duke, 1995, p. 279). Recognition for Ukraine’s borders was a foreign policy priority for Ukraine, with the Ukrainian government seeking recognition for its borders in both international laws and by its neighbours.

119

In 2013, when Ukrainian President Viktor Yanukovych opted for a Russian loan to assist the Ukrainian economy in place of an Association Agreement with the European Union, the Ukrainian population protested for the Ukrainian government to turn to Europe for assistance, rather than become an ally with Russia once more (Diuk, 2014). On the 21st of November 2013, Ukrainian journalist Mustafa Nayyem posted on Facebook “we are meeting at 22:30 under the Monument of Independence. Dress warm, bring umbrellas, tea, coffee, good mood and friends. Reposts are highly encouraged!” (Bohdanova, 2014). The post requesting Ukrainians to meet at Independence Square, led to Euromaidan or European Square, an uprising by the Ukrainian population against President Yanukovych, involving hundreds of thousands of people and lasting three months.

Euromaidan was not the first time the Ukrainian population had protested for Ukraine’s independence from Russia. In 1990, Lenin Square, the former name for Independence Square, saw thousands of people coming together in support of Ukrainian sovereignty, which resulted in Ukraine’s independence from Russia a year later in a referendum. Then, five months before Euromaidan, a movement inspired by Ukraine’s opposition party saw 20,000 to 30,000 demonstrators protesting in what was dubbed “Rise Up, Ukraine” against the Ukrainian government, but the movement did not gain traction. In comparison, Euromaidan, aided by social media, was able to mobilise 100,000 people within days (Steinzova & Oliynyk, 2018). Although Facebook was only the tenth most popular website in late 2013, it was the leading platform for discussing political views in Ukraine. Facebook was also the primary source of information for Ukrainian independent online media sources.

Mustafa Nayyem was a trusted source, a famous journalist whose views, according to Bohdanova (2014), resonated with a lot of Ukrainian SMN users. As a result, Nayyem’s message was shared over and over again. As Bohdanova (2014, p. 135) writes, “when it comes to protests, online social networks mobilise people in the same way that offline social networks do: users are most motivated to join when someone from their circle of friends decides to participate”. As such, Facebook facilitated the real-time mobilisation of the first Euromaidan gathering, with users changing their Facebook status as going to Maidan, or reporting real- time, that they had arrived at the square. Euromaidan resulted in Yanukovych fleeing Ukraine to Russia, with , taking on the acting role of President of Ukraine on the 23rd February 2014. Russia refused recognition of Turchynov and Ukraine’s new government,

120

with Putin stating in an interview on the 4th March 2014 that Turchynov’s position as President was illegitimate (The Embassy of the Russian Federation in , 2014). Twelve days later on the 16th March 2014 in a referendum that took place in Crimea, 95.5% of Crimean voters, voted to re-join Russia (BBC, 2014).

In a statement made by Putin after the annexation of Crimea, Putin accused the West of having a history of interfering in Eastern Europe and the Middle East, examples which included the colour revolutions and Kosovo. Putin described how the US assisted in the annexation of a new President in Kosovo with the support of the UN and how protestors, struck with poverty and tyranny, were taken advantage of by the West during the colour revolutions. "Instead of democracy and freedom, there was chaos, outbreaks in violence and a series of upheavals. The Arab Spring turned into the Arab Winter" (Putin, 2014). Putin also spoke on Russia's persistent efforts to build trust and cooperation with the US, yet "we saw no reciprocal steps" (Putin, 2014). Using the example of the Coordinating Committee for Multilateral Export Controls list developed by the West to control the export of technology to the USSR, Putin argued that it was still in existence today to control Russia and Russia's development.

6.5 Crimea’s Information Operations Before Crimea, the Russia government had intervened in territories occupied by Russian troops through political and military means. However, it had never deposed a local government, nor had it annexed the region of one of these territories. The annexation of Crimea saw the Russian government depart from these previous tactics, raising the stakes significantly (Mankoff, 2014). During the time of the colour revolutions, Russia reacted by increasing Georgia’s gas prices by nearly 500% in 2005 and then in 2006 applied similar measures against Ukraine (Pezard & Rhoades, 2020). Why Moscow changed tactics cannot be known for sure; however, Mankoff (2014) suggests that the annexure of Crimea in 2014 may be perceived as a direct assault against the West (Mankoff, 2014). As described in the previous chapter, post-Soviet states, such as Georgia and Ukraine, expressed an intention to cooperate with Western organisations like NATO, the European Union (EU) and the World Trade Organization (WTO), and that such intentions were perceived as a threat to Russia's geopolitical power (Mankoff, 2014; Roslycky, 2011). Crimea, therefore, may be seen as the finale to almost three decades of the Russian government fine-tuning its information warfare techniques.

121

Russia's campaign in Crimea saw four strategies in play during the conflict. These were: kinetic violence, economic and energy disruption, as well as information and political influence operations (Jonsson & Seely, 2015). According to Jaitner and Mattsson (2015), it was information that extended over the whole course of events that conspired in Crimea; Russia's end goal was to control the flow of information. Russian information operations extended into both the cyber and non-cyber domains. In 2015, a year after the annexation of Crimea, Checkinov and Bogdanov (2015), two prominent Russian military authors, wrote about the power of information as demonstrated in the destabilisation of Ukraine. Primarily, information may be used to delude adversaries, influence public opinion, disorganise governments and organise anti-government protests.

Crimea saw a shift in Russian operations, where previously Russia had attempted straightforward destruction through kinetic warfare, as seen in the Chechen wars, to an active campaign of influence (Bērziņš, 2014). Disinformation campaigns were used as a mechanism to distract from military operations and to justify Moscow's actions during the annexure (Darczewska, 2014). For example, Russia's inaccurate media reported on Ukrainian atrocities towards its Russian-speaking population, distracting from Russia’s own atrocities. Reports claimed that the Russian-speaking population was seeking refuge in Russia, with media outlets providing video and photo evidence of en masse crossing over the Ukraine border (Jaitner & Mattsson, 2015). The photos and video footage were discovered to be of the Ukrainian-Polish border, and not of the Ukrainian-Russian border as was reported (Jaitner & Mattsson, 2015). The Russian government also relentlessly portrayed Ukrainian soldiers as criminals, murderers and Nazi perpetrators across various media channels. The Russian military strategy was to destroy the morale of Ukrainian soldiers, while simultaneously causing divide amongst the Ukrainian population with regards to regional, religious, political and ethnic identities. As Veebel (2015, p. 2) wrote, "false stories of crucified children and raped women were created and replicated in order to discredit the Ukrainian army", with a longstanding target of Russian disinformation campaigns being the mass consciousness (Giles, 2019). The Russian government also targeted Ukrainian soldiers directly, using geographically targeted data towards the smartphones of Ukrainian soldiers as the soldiers arrived on the front line and before Russia’s artillery begun firing at the soldiers, with messages such as “They’ll find your bodies when the snow melts” (Singer & Brooking, 2018, p. 59).

122

It was not just Ukrainian soldiers who were targeted using technology, Russian citizens were also targeted. For example, a Russian woman was given 320 hours hard labour “for discrediting the political disorder” (Singer & Brooking, 2018, p. 94) after posting negative stories regarding Ukraine’s invasion. Singer and Brooking (2018) suggest that acts like the imprisonment of the Russian woman in turn lead to a ‘spiral of silence’, a theory formed by the German political scientist Noelle-Neumann, which suggests that people have a fear of isolation and therefore will prefer to keep their opinions to themselves, rather than voicing their opinions and being rejected and isolated from the society they are a part of. In this scenario, the spiral of silence according to Singer and Brooking (2018) led to people who had similar or like-minded views to the women imprisoned, not wanting to speak-up for fear of retribution by Russian authorities.

The Russian government’s campaign during Crimea also utilised online communities to conduct information operations. For example, in what appears to be a coordinated response, pro-Kremlin groups began to engage in online political debates by discrediting the Ukrainian opposition leader and disrupting the organisation of protests aimed against the Russian government over various SMNs. The name given to these pro-Kremlin online groups was "Kremlin trolls" (Jaitner and Geers, 2015). However, the Kremlin trolls did not replace the more traditional channels of propaganda and disinformation. Instead, trolls were used to complement the Kremlin's more traditional means of disinformation. For example, TV channels in Russia and Ukraine, controlled by the Russian government and pro-Russian oligarchs such as LifeNews and Ukraina 24, were used to spread content favourable to the Russian government. Russian and Ukrainian newspapers such as Komsomolskaya Pravda v Ukraine, web pages such as LiveJournal, and Russian radio such as Radio Majak, were also part of the Russian government's toolkit (Veebel, 2015). As Nissen (2015, p. 56) explains, “at an operational or conceptual level the weaponization of social media also entails adopting a cross-media communication approach to planning how the target audience should experience the multimedia content and be encouraged to participate in the conversations through exposure to several interlined media and platforms”.

In order to be successful, the Russian government not only had to flood the media with their intended narrative, but also customise the narrative to fit the audience of the selected media, a strategy which was seen throughout the Crimea campaign (Nissen, 2015). As Darczewska

123

(2014, p. 5) wrote, Crimea "served as an occasion for Russia to demonstrate to the entire world the capabilities and the potential of information warfare". For example, Russian malware that was designed to generate revenue via clickbait33, and Russian owned Finnish language Twitter accounts with official-sounding Finnish names which were also designed to generate an income through clickbait, in a coordinated move, were repurposed to spread disinformation and links to RT34 (Bērziņš, 2014). As stated, there is no direct evidence to support the notion that Russia had invested in information troops, yet Russia displayed improved information operations during the Crimea campaign; this included improved engagement with local and regional areas and the use of cyberattacks leading up to, and during the conflict. Russia also engaged in cyber espionage leading up to, throughout, and after Crimea's annexure (Iasiello, 2017). Such activity would have provided Russia with a strategic advantage with regard to military tactics, as well as provided insight into how Ukrainian media would report the campaign. Information operations were also carried out against the US by Russia, which included Russia’s denial of the presence of Russian military in Ukraine and blaming the West for undertaking information warfare against Russia (Iasiello, 2017).

An example of Russia’s use of multiple channels may be seen in what would be come to be known as the “Fuck the EU” quote. In 2014 a conversation between Victoria Nuland, Assistant Secretary of State in the US, and US Ambassador to Ukraine Geoffrey Pyatt was intercepted, recorded and then dispersed via SMNs. While there was no direct proof, Russia was seen as the most likely culprit (BBC News, 2014). In the conversation, Nuland and Pyatt indicate that contrary to the USA’s advertised position at the time, that it was up to the Ukrainian people to decide their own fate with the upcoming election, the US had decided on what the outcome of the election should be and were clearly working towards these goals. In reference to the EU’s participation in the conflict, Nuland states “Fuck the EU”. The incident has been described as a damaging episode and an embarrassment to the US given the ease of the hack (BBC News, 2014). A review of Twitter in May 2020 revealed a mere 28 instances in 2014 where the hashtag #FucktheEu had been used, suggesting that, in the Twitterverse at least, the attempt to discredit the US and Nuland, did not appear to work.

33 Clickbait is content that intentionally misleads or over promises in order for users to click on the link that leads them to another website Riggins N (2017). 34 For example, @Vaalit, @Eurovaalit and @Eduskuntavaalit. 124

6.6 Crimea’s Russian Heritage Part of Russia's successful information campaign in Crimea may be attributed to the connection between Russia and Ukraine. Ukraine, according to Kuzio (2014), remained a de facto part of Russia in the minds of many Russians after the fall of the Soviet Union. Ukraine remains the home to a large number of ethnic Russians, while the in 2014 was the primary language used by a significant proportion of Ukrainian citizens. Crimea in particular has an ethnic Russian majority. In 2014, the BBC reported that 58% of Crimea’s citizens where ethnic Russians at the time of the annexation. Russia's involvement with Crimea and the Black Sea region dates back to at least the 1774 Treaty of Kaynarca, where the Crimean Khanate was placed under Russia's influence (Toucas, 2017). Almost 100 years later in 1853, Russia took part in the Crimean War against the Ottoman Empire, leaving hundreds of thousands of people dead. With the collapse of the Soviet Union, Russia retained military infrastructure in on Crimea's peninsula for Russia to maintain operation of its Black Sea Fleet. In 1997, the Ukraine-Russian Friendship Treaty saw the Black Sea Fleet split to 81 per cent Russia and 19 per cent Ukraine, and for the lease of Sevastopol by Russia for 20 years in exchange for concessionary energy prices and cancellation of Ukraine's debt to Russia. In 2010 the lease was extended to 2042 (Toucas, 2017).

In their study on salient identities, Metzger, Bonneau, Nagler, and Tucker (2016) state that the heightened political tensions that existed in Ukraine at the start of Euromaidan, leading to the annexure of Crimea in 2014 would lead to an increased identification by Ukrainian citizens as either Ukrainian or Russian or Ukrainian speakers or Russian speakers. As Metzger et al. (2016, p. 35) explains, "at a fine-grained level the relationship between certain types of shocks and the activation of ethnic identity" would present itself. Metzger et al. (2016) collected real- time geolocated tweets sent within Ukraine and Crimea during Euromaidan until the annexure of Crimea, narrowing the search to specific tweets using keywords, hashtags and the users nominated location. The dataset demonstrated that 50.8 per cent of tweets were in Russian, followed by 20.7 in English and 19.9 in Ukrainian. According to Metzger et al. (2016, p. 36), there is a "relationship between dominant language and ethnonational identity" which would suggest, on the surface, that Russian is the proxy ethnicity for these identities. The authors set out to demonstrate that the Ukrainian language would become more important to the Ukrainian people as the Euromaidan protests and the subsequent annexure of Crimea progressed.

125

However, the study results demonstrated that rather than an increase in the use of the Ukrainian language, there was, in fact, an increase in the use of the Russian language. The authors attributed these results as a failure to demonstrate their hypothesis. Further, that key political events led to a reversion to what they perceived to be the language of choice, Ukrainian. Instead, the results demonstrated that both Ukrainian and Russian language accounts demonstrated a preference to use Russian as their primary language during the annexation of Crimea when there was heightened social media support for Crimea's annexation. In comparison, during Euromaidan, when there was heightened social support for Ukrainians’ independence from Russia, the preference was towards the Ukrainian language. As such, one of the reasons why the Russian government was so successful in its information campaign during the annexure of Crimea, may have been due to a high number of Ukrainian citizens identifying as Russian and not Ukrainian, as indicated by the Metzger et al. (2016) study. It is unclear whether the patriotism towards Russia was a result of Russia’s information campaign during Crimea, or whether a high percentage of Ukrainian citizens still identified as Russian— further study into this is needed.

The last section of this chapter will examine the IRA’s Twitter campaign during 2014, focusing on tweets which were specific to the Russia/Ukraine relationship. As will be demonstrated, the IRA saturated Twitter with messages regarding the so-called propaganda and conspiracy theory that Russia was attacking Ukraine. Further, the chapter will detail how the IRA, in refining their information operations, used accounts that had a large following, which would have contributed to the validation of the tweet, as outlined in Chapter Five. The high number of followers would have also acted as a trusted network for users. Lastly, the analysis will demonstrate that the IRA used repetitive narratives across the tweets.

6.7 The Internet Research Agency in 2014 SMNs have increasingly been used strategically by a multitude of actors, the results of which have seen an effect on international power relations. As Nissen (2015, p. 9) writes, “the increasing strategic uses of social network media, and the effects achievable in and through the use of them, empower a multitude of actors and have a re-distributive effect on international power relations”. As Chapter Four demonstrated, Russia’s weaponization of SMNs can be dated back to 2009. By 2014 and the annexation of Crimea, Russia had laid strong foundations, and established online networks and communities. In 2014 the Internet Research Agency (IRA)

126

alone produced 2,329,674 individual tweets via 2,100 distinct accounts in 47 different languages. An analysis of the IRA’s use of Twitter at the time demonstrates an exponential growth, with tweets heavily laden with information on Ukraine and Crimea as per Table 6.1 below.

Table 6.1—2014 Ukraine and Crimea Key Word Searches

Keyword search Number of Tweets Крым/Крымский 23,955 (Crimea/n) Украин/Украина/Украинцы 257,361 (Ukraine/Ukrainian/s) Ukrain/Ukraine/Ukrainia 6,820 (Ukrainian/s) Crimea/n 161 N.B. The above count does not consider tweets which used multiple keywords, such as #Crimea #Ukraine #Украин.

The search demonstrated a significant increase from 2013 which only revealed a combined 350 tweets, in comparison to the 288,297 tweets demonstrated in Table 6.1.

In order to demonstrate whether or not the IRA were using the tactics described in Chapter Five, analysis was undertaken on the 2014 twitter data in three stages. The first stage looked at the top ten (10) Twitter accounts with the highest number of followers by combining the key word searches demonstrated in Table 6.2. As discussed in Chapter Five, successful propaganda campaigns rest on a repetitive narrative, that is supported by experts, and propagated by trusted sources. Experts are often defined on SMNs through the number of followers they have, people with a high number of followers are seen as experts and therefore users will turn to these accounts when they are unsure of the answer. Finally, users are more likely to trust the people they follow than individuals they do not follow. The second stage of the analysis involved an examination of each of the key word searches demonstrated in Table 6.2, by examining the type of engagement the tweets received via retweets, likes and replies. The third stage examines general observations, such as the use of hashtags and user account creation dates.

127

6.7.1 Top Ten (10) Twitter Accounts with the Highest Followers A search was conducted on the combined tweets to identify the Twitter accounts with the most followers, as demonstrated in Table 6.2 below.

Table 6.2 – Top Ten Twitter Accounts with the highest number of follows Twitter Account Account Number of Following Number of Creation Followers Tweets Date ruopenwit 2014 35,736 409 1,275 MaryMozhaiskaya 2014 37,362 2,617 3 byDrBre 2014 43,504 88 77 ComradZampolit 2013 48,917 1,797 52 coldwar20_ru 2014 53,175 5,434 2,474 Jenn_Abrams 2014 79,152 22,607 1 LavrovMuesli 2014 84,642 2,575 298 KadirovRussia 2011 123,989 10 798 MaxDementiev 2014 134,805 2,796 10 NovostiSPb 2012 149,672 1,024 338

A total of 5,325 tweets were identified by these top ten accounts. The tweets were then reviewed and broken into two groups: the first consists of tweets with a direct link to the conflict in Ukraine, such as military fighting, the Crimean referendum, and perceived Western influence. The second group were those tweets which could not clearly be identified as referring to the conflict in Ukraine and tweets which have no correlation to the conflict in Ukraine. The second category of tweets contained a high number of replies to other tweets, which may explain the lack of context. For example, “@Dbnmjr Вы точно уверены, что тематика данного твита соответствует названию блога? При чём тут новости Украины то вообще?” [“@Dbnmjr Are you sure that the theme of this tweet matches the name of the blog? What does Ukraine’s news have to do with it?”]. The analysis demonstrated that 76.1% or 4,054 tweets of the 5,325 tweets were clearly linked to the conflict in Ukraine. While 42 of the tweets referred to the shooting down of the Malaysian Airliner MH-17 over Ukrainian air space, with one tweet proclaiming “BUSTED !!! Ukranian Airforce shot down MH17 !!! ФОТО: Украинский Су-27 сбивает БОИНГ 777 !!! : http://t.co/6zDU48t0d5” [“BUSTED !!! Ukranian Airforce shot down MH17 !!! PHOTO: Ukrainian Su-27 knocks down Boeing 777

128

!!! : http://t.co/6zDU48t0d5”], the tweet being part of a disinformation campaign to discredit Ukraine and push the blame of the attack away from Russia.35

The Twitter data was also reviewed to identify any repetitive narratives. Two such narratives can be seen in the claim that the Ukrainian government was responsible for the armed response in Ukraine, and that Ukrainian soldiers were fighting Ukrainian rebels and citizens. The second repetitive narrative was that the US and West were conspiring with Ukraine against Russia. Examples include, KadirovRussia’s tweet on the 14th of April 2014 “Западные и Украинские СМИ заявляют о причастности России к тому,что они называют сепаратизм.Окей.Предъявите доказательства,документальные.” [“Western and Ukrainian media claim Russia’s involvement in what they call separatism. Okay. Present documentary evidence.”]; ruopentwit’s tweet of the 29th of October 2014 “@SofiaFond забавно,что за несколько месяцев полномасштабной военной операции армии РФ у Украины так и не появилось ни одной фотографии” “[“@SofiaFond it's funny that for several months of a full- scale military operation of the Russian army, Ukraine did not have a single photo”] and LavrovMuesli’s tweet on the 3rd of November 2014 “США потребовал от РФ не вводить войску на Украину. Но ведь они полгода уже твердят, что там наши войска! Как так то?! http://t.co/MxMILZFzka” [“The US demanded that the Russian Federation not to send troops to Ukraine. But they have been saying for six months that our troops are there! How is that?! http://t.co/MxMILZFzka”].

There is also a number of tweets blaming the US and the West for the events in Ukraine and accusing the US of attempting to stifle Russia’s greatness, for example, coldwar20_ru’s tweet on the 15th of March 2014 “Эксперт: Ситуация на Украине — это попытка США помешать России стать великой державой / / Международные эксперты.. http://t.co/zmfpLjFsAV” [“Expert: The situation in Ukraine is the US attempt to prevent Russia from becoming a great power // International experts .. http://t.co/zmfpLjFsAV”]; coldwar20_ru’s tweet of the 28th June 2014 “Лавров: Без вмешательства США мир на Украине наступил бы быстрее— http://t.co/npuIah6kzv” http://t.co/3PLQm9H3ub” [“Lavrov: Without US interference, peace in Ukraine would come faster—http://t.co/npuIah6kzv http://t.co/3PLQm9H3ub”]; and coldwar20_ru’s tweet on the 16th May 2014 “Плоды евромайдана: политическая верхушка

35 For more information on the Russian sponsored disinformation campaign on Malaysian Airliner MH-17 see (Coynash, 2019) 129

#США прибирает к рукам нефтегазовую отрасль Украины. http://t.co/HudXSxPEAf http://t.co/EOYHBWkZKb” [“The fruits of Euromaidan: the political elite #U.S. takes over the oil and gas industry of Ukraine. http://t.co/HudXSxPEAf http://t.co/EOYHBWkZKb”]. A review of the top ten most retweeted tweets from the Ukraine conflict data set demonstrates Russia’s reliance on conspiracy theory, as seen in Table 6.3 below.

By breaking up the tweets into 12 individual topics, further analysis was also undertaken on the data, as demonstrated in Graph 6.1. Examples of each of the Twitter categories may be seen in Table 6.3.

2500 2,101 2000

1500

1000 502 453 500 235 139 174 26 53 96 134 61 79 0

NATO Maidan Western Elections Hashtags Sanctions Annexation Military/War Nazi/Fascism Referendum

Information War/Propaganda General Political Commentary

Graph 6.1 Categorisation of tweets with a direct link to Ukrainian conflict.

It was hoped that the data could be sorted via hashtags, however 2,296 of the 5,325 tweets did not utilise the hashtag functionality, therefore a manual sort had to be conducted.

6.7.2 Key Word Search Analysis As discussed, the second stage of the analysis involved an examination of each of the key word searches demonstrated in Table 6.2 above, the results are as follows:

130

6.7.2.1 Крым/Крымский Крым was used to search the 2014 database to detect any tweet with Крым and Крымский [Crimea and Crimean]. The search revealed additional data outside of the intended Крым/ Крымский search parameters, 554 tweets containing #ВладимирКрымский [#VladimirKrymsky] which was in reference to Russian Footballer Vladimir Krymsky and Russian President Vladimir Putin. Reference to Russian Footballer Vladimir Krymsky was removed, reducing the tweets by four (4) to 23,951 tweets. From the 23,951 tweets, the following activity was identified: • 3,642 – quote; • 5,462 – reply; • 26,001- retweet; and • 8,578—tweets used hashtags.

The top 10 most retweeted tweets, equating to eleven tweets (11), were from two (2) users: • KadirovRussia (123,989 Followers and Following 10); and • coldwar20_ru (53,175 Followers and Following 5,434)

All of the tweets were in reference to the Ukraine/Russia conflict, as demonstrated in Table 6.4 below. Tweets which were also in the top ten (10) for ‘Like’ and ‘Reply’ are highlighted in blue.

6.7.2.2 Украин/Украина/Украинцы Украин was used to search the 2014 database to detect any tweet with Украин, Украина and Украинцы [Ukraine, Ukraine and Ukrainian]. The search revealed 257,358 tweets, demonstrating the following activity: • 18,203 – quote; • 23,946 – reply; • 160,662—retweet; and • 26,984—tweets used hashtags.

The 10 most retweeted tweets were from five (5) users: • byDrBre (43,504 Followers and Following 88); • KadirovRussia (123,989 Followers and Following 10);

131

• LavrovMuesli (84,642 Followers and Following 2,575); • PoliteRussia (23,398 Followers and Following 6,022); and • PolotVlast (18,045 Followers and Following 1,369).

Seven of the tweets were in reference to the Ukraine/Russia conflict, while KadirovRussia on the 15th August 2014 referred to the Boeing that was shot out of the air in 2014 over Ukraine. Additionally, on the 17th July 2014 and 10th September 2014 KadirovRussia and byDrBre respectively, made vague tweets, where the topic was unclear. Table 6.5 below capture the top 10 retweets for the Украин search. Tweets which were also in the top ten (10) for ‘Like’ and ‘Reply’ are highlighted in blue.

6.7.2.3 Ukrain/Ukraine/Ukrania/Ukrainian(s) Ukrain was used to search the 2014 database to detect any tweet with Ukraine, Ukrainia, Ukranian and Ukrainians revealing 6,820 tweets, demonstrating the following activity: • 320 – quote; • 518 – reply; • 5,091- retweet; and • 6,901—tweets used hashtags.

The 10 most retweeted tweets were from four (4) users: • coldwar20_ru (53,175 Followers and Following 5,434); • Caruolan (2,937 Followers and Following 763); • Ruopentwit (35,736 Followers and Following 409); and • Batman_Bane (1,676 Followers and Following 285).

All of the tweets were in reference to the Ukraine/Russia conflict, as demonstrated in Table 6.6 below. Tweets which were also in the top ten (10) for ‘Like’ and ‘Reply’ are highlighted in blue.

6.7.2.4 Crimea(n) Crimea was used to search the 2014 database to detect any tweet with Crimea and Crimean. The data was then checked to ensure the search did not pick up any other random tweets. From the 161 tweets, the following activity was identified:

132

• 24 – quote; • 15 – reply; • 45- retweet; and • 95—tweets used hashtags. Eleven (11) of the tweets were retweeted one or more times, with the most retweeted tweet, only receiving ten (10) retweets by coldwar20_ru. All of the ten (10) top retweeted tweets were in reference to the Ukraine/Russian conflict, bar one, which was in reference to Natalia Pokonskaya, a Russian politician. All of the top ten tweets had multiple hashtags, including #Ukraine and #Крым, which meant that they were also included in the other searches referenced above.

All of the tweets were in reference to the Ukraine/Russia conflict, as demonstrated in Table 6.7 below. Tweets which were also in the top ten (10) for ‘Like’ and ‘Reply’ are highlighted in blue.

6.7.3 General Observations The account creation year was reviewed to identify whether the creation date of a Twitter account influenced the number of followers and responses. The data has been captured in Table 6.8 below.

Table 6.8 – IRA Account Creation Year Year Followers Accounts 2009 3,633 19 2010 14 1 2011 1,285 2 2012 21,536 55 2013 127,505 393 2014 133,560 693

Interestingly, the majority of tweets came from accounts created in 2014, with 2014 also seeing the number of accounts created almost doubling, the high number of accounts created indicating that the attack was coordinated. A review of the account creation dates of the top ten most followed accounts demonstrates that the majority of these accounts were also created in

133

2014, as demonstrated in Table 6.2. The results suggests that there is no correlation between trusted accounts and account creation dates.

A review of the use of hashtags was also taken across the whole data set and 11,740 tweets in total had utilised the tool, often using multiple hashtags at once. In total there were 861 unique hashtag and hashtag combination used. The most used hashtags are captured in Table 6.9 below.

Table 6.9 – Top Ten Hashtags Used Throughout Crimea Campaign Total times used Hashtag Translation 2689 #Крым #Россия # Crimea # Russia 1171 #Россия #Путин #Крым #Russia #Putin #Crimea 873 #UkrainianLie #UkrainianLie 672 #ВладимирКрымский #Vladimir Krymsky 565 #ОтКрымаДоКамчатки #From Crimea to Kamchatka 418 #ТвиттерскиеОтдыхаютвКрыму #Tweeters Rest in Crimea 385 #ТвиттерскиеОтдыхаютВКрыму #Tweeters Rest in Crimea 290 #Ukrainianlie #Ukrainianlie 281 #RussiainvadedUkraine #RussiainvadedUkraine #УкраинскаяЛожь #UkrainianLies 181 #КиевСбилБоинг #KievSbilBoeing

All but two of the tweets referred to the Ukraine conflict, with three of the tweets using the hashtag #UkrainianLies/#UkrainianLie, which was related to the conspiracy theory that Russia had invaded Ukraine and was the cause of the conflict.

What was interesting was the use of the hashtag #ColumbianChemical in the Крым data set, which was dated 11th September 2014 and was in reference to a Russian information campaign that was being carried out in the US that same year, discussed in Chapter Seven. The tweet itself reading “Я хочу в Крым #ColumbianChemicals” [“I want to Crimea #ColumbianChemicals”], which appears to indicate some confusion by the operative between the two campaigns.

134

6.8 Conclusion Conspiracies, propaganda and disinformation have long been weaponised in Russia and as demonstrated in 2014 in Ukraine, by the Russian government as an information warfare tool. An example of this is Russia’s placement of the West as a dangerous other determined to undermine Russian values and the Russian way of life. This is prevalent in the recent Russia information campaigns around the colour revolutions, which suggest that the revolutions themselves were part of a Western plot to destabilise Russia. Russian authorities stated outright that the colour revolutions were sponsored by the West in the hope that they would spur a revolution in Russia. In 2014, when Ukrainians protested the Ukrainian government’s decision to seek financial assistance from Russia, resulting in Ukrainian President Viktor Yanukovych fleeing to Russia, Putin’s response was to deny recognition of Oleksandr Turchynov as the acting president, and to blame the West for interfering. A few days later, Russia annexed Crimea, including a central naval base for Russia, off the peninsular of Ukraine.

Russia’s campaign to annex Crimea involved four strategic moves, the most substantial of which was the information warfare campaign. Russia’s information campaign extended in to both the cyber and non-cyber domains to influence public opinion, disorganise governments, confuse adversaries and organise protests. Combined with traditional forms of media such as television and radio, Russian information operations were a formidable force in 2014. The information operations demonstrated repetitive messages by trusted sources, as evidenced in the IRA Twitter data which exhibited a hefty information campaign centred around the conflict in Ukraine and the annexure of Crimea. The IRA used a deluge of accounts to disperse a repetitive narrative, which included the message that Russia was not involved in the conflict in Ukraine and that Russia’s involvement was a construction of Ukraine and the West intended to discredit Russia; that Russia was a saviour to Ukrainian people who identified as Russian; and that the sanctions against Russia was an attempt by the West to control Russia.

The following chapter will examine the Russian information campaign that was carried out against the US in the second half of 2014.

135

Table 6.3—The Ten Most Retweeted Tweets from the most followed accounts – Crimea Campaign Date User Tweet Translation Retweet Like Reply

27/05/2014 KadirovRussia Сколько гордости у тех, кто борется со своим How much pride is there for those who fight their 525 94 65 безоружным народом и группой ополченцев при unarmed people and a group of militias with the help of помощи авиации, но побоявшихся сунуться в aviation, but who are afraid to tuck into the Crimea. Крым.

06/06/2014 KadirovRussia Не удивлюсь, если в новой бондиане Агент 007 I will not be surprised if in a new Bond, Agent 007 will 470 135 65 будет бороться с ополченцами на Юго-Востоке fight with militias in the South-East of Ukraine, and Украины, а девушкой Бонда будет, например, Bond's girl, for example, Lyashko. Ляшко.

13/06/2014 coldwar20_ru Исчерпывающе о том, чей Крым, и кто там Exhaustively about whose Crimea, and who were the 549 150 28 были истинные оккупанты. true occupiers there. http://t.co/xtmU1r6nqw http://t.co/xtmU1r6nqw

27/02/2014 KadirovRussia Я правильно понимаю, что захватывать I correctly understand that only the "" has the 462 93 57 правительственные здания в Украине имеет right to seize government buildings in Ukraine, and in право только "правый сектор", а в Крыму — это the Crimea is it a "terrorist attack"? "теракт"?

27/02/2014 KadirovRussia Я правильно понимаю, что захватывать I correctly understand that only the "right sector" has the 462 93 57 правительственные здания в Украине имеет right to seize government buildings in Ukraine, and in право только "правый сектор", а в Крыму — это the Crimea is it a "terrorist attack"? "теракт"?

16/03/2014 KadirovRussia Я так понимаю, что Украинцы ещё не поняли, I understand that the Ukrainians still did not understand 892 210 87 что Запад их слил и оставил наедине с шайкой that the West had leaked them and left them alone with a

аферистов и бандитов захвативших власть. gang of swindlers and bandits seizing power.

136

12/06/2014 KadirovRussia А теперь давайте представим,какая бы паника Now let's imagine what kind of panic the dill had, if the 498 132 121 была у укропатриотов,если бы на Украину real military actually flew into Ukraine on a real tank

реально въехали реальные военные на реальных from Russia танка из РФ

21/08/2014 LavrovMuesli Путин увидел свой рейтинг и тихо прошептал: Putin saw his rating and whispered softly: "Glory to 493 11 4 "Слава Украине!" Ukraine!"

06/06/2014 KadirovRussia Укропатриоты уже столько чеченцев насчитали The Ukropatriots have already counted so many 610 175 52 и убили на Юго-Востоке Украины, что Chechens and killed in the South-East of Ukraine, which

непонятно, остались ли вообще чеченцы в самой is unclear whether the Chechens remained at all in Чечне? Chechnya itself?

13/05/2014 coldwar20_ru Эти люди организовали "марш за мир", когда в These people organized a "peace march" when in Crimea 448 47 111 Крыму ДАЖЕ НЕ СТРЕЛЯЛИ!!! Сейчас на Ю- EVEN DID NOT SHOOT !!! Now they are burning and В жгут и убивают, а их не слышно killing on the SE, but they are not heard http://t.co/URa8CMbWnh http://t.co/URa8CMbWnh

03/05/2014 KadirovRussia Сжечь людей своей же страны под крики "слава Burning people of your own country to the cries of 443 84 26 героям!", это так в духе новой Украины, “Glory to the heroes!” Is so in the spirit of a new

возродившей старые традиции. Ukraine, which has revived old traditions. http://t.co/WThfEOp46R http://t.co/WThfEOp46R

Table 6.4 – The categorisation of tweets from the top 10 accounts with the most followers– Crimea Campaign Categorisation Date User Tweet Translation

Annexation 18/03/2014 coldwar20_ru Через несколько минут будет подписан договор о In a few minutes, an agreement will be signed on the присоединении Крыма к России. Просьба annexation of Crimea to Russia. The request to remain in their places; # 128 http://t.co/UJS01xKYtV

137

оставаться на своих местах &#128 http://t.co/UJS01xKYtV

Elections 29/03/2014 coldwar20_ru Москва не считает легитимными майские выборы Moscow does not consider May elections in Ukraine на Украине. #Крым #Украина #Антимайдан legitimate. # Crimea # Ukraine # Antimaydan

Hashtags 25/03/2014 coldwar20_ru #демократия в каждый дом #беркут #obama # democracy in every home # golden eagle #obama # #обама #crimea #Крым #аксенов #украина obama #crimea # Crimea # aksenov # Ukraine #Russia ... #россия… http://t.co/VRIOGvlcgq http://t.co/VRIOGvlcgq

Information 29/04/2014 KadirovRussia Убирая из сетки вещания российские тв.каналы Removing the Russian TV channels from the broadcasting War/Propaganda Украина показывает то, что информационную network, Ukraine shows that they, too, are losing the войну они тоже проигрывают. information war.

Maidan 25/04/2014 KadirovRussia По версии Запада и США в Украинском кризисе According to the West and the USA, Russia is to blame for виновата Россия а не ими поддержанный the and not the supported armed coup

вооруженный переворот устроенный майданом. organized by the Maidan. Supreme arrogance Высшая наглость

Military/War 04/07/2014 coldwar20_ru Украинские артиллеристы под Ukrainian gunners near Slavyansk. Soldiers kill women Славянском.Солдаты убивают женщин и детей, and children, the army of Ukraine ... армия Украины... http://t.co/Bkj9EU7i02 http://t.co/Bkj9EU7i02

NATO 28/08/2014 NovostiSPb НАТО и США заявили о вторжении России на NATO and the United States announced the Russian Украину http://t.co/POOxbXa9Rk invasion of Ukraine http://t.co/POOxbXa9Rk

Nazi/Fascism 15/04/2014 KadirovRussia Сегодня фашисты напали на Украинские города Today, the Nazis attacked the Ukrainian cities of Славянск и Краматорск. Slavyansk and .

138

General Political 18/03/2014 coldwar20_ru «Мы не могли оставить жителей Крыма в беде. “We could not leave the inhabitants of Crimea in trouble. Commentary Иначе это было бы просто предательство», сказал Otherwise, it would be just a betrayal, "Putin said. Путин

Referendum 16/03/2015 coldwar20_ru В Крыму начался референдум о статусе In Crimea, a referendum on the status of autonomy began / автономии / / Решение будет приниматься / The decision will be made by a simple majority of votes / простым большинством голосов / .. http://t.co/JJ1Ks0AvX6 http://t.co/JJ1Ks0AvX6

Sanctions 25/06/2014 ruopentwit Американский бизнес не поддерживает American business does not support Obama's threat of выдвинутую Обамой угрозу санкций проти... sanctions against ... http://t.co/kbhJxMG2wb # Russia # http://t.co/kbhJxMG2wb #Россия #Путин #Крым Putin # Crimea

Western 22/03/2014 KadirovRussia Говорят, когда украинского военного, Мамчура, They say that when the Ukrainian military, Mamchur, was просили покинуть территорию российской базы в asked to leave the territory of the Russian base in Crimea, Крыму он кричал: Америка с нами! he shouted: America is with us!

Table 6.5 – The 11 most retweeted tweets from “Крым” search – Crimea Campaign Date User Tweet Translation Retweet Lik Reply e

27/02/2014 KadirovRuss В Крыму всего лишь появилось 7 БТРов без In Crimea, only 7 armoured personnel carriers appeared 245 47 45 ia опознавательных знаков а в Киеве уже галстуки without identification marks, and in Kiev they already жуют. chew ties.

27/02/2014 KadirovRuss Я правильно понимаю, что захватывать I understand correctly that only the "right sector" has the 462 93 57 ia правительственные здания в Украине имеет right to seize government buildings in Ukraine, and in the право только "правый сектор", а в Крыму — это Crimea is it a "terrorist attack"? "теракт"?

139

31/03/2014 KadirovRuss Крым может вернуться в состав Украины, если Crimea may return to Ukraine if Ukraine as a whole 379 176 63 ia Украина целиком войдет в состав РФ. becomes part of the Russian Federation.

17/04/2014 KadirovRuss Андрея Бабицкого отстранили от работы на Andrei Babitsky was suspended from work at Radio 255 48 22 ia «Радио Свобода» за поддержку позиции России Liberty for supporting Russia's position in the Crimea. по Крыму. Демократичное радио, свободное. Democratic radio, free.

11/05/2014 KadirovRuss И,конечно,никаких жертв в Киеве,отторжения And, of course, there would be no victims in Kiev, the 246 73 58 ia Крыма,пожара в Одессе и убиваемых rejection of the Crimea, the fire in Odessa and those killed нацгвардией сегодня не было бы,разгони by the National Guard, if Yanukovych had been dispersed Янукович вовремя майдан. in time for the Maidan.

13/05/2014 coldwar20_r Эти люди организовали "марш за мир", когда в These people organized a "peace march" when in Crimea 488 47 111 u Крыму ДАЖЕ НЕ СТРЕЛЯЛИ!!! Сейчас на Ю- EVEN DID NOT SHOOT !!! Now they are burning and В жгут и убивают, а их не слышно killing on the SE, but they are not heard http://t.co/URa8CMbWnh http://t.co/URa8CMbWnh

17/05/2014 coldwar20_r Поддержал Крым и новые республики ДНР и He supported Crimea and the new republics of the DPR 249 36 16 u ЛНР.Настоящий патриот,спортсмен,за которого and LPR. A true patriot, an athlete for whom he is proud, горд, в отличии от всяких Кличко unlike any Klitschko http://t.co/Kjo6OglcJo http://t.co/Kjo6OglcJo

27/05/2014 KadirovRuss Сколько гордости у тех, кто борется со своим How much pride is there for those who fight their unarmed 525 94 65 ia безоружным народом и группой ополченцев при people and a group of militias with the help of aviation, помощи авиации, но побоявшихся сунуться в but who are afraid to tuck into the Crimea. Крым.

13/06/2014 coldwar20_r Исчерпывающе о том, чей Крым, и кто там Exhaustively about whose Crimea, and who were the true 549 150 28 u были истинные оккупанты. occupiers there. http://t.co/xtmU1r6nqw http://t.co/xtmU1r6nqw

140

29/07/2014 KadirovRuss Барак Обама: Россия должна заплатить Barack Obama: Russia must pay a heavy price for the fact 305 60 19 ia большую цену за то, что мы не смогли that we were not able to carry out a coup in Syria and the осуществить переворот в Сирии и захват capture of Crimea. Крыма.

31/12/2014 KadirovRuss Написал в небе трассирующими «Крым наш!». He wrote in the sky tracer "Our Crimea!". 245 81 26 ia

Table 6.6—The 10 most retweeted tweets from “Украин” search – Crimea Campaign Date User Tweet Translation Retweet Like Reply

16/03/2014 KadirovRuss Я так понимаю, что Украинцы ещё не поняли, I understand that the Ukrainians still did not understand 892 210 87 ia что Запад их слил и оставил наедине с шайкой that the West had leaked them and left them alone with a аферистов и бандитов захвативших власть. gang of swindlers and bandits seizing power.

25/05/2014 KadirovRuss Долг Украины перед Россией за поставленый Ukraine’s debt to Russia for the supplied gas will probably 523 151 103 ia газ, наверное, тоже придется Кадырову also have to be knocked out by Kadyrov. выбивать.

06/06/2014 KadirovRuss Укропатриоты уже столько чеченцев насчитали The Ukropatriots have already counted so many Chechens 610 175 52 ia и убили на Юго-Востоке Украины, что and killed in the South-East of Ukraine, which is unclear непонятно, остались ли вообще чеченцы в самой whether the Chechens remained in Chechnya at all. Чечне?

12/06/2014 KadirovRuss А теперь давайте представим,какая бы паника Now let's imagine what kind of panic the dill had, if the 498 132 121 ia была у укропатриотов,если бы на Украину real military actually flew into Ukraine on a real tank from реально въехали реальные военные на реальных Russia танка из РФ

141

17/07/2014 KadirovRuss Кстати, прошу никого не забывать, что на By the way, I ask you not to forget that in Ukraine people 482 97 27 ia Украине люди могут сами себя заживо сжечь, can burn themselves alive, shoot themselves, and air самозастрелиться, и кондиционеры взрываются. conditioners explode.

15/08/2014 KadirovRuss Хочу напомнить,что в последнее время I want to remind you that recently, countries that have 515 84 51 ia страны,бурно обвинявшие в крушении Боинга vigorously accused Russia of crashing Boeing in Ukraine, на Украине Россию,молчат даже о ходе are even silent about the progress of the investigation of расследования этого дела this case.

21/08/2014 LavrovMues Путин увидел свой рейтинг и тихо прошептал: Putin saw his rating and whispered softly: "Glory to 493 11 4 li "Слава Украине!" Ukraine!"

06/09/2014 PoliteRussia Губарев предложил расширить границы Gubarev proposed expanding the borders of New Russia 594 12 6 Новороссии от Одессы до Луганска. #Украина from Odessa to Lugansk. #Ukraine #DPR #policy #ДНР #политика http://t.co/DBqm2Sgktb http://t.co/DBqm2Sgktb http://t.co/0ouKU5uhoG http://t.co/0ouKU5uhoG

10/09/2014 By DrBre У Украинцев скачут даже ракеты! #украина The Ukrainians even jump rockets! #Ukraine #Slavau 794 15 6 #славаукраине #героямслава #тернополь Ukraine #Heroyamslav #Ternopol https://t.co/aQQK0vy8jD http://t.co/JjIkwwLNhO https://t.co/aQQK0vy8jD http://t.co/JjIkwwLNhO

26/09/2014 PolitVlast Яценюк обвинил Россию в Yatsenyuk accused Russia of trying to “freeze” Ukraine. 490 9 8 попытках«заморозить»Украину.Видимо, Apparently, Moscow, in his opinion, should supply them Москва, по его мнению, должна поставлять им with gas for free http://t.co/QBTkUFLVyw газ бесплатно http://t.co/QBTkUFLVyw

Table 6.7 – The 10 most retweeted tweets from “Ukrain” search – Crimea Campaign Date User Tweet Translation Retweet Like Reply

142

30/05/2014 coldwar20_r #Славянск Сегодня в # Slavyansk Today at 5:00 a children's clinic was shelled from Karachun. 149 12 5 u 5:00 с Карачуна #sa+N167:N6673vedonbaspeople #StopUkrainianArmy http://t.co/iNiva5JvbM обстреляна детская поликлиника. #savedonbaspeople #StopUkrainianArmy http://t.co/iNiva5JvbM 31/05/2014 coldwar20_r Поддержка пришла Support came even from Lviv. #SaveDonbassPeoplefromUkranianArmy 183 9 4 u даже из Львова. #savedonbaspeople #StopUkrainianArmy http://t.co/4t5SmYQuY8 #SaveDonbassPeoplefr omUkranianArmy #savedonbaspeople #StopUkrainianArmy http://t.co/4t5SmYQuY 8 30/05/2014 coldwar20_r #Славянск С # Slavyansk Shelling of the city began from Karachun. #savedonbaspeople 175 15 6 u Карачуна начался #StopUkrainianArmy http://t.co/vdTNDTnnQA артобстрел города. #savedonbaspeople #StopUkrainianArmy http://t.co/vdTNDTnnQ A 31/05/2014 coldwar20_r Германия и Germany and Finland joined the action #SaveDonbassFromUkrainianArmy C every 263 26 1 u Финляндия day, we are more and more! http://t.co/ipUOxhRFDk присоединились к акции

143

#SaveDonbassFromUk rainianArmy C каждым днём,нас всё больше! http://t.co/ipUOxhRFD k 28/05/2014 Ruopentwit Тем кто пишет, что Those who write that Russian troops are involved in the Ukrainian conflict are likely to 186 3 1 российские войска go to school tomorrow at 9 #UkrainianLie http://t.co/aSMtwAGoCg участвуют в украинском конфликте, скорее всего завтра в школу к 9 #UkrainianLie http://t.co/aSMtwAGo Cg 28/05/2014 coldwar20_r #savedonbaspeople 139 13 0 u #StopUkrainianArmy http://t.co/7ZOjLNRHq j 28/05/2014 Caruolan Армия укропских Army of dill trolls brought to the top "duck" about the Russian invasion #UkrainianLie 200 1 0 троллей вывели в топ http://t.co/373FTMthRD http://t.co/aFt46fJcLK «утку» про вторжение России #UkrainianLie http://t.co/373FTMthR D http://t.co/aFt46fJcLK

144

30/05/2014 coldwar20_r Вот такое вооружение Here is the weapon of the junta located 5 km. north of # #savedonbaspeople 229 18 62 u Хунта разместила в 5 #StopUkrainianArmy http://t.co/gOEuQ354fs км. севернее #Донецка #savedonbaspeople #StopUkrainianArmy http://t.co/gOEuQ354fs 28/08/2014 Batman_Ban http://t.co/fquasjVIGQ http://t.co/fquasjVIGQ Ukrainian lie! #UkrainianLie http://t.co/nuq1GFBbBf 187 0 0 e Украинская ложь! #UkrainianLie http://t.co/nuq1GFBbB f 13/05/2014 coldwar20_r Расскажите об этом! Tell us about it! RT @ coldwar20_en: 18 + Show your friends. It will not show CNN. 165 27 6 u RT @coldwar20_en: The tragedy of Ukraine http://t.co/r6wAjFyD4V 18 + Show your friends. It will not show CNN. The tragedy of Ukraine http://t.co/r6wAjFyD4 V

145

Chapter Seven – 2014: Russia Attacks the US

7.1 Introduction The first known attacks on the US from the Russian troll farm the Internet Research Agency (IRA) began in 2014 after Russia carried out a successful information campaign in Crimea. The US examples examined in this chapter did not, however, prove to be as successful. The campaigns include an alleged chemical spill in Columbia, the alleged shooting of unarmed black women in Atlanta and an alleged Ebola outbreak in Atlanta, all three carried out in the closing months of 2014. The last campaign to be examined occurred a year later, in 2015, involving the allegation of food contamination stemming from turkeys bought at Walmart, but supplied by Koch Turkey Farms. As will be demonstrated the Columbian chemical campaign (CCC) began with significant momentum, with the IRA flooding Twitter with a repetitive narrative, which included imagery, links to false news stories and a Wikipedia page of the falsified events. However, the campaign failed to take off. A few months later two consecutive campaigns involving a shooting and then an Ebola outbreak in Atlanta, once more flooded social media. Both campaigns, like the CCC, included imagery and supporting Wikipedia pages. However, these campaigns did not appear to have been picked up at all by the US media, one of the campaigns occurring in the middle of the night in the US.

A year later, the IRA once more flooded Twitter with a repetitive narrative, this time concerning an allegation of food poisoning from Koch Turkey Farms. The campaign differed from the previous campaigns as the IRA had attempted to embed an operative, going by the name Alice Norton, into an American-centric food platform to start the allegations, which IRA Twitter operatives then linked to for credibility. The campaign again proved unsuccessful, as it was not long before a member of the forum identified Alice Norton as a troll due to an inconsistent narrative in her story. However, as will be argued, the four examples demonstrate the IRA learning from each campaign.

7.2 Russia Attacks America Between September and December 2014, the US was the victim of several disinformation campaigns carried out on various social media networks (SMNs) by Russia. However, as will be demonstrated, the first attempts of information manipulation in the US by Russia were unsuccessful. The campaigns were targeted at the US specifically. Many people deemed them

146

at the time as social media hoax campaigns carried out across multiple SMNs, including Facebook, YouTube and Twitter (Chen, 2015). Chen's (2015) exposé of Russian troll farms featured in The New York Times would identify the IRA36 as responsible for the campaigns. The campaigns centred around three fictional events, a chemical explosion and subsequent gas leak in Louisiana, as well as an Ebola outbreak, and the fatal police shooting of an unarmed black woman, both of which allegedly took place in Atlanta (Blidner, 2018). In the Louisiana example, residents received warning text messages, and the IRA established a Wikipedia page which contained a 30-second recording of a fake news report on the CCC. The IRA information campaign coincided with the anniversary of the 11th September 2001 terrorist attacks in the US. In what appears to be an attempt to leverage the fear created by the 2001 terrorist attack, and to presumable maximise panic and confusion, the IRA used terms such as 'cover-up', 'accident', 'ISIS' and 'Terrorist' to describe the falsified explosion and subsequent gas leak associated with the CCC while the Wikipedia page and the text message alert system added legitimacy to the campaign.

Similar intensity is seen on the 13th of December 2014, when two new crises were reported to have occurred in Atlanta, US. The first was an alleged Ebola outbreak and pandemic, and the second was the alleged shooting of an unarmed black woman, both events mirroring current headlines that were also prevalent in the news in 2014. Earlier the same year, six shootings by police in the US against black people had previously gained news coverage, including the killing of Michael Brown. The death of Brown, an unarmed 18-year-old student led to violent protests in Ferguson, Missouri (ABC News, 2015). A public emergency had also been declared earlier in 2014, after an Ebola outbreak in West Africa, which spread to the US through travel- association, the index case, a man who travelled from West Africa to the US with Ebola. The index case in the real-world example would go on to contaminate two health care workers before they would die from Ebola. Confirmation of eight other patients in the US followed, all of whom contracted the disease overseas (Centre for Disease Control, 2019). The 2014 Russian information campaign appears to be an attempt to leverage the fear created from actual Ebola

36 The connection was made regarding the three Hoax social media campaigns and the IRA when security specialists examined the tool used to post the tweets—Masss Post. The investigation further revealed that the tweets identified as Masss Post distributed were also associated to the domain add1.ru, which was registered to a Russian business man Mikhal Burchich. Mikhal Burchich of the same email address had previously been identified through leaked emails (by Anonymous International) as the executive director of the IRA (Chen, 2015).

147

cases. The AJC (2015) describing the event as “a hoax meant to stick with people long enough to scare them” (AJC, 2015b). However, as will be demonstrated, unlike the initial September 2014 campaign, which caught the attention of the media as a tasteless prank, and affected residents of Columbia who appeared genuinely confused for some time, the two December campaigns gained little traction.

In comparison to the 2014 Crimea information campaign carried out by the IRA, the US information campaigns included extensive use of hashtags, predominately #DeadHorse, #ColumbianChemicals, #Ebolainatlanta and #Shockingmurderinatlanta (AJC, 2015b). The IRA’s use of the hashtag in the US campaigns may have something to do with Twitter’s hashtag origins. According to Burgess and Baym (2020) the hashtag is deeply rooted in social issues, and as a way to navigate real-time events, one of the most notable being #blacklivesmatter, discussed in Chapter Eight. As Burgess and Baym (2020, p. 68) write,

the rise of #blacklivesmatter and its material ties to street protests and unjust policing serves as an important reminder of the embodiment and liveness of many events that might look merely like “data” or verbal discourse when viewed as hashtags.

By August 2014 #blacklivesmatter had been used 52,288 times. The very nature of Twitter, as a real-time, ever-updating social media network, meant that, people could join in local street protests or keep abreast of what was happening from afar with regards to the Ferguson protests associated with #blacklivesmatter in late 2014 (Burgess & Baym, 2020). With the hashtag proving to be such a powerful tool, it is not surprising that the IRA would then try and harvest this tool for information warfare. As Papacharissi (2015) demonstrates, the hashtag has become a tool whereby intense feelings coupled with great momentum build within communities online, which then spill out into the real world.

In addition to the use of hashtags, all of the IRA 2014 US centric campaigns included video footage of the alleged events. The CCC had footage of an explosion and smoke burning, the Ebola video was set up to appear like a news report and included a vehicle with the Hartsfield Jackson logo on it, the local airfield in Atlanta. While, the shooting video was blurred footage, with an unknown voice providing commentary of the alleged shooting that was happening in the background (AJC, 2015b). It would appear through the creation of the Wikipedia pages,

148

images and videos to depict the events and flooding SMNs with personal accounts of the incidents, that the IRA invested much time on each of these campaigns. After the CCC, Duval Arthur, director of Homeland Security and Emergency Preparedness for St Mary Parish Louisiana, dismissed the campaign as a "tasteless prank". Rather than a tasteless prank, the #ColumbianChemicals, like the #Ebolainatlanta and #Shockingmurderinatlanta campaigns, was an attempt to create panic in the US through the creation of a fictitious terrorist attack in Louisiana. All three campaigns, however, would prove unsuccessful as demonstrated through the Twitter campaign, which consisted of a plethora of tweets with minimal return in terms of likes, retweets and shares. The stories also failed to be picked up by primary news sources.

7.3 The Columbian Chemical Information Campaign The CCC was the first of the known campaigns orchestrated by the IRA in 2014 towards the US. Coincidentally, the CCC occurred in what would become a in the lead up to the 2016 US presidential election, that is a state whose final vote could 'swing' between either the Democrats or Republicans. As such, Louisiana was a primary target of the 2016 IRA campaign, as discussed in the proceeding chapter.

The current research undertook an examination on the IRA Twitter data for September 2014 and identified ~12,457 tweets which were specific to the CCC. The identification of tweets occurred in three stages. The first stage identified 3,393 tweets, which involved query searches of hashtags associated with the campaign, as outlined in Table 7.1. This data was based on hashtags identified in the ‘hashtag’ field of the data. Hashtags which appeared in the tweet text field only were not included in this count.

Table 7.1 Twitter Hashtags for Columbian Chemical Information Campaign #DeadHorse #ColumbianChemicals #911Anniversary #ChemicalAccidentLouisiana #NeverForget #LouisianaExplosion #September11 #ColumbianChemicalsInNewOrleans

Stage two identified 8,898 tweets without the relevant hashtags in the hashtag field through a keyword search in the tweet text area37. The keywords used in stage two appear in Table 7.2.

37 Most of these tweets had hashtags in the tweet text field, but this had not been picked up in the hashtag field. It is unclear why this occurred.

149

The extracted tweets were then examined manually to ensure their relevance to the CCC. Examples of these tweets include "@RichardAWebster Richard A., What happened this morning in Louisiana? I got the message! I'm very afraid! Help!"; "@foxandfriends Steve, If anything's going to happen to my mother in New Orleans you are going to be the first who I'll curse!"; and "@jrbullington Jonathan, No chance! They have no chance to cover their involvement in a terrible accident that happened in Louisiana". Of the remaining tweets, 166 were manually identified as belonging to the CCC. Such tweets included, “@jrbullington Jonathan, I can’t believe it! We are so screwed right now! What can u say on it?”; and “@RonPaul Ron, Everyone’s saying to calm down but no one knows what to do. How am I supposed to calm down? I’m pregnant for god’s sake!” A date range of the 10th to 16th of September was established based on the hashtag, and keyword searches. The IRA CCC stopped on the 16th September with no further mention of the incident occurring in September after this date. 189,240 tweets were extracted over the seven days. SmartCat.ai was used to translate non-English language tweets.

Table 7.2 – Keyword Search – 2014 US Campaigns Explosion Louisiana Chemical Plant Columbian Chemicals DeadHorse ISIS Bomb Terrorist Attack Blast New Orleans Accident Calm down

The first two IRA tweets regarding the CCC appeared on the 10th September 2014, a day before the alleged explosion, under the hashtags #DeadHorse by operatives running Russian Twitter accounts. The tweets, written in Russian, make no mention of the explosion, gas leak or catastrophic event. Instead, the tweets appear to be an attempt to get hashtags trending through excess use of the hashtag #DeadHorse, as demonstrated in the first tweet by user @vika_bere which read, “Как-то скушно без истории #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse” [Somehow boring without history #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse #DeadHorse]. The use of the hashtag #ColumbianChemicals is then added to the Russian tweets, again without reference to the fictional explosion and subsequent gas leak. Examples include "#ColumbianChemicals вывожу” [#ColumbianChemicals display]; "Кто-то анфолловнится #ColumbianChemicals” [Someone is unfolding #ColumbianChemicals]; and “Ну мне лень #ColumbianChemicals” [Well, I’m lazy #ColumbianChemicals]. As mentioned

150

in the previous chapter on the 11th September, there is also a reference to Columbian Chemicals and Crimea, with one user writing, “Я хочу в Крым #ColumbianChemicals” [“I want to Crimea #ColumbianChemicals”], again the explanation provided here is that there was some confusion by the operative between the two campaigns.

The Russian IRA tweets continued to propagate throughout the 10th and 11th of September totalling 70 tweets, until the first English tweet on the incident at 12.24 pm on the 11th September 2014, by an operative using the handle @annemarie_farle: "Run by terrorists! My country is raped… #LouisianaExplosion". The Russian language tweets then begin to slow down, and the English tweets start to pick up, reaching a peak acceleration of one tweet per second during the height of the information campaign on the 11th of September 2014 (Borthwick, 2015). Four tweets by @AnnRussela followed the @annemarie_farle tweet, which read “#ColumbianChemicals The information about chemical plant explosion received from the witnesses http://t.co/RPOAmc04wi”, "#ColumbianChemicals A powerful explosion heard from miles away happened at a chemical plant in Centerville http://t.co/j30i36S1X5"; "Chemical plant exploded in Centerville, Louisiana #ColumbianChemicals http://t.co/aTWhO7tINl"; and "Chemical plant exploded in Louisiana #ColumbianChemicals". The Twitter data demonstrates that it was typical for operatives to retweet other operatives, as seen in the example of @AnnRussela. While @AnnRussela's first tweet on the CCC received no re-tweets, the three subsequent tweets were retweeted 1,021 times, 227 of which were by other IRA handles. @AnneRussela tweeted a total of 43 times on the 11th September 2014 with information on the CCC, however, only the three tweets mentioned above, received any re- tweets.

151

Image 7.1 – An example of one of @AnnaRussela’s tweet’s

The operative going under the handle @JacobsDavid_ then proceeded to tweet "Holy Christ! This #DeadHorse is scary… http://t.co/XYMeONzJ30". A minute later, 144 operatives retweeted the @JacobsDavid post, with a further 89 re-tweets occurring by non-IRA accounts. Russian operatives appear to have chosen the #DeadHorse in preparation for sharing images of actual dead horses with the implication that the horses had died due to the purported gas leak in Louisiana. There may also be a hidden meaning in the hashtag as the original definition of 'dead horse' according to the Merriam-Webster Dictionary is "an exhausted or profitless topic or issue —usually used in the phrases beat a dead horse and flog a dead horse" (2019). With regards to the #DeadHorse, it would appear that the IRA intended to exhaust the topic on SMNs. A search of the Twitter data images for the IRA revealed actual dead horses and images of sickly horses shared on the 11th September 2014. The photos of the horses were then subsequently shared by legitimate Twitter users, as seen below in Images 7.2 and 7.3.

Image 7.2 and 7.3 – Examples of Legitimate Twitter users retweeting IRA images

A common strategy of Russian disinformation campaigns, as demonstrated in the 2014 Ukraine conflict is the use of imagery. Russia government operatives would manipulate and distribute images of "purported atrocities by the Ukrainian army, including mass graves of tortured people, civilians used for organ trafficking, burning crops to create a famine, recruiting child soldiers, the use of heavy weapons against civilians, and acts of cannibalism" (Lange- Ionatamishvili, Svetoka, & Geers, 2015, p. 108). As Lange-Ionatamishvili et al. (2015) explain, the use of atrocious images in a disinformation campaign can help the campaign spread further

152

and faster and assist in the manipulation of the targeted audience's behaviour. Pictures of dead horses appear to have been used to facilitate the IRA to spread the CCC.

Operatives whose locations read the US appear to have tweeted the most during the CCC. There were 307 variants of the users reported location, with several locations doubled up due to multiple spellings of the location name. For example, in the US appeared as 'New York', 'New-York, 'NY" and 'New York City'. Twenty-one accounts left their location blank, while none of the operatives had their location as Louisiana, which is where the chemical spill was said to have taken place. Twenty-four users had listed New Orleans as their location, with one of the users tweeting "My personal hell starts tomorrow. The city is out of bread. #DeadHorse #ColumbianChemicalsInNewOrleans", while another operative, @MatthewLaborde1 whose location data read New Orleans tweeted "Only idiots can do such an amazing shit! Russia pls help! #DeadHorse #ColumbianChemicals". The "Russia pls help" phrase was used twenty-seven times during the CCC, a phrase used throughout the various IRA campaigns.

Five hundred and forty-two individual IRA accounts tweeted on the CCC. IRA user @1DRussianFNDM whose location read as unknown tweeted the most out of all the IRA operatives listed in the data set, with 218 tweets. In addition to the multiple tweets, operatives also targeted prominent figures in their tweets by tweeting directly to these figures. For example, operative @zpokodon9 targeted Senators and reporters such as Senator Lisa Murkowski "@lisamurkowski Lisa , Are you kidding?? I saw the video #ColumbianChemicals and it looks like hell!!! What a nightmare!"; Senator Chris Coons "@ChrisCoons Chris, You've wrote so many things about how Americans must be stronger. What should we do now. #ColumbianChemicals"; Senator Tom Udall "@SenatorTomUdall Tom , Please help me to let people know that our government is trying to hide the info about an explosion in LA", as well as CNN reporter Jim Angle Fox "@JimAngleFox Jim, I'm sorry, but what were they thinking about? Are they crazy? It's going to kill us all! #ColumbianChemicals #September145”.

The tweets from the 12th of September appeared to be a carry-on from the day before, with one operative tweeting, "@maziehirono Mazie , What happened?? I got the message about the explosion at #ColumbianChemicals! Is this true???". However, the tweets on the 16th of September all came from operative accounts whose user reported location was Russia or places within Russia as their location as per Map 7.1. The exception being four operatives whose 153

locations were reported as Hama and Damascus in Syria, Lyon in France, and Greece. The users reported location appears to be the only difference to the campaign on the 16th, as the tweets continued to have the same tone as the two previous days. For example, @GubatovOfficial, whose user location was Moscow, wrote, "I can't get it! Don't panic, we still have time! #ColumbianChemicals", while @Abunuwasa whose location data was Hama, Syria wrote, "Quit panic and start action! When the fuck it gonna finally end?! #LouisianaExplosion". It is unclear why the accounts were chosen, in what appears to be a last attempt to get the hashtags trending. However, a search on the two accounts demonstrated that both were created in early 2014, and neither account had received much interaction from other Twitter accounts. For example, @GubatovOfficial was responsible for 6,355 tweets throughout 2014; however, only received 626 retweets, while @Abunuwasa was responsible for 3,892 tweets but only received a total of 438 retweets.

Map 7.1: List of Location of IRA Operates in Russia

The CCC was targeted and well organised; however, it did not spread or trend (Borthwick, 2015). As Borthwick (2015) explains, "no matter how much volume, how many tweets, or Facebook likes a campaign generates, if the messages aren’t embedded within existing networks of information flow, it will be very difficult for information to actually propagate" (10). The accounts used to spread the information campaign were part of a network of accounts that had not existed long enough to establish a connection with accounts outside of the network

154

of operatives (Borthwick 2015). As demonstrated in Table 7.3, the IRA tweets in English were minimal until 2014, when they expanded exponentially not allowing enough time to establish a trusted network of followers, as supported in Graph 7.1 and Table 7.4 below. In addition, it is the premise of this thesis that without support of traditional media sources, the campaign did not get the attention it needed to spread.

Table 7.3 – Total Number of IRA English Language Tweets per Year Year Total English Language Tweets 2009 1 2010 17 2011 10 2012 28 2013 682 2014 332,689 2015 1,118,328 2016 896,531 2017 643,031 2018 5,856

Graph 7.1 – Twitter Account Creation 2014

As Graph 7.1 demonstrates, a large portion of accounts were created throughout 2014 in the lead-up to the information campaign. With a high volume of tweets occurring from accounts created in September as per Table 7.4. Table 7.4 demonstrates that the IRA relied heavily on accounts created in 2014 to carry out the information campaign, with many tweets occurring from accounts that were only a few months old, the highest number of tweets coming from accounts that were created in the same month as the information campaign. On first review of

155 the 2014 US campaign, it would appear it was similar in nature to the Crimea campaign, which was carried out by newly created accounts. However, the newly created accounts during the Crimea campaign appear to quickly embed themselves into echo chambers, through association of older IRA accounts, that is the older accounts would have vouched and given legitimacy to the new accounts. With regards to the US accounts, there was no such avenue, the accounts were new to US echo chambers and therefore had not established trust networks.

Table 7.4 – Total Tweets based on the Creation Date of Accounts Used in the Columbian Chemical Information Campaign per Month in September 2014 Account Creation Total Tweets Month January 121 February 1 March 213 April 445 May 1,588 June 1,190 July 149 August 716 September 1,671

A large proportion of tweets came from applications to facilitate coordinated action across multiple accounts, such as Masss Post. An examination of the data identified 6,954 tweets using such clients, as demonstrated in Table 7.5. Masss Post is a bulk tweeting application that allows users to tweet identical information to multiple accounts, a practice now banned by Twitter in an attempt to stop spam, bots and malicious activity and to prevent the false amplification and inflation of tweets (Sugrue, 2018) as demonstrated in the CCC.

Table 7.5 – Client Used to Coordinate Action Across Multiple Accounts in 2014 US Campaigns Client Name Number of Tweets Masss Post 100 Masss Post2 75 Masss Post3 199

156

Masss Post 4 4,721 Masss Post5 1,868

7.4 Ebola and a Shooting in Atlanta Three months after the CCC, IRA Twitter operatives attempted to cause panic after fake news stories spread reports of an Ebola outbreak in Atlanta. The information campaign corresponded with a second campaign the next day, also in Atlanta, where a fake video emerged of the police shooting of an unarmed black woman (Chen, 2016). When reviewing the data for both the Ebola and shooting campaigns in Atlanta, the tactics used by the IRA operatives appear to have changed. For example, the number of tweets for both events had reduced dramatically.

The following words were used in the searches to identify the relevant IRA tweets: Ebola • Эбола [Ebola] • Shooting • стрельба [Shooting] • Atlanta • атланта [Atlanta]

Duplicated records were then deleted, leaving the total number of records at 2,693. The records were then reviewed and reduced to 563 tweets which focused on the shooting in Atlanta and 1,505 tweets about the Ebola outbreak. The first Ebola tweet occurred on the 13th of December 2014 at 19:14 and continued until the 14th of December 2014 at 3:51 am. What was interesting was the cross-over of the two campaigns, with the shooting campaign, starting at 2:43 am on the morning of the 14th of December and continuing until 3:50 am that same morning. It appears all retweets, consisting of 225 from the Ebola campaign and 23 from the shooting campaigns came from other IRA operatives. Spambots were not utilised in the campaign. The Atlanta campaigns occurred over 134 individual Twitter accounts and comprised of 27 accounts created in 2013 and 107 created between January and October 2014 with the majority of accounts, 74, created in May and June of 2014. All but 12 Twitter accounts were marked as being in the US, with only seven accounts noting Atlanta as their location.

When examining the full dataset on Ebola (all tweets in 2014 containing the word Ebola and Эбола), two conspiracy theories were identified. The first involved many Russian language- based tweets which implied that Ebola was a biological weapon, and that the US had a vaccine

157 for Ebola but was not sharing this with the rest of the world. Several operatives used their Twitter accounts to point followers to further evidence of the conspiracy theory through blog articles and YouTube videos. For example, an article on LiveJournal, which concluded,

Africa is milked like a cow. Ebola is an artificially created virus brought to Africa. It is used as a tool to reduce the number of overpopulated African countries, and also serves as an occasion for the introduction of US troops into Liberia to control the process of mining.

All this looks like a game of chess, where African countries will soon be put a checkmate [sic] (bender190191, 2014).

At one stage, the user milena_NIHAO was retweeted 37 times by other IRA operatives, and another 11 times by non-IRA users, for her tweet “Очередное эбола-разоблачение http://t.co/Q1z3bibSqg” [“Another Ebola Exposure http://t.co/Q1z3bibSqg”], which referenced the conspiracy theory article.

The second conspiracy theory claimed that people were coming back to life after dying from Ebola. The tweets were then picked up in a Big American News and Media Inc. story that claims a third Ebola victim had risen from the dead in Africa. The idea that a zombie apocalypse was picked up in main stream media is consistent with Vosoughi et al. (2018) findings, that fake news is novel. The story included an image of a zombie-like person, that was later identified as a character from the movie World War Z, as seen below in Image 7.4. In total, there were 73 tweets from the search results which referenced the Ebola zombie, all of which were in English. For example, Colby Bell tweeted on the 13th of December 2014, “ZOMBIE APOCALYPSE : Ebola victims in African village ‘rise from the dead’”. The campaign does not seem to have gained much traction as only one tweet was retweeted once.

158

Image 7.4 – Image taken from the movie World War Z and portrayed as a real-life event. Source:http://bigamericannews.com/2014/09/30/africa-confirms-3rd-ebola-victim-rises-from-the-dead-releases-picture-of- first-ebola-zombie-captured/

The last stage of the analysis involved an examination of the top ten most retweeted tweets, and the top 10 most followed users. The majority of activity occurred during the first few hours of the CCC campaign on the 11th of September 2014, as demonstrated in Table 7.6 below. The same pattern is seen in the Atlanta campaign with the most retweeted tweets occurring within a few hours of the campaign starting on the 13th of December 2014. The shooting campaign appears to have received minimal traction, perhaps because the campaign began in the very early morning so that by the time America woke up on 14th of December 2014, the campaign had already proven to be fraudulent. A search of news reports on the 14th and 15th of December demonstrated that neither campaign had made it into the local news in Atlanta, suggesting that the campaigns had died off on their own.

With regards to the top ten accounts with the most followers, as Table 7.7 demonstrates, six (6) of the accounts tweeted with regards to the CCC. The other three (3) accounts related to the Ebola campaign and one (1) related to the shooting campaign. The most active accounts were the two accounts with the user location recorded as Russia. The other accounts, although they show a high follower base, demonstrated very little activity in terms of tweets, retweets, likes

159

and quotes. In comparison, the Crimea campaign’s top ten followers showed a considerable amount of user interaction and tweeting activity, as demonstrated in Table 7.8 below.

Table 7.8 – Ten most followed accounts interaction – 2014 US Campaigns and Crimea Campaigns Average Average Average Average Reply Tweets Retweet Like US Campaigns 1,353.8 188.6 21.8 12.1 Crimea 7,052.6 71,885.2 14,839.9 10,868.1

7.5 Koch’s Turkey’s Almost a year after the Atlanta information campaigns, Twitter user @D_andre_Austin_ posted “#walmart #KochFarms https://t.co/szzqrlBAAy https://t.co/GrTM9etEQ9”, the link has since been removed by Twitter, so it is unclear where the link was pointing. However, it was followed by 3,691 tweets which referenced an instance of food poisoning on the American holiday, Thanksgiving, due to turkeys bought at Walmart and supplied by Koch Turkey Farms. Many of the tweets referenced a cooking forum called Discuss Cooking, where user Alice Norton posted that her family had suffered food poisoning from a turkey brought at Walmart. An image of the forum post is seen in Image 7.5 below.

Image 7.5 – Original post on Discuss Cooking

The post received 31 responses, not all were about the food poisoning incident and not all sympathetic to Alice. One forum participant pointing out that Alice had previously introduced herself as having two children, Jacki and Stan and not Robert, as indicated in the food poisoning post as demonstrated in Image 7.6 below.

160

Image 7.6 – A user questions the credibility of Alice Norton

A review of the Alice Norton posts on Discuss Cooking, revealed four posts in total, including the post of the food poisoning incident. The first was Alice introducing herself to the forum and indeed indicating that she had two children, neither named Robert, as demonstrated in image 7.7 below. The second and third posts were advice to another member and then a meme, respectively, as shown in images 7.8 and 7.9 below.

Image 7.7 – Alice introduces herself

161

Image 7.8 – Meme Alice posted

Image 7.9 – Alice helps a user

Alice Norton was a Russian troll working for the IRA who had attempted to embed herself into an American-centric cooking forum. However, it was not long after Alice’s post at 4.03 am on the 26th of November 2015, that the post was mocked due to inconsistencies, with one user calling Alice a troll outright, as demonstrated in Image 7.10 below.

Image 7.10 – Alice is identified as a troll

The Koch campaign, when reviewed, appears to be more sophisticated than the three earlier campaigns as if Russia was learning from previous mistakes. For example, unlike the three 2014 campaigns that occurred in 2014, all the tweets from the Koch’s campaign were in English, with the majority of the 2,542 tweets purporting to have come from within the US. One Twitter account’s location was marked as WA—it is assumed that the location was Washington D.C, while the other tweets had left their location blank. Also new was the IRAs attempt to build trust in an American-centric online network, through the pseudonym Alice

162

Norton. The IRA had used the cooking forum, to try to add another layer of legitimacy to their disinformation campaign, as opposed to just creating a SMN post.

A review of the data demonstrated 115 distinct accounts in use throughout the campaign, in comparison to the 132 accounts used in the Ebola campaign, the 31 accounts used in the shooting and 568 accounts used in the CCC. What was interesting was the cross-over of accounts used throughout the four campaigns. Sixty-three user accounts were used across two of the campaigns, while seven user accounts were used across three of the campaigns. In terms of the 2015 campaign, there were six instances where user accounts were re-used from one of the 2014 accounts and one case where an account was used in the 2015 campaign and two of the 2014 accounts. In terms of user interaction, the Koch campaign received 647 Retweets, 141 Likes and 4 Replies.

Also new in the 2015 campaign was the consistent use of hashtags, in each of the tweets identified for the Koch campaign, a hashtag had been utilised. While examining the hashtags used throughout the campaign, a hashtag unrelated to the Koch campaign appeared, #wakeupamerica, with four of the top ten accounts with the most twitter followers having used this hashtag in their user profile description. #wakeupamerica and ‘Wake up America’ was then examined and discovered to have been used several times throughout 2015, 2016 and 2017, as demonstrated in Table 7.9 below. A review of the top ten most tweeted accounts for the Koch’s campaign, as seen in Table 7.10, also revealed that the oldest account @kennygratham, which was created in 2013, had been lying dormmate until 2015, with the first user activity being the Koch campaign; no prior activity existed before this. Table 7.9 – Analysis of #wakeupamerica #wakeupamerica Wake Up America 2015 1,484 47 2016 2,690 47 2017 397 48

The last stage of investigation into the Koch campaign involved an examination of Twitter to see if there was any evidence that the campaign had spread to a broader Twitter audience, as suggested in an article by Fusaro (2018). However, a search of Twitter between the 25th and 26th of November 2015, revealed no mention of the food poisoning, which can either indicate

163

that the campaign did not reach a wider audience, or that people have removed posts regarding the food poisoning campaign once the campaign was demonstrated to be Russian sponsored.

One factor that may have helped to mould the approach the IRA took with the 2015 Koch campaign could have been the release in June 2015 of an expose by The New York Times entitled The Agency (Chen, 2015), which outlined in detail the three previous campaigns orchestrated by the IRA towards the US in 2014. This may explain why the Koch campaign consisted of operatives whose locations where the US and to why the IRA had embedded an operative into a US centric cooking site, in order to aid in the deception of the campaign.

7.6 The start of an information war All four of the disinformation campaigns analysed in the chapter have been written about in books and newspaper reports. The most cited report being Chen’s (2015) The Agency published in June 2015, which describes the incidents in 2014 as a virtual assault. According to Tromblay (2018), the CCC was an attempt to exploit fears of an environmental disaster and to sow distrust in the American government. The threat that the campaign created however, could not last. The accounts, the Wikipedia page and the false news reports could not stand the test of time, and they did not. The campaign was over in a few days, and the truth regarding the falseness of the incident exposed within two hours of the campaign beginning in the US, with a news report released explaining that there was no explosion and no chemical spill, the allegations had been debunked and determined a hoax (Chen, 2015). Barry (2018) writing on the Koch campaign found that a similar conclusion was reached by Walmark Inc. when the IRA posts first came to their attention in November 2015, that the posts were a hoax, so they did not investigate further. Barry (2018) however, suggests that the Koch campaign was more than a hoax, it was a test run. Borrowing from Keir Giles’ research, Barry (2018) indicates that the IRA were doing test runs on information campaigns targeted at the US, in order to test the boundaries of what they could and could not get away with in terms of falsified stories, and to also see what type of social movement they could coordinate from across the globe. Alternatively, the four campaigns may have been a distraction, for a greater campaign being put in motion by the Russian government and played out in 2016 during the US presidential election. As demonstrated below, and discussed in detail in Chapter Eight, alongside the four campaigns discussed above, it would appear that IRA operatives were imbedding themselves into American centric echo chambers.

164

When comparing the four disinformation campaigns to the Crimea campaign, some similarities exist. For example, both the Crimea campaign and the 2014 US campaigns used newly created accounts which flooded Twitter with tweets. In contrast, the US campaigns relied on Spam bots to publish tweets while the Crimea posts were organic, meaning users wrote them. To gauge the activity and engagement the IRA user accounts received in 2014 and 2015, the Twitter tweet language and account language fields were used to search for all English tweets. As demonstrated in Graph 7.2 below, the interaction in terms of retweets is barely in existence in 2014, however by the end of 2015, the IRA seems to have embedded itself into America’s Twitter networks.38. In November 2015, for example, the location ‘Black America’ was being used, with one user who recorded their location as ‘Black America’ being responsible for 6,544 retweets on one tweet alone. While user gloed_up whose profile read “No black person is ugly #BRONZE #BlackLivesMatter #BlackToLive” was responsible for 12,171 retweets on one single tweet in December 2015. The tweet in question read “Her Teacher cut off 1 of Lamya's beautiful braids as a punishment. Paid a small fine and kept her job #HATEIT https://t.co/CW4sjkzp9B”. The tweet was in reference to a seven-year-old black girl having her braid cut off in the classroom by her teacher, as the child was playing with them (Watts 2009). gloed_up had also amassed a large following of 28,943. As was demonstrated in Chapter Six, #BlackLivesMatter and black issues in general would be one of the ways that the IRA created and infiltrated trust networks within online American communities. The final section of this chapter will examine this idea of ‘trust’, and what it means in online social networks.

38 Tweets where identified as American if the user data was situated in America and/or the language was in English and the users name was not Russian or Arabic.

165

160000 180%

140000 160% 140% 120000 120% 100000 100% 80000 80% 60000 60% 40000 40%

20000 20%

0 0% Jul-14 Jul-15 Jan-14 Jan-15 Jun-14 Jun-15 Oct-14 Oct-15 Feb-14 Sep-14 Feb-15 Sep-15 Apr-14 Apr-15 Dec-14 Dec-15 Mar-14 Mar-15 Aug-14 Nov-14 Aug-15 Nov-15 May-14 May-15

Total Number of Tweets Percentage of Retweets

Graph 7.2 – Tweets and Percentage of Retweets 2014-2015

7.7 Trust Networks Trust is a primary component in all human social relationships. Defined as, "a measure of confidence that an entity will behave expectedly despite the lack of ability to monitor or control the environment in which it operates" (Sherchan, Nepal, & Paris, 2013, p. 47:44), trust is an essential part of the connection between people or organisations via a computer network, known as a social network (Garton, Haythornthwaite, & Wellman, 1997). Active participation is central to trust, in order for networking to occur, users must reveal information (Grabner- Kräuter & Bitter, 2015). Trust lays the foundation for constructing trust communities, online communities where members share experiences, thoughts and opinions in a safe environment without fear of judgement or scrutiny (Suryanarayana & Taylor, 2004). Social networks are a social structure made up of nodes which represent either individuals, groups or organisations connected by a common interest, for example, through trade, friendship, shared values or ideas. According to Sherchan et al. (2013, p. 47:12), social networks are "homogeneous with regards to many sociodemographic, behavioural, and intra-personal characteristics". Individuals will primarily associate and bond with those similar to themselves, homophily, discussed briefly in Chapter Six, is an inhibitor of how a member of a social network experiences interactions, their attitudes formed and the information that they receive. Members of social networks primarily associate with those individuals who are either the same social status such as race, ethnicity, age or religion or with individuals who share the same values, beliefs or attitudes as themselves (Sherchan et al., 2013).

166

Social networking sites offer a symposium for members to share life experiences, opinions, hobbies and even daily activities, with most social networks aiming to connect as many individuals as possible (Sherchan et al., 2013). A key component of social networks is the idea of 'Friend of a Friend' (FOAF), with the underlying assumption being, that friendship is transitive, that is, if A trusts B and B trusts C then A and C will trust each other. However, trust is not transitive but propagative. That is, if A trusts B and B trusts C, A's trust of C will depend on how much she trusts B and how much B trusts C. As such, "trust information can be passed from one member to another in a social network, creating trust chains" (Sherchan et al., 2013, p. 47:49). Trust chains thus engage members to trust other members not directly connected to them. Trust chains are formed under specific conditions that allow a trust to develop based on a recommendation system. Jøsang, Gray, and Kinateder (2003) in their study of trust networks describe this phenomenon as a chain of links each one equal in trust, but only the first link has direct trust, and each proceeding link having indirect trust. When examining the 2014 and 2015 information campaigns above, no opportunity appears to have been created for the IRA operatives to build trust with anyone outside of the IRA operation.

According to Grabner-Kräuter and Bitter (2015), trust-building begins when a person first joins a social network community. The foundation of the trust relationship during this first encounter as well as the idea of stereotypes is based on first impressions or rapid, cognitive cues. A transference process may also occur in this initial stage operating the same way as is the case with Jøsang et al. (2003) trust chains. The most mature level of trust, according to Grabner- Kräuter and Bitter (2015, p. 55) however, "is restricted to interpersonal trust and dominated by internalization of the other's preferences, mutual empathy and identification with each other [italics included]"; this is the highest level of trust achieved and is influenced by things such as shared values, a collective identity, tasks and goals and by either emotional closeness or physical closeness. A further way that members form social trust is through Putnam’s (2000) concept of social capital, which is the number of valuable contributions a member of a social network offers other members. Defined in contemporary society39 as “features of social organisations such as networks, norms, and social trust that facilitate coordination and cooperation for mutual benefit" (Putnam, 2000, p. 2), social capital has three core principles to

39 The original idea of capital belongs to Marx and was seen as the surplus-value captured by those who controlled the production means, namely the bourgeoisie.

167

enhance the trust outcome (Lin, 2017). Firstly, it provides for the flow of information, secondly, it exerts influence and third it gives a form of social credential. For example, if someone is tied into a social network of cybersecurity experts and requires a job, the current sum of social capital they have in the group would determine the effort other members of the group invest in assisting them in their endeavour to find employment. Assistance would include the spreading of information, using influence to help secure employment and providing references for the member.

According to Grabner-Kräuter and Bitter (2015, p. 56), social capital is the value accrued "through a social network and from the social resources of the actors embedded within that network". An embedded perspective is assuming that trust increases as a result of multiple beneficial experiences. Social capital is, therefore, an umbrella theory bringing together various concepts such as trust, social exchange, social support, social networks, and embeddedness. With regards to troll farms, this suggests that for trolls to make a connection with members of a social network community, the troll needs to provide value to that community through their positive interactions with other members of the social network, it is not enough to simply flood a community with messages, as it is often viewed as redundant information. The messages need validation through time spent in the social network building up social capital and trust. Once a troll can gain the trust of members, the troll may then begin to propagate this trust through FOAF networks. As Sherchan et al. (2013) explain, social trust, therefore, consists of three steps; the collection of trust information, the evaluation of trust, and the dissemination of trust, as illustrated in Diagram 7.1. With regards to the four campaigns reviewed at the start of this chapter, Alice Norton appears to have attempted to infiltrate a trust network but failed due to inconsistencies in her story. Alice also had not gained any form of social capital due to her minimal interactions, so, therefore, none of the members appear to have supported her in her allegations, the way a trusted friend would have offered support. In comparison, when examining the Crimea campaign, the IRA had begun to build trust with fellow Russians and Russian speakers of Twitter as early as 2009, any newly created accounts in 2014 would have then benefited from the FOAF networks which were pre-created.

168

Diagram 7.1 – Process of Creating Social Trust Sherchan et al. (2013, p. 47:16)

7.8 Conclusion As demonstrated throughout this chapter, the IRA’s first assault against the US using social media appears to have occurred in 2014. Russia’s approach was to flood Twitter with a repetitive narrative to create chaos. The campaigns differed from the 2014 Crimea campaign in several ways. First, 2014 saw the use of applications such as Masss Post to facilitate the coordination of several tweets at once. The campaigns in the US also had no basis. They were created from fiction and without there being any truth to the information campaigns, the campaigns either failed within a few hours or failed to take-off at all. In comparison, the Crimea campaign was supported by several media campaigns co-occurring, online and through traditional media outlets such as newspapers and television. In 2015, the IRA took a different approach to the information war against the US by moving away from applications such as Masss Post and attempting to embed an operative in an America-centric online food forum, but this too proved unsuccessful. With the IRA’s increase in user interaction by the end of 2015, it would almost be expected that the Koch campaign cause more havoc than it did, which suggests that something was still missing from the IRA’s information campaign formula. It is the premise of this chapter, that the lack of success concerning the Koch Turkey Farm campaign occurred for several reasons. Firstly, as already highlighted, the food poisoning was not

169

reported by other media channels; the spread of disinformation over a multitude of channels is a tool that was used frequently in Crimea, as demonstrated in the previous chapter. Further, when examining Paul and Matthews (2016) components of information campaigns, there appears to be an element missing, the element of truth. For a rumour to sustain itself, there needs to be some truth to it, as seen in Chapter Five’s discussion on conspiracy theories. An element of truth can throw doubt on the falsehood of a story. Thirdly, the IRA operatives utilised throughout the US disinformation campaigns had not appeared to have embedded themselves into trusted networks.

For users to gain the support of other users within a social network, trust needs to form. Trust is built by sharing everyday experiences, speaking a common language and being involved in online communities which support the values that a member may subscribe. In Crimea in 2014, the operatives, although part of the IRA were of Russian origin, they spoke Russian and would have been up to date with news and current affairs in Russia and Ukraine. As demonstrated, during the Crimea campaign, operatives engaged in conversations specific to Russia and Russian culture. Further, operatives were focused on communicating with other Russians or Ukrainians who either identified as Russian or at the very least spoke Russian as evidenced in the tweets examined in Chapter Six. The 2014 and 2015 tweets, however, showed no such activity. The operatives had not embedded themselves into any subcultures and communities within the US; the first attempt to do so was through Alice Norton, which was unsuccessful. What is unclear is whether the IRA had intentionally embedded themselves into Russian and Ukrainian communities or whether this had occurred naturally, so therefore had to learn this tactic from trial and error with the start of the US campaigns. Or was this a tactical move? That is, were the campaigns meant to be believed, or where they merely a distraction so that other IRA members could embed themselves into US centric echo chambers. By distracting the world with flawed disinformation campaigns, IRA operatives could move into US centric echo chambers, such as those supporting #blacklivesmatter, with little suspicion.

One last defining factor to be mentioned, between the 2014 Crimea campaign and the 2014 and 2015 US campaigns was the number of followers the IRA operatives for each campaign had. There was a significant difference in the top ten accounts with the most followers in Crimea compared to the 2014 US accounts. As a result, there was substantial more engagement in the Crimea campaign in terms of likes, retweets and replies. This in turn would have helped the

170

IRA campaign reach a greater audience. In comparison, the US campaigns showed very little interaction with non-IRA accounts.

The next chapter will examine the 2016 US presidential campaign and IRA campaign, which occurred throughout the election years.

171

Table 7.6 Top Ten Follower Accounts for the 2014 Campaigns Total Total Total Total number number of number of number of Account User of Retweet tweets since Like since Reply since Username Creation Recorded Language Followers Following Campaign since account account account account Date Location creation till creation till creation till creation till end of 2014 end of 2014 end of 2014 end of 2014 AndyHashtagger 26/10/2014 I AM A English 22,026 12,966 Ebola 465 125 177 97 CITIZEN OF THE UNIVERSE DallasTopNews 05/07/2014 , Texas English 24,696 7,388 CCC 77 0 0 0 DickyIrwin 10/06/2014 USA English 5,986 4,850 CCC 3 0 0 0 digestlj 25/12/2013 Санкт- Russian 16,390 1,200 CCC 5,643 995 5 12 Петербург [St Petersburg] DominicValent 31/05/2014 US English 5,116 4,136 CCC 234 1 1 0 KathieMrr 29/05/2014 Atlanta English 6,709 5,606 Ebola 1,212 8 11 6 LoraGreeen 28/05/2014 New-York English 7,446 7,377 Shooting 358 4 6 2 NewOrleansON 05/05/2014 New Orleans English 35,988 11,010 CCC 472 2 6 0 pureDavie 18/06/2014 London, UK English 4,483 3,281 Ebola 25 0 0 0 Sinigopajuby 21/12/2013 Moscow English 16,391 1,200 CCC 5,283 751 12 4 [Москва]

Table 7.7—Top 10 Followed Accounts in Crimea Campaign Total number Total number Total number Total number of tweets of Retweet of Like of Reply Account User between between between between Username Creation Recorded Language Followers Following account account account account Date Location creation and creation and creation and creation and end of 2014 end of 2014 end of 2014 end of 2014 coldwar20_ru 19/2/2014 Россия, Russian 53175 5,434 16,214 390,797 69,634 38,101 Москва NovostiSPb 29/02/2012 Saint Russian 149672 1,024 28,710 23,368 1,796 1713 Petersburg, Russia

172

KadirovRussia 29/12/2011 Blank Russian 123989 10 5,154 264,362 72,577 58,984 Ruopentwit 17/06/2014 Russia Russian 35736 409 2,758 6,433 271 296 byDrBre 22/07/2014 Blank Russian 43504 88 1,653 13,038 832 505 ComradZampolit 03/03/2013 Москва Russian 48912 1,797 615 3,202 173 307 (СССР— Россия) Jenn_Abrams 29/10/2014 USA English 79152 22,607 200 57 41 27 LavrovMuesli 21/07/2014 Blank Russian 84642 2,575 1,707 11,369 1,054 1,508 MaryMozhaiskaya 10/08/2014 Blank Russian 37362 2,617 5,439 3,681 1,237 2,719 MaxDementiev 14/09/2014 Blank Russian 134805 2,796 8,076 2,545 784 4,521

Table 7.10 – Top Ten Followed Accounts in Koch’s Turkey Farm Campaign

Total number Total number Total number Total number of tweets of Retweet of Like of Reply Account User between between between between Username Creation Recorded Language Followers Following account account account account Date Location creation and creation and creation and creation and end of 2014 end of 2014 end of 2014 end of 2014 _Billy_Moyer_ 16/05/2014 USA English 466 445 844 41 19 10 Israel_Wills 28/08/2014 USA English 1,213 1,617 808 22 22 5 DesertQueenOh 30/05/2014 New York English 139 224 275 2 4 1 Gibbs_Jeremy_ 04/06/2014 USA English 337 356 631 61 32 8 Jon_Underwood_ 20/06/2014 USA English 333 279 521 21 24 5 jrrbrtt 11/03/2015 Blank English 443 370 617 46 30 1 kennygratham 08/08/2013 Blank English 119 122 308 4 1 1 MarkMcGregory 18/05/2014 USA English 301 261 755 9 12 2 PatriotRaphael 07/06/2014 USA English 1,467 1,527 763 25 15 6 Roscoe_Riddle 20/06/2014 USA English 352 284 516 33 18 3

173

Chapter Eight – The 2016 US Presidential Election

8.1 Introduction There is a long history of interference between the US and Russia, with both countries attempting to influence the other with social and political narratives. Examples of these include Project Troy which involved the US spreading pro-democracy messages behind the metaphorical iron curtain, and the Soviet Union attempting to influence American political elections in favour of candidates perceived to be sympathetic to Russia’s plight. In what appears to be a replication of the 2014 Crimea campaign, the 2016 US presidential campaign was similarly influenced by a Russian misinformation campaign carried out across multiple platforms and involving numerous resources. Included was the hacking of the Democratic National Committee (DNC) and Hillary Clinton’s campaign, state-funded media outlets such as Sputnik and RT reporting favourably towards presidential nominee Donald Trump, an increase in anti-American rhetoric, utilisation of the troll farm the Internet Research Agency (IRA) and social media posts on multiple platforms, including Facebook, Instagram and Twitter.

Several studies analysing Twitter troll farm activity have been undertaken since 2016. Examples include Linvill, Boatwright, Grant, and Warren (2019) who concluded that the IRA were successful in sowing discord in the 2016 election. In contrast, Bail et al. (2020), suggest that the IRA were unsuccessful in sowing discord as they targeted echo chambers whose members already held beliefs consistent with Russian objectives. Moving away from the question of whether the IRA successfully influenced the election, this Chapter will demonstrate that the IRA’s tactics changed significantly compared to previous disinformation and propaganda campaigns. Instead of creating disinformation and propaganda to suit a specifically created Russian government narrative, the IRA relied on stories already in existence and circulating on social media networks (SMNs) to carry out an information campaign in an attempt to influence the 2016 election results. Examples include rumours around Hillary Clinton’s degrading health, the Benghazi attack which killed four US citizens, and the scandal surrounding Hillary Clinton’s email server that would follow. As will be demonstrated, the IRA also participated in the circulation of news stories and allegations concerning Bill Clinton’s affair with intern Monica Lewinsky between 1995 and 1997 when Bill Clinton was President, as well as allegations concerning the alleged sexual assault of Juanita Broadrick in 1978 by Bill Clinton. Russian operatives on Twitter deviating from creating their own content, as seen in the IRA campaigns targeted towards the US in 2015, to utilising existing material demonstrating a significant change in Russian disinformation campaigns.

8.2 A History of Interference Since at least the beginning of the Cold War, Russia and the US have supported different ideologies and alliance systems, dividing the world into sides, while carrying out noisy propaganda campaigns against one another (Paterson & McMahon, 1999). After World War II, the US began to invest in studies about ‘soft power’, or the capability of changing a person’s mind without the need for violence, which Nye (2004) describes as an instrument of co- operation (Nye, 2004). For example, in 1950, the Massachusetts Institute of Technology was asked to find a way to get ‘the truth’ to those living behind the Iron Curtain. In response, Project Troy was deployed by the US government and involved “broadcasts and balloons, […] journals, student exchanges, travellers, and movies to get U.S. information to Central and Eastern Europe” (L. Robinson et al., 2018, p. 19). Between 1951 and 1956, the campaign reportedly released 350,000 balloons carrying approximately three-hundred million leaflets and posters of friendship (L. Robinson et al., 2018).

Just as the US had attempted to influence the Soviet Union through soft power, so too did the Soviet Union similarly attempt to influence the US. The Soviet government, for instance, attempted to influence US presidential elections by supporting the presidential candidate whose policy was perceived as most favourable or beneficial towards the communist way of life. For example, during the 1960 Democratic primary, Senator Adlai Stevenson campaigned to ban testing of the hydrogen bomb (Daley, 2017a). In response, Soviet Union Ambassador to the United States, Mikhail Menshov, offered Stevenson support on behalf of Russian Premier Khrushchev and ‘the Presidium’ if Stevenson were to run against Kennedy in the Democratic primary. Stevenson declined the offer and did not run in the primary against Kennedy (Daley, 2017a). According to a memorandum Stevenson wrote on the subject, the Russian Ambassador’s stated:

In Russia, we know well Mr. Stevenson and his views regarding disarmament, nuclear testing, peaceful coexistence, and the conditions of a peaceful world. He has said many sober and correct things during his visit to Moscow and in his

175

writings and speeches. When we compare all the possible candidates in the United States, we feel that Mr. Stevenson is best for mutual understanding and progress toward peace. […] We believe that Mr. Stevenson is more of a realist than others and is likely to understand Soviet anxieties and purposes (qt. in Daley, 2017: 1).

Eight years later, in 1968, Hubert Humphrey would receive a similar offer from the Soviet Union Ambassador to the United States, Anatoly Dobrynin. Humphrey was offered secret funding by the USSR to campaign against Richard Nixon, Humphrey also reportedly declined the offer (Doran, 2017).

By 1976 the USSR appeared to have adopted a new approach to influence the US elections by discrediting Henry ‘Scoop’ Jackson, a vocal anti-communist Democrat. Jackson was running for President of the United States on the back of Republican President Nixon’s resignation (Doran, 2017). Due to his anti-Soviet stance, the KGB began operation POROK, which involved several attempts to discredit Jackson as a Zionist and a homosexual, the latter of the two a criminal offence at the time. This campaign involved forging FBI documents and sending copies to major newspapers, such as the as well as magazines such as Playboy and Penthouse and even to Jackson’s opponent, Jimmy Carter. When Jackson went on National Television to speak out against homosexuality, the KGB once more forged documents stating Jackson was part of a gay sex club.

By the 1980s, concern was growing regarding Soviet interference in the US presidential elections. In response, Bill Casey, Director of Central Intelligence, in 1984 wrote a memorandum requesting resources to undertake a study into the Soviet Union’s influence on past and present US elections. The memo stated that “after years of intense efforts […] the Soviet grasp of the US political system is better than ever. Hence, the Soviet capacity for influencing votes is higher” (Central Intelligence Agency, 1984, p. 1). Later that year, the Kremlin commenced an active measures campaign to discredit presidential nominee Ronald Reagan, promoting the slogan ‘Reagan Means War’ and presenting Reagan as an aggressor who wanted a war (Reiss, 2019). According to The Economist (2016), “Russia propagated stories about Reagan’s militaristic adventurism, rising tensions among NATO allies, discrimination against ethnic minorities and corruption”. However, with Reagan winning the election by a landslide victory, the campaign had little to no negative effect on Reagan’s career nor his credibility.

176

Until 2016, Russia does not seem to have made any further attempts to influence US elections. As Bastos and Farkas (2019, p. 1) write, the “aftermath of the Cold War was […] marked by a declining trend in information warfare between enemy states”. However, this ceasefire appears to have ended with the interference by the US in the 1996 Russian elections, when the US-supported Russian Prime Minister Boris Yeltsin by endorsing an International Monetary Fund loan for $10.2 billion to Russia. A caveat to the loan was that the fund would cut off the money if a Communist party member were to come to power once more in Russia and if Russia abandoned democratic reforms. The announcement of the loan came at the end of Yeltsin’s first Prime Ministerial term, just before the 1996 election, leading to a growth in support by the Russian people of Yeltsin (Goldgeier & McFaul, 2003). It should also be noted that Yeltsin’s primary opponent was , a Russian Communist politician and leader of the Russian Communist Party since 1993. As Goldgeier and McFaul (2003) explain, in 1996, the US saw Yeltsin as the lesser of two evils running for Prime Minister in Russia, so it was in President Bill Clinton’s and the US’ best interest to have Yeltsin win.

8.3 The 2016 US Presidential Election Unlike previous Russian interference in the US elections, the 2016 US presidential election revealed on the surface, an alarming and more sophisticated interference campaign. In November 2017, members of the US Senate Judiciary Subcommittee on Crime and Terrorism commented on the success of the recent Russian campaign to manipulate the 2016 US election results in favour of President-elect Donald Trump. As the ranking Democrat on the committee, Senator Mark R Warner announced, “if you look back at the results, […] it’s a pretty good return on investment.” (Timberg et al., 2017, p. 1). A year later, a joint report released by the US Central Intelligence Agency, the Federal Bureau of Investigations and the National Security Agency confirmed that “Moscow’s influence campaign followed a Russian messaging strategy that blend[ed] covert intelligence operations” (Intelligence Community Assessment, 2017, p. ii). The activities, described below in detail, included cyber activity, state-funded media, anti-American rhetoric, IRA trolling and social media posts. The campaign also targeted America’s ‘purple states’, US states that swung between the Republican Party (red) and the Democratic Party (blue) during elections (Intelligence Community Assessment, 2017).

177

8.3.1 Cyber-Activity In September 2015 the Democratic National Committee (DNC) IT Department received a telephone call from the FBI informing them that Russian hackers had compromised at least one of their computers. A quick scan undertaken by a computer-technician revealed nothing amiss, so the phone call from the FBI did not escalate and was presumably forgotten (CNN Library, 2018). Two months later, a further phone call from the FBI informed the DNC IT Department that the DNC was now transmitting information back to Russia, however, once more, the request was not escalated and again was not pursued (CNN Library, 2018). On the 19th March 2016 John Podesta, the Chairman of Hillary Clinton’s presidential campaign received a phishing email40 disguised as an email from Google. Podesta, in doubt of the origins of the email, forwarded it to his IT Department inquiring about the authenticity. However, the IT Department reportedly mistyped the response to the inquiry, confirming the email as legitimate rather than illegitimate. Podesta then proceeded to follow the link and as such, provided full access to his email to malicious actors (Sciutto, 2017). Two Democratic computer systems were now breached, that of the DNC and the Clinton campaign (Sciutto, 2017).

A month later, in April 2016, the DNC would discover the breach for themselves, informing the FBI and hiring the cybersecurity firm, Crowdstrike to eradicate the intruders (Nakashima, 2016). However, by this time, the DNC had been compromised for almost a year (Nakashima, 2016). A hacker calling themselves Guccifer 2.0 would be held responsible for the hack on the DNC and the subsequent dissemination of stolen emails and material, throughout the 2016 US election (Price & Sheth, 2018). Although Guccifer 2.0 denied ties to Russia, the US intelligence community concluded that “based on digital fingerprints left on hacks targeting Democrats during the 2016 presidential campaign […] the cyberattacks against the Democratic National Committee and members of Hillary Clinton’s campaign were largely, if not entirely, carried out by Russian intelligence group [sic]” (Price & Sheth, 2018, p. 1). Guccifer 2.0 had been identified as a Russian asset. The stolen information would later be turned over to WikiLeaks and slowly leaked throughout the last stages of the presidential election campaign to discredit Hillary Clinton.

40 A phishing email is a fraudulent email which is used to trick the recipient to give away online credentials, financial details or other sensitive information.

178

8.3.2 State Funded-Media Sputnik and RT, examined in previous chapters, are examples of Russian state-funded media outlets targeting global audiences. Both entities, according to the Intelligence Community Assessment (2071: 4) “consistently cast President-elect Trump as the target of unfair coverage from traditional US media outlets”. In contrast, coverage of Hillary Clinton focused on leaked emails, accusations of poor physical and mental health, corruption and unfavourable material discovered and revealed by Wikileaks as described above. One of RT’s most popular videos, reaching nine million views on social media, accused Hillary Clinton of embezzling money intended for charity41 (Intelligence Community Assessment, 6 January 2017).

8.3.3 Growth of Anti-American Rhetoric In addition to rallying internal support for Putin’s government, research undertaken by Gerber and Zavisca (2015) reported that from 2014 an increase in anti-American rhetoric could be seen within Russian troll farms (although this may be seen as part of a broader strategy of rallying support for Putin by demonising foreigners, see for example Kuchin, 2012). The Strategic Communications Centre of Excellence in Latvia, which was set up by NATO to investigate and counter Russian propaganda, confirmed this increase in 2015, announcing that the “Russian government is pursuing a conscious strategy of swaying public opinion in its favour and against the United States and its NATO allies both domestically and abroad” (Gerber & Zavisca, 2016a, p. 80). Kuchin (2012, p. 2), asserts however, that Putin’s anti- American rhetoric is merely a ‘domestic political tool’. However, Labot (2011), in contrast, suggests that Hillary Clinton’s call for a ‘full investigation’ after the re-election of Vladimir Putin in December 2011, resulted in Prime Minister Putin accusing Hillary Clinton and the West of attempting to influence the Russian elections (Labot, 2011, p. 1).

As previously discussed (see Chapter Five), the colour revolutions led to a knock-on effect in Russia, where the 2011 elections saw public protests by Russian citizens, after allegations of ballot-box stuffing at Russian voting locations. In response to this, Clinton announced that “the Russian people, like people everywhere, deserve the right to have their voices heard and their votes counted […] and this means they deserve free, fair, transparent elections and leaders who are accountable to them” (qt in Shuster, 2016). According to Shuster (2016), Putin took significant offence to this statement. Putin also took it as a sign that the US was

41 The YouTube version of the video may be found at https://presidential.youtube.com/watch?v=- YC5f1YjMzU.

179 attempting to manipulate Russian election results, as seen in 1996 with Yeltsin. As Herszenhorn and Barry (2011) reported in The New York Times at the time of the protests, Mr Putin claimed that Clinton sent “a signal” to actors within Russia which led to protestors chanting “Putin is a thief” and “Russia without Putin”. Putin went on to claim that the West was investing hundreds of millions of dollars to influence the politics in Russia. In response, the White House’s press secretary in support of Clinton reiterated Clinton’s concerns by stating, “When rights are violated in Russia or another country, we speak out” (qt in Herszenhorn & Barry, 2011).

8.3.4 The Internet Research Agency The US Intelligence Community Assessment (2017) concluded that Russian President Vladimir Putin ordered an influence campaign intended to disrupt the US election that was carried out by the IRA42. In 2018 the United States District Court for the District of Columbia released an indictment stating that the IRA was formed with the specific purpose of waging “information warfare against the United States of America [using] fictitious US personas on social media” (United States District Court for the District of Columbia, 2018, p. 6). The IRA’s primary goal was to spread “distrust towards the candidates and the political system in general” (United States District Court for the District of Columbia, 2018, p. 6). However, as the previous chapters have demonstrated, the IRA’s focus was much larger than that of the US. The IRA had previously targeted citizens of Russia and its neighbouring countries long before it set sights on the US.

The indictment against the IRA targeted thirteen employees and their combined failure to disclose their involvement in US domestic activities, namely interfering with the 2016 US presidential election. Evidence in support of the indictment includes a 2016 IRA internal memorandum discovered throughout the US investigation. The memo instructs the defendants to focus on US politics using any opportunity to criticise Hillary Clinton and the other candidates, excluding Sanders and Trump. It is unknown if the memorandum was dated before or after Hillary Clinton won the primaries, therefore it is unclear whether the support for Sanders was to undermine Hillary Clinton’s chances of winning the primaries or to undermine the validity of Hillary Clinton’s subsequent win over Sanders in the primaries. In response to this memorandum, the indictment states that the defendants impaired, obstructed and defeated

42 See also Glaser (2018)

180

“the lawful government functions of the United States by dishonest means to enable the Defendants to interfere with US political and electoral processes” (United States District Court for the District of Columbia, 2018, p. 12). The thirteen employees of the IRA were accused of violating the laws of several US federal regulatory agencies, including the Federal Election Commission (FECA). These agencies prohibit contributions, expenditure, independent expenditures, or disbursements for election communications by foreign nationals.

The indictment also cites the US Department of Justice’s Foreign Agent Registration Act (FARA) that manages “the registration, reporting, and disclosure regime for agents of foreign principals […] so that the US government and the people of the United States are informed of the source of information and the identity of persons attempting to influence US public opinion, policy, and law” (United States District Court for the District of Columbia, 2018, p. 11). The third agency which the indictment refers to is the US Department of State. This department is responsible for issuing non-immigrant visas to foreign individuals. The indictment alleges that some defendants travelled to the US for the sole purpose of reconnaissance work and to gauge the political atmosphere of the US at that time, and as such, were accused of lying on their passports regarding the true nature of their trip to America. The thirteen defendants were also charged with purchasing space on US computer servers using stolen US identities with PayPal43 accounts and other digital payment service providers. The defendants also created false identification documents and posted on social media platforms while using these documents. The employees of the IRA not only posed as US citizens and entities but also stole the identities of US citizens to operate social media accounts designed to attract a US audience.

8.3.5 Social Media Posts According to the United States District Court for the District of Columbia (2018), the IRA manipulated social media platforms such as Facebook, Twitter and Instagram in an attempt to influence the election. The IRA purchased an estimated 3,000 plus advertisements either in support of President-elect Trump or to oppose Presidential nominee Hillary Clinton. More than 10 million Americans would view the ads purchased by the IRA (Intelligence Community Assessment, 2017). In addition to this, personal Facebook posts disseminating similar

43 Paypal is a payment method which allows consumers to make financial transactions online— Hsiao (2020)

181 messages to the purchased advertisements reached approximately 126 million Americans. Employees of the IRA, acting as US citizens or entities, were held responsible by the US Intelligence Community Assessment (2017) for these posts (Timberg et al., 2017). The Russian troll farm assault was not restricted to Facebook and occurred over multiple platforms, including Twitter and Instagram (Intelligence Community Assessment, 2017).

8.3.6 Purple States In addition to the above activity, IRA operatives also posed online as US political volunteers to gain inside information on how to target US voters effectively (David Lee, 2018). During one of these exercises, a Texas-based volunteer informed an IRA worker that campaigning needed ‘to be aimed’ at purple states44 as that was where voting would be ‘tight’ (David Lee, 2018). The IRA took heed, as the indictment specifically mentions the IRA’s focus on purple states such as Colorado, Virginia and Florida. In focusing on purple states, the indictment alleges that IRA employees targeted social media platforms creating organisations and social media campaigns to target specific groups, such as: • Secure Borders which focused on immigration; • Blacktivist which targeted the movement; • United Muslims of America and Army of Jesus which covered religion; and • South United Heart of Texas which targeted geographical issues.

8.4 Previous Studies There have been several studies conducted on the IRA’s Twitter activity. One such study by Linvill et al. (2019) examined two narratives: the first is the IRA’s role in sowing discord and chaos during the 2016 election and the second was the IRA’s purported support of Donald Trump. The study found evidence to suggest both narratives were true and also were not mutually exclusive. As Linvill et al. (2019, p. 298) conclude, “accounts disguised as U.S. citizens infiltrated normal political conversations and inserted false, misleading, or sensationalized information. These practices create an existential threat to the very democratic ideals that grant the electorate confidence in the political process.” According to Linvill et al. (2019), by focusing attention and support on Donald Trump, the IRA destabilised the US’

44 Purple states refer to those states in the US that have historically not demonstrated a consistent support for either blue (Demoncratice) or red (Republican) states. Purple states are also known as swing states, which suggest that they swing between red and blue voting each election (United States District Court for the District of Columbia, 2018)

182 authentic political discourse. A further study by Stewart, Arif, and Starbird (2018) performed a network analysis on identified IRA Twitter accounts disseminating data on US shooting events and the hashtags #BlackLivesMatter, #BlueLiveMatters and #AllLiveMatters. Their research demonstrated a significant level of polarising discourse was utilised by the IRA to propel issues into the public eye. Further, their research claimed that filter bubbles or echo chambers played a significant role in the information flow, “likely serving to accentuate disagreement and foster division [and a] calculated form of that exploits the crowd-sourced nature of social media” (Stewart et al., 2018). Bail’s et al (2020) study also concluded that IRA operatives interacted with individuals who were already significantly polarised.

However, Bail’s et al (2020) study found that due to the IRA’s interactions with specific echo chambers that focused explicitly on US politics, the IRA trolls failed to sow discord as the operatives were mostly interacting with individuals who already had the same or similar polarised beliefs, such as strong political views and within their Twitter network held strong ideological homophily. Further, the study concluded that it was not possible to determine systematically whether the IRA influenced public attitudes or behaviour during the 2016 US election. One reason why IRA operatives may have imbedded themselves into politically focused echo chambers is discussed by Howard et al. (2018) who claimed that the IRA not only aimed to influence attitudes of American voters but also their political behaviours. This perspective is supported by Bail et al. (2020, p. 244) who provide the example of the IRA’s attempt to “demobilize African-American voters by spreading negative messages about Hillary Clinton before the 2016 US presidential election”.

8.5 Twitter Analysis In response to Bail’s et al. (2020) study, a review of Twitter data was undertaken as part of this research project. The review demonstrated that the IRA had begun to utilise #BlackLivesMatters in 2015. Upon further inspection, another three topics, Benghazi, Hillary Clinton’s email scandal and allegations of The Clinton Foundation’s fraudulent behaviour were also identified in the 2015 tweets, with all four issues carrying through to the 2016 election. Graph 8.1 provides a visual representation of the use of the topics throughout 2015.

183

1000 900 800 700 600 500 400 300 200 100 0

Jan-15 Jul-15 Feb-15 Mar-15 Apr-15 May-15 Jun-15 Aug-15 Sep-15 Oct-15 Nov-15 Dec-15 Benghazi Black Lives Matter Email Scandal Foundation Scandal

Graph 8.1 – 2015 IRA Activity

As demonstrated by the graph, IRA tweets concerning Benghazi and the Black Lives Matter movement began as early as January 2015. At the same time, tweets on the email scandal started in March 2015, which matches the timeline of Hillary Clinton being subpoenaed for her Benghazi-related emails (Hicks, 2016). With regard to the scandal around the Clintons’ foundation, reports concerning the allegations began to surface in February of 2015. However, a jump in traffic around this topic does not occur until May, peaking in June and then reducing in August 2015. Particularly notable is the number of tweets that were generated by IRA operative on Twitter regarding the hashtag #BlackLivesMatter in 2015. The Black Lives Matters movement began in 2013; however, it did not start to gain traction in the US until 2014 with the death of Michael Brown by Ferguson Police in 2014. When looking for a correlation between the July 2015 spike in Black Lives Matter and stories in the news at the same time, the spike appears to correlate with the who was an African American found hanged in her jail cell in Waller County Texas (Toronto Sun, 2015). The focus on the four headings demonstrates that in 2015 the IRA were experimenting with two different social media attacks simultaneously. The first involved the made-up stories illustrated in Chapter Seven, the second involved inserting operatives into echo chambers specific to US politics at the time, as demonstrated above in Graph 8.1. Alternatively, the IRA were creating noise through the campaigns discussed in Chapter Seven, in order to distract from operatives penetrating US centric echo chambers.

184

In August 2020 the news site reported on leaked audio from 2016 of President-elect Trump speaking to civil rights leaders at , just days before his inauguration. In the audio Trump may be heard stating, “Many Blacks didn’t go out to vote for Hillary ‘cause they like me. That was almost as good as getting the vote, you know, and it was great” (Trump qt in McCaskill, 2020). The report goes on to state that a low turnout of Black voters in key US states was the primary reason why Hillary Clinton lost the 2016 US presidential election to Trump. Further research is required to establish whether or not the IRA was vital in influencing the black vote in the US during the 2016 election, including correlating the participation of black Americans with IRA activity and engagement.

8.6 Russian Troll Farm Activity According to the 2018 indictment, the primary campaign of influence undertaken by the IRA began on the 6th April 2016 and continued until after the November 2016 election. The campaign focused on buying advertisements on social media, organising rallies online, spreading hashtags, operating social media accounts, and posting to social media groups. The indictment also states that all the IRA activity was either in support of President-elect Trump or against Hillary Clinton (United States District Court for the District of Columbia, 2018). One of the first activities undertaken as part of this research project was an examination as to whether mainstream media had picked up the 2016 IRA propaganda campaigns. What was discovered instead was that the propaganda campaigns identified as being undertaken by the IRA were based on existing data. In some instances, Russian media campaigns continued the rhetoric seen in the media to derogate Hillary Clinton. In other cases, the Russian media campaign appeared to attempt to circumvent negative press associated with President-elect Trump. As will be discussed further in the Chapter, instances of IRA support for Hillary Clinton were also discovered. Table 8.1 provides the results of this research. As can be seen, the first two columns represent the date and an issue as reported in mainstream media or other means, such as blogs. The next two columns demonstrate the IRA campaigning on the same issue after the fact, a tactic identified throughout this research project, and to date, not identified in any other research.

185

Table 8.1 – Examples of the IRA Campaign reporting regurgitating ideas that were already in circulation The African American Vote Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 10/02/16 A report claimed that Hillary 6/4/16 IRA bought advertisements on Clinton did not ‘deserve the black social media and/or online site— vote’. This was based on Bill “You know, a great number of Clinton’s legacy, the 1994 Crime black people support us saying Bill, which was supported by that Hillary Clinton at the time, and #HillaryClintonIsNotMyPreside led to higher incarceration rates 24/5/16 nt.” of black people (Alexander, 11/03/16 2016) IRA bought advertisements on social media and/or online site— A Trump rally is cancelled after 16/10/16 “Hillary Clinton Doesn’t riots became a security threat. Deserve the Black Vote.” The group, ‘Black Lives Matter’ were identified as one of the IRA operatives posted on major organisers of the protest Instagram account ‘Woke (Bellware, 2016) Blacks’ – A “particular hype and hatred for Trump is misleading the people and forcing Blacks to 3/11/16 vote Killary. We cannot resort to the lesser of two devils. Then we’d surely be better off without voting AT ALL.”

IRA operatives posted on the Instagram account “Blacktivist” that read in part: “Choose peace and vote for . Trust me, it’s not a wasted vote.”

186

Hillary’s Credibility Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 2015 Whilst in the position of 7/4/16 IRA bought advertisements on Secretary of State, Hillary social media and/or online site—“I Clinton set-up her own private say no to Hillary Clinton / I say no email server. In 2015 the news to manipulation.” that Hillary Clinton used her 19/4/16 own server hit mainstream IRA bought advertisements on media, and an investigation was social media and/or online site— undertaken to discover whether “JOIN our or not Hillary Clinton had 30/6/16 #HillaryClintonForPrison2016”. engaged in criminal conduct. The discovery occurred after IRA bought advertisement on she was requested to hand over social media and/or online site— her email correspondence “#NeverHillary #HillaryForPrison regarding her involvement in #Hillary4Prison the Benghazi incident, where #HillaryForPrison2016 27/6/16 four American’s were killed #Trump2016 #Trump (Zurcher, 2016). 20/8/16 #Trump4President”—this appears to be in response to events that Bill Clinton and Attorney occurred on the 27th June. General Loretta Lynch had a private chat at an apparent The IRA organised a series of chance meeting in Phoenix, coordinated rallies entitled sparking concern that they “Florida goes Trump” – this spoke about the current included buying Facebook investigation by the US Justice 19/10/16 advertising to advertise the 29/6/16 Department into Hillary protests prior to the event and Clinton's private emails server hiring equipment and people to created by Hillary Clinton coordinate the event, including whilst she was leading the State having a person dress up as Hillary Department (Watkins, 2016). 10/8/16

187

Clinton in an orange prison suit 30/6/16 NRA put together an behind bars. advertisement for Trump—it does not involve guns, but IRA bought advertisements on rather a survivor of Benghazi social media and/or online site— asking the nation to vote for “Hillary is a Satan, and her crimes Trump and not Hillary (Fishel and lies had proved just how evil & Stracqualursi, 2016) she is.”

8/7/16 Judicial Watch releases a IRA bought advertisements on statement regarding events social media and/or online site— from the 27th June that “We cannot trust Hillary to take "Attorney General Lynch's care of our veterans!” meeting with President Clinton creates the appearance of a violation of the law, ethical standards and good judgement” (Judicial Watch, 2016, p. 1).

Political commentator, cartoonist and columnist Ted Rally wrote a commentary entitled "Hillary Cheated" was he claimed "National Committee cheated in favor of Clinton and against Sanders. 18/7/16 They broke the law. They disenfranchised voters. They broke party rules. And they violated long-standing customs that are so widely accepted that they are essential de facto rules of the Democratic Party and the

188

American political system" (Rall, 2016, p. 1)—This sediment appears to have also been picked up on the 4th August in IRA led social media campaigns (see below under Iowa).

GOP convention begins and the Republican Party debate as to whether Hillary Clinton should be executed or locked up for her crimes. The allegations stem from Benghazi, the 11th of September 2012 terrorist attack in Libya that killed four Americans (Milbank, 2016).

Muslim and anti-Muslim Sentiment Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 1/1/2016 It was reported by Hillary 10/5/16 IRA bought advertisements on Clinton and then mainstream social media and/or online site— news that Donald Trump is “Donald wants to defeat terrorism being used for terrorist . . . Hillary wants to sponsor it.” propaganda videos (Paletta, 2016). 1/7/16 A group of American Muslims 9/7/16 Facebook group “United Muslims announced plans to walk on of America” promoted a rally Washington DC on the 23rd July called “Support Hillary. Save 2016 (Hatuqa, 2016). American Muslims” – this included paying an American to

189

25/7/16 Khaled A. Beydoun a law hold a sign up of Hillary Clinton professor, and author of with the quote, “I think Law American Islamophobia: will be a powerful new direction of Understanding the Roots and 26/7/16 freedom.” Rise of Fear—wrote a blog article entitled Muslim voters The IRA promoted on the between Hillary Clinton and a Facebook page United Muslims of Hard Place, suggesting "For American that Muslim voters were Muslim-Americans, to vote for 14/9/16 “between Hillary Clinton and a Trump is to do the unthinkable. hard place.” On the other hand, casting a vote for Clinton means IRA operated Facebook page assuming the risks and perils of 14/10/16 ‘Secure Boarders’ was told to up an expanded "war on terror" at its game with regards to criticizing home and abroad" (Beydoun, Hillary. 2016, p. 1). IRA bought advertisements on social media and/or online site— 8/11/16 “Among all the candidates Donald Trump is the one and only who can defend the police from terrorists.”

Various social media campaigns of anti-vote messages under the group “United Muslims of America” for example, “American Muslims [are] boycotting elections today, most of the American Muslim voters refuse to vote for Hillary Clinton because she wants to continue the war on Muslims in the middle east and voted yes for invading Iraq.”

190

Gun Issues Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 7/05/16 President-elect Trump during a 19/5/16 IRA bought advertisements on social rally states “Hillary Clinton media and/or online site—“Vote wants to abolish the Second Republican, vote Trump, and support Amendment. She wants to the Second Amendment!” abolish it. Hillary Clinton wants to take your guns away, and she wants to abolish the Second Amendment. She wants to take the bullets away. She wants to take it” (Rupert, 2016, p. 1).

Ohio Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 4/7/16 Mainstream news reports that 20/7/16 IRA bought advertisements on social Trump needs to play catch-up in media and/or online site—“Ohio Ohio (Gabriel, 2016). Wants Hillary 4 Prison.”

Iowa Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 2/2/16 A video appears showing 23/7/16 The IRA organised a rally “Down disorganisation during the Iowa with Hillary” – this included caucus with onlookers advertising on Facebook that the suggesting it was a sign of voter rallies before the event and hire fraud (Bergeson, 2016). equipment and people to coordinate the event.

191

4/8/16 IRA operatives began purchasing advertisement on the Facebook account “Stop A.I.” The post alleged that “Hillary Clinton has already committed voter fraud during the Democrat Iowa Caucus.”

The IRA bought advertisement on social media and/or online sites— “Hillary Clinton has already committed voter fraud during the Democrat Iowa Caucus.”

The IRA purchased advertisement on Facebook to advertise Florida for Trump rally – it is estimated the ad reached over 59,000 Facebook users in Florida with 8,300 people clicking 11/8/16 on the ads and being directed to the fabricated ‘Being Patriotic’ Facebook page

IRA operatives began posting 2/11/16 allegations that voter fraud was being investigated in North Carolina via the Twitter account @TEN_GOP.

The IRA circulate the hashtag #VoterFraud against Hillary.

192

In addition to the above, the IRA also managed a number of general campaigns that both supported, and then opposed Donald Trump.

General Trump Hype Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 7/6/16 The IRA bought advertisements on social media and/or online sites— “Trump is our only hope for a better future!” 25/6/16 The IRA organised a rally “March for Trump” – this included buying Facebook advertising to advertise the rally prior to the event and hire equipment and people to coordinate the event. 12/11/16 The IRA promote pro-Trump rallies in New York, after the election called “Show your support for President- elect Donald Trump.”

Posts against Trump Previous reporting on the issue IRA Social Media Campaign Date Incident Date Incident 12/11/16 The IRA promoted a rally entitled “Trump is not my President” in New York. 12/11/16 The IRA promote an anti-Trump rally in Charlotte, North Carolina entitled, “Charlotte against Trump.”

8.7 Twitter To understand what the 2016 IRA campaign looked like on Twitter, searches were conducted on a number of key terms and words:

193

• Hillary/ Clinton • Trump • WikiLeaks • Trump is not my president/ Trumpisnotmypresident • March for trump/ marchfortrump/ March 4 trump/ March4trump • Voter Fraud/ VoterFraud • Ohio • Down with Hillary • Hillary for Prison/Hillary 4 Prison/ HillaryforPrison/ Hillary4prison • Trump for president/ Trumpforpresident • Killary • Hillary is Satan/ Hillaryissatan • Black lives matter/ Blacklivesmatter • Benghazi • Hillary cheated/ Hillarycheated • Second Amendment/ SecondAmendment • Muslim

The frequency of tweets in 2016 for each of these keywords may be seen in Graph 8.2 below.

80,000 75,056

70,000

60,000

50,000 46,454

40,000

30,000

20,000 15,541 12,945 10,000 6,819 2,420 2,622 4,666 29 42 650 2 283 2 1,376 4 186 0

Ohio Trump Killary Muslim Benghazi WikiLeaks Hillary/ Clinton Down with Hillary

Voter Fraud/ VoterFraud

Hillary is Satan/ Hillaryissatan Hillary cheated/ Hillarycheated

Black lives matter/ Blacklivesmatter

Trump for president/ Trumpforpresident Second Amendment/ SecondAmendment

March for trump/ marchfortrump/ March 4 trump/… Trump is not my president/ Trumpisnotmypresident Hillary for Prison/Hillary 4 Prison/ HillaryforPrison/…

Graph 8.2 – 2016 Key Term and Word Search Results

194

A preliminary review of the data demonstrated no deviation from what had been identified in Table 8.1 above, that is, that the IRA was merely regurgitating pre-existing media reports. Further analysis was then undertaken to identify any new campaigns that might have been co- occurring through the IRA Twitter accounts. The tweets about Hillary/Clinton were reviewed, duplicated tweets were deleted, and the remaining data was sorted based on keywords and known media campaigns which were occurring in 2016, as demonstrated in Table 8.2 below.

When reviewing the 2016 Hillary Clinton tweets, a pattern of retweeting was identified, that is, the IRA operatives appeared to be frequently retweeting, although it was not clear if they were retweeting other operatives or legitimate Twitter users. Further analysis was undertaken, and it was discovered that out of the 46,342 Hillary Clinton tweets identified in 2016, 30,025 were retweeted, which is 65% of the total Hillary Clinton tweets. A Pearl script, as displayed in Appendix Two, was then used to identify how many of the retweets were from other IRA operatives and how many were from legitimate Twitter users. The results demonstrated that 28,284 of the retweets were from legitimate users, which is 94.5% of the total retweets and 61% of the total tweets with the key terms Hillary/Clinton. This is a significant discovery as it supports the hypothesis that IRA operatives relied heavily on material that was already circulating throughout social media networks.

Table 8.2 – Previously Identified Media Campaigns Voter Fraud in Black Lives Matter Benghazi Hillary Clinton’s Iowa Email Scandal Voter Fraud BLM Benghazi Email Voterfraud Black Lives Matter Libya E-mail Iowa BlackLivesMatter Terrorist E- mail BlackTwitter Veterans Server Killary FBI Veterans Comey

The Clinton Hillary Against Email Hacks Foundation Guns Foundation NRA Wikileaks Fdn Second Amendment DNCleaks

195

Donation/s Assange Donor Anonymous Charity Guccifer Saudi Blumenthal Mevs

In addition to the themes identified in Table 8.1 and replicated in Table 8.2 above, several other themes were identified throughout the IRA campaign on Twitter. These themes have been categorised into two groups, “Additional Repeated Themes”, that is, themes which were seen on multiple occasions while reviewing the Hillary/Clinton tweets, and “Other Events”. “Other Events” refers to tweets which referenced headlines from other media outlets or linked to YouTube videos. Examples of “Other Events” are provided towards the end of the chapter.

8.7.1 Additional Repeated Themes

8.7.1.1 Anti-Clinton Themed Tweets There were many tweets which fell under the category of general anti-Clinton themed tweets. These tweets did not reference any particular event, but instead were negative comments directed towards Hillary Clinton or another member of the Clinton family. For example, CatelineWatkins on the 27th of September 2016 wrote “RT @leftyguitar1: #ThingsMoreTrustedThanHillary Hannibal Lechter”, while IlikeBIGbuttand wrote on the 4th of October 2016 “#unlikelythingsheardatwalmart "I'm with her! Vote Clinton—Kaine 2016!"”. Tweets which contained anti-Clinton themes were identified using the following keyword searches: • The Clintons • The #Clintons • Corrupt • Draintheswamp • Clintonbodycount • Jail • Liar • Nazi • Hitler • Mafia • Crime Family • Crime

196

• Criminal • Cartel • Crooked Hillary • Jihad • Satan • Soros • Deplorable • I’m with her (Now) • Iraq

8.7.1.2 Hillary Clinton’s Health Hillary Clinton’s health became a concern during the 2016 election after she appeared to be unsteady on her feet during a 9/11 memorial event on the 11th of September 2016. It would later be revealed that Hillary Clinton was suffering from pneumonia at the time (Seitz-Wald, Alba, Mitchell, Welker, & Hunt, 2016). Example tweets may be seen on August 19th, 2016 by EmileeWaren—“#ReleaseClintonsMedicalRecords I can tell you without any medical records that they are both ill”; and RH0lbr00k’s tweet on the 18th of September 2016 “RT @michaelbeatty3: DON'T WORRY—HILLARY IS AWAKE #ClintonCollapse #ChelseaNYC #SickHillary #ZombieHillary https://t.co/gMAr7AnPEo”. Tweets which contained information on Hillary Clinton’s Health were identified using the following keyword searches: • #ZombieHillary • Medical • Health • Sick • Collapse • Pneumonia • Bodydouble • Consciousness • Fainted • Collapsed

8.7.1.3 Bill Clinton’s Presidency, Impeachment, Affair and Allegations of Rape Former President Bill Clinton was accused and admitted to having an extra-marital affair with an intern between 1995 and 1997 while President of the US, which led to impeachment proceeding against the President. In 1999 Juanita Broadrick accused Bill Clinton of raping her in 1978. In 1998 Kathrin Wiley made allegations that Bill Clinton made lurid advances and

197 forced her to touch his crouch during a meeting at the White House in 1993, and in 2016 Paula Jones accused Bill Clinton of exposing himself to her in 1991 (Relman, 2017). Examples of tweets by the IRA repeating these themes may be seen by hyddrox on the 16th of September 2016 “RT @Hublife: @HillaryClinton Your entire career and life has been a disgrace, you rape enabling, treasonous sociopath” and JeffreyKahunas’ tweet on the 30th of October 2020 stating “RT @FreeDavidKing: !!! Bill had affairs.. !!! No ma'am, Bill Clinton was a vicious rapist. And he flew 26 times on the Lolita Express. We h…”. Tweets which contained information about allegations of sexual assault by Bill Clinton were identified using the following keyword searches: • Rape • Rapist • Monica • Lewinsky • Bill

8.7.1.4 Chelsea Clinton’s ‘Best Friend’ Getting a Government Contract In October 2016 it was reported that Chelsea Clinton’s close friend Jacqueline Newmyer received a government contract when Hillary Clinton was Secretary of State in 2009 after Hillary Clinton provided a favourable reference for Newmyer. The referral came to light after emails from Hillary Clinton’s email server were recovered and made public by the State Department (Gertz, 2016). An example of the IRA using this information to discredit Hilary Clinton may be seen in the tweet by Hyddrox dated the 10th of October 2016 “RT @Patriotancestry: Clinton Sought Pentagon, State Department Contracts for Chelsea’s Friend https://t.co/J3JoN8r9Q1”. Tweets which contained information pertaining to the deal were identified using the search term ‘Chelsea’.

8.7.1.5 Media Outlets Reporting Favourable to Hillary Clinton/Fake News In April 2016, Bernie Sanders supporters protested outside the CNN building on Sunset Blvd., Hollywood protesting the coverage of the 2016 US presidential election, voicing concern that the media outlet was responsible for unfair coverage of the election, especially against Bernie Sanders (CBSLA.com, 2016). Donald Trump would later pick up the call for Fake News reporting during his campaign. Examples of tweets supporting the fake news allegations include RH0lbr00k’s tweet on the 12th of December 2016 “RT @AmyMek: Dear "Media" If you were Caught Colluding with Hillary – you can't pretend you are Going to War Against

198

"Fake News"! #Frauds…” and JavonHIDP’s tweet on the 9th of December 2016 “RT @ggreenwald: A Clinton fan manufactured fake news that MSNBC personalities spread to discredit WikiLeaks docs https://t.co/I1HnHKjclY”. Tweets which contained information pertaining to allegations of fake news were identified using the following keyword searches: • CNN • ClintonNewsNetwork • Clinton News Network • Fake News

8.7.1.6 FallenAngel Reference was made to the documentary movie Fallen Angel which examines the allegations of a cover-up of the shooting down of the Army Reserve Helicopter ‘Extortion 17’. As the event occurred in 2011, the matter was linked to a cover-up by Hillary Clinton, who was Secretary of State at the time of the incident (Trento, 2016). The following keywords were used to identify these tweets: • Fallen Angel • FallenAngel • Tom Trento

8.7.1.7 The Clintons’ Involvement in Paedophilia The headline that the Clintons’ were involved or at the very least, aided paedophiles, may be seen via two avenues, the first is with regards to Hillary Clinton’s aide, Huma Abedin’s, whose husband was charged and then later convicted of sending explicit text messages to an underage girl (BBC News, 2018). The second may be seen in the week leading up to the US 2016 presidential election when a 4Chan user going by the name anonymous and purporting to be an FBI agent posted a comment suggesting the Hillary and Bill Clinton were involved in a paedophile ring. The allegations escalated and morphed into a story that influential and senior Democratic leaders were running a paedophile ring out of a pizzeria in Washington DC. It is unclear whether the campaign was initiated by Russia or a Republican supporter (the original post being shared on an Alt-Right site which demonstrated significant support towards Donald Trump) (BBC Trending, 2016). The campaign saw a significant amount of Twitter traffic from the IRA in a small amount of time, with 389 tweets containing the #Pizzagate appearing between the 8th of November and the 31st of December 2016. As Chapter Nine will demonstrate, the #Pizzagate theory would present as the blueprint for political disinformation (Cosentino, 2020). With regards to the Hillary Clinton tweets examined throughout this

199 chapter, the allegations of child sexual assault were not limited to #Pizzagate and included terms such as: • Paedophile • Pedophile • Pedo • Child rapist • Trafficking

Examples of such tweets which allege the Clintons’ aided or engaged in paedophilia include, KateRitterrrr’s tweet on the 8th of November 2016 “@speakout_april #TrumpTrain bout to steamroll the pedophilic Clinton Crime Family! #DrainTheSwamp #MAGA #TrumpForPresident” and hyddrox’s tweet on the 31st of October 2016 “RT @DeplorableMark1: @HillaryClinton you let classified info onto a pedophiles computer, YOURE DONE! #hillaryforprison”. As will be demonstrated in Chapter Nine, allegations of paedophilia were also linked to the emergence of the group QAnon in 2017, which according to Wendling (2020) is a “wide-ranging, unfounded conspiracy theory that says that former US President Donald Trump is waging a secret war against elite Satan-worshipping paedophiles in government, business and the media”. QAnon, which would grow from #Pizzagate, according to Cosentino (2020), began by an anonymous user posting to 4chan.

8.7.1.8 Hillary Clinton’s Wall Street Speeches During the Democratic Primary Election, Bernie Sanders questioned Hillary’s “coziness” with Wall Street, after it was revealed that Hillary Clinton had received several large donations towards her campaign. It was also confirmed that after she retired from her position as Secretary of State, Hillary Clinton took on several engagements for a substantial fee to speak at private events. The speeches would become a talking point during the election after Hillary refused to release them to the public. In October 2016, the speeches would be released as part of the Wikileaks data dump on Hillary (Colvin, 2016). Examples of tweets used by the IRA to diminish Hillary during the election concerning the speeches include KarenParker93’s tweet “RT @JaredWyand: Convincing democrats that giving $20,000/min speeches aren't bribes #HillaryAccomplishments https://t.co/1aYm8L9oHG” on August 11th, 2016 and ErRivvvvers’ tweet on May 12th, 2016 “RT @DeathAndTaxes: All conditions have been met for Hillary Clinton to release her Wall Street speeches https://t.co/JT3kKbl78c https://t.c…”. Both of these two examples demonstrate IRA accounts retweeting from non-IRA accounts.

200

The terms “wall street” and “speeches” were used to identify this category of tweets.

8.7.1.9 Hillary Against Christians There were two sources of information pertaining to Hillary Clinton’s alleged attack on Christianity. One was during the Wikileak data dump in one of the speeches Hillary made after her retirement from Secretary of State, and the second was through numerous emails also released in the Wikileaks dumps. In both circumstances, Hillary either implied or stated outright that religious bias needs to be changed. In response, the IRA tweeted several times on the issue, including KateRitterrr’s retweet on the 13th of October 2016 “RT @jimgeraghty: I see we’ve finally found a “Middle Ages dictatorship” the Clinton camp won’t work with: the Catholic Church. https://t.co…” and PrettyLaraPlace’s retweet of @ChristiChat on the October 12th, 2016 “RT @ChristiChat: May I remind #Catholics #Christians Hillary is mocking US too. #HillaryBashesEvangelicals #HillaryHatesAmericans https://…”. Although @ChristiChat’s account has been suspended by Twitter, it does not appear that @ChristiChat was an IRA operative as her username is not among the IRA username data. Searches were conducted on Christian and Catholic to identify these tweets.

8.7.1.10 Pro-Hillary Clinton Tweets In addition to the negative comments around Hillary Clinton, there were also positive support of Hillary Clinton by the IRA operatives, with hashtags such as #Hillary4president; #HillaryForPresident; and #Imwithher. For example, NoJonathonNo tweeted on the 26th of October 2016 “RT @fawfulfan: I have just cast my ballot for Hillary Rodham Clinton to become the next President of the United States. #ImWithHer #TexasVo…” and Gab1Aldana on the 26th of September 2016 “RT @Anni_Cyrus: #LostIn3Words ✔ Religion of peace ✔ Hillary for President ✔ Islam was hijacked”. The later of the tweets, however, could be an attempt to align Hillary with the anti-Islamic rhetoric seen in the IRA Facebook campaigns mentioned above.

8.7.1.11 WikiLeaks When examining the Hillary/Clinton tweets above, the usage of WikiLeaks data dumps can be seen in a variety of instances, for example, in reference to Hillary Clinton’s Wall Street speeches, Hillary against Christians as well as general negative press around Hillary’s person.

201

To reiterate what was described at the start of the Chapter, Russian hackers were held responsible for stealing data from John Podesta’s email as well as the DNC. During the lead- up to the election, this information was then handed over to WikiLeaks. It is unclear how WikiLeaks founder Julian Assange received the data, only that it came into his possession sometime after the hack. In October 2016 WikiLeaks began leaking the information on Hillary Clinton in what was dupped by WikiLeak’s as their ‘October Surprise’ (Heathman, 2016). According to Enten (2016, p. 1), “the drip, drip, drip of the hacked emails—published weekly during October – makes it all but impossible to measure their effect precisely”. However, Enten (2016, p. 1) continues that “(i) Americans were interested in the WikiLeaks releases, and (ii) the timeline of Clinton’s fall in the polls roughly matches the emails’ publishing schedule”. Enten (2016) uses Google Trends to map the interest of WikiLeaks during 2016, noting that there was a steep increase in searches of WikiLeaks during October 2016.

8.7.2 Other Events When reviewing the remaining tweets, there were many headlines utilised that were not recognised, or simply pointed to a link with a random negative comment regarding Hillary or the Clintons’ in general, for example, ChesPlaysChess on the 18th of September 2016 wrote “RT @dcexaminer: Reince Priebus: It's just "ridiculous" to give Clinton a pass on birtherism https://t.co/jZhFjEld3k https://t.co/whKncnse6t”. The tweet was referring to the rumour that Hillary Clinton started the conspiracy during the 2012 election that Barrack Obama was not born in the US. The tweet references a news article from the day before in the Washington Examiner by Rudy Takala headlined “Priebus: 'Ridiculous' to give Clinton a pass on birtherism”. What is interesting about this tweet, is the fact that Trump for years denied the legitimacy of Obama’s presidency based on the birther conspiracy theory (Cheney 2016). A further example is a tweet by WillisBonnerr dated May 24th, 2016 “RT @thehill: Trump on 1993 suicide of top Clinton aide Vince Foster: "Very fishy" https://t.co/14rAqamf0w https://t.co/jmd8CNBAxX”. The link in the tweet redirects to a tweet by , pictured below in Image 8.1 which references a conspiracy around the death of Vincent Foster dating back to the 1990s where the Clintons were blamed for the death of former association Vince Foster who committed suicide. A third example may be seen in the tweet by AmelieBaldwin dated the 30th of September 2016 which read “RT @Lagartija_Nix: Chelsea Clinton Uses Private Jet to Travel to ‘Clean Energy’ Roundtable https://t.co/d1iznZmkks @realDonaldTrump https:/…”. This tweet redirected to a news article in The Washington Free

202

Beacon by Cameron Cawthorn entitled “Chelsea Clinton Uses Private Jet to Travel to ‘Clean Energy’ Roundtable”.

Image 8.1 – A Tweet Referenced by the IRA

8.8 Conclusion American and Russia have a long history of interfering in each other’s political and social affairs. After the Cold War, the interference appears to have subsided for a short period. Then in 1996, President Bill Clinton aided in the re-election of Boris Yeltsin and in 2011 Hillary Clinton spoke out against the legitimacy of Vladimir Putin’s election win. In 2016, Russia engaged in an information operation to interfere in the 2016 US presidential election and gain support for President-elect Donald Trump, while spreading unfavourable content towards Hillary Clinton. However, unlike previous information operations carried out by Russia on social media networks, the campaign did not involve creating new content nor investing in useful fools to legitimise false stories. Instead, the Russian government relied heavily on existing content that was already circulating in the mainstream news and social media networks.

Studies have come to different conclusions regarding whether or not the IRA campaign successfully influenced the 2016 election. The aim of this chapter was not to argue or support either of these conclusions but rather to demonstrate that the tactics of the IRA had changed. As shown in Chapter Seven, the IRA failed in the initial campaigns targeted at the US as the campaigns did not get buy-in from local media outlets. In response, the research undertaken as part of this project suggests that the IRA relied on material that was already in circulation during 2016 and did not attempt to make-up new stories in support of Trump or against Hillary. That is, the Russian government did not need to create new content to damage Clinton’s

203 standing amongst the voting public; the content was already in circulation. As such, the IRA retweeted existing anti-Clinton content, embedded themselves into echo chambers and engaged in conversations and insults based on the existing narratives which had developed throughout the election. These included, for example, Hillary’s poor health, allegations around Hillary’s email server and her involvement in Benghazi, as well as family controversies such as allegations of sexual assault against Bill Clinton and allegations of fraud against the Clinton Foundation.

The following chapter will examine the response by social media outlets, Twitter, Facebook and Instagram to the 2016 IRA campaign before examining information campaigns in 2020.

204

Chapter Nine – A World of Disinformation

9.1 Introduction Chapter Eight examined the Internet Research Agency’s (IRA) troll farm activity during the US 2016 presidential election. It is important to note, however, that the IRA were not the only group spreading disinformation during this time. As demonstrated, the IRA began to rely heavily on retweeted content, rather than creating their original content to discredit Hillary Clinton and sow discord in the West. As this chapter will demonstrate, several disinformation campaigns were running adjacent to the IRA campaign but were effectively co-opted by Russian sources. Some of which have been traced back to the source, such as the city of Veles, where students profited from sensationalised headlines during the run-up to the 2016 US presidential elections, and other disinformation campaigns which have not been accredited to any group or nation-state. One such example of the latter, is the case of #Pizzagate.

Using the example of #Pizzagate, this chapter will demonstrate the morphology of an online disinformation campaign in doing so, demonstrate how the IRA utilised existing content to sow discord and undermine prominent US figures. However, as will be demonstrated, the IRA would not be the only entity regurgitating the #Pizzagate disinformation campaign. As will be demonstrated, US officials assisted in spreading the disinformation campaign and other conspiracy theories which appeared online during the 2016 US presential election. With the rise of conspiracy groups, such as QAnon, and prominent people proliferating conspiracy theory online, the IRA had a plethora of material which was supported by prominent US officials, to regurgitate, reducing the need for them to create original content. However, as the case of Sergei Skripal will demonstrate, the Kremlin has not entirely discarded former active measures campaign techniques which include the creation of new content to confuse and create doubt in existing narratives.

Lastly, this chapter will examine the response by social media networks to information operations, and how, if at all, this response has assured transparency in the 2020 US presidential campaigns.

205

9.2 Other identified Disinformation Campaigns in 2016 A review of research conducted on disinformation campaigns in 2016 reveals two prominent campaigns running simultaneously to the 2016 Russian disinformation campaign45. As demonstrated in previous chapters, the use of social media as a political tool saw its beginnings in the colour revolutions, where citizens opposed to authoritarian rule used SMNs to fight for freedom against their oppressors. Since this time, authoritarian nation-states have added social media networks to their toolbox of information operations. In addition to the Vele’s stories mentioned below and the Russian information campaign carried out during the 2016 US presidential election, a study by Zannettou et al. (2018) demonstrates that Iranian state- sponsored trolls were also using SMNs to spread an anti-Trump campaign simultaneously with the Russian based pro-Trump campaign. The study found that the Russian trolls were more influential and efficient at spreading URLs over various social media platforms, such as Twitter, and . The study also noted similar results to the current study, in that the Iranian campaigns echoed real-world headlines. However, rather than the same types of conspiracy theories picked up by Russian trolls, the Iranian trolls regurgitated campaigns against Saudi Arabia and France. As Zannettou et al. (2018, p. 6) explains,

In November 2013 France blocked a stopgap deal related to Iran’s uranium enrichment program, leading to some fiery rhetoric from Iran’s government (and apparently the launch of a troll campaign targeting French speakers). As tweets in French fall off, we also observe a dramatic increase in the use of Arabic in early 2016. This coincides with an attack on the Saudi embassy in Tehran, the primary reason the two countries ended diplomatic relations.

A review undertaken as part of this research project of the IRA Twitter data did not demonstrate any cross-over to headings produced by Iranian trolls.

The second disinformation campaign mentioned above, may be seen in the sensationalised headlines which came out of Veles during 2016 in relation to the US presidential election. As evidenced in Chapter Eight, the Russian troll farm the Internet Research Agency (IRA) took

45 This research paper is not suggesting that these were the only two information campaigns being run on SMNs in 2016, however, rather, they are demonstrations that the IRA were not the only ones running disinformation campaigns during this time.

206 advantage of the mass of news stories and reports which appeared online leading up to the 2016 US election. Often these stories fell into what Bradshaw and Howard (2019) define as addictive content, that is, content which is usually fake, and designed to attract readers with dramatic headlines. An example of this addictive content may be seen in the fabricated stories created in the small city of Veles in Macedonia. Kirby (2016) has traced most of the fake news stores and websites which appeared online during the 2016 US election to Macedonia where teenagers created sensationalised headlines for revenue. According to an interview with one of the writers from Veles, hundreds of teenagers churned out fake pro-Trump reports which were plagiarised from American right-wing sites. The writers would copy and paste various articles together under a new catchy headline, before paying Facebook to share the news. The more times the story was shared and liked, the more revenue the writer would get from paid advertisements operating on the author's site. Fake news stories use ‘Clickbait’ to attract users to their articles, usually in the form of a provocative title or image. As Metaxas and Finn (2017, p. 1) explain “clickbait attracts the attention of unsuspecting social media users who click on links to these stories thinking they are visiting a legitimate news site”. The stories published in Veles are prime examples of clickbait. Ubavka Janevska, an investigative reporter in Macedonia, identified seven separate teams active in 2016 which were packaging misinformation online but estimated that there were hundreds of other school-aged individuals working individually on similar stories.

A review of articles written on the Veles clickbait sites, suggests that very little is known about the role these students played during the 2016 elections. It is understood that Veles was responsible for a number of stories, two confirmed headlines being, the Pope endorsed Donald Trump and that Mike Pence called Michelle Obama vulgar (C. Silverman & Alexander, 2016). Using keywords from these two headlines, a search was conducted as part of this research project on the IRA Twitter data, to see if the IRA picked-up on either of these two headlines. The first search was on the keywords ‘Pope’ and ‘Trump’ and revealed only one tweet which appears to support the headline that the Pope endorsed Trump, “RT @tomsparrow: AHEAD Of 1st DEBATE, POPE FRANCIS SHOCKS THE WORLD, Releases THlS STATEMENT ABOUT DONALD TRUMP!—https://t.co/NyS2dOFgLa”. The tweet was from the user CooksnCooks and was published on the 29th of September 2016. The link has since been removed. The original story, that the Pope endorsed Trump, appeared in July 2016 with the headline “Pope Francis Shocks World, Endorses Donald Trump for President, Releases

207

Statement”, as demonstrated in Image 9.1 below. Although the headlines are similar in nature, the dates are out by a month, so it is possible the link was to something other than the story that the Pope endorsed Trump.

Image 9.1 – WTOE 5 News, July 2016

The second search undertaken on the IRA data, used the keywords ‘Pence’, ‘Obama’ and ‘vulgar’, interestingly this produced two results, both posted on the 22nd of December 2016 and both disputing the allegation that Pence called Michelle Obama vulgar. The first post was from RyanMaxwell_1 which read “RT @Conservatexian: New post: "AP FACT CHECK: Pence didn't call Michelle Obama 'vulgar'" https://t.co/dgd7ieUdQQ”. While the second post, by Mr Moran read “RT @factcheckdotorg: No, Mike Pence did not call Michelle Obama “the most vulgar first lady." That's a made-up quote from a satire site”. It is unclear what this means with regards to the IRA and how the IRA interacted, if at all, with other disinformation sites. To understand how the IRA operated with other disinformation sites, and if indeed the IRA were aware that these sites were also promoting disinformation, further research is required. Unfortunately, this is outside the scope of the current research project.

As discussed above, the following section will dissect the disinformation campaign #Pizzagate, as an example of how the IRA were quick to pick-up on new conspiracy theory and disinformation campaigns that appeared during the 2016 US presidential election. As a byproduct of the IRA’s interaction with the #Pizzagate story, it was also discovered that prominent US officials were also spreading disinformation relating to the #Pizzagate

208 campaign, fuelling its spread across social media networks (SMNs) and adding legitimacy to the story.

9.3 #Pizzagate – A Case Study Conspiracy theories circulated online appear to have several information sources attached to them. Sunstein and Vermeule (2009), for example, suggests that people who believe in conspiracy theory do so due to a crippled epistemology.46 However, SMNs have provided an opportunity to post to various online sources giving the perception that the limited data source is a diverse field of credible information to support the theory. As demonstrated below, this ‘diversity’ can be created by co-conspirators or others, such as the students in Veles. Mainstream media is also cited as evidence to corroborate event details, while in other cases, it is used to challenge competing stories presented by other mainstream media outlets directly. Starbird (2017) suggests that alternative media domains may be acting as a breeding ground for the transmission of conspiratorial ideas. As Starbird (2017) explains, the same conspiratorial content often appears on different sites in different forms, and often these internet sites focus on multiple conspiracy theories at once. As discussed earlier in this thesis, once an individual believes in one conspiracy theory, they are more likely to consider another. Therefore, conspiracy theory is a gateway to a new conspiracy theory (Starbird, 2017).

An example of a conspiracy theory propagated online during the 2016 US election47, created on an American right-wing site is #Pizzagate, mentioned briefly in Chapter Eight. #Pizzagate began in October 2016, when former FBI Director James Comey announced that the investigation into Hillary Clinton’s use of a personal email server to store classified information was re-opening as new evidence had come to light48. This evidence had been thought lost, and then found again on a computer owned by Anthony Weiner, the estranged

46 Crippled epistemology is defined here as “a sharply limited number of (relevant) informational sources” (C. R. Sunstein & Vermeule, 2009, p. 204) 47 See Allcott and Gentzkow (2017) for further information on conspiracy theories propagated throughout 2016. 48 Whilst in the position of Secretary of State, Hillary Clinton set-up her own private email server. In 2015 when mainstream media reported on this, an investigation was undertaken to discover whether or not Hillary Clinton had engaged in criminal activity. As the investigation was highly topical, Comey made the decision to announce when the investigation was complete as well as the findings of the investigation, which was highly irregular. A few months later, a decision was made by Comey to recant on these findings when the new evidence came to light (Comey, 2018).

209 husband of Hilary Clinton’s top aide Huma Abedin. Anthony Weiner was under investigation for an unrelated incident when emails about Hillary Clinton were discovered on his computer (it turned out that the computer did not contain any new evidence concerning the email scandal). Anthony Weiner was the subject of an FBI investigation after he had sent sexual messages and images (sexting) to an underage teenage girl (Comey, 2018).

On the 29th of October 2016, Carmen Katz wrote on her Facebook page, concerning the re- opening of Hillary Clinton’s email case:

My NYPD source said its [sic] much more vile and serious than classified material on Weiner’s device. The email DETAIL the trips made by Weiner, Bill and Hillary on their pedophile [sic] billionaire friend’s plane, the Lolita Express. Yup, Hillary has a well documented [sic] predilection for underage girls… We’re talking an international child enslavement and sex ring (qt in Robb, 2017).

An investigation by in conjunction with The Investigative Fund and Reveal from the Center for Investigative Reporting searched for Carmen Katz only to find she did not exist. Instead, the investigation was able to trace Carmen Katz back to Cynthia Campbell, a 60-year- old attorney who admitted to owning the Facebook page Carmen Katz but stated her page had been ‘hacked’ (qt in Robb, 2017). The Foreign Policy Research Institute suggests that rather than having had her page hacked, Katz/Campbell was the victim of clickbait; she had picked up the story on an online chat board and then repeated the story in her Facebook feed, becoming a ‘useful fool’, and assisting the story to take off (qt in Robb, 2017). Human actors are often the key to spreading disinformation on Twitter (Starbird, 2017), and Katz was purportedly the human actor for #Pizzagate, a modern-day ‘useful fool’.

The Rolling Stone report was able to trace two sources to the #Pizzagate story posted on message boards. The first post was on the 2nd July 2016 by an anonymous 4Chan user, later to be referred to as FBIAnon, who claimed to have secrets concerning the Clintons that he was willing to share with 4Chan users. FBIAnon then wrote “Bill and Hillary love foreign donors so much. They get paid in children as well as money.” Sparking the question by a 4Channer user as to whether Hillary Clinton had sex with kidnapped girls, to which @FBIAnon replies “yes” (qt in Robb, 2017). The second seed was twelve hours before Katz’s post by

210

@Fatoldman49 who re-posted FBIAnon to ThreeRant. A social bot, which is a computer algorithm that automates interactions with humans on SMNs (DuBois et al., 2011), @NIVIsa4031, picked up this post and re-shared it to Twitter. Katz’s original Facebook post was then retweeted by a social bot using the Twitter handle @DavidGoldbergNY. The tweet read that the new emails “point to a paedophilia ring and @HillaryClinton is at the centre” (qt in Fisher, Woodrow Cox, & Hermann, 2016, p. 1). This message was retweeted over 6,000 times before being picked-up by far-right sites. One commentator, Alex Jones, then made a YouTube video ‘validating’ the rumour and attracting over 427,000 views (Fisher et al., 2016). The YouTube video and @DavidGoldbergNY Twitter account have both been suspended.

Research conducted as part of this study suggests that on the 5th of November #Pizzagate appeared for the first time on Twitter via a valid Twitter account @hasko_e. On the 6th of November a Russian Twitter50 user @drposhlost seems to be the second user to tweet #Pizzagate. It is unclear whether any social bot activity occurred between these two tweets as tweets identified as being carried out by social bots have been deleted in Twitter’s feeds. The hashtag was then picked up globally, with a high share of tweets appearing in the Czech Republic, Cyprus and Vietnam. It is unclear why these countries were involved. The story that would develop involved Hillary Clinton, a paedophilia ring and a pizza parlour (#Pizzagate) and appears to have been picked up in one of Podesta’s emails that had been leaked by WikiLeaks during the lead up to the 2016 election. In true conspiracy style, the story reads as follows: Tony Podesta, John Podesta’s brother, frequented Comet Ping-Pong, a pizza shop in Washington DC. The owner James Alefantis, five years before the allegations, dated a man called David Brock, founder of Media Matters for America (MMfA) (Kang, 2016). MMfA is a not-for-profit website dedicated to analysing and correcting conservative media coverage in America. MMfA seeks to debunk misinformation, and during the 2016 Election tracked press coverage critical of Hillary Clinton (Kang, 2016). It was on the online message board 4Chan

49 An extensive search was undertaken of both the usernames @FBIAnon and @Fatoldman using various tools, include the site weleekinfo.com, however no previous mentions of @FBIAnon seem to exist before the 4Chan conversation mentioned above. In comparison @Fatoldman’s post seem to date back to 2012, assuming the handle is the same person. Post’s date back to 2012 and continue till the present day.

50 Twitter provided tweets from both IRA operatives and Russian User Accounts deemed to be State sponsored. This tweet was a by Russian User Account and not by an IRA operative.

211 that users noticed that a discussion had taken place between Podesta and Alefantis regarding a fund-raiser for Hillary Clinton at Comet Ping-Pong. Speculation grew between Comet Ping- Pong and the links between the Democratic Party and the paedophilia story (LaCapria, 2016). The Washington Policy Department would dismiss #Pizzagate as a baseless conspiracy theory (Cosentino, 2020), however, #Pizzagate storey would soon attract over 20,000 subscribers to Reddit's ‘The_Donald’ page (Kang, 2016).

It is unclear whether the first #Pizzagate appeared in a Reddit forum of Trump supporters, or on Twitter on 5th of November 2016. It would appear, based on news reports51, that the Reddit post did not appear until Wednesday 7th of November, which would suggest that #Pizzagate appeared on Twitter first. However, Reddit conversations have been deleted, so this is unable to be confirmed, and reference to this was unable to be found in any other research article or newspaper report.

The first IRA #Pizzagate tweet occurred on the 8th of November 2016, three days after the original #Pizzagate tweet. Research conducted on the IRA Twitter data demonstrates that there was a total of 846 original IRA tweets and 1,693 IRA tweets with the #Pizzagate hashtag. Sixteen distinct IRA user accounts were identified as responsible for the 846 original tweets, while seventy-two IRA user accounts retweeted the #Pizzagate hashtag. In addition to the tweets created by the IRA, IRA user accounts retweeted tweets originating from other sources 827 times. In terms of audience, the original 846 tweets by IRA user accounts totalled 5,888 retweets, while the number of likes the original tweets received totalled 5,112. The original IRA tweets were quoted 500 times in other tweets, and there were 627 replies to the original tweets. There is a crossover of users who ‘retweeted’, ‘liked’, ‘quoted’ and/or ‘replied’.

An examination of the tweets demonstrated a pro-Trump position and anti-Hillary/Democrat stance. There were several hashtags used interchangeably between IRA operatives, as seen in Table 9.1 below.

51 See for example Fisher et al. (2016)

212

Table 9.1—Common Hashtags Used Throughout the 2016 US Campaign:

#draintheswamp #FindOurGirls #MAGA #4Chan #Buildthewall #LondonAttacks #KilltheBill #Vault7 #Trump #Killary #HillaryClinton #FakeNews #Qanon #BoycottAmazon

With regards to the last hashtag #BoycottAmazon, it is unclear as to whether this was because Amazon had stocked a paedophiliac textbook, which it has since stopped carrying, or whether it was because Amazon dropped WikiLeaks52.

In addition to a plethora of hashtags, the tweets also purported to demonstrate public outrage against the Democrats and the #Pizzagate conspiracy. For example, one tweet read “Wake Up America”, while another tweet stated, “This HAS to be stopped!!!”. Dr Phil, an American reality TV icon, was also brought into the tweet conversation, with one operator tweeting “DR PHIL’S LIFE IN DANGER AS HE EXPOSES ELITE #Pedophile Rings”. While another tweet read “I Tweet, U Decide”. There appeared to be one attempt to rally people for a public protest, with an IRA operator announcing on the 2nd March 2017, “PROTESTS in DC announced today!!!”, however, there was no follow-up to the announcement. There was also a surge of tweets on the 5th of December 2016, which appear to correspond with Edgar Maddison Welch’s storming of Comet Ping-Pong Pizzeria, discussed below.

Throughout the tweets, reference was made periodically to other conspiracies, such as the Illuminati conspiracy, with one IRA operator writing “Because of the elite Illuminati in congress protecting him”. It was unclear from the tweet, who ‘him’ referred. Reference to the Rothschilds and Soros were also made, as well as to the #FollowtheWhiteRabbit, a pop culture reference to the movie the Matrix, which has become synonymous with the growing layers of conspiracy theories concerning the 2016 US presidential elections found online53. This

52 For further information regarding #BoycottAmazon, see Heritage (2015). 53 For further information see Kelly (2017).

213 behaviour is consistent with Starbird’s (2017) theory above, that conspiracy theory spread on social media is used as a breeding ground to apply other conspiracy theory. In some of the tweets, there is also evidence of the IRA attempting to shift attention away from their activity. In one tweet, an IRA operative implied that the Democrats had created the idea that Russia was trying to influence the election, writing “Obama Sets up Russia Plot”. In a few of the tweets analysed, IRA operators even suggest that Russian interference in the 2016 US election was even less credible than #Pizzagate, with one operative writing, “#PizzaGate holds more credibility than Russian interfering w/our election”. Another operator suggested that “#RussianCollusion was a ruse to cover up #PizzaGate”. Interestingly, although this review does not go into the names and handles chosen by IRA operators, one Russian operator chose @TheRussianBot as their identity.

According to Metaxas and Finn (2017) and their analysis of Twitter and #Pizzagate, somebody created a conspiracy theory which they then embedded into an ‘emotionally charged’ community or echo chamber using fake Twitter accounts and bots. There was an additional element guiding Russian disinformation and propaganda campaigns during the election, that is, the human element; individuals who held some form of social status, such as Presidential nominee at the time, Donald Trump, who engaged in the spread of IRA disinformation campaigns. , senior fellow at the Center for Cyber and Homeland Security at George Washington University and a Foreign Policy Research Institute fellow, testified at the first Senate Intelligence Committee hearing on Russia’s meddling in the 2016 US presidential election, stating that “part of the reason active measures have worked in the US election is that the Commander-in-Chief has used Russian active measures at times against his opponents”. Watts noted in his testimony that fake news was released, passed on and amplified by bots and then reiterated by Trump or his campaign team. Watts demonstrated this point by way of two examples where Trump and other members of his team quoted Russian controlled media outlets as their source of information. The first example was a false report on terror attacks on NATO bases in Turkey broadcasted by RT, which Trump and his former campaign chairman echoed. The second was information detailed in a report by Sputnik54, which suggested that Hillary Clinton was in a position to stop the Benghazi incident from

54 Both RT and Sputnik are known Kremlin controlled media outlets.

214 occurring55. Trump read this information verbatim from a piece of paper during one of the campaign rallies (Ackerman, 2017).

On the 3rd of November 2016, Lt. Gen. , who would briefly hold the position of Donald Trump’s National Security Adviser in January 2017, tweeted a link to an online post that claimed the New York Police Department had evidence linking Clinton to “child exploitation” and “sex crimes with minors.” The tweet does not mention #Pizzagate. The tweet does, however, link to an article by True Pundit, a political conspiracy website with a bias towards the far right ("True Pundit," 2019). The report suggested Hillary Clinton and key members of her party belonged to a paedophile ring ("Breaking Bombshell: NYPD Blows Whistle on New Hillary Emails: Money Laundering, Sex Crimes with Children, Child Exploitation, Pay to Play, Perjury," 2016). Five days later, Michael Flynn’s son, Michael Flynn Jr would then tweet the #Pizzagate conspiracy. Both these tweets have since been deleted.

Image 9.2 – General Flynn’s 2nd of November Tweet

Image 9.3 – Flynn Juniors 9th of November Tweet

The tweets came after the arrest of 28-year-old Edgar Maddison Welch who entered Comet Ping-Pong, armed with multiple guns, intending to ‘free the children’ from their supposed

55 The Benghazi incident involved the death of four American’s in Benghazi during the 11th September 2012 terrorist attack in Libya. Accusations arose from the Republican party that the Obama administration and then Secretary of State, Hillary Clinton’s failed to heed intelligence warnings of an imminent attack.

215 captivity. Welch’s attempt to find the sex dungeon at Comet Ping-Pong failed, and Welsh surrendered to police without injury to himself or anyone else (Kroll, 2018). Welsh, in an interview with the New York Times, revealed how he had heard the story of #Pizzagate by word of mouth. Welsh then went online to investigate the information only to find each new article led to further reports on the subject. Welsh revealed that he did not believe in the term ‘fake news’ and that he believed the term merely diminished “stories outside the mainstream media” which Welsh said he did not wholly trust (Goldman, 2016, p. 1). This sentiment may be seen encapsulated by Trump in his campaign rallies leading up to the election: a common theme throughout his campaign was the belittling of both the mainstream media and Hillary Clinton (, 2016).

The #pizzagate story gained significant traction. It is too soon to determine if #pizzagate will have a lasting effect on those who came in contact with the disinformation campaign, so the full impact of the story remains unknown at this time. A March 2019 graph on social media trends by RiteTag demonstrates that #Pizzagate was still trending under the broader heading of #Conspiracy three years after it first appeared. The colour ‘green’ is indicating that a topic is ‘hot right now’.

Figure Two—#PizzaGate Trending Graph (Taken from https://ritetag.com/best-hashtags-for/pizzagate)

216

In December 2016, Public Policy Polling (PPP) undertook a comprehensive US national poll on trending issues. PPP identified that there had been much attention recently on the way that fake news spread and was believed, particularly by Trump supporters. The results of their poll on #Pizzagate support this assumption. Fourteen per cent of Trump supporters thought that Hillary Clinton was connected to a paedophile ring run out of a pizzeria in Washington DC. Thirty-two per cent could not distinguish as to whether the reports of the connection between Hillary Clinton and a paedophile ring or the existence of a paedophile ring in a pizzeria in Washington DC were real or fake. Fifty-four per cent said they did not believe #Pizzagate was real (Jensen, 2016).

As demonstrated throughout this thesis, the use of the conspiracy theory as a tool by the Russian government has a long history, with previous research tracing its origins back to the late 1800s as demonstrated in Chapter Six. According to LaFrance (2020) the same may be said of American culture, with conspiracy theory propagated in some instances from well- known personalities. For example, the ‘birther’ conspiracy, mentioned in Chapter Eight, which began in 2004 by Illinois political candidate Andy Martin and then repeated in 2011 by Donald Trump, questioned the legitimacy of Barrack Obama’s presidency based on Obama’s place of birth. Obama was officially recorded as being born in Hawaii. Despite this, Martin and then Trump made accusations that this was a lie, and that Obama was not a natural-born American and instead, according to one allegation, had been born in Africa. This was not the first time that former American President Donald Trump had aided in the spread disinformation, with The New York Times reporting that Trump had tweeted over 145 times since the end of 2019 on conspiracy theory circulating in the US, including on QAnon, the conspiracy movement which emerged in 2017 (LaFrance, 2020). An example may be seen on the 8th March 2020, when Trump retweeted a QAnon meme. As Image 9.4 demonstrates, a QAnon supporter was quick to react to the tweet and link it to the greater Q conspiracy discussed below.

217

Image 9.4 – An example of Donald Trump Helping to Spread QAnon Conspiracy

9.4 QAnon According to Cosentino (2020),

Pizzagate conspiracy theory is presented as the blueprint for fictional political narratives growing out of the contributions of multiple authors in various world regions. The QAnon conspiracy theory, an offshoot of Pizzagate, is also presented as an open-ended collective narrative based on paranoid attitudes toward political institutions and establishments, typical of the current era of Internet— driven populism and radical politics.

In agreement with Cosentino (2020) is Griffin (2020), who argues that QAnon not only borrows from conspiracy theories, such as #Pizzagate, but adds so much more. For example, Q claimed that Special Counsel was investigating Hillary Clinton, Barack Obama and John Podesta instead of Donald Trump with regards to the Russian information operations during the 2016 election. Other theories extend on well-known conspiracy theories such as the Illuminati, paedophile rings in Hollywood and even conspiracy theories concerning the Freemasons and The Titanic (Griffin, 2020). QAnon refers to a purported anonymous high- ranking US official with ‘Q clearance’, at the Department of Energy, who is leaking top-secret information on supposed Deep State actors engaged in illicit activity (Newton, 2020). According to Newton (2020) Q’s first revelation occurred on 4Chan in October 2017, a SMN

218 which has become synonymous with QAnon content. The post entitled ‘Calm Before the Storm’, predicted the arrest of Hillary Clinton on the 30th of October 2017 (LaFrance, 2020); this did not occur.

A review undertaken as part of this study of IRA Twitter data demonstrates 408 tweets throughout the whole dataset with the term ‘QAnon’ via two operative accounts, Trump Nation and Amelie Baldwin. It should be noted that the majority of tweets, 407, came from the account Trump Nation, while Amelie Baldwin only tweeted the once, “RT @FemalesForTrump: Rush Limbaugh: IGNORE THE POLLS – Don’t Let Anything Dispirit You (Video) https://t.co/w1WeQanOn5.” The @FemalesForTrump Twitter account has since been suspended but is not part of the IRA dataset. Trump Nation’s tweets were broken up as follows: • Seven (7) tweets where original content; • 57 tweets were linked-to websites; and • The remainder of the tweets where retweets.

The above review of #Pizzagate and QAnon and the analysis in Chapter Eight appear to point to the conclusion that Russian operation campaigns had developed to a point where there was no longer a need to create original content. Instead, operatives need only borrow from conspiracy theories already circulating on the Internet. The next section of this chapter, however, will demonstrate that this is not the case. As the attempted assassination of Russian defector and former spy, Sergei Skripal and the response by the Kremlin would suggest, the Russian government has not abandoned one tactic in place of a new tactic. When need be, Russian government operatives are very comfortable reverting to former active measures campaign techniques and creating content to fit the narrative of the Kremlin.

9.5 Sergei Skripal Sergei Skripal was a Russian intelligence officer who was recruited by England as a spy in the mid-1990s (Schwirtz & Barry, 2018). According to Schwirtz and Barry (2018), Skripal was a Russian intelligence officer of the same age and training as Russian President Vladimir Putin. Putin is known as “reserving a special hatred” for intelligence officers such as Skripal who had betrayed the Russian intelligence community when it was vulnerable after the collapse of the Soviet Union. In 2010, during an exchange of four convicted spies with the West, one of

219 whom was Skripal, Putin made public comments that he daydreamed about their death (Schwirtz & Barry, 2018). In 2018 Salisbury England, Skripal was rushed to hospital after being poisoned with a Novichok nerve agent (Guardian, 2018). A week later, British officials accused Russia of sending hitmen to Skripal’s house, where it is alleged they smeared the front door handle with the nerve agent (Schwirtz & Barry, 2018). Within hours of the assassination attempt failing, the Kremlin's propaganda machine began spinning out false narratives and conspiracy theories through Russian controlled media outlets. The primary goal of the propaganda and disinformation campaigns that sprouted from the Kremlin appear to be an attempt to plant a seed of doubt regarding Russia's involvement. Such doubts came in the form of stories both tenable and prodigious such as, the CIA and not the Kremlin had orchestrated the attack on Skripal; that the attack on Skripal was the result of Ukrainian activities; a toxic spill had poisoned Skripal; that Skripal was sick due to Britain's lousy climate; that Britain was responsible for Skripal’s poisoning as a way for British Prime Minister Theresa May attempting to frame Russia to distract the English public from the truth; that Skripal poisoned himself in an attempted suicide; and that Skripal had overdosed on drugs (Warrick & Troianovski, 2018).

An analysis of the IRA Twitter data was undertaken as part of this study during March through to June 2018, to determine the extent of the information campaign on Twitter. Surprisingly, there was no response from the IRA until almost a month after the event, with operatives tweeting primarily on the upcoming Russian election held on the 18th March 2018. In comparison, RIA Novosti reported on the story on the 6th March 2018, two days after the poisoning, and suggested Skripal had overdosed himself. Rossiya TVs broadcast would follow on the 7th March 2018, with the suggestion that it was England's climate that had poisoned Skripal. In comparison, the first IRA tweet on Skripal did not occur until the 27th March 2018, which was in response to NATO expelling Russian diplomats.

Further tweets by IRA operatives regarding the Skripal poisoning provided direct links to the Kremlin’s official reports. The reports indicated that no evidence had been supplied to implicate the Kremlin in the Skripal incident and that it was yet another ploy by the West to undermine Russia, with one tweet reading "Great Britain is Washington's hand poodle". In another tweet, the operative points to an article that accuses Scotland Yard of not looking for a criminal. The article interviewed Walter Litvinenko, the father of a former FSB agent

220

Alexander Litvinenko, who was poisoned and killed in London in 2006. Litvinenko stated in the interview that the murder of his son was staged so that England could demonstrate the "cruelty and aggressiveness" of Russia (Politros, 2018).

The IRA tweets on the Skripal case were all in the Russian language, suggesting that the audience of the information campaign was Russia and Russians. A review of the English tweets during the same period do not reference Skripal. Instead, they appear to be targeting a US audience, with several political posts referencing the NRA, American shootings and Stormy Daniels, the pornography star who at the time had alleged to have slept with Donald Trump. There was also a magnitude of anti-Trump sentiment, as opposed to the plethora of pro-Trump support the IRA was circulating in 2016. For example, JemiSHaaaZzz’s tweet of March 17th, 2018 “Can we stop assuming "If Obama was in office and did things that Trump did"? Because President Obama would never do any of those things. He has class, morals, and a whole lot of dignity. Something Trump can't even spell... https://t.co/crJjTEbgQv” and KaniJJackson’s tweet on the 30th March 2018 “#GoodFriday is perfect to remind everyone that Trump presidency exposed the evangelical Christians for what they are — racists, misogynists, pedophile supporters who voted for a man who bragged about sexually assaulting women!.”

The final section of this chapter will examine the response by SMNs to Russia’s interference in the 2016 US presidential elections and the use of SMNs to spread disinformation in general.

9.6 Social Media Network’s Response to the 2016 US Presidential Elections As part of the U.S. House of Representative Permanent Selection Committee on Intelligence’s open hearing with senior officials from Facebook, Twitter and Google, in November 2017 the heads of three SMNs were interviewed regarding the troll farm activity carried out during the 2016 US presidential election. The findings of the hearing are as follows.

9.6.1 Facebook Actors had used Facebook in coordinated networks of fake accounts, to attack specific candidates, cause distrust in political institutions, cause general interference in the election and to spread confusion. • Ad tools were utilised by threat actors;

221

• The IRA had undertaken a disinformation campaign to deceive and manipulate people in targeted countries, including the US, Russia and Europe; • 470 accounts and pages (as of the 19th April 2018) had been linked to the IRA; • A total of 3,519 advertisements were purchased by the IRA with more than 11.4 million Americans having been exposed to these advertisements; • Over a two-year period, the IRA Facebooks accounts had generated over 80,000 posts; and • It is estimated that approximately 126 million people have viewed content from the IRA Facebook pages.

In his opening statement to the Hearing before the United States Senate Committee on the Judiciary and the United States Senate Committee on Commerce, Science and Transportation, Mark Zuckerberg, Chairman and Chief Executive Officer of Facebook stated:

It is not enough to just connect people; we have to make sure those connections are positive. It’s not enough to just give people a voice; we have to make sure people aren’t using it to hurt people or spread disinformation. It’s not enough to give people control of their information, and we have to make sure developers they’ve given it too are protecting it also. Across the board, we have a responsibility not just to build tools, but to make sure those tools are used for good (2018).

In the same statement, Zuckerberg admitted that Facebook was slow to not only spot but to also respond to the Russian interference present in the lead up to the 2016 US presidential election. Facebook has continued to update its community on what it is doing to protect fake accounts, a 2017 post explaining “we’ve made improvements to recognize these inauthentic accounts more easily by identifying patterns of activity — without assessing the content itself. For example, our systems may detect repeated posting of the same content or an increase in messages sent” (Facebook, 2017).The question remains as to whether Facebook’s statement was in response to what the public wanted to hear, or whether a concerted effort will be made in the future to try and prevent the flow of disinformation on the platform; only time will tell.

In 2017, Facebook announced that it had identified an influence campaign on their platform that was politically motivated to disrupt the US midterm elections. In response the platform deleted 32 pages and fake accounts which had engaged in activity on social issues that

222

Facebook had deemed divisive. Although Facebook did not link the activity to the IRA, according to Facebook officials some of the same tools and techniques the platform had identified in the IRA activity during the 2016 US presidential activity were similar. For example, the campaigns were targeting white supremacist groups and left-wing activity which campaigned to abolish the US’ Immigration and Customs Enforcement agency (ICE) (Fandos and Roose 2018). However, unlike 2016, where Facebook ads were paid for in rubles and on occasion operatives used Russian internet protocol addresses (IPs), the accounts in the 2018 discovery “used advanced security technique’s to avoid detection” (Fandos and Roose 2018 4). For example, the account holders disguised their traffic using virtual private networks (VPNs) and phone services and also used third parties to purchase ads on their behalf.

As of November 2019, Facebook had reportedly deleted over 3.2 billion fake accounts, up from 5.4 million during the same period in 2018 (Palmer, 2019). However, the specifics of these operations and the content they are blocking remain hidden to researchers.

9.6.2 Instagram Instagram does not have the same algorithms as Facebook, so an analysis of data research produced for the Senate hearing demonstrated incomplete results, however, the following could be determined: • The IRA made 120,000 pieces of content over two years; • Approximately 20 million people viewed content from the IRA on Instagram; • The IRA spent $ 100,000 on Instagram and Facebook advertising; • Over 3,000 ads were purchased; and • Approximately 11 million people viewed adds on both Instagram and Facebook that were purchased by the IRA.

Known IRA accounts on Facebook and Instagram were shut down in August 2017. Project DATA, a review of the Russian campaign on Facebook concluded that in addition to political ads purchased by the IRA, non-political adds were used “to identify, recruit, and organically engage with targets” (Kim 2018; 5). Ads were also used as a tactic for voter suppression; however, these ads were deployed towards the end of the election campaign.

223

9.6.3 Twitter According to data released by Twitter to the Committee’s open hearing, there were a total of 3,821 Twitter accounts affiliated with the IRA. The Twitter accounts were designed to impersonate US citizens, political parties, news entities and groups focused on political and social issues. Twitter advertisements were also purchased by Russian news outlet RT. The committee also reviewed a snapshot of Twitter activity between the 1st of September and the 15th of November 2016 and concluded:

• Over 36,000 Russian-linked bots tweeted on the US election; • There were approximately 288 million impressions of Russian-linked bot tweets; and • Over 130,000 tweets linked to IRA accounts.

A review of the RT advertised tweets, demonstrate a mix of issues, from Russia’s V Day celebrations to tweets regarding a rise in Satan worshipers in America.

According to a 2019 report, since the Committee’s open hearing Facebook and Twitter have taken action against cyber troops from seven countries demonstrating sophisticated influence operations by state actors, specifically China, India, Iran, Pakistan, Russia, Saudi Arabia and Venezuela (Bradshaw & Howard, 2019). According to Twitter, their policy is to focus on misleading, deceptive and spam behaviour while differentiating between legitimate speech and coordinated manipulative behaviour. The most frequent policies enforced by Twitter to identify such behaviour includes coordinated activity, fake accounts, platform manipulation and spam, attributed activity, ban evasion and distribution of hacked material. Once the material can be attributed to State-sponsored activity, the datasets are then released by Twitter to help support further research (Roth, 2019).

9.7 IRA Activity Since the 2016 US Presidential Election The 2019 National Intelligence Worldwide Threat Assessment and Senate testimony stated that the Russian government will continue its efforts to “aggravate social, political and racial tensions in the United States and among its allies” (Coats, 2019, p. 1). This assessment was based on Putin’s former position as a KGB agent, the intelligence service which created the world’s first intelligence state56, in the form of Soviet Russia. Russia as it stands today relies

224 on intelligence operations when dealing with foreign policy challenges as well as maintaining control closer to home (Coats, 2019). In support of this, Howard et al. (2018) asserts that rather than IRA activity decreasing after the publicity of the 2016 US presidential election and the attention the IRA received, engagement rates of IRA activity increased. For example, IRA Facebook ads reached a peak volume in April 2017, while Instagram saw the most significant increase in operative activity during the same period. The activity by the IRA covered a wide range of issues, including national security, public policy and topics pertinent to young voters.

In comparison, a review of IRA activity on Twitter, as demonstrated in Graph 9.1 reflects a decrease in IRA activity after the 2016 US presidential election. In 2016 a total of 1,565,337 tweets were identified by this research in the IRA Twitter data, by 2017, this figure had dropped by 15% to 1,330,419. There is also a significant decrease in tweets pretending to come from US citizens. Instead, there is an increase in tweets targeting European nations such as France, Italy and in particular Germany. Further analysis is needed for identifying whether the IRA tweets correlated with any significant events in these countries during this time. The sudden drop in tweets by the IRA could suggest one of two things. The first, due to the media coverage on Kremlin troll farm activity in the US, Twitter was no longer a viable option for the Kremlin to conduct disinformation campaigns. In contrast, it may also suggest that the Kremlin had shifted troll farm operations and began using new tactics which were no longer visible to Twitter's algorithms and therefore could not be detected.

300,000 250,000 200,000 150,000 100,000 50,000 0

Jul-17 Jan-17 Feb-17 Mar-17 Apr-17 May-17 Jun-17 Aug-17 Sep-17 Oct-17 Nov-17 Dec-17 Jan-18 Feb-18 Mar-18 Apr-18 May-18 Jun-18

Total Number of Tweets Tweets Purporting to be from the US Citizens

Graph 9.1 – IRA Tweets 2017-2018

Since Russia interfered in the 2016 US presidential election, companies such as Twitter, as outlined above, have had to address the Kremlin’s use and abuse of their services and have

225 been working to eradicate Russian trolls and bots on their platforms. In early 2019 new information released by Twitter demonstrated a propensity of Russian trolls and bots towards far-right conspiracy theories. The question remains, why? Is the primary purpose to sow discord within Western societies? Or is it a gateway to introduce new conspiracy theories that align with the Kremlin’s interests? Or is it a means to do both? In her research on alternative media ecosystems on Twitter57, Starbird (2017) identified several conspiracy themes that spread across all SMNs. These included anti-globalist and anti-media views, anti-vaccine, anti-GMO, and anti-climate science. In addition to these themes was a reference to George Soros and the Rothschilds. George Soros is a philanthropist of Jewish descent who survived Nazi occupation of Hungary, and who has invested billions in support of international democratic ideals (Soros, 2019). The Rothschilds are a wealthy Jewish family, with Jeff Rothschild placing 875th in Forbes The Complete List, which lists the world’s billionaires (Forbes, 2020). Many of the allegations circulating on SMNs against the Rothschilds were anti-Semitic, (McConnachie & Tudge, 2013) while in almost all of the articles on George Soros and the Rothschilds reference was made to paedophile rings whose members consisted of high-powered people (Starbird, 2017).

9.8 The 2020 US Presidential Election At the time of writing (December 2020/January 2021), the active global mobile social media population worldwide is 3.91 billion. In contrast, the most popular social network, based on the worldwide audience size of the platform is Facebook, with over 2.6 billion users active each month. The top country based on Facebook’s audience size is India, with 290 million users as of July 2020. In comparison, in the fourth quarter of 2019, Twitter's daily active users were 152 million. The US has the largest volume of Twitter users as of July 2020 at 62.55 million (Statista.com). According to Statista.com, most of the revenue generated by Facebook is through advertising. In comparison, Twitter has become a tool for US and international politics. As statitica.com writes, Twitter “has become a way to promote policies and interact with citizens and other officials, and most world leaders and foreign ministries have an official Twitter account”. With such a vast audience available to State actors, it seems unlikely that the Kremlin and other governments will desist in using SMNs to conduct information campaigns.

57 Over a 10-month period Starbird (2017) collected tweets related to alternative narratives, with a particular focus on mass shooting events.

226

A report by Evans (2020) stated that Facebook had a massive battle ahead in the lead-up to the 2020 US presidential elections. According to the report, Facebook had been attempting to better position itself for the election, but the effects of these attempts have been questioned by experts, such as Kovach (2020). However, the report is quick to note that Facebook, although a catalyst for disinformation is only part of a greater picture. Culliford (2020) reports that Facebook, in an attempt to better handle and respond to inappropriate content, will be launching an independent oversight board. However, it is unclear whether the board addressed propaganda and misinformation concerning the US 2020 presidential election. On the Twitter front, Hale (2020) has reported that to combat disinformation campaigns leading up to the 2020 election Twitter made changes to its core functions from the 20th of October to the 6th of November2020, including having users fill out a form whenever they wished to retweet content and removing the feature of ‘liked by’ and ‘followed by’ recommendations in an attempt to assist users in seeing relevant content from outside of their echo chambers. Lastly, Twitter provided commentary around trending topics to demonstrate why a topic is trending.

In response to allegations by former President Donald Trump with regards to the security of postal votes, and then, that he lost the election due to widespread voter fraud, both Twitter and Facebook began putting warning labels on the former Presidents posts, known as “Parental Advisory” labels (Fowler, 2020). The labels have reportedly not assisted in the spread of false election content, nor according to Fowler (2020) is there any evidence that they work. As Fowler (2020) writes,

online, election week devolved into a mess of false claims that the results were fraudulent. As traditional news networks stepped in to correct the president’s misstatements, his allies turned to a network of new and existing Facebook pages, groups and events to rally people and spark real-world intimation of poll workers.

Fowler (2020), goes on to explain that rather than being a win for democracy, the labels were instead a win for the SMNs as a Public Relations tool. The labels demonstrating to the world that the SMNs were responsive, enough that is, to help the SMNs avoid being blamed for the disinformation that was appearing online.

227

The 2020 election appears to have been plagued by domestic disinformation campaigns. This is in stark contrast to the allegations of 2016 which blamed the IRA and Russia of sponsoring disinformation campaigns to influence the election. However, as this paper demonstrates, when examining Twitter, the majority of Russian sponsored disinformation campaigns on of 2016 were not created by the IRA, merely echoed. The question then arises, where did the disinformation campaigns stem from if not the IRA. Where they from places like Veles, or where they created domestically, in the US?

9.9 Conclusion This chapter demonstrated that the spread of disinformation during the 2016 US presidential election extended beyond the Russian troll farm the IRA. Disinformation campaigns, such as #Pizzagate aiding IRA operations, providing new sensationalised headlines for IRA operatives to retweet and spread. However, it is unclear as to whether the IRA were particular about where they got their headlines to retweet and regurgitate, as it would appear that the Veles headlines were not among the headlines shared by the IRA, even though they were pro-Trump and attracted a large amount of online activity.

This chapter also introduced a third actor in the spread of disinformation campaigns online, American political figures, including the former President of the US, Donald Trump, who has not only repeated Russian disinformation in public speeches and rallies whilst president, but has also retweeted conspiracy posts from groups such as QAnon. Lastly, this chapter raised the question as to whether the IRA helped distract the US from the fact that the 2016 US presidential election was plagued with domestic disinformation campaigns. As demonstrated in previous chapters, and in the case of #Pizzagate, the IRA appear to have primarily tweeted on stories which were already circulating online and did not need to create new content. The question that arises is, where did this content originate? Lastly, this chapter examined the responses by SMNs, demonstrating that although Facebook and Twitter have begun to censor and label disinformation, more needs to be done.

The final chapter of this thesis will provide concluding remarks concerning the results of the research project, as well as provide suggestions for further research avenues.

228

Chapter Ten – Conclusion

As argued in the introduction of this thesis, Russia has demonstrated a renewed interest in information operations after the fall of the Soviet Union to compensate for its weak military and political position. However, it seems that little heed was paid by the West to Russian information operations as they were seen to deviate too much from the truth to be plausible. With reports that Russia conducted information operations against the US during the 2016 presidential election, there has been a revival in interest in Russian information operations. One of this research project's main objectives was to help fill a gap in research regarding Russian information operations by focusing on Kremlin troll farm activity and how their techniques have changed. In so doing, this research aimed to answer the question, “How have Russian Information Operations developed since the collapse of the Soviet Union?”

Subsequent research questions included:

6. How has the Kremlin troll operation/strategy developed? 7. How does Russia use social media to support its strategic goals? 8. How do social media platforms respond to State-sponsored trolling operations?

In answering these questions, this research followed a methodological framework that was both deductive and inductive. The research was qualitative with quasi-quantitative aspects and based on a longitudinal case study of the Internet Research Agency (IRA), a known Kremlin troll farm, using a unique dataset released to researchers by Twitter in 2017.

10.1 Major Findings The significant findings of this research project are as follows: • The development of Russian information operations online may be traced back to the first Chechenia war; • The 2014 annexure of Crimea involved an information campaign that was spread over multiple channels, and included a variety of tactics, such as the use of SMNs, traditional media outlets, cyber-attacks and espionage, the use of SMNs alone, was not enough;

229

• The IRA predominately participated in the further spread of disinformation that was already circulating online during the 2016 US presidential election; and • Contrary to the United States District Court for the District of Columbia (2018), the IRA was not created to attack the US. As demonstrated, the IRA was established in 2009 to create a favourable narrative towards the Russian government in events specific to Russia and neighbouring countries.

In addition to the findings outlined above, several other conclusions may be drawn from this research project concerning the main research question and sub-research questions.

How have Russian Information Operations developed since the collapse of the Soviet Union? In answering the over-arching research question, it has been demonstrated that Russian information operations are not static. Instead they evolve, adapt and develop based on their success, with the Russian government abandoning failed information operations and testing new approaches. When looking at Russia’s military operations from the collapse of the Soviet Union to the 2014 annexure of Crimea, outlined in Chapter’s Four and Six, this becomes evident. As demonstrated, Russia would learn from its failings and create an information army. Information operations evolved into a sophisticated strategy which occurred over three stages. The first is the creation of disinformation via multiple sources. The second is the production of user-generated media. The third is hackers who are used to capture and then slowly leak information relevant to an information campaign. However, as demonstrated throughout Chapters Four and Six, online information campaigns are not enough. The Russian government still relied on traditional methods to spread disinformation, such as conventional forms of media and useful fools.

How has the Kremlin troll operation/strategy developed? And How does Russia use social media to support its strategic goals? Since 2009 the IRA’s language capabilities slowly expanded, with the number of languages almost doubling between 2012 and 2014, suggesting that Russia was increasing information campaigns to include a wider global audience. As Chapter Four demonstrated, Russia has been learning and adapting information operation techniques and strategies using online communication applications, particularly SMNs, since the 2011 Russian demonstrations. Included in this strategy is the flooding of communications with a narrative designed to

230 confuse and cast doubt on Russian and world events and to depict the malicious role Russia envisions the West is playing in Russian affairs.

As demonstrated in Chapter Five, propaganda and disinformation are intrinsic to Russia’s history and culture, so a natural form of progression for Russian information campaigns was to move this narrative online to the Internet and social media networks (SMNs). SMNs provide an avenue for state actors to carry out information campaigns virtually unchecked. They provide a means to repeat narratives supported by experts and propagated by trusted sources, which, according to Paul and Matthews (2016) are the three critical elements of a successful information campaign. Further to Paul & Matthews (2016) research is the idea that SMNs add a new dimension to information campaigns, that is, SMNs make information operations sticky. As Nissen (2015) states, information received by individuals on SMNs tends to stick more than information presented on traditional media sources. Therefore, disinformation becomes stickier when distributed in the virtual world. Chapter Six demonstrates that with the annexure of Crimea Russian troll farm operatives used SMNs to infiltrate echo chambers and propagate falsehoods against the Ukrainian government and army, to gain support for the annexure of Crimea and the war in Ukraine. As demonstrated in Chapter Eight, this strategy was repeated during the 2016 US presidential elections, with Russian IRA operatives embedding themselves into US-centric forums. However, to give Russia credit for the full extent of the information campaigns which saturated SMN’s during the 2016 US presidential election was premature.

It was anticipated that the IRA Twitter data would demonstrate that IRA operatives created disinformation campaigns to sway public opinion in favour of presidential candidate Trump. Instead, the data showed that rather than make original content to sway the election, IRA operatives merely echoed disinformation that was already circulating online at the time. At the time of writing, it is believed this finding is unique to this research paper. For example, the IRA did not need to create content to damage Clinton’s reputation to influence the election; the content was already there. The exception is information released from WikiLeaks, which is believed to have been illegally gained by the Russian operative Guccifer 2.0 and distributed in batches leading up to the 2016 election. Although it could be argued that this could have been avoided if Clinton had been more transparent throughout the election, with matters such as her health and her income from the so-called ‘Wall Street speeches’.

231

It is unclear why the IRA chose not to create original content during the 2016 election, but instead echo what was already in circulation. Reasons may have included the IRA’s realisation that with so much disinformation available online leading up to the election, anything newly created by the IRA may have been lost in all the noise. It could also be argued that it is easier to join in a conversation that already had traction, rather than attempt to create new sticky content, or that it was cheaper and more straight forward to circulate existing content, then it was to create new content. Alternatively, the Kremlin may have realised that there was no way to manipulate local media outlets, so any new campaigns would not have received the attention they needed to influence the election. As demonstrated in Chapter Seven, as part of the initial information operations against the US during the same period, the IRA could not make allegations of a chemical spill, an Ebola outbreak, or the shooting of an unarmed woman and a food-poisoning outbreak stick. Unlike previous information operations close to home, there was no support from external sources such as the media or useful fools. The IRA operatives did not appear to be embedded in any US-centric echo chambers as the only accounts retweeting and sharing the IRA content was other IRA operatives. The timing of the information operations also proved to be a shortcoming of the campaigns, as they were undertaken in Russian rather than US time.

How do social media platforms respond to State-sponsored trolling operations? By tracing the Kremlin’s troll activity on Twitter using Twitter’s troll farm data, activity may be seen to peak in 2015 and that the number of tweets begin to decline from this point onwards. The decline may have been due to Russia adapting and improving its online information campaigns, making it more difficult for Twitter to identify. It is possible that the movements became so embedded into SMNs that they are unidentifiable as troll activity by current algorithms used by Twitter. Other factors that could have influenced the decline in identifiable Russian tweets include the possibility of reduced activity by Russia due to ineffectiveness; that the Kremlin reduced the program as it was receiving too much media coverage; or that Twitter continues to suspend troll accounts faster than the Kremlin can re-establish new versions, with Twitter undergoing multiple purges of fake accounts since the troll farms' identification (Kim, Graham, Wan, & Rizoiu, 2019).

Since the 2016 election and the Senate Hearings that followed, SMNs have had to address the Kremlin’s use and abuse of their services and have been working to eradicate Russian trolls and bots from their platforms, as witnessed in the examples provided by Twitter and Facebook.

232

However, more needs to be done. In a surprising move by Twitter in early 2021, Twitter banned US President Donald Trump (now former) from its platform due to a tweet deemed to incite violence. As Collins & Zadrozny (2021) reported, after the Capital siege, a violent protest was undertaken by Trump supporters on January 6th, 2021. As a result, hundreds of Twitter’s employees signed a letter to the CEO Jack Dorsey requesting that he ban the President. The full consequences of this suspension for Twitter and the spread of disinformation are yet to be seen.

Chapter Eight and Nine demonstrated that disinformation campaigns extend far beyond the Kremlin. By manipulating social media, information operations have become a global phenomenon employed by a range of governments and political actors using cyber troops. The use of SMNs to spread disinformation favouring one political candidate over another was once again witnessed in the 2020 US presidential election, which was plagued with disinformation campaigns. However, this time the disinformation was not blamed on a foreign State but was homegrown in the US, as seen by competing supporters of Democrat and Republican seats, and by political figures such as former US President Trump. Authoritarian nations are no longer the primary concern for SMNs, disinformation in general needs to be addressed and it needs to be addressed globally.

10.2 Limitations As highlighted at the start of this project, there were several limitations to be considered when examining this research project. For example, the project relies heavily on data produced by Twitter on IRA activity, and as such, there is no way to test the validity of how Twitter identified IRA activity or to validate that Twitter has correctly identified and included all IRA activity on its platform. Further, the data produced by Twitter only extends from 2009 to mid 2018, there is no current data in which to analysis. Additionally, without access to Russian government documents on information operations, any theories explored or identified throughout this project, are just theories, it is not possible to validate them at this stage.

10.3 Concluding Remarks This thesis has demonstrated through an analysis of the technical and social the first scholarly exploration of how the IRA’s social media tactics have changed over time since 2009. Further, that SMNs have developed beyond their initial intended use and that the very nature of SMNs

233 have provided the perfect breeding ground for conspiracy theories, propaganda and information warfare. In 2008 Georgia initiated an information operation against Russia using the Internet. In 2011 the colour revolutions spilt out into Russia, demonstrating the strength of a united people’s voice online. It was only a matter of time before Russia would also use the Internet to sway public opinion in the same way that the Russian government had used traditional news media in the past. In 2009, Russia had already begun experimenting with Twitter, as demonstrated in Chapter Five. However, by 2016 Russia was not the only nation using the Internet to spread chaos through disinformation, as shown in Chapter Eight. In Chapter Nine, this thesis explored the responses by SMNs to Russia’s information campaigns.

10.4 Implications The current project has demonstrated the evolution or Russian information operations since the collapse of the Soviet Union. What is apparent is that the Russian government continues to use information operations to influence Russian citizens living inside of Russian and abroad and to sow discord in Western countries such as the US. The research has produced no evidence to suggest this pattern of behaviour has slowed down since 1989, in contrast it appears to have increased since Vladimir Putin became president in 1999. This project has also demonstrated that conspiracy theories and propaganda in general have become common place on SMNs continuing to make them appealing for State actors in general to spread information operations. With regards to Russian information operations, providing current algorithms for detecting State sponsoring activity are calibrated correctly, Russian information operations towards the West on Twitter, at this time have slowed down. The questions that remain are, when will Russia’s information army re-emerge, and what will this look like in terms of information operations? Further, if the Russian government once more sets sight on the West to carry out information operations, what controls are in place to prevent these operations from being affective?

10.5 Further research Five additional research questions were identified; however, they were deemed to be outside the current research project's scope. These were:

234

1. Was the IRA central to in influencing the black vote in the US during the 2016 election? It is recommended that further research is undertaken to correlate where possible, the participation of black Americans with IRA activity and engagement.

2. How did the IRA operate, if at all, with other disinformation sites?

3. What effect did WikiLeaks have on the 2016 US election?

4. If the IRA was not responsible for the creation of most disinformation throughout the 2016 US presidential election, then who was?

5. What would a global response to disinformation look like?

The End :D

235

Appendix One – Ethics Approval Certificates

236

237

Appendix Two – SQL Queries, Command-Line Searches, Bash Commands and Pearl Scripts

To find data between time frames SQL Command—SELECT *FROM ira_data WHERE tweet_time BETWEEN “” AND “”

To sort tweets based on key words SQL Command—SELECT *FROM ira_data WHERE tweet_text LIKE “%%” or SQL Command—SELECT *FROM ira_data WHERE tweet_text LIKE “%%” BETWEEN “” AND “”

Count Unique Names Terminal—sort names2.csv | uniq | wc -l

Account Creation Date Terminal—sort account_creation_date.csv | uniq -c

User Reported Location Excel Spreadsheet—Columns/Data/Remove Duplicates Count

2016 IRA Election Tweets—How many of the retweets were from other IRA operatives and how many were from legitimate Twitter users #!/usr/bin/perl use warnings; use strict; use DBI; my $filename = 'tweets.csv'; my @listOfNames = (); open(FH, '<', $filename) or die $!; while() { # split tweet. Handle is second word my @spl = split('\s', $_); my $name = $spl[1];

# remove trailing : from twitter handle chop($name);

238

# remove @ from name $name = substr($name, 1);

#create array of twitter handles push(@listOfNames, $name); } close(FH); my $outFile = 'processed_tweets.csv'; open(FH, '>', $outFile) or die $!; my $driver = "SQLite"; my $database = "Twitter.sqlite"; my $dsn = "DBI:$driver:dbname=$database"; my $userid = ""; my $password = ""; my $dbh = DBI->connect($dsn, $userid, $password, { RaiseError => 1 }) or die $DBI::errstr; print "Opened database successfully\n"; foreach(@listOfNames) { my $stmt = qq(SELECT COUNT(*) FROM ira_tweets WHERE user_screen_name LIKE \'$_\';); my $sth = $dbh->prepare($stmt); my $rv = $sth->execute() or die $DBI::errstr;

if($rv < 0) { print $DBI::errstr; }

while(my @row = $sth->fetchrow_array()) { print FH $_ . "," . $row[0] . "\n"; print $_ . "," . $row[0] . "\n"; } } close(FH);

$dbh->disconnect(); print "CSV File Complete"

239

Bibliography

ABC News. (2015, 4 September). Timeline: Recent US police shootings of black suspects. ABC News. Retrieved from https://www.abc.net.au/news/2015-04-09/timeline-us-police- shootings-unarmed-black-suspects/6379472. Abrams, S. (2016). Beyond Propaganda: Soviet Active Measures in Putin's Russia. Connections, 15(1), 5-31. Aftab, S., & Bukhari, H. (2011, 20 November). What is Comparative Study. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1962328. Ahlawat, D. (2017). Central and South Asia. An overview of soft power prospective (section overview). In N. Chitty, J. Li, G. D. Rawnsley, & C. Hayden (Eds.), The Routledge Handbook of Soft Power. Abingdon, Oxon: Routledge. AJC. (2015a). Cop shooting, Ebola scare in Atlanta invented by Russians: Report. Retrieved from https://www.ajc.com/news/cop-shooting-ebola-scare-atlanta-invented- russians-report/Ux6zeq8OQeU3pgHMyPKP3O/?icmp=np_inform_variation-test. AJC. (2015b, 3 June). Cop Shooting, Ebola scare in Atlanta invented by Russians: Report. AJC Atlanta. News. Now. Retrieved from https://www.ajc.com/news/cop-shooting- ebola-scare-atlanta-invented-russians-report/Ux6zeq8OQeU3pgHMyPKP3O/. Alexander, M. (2016). Why Hillary Clinton Doesn’t Deserve the Black Vote. Retrieved from https://www.thenation.com/article/hillary-clinton-does-not-deserve-black-peoples- votes/. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of economic perspectives, 31(2), 211-236. Allen, J., & Parnes, A. (2017). Shattered: inside Hillary Clinton's doomed campaign: Crown Books. Anwar, N. D., Ang, B., & Jayakumar, S. (2021). Disinformation and Fake News: Palgrave Macmillan. Applebaum, A., & Lucas, E. (2016, 6 May). The danger of Russian disinformation. The Washington Post. Retrieved from https://www.washingtonpost.com/opinions/the- danger-of-russian-disinformation/2016/05/06/b31d9718-12d5-11e6-8967- 7ac733c56f12_story.html?noredirect=on&utm_term=.5b458bd1656a. Arabidze, I. (2018). Russian Disinformation and Propaganda: Old Strategy in a New Cover? The Pardee Periodical Journal of Global Affairs, III(I).

240

Aro, J. (2016). The cyberspace war: propaganda and trolling as warfare tools. European View, 15(1), 121-132. Arquilla, J. (2007). Introduction: Thinking about Information Strategy. In J. Arquilla & D. A. Borer (Eds.), Strategy and Warfare. A guide to theory and practice (pp. 1-15). New York and London: Routledge Taylor and Francis Group. Arquilla, J. (1999). "Ethics and information warfare." Rand-Publications-MR-All Series-: 379-402. Arquilla, J. and D. Ronfeldt (1993). "Cyberwar is coming!" Comparative Strategy 12(2): 141-165. Asmus, R. D. (2004). Opening NATO's door: how the alliance remade itself for a new era: Columbia University Press. Australian Electoral Commission. (2020a). AEC encouraging voters to “stop and consider” this federal election. Retrieved from https://www.aec.gov.au/media/media- releases/2019/04-15.htm. Australian Electoral Commission. (2020b). Electoral Integrity Assurance Taskforce. Retrieved from https://www.aec.gov.au/elections/electoral-advertising/electoral- integrity.htm. Bachman, R. D. and R. K. Schutt (2016). Fundamentals of research in criminology and criminal justice. Sage Publications. Baczynska, G., & Vasovic, A. (2014, 27 July). Pushing locals aside, Russians take top rebel posts in east Ukraine. Reuters. Retrieved from https://web.archive.org/web/20140728013327/https://www.reuters.com/article/20 14/07/27/us-ukraine-crisis-rebels-insight-idUSKBN0FW07020140727. Bail, C. A., Guay, B., Maloney, E., Combs, A., Hillygus, D. S., Merhout, F., Volfovsky, A. (2020). Assessing the Russian Internet Research Agency's impact on the political attitudes and behaviors of American Twitter users in late 2017. Proceedings of the National Academy of Sciences of the United States of America, 117(1), 243. doi:10.1073/pnas.1906420116. Barron, J. (1990). Dezinformatsiya. London: Viking. Barry, R. (2018). Russian Trolls Tweeted Disinformation Long Before U.S. Election. . Retrieved from https://www.wsj.com/graphics/russian-trolls- tweeted-disinformation-long-before-u-s-election/. Bartles, C. K. (2016). Getting Gerasimov Right. Military Review, 96(1), 30-38.

241

Basora, A. A. and A. Fisher (2014). "Putin’s “Greater ”–The Dismemberment of Ukraine?" Foreign Policy Research Institute, May 2. Bastos, M. and J. Farkas (2019). "“Donald Trump Is My President!”: The Internet Research Agency Propaganda Machine." Social media + society 5(3). Bastos, M., & Farkas, J. (2019). “Donald Trump Is My President!”: The Internet Research Agency Propaganda Machine. Social media + society, 5(3). doi:10.1177/2056305119865466. BBC. (2014, 16 March). Crimea referendum: Voters 'back Russia union'. BBC News. Retrieved from Crimea referendum: Voters 'back Russia union'. BBC News (Producer). (2014, 13 March). Ukraine crisis: Transcript of leaked Nuland-Pyatt call. BBC News. BBC News (Producer). (2018, 12 August). Huma Abedin and Anthony Weiner to settle divorce out of court. BBC Trending (Producer). (2016, 12 August). The saga of 'Pizzagate': The fake story that shows how conspiracy theories spread. Bechev, D. (2016, Feb 12th). Russia in the Middle East: From the Arab Uprisings to the Syrian Conundrum. Alsharq Forum. Beckhusen, R. (2017, 24/12/17). Operation Bagration: The Shocking Story of How Russia Crushed Nazi Germany. Retrieved from http://nationalinterest.org/blog/the- buzz/operation-bagration-the-shocking-story-how-russia-crushed-23792. Beehner, L., Collins, L., Ferenzi, S., Person, R., & Brantly, A. (2018). Analyzing the Russian Way of War: Evidence from the 2008 Conflict with Georgia. Bell, C., & Sternberg, E. (2002). Emotional Selection in Memes: The Case of Urban Legends. Journal of personality and social psychology, 81, 1028-1041. doi:10.1037//0022-3514.81.6.1028. Bellware, K. (2016, 12 March). Donald Trump Rally In Chicago Cancelled After Protesters Turn Out In Droves. Huffington Post. Retrieved from https://www.huffingtonpost.com.au/entry/trump-rally chicago_us_56e366ece4b0b25c9182176f. bender190191. (2014, 29 October). Афера Эбола [Ebola Scam]. Retrieved from https://boeing-is-back.livejournal.com/50416.html.

242

Bendrath, R., et al. (2007). "From’cyberterrorism’to’cyberwar’, back and forth." International Relations and Security in the Digital Age. Hrsg. von Johan Eriksson und Giampiero Giacomello. Abingdon: Routledge: 57-82. Benn, D. W. (1996). The Russian media in post‐Soviet conditions. Europe-Asia Studies, 48(3), 471-479. Benyumov, K. (2016). How Russia's independent media was dismantled piece by piece. . Australian Edition. Berkhoff, K. C. (2012). Motherland in Danger. Soviet Propaganda during WWII. Cambridge, Massachusetts; London, England: Harvard University Press. Bērziņš, J. (2014). Russia’s new generation warfare in Ukraine: Implications for Latvian Defense Policy. Policy paper, 2, 2002-2014. Besemeres, J. (2016). Russian disinformation and Western misconceptions. Australia, Australia: ANU Press.Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 US presidential election online discussion. Bessi, A. and E. Ferrara (2016). "Social bots distort the 2016 US Presidential election online discussion." SSRN 21(11). Beydoun, K. A. (2016, 25 July). Muslim voters between Hillary Clinton and a hard place. Al Jazeera. Retrieved from https://www.aljazeera.com/indepth/opinion/2016/07/muslim-voters-hillary-clinton- hard-place-160725094634857.html. Bhat, A. (2019). Experimental Research—Definition, Types of Designs and Advantages. Retrieved from https://www.questionpro.com/blog/experimental-research/. Birnbaum, M., & Branign, W. (2015, 28 February). Putin critic, Russian opposition leader Boris Nemtsov killed in Moscow. The Washington Post. Retrieved from https://www.washingtonpost.com/world/europe/russian-opposition-leader-boris- nemtsov-reported-killed-in-moscow/2015/02/27/972e15f0-becb-11e4-b274- e5209a3bc9a9_story.html?utm_term=.e5271f7ca418. Bittman, L. (1985). The KGB and Soviet disinformation: an insider's view. Washington: Pergamon-Brassey's. Blank, S. (2017). Cyber War and Information War à la Russe. In G. Perkovich & A. E. Levite (Eds.), Understanding Cyber Conflict: Fourteen Analogies (pp. 81-98). Georgetown: Georgetown University Press.

243

Blank, S. (2014). "Signs of New Russian Thinking About the Military and War." Eurasia Daily Monitor. 11(28). Blidner, R. (2018, 2 June). Russian Agency Creates Army of Trolls to Spread Internet Panic, Propaganda: Report New York Daily News. Boatwright, B. C., et al. (2018). "Troll factories: The internet research agency and state- sponsored agenda building." The Social Media Listen Centre. Clemson University. Bohdanova, T. (2014). Unexpected revolution: the role of social media in Ukraine's Euromaidan uprising. European View, 13(1), 133-142. Bogart, L. M. and S. Thorburn (2005). "Are HIV/AIDS conspiracy beliefs a barrier to HIV prevention among African Americans?" AIDS Journal of Acquired Immune Deficiency Syndromes 38(2): 213-218. Boghardt, T. (2009). "Soviet Bloc Intelligence and Its AIDS Disinformation Campaign." Studies in intelligence. 53(4): 1-24. Boni, B. (2001). "E-commerce the Dark Side: Cyber-terrorists and Counter Spies." Network Security 2001(12): 17-18. Boot, M., & Doran, M. (2013). Political-warfare: Policy Innovation Memorandum No. 33. Boyd, D. M., & Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of computer‐mediated Communication, 13(1), 210-230. Bradshaw, S., & Howard, P. (2017). Troops, trolls and troublemakers: A global inventory of organized social media manipulation. Computational Propaganda Research Project, 2017.12. Bradshaw, S., & Howard, P. N. (2018). Challenging truth and trust: A global inventory of organized social media manipulation. The Computational Propaganda Project, 1. Bradshaw, S., & Howard, P. N. (2019). The global disinformation order: 2019 global inventory of organised social media manipulation: Project on Computational Propaganda. Brangetto, P., & Veenendaal, M. A. (2016). Influence Cyber Operations: The Use of Cyberattacks in Support of Influence Operations. Paper presented at the 8th International Conference on Cyber Conflict. Broniatowski, D. A., et al. (2018). "Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate." 108(10): 1378-1384. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology, 3(2), 77-101.

244

Bruns, A., Highfield, T., & Burgess, J. (2014). The Arab Spring and its social media audiences: English and Arabic Twitter users and their networks. In Cyberactivism on the participatory web (pp. 96-128): Routledge. Bryant, C. G. (1985). Positivism in social theory and research: Macmillan International Higher Education. Bryman, A. (2016). Social research methods: Oxford university press. Brzezinski, Z. (1992). "The Cold War and its aftermath." Foreign Affairs. 71(4): 31-49. Bunker, R. (2018). Like War: The Weaponization of Social Media. Parameters, 48(4), 68-70. Bunn, A. (1999). Conspiracy theories: Secrecy and Power in American Culture. New York. 44: 83. Burenok, V. (2014). "Knowledge of mass destruction. Modern weapons systems should include means of protection of consciousness." VPK News. (23: 541). Burgess, J., & Baym, N. K. (2020). Twitter: A biography: NYU Press. Business Quarter. (2013, 3 April). The author of the Urals "Botgate" runs a business with the founder of "Ra Digital". Business Quarter. Retrieved from https://ekb.dk.ru/news/dva-odioznyx-uralskix-piarshhika-obedinyayutsya-v-ramkax- odnogo-pragentstva-236705656. Byford, J. (2011). Towards a Definition of Conspiracy Theories: Palgrave MacMillan. Byrne, D. (1971). The attraction paradigm. New York: Academic Press. Cadwalladr, C. (2018). "I created Steve Bannon’s psychological warfare tool’: Meet the data war whistleblower." The Guardian. 17. Callon, M., & Latour, B. (1981). Unscrewing the big Leviathan: how actors macro-structure reality and how sociologists help them to do so. Advances in social theory and methodology: Toward an integration of micro-and macro-sociologies, 1. Canbolat, M. and E. Sezgin (2016). Is NATO ready for a cyberwar?, Monterey, California: Naval Postgraduate School. Castells, M. (1989). The informational city: Information technology, economic restructuring, and the urban-regional process: Blackwell Oxford. Castells, M. (2004). The network society A cross-cultural perspective: Edward Elgar. Castells, M. (2013). Communication power: OUP Oxford. Castleberry, A., & Nolen, A. (2018). Thematic analysis of qualitative research data: Is it as easy as it sounds? Currents in Pharmacy Teaching and Learning, 10(6), 807-815.

245

Caute, D. (1988). The fellow-travellers: intellectual friends of communism: Yale University Press. CBSLA.com. (2016, 2 April). Bernie Sanders Supporters Protest Election Coverage At Hollywood CNN Building. CBS Local. Retrieved from https://losangeles.cbslocal.com/2016/04/03/bernie-sanders-supporters-protest- election-coverage-at-hollywood--building/. CBSNews. (2015, 3 December). Putin makes accusation against U.S. over Syria policy. CBS News. Retrieved from https://www.cbsnews.com/news/vladimir-putin-us-western- powers-zone-of-chaos-syria-iraq-libya/. Central Intelligence Agency (1984). [The Soviets ad the 1984 US Elections]. Centre for Disease Control. (2019). Ebola outbreak in eastern Democratic Republic of Congo reaches 3,000 cases and 2,000 deaths [Press release]. Retrieved from https://www.cdc.gov/media/releases/2019/s0829-ebola-cases.html. Chance, M. (2008, 28 August). Putin accuses U.S. of orchestrating Georgian war. CNN. Retrieved from http://edition.cnn.com/2008/WORLD/europe/08/28/russia.georgia.cold.war/index.h tml. Charap, S. (2015). The Ghost of Hybrid War. Survival, 57(6), 51-58. doi:10.1080/00396338.2015.1116147. Charap, S. (2007). "Inside Out: Domestic Political Change and Foreign Policy in Vladimir Putin's First Term." Demokratizatsiya. 15(3). Checkinov, S. B. S., & Bogdanov, S. A. J. M. T. (2015). The Art of War in the Early 21st Century: Issues and Opinions. 24(11). Chekinov, S., & Bogdanov, S. (2013). The Nature and Content of New-Generation Warfare. Military Thought, 10(2013), 13. Chekinov, S. G., & Bogdanov, S. A. (2010). Asymmetrical actions to maintain Russia’s military security. Military Thought, 1(2010), 1-11. Chen, A. (2015, 2 June). The Agency, Online. The New York Times Magazine 2015. Retrieved from https://www.nytimes.com/2015/06/07/magazine/the-agency.html. Cheney, Kyle. (2016). No, Clinton didn’t say the Birther thing, this guy did. Politico. Retrieved from https://www.politico.com/story/2016/09/birther-movement-founder- trump-clinton-228304. Cialdini, R. B. (1984). Influence: The new psychology of modern persuasion: Morrow.

246

CNN Library. (2018, 28 February). 2016 Presidential Campaign Hacking Fast Facts. Retrieved from https://edition.cnn.com/2016/12/26/us/2016-presidential-campaign- hacking-fast-facts/index.html. Coats, D. R. (2019). Statement for the Record: Worldwide Threat Assessment of the US Intelligence Community, Daniel R. Coats, Director of National Intelligence, Senate Select Committee on Intelligence, January 29, 2019. Paper presented at the United States. Office of the Director of National Intelligence; United States. Congress. Senate. Select Committee on Intelligence. Cohen, A., & Hamilton, R. E. (2011). The Russian military and the Georgia war: lessons and implications: Strategic Studies Institute. Collings, D., & Rohozinski, R. (2009). Bullets and Blogs: New media and the warfighter. Retrieved from https://www.researchgate.net/publication/235094991_Bullets_and_Blogs_New_Me dia_and_the_Warfighter. Colvin, G. (Producer). (2016, 12 August). Hillary Clinton’s Goldman Sachs Speeches Don’t Matter. Here’s Why. Comey, J. (2018). A higher loyalty: Truth, lies, and leadership. New York: Flatiron Books. Concept of the Foreign Policy of the Russian Federation, (2013). Connell, M., & Vogler, S. (2017). Russia's approach to cyber warfare. Conradi, P. (2017). Who Lost Russia?, Oneworld Publications.Cordella, A., & Shaikh, M. (2006). From epistemology to ontology: challenging the constructed'truth'of ANT. Cosentino, G. (2020). From Pizzagate to the Great Replacement: The Globalization of Conspiracy Theories. In (pp. 59-86). Cham: Cham: Springer International Publishing. Costolo, D. (2015). [Twitter Memorandum]. Coynash, H. (Producer). (2019). Russian Twitter ‘’ operation to blame Ukraine for MH17 broke all records. Human Rights Protection Group. Creswell, J. W. (2018). Qualitative inquiry & research design : choosing among five approaches / John W. Creswell, University of Michigan, Cheryl N. Poth, University of Alberta. Thousand Oaks, CA : SAGE. Crotty, M. (1998). The foundations of social research: Meaning and perspective in the research process: Sage. Dahl, R. (1957). The concept of Power. Behavioural Science, 2(3).

247

Daley, J. (2017a, 4 January). How Adlai Stevenson Stopped Russian Interference in the 1960 Election. SmartNews. Retrieved from https://www.smithsonianmag.com/smartnews/how-adlai-stevenson-stopped- russian-interference-1960-election-180961681/#Q8AJvO3xhjxQK4ai.99. Daley, J. (2017b, 4 January). How Adlai Stevenson Stopped Russian Interference in the 1960 Election. SmartNews. Retrieved from https://www.smithsonianmag.com/smart- news/how-adlai-stevenson-stopped-russian-interference-1960-election-180961681/. Darczewska, J. (2014). The anatomy of Russian information warfare. The Crimean operation, a case study: Ośrodek Studiów Wschodnich im. Marka Karpia. Darczewska, J., & Zochowski, P. (2017). Active Measures: Russia’s Key Export. OSW Studios 64. Davis, J. (2007). Hackers take down the most wired country in Europe. Wired magazine, 15(9), 15-09. De Haas, M. (2009). Medvedev’s security policy: A provisional assessment. Russian Analytical Digest, 62(09), 2-5. De Haas, M. (2010). Russia's foreign security policy in the 21st century: Putin, Medvedev and beyond. Routledge. Deibert, R. J., Rohozinski, R., & Crete-Nishihata, M. (2012). Cyclones in cyberspace: Information shaping and denial in the 2008 Russia–Georgia war. Security Dialogue, 43(1), 3-24. Denzin, N. K. and Y. S. Lincoln (1994). Handbook of qualitative research, Sage publications, inc. Deyermond, R. (2013). "Assessing the reset: successes and failures in the Obama administration's Russia policy, 2009–2012." European Security 22(4): 500-523. Dickey, J. V., Everett, T. B., Galvach, Z. M., J Mesko, M., & V Soltis, A. (2015). Russian political warfare: origin, evolution, and application. Monterey, California: Naval Postgraduate School. Dictionary.com. (2018). Dictionary.com. Dictionary.com. Retrieved from https://www.dictionary.com/browse/computer-virus?s=t. Dimitrakopoulou, S., & Liaropoulos, A. (2010). Russia's National Security Strategy to 2020: A Great Power in the Making? Caucasian Review of International Affairs, 4(1), 35. Diuk, N. (2014). EUROMAIDAN: Ukraine's self-organizing revolution. World Affairs, 176(6), 9-16.

248

Donovan, G. T. J. (2009). Russian Operational Art in the Russo-Georgian War of 2008. Retrieved from https://apps.dtic.mil/dtic/tr/fulltext/u2/a500627.pdf. Doran, W. (2017, 20 June). Here's every time Russian or Soviet spies tried to interfere in US elections. How does 2016 compare? Retrieved from http://www.politifact.com/north- carolina/statements/2017/jun/20/richard-burr/heres- every-time-russian-or-soviet- spies-tried-int/. DuBois, T., Golbeck, J., & Srinivasan, A. (2011). Predicting trust and distrust in social networks. Paper presented at the 2011 IEEE third international conference on privacy, security, risk and trust and 2011 IEEE third international conference on social computing. DuPaul, N. (2012). Common Malware Types: Cybersecurity 101. Retrieved from https://www.veracode.com/blog/2012/10/common-malware-types-cybersecurity-101. The Economist. (2016, 17 December). Russia has often tried to influence elections, with little success. The Economist. Retrieved from https://www.economist.com/united- states/2016/12/17/russia-has-often-tried-to-influence-elections-with-little-success. Ellis, E. G. (2018). Crying ‘Pedophile’ is the oldest propaganda trick in the book. Wired. Eltantawy, N., & Wiest, J. B. (2011). The Arab spring| Social media in the Egyptian revolution: reconsidering resource mobilization theory. International Journal of Communication, 5, 18. Enten, H. (2016). How Much Did WikiLeaks Hurt Hillary Clinton? Givethirtyeight.com. Retrieved from https://fivethirtyeight.com/features/wikileaks-hillary-clinton/. Eronen, P. (2016). Russian Hybrid Warfare. Retrieved from Washington DC. European Union External Action. (2017). Questions and Answers about the East StratCom Task Force. Retrieved from https://eeas.europa.eu/headquarters/headquarters- homepage/2116/-questions-and-answers-about-the-east-stratcom-task-force_en. Essay UK (2018). "Essay: Research methods—qualitative, exploratory, inductive and basic research approaches." Retrieved 17/08/2018, 2018, from http://www.essay.uk.com/essays/education/essay-research-methods-qualitative- exploratory-inductive-and-basic-research-approaches/. European Union External Action (2017). "Questions and Answers about the East StratCom Task Force." Retrieved 17 July 2018, 2018, from https://eeas.europa.eu/headquarters/headquarters-homepage/2116/-questions-and- answers-about-the-east-stratcom-task-force_en.

249

Evans, D. (Producer). (2020, 18 October). Facebook's battle against election manipulation. Facebook. (2017). Improvements in protecting the integrity of activity on Facebook. Retrieved from https://www.facebook.com/notes/facebook-security/improvements- in-protecting-the-integrity-of-activity-on-facebook/10154323366590766/. Falichev, O. (1995). FCS will Certainly Publish Information on Who Helped Dudayev and How. Krasnaya Zvezda, 21. Fasola, N. (2017). Principles of Russian Military Thought. In. Prague: Institute of International Relations Prague. Fenster, M. (1999). Conspiracy theories: Secrecy and power in American culture. U of Minnesota Press. Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59(7), 96-104. Finch, R. C. I. (1998). Why the Russian military failed in Chechnya. Finkel, E. and Y. M. Brudny (2012). "Russia and the colour revolutions." Democratization. 19(1): 15-36. Fishel, J., & Stracqualursi, V. (2016, 29 June). NRA Ad Supporting Donald Trump Appears to Be Filmed Inside a National Cemetery Despite Restrictions. Retrieved from https://abcnews.go.com/US/nra-ad-supporting-donald-trump-appears-filmed- inside/story?id=40227403. Fisher, M., Woodrow Cox, J., & Hermann, P. (2016, 6 December). Pizzagate: From rumor, to hashtag, to gunfire in D.C. The Washington Post. Retrieved from https://www.washingtonpost.com/local/pizzagate-from-rumor-to-hashtag-to- gunfire-in-dc/2016/12/06/4c7def50-bbd4-11e6-94ac- 3d324840106c_story.html?utm_term=.cc0cca24c8ce. Fitzgerald, C. W., & Brantly, A. F. (2017). Subverting Reality: The Role of Propaganda in 21st Century Intelligence. International Journal of intelligence and Counterintelligence, 30(2), 215-240. Fitzgibbon, L. (1980). Katyn vs 'Khatyn': Another Soviet Hoax. The Journal of Historical Review, 1, 230-233. Forbes. (2020). World’s Billionaries List: The Richest in 2020. Retrieved from https://www.forbes.com/billionaires/.

250

Fowler, G. A. (Producer). (2020, 11 December). Twitter and Facebook warning labels aren’t enough to save democracy. Fusaro, D. (Producer). (2018). Before Meddling in U.S. Elections, the Russians Picked on Food. Food Processing. Gabriel, T. (2016, 4 July). Donald Trump Finds Himself Playing Catch-Up in All-Important Ohio. The New York Times. Retrieved from https://www.nytimes.com/2016/07/05/us/politics/donald-trump-ohio.html. Gadde, V., & Roth, Y. (2018, 17 October). Enabling further research of information operations on Twitter. Retrieved from https://blog.twitter.com/en_us/topics/company/2018/enabling-further-research-of- information-operations-on-twitter.html. Galeotti, M. (2016). Hybrid, ambiguous, and non-linear? How new is Russia’s ‘new way of war’? Small Wars & Insurgencies, 27(2), 282-301. Galeotti, M. (2019). Russian Political War: Moving Beyond the Hybrid: Routledge. Garde-Hansen, J. (2009). MyMemories?: Personal digital archive fever and Facebook. In Save as… Digital memories (pp. 135-150): Springer. Garton, L., Haythornthwaite, C., & Wellman, B. (1997). Studying online social networks. Journal of computer-mediated communication, 3(1), JCMC313. Gayle, D. (2016). CIA concludes Russia interfered to help Trump win election, say reports. The Guardian. www.theguardian.com. Gel'man, V. (2004). "The unrule of law in the making: the politics of informal institution building in Russia." Europe-Asia Studies. 56(7): 1021-1040. Gerasimov, V. (2013). The value of science in prediction. Military-Industrial Kurier, 27. Gerasimov, V. (2016). Po opytu Sirii. Voenno-promyshlennyi kur’er Retrieved from http://vpk-news.ru/articles/29579. Gerber, T., & Zavisca, J. (2016). Does Russian Propaganda Work? Washington quarterly, 39(2), 79-89. doi:10.1080/0163660X.2016.1204398. Gertz, B. (2016, 6 October). Clinton Sought Pentagon, State Dept. Contracts for Chelsea Friend. The Washington Free Beacon. Retrieved from https://freebeacon.com/national-security/clinton-sought-pentagon-state- department-contracts-chelseas-friend/. Giles, K. (2009). Russia’s national security strategy to 2020. NATO Defense College. Giles, K. (2015). Russia’s Toolkit. In The Russian Challenge. London: Chatham House.

251

Giles, K. (2013). Internet use and cyber security in Russia. Research Centre for East European Studies. Giles, K. (2015). "Russia and its Neighbours: old attitudes, new capabilities." Cyber War in Perspective: Russian Aggression against Ukraine: 19-28. Giles, K. (2015). Russia’s Toolkit. The Russian Challenge. London, Chatham House. Giles, K. (2016). Russia’s ‘New’ Tools for Confronting the West. Continuity and Innovation in Moscow’s Exercise of Power. Retrieved from Chatham House. The Royal Institute of International Affairs. Giles, K. (2016). The Next Phase of Russian Information Warfare. NATO Strategic Communications Centre of Excellence. Giles, K. (2019). Moscow rules: What drives Russia to confront the West: Brookings Institution Press. Gioe, D. V. (2018). Cyber operations and useful fools: the approach of Russian hybrid Intelligence. Intelligence and National Security, 1-20. Glaser, A. (2018, 18 February). What We Know About How Russia’s Internet Research Agency Meddled in the 2016 Election. Retrieved from https://slate.com/technology/2018/02/what-we-know-about-the-internet-research- agency-and-how-it-meddled-in-the-2016-election.html. Global Security (Producer). (2019, 13 November). First Chechnya War—1994-1996. Retrieved from https://www.globalsecurity.org/military/world/war/chechnya1.htm. Goldgeier, J. M., & McFaul, M. (2003). Power and purpose: US policy toward Russia after the Cold War: Brookings Institution Press. Goldman, A. (2016). The Comet Ping Pong gunman answers our reporter’s questions. The New York Times. Retrieved from https://www.nytimes.com/2016/12/07/us/edgar- welch-comet-pizza-fake-news.html?module=inline. Goldman, M. I. (2004). "Putin and the Oligarchs." Foreign Aff. 83: 33. Goldman, M. I. (2005). "Putin, the Oligarchs & the End of Political Liberalization." The Economists' Voice. 2(2). Grabner-Kräuter, S., & Bitter, S. (2015). Trust in online social networks: A multifaceted perspective. Paper presented at the Forum for social economics. Gray, D. E. (2013). Doing research in the real world: Sage.

252

Greaves, F., Laverty, A. A., Cano, D. R., Moilanen, K., Pulman, S., Darzi, A., & Millett, C. (2014). Tweets about hospital quality: a mixed methods study. BMJ Qual Saf, 23(10), 838-846. Green, J. J. (2018, 19 September). Tale of a Troll: The Russian operation to target Hillary Clinton. Washington’s Top News. Retrieved from https://wtop.com/j-j-green- national/2018/09/tale-of-a-troll-the-operation-to-target-hillary-clinton/. Gregory, P. (2018). A Reassessment of Putin's Russia: The Economy. South Central Review, 35(1), 175-195. Griffin, A. (Producer). (2020, 16 October). What is QAnon? The origins of bizarre conspiracy theory spreading online. Grytsenko, O. (2014, 12 April). Armed pro-Russian insurgents in say they are ready for police raid. Post. Retrieved from https://web.archive.org/web/20140412131249/http://www.kyivpost.com/content/ ukraine/armed-pro-russian-insurgents-in-luhansk-say-they-are-ready-for-police-raid- 343167.html. The Guardian. (Producer). (2018, 18 October). Russian spy poisoning: Theresa May issues ultimatum to Moscow. Hao, L., & Wei, L. (2013). Misinformation. International economic review (Philadelphia), 54(1), 253-277. doi:10.1111/j.1468-2354.2012.00732.x. Hardaker, C. (2010). Trolling in asynchronous computer-mediated communication: from user discussions to academic definitions. Journal of Politeness Research, 6, 215-242. Harkins, S. G., & Petty, R. E. (1987). Information utility and the multiple source effect. Journal of personality and social psychology, 52(2), 260. Hatuqa, D. (2016, 1 July). Thousands of Americans to stage anti-bigotry rally. Al Jazeera. Retrieved from https://www.aljazeera.com/news/2016/06/thousands-americans- stage-anti-bigotry-rally-160630174636942.html. Heathman, A. (2016, 12 October). Aliens and arms deals: the Wikileaks 'October Surprise' data dumps have begun. Wired. Retrieved from https://www.wired.co.uk/article/wikileaks-plans-target-us-election. Heeks, R., & Seo-Zindy, R. (2013). ICTs and Social Movements under Authoritarian Regimes: An Actor-Network Perspective. Paper presented at the UKAIS.

253

Heickerö, R. (2010). Emerging cyber threats and Russian views on Information warfare and Information operations: Defence Analysis, Swedish Defence Research Agency (FOI). Helderman, R. S. and T. Hamburger (2015). Foreign governments gave millions to foundation while Clinton was at State Dept. W. Post: https://www.washingtonpost.com/politics/foreign-governments-gave-millions-to- foundation-while-clinton-was-at-state-dept/2015/2002/2025/31937c31931e- bc31933f-31911e31934-38668-31934e31937ba38439ca31936_story.html. Helmus, T. C., Bodine-Baron, E., Radin, A., Magnuson, M., Mendelsohn, J., Marcellino, W., Winkelman, Z. (2018). Russian social media influence: Understanding Russian propaganda in Eastern Europe: Rand Corporation. Henderson, A., & Kyng, M. (1995). There's no place like home: Continuing Design in Use. In Readings in Human–Computer Interaction (pp. 793-803): Elsevier. Herasimenka, A. (2018, 23 February). What’s behind ’s digital challenge to Vladimir Putin’s regime? Five things to know. The Washington Post. Retrieved from https://www.washingtonpost.com/news/monkey-cage/wp/2018/02/23/whats- behind-alexei-navalnys-digital-challenge-to-vladimir-putins-regime-5-things-to- know/?utm_term=.2ec57e7c5937. Herd, G. and S. W. Cumings (2000). "Information Warfare and the Second Chechen Campaign." Peace in Post-Soviet Eastern Europe: 31-42. Herd, G. P. (2018). "Russia’s Hybrid State and President Putin’s Fourth-Term Foreign Policy?" The RUSI Journal. 163(4): 20-28. Heritage, S. (2015). Why I’m finally going to boycott Amazon. The Guardian. Herszenhorn, D. M., & Barry, E. (2011, 8 December). Putin Contends Clinton Incited Unrest Over Vote. The New York Times. Retrieved from https://www.nytimes.com/2011/12/09/world/europe/putin-accuses-clinton-of- instigating-russian-protests.html. Herzog, S. (2011). Revisiting the Estonian cyber attacks: Digital threats and multinational responses. Strategic Security, 13(4). Hicks, C. (Producer). (2016, 23 July). Timeline of Hillary Clinton's email scandal. Hillary Clinton's Health Scare: 9 Unanswered Questions. NBC News. Retrieved from https://www.nbcnews.com/politics/2016-election/hillary-clinton-s-health-scare-9- unanswered-questions-n646551.

254

Historyofvaccines.org (2018). "History of Anti-vaccination Movements." Retrieved 18 April 2019, from https://www.historyofvaccines.org/index.php/content/articles/history- anti-vaccination-movements. Hitchens, C. (2014). For the sake of argument: essays and minority reports. Atlantic Books Ltd. Hoffman, F. G. (2009). Hybrid warfare and challenges, National Defense University of Washington DC. Institute for National Strategic Studies. Holland, M. (2006). "The Propagation and Power of Communist Security Services Dezinformatsiya." International Journal of Intelligence Counter Intelligence. 19(1): 1-31. Hollingsworth, M. and S. Lansley (2010). Londongrad: From Russia with Cash-The Inside Story of the Oligarchs. HarperCollins UK. Horsburgh, D. (2003). Evaluation of qualitative research. Journal of clinical nursing, 12(2), 307-312. Howard, P. N., Duffy, A., Freelon, D., Hussain, M. M., Mari, W., & Maziad, M. (2011). Opening closed regimes: what was the role of social media during the Arab Spring? Available at SSRN 2595096. Howard, P. N. and M. M. Hussain (2013). Democracy's fourth wave?: digital media and the Arab Spring. Oxford University Press. Howard, P. N., et al. (2015). "Opening closed regimes: what was the role of social media during the Arab Spring?" SSRN. Howard, P. N., et al. (2017). "Junk news and bots during the US election: What were Michigan voters sharing over Twitter." Computational Propaganda Research Project. Oxford Internet Institute, Data Memo. (2017.1). Howard, P. N., Ganesh, B., Liotsiou, D., Kelly, J., & François, C. (2018). The IRA, Social Media and Political Polarization in the United States, 2012-2018: University of Oxford. Howard, P. N., et al. (2019). "The IRA, social media and political polarization in the United States, 2012-2018." University of Nebraska – Lincoln. DigitalCommons@University of Nebraska – Lincoln. 10-2019. Hsiao, A. (Producer). (2020). What is PayPal? The Balance small business. Humphreys, L., Gill, P., & Krishnamurthy, B. (2014). Twitter: a content analysis of personal information. Information, Communication & Society, 17(7), 843-857.

255

Huhtinen, A.-M. and J. Rantapelkonen (2016). Junk Information in Hybrid Warfare: The Rhizomatic Speed of Social Media in the Spamosphere. European Conference on Cyber Warfare and Security, Academic Conferences International Limited. Humor, S. (2017). A Brief History of the “Kremlin Trolls”. The Saker. Willmington, DE, Saker Analytics, LLC. Huseynov, V. (2018). The Nexus of Neoclassical Realism and Soft Power, Georg-August- Universität Göttingen. Iasiello, E. (2017). Russia's Improved Information Operations: From Georgia to Crimea. Parameters, 47(2). Inkster, N. (2016). Information Warfare and the US Presidential Election. Survival, 58(5), 23-32. doi:10.1080/00396338.2016.1231527. Institute for Work & Health. (2015). Cross-sectional vs. longitudinal studies. Retrieved from https://www.iwh.on.ca/what-researchers-mean-by/cross-sectional-vs- longitudinal-studies. Intelligence Community Assessment. (2017). Assessing Russian Activities and Intentions in Recent US Elections. (ICA-2017-01D). National Intelligence Council. Intelligence Services. (2009). Annual Report of the Security Information Service for 2008. Retrieved from Czech Republic: https://www.bis.cz/annual-reports/annual-report- of- the-security-information-service-for-2008-4bdea95b.html. Isakova, I. (2004). Russian Governance in the 21st Century: Geo-Strategy, Geopolitics and New Governance. Routledge. Isikoff, M., & Corn, D. (2018). Russian Roulette: The Inside Story of Putin's War on America and the Election of Donald Trump: Twelve. Ivanov, S. (Writer). (2007). NTV, Moscow (in Russian). In. FBIS- SOV: Open Source Center. Ižak, Š. (2019). "Using the topic of migration by pro-Kremlin propaganda: Case study of Slovakia." Journal of Comparative Politics. 12(1): 53-70. Jackall, R. (1995). Propaganda (Vol. 8): NYU Press. Jackson, M. (2002). The politics of storytelling: Violence, transgression, and intersubjectivity (Vol. 3): Museum Tusculanum Press. Jaitner, M. (2013). "Exercising power in social media." The fog of cyber defence: 57. Jaitner, M., & Geers, K. (2015). Russian information warfare: Lessons from Ukraine. Cyber War in Perspective: Russian Aggression against Ukraine; Kenneth, G., Ed, 89.

256

Jaitner, M., & Mattsson, P. A. (2015). Russian Information Warfare of 2014. In (Vol. 2015-, pp. 39-52). Jameson, F. (2013). The political unconscious: Narrative as a socially symbolic act. Routledge. Jensen, B., et al. (2019). "Fancy bears and digital trolls: Cyber strategy with a Russian twist." Journal of Strategic Studies. 42: 1-23. Jensen, T. (2016). Trump Remains Unpopular; Voters Prefer Obama on SCOTUS Pick. Retrieved from Trump Remains Unpopular; Voters Prefer Obama on SCOTUS Pick. Jóhannesson, G. T., & Bærenholdt, J. O. (2009). Actor-network theory/network geographies. Jonsson, O., & Seely, R. (2015). Russian full-spectrum conflict: An appraisal after Ukraine. The Journal of Slavic Military Studies, 28(1), 1-22. Jøsang, A., Gray, E., & Kinateder, M. (2003). Analysing topologies of transitive trust. Paper presented at the Proceedings of the First International Workshop on Formal Aspects in Security & Trust (FAST2003). Jovanovic, M. M. (2018). The case of Russian trolls and what next? Antidot! New Independent Multimedia Network & Campus. Western Balkans. Judicial Watch. (2016, 30 June). Judicial Watch Asks Justice Inspector General to Investigate Loretta Lynch-Bill Clinton Meeting. Retrieved from https://www.judicialwatch.org/press-room/press-releases/judicial-watch-asks- justice-inspector-general-investigate-loretta-lynch-bill-clinton-meeting/. Kalinina, E. (2016). Narratives of Russia’s “Information Wars”. Politics in Central Europe. The Journal of the Central European Political Science Association, 12(1), 147-165. Kang, C. (2016). Fake news onslaught targets pizzeria as nest of child-trafficking. The New York Times. Kaplan, A. M., & Haenlein, M. (2010). Users of the world, unite! The challenges and opportunities of Social Media. Business horizons, 53(1), 59-68. Katkov, M. (1863). Samoderzhavie tsaria i edinstvo Rusi (I. Yablokov, Trans.): Moskovskie vedomoasti. Katz, M. N. (2011, 26 October). No Reason to Fear Arab Spring in Russia. . Retrieved from https://www.themoscowtimes.com/2011/10/26/no-reason-to- fear-arab-spring-in-russia-a10435. Keene, S. D. (2011). Terrorism and the internet: A double‐edged sword. Journal of Money Laundering Control.

257

Keller, F. B., Schoch, D., Stier, S., & Yang, J. (2020). Political on Twitter: How to Coordinate a Disinformation Campaign. Political communication, 37(2), 256-280. doi:10.1080/10584609.2019.1661888. Kelly, T. (2017). ‘Follow the White Rabbit’ is the most bonkers conspiracy theory you will ever read. The Daily Dot. Kennan, G. F. (1948). Policy Planning Staff Memorandum (RG 273). Retrieved from Records of the National Security Council, NSC 10/2: http://academic.brooklyn.cuny.edu/history/johnson/65ciafounding3.ht. Kim, D., Graham, T., Wan, Z., & Rizoiu, M.-A. (2019). Tracking the Digital Traces of Russian Trolls: Distinguishing the Roles and Strategy of Trolls on Twitter. arXiv preprint arXiv:.05228. Kinniburgh, J. and D. Denning (2006). Blogs and military information strategy, Joint Special Operations Univ Hurlburt Field FL. Kirby, J. (2016, 5 December). The city getting rich from fake news. BBC News. Retrieved from https://www.bbc.com/news/magazine-38168281. Klein, H. K., & Myers, M. D. (1999). A set of principles for conducting and evaluating interpretive field studies in information systems. MIS quarterly, 23(1), 67-94. Knight, J. (2004). Information Warfare. In Encyclopedia of Espionage, Intelligence, and Security (pp. 107-110): The Gale Group Inc. Koesel, K. J. and V. J. Bunce (2013). "Diffusion-proofing: Russian and Chinese responses to waves of popular mobilizations against authoritarian rulers." Perspectives on Politics. 11(3): 753-768. Kofman, M. and M. Rojansky (2015). A Closer Look at Russia's' hybrid War'. Woodrow Wilson International Center for Scholars. Kosenkov, A. (2016). "Cyber conflicts as a new global threat." Future Internet. 8(3): 45. Kozachenko, I. (2015). "Bad news for Putin as support for war flags beyond Russia's' troll farms'." Working Papers of the Communities. Culture Network. 6. Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive- scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 201320040. Kroll, A. (2018, 9 December). John Podesta is ready to talk about Pizzagate. Rolling Stone. Retrieved from https://www.rollingstone.com/politics/politics-features/john- podesta-pizzagate-766489/.

258

Kuchins, A. (2012). The demise of the US-Russia reset: what’s next? . Retrieved from Chatham House: https://www.chathamhouse.org/sites/files/chathamhouse/public/Research/Russia% 20and%20Eurasia/181012summary.pdf. Kumar, S., et al. (2017). An army of me: Sockpuppets in online discussion communities. Proceedings of the 26th International Conference on World Wide Web, International World Wide Web Conferences Steering Committee. Kuzio, T. (2014). Crime, politics and business in 1990s Ukraine. Communist and Post- Communist Studies, 47(2), 195-210. Labot, E. (2011). Clinton cites 'serious concerns' about Russian election. Retrieved from https://edition.cnn.com/2011/12/06/world/europe/russia-elections-clinton/index.html. LaCapria, K. (Producer). (2016, 5 January). Is Comet Ping Pong Pizzeria Home to a Child Abuse Ring Led by Hillary Clinton? LaFrance, A. (Producer). (2020, 18 October). The Prophecies of Q. Lange-Ionatamishvili, E. Lambridge, W. (1996). "A Note on KGB Style." Studies in Intelligence. 15(1). Lanskoy, M. and D. Myles-Primakoff (2018). "Power and Plunder in Putin's Russia." Journal of Democracy. 29(1): 76-85. Laruelle, M. (2016). "The three colors of Novorossiya, or the Russian nationalist mythmaking of the Ukrainian crisis." Post-Soviet Affairs. 32(1): 55-74. Laskin, A. V. (2019). Defining propaganda: A psychoanalytic perspective. Communication and the Public, 4(4), 305-314. Latour, B. (1996). Social theory and the study of computerized work sites. In Information technology and changes in organizational work (pp. 295-307): Springer. Lazer, D. M., et al. (2018). "The science of fake news." Science. 359(6380): 1094-1096. Le, H. T., et al. (2017). Revisiting the American voter on twitter. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, ACM. Ledford, C. J., & Anderson, L. N. (2013). Online social networking in discussions of risk: applying the CAUSE model in a content analysis of Facebook. Health, Risk & Society, 15(3), 251-264. Lee, D. (2018, 16 February). The tactics of a Russian troll farm. BBC News. Retrieved from https://www.bbc.com/news/technology-43093390. Leonard, M., et al. (2007). A power audit of EU-Russia relations. European Council on Foreign Relations London.

259

Lewis, J. (2000). The United States and the Origins of the Cold War, 1941-1947. ACLS Humanities Books: H EBooks. Leshchenko, S. (2014). "The Maidan and Beyond: The Media's Role." Journal of Democracy. 25(3): 52-57. Libicki, M. C. (1995). What Is Information Warfare? Washington, D.C: National Defense University. Lin, N. (2017). Building a network theory of social capital. In Social capital (pp. 3-28): Routledge. Liñán, M. V. (2010). "History as a propaganda tool in Putin’s Russia." Communist Post-Communist Studies. 43(2): 167-178. Linvill, D. L., Boatwright, B. C., Grant, W. J., & Warren, P. L. (2019). “The Russians are hacking my brain!” investigating Russia's internet research agency twitter tactics during the 2016 United States presidential campaign. Computers in Human Behavior, 99, 292-300. doi:10.1016/j.chb.2019.05.027. Lipman, M. a. (2009). Media manipulation and political control in Russia. Chatham House. Lipton, E., et al. (2016). "The perfect weapon: how Russian cyberpower invaded the US." The New York Times. 13. Lofland, J., & Lofland, L. H. (1995). Developing analysis. Analyzing social setting, 183- 203. Lonkila, M. (2012). Russian Protest On-and Offline: The role of social media in the Moscow opposition demonstrations in December 2011. UPI FIIA Briefing Papers, 98. Lucas, E. and P. Pomeranzev (2016). "Winning the Information War." Technique Counter- strategies to Russian Propaganda in Central Eastern Europe. Washington: The Centre for European Policy Analysis. Lucas, S. and K. Mistry (2009). "Illusions of coherence: George F. Kennan, US strategy and political warfare in the early Cold War, 1946–1950." Diplomatic History. 33(1): 39- 66. Magocsi, P. R. (2010). A history of Ukraine: The land and its peoples: University of Toronto Press. Makow, H. (2002). "Illuminati Defector Details Pervasive Conspiracy." HenryMakow. com. Malashenko, A. V. (2013). Russia and the Arab spring. Carnegie Moscow Center Moscow. Malešević, S. (2018). "Nationalism and the longue durée." Nations and Nationalism. 24(2): 292-299.

260

Mankoff, J. (2014). Russia's latest land grab: How Putin won Crimea and lost Ukraine. Foreign Aff., 93, 60. Marcellino, W., et al. (2017). Monitoring Social Media: Lessons for Future Department of Defense Social Media Analysis in Support of Information Operations, Rand National Defense Research INST Santa Monica CA Santa Monica. Marcus, G. E. and M. G. Powell (2003). From conspiracy theories in the incipient New World Order of the 1990s to regimes of transparency now, JSTOR. Marples, D. R., & Duke, D. F. (1995). Ukraine, Russia, and the question of Crimea. Nationalities papers, 23(2), 261-289. McCaskill, N. D. (2020, 21 August). ‘It was great’: In leaked audio, Trump hailed low Black turnout in 2016. Politico. McConnachie, J., & Tudge, R. (2013). Rough Guide to Conspiracy Theories, The (3rd): Rough Guides UK. Metaxas, P., & Finn, S. (2017). The infamous “Pizzagate” conspiracy theory: Insights from a twittertrails investigation. Computation Journalism. Metzger, M. M., Bonneau, R., Nagler, J., & Tucker, J. A. (2016). Tweeting identity? Ukrainian, Russian, and #Euromaidan. Journal of Comparative Economics, 44(1), 16-40. Milbank, D. (2016, 13 April). Trey Gowdy injects Benghazi into the 2016 campaign. The Washington Post. Retrieved from https://www.washingtonpost.com/opinions/trey- gowdys-benghazi-surprise/2016/04/13/dd861754-01ab-11e6-9d36- 33d198ea26c5_story.html. Miller, R. S. (2012). America’s Abandoned Sons. Library of Congress: Xlibris Corporation. Miskimmon, A., O'loughlin, B., & Roselle, L. (2014). Strategic narratives: Communication power and the new world order: Routledge. Molander, R. C., et al. (1996). Strategic information warfare: A new face of war. Rand Corporation. Mohammed, A., & Adomaitis, N. (2011, 6 December). Clinton criticizes Russia vote, Germany urges improvement. Reuters. Retrieved from https://www.reuters.com/article/us-russia-election-usa/clinton-criticizes-russia-vote- germany-urges-improvement- idUSTRE7B50IE20111206. Monaghan, A. (2013). The New Russian Foreign Policy Concept: Evolving Continuity. Chatham House, Russia and Eurasia(2013/13).

261

Monteiro, E. (2000). Actor-network theory and information infrastructure. From control to drift: The dynamics of corporate information infrastructures, 71, 83. Morgan, G. and L. Smircich (1980). "The case for qualitative research." Academy of management review. 5(4): 491-500. Morozov, V. (2015). Russia's postcolonial identity: A subaltern empire in a Eurocentric world. Springer. Nakashima, E. (2016, 14 June). Russian government hackers penetrated DNC, stole opposition research on Trump. The Washington Post. Retrieved from https://www.washingtonpost.com/world/national-security/russian-government- hackers-penetrated-dnc-stole-opposition-research-on-trump/2016/06/14/cf006cb4- 316e-11e6-8ff7- 7b6c1998b7a0_story.html?noredirect=on&utm_term=.caa0dc1867f7. Naylor, B. (2016). Trump Apparently Quotes Russian Propaganda to Slam Clinton On Benghazi. NPR. Nechepurenk, I., & Schwirtz, M. (2018). What We Know About Russians Sanctioned by the United States. The New York Times, Europe. Retrieved from https://www.nytimes.com/2018/02/17/world/europe/russians-indicted- mueller.html. Newcomb, A. (2018). Twitter is purging millions of fake accounts — and investors are spooked. NBC News. Nelson, J. L., & Webster, J. G. (2017). The myth of partisan selective exposure: A portrait of the online political news audience. Social Media+ Society, 3(3), 2056305117729314. Newton, C. (Producer). (2020, 12 October). What is QAnon, the conspiracy theory spreading throughout the US. [US Elections 2020]. Nguyen, T. C. (Producer). (2019, 5 May). The problem of living inside echo chambers.

Nickerson, Raymond S. (1998), "Confirmation bias: A ubiquitous phenomenon in many guises", Review of General Psychology, 2(2): 175–220, doi:10.1037/1089- 2680.2.2.175, S2CID 8508954.

Nissen, T. E. (2015). #TheWeaponizationOfSocialMedia:@ Characteristics_of_Contemporary_Conflicts: Royal Danish Defence College.

262

Noor, K. B. M. (2008). "Case study: A strategic research methodology." American journal of applied sciences. 5(11): 1602-1604. Nye Jr, J. S. (2009). Get smart: Combining hard and soft power. Foreign affairs, 160-163. Nye, J. S. (2004). Soft power : the means to success in world politics / Joseph S. Nye, Jr (1st ed. ed.). New York: Public Affairs. Nye, J. S. (2017). Information warfare versus soft power. The Strategist. Retrieved from https://www.aspistrategist.org.au/information-warfare-versus-soft-power/. Nyst, C., & Monaco, N. (2018). State-sponsored trolling: how governments are deploying disinformation as part of broader digital harassment campaigns. Institute for the Future. O’Loughlin, B. (2015). The permanent campaign. In: SAGE Publications Sage UK: London, England. O’Loughlin, J., et al. (2017). "The rise and fall of “Novorossiya”: examining support for a separatist geopolitical imaginary in southeast Ukraine." Post-Soviet Affairs. 33(2): 124-144. Osipova, Y. (2017). Indigenizing Soft Power in Russia. In N. Chitty, J. Li, G. D. Rawnsley, & C. Hayden (Eds.), The Routledge Handbook of Soft Power. Abingdon, Oxon: Routledge. Paavola, J., Helo, T., Jalonen, H., Sartonen, M., & Huhtinen, A. M. (2016). Understanding the Trolling Phenomenon: The Automated Detection of Bots and Cyborgs in the Social Media. Journal of Information Warfare, 15(4), 100-V. Paletta, D. (2016). Al Qaeda-Linked Group Cites Donald Trump in Terrorist Recruiting Video. The Wall Street Journal. Retrieved from https://blogs.wsj.com/washwire/2016/01/01/al-qaeda-linked-group-cites-donald- trump-in-terrorist-recruiting-video/. Palmer, A. (2019). Facebook removed 3.2 billion fake accounts between April and September, more than twice as many as last year. Retrieved from https://www.cnbc.com/2019/11/13/facebook-removed-3point2-billion-fake- accounts-between-apr-and-sept.html. Panarin, I. N. (2012). The Information War against Russia: Operation Anti-Putin. Schiller- Institut. Securing Mankind's Future. Berlin, Germany. Papacharissi, Z. (2015). Affective publics: Sentiment, technology, and politics: Oxford University Press.

263

Pariser, E. a. (2011). The : what the Internet is hiding from you: London : Viking. Parkhe, A. (1993). ‘Messy’ Research, Methodological Predispositions and Theory Development in International Joint Ventures. Academy of Management Review, 18(2), 227-268. Paterson, T. G., & McMahon, R. J. (1999). The origins of the Cold War / edited and with an introduction by Robert J. McMahon, Thomas G. Paterson (4th ed. ed.). Boston: Boston : Houghton Mifflin. Paul, C., & Matthews, M. (2016). The Russian “Firehose of Falsehood” Propaganda Model. RAND Corporation. Pearce, K. E. (2015). "Democratizing kompromat: The affordances of social media for state- sponsored harassment." Information, Communication Society. 18(10): 1158-1174. Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior Exposure Increases Perceived Accuracy of Fake News. Journal of Experimental Psychology: General, 147(12), 1865-1880. doi:10.1037/xge0000465. Perry, C., & Jensen, O. (2001). Approaches to Combining Induction and Deduction In One Research Study. Research Gate. Retrieved from https://www.researchgate.net/publication/255654388_Approaches_to_Combining_ Induction_and_Deduction_In_One_Research_Study. Pezard, S., & Rhoades, A. L. (2020). What Provokes Putin’s Russia. Deterring without unintended escalation. RAND Corporation. Retrieved from https://www.rand.org/pubs/perspectives/PE338.html. Piedrahita, P., Borge-Holthoefer, J., Moreno, Y., & González-Bailón, S. (2018). The contagion effects of repeated activation in social networks. Social Networks, 54, 326- 335. Pikulicka-Wilczewska, A. and R. Sakwa (2015). Ukraine and Russia: People, politics, propaganda and perspectives. E-International Relations Publishing. Policy Planning Staff Memorandum (1948). The Inauguration of Organized Political Warfare. Politros. (2018). Alexander Litvinenko’s father about the “Skripl case”: “Scotland Yard doesn’t search for a criminal”. Politros. Retrieved from https://politros.com/122323- otec-aleksandra-litvinenko-o-dele-skripalya-skotlend-yard-ne-zanimaetsya-poiskami- prestupnika.

264

Polkovnikov, P. (1999). A Painful Spot. Nezavismoye Voyennoye Obozreniye. Pomerantsev, P. (2015). The Kremlin’s Information War. Journal of Democracy, 26(4), 40- 50. doi:10.1353/jod.2015.0074. Pomerantsev, P., & Weiss, M. (2014). The menace of unreality: How the Kremlin weaponizes information, culture and money (Vol. 14): Institute of Modern Russia New York. Popper, K. (2006). "The conspiracy theory of society." J Conspiracy theories: The philosophical debate: 13-16. Price, R., & Sheth, S. (2018). DNC hacker 'Guccifer 2.0' was reportedly confirmed as a Russian agent after forgetting to conceal his identity online. Briefing. Retrieved from https://www.businessinsider.com.au/dnc-hacker-guccifer-confirmed-as-russian- agent-after-forgetting-to-conceal-identity-2018-3?r=US&IR=T. Prudnyk, I. (2018). “Injection of war”: disentangling the war. A case study informed by Actor-Network Theory. Putin, V. (2011). Address by President of the Russian Federation. In. The Kremlin, Moscow. Putin, V. (2014). Speech to the Russian State Duma [Press release]. Putnam, R. D. (2000). Bowling alone: America’s declining social capital. In Culture and politics (pp. 223-234): Springer. Rall, T. (2016, 4 July). Hillary Cheated. Counter Punch. Retrieved from https://www.counterpunch.org/2016/07/04/Hillary-cheated/. Ratkiewicz, J., Conover, M., Meiss, M. R., Gonçalves, B., Flammini, A., & Menczer, F. (2011). Detecting and tracking political abuse in social media. ICWSM, 11, 297-304. Reed, M. (1995). The action/structure debate in organizational analysis. Paper presented at the Conference on structuration theory and organizations. Reiss, M. (2019). Disinformation in the Regan years and lessons for today. Retrieved from www.jstor.org/stable/resrep19125. Relman, E. (2017, 16 November). These are the sexual-assault allegations against Bill Clinton. . Retrieved from https://www.businessinsider.com.au/these-are-the-sexual-assault-allegations- against-bill-clinton-2017-11?r=US&IR=T. Renz, B. (2016). Russia and ‘hybrid warfare’. Contemporary Politics, 22(3), 283-300. doi:10.1080/13569775.2016.1201316.

265

Research Methodology. (2018). "Inductive Approach (Inductive Reasoning)." Retrieved 20 August 2018, from https://research-methodology.net/research-methodology/research- approach/inductive-approach-2/. Reuters. (2016, 16 October). Donald Trump continues to belittle the media and Hillary Clinton at his rallies. Fortune.com. Retrieved from http://fortune.com/2016/10/16/trump-belittles-media-and-clinton/. Richards, L. (1993). Writing a Qualitative Thesis or Grant Application. In K. Beattie (Ed.), So Where’s Your Research Profile? A Resource Book for Academics. South Melbourne: Union of Australian College Academics. Rid, T. (2012). Cyber War Will Not Take Place. Journal of Strategic Studies, 35(1), 5-32. doi:10.1080/01402390.2011.608939. Riggins, N. (2017). What is Clickbait and Why Should You Be Careful Using It to Promote Your Business? Small Business Trends. Rittersporn, G. T. (2014). Anguish, Anger, and Folkways in Soviet Russia: Pittsburgh, Pa.: University of Pittsburgh Press. Robb, A. (2017). Anatomy of a Fake News Scandal. Retrieved from https://www.rollingstone.com/politics/politics-news/anatomy-of-a-fake-news- scandal-125877/. Robinson, L., Helmus, T. C., Cohen, R. S., Nader, A., Radin, A., Magnuson, M., & Migacheva, K. (2018). Modern Political Purposes. Current Practices and Possible Responses. Retrieved from Santa Monica, California. Robinson, L., et al. (2018). Modern Political Purposes. Current Practices and Possible Responses. Santa Monica, California, RAND Corporation. Robinson, M., Jones, K., Janicke, H., & Maglaras, L. (2018). An introduction to cyber peacekeeping. Journal of Network and Computer Applications, 114, 70-87. Rodriguez, M. (2019). Disinformation Operations Aimed at (Democratic) Elections in the Context of Public International Law: The Conduct of the Internet Research Agency During the 2016 US Presidential Election. International Journal of Legal Information, 47(3), 197. doi:10.1017/jli.2019.28. Roonemaa, H., & Springe, I. (2018, 31 August). This Is How Russian Propaganda Actually Works In The 21st Century, World News. Buzzfeed News. Retrieved from https://www.buzzfeednews.com/article/holgerroonemaa/russia-propaganda- baltics-baltnews.

266

Roslycky, L. L. (2011). Russia’s smart power in Crimea: sowing the seeds of trust. Southeast European and Black Sea Studies, 11(3), 299-316. doi:10.1080/14683857.2011.590313. Roth, Y. (2019). Information operations on Twitter: principles, process, and disclosure. Retrieved from https://blog.twitter.com/en_us/topics/company/2019/information- ops-on-twitter.html. Roudik, P. (2019). Russian Federation: Legal Aspects of War in Georgia. Retrieved from https://www.loc.gov/law/help/legal-aspects-of-war/russian-georgia-war.php. Rouse, M. (2016). denial of service (DoS) attack. Retrieved from http://searchsecurity.techtarget.com/definition/denial-of-service. RT. (2018). About Us. Retrieved from https://www.rt.com/about-us/. Rupert, E. (2016, 5 July). Trump tells crowd Clinton wants to 'abolish' Second Amendment, News. The Hill. Retrieved from http://thehill.com/blogs/blog-briefing- room/news/279139-trump-tells-rally-crowd-clinton-wants-to-abolish-second. Russkiy Mir. (2012). Vladimir Putin: Russia and the Changing World, Yandex.dzen. Rutland, P. (2003). "Putin and the Oligarchs." Putin’s Russia. Past imperfect, future uncertain. Lanham: Rowman & Littlefield: 133-152. Rutland, P., & Kazantsev, A. (2016). The limits of Russia’s ‘soft power’. Journal of Political Power, 9(3), 395-413. doi:10.1080/2158379X.2016.1232287. Ryan, A. B. (2006). "Post-positivist approaches to research." Researching and Writing your Thesis: a guide for postgraduate students: 12-26. Samadashvili, S. (2015). Muzzling the bear: strategic defence against Russia’s undeclared information war on Europe. European View, 14(1), 141-141. doi:10.1007/s12290- 015-0361-7. Sanger, D. E. (2018). The age of cyberwar is here. We can't keep citizens out of the debate. The Guardian. Sanovich, S. (2017). "Computational Propaganda in Russia: The Origins of Digital Misinformation." Working Paper. Sarotte, M. E. (2014). A Broken promise: What the West really told Moscow about NATO expansion. Foreign Aff., 93, 90. Satter, D. (2017). Russia Questions for Rex Tillerson. Wall Street Journal. Retrieved from https://www.hudson.org/research/13222-russia-questions-for-rex-tillerson. Sayce, D. (2018). Number of tweets per day? David Sayce, Digitial Consultant. London.

267

Schneider, J. (2017). The information revolution and international stability: A multi-article exploration of computing, cyber, and incentives for conflict, The George Washington University. Schweitzer, A. (2016, 14 December). This isn’t the start of a new cold war – the first one never ended, Opinion. The Guardian. Retrieved from https://www.theguardian.com/commentisfree/2016/dec/13/cold-war-never-ended- west-russia. Schoen, F. and C. J. Lamb (2012). Deception, Disinformation, and Strategic Communications: How One Interagency Group Made a Major Difference. National Defense University Press. Schweitzer, A. (2016). This isn’t the start of a new cold war – the first one never ended. The Guardain. Australia. Schwirtz, M., & Barry, E. (Producer). (2018, Nov 12th, 2020). A Spy Story: Sergei Skripal Was a Little Fish. He Had a Big Enemy. Science Daily. (2020). Confirmation Bias. Retrieved from https://www.sciencedaily.com. Sciutto, J. (2017). How one typo helped let Russian hackers in. CNN Politics. Retrieved from https://edition.cnn.com/2017/06/27/politics/russia-dnc-hacking-csr/index.html Search Security. (2018). Logic Bomb. Retrieved from https://searchsecurity.techtarget.com/definition/logic-bomb. Seitz-Wald, A., Alba, M., Mitchell, A., Welker, K., & Hunt, K. (2016, 12 September). Sherchan, W., Nepal, S., & Paris, C. (2013). A survey of trust in social networks. ACM Computing Surveys (CSUR), 45(4), 47. Siggett, S. (2017). The Cyber Cold War: A Look into Russia's Information Warfare Capabilities. In C. Riddell & B. Lauman (Eds.): ProQuest Dissertations Publishing. Silverman, C., & Alexander, L. (Producer). (2016, 11 December). How Teens in the Balkans are Duping Trump Supporters with Fake News. Silverman, D. (2007). A very short, fairly interesting and reasonably cheap book about qualitative research / David Silverman. London: Los Angeles: SAGE. Simes, D. K. (1998). "Russia's crisis, America's complicity." The National Interest. (54): 12- 22. Simons, G. (2011). "Attempting to re-brand the branded: Russia’s international image in the 21st century." Russian Journal of Communication. 4(3-4): 322-350. Sindelar, D. (2014). "The Kremlin’s troll army." The Atlantic. 12.

268

Singer, P. W., & Brooking, E. T. (2018). LikeWar: The weaponization of social media: Eamon Dolan Books. Sivkov, K. (2015). "Led by the “fifth column”—part I. Lenin's ideas live and sometimes win." VPK News. 20 586. Slipchenko, V. (2002). Vojny Shestogo pokoleniya. Moskva: Veche. Smith, D. (1999). Working the rough stone. Freemasonry and society in 18th century Russia. Northern Illinois University Press: De Kalb. Smith, R. (2015, 4 June). Columbia Chemical hoax tracked to “troll farm” dubbed the Internet Research Agency. Retrieved from https://www.news.com.au/technology/online/social/columbia-chemical-hoax- tracked-to-troll-farm-dubbed-the-internet-research-agency/news- story/128af54a82b83888158f7430136bcdd1. Snee, H. (2013). Framing the Other: cosmopolitanism and the representation of difference in overseas gap year narratives. The British Journal of Sociology, 64(1), 142-162. Snegovaya, M. (2015). Putin’s information warfare in Ukraine. Soviet Origins of Russia’s Hybrid Warfare, Washington. Snider, M. (2018, 16 February). Robert Mueller investigation: What is a Russian troll farm? US Today, p. 1. Retrieved from https://www.usatoday.com/story/tech/news/2018/02/16/robert-mueller- investigation-what-russian-troll-farm/346159002/. Soldatov, A., & Borogan, I. (2015). The Red Web. United States: Public Affairs. Soros, G. (2019). George Soros. Retrieved from https://www.georgesoros.com. Speier, H. (1948). "The future of psychological warfare." Public Opinion Quarterly. 12(1): 5-18. Sperling, V. (2012). "Nashi Devushki: Gender and Political Youth Activism in Putin's and Medvedev's Russia." Post-Soviet Affairs. 28(2): 232-261. Spruds, A., et al. (2016). "Internet Trolling as a hybrid warfare tool: the case of Latvia." stratcomcoe. org. Riga, LV: NATO Strategic Communications Centre of Excellence. Archived from the original. 28. Starbird, K. (2017). Examining the alternative media ecosystem through the production of alternative narratives of mass shooting events on Twitter. Paper presented at the Eleventh International AAAI Conference on Web and Social Media.

269

Steiger, S., et al. (2018). "Conceptualising conflicts in cyberspace." Journal of Cyber Policy. 3(1): 77-95. Steinzova, L., & Oliynyk, K. (2018). The Sparks Of Change: Ukraine's Euromaidan Protests. Radio Free Europe Radio Liberty. Retrieved from Radio Liberty. Stewart, L. G., Arif, A., & Starbird, K. (2018). Examining trolls and polarization with a retweet network. Paper presented at the Proc. ACM WSDM, workshop on misinformation and misbehavior mining on the web. Stein, R. A. (2017). "The golden age of anti-vaccine conspiracies." Germs. 7(4): 168. Stevenson, P. (2016). Trump is headed for a win, says professor who has predicted 30 years of presidential outcomes correctly. The Washington Post. Washington DC, WP Company LLC. Stewart, L. G., et al. (2018). Examining trolls and polarization with a retweet network. Proc. ACM WSDM, workshop on misinformation and misbehavior mining on the web. Stout, M. (2017). Covert Action in the Age of Social Media. Georgetown Journal of International Affairs (December). Retrieved from https://www.georgetownjournalofinternationalaffairs.org/online- edition/2017/12/22/covert-action-in-the-age-of-social-media. Subrahmanian, V., et al. (2016). "The DARPA challenge." 49(6): 38-46. Sugrue, C. (2018, 23 March). What Twitter's Bulk Tweeting Ban Means to Marketers. Falcon IO. Retrieved from https://www.falcon.io/insights-hub/topics/social-media- management/twitter-automation-policy-marketers/. Sunstein, C. (2018). Is social media good or bad for democracy? Sur International Journal on Human Rights, 15(27), 83. Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17(2), 202-227. Suryanarayana, G., & Taylor, R. (2004). A Survey of Trust Management and Resource Discovery Technologies in Peer-to-Peer Applications. Suzen, H. (2018). A Comparative Study of “Russian Political Warfare against the West and the Western (NATO & EU) Counteractions. Beyond the Horizon. Retrieved from https://www.behorizon.org/russian-political-warfare-against-nato-eu-counteractions/. Suzor, N. P. (2018). Lawless: the secret rules that govern our digital lives: Open Science Framework.

270

Svetoka, S., & Geers, K. (2015). Strategic communications and social media in the Russia Ukraine conflict. Cyber War in Perspective: Russian Aggression against Ukraine, Tallinn: NATO CCD COE Publications. Szapranski, R. (1995). A Theory of Information Warfare; Preparing for 2020. Thalis, A. (2018). Threat or Threatened? Russia in the Era of NATO Expansion. Australian Institute of International Affairs(Australian Outlook). Retrieved from https://www.internationalaffairs.org.au/australianoutlook/threat-or-threatened- russian-foreign-policy-in-the-era-of-nato-expansion/. The Embassy of the Russian Federation in Canada. (2014). The President of the Russian Federation Vladimir Putin press-conference with regard to the situation in Ukraine [Press release]. The War Institute Review. (2019). The Katyn Massacre – the Way to the truth. Retrieved from https://warsawinstitute.review/issue-2019/issue-2-2019/the-katyn-massacre-the- way-to-the-truth/. Thomas, T. (2003). Manipulating the Mass Consciousness: Russian & Chechen “Information War” Tactics in the Second Chechen-Russian Conflict. The Second Chechen War, 112-129. Thornton, R. (2015). The changing nature of modern warfare: Responding to Russian information warfare. The RUSI Journal, 160(4), 40-48. Timberg, C., Dwoskin, E., & Entous, A. (2017, 1 November 2017). U.S. Senate releases trove of Russian Facebook ads, revealing sophisticated influence campaign. The Washington Post. Retrieved from https://www.thestar.com/news/world/2017/11/01/us-senate-releases-trove-of-russian- facebook-ads-revealing-sophisticated-influence-campaign.html. Tinati, R., Halford, S., Carr, L., & Pope, C. (2014). Big data: methodological challenges and approaches for sociological analysis. Sociology, 48(4), 663-681. Tornoe, R. (2018). The dark side of social media. Editor & Publisher, 151(1), 24-25. Torabi, G. (2018). "Revolution and War in Contemporary Ukraine. The Challenge of Change." Research Gate. 70(2): 202-304. Toucas, B. (2017). The Geostrategic Importance of the Black Sea Region: A Brief History. Center for Strategic & International Studies. Retrieved from https://www.csis.org/analysis/geostrategic-importance-black-sea-region-brief-history. Tornoe, R. (2018). "The dark side of social media." Editor & Publisher. 151(1): 24-25.

271

Trento, T. (Producer). (2016, 10 August 2020). Fallen Angel. Cover-up of Seal team Six shoot-down. Trifonov, D. (2003). "Russian intelligence presence in the CIS." Central Asia-Caucasus Analyst 17. Treverton, G. (2017). Influence Operations and the Intelligence/Policy Challenges. Tromblay, D. E. (2018). Political Influence Operations. United Kingdom: Rowman & Littlefield. True Pundit. (2019). Retrieved from https://mediabiasfactcheck.com/true-pundit/ United States District Court for the District of Columbia. (2018). Indictment: United States of America v. Internet Research Agency LLC. In (pp. 37). True Pundit. (2016, 2 November). Breaking Bombshell: NYPD Blows Whistle on New Hillary Emails: Money Laundering, Sex Crimes with Children, Child Exploitation, Pay to Play, Perjury. Retrieved from https://truepundit.com/breaking-bombshell- nypd-blows-whistle-on-new-hillary-emails-money-laundering-sex-crimes-with- children-child-exploitation-pay-to-play-perjury. Tsfati, Y. (2010). "Online news exposure and trust in the mainstream media: Exploring possible associations." American Behavioral Scientist. 54(1): 22-42. Twitter (2018). "We’re focused on serving the public conversation." Twitter (Election Integrity). URA. (2012a, 23 January). For the last month and a half a secret squad of bloggers has been working. They were asked, for example, to call Tagil workers "cattle" URA.RU. Retrieved from https://ura.news/articles/1036257512. URA. (2012b, 27 March). Hackers have made public the work of Sverdlovsk PR people who are helping the Misharin administration on the Internet. In the open access— hundreds of letters with "temniki", instructions, instructions. URA.RU. Retrieved from https://ura.news/news/1052141222. United States District Court for the District of Columbia (2018). Indictment: United States of America v. Internet Research Agency LLC: 37. US Department of Defense (Producer). (2020, 5 May). Force Multiplier. van Dijck, J. (2013). Facebook and the engineering of connectivity: A multi-layered approach to social media platforms. Convergence, 19(2), 141-155. van Dijck, J. (2014). "Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology." Surveillance & society. 12(2): 197-208.

272 van Dijck, J. (2018). The Platform Society : Public Values in a Connective World: Oxford : Oxford University Press USA—OSO. Vasserman, A. (2017). Sotsial'nyye seti i dezinformatsiya [Social networks and disinformation]. Velkov, K. (2016). Pizzagate: A downfall hidden in plan sight. Geopolitca.ru. Russia. Veebel, V. (Producer). (2015, 5 May). Russian Propaganda, disinformation, and Estonia’s experience. Foreign Policy Research Institute. Ven Bruusgaard, K. (2014). "Crimea and Russia's strategic overhaul." Parameters 44(3): 81. Vendil Pallin, C., & Westerlund, F. (2009). Russia's war in Georgia: lessons and consequences. Small wars & insurgencies, 20(2), 400-424. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. Wagner, C., et al. (2012). "When social bots attack: Modelling susceptibility of users in online social networks." Making Sense of Microposts. 2(4): 1951-1959. Walker, S. (2015, 2 April). The Russian troll factory at the heart of the meddling allegations. The Guardian. Retrieved from https://www.theguardian.com/world/2015/apr/02/putin-kremlin-inside-russian-troll- house. Walsham, G. (1997). Actor-network theory and IS research: current status and future prospects. In Information systems and qualitative research (pp. 466-480): Springer. Wang, W., et al. (2015). "Forecasting elections with non-representative polls." International Journal of Forecasting. 31(3): 980-991. Warner, M. (2017, 1 November). [Opening Statement of Vice Chairman Warner from Senate Intel Open Hearing with Social Media Representatives]. Warrick, J., & Troianovski, A. (2018, 10 December). Agents of doubt. The Washington Post. Retrieved from https://www.washingtonpost.com/graphics/2018/world/national-security/russian- propaganda-skripal-salisbury/?utm_term=.6c86494b17ae. Waszczykowski, W. (2015). The Battle for the Hearts and Minds: Countering Propaganda Attacks Against the Euro-Atlantic Community. 048 CDSDG 15 E: NATO Parliamentary Assembly. Watkins, E. (2016, 1 July). Bill Clinton meeting causes headaches for Hillary. Retrieved from https://edition.cnn.com/2016/06/29/politics/bill-clinton-loretta-lynch/index.html

273

Weir, F. (2018, 15 January). Before Russia's 'troll farm' turned to US, it had a more domestic focus. The Christian Science Monitor. Retrieved from https://www.csmonitor.com/World/Europe/2018/0221/Before-Russia-s-troll-farm- turned-to-US-it-had-a-more-domestic-focus. Wendling, M. (2020, 20 August). QAnon: What is it and where did it come from? BBC News. Retrieved from https://www.bbc.com/news/53498434. Wilson, Tom and Starbird, Kate. (2020). Cross-Platform Disinformation Campaigns: Lessons Learned and Next Steps. The Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-002. Winner, L. (1993). Upon opening the black box and finding it empty: Social constructivism and the philosophy of technology. Science, Technology, & Human Values, 18(3), 362-378. Wirtz, J. J. (2015). Cyber War and Strategic Culture: The Russian Integration of Cyber Power into Grand Strategy. Cyber War in Perspective: Russian Aggression against Ukraine, ed. Kenneth Geers, 29-38. Wolfe, L. (2018). "Twitter user statistics 2008 through to 2017." The Balance Career. Woolley, S. and P. N. Howard (2016). "Automation, algorithms, and politics." International Journal of Communication. 10(9). Woolley, S. C. and P. N. Howard (2018). Computational propaganda: political parties, politicians, and political manipulation on social media. Oxford University Press. Yablokov, I. (2018). Fortress Russia. Conspiracy Theories in Post-Soviet Russia: Policy Press. Yablokov, I. (2015). "Conspiracy theories as a Russian public diplomacy tool: The case of Russia Today (RT)." Politics. 35(3-4): 301-315. Yablokov, I. (2019). Russian conspiracy theories: how Kremlin-backed yarns help keep Vladimir Putin in power. The Conversation. Yi, E. (Producer). (2018, 8 October). Themes Don’t Just Emerge—Coding the Qualitative Data. Data Science. Retrieved from https://medium.com/@projectux/themes-dont- just-emerge-coding-the-qualitative-data-95aff874fdce. Yin, R. K. (2015). Qualitative research from start to finish: Guilford Publications. Zannettou, S., Caulfield, T., Setzer, W., Sirivianos, M., Stringhini, G., & Blackburn, J. (2018). Who let the trolls out? towards understanding state-sponsored trolls. arXiv preprint arXiv:.03130.

274

Zannettou, S., et al. (2019). Who let the trolls out? towards understanding state-sponsored trolls. Proceedings of the 10th acm conference on web science. Zatsepin, V. (2005). "Performance-oriented defence budgeting: A Russian perspective." Gaidar Institute for Economic Policy. Zhang, Z. and B. B. Gupta (2018). "Social media security and trustworthiness: overview and new direction." Future Generation Computer Systems. 86: 914-925. Zorin, A. (2001). Kormia dvuglavogo orla. Literatura i gosudarstvennaia ideologiia v Rossi v poslednei treti XVIII—pervoi treti XIX veka, Moscow: Novoe literaturnoe obozrenie. Zurcher, A. (2016, 6 November). Hillary Clinton emails—what's it all about? US & Canada Retrieved from http://www.bbc.com/news/world-us-canada-31806907.

275