Trends for 2021 and Beyond
Total Page:16
File Type:pdf, Size:1020Kb
Hybrid CoE Working Paper 11 JULY 2021 Disinformation 2.0: Trends for 2021 COI HYBRID INFLUENCE COI and beyond ISABELLA GARCIA-CAMARGO AND SAMANTHA BRADSHAW Hybrid CoE Hybrid CoE Working Paper 11 Disinformation 2.0: Trends for 2021 and beyond ISABELLA GARCIA-CAMARGO AND SAMANTHA BRADSHAW Hybrid CoE Working Papers cover work in progress: they develop and share ideas on Hybrid CoE’s ongoing research/ workstrand themes or analyze actors, events or concepts that are relevant from the point of view of hybrid threats. They cover a wide range of topics related to the constantly evolving security environment. COI Hybrid Influence looks at how state and non-state actors conduct influence activities targeted at Participating States and institutions, as part of a hybrid campaign, and how hostile state actors use their influence tools in ways that attempt to sow instability, or curtail the sovereignty of other nations and the independence of institutions. The focus is on the behaviours, activities, and tools that a hostile actor can use. The goal is to equip practitioners with the tools they need to respond to and deter hybrid threats. COI HI is led by the UK. The European Centre of Excellence for Countering Hybrid Threats tel. +358 400 253800 www.hybridcoe.fi ISBN (web) 978-952-7282-88-5 ISBN (print) 978-952-7282-89-2 ISSN 2670-160X July 2021 Hybrid CoE is an international hub for practitioners and experts, building Participating States’ and institutions’ capabilities and enhancing EU-NATO cooperation in countering hybrid threats, located in Helsinki, Finland. The responsibility for the views expressed ultimately rests with the authors. Contents EXECUTIVE SUMMARY 7 THE EVOLVING THREAT LANDSCAPE: NEW TRENDS IN DIGITAL DISINFORMATION 10 Hard-to-verify content 10 Top-down disinformation 11 Groups and online communities 12 Cross-platform coordination 15 Algorithmic amplification 15 Images and ephemeral content 15 Live streams 16 Extremist platforms 16 Encrypted platforms 17 FUTURE TECHNOLOGIES: ARTIFICIAL INTELLIGENCE, DEEP FAKES AND THE NEXT GENERATION OF DISINFORMATION 18 EVALUATING PLATFORM RESPONSES 19 Facing coordinated authentic behaviour 20 Policy-evading content 20 Networked disinformation 21 Incentivized disinformation 21 AI on the horizon 22 CONCLUSIONS AND RECOMMENDATIONS 23 AUTHORS 25 REFERENCES 26 Executive summary Russian interference in the 2016 US presidential Center for an Informed Public, the Digital Forensic election drew attention to the many ways that Research lab, Graphika and the Stanford Internet social media can be leveraged for widescale infor- Observatory. Here, we identified ten key trends mation operations by nation-state actors. Since about 2020 disinformation campaigns: then, however, these campaigns have evolved: Rather than a large foreign effort to undermine 1) Disinformation actors made use of hard-to- the integrity of the election, the new digital threats verify content, which is content that is not to the 2020 US presidential election came from falsifiable through fact-checking. Hard-to-verify domestic users who co-opted social media plat- content can evade takedowns because it blurs forms to spread disinformation about a “stolen the line between truth and fiction, with users election”. Groups like #StopTheSteal – coordi- framing disinformation as stories they heard nated by many authentic American users – seeded from friends or asking questions about known false claims about illegal voting on Facebook and conspiracies or disinformation. Twitter.1 Conservative and far-right livestreams 2) Rather than bots or fake accounts amplifying on YouTube amplified conspiracies about rigged disinformation, most viral disinformation was voting machines to audiences that rival television top-down and was amplified by influencer or broadcast shows.2 Users on TikTok and Insta- accounts with millions of followers. As platforms gram posted stories about missing, destroyed, or consider speech from influencers – like politi- uncounted ballots on election day – many of which cians – newsworthy, content produced and would disappear after 24 hours.3 And on fringe or shared by influencer accounts is considered alternative platforms like Parler, Gab, and 4chan, a public good and has not been moderated users left comments inciting the “rape, torture, as strictly as that from other non-newsworthy and the assassination of named public officials and accounts. private citizens”, rhetoric that ultimately led to an 3) Disinformation narratives were not spread on insurrection at the US Capitol on 6 January 2021. a single platform but often appeared across In this Hybrid CoE Working Paper, we evaluate multiple platforms. This made it difficult to how disinformation has evolved from 2016 to remove conspiracy theories or disinformation as 2020. We define “disinformation” as the purposeful they emerged online, since information opera- spread of false, misleading, or exaggerated con- tions often extended beyond the remit of a tent online, or the use of fake accounts or pages single platform to respond. designed to purposefully manipulate or mislead 4) Social media platforms did not take action users. Using the 2020 election in the United States quickly enough against groups and online as our case study, we examine how the actors, communities that encouraged and coordinated strategies, and tactics for spreading disinformation the authentic bottom-up spread of disinforma- have evolved over the past four years, and discuss tion. These entities were the primary drivers of the direction that future policymaking should take key disinformation campaigns, which spilled into to address contemporary trends in information offline action, including #StopTheSteal. operations. Our data and examples are drawn from 5) While recommendation algorithms are founda- the research and analysis we conducted as part of tional to the business models of social media the Election Integrity Partnership, facilitated by the platforms, algorithms can also amplify 1 Silverman, Mac, and Lytvynenko, ‘Facebook Knows It Was Used To Help Incite The Capitol Insurrection’. 2 Bradshaw et al., ‘Election Delegitimization’. 3 Paul, ‘TikTok: False Posts about US Election Reach Hundreds of Thousands’. 7 disinformation and reinforce viewpoints within look realistic – the use of AI remained largely certain groups of users. Some election delegit- untapped during the 2020 US presidential imization narratives were picked up and election. Sometimes, AI would be used for recommended by platform algorithms. deception – to create fake profile photos of a 6) Many disinformation actors leveraged images, person who does not exist to make inauthentic videos and ephemeral content to create accounts appear real. However, no large-scale highly engaging posts, and elicit strong emo- disinformation campaigns or viral content relied tional responses in stories that often disap- primarily on the use of these technologies. peared after 24 hours. Researchers, fact- checkers, and even social media platforms Given growing concerns over the misuse of social themselves found it difficult to moderate or media platforms, the companies themselves have track the origin, spread and impact of this adopted a series of measures designed to limit the specific disinformation tactic. harms of disinformation. These strategies have 7) Video streaming platforms which were used to ranged from improving both human and auto- livestream content blended disinformation and mated content moderation capabilities to investing conspiracy into online talk shows, often broad- in fact-checking initiatives, or labelling information casting to millions of people in real time. The sources to help users distinguish trustworthy real-time nature of this content combined with entities from dubious ones. 4 In the lead-up to the the fact that shows could be an hour-long 2020 US presidential election, social media plat- segment with only a brief mention of a con- forms also introduced election-specific policies spiracy theory introduced a range of moderation that addressed content meant to stagnate the challenges that platforms were not prepared spread of election-related mis- and disinformation, to address. such as claims delegitimizing election results or 8) Narratives moderated by mainstream platforms calls to interfere with voting operations through would reappear in extremist communities violence.5 However, despite these initiatives, the online, which drew a large number of followers spread of disinformation continues to disrupt dem- who were similarly removed from mainstream ocratic deliberation online. We identify four areas platforms. Extremist communities, like Parler, in which policies were inadequate to address the Gab, 4chan or 8chan, take a laissez-faire realized dynamics: approach to content moderation, which allowed for disinformation and conspiracy to continue 1) The most salient disinformation narratives were to find a home online. coordinated by authentic users, rather than 9) Certain communities of users – like diaspora fake, inauthentic accounts. While platforms have communities – communicated through developed and implemented policies designed encrypted platforms, where harmful disinfor- to target inauthentic behaviour, decisions were mation about the legitimacy of the election much slower and more difficult when real users spread. The encrypted nature of the platforms were involved. made this content much harder to track. Addi- 2) While platforms have policies targeting tionally, the organic trust engendered within election