Medical Misinformation in the Covid-19 Pandemic

Total Page:16

File Type:pdf, Size:1020Kb

Medical Misinformation in the Covid-19 Pandemic Medical Misinformation in the Covid-19 Pandemic Sarah Kreps and Doug Kriner Department of Government, Cornell University Abstract The World Health Organization has labeled the omnipresence of misinformation about Covid-19 an “infodemic” that threatens efforts to battle the public health emergency. However, we know surprisingly little about the level of public uptake of medical misinformation and whether and how it affects public preferences and assessments. We conduct a pair of studies that examine the pervasiveness and persuasiveness of misinformation about the novel coronavirus’ origins, effective treatments, and the efficacy of government response. Across categories, we find relatively low levels of true recall of even the prominent fake claims. However, many Americans struggle to distinguish fact from fiction, with many believing false claims and even more failing to believe factual information. An experiment offers some evidence that corrections may succeed in reducing misperceptions, at least in some contexts. Finally, we find little evidence that exposure to misinformation significantly affected a range of policy beliefs and political judgments. One of the challenging public health aspects of the Covid-19 epidemic has been the misinformation surrounding the virus. Misinformation in the midst of a pandemic has a long history, dating back at least to the Plague of Athens, as the local population tried to shift blame onto an adversary or far-flung land rather than the local government. What makes the current misinformation context new and potentially threatening is that the internet facilitates the transfer of misinformation—defined as “false or misleading information”1—further and faster than either traditional forms of media or than accurate information.2 How pervasive and persuasive is the spread of medical misinformation? Prior studies offer few clues. Rumors about death panels surrounded the Affordable Care Act, showing that the high stakes of public health is not inoculated from misinformation and may be even more susceptible because the life-and-death consequences make people prone to fear and anxiety.3 Beyond the case- specific study, however, the pervasiveness and persuasiveness of medical misinformation is understudied compared to the focus on political misinformation, especially since the 2016 election. Recent research hints that medical misinformation may be less ubiquitous than political misinformation. Confronted with rapidly spreading false claims about Covid-19, social media platforms, the major vehicle for the diffusion of misinformation, have enacted unprecedented moderation policies, removing content and users that the platforms deem a public health risk. Because of the exigent threat to public health, the public has tacitly endorsed these draconian measures and entrusted platforms to act as a private regulator of the public information domain.4 However, the sheer volume of Covid-19 related content means that misinformation continues to propagate,5 although the degree of public exposure and impact remains unclear. According to one recent study, a small sample of fake claims on Facebook was shared 1.7 million times and viewed an estimated 117 million times as of mid-April 2020.6 In this research, we investigate the extent to which misinformation has percolated into the salient considerations on which Americans draw when thinking about the novel coronavirus. Can Americans faithfully recall Covid-19 misinformation? Can they distinguish factual information from misinformation? How does the spread of fake news affect public attitudes about the pandemic, trust in government, and international adversaries? We answer these questions with a pair of studies focusing on misinformation about Covid- 19, the disease caused by the novel coronavirus. First, we measure public uptake and perceived credibility of misinformation, comparing recall rates and accuracy perceptions across factual information, prominent misinformation about Covid-19, and placebo misinformation that has not appeared widely on social media. Second, we measure the impact of false claims on a range of attitudes, including public policy preferences, evaluation of and trust in government leaders and institutions, and perceptions of foreign competitors. These studies are the first to measure recall of 1 David Lazer, Matthew Baum, Yochai Benkler, Adam Berinsky, Kelly Greenhill, Filippo Menczer, Miriam Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild, Michael Schudson, Steven Sloman, Cass Sunstein, Emily Thorson, Duncan Watts, and Jonathan Zittrain, “The Science of Fake News,” Science, 359 no. 6380, 9 March 2018; 1094- 1096; 1094. 2 Soroush Vosoughi, Deb Roy, and Sinan Aral, “The spread of true and false news online,” Science, 9 March 2018, 359, No 6380, 1146-1151. 3 Adam Berinsky, “Rumors and Health Care Reform: Experiments in Political Misinformation,” British Journal of Political Science, Vol 47, No. 2 (April 2017), 241-262. 4 Sarah Kreps and Brendan Nyhan, “Coronavirus Fake News Isn’t Like Other Fake News,” Foreign Affairs, 30 March 2020. 5 Ramez Kouzy, Joseph Abi Joaode, and Khalil Baddour, “Coronavirus Goes Viral: Quantifying the Covid-19 Misinformation Epidemic on Twitter,” Cureus, March 2020 12 (3): e7255. 6 Avaaz, How Facebook Can Flatten the Curve of the Coronavirus Infodemic, 15 April 2020, https://secure.avaaz.org/campaign/en/facebook_coronavirus_misinformation/ 1 Covid misinformation as well as the first to assess the efficacy of corrections in combatting accuracy perceptions and propensity to distribute misinformation online. We report five main findings. First, while misinformation concerning the pandemic is ubiquitous, our data suggest that uptake and retention of misinformation overall is modest, though it varies by category. We find that true recall of fake headlines about the origins of Covid-19 is modest. However, misinformation about alleged treatments and the effectiveness of the government response to the virus have gained some traction. Second, we find that many Americans fail to correctly identify fake news as false. Perhaps equally if not even more troubling, even more Americans failed to correctly identify factual information as true. This suggests a more indirect, but potentially more dangerous mechanism through which misinformation threatens public health – not by causing majorities to believe erroneous claims, but by saturating the information environment to an extent that it drowns out accurate information.7 Third, these problems are particularly acute among certain partisan subgroups and among heavy consumers of social media. Fourth, corrections to fake news can counter beliefs in misinformation and reduce Americans’ propensity to contribute to its spread; however, these effects are variable across categories of fake claims. Finally, exposure to misinformation had little direct effect on Americans’ policy preferences for responding to the pandemic and on their political judgements. Medical Misinformation The democratic dilemma suggests that sound democratic governance hinges on a well- informed citizenry that can meaningfully weigh tradeoffs between policy proposals. Yet most individuals are underinformed about the very policies that they are meant to adjudicate. 8 Citizens could become more informed if they took measures to acquire policy-relevant information, but increasingly the marketplace of ideas is crowded and indeed fraught with misinformation that can impede the acquisition of accurate information. Research on the spread, uptake, and persuasiveness of misinformation has tended to focus on political misinformation, especially since the 2016 election. Some scholars have found that exposure to misinformation does not translate into persuasion, in part because those most exposed are partisans seeking pro-attitudinal information. 9 Other studies, however, have shown that individuals do fall prey to misinformation, although not for directionally-motivated, partisan reasons. Rather, scholars suggest that individuals are cognitively “lazy” and judge accuracy on the basis of plausibility, which requires some sort of prior about what is reasonable or not.10 Research on medical misinformation has similarly suggested that individuals believe rumors on the basis of cognitive fluency. The more prevalent a rumor, which can arise from partisan political actors frequently trafficking in particular narratives, the more credible it becomes and the harder it is to upend.11 7 The mechanism is similar to the arguments of Berinsky as well as Pennycook and Rand that suggests that fluency of information, which comes from repeat exposure, increases the plausibility of claims. 8 Arthur Lupia and Mathew McCubbins, The Democratic Dilemma: Can Citizens Learn what they need to know? Cambridge University Press, 1998. 9 Andrew Guess, Brendan Nyhan, and Jason Reifler, “Exposure to untrustworthy website in the 2016 US election,” Nature Human Behavior (2020), https://www.nature.com/articles/s41562-020-0833-x?proof=trueMay%252F 10 Gordon Pennycook and David Rand, “Lazy, not Biased: Susceptibility to Partisan Fake News is Better Explained by Lack of Reasoning than by Motivated Reasoning,” Cognition (2018). 11 Adam Berinsky, “Rumors and Health Care Reform: Experiments in Political Misinformation,” British Journal of Political Science, Vol 47, no 2 (April 2017), 241-262. 2 We investigate the applicability of these findings in the Covid-19 context. Previous research suggests that political misinformation travels faster or farther than
Recommended publications
  • Disinformation, and Influence Campaigns on Twitter 'Fake News'
    Disinformation, ‘Fake News’ and Influence Campaigns on Twitter OCTOBER 2018 Matthew Hindman Vlad Barash George Washington University Graphika Contents Executive Summary . 3 Introduction . 7 A Problem Both Old and New . 9 Defining Fake News Outlets . 13 Bots, Trolls and ‘Cyborgs’ on Twitter . 16 Map Methodology . 19 Election Data and Maps . 22 Election Core Map Election Periphery Map Postelection Map Fake Accounts From Russia’s Most Prominent Troll Farm . 33 Disinformation Campaigns on Twitter: Chronotopes . 34 #NoDAPL #WikiLeaks #SpiritCooking #SyriaHoax #SethRich Conclusion . 43 Bibliography . 45 Notes . 55 2 EXECUTIVE SUMMARY This study is one of the largest analyses to date on how fake news spread on Twitter both during and after the 2016 election campaign. Using tools and mapping methods from Graphika, a social media intelligence firm, we study more than 10 million tweets from 700,000 Twitter accounts that linked to more than 600 fake and conspiracy news outlets. Crucially, we study fake and con- spiracy news both before and after the election, allowing us to measure how the fake news ecosystem has evolved since November 2016. Much fake news and disinformation is still being spread on Twitter. Consistent with other research, we find more than 6.6 million tweets linking to fake and conspiracy news publishers in the month before the 2016 election. Yet disinformation continues to be a substantial problem postelection, with 4.0 million tweets linking to fake and conspiracy news publishers found in a 30-day period from mid-March to mid-April 2017. Contrary to claims that fake news is a game of “whack-a-mole,” more than 80 percent of the disinformation accounts in our election maps are still active as this report goes to press.
    [Show full text]
  • SARAH E. KREPS John L. Wetherill Professor
    SARAH E. KREPS John L. Wetherill Professor ACADEMIC EMPLOYMENT 2019-Present Professor of Government, Adjunct Professor of Law, Cornell University 2013-Present Associate Professor, Adjunct Professor of Law, Cornell University 2008-2013 Assistant Professor of Government, Cornell University FELLOWSHIPS AND AFFILIATIONS 2020-Present Non-Resident Senior Fellow, Foreign Policy, Brookings Institution 2020-Present Faculty Affiliate, Institute for Politics and Global Affairs, Cornell University 2018-Present Faculty Fellow, Milstein Program in Technology and Humanity, Cornell University 2018-Present Faculty Affiliate, Roper Center for Public Opinion Research 2007-Present Member, Council on Foreign Relations 2017-2018 Adjunct Scholar, Modern War Institute at West Point 2015 Summer Security Fellow, Hoover Institution, Stanford University 2013-2014 Stanton Nuclear Security Fellow, Council on Foreign Relations 2007-2008 Fellow, Belfer Center for Science and International Affairs, Harvard University 2006-2007 Fellow, Miller Center for Public Affairs, University of Virginia 2006 DAAD fellow, American Institute for Contemporary German Studies 2005-2008 Senior Fellow, Institute for International Law and Politics, Georgetown 1998-1999 Research Associate, Environment and Health Program, University of Geneva 1997-1999 Research Associate, Environmental Epidemiology, University of Paris V EDUCATION Georgetown University, PhD in Government, Fields: Intl. relations, security studies, 2007. Oxford University, M.Sc. in Environmental Change and Management, with distinction, 1999. Harvard University, B.A in Environmental Science and Public Policy, magna cum laude, 1998. BOOKS AND MONOGRAPHS Social Media and International Politics (Cambridge University Press, 2020). Taxing Wars: The American Way of War Finance and the Decline of Democracy (Oxford University Press, 2018). **Reviewed in the New York Times, Washington Post** Drones: What Everyone Needs to Know (Oxford University Press, 2016).
    [Show full text]
  • Fake News on Facebook and Twitter: Investigating How People (Don't)
    CHI 2020 Paper CHI 2020, April 25–30, 2020, Honolulu, HI, USA Fake News on Facebook and Twitter: Investigating How People (Don’t) Investigate Christine Geeng Savanna Yee Franziska Roesner Paul G. Allen School of Computer Science & Engineering University of Washington {cgeeng,savannay,franzi}@cs.washington.edu ABSTRACT well as the accounts who spread these stories. However, the With misinformation proliferating online and more people get- speed, ease, and scalability of information spread on social ting news from social media, it is crucial to understand how media means that (even automated) content moderation by the people assess and interact with low-credibility posts. This platforms cannot always keep up with the problem. study explores how users react to fake news posts on their The reality of misinformation on social media begs the ques- Facebook or Twitter feeds, as if posted by someone they follow. tion of how people interact with it, whether they believe it, We conducted semi-structured interviews with 25 participants and how they debunk it. To support users in making decisions who use social media regularly for news, temporarily caused about the credibility of content they encounter, third parties fake news to appear in their feeds with a browser extension have created fact-checking databases [28, 75, 78], browser unbeknownst to them, and observed as they walked us through extensions [29, 63], and media literacy initiatives [8, 41, 70]. their feeds. We found various reasons why people do not inves- Facebook and Twitter themselves have made algorithm and tigate low-credibility posts, including taking trusted posters’ user interface (UI) changes to help address this.
    [Show full text]
  • Matthew Fuhrmann
    matthew fuhrmann Curriculum Vitae Texas A&M University Email: mcfuhrmann (at) gmail (dot) com, Department of Political Science mfuhrmann (at) tamu (dot) edu 4348 TAMU Website: www.matthewfuhrmann.com College Station, Texas 77843-4348 Updated: July 19, 2019 Professional Positions Current Texas A&M University, Department of Political Science Professor September 2017 - Associate Department Head August 2019 - August 2020 Presidential Impact Fellow September 2018 - Faculty Affiliate, Center for Grand Strategy September 2018 - Stanford University, Center for International Security and Cooperation Affiliate September 2017 - Previous Stanford University, Center for International Security and Cooperation Visiting Associate Professor September 2016 - August 2017 Texas A&M University, Department of Political Science Director of Graduate Studies September 2015 - July 2016, September 2017 - August 2019 Associate Professor September 2014 - August 2017 Ray A. Rothrock ‘77 Fellow September 2014 - August 2017 Assistant Professor July 2011 - August 2014 Council on Foreign Relations Stanton Nuclear Security Fellow August 2010 - July 2011 University of South Carolina, Department of Political Science Assistant Professor January 2009 - May 2011 Harvard University, Belfer Center for Science and International Affairs Affiliate January 2009 - August 2011 Research Fellow August 2007 - December 2008 University of Georgia, Center for International Trade and Security Graduate Research Associate January 2005 - July 2007 Education Ph.D. University of Georgia Political Science 2008 M.S. Georgia Tech International Affairs 2004 B.A. University of Georgia Political Science (magna cum laude) 2002 Awards and Fellowships • Open Educator Award, Student Government Association, Texas A&M University, 2019. 1 • Presidential Impact Fellow, Texas A&M University, 2018. • Andrew Carnegie Fellow, Carnegie Corporation of New York, 2016.
    [Show full text]
  • Facts and Myths About Misperceptions Brendan Nyhan Brendan Nyhan Is Professor of Government, Dartmouth College, Hanover, New
    Facts and Myths about Misperceptions Brendan Nyhan Brendan Nyhan is Professor of Government, Dartmouth College, Hanover, New Hampshire. His email is [email protected]. Abstract Misperceptions threaten to warp mass opinion and public policy on controversial issues in politics, science, and health. What explains the prevalence and persistence of these false and unsupported beliefs, which seem to be genuinely held by many people? Though limits on cognitive resources and attention play an important role, many of the most destructive misperceptions arise in domains where individuals have weak incentives to hold accurate beliefs and strong directional motivations to endorse beliefs that are consistent with a group identity such as partisanship. These tendencies are often exploited by elites, who frequently create and amplify misperceptions to influence elections and public policy. Though evidence is lacking for claims of a “post-truth” era, changes in the speed with which false information travels and the extent to which it can find receptive audiences require new approaches to counter misinformation. Reducing the propagation and influence of false claims will require further efforts to inoculate people in advance of exposure (e.g., media literacy), debunk false claims that are already salient or widespread (e.g., fact-checking), reduce the prevalence of low- quality information (e.g., changing social media algorithms), and discourage elites from promoting false information (e.g., strengthening reputational sanctions). On August 7, 2009, former vice presidential candidate Sarah Palin reshaped the debate over the Patient Protection and Affordable Care Act when she published a Facebook post falsely claiming that “my parents or my baby with Down Syndrome will have to stand in front of [Barack] Obama’s ‘death panel’ so his bureaucrats can decide..
    [Show full text]
  • Echo Chambers
    AVOIDING THE ECHO CHAMBER ABOUT ECHO CHAMBERS: Why selective exposure to like-minded political news is less prevalent than you think — ANDREW GUESS BRENDAN NYHAN Department of Politics Department of Government knightfoundation.org Princeton University Dartmouth College [email protected] [email protected] BENJAMIN LYONS JASON REIFLER | Department of Politics Department of Politics @knightfdn University of Exeter University of Exeter [email protected] [email protected] CONTENTS AVOIDING THE ECHO CHAMBER ABOUT ECHO CHAMBERS CHAMBER ABOUT ECHO THE ECHO AVOIDING 4 THE ECHO CHAMBERS CRITIQUE 6 SELECTIVE EXPOSURE: A MORE COMPLEX STORY 13 THE IMPORTANCE OF SOCIAL CONTEXT 15 CONCLUSION | 17 REFERENCES Contents knightfoundation.org | @knightfdn 2 / 25 Is the expansion of media choice good for democracy? Not according to critics who decry “echo chambers,” “filter bubbles,” and “information cocoons” — the highly polarized, ideologically homogeneous forms of news and media consumption that are facilitated by technology. However, these claims overstate the prevalence and severity of these patterns, which at most AVOIDING THE ECHO CHAMBER ABOUT ECHO CHAMBERS CHAMBER ABOUT ECHO THE ECHO AVOIDING capture the experience of a minority of the public. In this review essay, we summarize the most important findings of the academic literature about where and how Americans get news and information. We focus particular attention on how much consumers engage in selective exposure to media content that is consistent with their political beliefs and the extent to which this pattern is exacerbated by technology. As we show, the data frequently contradict or at least complicate the “echo chambers” narrative, which has ironically been amplified and distorted in a kind of echo chamber effect.
    [Show full text]
  • The Science of Fake News News Outlets, Simultaneously Benefiting from on March 8, 2018 and Undermining Their Credibility
    INSIGHTS Downloaded from http://science.sciencemag.org/ POLICY FORUM SOCIAL S CIENCE gated about topics such as vaccination, nu- trition, and stock values. It is particularly pernicious in that it is parasitic on standard The science of fake news news outlets, simultaneously benefiting from on March 8, 2018 and undermining their credibility. Addressing fake news requires a multidisciplinary effort Some—notably First Draft and Facebook— favor the term “false news” because of the use of fake news as a political weapon (1). By David M. J. Lazer, Matthew A. Baum, and the mechanisms by which it spreads. We have retained it because of its value as a Yochai Benkler, Adam J. Berinsky, Kelly Fake news has a long history, but we focus scientific construct, and because its politi- M. Greenhill, Filippo Menczer, Miriam on unanswered scientific questions raised by cal salience draws attention to an impor- J. Metzger, Brendan Nyhan, Gordon the proliferation of its most recent, politically tant subject. Pennycook, David Rothschild, Michael oriented incarnation. Beyond selected refer- Schudson, Steven A. Sloman, Cass R. ences in the text, suggested further reading THE HISTORICAL SETTING Sunstein, Emily A. Thorson, Duncan J. can be found in the supplementary materials. Journalistic norms of objectivity and bal- Watts, Jonathan L. Zittrain ance arose as a backlash among journalists WHAT IS FAKE NEWS? against the widespread use of propaganda he rise of fake news highlights the We define “fake news” to be fabricated in- in World War I (particularly their own role erosion of long-standing institutional formation that mimics news media content in propagating it) and the rise of corporate bulwarks against misinformation in in form but not in organizational process or public relations in the 1920s.
    [Show full text]
  • Deterrence: What It Can (And Cannot) Do
    Deterrence and Conflict Deterrence: what it can (and cannot) do Ellen Resnek Downingtown East High School 2018 FPRI Conference Understanding the Many Missions of the American Military Lesson Plan World History/Contemporary Issues High School Essential Question: Assess the validity of the statement Deterrence is still fundamentally about influencing an actor's decisions. It is about a solid policy foundation. It is about credible capabilities. It is about what the U.S. and our allies as a whole can bring to bear in both a military and a nonmilitary sense. Robert Kehler Instructional Focus: After this lesson, students will be able to: define the acronym NATO and other key terms related to the lesson's content explain NATO's purpose identify member countries of NATO discuss employed defense strategies students will be able to summarize a specific event of NATO efforts Curriculum Standards CCSS.ELA-LITERACY.RST.11-12.7 Integrate and evaluate multiple sources of information presented in diverse formats and media (e.g., quantitative data, video, multimedia) in order to address a question or solve a problem. CCSS.ELA-LITERACY.WHST.11-12.2.B Develop the topic thoroughly by selecting the most significant and relevant facts, extended definitions, concrete details, quotations, or other information and examples appropriate to the audience's knowledge of the topic. Objectives: NCSS Standard VI. Power, Authority, and Governance. Understanding the historical development of structures of power, authority, and governance and their evolving functions in contemporary U.S. society and other parts of the world is essential for developing civic competence. Teacher Background This lesson plan was conceived and adapted from the lecture: “Deterrence and Forward Presence in Europe: From Cold War to Present” Sarah Kreps Associate Professor of Government, Cornell University March 24, 2018 The emergence of the Cold War following WWII did not allow for all U.S.
    [Show full text]
  • Forthcoming. American Political Science Review Partisan Polarization Is the Primary Psychological Motivation Behind Political Fa
    Please cite the final version of the paper available at American Political Science Review Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter Mathias Osmundsen∗1, Alexander Bor1, Peter Bjerregaard Vahlstrup2, Anja Bechmann2, and Michael Bang Petersen1 1 Department of Political Science, Aarhus University 2School of Communication and Culture, Aarhus University Abstract The rise of “fake news” is a major concern in contemporary Western democracies. Yet, research on the psychological motivations behind the spread of political fake news on social media is surprisingly limited. Are citizens who share fake news ignorant and lazy? Are they fueled by sinister motives, seeking to disrupt the social status quo? Or do they seek to attack partisan opponents in an increasingly polarized political environment? This manuscript is the first to test these competing hypotheses based on a careful mapping of psychological profiles of over 2,300 American Twitter users linked to behavioral sharing data and sentiment analyses of more than 500,000 news story headlines. The findings contradict the ignorance perspective but provide some support for the disruption perspective and strong support for the partisan polarization perspective. Thus, individuals who report hating their political opponents are the most likely to share political fake news and selectively share content that is useful for derogating these opponents. Overall, our findings show that fake news sharing is fueled by the same psychological motivations that drive other forms of partisan behavior, including sharing partisan news from traditional and credible news sources. ∗Corresponding author: [email protected] 1 With the advent of social media, the circulation of “fake news” has emerged as a major societal concern.
    [Show full text]
  • When Does the Mission Determine the Coalition? the Logic of Multilateral
    This article was downloaded by: [Cornell University] On: 13 February 2013, At: 23:30 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Security Studies Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/fsst20 When Does the Mission Determine the Coalition? The Logic of Multilateral Intervention and the Case of Afghanistan Sarah Kreps Version of record first published: 15 Sep 2008. To cite this article: Sarah Kreps (2008): When Does the Mission Determine the Coalition? The Logic of Multilateral Intervention and the Case of Afghanistan, Security Studies, 17:3, 531-567 To link to this article: http://dx.doi.org/10.1080/09636410802319610 PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages
    [Show full text]
  • WEAPONS of MASS DISTRACTION: Foreign State-Sponsored Disinformation in the Digital Age
    WEAPONS OF MASS DISTRACTION: Foreign State-Sponsored Disinformation in the Digital Age MARCH 2019 PARK ADVISORS | Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age Authored by Christina Nemr and William Gangware Acknowledgements The authors are grateful to the following subject matter experts who provided input on early drafts of select excerpts: Dr. Drew Conway, Dr. Arie Kruglanski, Sean Murphy, Dr. Alina Polyakova, and Katerina Sedova. The authors also appreciate the contributions to this paper by Andrew Rothgaber and Brendan O’Donoghue of Park Advisors, as well as the editorial assistance provided by Rhonda Shore and Ryan Jacobs. This report was produced with support from the US Department of State’s Global Engagement Center. Any views expressed in this report are those of the authors and do not necessarily reflect the views of the US State Department, Park Advisors, or its subject matter expert consultants. Any errors contained in this report are the authors’ alone. PARK ADVISORS | Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age 0. Table of Contents 01 Introduction and contextual analysis 04 How do we define disinformation? 06 What psychological factors drive vulnerabilities to disinformation and propaganda? 14 A look at foreign state-sponsored disinformation and propaganda 26 Platform-specific challenges and efforts to counter disinformation 39 Knowledge gaps and future technology challenges PARK ADVISORS | Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age 1 Introduction and 1. contextual analysis On July 12, 2014, viewers of Russia’s main state-run television station, Channel One, were shown a horrific story.
    [Show full text]
  • The Role of Technology in Online Misinformation Sarah Kreps
    THE ROLE OF TECHNOLOGY IN ONLINE MISINFORMATION SARAH KREPS JUNE 2020 EXECUTIVE SUMMARY States have long interfered in the domestic politics of other states. Foreign election interference is nothing new, nor are misinformation campaigns. The new feature of the 2016 election was the role of technology in personalizing and then amplifying the information to maximize the impact. As a 2019 Senate Select Committee on Intelligence report concluded, malicious actors will continue to weaponize information and develop increasingly sophisticated tools for personalizing, targeting, and scaling up the content. This report focuses on those tools. It outlines the logic of digital personalization, which uses big data to analyze individual interests to determine the types of messages most likely to resonate with particular demographics. The report speaks to the role of artificial intelligence, machine learning, and neural networks in creating tools that distinguish quickly between objects, for example a stop sign versus a kite, or in a battlefield context, a combatant versus a civilian. Those same technologies can also operate in the service of misinformation through text prediction tools that receive user inputs and produce new text that is as credible as the original text itself. The report addresses potential policy solutions that can counter digital personalization, closing with a discussion of regulatory or normative tools that are less likely to be effective in countering the adverse effects of digital technology. INTRODUCTION and machine learning about user behavior to manipulate public opinion, allowed social media Meddling in domestic elections is nothing new as bots to target individuals or demographics known a tool of foreign influence.
    [Show full text]