Countering Information Influence Activities: the State of The

Total Page:16

File Type:pdf, Size:1020Kb

Countering Information Influence Activities: the State of The RESEARCH REPORT Countering Information Influence Activities The State of the Art 2 Countering Information Influence Activities: The State of the Art, version 1.4 (1 July 2018) James Pamment, Howard Nothhaft, Henrik Agardh-Twetman, Alicia Fjällhed Department of Strategic Communication, Lund University MSB’s points of contact: Fredrik Konnander Publication number MSB1261 – July 2018 ISBN 978-91-7383-865-8 MSB has ordered and financed this report. The authors are solely responsible for the content. 3 Preface This is version 1.4 of a report that aims to provide an overview of current thinking on how to counteract information influence activities. It was commissioned to support the Swedish Civil Contingencies Agency’s (MSB) work in strengthening societal resilience against information influence activities. The report is intended to offer (1) a scientific overview to support the development of the MSB handbook Counter Influence Strategies for Communicators, (2) a guide and framework that can support the development of training and education on counter influence, and (3) a Swedish perspective on the knowledge currently available on information influence activities. The authors wish to thank MSB and the dozens of interviewees and reviewers without whom the study would not have been possible. 4 Foreword The term “Fake News” has catapulted the topic of information operations into the centre of a heated and mostly ill-informed public debate. One common misconception is that the problem is new. Another mistake is to assume that information, as the most visible part of the problem, is necessarily the most important. Both of these are at best only partly true. Political warfare—the disruption of another country’s public opinion and decision-making—dates back decades, if not centuries. Information operations form only part of the subversive arsenal. They are usually conducted along with the use of money, intimidation (legal and physical), cyber-attacks, and many other tactics. But the information environment has indeed changed sharply, and mostly to the advantage of unscrupulous autocratic countries wishing to attack open societies. The internet enables ubiquity, immediacy and, most of all, anonymity of a kind undreamt of in past decades. Technological change has also lowered barriers to entry in the media industries and disrupted the business models of incumbents. The era of media gatekeepers—in effect a cartel of trusted sources—has given way to a kaleidoscope of facts and opinions. This diversity is welcome, but as with all technological change, we are still developing the norms and rules to manage it. In the meantime, our information landscape is ripe for misuse. The abuse of automated communication sources (bots) and paid-for disruptors (trolls) distorts public debate by promoting lies and swamping truth. Our attackers are also aided by weaker social and political immune systems. Amid a general (and largely welcome) decline in deference, respect for experts has diminished. For complex demographic, economic and social reasons, levels of social trust in many societies have ebbed. At least in the short term, these problems are likely to worsen. We have seen attacks on the confidentiality of data (hacking) and its availability (swamping). The looming threat is attacks on data integrity. How will we react to “deepfakes”—seemingly authentic video and audio which purports to show our public figures saying words they never said, and doing things they never did? 5 As this excellent and timely report makes clear, the greatest strength of our societies—a open, robust public debate—now risks being its greatest vulnerability. Unconstrained by the requirements of honesty, truth or self- respect, attackers can use our information system to confuse us and disrupt our public life. This report ably brings modern academic thinking on cognition and other topics to bear on the problem, notably in explaining how we form our individual and collective opinions. It offers no easy answers—indeed, it cautions against them. It notes, rightly, that whereas applications may be legitimate, illegitimate, or outright illegal, the techniques themselves are neutral. Parody and satire, for example, are the lifeblood of a modern democracy. If we cannot mock our rulers, we are not truly free. But the same techniques in the hands of an enemy state are no longer amusing and thought-provoking; they are aimed cynically at increasing polarisation and corroding the trust and respect that are essential for the proper functioning of our public life. Grey areas abound and the choices we face are hard. An entirely passive response invites defeat. An overly zealous one is self-defeating. If we protect ourselves against attacks from autocratic closed societies by giving sweeping and arbitrary powers to our rulers, we may win the battle, but we lose the war. The most important recommendation in this report is therefore to study information operations first, and to act cautiously in trying to mitigate or counter their effects. Crying “wolf!” (or “Fake News!”) at every news item we dislike is a sure way to erode credibility. Our adversaries’ biggest and most effective victories come when we do their work for them. Edward Lucas Senior Vice President at the Center for European Policy Analysis (CEPA) July, 2018 6 Table of Contents Preface ..................................................................................................................... 3 Foreword .................................................................................................................. 4 1. Introduction ........................................................................................................ 8 2. Understanding information influence activities ........................................... 14 2.1 Definition of information influence activities ............................................... 14 2.2 Diagnosing illegitimate influence ............................................................... 16 2.3 Exploitation of vulnerabilities ..................................................................... 18 2.4 Hybrid threats and grey zones ................................................................... 21 3. Identifying information influence activities ................................................... 24 3.1 Influence strategies .................................................................................... 24 Positive, negative, oblique ................................................................ 24 Narratives and facts .......................................................................... 26 Classic vs. cognitive strategies ......................................................... 30 3.2 Influence techniques .................................................................................. 31 Sociocognitive and psychographic hacking ...................................... 33 Social hacking ................................................................................... 35 Para-social hacking .......................................................................... 39 Symbolic action ................................................................................. 41 Disinformation and “fake news” ........................................................ 43 Forging and leaking .......................................................................... 49 Potemkin villages of evidence .......................................................... 52 Deceptive identities........................................................................... 55 Bots, sockpuppets and botnets ........................................................ 56 Trolling and flaming......................................................................... 61 Humour and memes ....................................................................... 65 Malign rhetoric ................................................................................ 68 3.3 Influence stratagems .................................................................................. 70 Black propaganda ............................................................................. 71 Point and shriek ................................................................................ 72 Laundering ........................................................................................ 73 Flooding ............................................................................................ 74 Cheerleading .................................................................................... 75 Raiding .............................................................................................. 76 Polarisation ....................................................................................... 77 Hack, mix, release ............................................................................ 78 3.4 Information influence activities in practice: a case study ............................. 79 4. Counteracting information influence activities............................................. 82 4.1 Approaches to countering information influence activities ........................... 82 4.2 The communicator’s mandate ...................................................................... 85 7 4.3 Preparation ................................................................................................... 88 4.3.1 Societal and organisational preparedness ....................................... 88 4.3.2 Raising awareness...........................................................................
Recommended publications
  • Anecdotal Evidence 2013
    Repositorium für die Medienwissenschaft Sean Cubitt Anecdotal evidence 2013 https://doi.org/10.25969/mediarep/15070 Veröffentlichungsversion / published version Zeitschriftenartikel / journal article Empfohlene Zitierung / Suggested Citation: Cubitt, Sean: Anecdotal evidence. In: NECSUS. European Journal of Media Studies, Jg. 2 (2013), Nr. 1, S. 5– 18. DOI: https://doi.org/10.25969/mediarep/15070. Erstmalig hier erschienen / Initial publication here: https://doi.org/10.5117/NECSUS2013.1.CUBI Nutzungsbedingungen: Terms of use: Dieser Text wird unter einer Creative Commons - This document is made available under a creative commons - Namensnennung - Nicht kommerziell - Keine Bearbeitungen 4.0 Attribution - Non Commercial - No Derivatives 4.0 License. For Lizenz zur Verfügung gestellt. Nähere Auskünfte zu dieser Lizenz more information see: finden Sie hier: https://creativecommons.org/licenses/by-nc-nd/4.0 https://creativecommons.org/licenses/by-nc-nd/4.0 EUROPEAN JOURNAL OF MEDIA STUDIES www.necsus-ejms.org NECSUS Published by: Amsterdam University Press Anecdotal evidence Sean Cubitt NECSUS 2 (1):5–18 DOI: 10.5117/NECSUS2013.1.CUBI Keywords: anecdotal evidence, anecdote, communications, media, method, Robert Bresson La réalité est autre chose. – (Au hasard Balthazar) When asked to write about recent research I am faced with the problem of reconciling work on visual technologies, environmental issues, de-colonial and indigenous movements, political economy, global media governance, and media arts. Currents of thought link these fields, a general intellec- tual and perhaps ethical mood, and all are guided by an interest in the materiality of media; they also share a common methodological principle. This principle is largely disparaged among funding agencies as ‘anecdotal evidence’.
    [Show full text]
  • On the Generality and Cognitive Basis of Base-Rate Neglect
    bioRxiv preprint doi: https://doi.org/10.1101/2021.03.11.434913; this version posted March 12, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1 2 On the generality and cognitive basis of base-rate neglect 3 4 5 Elina Stengårda*, Peter Juslina, Ulrike Hahnb, Ronald van den Berga,c 6 7 aDepartment of Psychology, Uppsala University, Uppsala, Sweden 8 bDepartment of Psychological Sciences, Birkbeck University of London, London, UK 9 cDepartment of Psychology, Stockholm University, Stockholm, Sweden 10 11 12 13 *Corresponding author 14 E-mail: [email protected] 15 16 1 bioRxiv preprint doi: https://doi.org/10.1101/2021.03.11.434913; this version posted March 12, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 17 ABSTRACT - Base rate neglect refers to people’s apparent tendency to underweight or even 18 ignore base rate information when estimating posterior probabilities for events, such as the 19 probability that a person with a positive cancer-test outcome actually does have cancer. While 20 many studies have replicated the effect, there has been little variation in the structure of the 21 reasoning problems used in those studies.
    [Show full text]
  • Automated Tackling of Disinformation
    Automated tackling of disinformation STUDY Panel for the Future of Science and Technology European Science-Media Hub EPRS | European Parliamentary Research Service Scientific Foresight Unit (STOA) PE 624.278 – March 2019 EN Automated tackling of disinformation Major challenges ahead This study maps and analyses current and future threats from online misinformation, alongside currently adopted socio-technical and legal approaches. The challenges of evaluating their effectiveness and practical adoption are also discussed. Drawing on and complementing existing literature, the study summarises and analyses the findings of relevant journalistic and scientific studies and policy reports in relation to detecting, containing and countering online disinformation and propaganda campaigns. It traces recent developments and trends and identifies significant new or emerging challenges. It also addresses potential policy implications for the EU of current socio-technical solutions. ESMH | European Science-Media Hub AUTHORS This study was written by Alexandre Alaphilippe, Alexis Gizikis and Clara Hanot of EU DisinfoLab, and Kalina Bontcheva of The University of Sheffield, at the request of the Panel for the Future of Science and Technology (STOA). It has been financed under the European Science and Media Hub budget and managed by the Scientific Foresight Unit within the Directorate-General for Parliamentary Research Services (EPRS) of the Secretariat of the European Parliament. Acknowledgements The authors wish to thank all respondents to the online survey, as well as first draft, WeVerify, InVID, PHEME, REVEAL, and all other initiatives that contributed materials to the study. ADMINISTRATOR RESPONSIBLE Mihalis Kritikos, Scientific Foresight Unit To contact the publisher, please e-mail [email protected] LINGUISTIC VERSION Original: EN Manuscript completed in March 2019.
    [Show full text]
  • Why Do People Seek Anonymity on the Internet?
    Why Do People Seek Anonymity on the Internet? Informing Policy and Design Ruogu Kang1, Stephanie Brown2, Sara Kiesler1 Human Computer Interaction Institute1 Department of Psychology2 Carnegie Mellon University 5000 Forbes Ave., Pittsburgh, PA 15213 [email protected], [email protected], [email protected] ABSTRACT literature that exists mainly derives from studies of one or a In this research we set out to discover why and how people few online communities or activities (e.g., the study of seek anonymity in their online interactions. Our goal is to 4chan in [5]). We lack a full understanding of the real life inform policy and the design of future Internet architecture circumstances surrounding people’s experiences of seeking and applications. We interviewed 44 people from America, anonymity and their feelings about the tradeoffs between Asia, Europe, and Africa who had sought anonymity and anonymity and identifiability. A main purpose of the asked them about their experiences. A key finding of our research reported here was to learn more about how people research is the very large variation in interviewees’ past think about online anonymity and why they seek it. More experiences and life situations leading them to seek specifically, we wanted to capture a broad slice of user anonymity, and how they tried to achieve it. Our results activities and experiences from people who have actually suggest implications for the design of online communities, sought anonymity, to investigate their experiences, and to challenges for policy, and ways to improve anonymity tools understand their attitudes about anonymous and identified and educate users about the different routes and threats to communication.
    [Show full text]
  • Climate Change in the Era of Post-Truth
    09_ARBOLEDA_EDITEDPROOF_KS (DO NOT DELETE) 11/8/2018 2:50 PM Climate Change in the Era of Post- Truth INTRODUCTION In The Madhouse Effect: How Climate Change Denial is Threatening our Planet, Destroying our Politics, and Driving us Crazy,1 climate scientist Michael Mann joins with Pulitzer Prize-winning cartoonist Tom Toles to take on climate change denialism. Mann, the Director of the Earth System Science Center at The Pennsylvania State University, augments his prose with cartoons from Toles, who normally draws for the editorial section of the Washington Post.2 Together, Mann and Toles set out to debunk the main arguments that special interest groups use to undermine climate change policy. The book begins with an introduction to the scientific method and its application to climate change science.3 It then describes the current and potential effects of climate change on everyday life.4 In its second half, the book transitions to the politics surrounding climate change in the United States.5 A major focus of the book is the “war on climate science,” the phrase Mann and Toles use to describe how the fossil fuel industry has created misinformation to discourage action on climate change.6 The Madhouse Effect was published in 2016, at a moment when the United States was choosing between Democratic and Republican presidential candidates whose climate change agendas differed wildly. The book’s publication failed to avert the election of President Donald Trump, a climate change denier who has referred to the phenomenon as a “hoax” created by China.7 Still, The Madhouse Effect presents a valuable depiction of the underground currents that influence DOI: https://doi.org/10.15779/Z38W669857 Copyright © 2018 Regents of the University of California 1.
    [Show full text]
  • Starr Forum: Russia's Information War on America
    MIT Center for Intnl Studies | Starr Forum: Russia’s Information War on America CAROL Welcome everyone. We're delighted that so many people could join us today. Very SAIVETZ: excited that we have such a timely topic to discuss, and we have two experts in the field to discuss it. But before I do that, I'm supposed to tell you that this is an event that is co-sponsored by the Center for International Studies at MIT, the Security Studies program at MIT, and MIT Russia. I should also introduce myself. My name is Carol Saivetz. I'm a senior advisor at the Security Studies program at MIT, and I co-chair a seminar, along with my colleague Elizabeth Wood, whom we will meet after the talk. And we co-chair a seminar series called Focus on Russia. And this is part of that seminar series as well. I couldn't think of a better topic to talk about in the lead-up to the US presidential election, which is now only 40 days away. We've heard so much in 2016 about Russian attempts to influence the election then, and we're hearing again from the CIA and from the intelligence community that Russia is, again, trying to influence who shows up, where people vote. They are mimicking some of Donald Trump's talking points about Joe Biden's strength and intellectual capabilities, et cetera. And we've really brought together two experts in the field. Nina Jankowicz studies the intersection of democracy and technology in central and eastern Europe.
    [Show full text]
  • How Do Climate Change Skeptics Engage with Opposing Views?
    How do climate change skeptics engage with opposing views? Understanding mechanisms of social identity and cognitive dissonance in an online forum PREPRINT Lisa Oswald∗ Jonathan Bright Hertie School Oxford Internet Institute Friedrichstraße 180 University of Oxford 10117 Berlin, Germany 1 St Giles’, Oxford OX1 3JS, UK [email protected] [email protected] February 15, 2021 ABSTRACT Does engagement with opposing views help break down ideological ‘echo chambers’; or does it backfire and reinforce them? This question remains critical as academics, policymakers and activists grapple with the question of how to regulate political discussion on social media. In this study, we contribute to the debate by examining the impact of opposing views within a major climate change skeptic online community on ‘Reddit’. A large sample of posts (N = 3000) was manually coded as either dissonant or consonant which allowed the automated classification of the full dataset of more than 50,000 posts, with codes inferred from linked websites. We find that ideologically dissonant submissions act as a stimulant to activity in the community: they received more attention (comments) than consonant submissions, even though they received lower scores through up-voting and down-voting. Users who engaged with dissonant submissions were also more likely to return to the forum. Consistent with identity theory, confrontation with opposing views triggered activity in the forum, particularly among users that are highly engaged with the community. In light of the findings, theory of social identity and echo chambers is discussed and enhanced. Keywords Online community · climate change skepticism · reaction to opposition · dissonance · opinion polarisation · social identity · echo chamber arXiv:2102.06516v1 [cs.SI] 12 Feb 2021 Introduction Scientists worldwide consider climate change as one of the greatest challenges of our time (IPCC, 2015).
    [Show full text]
  • Homophily and Polarization in the Age of Misinformation
    Eur. Phys. J. Special Topics 225, 2047–2059 (2016) © EDP Sciences, Springer-Verlag 2016 THE EUROPEAN DOI: 10.1140/epjst/e2015-50319-0 PHYSICAL JOURNAL SPECIAL TOPICS Regular Article Homophily and polarization in the age of misinformation Alessandro Bessi1,2, Fabio Petroni3, Michela Del Vicario2, Fabiana Zollo2, Aris Anagnostopoulos3, Antonio Scala4, Guido Caldarelli2,and Walter Quattrociocchi1,a 1 IUSS, Pavia, Italy 2 IMT Institute for Advanced Studies, Lucca, Italy 3 Sapienza University, Roma, Italy 4 ISC-CNR, Roma, Italy Received 1 December 2015 / Received in final form 30 January 2016 Published online 26 October 2016 Abstract. The World Economic Forum listed massive digital misin- formation as one of the main threats for our society. The spreading of unsubstantiated rumors may have serious consequences on public opin- ion such as in the case of rumors about Ebola causing disruption to health-care workers. In this work we target Facebook to characterize information consumption patterns of 1.2 M Italian users with respect to verified (science news) and unverified (conspiracy news) contents. Through a thorough quantitative analysis we provide important in- sights about the anatomy of the system across which misinformation might spread. In particular, we show that users’ engagement on veri- fied (or unverified) content correlates with the number of friends hav- ing similar consumption patterns (homophily). Finally, we measure how this social system responded to the injection of 4, 709 false information. We find that the frequent (and selective) exposure to specific kind of content (polarization) is a good proxy for the detection of homophile clusters where certain kind of rumors are more likely to spread.
    [Show full text]
  • Perceptions of Social Support Within the Context of Religious Homophily: A
    Louisiana State University LSU Digital Commons LSU Master's Theses Graduate School 2003 Perceptions of social support within the context of religious homophily: a social network analysis Sally Robicheaux Louisiana State University and Agricultural and Mechanical College, [email protected] Follow this and additional works at: https://digitalcommons.lsu.edu/gradschool_theses Part of the Sociology Commons Recommended Citation Robicheaux, Sally, "Perceptions of social support within the context of religious homophily: a social network analysis" (2003). LSU Master's Theses. 166. https://digitalcommons.lsu.edu/gradschool_theses/166 This Thesis is brought to you for free and open access by the Graduate School at LSU Digital Commons. It has been accepted for inclusion in LSU Master's Theses by an authorized graduate school editor of LSU Digital Commons. For more information, please contact [email protected]. PERCEPTIONS OF SOCIAL SUPPORT WITHIN THE CONTEXT OF RELIGIOUS HOMOPHILY: A SOCIAL NETWORK ANALYSIS A Thesis Submitted to the Graduate Faculty of the Louisiana State University and Agricultural and Mechanical College in partial fulfillment of the requirements for the degree of Master of Arts in The Department of Sociology by Sally Robicheaux B.A., University of Southwestern Louisiana, 1998 May 2003 ACKNOWLEDGEMENTS I would like to take this opportunity to thank several people who accompanied me through this process. First, I am deeply indebted to the members of my examining committee, Jeanne S. Hurlbert, John J. Beggs, and Yang Cao, for their keen insight, direction, and contributions to this thesis. I especially wish to thank my committee chair, and advisor, Jeanne S. Hurlbert, for all the invaluable guidance, instruction, and encouragement in helping me design and carry out this project.
    [Show full text]
  • How the Chinese Government Fabricates Social Media Posts
    American Political Science Review (2017) 111, 3, 484–501 doi:10.1017/S0003055417000144 c American Political Science Association 2017 How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument GARY KING Harvard University JENNIFER PAN Stanford University MARGARET E. ROBERTS University of California, San Diego he Chinese government has long been suspected of hiring as many as 2 million people to surrep- titiously insert huge numbers of pseudonymous and other deceptive writings into the stream of T real social media posts, as if they were the genuine opinions of ordinary people. Many academics, and most journalists and activists, claim that these so-called 50c party posts vociferously argue for the government’s side in political and policy debates. As we show, this is also true of most posts openly accused on social media of being 50c. Yet almost no systematic empirical evidence exists for this claim https://doi.org/10.1017/S0003055417000144 . or, more importantly, for the Chinese regime’s strategic objective in pursuing this activity. In the first large-scale empirical analysis of this operation, we show how to identify the secretive authors of these posts, the posts written by them, and their content. We estimate that the government fabricates and posts about 448 million social media comments a year. In contrast to prior claims, we show that the Chinese regime’s strategy is to avoid arguing with skeptics of the party and the government, and to not even discuss controversial issues. We show that the goal of this massive secretive operation is instead to distract the public and change the subject, as most of these posts involve cheerleading for China, the revolutionary history of the Communist Party, or other symbols of the regime.
    [Show full text]
  • Don't Feed the Trolls: a Cautionary Tale
    Don’t Feed the Trolls: others (Linux, of course). In fact, the phrase originated as a reference to the A Cautionary Tale comic book the Fantastic Four, as one of the characters could turn himself DANA SAMUEL into a Human Torch. Chat participants would use the HTML‐styled, bracketed annotation to insult other participants online. In Eddo Stern’s Best. Flame War. Ever. we are presented with an online argu‐ <flame on> You’re pretty stupid for ment over seemingly insignificant not knowing what MMPORPG means. game‐playing topics that grows quickly Go look it up on Wiki. </flame off> out of proportion and out of the realm of online experience. Two collaged When the internet opened to the animated masks – each an amalgam of commercial world, and more people heraldry, knighthood, pixilated joined these groups, more language warriors, fantasy species, castles, and jargon was developed to describe shields and symbols – animate a flame the goings‐on in these communities. A war that actually took place online “troll” is one who writes nasty flames over the role‐playing game Everquest, just to make others angry and to be an MMPORPG. annoying. “Meat‐puppets” and “sock‐ Did I just make you feel like a total puppets” are those who are deceptive “newb”? about their identities or their true Online games, discussion boards and allegiances on a discussion board. communities have developed their How all very “D&D.” own language since these online It’s said, by several theorists, that worlds were first populated in people can be more uninhibited online, research and academia, and then, later they can change their personality, role‐ (post‐cold war, after the military had play and morph into who they no need for such a specialized perceive themselves to be.
    [Show full text]
  • Information Systems in Great Power Competition with China
    STRATEGIC STUDIES QUARTERLY - PERSPECTIVE Decide, Disrupt, Destroy: Information Systems in Great Power Competition with China AINIKKI RIIKONEN Abstract Technologies for creating and distributing knowledge have impacted international politics and conflict for centuries, and today the infrastructure for communicating knowledge has expanded. These technologies, along with attempts to exploit their vulnerabilities, will shape twenty-first- century great power competition between the US and China. Likewise, great power competition will shape the way China develops and uses these technologies across the whole spectrum of competition to make decisions, disrupt the operational environment, and destroy adversary capabilities. ***** he 2018 US National Defense Strategy (NDS) cites Russia and the People’s Republic of China (PRC) as “revisionist powers” that “want to shape a world consistent with their authoritarian model— gaining veto authority over other nations’ economic, diplomatic, and secu- T 1 rity decisions.” It describes these countries as competitors seeking to use “other areas of competition short of open warfare to achieve their [au- thoritarian] ends” and to “optimize their targeting of our battle networks and operational concepts.”2 The NDS assesses that competition will occur along the entire spectrum of statecraft from peace to open conflict and that Russia and the PRC will align their foreign policies with their models of governance. If this assessment is correct, and if technology plays a sig- nificant role in international politics, then technology will affect the whole spectrum of great power competition and conflict. Information architec- ture—the structures of technology that collect and relay information worldwide—is innately connected to power projection. The PRC has been innovating in this area, and its expanded information capabilities—and risks to US capabilities—will shape competition in the twenty- first cen- tury.
    [Show full text]