Balancing Act: Countering Digital Disinformation While Respecting

Total Page:16

File Type:pdf, Size:1020Kb

Balancing Act: Countering Digital Disinformation While Respecting Balancing Act: Countering Digital Disinformation While Respecting Freedom of Expression Broadband Commission research report on ‘Freedom of Expression and Addressing Disinformation on the Internet’ September 2020 Balancing Act: Countering Digital Disinformation While Respecting Freedom of Expression Broadband Commission research report on ‘Freedom of Expression and Addressing Disinformation on the Internet’ Introduction Editors: Kalina Bontcheva & Julie Posetti Contributing authors: Kalina Bontcheva University of Sheffield, UK International Center for Journalists (U.S.); Centre for Freedom of the Media, University of Sheffield (UK); Reuters Julie Posetti Institute for the Study of Journalism, University of Oxford, (UK) Denis Teyssou Agence France Presse, France Trisha Meyer Vrije Universiteit Brussel, Belgium Sam Gregory WITNESS, U.S. Clara Hanot EU Disinfo Lab, Belgium Diana Maynard University of Sheffield, UK Published in 2020 by International Telecommunication Union (ITU), Place des Nations, CH-1211 Geneva 20, Switzerland, and the United Nations Educational, Scientific and Cultural Organization, and United Nations Educational, Scientific and Cultural Organization (UNESCO), 7, Place de Fontenoy, 75352 Paris 07 SP, France ISBN 978-92-3-100403-2 This research will be available in Open Access under the Attribution-ShareAlike 3.0 IGO (CC-BY SA 3.0 IGO) license. By using the content of this publication, the users accept to be bound by the terms of use of the UNESCO Open Access Repository. 2 Authors Contents Foreword ......................................................................................................................................... 7 Executive Summary .....................................................................................................................8 1 Introduction ..........................................................................................................................17 2 Typology of Disinformation Responses ....................................................................... 36 3 Research Context and Gaps ..............................................................................................41 4 Identification Responses .................................................................................................. 65 4.1 Monitoring and fact-checking responses ..................................................................................................66 4.2 Investigative responses ..........................................................................................................................87 5 Ecosystem Responses Aimed at Producers and Distributors ................................. 96 5.1 Legislative, pre-legislative, and policy responses .....................................................................97 5.2 National and international counter-disinformation campaigns ....................................113 5.3 Electoral-specific responses .............................................................................................................123 6 Responses within Production and Distribution ....................................................... 140 6.1 Curatorial responses .............................................................................................................................141 6.2 Technical / algorithmic responses ............................................................................................... 169 6.3 Demonetisation and advertising-linked responses .............................................................190 7 Responses Aimed at The Target Audiences of Disinformation Campaigns .....202 7.1 Normative and ethical responses .................................................................................................203 7.2 Educational responses ........................................................................................................................ 218 7.3 Empowerment & credibility labelling responses ....................................................................................................................................................231 8 Challenges and Recommended Actions ....................................................................248 9 List of Sources Consulted ............................................................................................... 267 Appendix A ................................................................................................................................. 322 Contents 3 Figures Figure 1. Top-level categories of disinformation responses ........................................................... 37 Figure 2. The 4 top-level response categories and their eleven sub-categories. �������������������38 Figure 3. Chart Source: Buzzfeed News (Silverman et al., 2020) ��������������������������������������������������50 Figure 4. A geographical map of the IFCN signatories (67 verified active and 14 under renewal in early 2020) .............................................................................................69 Figure 5. A view of the Duke University Reporters’ Lab fact-checking database .....................71 Figure 6. A view of Facebook third-party fact checking network by continent ...................... 72 Figure 7. Map view of Facebook third-party fact checking network worldwide distribution ....................................................................................................................................... 73 Figure 8. Distribution of Facebook third-party fact checking programme by organisations involved ................................................................................................................ 74 Tables Table 1. Simplified version of the political disinformation campaign categorisation scheme devised by Brooking et al. (2020). ...........................................................................46 Table 2. Distribution of Facebook’s third-party fact checking network by country and number of operations ................................................................................................................... 73 Table 3. Legislative, pre-legislative, and policy responses (mapped against study taxonomy) .............................................................................................................................................. 107 Table 4. National and international counter-disinformation campaigns ��������������������������������� 115 Table 5. Curatorial responses from internet communication companies �������������������������������146 Table 6. Curatorial responses from internet communication companies to the COVID-19 Disinfodemic ........................................................................................................... 160 4 Contents The designations employed and the presentation of material throughout this publication do not imply the expression of any opinion whatsoever on the part of the Broadband Commission for Sustainable Development concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. The ideas and opinions expressed in this publication are those of the authors; they are not necessarily those of the Broadband Commission for Sustainable Development and do not commit the Commission. Members of the Working Group on Freedom of Expression and Addressing Disinformation 1. Hessa Al Jaber, chairperson Eshailsat 2. Bocar Ba, SAMENA 3. Moez Chakchouk, Focal Point, UNESCO 4. Piotr Dmochowski-Lipski, EUTELSAT IGO 5. Ramin Guluzade, Republic of Azerbaijan 6. Carlos Manuel Jarque Uribe, America Movil 7. Beeban Kidron, 5Rights Foundation 8. Robert Kirkpatrick, UN Global Pulse 9. Yee Cheong (Dato’) Lee ISTIC 10. Adrian Lovett, The Web Foundation 11. Philipp Metzger, DG Swiss Ministry of Communications 12. Paul Mitchell, Microsoft 13. Speranza Ndege, Kenyatta University 14. Patrick Nyirishema, Focal Point, Director-General Rwanda Utilities Regulatory Authority 15. Joanna Rubinstein, World Childhood Foundation 16. Achim Steiner, UNDP (via Minerva Novero-Belec) Expert oversight group Fabrício Benevenuto (Federal University of Minas Gerais) Divina Frau-Meigs (Université Sorbonne Nouvelle - Paris 3). Cherian George (Hong Kong Baptist University) Claire Wardle (First Draft) Herman Wasserman (University of Cape Town) 5 Acknowledgements: Guy Berger (UNESCO) Guilherme Canela De Souza Godoi (UNESCO) Oscar Castellanos (UNESCO) Jo Hironaka (UNESCO) Anna Polomska (ITU) Cedric Wachholz (UNESCO) Martin Wickenden (UNESCO) Joanna Wright (University of Sheffield) Shanshan Xu (UNESCO) Additional feedback received from Broadband Commission Working Group member organisations: UNDP (Minerva Novero Belec, Robert Opp) UN Global Pulse (Chis Earney) 6 Foreword As the past few months have shown, online communication is essential in today’s world. The Internet plays a key role when it comes to accessing education, culture and quality information. It allows us to work from home while staying in touch with family and friends. Now more than ever, online communication is at the heart of our connected societies. This greater connectivity goes hand in hand with greater opportunities. It empowers people with information and knowledge, which in turn supports development and democracy. Audrey Azoulay It diversifies language, enables us to do business, and encourages us to appreciate different cultures. UNESCO Director General However, online communication
Recommended publications
  • What Kind of Digital Citizen? a Reflection on Educating for Digital Democracy
    Running head: WHAT KIND OF DIGITAL CITIZEN 1 What Kind of Digital Citizen? A Reflection on Educating for Digital Democracy Ashley Ireland Dann University of California, Los Angeles October 1, 2018 WHAT KIND OF DIGITAL CITIZEN? 2 In order to best prepare a democratic citizenry, one must closely examine the education students are receiving. Educators must ask themselves, what kind of citizen are we helping to create? As the world becomes increasingly digital, one must also examine how students are being prepared to be digital citizens. The following reflection on digital democracy will draw connections between the practices of educating a democratic citizenry and the current deficits of digital citizenship education. It will seek to inspire those teaching digital citizenship to expand their narrow lens in order to create a more participatory and analytic digital citizen. In What Kind of Citizen? The Politics of Educating Democracy, Joel Westheimer and Joseph Kahne identify three categories of citizenship. The authors describe: personally responsible citizens, participatory citizens, and justice oriented citizens. Arguably the lowest level of citizenship, personally responsible citizens are only concerned and engaged at the individual level. Personally responsible citizens are described as hardworking, honest, and moral. They might demonstrate citizenship by, “picking up litter, giving blood, recycling, obeying laws, and staying out of debt” (Westheimer & Kahne, 2004, p. 39). In contrast, a participatory citizen is a more collaborative and active member of society. Participatory citizens “engage in collective, community-based efforts” (Westheimer & Kahne, 2004, p. 39); they readily participate in government and community organizations. The last category of citizenship is justice-oriented.
    [Show full text]
  • PR Firms Are Becoming More Powerful, but Good Journalism Still
    Close Academic rigour, journalistic flair Subscribe Fourth estate follies Trawling through the dustbins of the UK media PR firms are becoming more powerful, but good journalism still prevails October 19, 2016 4.00pm BST Author John Jewell Director of Undergraduate Studies, School of Journalism, Media and Cultural Studies, Cardiff University Mission accomplished: putting a positive spin on Iraq. White House Recent articles about the public relations firm Bell Pottinger are a stark reminder of the power and pervasiveness of PR in today’s fragmented media landscape. The Sunday Times and the Bureau of Investigative Journalism revealed that Bell Pottinger was hired by the Pentagon in Washington to coordinate a covert propaganda campaign to boost America’s profile in Iraq following the “end” of hostilities in 2003. And, earlier this year, South Africa’s Business Day newspaper revealed that the firm had been retained by the scandal-hit billionaire Gupta family to burnish its image after a string of stories accusing it of “state capture” – allegedly using its influence with the president, Jacob Zuma, to advance the family’s business interests. Bell Pottinger’s former chairman Lord Tim Bell confirmed to the Sunday Times that the firm had been paid US$540m for five contracts with the US government between 2007 and 2011. He said the firm reported to the Pentagon, the CIA and the National Security Council while working on the account. The investigation, “Fake News and False Flags” relied on interviews with a former Bell Pottinger employee, Martin Wells, who claimed that the PR company created short TV reports in the style of Arabic news networks for broadast in Iraq.
    [Show full text]
  • An Examination of the Impact of Astroturfing on Nationalism: A
    social sciences $€ £ ¥ Article An Examination of the Impact of Astroturfing on Nationalism: A Persuasion Knowledge Perspective Kenneth M. Henrie 1,* and Christian Gilde 2 1 Stiller School of Business, Champlain College, Burlington, VT 05401, USA 2 Department of Business and Technology, University of Montana Western, Dillon, MT 59725, USA; [email protected] * Correspondence: [email protected]; Tel.: +1-802-865-8446 Received: 7 December 2018; Accepted: 23 January 2019; Published: 28 January 2019 Abstract: One communication approach that lately has become more common is astroturfing, which has been more prominent since the proliferation of social media platforms. In this context, astroturfing is a fake grass-roots political campaign that aims to manipulate a certain audience. This exploratory research examined how effective astroturfing is in mitigating citizens’ natural defenses against politically persuasive messages. An experimental method was used to examine the persuasiveness of social media messages related to coal energy in their ability to persuade citizens’, and increase their level of nationalism. The results suggest that citizens are more likely to be persuaded by an astroturfed message than people who are exposed to a non-astroturfed message, regardless of their political leanings. However, the messages were not successful in altering an individual’s nationalistic views at the moment of exposure. The authors discuss these findings and propose how in a long-term context, astroturfing is a dangerous addition to persuasive communication. Keywords: astroturfing; persuasion knowledge; nationalism; coal energy; social media 1. Introduction Astroturfing is the simulation of a political campaign which is intended to manipulate the target audience. Often, the purpose of such a campaign is not clearly recognizable because a political effort is disguised as a grassroots operation.
    [Show full text]
  • Journalism in Jordan: a Comparative Analysis of Press Freedom in the Post-Arab Spring Environment
    Global Media Journal ISSN: 1550-7521 Special Issue 2015 Journalism in Jordan: A comparative analysis of press freedom in the post-Arab spring environment Matt J. Duffy, Hadil Maarouf 1. PhD, Berry College, GA,USA 2. Center for International Media Education, Georgia State University Corresponding author: Matt J. Duffy, PhD, Berry College, GA,USA. E-mail: [email protected] 1. Abstract Although once known for one of the most vibrant media sectors in the Arab world, the press freedom ranking of Jordan has declined in recent years. Despite governmental assurances during the height of the Arab Spring, promised reforms in freedom of the press have failed to materialize. By studying primary and secondary sources and interviewing Jordanian journalists, the authors identify four main developments that show diminished press freedom in Jordan. These developments will be described in detail and examined in the context of international media law. The analysis finds that the Jordanian approach to media regulation is often at odds with the approach recommended by the United Nations free speech rapporteurs. The authors also examine the press system of Jordan through the lens of Ostini and Fung’s press system theory. 1 Global Media Journal ISSN: 1550-7521 Special Issue 2015 2. Introduction organizations support the observation As the Arab region erupted in protests in that press freedom has gotten worse in early 2011, the king of Jordan appeared Jordan (see figures 1 and 2.) to see the writing on the wall. He fired Figure 1: Freedom House his cabinet and called for immediate changes in the organization of his Press Freedom Ranking government.
    [Show full text]
  • Download Download
    Facts, Truth, and Post-Truth: Access to Cognitively and Socially Just Information Rachel Fischer, University of Pretoria, South Africa Erin Klazar, University of Pretoria, South Africa Abstract This article addresses facts, truth, post-truth, and the impact on access to cognitively and socially just information. It is predominantly situated within the post-truth context where information is manipulated to such an extent that it becomes disinformation, disguised as truth. The article consists of four main sections: the first section will provide an introduction and overview of key concepts intrinsic to understanding the concerns at hand. The next section is a case study of the role the PR firm, Bell Pottinger, played in South Africa and Iraq and the cognitive and social injustices visible in the corresponding events. The selection of these countries provides an opportunity to demonstrate the effect of post-truth and whistleblowing in relation to the challenges experienced in the Global South. The third section, on Cambridge Analytica and Digitality, is a discussion of the infamous Cambridge Analytica and its interferences in political campaigns in Trinidad and Tobago and the U.S. These discussions lead to the final section as an antidote to post-truth influences, which reflects on the way forward. This section makes recommendations for South African and international initiatives based on UNESCO’s intergovernmental programme known as the Information for All Programme (IFAP). Keywords: cognitive justice; disinformation; fake news; post-truth era/politics; social justice Publication Type: research article Introduction his article will focus on the role of information in the fairness with which economic, political, and social benefits and burdens are distributed in society.
    [Show full text]
  • Digital Citizenship Curriculum
    Teaching Digital Citizens in Today's World: Research and Insights Behind the Common Sense Digital Citizenship Curriculum Credits Authors: Carrie James, Ph.D., Project Zero Emily Weinstein, Ed.D., Project Zero Kelly Mendoza, Ph.D., Common Sense Education Copy editor: Jen Robb Designers: Elena Beroeva Suggested citation: James, C., Weinstein, E., & Mendoza, K. (2021). Teaching digital citizens in today's world: Research and insights behind the Common Sense K–12 Digital Citizenship Curriculum. (Version 2). San Francisco, CA: Common Sense Media. This is an updated version of the original report published in 2019. Common Sense Education and Project Zero are grateful for the generous support provided for the work described in this report from the Bezos Family Foundation, the William and Flora Hewlett Foundation, Niagara Cares, and Susan Crown Exchange. © 2021 Common Sense Media. All rights reserved. www.commonsense.org/education 1 Table of Contents A Letter from Our Founder . 3 The Digital Landscape by the Numbers . 4 Introduction . 5 Children and Digital Media: An Overview . 6 Children, age 0 to 8 . 6 Tweens and Teens, age 8 to 18 . 7 Our Approach to the Digital Citizenship Curriculum . 11 What Is Digital Citizenship? . .12 About the Digital Citizenship Curriculum . 14 Our Guiding Theory: A Skills and Dispositions Approach . .15 Five Core Dispositions of Digital Citizenship . 16 Cornerstones of the Curriculum . 17 Rings of Responsibility . 17 Digital Life Dilemmas . 18 Repetition and Routines . 20 Poems, chants, and songs (elementary school) . 20 Thinking Routines . 21 1. Digital Habits Check-Up . 21 2. Feelings and Options . .23 3. Take a Stand . 24 A Look Inside the Curriculum: Six Topics .
    [Show full text]
  • Yahotline 79 5.Pdf (583.6Kb)
    Taking Action! of little importance to big corporations. Sarah Jones However, there is power in numbers and Every time a person makes decisions about the more people who boycott, the heavier how and where to spend money, s/he is the influence on the organization being exercising his/her power as a consumer. boycotted. Boycotting gives teens a way to There are several ways that teens can protest and state their beliefs. If they speak out against consumerism, raise boycott something, others may become awareness about consumerist media interested and take up the cause. messages, and make a difference in the way companies operate. Complaint Power Complaint power is another way in which Boycott Power consumers can influence companies to A boycott is the decision to abstain from change their practices. This method buying or dealing with a particular involves talking to a company directly, organization as a form of protest and/or either by phone, e-mail, or letter. Shari means of coercion. Teens may think that if Graydon suggests that writing a letter is the they boycott a particular product or best method of contacting a company company, their actions will make no because it is harder to throw away or difference because individually, each teen is ignore than an e-mail or phone message. A letter also takes more time than an email or phone call, indicating that the writer is serious about the message (Graydon 87). Again, there is always power in numbers, so the more letters a company receives; the more serious it will take the complaint.
    [Show full text]
  • Regulating “Fake News” and Other Online Advertising
    FOOL ME ONCE: REGULATING “FAKE NEWS” AND OTHER ONLINE ADVERTISING ABBY K. WOOD* AND ANN M. RAVEL† A lack of transparency for online political advertising has long been a problem in American political campaigns. Disinformation attacks that American voters have experienced since the 2016 campaign have made the need for regulatory action more pressing. Internet platforms prefer self-regulation and have only recently come around to supporting proposed transparency legislation. While government must not regulate the content of political speech, it can, and should, force transparency into the process. We propose several interventions aimed at transparency. First, and most importantly, campaign finance regulators should require platforms to store and make available (1) ads run on their platforms, and (2) the audience at whom the ad was targeted. Audience availability can be structured to avoid privacy concerns, and it meets an important speech value in the “marketplace of ideas” theory of the First Amendment—that of enabling counter speech. Our proposed regulations would capture any political advertising, including disinformation, that is promoted via paid distribution on social media, as well as all other online political advertising. Second, existing loopholes in transparency regulations *. Associate Professor of Law, Political Science, and Public Policy at University of Southern California ([email protected]). †. Senior Fellow, Maplight Digital Deception Project and former Chair of the Federal Election Commission and California Fair Political Practices Commission. This article has benefited from insights from Rebecca Brown, Chris Elmendorf, and Rick Hasen. Daniel Brovman, Samantha Hay, Justin Mello, Brandon Thompson, and Caroline Yoon provided fantastic research assistance. Teresa Delgado and Alex Manzanares joyfully created the time and space required to focus on the project.
    [Show full text]
  • Jo Ann Gibson Robinson, the Montgomery Bus Boycott and The
    National Humanities Center Resource Toolbox The Making of African American Identity: Vol. III, 1917-1968 Black Belt Press The ONTGOMERY BUS BOYCOTT M and the WOMEN WHO STARTED IT __________________________ The Memoir of Jo Ann Gibson Robinson __________________________ Mrs. Jo Ann Gibson Robinson Black women in Montgomery, Alabama, unlocked a remarkable spirit in their city in late 1955. Sick of segregated public transportation, these women decided to wield their financial power against the city bus system and, led by Jo Ann Gibson Robinson (1912-1992), convinced Montgomery's African Americans to stop using public transportation. Robinson was born in Georgia and attended the segregated schools of Macon. After graduating from Fort Valley State College, she taught school in Macon and eventually went on to earn an M.A. in English at Atlanta University. In 1949 she took a faculty position at Alabama State College in Mont- gomery. There she joined the Women's Political Council. When a Montgomery bus driver insulted her, she vowed to end racial seating on the city's buses. Using her position as president of the Council, she mounted a boycott. She remained active in the civil rights movement in Montgomery until she left that city in 1960. Her story illustrates how the desire on the part of individuals to resist oppression — once *it is organized, led, and aimed at a specific goal — can be transformed into a mass movement. Mrs. T. M. Glass Ch. 2: The Boycott Begins n Friday morning, December 2, 1955, a goodly number of Mont- gomery’s black clergymen happened to be meeting at the Hilliard O Chapel A.
    [Show full text]
  • Setting Goals and Objectives Setting Advertising Campaign Goals 1
    Department: Journalism and Mass Communication Course Name: M.A. JMC Semester - 4th Semester Paper name: Advanced Public Relations and Advertising Teacher Name: Prof. Manukonda Rabindranath Note: This is for the purpose of reading only but not for any publication. Unit-5 1. Advertising Campaigns: Setting goals and objectives Setting advertising campaign goals 1. Brand Awareness. It is common for marketers to appoint brand awareness as the most important goal in their marketing strategy. 2. Website Traffic. 3. Social Engagement. 4. Displaying Video Ads. 5. Understanding your target audience. Advertising has three primary objectives: to inform, to persuade, and to remind. Informative Advertising creates awareness of brands, products, services, and ideas. It announces new products and programs and can educate people about the attributes and benefits of new or established products. Within these broad goals, companies normally have more specific, quantified objectives, as well. Advertising campaigns are built to accomplish a particular objective or a set of objectives. Such objectives usually include establishing a brand, raising 1 brand awareness. The rate of success or failure in accomplishing these goals is reckoned via effectiveness measures. Following types of Advertising are most effective. • Social Media. With 56% of Americans having a profile with a social networking service, social media is undoubtedly a killer advertising platform to maximize brand recognition and spend as little money as possible. • Print Media. • Television. • Radio. • Direct Mail. • Email. Here's a breakdown of the top four most trusted advertising mediums, according to a recent study by Future Foundation for FEPE International. 1. TV. Twenty-eight percent of consumers believe television advertising is the most trustworthy, making it the most trusted advertising medium.
    [Show full text]
  • Freepint Report: Product Review of Factiva
    FreePint Report: Product Review of Factiva December 2014 Product Review of Factiva In-depth, independent review of the product, plus links to related resources “...has some 32,000 sources spanning all forms of content of which thousands are not available on the free web. Some source archives go back to 1951...” [SAMPLE] www.freepint.com © Free Pint Limited 2014 Contents Introduction & Contact Details 4 Sources - Content and Coverage 5 Technology - Search and User Interface 8 Technology - Outputs, Analytics, Alerts, Help 18 Value - Competitors, Development & Pricing 29 FreePint Buyer’s Guide: News 33 Other Products 35 About the Reviewer 36 ^ Back to Contents | www.freepint.com - 2 - © Free Pint Limited 2014 About this Report Reports FreePint raises the value of information in the enterprise, by publishing articles, reports and resources that support information sources, information technology and information value. A FreePint Subscription provides customers with full access to everything we publish. Customers can share individual articles and reports with anyone at their organisations as part of the terms and conditions of their license. Some license levels also enable customers to place materials on their intranets. To learn more about FreePint, visit http://www.freepint.com/ Disclaimer FreePint Report: Product Review of Factiva (ISBN 978-1-78123-181-4) is a FreePint report published by Free Pint Limited. The opinions, advice, products and services offered herein are the sole responsibility of the contributors. Whilst all reasonable care has been taken to ensure the accuracy of the publication, the publishers cannot accept responsibility for any errors or omissions. Except as covered by subscriber or purchaser licence agreement, this publication MAY NOT be copied and/or distributed without the prior written agreement of the publishers.
    [Show full text]
  • Political Astroturfing Across the World
    Political astroturfing across the world Franziska B. Keller∗ David Schochy Sebastian Stierz JungHwan Yang§ Paper prepared for the Harvard Disinformation Workshop Update 1 Introduction At the very latest since the Russian Internet Research Agency’s (IRA) intervention in the U.S. presidential election, scholars and the broader public have become wary of coordi- nated disinformation campaigns. These hidden activities aim to deteriorate the public’s trust in electoral institutions or the government’s legitimacy, and can exacerbate political polarization. But unfortunately, academic and public debates on the topic are haunted by conceptual ambiguities, and rely on few memorable examples, epitomized by the often cited “social bots” who are accused of having tried to influence public opinion in various contemporary political events. In this paper, we examine “political astroturfing,” a particular form of centrally co- ordinated disinformation campaigns in which participants pretend to be ordinary citizens acting independently (Kovic, Rauchfleisch, Sele, & Caspar, 2018). While the accounts as- sociated with a disinformation campaign may not necessarily spread incorrect information, they deceive the audience about the identity and motivation of the person manning the ac- count. And even though social bots have been in the focus of most academic research (Fer- rara, Varol, Davis, Menczer, & Flammini, 2016; Stella, Ferrara, & De Domenico, 2018), seemingly automated accounts make up only a small part – if any – of most astroturf- ing campaigns. For instigators of astroturfing, relying exclusively on social bots is not a promising strategy, as humans are good at detecting low-quality information (Pennycook & Rand, 2019). On the other hand, many bots detected by state-of-the-art social bot de- tection methods are not part of an astroturfing campaign, but unrelated non-political spam ∗Hong Kong University of Science and Technology yThe University of Manchester zGESIS – Leibniz Institute for the Social Sciences §University of Illinois at Urbana-Champaign 1 bots.
    [Show full text]