Defining and Annotating Credibility Indicators in News Articles

Total Page:16

File Type:pdf, Size:1020Kb

Defining and Annotating Credibility Indicators in News Articles A Structured Response to Misinformation: Defining and Annotating Credibility Indicators in News Articles Amy X. Zhang Aditya Ranganathan Sarah Emlen Metz MIT CSAIL Berkeley Institute for Data Science Berkeley Institute for Data Science Cambridge, MA, USA Berkeley, CA, USA Berkeley, CA, USA axz@mit:edu adityarn@berkeley:edu emlen:metz@berkeley:edu Scott Appling Connie Moon Sehat Norman Gilmore Georgia Institute of Technology Global Voices Berkeley Institute for Data Science Atlanta, GA, USA London, UK Berkeley, CA, USA scott:appling@gtri:gatech:edu connie@globalvoices:org norman@virtualnorman:com Nick B. Adams Emmanuel Vincent Jennifer 8. Lee Berkeley Institute for Data Science Climate Feedback Hacks/Hackers Berkeley, CA, USA University of California, Merced San Francisco, CA, USA nickbadams@berkeley:edu Merced, CA, USA jenny@hackshackers:com emvincent@climatefeedback:org Martin Robbins Ed Bice Sandro Hawke Factmata Meedan W3C London, UK San Francisco, CA, USA Cambridge, MA, USA martin:robbins@factmata:com ed@meedan:com sandro@w3:org David Karger An Xiao Mina MIT CSAIL Meedan Cambridge, MA, USA San Francisco, CA, USA karger@mit:edu an@meedan:com ABSTRACT KEYWORDS The proliferation of misinformation in online news and its amplifi- misinformation; disinformation; information disorder; credibility; cation by platforms are a growing concern, leading to numerous news; journalism; media literacy; web standards efforts to improve the detection of and response to misinforma- ACM Reference Format: tion. Given the variety of approaches, collective agreement on the Amy X. Zhang, Aditya Ranganathan, Sarah Emlen Metz, Scott Appling, indicators that signify credible content could allow for greater col- Connie Moon Sehat, Norman Gilmore, Nick B. Adams, Emmanuel Vincent, laboration and data-sharing across initiatives. In this paper, we Jennifer 8. Lee, Martin Robbins, Ed Bice, Sandro Hawke, David Karger, present an initial set of indicators for article credibility defined by and An Xiao Mina. 2018. A Structured Response to Misinformation: Defining a diverse coalition of experts. These indicators originate from both and Annotating Credibility Indicators in News Articles. In The 2018 Web within an article’s text as well as from external sources or article Conference Companion, April 23–27, 2018, Lyon, France. ACM, New York, NY, metadata. As a proof-of-concept, we present a dataset of 40 articles USA, 10 pages. https://doi:org/10:1145/3184558:3188731 of varying credibility annotated with our indicators by 6 trained annotators using specialized platforms. We discuss future steps 1 INTRODUCTION including expanding annotation, broadening the set of indicators, While the propagation of false information existed well before the and considering their use by platforms and the public, towards the internet [11], recent changes to our information ecosystem [24] development of interoperable standards for content credibility. have created new challenges for distinguishing misinformation This paper is published under the Creative Commons Attribution 4.0 International from credible content. Misinformation, or information that is false (CC BY 4.0) license. Authors reserve their rights to disseminate the work on their or misleading, can quickly reach thousands or millions of readers, personal and corporate Web sites with the appropriate attribution. In case of republi- helped by inattentive or malicious sharers and algorithms optimized cation, reuse, etc., the following attribution should be used: “Published in WWW2018 Proceedings © 2018 International World Wide Web Conference Committee, published for engagement. Many solutions to remedy the propagation of mis- under Creative Commons CC BY 4.0 License.” information have been proposed—from initiatives for publishers to WWW ’18 Companion, April 23–27, 2018, Lyon, France signal their credibility1, to technologies for automatically labeling © 2018 IW3C2 (International World Wide Web Conference Committee), published under Creative Commons CC BY 4.0 License. misinformation and scoring content credibility [6, 29, 35, 42], to the ACM ISBN 978-1-4503-5640-4/18/04. https://doi:org/10:1145/3184558:3188731 1The Trust Project: https://thetrustproject:org engagement of professional fact-checkers or experts [13], to cam- could contribute annotations using open standards developed dur- paigns to improve literacy [19] or crowdsource annotations [31]. ing this work, while any system for displaying or sharing news While all these initiatives are valuable, the problem is so multi- could make their own decisions about how to aggregate, weight, faceted that each provides only partial alleviation. Instead, a holistic filter and display credibility information. For instance, systems such approach, with reputation systems, fact-checking, media literacy as web browsers, search engines, or social platforms could surface campaigns, revenue models, and public feedback all contributing, information about a news article to benefit readers, much like how could collectively work towards improving the health of the infor- nutrition labels for food and browser security labels for webpages mation ecosystem. To foster this cooperation, we propose a shared provide context in the moment. Readers could also verify an article vocabulary for representing credibility. However, credibility is not a by building on the annotations left by others or even interrogate Boolean flag: there are many indicators, both human- and machine- the indicators that a particular publisher or other party provides. generated, that can feed into an assessment of article credibility, Finally, the data may be helpful to researchers and industry watch- and differing preferences for what indicators to emphasize ordis- dogs seeking to monitor the ecosystem as a whole. play. Instead of an opaque score or flag, a more transparent and customizable approach would be to allow publishers, platforms, 2 RELATED WORK and the public to both understand and communicate what aspects of an article contribute to its credibility and why. In recent years, researchers have sought to better define and char- In this work, we describe a set of initial indicators for article acterize misinformation and its place in the larger information credibility, grouped into content signals, that can be determined by ecosystem. Some researchers have chosen to eschew the popular- only considering the text or content of an article, as well as context ized term “fake news”, calling it overloaded [50]. Instead, they have signals, that can be determined through consulting external sources opted for terms such as “information pollution” and “information or article metadata. These indicators were iteratively developed disorder” [50], to focus not only on the authenticity of the content through consultations with journalists, researchers, platform repre- itself, but also the motivations and actions of creators, including sentatives, and others during a series of conferences, workshops, disinformation agents [47], readers, media companies [20] and their and online working sessions. While there are many signals of cred- advertising models [16], platforms, and sharers. Accordingly, our ibility, we focus on article indicators that do not need a domain approach covers a broad range of indicators developed by experts expert but require human judgment and training. This focus dif- representing a range of disciplines and industries. ferentiates our work from efforts targeting purely computational An important aspect of characterizing misinformation is un- or expert-driven indicators, towards broadening participation in derstanding how people perceive the credibility of information. credibility annotation and improving media literacy. To validate Reviews of the credibility research literature [24, 41] describe vari- the indicators and examine how they get annotated, we gathered ous aspects of credibility attribution, including judgments about a dataset of 40 highly shared articles focused on two topics pos- the credibility of a particular source or a broader platform (e.g., a sessing a high degree of misinformation in popular media: public blog versus social media) [43], as well as message characteristics health [49] and climate science [21]. These articles were each an- that impact perceptions of credibility of the message or source [30]. notated with credibility indicators by 6 annotators with training Studies have pointed out the differences in perceived credibility in journalism and logic and reasoning. Their rich annotations help that can occur based on differences in personal relevance [53], in- us understand the consistency of the different indicators across an- dividual online usage [10], and the co-orientation of reader and notators and how well they align with domain expert evaluations writer views [25], among others. This prior work suggests that a of credibility. We are releasing the data publicly2, and will host an one-size-fits-all approach or an approach that provides an opaque expanded dataset in service to the research community and public. “credibility score” will not be able to adapt to individual needs. The process outlined in this paper serves as a template for cre- However, research has also found that readers can be swayed ating a standardized set of indicators for evaluating content credi- by superficial qualities that may be manipulated, such as auser’s bility. With broad consensus, these indicators could then support avatar on social media [27] or number of sources quoted in an an ecosystem of varied
Recommended publications
  • Can You Spot COVID-19 Misinformation?
    Can You Spot COVID-19 Misinformation? We’re all seeking the latest information on the COVID-19 pandemic and what might be coming next. But how can we spot and avoid the false information that is also circulating, especially on social media? Here are some things to look out for. Have you received a message from your friend Did you see a statement being shared on social that says her aunt/teacher/colleague knows networks that looks like it has come from the someone who works in the ER and has the government or a health authority? following information? If so: Be cautious. If so: Be a detective. Lots of copy and paste It may be ‘imposter rumors are spread content’, which is this way. when it’s not. Action: try pasting the Action: social networks. If you website of the organization see lots of examples, it quoted and check whether has likely travelled a long the information on their way before it got to you. site is a match. Maybe someone sent you a list of top tips to Did you see that very dramatic video on social media avoid the virus, like eating certain foods or using showing the latest updates relating to COVID-19? home remedies? If so: Be skeptical. If so: Look closer. There’s no food or Sometimes videos and supplement that can stop pictures being shared on you getting this virus, social media aren’t quite and there’s currently no what they seem. treatment - you can only try to manage the symptoms. Action: try reverse image searching pictures ? Action: consult the latest to see if they have been guidelines from the World used before.
    [Show full text]
  • 1 Validation of a Novel Climate Change Denial Measure Using Item
    1 Validation of a Novel Climate Change Denial Measure Using Item Response Theory Mr. George Lorama*,, Dr. Mathew Linga, Mr. Andrew Heada, Dr. Edward J.R. Clarkeb a Deakin University, Geelong, Australia, Misinformation Lab, School of Psychology b Federation University, Mount Helen, Australia, School of Health and Life Sciences * Corresponding Author, Misinformation Lab, Deakin University, Locked Bag 20001, Geelong VIC 3220, Australia. Email address: [email protected] Declarations of Interest: All authors declare that no conflict of interest exists. Acknowledgements: This work was supported by Deakin University (4th Year Student Research Funding; 4th Year Publication Award). 2 Abstract Climate change denial persists despite overwhelming scientific consensus on the issue. However, the rates of denial reported in the literature are inconsistent, potentially as a function of ad hoc measurement of denial. This further impacts on interpretability and integration of research. This study aims to create a standardised measure of climate change denial using Item Response Theory (IRT). The measure was created by pooling items from existing denial measures, and was administered to a U.S. sample recruited using Amazon MTurk (N = 206). Participants responded to the prototype measure as well as being measured on a number of constructs that have been shown to correlate with climate change denial (authoritarianism, social dominance orientation, mistrust in scientists, and conspiracist beliefs). Item characteristics were calculated using a 2-parameter IRT model. After screening out poorly discriminating and redundant items, the scale contained eight items. Discrimination indices were high, ranging from 2.254 to 30.839, but item difficulties ranged from 0.437 to 1.167, capturing a relatively narrow band of climate change denial.
    [Show full text]
  • Handbook of Research on Deception, Fake News, and Misinformation Online
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by University of Strathclyde Institutional Repository Handbook of Research on Deception, Fake News, and Misinformation Online Innocent E. Chiluwa Covenant University, Nigeria Sergei A. Samoilenko George Mason University, USA A volume in the Advances in Media, Entertainment, and the Arts (AMEA) Book Series Published in the United States of America by IGI Global Information Science Reference (an imprint of IGI Global) 701 E. Chocolate Avenue Hershey PA, USA 17033 Tel: 717-533-8845 Fax: 717-533-8661 E-mail: [email protected] Web site: http://www.igi-global.com Copyright © 2019 by IGI Global. All rights reserved. No part of this publication may be reproduced, stored or distributed in any form or by any means, electronic or mechanical, including photocopying, without written permission from the publisher. Product or company names used in this set are for identification purposes only. Inclusion of the names of the products or companies does not indicate a claim of ownership by IGI Global of the trademark or registered trademark. Library of Congress Cataloging-in-Publication Data Names: Chiluwa, Innocent, editor. | Samoilenko, Sergei A., 1976- editor. Title: Handbook of research on deception, fake news, and misinformation online / Innocent E. Chiluwa and Sergei A. Samoilenko, editors. Description: Hershey, PA : Information Science Reference, 2019. | Includes bibliographical references and index. Identifiers: LCCN 2018055436| ISBN 9781522585350 (hardcover) | ISBN 9781522585374 (ebook) | ISBN 9781522585367 (softcover) Subjects: LCSH: Fake news. | Deception. | Misinformation. | Social media. Classification: LCC PN4784.F27 H365 2019 | DDC 302.23/1--dc23 LC record available at https://lccn.loc.gov/2018055436 This book is published in the IGI Global book series Advances in Media, Entertainment, and the Arts (AMEA) (ISSN: 2475-6814; eISSN: 2475-6830) British Cataloguing in Publication Data A Cataloguing in Publication record for this book is available from the British Library.
    [Show full text]
  • Infodemic and Misinformation in the Fight Against COVID-19
    Understanding the infodemic and misinformation in the fight against COVID-19 |DIGITAL TRANSFORMATION TOOLKIT KNOWLEDGE TOOLS 9 COVID-19 Factsheet Digital Health Understanding the Infodemic and Misinformation in the fight against COVID-19 IMPORTANT NOTE: Stay informed with timely information on the Coronavirus Disease (COVID-19), available on the PAHO and WHO websites and through your national and local public health authorities What is the Infodemic? As stated by the WHO, the COVID-19 outbreak and response has been accompanied by a massive infodemic: an overabundance of information – some accurate and some not – that makes it hard for people to find trustworthy sources and Access to the right reliable guidance when they need it. Infodemic refers to a large increase in the volume of information associated with a information, at the specific topic and whose growth can occur exponentially in a right time, in the right short period of time due to a specific incident, such as the format IS CRITICAL! current pandemic. In this situation, misinformation and rumors appear on the scene, along with manipulation of information with doubtful intent. In the information age, this phenomenon is amplified through social networks, spreading farther and faster like a virus.1 What is Misinformation? Misinformation is false or inaccurate information deliberately intended to deceive. In the context of the current pandemic, it can greatly affect all aspects of life, especifically people’s mental health, since searching for 361,000,000 videos were uploaded on COVID-19 updates on the Internet has YouTube in the last 30 days under the “COVID-19” jumped 50% – 70% across all and “COVID 19” classification, and about 19,200 generations.
    [Show full text]
  • Misinformation
    Psychological Science in the Public Interest Misinformation and Its Correction: 13(3) 106 –131 © The Author(s) 2012 Reprints and permission: Continued Influence and sagepub.com/journalsPermissions.nav DOI: 10.1177/1529100612451018 Successful Debiasing http://pspi.sagepub.com Stephan Lewandowsky1, Ullrich K. H. Ecker1, Colleen M. Seifert2, Norbert Schwarz2, and John Cook1,3 1University of Western Australia, 2University of Michigan, and 3University of Queensland Summary The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation. We first examine the mechanisms by which such misinformation is disseminated in society, both inadvertently and purposely. Misinformation can originate from rumors but also from works of fiction, governments and politicians, and vested interests. Moreover, changes in the media landscape, including the arrival of the Internet, have fundamentally influenced the ways in which information is communicated and misinformation is spread. We next move to misinformation at the level of the individual, and review the cognitive factors that often render misinformation resistant to correction. We consider how people assess the truth of statements and what makes people believe certain things but not others. We look at people’s memory for misinformation and answer the questions of why retractions of misinformation are so ineffective in memory updating and why efforts to retract misinformation can even backfire and, ironically, increase misbelief.
    [Show full text]
  • Understanding and Countering Climate Science Denial
    Journal & Proceedings of the Royal Society of New South Wales, vol. 150, part 2, 2017, pp. 207–219. ISSN 0035-9173/17/020207-13 Understanding and countering climate science denial John Cook Center for Climate Change Communication, George Mason University, Fairfax, Virginia, USA Email: [email protected] Abstract Science denial causes a range of damaging impacts on society. This is particularly the case with climate science denial, which has resulted in strong polarization around a non-controversial scientific issue, and prolific dissemination of climate misinformation. This in turn has had the effect of reducing public acceptance of climate change, and eroding support for policies to mitigate climate change. In order to develop effective responses to science denial, it is necessary to first understand the drivers of science denial, which leads to deeper understanding of the techniques employed to cast doubt on the reality of climate change. Analysis of denialist techniques can inform development of interventions that neutralize misinformation. Inoculation has been shown to be an effective way to preemptively reduce the influence of climate science denial. Two methods to practically implement inoculation are misconception-based learning (teaching science by addressing scientific misconceptions) and techno- cognition (an interdisciplinary approach that implements psychological principles in the design of technological solutions to misinformation). Interventions and procedures developed for the counter- ing of climate misinformation may also be applied to other scientific topics rife with misinformation, such as vaccination and evolution. Introduction damaging and significant impacts of misin- here is an overwhelming scientific con- formation. In 2014, the World Economic T sensus that humans are causing global Forum named online misinformation as one warming.
    [Show full text]
  • Media Manipulation and Disinformation Online Alice Marwick and Rebecca Lewis CONTENTS
    Media Manipulation and Disinformation Online Alice Marwick and Rebecca Lewis CONTENTS Executive Summary ....................................................... 1 What Techniques Do Media Manipulators Use? ....... 33 Understanding Media Manipulation ............................ 2 Participatory Culture ........................................... 33 Who is Manipulating the Media? ................................. 4 Networks ............................................................. 34 Internet Trolls ......................................................... 4 Memes ................................................................. 35 Gamergaters .......................................................... 7 Bots ...................................................................... 36 Hate Groups and Ideologues ............................... 9 Strategic Amplification and Framing ................. 38 The Alt-Right ................................................... 9 Why is the Media Vulnerable? .................................... 40 The Manosphere .......................................... 13 Lack of Trust in Media ......................................... 40 Conspiracy Theorists ........................................... 17 Decline of Local News ........................................ 41 Influencers............................................................ 20 The Attention Economy ...................................... 42 Hyper-Partisan News Outlets ............................. 21 What are the Outcomes? ..........................................
    [Show full text]
  • Misinformation, Disinformation, Malinformation: Causes, Trends, and Their Influence on Democracy
    E-PAPER A Companion to Democracy #3 Misinformation, Disinformation, Malinformation: Causes, Trends, and Their Influence on Democracy LEJLA TURCILO AND MLADEN OBRENOVIC A Publication of Heinrich Böll Foundation, August 2020 Preface to the e-paper series “A Companion to Democracy” Democracy is multifaceted, adaptable – and must constantly meet new challenges. Democratic systems are influenced by the historical and social context, by a country’s geopolitical circumstances, by the political climate and by the interaction between institutions and actors. But democracy cannot be taken for granted. It has to be fought for, revitalised and renewed. There are a number of trends and challenges that affect democracy and democratisation. Some, like autocratisation, corruption, the delegitimisation of democratic institutions, the shrinking space for civil society or the dissemination of misleading and erroneous information, such as fake news, can shake democracy to its core. Others like human rights, active civil society engagement and accountability strengthen its foundations and develop alongside it. The e-paper series “A Companion to Democracy” examines pressing trends and challenges facing the world and analyses how they impact democracy and democratisation. Misinformation, Disinformation, Malinformation: Causes, Trends, and Their Influence on Democracy 2/ 38 Misinformation, Disinformation, Malinformation: Causes, Trends, and Their Influence on Democracy 3 Lejla Turcilo and Mladen Obrenovic Contents 1. Introduction 4 2. Historical origins of misinformation, disinformation, and malinformation 5 3. Information disorder – key concepts and definitions 7 3.1. Fake news – definitions, motives, forms 7 3.2. Disinformation, misinformation, malinformation 8 4. Distortion of truth and manipulation of consent 12 5. Democracy at risk in post-truth society – how misinformation, disinformation, and malinformation destroy democratic values 17 6.
    [Show full text]
  • Vulnerable to Misinformation? Verifi!
    Vulnerable to Misinformation? Verifi! Alireza Karduni Svitlana Volkova Samira Shaikh Isaac Cho Dustin L Arendt Wenwen Dou Ryan Wesslen Pacific Northwest National UNC-Charlotte, Department of Sashank Santhanam Labarotory Computer Science UNC-Charlotte, Department of Richland, Washington Charlotte, North Carolina Computer Science {svitlana.volkova,dustin.arendt}@ {sshaikh2,wdou1}@uncc.edu Charlotte, North Carolina pnnl.gov {akarduni,icho1,rwesslen,ssantha1}@ uncc.edu ABSTRACT practitioners from multiple disciplines including political science, We present Verifi2, a visual analytic system to support the inves- psychology and computer science are grappling with the effects of tigation of misinformation on social media. Various models and misinformation and devising means to combat it [14, 44, 52, 53, 63]. studies have emerged from multiple disciplines to detect or un- Although hardly a new phenomenon, both anecdotal evidence derstand the effects of misinformation. However, there is still a and systematic studies that emerged recently have demonstrated lack of intuitive and accessible tools that help social media users real consequences of misinformation on people’s attitudes and distinguish misinformation from verified news. Verifi2 uses state- actions [2, 46]. There are multiple factors contributing to the wide- of-the-art computational methods to highlight linguistic, network, spread prevalence of misinformation online. On the content gen- and image features that can distinguish suspicious news accounts. eration end, misleading or fabricated content can be created by By exploring news on a source and document level in Verifi2, users actors with malicious intent. In addition, the spread of such mis- can interact with the complex dimensions that characterize mis- information can be aggravated by social bots.
    [Show full text]
  • How the 'Plandemic' Movie and Its Falsehoods Spread Widely Online
    How the ‘Plandemic’ Movie and Its Falsehoods Spread Widely Online Conspiracy theories about the pandemic have gained more traction than mainstream online events. Here’s how. By Sheera Frenkel, Ben Decker and Davey Alba There have been plenty of jaw-dropping digital moments during the coronavirus pandemic. There was the time this month when Taylor Swift announced she would air her “City of Lover” concert on television. The time that the cast of “The Office” reunited for an 18-minute-long Zoom wedding. And the time last month that the Pentagon posted three videos that showed unexplained “aerial phenomena.” Yet none of those went as viral as a 26-minute video called “Plandemic,” a slickly produced narration that wrongly claimed a shadowy cabal of elites was using the virus and a potential vaccine to profit and gain power. The video featured a discredited scientist, Judy Mikovits, who said her research about the harm from vaccines had been buried. “Plandemic” went online on May 4 when its maker, Mikki Willis, a little-known film producer, posted it to Facebook, YouTube, Vimeo and a separate website set up to share the video. For three days, it gathered steam in Facebook pages dedicated to conspiracy theories and the anti- vaccine movement, most of which linked to the video hosted on YouTube. Then it tipped into the mainstream and exploded. Just over a week after “Plandemic” was released, it had been viewed more than eight million times on YouTube, Facebook, Twitter and Instagram, and had generated countless other posts. The New York Times focused on the video’s spread on Facebook using data from CrowdTangle, a tool to analyze interactions across the social network.
    [Show full text]
  • A Disinformation-Misinformation Ecology: the Case of Trump Thomas J
    Chapter A Disinformation-Misinformation Ecology: The Case of Trump Thomas J. Froehlich Abstract This paper lays out many of the factors that make disinformation or misinformation campaigns of Trump successful. By all rational standards, he is unfit for office, a compulsive liar, incompetent, arrogant, ignorant, mean, petty, and narcissistic. Yet his approval rating tends to remain at 40%. Why do rational assessments of his presidency fail to have any traction? This paper looks at the con- flation of knowledge and beliefs in partisan minds, how beliefs lead to self-decep- tion and social self-deception and how they reinforce one another. It then looks at psychological factors, conscious and unconscious, that predispose partisans to pursue partisan sources of information and reject non-partisan sources. It then explains how these factors sustain the variety and motivations of Trump supporters’ commitment to Trump. The role of cognitive authorities like Fox News and right-wing social media sites are examined to show how the power of these media sources escalates and reinforces partisan views and the rejection of other cognitive authorities. These cognitive authorities also use emotional triggers to inflame Trump supporters, keeping them addicted by feeding their anger, resentment, or self-righteousness. The paper concludes by discussing the dynamics of the Trump disinformation- misinformation ecology, creating an Age of Inflamed Grievances. Keywords: Trumpism, disinformation, cognitive authority, Fox News, social media, propaganda, inflamed grievances, psychology of disinformation, Donald Trump, media, self-deception, social self-deception 1. Introduction This paper investigates how disinformation-misinformation campaigns, particularly in the political arena, succeed and why they are so hard to challenge, defeat, or deflect.
    [Show full text]
  • Computer-Assisted Detection and Classification of Misinformation
    1 Computer-assisted detection and classification of 2 misinformation about climate change 1 2 3,4,* 1 3 Travis G. Coan , Constantine Boussalis , John Cook , and Mirjam O. Nanko 1 4 Department of Politics and the Exeter Q-Step Centre, University of Exeter, United Kingdom. 2 5 Department of Political Science, Trinity College Dublin, Ireland. 3 6 Monash Climate Change Communication Research Hub, Monash University, Australia. 4 7 Center for Climate Change Communication, George Mason University, USA. * 8 Corresponding author email: [email protected] 9 ABSTRACT A growing body of scholarship investigates the role of misinformation in shaping the debate on climate change. Our research builds on and extends this literature by 1) developing and validating a comprehensive taxonomy of climate misinformation, 2) conducting the largest content analysis to date on contrarian claims, 3) developing a computational model to accurately detect specific claims, and 4) drawing on an extensive corpus from conservative think-tank (CTTs) websites and contrarian blogs to 10 construct a detailed history of misinformation over the past 20 years. Our study finds that climate misinformation produced by CTTs and contrarian blogs has focused on attacking the integrity of climate science and scientists and, increasingly, has challenged climate policy and renewable energy. We further demonstrate the utility of our approach by exploring the influence of corporate and foundation funding on the production and dissemination of specific contrarian claims. 11 Organised climate change contrarianism has played a significant role in the spread of misinformation and the delay of 1 12 meaningful action to mitigate climate change. Research suggests that climate misinformation leads to a number of negative 2 3 4 13 outcomes such as reduced climate literacy, public polarization, canceling out accurate information, reinforcing climate 5 6 14 silence, and influencing how scientists engage with the public.
    [Show full text]