
Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353-369. https://doi.org/10.1016/j.jarmac.2017.07.008 Peer reviewed version License (if available): Unspecified Link to published version (if available): 10.1016/j.jarmac.2017.07.008 Link to publication record in Explore Bristol Research PDF-document This is the accepted author manuscript (AAM). The final published version (version of record) is available online via Elsevier at DOI: 10.1016/j.jarmac.2017.07.008. Please refer to any applicable terms of use of the publisher. University of Bristol - Explore Bristol Research General rights This document is made available in accordance with publisher policies. Please cite only the published version using the reference above. Full terms of use are available: http://www.bristol.ac.uk/red/research-policy/pure/user-guides/ebr-terms/ Beyond Misinformation 1 Running head: BEYOND MISINFORMATION Beyond Misinformation: Understanding and coping with the \post-truth" era Stephan Lewandowsky University of Bristol and University of Western Australia Ullrich K. H. Ecker University of Western Australia John Cook George Mason University Word count: XXXX excluding references (approximate count due to use of LATEX) Stephan Lewandowsky School of Experimental Psychology and Cabot Institute University of Bristol 12a Priory Road Bristol BS8 1TU, United Kingdom [email protected] URL: http://www.cogsciwa.com Beyond Misinformation 1 Running head: BEYOND MISINFORMATION Beyond Misinformation: Understanding and coping with the post-truth era Stephan Lewandowsky University of Bristol and University of Western Australia Ullrich K. H. Ecker University of Western Australia John Cook George Mason University Beyond Misinformation 2 Abstract The terms \post-truth" and \fake news" have become increasingly prevalent in public discourse over the last year. This article explores the growing abundance of misinformation, how it influences people and how to counter it. We examine the ways in which misinformation can have an adverse impact on society. We summarize how people respond to corrections of misinformation, and what kinds of corrections are most effective. We argue that to be effective, scientific research into misinformation must be considered within a larger political, technological, and societal context. The post-truth world emerged as a result of societal mega-trends such as a decline in social capital, growing economic inequality, increased polarization, declining trust in science, and an increasingly fractionated media landscape. We suggest that responses to this malaise must involve technological solutions incorporating psychological principles, an interdisciplinary approach that we describe as \technocognition". We outline a number of recommendations to counter misinformation in a post-truth world. Beyond Misinformation 3 Beyond Misinformation: Understanding and coping with the post-truth era General Audience Summary Imagine a world that considers knowledge to be \elitist". Imagine a world in which it is not medical knowledge but a free-for-all opinion market on Twitter that determines whether a newly emergent strain of avian flu is really contagious to humans. This dystopian future is still just that|a possible future. However, there are signs that public discourse is evolving in this direction: Terms such as \post-truth" and \fake news", largely unknown until 2016, have exploded into media and public discourse. This article explores the growing abundance of misinformation in the public sphere, how it influences people and how to counter it. We show how misinformation can have an adverse impact on society, for example by predisposing parents to make disadvantageous medical decisions for their children. We argue that for counter-measures to be effective, they must be informed by the larger political, technological, and societal context. The post-truth world arguably emerged as a result of societal mega-trends, such as a decline in social capital, growing economic inequality, increased polarization, declining trust in science, and an increasingly fractionated media landscape. Considered against the background of those over-arching trends, misinformation in the post-truth era can no longer be considered solely an isolated failure of individual cognition that can be corrected with appropriate communication tools. Rather, it should also consider the influence of alternative epistemologies that defy conventional standards of evidence. Responses to the post-truth era must therefore include technological solutions that incorporate psychological principles, an interdisciplinary approach that we describe as \technocognition." Technocognition uses findings from cognitive science to inform the design of information Beyond Misinformation 4 architectures that encourage the dissemination of high-quality information and that discourage the spread of misinformation. Beyond Misinformation 5 Beyond Misinformation: Understanding and coping with the post-truth era Imagine a world that has had enough of experts. That considers knowledge to be \elitist". Imagine a world in which it is not expert knowledge but an opinion market on Twitter that determines whether a newly emergent strain of avian flu is really contagious to humans, or whether greenhouse gas emissions do in fact cause global warming, as 97% of domain experts say it does (Anderegg, Prall, Harold, & Schneider, 2010; Cook et al., 2013, 2016; Oreskes, 2004). In this world, power lies with those most vocal and influential on social media: from celebrities and big corporations to botnet puppeteers who can mobilize millions of tweetbots or sock puppets|that is, fake online personas through which a small group of operatives can create an illusion of a widespread opinion (Bu, Xia, & Wang, 2013; Lewandowsky, 2011). In this world, experts are derided as untrustworthy or elitist whenever their reported facts threaten the rule of the well-financed or the prejudices of the uninformed. How close are we to this dystopian future? We may not be there (yet) although there are reasons to be concerned about our trajectory. The terms \post-truth" and \post-fact", virtually unknown 5 years ago, have exploded onto the media scene with thousands of recent mentions. To illustrate, the media search engine Factiva returns 40 hits in the global media for \post-truth" in all of 2015, compared to 2,535 in 2016 and around 2,400 during the first 3 months of 2017. The prevalence of misinformation in 2016 led the Oxford Dictionary to nominate \post-truth" as the word of the year (Flood, 2016). The rapidly growing recognition of the role of misinformation follows on the heels of earlier warnings, for example by the World Economic Forum|a not-for-profit institution \committed to improving the state of the world"|which ranked the spread of misinformation online as one of the 10 most significant issues facing the world in 2013 (WEF, 2013). Beyond Misinformation 6 During the 2016 U.S. presidential campaign, independent fact checker PolitiFact judged 70% of all statements by Donald Trump to be false or mostly false. For his opponent, Hillary Clinton, this rate was much lower (although arguably still quite high) at 26%. Donald Trump won the presidency, suggesting that his comparatively impoverished record of accuracy, compared to his opponent, did not diminish his attractiveness with a large number of voters.1 Impressions of President Trump's popularity were possibly boosted by the fact that a substantial portion of all pro-Trump traffic on Twitter was driven by tweetbots, with automated pro-Trump traffic being at least 4 times as prevalent as automated pro-Clinton traffic (Kollanyi, Howard, & Woolley, 2016). The dissociation between accuracy and President Trump's attractiveness to voters is underscored by recent laboratory research investigating the effects of corrections on voters' beliefs and voting intentions: Swire, Berinsky, Lewandowsky, and Ecker (2017) presented statements that President Trump made on the primary campaign trail to a large sample of participants and elicited belief ratings. Half the statements were true (e.g., \the U.S. spent $2 trillion on the war in Iraq") and the other half consisted of false claims (e.g., \vaccines cause autism"). When participants received corrections of the false statements, and affirmations of the correct statements, their belief ratings changed accordingly: All participants, including Trump supporters, believed statements less after they were identified as false, and they believed them more after they were affirmed as being correct. However, for Trump supporters there was no association between the extent to which they shifted their belief when a statement was corrected and their feelings for President Trump or their intention to vote for him. Thus, it seems that President Trump's false claims did not matter to his supporters|at least they did not matter sufficiently to alter their feelings or voting intentions. This article uses the context of those recent public events to pursue a number of questions: What explains the growing abundance of misinformation? Why do people Beyond Misinformation 7 believe in misinformation? If misinformation is corrected, do people reliably update their beliefs? To what extent are people concerned whether or not information is accurate? This article places the findings from cognitive research on misinformation into a broader political and societal context.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages59 Page
-
File Size-