Medical Misinformation in the Covid-19 Pandemic
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Disinformation, and Influence Campaigns on Twitter 'Fake News'
Disinformation, ‘Fake News’ and Influence Campaigns on Twitter OCTOBER 2018 Matthew Hindman Vlad Barash George Washington University Graphika Contents Executive Summary . 3 Introduction . 7 A Problem Both Old and New . 9 Defining Fake News Outlets . 13 Bots, Trolls and ‘Cyborgs’ on Twitter . 16 Map Methodology . 19 Election Data and Maps . 22 Election Core Map Election Periphery Map Postelection Map Fake Accounts From Russia’s Most Prominent Troll Farm . 33 Disinformation Campaigns on Twitter: Chronotopes . 34 #NoDAPL #WikiLeaks #SpiritCooking #SyriaHoax #SethRich Conclusion . 43 Bibliography . 45 Notes . 55 2 EXECUTIVE SUMMARY This study is one of the largest analyses to date on how fake news spread on Twitter both during and after the 2016 election campaign. Using tools and mapping methods from Graphika, a social media intelligence firm, we study more than 10 million tweets from 700,000 Twitter accounts that linked to more than 600 fake and conspiracy news outlets. Crucially, we study fake and con- spiracy news both before and after the election, allowing us to measure how the fake news ecosystem has evolved since November 2016. Much fake news and disinformation is still being spread on Twitter. Consistent with other research, we find more than 6.6 million tweets linking to fake and conspiracy news publishers in the month before the 2016 election. Yet disinformation continues to be a substantial problem postelection, with 4.0 million tweets linking to fake and conspiracy news publishers found in a 30-day period from mid-March to mid-April 2017. Contrary to claims that fake news is a game of “whack-a-mole,” more than 80 percent of the disinformation accounts in our election maps are still active as this report goes to press. -
SARAH E. KREPS John L. Wetherill Professor
SARAH E. KREPS John L. Wetherill Professor ACADEMIC EMPLOYMENT 2019-Present Professor of Government, Adjunct Professor of Law, Cornell University 2013-Present Associate Professor, Adjunct Professor of Law, Cornell University 2008-2013 Assistant Professor of Government, Cornell University FELLOWSHIPS AND AFFILIATIONS 2020-Present Non-Resident Senior Fellow, Foreign Policy, Brookings Institution 2020-Present Faculty Affiliate, Institute for Politics and Global Affairs, Cornell University 2018-Present Faculty Fellow, Milstein Program in Technology and Humanity, Cornell University 2018-Present Faculty Affiliate, Roper Center for Public Opinion Research 2007-Present Member, Council on Foreign Relations 2017-2018 Adjunct Scholar, Modern War Institute at West Point 2015 Summer Security Fellow, Hoover Institution, Stanford University 2013-2014 Stanton Nuclear Security Fellow, Council on Foreign Relations 2007-2008 Fellow, Belfer Center for Science and International Affairs, Harvard University 2006-2007 Fellow, Miller Center for Public Affairs, University of Virginia 2006 DAAD fellow, American Institute for Contemporary German Studies 2005-2008 Senior Fellow, Institute for International Law and Politics, Georgetown 1998-1999 Research Associate, Environment and Health Program, University of Geneva 1997-1999 Research Associate, Environmental Epidemiology, University of Paris V EDUCATION Georgetown University, PhD in Government, Fields: Intl. relations, security studies, 2007. Oxford University, M.Sc. in Environmental Change and Management, with distinction, 1999. Harvard University, B.A in Environmental Science and Public Policy, magna cum laude, 1998. BOOKS AND MONOGRAPHS Social Media and International Politics (Cambridge University Press, 2020). Taxing Wars: The American Way of War Finance and the Decline of Democracy (Oxford University Press, 2018). **Reviewed in the New York Times, Washington Post** Drones: What Everyone Needs to Know (Oxford University Press, 2016). -
Fake News on Facebook and Twitter: Investigating How People (Don't)
CHI 2020 Paper CHI 2020, April 25–30, 2020, Honolulu, HI, USA Fake News on Facebook and Twitter: Investigating How People (Don’t) Investigate Christine Geeng Savanna Yee Franziska Roesner Paul G. Allen School of Computer Science & Engineering University of Washington {cgeeng,savannay,franzi}@cs.washington.edu ABSTRACT well as the accounts who spread these stories. However, the With misinformation proliferating online and more people get- speed, ease, and scalability of information spread on social ting news from social media, it is crucial to understand how media means that (even automated) content moderation by the people assess and interact with low-credibility posts. This platforms cannot always keep up with the problem. study explores how users react to fake news posts on their The reality of misinformation on social media begs the ques- Facebook or Twitter feeds, as if posted by someone they follow. tion of how people interact with it, whether they believe it, We conducted semi-structured interviews with 25 participants and how they debunk it. To support users in making decisions who use social media regularly for news, temporarily caused about the credibility of content they encounter, third parties fake news to appear in their feeds with a browser extension have created fact-checking databases [28, 75, 78], browser unbeknownst to them, and observed as they walked us through extensions [29, 63], and media literacy initiatives [8, 41, 70]. their feeds. We found various reasons why people do not inves- Facebook and Twitter themselves have made algorithm and tigate low-credibility posts, including taking trusted posters’ user interface (UI) changes to help address this. -
Matthew Fuhrmann
matthew fuhrmann Curriculum Vitae Texas A&M University Email: mcfuhrmann (at) gmail (dot) com, Department of Political Science mfuhrmann (at) tamu (dot) edu 4348 TAMU Website: www.matthewfuhrmann.com College Station, Texas 77843-4348 Updated: July 19, 2019 Professional Positions Current Texas A&M University, Department of Political Science Professor September 2017 - Associate Department Head August 2019 - August 2020 Presidential Impact Fellow September 2018 - Faculty Affiliate, Center for Grand Strategy September 2018 - Stanford University, Center for International Security and Cooperation Affiliate September 2017 - Previous Stanford University, Center for International Security and Cooperation Visiting Associate Professor September 2016 - August 2017 Texas A&M University, Department of Political Science Director of Graduate Studies September 2015 - July 2016, September 2017 - August 2019 Associate Professor September 2014 - August 2017 Ray A. Rothrock ‘77 Fellow September 2014 - August 2017 Assistant Professor July 2011 - August 2014 Council on Foreign Relations Stanton Nuclear Security Fellow August 2010 - July 2011 University of South Carolina, Department of Political Science Assistant Professor January 2009 - May 2011 Harvard University, Belfer Center for Science and International Affairs Affiliate January 2009 - August 2011 Research Fellow August 2007 - December 2008 University of Georgia, Center for International Trade and Security Graduate Research Associate January 2005 - July 2007 Education Ph.D. University of Georgia Political Science 2008 M.S. Georgia Tech International Affairs 2004 B.A. University of Georgia Political Science (magna cum laude) 2002 Awards and Fellowships • Open Educator Award, Student Government Association, Texas A&M University, 2019. 1 • Presidential Impact Fellow, Texas A&M University, 2018. • Andrew Carnegie Fellow, Carnegie Corporation of New York, 2016. -
Facts and Myths About Misperceptions Brendan Nyhan Brendan Nyhan Is Professor of Government, Dartmouth College, Hanover, New
Facts and Myths about Misperceptions Brendan Nyhan Brendan Nyhan is Professor of Government, Dartmouth College, Hanover, New Hampshire. His email is [email protected]. Abstract Misperceptions threaten to warp mass opinion and public policy on controversial issues in politics, science, and health. What explains the prevalence and persistence of these false and unsupported beliefs, which seem to be genuinely held by many people? Though limits on cognitive resources and attention play an important role, many of the most destructive misperceptions arise in domains where individuals have weak incentives to hold accurate beliefs and strong directional motivations to endorse beliefs that are consistent with a group identity such as partisanship. These tendencies are often exploited by elites, who frequently create and amplify misperceptions to influence elections and public policy. Though evidence is lacking for claims of a “post-truth” era, changes in the speed with which false information travels and the extent to which it can find receptive audiences require new approaches to counter misinformation. Reducing the propagation and influence of false claims will require further efforts to inoculate people in advance of exposure (e.g., media literacy), debunk false claims that are already salient or widespread (e.g., fact-checking), reduce the prevalence of low- quality information (e.g., changing social media algorithms), and discourage elites from promoting false information (e.g., strengthening reputational sanctions). On August 7, 2009, former vice presidential candidate Sarah Palin reshaped the debate over the Patient Protection and Affordable Care Act when she published a Facebook post falsely claiming that “my parents or my baby with Down Syndrome will have to stand in front of [Barack] Obama’s ‘death panel’ so his bureaucrats can decide.. -
Echo Chambers
AVOIDING THE ECHO CHAMBER ABOUT ECHO CHAMBERS: Why selective exposure to like-minded political news is less prevalent than you think — ANDREW GUESS BRENDAN NYHAN Department of Politics Department of Government knightfoundation.org Princeton University Dartmouth College [email protected] [email protected] BENJAMIN LYONS JASON REIFLER | Department of Politics Department of Politics @knightfdn University of Exeter University of Exeter [email protected] [email protected] CONTENTS AVOIDING THE ECHO CHAMBER ABOUT ECHO CHAMBERS CHAMBER ABOUT ECHO THE ECHO AVOIDING 4 THE ECHO CHAMBERS CRITIQUE 6 SELECTIVE EXPOSURE: A MORE COMPLEX STORY 13 THE IMPORTANCE OF SOCIAL CONTEXT 15 CONCLUSION | 17 REFERENCES Contents knightfoundation.org | @knightfdn 2 / 25 Is the expansion of media choice good for democracy? Not according to critics who decry “echo chambers,” “filter bubbles,” and “information cocoons” — the highly polarized, ideologically homogeneous forms of news and media consumption that are facilitated by technology. However, these claims overstate the prevalence and severity of these patterns, which at most AVOIDING THE ECHO CHAMBER ABOUT ECHO CHAMBERS CHAMBER ABOUT ECHO THE ECHO AVOIDING capture the experience of a minority of the public. In this review essay, we summarize the most important findings of the academic literature about where and how Americans get news and information. We focus particular attention on how much consumers engage in selective exposure to media content that is consistent with their political beliefs and the extent to which this pattern is exacerbated by technology. As we show, the data frequently contradict or at least complicate the “echo chambers” narrative, which has ironically been amplified and distorted in a kind of echo chamber effect. -
The Science of Fake News News Outlets, Simultaneously Benefiting from on March 8, 2018 and Undermining Their Credibility
INSIGHTS Downloaded from http://science.sciencemag.org/ POLICY FORUM SOCIAL S CIENCE gated about topics such as vaccination, nu- trition, and stock values. It is particularly pernicious in that it is parasitic on standard The science of fake news news outlets, simultaneously benefiting from on March 8, 2018 and undermining their credibility. Addressing fake news requires a multidisciplinary effort Some—notably First Draft and Facebook— favor the term “false news” because of the use of fake news as a political weapon (1). By David M. J. Lazer, Matthew A. Baum, and the mechanisms by which it spreads. We have retained it because of its value as a Yochai Benkler, Adam J. Berinsky, Kelly Fake news has a long history, but we focus scientific construct, and because its politi- M. Greenhill, Filippo Menczer, Miriam on unanswered scientific questions raised by cal salience draws attention to an impor- J. Metzger, Brendan Nyhan, Gordon the proliferation of its most recent, politically tant subject. Pennycook, David Rothschild, Michael oriented incarnation. Beyond selected refer- Schudson, Steven A. Sloman, Cass R. ences in the text, suggested further reading THE HISTORICAL SETTING Sunstein, Emily A. Thorson, Duncan J. can be found in the supplementary materials. Journalistic norms of objectivity and bal- Watts, Jonathan L. Zittrain ance arose as a backlash among journalists WHAT IS FAKE NEWS? against the widespread use of propaganda he rise of fake news highlights the We define “fake news” to be fabricated in- in World War I (particularly their own role erosion of long-standing institutional formation that mimics news media content in propagating it) and the rise of corporate bulwarks against misinformation in in form but not in organizational process or public relations in the 1920s. -
Deterrence: What It Can (And Cannot) Do
Deterrence and Conflict Deterrence: what it can (and cannot) do Ellen Resnek Downingtown East High School 2018 FPRI Conference Understanding the Many Missions of the American Military Lesson Plan World History/Contemporary Issues High School Essential Question: Assess the validity of the statement Deterrence is still fundamentally about influencing an actor's decisions. It is about a solid policy foundation. It is about credible capabilities. It is about what the U.S. and our allies as a whole can bring to bear in both a military and a nonmilitary sense. Robert Kehler Instructional Focus: After this lesson, students will be able to: define the acronym NATO and other key terms related to the lesson's content explain NATO's purpose identify member countries of NATO discuss employed defense strategies students will be able to summarize a specific event of NATO efforts Curriculum Standards CCSS.ELA-LITERACY.RST.11-12.7 Integrate and evaluate multiple sources of information presented in diverse formats and media (e.g., quantitative data, video, multimedia) in order to address a question or solve a problem. CCSS.ELA-LITERACY.WHST.11-12.2.B Develop the topic thoroughly by selecting the most significant and relevant facts, extended definitions, concrete details, quotations, or other information and examples appropriate to the audience's knowledge of the topic. Objectives: NCSS Standard VI. Power, Authority, and Governance. Understanding the historical development of structures of power, authority, and governance and their evolving functions in contemporary U.S. society and other parts of the world is essential for developing civic competence. Teacher Background This lesson plan was conceived and adapted from the lecture: “Deterrence and Forward Presence in Europe: From Cold War to Present” Sarah Kreps Associate Professor of Government, Cornell University March 24, 2018 The emergence of the Cold War following WWII did not allow for all U.S. -
Forthcoming. American Political Science Review Partisan Polarization Is the Primary Psychological Motivation Behind Political Fa
Please cite the final version of the paper available at American Political Science Review Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter Mathias Osmundsen∗1, Alexander Bor1, Peter Bjerregaard Vahlstrup2, Anja Bechmann2, and Michael Bang Petersen1 1 Department of Political Science, Aarhus University 2School of Communication and Culture, Aarhus University Abstract The rise of “fake news” is a major concern in contemporary Western democracies. Yet, research on the psychological motivations behind the spread of political fake news on social media is surprisingly limited. Are citizens who share fake news ignorant and lazy? Are they fueled by sinister motives, seeking to disrupt the social status quo? Or do they seek to attack partisan opponents in an increasingly polarized political environment? This manuscript is the first to test these competing hypotheses based on a careful mapping of psychological profiles of over 2,300 American Twitter users linked to behavioral sharing data and sentiment analyses of more than 500,000 news story headlines. The findings contradict the ignorance perspective but provide some support for the disruption perspective and strong support for the partisan polarization perspective. Thus, individuals who report hating their political opponents are the most likely to share political fake news and selectively share content that is useful for derogating these opponents. Overall, our findings show that fake news sharing is fueled by the same psychological motivations that drive other forms of partisan behavior, including sharing partisan news from traditional and credible news sources. ∗Corresponding author: [email protected] 1 With the advent of social media, the circulation of “fake news” has emerged as a major societal concern. -
When Does the Mission Determine the Coalition? the Logic of Multilateral
This article was downloaded by: [Cornell University] On: 13 February 2013, At: 23:30 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Security Studies Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/fsst20 When Does the Mission Determine the Coalition? The Logic of Multilateral Intervention and the Case of Afghanistan Sarah Kreps Version of record first published: 15 Sep 2008. To cite this article: Sarah Kreps (2008): When Does the Mission Determine the Coalition? The Logic of Multilateral Intervention and the Case of Afghanistan, Security Studies, 17:3, 531-567 To link to this article: http://dx.doi.org/10.1080/09636410802319610 PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages -
WEAPONS of MASS DISTRACTION: Foreign State-Sponsored Disinformation in the Digital Age
WEAPONS OF MASS DISTRACTION: Foreign State-Sponsored Disinformation in the Digital Age MARCH 2019 PARK ADVISORS | Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age Authored by Christina Nemr and William Gangware Acknowledgements The authors are grateful to the following subject matter experts who provided input on early drafts of select excerpts: Dr. Drew Conway, Dr. Arie Kruglanski, Sean Murphy, Dr. Alina Polyakova, and Katerina Sedova. The authors also appreciate the contributions to this paper by Andrew Rothgaber and Brendan O’Donoghue of Park Advisors, as well as the editorial assistance provided by Rhonda Shore and Ryan Jacobs. This report was produced with support from the US Department of State’s Global Engagement Center. Any views expressed in this report are those of the authors and do not necessarily reflect the views of the US State Department, Park Advisors, or its subject matter expert consultants. Any errors contained in this report are the authors’ alone. PARK ADVISORS | Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age 0. Table of Contents 01 Introduction and contextual analysis 04 How do we define disinformation? 06 What psychological factors drive vulnerabilities to disinformation and propaganda? 14 A look at foreign state-sponsored disinformation and propaganda 26 Platform-specific challenges and efforts to counter disinformation 39 Knowledge gaps and future technology challenges PARK ADVISORS | Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age 1 Introduction and 1. contextual analysis On July 12, 2014, viewers of Russia’s main state-run television station, Channel One, were shown a horrific story. -
The Role of Technology in Online Misinformation Sarah Kreps
THE ROLE OF TECHNOLOGY IN ONLINE MISINFORMATION SARAH KREPS JUNE 2020 EXECUTIVE SUMMARY States have long interfered in the domestic politics of other states. Foreign election interference is nothing new, nor are misinformation campaigns. The new feature of the 2016 election was the role of technology in personalizing and then amplifying the information to maximize the impact. As a 2019 Senate Select Committee on Intelligence report concluded, malicious actors will continue to weaponize information and develop increasingly sophisticated tools for personalizing, targeting, and scaling up the content. This report focuses on those tools. It outlines the logic of digital personalization, which uses big data to analyze individual interests to determine the types of messages most likely to resonate with particular demographics. The report speaks to the role of artificial intelligence, machine learning, and neural networks in creating tools that distinguish quickly between objects, for example a stop sign versus a kite, or in a battlefield context, a combatant versus a civilian. Those same technologies can also operate in the service of misinformation through text prediction tools that receive user inputs and produce new text that is as credible as the original text itself. The report addresses potential policy solutions that can counter digital personalization, closing with a discussion of regulatory or normative tools that are less likely to be effective in countering the adverse effects of digital technology. INTRODUCTION and machine learning about user behavior to manipulate public opinion, allowed social media Meddling in domestic elections is nothing new as bots to target individuals or demographics known a tool of foreign influence.