The Challenges of Algorithmically Assigning Fact-Checks: a Sociotechnical Examination of Google's Reviewed Claims
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Sociotechnical Systems and Ethics in the Large
Sociotechnical Systems and Ethics in the Large Amit K. Chopra Munindar P. Singh Lancaster University North Carolina State University Lancaster LA1 4WA, UK Raleigh, NC 27695-8206, USA [email protected] [email protected] Abstract question has inspired various versions of “do no harm to hu- mans” maxims, from Asimov to Bostrom and Yudkowsky Advances in AI techniques and computing platforms have (2014). And, partly this interest stems from imagining that triggered a lively and expanding discourse on ethical decision-making by autonomous agents. Much recent work agents are deliberative entities who will make choices much in AI concentrates on the challenges of moral decision mak- in the same way humans do: faced with a situation that de- ing from a decision-theoretic perspective, and especially the mands deliberation, an agent will line up its choices and representation of various ethical dilemmas. Such approaches make the best one that is also the most ethical. The trol- may be useful but in general are not productive because moral ley problem, a moral dilemma that has been the subject of decision making is as context-driven as other forms of deci- extensive philosophical discussion, has been discussed ex- sion making, if not more. In contrast, we consider ethics not tensively in the context of self-driving vehicles (Bonnefon, from the standpoint of an individual agent but of the wider Shariff, and Rahwan 2016). sociotechnical systems (STS) in which the agent operates. Concurrently, there has been an expanding body of work Our contribution in this paper is the conception of ethical STS in the broad AI tradition that investigates designing and governance founded on that takes into account stakeholder verifying, not individual agents, but sociotechnical systems values, normative constraints on agents, and outcomes (states of the STS) that obtain due to actions taken by agents. -
THE EFFECTS of FACT-CHECKING THREAT Results from a Field Experiment in the States
NEW AMERICA FOUNDATION Research Paper THE EFFECTS OF FACT-CHECKING THREAT Results from a field experiment in the states Brendan Nyhan and Jason Reifler* October 2013 Executive summary Politicians in the United States are coming under increasing scrutiny from fact-checkers like PolitiFact, Factcheck.org, and the Washington Post Fact Checker, who examine the accuracy of public statements that are often reported without challenge by traditional news organizations. However, we know little about the effects of this practice, especially on public officials. One possibility is that fact-checking might help to deter the dissemination of misinformation, especially for candidates and legislators at lower levels of government who receive relatively little scrutiny and are sensitive to potential threats to re-election. To test this hypothesis, we conducted a field experiment during the 2012 campaign evaluating the effects of reminding state legislators about the electoral and reputational threat posed by fact-checking. Our experimental sample consisted of nearly 1200 legislators in nine states with state PolitiFact affiliates. We found that legislators who were sent reminders that they are vulnerable to fact-checking were less likely to receive a negative PolitiFact rating or have the accuracy of their statements questioned publicly than legislators who were not sent reminders. These results suggest that the electoral and reputational threat posed by fact-checking can affect the behavior of elected officials. In this way, fact-checking could play an important role in improving political discourse and strengthening democratic accountability. * Brendan Nyhan ([email protected]) is an Assistant Professor in the Department of Government at Dartmouth College. -
(Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-Media Machine How a Strange New Class of Media Outlet Has Arisen to Take Over Our News Feeds
Inside Facebook’s (Totally Insane, Unintentionally Gigantic, Hy... http://www.nytimes.com/2016/08/28/magazine/inside-facebooks... http://nyti.ms/2bzePnO Inside Facebook’s (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political-Media Machine How a strange new class of media outlet has arisen to take over our news feeds. By JOHN HERRMAN AUG. 24, 2016 Open your Facebook feed. What do you see? A photo of a close friend’s child. An automatically generated slide show commemorating six years of friendship between two acquaintances. An eerily on-target ad for something you’ve been meaning to buy. A funny video. A sad video. A recently live video. Lots of video; more video than you remember from before. A somewhat less-on-target ad. Someone you saw yesterday feeling blessed. Someone you haven’t seen in 10 years feeling worried. And then: A family member who loves politics asking, “Is this really who we want to be president?” A co-worker, whom you’ve never heard talk about politics, asking the same about a different candidate. A story about Donald Trump that “just can’t be true” in a figurative sense. A story about Donald Trump that “just can’t be true” in a literal sense. A video of Bernie Sanders speaking, overlaid with text, shared from a source you’ve never seen before, viewed 15 million times. An article questioning Hillary Clinton’s honesty; a headline questioning Donald Trump’s sanity. A few shares that go a bit too far: headlines you would never pass along yourself but that you might tap, read and probably not forget. -
How White Supremacy Returned to Mainstream Politics
GETTY CORUM IMAGES/SAMUEL How White Supremacy Returned to Mainstream Politics By Simon Clark July 2020 WWW.AMERICANPROGRESS.ORG How White Supremacy Returned to Mainstream Politics By Simon Clark July 2020 Contents 1 Introduction and summary 4 Tracing the origins of white supremacist ideas 13 How did this start, and how can it end? 16 Conclusion 17 About the author and acknowledgments 18 Endnotes Introduction and summary The United States is living through a moment of profound and positive change in attitudes toward race, with a large majority of citizens1 coming to grips with the deeply embedded historical legacy of racist structures and ideas. The recent protests and public reaction to George Floyd’s murder are a testament to many individu- als’ deep commitment to renewing the founding ideals of the republic. But there is another, more dangerous, side to this debate—one that seeks to rehabilitate toxic political notions of racial superiority, stokes fear of immigrants and minorities to inflame grievances for political ends, and attempts to build a notion of an embat- tled white majority which has to defend its power by any means necessary. These notions, once the preserve of fringe white nationalist groups, have increasingly infiltrated the mainstream of American political and cultural discussion, with poi- sonous results. For a starting point, one must look no further than President Donald Trump’s senior adviser for policy and chief speechwriter, Stephen Miller. In December 2019, the Southern Poverty Law Center’s Hatewatch published a cache of more than 900 emails2 Miller wrote to his contacts at Breitbart News before the 2016 presidential election. -
Automated Tackling of Disinformation
Automated tackling of disinformation STUDY Panel for the Future of Science and Technology European Science-Media Hub EPRS | European Parliamentary Research Service Scientific Foresight Unit (STOA) PE 624.278 – March 2019 EN Automated tackling of disinformation Major challenges ahead This study maps and analyses current and future threats from online misinformation, alongside currently adopted socio-technical and legal approaches. The challenges of evaluating their effectiveness and practical adoption are also discussed. Drawing on and complementing existing literature, the study summarises and analyses the findings of relevant journalistic and scientific studies and policy reports in relation to detecting, containing and countering online disinformation and propaganda campaigns. It traces recent developments and trends and identifies significant new or emerging challenges. It also addresses potential policy implications for the EU of current socio-technical solutions. ESMH | European Science-Media Hub AUTHORS This study was written by Alexandre Alaphilippe, Alexis Gizikis and Clara Hanot of EU DisinfoLab, and Kalina Bontcheva of The University of Sheffield, at the request of the Panel for the Future of Science and Technology (STOA). It has been financed under the European Science and Media Hub budget and managed by the Scientific Foresight Unit within the Directorate-General for Parliamentary Research Services (EPRS) of the Secretariat of the European Parliament. Acknowledgements The authors wish to thank all respondents to the online survey, as well as first draft, WeVerify, InVID, PHEME, REVEAL, and all other initiatives that contributed materials to the study. ADMINISTRATOR RESPONSIBLE Mihalis Kritikos, Scientific Foresight Unit To contact the publisher, please e-mail [email protected] LINGUISTIC VERSION Original: EN Manuscript completed in March 2019. -
Climate Change in the Era of Post-Truth
09_ARBOLEDA_EDITEDPROOF_KS (DO NOT DELETE) 11/8/2018 2:50 PM Climate Change in the Era of Post- Truth INTRODUCTION In The Madhouse Effect: How Climate Change Denial is Threatening our Planet, Destroying our Politics, and Driving us Crazy,1 climate scientist Michael Mann joins with Pulitzer Prize-winning cartoonist Tom Toles to take on climate change denialism. Mann, the Director of the Earth System Science Center at The Pennsylvania State University, augments his prose with cartoons from Toles, who normally draws for the editorial section of the Washington Post.2 Together, Mann and Toles set out to debunk the main arguments that special interest groups use to undermine climate change policy. The book begins with an introduction to the scientific method and its application to climate change science.3 It then describes the current and potential effects of climate change on everyday life.4 In its second half, the book transitions to the politics surrounding climate change in the United States.5 A major focus of the book is the “war on climate science,” the phrase Mann and Toles use to describe how the fossil fuel industry has created misinformation to discourage action on climate change.6 The Madhouse Effect was published in 2016, at a moment when the United States was choosing between Democratic and Republican presidential candidates whose climate change agendas differed wildly. The book’s publication failed to avert the election of President Donald Trump, a climate change denier who has referred to the phenomenon as a “hoax” created by China.7 Still, The Madhouse Effect presents a valuable depiction of the underground currents that influence DOI: https://doi.org/10.15779/Z38W669857 Copyright © 2018 Regents of the University of California 1. -
IN AMERICAN POLITICS TODAY, EVEN FACT-CHECKERS ARE VIEWED THROUGH PARTISAN LENSES by David C
IN AMERICAN POLITICS TODAY, EVEN FACT-CHECKERS ARE VIEWED THROUGH PARTISAN LENSES by David C. Barker, American University What can be done about the “post-truth” era in American politics? Many hope that beefed-up fact-checking by reputable nonpartisan organizations like Fact Checker, PolitiFact, FactCheck.org, and Snopes can equip citizens with the tools needed to identify lies, thus reducing the incentive politicians have to propagate untruths. Unfortunately, as revealed by research I have done with colleagues, fact-checking organizations have not provided a cure-all to misinformation. Fact-checkers cannot prevent the politicization of facts, because they themselves are evaluated through a partisan lens. Partisan Biases and Trust of Fact Checkers in the 2016 Election In May of 2016, just a couple of weeks before the California Democratic primary, my colleagues and I conducted an experiment that exposed a representative sample of Californians to the statement: “Nonpartisan fact-checking organizations like PolitiFact rate controversial candidate statements for truthfulness.” We also exposed one randomized half of the sample to the following statement and image: “Each presidential candidate's current PolitiFact average truthfulness score is placed on the scale below.” The Politifact image validated the mainstream narrative about Donald Trump’s disdain for facts, and also reinforced Bernie Sanders’s tell-it-like-it-is reputation. But it contradicted conventional wisdom by revealing that Clinton had been the most accurate candidate overall (though the difference between the Clinton and Sanders ratings was not statistically significant). What did we expect to find? Some observers presume that Republicans are the most impervious to professional fact-checking. -
Starr Forum: Russia's Information War on America
MIT Center for Intnl Studies | Starr Forum: Russia’s Information War on America CAROL Welcome everyone. We're delighted that so many people could join us today. Very SAIVETZ: excited that we have such a timely topic to discuss, and we have two experts in the field to discuss it. But before I do that, I'm supposed to tell you that this is an event that is co-sponsored by the Center for International Studies at MIT, the Security Studies program at MIT, and MIT Russia. I should also introduce myself. My name is Carol Saivetz. I'm a senior advisor at the Security Studies program at MIT, and I co-chair a seminar, along with my colleague Elizabeth Wood, whom we will meet after the talk. And we co-chair a seminar series called Focus on Russia. And this is part of that seminar series as well. I couldn't think of a better topic to talk about in the lead-up to the US presidential election, which is now only 40 days away. We've heard so much in 2016 about Russian attempts to influence the election then, and we're hearing again from the CIA and from the intelligence community that Russia is, again, trying to influence who shows up, where people vote. They are mimicking some of Donald Trump's talking points about Joe Biden's strength and intellectual capabilities, et cetera. And we've really brought together two experts in the field. Nina Jankowicz studies the intersection of democracy and technology in central and eastern Europe. -
Online Media and the 2016 US Presidential Election
Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election The Harvard community has made this article openly available. Please share how this access benefits you. Your story matters Citation Faris, Robert M., Hal Roberts, Bruce Etling, Nikki Bourassa, Ethan Zuckerman, and Yochai Benkler. 2017. Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election. Berkman Klein Center for Internet & Society Research Paper. Citable link http://nrs.harvard.edu/urn-3:HUL.InstRepos:33759251 Terms of Use This article was downloaded from Harvard University’s DASH repository, and is made available under the terms and conditions applicable to Other Posted Material, as set forth at http:// nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of- use#LAA AUGUST 2017 PARTISANSHIP, Robert Faris Hal Roberts PROPAGANDA, & Bruce Etling Nikki Bourassa DISINFORMATION Ethan Zuckerman Yochai Benkler Online Media & the 2016 U.S. Presidential Election ACKNOWLEDGMENTS This paper is the result of months of effort and has only come to be as a result of the generous input of many people from the Berkman Klein Center and beyond. Jonas Kaiser and Paola Villarreal expanded our thinking around methods and interpretation. Brendan Roach provided excellent research assistance. Rebekah Heacock Jones helped get this research off the ground, and Justin Clark helped bring it home. We are grateful to Gretchen Weber, David Talbot, and Daniel Dennis Jones for their assistance in the production and publication of this study. This paper has also benefited from contributions of many outside the Berkman Klein community. The entire Media Cloud team at the Center for Civic Media at MIT’s Media Lab has been essential to this research. -
Fact Or Fiction?
The Ins and Outs of Media Literacy 1 Part 1: Fact or Fiction? Fake News, Alternative Facts, and other False Information By Jeff Rand La Crosse Public Library 2 Goals To give you the knowledge and tools to be a better evaluator of information Make you an agent in the fight against falsehood 3 Ground rules Our focus is knowledge and tools, not individuals You may see words and images that disturb you or do not agree with your point of view No political arguments Agree 4 Historical Context “No one in this world . has ever lost money by underestimating the intelligence of the great masses of plain people.” (H. L. Mencken, September 19, 1926) 5 What is happening now and why 6 Shift from “Old” to “New” Media Business/Professional Individual/Social Newspapers Facebook Magazines Twitter Television Websites/blogs Radio 7 News Platforms 8 Who is your news source? Professional? Personal? Educated Trained Experienced Supervised With a code of ethics https://www.spj.org/ethicscode.asp 9 Social Media & News 10 Facebook & Fake News 11 Veles, Macedonia 12 Filtering Based on: Creates filter bubbles Your location Previous searches Previous clicks Previous purchases Overall popularity 13 Echo chamber effect 14 Repetition theory Coke is the real thing. Coke is the real thing. Coke is the real thing. Coke is the real thing. Coke is the real thing. 15 Our tendencies Filter bubbles: not going outside of your own beliefs Echo chambers: repeating whatever you agree with to the exclusion of everything else Information avoidance: just picking what you agree with and ignoring everything else Satisficing: stopping when the first result agrees with your thinking and not researching further Instant gratification: clicking “Like” and “Share” without thinking (Dr. -
Energy Research & Social Science
Energy Research & Social Science 70 (2020) 101617 Contents lists available at ScienceDirect Energy Research & Social Science journal homepage: www.elsevier.com/locate/erss Review Sociotechnical agendas: Reviewing future directions for energy and climate T research ⁎ Benjamin K. Sovacoola, , David J. Hessb, Sulfikar Amirc, Frank W. Geelsd, Richard Hirshe, Leandro Rodriguez Medinaf, Clark Millerg, Carla Alvial Palavicinoh, Roopali Phadkei, Marianne Ryghaugj, Johan Schoth, Antti Silvastj, Jennie Stephensk, Andy Stirlingl, Bruno Turnheimm, Erik van der Vleutenn, Harro van Lenteo, Steven Yearleyp a University of Sussex, United Kingdom and Aarhus University, Denmark b Vanderbilt University, United States c Nanyang Technological University, Singapore d The University of Manchester, United Kingdom e Virginia Polytechnic Institute and State University, United States f Universidad de las Americas Puebla, Mexico g Arizona State University, United States h Universiteit Utrecht, Netherlands i Macalester College, United States j Norwegian University of Science and Technology, Norway k Northeastern University, United States l University of Sussex, United Kingdom m Laboratoire Interdisciplinaire Sciences Innovations Sociétés, France n Eindhoven University of Technology, Netherlands o Universiteit Maastricht, Netherlands p The University of Edinburgh, United Kingdom ARTICLE INFO ABSTRACT Keywords: The field of science and technology studies (STS) has introduced and developed a “sociotechnical” perspective Science and technology studies that has been taken up by many disciplines and areas of inquiry. The aims and objectives of this study are Sociotechnical systems threefold: to interrogate which sociotechnical concepts or tools from STS are useful at better understanding Science technology and society energy-related social science, to reflect on prominent themes and topics within those approaches, and to identify Sociology of scientific knowledge current research gaps and directions for the future. -
Disinformation, and Influence Campaigns on Twitter 'Fake News'
Disinformation, ‘Fake News’ and Influence Campaigns on Twitter OCTOBER 2018 Matthew Hindman Vlad Barash George Washington University Graphika Contents Executive Summary . 3 Introduction . 7 A Problem Both Old and New . 9 Defining Fake News Outlets . 13 Bots, Trolls and ‘Cyborgs’ on Twitter . 16 Map Methodology . 19 Election Data and Maps . 22 Election Core Map Election Periphery Map Postelection Map Fake Accounts From Russia’s Most Prominent Troll Farm . 33 Disinformation Campaigns on Twitter: Chronotopes . 34 #NoDAPL #WikiLeaks #SpiritCooking #SyriaHoax #SethRich Conclusion . 43 Bibliography . 45 Notes . 55 2 EXECUTIVE SUMMARY This study is one of the largest analyses to date on how fake news spread on Twitter both during and after the 2016 election campaign. Using tools and mapping methods from Graphika, a social media intelligence firm, we study more than 10 million tweets from 700,000 Twitter accounts that linked to more than 600 fake and conspiracy news outlets. Crucially, we study fake and con- spiracy news both before and after the election, allowing us to measure how the fake news ecosystem has evolved since November 2016. Much fake news and disinformation is still being spread on Twitter. Consistent with other research, we find more than 6.6 million tweets linking to fake and conspiracy news publishers in the month before the 2016 election. Yet disinformation continues to be a substantial problem postelection, with 4.0 million tweets linking to fake and conspiracy news publishers found in a 30-day period from mid-March to mid-April 2017. Contrary to claims that fake news is a game of “whack-a-mole,” more than 80 percent of the disinformation accounts in our election maps are still active as this report goes to press.