Fake News Or Real News: What's the Difference and How to Know OLLI Summer 2017
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Fake News and Propaganda: a Critical Discourse Research Perspective
Open Information Science 2019; 3: 197–208 Research article Iulian Vamanu* Fake News and Propaganda: A Critical Discourse Research Perspective https://doi.org/10.1515/opis-2019-0014 Received September 25, 2018; accepted May 9, 2019 Abstract: Having been invoked as a disturbing factor in recent elections across the globe, fake news has become a frequent object of inquiry for scholars and practitioners in various fields of study and practice. My article draws intellectual resources from Library and Information Science, Communication Studies, Argumentation Theory, and Discourse Research to examine propagandistic dimensions of fake news and to suggest possible ways in which scientific research can inform practices of epistemic self-defense. Specifically, the article focuses on a cluster of fake news of potentially propagandistic import, employs a framework developed within Argumentation Theory to explore ten ways in which fake news may be used as propaganda, and suggests how Critical Discourse Research, an emerging cluster of theoretical and methodological approaches to discourses, may provide people with useful tools for identifying and debunking fake news stories. My study has potential implications for further research and for literacy practices. In particular, it encourages empirical studies of its guiding premise that people who became familiar with certain research methods are less susceptible to fake news. It also contributes to the design of effective research literacy practices. Keywords: post-truth, literacy, scientific research, discourse studies, persuasion “Don’t be so overly dramatic about it, Chuck. You’re saying it’s a falsehood [...] Sean Spicer, our press secretary, gave alternative facts to that.” (Kellyanne Conway, Counselor to the U.S. -
Automated Tackling of Disinformation
Automated tackling of disinformation STUDY Panel for the Future of Science and Technology European Science-Media Hub EPRS | European Parliamentary Research Service Scientific Foresight Unit (STOA) PE 624.278 – March 2019 EN Automated tackling of disinformation Major challenges ahead This study maps and analyses current and future threats from online misinformation, alongside currently adopted socio-technical and legal approaches. The challenges of evaluating their effectiveness and practical adoption are also discussed. Drawing on and complementing existing literature, the study summarises and analyses the findings of relevant journalistic and scientific studies and policy reports in relation to detecting, containing and countering online disinformation and propaganda campaigns. It traces recent developments and trends and identifies significant new or emerging challenges. It also addresses potential policy implications for the EU of current socio-technical solutions. ESMH | European Science-Media Hub AUTHORS This study was written by Alexandre Alaphilippe, Alexis Gizikis and Clara Hanot of EU DisinfoLab, and Kalina Bontcheva of The University of Sheffield, at the request of the Panel for the Future of Science and Technology (STOA). It has been financed under the European Science and Media Hub budget and managed by the Scientific Foresight Unit within the Directorate-General for Parliamentary Research Services (EPRS) of the Secretariat of the European Parliament. Acknowledgements The authors wish to thank all respondents to the online survey, as well as first draft, WeVerify, InVID, PHEME, REVEAL, and all other initiatives that contributed materials to the study. ADMINISTRATOR RESPONSIBLE Mihalis Kritikos, Scientific Foresight Unit To contact the publisher, please e-mail [email protected] LINGUISTIC VERSION Original: EN Manuscript completed in March 2019. -
Starr Forum: Russia's Information War on America
MIT Center for Intnl Studies | Starr Forum: Russia’s Information War on America CAROL Welcome everyone. We're delighted that so many people could join us today. Very SAIVETZ: excited that we have such a timely topic to discuss, and we have two experts in the field to discuss it. But before I do that, I'm supposed to tell you that this is an event that is co-sponsored by the Center for International Studies at MIT, the Security Studies program at MIT, and MIT Russia. I should also introduce myself. My name is Carol Saivetz. I'm a senior advisor at the Security Studies program at MIT, and I co-chair a seminar, along with my colleague Elizabeth Wood, whom we will meet after the talk. And we co-chair a seminar series called Focus on Russia. And this is part of that seminar series as well. I couldn't think of a better topic to talk about in the lead-up to the US presidential election, which is now only 40 days away. We've heard so much in 2016 about Russian attempts to influence the election then, and we're hearing again from the CIA and from the intelligence community that Russia is, again, trying to influence who shows up, where people vote. They are mimicking some of Donald Trump's talking points about Joe Biden's strength and intellectual capabilities, et cetera. And we've really brought together two experts in the field. Nina Jankowicz studies the intersection of democracy and technology in central and eastern Europe. -
Political Rhetoric and Minority Health: Introducing the Rhetoric- Policy-Health Paradigm
Saint Louis University Journal of Health Law & Policy Volume 12 Issue 1 Public Health Law in the Era of Alternative Facts, Isolationism, and the One Article 7 Percent 2018 Political Rhetoric and Minority Health: Introducing the Rhetoric- Policy-Health Paradigm Kimberly Cogdell Grainger North Carolina Central University, [email protected] Follow this and additional works at: https://scholarship.law.slu.edu/jhlp Part of the Health Law and Policy Commons Recommended Citation Kimberly C. Grainger, Political Rhetoric and Minority Health: Introducing the Rhetoric-Policy-Health Paradigm, 12 St. Louis U. J. Health L. & Pol'y (2018). Available at: https://scholarship.law.slu.edu/jhlp/vol12/iss1/7 This Symposium Article is brought to you for free and open access by Scholarship Commons. It has been accepted for inclusion in Saint Louis University Journal of Health Law & Policy by an authorized editor of Scholarship Commons. For more information, please contact Susie Lee. SAINT LOUIS UNIVERSITY SCHOOL OF LAW POLITICAL RHETORIC AND MINORITY HEALTH: INTRODUCING THE RHETORIC-POLICY-HEALTH PARADIGM KIMBERLY COGDELL GRAINGER* ABSTRACT Rhetoric is a persuasive device that has been studied for centuries by philosophers, thinkers, and teachers. In the political sphere of the Trump era, the bombastic, social media driven dissemination of rhetoric creates the perfect space to increase its effect. Today, there are clear examples of how rhetoric influences policy. This Article explores the link between divisive political rhetoric and policies that negatively affect minority health in the U.S. The rhetoric-policy-health (RPH) paradigm illustrates the connection between rhetoric and health. Existing public health policy research related to Health in All Policies and the social determinants of health combined with rhetorical persuasive tools create the foundation for the paradigm. -
Fake News, Real Hip: Rhetorical Dimensions of Ironic Communication in Mass Media
FAKE NEWS, REAL HIP: RHETORICAL DIMENSIONS OF IRONIC COMMUNICATION IN MASS MEDIA By Paige Broussard Matthew Guy Heather Palmer Associate Professor Associate Professor Director of Thesis Committee Chair Rebecca Jones UC Foundation Associate Professor Committee Chair i FAKE NEWS, REAL HIP: RHETORICAL DIMENSIONS OF IRONIC COMMUNICATION IN MASS MEDIA By Paige Broussard A Thesis Submitted to the Faculty of the University of Tennessee at Chattanooga in Partial Fulfillment of the Requirements of the Degree of Master of Arts in English The University of Tennessee at Chattanooga Chattanooga, Tennessee December 2013 ii ABSTRACT This paper explores the growing genre of fake news, a blend of information, entertainment, and satire, in main stream mass media, specifically examining the work of Stephen Colbert. First, this work examines classic definitions of satire and contemporary definitions and usages of irony in an effort to understand how they function in the fake news genre. Using a theory of postmodern knowledge, this work aims to illustrate how satiric news functions epistemologically using both logical and narrative paradigms. Specific artifacts are examined from Colbert’s speech in an effort to understand how rhetorical strategies function during his performances. iii ACKNOWLEDGEMENTS Without the gracious help of several supporting faculty members, this thesis simply would not exist. I would like to acknowledge Dr. Matthew Guy, who agreed to direct this project, a piece of work that I was eager to tackle though I lacked a steadfast compass. Thank you, Dr. Rebecca Jones, for both stern revisions and kind encouragement, and knowing the appropriate times for each. I would like to thank Dr. -
Hacks, Leaks and Disruptions | Russian Cyber Strategies
CHAILLOT PAPER Nº 148 — October 2018 Hacks, leaks and disruptions Russian cyber strategies EDITED BY Nicu Popescu and Stanislav Secrieru WITH CONTRIBUTIONS FROM Siim Alatalu, Irina Borogan, Elena Chernenko, Sven Herpig, Oscar Jonsson, Xymena Kurowska, Jarno Limnell, Patryk Pawlak, Piret Pernik, Thomas Reinhold, Anatoly Reshetnikov, Andrei Soldatov and Jean-Baptiste Jeangène Vilmer Chaillot Papers HACKS, LEAKS AND DISRUPTIONS RUSSIAN CYBER STRATEGIES Edited by Nicu Popescu and Stanislav Secrieru CHAILLOT PAPERS October 2018 148 Disclaimer The views expressed in this Chaillot Paper are solely those of the authors and do not necessarily reflect the views of the Institute or of the European Union. European Union Institute for Security Studies Paris Director: Gustav Lindstrom © EU Institute for Security Studies, 2018. Reproduction is authorised, provided prior permission is sought from the Institute and the source is acknowledged, save where otherwise stated. Contents Executive summary 5 Introduction: Russia’s cyber prowess – where, how and what for? 9 Nicu Popescu and Stanislav Secrieru Russia’s cyber posture Russia’s approach to cyber: the best defence is a good offence 15 1 Andrei Soldatov and Irina Borogan Russia’s trolling complex at home and abroad 25 2 Xymena Kurowska and Anatoly Reshetnikov Spotting the bear: credible attribution and Russian 3 operations in cyberspace 33 Sven Herpig and Thomas Reinhold Russia’s cyber diplomacy 43 4 Elena Chernenko Case studies of Russian cyberattacks The early days of cyberattacks: 5 the cases of Estonia, -
Disinformation Ink Spots a Framework to Combat Authoritarian Disinformation
Brief No. 12.6 Disinformation Ink Spots A Framework to Combat Authoritarian Disinformation Lincoln Zaleski P I P S Disinformation Ink Spots A Framework to Combat Authoritarian Disinformation Campaigns APRIL 2020 Lincoln Zaleski The Project on International Peace and Security P I P S Global Research Institute College of William & Mary Disinformation Ink Spots A Framework to Combat Authoritarian Disinformation Campaigns Modern disinformation campaigns, enabled by emerging technologies, allow authoritarian regimes to exploit inherent democratic vulnerabilities. This white paper provides a conceptual framework for understanding authoritarian disinformation campaigns, building on the ink spot approach to countering insurgencies. Using an array of precision targeting and data collecting technologies, authoritarian regimes identify key individuals and groups in the United States to reinforce, shape, and connect. The regimes seek to create a domestic network of influential “ink spots.” Hostile or antagonistic governments then use these sympathetic spots to undermine U.S. policy and democracy through constant reinforcing and manipulation of identity and beliefs. The Ink-Spot Disinformation framework strengthens the United States government understanding of the nature of authoritarian disinformation campaigns and provides a new conceptual foundation for U.S. disinformation defense and deterrence. Introduction Authoritarian regimes, such as Russia, use information warfare to target inherent vulnerabilities in liberal democratic institutions, societies, -
Breaking the Spin Cycle: Teaching Complexity in the 19.3
Lane Glisson 461 Breaking the Spin Cycle: Teaching Complexity in the 19.3. Age of Fake News portal Lane Glisson publication, abstract: This article describes a discussion-based approach for teaching college students to identify the characteristics of ethical journalism and scholarly writing, by comparingfor fake news with credible information in a strategically planned slideshow. Much has been written on the need to instruct our students about disinformation. This librarian shares a lesson plan that engages students’ critical thinking skills by using a blend of humor, analysis, and a compelling visual presentation. The teaching method is contextualized by research on the distrust of the press and scientific evidence since the rise of hyper-partisan cable news, Russian trollaccepted farms, and alternative facts. and Introduction edited, Throughout our culture, the old notions of “truth” and “knowledge” are in danger of being replaced by the new ones of “opinion,” “perception” and “credibility.” copy Michio Kakutani1 What if truth is not an absolute or a relative, but a skill—a muscle, like memory, that collectively we have neglected so much that we have grown measurably weaker at using it? How might we rebuild it, going from chronic to bionic? reviewed, Kevin Young2 npeer 2015, I knew I had a problem. After several years of teaching library instruction is classes, I noticed that my conception of factual- ity and that of my students had diverged. Most Most students preferred Istudents preferred Google and YouTube to do their mss. Google and YouTube to do research. When asked in my classes how they dis- cerned the credibility of a website, most shrugged their research. -
The Psychology of Fake News
10 YOUR FAKE NEWS, OUR FACTS Identity-based motivation shapes what we believe, share, and accept Daphna Oyserman and Andrew Dawson Introduction On June 23, 2016, British voters went to the polls, or rather, seven in ten Brit- ish voters went to the polls; the others refrained (The Guardian, 2016). The less than full turnout was surprising because what was at stake was whether or not Britain (England, Northern Ireland, Scotland, and Wales) would remain part of the European Union (EU) as they had been since 1973. The EU was built on the assumption that members were safer, stronger, and freer together – their coun- tries less likely to face war; their economies more prosperous; their citizens more able to choose their own path. A British generation had grown up with London as an EU financial center (Brush & Weber, 2019), with EU research funds flow- ing into British universities (UK Research and Innovation, 2019) and British products flowing seamlessly through the EU, Britain’s largest trading partner, dwarfing trade with its next three largest trading partners combined (McCrae, 2018). This generation had grown up assuming that they could flow too – be educated, get jobs, raise families anywhere in the EU. As noted by the Stay cam- paign website (www.strongerin.co.uk/), voting to leave would undermine all of that.1 It would leave Britain alone in a connected world and, by creating borders with Ireland, an EU member, would undermine a central element of the 1999 Good Friday peace accord with Northern Ireland that ended a long and bloody history of strife. -
Detecting Digital Fingerprints: Tracing Chinese Disinformation in Taiwan
Detecting Digital Fingerprints: Tracing Chinese Disinformation in Taiwan By: A Joint Report from: Nick Monaco Institute for the Future’s Digital Intelligence Lab Melanie Smith Graphika Amy Studdart The International Republican Institute 08 / 2020 Acknowledgments The authors and organizations who produced this report are deeply grateful to our partners in Taiwan, who generously provided time and insights to help this project come to fruition. This report was only possible due to the incredible dedication of the civil society and academic community in Taiwan, which should inspire any democracy looking to protect itself from malign actors. Members of this community For their assistance in several include but are not limited to: aspects of this report the authors also thank: All Interview Subjects g0v.tw Projects Gary Schmitt 0archive Marina Gorbis Cofacts Nate Teblunthuis DoubleThink Lab Sylvie Liaw Taiwan FactCheck Center Sam Woolley The Reporter Katie Joseff Taiwan Foundation for Democracy Camille François Global Taiwan Institute Daniel Twining National Chengchi University Election Johanna Kao Study Center David Shullman Prospect Foundation Adam King Chris Olsen Hsieh Yauling The Dragon’s Digital Fingerprint: Tracing Chinese Disinformation in Taiwan 2 Graphika is the network Institute for the Future’s The International Republican analysis firm that empowers (IFTF) Digital Intelligence Lab Institute (IRI) is one of the Fortune 500 companies, (DigIntel) is a social scientific world’s leading international Silicon Valley, human rights research entity conducting democracy development organizations, and universities work on the most pressing organizations. The nonpartisan, to navigate the cybersocial issues at the intersection of nongovernmental institute terrain. With rigorous and technology and society. -
Fake News” in an Age of Digital Disorientation
S. A. MURCHIE & J. A. NEYER ROB WILLIAMS Janet A. Neyer Cadillac High School Cadillac, Michigan 5. FIGHTING “FAKE NEWS” IN AN AGE OF DIGITAL DISORIENTATION Towards “Real News,” Critical Media Literacy Education, and Independent Journalism for 21st Century Citizens Journalism’s job is not impartial ‘balanced’ reporting. Journalism’s job is to tell the people what is really going on. – George Seldes INTRODUCTION “This is what makes covering Donald Trump so difficult,” explained baffled CNN reporter John Corker to a national viewing audience in February 2017, shortly after Inauguration Day. “What does he mean when he says words?” (Badash, 2017). This bewildering statement reflects our increasingly disorienting digital landscape of 21st century U.S. news and information, in which the meanings of words, images and news stories seem to have become completely unmoored from reality. Trump is just the tip of the iceberg. Decades ago, journalist and 1984 author George Orwell famously warned readers to be wary of “doublethink” and “Newspeak” (from which we derive the modern term “doublespeak”), in which governments deploy phrases designed to disguise, distort or even reverse reality—think “war is peace,” or “ignorance is strength.” Post-2016 election, the term “fake news” is the latest phrase to capture what is an age-old phenomenon—namely, how powerful state and corporate actors work together to deploy news and information designed to distract and disorient the rest of us. It is no exaggeration to say that we now live in what I call an “age of digital disorientation,” in which the very meaning of “reality” itself seems up for grabs in a “post-truth” digital media culture controlled by powerful corporate and state actors, and defined by speed, immediacy, and information oversaturation. -
Deplatforming Misogyny
Copyright © 2021 Women’s Legal Education and Action Fund (LEAF) Published by Women’s Legal Education and Action Fund (LEAF) 180 Dundas Street West, Suite 1420 Toronto, Ontario, Canada M5G 1C7 www.leaf.ca LEAF is a national, charitable, non-profit organization, founded in 1985. LEAF works to advance the substantive equality rights of women and girls in Canada through litigation, law reform and public education using the Canadian Charter of Rights and Freedoms. This publication was created as part of LEAF's Technology-Facilitated Violence (TFV) Project. The TFV Project brings together feminist lawyers and academics to conduct research and produce publications imagining legal responses to TFV against women and gender-diverse people that are informed by equality principles. The project also supports and informs LEAF’s law reform efforts and potential upcoming interventions concerning TFV. Acknowledgements Deep gratitude and appreciation go to the many people whose efforts and support made this publication possible. This report was researched and written by Cynthia Khoo, a technology and human rights lawyer and researcher. Cynthia holds an LL.M. (Concentration in Law and Technology) from the University of Ottawa, where she worked on cases as junior counsel at the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC). Her paper on platform liability for emergent systemic harm to historically marginalized groups received the inaugural Ian R. Kerr Robotnik Memorial Award for the Best Paper by an Emerging Scholar at We Robot 2020. She has managed a sole practice law firm, Tekhnos Law, and obtained her J.D. from the University of Victoria.