<<

Understanding and Addressing the Ecosystem

December 15-16, 2017 Annenberg School for Communication This workshop brings together academics, , fact-checkers, technologists, and funders to better understand the challenges produced by the current disinformation ecosystem. The facilitated discussions will highlight relevant research, share best-practices, identify key questions of scholarly and practical concern regarding the nature and implications of the disinformation ecosystem, and outline a potential research agenda designed to answer these questions.

This conference has been made possible through generous funding from the John S. and James L. Knight Foundation. Table of contents

Fake in Context Michael Schudson (Columbia University) Barbie Zelizer (University of Pennsylvania) 1 Information Disorder: Definitions Hossein Derakhshan (MIT Media Lab and Harvard University Shorenstein Center) Claire Wardle (First Draft) 5 People Don’t Trust – And This Is Key to the Global Debate Richard Fletcher (Reuters Institute for the Study of , University of Oxford) Rasmus Nielsen (Reuters Institute for the Study of Journalism, University of Oxford) 13 Taking the Red Pill: Ideological Motivations for Spreading Online Disinformation Rebecca Lewis (Data & Society Research Institute) Alice Marwick (University of North Carolina) 18 The Minority Report on the Crisis: (Spoiler Alert: It’s the Real News) Duncan Watts (Microsoft Research) David Rothschild (Microsoft Research) 23 Personalized Information Environments and Their Potential Consequences for Disinformation Deen Freelon (University of North Carolina) 38 Making Sense of Information and Judging its Credibility Natalie Jomini Stroud (University of Texas) Emily Thorson (Syracuse University) Dannagal Young (University of Delaware) 45 Misinformation and Identity-Protective Cognition Dan Kahan (Yale University) 51 Using Behavioral Theory to Curb Misinformation Sharing Brian Southwell (RTI International) Vanessa Boudeywns (RTI International) 58 Assessing Current Efforts by the Platforms and their Effectiveness Claire Wardle (FirstDraft) 64 Information Operations and Democracy: What Role for Global Civil Society and Independent Media? Sarah Oh (University of California, Berkeley) 69 Why We Feel Woefully Under-Informed About How We Can Do Our Jobs Better Amy Sippett (Full Fact) 75 Field Notes: Observation from a School Library Michael Barker (Phillips Academy) 80 How Academics Can Help Platforms Tackle Disinformation Nic Dias (FirstDraft) 83 Face2Face: Real-time Face Capture and Reenactment of RGB Videos Justis Theis (Technical University of Munich & University of Erlangen-Nuremberg) Michael Zollhöfer (Max-Planck Institute for Informatics) Marc Stamminger (University of Erlangen-Nuremberg) Christian Theobalt (Max-Planck Institute for Informatics) Matthias Nießner (Technical University of Munich) 87 Schudson and Zelizer

Fake News in Context

Michael Schudson, Columbia University Barbie Zelizer, University of Pennsylvania

“Fake news” landed in the US lexicon in for passersby, not unlike the rumor mongers 2016 as if it had arrived from Mars in a of today’s digital environment (Darnton flying saucer, without precedent, without 2017). lineage. Fears about its influence were widespread, anxieties about its origins, both Fake news lessened in centrality only in the foreign and domestic, were large and alarm late 1800s as printed news, particularly in bells were clamorous about its threat to Britain and the , came to center democracy. But now, with the initial panic on what Jean Chalaby called “fact-centered at least partially in remission, we may be discursive practices” and people realized able to put it in better perspective. that could compete with one another not simply on the basis of partisan In many ways, the noise created in response affiliation or on the quality of philosophical to fake news was predictable. From anti- and political essays but on the immediacy Semitic blood libel stories in 15th century and accuracy of factual reports (Chalaby Europe to church-supported missives of 1996). Until that point, few clear notions of divine retribution following the 1755 Lisbon journalistic ethics, balance or truth-telling Earthquake, fake news has been around a were around to inspire trust in the reliability long while. Dating back hundreds of years to of news. In the 1890s early journalism the invention of print and offering primarily textbooks could still recommend strict inflammatory and sensational accounts of accuracy about the “essential” facts of a the events of the day, there has been a long story while taking a tolerant attitude toward history of journalistic , scandals, , pure invention when it came to “non- satire and exaggerations. In the US Thomas essentials” (Shuman 1894) And fakery in Jefferson complained in 1807 to a friend that photography and news photography early in “the man who never looks into a the 20th century was widely accepted as just is better informed than he who reads them; a part of the game (Tucher 2017). inasmuch as he who knows nothing is nearer to truth than he whose mind is filled with So even then fake news persisted. It falsehoods & errors” (Jefferson 1807). Fake remained consonant with the news produced antebellum stories of of the tabloids: The exaggerations and lies in African-Americans turning white or tales of the run-up to the Spanish-American War alien populations on the moon, and it helped make profitable. prevailed, according to Robert Darnton, in But attempts to rein in fabrication appeared the pasquinades of 16th century Italy, the soon thereafter, notably through norms of canards of 17th century Paris and the media truthfulness and formal codes of “paragraph men” and nouvellistes of London ethics, beginning with the Kansas State and Paris respectively who turned 18th Editorial Association in 1910. Reactions century coffeehouse gossip into printed or against Wilson administration to handwritten papers left on public benches whip up support for World War I and

1 Schudson and Zelizer against the rapid rise of after discredited accounts of purportedly reliable the war led more news organizations and information. associations of journalists to adopt ethical codes, taking shape nationally with the Different forms of fake news easily spread formation of the American Society of across the media. Its adaptable shapes— Newspaper Editors that in 1923 adopted a where it can appear as honest error, code of journalistic ethics (Samuel 2016). misreporting, conspiracy theories, clickbait, In the 1920s journalists adopted the ideal of unsubstantiated gossip, junk science and “objectivity” in news reporting as a flag they downright false news—underscore the would salute (Schudson 1978). hospitable nature of current circumstances, where its dissemination is readily facilitated Fake news echoed around the world during by many features. They include algorithmic World War I, where longstanding practices formulae that generate news feeds without of propaganda and disinformation fostered regard for accuracy; a news environment the deliberate creation of false news by driven not only by journalists; access to official or quasi-official agencies intent on instantaneous and g lobal dissemination; destabilizing the public or shaping public populist political currents that invoke the opinion. Latent and manifest attempts at label of fake news as a weapon; low disinformation used both covert and occupational esteem and failing business conventional channels, making it difficult to models across journalism; journalistic know what was true or fake in an ecosystem resistance to embracing perspectival forms that accommodated both: British of information favored by many beyond the disinformation efforts during World War II, news—, podcasts, crowd-sourcing, capped by Sefton Delmer’s so- listicles, gifs and memes; and populations called “ Propaganda” campaign across who distrust the media while lacking the Nazi territory, coexisted alongside Allied media literacy skills to discern or understand military efforts at publicizing true but hard- signs of journalistic , subtle agendas, or to-believe news of the concentration camps distinctions between news and (Zelizer 1998). Fake news intensified during advertisement. While today’s fake news the , where the Russian practice of builds on longstanding journalistic, dezinformatsiya and similar US efforts vied technological, political, social, cultural, with each to produce accounts damaging to economic and legal contours that for the the other side; portraying JFK’s most part preceded the contemporary assassination as a CIA plot was among them landscape, they nonetheless suit its current (Zelizer 1992). Today some argue that spread particularly well. notions of propaganda aptly describe the contemporary information efforts of many To act as if today’s fake news environment online actors and activists (Jack 2017). is fundamentally different from that of earlier times misreads how entrenched fake Fake news, then, has proliferated in various news and broader attitudes toward fakery ways. No surprise that the fake news of one have been. Today we readily accept fakery situation is not the fake news of another. in many forms: reality television that is Variations in detail, degree and intensity more staged than real, message manipulation make the label a catch-all phrase, an and photoshopping in politics and imprecise moniker for the many kinds of entertainment, practices of appropriation that produce much-discussed art ,

2 Schudson and Zelizer pirated computer software, musical remixes honest journalism. We also need to and designer fashion rip-offs. Fake news remember that fake news has a history. thus echoes an environment that is used to While that does not preclude recognizing its positioning falsehood as a seamless feature pernicious nature, at the least it should help of public life. us to keep it in perspective.

Though the stubborn persistence of fake References news across evolving and disparate Chalaby, J. “Journalism as an Anglo- circumstances is not new, three things strike American Invention,” European us as different, at least in the U.S. context. Journal of Communication 11 (3), First, there is great anxiety today about the 1996, 303-326. border between professional journalists and others who through digital media have easy Darnton, R. “The True History of Fake access to promoting their ideas, News,” New York Review of Books, perspectives, factual reports, pranks, February 13, 2017. Retrieved from inanities, conspiracy theories, fakes and lies. http://www.nybooks.com/daily/2017/ Second, the President of the United States 02/13/the-true-history-of-fake- turns out to be either naïve about how to news/?printpage=true\ distinguish truth from fantasy or a faux naïf who effectively masquerades as a gullible Jack, C. “What’s Propaganda Got To Do believer in wild fantasies in order to draw With It,” Data and Society: Points. attention away from or otherwise cast doubt January 5, 2017. Retrieved from on truthful reports that place him in a bad https://points.datasociety.net/whats- light. Finally, there is widespread propaganda-got-to-do-with-it- recognition in the academy and beyond that 5b88d78c3282 getting at the truth is difficult, that the construction of what counts as truthful is Jefferson, T. Letter to John Norvell, June 11, indeed contestable and contested and that 1807. In Adrienne Koch and William debunking falsehood requires critical skills Peden, eds. The Life and Selected and moral commitment to truth-seeking. The Writings of Thomas Jefferson (New validity of this understanding that truths are York: Modern Library, 1944), 581. socially constructed stops far short of saying “anything goes.” Samuel, A. “To Fix Fake News Look to Yellow Journalism,” J-Stor Daily, What to do, then, about fake news? November 29, 2016. Retrieved from Currently there are admirable efforts to https://daily.jstor.org/to-fix-fake- specify what fake news looks like, where news-look-to-yellow-journalism/ and how it surfaces, what influence it has and how its distribution might be curtailed Schudson, M. Discovering the News: A by watchdog journalists. If necessary, Social History of American government regulation might curtail fake Newspapers (New York: Basic news, just as it does fraudulent . Books, 1978). But in the interim we need to keep clarifying how fake news is distinguishable from Shuman, E. L. Steps Into Journalism legitimate “fact-based discursive practices” (Evanston, Ill: Correspondence and, equally, from openly partisan but School of Journalism, 1894), 66.

3 Schudson and Zelizer

Tucher, A. “’I Believe in Faking’: The Dilemma of Photographic Realism at the Dawn of ,” Photography and Culture, June 1 2017. Retrieved from http://dx.doi.org/10.1080/17514517. 2017.1322397

Zelizer, B. Covering the Body: The Kennedy Assassination, the Media and the Shaping of Collective Memory. (Chicago: University of Chicago Press, 1992).

Zelizer, B. Remembering to Forget: Holocaust Memory Through the Camera’s Eye. (Chicago: University of Chicago Press, 1998).

4 Derakhshan and Wardle

Information Disorder: Definitions

Hossein Derakhshan, Massachusetts Institute of Technology Media Lab Claire Wardle, FirstDraft

The use of and manipulation is as can see this collocation in use as far back as ancient as language itself, but according to the 16th century.” Merriam Webster, the term ‘fake news’ only emerged around the end of the 19th Century. The term appears new, because of its sudden According to the dictionary website: appearance in our public discourse, caused mostly by its use by Donald Trump himself. “One of the reasons that fake news is such a This chart from Google Trends (a site that recent addition to our vocabulary is that the maps how frequently people search for a word fake is also fairly young. Fake was word or phrase on Google) shows that little used as an adjective prior to the late people suddenly started searching for that 18th century. [Before that point] the most term between November 6 and 12, 2016. common [description] was false news. We The US Presidential election was November 8.

Figure 1: Google Trends data for the past 2 years, which shows the number of searches for the phrase ‘fake news.’

The academic community has followed this 'fake news,' shows that this subject has curve. This chart which visualizes peer- become much more popular recently. reviewed papers which included the terms

5 Derakhshan and Wardle

Figure 2: Number of peer-reviewed publications from the Web of Science that included ‘fake news’ in the title, abstract, or keywords.

In the months that have followed the US an absence of definitional rigor, which has election, we’ve seen phrases such as resulted in a failure to recognize the , post-truth, and post-fact diversity of mis- and dis-information, enter public discourse, and most troubling, whether of form, motivation, or the term ‘fake news’ has been used as a dissemination. weapon to discredit the media, undermining As researchers like Claire Wardle, Ethan the concept of a free press. In this paper, we Zuckerman, danah boyd, and Caroline Jack argue that while terminology and definitions and journalists like the Washington Post’s always matter, on this topic, which is Margaret Sullivan have argued, the term complex and sprawling, and involves ‘fake news’ is woefully inadequate to information itself being used as a form of describe the complex phenomena of mis- warfare, definitions are critical. Only by and dis-information. As Zuckerman states, developing agreed upon definitions, and “It’s a vague and ambiguous term that spans breaking down pieces of the ecosystem everything from (actual news effectively, can we start to work collectively that doesn’t deserve our attention), on solutions. In this paper, we introduce a propaganda (weaponized speech designed to new conceptual framework for support one party over another), and understanding information disorder. disinformatzya (information designed to sow doubt and increase mistrust in institutions).” The term ‘fake news’ One depressing aspect of the recent ‘fake A study by Tandoc et al. published in news’ panic is that, while it has resulted in August 2017 examined 34 academic articles an astonishing number of reports, books, that used the term “fake news” between conferences, and events, it has produced 2003 and 2017. The authors noted that the little other than funding opportunities for term has been used to describe a number of research and the development of tools. One different phenomena over the past 15 years: key reason for this stagnation, we argue, is news satire, news parody, fabrication,

6 Derakhshan and Wardle manipulation, advertising, and propaganda. actors to distort domestic or foreign Indeed, this term has a long history, long political sentiment, most frequently to predating President Trump’s recent achieve a strategic and/or geopolitical obsession with the phrase. outcome. These operations can use a combination of methods, such as false The term “fake news” has also begun to be news, disinformation, or networks of appropriated by politicians around the world fake accounts aimed at manipulating to describe news organizations whose public opinion (false amplifiers). coverage they find disagreeable. In this way, it’s becoming a mechanism by which the 2. False News. News articles that purport powerful can clamp down on, restrict, to be factual, but contain intentional undermine, and circumvent the free press. misstatements of fact to arouse passions, It’s also worth noting that the term and its attract viewership, or deceive. visual derivatives (e.g., the red ‘FAKE’ stamp) have been even more widely 3. False Amplifiers. Coordinated activity appropriated by websites, organizations, and by inauthentic accounts that has the political figures identified as untrustworthy intent of manipulating political by fact-checkers to undermine opposing discussion (e.g., by discouraging specific reporting and news organizations (Haigh parties from participating in discussion 2017). or amplifying sensationalistic voices over others). Many have offered new definitional frameworks in attempts to better reflect the In “Fake News. It’s Complicated,” Wardle complexities of mis- and dis-information. outlines seven types of mis- and dis- information, revealing the wide spectrum of Facebook defined a few helpful terms in problematic content online, from satire and their paper on information operations: parody (which, while a form of art, can become misinformation when audiences 1. Information (or Influence) misinterpret the message) to full-blown Operations. Actions taken by fabricated content. governments or organized non-state

7 Derakhshan and Wardle

Figure 3: 7 Categories of Information Disorder (Credit: Claire Wardle, First Draft)

While these seven classifications are helpful These terms can carry a lot of baggage. in encouraging people to see beyond the They have each accrued different cultural infamous ‘Pope endorses Trump’-type news associations and historical meanings, and sites that received so much attention after they can take on different shades of meaning the US election, the phenomenon requires an in different contexts. These differences may even more nuanced conceptual framework— seem small, but they matter. The words we particularly one that highlights the impact of choose to describe can visuals in perpetuating disinformation. We lead to assumptions about how information have therefore created such a framework, spreads, who spreads it, and who receives it. and we will use it as the organizing structure These assumptions can shape what kinds of for the report. interventions or solutions seem desirable, appropriate, or even possible.” While we work through terms and descriptions, it’s important that we Our conceptual framework has three recognize the importance of shared components, each of which is also broken definitions. As Caroline Jack argued in the down into three parts: introduction to her recent report, Lexicon of Lies, for Data & Society: 1. The Three Types of Information Disorder: Dis-information, Mis-information, and Mal- “Journalists, commentators, policymakers, information and scholars have a variety of words at their disposal — propaganda, disinformation, 2. The Three Phases of Information misinformation, and so on — to describe the Disorder: Creation, Production, and accuracy and relevance of media content. Distribution

8 Derakhshan and Wardle

3. The Three Elements of Information Dis-information. Information that is false Disorder: Agent, Message, and Interpreter and deliberately created to harm a person, social group, organization or country. The Three Types of Information Disorder Much of the discourse on ‘fake news’ Mis-information. Information that is false, conflates three notions: mis-information, but not created with the intention of dis-information, and mal-information. But causing harm. it’s important to distinguish messages that are true from those that are false, and Mal-information. Information, that is messages that are created, produced, or based on reality, used to inflict harm on a distributed by “agents” who intend to do person, organization, or country. harm from those that are not:

Figure 4: Examining how mis-, dis-, and mal-information intersect around the concepts of falseness and harm.

9 Derakhshan and Wardle

The Phases and Elements of Information 2. Message. What type of message was it? Disorder What format did it take? What were the In trying to understand any example of characteristics? information disorder, it is useful to consider it in three elements: 3. Interpreter. When the message was received by someone, how did they interpret 1. Agent. Who were the ‘agents’ that the message? What action, if any, did they created, produced, and distributed the take? example, and what was their motivation?

Figure 5: The Three Elements of Information Disorder

We argue that it is also productive to 2. Production (and reproduction). The consider the life of an example of message is turned into a media product. information disorder as having three phases: 3. Distribution. The message is distributed 1. Creation. The message is created. or made public.

10 Derakhshan and Wardle

Figure 6: The Three Phases of Information Disorder

In particular, it’s important to consider the current situation, or be in a position to work different phases of an instance of collaboratively on effective solutions. information disorder alongside its elements because the agent that creates the content is Resources often fundamentally different from the agent boyd, d. (March 27, 2017) “Google and who produces it. For example, the Facebook can’t just make Fake News motivations of the mastermind who ‘creates’ Disappear,” Wired, a state-sponsored dis-information campaign https://www.wired.com/2017/03/goo are very different from those of the low-paid gle-and-facebook-cant-just-make- ‘trolls’ tasked with turning the campaign’s fake-news-disappear/ themes into specific posts. And once a message has been distributed, it can be Haigh et al, (2017) “Stopping Fake News: reproduced and redistributed endlessly, by The work practices of peer-to-peer many different agents, all with different counter propaganda.” Journalism motivations. For example, a social media Studies, 1-26. post can be distributed by several communities, leading its message to be Jack, C. (2017) “Lexicon of Lies,” Data & picked up and reproduced by the mainstream Society, media and further distributed to still other https://datasociety.net/pubs/oh/Data communities. AndSociety_LexiconofLies.pdf

In conclusion, only by dissecting Merriam Webster. “The Real Story of Fake information disorder in this manner can we News,” https://www.merriam- begin to undertake the necessary research to webster.com/words-at-play/the-real- fully understand the different facets of this story-of-fake-news phenomenon. And without methodologically rigorous, empirical research, we are Sullivan, M (Jan 6, 2017) “It’s Time To certainly not going to understand the Retire the Tainted Term Fake underlying symptoms that have caused the News,” Washington Post,

11 Derakhshan and Wardle

https://www.washingtonpost.com/lif estyle/style/its-time-to-retire-the- tainted-term-fake- news/2017/01/06/a5a7516c-d375- 11e6-945a-76f69a399dd5_story.html

Tandoc, Jr., E. C, Lim, Z. W., and Ling, R. (Aug. 2017) “Defining ‘Fake News’: A Typology of Scholarly Definitions,” , 5 (7): 1-17

Wardle, C. (Feb 16, 2017) “Fake News. It’s Complicated,” First Draft, https://firstdraftnews.com/fake- news-complicated/

Zuckerman, E. (Jan 30. 2017) “Stop Saying Fake News, It’s not Helping,” My Heart is in Accra, http://www.ethanzuckerman.com/blo g/2017/01/30/stop-saying-fake-news- its-not-helping/

*Some elements of this paper are taken from a report commissioned by the Council of Europe entitled “Information Disorder: Toward an Interdisciplinary Framework for Research and Policymaking.”

12 Fletcher and Nielsen

People Don’t Trust News Media – and This is Key to the Global Misinformation Debate

Richard Fletcher, Reuters Institute for the Study of Journalism, University of Oxford Rasmus Kleis Nielsen, Reuters Institute for the Study of Journalism, University of Oxford

If we want to understand the global debate 2,000 in each country, with respondents around misinformation, we have to accept selected based on interlocking quotas for that it is playing out in the context of low age, gender, and location. The focus groups trust in news media, general scepticism of are not representative in a statistical sense, publishers and platforms alike, and in an but were organized by age and were varied environment where people do not make a in terms of other demographic and news use clear-cut distinction between “real news,” characteristics (see Vir and Hall 2017 for fake news, and other forms of more details). In an attempt to avoid misinformation, but rather see the difference generalizations based on extrapolating from as one of degree. These are three key single-country findings, we will focus here takeaways from our recent research into how on 10 countries that together represent a people see news, media, and journalism wide range of different high-income across the world. There are significant democracies: , France, Spain, differences from country to country, but also Ireland, , Greece, Australia, clear common patterns. Crudely put, people , the UK, and the US. often have little trust in the news media, they are skeptical of the information they First, trust in the news media is low come across on platforms, and they see poor The first of our three points is that the journalism, political propaganda, and debate about misinformation is often taking misleading forms of advertising and click- place against the backdrop of low levels of bait as contributing to the problem. trust in the news media. As is clear from Figure 1, less than half in many countries Data agree that they can “trust most news most of To illustrate these three points, we turn to the time.” Trust is particularly low in Greece data from the 2017 Reuters Institute Digital (23%) and France (30%), but also in the US News Report (Newman et al. 2017). The (38%). At most, this is only partly due to the findings in the report are based on an online current debates about misinformation, as we survey of around 70,000 respondents across know from a handful of more detailed 36 markets, as well as an additional eight studies that in many countries—particularly focus group sessions in four of those the US (Ladd 2012)—trust in news media (Finland, Spain, the UK, and the US). has been declining for years, as part of a Polling was conducted by YouGov, and the broader crisis of confidence in many public focus groups were moderated by Kantar institutions (Norris 2011). But it is also clear Media, but both the questionnaire and focus that this same fall in trust is not happening group discussion guides were designed by everywhere (Hanitzsch, van Dalen, and researchers at the Reuters Institute Steindl 2017). We can see from our own (including the authors). The survey used survey data that trust in the news is more nationally-representative samples of around widespread in Spain (51%), Germany

13 Fletcher and Nielsen

(50%), and Denmark (50%), with the some time that some people regularly highest figure of all in Finland (62%, but not consume news they say they do not trust considered here). (Tsfati and Cappella 2003). However, this measure does give us an indication of Trust is a complicated issue, and these people’s attitudes towards the news media, figures can be interpreted in a number of and gives us some basis for thinking about ways. However, they should not be treated how initiatives aiming to combat as a proxy measure for how trustworthy the misinformation—particularly those that news media actually are in certain countries. hinge on trust—might play out in particular People might trust things that appear deeply contexts. Tagging social media posts as untrustworthy from an outside or neutral ‘verified’ by established news organizations perspective, particularly if they are unaware might work well in Germany, where many of potential alternatives, or if they happen to trust the media, but perhaps less so in support the views and actions of the Greece, where most people—perhaps untrustworthy actors. Nor should the figures rightly—do not. be used to straightforwardly understand news consumption, as we have known for

Figure 1. Proportion that ‘strongly agree’ or ‘tend to agree’ they can trust most news most of the time

Second, confidence in social media is even broadly comparable levels of confidence in lower social media. In the UK and the US, no We mention social media because many see more than one in five agree that social media it as absolutely central to the debates about does a good job in helping them distinguish misinformation. We therefore also asked fact from fiction. Of course, unlike the news respondents a follow-up question about their media, few social networks claim that this is confidence in the ability of both news media what they set out to do, but it is nonetheless and social media to separate fact from striking that most people do not feel that fiction. We use it to make our second point, social media help them much in this regard. namely that although confidence in the news media is generally low, confidence in social As people increasingly rely on social media media is even lower (see Figure 2). This is to learn about what’s happening in the true of every country apart from Greece, world, we could interpret this state of affairs which as we have already seen, has in quite a pessimistic fashion—as a catalyst particularly low trust in the news media, but for already-declining trust. But a more

14 Fletcher and Nielsen optimistic reading reminds us that we should “generalized scepticism,” as people’s views not conflate exposure to misinformation of one are strongly correlated with their with uncritical acceptance of views of the other (Fletcher and Nielsen misinformation. Elsewhere, we have called 2017). the combination of low trust in news media and even lower confidence in social media

Figure 2. Proportion that ‘strongly agree’ or ‘tend to agree’ that the news media/social media does a good job in helping them distinguish fact from fiction

Third, many see fake news as different of definitions of fake news from a subset of from poor journalism primarily by four countries, specifically the US and the degree UK (countries where less than half the In this environment of low trust in news population trust most news most of the time) media and even lower confidence in social and Finland and Spain (where just over half media and much of the information accessed the population say they do so). The central there, how do people think about the point that came out of our focus groups was phenomenon of fake news? Top-down that most people do not operate with a definitions abound, including useful categorical distinction between “fake news” typologies of misinformation (e.g. Wardle and “real news,” but see the difference as 2017) and calls for the term “fake news” to one of degree, as this exchange from the UK be abandoned altogether in favour of more shows: precise vocabulary (Sullivan 2017). But our qualitative research demonstrates that the Moderator: What does “fake news” mean to term is already part of the vernacular you? because it helps people express their M: Made up stories. frustration with the media environment, F: Do you believe everything that you hear because it resonates with their experiences and see and read? I’m sure some of it is of coming across many different kinds of made up. misinformation, especially online, and M: It’s a spectrum isn’t it? because it is actively used by critics of both M: There’s been fake news for years hasn’t news media and platform companies. there?

We have gathered qualitative evidence to We cannot present the focus group data in provide an audience-centric, bottom-up set detail here, or unfold a full analysis (see

15 Fletcher and Nielsen

Nielsen and Graves 2017 for this). But propaganda, and misleading forms of Figure 3 provides a visual summary of the advertising and clickbait. Satire and outright main types of content that people identified false news were also mentioned, but often as fake news during the discussions, explicitly discounted as “not news” at all. including poor journalism, political Figure 3. Types of misinformation identified by our focus group participants and their overlap with news

This suggests how the global US, people have more confidence in social misinformation debate is playing out—not media in Spain than in the UK, and the fake only against the current backdrop of low news discussion is seen as much more trust in news and generalized skepticism relevant by focus group participants in towards both publishers and platforms, but English-speaking countries than elsewhere. also against what is often years of frustrating But there are also clear commonalities. It is experiences with many kinds of news. striking that even in relatively high-trust contexts, about half the population do not Conclusion think they can trust most news most of the In this essay, we have argued that the global time. Whether this is indicative of healthy debate around misinformation plays out in a skepticism or a paralyzing form of hardened context where people do not trust news cynicism is a matter of judgement. Broadly media, they are skeptical of the information speaking, societies benefit from people they come across on platforms, and they see trusting trustworthy institutions and poor journalism, political propaganda, and distrusting untrustworthy ones (O’Neill misleading forms of advertising and click- 2002). What we can say here is that, bait as contributing to the problem. There empirically, many people do not see are very clear and important country-to- publishers and platforms (or politicians for country differences in how pronounced that matter) as particularly trustworthy when these trends are. Trust in news media is it comes to news and factual information. significantly higher in Germany than in the

16 Fletcher and Nielsen

This is key to the global misinformation Norris, Pippa. 2011. Democratic Deficit: debate. Critical Citizens Revisited. Cambridge: Cambridge University References Press. Fletcher, Richard, and Rasmus Kleis Nielsen. 2017. “Navigating News on O’Neill, Onora. 2002. A Question of Trust. Social Media: A Four-Country Cambridge: Cambridge University Mixed-Methods Analysis.” Paper Press. presented at APSA Annual Meeting. Sullivan, Margaret. 2017. “It’s Time to Hanitzsch, Thomas, Arjen van Dalen, and Retire the Tainted Term ‘Fake Nina Steindel. 2017. “Caught in the News.’” Washington Post. Nexus: A Comparative and https://www.washingtonpost.com/lif Longitudinal Analysis of Public estyle/style/its-time-to-retire-the- Trust in the Press.” International tainted-term-fake- Journal of Press/Politics 23 (1):3– news/2017/01/06/a5a7516c-d375- 23. 11e6-945a- 76f69a399dd5_story.html. Ladd, Jonathan M. 2012. Why Americans Hate the Media and How It Matters. Tsfati, Yariv, and Joseph N. Cappella. 2003. Princeton: Princeton University “Do People Watch What They Do Press. Not Trust? Exploring the Association Between News Media Skepticism Newman, Nic, Richard Fletcher, Antonis and Exposure.” Communication Kalogeropoulos, David A. L. Levy, Research 30 (5):504–29. and Rasmus Kleis Nielsen. 2017. “Reuters Institute Digital News Vir, Jason, and Kathryn Hall. 2017. Report 2017.” Oxford: Reuters “Attitudes to Paying for Online Institute for the Study of Journalism. News.” Oxford: Reuters Institute for the Study of Journalism. Nielsen, Rasmus Kleis and Lucas Graves. 2017. “News You Don’t Believe”: Wardle, Claire. 2017. “Fake News. It’s Audience Perspectives on Fake Complicated.” First Draft News. News. Oxford: Reuters Institute for https://firstdraftnews.com:443/fake- the Study of Journalism. news-complicated/.

17 Lewis and Marwick

Taking the Red Pill: Ideological Motivations for Spreading Online Disinformation

Rebecca Lewis, Data & Society Research Institute Alice Marwick, University of North Carolina at Chapel Hill

Abstract: This paper addresses the ideological motivations of members of far-right internet subcultures who spread misinformation and disinformation online. These groups recruit like- minded individuals who believe that white, male identities and conservative belief systems are under attack from a range of outside forces. They often greatly distrust the mainstream media, which they believe to be dominated by liberal ideology and “politically correct” culture. As group members are radicalized – a process they refer to as “redpilling” – their ideologies and distrust of the media feed on each other and ultimately inform a broader shift in their understanding of reality and veracity. As a result, they may view highly ideological and factually incorrect information as truthful, thus complicating understandings of disinformation.

The role of “fake news,” misinformation, users, trolls, and conspiracy theorists and propaganda in spreading inaccurate (Marwick & Lewis, 2017). information online came to prominence during the run-up to the 2016 election and For those on the far-right, recruiting like- remains an area of wide concern. Previous minded individuals and converting them to research has identified a variety of their way of thinking is a primary driver of motivations behind such content, including online participation. While these groups financial interests (for instance, hyper- have robust online communities, blogs, partisan clickbait content that promotes podcasts and the like, the mainstream media audience engagement), the political interests offers a significant opportunity to set of state actors (such as computational agendas, frame current issues, and spread propaganda spread by Russian actors on their messaging to a larger and more diverse platforms like Twitter), trolling and audience. Since these groups often adhere to disruption (for example, imageboard conspiratorial thinking which maintains that denizens who spread wild rumors for white, male identities and conservative entertainment), and the desire for fame (as belief systems are under attack from a influencers like Milo Yiannopolous and hostile Other, whether that be people of Mike Cernovich came to prominence during color, Muslim immigrants, Jews, global GamerGate and have increased their online elites, feminists, or the Illuminati, they share footprints by adopting alt-right beliefs) a sense of urgency and significance in (Caplan & boyd, in press; Woolley & growing their movement. At the same time, Howard, 2016). For the last year and a half, their ideological presuppositions justify our team at Data & Society has conducted what is often aggressive online and offline research into the manipulation of the harassment of feminists, people of color, mainstream media by far-right internet Jews, journalists, and so forth by presenting subcultures, including white supremacists, an alternate reality in which white men are men’s rights activists, 4chan and 8chan the victims of an aggressively politically-

18 Lewis and Marwick correct liberal culture. While individuals the eyes of ordinary people in order to may have a variety of reasons to participate justify elite dominance, and often veers into in these subcultures, this essay discusses the familiar conspiracy theories about Jewish ideological motivations behind these groups’ control. Similarly, the liberal beliefs of spread of misinformation and disinformation Silicon Valley media titans have increased online—that is, information that is incorrect, distrust in social media platforms, especially and information that is deliberately when far-right participants believe that misleading (Jack, 2017). content moderation and trending topics favor left-wing viewpoints. Distrust of the Media Generally, far-right groups spreading mis- Redpilling and the Shifting Nature of and disinformation have very low levels of Truth trust in the mainstream media. To a certain The process of “redpilling”—a shorthand for extent, this is reflective of a larger trend; far-right radicalization—illustrates how far- distrust in the media is at an all-time low right ideology and media distrust feed on nationwide, particularly among Republicans, each other and ultimately represent a where trust has dipped to 14% (Swift, 2016). broader shift in people’s perceptions of Many right-leaning media consumers and truth, disinformation, and propaganda. The young men more generally express term redpilling speaks to a kind of total frustration with the media’s “PC culture,” revelation, a “waking up” to the realities of which they interpret as liberal bias, pointing the world. The “red pill” refers to the scene to media ownership by liberal in The Matrix in which the character philanthropists like Jeff Bezos and Ted Morpheus offers the protagonist Neo the Turner as proof. option of taking a red pill or a blue pill. By taking the red pill, Neo sees “the truth” Far-right critiques of mainstream media go about the Matrix and his world is upended; beyond perceived liberal bias. For years, had he taken the blue pill, he would have self-described “trolls” have criticized the returned to his everyday life, none the wiser. mainstream media for its overly sensational In far-right circles, one is redpilled when and emotionally manipulative tactics they begin believing in a truth that is surrounding issues like pedophilia and the counterfactual to mainstream belief, which murders of young white women (Phillips, may include white supremacy, Holocaust 2013). Separately, media consolidation at denial, the danger that immigration posits the corporate level has left “news deserts” in for white Americans, the oppression of men rural areas, meaning that rural news by feminists, and so forth. consumers may feel that local media fails to represent their interests. At a time when People often first become redpilled in one social media has removed gatekeepers and ideological area, which then serves as an allowed anyone with internet access to entry point to redpilling in other areas. In express their opinions, news outlets other words, our research indicates that represent a more traditional, closed system someone who has been redpilled on one of information distribution, which many issue is more likely to be redpilled on others. members of far-right movements In this way, susceptibility to redpilling characterize as a distant elite. The extreme resembles susceptibility to conspiracy version of this belief system maintains that theories, as research has shown that one of the media deliberately pulls the wool over the greatest predictors of belief in a

19 Lewis and Marwick is belief in others newly-radicalized shift, but their very (Douglas & Sutton, 2008). Most frequently, definitions of misinformation, the world of Men’s Rights —fueled disinformation, and propaganda, as well as by a backlash to popular feminism, or a lack concepts of oppression and inclusion, do as of romantic or sexual success—serves as an well. Redpilling thus begins to inform an entry point for disaffected young men into entirely different view of reality, white nationalism. Significantly, far-right epistemology, and veracity. Redpilled groups attempting to radicalize people may individuals may not view the content that encourage this piecemeal approach. On the they spread as mis or disinformation white supremacist Fash the Nation, for (although in many cases they strategically instance, a series called “Red Pill 101” re-frame events to encourage adoption of includes articles on “equality,” “race,” and their own belief systems), but as necessary “,” but also on in order to protect their own interests and “pornography,” which they believe Jews fight for the survival of their very identities. control and use as a strategy to make white men impotent. Misinformation vs. Disinformation In January 2016, four Chicago teens used The example above shows not only how Facebook Live to film themselves torturing redpilling leads to a broader radicalization, a mentally disabled man, which received but also how quickly ideology and distrust widespread media attention. Alt-right media of the media feed on each other throughout personality, author, and blogger Mike the process. This also explains the far-right’s Cernovich used Twitter’s streaming app proclivity for conspiracy theories. If Periscope to talk to his followers, someone begins to believe that white people brainstorming a way to connect the event to are actually the oppressed race, it also means larger ideological goals. Since the teenagers that everything that mainstream media is were African-American, Cernovich and his telling them is false. In white supremacist followers decided to link the kidnappers to circles, it is widely believed that Jewish Black Lives Matter, a movement for racial elites control the culture (specifically, the justice which the far-right often describes as media) and are accelerating the destruction a terrorist organization. (One goal of the far- of the white race. So, as one becomes right is getting the federal government to redpilled on white supremacy, he is also officially label BLM and movements like necessarily redpilled on the mainstream Antifa terrorist organizations, which would media. The more someone is radicalized curtail their activities and justify federal against the mainstream media, the more the action against members.) Using bots and explanations of ideologues make sense to fake accounts, Cernovich and his followers explain its shortcomings. Thus, redpilled tweeted the hashtag #BLMKidnapping individuals by their very nature distrust any 480,000 times in 24 hours, causing it to “official story” given to them by mainstream trend on Twitter. While both Chicago police media. and BLM activists debunked the rumor, it spread widely on social media and was As a result, those who are redpilled fully picked up by hyper-partisan news sites like reject and antagonize the mainstream media Breitbart. The idea that the kidnappers were and its narratives, as well as traditional BLM activists became so pervasive that it institutions of government, business, and was mentioned in most media stories about education. Not only do the ideologies of the the kidnapping, even if just to discredit it.

20 Lewis and Marwick

Cernovich and his audience forcibly murkiness, and the lack of understanding changed the way the story was framed by around intentionality, makes it difficult to pushing out the hashtag. We consider this an determine whether this documentary is example of disinformation, since Cernovich strategic disinformation. Instead, it might be and his audience knew that BLM was not considered misinformation, inaccurate actually linked to the Chicago kidnapping. information spread unintentionally. If its Instead, they propagated an alternate frame creators and distributors are redpilled, they in order to further their larger ideological may be entirely convinced that their goals. interpretation of Antifa is correct.

By contrast, consider how the far-right has propagated information about Antifa. Antifa References is a term for antifascist groups who use Caplan, R., & boyd, danah. (in press). direct action techniques to fascist and Who’s Playing Who? Media neo-Nazi thought, often through physical Manipulation in an Era of Trump. In force. In the US, this has been a very small Z. Papacharissi & P. J. Boczkowski group until quite recently, made up (Eds.), Trump and the Media. primarily of college-age young people, often Cambridge, MA: MIT Press. linked to anarchist and punk rock communities. In the current political Douglas, K. M., & Sutton, R. M. (2008). climate, where far-right groups are more The hidden impact of conspiracy visible than in the last two decades, Antifa theories: Perceived and actual activism is growing in numbers. influence of theories surrounding the death of Princess Diana. The Journal The YouTube documentary “America Under of Social Psychology, 148(2), 210– Siege: Antifa” frames Antifa as a radical 222. movement opposed not just to fascism, but to all mainstream conservatism. It presents Jack, C. (2017). Lexicon of Lies: Terms for both accurate historical information about Problematic Information. New York, Antifa’s European roots and complete N.Y: Data & Society Research inaccuracies, such as the idea that Antifa Institute. Retrieved from controls entire towns in East Germany and https://datasociety.net/output/lexicon prevents any right-wing beliefs from being -of-lies/ publicly discussed, or that Antifa is funded by George Soros. Some of the Marwick, A., & Lewis, R. (2017). Media documentary’s claims are ideologically- Manipulation and Disinformation slanted, but not actually incorrect; for Online (p. 107). New York: Data & instance, the claim that Antifa members Society Research Institute. Retrieved dislike anyone who espouses “Western free from enterprise system capitalism,” can be read as https://datasociety.net/output/media- accurate in one sense, given that Antifa manipulation-and-disinfo-online/ activists are primarily anarchists who by definition are opposed to capitalism. In Phillips, W. (2013). The house that fox built: another sense it is entirely inaccurate – Anonymous, spectacle, and cycles of Antifa activists do not, generally, attack amplification. Television & New “anyone” who supports the status quo. This Media, 14(6), 494–509.

21 Lewis and Marwick

Woolley, S. C., & Howard, P. N. (2016). Swift, A. (2016, September 14). Americans’ Automation, Algorithms, and Trust in Sinks to New Politics| Political Communication, Low. Retrieved from Computational Propaganda, and http://www.gallup.com/poll/195542/ Autonomous Agents — Introduction. americans-trust-mass-media-sinks- International Journal of new-low.aspx Communication, 10(0), 9.

22 Watts and Rothschild

The Minority Report on the Fake News Crisis: (Spoiler Alert: It’s the Real News)

Duncan J. Watts, Microsoft Research David M. Rothschild, Microsoft Research

"To ensure that the concept of self-government outlined by the U.S. Constitution remains a reality into future centuries, the American people must be well informed in order to make decisions regarding their lives, and their local and national communities. It is the role of journalists to provide this information in an accurate, comprehensive, timely and understandable manner." –Society for Professional Journalists

Introduction messages specifically tailored to shift their Since the 2016 Presidential election, an votes (Albright 2017). increasingly familiar narrative has emerged concerning the unexpected victory of Alarmed by these events, the mainstream Donald Trump. Fake news, much of it media has vigorously shouldered the mantle produced by Russian sources, was amplified of truth-speakers. The Washington Post on social networks such as Facebook and changed its motto to “Democracy dies in Twitter, generating millions of views among Darkness” on February 22, 2017 (Fahri a segment of the electorate eager to hear 2017). launched a stories about Hilary Clinton's malfeasance major ad campaign reflecting the nuanced and Obama's illegitimacy (Madrigal 2017; and multifaceted nature of truth, “Truth is Palma and Green 2017; Roose 2017; …,” on February 26, 2017, during the Timberg 2017). “Alt-right” news sites like Oscars (Diaz 2017). CNN hosts now Breitbart and Daily Caller amplified and routinely accuse their guests of lying or supplemented the outright fake information bending the truth; and chyrons fact-check with highly slanted and misleading coverage President Trump as he speaks (Abbruzzese of their own (Benkler et al. 2017a; 2016). Headline writers now explicitly call Rutenberg 2017). In parallel, the continuing out falsehoods in the stories that follow, fragmentation of the media and the rather than leaving it to the text. Journalists increasing ability of Americans to self-select readily accuse the President and his staff of into like-minded “filter bubbles” indulging in , as when he exacerbated both phenomena, generating a compared Antifa protestors with Nazis and toxic brew of political polarization and heavily armed white supremacists after the skepticism toward traditional sources of Charlottesville riots. authority (Delaney 2017; El-Bermaway 2016). Finally, powerful machine learning In parallel, the same outlets have stepped up algorithms, coupled with reams of data from their already vigorous critiques of the role social networks, allowed groups allied with that technology companies—in particular the Trump campaign to micro-target Facebook, Google, and Twitter—may have susceptible voters in swing states with played in the election (Cohen 2017; Coren 2017; Entous, Dwoskin and Timberg 2017;

23 Watts and Rothschild

Madrigal 2017; Roose 2017). Stories appear First, as large as the breathlessly repeated on an almost daily basis highlighting the numbers on fake news are made to seem, potential ways in which algorithms and they are not large when compared with the social sharing can combine to spread volume of information to which online users misinformation (Bajarin 2017; Morgan are exposed (Karpf 2017). For example, a 2017; Palma and Green 2017). For example, New York Times story (Shane and Goel a widely cited BuzzFeed article (Silverman 2017) reported that Facebook had identified 2016) claimed that the 20 most shared fake over 3,000 ads purchased by fake accounts news articles Facebook were shared more traced to Russian sources, which generated frequently than the 20 most shared real news over $100,000 in advertising revenue. But articles. Numerous stories have reported on Facebook’s advertising revenue in Q4 of the manipulating of Facebook’s ad system 2016 was over $8,000,000,000, which by Russian-affiliated groups, including ads translates to roughly $88,000,000 per day; that were targeted at self-described “Jew that is, the fake ads in total comprised Haters” (Feldman 2017; Timberg and roughly 0.1% of Facebook’s daily Dwoskin 2017; Volz 2017). Algorithmic advertising revenue. Likewise, a 2016 ranking of Google search results has been BuzzFeed report (cite) claimed that the top implicated in amplifying fake reports in the 20 fake news stories on Facebook immediate aftermath of the Las Vegas “generated 8,711,000 shares, reactions, and shootings (Machkovech 2017). Lawmakers comments” between August 1 and Election such as Senator Mark Warner (Dem, VA) Day. Again, this sounds like a large number have been prominently profiled on account until one considers that Facebook had of their outspoken criticism of the Tech 1,500,000,000 active monthly users in 2016. industry (Kang 2017). Even Facebook’s own If each user took only a single action per day employees have been reported as expressing (likely an underestimate), then over the 100 anxiety over their company’s role in the days prior to the election, the 20 stories in election (Isaac 2017). BuzzFeed’s study would have accounted for only 0.006% of user actions. Even recent A problem with the fake news narrative claims that the “real” numbers were much We agree that fake news, and higher than initially reported do not hold up misinformation more generally, are real to scrutiny. For example, a New York Times problems that require scientific attention story reported on Oct 30 that “Russian (Lazer et al. 2017). More broadly, we agree agents…disseminated inflammatory posts that social media and other web technologies that reached 126 million users on Facebook, have contributed to more general problems published more than 131,000 messages on in democratic discourse such as the lack of Twitter and uploaded over 1,000 videos to respect for expert opinion and potentially Google’s YouTube service” (Isaac and eroding basis of shared reality (Watts and Wakabayashi 2017). Big numbers indeed. Rothschild 2017). Nonetheless, we believe Except that several paragraphs later in the that the volume of reporting around fake same article, the authors concede that over news, and the role of tech companies in the same period Facebook users were disseminating it, is wildly disproportionate exposed to 11 trillion posts—roughly 87,000 to its likely role in the election outcome, for for every fake exposure—while on Twitter three reasons. the Russian-linked election tweet represented less than 0.75% of all election- related tweets. On YouTube, meanwhile, the

24 Watts and Rothschild total number of views of fake Russian most popular stories—Pope Francis videos was around 309,000—compared with endorsing Trump; Democrats planning to the 5 Billion YouTube videos that are impose Islamic law in Florida; Trump watched every day (Donchev 2017). supporters chanting “We hate Muslims, we hate ;” and so on (Ritchie 2016)— Second, given what is known about the made them especially unlikely to have impact of online information on opinions, altered voters’ pre-existing opinions of the even the high-end estimates of fake news candidates. Notwithstanding the occasional penetration would be unlikely to have had a rube who actually believed that Hillary meaningful impact on voter behavior. For Clinton was running a pedophilia sex ring example, a recent study by Allcott and out of a Washington pizzeria, such stories Gentzkow (2017) estimates that “the were most likely consumed by readers who average US adult read and remembered on already agreed with their overall sentiment the order of one or perhaps several fake (Hillary is evil, Trump supporters are racists, news articles during the election period, etc.) and who shared them either to signal with higher exposure to pro-Trump articles their “tribal allegiance” or simply for than pro-Clinton articles.” In turn, they entertainment value (“cheer leading”), not estimate that this level of exposure would because they had been persuaded by the have changed vote shares by “an amount on stories themselves, or even necessarily the order of hundredths of a percentage believed them to be accurate. point.” As the authors acknowledge, the fake news stories could have been concentrated Turning attention to the “real” news on specific segments of the population who To the extent that one considers the 2016 in turn could have had a disproportionate presidential election outcome a failure of impact on the election outcome. Also, democratic decision making, fake news is potentially, individual stories can exert therefore unlikely to be the main culprit. But outsized influence over readers’ opinions if if not, then what is? A potentially more they resonate with deeply held beliefs or serious threat is what Benkler et al (2017a) fears. For all these reasons, fake news and refer to as “a network of mutually- ads could have influenced the election more reinforcing hyper-partisan sites that revive than their relatively tiny average prevalence what Richard Hofstadter called ‘the suggests. Nevertheless, it would have had to paranoid style in American politics,’ be much larger—at least 100 times larger— combining decontextualized truths, repeated to account for Trump’s margin of victory in falsehoods, and leaps of logic to create a the key states on which the election outcome fundamentally misleading view of the depended. world.” Unlike the fake news numbers above, engagement with sites like Breitbart Third, there are other reasons to think that News, Infor Wars, and the Daily Caller are the effect could have been even smaller. substantial—especially in the realm of social Many of the shares are likely to have been media. Nevertheless, the more detailed made by automated “bots” rather than by accounting contained in Benkler et al humans (Bessi and Ferrara 2016). Even (Benkler et al. 2017b) shows that by any among human users, many impressions that reasonable metric—including Facebook or are registered by social media platforms do Twitter shares, but also referrals from other not correspond to actual consumption. media sites, number of published stories, Finally, the sheer outrageousness of the etc.—the media ecosystem remains

25 Watts and Rothschild dominated by conventional (and mostly left- estimate of thousands of election-related of-center) sources such as the Washington stories published by the mainstream media is Post, the New York Times, Huffington Post, therefore not unreasonable. and CNN. In terms of Facebook shares, far and away the largest players were (in order) What did all these stories talk about? The New York Times, CNN, Breitbart, Benkler et al. (2017b) investigated this Huffington Post, The Hill, and the question also, counting sentences that Washington Post. In terms of Twitter shares, appeared in mainstream media sources that the same five again dominate, albeit in they then classified as detailing one of slightly different order (Breitbart drops to several Clinton and Trump related issues. In fourth after CNN, The New York Times, particular, they classified each sentence as and The Hill), while in terms of media links describing either a scandal (e.g. Clinton Breitbart and Fox News are well down the Emails, Trump Taxes) or a policy issue rankings behind The Washington Post, The (Clinton Jobs, Trump Immigration). As New York Times, CNN, and Politico. shown in Figure 1, they found roughly four times as many Clinton-related sentences that Given the attention that these very same described scandals as policies, whereas news outlets have lavished, post-election, on Trump-related sentences were one-and-a- fake news shared via social media, it may half times as likely to be about policy as come as a surprise that they themselves scandal. Given the sheer number of scandals dominated social media traffic; but once in which Trump was implicated--sexual again some simple math shows why it assault, the Trump Foundation, Trump should not be. While it may have been the University, redlining in his real-estate case that the twenty most shared fake news developments, insulting a gold star family, stories narrowly outperformed the twenty numerous instances of racist, misogynist, most shared mainstream stories, the overall and otherwise offensive speech—and his volume of stories produced by major repeated claims to have been treated unfairly vastly outnumbers fake news. by the media, it is striking that the media According to Benkler et al (2017b) “The devoted more attention to his policies than Washington Post produced more than to his personal failings. Even more striking, 50,000 stories over the 18-month period, one single Clinton-related scandal—her use while the New York Times, CNN, and of a private email server while serving as Huffington Post each published more than Secretary of State—accounted for more 30,000 stories.” (p.28). Presumably not all sentences than all of Trump’s scandals these stories were about the election, but combined (65,000 vs. 40,000) and more than each such story was also likely reported by twice as many as were devoted to all of her many news outlets simultaneously. A rough policy positions.

26 Watts and Rothschild

Figure 1. Count of sentences that appeared in mainstream media sources (e.g. New York Times, Washington Post, Huffington Post, CNN) that were classified as describing either a scandal or a policy issue related to either Trump or Clinton. Reprinted from (Benkler et al. 2017a)

To reiterate, these 65,000 sentences were NYT Campaign Coverage written not by Russian Hackers, but by To shed more light on this possibility, we professional journalists employed at conducted a more in-depth analysis of a mainstream news organizations, such as the single media —namely the New York New York Times, The Washington Post, and Times.ii We gathered two datasets that the Wall Street Journal.i To the extent that captured the New York Times’ coverage of voters mistrusted Hillary Clinton, or the final stage of 2016 presidential election. considered her conduct as Secretary of State The first dataset comprised all articles that to have been negligent and potentially appeared on the front page of the printed criminal, or were generally unaware of what newspaper (399 total) over the last 69 days her policies contained or how they may have of the campaign, beginning on Sep 1, 2016 differed from Donald Trump’s, these and ending on Nov 8 (election day). The numbers suggest that their views were second dataset comprised the full corpus of entirely consistent with the content being all 13,481 articles published online over the produced by mainstream news sources. same period. In both datasets, we first Moreover, in light of the often bizarre focus identified all articles that were relevant to of the fake news stories, and given the issues the election campaign. We then further that appear to have energized voters on both categorized each of these articles as sides—the repeal and replace of Obamacare, belonging to one of three top-level Hillary Clinton’s email scandal, Donald categories:iii Campaign Miscellaneous, Trump’s misogyny and racism—it is not far- Personal/Scandal, and Policy. Within fetched to speculate that the real news was Personal/Scandal we then further classified far more influential in driving the election the article as focused on Clinton or Trump, outcome than the fake news. and within Policy classified it as one of the following: Policy no details, Policy Clinton

27 Watts and Rothschild details, Policy Trump details, and Policy Policy articles mentioned policy issues such both details: as healthcare, immigration, taxation, abortion, or education. Articles coded as Campaign Miscellaneous articles focused on Policy No Details mentioned policy issues the “horse race” elements of the campaign, as impacting the campaign, but did not such as the overall likelihood of victory of describe the actual policies of the the candidates, details of intra-party candidates. For example, an Oct 26 article, conflicts, or the mobilization of specific “Growing Costs of Health Law Pose a Late demographic groups (e.g. African Test” described Donald Trump attacking Americans, Hispanics). For example, an Hillary Clinton over the issue of premium October 12, story with the headline increases, but did not mention the policy “Republican Split Over Trump Puts States proposals of the two candidates, nor did it into Play” described how Clinton’s note that due to subsidies the hikes would campaign was taking advantage of Trump’s not affect the actual price paid by 86% of battle with the Republican party. This article people in marketplaces. Policy Clinton was manifestly about the campaign, but Details or Policy Trump Details counted treated it mostly as contest in which a articles that mentioned specifics of the dramatic twist had just taken place. It Clinton or Trump platforms respectively but contained little information that would have not both, while “Policy Both Details” helped potential voters understand the compared the specifics of the two candidates’ policy positions and hence their candidates’ platforms. For example, an respective agendas as President. October 3 article, “Next President Likely To Shape Health Law Fate,” which we coded as Personal/Scandal articles focused on the Policy Both Details, noted that Clinton had controversial actions and/or statements of endorsed “a new government-sponsored the candidates, either during the election health plan, the so-called public option, to itself or prior to it, as well as on the fallout give consumers an additional choice.” It also generated by the controversies. An example noted that “Donald J. Trump and of the former would be an Oct 8 article Republicans in Congress would go in the “Tape Reveals Trump Boast About Groping direction of less government, reducing Women,” which discussed the infamous federal regulation and requirements so Access Hollywood Tape. An example of the insurance would cost less and no-frills latter would be an October 29 article, “New options could proliferate. Mr. Trump would, Emails Jolt Clinton Campaign in Race’s for example, encourage greater use of health Last Days,” which discussed the impact of savings accounts, allow insurance policies to the reopening of FBI investigation into be purchased across state lines and let Clinton’s private email server on the people take tax deductions for insurance campaign but provided no new details on the premium payments.” scandal itself. In addition, we classified each Personal/Scandal article as being primarily Table 1 shows the results of our analysis. Of about Clinton (e.g. emails, Benghazi, the the 150 front-page articles that discussed the Clinton Foundation) or Trump (e.g. sexual campaign in some way, we classified harassment, Trump University, Trump slightly over half (79) as Campaign Foundation, etc.). Miscellaneous. Slightly over a third (55) were Personal/scandal with 30 focused on Trump and 25 on Clinton. Finally, just over

28 Watts and Rothschild

10% (16) of articles discussed Policy, of matters, while only 70 mentioned policy. Of which six had no details, four provided those only 60 mentioned any details of details on Trump policy only, one on either candidate’s positions. In other words, Clinton policy only, and five made some comparing the two datasets, the number of comparison between the two candidates’ Personal/Scandal stories for every Policy policies. Table 2 shows the corresponding story ranged from 3.4 (for front page stories) results for the full corpus (see Appendix for to 4.2. Further restricting to policy stories details of methods). Of the 1,433 articles that contained some detail about at least one that mentioned Trump or Clinton, 291 were candidate’s positions, these ratios rise to 5.5 devoted to scandals or other personal and 4.85 respectively.

Table 1: Front page New York Times stories classified as about the campaign that focused primarily on personal scandals, policy, and miscellaneous matters (see text for coding).

Personal/ Personal/ Policy Policy Policy Policy Campaig Scandal Scandal No Clinton Trump Both n Misc. Clinton Trump Detail Detail Detail Detail Sep. 30 6.5 8.5 2 4 1 3 Oct. 33 13.5 17.5 4 0 0 2 Nov. 16 5 4 0 0 0 0 Tot. 79 25 30 6 4 1 5

Table 2: New York Times stories classified in the same manner as above but drawn from the full corpus published at nytimes.com.

Persona Mention Personal/ Policy Policy Policy Policy l/ Trump/Cl Scandal No Clinton Trump Both Scandal inton Trump Detail Detail Detail Detail Clinton Total 1,433 87 204 10 13 30 17

As has become clear since the election, there which the news media had an obligation to were in fact profound differences between inform citizens of the substantive policy the two candidates’ policy positions, and implications of their choice, the election of these differences are already proving 2016 should have been such an election. enormously consequential to the American And yet only five out of 150 front page people. Under President Trump, the articles that the New York Times ran over Affordable Care Act is being actively the last, most critical months of the election undermined, environmental and consumer even attempted to make these comparisons protections are being rolled back, evident to their readers, while only ten even international alliances and treaties are being mentioned policy details of either threatened, and immigration policy has been candidate.iv thrown into turmoil, among other dramatic changes. If ever there was an election in

29 Watts and Rothschild

Clinton Email Scandal. In this context, ten is becomes truly astounding when juxtaposed an interesting number because it is also the with the coverage of all policy issues number of front-page stories that the New combined. To reiterate, in just six days the York Times ran on the Hillary Clinton email New York Times ran as many cover stories scandal in just six days, from Oct 29 (the about Hilary Clinton’s emails as they did day after FBI director James Comey about all policy issues combined in the 69 announced his decision to reopen his days leading up to the election.v Nor can this investigation of possible wrongdoing by intense focus on the email scandal be written Clinton) through Nov 3, just five days off as inconsequential: the Comey incident before the election (see Fig 2). In retrospect, and its subsequent impact on Clinton’s one cannot help but be struck by the sheer approval rating among undecided voters was extent of coverage, day after day, of this one very likely sufficient to have tipped the issue on the front page of the nation’s election (Silver 2017a). leading newspaper. The level of coverage

Figure 2: Ten articles on the front page of the New York Times in a six-day period (October 29 to November 3, 2016), discussing the FBI investigation into Secretary Clinton’s use of a private email server. In the same time- period there were six front page articles on the dynamics of the campaign, one piece on Trump’s business, and zero on public policy of candidates.

30 Watts and Rothschild

Coverage of Obamacare. Setting scandals Trump voters, one might have expected to aside for a moment, arguably the most see detailed explanations of how Obamacare salient policy issue of the election campaign worked and how well it was working in the was the success or failure of Obamacare. In months leading up to the election. In the months after Donald Trump’s contrast, the Times’ pre-election coverage of inauguration, The Upshot, the data-centric Obamacare was both surprisingly sparse (we sub-section of the New York Times counted only four front page stories between published two pieces on Obamacare. The Sept 1 and Nov 8) and surprisingly negative. first, “One-Third Don’t Know Obamacare As Figure 3 shows, the first article, on and Affordable Care Act Are the Same,” October 3, strikes a much gloomier note published on Feb 7, described some than the Upshot articles above, stating “Mr. important misconceptions about Obamacare Obama’s signature domestic achievement held by large percentages of the American will almost certainly have to change to public—for example, that almost 40% (and survive.” But then, in a three-day period 47% of Republicans) did not know that from Oct 25-27 the Times’ tone was even repealing Obamacare would cause people to more negative: “Choices fall in health law as lose Medicaid coverage or subsidies for costs rise” blares the October 25 headline; private insurance. The second, “No, “Growing costs of health law pose a late Obamacare isn’t in a ‘Death Spiral,’” test;” and finally “Many prefer tax penalties published on March 15, 2017, provided to health law.” All four articles emphasized readers with some important details about troubles in the insurance market, failing to how Obamacare works. For example, it mention that most policy holders have noted that “because of how subsides work, subsidized capped prices (so are insulated people were generally shielded from this from premium hikes), or that the year’s higher prices.” It also noted that while government was spending less than prices had gone up recently, they “were anticipated, or that premiums were rising lower than expected din the first few years slower than before Obamacare. Further, of the program.” The article then went on to none of the articles mention the Medicaid make a clear picture of an insurance market expansion, which has proven to be one of that could certainly use improvement, but the most popular parts of the bill. Instead, that the “Obamacare markets will remain the articles draw their negative inferences stable over the long run, if there are no from the small fraction of Obamacare users significant changes.” who paid full price in the markets, while ignoring all the features that have made it so Given the importance of health care to a popular postelection. large percentage of Americans, including

31 Watts and Rothschild

Figure 3. All four front-page articles that touch on the Affordable Care Act (aka ObamaCare), published on (from left to right) October 3, 25, 26, 27.

Conclusion they would also have learned about was the Consistent with Benkler et al’s (2017b) ever-fluctuating state of the horse race: who sentence-level count, our analysis of the was up and who was down; who might turn New York Times’ coverage finds that it out and who might not; and who was happy mostly focused on “dramatic” issues like the or unhappy with whom about what. horserace or personal scandals, not on the kind of substantive policy issues that would To reiterate, we chose to analyze the New have allowed readers to make informed York Times’ coverage not because we felt it decisions about the candidates’ platforms.vi was worse than any other mainstream news If voters had wanted to educate themselves organization (again our results are broadly on issues such as healthcare, immigration, consistent with Benkler et al. (2017b), but taxes, and economic policy—or how these rather because it illustrates a broader failure issues would likely be affected by the of mainstream journalism to inform their election of one or other of the candidates as audiences of the very real and very President—they would not have learned consequential issues at stake. In retrospect, much from reading the New York Times. it seems clear that the Times—along with What they would have learned about was essentially all its peer organizations—made that both candidates were plagued by the mistake of assuming that a Clinton scandal: Hillary Clinton over her use of a victory was inevitable, and were setting private email server for government themselves as credible critics of the next business while Secretary of State and also administration (Silver 2017a). Possibly this allegations of possible conflicts of interest in mistake arose from the failure of journalists the Clinton Foundation; and Trump over his to get of their “hermetic bubble” as some failure to release his tax returns, his past have asserted (Spayd 2017). Possibly it was business dealings, Trump University, the their misinterpretation of available polling Trump Foundation, accusations of sexual data, which showed all along that a Trump harassment and assault, and numerous victory, albeit unlikely, was far from misogynistic, racist, and otherwise inconceivable (Silver 2017b; Watts 2017). insensitive or offensive remarks.vii What These were understandable mistakes, but

32 Watts and Rothschild they were still mistakes. Yet rather than assumes that journalist are passive observers acknowledging the possible impact that their of events rather than active participants collective failure of imagination could have whose choices about what to cover and how had on the election outcome, the mainstream to cover it do not meaningfully influence the news community has instead focused its events in question. Given the disruption critical attention on fake news, Russian visited upon the print news business model hackers, technology companies, algorithmic since the beginning of the 21st century, ranking, the alt-right media, even on journalists can perhaps be forgiven for “Americans’ inability or unwillingness to seeing themselves as helpless bystanders in sift fact from fiction” (Parker 2017)—that is an information ecosystem that is everywhere but on themselves. increasingly centered around social media and search. But even if the news media has To be fair, journalists were not the only ceded much of its distribution power to community to be surprised by the outcome technology companies, its longstanding of the 2016 election—a great many ability to “set the agenda” (McCombs and informed observers, possibly including the Shaw 1972)—that is, to determine what candidate himself, failed to take the prospect counts as news to begin with—remains of a Trump victory seriously. Also to be fair, formidable. In sheer numerical terms, the the difficulty of adequately covering a information to which voters were exposed campaign in which the “rules of the game” during the election campaign was were repeatedly upended must surely have overwhelmingly produced not by fake news been formidable (Hepworth et al. 2016). sites or even by alt-right media sources, but That said, one could equally argue that by household names like the New York Facebook could not have been expected to Times, the Washington Post, and CNN. anticipate the misuse of its advertising Without discounting the role played by platform to seed fake news stories. And one malicious Russian hackers and naïve tech could just as easily argue that the difficulties executives, we believe that fixing the facing tech companies in trading off information ecosystem is much more about between complicity in spreading intentional improving the real news than stopping the misinformation on the one hand, and fake stuff. on the other hand, are every bit as formidable as those facing journalists The authors are grateful to William Cai for trying to cover Trump. For journalists to valuable research assistance. excoriate the tech companies for their missteps while barely acknowledging their Appendix: Analysis of the Full Corpus own reveals an important blind spot in the What the NYT puts on its front page of its journalistic profession’s conception of itself. print edition is important, but not necessarily representative of how many readers We have no doubt that journalists take their encounter the news, either because they mission to provide their readers with the navigate directly to individual articles from information they need “in order to make social media sources (mostly Facebook and decisions regarding their lives, and their Twitter), or because articles at nytimes.com local and national communities... in an can appear in different places at different accurate, comprehensive, timely and times. To check that our conclusions understandable manner”viii seriously. We regarding coverage of the campaign were note, however, that this mission implicitly not badly biased by restricting our sample to

33 Watts and Rothschild the front page only, we also coded the entire 2) Trump personal/scandal words: ['', corpus of all articles published on 'foundation', 'university', 'woman', 'women', nytimes.com during the same period. 'tax', 'sexual assault', 'golf', 'tape', 'kiss'] Because this sample is much larger (13,481 vs. 399) we coded them using a combination 3) Policy words for both candidates are of machine classification and hand coding. taken from the list of issues covered by On the Issues: ['abortion', 'budget', 'civil rights', First, we scraped the headline and first 'corporation', 'crime', 'drug', 'education', paragraph, if given by the API, of all NYT 'energy', 'oil', 'environment', 'familiy', articles from 9/1/2016 to 11/9/2016 using 'families', 'children', 'foreign policy', 'trade', the archive API for all articles that included 'reform', 'government', 'gun', 'health', the words Clinton or Trump. Note: this 'security', 'immigra' 'technology', 'job', criterion that would include all campaign- 'principl', 'value', 'social security', 'war', related articles but might also include 'peace', 'welfare', 'poverty', 'econom', potentially non-campaign related articles 'immigrat', 'immigran'] (e.g. about Bill Clinton or Ivanka Trump). Next, we compiled a list of words (details 4) Campaign words for both candidates: below) delineating three categories of ['fundraise', 'ads', article: campaign (focused on horse race and 'advertisements','campaign','trail','rally','endo how people react to events); policy (focused rs','outreach', 'ballot', 'vote', 'electoral', 'poll', on a policy issue); and personal/scandal. 'donat', 'turnout', 'margin', 'swing state'] For each article, we check if any of the words in the article begin with one of the References stems in our word list. For example, if an Abbruzzese, Jason. 2016. “CNN Rolled Out article contains the word ‘immigration’, we a Pretty Bold Way to Fact-Check notice that it starts with ‘immigrant’, which Donald Trump.” in Mashable. is one of our policy words, and we mark it as policy article. Then, for all articles which Albright, Jonathan. 2017. "Cambridge are marked as candidate policy articles, we Analytica: the Geotargeting and hand-code them into the four sub-categories Emotional Data Mining Scripts." in and toss articles into campaign Medium. miscellaneous if they do not actually cover any policy. Allcott, Hunt, and Matthew Gentzkow. 2017. "Social Media and Fake News Finally, we hand-coded the Policy articles as in the 2016 Election." National Policy No Details, Policy Clinton Details, Bureau of Economic Research. Policy Trump Details, or Policy Both Details using the same criteria as above. Bajarin, Tim. 2017. "Forget Politics: Why 'Fake News' Should Infuriate You." Word list for Clinton/Trump Categories: in PC Magazine.

1) Clinton personal/scandal words: ['email', Benkler, Yochai, Robert Faris, Hal Roberts, 'benghazi', 'foundation', 'road'] and Ethan Zuckerman. 2017a. "Study: Breitbart-led right-wing media ecosystem altered broader

34 Watts and Rothschild

media agenda." in Columbia over fake news on Facebook." in Journalism Review. Washington Post.

Benkler, Yochai, Hal Roberts, Robert Faris, Fahri, Paul. 2017. "The Washington Post's Bruce Etling, Ethan Zuckerman, and New Turns Out to be an Old Nikki Bourassa. 2017b. Saying." in Washington Post. "Partisanship, Propaganda, and Disinformation: Online Media and Feldman, Brian. 2017. "Facebook Hands the 2016 U.S. Presidential Election." Over Data to Congress and Promises Berkman Klein Center for Internet & Better Disclosure on Political Ads." Society Research Paper. in New York Magazine.

Bessi, Alessandro, and Emilio Ferrara. 2016. Hepworth, Shelley, Vanessa Gezari, Kyle "Social bots distort the 2016 US Pope, Cory Schouten, Carlett Spike, Presidential election online David Uberti, and Pete Vernon. discussion." 2016. "Covering Trump: An oral history of an unforgettable Cohen, Noam. 2017. "Silicon Valley is Not campaign." in Columbia Journalism Your Friend." in New York Times. Review.

Coren, Michael. 2017. "Ending fake news Isaac, Mike. 2017. "At Facebook, Hand- means changing how Wall Street Wringing Over a Fiz for Fake values Facebook and Twitter." in Content." in New York Times. Quartz. Isaac, Mike, and Daisuke Wakabayashi. Delaney, Kevin J. 2017. "Filter bubbles are 2017. "Russian Influence Reached a serious problem with news, says 126 Million Through Facebook Bill Gates." in Quartz. 21 February Alone." in New York Times. 2017. Kang, Cecilia. 2017. "Mark Warner: Tech Diaz, Ann-Christine. 2017. "'The Truth is Millionaire Who Became Tech's Hard,' Says The New York Times' Critic in Congress." in New York First-Ever Oscars Ad." in AdAge. Times.

Donchev, Danny. 2017. "36 Mind Blowing Karpf, David. 2017. "People are YouTube Facts, Figures, and hyperventilating over a study of Statistics - 2017." in Fortune Lords. Russian propaganda on Facebook. Just breathe deeply." in Washington El-Bermaway, Mostafa. 2016. "Your Fillter Post. Bubble is Destroying Democracy." in Wired. Lazer, David, Matthew Baum, Yochai Benkler, Adam Berinsky, Kelly Entous, Adam, Elizabeth Dwoskin, and Greenhill, Filippo Menczer, Miriam Craig Timberg. 2017. "Obama tried Metzger, Brendan Nyhan, Gordon to give Zuckerberg a wake-up call Pennycook, David Rothschild, Michael Schudson, Steven A

35 Watts and Rothschild

Sloman, Cass R Sunstein, Emily Rutenberg, Jim. 2017. "RT, Sputnik and Thorson, Duncan J. Watts, and Russia's New Theory of War." in Jonathan Zittrain. 2017. "The science New York Times Magazine. of fake news." Under review. Shane, Scott, and Vindu Goel. 2017. "Fake Machkovech, Sam. 2017. "Google admits Russian Facebook Accounts Bought citing 4chan to spread fake Vegas $100,000 in Political Ads." in New shooter news." in Ars Technica. York Times.

Madrigal, Alexis C. 2017. "12 October Silver, Nate. 2017a. "The Comey Letter 2017." in The Atlantic. Probably Cost Clinton The Election." in FiveThirtyEight. McCombs, Maxwell E, and Donald L Shaw. 1972. "The agenda-setting function —. 2017b. "The Media Has A Probability of mass media." Public Opinion Problem." in FiveThirtyEight. Quarterly 36(2):176-87. Silverman, Craig. 2016. "This Analysis Morgan, Jonathon. 2017. "Facebook and Shows How Viral Fake Election Google need to own their role in News Stories Outperformed Real spreading misinformation -- and fix News On Facebook." in BuzzFeed. it." in CNN. Spayd, Liz. 2017. "Seeking More Voices, Palma, Bethania, and Vinny Green. 2017. Even if Some Don't Want to Hear "Misinfluencers, Inc.: How Fake Them." in New York Times. News Is Reaching Millions Using Verified Facebook Accounts." in Timberg, Craig. 2017. "Russian propaganda . may have been shared hundreds of millions of times, new research Parker, Emily. 2017. "Silicon Valley Can't says." in Washington Post. Destroy Democracy Without Our Help." in New York Times. Timberg, Craig, and Elizabeth Dwoskin. 2017. "Facebook takes down data Rees, Annie. 2016. "In NYT’s Hillary and thousands of posts, obscuring Clinton Coverage, An Obsession reach of Russian disinformation." in With ‘Clouds’ And ‘Shadows’." in Washington Post. Talking Points Memo. Volz, Dustin. 2017. "Facebook will help Ritchie, Hannah. 2016. "Read all about it: investigators release Russia ads, The biggest fake news stories of Sandberg tells Axios." in Reuters. 2016." in CNBC. Watts, Duncan J, and David Rothschild. Roose, Kevin. 2017. "We Asked Facebook 2017. "Rebuilding legitimacy in a 12 Questions About the Election, and post-truth age." Medium Jan 17. Got 5 Answers." in New York Times.

36 Watts and Rothschild

Watts, Duncan J. 2017. "The non- inevitability of Donald Trump (and almost everything)." in Medium.

i Breitbart and other “alt-right” media organizations (e.g. TruthFeed, the Daily Caller) were included in this sample, but a separate count in which the two largest sites (Breitbart and the Daily Caller) were removed yielded qualitatively similar results (personal correspondence with Y. Benkler). ii We chose the Times for two reasons: first, because its broad reach both among policy elites and ordinary citizens means that the Times has singular influence on public debates; and second, because its reputation for serious journalism implies that if the Times did not inform its readers of the issues then it is unlikely that such information was widely available anywhere. iii Every front-page article was read by two researchers and coded for these three topline categories and sub- categories, using only the text that appeared on the actual front-page (not on what may be continued on future pages). There was very little disagreement between the two researchers. Most critically there was agreement over the articles that covered policies of both candidates and just one article difference on the total number of articles that included policy. For simplicity, the authors reviewed all disagreements together, by hand, and we report from that dataset. iv Perhaps even more damning, even the tiny minority of articles that did attempt to deal with policy details skimmed over some important ones. For example, the Oct 3 article from earlier did not mention the key provision of cutting the Medicaid expansion, nor did it mention any consequences of eliminating Essential Benefits (e.g. making healthcare insurance unaffordable for people with pre-existing conditions). Given that healthcare was arguably the most important salient issue of the election to many voters—these omissions seem significant. v If one counts the two additional front page articles on Clinton’s email that appeared on November 6 and November 7 (the latter describing Comey’s notification to Congress that the renewed investigation had uncovered nothing new), then these numbers look even more lopsided. vi One of the authors, David Rothschild, also covered the horse race, writing a piece in the New York Times on how the polling margin of error is larger than advertised. He was featured in two other articles, one where he was an outlier in showing Trump to win Florida and another where he argued that the winner of Pennsylvania will win the election. vii Even when covering scandals, the coverage was less about what happened than the perception of what happened, as evidenced by the frequent references to “shadows,” “clouds,” and other implied rather than actual wrong-doing. (Rees 2016). viii From the website of the Society of Professional Journalists

37 Freelon

Personalized Information Environments and Their Potential Consequences for Disinformation

Deen Freelon, University of North Carolina

Ever since the internet started to permeate political preconceptions while screening out everyday life, academics and journalists dissenting opinions. The fear is that people have fretted about the ill effects of will use such tools as much as possible, personalized information environments. This polarizing public opinion and perhaps even idea was first popularized by Negroponte rendering civil debate between two opposing (1996) in the mid-1990s as the “Daily Me,” sides impossible (Gutmann & Thompson, and developed into its best-known form 2004; Schkade, Sunstein, & Hastie, 2010; primarily by Sunstein (2007) and Pariser Sunstein, 2007). (2011). Other closely-related concepts include echo chambers, cyberbalkanization, The deliberative objection can be unpacked filter bubbles, and selective exposure to into two falsifiable research questions: media content. Personalized information environments are digital content delivery To what extent do people with the systems configured to suit the idiosyncratic technical capacity to create personalized tastes of a single user. They may be information environments use them to constructed manually, as by following create ideologically airtight “echo particular accounts on a social media chambers”? platform; or automatically, as when a recommender system serves content on the To what extent do people who use basis of a user’s prior behaviors. Many highly polarized information contemporary sociotechnical systems, environments exhibit extreme political including Facebook and Twitter, incorporate opinions and intolerance of opposing both manual and automated components to viewpoints? generate each user’s individual feed. Given the topic of this conference, we might These environments can become also consider a third research question: problematic when considered from normative democratic perspectives that How is the use of personalized values civility and the exchange of information environments related to use information between people and groups that of and trust in disinformation content? disagree fundamentally. Perhaps the best- known of these in political science and The remainder of this paper will briefly communication circles is deliberative review the empirical literature addressing democracy, which upholds cross-cutting these questions and conclude by raising interaction as a core priority (Carpini, Cook, research questions of particular relevance & Jacobs, 2004; Freelon, 2010; Mutz, 2006). for disinformation research. This perspective condemns as undemocratic media tools and environments that allow people to read only content that flatters their

38 Freelon

Do echo chambers exist? said, it is still possible that opinion extremity The foundational theoretical work in this or polarization is correlated with the degree area made several key assumptions, among of one-sidedness of one’s information them 1) that it is possible to construct environment. The relevant research tends to hermetically-sealed information bubbles support this prediction (Garrett et al., 2014; through which no (or almost no) contrary Knobloch-Westerwick, 2012; Levendusky, views can penetrate; and 2) that given a 2013a, 2013b; Stroud, 2010), although at choice between such an environment and a least one study showed the opposite: that more ideologically balanced information individuals with ideologically diverse social diet, most people would prefer the former networks hold more polarized feelings (Pariser, 2011, pp. 12, 18; Sunstein, 2007, p. toward Democrats and Republicans and 1). Most empirical work on the topic identify more strongly as liberal or suggests that neither of these assumptions conservative (Lee, Choi, Kim, & Kim, generally hold. Instead, people tend to 2014). That single study notwithstanding, a exhibit moderate to strong preferences for firm conclusion based on the preponderance ideologically-consistent content while of the empirical evidence would be that the failing to routinely reject content from the more one’s media environment looks like an other side (Bakshy, Messing, & Adamic, echo chamber, the more extreme one’s 2015; Barberá, Jost, Nagler, Tucker, & viewpoints are. There is currently no Bonneau, 2015; Dvir-Gvirsman, Tsfati, & consensus on the causal direction of this Menchen-Trevino, 2016; Garrett, 2009; relationship. Gentzkow & Shapiro, 2011; Hanna et al., 2013; Hargittai, Gallo, & Kane, 2008; Methodological challenges Kobayashi & Ikeda, 2009). Some studies The original conceptual descriptions of have found differences in the degrees of personalized information environments ideological self-segregation between liberals offered specific sites like Google and Reddit and conservatives (Barberá et al., 2015; as examples, but few people rely on a single Boutyline & Willer, 2017; Hanna et al., service for all the information they consume. 2013) as well as between political issues If the chief concern is the diversity of (Bakshy et al., 2015; Barberá et al., 2015; people’s overall information diets, all Himelboim, McCreery, & Smith, 2013). possible media types should ideally be Overall, the information environments most considered, whether TV, radio, print, or people choose might look more slanted than digital. For obvious reasons, this presents Sunstein and Pariser would prefer, but they formidable methodological challenges— are less so than predicted. tracking one person’s complete media intake over time is difficult enough, and most Do personalized information researchers would want to track multiple environments polarize? individuals. To my knowledge, no study has We might dismiss the requirement that yet been able to accomplish this in a personalized information environments rigorous way, although Dvir-Gvirsman et al. completely filter out all opinion-inconsistent (2016) come the closest for online content content as a strawman. It makes more sense sources. Consequently, much of the research to consider the degree of ideological bias: a that purports to examine personalized completely biased environment would fulfill information environments analyzes only one the criteria of an echo chamber, but these are platform or medium. While a single social exceedingly rare if they exist at all. That media feed may be considered an

39 Freelon information environment unto itself, such News fabricators have anecdotally research does not directly address the suggested that right-leaning individuals question of overall information diversity on in general, and Trump supporters in an individual level. Thus, we should not particular, are disproportionately infer that people whose Twitter susceptible to opinion-reinforcing communication patterns are highly polarized disinformation online (Dewey, 2016; (for example) necessarily experience a Sydell, 2016). media diet that is highly polarized overall. This is not to say that any of the studies An informal analysis by the publication cited above do so, but only to point out that PCWorld found that a Facebook account some of them define “echo chamber” at the constructed to appear conservative platform level rather than the individual attracted ten fabricated stories, while one level. Such research is nevertheless still constructed to appear liberal attracted useful, as it indicates whether specific none (Hachman, 2017). platforms could contribute to individual- level polarization. Conservatives have traditionally distrusted mainstream news more than Possible effects on disinformation liberals, and this gap has widened in While misinformation in the sense of recent years (Barthel & Mitchell, 2017; unintentional false beliefs has been a long- Swift, 2016). This may drive them standing concern of communication research toward friendly disinformation sources (see Bode & Vraga, 2015; Nyhan, 2010; that lack the perceived bias of traditional Weeks, 2015), studies of maliciously- news outlets. distributed disinformation are much rarer (exceptions include Lewandowsky, Stritzke, Distrust in factual claims generally. Freund, Oberauer, & Krueger, 2013; and Conservative distrust for news long predates Martin, 1982). Studies that analyze the new the current wave of disinformation. But wave of political disinformation are just now disinformation itself may drive people to beginning to emerge (Faris et al., 2017; adopt distrust as a default position for novel Lazer et al., 2017; Vargo, Guo, & Amazeen, claims about the world, or at least to 2017). Thus far, few if any empirical studies disbelieve them more than otherwise. have directly addressed the consequences of Consider: polarized information environments for disinformation and its effects. However, the Corporate purveyors of disinformation, following topics would be worth exploring including the tobacco and oil industries, given what we already know: successfully sowed public doubt about political issues affecting their profits for Ideological asymmetries in the decades (Oreskes & Conway, 2010). consumption, distribution, and acceptance of disinformation. The following findings A 2017 Buzzfeed/Ipsos poll found that support the prediction that conservatives 54% of respondents distrust news on will engage with disinformation Facebook, in part because it has failed to substantially more than liberals (and perhaps stop disinformation from spreading also centrists): (Silverman, 2017).

40 Freelon

In July 2017, only 45% of Trump voters Facebook. Science, 348(6239), believed Donald Trump Jr. met with a 1130–1132. Russian lawyer to seek damaging information on Hillary Clinton before Barberá, P., Jost, J. T., Nagler, J., Tucker, J. the 2016 election, even though he A., & Bonneau, R. (2015). Tweeting publicly admitted to doing so from left to right: Is online political (Durkheimer, 2017). communication more than an echo chamber? Psychological Science, Misuse and abuse of the rhetoric of 26(10), 1531–1542. “critical thinking.” Many purveyors and consumers of disinformation claim to be Barthel, M., & Mitchell, A. (2017, May 10). critical thinkers. This may reduce the Americans’ Attitudes About the efficacy of critical thinking-based solutions News Media Deeply Divided Along to disinformation, especially in polarized Partisan Lines. Retrieved September information environments. 28, 2017, from http://www.journalism.org/2017/05/1 Some of the best-known proponents of 0/americans-attitudes-about-the- the flat-earth misbelief claim to “look at news-media-deeply-divided-along- things objectively with a critical eye” partisan-lines/ (Ambrose, 2017). Boutyline, A., & Willer, R. (2017). The A January 2010 print ad for the media social structure of political echo outlet Russia Today compared the chambers: Variation in ideological feasibility of anthropogenic climate homophily in online networks. change to that of space aliens. The ad’s Political Psychology, 38(3), 551– fine print encouraged readers to 569. “question more” and “challeng[e] the accepted view” (“Russia Today,” 2010). boyd, danah. (2017, January 5). Did Media Literacy Backfire? Retrieved Some teens consider their high level of September 29, 2017, from trust in the information quality of https://points.datasociety.net/did- Google’s top search results as a form of media-literacy-backfire- critical thinking (boyd, 2017). 7418c084d88d

References Carpini, M. X. D., Cook, F. L., & Jacobs, L. Ambrose, G. (2017, July 7). These R. (2004). Public deliberation, Coloradans say Earth is flat. And discursive participation, and citizen gravity’s a . Now, they’re being engagement: A review of the persecuted. Retrieved September 29, empirical literature. Annu. Rev. Polit. 2017, from Sci., 7, 315–344. http://www.denverpost.com/2017/07/ 07/colorado-earth-flat-gravity-hoax/ Dewey, C. (2016, November 17). Facebook fake-news writer: ‘I think Donald Bakshy, E., Messing, S., & Adamic, L. A. Trump is in the White House (2015). Exposure to ideologically because of me.’ Retrieved September diverse news and opinion on 28, 2017, from

41 Freelon

https://www.washingtonpost.com/ne Garrett, R. K., Gvirsman, S. D., Johnson, B. ws/the- K., Tsfati, Y., Neo, R., & Dal, A. intersect/wp/2016/11/17/facebook- (2014). Implications of Pro- and fake-news-writer-i-think-donald- Counterattitudinal Information trump-is-in-the-white-house- Exposure for Affective Polarization. because-of-me/ Human Communication Research, 40(3), 309–332. Durkheimer, M. (2017, July 27). “Fake https://doi.org/10.1111/hcre.12028 News” Leads To No News For The Trump Base. Retrieved September Gentzkow, M., & Shapiro, J. M. (2011). 29, 2017, from Ideological Segregation Online and https://www.forbes.com/sites/michae Offline. The Quarterly Journal of ldurkheimer/2017/07/27/fake-news- Economics, 126(4), 1799–1839. leads-to-no-news-for-the-trump- https://doi.org/10.1093/qje/qjr044 base/ Gutmann, A., & Thompson, D. F. (2004). Dvir-Gvirsman, S., Tsfati, Y., & Menchen- Why deliberative democracy? Trevino, E. (2016). The extent and Princeton: Princeton University nature of ideological selective Press. exposure online: Combining survey responses with actual web log data Hachman, M. (2017, September 7). Just how from the 2013 Israeli Elections. New partisan is Facebook’s fake news? Media & Society, 18(5), 857–877. We tested it. Retrieved September https://doi.org/10.1177/14614448145 28, 2017, from 49041 https://www.pcworld.com/article/314 2412/windows/just-how-partisan-is- Faris, R. M., Roberts, H., Etling, B., facebooks-fake-news-we-tested- Bourassa, N., Zuckerman, E., & it.html Benkler, Y. (2017). Partisanship, Propaganda, and Disinformation: Hanna, A., Wells, C., Maurer, P., Friedland, Online Media and the 2016 U.S. L., Shah, D., & Matthes, J. (2013). Presidential Election. Berkman Partisan Alignments and Political Klein Center for Internet & Society. Polarization Online: A Computational Approach to Freelon, D. (2010). Analyzing online Understanding the French and US political discussion using three Presidential Elections. In models of democratic Proceedings of the 2Nd Workshop on communication. New Media & Politics, Elections and Data (pp. 15– Society, 12(7), 1172–1190. 22). New York, NY, USA: ACM. https://doi.org/10.1145/2508436.250 Garrett, R. K. (2009). Echo chambers 8438 online?: Politically motivated selective exposure among Internet Hargittai, E., Gallo, J., & Kane, M. (2008). news users. Journal of Computer- Cross-ideological discussions among Mediated Communication, 14(2), conservative and liberal bloggers. 265–285. Public Choice, 134(1), 67–86.

42 Freelon

Himelboim, I., McCreery, S., & Smith, M. 611–623. (2013). Birds of a Feather Tweet https://doi.org/10.1111/ajps.12008 Together: Integrating Network and Content Analyses to Examine Cross- Lewandowsky, S., Stritzke, W. G., Freund, Ideology Exposure on Twitter. A. M., Oberauer, K., & Krueger, J. I. Journal of Computer-Mediated (2013). Misinformation, Communication, 18(2), 40–60. Disinformation, and Violent https://doi.org/10.1111/jcc4.12001 Conflict: From Iraq and the “war on Terror” to Future Threats to Peace. Knobloch-Westerwick, S. (2012). Selective American Psychologist, 68(7), 487– Exposure and Reinforcement of 501. Attitudes and Partisanship Before a https://doi.org/10.1037/a0034515 Presidential Election. Journal of Communication, 62(4), 628–642. Martin, L. J. (1982). Disinformation: An https://doi.org/10.1111/j.1460- instrumentality in the propaganda 2466.2012.01651.x arsenal. Political Communication, 2(1), 47–64. Kobayashi, T., & Ikeda, K. (2009). Selective https://doi.org/10.1080/10584609.19 exposure in political web browsing. 82.9962747 Information, Communication & Society, 12(6), 929–953. Mutz, D. C. (2006). Hearing the other side: Deliberative versus participatory Lazer, D., Baum, M., Grinberg, N., democracy. Cambridge University Friedland, L., Joseph, K., Hobbs, W., Press. Retrieved from & Mattsson, C. (2017). Combating http://books.google.com/books?hl=e Fake News: An Agenda for Research n&lr=&id=r2i3cog2IXEC&oi=fnd& and Action. Shorenstein Center on pg=PA1&dq=hearing+the+other+sid Media, Politics, and Public Policy. e&ots=ydYzxHeEco&sig=pCaHNvo HPp3ucojrMtu4cGyRiNc Lee, J. K., Choi, J., Kim, C., & Kim, Y. (2014). Social Media, Network Negroponte, N. (1996). Being Digital. New Heterogeneity, and Opinion York, NY: Vintage. Polarization. Journal of Communication, 64(4), 702–722. Nyhan, B. (2010). Why the “Death Panel” https://doi.org/10.1111/jcom.12077 Myth Wouldn’t Die: Misinformation in the Health Care Reform Debate. Levendusky, M. (2013a). Partisan Media The Forum, 8(1). Exposure and Attitudes Toward the https://doi.org/10.2202/1540- Opposition. Political 8884.1354 Communication, 30(4), 565–581. https://doi.org/10.1080/10584609.20 Oreskes, N., & Conway, E. M. (2010). 12.737435 : How a Handful of Scientists Obscured the Truth on Levendusky, M. (2013b). Why Do Partisan Issues from Tobacco Smoke to Media Polarize Viewers? American Global Warming (1 edition). Journal of Political Science, 57(3), Bloomsbury Press.

43 Freelon

Pariser, E. (2011). The : What Swift, A. (2016, September 14). Americans’ the Internet is hiding from you. New Trust in Mass Media Sinks to New York: Penguin. Low. Retrieved September 28, 2017, from Russia Today: “” Print Ad http://news.gallup.com/poll/195542/a by McCann London. (2010, mericans-trust-mass-media-sinks- January). Retrieved September 29, new-low.aspx 2017, from https://www.coloribus.com/adsarchiv Sydell, L. (2016, November 23). We e/prints-outdoor/russia-today- Tracked Down A Fake-News Creator climate-change-14024155/ In The Suburbs. Here’s What We Learned. Retrieved September 28, Schkade, D., Sunstein, C. R., & Hastie, R. 2017, from (2010). When Deliberation Produces http://www.npr.org/sections/alltechc Extremism. Critical Review, 22(2– onsidered/2016/11/23/503146770/np 3), 227–252. r-finds-the-head-of-a-covert-fake- https://doi.org/10.1080/08913811.20 news-operation-in-the-suburbs 10.508634 Vargo, C. J., Guo, L., & Amazeen, M. A. Silverman, C. (2017, April 13). Most (2017). The agenda-setting power of Americans Don’t Trust The News fake news: A big data analysis of the They See On Facebook, According online media landscape from 2014 to To A New Survey. Retrieved 2016. New Media & Society, September 29, 2017, from 1461444817712086. https://www.buzzfeed.com/craigsilve https://doi.org/10.1177/14614448177 rman/americans-cant-agree-on-what- 12086 news-is-on-facebook Weeks, B. E. (2015). Emotions, Stroud, N. J. (2010). Polarization and Partisanship, and Misperceptions: Partisan Selective Exposure. Journal How Anger and Anxiety Moderate of Communication, 60(3), 556–576. the Effect of Partisan Bias on https://doi.org/10.1111/j.1460- Susceptibility to Political 2466.2010.01497.x Misinformation. Journal of Communication, 65(4), 699–719. Sunstein, C. (2007). Republic. com 2.0. https://doi.org/10.1111/jcom.12164 Princeton, NJ: Princeton University Press.

44 Stroud, Thorson, and Young

Making Sense of Information and Judging its Credibility

Natalie Jomini Stroud, University of Texas at Austin Emily Thorson, Syracuse University Dannagal Young, University of Delaware

Introduction given that this style of analytic journalism Our collective charge was to write about has increased over the past century how people make sense of information and (Barnhurst & Mutz, 1997). As a final judge its credibility. While this brief paper example, the prevalence of critiques of the does not cover these topics media’s credibility – many as reported by comprehensively; it does provide summaries the media! – can undermine trust in the of three areas in which we believe media (Watts, Domke, Fan, & Shah, 1999). interventions could be especially effective at reducing the spread and impact of Fact-checking, a specific subset of news misinformation. In particular, we discuss the reporting, also can affect the ways in which roles of media content and structure, social people make sense of what is factual and identity, and information processing styles. what is not. Three examples help to illustrate this idea. First, fact-checking that comes The role of media content and structure from an unexpected source (e.g. a It almost goes without saying that the Republican calling something untrue that content of information can affect whether would normally be endorsed by people find it credible. Weak , Republicans) can be more effective at disreputable sources, poor grammar, and reducing misperceptions (Berinsky, 2015). out-of-date formatting are all examples of Second, when the content of the fact-check characteristics that can affect whether contains images or text that calls to mind the people see information as credible. Beyond original misinformation, people are less these predictable instances, however, responsive to the fact-check. In one study research has shown a number of other ways attempting to correct the false claim that in which information can inspire trust, or a “Feisal Abdul Rauf, the Imam backing the lack thereof. proposed Islamic cultural center and mosque, is a terrorist-sympathizer,” Different types of reporting affect how including images of the Imam in traditional credible we find news media coverage. For Arab clothing reduced the power of the example, discussing politics in terms of corrected information at reducing winning and losing, as opposed to misperceptions (Garrett, Nisbet, & Lynch, emphasizing issues, can lead to greater 2013). Third, correcting misinformation is distrust of the media (Cappella & Jamieson, more effective when an alternative cause can 1997; Hopmann, Shehata, & Strӧmbäck, be described as opposed to merely saying 2015). As another example, the presence of that the original information is incorrect analysis and interpretation in news reporting (Nyhan & Reifler, 2015). For instance, when may increase distrust because it offers more trying to counter the misperception that points with which an audience can disagree vaccines cause autism, it would be more (Jones, 2004). This is even more possible effective to explain what does cause autism

45 Stroud, Thorson, and Young than merely stating that there is not an way to think about how people make sense association between the two. of information and judge its credibility.

The sheer amount of information also can The role of social identity affect how people make sense of it. When How we see ourselves, as well as how we given two pieces of information, one that want others to see us, both profoundly shape reflects a person’s belief (e.g. a pro-life what information we choose to read, believe, article and a person is pro-life) and one that and share. Even partisan-driven motivated articulates a different point of view (e.g. a reasoning is in many ways a social pro-choice article and a person is pro-life), phenomenon – we seek out information that people are more likely to look at the affirms our social identity as a partisan and counter-attitudinal information. When given avoid information that undermines it ten pieces of information, however, people (Bennett 2012; Green, Palmquist, & select a higher percentage of pro- than Schickler, 2004). In an era of social media, counter-attitudinal information (Fischer, the role of social identity in information Shulz-Hardt, & Frey, 2008). One reason for choice is magnified because so many of this pattern is that when people encounter these choices are public. Twitter and more information, they decide what to look Facebook provide countless opportunities to at based on its perceived quality. And it just signal to our social networks – via liking, so happens that information judged to be of retweeting, sharing, or posting – what high quality tends to agree with our beliefs. content we endorse or oppose (Shin & So when people are inundated with Thorson 2017). information, pro-attitudinal information that is judged as higher quality and is more likely Much existing research on social identity to be selected. and information consumption has focused on how social cues affect what people read, The structures that we put into place also process, and recall. For example, Messing influence the information that we encounter, and Westwood (2014) find that social and help us to tame the deluge of endorsements are extraordinarily powerful information (Neuman, 2016). Search in directing audience attention to a news engines and news aggregators have become story – even more so, in fact, than the increasingly popular places for gaining partisanship of the news source. And when it information. For some, algorithmic news comes to misinformation and rumors, social selection is trusted due to “the perceived signals may carry an outside importance independence of algorithms, which most because other cues that people rely on to considered to be less influenced by political determine credibility (for example, and editorial agendas than journalists” authoritative sources) are often absent (Kantar Media, 2016, p. 64). New structures (Metzger, Flanagin, & Medders 2010). can reshape how the public determines what However, for the purposes of designing is credible. interventions to stop the spread of misinformation, leveraging social identity As this brief review demonstrates, the may be more effective at the point of information itself can signal trustworthiness distribution rather than at the point of and correct misinformation more or less reception. Because most information online effectively. Because information can be spreads through relatively few powerful tailored, we propose this as one important nodes, interventions that discourage those

46 Stroud, Thorson, and Young nodes from sharing false information can between photos of friends’ children and have large downstream effects (Goel et al., videos of panda bears, promoting 2015). thoughtful, rational consideration of news content and source credibility is particularly Several factors shape a person’s decision challenging (Ecker et al., 2014). Dual- about whether or not to publicly share or process models of judgment formation posit endorse a piece of information. Some of that the brain processes information one of these factors are relatively stable -- for two ways: either centrally (systematically), example, people high in conflict avoidance thoughtfully scrutinizing the arguments and are unlikely to share content they believe evidence presented, or peripherally will be controversial (Vraga et al., 2015), (heuristically) based on emotional, visual, but others are more malleable. One area that and social cues. Whether a person processes may be especially ripe for intervention is content centrally or peripherally is normative beliefs – specifically, concerns contingent on his/her motivation and ability over creating negative impressions. People to do so. The social media environment are less likely to endorse content if they itself (with huge quantities of bite-sized believe that others might disapprove of that information, constantly updated in real time) endorsement (Marder et al., 2016). Of promotes peripheral/heuristic processing by course, these normative beliefs almost placing constraints on both users’ ability and always compete with other motivations. For motivation to think carefully about the example, a strong Democrat might information presented. As a result, factors simultaneously crave the social reward that such as “who shared it” and “does this make comes from sharing an unverified rumor me feel good/right” become more likely to about Donald Trump with her liberal friends dictate processing goals and dissemination while also worrying about the potential behaviors. backlash if that rumor turns out to be false. Given these competing beliefs, a successful Meanwhile, people with certain intervention designed to stop the spread of psychological traits or worldviews are misinformation might take on two tasks. especially likely to rely on heuristics in First, it could increase the salience of the deciding what to think about - and do with - normative belief that spreading news stories. Traits such as conspiratorial misinformation will incur negative social ideation (Uscinski, Klofstad, Atkinson, consequences. Second, it could emphasize 2016), as well as specific epistemic beliefs the potential social costs of spreading highly (faith in one’s own intuition and belief that partisan information, potentially by building facts are politically constructed) (Garrett & on a growing body of research suggesting Weeks, forthcoming), both of which that Americans are increasingly turned off transcend political ideology, are especially by public displays of partisanship (Klar & predictive of one’s likelihood of embracing Krupnikov, 2016). misinformation as true.

The role of information processing styles Yet, the impact of these traits is not fixed. Psychological factors play a large role in The tendency to reflexively embrace how people process information. In the conspiracy theories or judge stimuli based context of social media, where an abundance on “gut instinct” alone can be altered of information in a readily accessible, through thoughtful interventions and headline-oriented format is sandwiched informational cues (Ecker et al., 2010;

47 Stroud, Thorson, and Young

Lewandowsky et al., 2012; Tully & Vraga, Brotherton R, Eser S. Bored to fears: 2017; Uscinski et al, 2016). Tagging Boredom proneness, paranoia, and information as problematic, unverified, or conspiracy theories. Personality and stemming from an unreliable source Individual Differences. 2015;80:1–5. increases users’ motivation to centrally 19. process the message rather than relying on heuristics and . As Garrett and Weeks Brotherton, R., French, C. C., & Pickering, (forthcoming) suggest, reminding audiences A. D. (2013). Measuring belief in about the importance of evidence and the conspiracy theories: The generic risks of being guided by feelings might be conspiracist beliefs scale. Frontiers particularly effective when dealing with in psychology, 4. audiences who place faith in their own intuition and see truth as socially Cappella, J. N., & Jamieson, K. H. (1997). constructed. Spiral of cynicism: The press and the public good. New York: Oxford Discussion University Press. There is no single intervention that can solve the problems created by the spread of false Dagnall N, Drinkwater K, Parker A, information in politics. But decades of Denovan A, Parton M. Conspiracy research into how people attend to and theory and cognitive style: a evaluate information can shed light into how worldview. Frontiers in Psychology. and why people process, believe, and share 2015;6(206). 20. misinformation. By building upon this research to design effective evidence-based Ecker, U. K. H., Lewandowsky, S., Chang, interventions, we believe it is possible to E. P., & Pillai, R. (2014). The effects substantially reduce the impact of of subtle misinformation in news misinformation on political beliefs and headlines. Journal of Experimental attitudes. Psychology: Applied, 20(4), 323- 335. References Barnhurst, K. G., & Mutz, D. (1997). Ecker, U. K. H., Lewandowsky, S., & Tang, American journalism and the decline D. T. W. (2010). Explicit warnings in event-centered reporting. Journal reduce but do not eliminate the of Communication, 47, 27-53. continued influence of misinformation. Memory & Bennett, W. L. (2012). The personalization Cognition, 38, 1087–1100. of politics: Political identity, social media, and changing patterns of Fischer, P., Schulz-Hardt, S., & Frey, D. participation. The ANNALS of the (2008). Selective exposure and American Academy of Political and information quantity: How different Social Science, 644(1), 20-39. information quantities moderate decision maker’s preference for Berinsky, A. J. (2015). Rumors and health consistent and inconsistent care reform: Experiments in political information. Journal of Personality misinformation. British Journal of & Social Psychology, 94, 231-244. Political Science, 47, 241-262.

48 Stroud, Thorson, and Young

Garrett, R. K., Nisbet, E. C., & Lynch, E. K. Marder, B., Slade, E., Houghton, D., & (2013). Undermining the corrective Archer-Brown, C. (2016). I like effects of media-based political fact them, but won't ‘like’ them: An checking? The role of contextual examination of impression cues and naïve theory. Journal of management associated with visible Communication, 63(4), 617-637. political party affiliation on Facebook. Computers in Human Garrett, R.K., & Weeks, B.E. (forthcoming). Behavior, 61, 280-287. Epistemic beliefs’ role in promoting misperceptions and conspiracist Messing, S., & Westwood, S. J. (2014). ideation. Proceedings of the National Selective exposure in the age of Academy of Sciences. social media: Endorsements trump partisan source affiliation when Hopmann, D. N., Shehata, A., & Strömbäck, selecting news online. J. (2015). Contagious media effects: Communication Research, 41(8), How media use and exposure to 1042-1063. game-framed news influence media trust. Mass Communication and Metzger, M. J., Flanagin, A. J., & Medders, Society, 18, 776-798 R. B. (2010). Social and heuristic approaches to credibility evaluation Jones, D. A. (2004). Why Americans don’t online. Journal of communication, trust the media. The International 60(3), 413-439. Journal of Press/Politics, 9, 60-75. Neuman, W. R. (2016). The digital Kantar Media (2016). and trust in a difference: Evolving media fragmented news environment. technology and the theory of Reported prepared for the Reuters communication effects. Cambridge, Institute for the Study of Journalism. MA: Harvard University Press.

Klar, S., & Krupnikov, Y. (2016). Nyhan, B., & Reifler, J. (2015). Displacing Independent politics. Cambridge misinformation about events: An University Press. experimental test of causal corrections. Journal of Experimental Lewandowsky, S., Gignac, G. E., & Political Science, 2(1), 81–93. Oberauer, K. (2013). The role of conspiracist ideation and worldviews Shin, J., & Thorson, K. (2017). Partisan in predicting rejection of science. Selective Sharing: The Biased PloS one, 8(10), e75637. Diffusion of Fact‐Checking Messages on Social Media. Journal Lewandowsky S, Ecker UKH, Seifert CM, of Communication. Schwarz N, Cook J. (2012) Misinformation and Its Correction: Tully, M., & Vraga, E. K. (2017). Continued Influence and Successful Effectiveness of a news media . Psychological Science in literacy advertisement in partisan the Public Interest. 2012;13(3):106– versus nonpartisan online media 131.

49 Stroud, Thorson, and Young

contexts. Journal of Broadcasting & Electronic Media, 61(1), 144-162.

Uscinski JE, Klofstad C, Atkinson MD. What Drives Conspiratorial Beliefs? The Role of Informational Cues and Predispositions. Political Research Quarterly. 2016;69(1):57–71.

Vraga, E. K., Thorson, K., Kligler- Vilenchik, N., & Gee, E. (2015). How individual sensitivities to disagreement shape youth political expression on Facebook. Computers in Human Behavior, 45, 281-289.

Watts, M. D., Domke, D., Shah, D. V., & Fan, D. P. (1999). Elite cues and in presidential campaigns: Explaining public perceptions of a liberal press. Communication Research, 26, 144– 175.

50 Kahan

Misinformation and Identity-Protective Cognition

Dan M. Kahan, Yale University

Abstract: This paper synthesizes existing work on misinformation relating to policy-relevant facts. It argues that misinformation has the greatest power to mislead when it interacts with identity-protective cognition.

Introduction: IPC  misinformation Empirical studies of the role IPC plays in This paper furnishes a compact (even sub- public conflicts over science abound (Flynn compacti) synthesis of the literature on et al. 2017; Kahan 2010). In one study miscommunication of decision-relevant (Kahan, Jenkins-Smith & Braman 2011), for science. The incidence and effect of such example, subjects evaluated whether three information, the paper argues, is conditional scientists of impeccable credentials were on identity-protective cognition on the part knowledgeable and trustworthy “experts” on of its consumers (Nyhan & Reffler 2015; climate change, gun control, and nuclear Flynn, Nyhan & Reifler 2017).ii The paper waste disposal, respectively. The subjects’ thus be-gins with a short discussion of evaluations, it turned out (e.g., Figure 1) identity-protective cognition before depended strongly on the relation-ship addressing the psychological dynamics of between the scientists’ (experimentally misinformation. manipulated) positions on those issues and the subjects’ own cultural commitments: if a What is IPC? scientist’s position matched the one that Identity-protective cognition (IPC) is a prevailed within the subjects’ cultural com- species of . Motivated munity, he was deemed an expert; if not, reasoning refers to the unconscious tendency then not. of people to selectively credit and dismiss factual information in patterns that promote If this is a good model of how people some goal or interest independent of the identify scientific “experts” in the real truth of the asserted facts. When that goal or world, then we should expect people of interest is the protection of one’s status diverse cultural outlooks to be as polarized within an important affinity group, the in- over what expert consensus is as they are formation consumer can be said to be dis- over what the facts on politically charged playing IPC (Sherman & Cohen 2006). issues are. This turns out to be true, the same study and countless others (e.g., Funk & Kennedy 2016) have shown.

51 Kahan

Figure 1. Identity-protective reasoning in recognition of expertise. Adapted from Kahan, Jenkins-Smith & Braman (2011). Study subjects view highly credentialed scientist as an “expert” on climate change much less readily when the scientist takes the view opposed to rather than supportive of the one that is dominant in their political group. Colored bars reflect 0.95 level of confidence.

Surprisingly, IPC has been found to politically charged issue (e.g., gun control). intensify, not abate, as the members of the In that case (Figure 2), the most numerate public become more proficient in the forms people are even more likely than the least of critical reasoning essential to science numerate ones to construe such evidence as comprehension (Giner-Sorolla, & Chaiken supporting the factual beliefs that prevail 1997; Kahan 2016). Numeracy is an aptitude among people who share their political to reason well with quantitative information identity no matter what its true import and to draw appropriate inferences from data (Kahan, Peters, Dawson & Slovic 2017). (Peters et al. 2006). Thus, the individuals who are highest in numeracy are the best If this is how people reason outside the lab, able to recognize whether evidence from a then we should expect people who are the con-trolled experiment displays the pattern most scientifically literate to be the most of covariance that supports or negates a polarized on disputed forms of decision- hypothesis—unless that evidence relates to a relevant science. They are (Kahan 2015).

52 Kahan

Figure 2. Responses by subjects of opposing cultural outlooks. Locally weighted regression lines (Cohen et al., 2003) track the proportion of subjects solving the problem correctly in relation to numeracy levels in the various experimental conditions. Blue lines plot relationships for subjects who score below the mean and red ones are for subjects who score above the mean on Conserv_Repub, the composite measure based on liberal–conservative ideology and identification with one or the other major party. Solid lines are used for subjects in the condition in which the data, properly interpreted, support the inference that either skin rashes or crime decreased in response to an experimental treatment; dashed lines are used for subjects in conditions in which the data, properly interpreted,

support the inference that either skin rashes or crime increased.

ct interpretation of data (=1)

ct interpretation of data (=1)

correct interpretationcorrectdata (=1) of

correct interpretation of data (=1)

n_

n_

corre corre

IPC and dynamics of misinformation thus good reason to believe that even Social scientists have documented a wide without significant quantities of array of misinformation dynamics. misinformation, we’d still see substantial Practically all of them bear the signature of polarization over decision-relevant science IPC. Here are a few examples. relating to culturally contested issues.

Self-misinformation. To begin, the Information search. There is definitely a consequences of misinformation must be correlation between exposure to deceptive assessed in relation to the disposition of information in the media and being culturally diverse citizens to misinform misinformed on a variety of consequential themselves. Consider the two experiments issues (e.g., Feldman et al. 2012). But in discussed so far. In the scientific-consensus which direction does causation run? A experiment (Kahan et al 2011), no considerable body of research concludes that misinformation was necessary to induce people’s cultural and political subjects to reason in a manner that thwarted predispositions are the source, not the appropriate updating of their perceptions of outcome, of the information they consume. what expert scientists believe. The same Identity-protection, not correctness, is their goes for the covariance-detection goal, here as well: armed with evidence, experiment: high-numeracy sub-jects were people are less vulnerable to succumb to lulled into error by their own pre- opposing arguments (e.g., Arceneauz & dispositions—not by the importuning of Johnson 2013). misinformers (Kahan et al. 2017). There is

53 Kahan

Biased assimilation and polarization. Retrenchment and Backfiring. Attempts to Biased assimilation refers to the tendency of correct factual misimpressions not only fail people to more readily credit information to move individuals. Perversely, the that rein-forces their existing factual beliefs correction itself is viewed as certifying that on controversial issues (e.g., the deterrent the facts being challenged are within the effect of capital punishment), thereby constellation of beliefs that divide opposing magnifying the gap between those on both groups. Responding to this cue, those who sides of a dispute (Lord, Ross & Lepper initially credited the misinformation for 1979). identity-protective purposes only dig in deeper when admonished that the beliefs in What IPC adds is an account of the advent, question are false (Nyhan, Reifler & Lubel direction, and incidence of this information- 2013). Such “backfiring” has been observed processing distortion. Biased assimilation to influence mass opinion on a diverse set of doesn’t generate a clean prediction, for ex- issues, from the safety of vaccines (Nyhan, ample, on whether and how individuals will Reifler, Richey & Freed 2014) to the exist- react to information on a novel risk source ence of weapons of mass destruction in Iraq (such as nanotechnology [Kahan, Braman, (Nyhan & Reifler 2010) to the consensus of Slovic, Gastil & Cohen 2009] or the HPV expert views on climate change (Cook & vaccine [Kahan, Braman, Cohen, Gastil & Lewandowsky 2016). Slovic 2010]). IPC, however, correctly predicts that even in this context members of What about “fake news”? opposing cultural groups will polarize (or The general election of 2016 was inundated not) based on tacit cues, such as in-group with instances of “fake news”—on-line and out-group conflict (Kahan, Braman et al. fictional accounts disguised to resemble 2010) or exposure to social memes that fuse news stories published in legitimate the new risk source to existing ones on newspapers and like media. Reasoning which there is already cultural polarization axiomatically from general dynamics of (Kahan, Jamieson, Landrum & Winneg belief formation, commentators have tended 2017). to identify public consumption of “fake news” as an important part of Donald Biased assimilation also offers no Trump’s victory over Hilary Clinton in the explanation of why people will sometimes electoral contest for President of the United change their minds about policy- States (e.g., Parkinson 2016). consequential facts. IPC does. The answer is not exposure to new, fact-laden, empirical But the only empirical study published so arguments but rather personal observation of far casts considerable doubt on this behavior by members of their own cultural conclusion. Combining an internet database group. People evince confidence in decision- of fake news stories with survey relevant science when they rely on it to respondents’ recollection of having seen make important decisions. Such behavior particular articles, Allcott and Gentzkow vouches not only for the safety (or danger) (2017) report that individuals were much of a putative risk source (novel or well- more likely to represent that they had read, established) but also for the safety (or not) agreed with, and shared articles that were of holding the new position within their favorable toward their preferred political particular affinity group (Kahan 2015). candidate or unfavorable to-ward that candidate’s opponent. This is evidence, then,

54 Kahan that consumption of the stories (or story: the by economic or political interests and average number of fake news articles read directed at the public by a compliant or even by a voter, A&G calculated, was 1) were a complicit media. The second, in contrast, consequence, not a cause of their readers’ views the public as motivated consumers voting preferences. who consciously seek out information supportive of factual beliefs characteristic of Conclusion: Two models and one their identity-defining cultural groups. That yuuuuge open question demand, ac-cording to the model, creates Consider two models of the nature and opportunities, economic and political, for impact of misinformation (Figure 3). The conflict entrepreneurs to supply first account treats the public as credulous misinformation that the public will greedily consumers of misinformation manufactured consume as a result of IPC.

Figure 3. Two models of information processing.

The primary conclusion of this synthesis positions on disputed facts acquire the paper is that the latter, “motivated public” antagonistic social meanings that turn them position is much closer to being true than is into badges of group identity (Lessig 1995; the former, “credulous public” one. Only the Kahan, Jamieson et al. 2017). Along with “motivated public” account explains the and misadventure, misinformation interaction between misinformation and the no doubt plays a role. But for this purpose, panoply of reasoning defects associated with misinformation that takes aim only at the identity-protective reasoning (IPC)—from facts that underlie a disputed political issue self-deception to biased assimilation, from will not be particularly effective; what will motivated system 2 reasoning to cultural be is misinformation that implies the issue is backlash effects. It is this link to IPC that one that already starkly divides members of enables the “motivated public” account to opposing identity-defining groups; that’s the explain why misinformation tends to message that has the greatest potential to amplify polarization rather than increase spark the sort of group-status conflict that public misunderstandings generally. disables the capacity of diverse citizens to recognize what science knows. Not until The “motivated public” account also that sort of information is brought under highlights the most important missing piece control will it be possible to protect the of existing research: how and why particular

55 Kahan science communication environment from the particular toxin of IPC. Kahan, D. (2017) Misconceptions, Misinformation, and the Logic of References Identity-protective Cognition. Allcott, H., & Gentzkow, M. (2017). Social AAAS: Face of Science White Media and Fake News in the 2016 Paper. Election. J. Econ. Perspectives, 31, 211-236. Kahan, D., Braman, D., Cohen, G., Gastil, J., & Slovic, P. (2010). Who Fears Arceneaux, K. & Johnson, M. (2013) the HPV Vaccine, Who Doesn’t, and Changing Minds Or Changing Why? An Experimental Study of the Channels?: Partisan News in an Age Mechanisms of Cultural Cognition. of Choice (University of Chicago Law and Human Behavior, 34(6), Press). 501-516.

Cook, J. & Lewandowsky, S. (2016). Kahan, D.M. Climate-Science Rational Irrationality: Modeling Communication and the Climate Change Belief Polarization Measurement Problem. (2015). Using Bayesian Networks. Topics in Advances in Political Psychology 36, Cognitive Science 8, 160-179. 1-43.

Feldman, L., Maibach, E.W., Roser-Renouf, Kahan, D.M., Braman, D., Slovic, P., Gastil, C. & Leiserowitz, A. Climate on J.& Cohen, G. (2009). Cultural Cable. (2012) The International Cognition of the Risks and Benefits Journal of Press/Politics 17, 3-31. of Nanotechnology. Nature Nano- technology 4, 87-91. Flynn, D.J., Nyhan, B. & Reifler, J. (2017). The Nature and Origins of Kahan, D.M., Jamieson, K.H., Landrum, A. Misperceptions: Understanding False & Winneg, K. (2017). Culturally and Unsupported Beliefs About antagonistic memes and the Zika Politics. Political Psychology 38, virus: an experimental test. J Risk 127-150. Res 20, 1-40.

Funk, C. & Kennedy, B. (Oct. 4, 2016) The Kahan, D.M., Jenkins-Smith, H. & Braman, Politics of Climate (Pew Research D. (2011) Cultural Cognition of Center. Scientific Consensus. J. Risk Res. 14, 147-174. Giner-Sorolla, R. & Chaiken, S. (1997) Selective Use of Heuristic and Kahan, D.M., Peters, E., Dawson, E.C. & Systematic Processing Under Slovic, P. (2017). Motivated Defense Motivation. Pers Soc numeracy and enlightened self- Psychol B 23, 84-97. government. Behavioural Public Pol- icy 1, 54-86. Kahan, D. (2010). Fixing the Communications Failure. Nature 463, 296-297.

56 Kahan

Lessig, L. (1995). The Regulation of Social Candidate Favorability. Available at Meaning. U. ChI. l. Rev., 62, 943- https://ssrn.com/abstract=2995128. 1045. Nyhan, B., Reifler, J. & Ubel, P.A. (Nov. Lord, C.G., Ross, L. & Lepper, M.R. (1979). 27, 2016). The Hazards of Biased Assimilation and Attitude Correcting Myths About Health Care Polarization - Effects of Prior Reform. Medical Care 51, 127-132 Theories on Subsequently 110.1097/MLR.1090b1013e3182794 Considered Evidence. Journal of 86b (2013). Personality and Social Psychology 37, 2098-2109. Parkinson, H.J. (2016) Click and elect: How fake news helped Donald Trump win Nyhan, B. & Reifler, J. (2015). The roles of a real election. The Guardian, information deficits and identity available at https://goo.gl/bL6Ww8. threat in the prevalence of misperceptions. Peters, E., Västfjäll, D., Slovic, P., Mertz, C.K., Mazzocco, K. & Dickert, S. Nyhan, B. & Reifler, J. (2010).When Numeracy and Decision Making. corrections fail: The persistence of (2006). Psychol Sci 17, 407-413. political misperceptions. Polit Behav 32, 303-330. Sherman, D.K. & Cohen, G.L. (2006). The Psychology of Self-defense: Self- Nyhan, B., Porter, E., Reifler, J. & Wood, T. Affirmation Theory. Advances in (2017). Taking Corrections Literally Experimental Social Psychology 38, But Not Seriously? The Effects of 183-242. Information on Factual Beliefs and i For something approaching a mid-size account, see Kahan (2017). ii If you want a luxury-sedan sized account of the sources and consequences of political misinformation, then get hold of these (e.g., Flynn, Nyhan & Reifler 2017; Nyhan, Porter et al 2017) and other papers by Nyhan and his collaborators.

57 Southwell and Boudewyns

Using Behavioral Theory to Curb Misinformation Sharing

Brian Southwell, RTI International; Duke University; University of North Carolina Vanessa Boudewyns, RTI International

How can we stop the spread of in which sharing occurs and outline misinformation? That question implies we considerations for future intervention. have reason to be concerned about audience A behavioral approach to understand encounters with misinformation, a claim for misinformation sharing highlights at least which we have justification (Southwell & three important variables with predictive and Thorson, 2015; Southwell, Thorson, & ethical consequence: awareness (of Sheble, 2018). Perhaps less obvious in that misinformation as such), ability (to share), question but nonetheless crucial is exactly and motivation (to act). Available literature what spreading entails, including processes suggests important limitations in human of human decision-making and behavior. awareness of information accuracy, Although automated social media accounts highlighting the potential for some (or bots) discover and repost false material unintentional spreading to occur and the online (Maréchal, 2016) and algorithm- potential utility of improving awareness. based approaches to countering Considerations of how we can share misinformation offer some potential for information point to important roles for remedy (e.g., Nguyen et al., 2013, or Bode one’s facility with information sharing & Vraga, 2015), the problem of technologies and circumstances (including misinformation spreading via electronic emotionally-charged situations). Lastly, we media outlets and through social networks is can understand volitional sharing of attributable to human behavior. Decades of misinformation through the lens of extant academic concern with rumor suggest that behavioral theory that has been widely the existence of falsehood alone is not solely applied to a range of civic and health-related responsible for false information diffusion, behaviors, a move that highlights a path for e.g., Allport and Postman (1947), Rosnow future intervention involving more than (1991), and Weeks and Southwell (2010); simple awareness-raising. human psychology interacts with an information environment to disseminate Awareness and detection of falsehoods. If we consider conversation (and misinformation information exchange more broadly) If we plan to ask people to opt to not share between people to be fundamentally an inaccurate information, we first must assume observable behavioral phenomenon their ability to recognize the existence of constrained by human motivations and skills misinformation. Our task is inherently and with important consequences (Southwell limited in the light of available cognitive & Yzer, 2007), then misinformation sharing psychology literature. As Southwell and should be a function of human hopes and Thorson (2015) argued, we can trace desires. By foregrounding misinformation questions about our ability to detect sharing as behavior, we can both articulate misinformation back at least hundreds of important distinctions in the circumstances years to a debate that unfolded across the roughly contemporaneous lives of

58 Southwell and Boudewyns philosophers Descartes and Spinoza as they et al., 2012). Familiarity with a topic and a disagreed about what happens when people predisposition to scrutinize and distrust encounter false information (or any information may offer some protection, but information). Rather than accepting assessing the credibility of information is Descartes’ notion that people screen out not an easy task, especially if information is false information at the point of exposure, shared by people in your network (Benkler Spinoza argued that people accept et al., 2017). People also can misattribute information by default and verify or reject it information sources when recalling past in a subsequent process. Empirical support encounters with information, e.g., Auslander for Spinoza’s account has accumulated in et al. (2017). Being connected largely to recent decades (e.g., Asp et al., 2012; like-minded people can lead to echo Gilbert, Krull, & Malone, 1990; Skurnik, chambers and filter bubbles that undermine Yoon, Park, & Schwarz, 2005), suggesting identification of falsehoods (Pariser, 2011). that people encode all new information as if it were true, even if only momentarily, and The various means of sharing later tag the information as being either true Consider a false news story about a public or false. Despite this evidence of audience official being an alcoholic planted by people processing, Boudewyns and colleagues opposing that official. Drawing on a recent (2018) have noted that past research in catalogue of information-sharing behaviors domain-specific literatures on topics such as (Southwell, 2013), there are various ways consumer advertising largely elides the that any one person can share aspects of that question of individual awareness of false story with others. A person could talk with claims. The health advertising literature, for friends, simply forward the story to another example, largely has focused on by e-mail, or post it on a bulletin board. A documenting problematic claims more than person might publicly protest or denigrate on audience awareness of false or the story, either by declaring the presumed misleading claims, per se. Such emphasis on true story to be offensive to society or by the prevalence of misinformation to the claiming the person writing the story was exclusion of audience awareness research is lying; either way, the protest would share a unwarranted. form of the story with other people. A person also could coopt the original story to The pre-existing belief systems and social create a revised or new message; an evening networks of audience members also talk show host making a joke or a person constrain rejection of misinformation. writing satire inspired by the story would Humans are vulnerable to deception in further spread the story in a sense. advertising, news stories, and even in Technology also now facilitates overt interpersonal contexts in part because of our endorsement of a story; consider the explicit information processing tendencies, e.g., see acknowledgement that a person approves a Gilbert et al. (1993) for discussion. People message by “liking” a post on Facebook or tend to believe information that is otherwise noting their endorsement on a web compatible and consistent with their current site by clicking on an icon. beliefs, if the individual elements which comprise the information are coherent and Outlining these various forms of information without internal contradictions, if the source sharing underscores the potential for is perceived to be credible, and if there is a unintentional sharing. Relatively inept users perceived social consensus (Lewandowsky of social media technology, for example,

59 Southwell and Boudewyns might accidentally share misinformation misinformation. For example, people may because of an errant click. On a different think that sharing misinformation will plane, a sarcastic comment to another person generate social support or help them take to mock a story known to be false vengeance (an aspect of emotion regulation). nonetheless might raise awareness of the People may think sharing misinformation misinformation in a way not considered will help to communicate a specific identity deeply by the speaker. New technologies (which relates to impression management) also have facilitated information sharing and or reinforce shared views (social bonding). have introduced potential influences on Sharing misinformation might even help a misinformation sharing. With sharing now person gain new information from others. possible with a wide network almost Each of these factors could encourage immediately after reading content online, misinformation sharing even in situations in emotional arousal may now play a crucial which a person understands the particular role in that we have lowered the threshold of information they are sharing is false. skills and ability necessary for sharing: misinformation spread is possible with an A range of theoretical frameworks that anxious tap on a keyboard. explain and predict social behavior are useful here. The Reasoned Action approach Misinformation sharing as human to predicting and explaining behavior, for behavior example, holds important implications for Following Southwell and Yzer (2007)’s our specific quest to understand underlying conceptualization of conversation and drivers behind a person’s intentions to share Cappella’s (1987) definition of interpersonal potentially false information (Fishbein & interaction, we contend that misinformation Ajzen, 2010; Fishbein & Cappella, 2006; sharing is an observable behavioral Fishbein & Yzer, 2003). A principal tenet of phenomenon involving multiple people that approach – that intentions most constrained by human motivations and skills proximally influence behavior – suggests and with important consequences. That that for behavior to change, interventionists framing sets the stage for future research must target precursors of intention, involving beliefs and perceived incentives including attitudes concerning the behavior about information sharing that could extend (e.g., approval, disapproval, neutrality, beyond studies on public understanding of disinterest), beliefs about subjective norms particular facts. concerning the behavior (how do others think one should respond?), and perceptions What are the beliefs and perceived about one’s confidence in performing the incentives that appear to drive information behavior. The more a person believes that and misinformation sharing? Available sharing misinformation will result in literature on news sharing and word-of- positive consequences, that people who are mouth diffusion offers useful evidence (Lee important to him or her think he or she & Ma, 2012; Berger, 2014; Chen et al., should share the misinformation (and that 2015). Berger (2014), for example, suggests the norm is to share such information), and that word of mouth serves five key that they are confident in their ability to functions: impression management, emotion share the misinformation, the more likely regulation, information acquisition, social they are to do so. Conversely, conscious bonding, and . These mirror effort to avoid sharing is likely a function of beliefs that people can hold about sharing a similar array of perceptions.

60 Southwell and Boudewyns

Environmental and intrapersonal variables could use those beliefs to build predictive beyond individual beliefs also are models. With that work in hand, we may be noteworthy. Misinformation must be in a better position to respectfully offer available in some form to be shared; the alternative ways for people to satisfy key environmental prevalence of misinformation needs without spreading misinformation. is a precondition for its spread (unless a person simply generates misinformation on References their own). In addition, we must also Allport, G.W., & Postman, L. (1947). The consider individual variation (between psychology of rumor. New York: people and across time within a person) in Holt, Rinehart & Winston. emotion and physiology. Misinformation sharing may help a person to regulate their Asp, E., Manzel, K., Koestner, B., Cole, C. emotions (Berger, 2014) and might also A., Denburg, N. L., & Tranel, D. prompt certain emotions, like pride or guilt, (2012). A neuropsychological test of that could offer incentive or disincentive for belief and doubt: Damage to sharing. Emotion can be accommodated ventromedial prefrontal cortex within existing cognitive frameworks such increases credulity for misleading as the Reasoned Action approach. For advertising. Frontiers in example, if a person believes that sharing Neuroscience, 6, 1–9. misinformation will allow them to vent (which they perceive as positive), then that Auslander, M. V., Thomas, A. K., & belief will contribute to a positive attitude. Gutchess, A. H. (2017). Confidence Competing emotions might also discourage moderates the role of control beliefs sharing, e.g., sharing misinformation may in the context of age-related changes make someone feel guilty. in misinformation susceptibility. Experimental Aging Implications for intervention and future Research, 43(3), 305-322. research Expecting people to detect false information Benkler, Y., Faris, R., Roberts, H., & on their own and to avoid sharing it seems Zuckerman, E. (2017 March 3). unwise; evidence suggests people often are Study: Breitbart-led right-wing vulnerable to accepting misinformation at media ecosystem altered broader face value. Similarly, simply asking people media agenda. Columbia Journalism to not spread misinformation is unlikely to Review. Retrieved from stop sharing in all cases. Communication https://www.cjr.org/analysis/breitbart technology makes some instances of -media-trump-harvard-study.php. inadvertent sharing possible, and such accidents could be reduced by adding a Berger, J. (2014). Word of mouth and confirmation step before sharing occurs. interpersonal communication: A Volitional sharing of misinformation may review and directions for future pose a greater challenge. Addressing such research. Journal of Consumer sharing will require that we understand the Psychology, 24(4), 586-607. social identity, bonding, and emotion regulation benefits of sharing. Future survey Bode, L., & Vraga, E. K. (2015). In related research could elicit beliefs about news, that was wrong: The misinformation sharing and researchers correction of misinformation through

61 Southwell and Boudewyns

related stories functionality in social Gilbert, D. T., Krull, D. S., & Malone, P. S. media. Journal of Communication, (1990). Unbelieving the 65(4), 619–638. unbelievable: some problems in the rejection of false information. Boudewyns, V., Southwell, B., Betts, K., Journal of Personality and Social Gupta, C., Paquin. R., O’Donoghue, Psychology,59, 601–613. A., & Vazquez, N. (2018). Awareness of misinformation in Gilbert, D., Tafarodi, R., & Malone, P. health-related advertising: a narrative (1993). You can’t not believe review of the literature. In B. everything you read. Journal of Southwell, E. Thorson, & L. Sheble Personality and Social Psychology, (eds.). Misinformation and mass 65(2), 221-233. audiences. Austin, TX: University of Texas Press. p. 35-50. Lee, C. S., & Ma, L. (2012). News sharing in social media: The effect of Cappella, J. N. (1987). Interpersonal gratifications and prior experience. communication: Definitions and Computers in Human Behavior, fundamental questions. In: C. R. 28(2), 331–339. Berger & S. H. Chaffee (eds.). Handbook of communication Lewandowsky S., Ecker, U. K., Seifert, C. science. Newbury Park, CA: Sage. p. M., Schwarz, N., & Cook, J. (2012). 184-238. Misinformation and its correction: continued influence and successful Chen, X., Sin, S. C. J., Theng, Y. L., & Lee, debiasing. Psychological Science in C. S. (2015). Why students share the Public Interest, 13(3), 106–131. misinformation on social media: Motivation, gender, and study-level Maréchal, N. (2016). When bots tweet: differences. The Journal of toward a normative framework for Academic Librarianship, 41(5), 583- bots on social networking sites. 592. International Journal of Communication, 10, 10. Retrieved Fishbein, M., & Ajzen, I. (2010). Predicting from http://ijoc.org/index.php/ijoc/ar and changing behavior: the reasoned ticle/view/6180/1811. action approach. New York: Psychology Press. Nguyen, N. P., Yan, G., & Thai, M. T. (2013). Analysis of misinformation Fishbein, M., & Cappella, J. N. (2006). The spread containment in online social role of theory in developing effective networks. Computer Networks, health communications. Journal of 57(10):2133–2146. Communication, 56(s1), s1-s17. Pariser, E. (2011). The filter bubble: what Fishbein, M., & Yzer, M. C. (2003). Using the internet is hiding from you. theory to design effective health London: Penguin UK. behavior interventions. Communication Theory, 13, 164- Rosnow, R.L. (1991). Inside rumor — A 183. personal journey. American Psychologist, 46(5), 484–496.

62 Southwell and Boudewyns

Skurnik, I., Yoon, C., Park, D. C., & Southwell, B. G., Thorson, E. A., & Sheble, Schwarz, N. (2005). How warnings L. (Eds.). (2018). Misinformation about false claims become and mass audiences. Austin, TX: recommendations. Journal of University of Texas Press. Consumer Research, 31(4), 713–724. Southwell, B. G, & Yzer, M. C. (2007). The Southwell, B. G. (2013). Social networks roles of interpersonal communication and popular understanding of in mass media campaigns. In: C. science and health: sharing Beck. (ed.). In Communication disparities. Baltimore, MD: Johns yearbook 31. New York: Lawrence Hopkins University Press. Erlbaum Associates. p. 420-462.

Southwell, B. G., & Thorson, E. A. (2015). Weeks, B., & Southwell, B. (2010). The The prevalence, consequence, and symbiosis of news coverage and remedy of misinformation in mass aggregate online search behavior: media systems. Journal of Obama, rumors, and presidential Communication, 65(4), 589-595. politics. Mass Communication and Society, 13(4), 341-360.

63 Wardle

Assessing Current Efforts by the Platforms and their Effectiveness

Claire Wardle, First Draft, Shorenstein Center on Media, Politics and Public Policy, Harvard Kennedy School of Government

Over the past twelve months, ideas for advertisers to run ads that link to stories that solving mis- and disinformation have been have been marked false by third-party fact- discussed endlessly at conferences and in checking organizations. Now we are taking workshops, but we’ve seen few concrete an additional step. If Pages repeatedly share changes and nothing has significantly stories marked as false, these repeat moved the needle. While there is certainly offenders will no longer be allowed to more money available from the advertise on Facebook” (Shukla 2017). philanthropic sector, and a myriad of small projects are underway, grand ideas are yet to Facebook moved relatively quickly from be implemented. While there are projects Mark Zuckerberg’s post-election denial developing in terms of news literacy (Shahari 2017) that ‘fake news’ was a projects, third party technology initiatives, problem on his platform, to rolling out a credibility scores, and indicators, this paper third-party fact-checking initiative on will focus on the steps taken by the social December 15, 2016 that includes the platforms. International Fact Checking Network, The Associated Press, The Washington Post, and What have the social networks done to Snopes (Mosseri 2016). They expanded the tackle agents motivated by financial gain? project to France and Germany in February As we have seen, one of the primary and the Netherlands in March. In this motivations for creating disinformation is initiative, users flag posts they think might financial gain. Both Google and Facebook be ‘false news,’ populating a queue that the have taken steps to prevent revenue flowing affiliated fact-checking organizations can to the owners of “bad sites, scams and ads.” see. In August, Facebook announced it Google permanently banned nearly 200 would use "updated machine learning" to publishers from its AdSense advertising detect more potential hoaxes. After an network as of late 2016 (Spencer 2017). article has been fact-checked, any user who Facebook made similar moves, updating its sees that content will see that it has been policies with language stating they would disputed by one of the fact-checking not display ads that show misleading or organizations. If someone tries to share a illegal content. Facebook has also taken disputed article, they are reminded with a steps to tackle ‘imposter content,’ stating, pop-up notice that the content is in dispute. “On the buying side, we’ve taken action In December 2017, Facebook announced against the ability to spoof domains, which (Lyons 2017) it would replace these will reduce the prevalence of sites that disputed flags with ‘Related Articles,’ a pretend to be real publications” (Written featured box underneath a post that was Evidence). And, in late August 2017, designed to “help people discover new Facebook announced they would block ads articles they may find interesting about the from pages that had repeatedly shared false same topic” (Su 2017). news, stating “Currently, we do not allow

64 Wardle

After months of requests by the fact- Twitter has also come under scrutiny checking organizations themselves (as well recently. In a June 2017 blog post, Twitter as researchers) for data connected to the explained its efforts to fight against bots: initiative, in early October, Facebook “We’re working hard to detect spammy provided information via an email (that was behaviors at source,… [and we] also then reported on by Craig Silverman of frequently take action against applications Buzzfeed) that when content was flagged by that abuse the public API to automate third party fact-checkers sharing decreased activity on Twitter (Twitter 2016), stopping by 80% (Silverman 2017). potentially manipulative bots at the source” (Crowell 2017). They were forced to update What have the social networks done to this post in late September (Twitter 2017), tackle agents motivated by political summarizing the role the platform may have influence? played in terms of Russian interference and In April 2017, Facebook’s Security team the 2016 election, explaining some of the published a paper on ‘Information tools they are using to prevent automation, Operations,’ defining it as “actions taken by coordinated attacks, and campaigns on the organized actors (governments or non-state platform. They also argued that researchers actors) to distort domestic or foreign using Twitter’s own data would not see the political sentiment, most frequently to steps the platform was taking against this achieve a strategic and/or geopolitical type of coordinated activity, as some of the outcome” (Weedon, et al 2017). It was the enforcement would not be visible via the first-time Facebook had admitted the scale public API. While this is certainly true, the of the problem they face in terms of official, fact that researchers cannot independently tightly organized, networked agents using assess the effectiveness of these moves their platform to disseminate automated dis- makes it very difficult to understand this information. phenomenon (Warzel 2017).

In September 2017, Facebook admitted that One significant issue that has surfaced they had found evidence that ‘dark ads’ (ads recently is the challenge for researchers who that are only visible to the intended are attempting to understand this audience, rather than publicly viewable on a phenomenon when the platforms are page) had been purchased by a Russian deleting content. As Jonathan Albright organization and directed at US citizens. explained (Timberg and Dwoskin 2017) to Facebook explained, “[T]he ads and the Washington Post, his analysis of ads, accounts appeared to focus on amplifying pages, and posts connected to the 2016 divisive social and political messages across elections was taken down by Facebook after the ideological spectrum — touching on he published his research, meaning that his topics from LGBT matters to race issues to methodology couldn’t be replicated, and immigration to gun rights” (Stamos 2017). there was no opportunity for other A few days later, an investigation by the researchers to examine patterns in the data Daily Beast found inauthentic accounts, to more fully understand the methods seemingly located in Russia, had used the employed. Facebook events function to organize anti- immigration in the US (Collins, et al Providing additional context 2017). Google News recently took steps to allow publishers to highlight fact-checked content

65 Wardle for programmatic detection using Conclusion schema.org, a structured data markup As this article goes to print, Facebook has schema supported by the major search announced a significant change to their engines. This feature first appeared in the Newsfeed algorithm, announcing that posts UK and the US last October, and in the from friends and family will appear higher summer of 2017 added the same than posts from news publishers and functionality to Google News in Germany, (Mosseri 2018). There was also a suggestion France, Brazil, , and Argentina. that journalism outlets will be ranked by independent assessments of credibility In early October 2017, Facebook started to (Seetharaman 2018), for example via test new contextual features. As they surveys of users in terms of which outlets explained in their blog post announcing the they would be more prepared to pay for. move: The relative weighting of these elements within the algorithm is unclear. (For “For links to articles shared in News example, an outlet might have a low Feed, we are testing a button that people credibility score, but if all of your family can tap to easily access additional and friends comment on a story from that information without needing to go outlet, will you still see it?) elsewhere. The additional contextual information is pulled from across When assessing moves by the technology Facebook and other sources, such as companies over the past year, one of the information from the publisher’s most frustrating elements has been the Wikipedia entry, a button to follow their failure to connect with the research, Page, trending articles or related education, library, civil society, and policy articles about the topic, and information communities at any substantive level. There about how the article is being shared by are decades of research on mis-information, people on Facebook. In some cases, if how people ‘read’ and make sense of that information is unavailable, we will information, and the factors that slow down let people know, which can also be or exacerbate rumors. But the responses helpful context” (Anker, et al 2017). have often felt knee-jerk and atheoretical, and at times public relations moves, rather This move is supported by experimental than serious attempts to tackle the research by Leticia Bode in 2015 which complexity of the problem. On this topic, suggested that when a Facebook post that when the scale and seriousness require includes mis-information is immediately sophisticated responses, the technology contextualized in their ‘related stories’ companies must work more closely with feature underneath, misperceptions are those who have research expertise on this significantly reduced (Bode and Vraga subject, as well as those working on the 2015). Hopefully, Facebook will be ground around the world, and experience forthcoming with data about the success of first-hand the real-world repercussions of the experiment, to provide cues for the other polluted information streams. platforms about whether people will click on additional contextual information if provided with the option.

66 Wardle

References Mosseri, A. (2018) News Feed FYI: Anker, A., S. Su, and J. Smith (October 5, Bringing People Closer Together, 2017) News Feed FYI: New Test to Facebook , Provide Context About Articles, https://newsroom.fb.com/news/2018/ Facebook Newsroom, 01/news-feed-fyi-bringing-people- https://newsroom.fb.com/news/2017/ closer-together/ 10/news-feed-fyi-new-test-to- provide-context-about-articles/ Seetharaman, D., Alpert, L., & Mullin, B. (2018) Facebook to Overhaul How It Bode, L. & Vraga, E (2015) In Related Presents News in Feed, Wall Street News, That Was Wrong: The Journal, Correction of Mis-information https://www.wsj.com/articles/facebo Through Related Stories ok-considers-prioritizing- Functionality in Social Media, trustworthy-news-sources-in-feed- Journal of Communication, 65 (4): 1515714244 619-638. Shahari, A. (Nov 11. 2017) Zuckerberg Collins, B. et al (September 11, 2017) Denies Facebook’s Impact on the Exclusive: Russia Used Facebook Election, Events to Organize Anti-Immigrant http://www.npr.org/sections/alltechc Rallies on U.S. Soil, The Daily onsidered/2016/11/11/501743684/zu Beast, ckerberg-denies-fake-news-on- http://www.thedailybeast.com/exclus facebook-had-impact-on-the-election ive-russia-used-facebook-events-to- organize-anti-immigrant-rallies-on- Shukla, S. (August 28, 2017) Blocking Ads us-soil from Pages that Repeatedly Share False News. Facebook Newsroom. Crowell, C., (June 14, 2017) Our Approach https://newsroom.fb.com/news/2017/ to Bots and Automation, 08/blocking-ads-from-pages-that- https://blog.twitter.com/official/en_u repeatedly-share-false-news/ s/topics/company/2017/Our- Approach-Bots-Mis- Silverman, C. (2017) Facebook Says Its information.html Fake News Label Helps Reduce The Spread Of A Fake Story By 80%, Lyons, T. (2017) News Feed FYI: Replacing Buzzfeed, Disputed Flags with Related https://www.buzzfeed.com/craigsilve Articles, Facebook Newsroom, rman/facebook-just-shared-the-first- https://newsroom.fb.com/news/2017/ data-about-how-effective- 12/news-feed-fyi-updates-in-our- its?utm_term=.ujVvrWlzn#.ba7mpJ fight-against-misinformation/ MPr

Mosseri, A. (2016) News Feed FYI: Spencer, S. (Jan. 25, 2017) How we fought Addressing Hoaxes and Fake News, bad ads, sites and scammers in 2016, Facebook Newsroom, December 15, Google Blog, 2016. https://www.blog.google/topics/ads/h ow-we-fought-bad-ads-sites-and-

67 Wardle

scammers-2016 twitter-is-dismissing-their-work?

Stamos, A. (September 6, 2017) An Update Weedon, J., W. Nuland & A. Stamos (April On Information Operations On 27, 2017) Information Operations Facebook, Facebook Newsroom, and Facebook, p. 4 https://newsroom.fb.com/news/2017/ https://fbnewsroomus.files.wordpress 09/information-operations-update/ .com/2017/04/facebook-and- information-operations-v1.pdf Su, S. (April 25, 2017) News Feed FYI: New Test With Related Articles, Written evidence by Facebook to the UK https://newsroom.fb.com/news/2017/ Parliamentary Inquiry on Fake 04/news-feed-fyi-new-test-with- News, related-articles/ http://data.parliament.uk/writtenevid ence/committeeevidence.svc/evidenc Timberg, C. & E. Dwoskin (Oct 12, 2017) edocument/culture-media-and-sport- Facebook takes down data and committee/fake- thousands of posts, obscuring reach news/written/49394.html of Russian disinformation, The Washington Post, https://www.washingtonpost.com/ne ws/the- switch/wp/2017/10/12/facebook- takes-down-data-and-thousands-of- posts-obscuring-reach-of-russian- disinformation/

Twitter, (April 6, 2016) Automation Rules, https://support.twitter.com/articles/7 6915

Twitter Public Policy (Sept 28, 2017) Update: Russian Interference in 2016 US Election, Bots, & Misinformation, Twitter Blog, https://blog.twitter.com/official/en_u s/topics/company/2017/Update- Russian-Interference-in-2016-- Election-Bots-and- Misinformation.html

Warzel, C. (October 3, 2017) Researchers Are Upset That Twitter Is Dismissing Their Work On Election Interference, Buzzfeed, https://www.buzzfeed.com/charliewa rzel/researchers-are-upset-that-

68 Oh

Information Operations and Democracy: What Role for Global Civil Society and Independent Media?

Sarah Oh, The Center for Information Technology Research in the Interest of Society (CITRIS) and the Banatao Institute

There is increasing concern that the In Eastern Europe and Southeast Asia, there manipulation of social mediai through are three relevant trends around the targeted information operations (Weedon aggressive use of social media to promote 2017) could play a significant role in false information by the following groups: influencing the outcome of future elections ISIS-aligned terror groups, domestic and other democratic events and processes extremist nationalist and religious groups, globally (Deb 2017). Opinions about what and authoritarian governments, including can or should be done to address the Russia. Social media platforms present a production and spread of misinformation, new front for the ongoing ideological disinformation, and false information, warfare that the Kremlin has waged in the however, are divided (Anderson 2017). Balkans and Baltic region (Priest 2017). In Stakeholders in the United States and Southeast Asia, authoritarian governments Western Europe have begun testing a range themselves use social media to promote of responses including regulatory remedies false information. In both regions, ISIS- (Overview 2017), technology solutions to aligned terror groups have strengthened their filter online content, and public education stronghold where there is evidence of initiatives to promote information literacy increasing strength (Militant) and the use of around political events in their countries social media to recruit supporters (Mejdini (Fleming 2016). These efforts, which are 2017).iii In both regions, extremist described as content risk support in this nationalist groups are flourishing online paper, have stemmed from discussions (Ristic 2017). around the U.S., French, and German elections and rely on engagement with The effects of these trends may be greater technology company representatives, than in the West for several reasons. In foundations, think tanks, and researchers Indonesia, the , and Myanmar, based in the U.S. or Western Europe. the use of false information to influence public opinion is pernicious because of the Despite the focus on the experience in the widespread use of social media to access West, concerns about information operations news and information; Facebook is a critical and their effect on democracy are salient in (if not the only) information lifeline. In countries transitioning to democracy, where Southeast Asia alone, there are more than government and media institutions are under 241 million Facebook users, which is immediate threat or unevenly established, if approximately 12% of users worldwide at all. This paper draws on field experienceii (Connecting 2016); in Indonesia, Myanmar, in Eastern Europe and Southeast Asia to and the Philippines, “Facebook is the make the that more attention internet.” Second, recent violence and should be paid to groups engaged on this conflict, and, significantly, the lack of issue outside the West. institutions such as a functioning legal

69 Oh system, independent media, and civil society An Emerging Community of Content mean there are less counter-weights in Risk Researchers society to mitigate false and dangerous There are, at a minimum, three important information campaigns. functions civil society, media organizations, and other international groups could perform For the reasons stated above, the perspective concerning content risk research and and data possessed by organizations in response. First, civil society organizations Southeast Asia and the Western Balkans is focused on issues such as citizen election valuable. In the Philippines, Myanmar, and observation, democracy, and citizen Indonesia, as well as Serbia, Kosovo, engagement are the front line of defense: Albania, and Macedonia, civil society and users may call on or consult these independent media have been proactively organizations with social media problems, or studying and responding to actions taken by with concerns about political events.vi New nationalist and extremist groups, and independent media organizations have governments online around elections and studied information operations through other major political events. Increasingly, reporting and analysis, which have ranged stakeholders familiar with these from anecdotal to in-depth reports and data environments take the normative position analysis.vii Finally, international that these groups should be engaged in practitioners, who understand the context discussions about possible defensive and in which local organizations work, bring offensive strategies against information conflict sensitivity awareness and political operations particularly with industry leaders development expertise useful for external and academic researchers who seek to learn actors. They also act as important facilitators more about the manipulation or misuse of when internal or community conflicts arise their platforms and advertising featuresiv among organizations. globally. Collectively, the groups listed above possess Despite opportunities, however, discussions unique knowledge about content risk trends around operationalizing engagement in developing democracies. They often between tech industry representatives and collect data about developments in real-time, academic researchers with civil society and and conduct analysis and share insights with media organizations in transition countries local communities. They have identified are absent. Existing forums are inadequate trends such as the influence of ambient fear for constructive engagement,v say local on why someone might share low-grade organizations, and there is a sense of fatigue information, and have developed impactful, and frustration among these organizations often creative, counter-messaging strategies. that may prevent their knowledge and They also explain local cultural and political insights from informing essential context around why a social media post or discussions in the future. There is a need to comment may or may not violate terms of invest resources to develop these service agreements. These insights are relationships explicitly during critical critical during moments of crisis. democratic transition periods. Challenges for Engagement The unique knowledge civil society and independent media possess has not yet been harnessed; global civil society and media

70 Oh has not yet fully occupied a position to proactively developed over providing direct advise industry and policy leaders on these project services, for example. issues on the global stage.viii In fact, existing engagement with civil society and media has Third, major research institutions simply do fallen short because interventions have not have direct relationships with failed to address the challenges listed below. international civil society and emerging media organizations covering the issues First, groups may fear being viewed as largely because of their own resource conduits of international actors or constraints, and lack of previous companies. In some countries, say local engagement in these countries. groups, engagement with Facebook is seen negatively because of the company’s Finally, language presents challenges. coordination with government Representatives seeking to engage representatives. In addition, disagreements organizations and communities in transition about Facebook content decisions have, at countries primarily work in English, even if times, been wrongfully directed toward their platform serves localized content. organizations by community members.ix In Small, rights groups with valuable insights general, organizations working on issues and knowledge who do not speak English related to democracy in developing face barriers to participate in discussions democracies often work under significant about policies that govern content on social pressure either from governments and media platforms. security agents in their countries, other non- state actors, and even other civil society Opportunities for Investigation organizations. They may be subject to Interviews conducted with civil society harassment themselves and often risk their organizations in Southeast Asia and the livelihoods to do the work that they do. Western Balkans suggest there are opportunities where research could shed Second, civil society and independent media light in three important areas. organizations in developing democracies are young organizations that are still growing; How should content risk be quantified in they were formed after transitions or developing democracies? How might democracy movements that have occurred in context be better factored into risk recent history. Given the dearth of assessment and identification when government funding and private foundations considering content trends and behaviors in in transition countries, these young transition countries (Adams et al)?x Are organizations rely on funding from factors such as the state of media capture in international organizations and foundations a country, trust or distrust in government and foreign governments, which means their and media institutions, the role of social funding is project-based. As a result, their media in providing news, availability of capacity to conduct non-project related work independent media, digital and media is significantly constrained when compared literacy, active war or conflict, historical to more established groups. Unless there is divisions in society, and other socio- an explicit focus of their program to liaise economic factors that contribute to with global policymakers and companies, audience worldviews relevant? Implications: these interactions cannot be prioritized or develop foundational risk identification categories that can be used by civil society, media, international researchers, and

71 Oh industry partners to share an understanding markets, and investigate possible terms of of risk and investigate possible content service violations. issues. To date, transition countries have not been What organizational structures are needed given enough attention by researchers and to facilitate qualitative research on content tech industry representatives seeking to risk? How might existing forums be address the online information operations improved to foster constructive engagement problem. Civil society and media between technology companies and civil organizations are on the front lines of society and independent media organizations responding to manifestations of this problem around future events, or post-mortem around in Southeast Asia, the Western Balkans, and past events? Between researchers and civil beyond. As social media companies are society and independent media? Can they be scrutinized, and the role of government is modified and used to discuss information discussed, there is a larger question to raise integrity and threats to them? Implications: around how civil society and media advise companies and thought leaders on organizations can inform ongoing structures for mutually beneficial discussions. Without explicit efforts to engagement between local organizations in engage civil society and media organizations transition countries and companies, in in developing democracies, and use their addition to international researchers. knowledge, the tech industry and global policymakers are missing an opportunity. What skills and resources do civil society The dynamic nature of the problem makes organizations and global stakeholders need identifying specific solutions challenging; to engage more meaningfully with each developing relationships with a diverse set other when it comes to content risk of invested stakeholders, or pathways for identification and management? What information to be shared to identify trends, people or individuals might be useful for and factor this knowledge into response civil society organizations and technology strategies may be an important next step. companies to proactively identify and mitigate future content risk events? What References interlocutors would enhance engagement Adams, N. et al. “The Credibility Indicators between these two groups? Many local Working Group.” Meedan. organizations note the importance of https://meedan.com/credibility- connectors, or individuals familiar with their indicators/ environment and corporate representatives. How can they be surfaced and their Anderson, J. and L. Rainie. “The Future of capabilities enhanced? What capabilities are Truth and Misinformation Online.” missing among technology companies and Pew Research Center, October 19, with civil society organizations? How might 2017. insights gained by those at social media http://www.pewinternet.org/2017/10/ companies interacting with civil society 19/the-future-of-truth-and- actors be mainstreamed across teams and misinformation-online/ organizations? Implications: inform the restructuring, or creation of new teams and “Connecting Over 241 Million People in hiring of new connectors who will seek to Southeast Asia on Facebook.” understand, assess, and mitigate risk in these Facebook Business, March 15, 2016.

72 Oh

https://www.facebook.com/business/ “Overview of the NetzDG Network news/Connecting-over-241-Million- Enforcement Law.” CDT, July 17, People-in-Southeast-Asia-on- 2017. Facebook https://cdt.org/insight/overview-of- the-netzdg-network-enforcement- Deb, A., S. Donohue, and T. Glaisyer. “Is law/ Social Media a Threat to Democracy?” The Omidyar Group, Priest, D. and M. Birnbaum. “Europe Has October 1, 2017. Been Working to Expose Russian https://www.omidyargroup.com/wp- Meddling for Years.” Washington content/uploads/2017/10/Social- Post, June 25, 2017. Media-and-Democracy-October-5- https://www.washingtonpost.com/wo 2017.pdf rld/europe/europe-has-been-working- to-expose-russian-meddling-for- Fleming, J. “News Literacy and a Civics years/2017/06/25/e42dcece-4a09- Model for Journalism Education.” 11e7-9669- Center for News Literacy, 2016. 250d0b15f83b_story.html http://www.centerfornewsliteracy.or g/2016_site/wp- Ristic, M. et al. “Far-Right Balkan Groups content/uploads/2017/05/Jennifer- Flourish on the Net.” Resonant Fleming-FINAL.pdf Voices Initiative, May 5, 2017. http://www.balkaninsight.com/en/arti Mejdini, F. et al. “Balkan Jihadi Warriors cle/far-right-balkan-groups-flourish- Remain Safe on the Net.” Resonant on-the-net-05-03-2017 Voices Initiative, February 2, 2017. http://www.balkaninsight.com/en/arti Weedon, J., W. Nuland, and A. Stamos. cle/balkan-jihadi-warriors-remain- “Information Operations and safe-on-the-netj-01-27-2017 Facebook.” Facebook, April 27, 2017. “Militant Islamists Shift Focus to Southeast https://fbnewsroomus.files.wordpress Asia.” Financial Times. .com/2017/04/facebook-and- https://www.ft.com/content/b830552 information-operations-v1.pdf a-5176-11e7-bfb8- 997009366969?mhq5j=e5

73 Oh

i For the purposes of this paper, social media actors include Facebook, Instagram, YouTube, and Twitter, but the behaviors and trends named in this paper also affect other platforms. ii Individuals interviewed from civil society and media organizations requested their names not be used in this paper. iii These trends are exacerbated by the growing number of returned fighters from Syria and Iraq in the Western Bal- kans, as well as Indonesia and the Philippines. iv Twitter noted at an open industry conference that behaviors are changing so quickly, understanding on the context and problem is critical, citing the use of bots to influence political discourse in the Philippines and Mexico. At the same conference, Juniper Downs noted Google regularly engages civil society academics and others to get input in an effort to better respond to changing behaviors and trends. v Existing forums such as the Global Network Initiative, International Telecommunications Union, and Internet Governance Forum, say local organizations, have not provided the desired space and structure for constructive en- gagement on this issue. vi In Macedonia, many groups focus on track false information spread around parliamentary elections to prevent vio- lence or other activities that would have negatively affected the elections. vii Rappler, a media organization in the Philippines, for example, regularly scrapes and collects data on political ac- counts to track bots, and other information operation campaigns around news events. viii Based on interviews with leaders from the Philippines, Myanmar, Indonesia, Sri Lanka, Kosovo, Macedonia, Bosnia, Serbia, and Albania, there may be multiple reasons why this is the case. ix This is part of a larger issue: an important entry point for major social media companies and user communities in other countries is through country governments due to the company’s interest in operating and complying with the law in these countries. x Conversations could build on existing discussions about information credibility measurement.

74 Sippitt

Why We Feel Woefully Under-Informed About How We Can Do Our Jobs Better

Amy Sippitt, Full Fact

Full Fact is the UK’s independent, non- Then, when we see specific unsubstantiated partisan, factchecking charity. We provide claims, we get those claims corrected at free tools, advice, and information so that source. We’ve secured corrections from a anyone can check the claims we hear from Prime Minister and national newspapers, politicians, journalists, and other figures and we’ve worked with official statistics influencing public debate in the UK. providers and charities to improve the way they present their information when it has Simply publishing factchecks is not going to been persistently misinterpreted. stop the spread of unsubstantiated claims or improve the overall accuracy of public In the medium term, we use the evidence debate. That idea has been discredited at from our factchecking to diagnose systemic least as far back as the 2008 US general problems and get systemic changes such as election when FactCheck.org, one of the improvements to press complaints first generation of factchecking procedures. organizations, concluded that despite more aggressive fact-checking than ever before, In the long run, we are in a fight about the “millions of voters were bamboozled culture of public life. anyway” (Jamieson and Jackson 2008). Research that does not take account of this is We do not give claims accuracy ratings: we of limited use to us. see our job as filling in shades of grey when campaigners often talk in black and white. Full Fact is a second-generation We also use a variety of formats other than a factchecking organization. We see our dual written factcheck, such as a simple “claim role as ensuring reliable information is and conclusion,” videos, graphics, as well as available, and stopping the spread of specific interviews on radio and TV. unsubstantiated claims. Africa Check and Argentina’s Chequeado are two other We believe there is now a need for a third examples. Publishing factchecks is just one generation of factchecking: responding to tool in our work. challenges including (a) everyone having the power of access to public platforms, vastly In the short term, we factcheck, which has increasing the range of monitoring needed; three benefits: giving people reliable (b) pseudo-grassroots activity online up to information to make up their own minds on state-sponsored disinformation; and (c) the big issues; scrutinizing the claims of people declining relevance of existing standards in public debate; and building an evidence processes. base of how specific unsubstantiated claims arise and are spread. Full Fact is running a number of initiatives to prepare for this. These include –

75 Sippitt

Organizing our work around debates not “The public debate is being poorly individual claims—to properly inform served by inconsistent, unqualified and, individuals’ beliefs we need to in some cases, misleading claims and understand how individual claims on a counter-claims” (Treasury Select particular topic build up to form a Committee 2016). picture about the world. There is “sufficient evidence to conclude Our world-leading automated that some sections of the press have factchecking work. The tools will first deliberately invented stories with no spot claims that have already been factual basis in order to satisfy the factchecked in new places, helping us demands of a readership” with “sections monitor the spread of misinformation so of the press…prioritizing the worldview we can target and evaluate our of a title over the accuracy of a story” interventions effectively. Second, they (Leveson Inquiry 2012). will automatically detect and check new claims, freeing up our resources to focus Our partnership with First Draft brought on more complicated claims. together verification and factchecking skills for online content during the 2017 general Our Third Generation Factchecking election and showed inaccurate information project which aims to develop, test, and spreading enthusiastically online, from implement effective reusable editorial official, aligned, grassroots, and perhaps packages that could appear anywhere pseudo-grassroots sources. But this is not a from our website to search results. pattern unique to digital spaces, nor is it new.

Misinformation and disinformation in the Full Fact leaves it up to our readers to judge UK where inaccuracies on the spectrum of The UK is a fortunate country with high misinformation (“the inadvertent sharing of levels of education, well-developed public false information”) and disinformation (“the and civil society institutions, and some deliberate creation and sharing of highly trusted media. Nevertheless, there is information known to be false”) (Wardle evidence that the public is substantially 2017). We see claims which seem to misinformed on key issues of public debate. represent genuine misunderstandings as well The Ipsos MORI Perils of Perception as careless, reckless, and sometimes willful series—which asks participants to answer inaccuracies. factual questions—has shown that for citizens in the UK and elsewhere across The most common causes we come across many different topics (Ipsos MORI 2016). seem to us to be the careless use of information or a reasonable Observers have pointed to examples of misunderstanding of a source. That is why inaccurate information in the UK: well communicated, high quality, and easy to access information is hugely important to “Politicians tend to promote the statistics tackling misinformation and disinformation. that best present their case…In some cases, spinning reduces the story behind Research needs the statistics to such an extent that the We welcome and are eager to act on picture is no longer true” (Jenkin 2013). research which –

76 Sippitt

Is not just US-based or is clearly literature reviews. In particular, we are keen transferrable to other countries. The to see cross-disciplinary research. The political environment in the US is transmission of facts is not a new topic, and unique. For example, voters seem to is not a research area unique to political have stronger partisan affiliations scientists and psychologists. As well as (Dalton 2016). In the UK, just 31% of pursuing new research, we hope to learn the British public classify themselves as what academics in the fields of , “very” or “fairly strong” supporters of a advertising, education, and elsewhere might political party (Hansard Society 2017). be able to tell us.

Tackles specific questions. Fake news is Some specific research questions an unhelpful catch all term (Full Fact What works to stop misinformation and 2017). disinformation spreading online and offline? How does information spread Understands the range of interventions online and offline? Who should be held factcheckers globally use, from seeking accountable? Are corrections worth the corrections to public education work, effort, especially when information spreads and the range of existing and possible so quickly and easily online? formats. How does factchecking contribute over Has a sophisticated approach to change time to people’s beliefs? The existing over time. Factchecking interventions research has tended to focus on the impact cannot be seen individually, nor should of a participant reading one factcheck, and their effect be viewed simply at a single whether that changes someone’s beliefs. point in time. However, political campaigns take place over months, and beliefs form over years. Clearly communicates its practical The research needs to look at the impact of conclusions and lessons, including under factchecking and associated activities over what circumstances impacts have been much longer periods if we are to properly found, how small or large the effect is, understand its impact on beliefs. and the level of confidence in the findings. How can we communicate effectively? We know very little about what works to get a Is methodologically rigorous and person to simply understand a factcheck. Do designed to be replicable, including some formats work better than others? Do publishing data. we need to tailor formats to individuals? How does context matter? We are doing As it stands we feel woefully under- some initial work on this with the University informed from the research about how we of Edinburgh’s Neuropolitics Research Lab can do our jobs better as much research in to look at what works to communicate this field falls short of these hopes or is hard factual information in digital spaces. for busy factcheckers to access and apply. Who are we not reaching and how do we We believe that there is great value in reach them? What does it take to secure existing research from many fields. We are trust? Are there people who are too eager to see more practical and critical

77 Sippitt distrustful for our work? If so, how do we membership. First Report of Session reach them? 2016-17. (2016)

What interventions work to improve the Independence and Funding, Full Fact, skills of people receiving information? We https://fullfact.org/about/funding/ know that individuals distrust but we also know that lots of individuals are Ipsos MORI, (2016). “Perceptions Are Not misinformed. As Professor Onora O’Neill Reality: What the World Gets has said: “We may end up claiming not to Wrong.” Ipsos MORI, trust, and yet for practical purposes place https://www.ipsos.com/ipsos- trust in the very sources we claim not to mori/en-uk/perceptions-are-not- trust” (O’Neill 2002). We need to reality-what-world-gets-wrong understand more about whether we need to make people more skeptical, so they are Jamieson, K.H., and B. Jackson. (2008) wary of what they see, or less cynical so “Our Disinformed Electorate,” they don’t feel the need to switch off from FactCheck.Org, any factual messages. We also need to know http://www.factcheck.org/2008/12/o how factchecking and verification influences ur-disinformed-electorate/ individuals’ skepticism and cynicism. More specifically, there are journalist training Jenkin, B. (2013) “PASC Demands programs and public education programs: Government Stats Are Presented what is their impact? With ‘The Whole Truth.’” Parliament.UK, Increasing the cost of lying and the http://www.parliament.uk/business/c rewards for accuracy ommittees/committees-a- When the vast majority of the public says z/commons-select/public- that they distrust politicians, is there any administration-select- reason for politicians not to lie? committee/news/publication-of- communicating-statistics-report/ References Dalton, R.J. (2016) “Politics.” Oxford Leveson Inquiry into the culture, practices Research Encyclopedias. and ethics of the British press. http://politics.oxfordre.com/view/10. (2012). Report of the Inquiry into the 1093/acrefore/9780190228637.001.0 Culture, Practices and Ethics of the 001/acrefore-9780190228637-e- Press, Volume II. London: The 72#acrefore-9780190228637-e-72- Stationery Office. figureGroup-1 O’Neill, O. (2002) “Reith Lecture Four: Hansard Society (2017). Audit of Political Trust and Transparency.” BBC Engagement. London: Hansard Radio 4. Society. http://www.bbc.co.uk/radio4/reith20 02/lecture4.shtml House of Commons Treasury Committee, The economic and financial costs Wardle, C. (2017) “Fake News. It’s and benefits of the UK’s EU Complicated.” First Draft,

78 Sippitt

https://medium.com/1st-draft/fake- news-its-complicated-d0f773766c79

Written Evidence Submitted by Full Fact (FNW0097). (2017). Parliament.UK, http://data.parliament.uk/writtenevid ence/committeeevidence.svc/evidenc edocument/culture-media-and-sport- committee/fake- news/written/48260.html

79 Barker

Field Notes: Observation from a School Library

Mike Barker, Phillips Academy

Welcome to our Wheelhouse there does not address the larger, more In months following the 2016 election the substantial problem that the impact of this country became fixated on two new phrases: disinformation campaign made evident. “Fake News” and “Post-Truth.” These two phrases were repeated and recycled across As a country of information consumers, the endless publications and news platforms, all past election exposed a dearth of critical the while presented as an emerging thinking skills when engaging with text or phenomenon we had never seen before. But, visual information. It exposed a broad lack of course, we had. These terms simply of understanding related to authority of a described an information ecosystem plagued piece of information, and, even worse, lack by disinformation--an historical, well-known of curiosity with regard to an information threat to civil societies, and, in our case, a source and a lack of sophistication in functioning democracy. addressing bias.

Since the election, we have experienced a In the library profession, these skills are scramble toward a solution primarily commonly referred to as Information involving technologists, journalists, fact- Literacy skills. We are graduating students checkers, media academics and, to an extent, without these skills, for some reasons I will educators. These collaborations have been describe in more detail. These are skills fruitful and yielded a number of very school libraries and librarians are well- creative and mostly technological solutions. positioned to help develop in the next But most solutions to date address the generation of information consumers. Thus, immediate problem of how disinformation a more substantial, perhaps longer-term circulates on our social platforms. In this solution to the problem of a “Post-Truth” way, we have started to think of this issue as world is an approach that aims to integrate more of a technological problem and as less school library programs better with the rest one about knowledge and education. This of the curriculum. According to the distinction is important as it will lead us to American Library Association, we have more sustainable solutions going forward. close to 100,000 school libraries (Number 2015) already situated in our 135,000 public Recent evidence has surfaced suggesting and private institutions (Fast 2017), and we agents with ties to the Russian government would be smart to leverage the existing bought ad space on our most common presence they hold within schools to better technology platforms for the purpose of solve the problems we have. With the right distributing disinformation (Shane and Goel partnerships, these libraries can bring 2017). For the short-term and current news significant strategic value to their collective cycle, this has given politicians scapegoats learning communities: a broader advocacy to blame. While it is clear these companies network. have work ahead of them, placing blame

80 Barker

Advancing Inquiry as Pedagogy things today. One responsibility libraries, Positionally the library does not drive and in particular school libraries, have been classroom activity; it supports it. No matter slower to assume is to look critically at the how effective a librarian may be in all of the skills needed today to be advocating for information literacy “information literate” across an array of instruction, odds are that a librarian plays a mediums and source types. Information is secondary or even tertiary role to choices communicated more than ever through related to content and pedagogy. In more visual content than ever before. In this way, cases than not--perhaps due to many years instruction cannot be focused on simply of teaching to standardized tests--those assessing print content. How are we to choices usually follow a set curriculum, with prepare students to assess the authority of a set of pre-selected texts. Even in the best articles and tweets and other content case scenario, this approach usually includes produced by bots and artificial intelligent assignments driven by predetermined agents? This challenge will only get harder prompts. Yet this flavor of teaching and and more complex as AI gets more learning does not really correlate with a sophisticated. How are we preparing world that is far more open-ended. It does students with an awareness of the not fully allow for a student to seek out information ecosystems they inhabit and the information broadly or to practice assessing ability to identify how pieces of information what they find. In a nutshell, this mode of are digitally served to them? How are we teaching is response oriented, rather than preparing them with an awareness of their inquiry oriented. own biases and two-system cognitive tendencies as they interact with a piece of A shift to an inquiry oriented pedagogy information? would offer students opportunities to interact more deeply with a less curated Shaping a research agenda that might help information ecosystem. Short, sustained us to broaden or modernize the traditional research topics where students formulate library’s sense of information literacy to and investigate their own topics by something more holistic and reflective of the interacting with the broad array of content technological and cognitive aspects of the on the web, as well as in libraries, would term is crucial. In doing so, we would help seem to be a more complementary and libraries better prepare information effective way to train information consumer consumers in the future. habits in the future. We need a research agenda which investigates this notion of Expanding Partnerships open inquiry as an effective pedagogical An effective way to promote inquiry-based device. With better evidence on the benefits learning and to expand the definition of this shift could bring to learning, traffic information literacy would be to shape would start to flow more heavily into school partnerships with the very platforms the libraries. students use everyday. Google for Education has been incredibly successful selling Broadening the Scope and Shape of products and technology to schools. Though Information Literacy this is a for-profit venture, it likely While libraries have long carried the the flag accelerated technology adoption in of information literacy, they are not free of American classrooms. Yet, aside from the blame either as it relates to the state of Google Books project, there really is no

81 Barker analogous partnership Google offers with civic life. In much the same way, Neal’s school libraries to better develop diagnosis signals the role school libraries information literacy in students. While it can play in supporting and strengthening would be wise not to partner too closely democracy in America. But to realize this with one group that monetizes information mission we need to catalyze programmatic and data, a collaboration of this kind could change within the library, and to increase be beneficial to both sides. A Google-led support for these programs with tested and curriculum on information literacy is the valid research. kind of partnership that would attract the attention of teachers and drive activity back Resources toward school libraries. Bayliss, Sarah. “Incoming ALA President Jim Neal: Supporting School Libraries: Defunded & Underutilized ... Libraries.” School Library Journal, We Need New Allies May 24th, 2017. If implemented, strategies like some of the http://www.slj.com/2017/05/industry ones listed above would augment the value -news/incoming-ala-president-jim- proposition of school libraries for the neal-supporting-school-libraries/# communities they serve, and help reverse a lengthy trend of divestment. This divestment “Fast Facts: Educational Institutions,” is usually couched in the view that school National Center for Education libraries are underutilized because “we have Statistics, Accessed October 21, Google now;” but this is precisely why they 2017. matter more than ever before. https://nces.ed.gov/fastfacts/display.a sp?id=84 For the most part, the library community has relied on a strategy of outreach and “Number of Libraries In the United States,” advocacy, but library advocacy cannot be American Library Association, Last the only solution--and to that end those updated September, 2015. advocating for the value of libraries cannot http://www.chicagomanualofstyle.or primarily be people working within them. g/tools_citationguide/citation-guide- 1.html Jim Neal, the incoming President of the American Library Association (ALA) Shane, Scott and Vindu Goel. “Fake Russian recently stated to School Library Journal Facebook Accounts Bought that his focus would be on school libraries. $100,000 in Political Ads,” New York Times, September 6, 2017. “With weakened school libraries, we create https://www.nytimes.com/2017/09/0 a nation that is not able to locate and 6/technology/facebook-russian- evaluate information that is accurate and political-ads.html?_r=0 balanced. Knowledge literacy is fundamental,” he said (Bayliss 2017).

Libraries were initially established as a means to ensure that citizens operating in a functioning democracy would have access to the information they needed to participate in

82 Dias

How Academics Can Help Platforms Tackle Disinformation

Nic Dias, First Draft

It was over a year ago that Craig Silverman themselves to responding to needs of the published the report that brought wide technology companies. Rather, I provide attention to the mis- and disinformation evidence that thoughtful cooperation space (Silverman 2016). Journalists and between research institutions and platforms fact-checkers have since reported on would benefit the development of effective, countless rumors and hoaxes, and social short-term responses to disinformation, and media researchers have used what data they so must be a priority. have to track attempts to amplify this information. Some research institutions can act as honest brokers between stakeholders. Google, Facebook, and Twitter have finally The benefits of bringing together journalists, begun to acknowledge the scale of the mis- researchers, and platforms are myriad. For and disinformation problem on their one, institutional and political factors can platforms. But they continue to fall short of make it difficult for platforms to explore any comprehensive response or some research questions. One interviewee transparency. This pace is frustrating for explained that research within industry is many disinformation scholars. Yet little has often overly constrained by the financial been written about what research institutions calendar and competition between might do to help these technology employees, particularly in companies where companies respond to issues surrounding the the bottommost performers are regularly quality of information shared on, or surfaced replaced. As a result, emphasis is placed on by, their platforms. projects that can produce results within the quarter, even when such projects may have In the hope of better understanding how negative long-term side effects. disinformation researchers have interacted – and might better interact – with platforms, I Cooperation between stakeholders would spoke on background with platform also allow them to fully benefit from each representatives and academics about their other’s expertise, and to pool resources in experiences working with one another. incredibly powerful ways, like cross- Notably, these interviews pointed to platform databases. As one platform interlocking, if not similar, sets of problems representative described: disrupting cooperation. And a majority even brought up similar ideas about how to "I have visibility into a slice of the challenge address those problems. that… our partners touch. If we're thinking about really trying to solve, or at least It’s important to acknowledge the potential seriously make a dent in this misinformation conflicts that arise when academics begin to challenge, that also touches [Platform X], play a role that could reasonably be that also touches [Platform Y]." compared to R&D. To be clear, I’m not advocating that researchers devote

83 Dias

Of course, competitive frictions within each mediocre, but they were pursued just sector, and broad mistrust between them, because I had the network contacts.” complicate collaboration. There’s also an undeniable mismatch between technology Research institutions can build on this social companies’ move-fast-and-break-things currency by valuing industry experience approach and academia’s measured methods when hiring employees, or valuing industry of progress. ambitions when selecting student interns and assistants. Yet several platforms representatives suggested that some research institutions – Be more solution-oriented researchers. like academic institutes or more practically- Another theme from my interviews was that oriented labs – are best positioned to act as platform representatives struggle to apply trusted brokers between journalists and research when it abstracts away from technology companies. For one, they are particular platforms. They were also relatively trusted by all parties, including the discouraged with what they saw as public, particularly where they are charitably research’s tendency to complicate questions funded. As one platform representative rather than answer them: explained: "That work is actually 95% useless to "Think tanks have essentially denigrated industry. Because we know that it's hard. into lobbying shops, which has created the We know that it's complicated. We know need for some sort of neutral hub for that it's ambiguous. But we've got to do projects that require complex coalitions… something. So, reading an analysis of why Working with academic institutes creates the it's hard doesn't move us forward." ability to move a project with a reasonable velocity, in a way where there's that layer of Ultimately, what product managers need is intermediation between [platforms] and something to convince engineers to players that, for whatever set of reasons, it prioritize a feature. Thus, platform might be hard for us to directly build a representatives said it would be helpful if relationship with." researchers crafted their research, or at least their discussion sections, with an eye to the Academic institutes also commonly employ circumstances in which platforms operate. people who have worked in industry. This is Specifically, that means providing explicit an advantage for two reasons. First, ex- suggestions to the platforms where possible, industry has a better sense how to work with rather than assuming a study’s implications industry. Second, those partnerships which are clear. It also means considering how have succeeded were built on personal one’s research might apply internationally, relationships between current and former and being more solution-minded and employees or students. One academic platform-specific. As one platform interviewee said the following: representative said:

“I've had personal experiences where I "If it's Google-specific, or YouTube- thought that I had great ideas but I didn't specific, or Facebook-specific – it's have the connections, and so they weren't generally more useful because we can pursued. And then I had some other cases in understand more concrete conclusions based which I had ideas that were probably kind of on the actual products being used, as

84 Dias opposed to more general statements about effort to bring together such an exceptional here's how people consume information...” group of people, only to have them to sit in an audience, listening to content they could Platform representatives also remarked that have already read – and likely have. As one it had become impossible to keep up with all representative put it: the new disinformation studies being published. Indeed, the recent influx of "It's not always the best use of time to listen interest and funding for disinformation to a panel of four academics, who have all research has been accompanied by a still- published papers that we've all read, swelling flood of research. Rasmus Klein regurgitate what's in their papers... Nielsen (2017), for example, found that 17 Conferences should be about discussing percent of the papers presented at the most what your paper says – the wider recent Future of Journalism Conference at framework, the policy recommendations." Cardiff University were focused on “post- truth, truth, and fake news.” The same platform representative suggested trading the current conference schedule for As such, platform representatives said one with fewer, longer gatherings. This brevity in research write-ups is greatly would ensure several things: It would appreciated. One researcher also suggested minimize time spent traveling and away that more papers reconciling and translating from the office. It would save everyone’s the immense volume of at times money. It would cut down on redundancies contradictory research would be beneficial. between conferences. And it would ensure These papers could even be aggregated in a everyone can meet. cross-disciplinary journal of disinformation. To make the most of these gatherings, it Consolidate conferences to minimize time may also be necessary to provide space for demands. platform representatives to talk under Another product of the sudden focus on Chatham House rule, or even off-the-record. disinformation is an overabundance of This would provide a genuine opportunity conferences organized about the topic. One for the powerful players collected to platform representative described it this combine their perspectives and build trust: way: "If an event is on the record or quotable it "It gets to a point where every other day makes it really, really hard for companies to there is a different conference on the same discuss hypotheticals or potential policy topic. And it kind of becomes like a solutions before we have made decisions traveling carnival, where the same people internally. Because, ultimately, we work for just move around different hotels and publicly traded companies that attract a large different conference venues.” amount of public interest… And one errant phrase on the record in an event that gets Unfortunately, these conferences tend to reported can quickly turn into a news story devote most of their agendas to the that doesn't reflect the conversation and presentation of papers. This, several context, and [is] blown out of proportion." platform representatives suggested, is not an optimal use of everyone’s time, brainpower, and experience. It’s a waste to expend the

85 Dias

Conclusion The centrality of social media and search engines, and the apparent scale of the information pollution on them, suggests that platforms will play an important role in relegating disinformation to the depths of the internet – at least in the short-term. However, as broadly trusted and independent experts, academics appear to be well-positioned to strengthen their efforts by informing and convening stakeholders.

As I said at the top of this paper, many scholars may be hesitant to work with industry. Yet none of my interviewees suggested that academics should stop seeking truth for truth’s sake. Rather, they argue that a willingness to cooperate with platforms would go a long way to help them tackle disinformation.

References Silverman, C. (2016). This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook, Buzzfeed, https://www.buzzfeed.com/craigsilve rman/viral-fake-election-news- outperformed-real-news-on-facebook

Nielsen, R.K. (2017). What is Journalism Studies Studying? [Blog post]. Retrieved from https://rasmuskleisnielsen.net/2017/0 9/14/what-is-journalism-studies- studying/

86 Thies, Zollöfer, Stamminger, Theobalt, and Nießner

Face2Face: Real-time Face Capture and Reenactment of RGB Videos

Justus Thies, Technical University of Munich; University of Erlangen-Nuremberg Michael Zollhöfer, Max-Planck Institute for Informatics Marc Stamminger, University of Erlangen-Nuremberg Christian Theobalt, Max-Planck Institute for Informatics Matthias Nießner, Technical University of Munich

Abstract: Face2Face is an approach for real-time facial reenactment of monocular target video sequences from the Internet. The goal of Face2Face is to animate the facial expressions of the target video in real time based on the input of a commodity webcam that captures the source actor. To this end, Face2Face tackles a hard, inverse rendering problem. It reconstructs the facial geometry of both faces and tracks the non-rigid motions. Using a novel expression transfer technique, the facial expressions are transferred from the recovered source face to the target face. Finally, a convincingly re-rendered synthetic video stream is produced based on novel image based rendering techniques.

Introduction In contrast of scanning solid objects with Our primary goal is to create a mathematical e.g. a Kinect depth sensor, Face2Face model of our world. Such a model enables concentrates on the harder problem of computers to reconstruct, understand, and reconstructing moving faces in uncontrolled interact with it. Computer Vision tries to environments. Most existing real-time face obtain this model from image data. The trackers are based on sparse features and reconstruction of the surrounding is very thus capture only a coarse face model. important nowadays. New technologies like Face2Face tries to use all available Virtual Reality or Augmented Reality rely information in the captured input, i.e., every on such data. The better the quality of the pixel, which is why it is called a dense face representation is, the better these new tracker. The method follows the principle of technologies will work. analysis-by-synthesis, which means that we determine the parameters of our face model

87 Thies, Zollöfer, Stamminger, Theobalt, and Nießner by minimizing the error between the input As described above the primary aim of image and a synthesized face image. Our Face2Face is to reconstruct and track a resulting synthesized model is so close to mathematical model of the human face. To the input that it is hard to distinguish model geometry and skin appearance, between the synthesized and the real face Face2Face is based upon a database of 200 captured by the RGB camera. scanned faces and 76 modelled expressions. Thus, using this data new faces can be Method Overview generated using a relatively small set of parameters.

Figure 1: Synthesized faces with different expressions using Face2Face

Using the standard graphics rendering Face2Face uses a dense error metric thus pipeline of modern computers, we can comparing every single pixel of the two generate synthetic images as shown in images. This requires a lot of computation Figure 1. This is key for the analysis-by- that is only feasible to be done on modern synthesis approach that we apply to Graphics Cards (GPUs). The optimizer that reconstruct the face of a person in a video is used to minimize the error is a highly stream. optimized parallel GPU-based Gauss- Newton solver that enables real-time The analysis-by-synthesis approach is an framerates. iterative process of synthesizing and error minimization. The error in our method In Figure 2, we show a sequence of describes the difference between an input reconstructed faces. image of a real camera and the synthetic image of the currently reconstructed face.

88 Thies, Zollöfer, Stamminger, Theobalt, and Nießner

Figure 2: Facial motion capturing based on the principal of Analysis-by-Synthesis: Top row shows the input and the bottom row shows the recovered facial geometry.

Based on this estimation of a face we During runtime we then transform the implemented facial reenactment. geometry of the target face to match the reconstructed expression of the source actor. To this end, we reconstruct the face of both Using the deformed face geometry of the the source actor and the target actor using target face and the mouth appearance in the the approach described above. The source database that is closest to the applied actor is tracked in a live video stream, while deformation we rerender the face on top of the target video is preprocessed. Using the the target video. variation of mouth motions in the target video, we create a database of mouth Figure 3 demonstrates the effectiveness of appearances, especially the appearance of the described approach. the mouth interior.

Figure 3: Facial reenactment applied to a video from the Internet.

Use Cases prominent use case is the usage in post- There are many possible use cases for our production systems in the film industry. dense face tracking technique. The most There are already techniques that transfer

89 Thies, Zollöfer, Stamminger, Theobalt, and Nießner the expressions of a human to a virtual detect inconsistencies. For example, the avatar. This is also possible with our expressions of a person and also the technique, despite the fact that we can even transition between them are unique. Thus, a transfer the expressions / mouth movement fraud can be detected by analyzing the to another human actor without the need of tracked expressions in a video sequence and special hardware and long scanning comparing them to a reference video procedures. Thus, we can easily alter the sequence. Which is similar to the analysis of face of an actor after capturing. E.g., one can handwritten text (graphoanalysis). In the alter the lighting, the skin or modify the field of Man-Machine Interaction detection expression. We also think that our technique and tracking of movements is the key- paves the way to live dubbing in a component. Thus, accurate tracking pushes teleconferencing scenario. A similar field the development of these applications. where such a dense face tracking could be beneficial, is the game industry. Here, the Discussion and Conclusion motion of a player can be transferred to a The demonstrations of Face2Face at several virtual avatar of the same person in an conferences and in the supplemental video online multiplayer game, resulting in a more on Youtube resulted in a controversial realistic output. discussion of the developed technique. People, who watched our system for the first Our project stems originally from another time, were thrilled by the results of our important field - medical research. There we method and thought of funny thinks they analyzed the healing of patients that suffer could do with it. But soon they realized that from a cleft lip and palate disorder. Head it could also be used to manipulate videos tracking is also important for other medical for propaganda or other evil purposes. In purposes, i.e., tracking during surgeries etc. general, they are right with this rating, but it Aside from that, our technique can be is important to sensitize people to such adapted to track other body parts or organs. video manipulations. Existing methods that Which is one of the most challenging tasks are used for movie production bring dead in modern medicine and is for example actors virtually to life, but nobody thinks of needed to damage as less as possible healthy the abuse of such a technology. matter during a therapy. Also, many Demonstrating the simplicity of video psychologists are interested in our system to manipulation teach the people to not blindly be used to treat people that suffer from trust videos from unknown sources. psychical diseases. There is also social psychological research that analysis the bias Nevertheless, we believe that this system of which person is more trustworthy. paves the way for many new and exciting applications in the fields of VR/AR, Fraud detection: As we reconstruct the face teleconferencing, or on-the-fly dubbing of parameters as well as the lighting videos with translated audio. conditions, our approach can be used to

90