Technology-Enabled Disinformation: Summary, Lessons, and Recommendations

Technology-Enabled Disinformation: Summary, Lessons, and Recommendations

Technical Report UW-CSE-2018-12-02 Technology-Enabled Disinformation: Summary, Lessons, and Recommendations December 2018 John Akers Gagan Bansal Gabriel Cadamuro Christine Chen Quanze Chen Lucy Lin Phoebe Mulcaire Rajalakshmi Nandakumar Matthew Rockett Lucy Simko John Toman Tongshuang Wu Eric Zeng Bill Zorn Franziska Roesner ([email protected]) Paul G. Allen School of Computer Science & Engineering University of Washington Abstract In this report, which is the culmination of a PhD-level spe- 1 Technology is increasingly used — unintentionally (mis- cial topics course in Computer Science & Engineering at the information) or intentionally (disinformation) — to spread University of Washington’s Paul G. Allen School in the fall false information at scale, with potentially broad-reaching of 2018, we consider the phenomenon of mis/disinformation societal effects. For example, technology enables increas- from the perspective of computer science. The overall space ingly realistic false images and videos, and hyper-personal is complicated and will require study from, and collabora- targeting means different people may see different versions tions among, researchers from many different areas, includ- of reality. This report is the culmination of a PhD-level spe- ing cognitive science, economics, neuroscience, political sci- cial topics course1 in Computer Science & Engineering at ence, psychology, public health, and sociology, just to name the University of Washington’s Paul G. Allen School in the a few. As technologists, we focused in the course, and thus in fall of 2018. The goals of this course were to study (1) how this report, on the technology. We ask(ed): what is the role of technologies and today’s technical platforms enable and sup- technology in the recent plague of mis- and disinformation? port the creation and spread of such mis- and disinformation, How have technologies and technical platforms enabled and as well as (2) how technical approaches could be used to mit- supported the spread of this kind of content? And, critically, igate these issues. In this report, we summarize the space of how can technical approaches be used to mitigate these is- technology-enabled mis- and disinformation based on our in- sues — or how do they fall short? vestigations, and then surface our lessons and recommenda- In Section 2, we summarize the state of technology in tions for technologists, researchers, platform designers, pol- the mis/disinformation space, including reflecting on what icymakers, and users. is new and different from prior information pollution cam- paigns (Section 2.1), surveying work that studies current mis/disinformation ecosystems (Section 2.2), and summariz- 1 Introduction ing the underlying enabling technologies (specifically, false video, targeting and tracking, and bots, in Section 2.3), as Misinformation is false information that is shared with no in- well as existing efforts to curb the effectiveness or spread tention of harm. Disinformation, on the other hand, is false of mis/disinformation (Section 2.4). We then step back to information that is shared with the intention of deceiving the summarize the lessons we learned from our exploration (Sec- consumer [49, 110]. This phenomenon, also referred to as tion 3.1) and then make recommendations particularly for information pollution [110] or false or fake news [54], is a technologists, researchers, and technical platform designers, critical and current issue. Though propaganda, rumors, mis- as well as for policymakers and users (Section 3.2). leading reporting, and similar issues are age-old, our current Our overarching conclusions are two-fold. First, as tech- technological era allows these types of content to be created nologists, we must be conscious of the potential negative im- easily and realistically, and to spread with unprecedented pacts of the technologies we create, recognize that their de- speed and scale. The consequences are serious and far- signs are not neutral, and take responsibility for understand- reaching, including potential effects on the 2016 U.S. elec- ing and minimizing their potential harms. Second, however, tion [33], distrust in vaccinations [9], murders in India [16], the (causes and) solutions to mis/disinformation are not just and impacts on the lives of individuals [50]. based in technology. For example, we cannot just “throw some machine learning at the problem”. In addition to de- 1https://courses.cs.washington.edu/courses/cse599b/ signing technical defenses — or designing technology to at- 18au/ tempt to avoid the problem in the first place — we must also 1 work together with experts in other fields to make progress on social media. Contrast the six long years that it took in a space where there do not seem to be easy answers. for the Russian disinformation campaign about AIDS having been created by the U.S. to take off in the 1980s [7, 20] with the viral spread of false stories like Pizza- 2 Understanding Mis- and Disinformation gate [5] during the 2016 U.S. election. A key goal of this report is to provide an overview of 4. Organic and intentionally created filter bubbles. In technology-enabled mis/disinformation — what is new, what today’s web experience, individuals get to choose what is happening, how technology enables it, and how tech- content they do or do not want to see, e.g., by customiz- nology may be able to defend against it — which we do ing their social media feeds or visiting specific websites in this section. Others have also written related and valu- for news. The resulting echo chambers are often called able summaries that contributed to our understanding of the “filter bubbles” [72] and can help mis/disinformation space and that we recommend the interested reader cross- spread to receptive people who consume it within a lim- reference [49, 54, 63, 110]. ited broader context. Though some recent studies sug- gest that filter bubbles may not be as extreme as feared (e.g., [39]), others find evidence for increased polariza- 2.1 What Is New? tion [62]. Moreover, cognitive biases like the backfire effect (discussed below) suggest that exposure to alter- While propaganda and its ilk are not a new problem, it is nate views is not necessarily effective in helping people becoming increasingly clear that the rise of technology has change their minds. Worse, entities wishing to spread served to catalyze the creation, dissemination, and consump- disinformation can intentionally create filter bubbles by tion of mis/disinformation at scale. Here we briefly detail, directly and accurately targeting (e.g., via ads or spon- from a technology standpoint, the major factors that have sored posts) potentially vulnerable individuals. contributed to our current situation: 5. Algorithmic curation and lack of transparency. Con- 1. Democratization of content creation. Anyone can tent curation algorithms have become increasing com- start a website, blog, Facebook page, Twitter account, plex and fine-tuned — e.g., the algorithm that deter- or similar and begin creating and sharing content. There mines which posts, in what order, are displayed in a has been a transformation from a highly centralized sys- user’s Facebook news feed, or YouTube’s recommender tem with a few major content producers to a decentral- algorithms, which have been frequently criticized for ized system with many content producers. This flood pushing people towards more extreme content [102]. At of information makes it difficult to distinguish between the same time, these algorithms are not transparent for true and false information (i.e., information pollution). end users (e.g., Facebook’s ad explanation feature [3]). Additionally, tools to create realistic fake content have This complexity can make it challenging for users to become available to non-technical actors (e.g., Deep- understand why they are having specific viewing expe- Fakes [10, 48]), significantly lowering the barrier to en- riences online, what the real source of information (e.g., try in terms of time and resources for people to create sponsored content) is [64], and even that they are hav- extremely realistic fake content. ing different experiences from other users. 2. Rapid news cycle and economic incentives. The more 6. Scale and anonymity in online accounts. Actors clicks a story gets, the more money gets made through wishing to spread disinformation can leverage the weak ad revenue. Given the rapid news cycle and plethora of identity and account management frameworks of many information sources, getting more clicks requires pro- online platforms to create a large number of accounts ducing content that catches consumers’ attention — for (“sybils” or bots) in order to, for example, pose as le- instance, by appealing to their emotions [74] or playing gitimate members of a political movement and/or arti- into their other cognitive biases [6, 89]. In addition, so- ficially create the impression that particular content is cial media platforms that provide free accounts to users popular (e.g., [4]). It is not clear how to address this make money via targeted advertising. This allows a va- issue, as stricter account verification can have negative riety of entities, with intentions ranging from helpful to consequences for some legitimate users [43], and bot harmful, to get their messages in front of users. accounts can have legitimate purposes [36, 82]. 3. Wide and immediate reach and interactivity. Con- tent created on one side of the planet can be viewed What has not changed: Cognitive biases of human con- on the other side of the planet nearly instantaneously. sumers. Though the above technology trends have con- Moreover, these content creators receive immediate tributed to creating the currently thriving mis/disinformation feedback on how their campaigns are performing — and ecosystem, ultimately, the content that is effective relies on can easily iterate and A/B test — based on the number something that has not changed: the cognitive biases of hu- of likes, shares, clicks, comments, and other reactions man consumers [6, 89].

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    15 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us