Computational Propaganda Ii

Total Page:16

File Type:pdf, Size:1020Kb

Computational Propaganda Ii i Computational Propaganda ii Oxford Studies in Digital Politics Series Editor: Andrew Chadwick, Professor of Political Communication in the Centre for Research in Communication and Culture and the Department of Social Sciences, Loughborough University Using Technology, Building Democracy: Taking Our Country Back: The Crafting of Digital Campaigning and the Construction of Networked Politics from Howard Dean to Citizenship Barack Obama Jessica Baldwin-Philippi Daniel Kreiss Expect Us: Online Communities and Political Media and Protest Logics in the Digital Era: The Mobilization Umbrella Movement in Hong Kong Jessica L. Beyer Francis L.F. Lee and Joseph M. Chan If . Then: Algorithmic Power and Politics Bits and Atoms: Information and Communication Taina Bucher Technology in Areas of Limited Statehood Steven Livingston and Gregor Walter- Drop The Hybrid Media System: Politics and Power Digital Cities: The Internet and the Geography of Andrew Chadwick Opportunity Karen Mossberger, Caroline J. Tolbert, and The Only Constant Is Change: Technology, William W. Franko Political Communication, and Innovation Over Time Revolution Stalled: The Political Limits of the Ben Epstein Internet in the Post- Soviet Sphere Sarah Oates Tweeting to Power: The Social Media Revolution in American Politics Disruptive Power: The Crisis of the State in the Jason Gainous and Kevin M. Wagner Digital Age Taylor Owen Risk and Hyperconnectivity: Media and Memories of Neoliberalism Affective Publics: Sentiment, Technology, and Politics Andrew Hoskins and John Tulloch Zizi Papacharissi Democracy’s Fourth Wave?: Digital Media The Citizen Marketer: Promoting Political Opinion and the Arab Spring in the Social Media Age Philip N. Howard and Muzammil M. Hussain Joel Penney The Digital Origins of Dictatorship and China’s Digital Nationalism Democracy: Information Technology Florian Schneider and Political Islam Philip N. Howard Presidential Campaigning in the Internet Age Jennifer Stromer- Galley Analytic Activism: Digital Listening and the New Political Strategy News on the Internet: Information and Citizenship David Karpf in the 21st Century David Tewksbury and Jason Rittenberg The MoveOn Effect: The Unexpected Transformation of American Political The Civic Organization and the Digital Citizen: Advocacy Communicating Engagement in a Networked Age David Karpf Chris Wells Prototype Politics: Technology- Intensive Networked Publics and Digital Contention: The Campaigning and the Data of Democracy Politics of Everyday Life in Tunisia Daniel Kreiss Mohamed Zayani iii Computational Propaganda POLITICAL PARTIES, POLITICIANS, AND POLITICAL MANIPULATION ON SOCIAL MEDIA EDITED BY SAMUEL C. WOOLLEY AND PHILIP N. HOWARD 1 iv 1 Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and certain other countries. Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America. © Samuel C. Woolley and Philip N. Howard 2019 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by license, or under terms agreed with the appropriate reproduction rights organization. Inquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above. You must not circulate this work in any other form and you must impose this same condition on any acquirer. CIP data is on file at the Library of Congress ISBN 978– 0– 19– 093141– 4 (pbk.) ISBN 978–0–19–093140–7 (hbk.) 9 8 7 6 5 4 3 2 1 Paperback printed by Sheridan Books, Inc., United States of America Hardback printed by Bridgeport National Bindery, Inc., United States of America v Contents Part I THEORETICAL INTRODUCTION AND ANALYTICAL FRAME Introduction: Computational Propaganda Worldwide 3 SAMUEL C. WOOLLEY AND PHILIP N. HOWARD Part II COUNTRY- SPECIFIC CASE STUDIES 1. Russia: The Origins of Digital Misinformation 21 SERGEY SANOVICH 2. Ukraine: External Threats and Internal Challenges 41 MARIIA ZHDANOVA AND DARIYA ORLOVA 3. Canada: Building Bot Typologies 64 ELIZABETH DUBOIS AND FENWICK MCKELVEY 4. Poland: Unpacking the Ecosystem of Social Media Manipulation 86 ROBERT GORWA 5. Taiwan: Digital Democracy Meets Automated Autocracy 104 NICHOLAS J. MONACO 6. Brazil: Political Bot Intervention During Pivotal Events 128 DAN ARNAUDO vi vi Contents 7. Germany: A Cautionary Tale 153 LISA- MARIA N. NEUDERT 8. United States: Manufacturing Consensus Online 185 SAMUEL C. WOOLLEY AND DOUGLAS GUILBEAULT 9. China: An Alternative Model of a Widespread Practice 212 GILLIAN BOLSOVER Part III CONCLUSIONS Conclusion: Political Parties, Politicians, and Computational Propaganda 241 SAMUEL C. WOOLLEY AND PHILIP N. HOWARD Author Bios 249 Index 253 vii Computational Propaganda viii 1 Part I THEORETICAL INTRODUCTION AND ANALYTICAL FRAME 2 3 Introduction Computational Propaganda Worldwide SAMUEL C. WOOLLEY AND PHILIP N. HOWARD What Is Computational Propaganda? Digital technologies hold great promise for democracy. Social media tools and the wider resources of the Internet offer tremendous access to data, knowledge, social networks, and collective engagement opportunities, and can help us to build better democracies (Howard, 2015; Margetts et al., 2015). Unwelcome obstacles are, however, disrupting the creative democratic applications of in- formation technologies (Woolley, 2016; Gallacher et al., 2017; Vosoughi, Roy, & Aral, 2018). Massive social platforms like Facebook and Twitter are struggling to come to grips with the ways their creations can be used for polit- ical control. Social media algorithms may be creating echo chambers in which public conversations get polluted and polarized. Surveillance capabilities are outstripping civil protections. Political “bots” (software agents used to generate simple messages and “conversations” on social media) are masquerading as gen- uine grassroots movements to manipulate public opinion. Online hate speech is gaining currency. Malicious actors and digital marketers run junk news factories that disseminate misinformation to harm opponents or earn click- through ad- vertising revenue. It is no exaggeration to say that coordinated efforts are even now working to seed chaos in many political systems worldwide. Some militaries and intelli- gence agencies are making use of social media as conduits to undermine demo- cratic processes and bring down democratic institutions altogether (Bradshaw & Howard, 2017). Most democratic governments are preparing their legal and regulatory responses. But unintended consequences from over- regulation, or 3 4 4 THEORETICAL INTRODUCTION AND ANALYTICAL FRAME regulation uninformed by systematic research, may be as damaging to demo- cratic systems as the threats themselves. We live in a time of extraordinary political upheaval and change, with political movements and parties rising and declining rapidly (Kreiss, 2016; Anstead, 2017). In this fluctuating political environment, digital technologies provide the platform for a great deal of contemporary civic engagement and political action (Vaccari, 2017). Indeed, a large amount of research has shown that social media play an important role in the circulation of ideas and conversation about politics and public policy. Increasingly, however, social media platforms are also vehicles for manipulative disinformation campaigns. Political campaigns, governments, and regular citizens around the world are employing combinations of people and bots—automated software built to mimic real users— in an attempt to artificially shape public life (Woolley, 2016; Gallacher et al., 2017). But there are still open, and difficult to answer, questions about the specific mechanisms of influence for particular voters, and how governments, news organizations, and civil society groups should respond. How do new forms of civic engagement affect political outcomes? To what extent do online echo chambers and selective exposure to informa- tion promote political extremism? How can civil activists respond effectively to “trolling” by hostile political agents? Computational propaganda is a term that neatly encapsulates this recent phenomenon— and emerging field of study— of digital misinformation and manipulation. As a communicative practice, computational propaganda describes the use of algorithms, automation, and human curation to purpose- fully manage and distribute misleading information over social media networks (Woolley & Howard, 2016a). As part of the process, coders and their auto- mated software products (including bots) will learn from and imitate legiti- mate social media users in order to manipulate public opinion across a diverse range of platforms and device networks. These bots are built to behave like real people (for example, automatically generating and responding to conversations online) and then let loose over social media sites in order to amplify or suppress particular political messages. These “automated social actors” can be used to bolster particular politicians and policy positions— supporting them actively and enthusiastically, while simultaneously drowning out any dissenting voices (Abokhodair, Yoo, & McDonald, 2015).
Recommended publications
  • How the Chinese Government Fabricates Social Media Posts
    American Political Science Review (2017) 111, 3, 484–501 doi:10.1017/S0003055417000144 c American Political Science Association 2017 How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argument GARY KING Harvard University JENNIFER PAN Stanford University MARGARET E. ROBERTS University of California, San Diego he Chinese government has long been suspected of hiring as many as 2 million people to surrep- titiously insert huge numbers of pseudonymous and other deceptive writings into the stream of T real social media posts, as if they were the genuine opinions of ordinary people. Many academics, and most journalists and activists, claim that these so-called 50c party posts vociferously argue for the government’s side in political and policy debates. As we show, this is also true of most posts openly accused on social media of being 50c. Yet almost no systematic empirical evidence exists for this claim https://doi.org/10.1017/S0003055417000144 . or, more importantly, for the Chinese regime’s strategic objective in pursuing this activity. In the first large-scale empirical analysis of this operation, we show how to identify the secretive authors of these posts, the posts written by them, and their content. We estimate that the government fabricates and posts about 448 million social media comments a year. In contrast to prior claims, we show that the Chinese regime’s strategy is to avoid arguing with skeptics of the party and the government, and to not even discuss controversial issues. We show that the goal of this massive secretive operation is instead to distract the public and change the subject, as most of these posts involve cheerleading for China, the revolutionary history of the Communist Party, or other symbols of the regime.
    [Show full text]
  • Information Systems in Great Power Competition with China
    STRATEGIC STUDIES QUARTERLY - PERSPECTIVE Decide, Disrupt, Destroy: Information Systems in Great Power Competition with China AINIKKI RIIKONEN Abstract Technologies for creating and distributing knowledge have impacted international politics and conflict for centuries, and today the infrastructure for communicating knowledge has expanded. These technologies, along with attempts to exploit their vulnerabilities, will shape twenty-first- century great power competition between the US and China. Likewise, great power competition will shape the way China develops and uses these technologies across the whole spectrum of competition to make decisions, disrupt the operational environment, and destroy adversary capabilities. ***** he 2018 US National Defense Strategy (NDS) cites Russia and the People’s Republic of China (PRC) as “revisionist powers” that “want to shape a world consistent with their authoritarian model— gaining veto authority over other nations’ economic, diplomatic, and secu- T 1 rity decisions.” It describes these countries as competitors seeking to use “other areas of competition short of open warfare to achieve their [au- thoritarian] ends” and to “optimize their targeting of our battle networks and operational concepts.”2 The NDS assesses that competition will occur along the entire spectrum of statecraft from peace to open conflict and that Russia and the PRC will align their foreign policies with their models of governance. If this assessment is correct, and if technology plays a sig- nificant role in international politics, then technology will affect the whole spectrum of great power competition and conflict. Information architec- ture—the structures of technology that collect and relay information worldwide—is innately connected to power projection. The PRC has been innovating in this area, and its expanded information capabilities—and risks to US capabilities—will shape competition in the twenty- first cen- tury.
    [Show full text]
  • Freedom on the Net 2016
    FREEDOM ON THE NET 2016 China 2015 2016 Population: 1.371 billion Not Not Internet Freedom Status Internet Penetration 2015 (ITU): 50 percent Free Free Social Media/ICT Apps Blocked: Yes Obstacles to Access (0-25) 18 18 Political/Social Content Blocked: Yes Limits on Content (0-35) 30 30 Bloggers/ICT Users Arrested: Yes Violations of User Rights (0-40) 40 40 TOTAL* (0-100) 88 88 Press Freedom 2016 Status: Not Free * 0=most free, 100=least free Key Developments: June 2015 – May 2016 • A draft cybersecurity law could step up requirements for internet companies to store data in China, censor information, and shut down services for security reasons, under the aus- pices of the Cyberspace Administration of China (see Legal Environment). • An antiterrorism law passed in December 2015 requires technology companies to cooperate with authorities to decrypt data, and introduced content restrictions that could suppress legitimate speech (see Content Removal and Surveillance, Privacy, and Anonymity). • A criminal law amendment effective since November 2015 introduced penalties of up to seven years in prison for posting misinformation on social media (see Legal Environment). • Real-name registration requirements were tightened for internet users, with unregistered mobile phone accounts closed in September 2015, and app providers instructed to regis- ter and store user data in 2016 (see Surveillance, Privacy, and Anonymity). • Websites operated by the South China Morning Post, The Economist and Time magazine were among those newly blocked for reporting perceived as critical of President Xi Jin- ping (see Blocking and Filtering). www.freedomonthenet.org FREEDOM CHINA ON THE NET 2016 Introduction China was the world’s worst abuser of internet freedom in the 2016 Freedom on the Net survey for the second consecutive year.
    [Show full text]
  • How the Vietnamese State Uses Cyber Troops to Shape Online Discourse
    ISSUE: 2021 No. 22 ISSN 2335-6677 RESEARCHERS AT ISEAS – YUSOF ISHAK INSTITUTE ANALYSE CURRENT EVENTS Singapore | 3 March 2021 How The Vietnamese State Uses Cyber Troops to Shape Online Discourse Dien Nguyen An Luong* A Vietnamese youth checks his mobile phone on the century-old Long Bien Bridge in Hanoi on September 3, 2019. Vietnamese authorities have handled public political criticism - both online and in real life - with a calibrated mixture of toleration, responsiveness and repression. Photo: Manan VATSYAYANA, AFP. * Dien Nguyen An Luong is Visiting Fellow with the Media, Technology and Society Programme of the ISEAS – Yusof Ishak Institute. A journalist with significant experience as managing editor at Vietnam’s top newsrooms, his work has also appeared in the New York Times, the Washington Post, the Guardian, South China Morning Post, and other publications. 1 ISSUE: 2021 No. 22 ISSN 2335-6677 EXECUTIVE SUMMARY • The operations of Vietnam’s public opinion shapers and cyber-troops reveal that the online discourse is manipulated to enforce the Communist Party’s line. • Vietnamese authorities constantly grapple with the vexing question: How to strike a delicate balance between placating critical public sentiment online while ensuring that it does not spill over into protests against the regime. • When it comes to methods, targets and motives, there appears to be significant crossover between public opinion shapers and the government’s cyber troops. • The Vietnamese state cyber-troops have been encouraged to use real accounts to mass- report content. This helps explain why it is the only Southeast Asian state to publicly acknowledge having a military cyber unit.
    [Show full text]
  • Zerohack Zer0pwn Youranonnews Yevgeniy Anikin Yes Men
    Zerohack Zer0Pwn YourAnonNews Yevgeniy Anikin Yes Men YamaTough Xtreme x-Leader xenu xen0nymous www.oem.com.mx www.nytimes.com/pages/world/asia/index.html www.informador.com.mx www.futuregov.asia www.cronica.com.mx www.asiapacificsecuritymagazine.com Worm Wolfy Withdrawal* WillyFoReal Wikileaks IRC 88.80.16.13/9999 IRC Channel WikiLeaks WiiSpellWhy whitekidney Wells Fargo weed WallRoad w0rmware Vulnerability Vladislav Khorokhorin Visa Inc. Virus Virgin Islands "Viewpointe Archive Services, LLC" Versability Verizon Venezuela Vegas Vatican City USB US Trust US Bankcorp Uruguay Uran0n unusedcrayon United Kingdom UnicormCr3w unfittoprint unelected.org UndisclosedAnon Ukraine UGNazi ua_musti_1905 U.S. Bankcorp TYLER Turkey trosec113 Trojan Horse Trojan Trivette TriCk Tribalzer0 Transnistria transaction Traitor traffic court Tradecraft Trade Secrets "Total System Services, Inc." Topiary Top Secret Tom Stracener TibitXimer Thumb Drive Thomson Reuters TheWikiBoat thepeoplescause the_infecti0n The Unknowns The UnderTaker The Syrian electronic army The Jokerhack Thailand ThaCosmo th3j35t3r testeux1 TEST Telecomix TehWongZ Teddy Bigglesworth TeaMp0isoN TeamHav0k Team Ghost Shell Team Digi7al tdl4 taxes TARP tango down Tampa Tammy Shapiro Taiwan Tabu T0x1c t0wN T.A.R.P. Syrian Electronic Army syndiv Symantec Corporation Switzerland Swingers Club SWIFT Sweden Swan SwaggSec Swagg Security "SunGard Data Systems, Inc." Stuxnet Stringer Streamroller Stole* Sterlok SteelAnne st0rm SQLi Spyware Spying Spydevilz Spy Camera Sposed Spook Spoofing Splendide
    [Show full text]
  • Chinese Computational Propaganda: Automation, Algorithms and the Manipulation of Information About Chinese Politics on Twitter and Weibo
    This is a repository copy of Chinese computational propaganda: automation, algorithms and the manipulation of information about Chinese politics on Twitter and Weibo. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/136994/ Version: Accepted Version Article: Bolsover, G orcid.org/0000-0003-2982-1032 and Howard, P (2019) Chinese computational propaganda: automation, algorithms and the manipulation of information about Chinese politics on Twitter and Weibo. Information, Communication & Society, 22 (14). pp. 2063-2080. ISSN 1369-118X https://doi.org/10.1080/1369118X.2018.1476576 © 2018 Informa UK Limited, trading as Taylor & Francis Group. This is an Accepted Manuscript of an article published by Taylor & Francis in Information, Communication & Society on 24 May 2018, available online: https://doi.org/10.1080/1369118X.2018.1476576 Reuse Items deposited in White Rose Research Online are protected by copyright, with all rights reserved unless indicated otherwise. They may be downloaded and/or printed for private study, or other acts as permitted by national copyright laws. The publisher or other rights holders may allow further reproduction and re-use of the full text version. This is indicated by the licence information on the White Rose Research Online record for the item. Takedown If you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing [email protected] including the URL of the record and the reason for the withdrawal request. [email protected] https://eprints.whiterose.ac.uk/ Chinese computational propaganda: automation, algorithms and the manipulation of information about Chinese politics on Twitter and Weibo Gillian Bolsover and Philip Howard1 Oxford Internet Institute, University of Oxford, Oxford, UK.
    [Show full text]
  • Coordinating Across Chaos: the Practice of Transnational Internet Security Collaboration
    COORDINATING ACROSS CHAOS: THE PRACTICE OF TRANSNATIONAL INTERNET SECURITY COLLABORATION A Dissertation Presented to The Academic Faculty by Tarun Chaudhary In Partial Fulfillment of the Requirements for the Degree International Affairs, Science, and Technology in the Sam Nunn School of International Affairs Georgia Institute of Technology May 2019 COPYRIGHT © 2019 BY TARUN CHAUDHARY COORDINATING ACROSS CHAOS: THE PRACTICE OF TRANSNATIONAL INTERNET SECURITY COLLABORATION Approved by: Dr. Adam N. Stulberg Dr. Peter K. Brecke School of International Affairs School of International Affairs Georgia Institute of Technology Georgia Institute of Technology Dr. Michael D. Salomone Dr. Milton L. Mueller School of International Affairs School of Public Policy Georgia Institute of Technology Georgia Institute of Technology Dr. Jennifer Jordan School of International Affairs Georgia Institute of Technology Date Approved: March 11, 2019 ACKNOWLEDGEMENTS I was once told that writing a dissertation is lonely experience. This is only partially true. The experience of researching and writing this work has been supported and encouraged by a small army of individuals I am forever grateful toward. My wife Jamie, who has been a truly patient soul and encouraging beyond measure while also being my intellectual sounding board always helping guide me to deeper insight. I have benefited from an abundance of truly wonderful teachers over the course of my academic life. Dr. Michael Salomone who steered me toward the world of international security studies since I was an undergraduate, I am thankful for his wisdom and the tremendous amount of support he has given me over the past two decades. The rest of my committee has been equally as encouraging and provided me with countless insights as this work has been gestating and evolving.
    [Show full text]
  • Forbidden Feeds: Government Controls on Social Media in China
    FORBIDDEN FEEDS Government Controls on Social Media in China 1 FORBIDDEN FEEDS Government Controls on Social Media in China March 13, 2018 © 2018 PEN America. All rights reserved. PEN America stands at the intersection of literature and hu- man rights to protect open expression in the United States and worldwide. We champion the freedom to write, recognizing the power of the word to transform the world. Our mission is to unite writers and their allies to celebrate creative expression and defend the liberties that make it possible. Founded in 1922, PEN America is the largest of more than 100 centers of PEN International. Our strength is in our membership—a nationwide community of more than 7,000 novelists, journalists, poets, es- sayists, playwrights, editors, publishers, translators, agents, and other writing professionals. For more information, visit pen.org. Cover Illustration: Badiucao CONTENTS EXECUTIVE SUMMARY 4 INTRODUCTION : AN UNFULFILLED PROMISE 7 OUTLINE AND METHODOLOGY 10 KEY FINDINGS 11 SECTION I : AN OVERVIEW OF THE SYSTEM OF SOCIAL MEDIA CENSORSHIP 12 The Prevalence of Social Media Usage in China 12 Digital Rights—Including the Right to Free Expression—Under International Law 14 China’s Control of Online Expression: A Historical Perspective 15 State Control over Social Media: Policy 17 State Control over Social Media: Recent Laws and Regulations 18 SECTION II: SOCIAL MEDIA CENSORSHIP IN PRACTICE 24 A Typology of Censored Topics 24 The Corporate Responsibility to Censor its Users 29 The Mechanics of Censorship 32 Tibet and
    [Show full text]
  • The Post-Trauma of the Great Patriotic War in Russia
    The Post-Trauma of the Great Patriotic War in Russia AN ESSAY BY ELIZAVETA GAUFMAN University of Bremen Abstract: Collective memory often functions as embeddedness for a narrative that can have profound legitimation consequences. In order to make a population ‘buy’ a narrative, mem- ory entrepreneurs can manipulate traumatic memories in a population to justify the subver- sion of democratic processes, which is particularly dangerous. The ‘Great Patriotic War’, as World War II is known in Russia, commemorates not just the defeat of fascism, but also the survival of the nation in the face of extinction. It is also the most important heroic and unify- ing event in recent Russian history and is now actively used in nation-building efforts. The main argument of this essay is that due to the very traumatic nature of the collective memory of the Great Patriotic War in Russia, its citizens are bound to react in an emotional way to the issues that are discursively connected to the war. Keywords: Russia, Ukraine, trauma, fascism, commemoration, memory, epigenetics ‘To have the glory of the past in common, a shared will in the present; to have done great deeds together and want to do more of them, are the essen- tial conditions for the constitution of a people’. Ernest Renan enan was referring to nation-building in late 19th century France, but his words ring R true today. Memory, or to be more precise, emotive memory is indispensable for na- tion-building; almost all nations have foundational myths that are based on more or less au- thentic memories of greatness and suffering.
    [Show full text]
  • Dataset of Propaganda Techniques of the State-Sponsored Information Operation of the People’S Republic of China
    Dataset of Propaganda Techniques of the State-Sponsored Information Operation of the People’s Republic of China Rong-Ching Chang Chun-Ming Lai [email protected] [email protected] Tunghai University Tunghai University Taiwan Taiwan Kai-Lai Chang Chu-Hsing Lin [email protected] [email protected] Tunghai University Tunghai University Taiwan Taiwan ABSTRACT ACM Reference Format: The digital media, identified as computational propaganda provides Rong-Ching Chang, Chun-Ming Lai, Kai-Lai Chang, and Chu-Hsing Lin. a pathway for propaganda to expand its reach without limit. State- 2021. Dataset of Propaganda Techniques of the State-Sponsored Informa- tion Operation of the People’s Republic of China. In KDD ’21: The Sec- backed propaganda aims to shape the audiences’ cognition toward ond International MIS2 Workshop: Misinformation and Misbehavior Min- entities in favor of a certain political party or authority. Further- ing on the Web, Aug 15, 2021, Virtual. ACM, New York, NY, USA, 5 pages. more, it has become part of modern information warfare used in https://doi.org/10.1145/nnnnnnn.nnnnnnn order to gain an advantage over opponents. Most of the current studies focus on using machine learning, quantitative, and qualitative methods to distinguish if a certain piece 1 INTRODUCTION of information on social media is propaganda. Mainly conducted Propaganda has the purpose of framing and influencing opinions. on English content, but very little research addresses Chinese Man- With the rise of the internet and social media, propaganda has darin content. From propaganda detection, we want to go one step adopted a powerful tool for its unlimited reach, as well as multiple further to providing more fine-grained information on propaganda forms of content that can further drive engagement online and techniques that are applied.
    [Show full text]
  • The 'New' Private Security Industry', the Private Policing of Cyberspace and the Regulatory Questions Mark Button1 Publis
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Portsmouth University Research Portal (Pure) The ‘New’ Private Security Industry’, the Private Policing of Cyberspace and the Regulatory Questions Mark Button1 Published in Journal of Contemporary Criminal Justice Abstract This paper explores the growth of the ‘new’ private security industry and private policing arrangements, policing cyberspace. It argues there has been a significant change in policing which is equivalent to the ‘quiet revolution’ associated with private policing that Shearing and Stenning observed in the 1970s and 1980s, marking a ‘second quiet revolution’. The paper then explores some of the regulatory questions that arise from these changes, which have been largely ignored, to date, by scholars of policing and policy-makers making some clear recommendations for the future focus of them. Keywords private security, private policing, regulation, cybercrime and cyberspace Introduction Writing in the late 1970s and early 1980s Shearing and Stenning observed the substantial changes to policing taking place at the time in North America, describing the transformation 1 Institute of Criminal Justice Studies, University of Portsmouth, Portsmouth PO1 2HY, UK. Email: [email protected] 1 as a ‘quiet revolution.’ They noted the substantial growth of private security, linked to the advance of mass private property and the under-funding of the police, with a sector focused upon preventative, rather than curative policing (Stenning and
    [Show full text]
  • Armchair Detectives and the Social Construction of Falsehoods
    ARMCHAIR DETECTIVES AND THE SOCIAL CONSTRUCTION OF FALSEHOODS: EMERGENT MOB BEHAVIOR ON THE INTERNET A DISSERTATION SUBMITTED TO THE GRADUATE DIVISION OF THE UNIVERSITY OF HAWAI’I AT MĀNOA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY IN SOCIOLOGY MAY 2017 By Penn Pantumsinchai Dissertation Committee: Patricia Steinhoff, Chairperson David Johnson Krysia Mossakowski Wayne Buente Debora Halbert Keywords: Mob justice, online communities, internet vigilantism, collective intelligence, social control, collective behavior ©2017, Penn Pantumsinchai All Rights Reserved ACKNOWLEDGEMENTS I dedicate this dissertation to my father, who was the inspiration for me to pursue my PhD and become the second Doctor of the family. In really tough times, his stories of how he survived the PhD and wrote his dissertation gave me hope. Thank you to my dear mother, who always checked in on me to see how I was doing, and encouraged me to never give up, and never take the easier route. Thank you to my brother Nate, who actually gave me the idea to pursue this murky topic of mob justice in the first place. The dissertation would never have happened otherwise! Thank you to my second brother Dan and his family, Jayon and Sienna, for their endless support and encouragements, as well as a healthy supply of chocolates throughout the years of my PhD. More than anyone, thank you to my beloved aunt, Pabol, who supported my studies through all these years financially and morally. She always believed in me and I hope I have made her proud. My family’s staunch support was the pillar that kept me going.
    [Show full text]