Facebook's "Oversight Board:" Move Fast with Stable Infrastructure and Humility
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
``At the End of the Day Facebook Does What It Wants'': How Users
“At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation KRISTEN VACCARO, University of Illinois Urbana-Champaign CHRISTIAN SANDVIG, University of Michigan KARRIE KARAHALIOS, University of Illinois Urbana-Champaign Interest has grown in designing algorithmic decision making systems for contestability. In this work, we study how users experience contesting unfavorable social media content moderation decisions. A large-scale online experiment tests whether different forms of appeals can improve users’ experiences of automated decision making. We study the impact on users’ perceptions of the Fairness, Accountability, and Trustworthiness of algorithmic decisions, as well as their feelings of Control (FACT). Surprisingly, we find that none of the appeal designs improve FACT perceptions compared to a no appeal baseline. We qualitatively analyze how users write appeals, and find that they contest the decision itself, but also more fundamental issues like thegoalof moderating content, the idea of automation, and the inconsistency of the system as a whole. We conclude with suggestions for – as well as a discussion of the challenges of – designing for contestability. CCS Concepts: • Human-centered computing → Human computer interaction (HCI); Social media. Additional Key Words and Phrases: content moderation; algorithmic experience ACM Reference Format: Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. Proc. ACM Hum.-Comput. Interact. 4, CSCW2, Article 167 (October 2020), 22 pages. https://doi.org/10.1145/3415238 1 INTRODUCTION As algorithmic decision making systems become both more prevalent and more visible, interest has grown in how to design them to be trustworthy, understandable and fair. -
Download Paper
Lawless: the secret rules that govern our digital lives (and why we need new digital constitutions that protect our rights) Submitted version. Forthcoming 2019 Cambridge University Press. Nicolas P. Suzor Table of Contents Part I: a lawless internet Chapter 1. The hidden rules of the internet ............................................................................................. 6 Process matters ....................................................................................................................................................... 12 Chapter 2. Who makes the rules?.......................................................................................................... 17 Whose values apply? ............................................................................................................................................... 22 The moderation process .......................................................................................................................................... 25 Bias and accountability ........................................................................................................................................... 28 Chapter 3. The internet’s abuse problem ............................................................................................... 41 Abuse reflects and reinforces systemic inequalities ................................................................................................ 50 Dealing with abuse needs the involvement of platforms ....................................................................................... -
How Bad Can It Git? Characterizing Secret Leakage in Public Github Repositories
How Bad Can It Git? Characterizing Secret Leakage in Public GitHub Repositories Michael Meli Matthew R. McNiece Bradley Reaves North Carolina State University North Carolina State University North Carolina State University [email protected] Cisco Systems, Inc. [email protected] [email protected] Abstract—GitHub and similar platforms have made public leaked in this way have been exploited before [4], [8], [21], [25], collaborative development of software commonplace. However, a [41], [46]. While this problem is known, it remains unknown to problem arises when this public code must manage authentication what extent secrets are leaked and how attackers can efficiently secrets, such as API keys or cryptographic secrets. These secrets and effectively extract these secrets. must be kept private for security, yet common development practices like adding these secrets to code make accidental leakage In this paper, we present the first comprehensive, longi- frequent. In this paper, we present the first large-scale and tudinal analysis of secret leakage on GitHub. We build and longitudinal analysis of secret leakage on GitHub. We examine evaluate two different approaches for mining secrets: one is able billions of files collected using two complementary approaches: a to discover 99% of newly committed files containing secrets in nearly six-month scan of real-time public GitHub commits and a public snapshot covering 13% of open-source repositories. We real time, while the other leverages a large snapshot covering focus on private key files and 11 high-impact platforms with 13% of all public repositories, some dating to GitHub’s creation. distinctive API key formats. This focus allows us to develop We examine millions of repositories and billions of files to conservative detection techniques that we manually and automat- recover hundreds of thousands of secrets targeting 11 different ically evaluate to ensure accurate results. -
Content Moderation
Robert Lewington (Twitch) & The Fair Play Alliance Executive Steering Committee April 2021 Being ‘Targeted’ about Content Moderation: Strategies for consistent, scalable and effective response to Disruption & Harm 1 Content Moderation: Best Practices for Targeted Reporting & reactive UGC Management At Scale March 2021 Abstract This document provides replicable best practice information on how to moderate User-Generated Content (UGC) in social applications or services (including digital media and video games). Its focus is on reactive moderation, a central component of the growing content moderation toolkit where a service provider responds to reports submitted by users of its service regarding UGC that may violate its Terms of Service. Specifically, the document explores and advocates for a ‘targeted’ approach to the creation of reporting mechanisms. This allows users to closely identify the specific infraction, utilise evidence of the infraction—access to which is facilitated as part of the design of the reporting process—enabling consistent content moderation at scale. Note, however, that we will also make passing-reference to pre, post and proactive (see Appendix A) moderation approaches. Specifics of how best to tailor these best practices to a particular application or service will differ based on various parameters, including: type of service (social media, video game etc.); type of media (text, image, audio, video etc.); sharing mechanism (feed/gallery, avatar, communication etc.); persistence (ephemeral vs. static/umutable) and others, and therefore this document should be considered a set of high-level instructive principles rather than prescriptive guidelines. Contents i. Background ii. The Content Moderation Flywheel ○ Community Guidelines/Code of Conduct ○ Targeted reporting ■ Context ○ Scalable Content Moderation ○ Education ○ Technology iii. -
Of Facebook in Myanmar: a Case for Corporate Criminal Liability
The “Weaponization” of Facebook in Myanmar: A Case for Corporate Criminal Liability † NERIAH YUE The advent of social media platforms in the mid-2000s increased global communication and encouraged innovative activism by ushering new, effective ways to organize and protest. News agencies have recently reported the misuse of these platforms by individual actors and authoritarian regimes. Autocrats, in particular, twist social media platforms into weapons to silence dissent and spread hate speech. The latter category, hate speech, has contributed to some of the gravest human rights abuses globally. The increased spotlight on the weaponization of social media has motivated scholars, states, and companies to revisit the theory of corporate responsibility. This Note unpacks the potential criminal liability of social media companies for misuse on their platforms that result in grave human rights violations. Specifically, it explores Facebook’s corporate criminal liability in light of authoritarian regimes’ misuse of its platform to incite crimes against humanity. This Note will not cover jurisdictional issues regarding corporate criminal liability. Rather, it identifies on a theoretical level, which crimes, if any, social media corporations could be held accountable for under international criminal law. While there remain significant obstacles to prosecuting such cases, this Note identifies an accountability gap between Facebook’s actions and victims of human rights abuses that occur from its platform. Ultimately, this Note concludes that corporate criminal liability is an effective form of ensuring that social media companies remain responsible in doing their part to uphold human rights. † J.D. Candidate 2020, University of California, Hastings College of the Law; Executive Managing Editor, Hastings Law Journal. -
Sarahah Is a Messaging Application Launched in November 2016 by Saudi Developer Zain Al- Abidin Tawfiq
NATIONAL WHITE COLLAR CRIME CENTER Background Sarahah is a messaging application launched in November 2016 by Saudi developer Zain al- Abidin Tawfiq. The word “Sarahah” is the pronunciation of the Arabic word for “honesty.” It was originally launched as a service for businesses in Arabic-speaking regions to solicit anonymous, candid feedback from their employees and co-workers. However, it quickly went viral in Saudi Arabia and Egypt as an anonymous messaging application.2 Building on its regional success, Sarahah rapidly gained traction in North America, Europe, and Australia. A recent update integrated its functionality and network into Snapchat, one of the most popular social media apps in the world. This prompted an explosive growth in popularity, with over 14 million registered users and 20 million unique daily visitors (it is possible to leave messages in Sarahah without creating an account). 3 What is Sarahah? Sarahah provides a free network to leave anonymous messages through a public profile that a user shares with other people. Anyone with a user’s profile name can anonymously message that user, without necessarily creating an account. Sarahah can be used via web browser or by installing an app on an iOS or Android device. New accounts can be created only via the mobile app. Registration is required in order to receive messages. New users register with an email address, username, password, first and last name, and “handle” or profile name. The username is used to log in to the service, but only the profile name is displayed to other users. The personal link to receive anonymous messages automatically becomes www.PROFILE_NAME.Sarahah.com, and cannot be changed. -
Popular Applications KIK- Messaging App That Allows Users to Join Groups Or Direct Message Other Users. Photos/Videos Can Be
Popular Applications KIK- Messaging App That Allows Users to Join Groups or Direct Message Other Users. Photos/Videos Can Be Sent, and All Can Be Deleted by Deleting the Application. WhatsApp- A Messaging Service That Lets Users Exchange Unlimited Text, Audio, Phone Calls, Photo and Video Messages. Messages Are Encrypted Telegram- A cloud-based instant messaging and voice over IP service. Telegram client apps are available for Android, iOS, Windows Phone, Windows NT, macOS and Linux.[16] Users can send messages and exchange photos, videos, stickers, audio and files of any type. Messages can also be sent with client-to-client encryption in so-called secret chats. Unlike Telegram's cloud-based messages, messages sent within a secret chat can be accessed only on the device upon which the secret chat was initiated and the device upon which the secret chat was accepted; they cannot be accessed on other devices. Messages sent within secret chats can, in principle, be deleted at any time and can optionally self-destruct. Whisper - A proprietary iOS and Android mobile app available without charge. It is a form of anonymous social media, allowing users to post and share photo and video messages anonymously. You can respond to a message publicly or privately, choosing a public anonymous post or a private pseudonymous chat. Mocospace - Site similar to other social networking sites. Features include mobile games, chat, instant messaging, eCards, and photos Houseparty - A social networking service that enables group video chatting through mobile and desktop apps. Users receive a notification when friends are online and available to group video chat. -
Deeper Attention to Abusive User Content Moderation
Deeper Attention to Abusive User Content Moderation John Pavlopoulos Prodromos Malakasiotis Ion Androutsopoulos Straintek, Athens, Greece Straintek, Athens, Greece Athens University of Economics [email protected] [email protected] and Business, Greece [email protected] Abstract suffer from abusive user comments, which dam- age their reputations and make them liable to fines, Experimenting with a new dataset of 1.6M e.g., when hosting comments encouraging illegal user comments from a news portal and an actions. They often employ moderators, who are existing dataset of 115K Wikipedia talk frequently overwhelmed, however, by the volume page comments, we show that an RNN op- and abusiveness of comments.3 Readers are dis- erating on word embeddings outpeforms appointed when non-abusive comments do not ap- the previous state of the art in moderation, pear quickly online because of moderation delays. which used logistic regression or an MLP Smaller news portals may be unable to employ classifier with character or word n-grams. moderators, and some are forced to shut down We also compare against a CNN operat- their comments sections entirely. ing on word embeddings, and a word-list We examine how deep learning (Goodfellow baseline. A novel, deep, classification- et al., 2016; Goldberg, 2016, 2017) can be em- specific attention mechanism improves the ployed to moderate user comments. We experi- performance of the RNN further, and can ment with a new dataset of approx. 1.6M manually also highlight suspicious words for free, moderated (accepted or rejected) user comments without including highlighted words in the from a Greek sports news portal (called Gazzetta), training data. -
Recommendations for the Facebook Content Review Board
STANFORD Policy Practicum: Creating a Social Media Oversight Board Recommendations for the Facebook Content Review Board 2018-19 PRACTICUM RESEARCH TEAM: Shaimaa BAKR, Ph.D. Electrical Engineering ‘20 Madeline MAGNUSON, J.D. ’20 Fernando BERDION-DEL VALLE, J.D. ’20 Shawn MUSGRAVE, J.D. ’21 Isabella GARCIA-CAMARGO, B.S.’20 Ashwin RAMASWAMI, B.S. ‘21 Julia GREENBERG, J.D. ’19 Nora TAN, B.S. ’19 Tara IYER, B.S.’19 Marlena WISNIAK, LL.M. ’19 Alejandra LYNBERG, J.D. ’19 Monica ZWOLINSKI, J.D. ’19 INSTRUCTORS: Paul BREST, Faculty Director, Law and Policy Lab Daniel HO, William Benjamin Scott and Luna M. Scott Professor of Law Nathaniel PERSILY, James B. McClatchy Professor of Law Rob REICH, Faculty Director, Center for Ethics in Society TEACHING ASSISTANT: Liza STARR, J.D./M.B.A. ’21 SPRING POLICY CLIENT: Project on Democracy and the Internet, Stanford Center on Philanthropy and Civil Society 2019 559 Nathan Abbot Way Stanford, CA https://law.stanford.edu/education/only-at-sls/law-policy-lab// Contents Executive Summary ....................................................................................3 Introduction ................................................................................................8 I. The Need for Governing Principles ..........................................................9 II. Structure and Membership ...................................................................10 A. Selection Process for Board Members ..............................................................10 B. Membership Criteria .............................................................................................12 -
Online Communities As Semicommons
The Virtues of Moderation: Online Communities as Semicommons James Grimmelmann* IP Scholars workshop draft: not intended for broad distribution or citation I. INTRODUCTION................................................................................................................................................2 II. BACKGROUND ..................................................................................................................................................4 A. PUBLIC AND PRIVATE GOODS ...........................................................................................................................4 B. THE TRAGIC STORY ..........................................................................................................................................6 C. THE COMEDIC STORY .......................................................................................................................................9 D. LAYERING.......................................................................................................................................................12 III. ONLINE SEMICOMMONS .............................................................................................................................14 A. A FORMAL MODEL .........................................................................................................................................14 B. COMMONS VIRTUES........................................................................................................................................17 -
Whose Social Network Account: a Trade Secret Approach to Allocating Rights Zoe Argento Roger Williams University School of Law
Michigan Telecommunications and Technology Law Review Volume 19 | Issue 2 2013 Whose Social Network Account: A Trade Secret Approach to Allocating Rights Zoe Argento Roger Williams University School of Law Follow this and additional works at: http://repository.law.umich.edu/mttlr Part of the Intellectual Property Law Commons, Internet Law Commons, Labor and Employment Law Commons, and the Science and Technology Law Commons Recommended Citation Zoe Argento, Whose Social Network Account: A Trade Secret Approach to Allocating Rights, 19 Mich. Telecomm. & Tech. L. Rev. 201 (2013). Available at: http://repository.law.umich.edu/mttlr/vol19/iss2/1 This Article is brought to you for free and open access by the Journals at University of Michigan Law School Scholarship Repository. It has been accepted for inclusion in Michigan Telecommunications and Technology Law Review by an authorized editor of University of Michigan Law School Scholarship Repository. For more information, please contact [email protected]. WHOSE SOCIAL NETWORK ACCOUNT? A TRADE SECRET APPROACH TO ALLOCATING RIGHTS Zoe Argento* Cite as: Zoe Argento, Whose Social Network Account? A Trade Secret Approach to Allocating Rights, 19 MICH. TELECOMM. & TECH. L. REV. 201 (2013), available at http://www.mttlr.org/volnineteen/argento.pdf Who has the superior right to a social network account? This is the question in a growing number of disputes between employers and workers over social network accounts. The problem has no clear legal precedent. Although the disputes implicate rights under trademark, copyright, and privacy law, these legal paradigmsfail to address the core issue. At base, disputes over social network accounts are disputes over the right to access the people, sometimes numbering in the tens of thousands, who follow an account. -
Online Comment Moderation: Emerging Best Practices a Guide to Promoting Robust and Civil Online Conversation
Online comment moderation: emerging best practices A guide to promoting robust and civil online conversation www.wan-ifra.org Online comment moderation: emerging best practices by Emma Goodman RESEARCHERS: SUPPORTED BY: Federica Cherubini, Alexandra Waldhorn Media Program of the Open Society Foundations EDITORS: www.opensocietyfoundations.org Alexandra Waldhorn, Amy Had!eld PUBLISHED BY: WAN-IFRA Washingtonplatz 1 64287 Darmstadt, Germany Tel. +49 6151 7336 / Fax +49 6151 733800 www.wan-ifra.org CEO: Vincent Peyrègne DIRECTOR OF PUBLICATIONS: Dean Roper Published by: the World Association of Newspapers (WAN-IFRA) WORLD EDITORS FORUM DIRECTOR: Cherilyn Ireton © 2013 World Association of Newspapers and News Publishers Table of contents Introduction ............................................ 5 The status of comments .........................17 The moderation process ........................ 21 Issues and challenges .............................51 Best practices ........................................ 61 What’s next? ......................................... 65 Conclusion ............................................ 71 Online comment moderation: challenges and opportunities Image © sxc.hu | Miguel Ugalde 4 A guide to promoting robust and civil online conversation Introduction In many parts of the globe, online com- consequence-free behaviour and a chance ments have become an essential ingredient to defy social norms, or maybe it’s a factor of a thriving news publication: readers feel of the structure of online conversations, that they have