Online Harassment and Content Moderation: the Case of Blocklists
Online Harassment and Content Moderation: The Case of Blocklists SHAGUN JHAVER, SUCHETA GHOSHAL, AMY BRUCKMAN, and ERIC GILBERT, Georgia Institute of Technology Online harassment is a complex and growing problem. On Twitter, one mechanism people use to avoid ha- rassment is the blocklist, a list of accounts that are preemptively blocked from interacting with a subscriber. In this article, we present a rich description of Twitter blocklists – why they are needed, how they work, and their strengths and weaknesses in practice. Next, we use blocklists to interrogate online harassment – the forms it takes, as well as tactics used by harassers. Specifically, we interviewed both people who use block- lists to protect themselves, and people who are blocked by blocklists. We find that users are not adequately protected from harassment, and at the same time, many people feel that they are blocked unnecessarily and unfairly. Moreover, we find that not all users agree on what constitutes harassment. Based on our findings, we propose design interventions for social network sites with the aim of protecting people from harassment, while preserving freedom of speech. CCS Concepts: • Human-centered computing → Empirical studies in collaborative and social com- puting; Ethnographic studies; Additional Key Words and Phrases: Online harassment, moderation, blocking mechanisms, gamergate, blocklists ACM Reference format: Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online Harassment and Con- tent Moderation: The Case of Blocklists. ACM Trans. Comput.-Hum. Interact. 25, 2, Article 12 (March 2018), 33 pages. https://doi.org/10.1145/3185593 1 INTRODUCTION 1.1 Online Harassment In mid 2016, 25-year-old Erin Schrode was in the middle of her congressional campaign.
[Show full text]