Supporting Volunteer Moderation Practices in Online Communities Joseph Seering CMU-HCII-20-107 September 2020 Human-Computer Interaction Institute School of Computer Science Carnegie Mellon University Pittsburgh, Pennsylvania 15213
[email protected] Thesis Committee: Geoff Kaufman (Chair) Human-Computer Interaction Institute Carnegie Mellon University Jason Hong Human-Computer Interaction Institute Carnegie Mellon University Bob Kraut Human-Computer Interaction Institute Carnegie Mellon University Michael Bernstein Department of Computer Science Stanford University Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Copyright © 2020 Joseph Seering. All rights reserved. KEYWORDS Content moderation, platforms, social computing, computer-supported cooperative work, social media, volunteer moderators, cooperative responsibility, Twitch, Reddit, Facebook, AI-mediated communication, social identity theory, rebukes, interpersonal moderation, governance, communities, online communities, self- governance, self-moderation, literature review, platforms and policies, commercial content moderation, human- centered design, metaphors, computational social science, thematic analysis, interviews, digital labor, hate speech, harassment, social networks. Abstract In this dissertation, I explore multiple levels of the content moderation ecosystem with a focus on platforms that rely extensively on volunteer user labor. These platforms, like Reddit, Twitch, and Facebook Groups, expect users to moderate their own communities, but reserve the right to intervene when communities or content therein violates sitewide standards for behavior. This thesis contains three parts. I begin with a high-level exploration of how platforms do and do not engage with volunteer community moderators. I build on the framework of cooperative responsibility to analyze the different ways platforms and users have found common ground on values, roles, and spaces for deliberation.