Identifying Opportunities to Improve Content Moderation

Identifying Opportunities to Improve Content Moderation

IDENTIFYING OPPORTUNITIES TO IMPROVE CONTENT MODERATION A Dissertation Presented to The Academic Faculty By Shagun Jhaver In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the School of Interactive Computing Georgia Institute of Technology May 2020 Copyright c Shagun Jhaver 2020 IDENTIFYING OPPORTUNITIES TO IMPROVE CONTENT MODERATION Approved by: Dr. Amy Bruckman, Advisor School of Interactive Computing Georgia Institute of Technology Dr. W. Keith Edwards School of Interactive Computing Dr. Eric Gilbert, Advisor Georgia Institute of Technology School of Interactive Computing Georgia Institute of Technology Dr. Scott Counts Social Technologies Group Dr. Neha Kumar Microsoft Research Sam Nunn School of International Affairs and the School of Interactive Date Approved: March 4, 2020 Computing Georgia Institute of Technology In loving memory of my mom. ACKNOWLEDGEMENTS This thesis would not have been possible without the generosity and support of many, many people. First, I would like to thank my advisors, Amy Bruckman and Eric Gilbert. They provided me intellectual and emotional support at every step of the way. Our weekly meetings always left me with a clearer focus and joy for my research. I will always cherish these meetings. Most importantly, Amy and Eric showed me how to be a good advisor through their example. Since early on in my PhD, I made a conscious effort to absorb the leadership lessons and values that this collaboration has offered. I look forward to practicing these lessons in my future roles as a teacher, advisor, mentor and leader. I am also deeply indebted to an extraordinary group of mentors: Neha Kumar, Michae- lanne Dye, Stevie Chancellor, Scott Counts and Munmun De Choudhury. Their thoughtful advise has been helpful in shaping my research. I also consider myself fortunate to have made friends who I will cherish forever: Koustuv Saha, Benjamin Sugar, Eshwar Chan- drasekharan and Bahador Saket. Additionally, I could not have asked for a more supportive group of lab-mates for this journey: Sucheta Ghoshal, Julia Deeb-Swihart, Darren Scott Appling and Jane Im. To each of you: Thank you! I have also been fortunate to have an amazing network of friends outside my depart- ment. To my housemates: Kashyap Mohan, Sloane Tribble, Akshay Sakariya, Swapnil Srivastava and Amber Chowdhary, your constant support has been crucial in maintaining my spirits during the tough times. I have been much enriched by our informal brainstorm- ing sessions. I have also savored every minute of our board games together! Above all, I am incredibly grateful to my extended family in India and the US. To my grandmother, sister, uncles, aunts, cousins, and friends, you have provided me the emotional and financial support I so desperately needed to continue my journey along this chosen path. I am deeply thankful for the support that I continue to receive from all of you. Finally, I would like to thank hundreds of individuals who participated in my studies. iv Without their voluntary support, this work would not have been possible. v TABLE OF CONTENTS Acknowledgments . v List of Tables . xv List of Figures . .xviii Chapter 1: Introduction . 1 1.1 Research Framing . 3 1.1.1 Online Harassment . 4 1.1.2 Third-party Moderation Systems . 6 1.1.3 Automated Tools for Moderation . 6 1.1.4 Fairness in Content Moderation . 7 1.1.5 Transparency in Content Moderation . 8 1.2 Research Questions . 9 1.3 Research Contributions . 12 Chapter 2: Related Work . 14 2.1 Content Moderation . 14 2.1.1 Challenges of Moderation Work . 16 2.1.2 Composition of Moderation Systems . 16 vi 2.1.3 Role of Human Moderators . 17 2.1.4 Role of Automated Mechanisms . 19 2.1.5 Fairness and Transparency in Moderation Systems . 21 2.1.6 Moderation Research on Specific Sites: Reddit and Twitter . 27 2.2 Online Harassment . 31 2.3 Freedom of Speech on Internet Platforms . 33 Chapter 3: Conceptualizing Online Harassment: The Case of Kotaku in Action 38 3.1 Introduction . 38 3.2 Gamergate and Kotaku in Action . 40 3.2.1 GamerGate . 40 3.2.2 Kotaku in Action . 41 3.3 Methods . 42 3.3.1 Participants . 44 3.3.2 Analysis . 45 3.4 Findings . 47 3.4.1 KiA Community . 47 3.4.2 KiA and Online Harassment . 57 3.5 Discussion . 65 3.5.1 Implications for Theory: Free Speech vs. Harassment . 66 3.5.2 Implications for Design . 69 3.5.3 Limitations . 70 3.6 Conclusion . 71 vii Chapter 4: Understanding the Use of Third-Party Moderation tools: The Case of Twitter Blocklists . 73 4.1 Introduction . 73 4.1.1 Online Harassment . 73 4.1.2 Blocking on Twitter . 74 4.1.3 Research Questions . 80 4.1.4 Contributions . 80 4.2 Related Work . 81 4.2.1 Online Harassment on Twitter . 81 4.2.2 Twitter Blocklists . 81 4.3 Methods . 82 4.3.1 Participant Sampling . 83 4.3.2 Interviews . 86 4.3.3 Participants . 86 4.3.4 Analysis . 87 4.3.5 Researcher Stance . 88 4.4 Findings : Online harassment . 89 4.4.1 Different perceptions of online harassment . 89 4.4.2 Tactics used by harassers . 93 4.4.3 Who is vulnerable to harassment? . 99 4.4.4 Support of harassed users . 100 4.5 Findings : Twitter Blocklists . 101 4.5.1 Algorithmically curated versus socially curated blocklists . 101 viii 4.5.2 Why do users subscribe to/avoid subscribing to anti-harassment blocklists? . 102 4.5.3 How did user experience change after using anti-abuse blocklists? . 104 4.5.4 Challenges of social curation . 107 4.5.5 Perception that blocklists block too much/block unfairly . 110 4.5.6 Feelings about being put on blocklists . 111 4.5.7 Appeals procedure . 112 4.6 Discussion . 113 4.6.1 Focusing on vulnerable groups . 114 4.6.2 Designing support systems for harassed users . 115 4.6.3 Improving blocking mechanisms . 116 4.6.4 Building “understanding mechanisms” . 120 4.7 Conclusion . 123 Chapter 5: Human-machine Collaboration for Content Moderation: The Case of Reddit Automoderator . 125 5.1 Introduction . 125 5.2 Methods . 129 5.2.1 Selection of Subreddits . 129 5.2.2 Interviews . 131 5.2.3 Participants . 132 5.2.4 Analysis . 133 5.3 Findings . 135 5.3.1 Reddit Moderation Tools . 136 ix 5.3.2 Introduction of Automod on Reddit . 138 5.3.3 Use of Automod to Enforce Community Guidelines . 143 5.3.4 Mechanics of Automod Configuration . 148 5.3.5 Automod Creates New Tasks for Moderators . 155 5.4 Discussion . 158 5.4.1 Facilitating Development and Sharing of Automated Regulation Tools158 5.4.2 The Need for Performance Data . 159 5.4.3 Human versus Automated Moderation Systems . 161 5.4.4 Limitations and Future Work . 165 5.5 Conclusion . 167 5.5.1 For creators of new and existing platforms . 167 5.5.2 For designers and researchers interested in automated content reg- ulation . 168 5.5.3 For scholars of platform governance . 168 5.5.4 For content moderators . 169 Chapter 6: Understanding User Reactions to Content Removals: A Survey of Moderated Reddit Users . 171 6.1 Introduction . 171 6.2 Study Context: Content Removals on Reddit . 174 6.3 Methods . 175 6.3.1 Survey Instrument . 176 6.3.2 Data Collection . 178 6.3.3 Data Preparation . 180 x 6.3.4 Participants . 181 6.3.5 Variables . ..

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    306 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us