HATE SPEECH IN SOCIAL MEDIA: AN EXPLORATION OF THE PROBLEM AND ITS PROPOSED SOLUTIONS by CAITLIN ELIZABETH RING B.A., Clemson University, 2001 M.S., University of Denver, 2004 A dissertation submitted to the Faculty of the Graduate School of the University of Colorado in partial fulfillment of the requirement for the degree of Doctor of Philosophy Journalism and Mass Communication 2013 This dissertation entitled: Hate Speech in Social Media: An Exploration of the Problem and Its Proposed Solutions Written by Caitlin Elizabeth Ring has been approved for the Department of Journalism and Mass Communication Dr. Robert Trager Dr. Andrew Calabrese Date The final copy of this thesis has been examined by the signatories, and we find that both the content and the form meet acceptable presentation standards of scholarly work in the above mentioned discipline. ABSTRACT Ring, Caitlin Elizabeth (Ph.D., Communication [Dept. of Journalism and Mass Communication]) Hate Speech in Social Media: An Exploration of the Problem and Its Proposed Solutions Dissertation directed by Professor Emeritus Robert Trager, Ph.D., J.D. Social media are rife with hate speech. A quick glance through the comments section of a racially charged YouTube video demonstrates how pervasive the problem is. Although most major social media companies such as Google, Facebook and Twitter have their own policies regarding whether and what kinds of hate speech are permitted on their sites, the policies are often inconsistently applied and can be difficult for users to understand. Many of the decisions made by the content removal teams at these organizations are not nearly as speech protective as the First Amendment and U.S. Supreme Court precedent on the subject would mandate. Thus, the current situation gives social media companies’ unprecedented power to control what videos, text, images, etc. users may or may not post or access on those social media sites. In an effort to identify solutions for curtailing hate speech in social media, this dissertation will explore the scope and nature of the problem of hate speech in social media today, using YouTube as an exemplar. A review of arguments for and against regulating hate speech online is then presented, along with an overview of current U.S. hate speech and Internet regulations and relevant jurisprudence. The approaches proposed by other legal and communication scholars about whether and how to limit hate speech online are examined and evaluated. Finally, a solution that seeks to minimize hate speech on social media Web sites, while still respecting the protections established by the First Amendment, is proposed. Specifically, a recommendation is made to encourage self-regulation on the part of social media companies, which involves a move from a “.com” generic top-level domain to one called “.social.” In order to be part of the consortium of companies included on the “.social” domain, which will hopefully include YouTube, Facebook, Twitter, Instagram and others, an organization must abide by the industry-developed, uniform rules regarding what kinds of hate speech content are and are not permitted on these sites. A working group comprised of social media decision-makers will develop the policy and staff members of the Federal Communications Commission will facilitate this process. Hopefully, the resulting approach will better reflect precedent on the issue, which hesitates to place any restrictions on expression, regardless of how offensive it may be. iii ACKNOWLEDGEMENT I would like to thank my advisor and dissertation chair, Dr. Robert Trager for teaching me to be a passionate educator, a rigorous scholar and an enthusiastic servant to the demands of life in academia. His attention to detail, respect for his students and love of the law are traits I can only hope to take with me as I move on to the next stage of my career. Trager has made it possible for me to complete this dissertation and secure a tenure- track job as an assistant professor, even in the face of the substantial personal obstacles we’ve both faced during this time. For that, and for everything he has done for me and taught me, I am eternally grateful. At the end of the day, Trager is the kind of professor we should all aspire to be. I know I will. In addition, I owe an enormous debt of gratitude to the members of my committee: Dr. Andrew Calabrese, Dr. Shu-Ling Berggreen, Rothgerber Professor of Constitutional Law Robert Nagel, and Dr. Peter Simonson. Their teaching, guidance, feedback and personal support have helped this project come to fruition. Along with my committee, I would also like to personally thank Martha LaForge and the rest of the faculty and staff at Journalism and Mass Communication for all of their support throughout my four years in the JMC graduate program. I would also like to thank my family and friends for all of their help throughout the process of researching and writing this dissertation. In particular, I want to acknowledge my mother, Jackie Ring, who served as my copy editor, sounding board and number-one cheerleader. Thank you also to my Dad, Mike Ring; my sister, Melissa Perricone; my boyfriend, Wayne Carlson, and my friends Ben Razes, Shawna Golden and Kate Stokley for their unwavering support. Finally, I would like to thank my cohort members and fellow doctoral students for joining me on this journey. ! iv CONTENTS I. SCOPE AND NATURE OF HATE SPEECH ON SOCIAL MEDIA WEB SITES…………………………………………………………………………...1 Hate Speech on YouTube…………………………………………………...7 Defining Hate Speech……………………………………………………...14 Defining Social Media II. ARGUMENTS FOR AND AGAINST HATE SPEECH PROTECTION…….22 Why Protect Hate Speech……..…………………………………………...22 Why Ban Hate Speech……………………………………………………..29 III. CURRENT REGULATION OF ONLINE HATE SPEECH IN THE UNITED STATES AND ABROAD…………………………………………….....……38 Supreme Court Rulings on Hate Speech Statutes…………………………38 Supreme Court Rulings Regarding Related Cases………………………...52 International Hate Speech Regulations……………………………………58 IV. REGULATING THE INTERNET…………………………………………….68 Regulating Mass Media……………………………………………………69 Current Internet Regulations………………………………………………78 V. POTENTIAL SOLUTIONS…………………………………………………...99 Legislative Action…………………………………………………………99 International Approaches to Regulation……………………………….…109 Self-Regulation by Social Media Companies………………………….…113 Filtering Software………………………………………………………...115 Do Not Regulate Hate Speech on Social Media Sites……………………119 VI. RECOMMENDED APRROACH……………………………………………125 Addressing Research Questions………………………………………….125 Solution Explanation and Implementation……………………………….132 Arguments in Favor of Recommended Approach………………………..137 Addressing Counter-Arguments………………………………………….140 VII. CONCLUSION………………………………………………………………144 Project Limitations ………………………………………………………145 Recommendations for Future Research ……………………………...….145 BIBLIOGRAPHY………………………………….……………………………………149 v Chapter I Scope and Nature of Hate Speech on Social Media Web Sites Introduction: The Internet is changing the face of communication and culture. In the United States, the Internet has drastically altered the way we get our news, talk to our friends and generally live our lives. Its de-centralized nature makes it a perfect place for amateurs and professionals alike to create and share ideas, information, images, videos, art, music and more. In spite of, or perhaps because of, its democratic nature, the Internet is also populated with Web sites dedicated to inciting hatred against particular ethnic, religious, racial or sexually-oriented groups such as women, Jews, African-Americans, Muslims and members of the lesbian, gay, bi-sexual and transgendered (LGBT) community. According to the Southern Poverty Law Center (SPLC), a civil rights organization that tracks and litigates hate group activity, there has been a 69 percent rise in the number of active hate groups from 2000 to 2012.1 This increase is due in large part to the fear and anger surrounding issues of non-white immigration, the sluggish economy and the re-election of an African-American president.2 Today, there are more than 1,000 active hate groups3 in the United States and many of these, including the Ku Klux Klan, the Creativity Movement and the Family Research Institute, have a substantial presence online. According to the Simon Wiesenthal Center’s 2012 Digital Terror and Hate Report, there !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 1 What We Do, Southern Poverty Law Center.com, http://www.splcenter.org/what-we-do/hate-andextremism (last visited Nov. 29, 2012). 2 Hate Map, Southern Poverty Law Center.com, http://www.splcenter.org/get-informed/hate-map (last visited Nov. 29, 2012). 3 Id. 1 are approximately 15,000 problematic Web sites, social networking pages, forums and newer online technologies such as games and applications, dedicated to inciting hatred based on ethnicity, race or sexual preference. According to Websense, which is tracking the 15,000 hate and militancy sites, the total number of these sites has tripled since 2009.4 However, the U.S. Department of Homeland Security currently has only one person tracking these suspicious, non-Islamic groups.5 Several of these sites encourage the creation and dissemination of hateful rhetoric, videos and music, while others go so far as to call for physical violence against an out- group. For example, the Creativity Movement’s Web site asks visitors to “arm yourselves and join the fight against the mud races” a reference to the racial holy war (often referred to as “RAHOWA”) called
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages166 Page
-
File Size-