
Content Removal by Commercial Social Media Platforms: Implications offor Compliance with Local Laws By Afef Abrougui (student number: 11102950) Under the supervision of Dr. Stefania Milan Master’s in Media Studies: New Media and Digital Culture University of Amsterdam: Graduate School of Humanities 24 June 2016 2 Table of Contents Acknowledgements, 3 Abstract, 4 Keywords, 5 Chapter 1: Introduction, 5 Chapter 2: Literature Review, 9 Speech Regulation in Private Hands, 10 Governments as Powerful as Ever, 15 Implications offor Compliance with Local Laws, 18 Not All Requests are Treated Equally, 21 Involving Users in Questions of Governance, 23 Chapter 3: Research Design, 26 Turkey as a Case Study, 26 Data collection methods, 30 Data analysis methods, 33 Chapter 4: Findings and Discussion, 34 Implications of the Country-Withheld Content Policies, 34 Facebook and Twitter: Platforms of Concern, 37 A Demand for More Transparency, 40 How Should Platforms Handle Requests, 44 Interference of Business Interests, 47 Chapter 5: Conclusion, 51 Appendices, 55 Appendix A: Interview Guide, 55 Appendix B: Two Interview Samples, 56 Appendix C: List of Interviews, 63 Bibliography, 64 3 Acknowledgements I would like to express my gratitude to my supervisor Dr. Stefania Milan for her guidance and support throughout the entire writing process. I would also like to thank all the respondents who contributed their valuable insights and knowledge to this dissertation. 4 Abstract Taking Turkey as a case study, this thesis examines the implications of country-withheld content (CWC) policies. These policies enable governments to make requests to social media platforms to remove content for violating their local laws. CWC policies have serious implications for freedom of expression online. As the case-study shows, speech-restrictive governments like that of Turkey are exploiting social media platforms’ policies of compliance with local laws to crackdown on legitimate speech and silence critics. The dissertation draws upon previously published research, and primary data collected through interviews conducted with Turkish and non- Turkish net freedom advocates. This thesis contributes to existing research by exploring attitudes of the net-freedom activist community towards content removal in compliance with local laws. Respondents are concerned about governments’ abuse of these policies and platforms’ increasing compliance. Further, they want to see platforms appealing more often these requests, and revealing to the public their criteria for compliance. 5 Keywords Country-withheld content (CWC), governance, transparency report, content removal (takedown) requests, country-specific restrictions, compliance. Chapter 1: Introduction New media technologies enthusiasts or “cyber-optimists” (Haunss 33) maintain that social media platforms empower citizens to do their own reporting, and publish stories that receive little to no coverage from corporate or government owned media, and enable the emergence and coordination of social movements and protests. This stance is often backed by the coverage of real-life events including natural disasters, riots, elections, and protests, by citizens using their mobile phones, an internet connection, and “free” commercial platforms on which they publish and disseminate their content. According to social media theorist Clay Shirky, the internet empowers anyone who has a message to easily spread it to a global audience. In his book Public Parts, journalism professor Jeff Jarvis describes the internet as “everyone’s printing press”, adding that “all of us no longer watch the same, shared news with the same, one-size-fits-all viewpoint” (24). As Yochai Benkler points out in his book The Wealth of Networks, the internet enables the creation of a networked public sphere, providing “anyone with an outlet to speak, to inquire, to investigate, without need to access the resources of a major media organization”(12). He goes on: “We are seeing the emergence of new, decentralized approaches to fulfilling the watchdog function and to engaging in political debate and organization” (Benkler 12). For instance, content posted by users on social media platforms, or user-generated content (UGC) in general, have made available to users “new hope and new possibilities for public reinvolvement in affairs of common interest” (Langlois 92). 6 When anti-government protests broke out in late 2010 in Tunisia, the state-owned media and the privately-owned radios and TV stations that existed at that time either turned a blind eye or focused on airing the government’s narrative that depicted protesters as “thugs” (Alakhbar), ignored their demands, and downplayed the number of victims (Ryan). On Facebook and Twitter, however, videos and photos of police brutality against defenceless protesters demanding “jobs, freedom and dignity” were widely circulated, in a challenge to the mainstream media’s coverage of the events (Delany). In another instance showcasing how content generated by users empowers involvement in affairs of public interest, the dissemination of police brutality footage against unarmed black men and women in the US, helped spark a debate about discrimination in the country’s criminal justice system (Laughland and Swaine). On 17 July 2014, 43-year-old Eric Garner was pronounced dead in hospital following his arrest by the New York Police Department (NYPD). Witness Ramsey Orta captured Garner’s final moments in a video he filmed with his mobile phone, before handing it over to the New York Daily News. In the video, Garner can be heard repeatedly uttering the phrase “I can’t breathe” as an NYPD officer placed him in a chokehold (Sanburn). Ever since, several more videos showing excessive, and at times deadly use of force by police officers against unarmed black Americans have been recorded and published by bystanders and witnesses. Speaking to CNN, one editor of a black newspaper said that they have been covering police abuse for years, but today “there are more ways to expose it”, thanks to tools that allow instant dissemination of footage (McLaughlin). In this regard, social media platforms and content hosting services like to maintain that their mission is all about empowering users to communicate and express themselves freely (Mackinnon). Youtube provides its users with “a forum to connect, inform, and inspire” (Youtube About Page), while Twitter allows them to “watch events unfold, in real time, from every angle” (Twitter 7 homepage). Facebook, on the other hand, “connects [its users] with friends and the world around them” (Facebook homepage). Over the past years, however, these platforms have been increasingly “facing questions about their responsibilities to their users, to key constituencies who depend on the public discourse they host, and to broader notions of the public interest”, writes communication studies professor Tarleton Gillespie (“The Politics of Platforms” 348). Governance of speech is one area of regulation platforms often face questions about. Commercial social media platforms and other online content-hosting services are continuously under the spotlight over their governance of the troves of content their users generate on a daily basis. Providing services to a global community of consumers living under different cultural norms and jurisdictions, these platforms face the challenge of drawing the line between what is acceptable speech and what is not (Gillespie, “Facebook’s improved…”). When Facebook and other corporate actors make decisions about which type of content is allowed on their platforms and which is not (whether it is nudity, hate speech or graphic content), they are authorising themselves to act as “custodians of content” (Gillespie, “The dirty job of…”). Mostly owned by American companies, these platforms are also under increased pressures to comply with requests made by foreign governments to remove content under repressive laws, which goes against their promise of user empowerment. For example, during the second half of 2015, Twitter received 2,211 content takedown requests from Turkish authorities for violating local laws, and complied with 23 percent of those requests (Twitter Transparency Report). Turkish activists say among the content affected, were tweets and accounts addressing corruption and criticising public figures (Daraghi and Karakas). 8 Serving as a “quasi-public sphere” (York, “Policing content…” 3) for a large number of international users, including those based in high-growth countries under undemocratic systems of governance, platforms will continue to face a never-ending dilemma: which comes first, their commitment to freedom of expression or the pursuit of revenue growth? This dilemma often leads to contradictions in the ways platforms handle government takedown requests. In one situation a platform may choose to fight back a government request under the pretext that it violates the right to free speech, while in another situation they decide to comply even though their compliance clearly violates that same right. Policies of compliance with local laws, and the lack of transparency about how such policies are enforced, leave users at the mercy of private social media platforms and their interests. This has serious implications for freedom of expression online. As the case study will show, Turkish users seeking to express themselves away from government censorship, often find themselves silenced by the platforms that were supposed to enable their right to free expression, at the request
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages75 Page
-
File Size-