Content Removal by Commercial Platforms: Implications offor Compliance with Local Laws

By Afef Abrougui (student number: 11102950) Under the supervision of Dr. Stefania Milan Master’s in Media Studies: New Media and Digital Culture University of Amsterdam: Graduate School of Humanities 24 June 2016 2 Table of Contents

Acknowledgements, 3 Abstract, 4 Keywords, 5 Chapter 1: Introduction, 5 Chapter 2: Literature Review, 9 Speech Regulation in Private Hands, 10 Governments as Powerful as Ever, 15 Implications offor Compliance with Local Laws, 18 Not All Requests are Treated Equally, 21 Involving Users in Questions of Governance, 23 Chapter 3: Research Design, 26 as a Case Study, 26 Data collection methods, 30 Data analysis methods, 33 Chapter 4: Findings and Discussion, 34 Implications of the Country-Withheld Content Policies, 34 and : Platforms of Concern, 37 A Demand for More Transparency, 40 How Should Platforms Handle Requests, 44 Interference of Business Interests, 47 Chapter 5: Conclusion, 51 Appendices, 55 Appendix A: Interview Guide, 55 Appendix B: Two Interview Samples, 56 Appendix C: List of Interviews, 63 Bibliography, 64 3

Acknowledgements

I would like to express my gratitude to my supervisor Dr. Stefania Milan for her guidance and support throughout the entire writing process. I would also like to thank all the respondents who contributed their valuable insights and knowledge to this dissertation.

4

Abstract

Taking Turkey as a case study, this thesis examines the implications of country-withheld content (CWC) policies. These policies enable governments to make requests to social media platforms to remove content for violating their local laws. CWC policies have serious implications for freedom of expression online. As the case-study shows, speech-restrictive governments like that of Turkey are exploiting social media platforms’ policies of compliance with local laws to crackdown on legitimate speech and silence critics. The dissertation draws upon previously published research, and primary data collected through interviews conducted with Turkish and non- Turkish net freedom advocates. This thesis contributes to existing research by exploring attitudes of the net-freedom activist community towards content removal in compliance with local laws. Respondents are concerned about governments’ abuse of these policies and platforms’ increasing compliance. Further, they want to see platforms appealing more often these requests, and revealing to the public their criteria for compliance. 5

Keywords Country-withheld content (CWC), governance, transparency report, content removal (takedown) requests, country-specific restrictions, compliance.

Chapter 1: Introduction

New media technologies enthusiasts or “cyber-optimists” (Haunss 33) maintain that social media platforms empower citizens to do their own reporting, and publish stories that receive little to no coverage from corporate or government owned media, and enable the emergence and coordination of social movements and protests. This stance is often backed by the coverage of real-life events including natural disasters, riots, elections, and protests, by citizens using their mobile phones, an internet connection, and “free” commercial platforms on which they publish and disseminate their content. According to social media theorist Clay Shirky, the internet empowers anyone who has a message to easily spread it to a global audience. In his book Public Parts, professor Jeff Jarvis describes the internet as “everyone’s printing press”, adding that “all of us no longer watch the same, shared news with the same, one-size-fits-all viewpoint” (24). As Yochai Benkler points out in his book The Wealth of Networks, the internet enables the creation of a networked public sphere, providing “anyone with an outlet to speak, to inquire, to investigate, without need to access the resources of a major media organization”(12). He goes on: “We are seeing the emergence of new, decentralized approaches to fulfilling the watchdog function and to engaging in political debate and organization” (Benkler 12). For instance, content posted by users on social media platforms, or user-generated content (UGC) in general, have made available to users “new hope and new possibilities for public reinvolvement in affairs of common interest” (Langlois 92). 6

When anti-government protests broke out in late 2010 in Tunisia, the state-owned media and the privately-owned radios and TV stations that existed at that time either turned a blind eye or focused on airing the government’s narrative that depicted protesters as “thugs” (Alakhbar), ignored their demands, and downplayed the number of victims (Ryan). On Facebook and Twitter, however, videos and photos of police brutality against defenceless protesters demanding “jobs, freedom and dignity” were widely circulated, in a challenge to the mainstream media’s coverage of the events (Delany). In another instance showcasing how content generated by users empowers involvement in affairs of public interest, the dissemination of police brutality footage against unarmed black men and women in the US, helped spark a debate about discrimination in the country’s criminal justice system (Laughland and Swaine). On 17 July 2014, 43-year-old Eric Garner was pronounced dead in hospital following his arrest by the New York Police Department (NYPD). Witness Ramsey Orta captured Garner’s final moments in a video he filmed with his mobile phone, before handing it over to the New York Daily News. In the video, Garner can be heard repeatedly uttering the phrase “I can’t breathe” as an NYPD officer placed him in a chokehold (Sanburn). Ever since, several more videos showing excessive, and at times deadly use of force by police officers against unarmed black Americans have been recorded and published by bystanders and witnesses. Speaking to CNN, one editor of a black newspaper said that they have been covering police abuse for years, but today “there are more ways to expose it”, thanks to tools that allow instant dissemination of footage (McLaughlin).

In this regard, social media platforms and content hosting services like to maintain that their mission is all about empowering users to communicate and express themselves freely (Mackinnon). Youtube provides its users with “a forum to connect, inform, and inspire” (Youtube About Page), while Twitter allows them to “watch events unfold, in real time, from every angle” (Twitter 7

homepage). Facebook, on the other hand, “connects [its users] with friends and the world around them” (Facebook homepage). Over the past years, however, these platforms have been increasingly “facing questions about their responsibilities to their users, to key constituencies who depend on the public discourse they host, and to broader notions of the public interest”, writes communication studies professor Tarleton Gillespie (“The Politics of Platforms” 348). Governance of speech is one area of regulation platforms often face questions about.

Commercial social media platforms and other online content-hosting services are continuously under the spotlight over their governance of the troves of content their users generate on a daily basis. Providing services to a global community of consumers living under different cultural norms and jurisdictions, these platforms face the challenge of drawing the line between what is acceptable speech and what is not (Gillespie, “Facebook’s improved…”). When Facebook and other corporate actors make decisions about which type of content is allowed on their platforms and which is not (whether it is nudity, or graphic content), they are authorising themselves to act as “custodians of content” (Gillespie, “The dirty job of…”). Mostly owned by American companies, these platforms are also under increased pressures to comply with requests made by foreign governments to remove content under repressive laws, which goes against their promise of user empowerment. For example, during the second half of 2015, Twitter received 2,211 content takedown requests from Turkish authorities for violating local laws, and complied with 23 percent of those requests (Twitter Transparency Report). Turkish activists say among the content affected, were tweets and accounts addressing corruption and criticising public figures (Daraghi and Karakas). 8 Serving as a “quasi-public sphere” (York, “Policing content…” 3) for a large number of international users, including those based in high-growth countries under undemocratic systems of governance, platforms will continue to face a never-ending dilemma: which comes first, their commitment to freedom of expression or the pursuit of revenue growth? This dilemma often leads to contradictions in the ways platforms handle government takedown requests. In one situation a platform may choose to fight back a government request under the pretext that it violates the right to free speech, while in another situation they decide to comply even though their compliance clearly violates that same right. Policies of compliance with local laws, and the lack of transparency about how such policies are enforced, leave users at the mercy of private social media platforms and their interests. This has serious implications for freedom of expression online. As the case study will show, Turkish users seeking to express themselves away from government , often find themselves silenced by the platforms that were supposed to enable their right to free expression, at the request for their government.

In this dissertation, I explore how commercial social media platforms interfere with users’ free speech rights, and how foreign governments and jurisdictions affect their content regulation policies and practices taking Turkey as a case study. The Turkey case study will look into the implications of compliance with local laws on Turkish users’ exercise of their online free speech rights, by combining data collected from interviews conducted mostly with Turkish internet freedom advocates, and reports illustrating the impact of these policies on Turkish users such as news reports of content being taken down or transparency reports released by the companies.

Literature used in this dissertation include academic research in the fields of platform studies, social media governance (Flew) and the privatisation of 9 speech regulation. Research conducted by organizations and activist groups advocating for better protections for online such as the Electronic Frontier Foundation, and the Ranking project are also referred to in this dissertation, as their work usually illustrate concrete examples and cases. Throughout the paper I will be referring to the country- withheld Content (CWC) rule or policy. This is how Twitter describes its policy of withholding content in a specific country for violating local laws (Twitter Help Center). Other platforms and companies do not name this policy but they simply refer to country-specific restrictions or government requests for content removal. Though I focus on Turkey as a case study, I will be referring to other countries whenever needed for the purpose of providing relevant examples. Chapter two is a review of the existing literature on the privatisation of speech regulation, and its implications for users’ free speech rights. Chapter three lays down the data collection and analysis methods used, while in chapter four I analyze and discuss the findings of the Turkey case study.

Chapter 2: Literature Review

When it comes to the regulation of speech online, commercial social media platforms play a powerful role in deciding what kind of speech is permissible and what is not. Through algorithms, content moderators, and community-reporting or flagging tools, platforms govern the troves of content posted by their users on a daily basis, to make sure that certain red lines such as nudity, threats of violence and hate-speech are not crossed. These platforms are not the only powerful actor in the regulation of online content. In fact, governments are increasingly influencing content removal decisions, particularly as platforms started to abide by local laws. 10

Speech Regulation in Private Hands By taking and enforcing decisions about what is (and should be considered as) appropriate content and what is not, commercial platforms have come to exercise significant powers over public speech (Langlois, MacKinnon, Zuckerman). They have been described as “custodians of content” (Gillespie, “The dirty job of…”), “proxy censors” (Kreimer 14), a “social media police force” (Dencik), and “digital superpowers” (MacKinnon 34).

Social media platforms, search engines, and app stores occasionally face accusations of censorship, raising questions about their policies and the power they have gained over our rights and liberties including our rights to free speech and access to information. Every now and then, we hear of a photo or a video taken down, an account suspended, or a link removed from the results of a search engine. In February 2016, Facebook removed a photograph by Tunisian photographer Karim Kammoun showing bruises on the naked body of a woman victim of domestic violence (Marzouk). The platform does not allow nudity. On 31 December 2015, Twitter suspended the account of human rights activist Iyad El-Baghdadi for half an hour after confusing him with Abubaker al-Baghdadi, leader of the terrorist group ISIS (BBC, “Twitter ‘confuses”’...). Twitter staged a crackdown against groups and individuals using its service to threaten or promote terrorist acts, banning over 125,000 accounts between mid 2015 and early 2016 (Twitter). In September 2015, Apple removed from its app store metadata+, an application that maps US drone strikes around the world citing “excessively crude or objectionable content” (Smith).

The cases we hear about from time to time do in no way reflect the reality, as not all cases become public and social media platforms do not publish data about actions taken to enforce their terms of use. Ranking Digital 11

Rights, a project that ranks ICT sector companies on respect for free expression and privacy including internet companies that own the most popular platforms such as , Twitter and Facebook, found that not a single internet company among a total number of eight publishes information about their terms of service enforcement, including the number of accounts affected and types of content restricted.

The internet is often credited for its decentralized infrastructure, and the new possibilities it offers to citizens to publish and access content without any interference, particularly from governments (Hintz). While attending the 1996 World Economic Forum in Davos, John Perry Barlow founder of the digital- rights advocacy group the Electronic Frontier Foundation (EFF), wrote and published the “Declaration of the Independence of the Cyberspace”. Addressing governments in the declaration, Barlow wrote: “you have no moral right to rule us nor do you possess any methods of enforcement we have true reason to fear”. Written on the same day the then US President Bill Clinton signed the Communications Decency Act to regulate “obscene” content on the internet, Barlow warned governments that the “cyberspace does not lie within [their] borders” (Greenberg).

This belief in the emancipatory power of the internet has soon proved to be “naive” (Morozov xiii), as both governments and corporate actors managed to acquire key roles in governing the cyberspace, and the rights of users. In addition, these two actors are increasingly collaborating with each other whereby the “state outsources interventions into citizens’ communication to these platforms” (Hintz 195). In this regard, the cyberspace is neither “independent” nor “sovereign”, as Barlow declared it to be in 1996. “While there might be a radical decentralization of communication online, it does not mean that power relations have disappeared", writes communication studies scholar Ganaele Langlois (99). According to Crawford and Lumby, 12 networked media regulation “accounts for a diverse, contested environment of agents with differing levels of power and visibility: users, algorithms, platforms, industries and governments” (9). In fact, unlike the regulation of traditional media which usually involves governments and industry actors, new media governance further involves consumers and computer code.

In the governance of speech online, commercial platforms are key actors, acting as both hosts of and gatekeepers to content generated by their users, through their algorithms and Terms of Service (ToS). Through algorithms, platforms decide what is the most relevant content to a specific user (Tufekci). For instance, Google’s algorithms rely on more than 200 “clues” such as freshness of content and location “to guess what [users] might really be looking for” (Google Inside Search). In another example, Facebook’s EdgeRank algorithm decides which stories are prioritized on each user’s newsfeed according to his or her connections and activity (Facebook Help Center). Algorithms further help companies enforce their terms of service, in addition to community reporting and content moderators (Youmans and York). For instance, Youtube has algorithms that help it detect pornographic content (Robinson), while Facebook’s artificial intelligence algorithms are flagging more photos as “offensive” than users or content moderators (Constine). On the other hand, Terms of Service play the role of a legally binding document between users and platforms (Bayley), and they “function much as traditional laws do” (Ammori 2276), since they determine what speech is permissible and what is not. Platforms have different conditions, and as a result what is permissible on one platform, may not be allowed on another one. For instance, while Facebook and Instagram have strict rules regarding nudity, Twitter only asks its users to mark graphic content or nudity as “sensitive” (Fitzpatrick). 13

Despite the increased responsibilities and powers in the hands of the private sector, governments are not completely sidelined as they increasingly seek to exercise control by pressuring platforms to enforce national laws and take down certain types of content. For instance, these platforms have recently been under criticism from a number of governments for not doing enough to respond to the spread of extremist propaganda (Frenkel), and the surge of hate speech in the midst of Europe’s refugee crisis (Donahue). In addition, commercial platforms place their users as agents in the governance of speech, by providing them with tools to flag or report content for violating community guidelines and rules. Even though the final decision to remove or retain a piece of content remains in the hands of moderators or a company’s policy team, flagging is useful to platforms as it “provides a practical mechanism” (Crawford and Gillespie 2) for regulating the vast troves of content users post everyday, and further “offers a powerful rhetorical legitimation” (ibid.) to platforms when they face criticism for deciding to either remove or keep a specific piece of content. These community-reporting tools have previously been used by government and law enforcement agencies in a number of countries. Crawford and Gillespie report that the UK’s Metropolitan Police has a “super flagger” status to report extremist content on Youtube (Crawford and Gillespie 7), while industry sources confirmed to the authors of a UNESCO report entitled “Fostering Freedom Online: the Role of internet Intermediaries”, that governments “sometimes seek to have content restricted” via community reporting mechanisms (Bar, Hickok, Lim and MacKinnon 143).

In the governance of USG, no actor intervenes independently, as algorithms, terms of service, consumers, corporations, and governments interact with one another and at different levels to determine what speech is tolerated and what is not. Algorithms are not independent of the corporations that put them in place, and they act to enforce corporate choices and policies 14

(Pariser 175). The regulatory choices and policies corporations take and adopt are, on the other hand, influenced by foreign laws and norms (Ammori 2014). The most popular social media platforms and content-hosting services are owned by transnational companies headquartered in the US. Though these platforms may be subject to US law, their terms of use and community guidelines are also influenced by foreign jurisdictions and norms. Most users of these platforms live outside the US. In fact, 83.6 per cent of Facebook’s daily active users are outside the US and Canada (Facebook newsroom) and 79 per cent of Twitter accounts are also outside the US (Twitter Usage). Thus, it is in a company’s interest to put in place harmonious policies influenced by foreign laws and norms in order to attract a broad range of users (Ammori 2263). In this regard, the US constitution's First Amendment which guarantees is but “a local ordinance”, writes internet policy expert at the Center for Internet and Society at Stanford Law School Marvin Ammori (ibid.). This is reflected in the regulatory approaches taken by social media platforms to govern “controversial” content that is usually tolerated under the First Amendment such as hate speech or nudity. In the US, the First Amendment offers broad speech protections compared to European legislation and of course that of undemocratic regimes. For example, while in several European countries hate speech is criminalised, this type of speech is more tolerated by the First Amendment unless it is accompanied by a threat or incitement “intended to produce an imminent illegal conduct”(Volokh). As far as nudity is concerned, in the US, even though “” is considered unprotected speech under the First Amendment, nudity and most pornograhy are still protected (Ruane). Yet, American, social media platforms have taken different approaches to regulate it. Facebook, for instance, has a strict policy on nudity compared to other platforms like Twitter and Tumblr. While Facebook and Instagram ban photographs of genitals and female nipples (Facebook Community Standards), Twitter and Tumblr only 15 require users posting similar content to flag their accounts and blogs (the Twitter Rules. Tumblr Community Guidelines).

Governments as Powerful as Ever In addition to taking into account foreign laws and norms when crafting their terms of use and community guidelines, American social media platforms are increasingly expected to cooperate with governments. “We are forming our own Social Contract. This governance will arise according to the conditions of our world, not yours. Our world is different”, asserted Barlow in his 1996 declaration. The decentralised infrastructure of the internet was promising a space independent of any government interference, where “anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity” (Barlow).

Barlow like many other cyber-libertarians failed to anticipate how governments would respond by purchasing and manufacturing technologies that allow them to filter and monitor this new supposedly liberating space. For example, is reportedly building a ‘halal internet’, an intranet with its own national and religiously permitted services, that would isolate the Iranian cyberspace from the rest of the world (Doug). Undemocratic governments do not need to go as far as building their own national internet, as they can just acquire filtering and surveillance technologies, they usually purchase from corporations based in western democracies. When trouble arises, such as an uprising or protests, they can resort to entirely shutting down access to internet services, as the Egyptian authorities did in January 2011 to clampdown on protests against the dictatorship of then President Hosni Mubarak (Williams).

Further, throughout the years governments found ways to make internet companies collaborate with them, enlisting them as “proxy censors to control 16 the flow of information” (Kreimer 14), and pressuring them to take down content for violating their local laws. Platforms, on the other hand, comply based on calculations of legal and financial risks (Zuckerman). For instance, companies that have an office in a foreign country are under the legal obligation to comply with that country’s laws. If they do not comply, they may risk to have their staff arrested or prosecuted. For countries in which they do not have operations, platforms could risk being blocked, which in turn could result in loss of profits (see “Not All Requests Are Treated Equally” section in this chapter and “Interference of Business Interests” in chapter four).

Facebook states that governments may ask them to restrict access to a piece of content, if they believe it violates their local laws such as legislation banning Holocaust denial in Germany. If Facebook finds the content in question to be in violation of local laws, they restrict access to it in the relevant country or territory (Facebook government requests report, “About the Reports”). Twitter’s international users “agree to comply with all local laws regarding online conduct and acceptable content” (The Twitter Rules). Google, on the other hand, states that it “regularly receives requests from courts and government agencies around the world to remove information from [its] products” (Google Transparency Report). Other services and platforms that receive requests from foreign governments and courts include the online collaborative encyclopedia Wikipedia (), the multinational technology company Yahoo (Yahoo Transparency Report) and the messaging application Snapchat (Snapchat Transparency Report).

In this regard, the “open” and “borderless” infrastructure of the internet is after all not without borders and constraints, and users are increasingly expected to behave according to the local jurisdictions under which they are living. Otherwise, they may be silenced by the same platforms that promise to empower them, at the request of their governments. Senior lecturer at 17

Cardiff School of Journalism, Media and Cultural Studies, Arne Hintz writes that“the deterritorialized spheres of the internet have partly been reterritorialized by states” (217). While York writes that “instead of an unregulated, decentralized internet, we have centralized platforms serving as public spaces: a quasi-public sphere”, which is “subject to both public and private content controls spanning multiple jurisdictions and differing social mores” (3).

Further, the fact that today users across the world rely on a few big corporations that own the most popular services and products, makes it easier for states to exercise power in the cyberspace. One particular company, Google, has ownership over too many services, that media studies professor Siva Vaidhyanathan described its expansion as the “googlization of everything” (Vaidhyanathan). Google’s services include the largest search engine Google Search, one of the most popular online news-aggregating services Google News, the blogging service , the video sharing website Youtube which has over a billion users (Youtube Statistics), and the biggest mobile operating system Android, which in 2015 had a global market share of 81.5 percent (Hahn). On the other hand, Facebook is by far the most popular social networking site with more than 1 billion active users. Over the past years, the company had also purchased the photo-sharing service Instagram, and the instant-messaging app Whatsapp. This concentration of ownership in the hands of few corporate actors leaves users with very limited options and alternatives. For instance, activists living under repressive regimes might be reluctant to leave a large platform like Facebook for a service that is less popular but more supportive of freedom of expression, as they have better opportunities of reaching out to their audiences on Facebook. 18

Implications offor Compliance with Local Laws Social media platforms’ local laws compliance policies have three main implications for users’ free speech rights. First, by agreeing to comply with local laws, internet companies and online platforms face the dilemma of silencing what is considered legitimate speech under international human rights standards. Governments and platforms do have legitimate reasons to restrict certain types of content, such as child pornography and cyber- bullying, as international human rights law allows for certain restrictions that are “provided by law” and “necessary”. Under article 19 of the International Covenant on Civil and Political Rights (ICCPR), freedom of expression could be restricted for the purpose of ensuring “the respect of the rights or reputations of others, and the protection of national security or of public order, or of public health or morals”. However, legal restrictions on freedom of speech differ from one country to another. What is illegal speech in one country, could be totally legal in another one. And while some countries have in place legislation that are consistent with international human rights law, others do not. So, when companies agree to comply with local laws, they also agree to honour requests that violate their customers’ free speech rights, which are guaranteed by international human rights law. This, in turn, contradicts the public commitment to free speech these platforms and their leaders make as described earlier. For instance, between January and June 2014, Facebook restricted seven pieces of content inside Saudi Arabia for violating local laws that ban criticism of the royal family (Facebook Government Requests Report. Saudi Arabia). While between January to June 2015, Google received a total number of 14 requests from Indian authorities for violating local laws that prohibit religious offense (Google Transparency Report. ). While Google is legally required to comply with Indian legislation since it has offices inside the country, Facebook does not have 19

offices in Saudi Arabia and is not in anyway under the obligation to abide by Saudi law.

Second, there is the risk of misinterpreting and misunderstanding local laws, and as a result “inflict[ing] unacceptable collateral damage” on users’ free speech rights (Kreimer 47). As mentioned earlier, most users of social media platforms are based outside the US, living under a variety of jurisdictions, from the most restrictive to the most liberal. These jurisdictions and laws are themselves influenced by “informal institutions” which include cultural norms and values, traditions and customs, and belief systems (Flew 1). As a result, to properly understand the local laws of a foreign jurisdiction, it is not sufficient to study the laws as it is equally important to understand the cultural, social and political contexts. Internet companies rely on content moderators to remove pieces of content from their platforms to enforce their terms of service. Recently, the media coverage of and academic research into commercial content moderation revealed how major social media platforms outsource content moderators usually based in countries like India and the Philippines to “soak up the worst of humanity in order to protect the rest of us” (Chen), in other words to keep our feeds “clean” from gore, nudity, hate speech and threats of violence. This raised questions as to how can moderators judge the “appropriacy” of content destined to an audience living in culturally, socially, and politically different environments from theirs. As media studies scholars Sarah Roberts puts it, content moderation tasks “often involve complicated matters of judgment and thus require a moderator be steeped in the social norms and mores of the places in the world for which the content is destined” (Roberts 2). These same questions apply to the legal teams of social media platforms tasked with addressing content takedown requests made by governments and law enforcement authorities of a foreign country they may not be familiar with its language, laws, judicial system, and its cultural norms and socio-political context. Though transnational American 20 social media platforms regularly publish transparency reports shedding light on content takedown requests made by courts and law enforcement in foreign countries, there is a lack of transparency about the process of evaluating and responding to such requests, and particularly about the teams that respond to such requests. To what extent are these teams diverse in terms of the regions represented and the languages spoken, in order to be able to make flawless decisions about which content to withhold and which to keep based on requests made by governments from across the world? These platforms may be capable of understanding and perfectly interpreting the laws of the countries in which they have offices and operations, but what about the larger number of countries in which they do not? “Even cultures that are broadly aligned — such as the United States and Europe— still have marked differences in their approaches to controversial issues”, notes Internet governance and privacy expert Emily Taylor in a 2016 report for the Global Commission on Internet Governance (12). Taylor goes on to conclude that “making the right decision is difficult”, and that “for the most part, the line between what is acceptable and unacceptable is not so easy to draw; decisions are difficult and nuanced, and different cultures have varying levels of tolerance” (13).

The third implication is that undemocratic and anti-free speech governments will be increasingly making requests to social media platforms to take down content they find “objectionable”, as they now expect these platforms to comply with their local laws. “As governments grow aware of the fact that stifling speech is as easy as submitting an order to a corporation, the number of those orders will drastically increase”, writes York (“Complicity in censorship…”). In recent years, there has indeed been an increase in the number of government requests received by social media platforms and companies offering content hosting services. Since it first started publishing its data about third-party requests for content takedown or user data, also 21 known as the transparency report, Google has seen the number of government removal requests more than double. Over the second half of 2009, the company received 1,062 requests, a number that reached 3,467 over the first half of 2015 (Google Transparency Report). Twitter, on the other hand, saw an even more significant increase in the number of takedown requests. In 2012, the platform received only 48 requests from governments, a number which jumped to 5,631 requests in 2015 (Twitter Transparency Report). Though the increase in the number of requests could be attributed to the increase in the number of users, governments’ growing exploitation of content takedown policies also reflects the increase in numbers. For instance, while Twitter’s number of users did not grow significantly over 2015 (Oreskovic), the number of requests the platform receives continues to rise. Over the second half of 2015, Twitter received 4,618 requests, compared to only 1,013 during the first half of the same year. While during the first half of 2013, Google registered a 68 percent rise in the total number of government takedown requests compared to the second half of 2012 (Google Official Blog).

Not All Requests are Treated Equally Commercial platforms are owned by large internet companies and corporations whose aims include increasing the number of users, boosting revenues and entering new markets (Youmans and York). In this regard, a ban on a commercial platform particularly in large and profitable markets would result in the loss of users and revenues. For instance, with more than 44 million internet users as of 2015 (Internet Live Stats), Turkey represents an important market for platforms, and companies might prefer to censor their users there rather than face significant financial losses.

Though “business considerations may trump civic considerations” (Youmans and York 324), engaging in too much censorship, however, could 22 also be harmful to commercial platforms, by bringing about bad publicity (Zuckerman). As a result, these platforms need to strike a balance between their financial interests, and user satisfaction. In other words, they need to make sure that they do not upset governments, but do not censor too much either. As Roberts writes in reference to content moderation by commercial platforms: “these decisions may revolve around issues of “free speech” and “free expression” for the user base, but on commercial social media sites and platforms, these principles are always counterbalanced by a profit motive; if a platform were to become notorious for being too restrictive in the eyes of the majority of its users, it would run the risk of losing participants to offer to its advertisers” (5). While Ammori (2279) notes: “these companies must figure out how to adopt and enforce policies that comply with various national laws while advancing a corporate (and individual) interest in freedom of expression”.

The need to strike this kind of balance can lead to inconsistencies in the free speech policies and practices of these platforms. What a platform may accept to restrict in one country (or under one context), it may refuse to take down in another country (or under a different context). In other words, platforms may be more willing to collaborate with authorities in countries where they have high stakes than the ones in which they do not. Following last year's attack against the French satirical magazine Charlie Hebdo, Facebook founder and CEO Mark Zuckerberg posted a #jesuisCharlie support message and revealed that in 2010 his platform refused to ban content deemed offensive to prophet Muhammad in Pakistan despite the death threats he received (Zuckerberg). Few days later, however, Facebook agreed to ban pages for insulting the same prophet in Turkey following a court ruling (Akkoc, “Facebook censors…”). Turkey and Pakistan are both countries with Muslim-majority populations, and legislation that ban religious offense. In another example, while in the US Google fought a court order against the 23 removal of the controversial “Innocence of Muslims” film posted on Youtube in 2012, and won (Roberts J), the company still agreed to remove the same video inside Pakistan. Authorities there blocked Youtube since 2012, until Google agreed to launch a localized (in other words censored) version of its video-sharing platform in the country (Variyar).

For years, several transnational companies sought to rely on American free speech legislation in response to foreign governments’ content takedown requests, but eventually ended up adopting country withheld content policies (Ammori). Unlike a free speech policy based on the American legal system, a country withheld content policy gives social media platforms more flexibility in dealing with government requests while allowing them to avoid the risk of being completely banned. Platforms are able to push back on certain requests, but they can always comply when the stakes are high. Meanwhile, if they face criticism from activists and free speech advocates, they have local laws to blame. To give an example, during a Facebook Q&A event, Zuckerberg defended his company’s decision to operate in speech restrictive countries by laying the blame on local laws. He said: “Most countries have laws restricting some form of speech or another...The problem is if you break the law in a country then often times that country will just the whole service entirely... I can’t think of many examples in history where a company not operating or shutting down in a country in a protest of a law like that has actually changed the law. I think overwhelmingly our responsibility is to continue operating” (Q&A with Mark).

Involving Users in Questions of Governance Regulation of speech online is part of the broader question of internet governance to which several international and United Nations-led conferences and meetings have been dedicated over the past years, most notably the internet Governance Forum (better known by its acronym IGF). Governments 24 and corporations today dominate the Internet Governance (IG) process, including the regulation of our speech rights, and shaping what kind of content are users allowed and not allowed to post online, as shown above.

One principle of the internet governance process is “multistakeholderism”, which refers to the participation on equal footing of multiple stakeholders including governments, non-governmental organizations, the private sector, academics and the technical community in forums and discussions that shape IG policies whether at the international, regional or local levels (Diplo). This principle, however, is not always honoured as powerful actors (governments and the private sector in particular) often enjoy a larger representation in such events than the civil society and other marginalised communities, mainly thanks to the resources they have access to, which allow them to be widely represented in these events. In some regional or national IGFs, government and private actors also have the power to shape the agenda of the meeting and exclude or marginalise certain participants and communities, the civil society in particular. For instance, the 2015 Arab region IGF edition focused on financial and security concerns, while marginalising human rights issues (Nachawati Rego). In addition, not a single civil society member took part in any of the forum’s plenary sessions (Tarakiyee).

It may not be possible to adopt a multi-stakeholder approach in every single decision or policy corporations take or approve, but greater involvement from users and rights groups and activists is needed. Over the past few years, corporations and civil society actors took a number of steps to maximize protections for user rights. In 2008, the Global Network Initiative (GNI) was founded with the purpose of “advancing freedom of expression and privacy in Information and Communication Technologies” (GNI borchure). ICTs related companies, civil society organizations, academics, and investors are 25 involved in GNI activities, including companies like Google, Microsoft and Yahoo, and civil society groups like and the Committee to Protect Journalists. GNI put in place “principles on freedom of expression and privacy”, which member companies commit to implement. In another positive development, several companies today publish transparency reports providing insights into requests made by governments for user data or content takedown, even though information provided is often not comprehensive, and as a result does not allow users and rights groups to get the full picture and hold these companies to account.

However, there is still much to do not only in terms of enhancing transparency and maximizing rights protections, but also when it comes to involving customers. In her book Consent of the Networked, internet freedom advocate Rebecca Mackinnon refers to the principle of the “consent of the governed” in political philosophy, under which the authority of a government should be derived from the consent of its people. In the cyberspace, however, there is no such explicit consent except for the terms of use most users agree to without reading. This makes from governments and corporations “sovereigns operating [in the cyberspace] without the consent of the networked” (Mackinnon, xxiii). In this regard, netizens need to play greater roles in influencing the decision and policy making processes, and holding companies into account. This, however, is only possible when companies “build processes for engagement with users and customers, who are re- envisioned as constituents”, writes Mackinnon (248). “If companies want to gain trust amongst users, they need to be aware of the human rights implications of their policies and develop systems to resolve issues between activists—as well as average users—and companies”, noted York in a 2010 report entitled “Policing Content in the Quasi-Public Sphere” (29). One area, social media platforms can engage users and civil society actors in, is the development of “realistic and robust processes for content moderation that 26 comply with international human rights standards” (Taylor 16 ). Development of similar standards would make it easier for companies to make decisions regarding content removal, and make it easier for users to hold the companies into account. A number of respondents in the Turkish case study also mentioned the development of such standards as one of the solutions to mitigate threats to free speech rights, when companies take content removal decisions in compliance with local laws (see chapter four).

Chapter 3: Research Design

The purpose of the case study is to examine attitudes of internet freedom advocates towards the country withheld content policies and practices of social media platforms. These attitudes will be explored through interviews conducted via email or skype, mostly with Turkish activists.

Turkey as a Case Study

Over the past few years, Turkey has been gradually heading toward more authoritarianism under the rule of the Justice and Development Party (AKP), led by one of its founders Recep Tayyip Erdogan who currently serves as the country’s President after winning the presidential election in 2014 ( BBC, “Recep Tayyip Erdogan: Turkey’s ruthless president”). In his bid to consolidate his grip on power, Erdogan staged a crackdown on civil and political liberties, including press and media freedoms (Kirişci). In 2013, the Committee to Protect Journalists (CPJ) listed Turkey as the world’s “leading jailers of journalists” (Taal), documenting numerous attacks on and legal cases against journalists covering the anti-government Gezi protests (Dewey). Detentions and prosecutions of journalists are still common to this date. On 6 May, a court in sentenced two journalists from the opposition newspaper 27 Cumhiryet to jail for revealing to the public Turkish arms shipments to Syrian rebels ().

In addition to silencing journalists and cracking down on media coverage, Erdogan has also been wary of the internet’s role in supporting political speech in Turkey, describing social media as a “menace to society” and threatening to “eradicate” Twitter in 2014 (Ries). Seeking to control the internet, the parliament passed legislation that require Internet Service Providers (ISPs) to keep a record of their users’ online activities for two years (Hurriyet, “Turkish parliament…”), and give the government the power to shut down websites without a court order (Hurriyet, “New bill gives…”). Further, the country’s law (Law No. 5651 passed in 2007 on the Regulation of Publications on the Internet and Combating Crimes Committed by Means of Such Publications), bans among other types of content, child- pornography, “obscenity”, and criticism of the founder of the Turkish Republic Mustafa Ataturk (Akgul). Prosecutions of users for exercising their right to free speech online are not uncommon in Turkey. In 2014 for instance, 29 Twitter users were put on trial for sending tweets related to the 2013 Gezi protests (Amnesty).

The Turkish government repeatedly resorts to blocking social media platforms (Kasapoglu), as a way to “bully” them into complying with its censorship demands (Galperin). In March 2014, Facebook, Twitter, and Youtube were temporarily blocked to halt the circulation of leaked audio recordings purportedly showing evidence of corruption within the inner circle of the then Prime Minister and currently the President Recep Tayyip Erdogan. Turkish authorities also turn to restricting access to social media platforms by entirely blocking or throttling access to thwart coverage of terror attacks. Following a deadly car bomb attack in the capital on 17 February, activists reported that ISPs either blocked or slowed down access to Twitter and Facebook (Sozeri, “Turkey 28 cracks down…”). Blocking tactics often backfire, with users usually circumventing the censorship. For example, in March 2014 Twitter users in Turkey posted 2.5 million tweets in the hours after ISPs blocked access to the site (Gayomali).

In addition, Turkish authorities have been increasingly taking advantage of content removal policies to pressure social media platforms, particularly Facebook, Twitter, and Youtube to comply with the country’s anti-free speech laws. Only recently, following a complaint filed by Turkcell, one of the country’s largest mobile service providers, a court ordered Twitter to delete 862 tweets about a scandal at a religious foundation that is linked to the company, and has close ties to president Erdogan and his family (Sozeri, “Turkish court orders…”). In fact, Turkey makes a significant number of requests to different companies. During the second half of 2015, Turkish authorities made 2,211 content removal requests to Twitter, almost half the total number of requests received by the social networking site over that period (Twitter Transparency Report). Though the increase in the number of requests can be explained by an increase in the number of internet users in the country, Turkey still makes more requests than countries with larger number of users such as the US, Germany or India.

Even though, the US, Germany, and India have more internet users (see table 1), Turkey has been making more content takedown requests to both Google and Twitter from 2012 to mid 2015 as shown below in figures 3.1 and 3.2. Facebook data is not represented below as the company only provides the number of pieces of content restricted and does not publish the number of requests the company receives from each country. But, between January and June 2015, the company restricted 4,496 pieces of content inside Turkey mostly under the country’s internet censorship law. Only India comes before Turkey, with more than 15 thousand pieces of content restricted. However, 29 there are more users in India than in Turkey. In addition, the company has several offices in India, making it subject to the country’s legislation.

Turkey1 Germany2 India3 US4

2012 33,779,438 66,273,592 158,960,346 249,635,976

2013 35,253,433 67,812,285 193,204,330 267,028,444

2014 39,568,141 69,509,013 233,152,478 279,070,327

2015 43,953,971 70,569,048 354,114,747 283,712,407 Table 1: Growth of the number of internet users in Turkey, Germany, India, and the US from 2012 to 2015. Source: Internet Live Stats, data elaborated by the International Telecommunications Union (ITU), the World Bank and the United Nations Population Division.

Figure 3.1: Number of removal requests made by Turkey, Germany, India and US to Google from 2012 to 2015. * denotes only first half of 2015. Source: Google Transparency Report: Requests by the number from Germany, India, Turkey, and the US.

1 Number of Internet users in Turkey. Internet Live Stats. Accessed on 30 April 2016. < http :// www . internetlivestats . com / internet - users / turkey />. 2 Number of Internet users in Germany. Internet Live Stats. Accessed on 30 April 2016. < http :// www . internetlivestats . com / internet - users / germany />. 3 Number of Internet users in India. Internet Live Stats. Accessed on 30 April 2016. < http :// www . internetlivestats . com / internet - users / india />. 4 Number of Internet users in the US. Internet Live Stats. Accessed on 30 April 2016. < http :// www . internetlivestats . com / internet - users / us />. 30

Figure 3.2: Number of removal requests made by Turkey, Germany, India and the US to Twitter from 2012 to 2015. Source: Twitter Transparency Report.

It is for the reasons mentioned above that Turkey was selected for the case study of this dissertation. The government there is increasingly going authoritarian and wary of the role the internet plays in providing a public and open space for critics and political opponents. Among the tactics it employs to crackdown on dissent is making content removal requests to social media platforms.

Data collection methods To gain an insight into the implications of the country withheld content policy on the free speech rights of users in Turkey, qualitative interviews were conducted with Turkish and non Turkish free speech and internet freedom advocates. I chose to conduct qualitative interviews instead of surveys as my aim is “integrating multiple perspectives” (Weiss 25) from the internet-activist community, and not the collection of statistical data. In the study, I am not interested in statistics and numbers as much as I am interested in exploring how Turkish activists perceive the country-withheld content rule and its use by their government to silence critics and suppress dissent. In his book Learning From Strangers: The Art and Method of Qualitative Interview Studies published in 1995, Robert S. Weiss writes that when we conduct qualitative 31 interviews “we gain in the coherence, depth, and density of the material each respondent provides. We permit ourselves to be informed as we cannot be by brief answers to survey items. The report we ultimately write can provide readers with a fuller understanding of the experiences of our respondents” (16).

The number of interviewees is nine, and they were selected based on their knowledge of the topic, and their experience advocating for an open and free , or elsewhere. Due to the political situation in Turkey, respondents were given the opportunity to remain anonymous. As Brennen writes, qualitative interviews raise “potential ethical dilemmas arising from the use of personal information”, and interviewers “have a moral responsibility to protect their respondents” from any “potential harm” (29). Only one interviewee requested to have her identity concealed. Interviewees include activists, academics, journalists, and researchers. They are: ❖ Ahmet A. Sabancı, a freelance writer and researcher focusing on internet freedom issues in Turkey ❖ Arzu Geybulla, a freelance Azerbaijani writer who also covers Turkey ❖ Asli Telli Aydemir, a researcher with the Alternative Informatics Association, an Istanbul based civil society group promoting digital rights and internet freedom ❖ Efe Kerem Sozeri, a Turkish journalist who covers internet freedom issues in Turkey ❖ Erkan Saka, a political blogger and assistant professor at Bilgi University in Istanbul ❖ Isik Mater, a Turkish internet freedom activist ❖ Reem AlMasri, technology editor and researcher at 7iber.com, an online magazine based in Amman, Jordan ❖ Sarah Myers West, PhD researcher at the Annenberg School for Communication at University of Southern Carolina 32 ❖ and finally, a Turkish lawyer who requested to remain anonymous

Having myself conducted research and reported on the subject of the intersection of technology and human rights, I have come to personally know and meet most of the respondents, which made it easy for me to get in touch with them. In addition, one respondent, recommended and helped me reach out to two other interviewees.

Before conducting the interviews, an ‘interview guide’ was developed based on the literature review and research questions (see Appendix A for interview guide). Weiss defines an interview guide as “a listing of areas to be covered in the interview along with, for each area, a listing of topics or questions that together will suggest lines of inquiry” (Weiss 80). The interviews are semi-structured in the sense that there is a “pre-established set of questions” (Brennen 28) that do not require limited answer options. There is also “greater flexibility” with semi structured interviews, as the order of questions may change and interviewers may ask follow-up questions (ibid.). For this study, questions differed a bit from one interview to another depending on how each interview went and the background of each respondent (see Appendix B for two sample interviews). For the non-Turkish interviewees, they were not asked specific questions about Turkey, but rather questions that are general or more related to the region and countries they focus on. The fact that I do not speak Turkish did not represent a challenge, since respondents speak English, the language in which all interviews were conducted. The interviews were conducted via skype or email, as not all respondents had time for skype conversations (see Appendix C for the interviewing method of each respondent). For interviews conducted by email, follow up questions were sent to respondents whenever needed. Email interviews may reduce the spontaneity of synchronous interviewing or communication, and “spontaneity can be the basis for the richness of data collected in some interviews”, however they also provide respondents with 33 the opportunity to think about their answers, and even list references and resources (Opdenakker 2006).

Data analysis methods After each interview, answers (for emails) and transcripts (for skype calls) were “coded” (Weiss) to explore which questions and issues were raised, in order to identify “important insights, and information, outline key concepts, opinions, patterns and themes” (Brennen 38). The main themes of each interview were then listed in a spreadsheet, with the corresponding quotes and information provided by each respondent. Whenever needed, further research was conducted to “provide a contextual frame of reference from which the interview quotations are interpreted” (ibid.), and to make the writing process of the findings and discussion chapter easier later. Brennen notes that qualitative interviewing presents “one potential issue” which is the “reliability of the information provided by respondents” (38). For this reason, whenever needed, information and data provided by respondents as “factual” were verified and assessed to make sure they are accurate. This, however, does not concern the personal opinions and perspectives of interviewees which in this case study are treated as “authentic responses”, unlike factual information which “should be verified from other research sources as well as corroborated by other respondents during subsequent interviews” (ibid.).

The findings and discussion chapter (chapter four) was written using the interviews and a qualitative textual analysis of the news coverage and research related to Turkey’s internet freedom situation, in particular how authorities there exploit social media platforms’ content removal policies. It was written based on an “issue-focused analysis” (ibid.). An “issue-focused description is likely to move from discussion of issues within one area to discussion of issues within another, with each area logically connected to the 34 others” (ibid.). This allowed to divide the chapter into different sections, with each section focusing on a particular theme.

Chapter 4: Findings and Discussion

In the interviews, respondents addressed a number of concerns related to country-specific content removals. Turkish respondents, in particular, were openly critical towards the growing compliance of social media platforms (mainly Facebook and Twitter) with their government’s takedown requests which they said usually aim at silencing critics and legitimate speech. Respondents want to see platforms disclosing more information about government requests, and their criteria for complying with those requests.

Implications of the Country-Withheld Content Policies Respondents expressed unfavourable views of country-specific restrictions of content in general, and for how it is being exploited by governments, particularly the Turkish government in this case. Ahmet Sabanci, a writer and researcher focusing on internet freedom issues in Turkey, said he opposes such restrictions by social media platforms for two reasons. First, such restrictions contribute to the “balkanization” of the internet. The term “balkanization” is often used pejoratively to refer to the fragmentation of the “universal” and “borderless” internet into “splinternets” or networks "that are walled off from the rest of the Web" (Maurer). However, the threat of “balkanization” is not only due to attempts by governments such as those of and Iran to build their own networks isolated from the rest of the web (Meinrath). “Balkanization” is also the result of non-harmonious legislation and regulatory regimes (Bleiberg), which governments increasingly use to demand content removal. As a result, some tweets and accounts that one user in the Netherlands is able to see, may not be accessible to another user in Turkey. Second, according to Sabanci, these restrictions are “useless” since 35 users can bypass them by “changing some little account settings” like setting up the account location to another country other than Turkey.

Reem AlMasri who researches Internet Governance in the Arab Region thinks social media platforms “should not be subject to local laws, especially in this part of the world where laws are made to restrict content”. Country- withheld content policies “give countries like Turkey more leverage to persecute those who say things the authorities do not necessarily like”, said writer and journalist Arzu Geybulla. Citing as examples hate speech laws in Europe, and legislation banning Holocaust denial in Germany, Efe Kerem Sozeri, a journalist who covers internet freedom issues in Turkey, stated that internet companies can comply with national laws, but when there is an independent judiciary, which is not the case in Turkey where removal requests are “politically motivated” (Sozeri, Efe Kerem. Interview. 11 May 2016). As explained above (‘Research Design’ chapter), Turkish authorities do indeed exploit content removal policies to crackdown on speech and dissent. As a result, these policies have implications for users’ free speech rights in Turkey and elsewhere. Respondents particularly mentioned four types of content that Turkish authorities target through country-specific restrictions: ❖ Criticism of Ataturk: internet law No. 5651 prohibits of Mustafa Kemal Ataturk, the founder of the Turkish Republic (Human Rights Watch, “Turkey: Internet Freedom, Rights in Sharp Decline”). ❖ Criticism of the ruling authorities: Article 301 of the Penal Code prohibits the “denigration of the Turkish nation, the state of the Republic of Turkey, the Turkish Parliament, the government of the Republic of Turkey and the legal institutions of the state” (Human Rights Turkey). Article 299 of the same code prohibits insults against the President (Ognianova). ❖ Content and accounts related to Kurdish politicians and activists, and the outlawed Kurdistan Workers’ Party (PKK), a leftist militant group 36 seeking to establish an independent Kurdish state (BBC, “Profile: Kurdistan Workers' Party”). Turkey often uses its anti-terror laws to crackdown on activists peacefully promoting Kurdish rights (Human Rights Watch, “Turkey: Terror Laws Undermine Progress on Rights”). The United Nations has previously criticized the use of such laws to prosecute those involved in “non-violent discussions of the Kurdish issue” (Global Legal Monitor). ❖ Coverage of major political events or breaking news: Turkish authorities often place gag orders on coverage of major events. These orders also apply to social media platforms. For instance, last year a court ordered a ban on Twitter and Youtube over footage showing the hostage-taking of an Istanbul prosecutor by members of a far leftist group (Akkoc, “ social…”). The two platforms were, however, unblocked after they complied with the court’s removal requests (Franceschi-Bicchierai). In another example, in an attempt to obstruct the circulation of images showing the aftermath of a deadly car bombing in the capital Ankara in March 2016, both Twitter and Facebook were blocked following a court gag order (Akkoc, “Ankara blast…”). Once again, the two platforms were unblocked later.

Turkey’s tactic of intimidating platforms to comply with its takedown (or censorship) requests is not new. In 2007, Youtube was blocked over videos deemed “insulting” to Ataturk, even though the platform had removed the videos (). The blogging platforms blogger.com and wordpress.com were also targeted over posts deemed libelous against , a creationist TV host and author (Ben Gharbia). This trend continued to grow over the past few years, as Turkish authorities strengthen crackdown on civil and political liberties, and as the number of internet users in the country is increasing. This has in turn brought with it content regulation or moderation challenges, as platforms struggle between honouring their public 37

commitments to freedom of expression, and avoiding to upset governments that might ban them. While social media platforms “could have once used more light touch content moderation, they are now experiencing difficult cultural norms from country to country as well as different government imperatives as their customer base is growing globally”, said Sarah Myers West from onlinecensorship.org, a project that collects reports from users censored by social media platforms. “What was always a tension has only grown over time as a result of the increasing global base” (Myers West, Sarah. Interview. 2 May 2016).

Facebook and Twitter: Platforms of Concern A platform responds to government requests for content removal depending on a number of factors including its commitment to free speech, its user base, the importance of the market, and whether it has staff and offices inside the country issuing the request. In the interviews, respondents expressed concerns about the ways platforms handle takedown requests coming from Turkey, focusing mainly on two platforms: Facebook and Twitter. There were 39 million daily Facebook users from Turkey as of June 2015 (Daily and ), and 6.5 million monthly Twitter users as of late 2014 (Dogramaci).

Facebook, considerably received more negative criticism than Twitter for how it handles the Turkish government’s takedown requests. Respondents were explicitly critical towards Facebook, blaming the platform for not fighting back or appealing requests, and for complying “very quickly and very easily” (Aydemir, Asli Telli. Interview, 2 May 2016). Facebook has on several occasions faced accusations of (political) . During and in the aftermath of the 2013 anti-government Gezi protests, activists accused Facebook of removing accounts and pages of political activists and parties such as the page of the Kurdish Peace and Democracy Party (BDP) (Hurriyet 38

2013), the citizen journalism group the Others’ Post (Güler, “Facebook facing…”), and the page of the anti-racism and anti hate speech initiative “DurDe!”, or “Say Stop!” in English (Güler, “Facebook censors…”). The social networking site also engages in removing LGBT related content (Korkmaz) and posts deemed blasphemous or insulting to Islam, the dominant religion in Turkey (Akkoc, “Facebook censors...”). Because it “removes everything the Turkish government wants [to be removed]”, Facebook is becoming more of a “pro government media” Sabanci said.

Respondents also expressed concerns about Turkey’s exploitation of Twitter’s country-withheld content tool, but they were less critical towards the platform. For Sabanci, even though Twitter realizes that Turkey is a big market, and they maintain relationships with the Turkish government, the platform “tries its best to fight back against censorship”, by appealing in courts and not always complying. “They are trying to find a middle ground in this situation, and so far they did some good jobs for fighting back against censorship”, he asserted. According to Erkan Saka from Istanbul’s Bilgi University, “Twitter is relatively careful about handling the situation. It informs the users at every stage, although it complies, and they regularly counter-sue the government to unblock messages or accounts”.

One example showing how Twitter “fights back” is its decision to appeal a court order requesting the removal of 860 tweets related to a child-rape scandal involving an organization close to the Erdogan circle (Sozeri, “Turkish court orders…”). That is not the only case of Twitter appealing a Turkish request. In response to an appeal filed by the platform, in March 2014 a court overturned a previous ban on an account that posted tweets accusing a former government minister of corruption (Gadde). While, during the second half of 2016, the company said that it appealed 70 per cent of the 477 removal requests it received from Turkish courts (Kessel). 39

The fact that Facebook is not doing enough to appeal the requests, led a number of activists, including Sozeri, to quit the platform, preferring Twitter, instead, to address politics. This turned Twitter into the “most political platform” for Turkish users (Sozeri, Efe Kerem. Interview. 11 May 2016). In addition, the fact that Twitter does not require real name registration like Facebook, seems to make it possible for users to evade censorship by staying anonymous or adopting pseudonyms. Sabanci mentioned how once users are notified by Twitter that their accounts or tweets are the subject of a court removal order, they switch their usernames or open new accounts. By changing their usernames, Twitter is unable to apply the court order as they are unable to find the username in that order. Such tactic would not of course be possible when real name registration is required, as such policy makes it easier for a platform like Facebook to disable or “punish” accounts either because they violated its community standards or because they are the subject of a government request (York, “Facebook’s ‘real name’ policy…”).

But as Twitter continues to serve as a “public forum”(Akdeniz and Altiparmak), for Turkish users seeking to debate their country’s current affairs, expose government wrongdoing, and access viewpoints and perspectives sidelined by the mainstream media in their country, the platform is facing growing pressures from the Turkish authorities to comply with their requests. In fact, Twitter started to implement its country-withheld content policy in Turkey following a 2014 ban on the site, over tweets related to a corruption scandal involving the inner circle of the then Prime Minister and current President Erdogan (Sozeri, “Uncovering the accounts…”). Though the ban was later overturned by the country’s constitutional court, the damage was done already as Twitter bowed to Turkey’s pressures and started to comply more often with its content takedown requests. 40

Turkey’s growing compliance alarmed activists who documented several cases of “political censorship” on Twitter, made possible by the implementation of the CWC policy (Akdeniz and Altiparmak 2). In a 2015 legal notice they addressed to Twitter’s legal representatives, Turkish academics and activists Kerem Altiparmak and Yaman Akdeniz accused the platform of “transforming a public forum into a pro-government one” (4), for complying with politically motivated requests against tweets and accounts critical of the government. Content affected by Twitter’s enforcement of its CWC policy in Turkey, include accounts of journalists, media, and users addressing government corruption, criticizing Erdogan and other government officials, and leaking information to the public (Sozeri, “Uncovering the accounts…”). For instance, in January 2015, accounts tweeting and leaking information about a secret Turkish arms shipment to Syrian armed groups fighting the regime of Bashar al-Assad were suspended by Twitter following a court order. The account of the leftist newspaper BirGün was also affected when another court imposed a gag order on the story. In compliance with the order, Twitter withhold certain tweets that contain links to the newspaper’s coverage of the story (ibid.). Just like on Facebook, Twitter accounts posting about Kurdish issues also face the risk of being censored or taken down. In October 2014, a government official revealed that Twitter withheld “provocative tweets against Turkey’s national security” (Amy L. Beam, Ed.D ). Those tweets were related to protests taking place in Kurdish majority cities in Turkey, in support of the Kurdish Syrian town of Kobani which came under the siege of the terrorist group ISIS (BBC, “Turkey ...”).

A Demand for More Transparency

The fact that Facebook is receiving more attention and criticism could of course be explained by its large user base. There are more than 1 billion daily Facebook users (Facebook newsroom), and only 310 million users who log in 41 to Twitter on a monthly basis (Twitter Usage). According to crowdsourced data gathered by onlinecensorship.org based on reports submitted by users who were subject to censorship by social media platforms, Facebook was the most reported site (Online Censorship). “Facebook has more users and so it has more of a resource problem in terms of its content moderation because it just has more content to review”, explained Myers West.

But because Facebook does not provide comprehensive information about the removal requests it gets from the Turkish government, in particular the number of requests and its compliance rates, it is impossible to tell to what extent is the platform complicit in censoring its users in Turkey. For instance, while Turkish activists and users know that Twitter complied with 23 per cent of the 2,211 removal requests made by their government during the second half of 2015 (Twitter Transparency Report), they only know that Facebook restricted 2,078 pieces of content over the same period (Facebook Government Requests Report). In other words, it is not possible to tell whether Facebook complied with 100 per cent, 60 per cent, 20 per cent or any other rate of Turkey’s requests. In addition to providing the compliance rate, Twitter further specifies the number of requests it receives by branch of government (whether a court or a government agency), and states the number of accounts specified in the requests, and the number of tweets and accounts withheld in response to those requests. 42

Figure 4.1: Data Facebook provides on requests made by Turley. Source: a screenshot of the Facebook government requests report for Turkey July 2015-December 2015.

Figure 4.2: Data Twitter provides on government removal requests. Source: the company’s transparency report for July 2015-December 2015

Google, on the other hand, provides more information than both Twitter and Facebook, by breaking down the number of requests per reason, for 43 instance defamation, government criticism, or violence. The company, however, does not provide its compliance rates for each reason. For instance, while Google received 19 requests from Turkey for “government criticism” between January and June 2015 (Google transparency report), it does not say whether it partially or fully complied with those requests or did not comply at all.

Figure 4: Google breaks down the number of government takedown requests it receives by reason. Source: Google Transparency Report, Turkey requests between January to June 2015.

Respondents want to see platforms disclosing more information about government requests for content takedown in their transparency reports. For Aydemir, the “first problem” with these reports is that companies publish 44 them because “they have to do it” and not because they seek to “keep their transparency at a credible level”. These reports are “definitely better than nothing”, said Saka, since they allow for the “documentation of existing censorship regime”, before adding that they should further be improved to include details about what kind of content is requested to be removed. Sabanci also stated that further information is needed, particularly when it comes to what kind of content is removed, why, and under which law: “I believe when it comes to a really meaningful transparency report, we need much more details than design. I believe there is no reason that a company shouldn't give us this information. Somehow they are giving us the summaries, maybe a little bit more detailed information. It is useful but not enough” (Sabanci, Ahmet. Interview. 25 April 2016). Since companies publish exact numbers, it means that they have a complete database of withheld content, Sozeri said. As a result companies “can and should make” such databases available to the public, he added.

How Should Platforms Handle Requests? In addition to being more transparent, respondents want to see social media platforms complying less and appealing more the requests they get from Turkish authorities. Even though, they are aware that such resistance may result in the blocking of those platforms costing them financial losses, respondents still expect to see internet companies on the side of their users. Geybulla said that platforms “are not doing enough” in protecting their users’ rights: “It is simply unacceptable that not only [do] netizens get pressured at home but then they also get pressured for what they say online about the pressure they face at home” (Geybulla, Arzu. Interview. 26 April 2016). “It's already hard to deal with censorship itself but it's getting even harder when the biggest social media platforms comply with the requests. We need their support to preserve freedom of speech”, said Turkish net freedom advocate 45

Isik Mater. Despite acknowledging that the legal environment in Turkey may not be “suitable” for the social media platforms, a Turkish lawyer who requested to remain anonymous, still hopes to see them challenging more the removal requests. Saka admitted that platforms face a “difficult situation” as on one hand they get blocked if they do not comply, and when they comply with orders issued by Turkey’s “flawed” judiciary, they contribute to the violation of their users’ free speech rights. So what should platforms do? Efe believes that companies should simply follow international human rights law. Because there is “no judicial independence in Turkey”, these platforms “have all the right to fail to comply with Turkish court requests”, he explained.

In their 2015 legal notice to Twitter, Akdeniz and Altiparmak argued that the social networking site is under no legal obligation to enforce its CWC policy in Turkey for two main reasons. First, Twitter and also Facebook do not have representative offices inside the country, which does not make them liable to Turkish law. In fact, Turkish authorities have previously called on Twitter to open a “liaison office” in the country for “better coordination”, a call the company has so far rejected (Aljazeera). According to Mater, the Turkish government is urging the opening of such offices “to make [platforms] pay taxes”, but most importantly to “create a more accessible channel for its censorship requests”. Second, Akdeniz and Altiparmak explained that Twitter and content hosts in general are not required under Turkish law to execute blocking orders. In this sense, the two academics called on Twitter to respect their users’ human rights in accordance with international human rights law. “Social media companies have a negative obligation to refrain from violating others’ rights and a positive obligation to prevent the violation of fundamental rights” (8), they wrote. Geybulla also stated that social media platforms should not comply, particularly with requests coming from a speech-restrictive government like that of Turkey. She suggested that platforms could develop “an emergency country guidebook”, which would 46 tell them in which countries are governments abusing their content removal policies.

AlMasri stressed the need for a set of international standards respectful of human rights, similar to the Manila Principles on Intermediary Liability and the International Principles on the Application of Human Rights to Communications Surveillance, to be developed by civil society groups and the private sector. These standards would establish a process social media platforms could follow to decide on government takedown requests. Companies need to require a public law before complying with each request, and that law “should be interpreted as narrowly as possible”, said Myers West. Another respondent stated that social media platforms need to consider three criteria before complying with a request: (1) necessity of removal, (2) how it will hurt freedom of speech, (3) and the potential reaction of the users (Anonymous. Interview. 1 May 2016).

It remains, however, unclear which criteria different social media platforms take into account when they decide to comply or not with a request. The lack of transparency regarding such criteria opens the door for inconsistencies in the implementation of content removal policies : “when such choices are made in private, without transparency, there is greater scope for inconsistency in approach, to the detriment of fundamental rights” (Taylor 13). Sozeri mentioned as an example Twitter’s decision of not taking down or suspending accounts of journalists tweeting about Kurdish issues, before asking rhetorically: “what is Twitter's criteria of journalist[ic] content and journalist[ic] accounts”? “It is very ambiguous” how the company complies with those requests (Sozeri, Efe Kerem. Interview. 11 May 2016). He suggested that platforms should be more explicit about country-specific restrictions in their terms of use, so that it will be “easier for users to post content”, and for activists to legally fight those restrictions. If, for instance, 47

Twitter states in its terms of use that users in Turkey cannot defame Ataturk, activists would be able to argue that Twitter's ToS violates international human rights law, and would be able to legally fight against those restrictions (ibid.).

Abiding by international human rights standards remain an unlikely scenario, as that would result in the blocking of platforms in many countries (Ammori). This is in addition to the fact that social media platforms “are not directly required” (Taylor 8) to abide by international human rights conventions unlike signatory states. For the time being, platforms seem to be complying with requests depending on the country and “calculations of fiscal and legal risk” (Zuckerman 73). By adopting different rules companies can “push the outer limit of protection in each nation”, while “making compromises on free expression that make the companies complicit in censorship” (Ammori 2279). This as a result leads to ambiguities and inconsistencies in the process by which social media platforms implement their content removal policies.

Interference of Business Interests Though social media platforms may not be that transparent about their criteria and process for removing or keeping content, respondents seem to agree that business interests interfere with these decisions. “As far as I know Twitter does not profit much yet in Turkey, maybe because of this it is more critical of government requests. On the other hand, Facebook has a good advertising revenue and it is more compliant”, assumed Saka in an email interview. But that does not exclude the fact that Twitter sees business opportunities in Turkey where there is “a big market, a really high penetration rate and a very active and engaged user base” (Sozeri, Efe Kerem. Interview. 11 May 2016).

48

As an emerging economy with a growing middle class (the World Bank) and more than 40 million internet users, Turkey represents a promising market for internet companies like Twitter and Facebook. These companies have so far been making most of their profits in developed markets. For instance, during the first quarter of 2016 Facebook made most of its advertising revenue ($2,615 million dollars) in the US and Canada followed by Europe with $1,307 million (Facebook Q1 2016 results). Twitter, on the other hand, has been witnessing a stall in the growth of its user base in the US (Oreskovic) where the company made three quarters of its total ad revenues in 2013 (Emarketer). In fact, while nearly 80 per cent of the total number of Twitter users are based outside the US, international users only bring about 34 per cent of the company’s ad revenues (Frier).

To sustain growth, these companies need to attract and keep users based in emerging markets in countries like Brazil, India, Turkey, and Kenya. These markets are “attractive because of the potential for fast growth and burgeoning consumer spending” (Gross), and social media companies are already seeking to target them. Both Facebook and Twitter are expanding their operations in markets outside the US and Western Europe, particularly in emerging markets. Seeking new advertising revenues, Twitter opened a new office in Hong Kong in March 2015, even though the site is banned in mainland China (Shih). A few months later, the company launched another office in Dubai, in the United Arab Emirates, to serve the Middle East and North Africa region (Brown), considered by Twitter as a “key strategic market” thanks to its fast growing user base (Nazzal). Facebook, on the other hand, launched the Creative Accelerator advertising programme to target markets in “high-growth” countries like Indonesia, Turkey and India. The programme was designed to help companies and brands develop ad campaigns “tailored to the people in each country and the devices they use to experience Facebook” (Facebook for Business). 49

In this sense, these platforms we have entrusted with ensuring our free speech and privacy rights are businesses whose aims include growing their user base and making more ad revenues. To achieve these goals, they often need to be friends with policymakers and governments (Mackinnon). “I believe those companies are complying with the legal decisions in order not to be banned”, the Turkish lawyer who requested to remain anonymous told me me via an email interview. Saka shares a similar opinion: “they do not want to be blocked. Although users can circumvent censorship, platforms lose traffic when they are blocked and that is not good for them”. “At the end of the day these are profit making corporations. Sometimes they try to show more transparency and be more careful when complying with government requests, but if it is affecting their profits they will have to comply”, AlMasri said, adding that she would expect small companies to be more respectful of their users’ rights than big multinational corporations. She mentioned as an example the email service provider Lavabit, which in 2013 decided to shut down after reportedly refusing to comply with a US court order for data request (Ackerman).

There have been a few cases of big corporations “fighting back” as well. Recently, technology giant Apple declined to comply with the order of a US federal judge, requesting it to help the FBI unlock the iPhone of Syed Farook, who was behind the 2015 San Bernardino shootings that left 14 people dead (Kharpal). If Apple was able to appeal and fight back that court order, it is thanks to the democratic system of governance in the US and its independent judiciary. Had this happened in a less democratic environment, the company would have probably been obliged to either shut down or to comply, and there have been such cases in the past. In 2009, Apple launched a censored version of its app store in China in compliance with the government’s censorship demands (Mackinnon). In 2013, the company banned inside China an online application that provides access to books that are banned in the 50 country (Anderlini). Still in China, in 2005 authorities there sentenced journalist Shi Tao to ten years in jail after Yahoo complied with an order from the Beijing state security bureau requesting access to his email details (Associated Press in Beijing). Though these cases are not related to Turkey, they still signal the tricky and difficult situation internet companies and social media platforms face while seeking new opportunities in countries with undemocratic or repressive regimes like Turkey.

“It is hugely problematic particularly as more and more countries are passing legislation that violate the right to free expression, and those are often the countries that are being targeted as the new markets for these companies”, stated Myers West. However, she also believes that companies would face “a PR problem if they are enabling too many speech restrictions”. For Asli losing users who might migrate to rival services and platforms is also a commercial risk companies would face when they get involved in censoring their customers. However, for the time being, that risk does not seem significant due to the lack of alternative platforms and services users could migrate to. “Facebook and Twitter’s compliance with authorities upset users and they feel more helpless, but there are no alternatives for masses the yet”, said Saka. Although Sozeri mentioned that he and other activists decided to stop using Facebook, the larger number of non-activist users still log in to the platform on a daily basis, which as a result would not bring any noteworthy revenue losses that would in anyway make the company rethink its policies and content takedown practices. In addition, although highly- motivated and sophisticated users like political activists or internet freedom advocates may find ways to bypass these restrictions as discussed above, the majority of non-activist users may not even be aware of the existence of such restrictions. As journalism and media studies professor at the Rutgers School of Communication and Information Philip Napoli argues, users “engage in less purposeful, directed information-seeking and rely instead on the operation of 51 their social media platforms, and the behaviors of the individuals and organizations within their social networks, to place relevant news and information in front of them” (756). In other words, most users do not adopt sophisticated behaviors and practices in their usage of social media platforms, and it is on such users that a policy like the country-content withheld policy might leave the greatest impact, even though they may not be aware of the existence of such policy. The case study here focused on activist users, it may thus be useful in the future to study the awareness (or lack thereof) of non-activist users about content removal, and their attitudes towards such policies and practices.

Chapter 5: Conclusion

Taking Turkey as a case study, this dissertation looked into the implications of social media platforms’ policies and practices of removing content in compliance with local laws on free speech rights. As explored in chapter two, previous research already looked into the privatisation of speech regulation, and how decisions to remove or keep a piece of content are increasingly influenced by foreign jurisdictions. This dissertation contributes to existing research by exploring how a speech-restrictive government like that of Turkey exploits country-withheld content policies, and the implications of such policies on speech rights.

The Turkey study case is a perfect illustration of how the privatization of speech regulation contributes to the weakening of the public sphere, where matters of general interests should be freely and critically debated (Habermas). As the authorities in Turkey were restricting media and press freedoms, Turkish citizens seeking to freely speak their mind, criticize the government, and access viewpoints and stories they may not hear about in the mainstream media, turned to social media platforms. These platforms 52 may have enabled the creation of a networked public sphere (Benkler), but the fact that they are owned by a handful of big multinational corporations brings with it a conflict of interests. Seeking to avoid government censorship, Turkish users often find themselves censored by the same platforms that are supposed to enable their free speech rights. They may look open and borderless, but these “social media companies have essentially erected new borders where such borders did not exist before”, as Jillian York from EFF writes (York, “Social media has been privatised…”).

Until recently, US social media platforms had to only abide by US law (Ammori). But, as they started to grow and to attract users living under different jurisdictions, they resorted to adopting country-withheld content policies, thus opening a pandora’s box. As governments now expect internet companies to comply with their local laws, platforms will continue to receive more and more takedown requests. These requests often have implications for freedom of expression as discussed in the Turkish case study, in particular the censorship of (political) speech tolerated under international human rights standards. In the interviews, respondents expressed concerns regarding platforms’ growing compliance with the Turkish government’s takedown requests for violating local laws. Many of these requests are “politically motivated”, and respondents think platforms are not doing enough to stand by the side of their users. They also stressed the need for more transparency about the requests in general, and the criteria for complying with such requests. Although respondents realize that platforms are after all businesses that have commercial interests that might compel them to censor on behalf of governments, they think that in an ideal world platforms would abide by international human rights standards. Finally, it seems that Facebook’s strict policies have pushed a number of activists to stop using the platform, who now prefer Twitter to address sensitive political questions such as government 53 corruption and Kurdish rights. For the time being, however, alternatives to the largest and most popular platforms are lacking.

One limitation of this study is the lack of concrete examples and cases, as it is hard to document each piece of removed content, unless when a user reports it to the public. Though, in the dissertation I mention what types of content are most likely to be removed by social media platforms in response to Turkish government requests, such as pro-Kurdish content and Ataturk criticism, it is very difficult to provide examples of specific tweets and Facebook posts. This is mainly due to the lack of transparency from social media platforms as discussed above. Throughout the thesis, this limitation was addressed through examples and cases provided by respondents, in addition to news reports. In future research, it could be useful to think of ways to document specific pieces of content removed in order to study the scope of social media censorship. For instance, future research could look into ways to use the Lumen Database, previously known as Chilling Effects, which is an archive of “legal complaints and requests for removal of online materials” (Lumen Database). Sozeri mentioned that he started a database of Turkish government requests to Twitter using information provided by the database, but could not keep up with the high number of court orders. In addition, he mentioned how he was running a web scraper tool to extract the data, but that tool stopped working when the URL of the database changed from chillingeffects.org to lumendatabase.org. In this regard, future research could look into techniques and tools that would make it easier to document specific cases of content removal using the Lumen database.

This dissertation focused on Turkey as a case study, since it is the country that by far makes the most requests to Facebook, Google and Twitter. It may thus be useful in future research to study other countries such as Russia or India, two countries that also seem to be taking advantage of country-specific 54 restrictions. For example, Russia made 1,735 content removal requests to Twitter over the second half of 2015, of which the platform complied with 5% (Twitter Transparency Report). Over the same period, Facebook restricted nearly fifteen thousand pieces of content inside India over anti-religious discourses and hate speech (Facebook Government Requests. India). Researching different countries would allow to compare and study how different governments use (or abuse) country-withheld content policies. It could also help shed light on how platforms respond to requests coming from different governments, and to what extent are they consistent in their responses to these requests. In other words, this would help explore whether there are particular governments, platforms are more likely to comply with their requests.

Finally, as mentioned earlier, future research could also study the attitudes of non-activist users towards content removal in compliance with local laws: to what extent are such users aware of the existence of such policies, and would they change their online behaviors once they become aware of these policies? This could allow to study the impact of country- specific restrictions on non-activist users, who unlike activist users may not be knowledgeable of tools and techniques to help them circumvent such restrictions. 55

Appendices

Appendix A: interview guide

Area Questions

Platforms’ policies ➢ What do you think of the country withheld and practices content rule several platforms like Facebook and Twitter adopt to comply with local laws? ➢ What do you think about the different ways platforms handle Turkish authorities’ requests for content removal? Are they doing enough to ensure that users’ rights are not violated in the process? If not, what do you think they should be doing?

Freedom of speech ➢ What do you think are the impacts of the country withheld content rule on Turkish users’ online free speech rights. ➢ Do you think the Turkish government is taking advantage of such policies to censor critics? How? ➢ Have you heard of users/activists or media in Turkey being silenced by such policy? Can you give me an example? ➢ If they do not comply, platforms risk being blocked by the Turkish government, thus affecting all users. What do you think platforms should do in such case?

Business interests ➢ To what extent do you think companies take into account potential loss of profits when they decide to comply with Turkey’s removal requests?

Transparency ➢ To what extent do you think the transparency reports reports published today by different internet companies are helpful to the net freedom activist community in Turkey? How do you think they could be improved? 56

Appendix B: Two Interview Samples

Interview 1 (via skype): Ahmet Sabanci

Question: What do you think of country-content withheld policy several platforms like Facebook and Twitter adopt to comply with local laws?? I am against country specific content removal, I am pretty much against all kinds of online censorship. But I am against country specific content removals for two basic reasons. One, it makes internet much more balkanized: you have Twitter, I have Twitter but I can’t see some of the tweets because I am in an X country and you are in a Y country. This makes it pretty much nonsense, because always since the beginning, internet freedom activists and freedom of speech activists have been saying that internet should be free for everyone and should be available for everyone and country specific content removals are basically destroying that. And the second reason, it is basically useless because just changing some little settings in the account settings menu like for example even though you are connecting from Turkey you can set up your account from somewhere else like Argentina or any other country, you can see all the tweets removed for Turkey. This basically makes all censorship useless. But somehow social media companies and other websites use these rules maybe just [to make] the government happy but it’s censorship anyway, it’s too easy to circumvent but it’s censorship.

Q: To what extent do you think companies take into account potential loss of profits when they decide to comply with Turkey’s removal requests? Well, yeah the economic reasons are coming first. For example, Facebook really likes removing content for Turkey because Turkish people are the third 57 most crowded users on Facebook, and they also have very good relationship with Turkey and the Turkish government, so they don’t want to lose [that]. Stupid laws in Turkey can cause censorship on Facebook or Twitter. For example when it comes to Kurdish freedom movement, and Ataturk criticism, Facebook censors this and you can’t get that content back. Facebook does it because they know that Turkey is a big resource and a big market for them and they don’t want to lose that. So yeah economy has a very important role in content removals.

Q: What about Twitter? How do you think they are handling this? Do you think they are doing it differently? Just so far Twitter tries its best to fight back against censorship. For example, they always take this to court in Turkey whenever they can. For example, even though the government asks for some accounts to be blocked or removed for Turkey, they say it’s an activist or a journalist and we can’t remove their accounts. Even though the government asks them they don’t do that and also from time to time they get in touch with the lawyers and other people here to talk about how to fight back. But, on the other hand they are also really in touch with the Turkish government and they somehow know that it [Turkey] is a big market for them, they are trying to find a middle ground in this situation but so far they did some good job fighting back against censorship.

Q: If they do not comply, platforms risk being blocked by the Turkish government, thus affecting all users. What do you think platforms should do in such case? Getting all websites blocked is much worse that just removing one or two tweets or one or two facebook shares. At some point, it gives some extra power for the Turkish government or other governments. They are turning it 58 into propaganda material, like saying we told them and they removed everything. Yes blocking the whole website is worse, specific country removal at some point is better, but the government and pro-government people can use it as a propaganda material. They say that the American companies are obeying what we are ordering and we are the powerful ones. They are using it for themselves, and most of the people who follow the news from pro- government media think that our government does a perfect job putting the traitors and the Americans in their place.

Q: What are the impacts of these policies on Turkish users’ free speech rights? One of the big effects is happening on Facebook, because on Facebook we have a much more complex audience, everyday much more regular people use it. When it comes to content removal in Turkey, it really has a big impact for Facebook [users]. For example, Facebook basically removes everything the Turkish government wants which can be the leftist propaganda, anti- government news, government criticism, Ataturk criticism, Kurdish movements news…, and when you remove them without any kind of questions and without any kind of fighting back it creates an effect like facebook turning into something like a pro-government media because you can’t find any other resource or facebook pages and profiles because most of them are either removed or blocked. So, the impact on Facebook is much more bigger than on Twitter because Twitter users in Turkey know the technology they are using a bit better. For instance, when they hear about a court order they open a secondary account or change their usernames or use another circumvention tool [in order not to be] affected by that. But, on Facebook it’s not like that, because we have a much broader audience in there and they get much more affected by the censorship. 59

Q: Have you heard of users/activists or media in Turkey being silenced by such policy? Can you give me an example? Socialist and leftist news accounts and journalists regularly get their accounts threatened, but most of the time Twitter does not block [them] or warns them before [they do so] so that they can take some cautions. Other than that, some of journalists and writers...change their username into something different from the court order so that their accounts do not get blocked because there wasn’t any account like that. It happens regularly. Basically in in less than six months I saw more than 30 or 40 court orders ordering censorship of tweets or Twitter accounts. It has become a part of our daily lives.

Q: Are there any activist initiatives to bypass these restrictions and educate users? Some of the activists like me are doing mostly this right now, and also some of the journalists who know the technology [show] the other journalists how to deal with these kind of situations. Also, some of the lawyers like Kerem Altiparmak and others are regularly posting court orders related to this to help the people get their cautions about that. We are a small network to help people and we are trying to do our best to circumvent censorship and make it pretty much useless.

Q: To what extent do you think the transparency reports published today by different internet companies are helpful to the net freedom activist community in Turkey? How do you think they could be improved? In the current situation the Google and Twitter transparency reports are much more better than Facebook, they are kind of useful when it comes to telling people in Turkey about the situation: how broad the censorship. It’s useful but 60 when it comes to more detailed research and getting the fight against censorship to the next level, we need much more detailed transparency reports. Twitter and Google are a bit more detailed but I believe they need to be more detailed because when it comes to collecting more information about what kind of content has been removed or why the government asked for removal, we need much more information like which laws they try to use and what was the content. I believe when it comes to a really meaningful transparency report we need much more details than design. I believe there’s no reason for the company not to give us this information, but somehow they are giving us the summaries, maybe a little bit more detailed information like how much they censored and how much they got. It’s useful but not enough.

Interview 2 (via email): Erkan Saka

Question: What do you think of the Country Withheld Content policy several platforms like Facebook and Twitter adopt to comply with local laws? They both comply so that they will not be banned in Turkey. However, Facebook is believed to be more compliant.

Q: To what extent do you think such policy has negative impacts on Turkish users’ online free speech rights? How? Many citizens prefer to be anonymous [in order] not to be targeted by Turkish authorities. Or, I observe more and more self-censorship. Facebook and Twitter’s compliance with authorities upset users and they feel more helpless… There are sometimes calls for new tools to be used but no alternatives for masses are found yet.

Q: To what extent do you think the Turkish government is taking advantage of such policies to censor critics? How? 61

To the largest extent. Turkish government amends laws to become serviceable for censorship and then ask Facebook and Twitter to obey. It is very difficult to counter sue, it takes time for Twitter to counter government requests.

Q: Have you heard of users/activists or media in Turkey being silenced by such policy? Can you give examples? Now Tweets are censored in bulk. Hundreds of Twitter messages were banned after Ensar Foundation child abuse case. Or after nearly every attack by Islamic State, critics seem to be silenced.

Q: If they do not comply with government requests, platforms risk being blocked by the Turkish government, thus affecting all users. What do you think platforms should do in such case? We should not forget that these platforms are businesses in the final analysis. I understand they do not want to be blocked. Although users can circumvent censorship, by being blocked platforms lose traffic and it is not good for them. On the other hand, since judiciary system is flawed in Turkey, complying with local laws means these platforms help violation of free speech rights. A difficult situation for the platforms.

Q: What do you think about the different ways platforms handle Turkish authorities’ requests for content removal? Are they doing enough to ensure that users’ rights are not violated in the process? If not, what do you think they should be doing? Twitter is relatively careful about handling the situation. It informs the users at every stage although it complies [with] the block orders. It works with a local law agency, they regularly counter-sue the government to unblock messages or accounts. Facebook is responsive sometimes but also elusive. Responses are abstract and mostly just compliant with block orders (i.e. Ötekilerin 62

Postası Facebook page that focuses on the Kurdish issue was banned several times).

Q: To what extent do you think companies take into account potential loss of profits when they decide to comply with Turkey’s removal requests? Partially answered above, they all care. As far as I know Twitter does not profit much yet in Turkey, maybe because of this, it is more critical of government requests. On the other hand, Facebook has a good advertising revenue and it is more compliant.

Q: Several companies today publish transparency reports on government removal requests. To what extent do you think these reports are helpful to the net freedom activist community in Turkey? How do you think they can be improved? They are definitely better than nothing. It is an evidence for what activist cry for [for] years. It is a documentation of existing censorship regime. It is not enough but it gives a leverage. There could be more detailed reporting about what is demanded to be banned by authorities. And of course it could be better if platforms would comply less with the block requests. 63

Appendix C: List of Interviews

Interviewee Interviewing Date5 Country method

Ahmet Sabanci skype April 25, 2016 Turkey

Arzu Geybulla email April 26, 2016 Azerbaijan

Asli Telli Aydemir skype May 2, 2016 Turkey

Efe Kerem Sozeri skype May 11, 2016 Turkey

Erkan Saka email April 29, 2016 Turkey

Isik Mater email June 8, 2016 Turkey

Anonymous email May 1, 2016 Turkey respondent

Reem AlMasri skype May 12, 2016 Jordan

Sarah Myers West skype May 2, 2016 US 64

Bibliography

Ackerman, Spencer. “Lavabit email service abruptly shut down citing government interference”. . 2013. 23 May 2016. < https :// www . theguardian . com / technology /2013/ aug /08/ lavabit - email - shut - down - edward - snowden>.

Akdeniz, Yaman and Altiparmak, Kerem. “Legal notice to legal representatives of Twitter, Inc. (USA/Turkey)”. Cyber-rights.org. 2015. 22 May 2016. < http :// cyber - rights . org . tr / docs / Twitter _ ihtar _ ENG .pdf>.

Akgul, Mustafa. “Internet Censorship in Turkey”. Internet Policy Review. Vol.4 No.2 (2015). 21 March 2016. < http :// policyreview . info / articles / analysis / internet - censorship -turkey>.

Akkoc, Raziye. “Ankara blast: Turkish court 'blocks Twitter and Facebook' after explosion”. The Telegraph. 2016. 20 May 2016. < http :// www . telegraph . co . uk / news / worldnews / europe / turkey /12192889/ Ankara - blast - Turkish - court - blocks - Twitter - and - Facebook - after - explosion .html>.

Akkoc, Raziye. “Facebook censors images of Prophet Mohammed”. The Telegraph. 2015. 22 March 2016. < http :// www . telegraph . co . uk / news / worldnews / europe / turkey /11373723/ Facebook - censors - images - of - Prophet - Mohammed .html>.

Akkoc, Raziye. “Turkey blocks social media and Youtube over hostage photos”. The Telegraph. 2015. 20 May 2016. < http :// www . telegraph . co . uk / news / worldnews / europe / turkey /11518004/ Turkey - blocks - access - to - Facebook - Twitter - and - YouTube .html>.

Ben Ali threatens protesters with more] .”بن علي يتوعد المحتجين بمزيد من الحزم“ .Alakhbar toughness]. Al-akhbar.com. 2011. 25 May 2016. < http :// www . al - akhbar . com / node /1474>.

Aljazeera. “Turkey government still wants Twitter office”. Aljazeera.com. 2014. 22 May 2016. < http :// www . aljazeera . com / news / europe /2014/04/ turkey - government - still - wants - twitter - office - 201441716057445483. html>.

AlMasri, Reem. Interview. 12 May 2016.

Amy L. Beam, Ed.D. “Turkey Censors Twitter and Websites During Kobani Siege”. The Kurdistan Tribune. 2014. 22 May 2016. < http :// kurdistantribune . com /2014/ turkey - censors - twitter - websites - during - kobani - siege />.

Ammori, Marvin. “The “New” New York Times: Free Speech Lawyering in the Age of Google and Twitter”. Harvard Law Review. Vol.27 No.8 (June 2014): 2259-2295. 7 March 2016. < http :// cdn . harvardlawreview . org / wp - content / uploads /2014/06/ vol 127_ Ammori .pdf>.

Amnesty. “Turkey: Gezi Park Protest Twitter Trial”. . 2014. 19 May 2016. < https :// www . amnesty . org / en / press - releases /2014/04/ turkey - gezi - park - protest - twitter - trial />. 65

Anderlini, Jamil. “Apple bars China app for ‘illegal’ content”. Financial Times. 2013. 23 May 2016. < https :// next . ft . com / content /39 e 02 d 6 c -9 d 02-11 e 2-88 e 9-00144 feabdc 0>.

Anonymous respondent. Interview. 1 May 2016.

Associated Press. “Turkey pulls plug on YouTube over Ataturk 'insults'”. The Guardian. 2007. 21 May 2007. < http :// www . theguardian . com / world /2007/ mar /07/ turkey>.

Associated Press in Beijing. “Shi Tao: China frees journalist jailed over Yahoo emails”. The Guardian. 2013. 23 May 2016. < http :// www . theguardian . com / world /2013/ sep /08/ shi - tao - china - frees -yahoo>.

Aydemir, Asli Telli. Interview. 2 May 2016.

Bar, Allon; Hickok, Elonnai; Lim, Hae-in ; MacKinnon, Rebecca. “Fostering freedom online: the role of Internet intermediaries”. UNESCO. 2014. 23 March 2016. < http :// unesdoc . unesco . org / images /0023/002311/231162 e .pdf>.

Barlow, John P. “A Declaration of the Independence of Cyberspace”. EFF. 1996. 13 March 2016. < https :// www . eff . org / cyberspace -independence>.

Bayley, Ed. “The Clicks That Bind: Ways Users "Agree" to Online Terms of Service”. EFF. 2009. 25 May 2016. < https :// www . eff . org / wp / clicks - bind - ways - users - agree - online - terms -service>.

BBC. “Profile: Kurdistan Workers' Party (PKK)”. British Broadcasting Corporation. 2015. 20 May 2016. < http :// www . bbc . com / news / world - europe -20971100>.

BBC. “Recep Tayyip Erdogan: Turkey’s ruthless president. British Broadcasting Corporation. 2016. 19 May 2016. < http :// www . bbc . com / news / world - europe -13746679>.

BBC. Turkey Kurds: Kobane protests leave 19 dead. British Broadcasting Corporation. 2014. 22 May 2016. < http :// www . bbc . com / news / world - middle - east -29530640>.

BBC. “Twitter 'confuses' Iyad El-Baghdadi with Islamic State leader”. bbc.com. 2016. 10 March 2016. < http :// www . bbc . com / news / world -35210527>.

Ben Gharbia, Sami.”Blogger.com banned in Turkey”. Global Voices Advocacy. 2008. 21 May 2016. < https :// advox . globalvoices . org /2008/01/23/ turkey - again - blocks - access - to - />. Benkler, Yochai. The Wealth of Networks. New Haven and London: Yale University Press, 2006.

Brennen, Bonnie. Qualitative Research Methods for Media Studies. New York and London: Routledge Taylor and Francis Group, 2013.

Brown, Hayes. “Twitter Opened Its Middle East Office In The UAE Just As The UAE Reportedly Arrested A Man For Tweeting”. Buzzfeed. 2015. 23 May 2016. < https :// www . buzzfeed . com / hayesbrown / twitter - opened - its - middle - east - office - in - the - uae - just - as - the ? utm _ term =. wi 6 RL 4 YG 5#. bbndqMayP>.

Castells, Manuel. 2012. Networks of Outrage and Hope. Social Movements in the Internet Age. Cambridge: Polity Press. 66

Constine, Josh. “Facebook spares humans by fighting offensive photos with AI”. TechCrunch. 2016. 16 June 2016. < https :// techcrunch . com /2016/05/31/ terminating - abuse />.

Crawford, Kate and Gillespie, Tarleton. (2014). “What is a flag for? Social media tools and the vocabulary of reporting”. New Media & Society. 2014. 7 March 2016. < http :// papers . ssrn . com / sol 3/ papers . cfm ? abstract _ id =2476464>.

Crawford, Kate and Lumby, Catherine. ‘Networks of Governance: Users, Platforms, and the Challenges of Networked Media Regulation.’ International Journal of Technology Policy and Law. Vol.2 No.1 (2013): 1-15. 5 March 2015. < http :// papers . ssrn . com / sol 3/ papers . cfm ? abstract _ id =2246772>.

Daily Sabah and Anadolu Agency. “39 million log on to Facebook in Turkey daily”. Daily Sabah. 2015. 21 May 2016. < http :// www . dailysabah . com / life /2015/09/07/39- million - log - on - to - facebook - in - turkey -daily>.

Daragahi, Borzou and Karakas, Burcu. “How Turkey Became The Top Censor Of Twitter Accounts Worldwide”. Buzzfeed. 2016. 4 March 2016. < http :// www . buzzfeed . com / burcukarakas / how - turkey - became - the - top - censor - of - twitter - accounts - worldwi #. aiG 1 WxVO 4>.

Delany, Colin. “How Social Media Accelerated Tunisia’s Revolution: An Inside View”. Huffington Post. 2011. 23 March 2016. < http :// www . huffingtonpost . com / colin - delany / how - social - media - accelera _ b _821497. html>.

Dencik, L. “Why Facebook Censorship Matters.” JOMEC Blog. 2014. 13 March 2016. < http :// www . jomec . co . uk / blog / why - facebook - censorship - matters />.

Dewey, Caitlin. “A guide to what’s going on in Istanbul’s Gezi Park”. The Washington Post. 2013. 19 May 2016. < https :// www . washingtonpost . com / news / worldviews / wp /2013/05/31/ a - guide - to - whats - going - on - in - - gezi - park />. Diplo. “Multistakeholderism in IGF Language”. Diplomacy.edu. Undated. Accessed on 23 March 2016. < http :// www . diplomacy . edu / IGFLanguage /multistakeholderism>.

Dogramaci, Esra and Radcliffe, Damian. “How Turkey uses social media”. Digital News Report. 2015. 21 May 2016. < http :// www . digitalnewsreport . org / essays /2015/ how - turkey - uses - social - media />.

Donahue, Patrick. “Merkel Confronts Facebook's Zuckerberg Over Policing Hate Posts”. 2015. 14 March 2016. < http :// www . bloomberg . com / news / articles /2015-09-26/ merkel - confronts - facebook - s - zuckerberg - over - policing - hate -posts>.

Doug, Bernard. “Iran's Next Step in Building a 'Halal' Internet”. VOA NEWS. 2015. 15 March 2015. < http :// www . voanews . com / content / irans - next - step - in - building - a - halal - internet /2672948. html>.

Emarketer. “Emerging Markets Drive Twitter User Growth Worldwide”. Emarketer.com. 2014. 23 May 2016. < http :// www . emarketer . com / Article / Emerging - Markets - Drive - Twitter - User - Growth - Worldwide /1010874>. 67

Facebook Community Standards: < https :// www . facebook . com / communitystandards # >.

Facebook for Business. “Storytelling in High-Growth Markets”. Facebook. 2015. 23 May 2016. < https :// en - gb . facebook . com / business / news / uk - creative -accelerator>.

Facebook Government Requests Reports. “About the Reports”. Facebook. 2016. 17 March 2016. < https :// govtrequests . facebook . com / about />.

Facebook Government Requests Report. India. June 2015-December 2015. Facebook. < https :// govtrequests . facebook . com / country / India /2015- H 2/>.

Facebook Government Requests Report. Saudi Arabia: January 2014-June 2014. Facebook. < https :// govtrequests . facebook . com / country / Saudi %20 Arabia /2014- H 1/#>.

Facebook Government Requests Report. Turkey: July 2015-December 2015. Facebook. 2016. 21 May 2016. < https :// govtrequests . facebook . com / country / Turkey /2015- H 2/>.

Facebook Help Center. “How News Feed Works”. < https :// www . facebook . com / help /327131014036297/>.

Facebook homepage. Facebook.com. Accessed on 22 June 2016.

Facebook newsroom. Company info. Facebook. 2016. 7 March 2016. < http :// newsroom . fb . com / company - info />. Facebook Q1 2016 results. Investor.fb.com. 2016. 23 May 2016. < http :// files . shareholder . com / downloads / AMDA - NJ 5 DZ /2108895290 x 0 x 888146/484823 BA - 5 B 5 D -4 BC 4- B 872-5 F 239 E 813384/ FB _ Q 116_ Earnings _ Slides .pdf>.

Fitzpatrick, Alex. “Why Chelsea Handler Can Post Nudes on Twitter But Not Instagram”. Time. 2014. 7 March 2016. < http :// time . com /3550969/ chelsea - handler - nudes - twitter - instagram />.

Franceschi-Bicchierai, Lorenzo. “Twitter and YouTube Are Back, But Turkey Won the Censorship Battle”. MotherBoard. 2015. 20 May 2016. < http :// motherboard . vice . com / read / twitter - and - youtube - are - back - but - turkey - won - the - censorship -battle>.

Frenkel, Sheera. “Inside The Obama Administration’s Attempt To Bring Tech Companies Into The Fight Against ISIS”. Buzzfeed. 2016. 14 March 2016. < http :// www . buzzfeed . com / sheerafrenkel / inside - the - obama - administrations - attempt - to - bring - tech -compa>.

Frier, Sarah. “How Twitter Plans to Make More Money, in Five Not-So-Easy Steps”. Bloomberg. 2015. 23 May 2016. < http :// www . bloomberg . com / news / articles /2015-05-21/ how - twitter - plans - to - make - more - money - in - five - not - so - easy -steps>.

Gayomali, Chris. “After A Government-Imposed Twitter Ban, Turkey Sets Tweet Record”. Fast Company. 2014. 27 May 2016. < http :// www . fastcompany . com /3028030/ fast - feed / after - a - government - imposed - twitter - ban - turkey - sets - tweet -record>. 68

Gadde, Vijaya. “Victory for free expression in Turkish court”. Twitter. 2014. 21 May 2015. < https :// blog . twitter . com /2014/ victory - for - free - expression - in - turkish -court>.

Galperin, Eva. “Facebook Caves to Turkish Government Censorship”. EFF. 2015. 21 March 2016. < https :// www . eff . org / ja / deeplinks /2015/01/ facebook - caves - turkish - government - censorship>.

Geybulla, Arzu. Interview. 26 April 2016.

Gillespie, Tarleton. “Facebook’s improved “Community Standards” still can’t resolve the central paradox”. Culture Digitally. 2015. 13 March 2016. < http :// culturedigitally . org /2015/03/ facebooks - improved - community - standards - still - cant - resolve - the - central -paradox>.

Gillespie, Tarleton. “The dirty job of keeping Facebook clean”. Culture Digitally. 2012. 13 March 2016. < http :// culturedigitally . org /2012/02/ the - dirty - job - of - keeping - facebook - clean />.

Gillespie, Tarleton. ‘The Politics of “Platforms”,’ New Media & Society 12.3 (2010): 347-364.

Global Legal Monitor. “Turkey; United Nations: Criticism of Anti-Terrorism Laws”. Library of Congress. 2012. 20 May 2016. < http :// www . loc . gov / law / foreign - news / article / turkey - united - nations - criticism - of - anti - terrorism -laws>.

GNI brochure. Undated. Accessed on 23 March 2016. < https :// globalnetworkinitiative . org / sites / default / files / GNI _ brochure .pdf>.

Germany Requests to remove content. Google. Undated. Accessed on 19 May 2016. < https :// www . google . com / transparencyreport / removals / government / DE /? hl = en>

Google Inside Search. “Algorithms”. Google. Accessed on 10 June 2016. < https :// www . google . com / insidesearch / howsearchworks / algorithms .html>.

Google Official Blog. “Transparency Report: Government removal requests continue to rise”. Blogspot. 2013. 27 May 2016. < https :// googleblog . blogspot . nl /2013/12/ transparency - report - government - removal .html>.

Google Transparency Report. “Government requests to remove content”. Google. 2016. 18 March 2016. < https :// calendar . google . com / calendar / render ? tab = oc & pli =1# main _7| month>.

Greenberg, Andy. “It’s Been 20 Years Since This Man Declared Cyberspace Independence”. Wired. 2016. 25 May 2016. < https :// www . wired . com /2016/02/ its - been -20- years - since - this - man - declared - cyberspace - independence />.

Gross, Jenny. “Opportunities Lurk in Emerging Markets”. The Wall Street Journal. 2014. 23 May 2016.< http :// www . wsj . com / articles / SB 10001424052702304899704579388733322766754>.

Güler, Emrah. “Facebook censors Turkey’s biggest anti-racist initiative”. Hurriyet Daily News. 2013. 21 May 2016. < http :// www . hurriyetdailynews . com / facebook - censors - turkeys - biggest - anti - racist - initiative . aspx ? pageID =238& nID =39462>. 69

Güler, Emrah. “Facebook facing accusations of censoring citizen journalism”. Hurriyet Daily News. 2013. 21 May 2016. < http :// www . hurriyetdailynews . com / facebook - facing - accusations - of - censoring - citizen - journalism . aspx ? pageID =238& nID =51996& NewsCatID =374>.

Habermas, Jurgen. The Structural Transformation of the Public Sphere. Cambridge Massachusetts: MIT Press, 1991.

Hahn, Jason. “Android claims 81.5% of the global smartphone OS market in 2014, iOS dips to 14.8%”. Digital Trends. 2015. 18 March 2016. < http :// www . digitaltrends . com / mobile / worldwide - domination - android - and - ios - claim -96- of - the - smartphone - os - market - in -2014/>.

Hintz, Arne. “Promise and Practice in Studies of Social Media and Movements”. Critical Perspectives on Social Media and Protest. London: Rowman & Littlefield International, 2015. 193-227. Scribd. 6 October 2015. 11 March 2016. < https :// www . scribd . com / book /284531870/ Critical - Perspectives - on - Social - Media - and - Protest - Between - Control - and -Emancipation>.

Haunss, Sebastian. “Promise and Practice in Studies of Social Media and Movements”. Critical Perspectives on Social Media and Protest. London: Rowman & Littlefield International, 2015. 27- 61. Scribd. 6 October 2015. 11 March 2016. < https :// www . scribd . com / book /284531870/ Critical - Perspectives - on - Social - Media - and - Protest - Between - Control - and -Emancipation>.

Human Rights Turkey. “Article 301: End it, don’t Amend it”. Amnesty International-USA: Turkey Regional Action Network. 2013. 20 May 2016. < https :// humanrightsturkey . org /2013/04/03/ article - 301- end - it - dont - amend - it />.

Human Rights Watch. “Turkey: Internet Freedom, Rights in Sharp Decline”. HRW. 2014. 20 May 2016. < https :// www . hrw . org / news /2014/09/02/ turkey - internet - freedom - rights - sharp -decline>.

Human Rights Watch. “Turkey: Terror Laws Undermine Progress on Rights”. HRW. 2013. 20 May 2016. < https :// www . hrw . org / news /2013/01/31/ turkey - terror - laws - undermine - progress -rights>.

Hurriyet. “Kurdish politicians to take action after Facebook admits to banning pages with PKK content”. Hurriyet Daily News. 2013. 21 May 2016. < http :// www . hurriyetdailynews . com / kurdish - politicians - to - take - action - after - facebook - admits - to - banning - pages - with - pkk - content . aspx ? pageID =238& nID =53465& NewsCatID =339>.

Hurriyet. “Turkish Parliament approves Internet bill despite concerns”. Hurriyet Daily News. 2014. 19 May 2016. < http :// www . hurriyetdailynews . com / turkish - parliament - approves - internet - bill - despite - concerns . aspx ? pageID =238& nID =62088& NewsCatID =339>.

Hurriyet. “New bill gives Turkish PM power to shut down websites”. Hurriyet Daily News. 2015. 19 May 2016.< http :// www . hurriyetdailynews . com / new - bill - gives - turkish - pm - power - to - shut - down - websites . aspx ? pageID =238& nID =77378& NewsCatID =339>.

India Requests to remove content. Google. Undated. Accessed on 19 May 2016. < https :// www . google . com / transparencyreport / removals / government / IN /? hl = en>. 70

International Covenant on Civil and Political Rights. United Nations. Accessed on 10 March 2016. < http :// www . ohchr . org / en / professionalinterest / pages / ccpr .aspx>.

International Principles on the Application of Human Rights to Communications Surveillance. Necessaryandproportionate.org. 2014. 23 May 2016. < https :// necessaryandproportionate . org /principles>

Internet Live Stats. < http :// www . internetlivestats . com />.

Jarvis, Jeff. Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live. Simon & Schuster. 27 September 2011. Scribd. 3 March 2016. < https :// www . scribd . com / book /224334529/ Public - Parts - How - Sharing - in - the - Digital - Age - Improves - the - Way - We - Work - and -Live>.

Kharpal, Arjun. “Apple vs FBI: All you need to know”. CNBC. 2016. 23 May 2016. < http :// www . cnbc . com /2016/03/29/ apple - vs - fbi - all - you - need - to - know .html>.

Kreimer, Seth F. “Censorship by Proxy: The First Amendment, Internet Intermediaries, and the Problem of the Weakest Link.” University of Pennsylvania Law Review vol. 155 no. 11 (2006): 11- 101. 13 March 2016. < https :// www . law . upenn . edu / journals / lawreview / articles / volume 155/ issue 1/ Kreimer 155 U . Pa . L .Re v .11%282006%29. pdf>.

Kirişci, Kemal. “The Rise and Fall of Turkey as a Model for the Arab World”. Brookings. 2013. 19 May 2016. < http :// www . brookings . edu / research / opinions /2013/08/15- rise - and - fall - turkey - model - middle -east>.

Kasapoglu, Cagil. “Turkey social media ban raises censorship fears”. BBC. 2015. 21 March 2016. < http :// www . bbc . com / news / world - europe -32204177>.

Kessel, Jeremy. “Three years of increased #transparency... and counting”. 2015. 23 June 2016. < https :// blog . twitter . com /2015/ three - years - of - increased - transparency - and -counting>.

Korkmaz, Mustafa. “Facebook; stop censorship and siding homophobia”. Change.org. 2015. 21 May 2016. < https :// www . change . org / p / facebook - stop - censorship - and - siding -homophobia>.

Langlois, Ganaele. “Participatory Culture and the New Governance of Communication: The Paradox of Participatory Media”. Television & New Media. vol. 14 no. 2 (March 2013) 91-105.

Laughland, Oliver and Swaine, Jon. “'I dream about it every night': what happens to Americans who film police violence?”. The Guardian. 2015. 3 March 2016. < http :// www . theguardian . com / us - news /2015/ aug /15/ filming - police - violence - walter - scott - michael - brown -shooting>.

Lumen Database. Homepage. Accessed on 22 June 2016. < https :// lumendatabase . org />.

MacKinnon, Rebecca. Consent of the Networked. 2012. United States of America: Basic Books, 2013.

Mater, Isik. Interview. 8 June 2016. 71

McLaughlin, Eliott. “We’re not seeing more police shootings, just more news coverage”. CNN. 2015. 3 March 2016. < http :// edition . cnn . com /2015/04/20/ us / police - brutality - video - social - media - attitudes />.

Manila Principles on Intermediate Liability. Manilaprinciples.org. Accessed on 23 May 2016. < https :// www . manilaprinciples . org />.

Marzouk, Zeineb. “Designed to Shock: Facebook Removes Image of Domestic Violence”. Tunisia Live. 2016. 10 March 2016. < http :// www . tunisia - live . net /2016/02/18/72146>.

Maurer, Tim and Morgus Robert. “Stop Calling Decentralization of the Internet “Balkanization”. Slate. 2014. 27 May 2016. < http :// www . slate . com / blogs / future _ tense /2014/02/19/ stop _ calling _ decentralization _ of _ the _ inter net _ balkanization .html>.

Meinrath, Sacsha. “The Future of the Internet: Balkanization and Borders”. Time. 2013. 27 May 2016. < http :// ideas . time . com /2013/10/11/ the - future - of - the - internet - balkanization - and - borders />.

Meyer, Robinson. “The Tradeoffs in Google’s New Crackdown on Child Pornography”. The Atlantic. 2013. 6 March 2016. < http :// www . theatlantic . com / technology / archive /2013/11/ the - tradeoffs - in - - new - crackdown - on - child - pornography /281604/>.

Morozov, Evgeny. The Net Delusion. 2011. England: Penguin Books, 2012.

Myers West, Sarah. Interview. 2 May 2016.

Nachawati Rego, Leila.”The best and worst of the Arab IGF 2015”. Association for Progressive Communication. 2015. Undated. Accessed on 23 March 2016. < https :// www . apc . org / en / blog / best - and - worst - arab - igf -2015>.

Napoli, Philip. “Social media and the public interest: Governance of news platforms in the realm of individual and algorithmic gatekeepers.” Telecommunications Policy 39 (2015). 751-760.

Nazzal, Noor. “Twitter to open offices in Dubai”. Gulf News. 2015. 23 May 2016. < http :// gulfnews . com / news / uae / media / twitter - to - open - offices - in - dubai -1.1473768>.

Ognianova, Nina. “Erdoğan vs the press: Insult law used to silence president's critics”. Committee to Protect Journalists. 2015. 20 May 2016. < https :// cpj . org / blog /2015/07/ erdogan - vs - the - press - president - uses - insult - law - to -. php>.

Online Censorship. “Onlinecensorship.org launches first report (PDF)”. Onlinecensorship.org. 2016. 21 May 2016. < https :// onlinecensorship . org / news - and - analysis / onlinecensorship - org - launches - first - report -download>.

Opdenakker, Raymond. “Advantages and Disadvantages of Four Interview Techniques in Qualitative Research”. Forum: Qualitative Social Research. Vol 7, No 4 (2006). < http :// www . qualitative - research . net / index . php / fqs / article / view /175/391>.

Oreskovic, Alexei. “Twitter's stock plunges on revenue miss”. Business Insider. 2016. 23 May 2016. 72

< http :// uk . businessinsider . com / twitter - reports - q 1-2016- earnings - results -2016-4? r = US & IR = T>.

Pariser, Eli. The Filter Bubble. 2011. USA: Penguin Press, 2012.

Q&A with Mark. “Watch the full video of the third Q&A with Mark from Pontificia Universidad Javeriana in Bogotá, Colombia”. Facebook. 2015. 20 June 2016. < https :// www . facebook . com / qawithmark / videos / vb .823440467713730/870053989719044/? type =2& theater>.

Ries, Brian. “'Twitter, Mtwitter!': Turkish Prime Minister's 9 Craziest Quotes About Social Media”. Mashable. 2014. 19 May 2016. < http :// mashable . com /2014/03/28/ quotes - turkey - erdogan - social - media /# Lpa .433 e 5 uqF>.

Roberts J., Jeff. “In win for Google, court lifts ban on 'Innocence of Muslims' video”. Fortune. 2015. 22 March 2016. < http :// fortune . com /2015/05/18/ google - video - ban - court />.

Roberts T., Sarah. “Commercial Content Moderation: Digital Laborers’ Dirty Work.”. Western University, Media Studies Publications. 2016. 19 March 2016. < http :// ir . lib . uwo . ca / cgi / viewcontent . cgi ? article =1012& amp ; context = commpub>.

Ruane A., Kathleen. “Freedom of Speech and Press: Exceptions to the First Amendment”. Congressional Research Service. 2014. 8 March 2016. < https :// www . fas . org / sgp / crs / misc /95- 815. pdf>.

Ryan, Yasmine. “How Tunisia’s revolution began”. Aljazeera. 2011. 25 May 2016. < http :// www . aljazeera . com / indepth / features /2011/01/2011126121815985483. html>. Sanburn, Josh. “One year after filming Eric Garner’s fatal confrontation with police, Ramsey Orta’s life has been upended”. Time. Undated. 3 March 2016. < http :// time . com / ramsey - orta - eric - garner - video />.

Saka, Erkan. Interview. 29 April 2016.

Shih, Gerry. “Twitter opens Hong Kong office, gains China foothold”. . 2015. 23 May 2016. < http :// www . reuters . com / article / us - twitter - hongkong - idUSKBN 0 M 60 SH 20150310>.

Smith, Jack. “Apple Suddenly Banned an App That Maps U.S. Drone Strikes”. Mic. 2015. 10 March 2016. < http :// mic . com / articles /126003/ apple - bans - dronestream - metadata - app - that - maps - drone - strikes - in -us>.

Snapchat Transparency Report. Snapchat. 2016. 18 March 2016. < https :// www . snapchat . com / transparency />.

Sozeri, Efe Kerem. Interview. 11 May 2016.

Sozeri, Efe Kerem. “Turkey cracks down on Twitter and Facebook after deadly car bombing”. Daily Dot. 2016. 21 March 2016. < http :// www . dailydot . com / politics / turkey - ankara - bombing - twitter - social - media />. 73

Sozeri, Efe Kerem. “Turkish court orders 860 tweets censored after ISP boycott sparked by child- rape scandal”. The Daily Dot. 2016. 30 April 2016. < http :// www . dailydot . com / politics / turkcell - twitter - censorship - protest - ensar - foundation />.

Sozeri, Efe Kerem “Uncovering the accounts that trigger Turkey's war on Twitter”. The Daily Dot. 2015. 22 May 2016. < http :// www . dailydot . com / politics / twitter - transparency - report - turkey - censorship />.

Tarakiyee, Mohammad. “Arab IGF: MIssed opportunities for multistakeholder engagement”. Undated. Accessed on 23 March 2016. < https :// www . apc . org / en / blog / arab - igf - missed - opportunities - multistakeholder -eng>.

Taal, Maya. “CPJ Risk List: Where Press Freedom Suffered”. Committee to Protect Journalists. 2013. 19 May 2016. < https :// www . cpj . org /2014/02/ attacks - on - the - press - cpj - risk - list -1. php >.

Taylor, Emily. “The Privatization of Human Rights: Illusions of Consent, Automation and Neutrality”. Global Commission on Internet Governance. 2016. 22 June 2016. < https :// ourinternet - files . s 3. amazonaws . com / publications / no 24_ web _2. pdf>.

Clay Shirky. “How social media can make history”. Ted.com. June 2009. 3 March 2016. < https :// www . ted . com / talks / clay _ shirky _ how _ cellphones _ twitter _ facebook _ can _ make _ history>.

The Economist. “Autocratic Ottoman: Turkey is sending its journalists to prison”. The Economist. 2016. 19 May 2016. < http :// www . economist . com / news / europe /21698472- after - forcing - out - his - prime - minister - president - erdogan - muzzles - press - turkey - sending -its>.

The Twitter Rules. Twitter. 2016. 17 March 2016. < https :// support . twitter . com / articles /18311>.

The World Bank. “New World Bank Report Looks at Turkey’s Rise to the Threshold of High- Income Status and the Challenges Remaining”. Worldbank.org. 2014. 23 May 2016. < http :// www . worldbank . org / en / news / press - release /2014/12/10/ new - world - bank - report - looks - at - turkey - rise - to - threshold - of - high - income - and - challenges -remaining>.

Tufekci, Zeynep. “Algorithmic Harms Beyond Facebook and Google: Emergent Challenges of Computational Agency”. Colorado Technology Law Journal. 13.2 (2015): 203-218. 7 March 2016. < http :// ctlj . colorado . edu /? page _ id =238>.

Tumblr Community Guidelines. Last modified on 26 January 2015. Accessed on 8 March 2016. < https :// www . tumblr . com / policy / en /community>.

Turkey Requests to remove content. Google. Undated. Accessed on 19 May 2016. < https :// www . google . com / transparencyreport / removals / government / TR /? hl = en>.

Twitter. “Combating Violent Extremism”. The Official Twitter Blog. 2016. 4 March 2016. < https :// blog . twitter . com /2016/ combating - violent -extremism>.

Twitter Help Center. “Country withheld content”. Twitter. 2016. 25 May 2016. < https :// support . twitter . com / articles /20169222#>.

Twitter homepage. Twitter.com. Accessed on 22 June 2016. 74

Twitter Transparency Report. Removal requests. Twitter. 2016. 17 March 2016.< https :// transparency . twitter . com / removal - requests /2015/ jul -dec>.

Twitter Usage. Twitter. 2016. 7 March 2016. < https :// about . twitter . com /company>.

US Requests to remove content. Google. Undated. Accessed on 19 May 2016. < https :// www . google . com / transparencyreport / removals / government / US /? hl = en>.

Vaidhyanathan, Siva. The Googlization of Everything. 2011. Berkeley and Los Angeles: University of California Press, 2012.

Variyar, Mugdha. “Pakistan unblocks YouTube after removal of 'blasphemous' content”. International Business Times. 2016. 22 March 2016. < http :// www . ibtimes . co . in / pakistan - unblocks - youtube - after - removal - blasphemous - content -663642>.

Volokh, Eugene. “No there’s no hate speech exception to the First Amendment”. Washington Post. 2015. 7 March 2016. < https :// www . washingtonpost . com / news / volokh - conspiracy / wp /2015/05/07/ no - theres - no - hate - speech - exception - to - the - first - amendment />.

Weiss, Robert S. 1994. Learning From Strangers: The Art and Method of Qualitative Interview Studies. New York: The Free Press, 1995.

Wikimedia Foundation. Transparency Report. Accessed on 18 March 2016. < https :// transparency . wikimedia . org />.

Williams, Christopher. “How Egypt shut down the internet”. The Telegraph. 2011. 15 March 2016. < http :// www . telegraph . co . uk / news / worldnews / africaandindianocean / egypt /8288163/ How - Egypt - shut - down - the - internet .html>.

Yahoo Transparency Report. Frequently Asked Questions. Yahoo. Accessed on 18 March 2016. < https :// transparency . yahoo . com / faq / index . htm # c 28_ anchor _ nav>.

Youmans, William and York, Jillian. “Social Media and the Activist Toolkit: User Agreements, Corporate Interests, and the Information Infrastructure of Modern Social Movements”. Journal of COmmunication. Vol.62 No.2 (2012): 315-329. 21 March 2016.

York, Jillian. “Complicity in Censorship: Facebook's Latest Government Requests Report”. EFF. 2014. 20 March 2016. < https :// www . eff . org / es / deeplinks /2014/11/ complicity - censorship - facebooks - latest - government - requests -report>.

York, Jillian. “Facebook's 'real names' policy is legal, but it's also problematic for free speech”. The Guardian. 2014. 24 May 2016. < http :// www . theguardian . com / commentisfree /2014/ sep /29/ facebooks - real - names - policy - is - legal - but - its - also - problematic - for - free -speech>.

York, Jillian. “Policing Content in the Quasi-Public Sphere”. Open Net Initiative. 2010. 21 June 2016. < https :// opennet . net / sites / opennet . net / files / PolicingContent .pdf>. 75

York, Jillian. “Social media has been privatised. Why do we treat it as a public space?”. . 2014. 27 May 2016. < http :// www . newstatesman . com / internet /2014/06/ social - media - has - been - privatised - why - do - we - treat - it - public -space>.

Youtube About Page. Youtube. Accessed on 23 June 2016. < https :// www . youtube . com / yt / about />.

Youtube Statistics. Youtube. Accessed on 18 March 2016. < https :// www . youtube . com / yt / press / statistics .html>.

Zuckerman, Ethan. “Intermediary Censorship”. Access Controlled: The Shaping of Power, Rights, and Rule in Cyberspace. Ronald Deibert, John Palfrey, Rafal Rohozinski, and Jonathan ZIttrain. United States of AMerica: MIT Press, 2010. 71-85.

Zuckerberg Mark. Facebook. 2015. 22 March 2016. < https :// www . facebook . com / zuck / posts /10101844454210771? pnref = story>.