REPORT Freedom of Expression on the Internet
Total Page:16
File Type:pdf, Size:1020Kb
Organization for Security and Co-operation in Europe The Office of the Representative on Freedom of the Media DUNJA MIJATOVIĆ REPORT Freedom of Expression on the Internet A study of legal provisions and practices related to freedom of expression, the free flow of information and media pluralism on the Internet in OSCE participating States The report has been commissioned by the Office of the OSCE Representative on Freedom of the Media. It was prepared by Professor Yaman Akdeniz, Faculty of Law, Istanbul Bilgi University, Turkey.* This report presents the conclusions of the first comprehensive research on Internet content regulation in the OSCE region. A preliminary report was prepared and published in view of the OSCE review conference and OSCE Astana Summit 2010. The information contained in this report is based on data received from OSCE participating States as well as bona fide sources in response to a questionnaire sent out on 23 September 2010. * Yaman Akdeniz’ recent publications include Internet Child Pornography and the Law: National and International Responses (London: Ashgate, 2008: ISBN: 0 7546 2297 5), Racism on the Internet, Council of Europe Publishing, 2010 (ISBN 978-92-871-6634-0). For further information about his work see <http://cyberlaw.org.uk/about/>. Akdeniz can be contacted at [email protected]. 2 TABLE OF CONTENTS INTRODUCTION 4 OSCE COMMITMENTS 7 METHODOLOGY 8 A. INTERNET ACCESS 10 B. INTERNET CONTENT REGULATION 13 C. BLOCKING,F,CR ILTERING AND ONTENT EMOVAL 22 D. LICENSING AND LIABILITY RELATED ISSUES, AND HOTLINES TO REPORT ILLEGAL CONTENT 28 E. CONCLUSIONS AND RECOMMENDATIONS 33 OVERVIEW OF LAWS AND PRACTICES ON INTERNET CONTENT REGULATION IN THE OSCE AREA 37 A. INTERNET ACCESS 37 Internet Access – A Fundamental Human Right 37 Legal provisions which could restrict users’ access to the Internet 38 Legal provis io ns gua ranteeing or regulating “Net Neutrality” 40 Conclusion to Part A 47 B. INTERNET CONTENT REGULATION 48 Legal provisions outlawing racist content, xenophobia, and hate speech on the Internet 51 Legal provisions outlawing the denial, gross minimisation, approval or justification of genocide or crimes against humanity 64 Legal provisions outlawing incitement to terrorism, terrorist propaganda and/or terrorist use of the Internet 69 Legal provisions criminalizing Child Pornography 81 Legal provisions outlawing obscene and sexually explicit (pornographic) content 99 Legal Provisions Outlawing Internet Piracy 103 Legal provisions outlawing libel and insult (defamation) on the Internet 115 Legal provisions outlawing the expression of views perceived to be encouraging extremism 127 Legal provisions outlawing the distribution of “harmful content” 133 Legal provisions outlawing a ny other categories of Internet content 135 Conclusion to Part B 136 C. BLOCKING, FILTERING, AND CONTENT REMOVAL 139 European Union and Council of Europe policies and projects on blocking access to websites 139 Legal provisions which require closing down and/or blocking access to websites and access to Web 2.0 based services 149 Policies on Filtering Software and Children’s Access to Harmful Content 174 Legal provisions requiring schools, libraries, and Internet cafes to use filtering and blocking systems and softw are 176 Conclusion to Part C 181 D. LICENSING AND LIABILITY RELATED ISSUES, AND HOTLINES TO REPORT ILLEGAL CONTENT 186 Hotlines to report alle gedly illeg al content 208 Conclusion to Part D 219 APPENDIX II: RESPONSE STATISTICS 229 APPENDIX III: RESPONSE FREQUENCIES 230 3 Introduction Whenever new communications and media platforms have been introduced, their innovation and application was met with scepticism, fear or outright banning by the ruling parties and authorities who feared the unknown medium, and its capacity to oust them from power. Therefore, new (mass) media historically face suspicion, and are liable to excessive regulation as they spark fear of potential detrimental effects on society, security and political power structures. This has proven true in the publication and transmission of certain types of content from the printing press through the advent of radio, television and satellite transmissions, as well as other forms of communication systems. During the 1990s, as attention turned to the Internet and as access to this borderless new communications platform increased, the widespread availability of various content, including sexually explicit content and other types of content deemed to be harmful for children, stirred up a ‘moral panic’1 shared by many states and governments and certain civil-society representatives and concerned citizens. Prior to the 1990s, information and content was predominantly within the strict boundaries and control of individual states, whether through paper-based publications, audio-visual transmissions limited to a particular area or even through public demonstrations and debates. Much of the media content made available and the discussions it triggered remained confined within territorially defined areas. Today, however, information and content, with its digital transmission and widespread availability through the Internet, do not necessarily respect national rules or territorial boundaries. This dissolution of the “sovereignty” of content control, coupled with the globalization of information, comes along with an increased multilingualism observable in many countries. The increasing popularity of user-driven interactive Web 2.0 applications and services such as YouTube, Facebook and Twitter seem to eliminate virtual Internet borders even further by creating a seamless global public sphere. This, inevitably complicates state-level efforts to find an appropriate balance between the universal right to freedom of opinion and expression, which includes the right to receive and impart information, and the prohibition on certain types of content deemed illegal by nation- state authorities or intergovernmental organizations. With the widespread availability of the Internet and increasing number of users, online content regulation became an important focus of governments and supranational bodies across the globe. Today, many OSCE participating States feel the need to react to the development of the Internet as a major media and communication platform. Governments think that it is, on the one hand, the infrastructure that requires protective measures and, on the other hand, content made available that necessitates regulation. The past few years have shown that more people access the Internet, more content is made available online and more states feel obliged to regulate online content. A number of countries across the OSCE region have introduced new legal provisions in response to the availability and dissemination of certain types of (illegal or unwanted) content. Governments are particularly concerned about the availability of terrorist propaganda,2 racist content,3 hate speech, sexually explicit content, including child 1 Cohen, S., Folk Devils and Moral Panics: Creation of Mods and Rockers, Routledge: 30th Anniversary edition, 2002; Jenkins, P., Intimate Enemies: Moral Panics in Contemporary Great Britain, Aldine De Gruyter, 1992. 2 See generally Weimann, G., Terror on the Internet: The New Arena, the New Challenges (Washington: US Institute of Peace, 2006). 3 For a detailed assessment of legal issues surrounding racist content and hate speech on the Internet see Akdeniz, Y., Racism on the Internet, Council of Europe Publishing, 2010 (ISBN 978-92-871-6634-0); Akdeniz, Y., “Introduction,” in Legal Instruments for Combating Racism on the Internet, Council of Europe Publishing, Human Rights and Democracy Series, 2009, pp 7-37. 4 pornography,4 as well as state secrets and content critical to certain governments or business practices. However, the governance of illegal as well as harmful (which falls short of illegal) Internet content may differ from one country to another and variations are evident within the OSCE participating States.5 “Harm criteria” remain distinct within different jurisdictions with individual states deciding what is legal and illegal based upon different cultural, moral, religious and historical differences and constitutional values. Typically, the stance taken by many states is that what is illegal and punishable in an offline form must at least be treated equally online. There are, however, several features of the Internet which fundamentally affect approaches to its governance and while rules and boundaries still exist, enforcement of existing laws, rules and regulations to digital content becomes evidently complex and problematic. Despite the introduction of new laws or amendments to existing laws criminalizing publication or distribution of certain types of content, in almost all instances extraterritoriality remains a major problem when content hosted or distributed from outside the jurisdiction is deemed illegal in another.6 Therefore, the question of jurisdiction over content adds to the challenges faced by the governments and regulators. Which country’s laws should apply for content providers or for Web 2.0 based platform providers? Should the providers be liable in the country where the content has been uploaded, viewed, or downloaded or where the server is placed or where the responsible providers reside? Many of these questions remain unanswered. Some countries fear the Internet could undermine their judicial sovereignty; others embrace the Internet and praise its global nature. However, the Internet certainly has created