Platform Liability in the EU: a Need for Reform? 2020
Total Page:16
File Type:pdf, Size:1020Kb
Platform Liability in the EU: A Need for Reform? 2020 3(53) Platform Liability in the EU: A Need for Reform? Table of Contents Definitions ................................................................................................. 4 1 Executive summary ........................................................................... 5 2 Introduction ....................................................................................... 7 2.1 The purpose of the report ............................................................ 8 2.2 Method ........................................................................................ 8 2.3 Overview ..................................................................................... 8 2.4 The context of the report ............................................................. 9 2.4.1 Platforms and the internal market ....................................... 9 2.4.2 Platforms and sustainable development ............................ 10 3 The regulatory framework for platform content liability in the EU 11 3.1 The e-Commerce Directive ....................................................... 11 3.2 The Copyright Directive (2019) ............................................... 14 3.3 The Audiovisual Media Services Directive (2018) .................. 15 3.4 The relationship and consistency between the directives ......... 16 3.5 Self-regulation .......................................................................... 17 4 A further harmonised framework for platform liability in the EU .. 19 4.1 Evidence of fragmentation between the Member States .......... 19 4.2 A harmonised framework for platform liability: issues for further analysis .................................................................................... 24 5 A stricter level of platform liability in the EU? ............................... 25 5.1 Main arguments in favour of a stricter liability ........................ 26 5.2 Main arguments against a stricter liability ................................ 28 5.3 Recommendations ..................................................................... 31 Bibliography and other sources ............................................................... 33 Sammanfattning på svenska (Summary in Swedish) .............................. 39 Annex: Platform Liability and the e-Commerce Directive – answers from stakeholders ............................................................................................. 41 4(53) Definitions The following definitions apply for the purpose of this report. Platforms: A platform is defined as a “digital service that facilitates interactions between two or more distinct but interdependent sets of users (whether firms or individuals) who interact through the service via the internet”.1 The platform business model is characterised by the fact that platforms act as intermediaries between two or more types of users.2 Examples include e-commerce platforms that match retailers and customers, social media platforms that match users and advertisers and apartment sharing platforms that match apartment owners with customers seeking short-term rentals. Illegal content: Illegal content is defined as content that infringes European Union (EU) law or national laws in the Member States,3 including illegal offers for sale of goods and services. Falling outside the scope of the report are discussions regarding content that is not illegal but rather “harmful”, such as pornographic content. The distinction between illegal and harmful content is of course fluid and changes subject to legislative developments. To take one example, the concept of “fake news” today exists in the grey zone between legal and illegal content. Some countries are taking measures to classify “fake news” as illegal,4 a trend that has intensified during the COVID-19 pandemic.5 In this report, we will not discuss “fake news” as an issue of illegal content. This is in line with the approach taken by the Commission in the recent consultation on the Digital Services Act.6 Liability: When discussing the liability of platforms in this report, we are addressing under what circumstances platforms can or cannot be held liable for the content posted on the platform by its users. This is commonly referred to as the secondary liability of platforms. If the platform were to publish illegal content on its own, this would concern the primary liability of platforms. The primary liability of platforms falls beyond the scope of this report. Content filtering: When discussing content filtering in this report we are referring to the use of software and algorithms to prevent, screen and remove illegal content from platforms. A common example is the ContentID system used by YouTube, which is a digital fingerprinting system to detect copyright infringements. Content filtering in a wider sense may also refer to measures taken by other actors. Examples include an internet access provider that blocks access to certain websites or a parent who uses software to block certain internet content for children. The latter issues are however not the subject of this paper. 1 OECD (2019) p. 21. 2 See Stefan Larsson & Jonas Andersson Schwarz (2018), p. 20. 3 This is also the definition used in the recommendation by the Commission on measures to tackle illegal content online. See C(2018) 1177. 4 Michael-Ross Fiorentino (2018). 5 EDRi (2020). 6 Commission (2020a). 5(53) 1 Executive summary This report discusses the need to reform the European Union (EU) rules on platform liability for illegal content, such as terrorist propaganda, hate speech and illegal products. In this regard, the President of the European Commission Ursula von der Leyen has called for a Digital Services Act that will upgrade the liability and safety rules for platforms in the EU. The purpose of this report is to outline key issues and arguments that need to be considered in the upcoming policy process. The report forms part of the work of the National Board of Trade in offering analysis to the Swedish government ahead of upcoming negotiations regarding the internal market. The analysis presented in this report is supported by the results obtained through a questionnaire sent out by the National Board of Trade to stakeholders in December 2019 (the NBT questionnaire). The questionnaire was answered by 29 stakeholders and will be used to provide a sample of their insights. The questionnaire will not be used to draw statistical conclusions due to the relatively small sample size. The answers to the questionnaire are summarised in a document and can be found as a separate annex to this report. The analysis in this report is divided into two parts. The first part discusses fragmentation under the current regulatory framework and potential improvements to increase harmonisation. The second part discusses the arguments for and against a more large-scale reform of the level of liability for platforms. The discussion in this latter part focuses on the need for the introduction of mandatory filtering obligations. Analysis In terms of harmonisation under the current regulatory framework, we present evidence of fragmentation between the Member States. Indeed, the national regulations have to a certain extent diverged over the 20 years since the e-Commerce Directive was enacted. To address this issue, we highlight the following issues for further analysis: • The possibility of introducing a harmonised notice and takedown procedure across the EU, including issues such as the territorial scope of takedowns and access to justice. • The possibility of reforming the scope of the e-Commerce Directive to account for “new services” that were not explicitly addressed in the original directive. 6(53) • The possibility of clarifying the distinction between active and passive hosting services. In this regard, the legislator may consider a clarification that good faith efforts from platforms to prevent illegal content will not lead to the loss of the liability exception. • The possibility of clarifying the distinction between general and specific monitoring obligations, in line with recent case law. • The possibility of clarifying the relationship between the e- Commerce Directive and other pieces of EU platform legislation, such as the Copyright Directive and the AVMS Directive. With regard to the level of platform liability, there are forceful arguments for and against a reform. The situation is characterised by a complex balancing act between competing interests. This was also reflected in the answers to the NBT questionnaire, where stakeholders were divided as to the need for a large-scale reform of the rules on hosting liability. We present the following recommendations for the policy process moving forward. First of all we stress the importance of gathering solid evidence. It is crucial that the policy process is not rushed and that proposals are based on solid empirical evidence. Examples of tools that can be used to gather evidence include regulatory sandboxes and public consultations. Second, we note the fact that illegal content is a multi-faceted problem. Some areas of content liability may be more suitable for reform than others. For example, stricter liability might be investigated with regard to child sexual abuse material and illegal goods. Finally, it is important that any reform respects fundamental freedoms, such as the freedom of expression and the freedom to conduct a business. 7(53) 2 Introduction This year marks the 20th anniversary of the e-Commerce Directive. The Directive