Platform Liability Under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: an Impossible Match
Total Page:16
File Type:pdf, Size:1020Kb
American University Washington College of Law Digital Commons @ American University Washington College of Law Joint PIJIP/TLS Research Paper Series 3-2021 Platform Liability Under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match Christophe Geiger Centre for International Intellectual Property Studies (CEIPI), University of Strasbourg, [email protected] Bernd Justin Jütte University College Dublin, Sutherland School of Law, [email protected] Follow this and additional works at: https://digitalcommons.wcl.american.edu/research Part of the Intellectual Property Law Commons, and the International Trade Law Commons Recommended Citation Geiger, Christophe and Bernd Justin Jütte. "Platform Liability Under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match." (2021) PIJIP/TLS Research Paper Series no. 64. https://digitalcommons.wcl.american.edu/research/64 This Article is brought to you for free and open access by the Program on Information Justice and Intellectual Property and Technology, Law, & Security Program at Digital Commons @ American University Washington College of Law. It has been accepted for inclusion in Joint PIJIP/TLS Research Paper Series by an authorized administrator of Digital Commons @ American University Washington College of Law. For more information, please contact [email protected]. PLATFORM LIABILITY UNDER ARTICLE 17 OF THE COPYRIGHT IN THE DIGITAL SINGLE MARKET DIRECTIVE, AUTOMATED FILTERING AND FUNDAMENTAL RIGHTS: AN IMPOSSIBLE MATCH Christophe Geiger & Bernd Justin Jütte* “We cannot accept a situation where decisions that have a wide-ranging impact on our democracy are being made by computer programs without any human supervision.” -Ursula von der Leyen President of the European Commission 20 January 2020 ABSTRACT ............................................................................................................ 3 EXECUTIVE SUMMARY ................................................................................... 5 INTRODUCTION ................................................................................................ 10 FUNDAMENTAL RIGHTS IMPLICATIONS OF PLATFORM LIABILITY ................ 15 A. The systemic role of fundamental rights in the EU legal order ................................................................................................ 16 B. Proportionality: The CJEU’s balancing methodology .......... 20 * Christophe Geiger is Professor of Law at the Centre for International Intellectual Property Studies (CEIPI), University of Strasbourg (France); Affiliated Senior Researcher at the Max Planck Institute for Innovation and Competition (Munich, Germany) and Spangenberg Fellow in Law & Technology at the Spangenberg Center for Law, Technology & the Arts, Case Western Reserve University School of Law (Cleveland, US); Bernd Justin Jütte is Assistant Professor in Intellectual Property Law, University College Dublin, Sutherland School of Law (Ireland) and Senior Researcher Vytautas Magnus University, Faculty of Law (Kaunas, Lithuania). This research project was conducted with the support of Bitkom e.V. 1 C. The relevant fundamental rights with regard to platform liability ................................................................................................ 23 1. Freedom of expression and information, freedom of the arts............................24 2. Freedom to conduct a business..........................................................................28 3. Data protection, privacy and family life............................................................30 4. The right to property and its social function......................................................32 5. The right to an effective remedy and to a fair trial............................................35 D. Legal Certainty and market harmonization .......................... 37 E. Interim conclusions.................................................................38 EXAMINING ARTICLE 17 CDSM IN THE LIGHT OF FUNDAMENTAL RIGHTS ............................................................................................................... 38 A. The new liability regime established by Art 17 CSDM – An overview ................................................................................................ 39 B. Article 17, Automated filtering and fundamental rights ........ 42 1. Automated filtering as a necessary consequence of Article 17.........................43 2. Does automated filtering qualify as general monitoring? (Article 17(8)).............................................................................................................................44 C. A closer look at filtering: technological background ........... 46 1. Limits of automated filtering technology (false positives)................................47 a. Infringement threshold.................................................................................47 b. Contextual Differentiation............................................................................48 D. Intermission: Effects of automated filtering on fundamental rights.........................................................................................................48 E. Addressing Fundamental Rights in the Mechanisms of Article 17- A Proper Balance? ............................................................................ 49 1. Best Efforts to Obtain Authorization (Article 17(7))........................................50 2. Targeted and Tailored Filtering Obligations.....................................................51 a. Prevent Access to Specific Works and Subject Matter (Article17(4)(b))....52 b. Disable Access upon Notification................................................................55 c. Best Efforts to Ensure Future Availability of Specific Works.....................56 d. Proportionality as a Horizontal Requirement (Article 17(5))......................57 3. Safeguarding Fundamental User Rights (Article 17(7))....................................58 a. The Primary Obligation to Guarantee Lawful Uses.....................................58 b. The Imperatives of Exceptions and Limitations..........................................61 c. Procedural Safeguards..................................................................................62 F. Resolving Fundamental Rights Conflicts Through An Institutional Mediator...............................................................................63 CONCLUSION .................................................................................................... 66 2 ABSTRACT The Directive on Copyright in the Digital Single Market (CDSM Directive) introduced a change of paradigm with regard to the liability of some platforms in the European Union. Under the safe harbour rules of the Directive on electronic commerce (E-Commerce Directive), intermediaries in the EU were shielded from liability for acts of their users committed through their services, provided they had no knowledge of it. Although platform operators could be required to help enforce copyright infringements online by taking down infringing content, the E-commerce Directive also drew a very clear line that intermediaries could not be obliged to monitor all communications of their users and install general filtering mechanisms for this purpose. The Court of Justice of the European Union confirmed this in a series of cases, amongst other reasons because filtering would restrict the fundamental rights of platform operators and users of intermediary services. Twenty years later, the regime for online intermediaries in the EU has fundamentally shifted with the adoption of Article 17 CDSM Directive, the most controversial and hotly debated provision of this piece of legislation. For a specific class of online intermediaries called “online content-sharing providers” (OCSSPs), uploads of infringing works by their users now result in direct liability and they are required undertake “best efforts” to obtain authorization for such uploads. With this new responsibility come further obligations, which oblige OCSSPs to make best efforts to ensure that works for which they have not obtained authorization are not available on their services. How exactly OCSSPs can comply with this obligation is still unclear. However, it seems unavoidable that compliance will require them to install measures such as automated filtering (so-called “upload filters”) using algorithms to prevent users from uploading unlawful content. Given the scale of the obligation, there is a real danger that measures taken by OCSSPs in fulfilment of their obligation will amount to expressly prohibited general monitoring. What seems certain however is that the automated filtering, whether general or specific in nature, cannot distinguish appropriately between illegitimate and legitimate use of content (e.g. because it would be covered by a copyright limitation). Hence, there is a serious risk of over-blocking of certain uses that benefit from strong fundamental rights justifications such as the freedom of expression and information or freedom of artistic creativity. This article first outlines the relevant fundamental rights as guaranteed under the EU Charter of Fundamental Rights and the European Convention of Human Rights that are affected by an obligation to monitor and filter for copyright infringing content. Second,