TTLF Article MA Weiss Jan 30 2021
Total Page:16
File Type:pdf, Size:1020Kb
Stanford – Vienna Transatlantic Technology Law Forum A joint initiative of Stanford Law School and the University of Vienna School of Law TTLF Working Papers No. 73 Regulating Freedom of Speech on Social Media: Comparing the EU and US Approach Marie-Andrée Weiss 2021 TTLF Working Papers Editors: Siegfried Fina, Mark Lemley, and Roland Vogl About the TTLF Working Papers TTLF’s Working Paper Series presents original research on technology, and business- related law and policy issues of the European Union and the US. The objective of TTLF’s Working Paper Series is to share “work in progress”. The authors of the papers are solely responsible for the content of their contributions and may use the citation standards of their home country. The TTLF Working Papers can be found at http://ttlf.stanford.edu. Please also visit this website to learn more about TTLF’s mission and activities. If you should have any questions regarding the TTLF’s Working Paper Series, please contact Vienna Law Professor Siegfried Fina, Stanford Law Professor Mark Lemley or Stanford LST Executive Director Roland Vogl at the Stanford-Vienna Transatlantic Technology Law Forum http://ttlf.stanford.edu Stanford Law School University of Vienna School of Law Crown Quadrangle Department of Business Law 559 Nathan Abbott Way Schottenbastei 10-16 Stanford, CA 94305-8610 1010 Vienna, Austria About the Author Marie-Andrée Weiss is a French-American attorney admitted in New York and in Strasbourg, France. Her solo practice focuses on intellectual property, privacy, data protection and social media law. Before becoming an attorney, she worked several years in the fashion and cosmetics industry in New York as a buyer and a director of sales and marketing. She enjoys blogging and is a guest blogger on several sites and maintains two of her own blogs. General Note about the Content The opinions expressed in this paper are those of the author and not necessarily those of the Transatlantic Technology Law Forum or any of its partner institutions, or the sponsors of this research project. Suggested Citation This TTLF Working Paper should be cited as: Marie-Andrée Weiss, Regulating Freedom of Speech on Social Media: Comparing the EU and the US Approach, Stanford-Vienna TTLF Working Paper No. 73, http://ttlf.stanford.edu. Copyright © 2021 Marie-Andrée Weiss Abstract Social media platforms provide forums to share ideas, jokes, images, insults, and threats. These private companies form a contract with their users who agree in turn to respect the platform’s private rules, which evolve regularly and organically, reacting sometimes to a particular event, just as legislatures may do. As these platforms have a global reach, yet are, for the most part, located in the United States, the articulation between the platforms’ terms of use and the laws of the states where the users are located varies greatly from country to country. This article proposes to explore the often-tense relationships between the states, the platforms, and the users, whether their speech creates harm or they are a victim of such harm. The first part of the article is a general presentation of freedom of expression law. This part does not attempt to be a comprehensive catalog of such laws around the world and is only a general presentation of the U.S. and the European Union laws protecting freedom of expression, using France as an example of a particular country in the European Union. While the principle is freedom of speech, the legal standard is set by international conventions, such as the United Nations Universal Declaration of Human Rights or the European Convention on Human Rights. The second part of the article presents what the author believes to be the four main justifications for regulating free speech: protecting the public order, protecting the reputation of others, protecting morality, and advancing knowledge and truth. The protection of public order entails the protection of the flag or the king, and lèse- majesté sometimes survives even in a Republic. The safety of the economic market, which may dangerously sway if false information floats online, is another state concern, as is the personal safety of the public. Speech sometimes does harm, even kill, or place an individual in fear for her or his life. The reputation and honor of others is easily smeared on social media, whether by defamation, insults or hate speech, a category of speech not clearly defined by law, which yet is at the center of the debate on online content moderation, including whether there is a right to speak anonymously online. What is “morality” is another puzzling question, as blasphemy, indecency, even pornography, have different legal definitions around the world and private definitions by the platforms. Even truth is an elusive concept, and both states and platforms struggle to define what is “fake news,” and whether what is clearly false information, such as denying the existence of the Shoah, should be allowed to be published online. Indeed, while four justifications for regulating speech are delineated in this article, the speech and conduct which should be considered an attack on values worthy to be protected is not equally considered by the different states and the different platforms, and how the barriers to speech are being placed provides a telling picture of the state of democracy. The third part examines who should have the power to delete speech on social media. States may exert censorship on the platforms or even on the pipes to block access to speech and punish, sometimes harshly, speakers daring to trespass the barriers to free speech erected by the states. For the sake of democracy, the integrity of the electoral process must not be threatened by false information, whether it spreads false information about the candidates or false information about alleged fraud, or even false information about the result of the vote. Social media platforms must respect the law. In the United States, Section 230 of the Communications Decency Act of 1996 provides immunity to platforms for third-party content, but also for screening offensive content. Section 230 has been modified several times and many bills, from both sides of the political spectrum, aim at further reform. In the European Union, the E-commerce Directive similarly provides a safe harbor to social media platforms, but the law is likely to change soon, as the Digital Services Act proposal was published in December 2020. The platforms have their own rules, and may even soon have their own private courts, for example the recently created Facebook Oversight Board. However, other private actors may have a say on what can be published on social media, for instance employers or the governing bodies of regulated professions, such as judges or politicians. Even private users may censor the right of others to speak freely, using copyright laws, or may use public shaming to fear speakers into silence. Such fear may lead users to self-censor their speech, to the detriment of the marketplace of ideas, or they may choose to delete controversial messages. Public figures, however, may not have the right to delete social media posts or to block users. The article was finished the last days of 2020, a year which saw attempts to use social media platforms to sway the U.S. elections by spreading false information, the semi- failed attempt of France to pass a law protecting social media users against hate speech, and false news about the deadly Covid-19 virus spreading online like wildfire, through malicious or naïve posts. A few days after the article was completed, the U.S. Capitol was attacked, on January 6, 2021, by a seditious mob seeking to overturn the results of the Presidential election, believing that the election had been rigged, a false information amplified by thousands of users on social media, including the then President of the United States. Several social media platforms responded by blocking the President’s social media accounts, either temporarily or permanently, as did Twitter. Table of Contents Introduction ................................................................................................................... 4 1st Part: General Presentation of Freedom of Expression Laws ................. 10 I. The Principle: Freedom of Speech ................................................................................................. 10 A. International Laws ..................................................................................................................... 10 B. European Union law .................................................................................................................. 13 C. National Laws ............................................................................................................................. 20 a. U.S. Laws ................................................................................................................................ 20 b. French Laws ........................................................................................................................... 25 2nd Part: The Four Main Justifications for Regulating Free Speech ........... 30 I. Protecting Public Order ................................................................................................................. 34 A. Protecting the FlaG ..................................................................................................................... 34 B. MocKinG the KinG ......................................................................................................................