The Marketplace of Ideas Online
Total Page:16
File Type:pdf, Size:1020Kb
GW Law Faculty Publications & Other Works Faculty Scholarship 2019 The Marketplace of Ideas Online Dawn C. Nunziato George Washington University Law School, [email protected] Follow this and additional works at: https://scholarship.law.gwu.edu/faculty_publications Part of the Law Commons Recommended Citation Nunziato, Dawn Carla, The Marketplace of Ideas Online (2019). The Marketplace of Ideas Online, 94 Notre Dame L. Rev. 1519-1584 (2019).; GWU Law School Public Law Research Paper No. 2019-36; GWU Legal Studies Research Paper No. 2019-36. Available at SSRN: https://ssrn.com/abstract=3405381 or http://dx.doi.org/10.2139/ssrn.3405381 This Article is brought to you for free and open access by the Faculty Scholarship at Scholarly Commons. It has been accepted for inclusion in GW Law Faculty Publications & Other Works by an authorized administrator of Scholarly Commons. For more information, please contact [email protected]. \\jciprod01\productn\N\NDL\94-4\NDL402.txt unknown Seq: 1 29-MAY-19 15:31 THE MARKETPLACE OF IDEAS ONLINE Dawn Carla Nunziato* INTRODUCTION .................................................. 1520 R I. THE HISTORICAL ORIGINS OF THE MARKETPLACE OF IDEAS . 1523 R II. THE UNIQUE PROBLEMS OF TODAY’S ONLINE MARKETPLACE OF IDEAS ................................................. 1527 R III. FIXING THE FLAWS IN THE ONLINE MARKETPLACE OF IDEAS . 1531 R A. The European Union’s Approach, the German Approach, and Potential Regulatory Spillover to the United States ......... 1531 R B. What Facebook Is Doing ................................ 1538 R 1. Partnering with Third-Party Fact-Checkers to Evaluate Potentially False Posts ................... 1539 R 2. Related Articles/Additional Reporting as Counterspeech and Other Remedies in Response to False News .................................... 1540 R 3. Transparency and Disclosure Requirements Regarding Political/Electioneering Advertisements 1543 R 4. Removing False Posts Intended and Likely to Encourage Violence .............................. 1543 R 5. Eliminating Fake Accounts ....................... 1544 R 6. Providing Contextual and Source Information .... 1545 R 7. Allowing Users to Rank Trustworthiness of News Sources .......................................... 1545 R 8. Modifying News Feed ............................ 1545 R 9. Analysis of Facebook’s Efforts .................... 1546 R © 2019 Dawn Carla Nunziato. Individuals and nonprofit institutions may reproduce and distribute copies of this Article in any format at or below cost, for educational purposes, so long as each copy identifies the author, provides a citation to the Notre Dame Law Review, and includes this provision in the copyright notice. * William Wallace Kirkpatrick Research Professor of Law, The George Washington University Law School; Codirector, Global Internet Freedom Project. I am grateful to the editors of the Notre Dame Law Review and the participants in the Notre Dame Law Review Symposium on Contemporary Free Speech: The Marketplace of Ideas a Century Later, especially Alexander Tsesis and Fred Schauer, and to my colleagues Chip Lupu and Todd Peterson, for their helpful comments on this Article. I am also grateful to Alexia Khella and Ken Rodriguez for providing excellent research and library assistance in connection with this Article, to Kierre Hannon for excellent administrative assistance, and to Dean Blake Morant for financial support of my research. 1519 \\jciprod01\productn\N\NDL\94-4\NDL402.txt unknown Seq: 2 29-MAY-19 15:31 1520 notre dame law review [vol. 94:4 C. What Twitter Is Doing ................................. 1549 R 1. Suspending Fake and Suspicious Accounts ........ 1549 R 2. Mechanisms for User Reporting of Content and Accounts......................................... 1551 R 3. Demoting Tweets from Bad Faith Actors .......... 1551 R 4. Transparency and Disclosure Requirements re Political/Electioneering Advertisements........... 1551 R 5. Future Plans ..................................... 1553 R 6. Analysis of Twitter’s Efforts ....................... 1553 R D. What the U.S. Legislature Seeks to Do: The Honest Ads Act . 1554 R E. What U.S. Litigants Are Doing: Defamation Actions Against Purveyors of False News ................................ 1558 R CONCLUSION .................................................... 1560 R APPENDIX A ..................................................... 1562 R APPENDIX B ..................................................... 1580 R INTRODUCTION One hundred years ago, in the 1919 case of Abrams v. United States, Jus- tice Oliver Wendell Holmes ushered into existence modern First Amend- ment jurisprudence by introducing the “free trade in ideas” model of free speech.1 According to this model, the ultimate good is reached by allowing speakers to engage in the free trade in ideas—free of government interven- tion in the way of regulation, censorship, or punishment. Ideas must be allowed to compete freely in an unregulated market, and the best ideas will ultimately get accepted by competing with others in this marketplace. As such, government intervention is unnecessary and counterproductive. Thus, instead of punishing the speakers in Abrams—for criticizing the govern- ment’s attempts to crush the Russian Revolution and calling for American workers to strike2—the government should have taken a hands-off approach and allowed these ideas to compete (and lose) in the marketplace of ideas.3 The characteristics of our marketplace(s) of ideas have changed dramat- ically since 1919, when the Russian immigrants in Abrams threw their leaflets from the fourth floor window of a hat factory in lower Manhattan in an effort to widely disseminate their ideas.4 Russians are still players in our market- place of ideas, but today’s marketplace suffers from uniquely modern and challenging problems—such as rampant interference in the form of Russian troll farms mass-producing tweets and other widely shared content on social media with the intent and the effect of sabotaging U.S. elections.5 In addi- 1 Abrams v. United States, 250 U.S. 616, 630 (1919) (Holmes, J., dissenting). 2 Id. at 617–19 (majority opinion). 3 Id. at 630 (Holmes, J., dissenting). 4 Id. at 617–18 (majority opinion). 5 See OFFICE OF THE DIR. OF NAT’L INTELLIGENCE, NAT’L INTELLIGENCE COUNCIL, INTEL- LIGENCE COMMUNITY ASSESSMENT 2017-01D, ASSESSING RUSSIAN ACTIVITIES AND INTENTIONS \\jciprod01\productn\N\NDL\94-4\NDL402.txt unknown Seq: 3 29-MAY-19 15:31 2019] t h e marketplace of ideas online 1521 tion to the widespread dissemination of false political content from both for- eign and domestic sources, today’s online marketplace of ideas is besieged by the increased polarization and siloing of thought and opinion, which renders Holmes’s prescribed remedy for harmful speech—counterspeech—increas- ingly ineffective.6 In the past two years we have seen a variety of efforts, both in the United States and across the globe, by governments and by online platform providers themselves, to address the problems, distortions, and imperfections in the online marketplace. Because online platforms like Facebook and Twitter play such a dominant role in the online marketplace of ideas—and the mod- ern marketplace of ideas generally—it is worthwhile to focus specifically on how these platforms are being regulated, as well as how they are regulating themselves. While the United States has essentially taken a hands-off approach to regulating online platforms, the European Union has assumed a relatively aggressive regulatory approach.7 The European Union, as well as several European countries, have generally implemented speech regulations to hold platforms liable for failing to police their sites, and have recently imposed sweeping regulations on such platforms. And, in their efforts to comply with such regulations, online platforms like Facebook and Twitter may end up implementing these European regulations in ways that affect what U.S. audiences can access online—since it is often difficult for platforms to implement national regulations in a geographically targeted manner with no spillover beyond the regulating nation’s borders.8 Accordingly, it is worthwhile to examine these international efforts in some detail. The Euro- pean Union and European countries have recently undertaken sweeping efforts to remedy perceived imperfections in the marketplace,9 including by requiring online platforms to rapidly remove a wide swath of harmful con- tent.10 Among European nations, Germany has led the way by enacting dras- tic legislation requiring social media sites like Facebook and Twitter to remove false news, defamatory hate speech, and other unlawful content within twenty-four hours of receiving notice of the same, upon pain of multi- IN RECENT US ELECTIONS 2 (2017), https://www.dni.gov/files/documents/ICA_2017_01 .pdf. 6 See, e.g., Cristina Maza, Florida Shooting: Russian Bots Flooded the Internet with Propa- ganda About Parkland Massacre, NEWSWEEK (Feb. 16, 2018), https://www.newsweek.com/ florida-shooting-russian-bots-twitter-809000 (reporting on Russian-linked bots tweeting about the Parkland shooting and gun reform). 7 See infra Section III.A (comparing Communications Decency Act section 230 and the European Union approach to online intermediary liability). 8 See infra Section III.A. 9 See, e.g., Joanna Plucinska, Macron Proposes New Law Against Fake News, POLITICO (Jan. 3, 2018), https://www.politico.eu/article/macron-proposes-new-law-against-fake-news (reporting on French President Emmanuel Macron’s proposal for