Documents That Would Establish Malwarebytes’ Deep Relationship with Bleeping and Its Collaboration with Bleeping’S Efforts to Divert Sales from Enigma to 6

Total Page:16

File Type:pdf, Size:1020Kb

Documents That Would Establish Malwarebytes’ Deep Relationship with Bleeping and Its Collaboration with Bleeping’S Efforts to Divert Sales from Enigma to 6 No. 19-1284 IN THE Supreme Court of the United States __________ MALWAREBYTES, INC., Petitioner, v. ENIGMA SOFTWARE GROUP USA, LLC, Respondent. __________ On Petition for a Writ of Certiorari to the United States Court of Appeals for the Ninth Circuit __________ BRIEF IN OPPOSITION FOR RESPONDENT __________ CHRISTOPHER M. VERDINI TERRY BUDD ANNA SHABALOV Counsel of Record K&L GATES LLP BUDD LAW, PLLC 210 Sixth Avenue 120 Lyndhurst Circle Pittsburgh, PA 15222 Wexford, PA 15090 (412) 355-6500 (412) 613-2541 ([email protected]) July 27, 2020 QUESTION PRESENTED Whether the Ninth Circuit correctly held that 47 U.S.C. § 230(c)(2), titled “Protection for ‘Good Samaritan’ blocking and screening of offensive material,” does not provide immunity from liability for companies engaging in predatory practices that intentionally target competitors for anticompetitive reasons. ii RULE 29.6 DISCLOSURE STATEMENT Enigma Software Group USA, LLC is a Florida limited liability company with its principal place of business in Florida. Enigma Software Group USA, LLC is 100% owned by Globalist LLC, a Delaware limited liability company. Globalist LLC has no parent corporation, and no publicly held corporation owns 10% or more of its stock. iii TABLE OF CONTENTS Page QUESTION PRESENTED .......................................... i RULE 29.6 DISCLOSURE STATEMENT ................. ii TABLE OF AUTHORITIES ........................................v INTRODUCTION ....................................................... 1 STATEMENT .............................................................. 2 I. STATUTORY CONTEXT ................................ 2 II. FACTUAL BACKGROUND ............................ 4 III. PROCEDURAL HISTORY .............................. 8 A. District Court Opinion ................................ 8 B. Ninth Circuit Opinion .............................. 10 REASONS FOR DENYING THE PETITION ......... 11 I. THE NINTH CIRCUIT’S DECISION WAS CORRECT ............................................. 11 A. The Ninth Circuit’s Judgment Is Consistent with the Statutory Text ......... 11 B. Malwarebytes’ Attacks on the Ninth Circuit Opinion Have No Merit ................ 14 II. THERE IS NO CONFLICT WARRANT- ING THE COURT’S REVIEW ....................... 17 A. Malwarebytes’ Claim of a General- ized Circuit Conflict Is Baseless............... 17 B. The Decision Below Created No Intra-Circuit Conflict ................................ 20 III. THE NINTH CIRCUIT’S NARROW AND FACT-BOUND DECISION IS NOT OF SUFFICIENT IMPORTANCE TO WARRANT REVIEW ..................................... 22 iv IV. THIS CASE IS A POOR VEHICLE FOR FURTHER REVIEW ...................................... 29 CONCLUSION .......................................................... 31 v TABLE OF AUTHORITIES Page CASES Almeida v. Amazon.com, Inc., 456 F.3d 1316 (11th Cir. 2006) .................................................... 18 Ashcroft v. Iqbal, 556 U.S. 662 (2009) ...................... 27 Asurvio LP v. Malwarebytes Inc., No. 5:18-cv- 05409-EJD, 2020 WL 1478345 (N.D. Cal. Mar. 26, 2020) ...................................................... 27 Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003) ..... 2, 4 Bell Atlantic Corp. v. Twombly, 550 U.S. 544 (2007) ..............................................................26, 27 Breazeale v. Victim Servs., Inc., 878 F.3d 759 (9th Cir. 2017) ........................................................ 2 Chicago Lawyers’ Comm. for Civil Rights Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666 (7th Cir. 2008) ...................................................... 18 Comcast Corp. v. FCC, 600 F.3d 642 (D.C. Cir. 2010) ................................................................18, 19 Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008) ..................................................................... 18 Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) ............................................................. 14 Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019), cert. denied, No. 19-859 (U.S. May 18, 2020) ..................................................................... 18 Goddard v. Google, Inc., No. C 08-2738 JF (PVT), 2008 WL 5245490 (N.D. Cal. Dec. 17, 2008) ...... 28 vi Gundy v. United States, 139 S. Ct. 2116 (2019), reh’g denied, 140 S. Ct. 579 (2019) ................ 14-15 Hassell v. Bird, 420 P.3d 776 (Cal. 2018), cert. denied, 139 S. Ct. 940 (2019) .............................. 18 Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016) ................................................. 18 Johnson v. Arden, 614 F.3d 785 (8th Cir. 2010) ...... 18 Marshall’s Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263 (D.C. Cir. 2019) ........................... 18 Nat’l Numismatic Certification, LLC v. eBay, Inc., No. 6:08-cv-42-Orl-19GJK, 2008 WL 2704404 (M.D. Fla. July 8, 2008) ........................ 28 Nat’l Soc’y of Prof’l Eng’rs v. United States, 435 U.S. 679 (1978) ............................................. 13 Obduskey v. McCarthy & Holthus LLP, 139 S. Ct. 1029 (2019)..................................................... 17 Parker Drilling Mgmt. Servs., Ltd. v. Newton, 139 S. Ct. 1881 (2019) ......................................... 15 Prager Univ. v. Google LLC, No. 19CV340667, 2019 WL 8640569 (Cal. Super. Ct., Santa Clara Cty., Nov. 19, 2019), appeal docketed, No. H047714 (Cal. Ct. App. 6th Dist. Dec. 19, 2019) ..........................................................19, 20 Sherman v. Yahoo! Inc., 997 F. Supp. 2d 1129 (S.D. Cal. 2014) .................................................... 28 Shiamili v. Real Estate Grp. of N.Y., Inc., 952 N.E.2d 1011 (N.Y. 2011) ............................... 18 Song fi Inc. v. Google, Inc., 108 F. Supp. 3d 876 (N.D. Cal. 2015) ................................................... 27 Wyeth v. Levine, 555 U.S. 555 (2009) ....................... 27 vii Zango, Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169 (9th Cir. 2009) .................... 3, 9, 10, 20, 21, 22 Zeran v. Am. Online, Inc., 129 F.3d 327 (4th Cir. 1997) ..................................................................... 18 CONSTITUTION, STATUTES, AND RULES U.S. Const. amend. I ................................................. 31 Allow States and Victims to Fight Online Sex Trafficking Act of 2017, Pub. L. No. 115-164, 132 Stat. 1253 (2018) ......................................29, 30 Communications Act of 1934, 47 U.S.C. § 151 et seq. ................................................................... 19 Communications Decency Act of 1996, Pub. L. No. 104-104, tit. V, 110 Stat. 56, 133 .............1, 2, 9 47 U.S.C. § 230 ............................................. passim 47 U.S.C. § 230(b) ............................................ 3, 11 47 U.S.C. § 230(b)(2) ............................................ 13 47 U.S.C. §230(c)(1) ............... 17, 18, 20, 23, 28, 30 47 U.S.C. §230(c)(2) .................... 3, 4, 9, 13, 16, 17, 18, 19, 20, 21, 22, 29 47 U.S.C. § 230(c)(2)(A) ............................ 11, 22, 28 47 U.S.C. § 230(c)(2)(B) ............ 9, 11, 15, 17, 22, 23 47 U.S.C. § 230(e) ................................................ 30 47 U.S.C. §230(e)(3) .......................................18, 27 28 U.S.C. § 1404 .......................................................... 9 N.Y. Gen. Bus. Law § 349 ........................................... 9 viii Fed. R. Civ. P.: Rule 12(b) ............................................................... 9 Rule 12(b)(2) .......................................................... 9 Rule 12(b)(6) .................................................... 9, 29 Rule 45 ................................................................... 5 LEGISLATIVE MATERIALS “Biased Algorithm Deterrence Act of 2019,” H.R. 492, 116th Cong. (2019) .............................. 30 Clerk, U.S. House of Representatives, 115th Cong., Final Vote Results for Roll Call 91 on H.R. 1865 (Feb. 27, 2018), http://clerk.house. gov/evs/2018/roll091.xml ..................................... 30 “EARN IT Act of 2020,” S. 3398, 116th Cong. (2020) ................................................................... 30 “Ending Support for Internet Censorship Act,” S. 1914, 116th Cong. (2019) ................................ 30 “Limiting Section 230 Immunity to Good Samaritans Act,” S. 3983, 116th Cong. (2020) ................................................................... 30 “Platform Accountability and Consumer Transparency Act (PACT Act),” S. 4066, 116th Cong. (2020) .............................................. 30 “Stop the Censorship Act,” H.R. 4027, 116th Cong. (2019) ......................................................... 30 U.S. Senate, 115th Cong., Roll Call Vote on H.R. 1865 (Mar. 21, 2018), https://www. senate.gov/legislative/LIS/roll_call_lists/roll_ call_vote_cfm.cfm?congress=115&session=2 &vote=00060 ........................................................ 30 ix ADMINISTRATIVE MATERIALS Exec. Order No. 13,925, 85 Fed. Reg. 34,079 (May 28, 2020) ..................................................... 31 U.S. Dep’t of Justice: Office of Att’y Gen., Department of Justice’s Review of Section 230 of the Communi- cations Decency Act of 1996, https://www. justice.gov/ag/department-justice-s-review- section-230-communications-decency-act- 1996 (last visited July 23, 2020) .................... 30-31 Section 230 — Nurturing Innovation or Fostering Unaccountability?: Key Takeaways and Recommendations (June 2020), https:// www.justice.gov/file/1286331/download ............
Recommended publications
  • Preparing for a More Fractured Web the Central Services Under Siege Built Global-Scale Networks
    Preparing for a More Fractured Web The central services under siege built global-scale networks. If policy trends hold, what comes next? Andrea O’Sullivan hat’s your worry about Silicon ad-based business model supporting much Valley? Most everyone has one. of big tech. Maybe you’re concerned about For some, it allows impolitic privacy. It could be a combination of many Wspeech to flourish online. If you’re like things. Whatever your persuasion, there me, you’re more bothered by the threat of is usually some good reason to resist big targeted content controls. Or you might American technology companies. fear that some companies have just gotten This essay will not debate the merits or a little too big. Maybe you dislike the entire demerits of any particular tech criticism. The JOURNAL of The JAMES MADISON INSTITUTE Readers can find many such commentaries These trends herald a future where data tailored to their own liking elsewhere on localization, which limits how information the world wide web. Instead, I will discuss can be used across borders, is the norm.5 how the many forces converging against Regulating data means regulating American technology companies may result commerce. Although framed as a way in a new web that is less, well, worldwide. to bring tech companies in line, data What might such an internet look like? regulations affect anyone who engages in We already have a good inkling. Most online commerce: that is to say, almost people have heard of one longstanding everyone with a computer and a connection. internet faultline: the so-called Great To a foreign retailer, for instance, data Firewall of China.1 Residents of China controls might as well be a trade control.6 cannot easily access major parts of the Then there are content controls.
    [Show full text]
  • House of Representatives Staff Analysis Bill #: Hb 7013
    HOUSE OF REPRESENTATIVES STAFF ANALYSIS BILL #: HB 7013 PCB COM 21-01 Technology Transparency SPONSOR(S): Commerce Committee, Ingoglia TIED BILLS: HB 7015 IDEN./SIM. BILLS: REFERENCE ACTION ANALYST STAFF DIRECTOR or BUDGET/POLICY CHIEF Orig. Comm.: Commerce Committee 16 Y, 8 N Wright Hamon 1) Appropriations Committee 19 Y, 8 N Topp Pridgeon 2) Judiciary Committee SUMMARY ANALYSIS A section of the Federal Communications Decency Act (Section 230) provides immunity from liability for information service providers and social media platforms that, in good faith, remove or restrict from their services information deemed “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” While this immunity has fostered the growth of certain parts of the internet, recently, there have been criticisms of the broad federal immunity provision due to actions taken or not taken regarding the censorship of users by internet platforms. Government regulators have recently investigated and initiated cases against certain platforms for antitrust activities. The bill provides that a social media platform must: publish standards used for determining how to censor, deplatform, and shadow ban users, and apply such standards in a consistent manner. inform each user about any changes to its user rules, terms, and agreements before implementing the changes and may not make changes more than once every 30 days. notify the user within 30 days of censoring or deplatforming. provide a user with information relating to the number of other individuals who were provided or shown the user's content or posts upon request by the user.
    [Show full text]
  • Reframing the Crypto Wars
    CONTENTS Introduction 1 A Highlights Reel of the U.S. Encryption Debate 2 The Early “Crypto Wars” and the Clipper Chip 2 The Encryption Debate Sequel: Snowden and Surveillance 3 Privacy and the Encryption Debate 3 Earn It: Encryption Controversy Over Section 230 3 The Current International Debate Around Encryption 4 The Impact of International Encryption Policy on Domestic Policy 5 Security Versus Privacy or Security Versus Security 5 Challenges to Making Progress on the Encryption Debate 6 The Increased Availability of User-friendly Encryption Services 6 A Lack of a Common Language 6 A Lack of a Whole of Government Approach to Cybersecurity 7 Disagreement About Backdoor Alternatives 8 Conclusion 9 About the Authors 10 R STREET POLICY STUDY NO. 237 The debate was seemingly set to be answered by the courts in July 2021 2016, when Apple refused to craft new software at the behest of the Federal Bureau of Investigation (FBI) that would allow the agency to access encrypted information on a phone belonging to one of the shooters in a 2015 terrorist attack on Inland Regional Center in San Bernardino, California. REFRAMING THE CRYPTO WARS But the “going dark” encryption debate shares a key charac- By Kathryn Waldron and Sofia Lesmes teristic with the walking dead: no matter how many times you try to put it to rest, it keeps coming back. In October 2020, the Department of Justice (DOJ) issued an international INTRODUCTION statement calling for companies to “[e]nable law enforce- ment access to content in a readable and usable format where or the past several decades, policymakers, law enforce- authorisation is lawfully issued, is necessary and proportion- ment, private companies, civil liberties advocates and ate, and is subject to strong safeguards and oversight.”2 Both cybersecurity specialists have been locked in a pas- at home and abroad, there are increased government calls to sionate yet seemingly unending battle over encryp- weaken encryption in the name of national security.
    [Show full text]
  • 2020 Post-Election Outlook Introduction – a Divided Government Frames the Path Forward
    2020 Post-Election Outlook Introduction – A Divided Government Frames the Path Forward ........................................................................3 Lame Duck .....................................................................................4 First 100 Days ...............................................................................7 Outlook for the 117th Congress and Biden Administration ............................................................12 2020 Election Results ............................................................ 36 Potential Biden Administration Officials ..................... 40 Additional Resources ............................................................. 46 Key Contacts ............................................................................... 47 Introduction – A Divided Government Frames the Path Forward Former Vice President Joe Biden has been elected to serve as the 46th President of the United States, crossing the 270 electoral vote threshold on Saturday, November 7, with a victory in Pennsylvania. His running mate, Sen. Kamala Harris (D-CA), will be the first woman, first African- American and first South Asian-American to serve as Vice President. Their historic victory follows an election where a record number of voters cast ballots across a deeply divided country, as reflected in the presidential and closely contested Senate and House races. In the Senate, Republicans are on track to control 50 seats, Democrats will control 48 seats, and the final two Senate seats will be decided
    [Show full text]
  • Cyber Mobs, Disinformation, and Death Videos: the Internet As It Is (And As It Should Be)
    Michigan Law Review Volume 118 Issue 6 2020 Cyber Mobs, Disinformation, and Death Videos: The Internet as It Is (and as It Should Be) Danielle Keats Citron Boston University School of Law Follow this and additional works at: https://repository.law.umich.edu/mlr Part of the Entertainment, Arts, and Sports Law Commons, Internet Law Commons, and the Law and Society Commons Recommended Citation Danielle Keats Citron, Cyber Mobs, Disinformation, and Death Videos: The Internet as It Is (and as It Should Be), 118 MICH. L. REV. 1073 (2020). Available at: https://repository.law.umich.edu/mlr/vol118/iss6/9 https://doi.org/10.36644/mlr.118.6.cyber This Review is brought to you for free and open access by the Michigan Law Review at University of Michigan Law School Scholarship Repository. It has been accepted for inclusion in Michigan Law Review by an authorized editor of University of Michigan Law School Scholarship Repository. For more information, please contact [email protected]. CYBER MOBS, DISINFORMATION, AND DEATH VIDEOS: THE INTERNET AS IT IS (AND AS IT SHOULD BE) Danielle Keats Citron* SABRINA. By Nick Drnaso. Montreal: Drawn & Quarterly. 2018. Pp. 203. $27.95. INTRODUCTION Nick Drnaso’s1 graphic novel Sabrina provides a powerful snapshot of online norms. The picture is not pretty: A young woman goes missing. Her grief-stricken boyfriend cannot bear to stay in their home and escapes to a friend’s house. Her sister struggles with the pain of her loss. We learn that the woman’s neighbor, a misogynist loner, killed her and recorded the mur- der.
    [Show full text]
  • Free Expression, Harmful Speech and Censorship in a Digital World
    Free Expression, Harmful Speech and Censorship in a Digital World #TECHPOLICY The John S. and James L. Knight Foundation’s Trust, Media and Democracy initiative aims to address the decline in trust for journalism and other democratic institutions by examining the causes and supporting solutions. As part of the multidisciplinary initiative launched in 2017, Knight Foundation partnered with Gallup on a research series to better understand Americans’ evolving relationship with the media and to ABOUT THE SERIES inform solutions to the information challenges of our day. Knight Foundation is also investing in technologists, journalists, academic institutions and others with strong, innovative approaches to improve the flow of accurate information, prevent the spread of misinformation and better inform communities. Knight Foundation believes that democracy thrives when communities are informed and engaged. For more, visit kf.org/tmd. COPYRIGHT STANDARDS This document contains proprietary research, copyrighted and trademarked materials of Gallup, Inc. Accordingly, international and domestic laws and penalties guaranteeing patent, copyright, trademark and trade secret protection safeguard the ideas, concepts and recommendations related within this document. The materials contained in this document and/ or the document itself may be downloaded and/or copied provided that all copies retain the copyright, trademark and any other proprietary notices contained on the materials and/or document. No changes may be made to this document without the express written permission of Gallup, Inc. Any reference whatsoever to this document, in whole or in part, on any web page must provide a link back to the original document in its entirety. Except as expressly provided herein, the transmission of this material shall not be construed to grant a license of any type under any patents, copyright or trademarks owned or controlled by Gallup, Inc.
    [Show full text]
  • Posting Into the Void
    A community report by Hacking//Hustling Danielle Blunt, Emily Coombes, Shanelle Mullin, and Ariel Wolf Abstract As more sex workers and activists, organizers, and protesters (AOP) move online due to COVID-19, the sex working community and organizing efforts are being disrupted through legislative efforts to increase surveillance and platform liability. Sex worker contributions to movement work are often erased,1 despite the fact that a significant amount of unpaid activ- ism work (specific to sex work or otherwise) is funded by activists’ direct labor in the sex trades. This research aims to gain a better understanding of the ways that platforms’ responses to Section 2302 carve-outs3 impact content moderation, and threaten free speech and human rights for those who trade sex and/or work in movement spaces. In this sex worker-led study, Hacking//Hustling used a participatory action research model to gather quantitative and qualitative data to study the impact of content moderation on sex workers and AOP (n=262) after the uprisings against state-sanctioned police violence and police murder of Black people. The results of our survey indicate that sex workers and AOP have noticed significant changes in content moderation tactics aiding in the disruption of movement work, the flow of capital, and further chilling speech.4 The negative impact of content moderation experienced by people who identi- fied as both sex workers and AOP was significantly compounded. Key Words: Sex Work, Prostitution, Content Moderation, Section 230, Tech, Public Health, Platform Policing, Censorship, Community Organizing, Activist, Platform Liability, Free Speech, First Amendment 1 Roderick, Leonie.
    [Show full text]
  • Gao 1 Cecily Gao Professor Joan Feigenbaum CPSC 610 – Topics in Computer Science and Law Spring 2021 Section 230: Past, Presen
    Gao 1 Cecily Gao Professor Joan Feigenbaum CPSC 610 – Topics in Computer Science and Law Spring 2021 Section 230: Past, Present, and Future Introduction “The Twenty-Six Words That Created the Internet” is no hyperbole when it comes to Section 230 of the Communications Decency Act (CDA) of 1996.1 By every measure, § 230 is objectively the single most influential piece of Internet legislation ever passed. CDA § 230(c)(1) grants online service providers immunity from publisher liability with the following language: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.2 CDA § 230(c)(2)(A) also grants safe harbor from civil liability for “Good Samaritan” moderation of third-party and user-generated content: No provider or user of an interactive computer service shall be held liable on account of— (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.3 In recent years, as technology giants, like Google and Facebook, have solidified their foothold in nearly every aspect of American online life and the consequences—both positive and negative— 1 Kosseff, Jeff. The Twenty-Six Words That Created the Internet. Ithaca; London: Cornell University Press, 2019. Accessed May 10, 2021. http://www.jstor.org/stable/10.7591/j.ctvr7fcrd. 2 Communications Decency Act of 1996, Pub. L. No. 104-104, 110 Stat.
    [Show full text]
  • The Contradictions of Platform Regulation1
    The Contradictions of Platform Regulation1 Mark A. Lemley2 Everyone wants to regulate the big tech companies – Amazon, Apple, Facebook, and Google.3 Companies that were darlings of the media and government a decade ago are now under attack from all sides. Scholars and politicians on all sides of the aisle are proposing to remove their immunity from liability, require them to take certain acts, prevent them from taking others, or even break them up entirely.4 Governments have filed numerous antitrust suits against them,5 and the UK has set up a whole new agency just to regulate tech platforms.6 1 © 2021 Mark A. Lemley. 2 William H. Neukom Professor, Stanford Law School; partner, Durie Tangri LLP. Thanks to Eric Goldman, Nik Guggenberger, Rose Hagan, Thomas Kadri, Daphne Keller, Tom Nachbar, Pam Samuelson, Rebecca Tushnet, and Tim Wu for comments on an earlier draft, and to Tyler Robbins for research assistance. 3 Microsoft and Netflix sometimes make this list, but most of the challenges are directed at the four companies listed in text. 4 See, e.g., The Editorial Board, Joe Biden Says Age Is Just a Number, THE NEW YORK TIMES (Jan. 17, 2020) (President Biden saying Section 230 should be revoked “immedately”); see also Christopher Mims, Republicans and Democrats Find a Point of Agreement: Big Tech Is Too Powerful, WALL STREET JOURNAL (Jul. 30, 2020).Europe is proposing new rules that would require structural separation of the big tech companies. Natalia Drozdiak, Tech Giants Risk Breakup Under Strict EU Digital Rules, Bloomber Law News, Dec. 15, 2020.
    [Show full text]
  • Issue Brief July 2020 How Section 230 Enhances the First Amendment
    Issue Brief July 2020 How Section 230 Enhances the First Amendment Eric Goldman The First Amendment says that Congress “shall make no law…abridging the freedom of speech, or the press.” This constitutional provision sets a floor, not a ceiling, on free speech protection in the United States. Legislatures can, and sometimes do, supplement the First Amendment with “speech-enhancing statutes” that facilitate or enhance free speech.1 In 1996, Congress enacted one of its most important “speech-enhancing statutes” ever, 47 U.S.C. § 230 (Section 230), as part of the Communications Decency Act (CDA). Section 230 sets forward a straightforward policy: it categorically shields Internet services2 from liability for publishing third-party content (also called “user generated content,” or “UGC”), subject to a few statutory exceptions, including intellectual property claims and federal criminal prosecutions.3 Together, Section 230 and the First Amendment have contributed to the Internet’s emergence as one of the most remarkable speech venues in human history. We’ve seen the emergence of valuable UGC services that never existed in the offline world, such as Wikipedia’s crowdsourced encyclopedia, consumer review websites like Yelp, and user-uploaded video sites like YouTube. These UGC services give Internet users an unprecedented ability to express themselves to a global audience. UGC services have also created many private benefits, including new jobs and wealth. Those benefits are in extreme peril as regulators target Section 230 for drastic reform. Four 1 Some examples of “speech-enhancing statutes” include shield laws that protect reporters from being obligated to disclose their confidential sources; defamation retraction-demand statutes that require plaintiffs to demand (and be denied) a retraction before bringing a defamation lawsuit; and anti-SLAPP laws that expedite the dismissal of lawsuits targeting socially beneficial speech that unhappy critics seek to censor, intimidate, or silence.
    [Show full text]
  • Understanding Section 230 of the Communications Decency Act
    Understanding Section 230 of the Communications Decency Act August 2020 Understanding Section 230 of the Communications Decency Act I. Introduction As consumer engagement with digital content has evolved over the last three decades, Section 230 of the Communications Decency Act remains foundational not only to past evolution but future innovation. Section 230’s protections emerged during a time when the judicial system that was grappling with how to assign liability in an age where the roles of publisher and distributor were blurring. This paper provides a historical overview of liability for content on the Internet prior to the passage of Section 230, the reason for its adoption by Congress, and the post-230 judicial decisions that are continuing to shape liability for entities that offer user- generated content. It concludes with a look at two bills in Congress that are receiving particular attention, the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020 (EARN IT Act of 2020) and the Platform Accountability and Consumer Transparency Act (PACT Act). II. Liability for Content Prior to Section 230 In the early years of online activities, bulletin boards were the primary form of user-generated content and it was the free-wheeling speech available on bulletin boards that attracted users. Two of the most prominent companies of their time- CompuServe and Prodigy- hosted bulletin boards allowing users to interact with communities of shared interests. The two companies, however, took somewhat different approaches to the editorial management of the flow of content across their platforms, and that difference was the basis for inconsistent assignment of liability by the courts in separate cases that were brought against the companies.
    [Show full text]
  • Section 230: an Overview
    Section 230: An Overview April 7, 2021 Congressional Research Service https://crsreports.congress.gov R46751 SUMMARY R46751 Section 230: An Overview April 7, 2021 Section 230 of the Communications Act of 1934, enacted as part of the Communications Decency Act of 1996, provides limited federal immunity to providers and users of interactive Valerie C. Brannon computer services. The law generally precludes providers and users from being held liable—that Legislative Attorney is, legally responsible—for information provided by a third party, but does not prevent them from being held legally responsible for information that they have developed or for activities unrelated to third-party content. Courts have interpreted Section 230 to foreclose a wide variety of lawsuits Eric N. Holmes and to preempt laws that would make providers and users liable for third-party content. For Legislative Attorney example, the law has been applied to protect online service providers like social media companies from lawsuits based on their decisions to transmit or take down user-generated content. Two provisions of Section 230 are the primary framework for this immunity. First, Section 230(c)(1) specifies that service providers and users may not “be treated as the publisher or speaker of any information provided by another information content provider.” In Zeran v. America Online, Inc., an influential case interpreting this provision, a federal appeals court said that Section 230(c)(1) bars “lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.” Second, Section 230(c)(2) states that service providers and users may not be held liable for voluntarily acting in good faith to restrict access to “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” material.
    [Show full text]