Godaddy and Then Google Ban Neo-Nazi Site Daily Stormer by Washington Post, Adapted by Newsela Staff on 08.17.17 Word Count 991 Level 1040L

Total Page:16

File Type:pdf, Size:1020Kb

Godaddy and Then Google Ban Neo-Nazi Site Daily Stormer by Washington Post, Adapted by Newsela Staff on 08.17.17 Word Count 991 Level 1040L GoDaddy and then Google ban neo-Nazi site Daily Stormer By Washington Post, adapted by Newsela staff on 08.17.17 Word Count 991 Level 1040L Flowers surround a photo of 32-year-old Heather Heyer, who was killed when a car plowed into a crowd of people protesting against the white supremacist Unite the Right rally on August 13, 2017, in Charlottesville, Virginia. Photo by Chip Somodevilla/Getty Images SAN FRANCISCO, California - The push for Internet companies to remove hateful speech has intensified in the wake of violent clashes in Charlottesville, Virginia over the weekend. Chaos hit Charlottesville last Saturday during a white supremacist rally. White supremacists and neo-Nazis have the wrong and hateful belief that white people are better than people of other races. The violent day left three people dead and dozens of people injured. GoDaddy is the largest domain name service. It registers names for 71 million websites globally. A domain name is the address you type in when you want to visit a site. The move is the latest and perhaps the broadest indication of how far technology companies are willing to go in response to outcry that their services are being used to promote racism and white supremacy. This article is available at 5 reading levels at https://newsela.com. 1 Drawing The Line Against Hate Speech Although technology companies have long resisted calls to regulate the content they host, they have been under more pressure recently. Companies appear to be bowing at least to some of it. "This may very well indicate that the sense of responsibility among tech companies is deepening," said Susan Benesch, director of the Dangerous Speech Project. It is a nonprofit group that researches harmful online content and free speech. "They are under gigantic pressure to solve this problem, and they are reacting as they never have before." Freedom of speech is guaranteed by the First Amendment to the U.S. Constitution. Many technology companies are facing a challenge over the extent to which they must uphold freedom of speech. Can an Internet company remove content that is hateful? GoDaddy and other Internet companies have to decide where to strike a balance and where to draw the line. On Monday, GoDaddy kicked the neo-Nazi website Daily Stormer off its systems. It cited company policies that prohibit speech that incites "violence against people." GoDaddy said the move was in response to a Twitter user who called attention to a post by Daily Stormer founder Andrew Anglin. He criticized 32-year-old Heather Heyer, who was killed Saturday in Charlottesville when a man plowed into a crowd with his vehicle. The post attacked Heyer's appearance and used a slur to describe her as a loose woman. "Most people are glad she is dead," Anglin wrote. The Daily Stormer then transferred its listing to Google. That prompted an outcry. Google responded quickly, cutting off the white supremacy site. They said that posts on the website violated Google's policies. A company called Cloudflare continued to service the Daily Stormer. Censorship Issues Raised Many people praised GoDaddy's decision. They said it represented a shift by tech corporations to take more responsibility. The praise came from both liberals and conservatives. Liberals want social change while conservatives like the way things are traditionally done. "It's well past time for platforms that already exercise some discretion to stop pretending they are just dumb pipes that allow all types of garbage to flow through them," said Laurence Tribe. He is a law professor at Harvard University. Tribe called the move "long overdue." However, the American Civil Liberties Union (ACLU) has a different view. It said that people should not be quick to condemn the display of even "the most vile white supremacist speech." This article is available at 5 reading levels at https://newsela.com. 2 People are relieved when speech they disagree with is removed, but the censorship can come back to bite them when they are on the receiving end, said Lee Rowland. She is a lawyer with the ACLU. The First Amendment has enabled Americans throughout history to challenge views because "we are able to reveal what people really think and counter it," she added. GoDaddy's decision comes after other moves to crack down on hateful content. After pressure from activists, payment processor PayPal and Patreon recently canceled accounts for some questionable figures. Google recently apologized to major advertisers after their content appeared on hate and white supremacist sites. It promised to do better. Facebook also has blocked several extreme pages in recent weeks. Airbnb stopped neo-Nazis from using its site to book lodging for their rally in Charlottesville. Mike Cernovich is a member of the "alt-right." The group is very socially traditional and many of its members hold views that are hurtful to women and people of color. Cernovich said social media companies should let people use their platform as long as they aren't promoting violence. "There are always ideas people have on the left and right (that) are going to offend somebody, but do we really want corporations taking sides?" Tech Companies Take Responsibility For Their Content In a statement, GoDaddy spokesman Dan Race said that the web-hosting company generally protects free speech. However, it determined that the Daily Stormer had "crossed the line and encouraged and promoted violence," he said. "We generally do not take action on complaints" that would involve "censorship of content," the statement said. However, if a site crosses over to promoting violence against anyone, "we will take action," it said. GoDaddy said that it had received complaints about the Daily Stormer before. It said that the complaints hadn't warranted action. But as recently as July, the site had promised to "track down" the families of CNN employees. Daily Stormer also has posted the names and contact information for a Jewish family in Montana, describing the 12-year-old son with disrespectful words. In a statement, Cloudflare said it was "aware of the concerns" about some websites, but would not comment on specific sites. Technology companies are protected by laws. They are not held responsible for content that appears on their platforms. As they move further into policing speech and expression, technology companies could be accused of being inconsistent. They are blocking some sites in response to public pressure, but thousands of others remain. This article is available at 5 reading levels at https://newsela.com. 3 Leaders at Twitter and Facebook have said that they do not want to be "arbiters of truth." In other words, they do not want to make determinations about what people see or don't see on their sites. This article is available at 5 reading levels at https://newsela.com. 4.
Recommended publications
  • S.C.R.A.M. Gazette
    MARCH EDITION Chief of Police Dennis J. Mc Enerney 2017—Volume 4, Issue 3 S.c.r.a.m. gazette FBI, Sheriff's Office warn of scam artists who take aim at lonely hearts DOWNTOWN — The FBI is warning of and report that they have sent thousands of "romance scams" that can cost victims dollars to someone they met online,” Croon thousands of dollars for Valentine's Day said. “And [they] have never even met that weekend. person, thinking they were in a relationship with that person.” A romance scam typically starts on a dating or social media website, said FBI spokesman If you meet someone online and it seems "too Garrett Croon. A victim will talk to someone good to be true" and every effort you make to online for weeks or months and develop a meet that person fails, "watch out," Croon relationship with them, and the other per- Croon said. warned. Scammers might send photos from son sometimes even sends gifts like flowers. magazines and claim the photo is of them, say Victims can be bilked for hundreds or thou- they're in love with the victim or claim to be The victim and the other person are never sands of dollars this way, and Croon said the unable to meet because they're a U.S. citizen able to meet, with the scammer saying they most common victims are women who are 40 who is traveling or working out of the coun- live or work out of the country or canceling -60 years old who might be widowed, di- try, Croon said.
    [Show full text]
  • Reporting, and General Mentions Seem to Be in Decline
    CYBER THREAT ANALYSIS Return to Normalcy: False Flags and the Decline of International Hacktivism By Insikt Group® CTA-2019-0821 CYBER THREAT ANALYSIS Groups with the trappings of hacktivism have recently dumped Russian and Iranian state security organization records online, although neither have proclaimed themselves to be hacktivists. In addition, hacktivism has taken a back seat in news reporting, and general mentions seem to be in decline. Insikt Group utilized the Recorded FutureⓇ Platform and reports of historical hacktivism events to analyze the shifting targets and players in the hacktivism space. The target audience of this research includes security practitioners whose enterprises may be targets for hacktivism. Executive Summary Hacktivism often brings to mind a loose collective of individuals globally that band together to achieve a common goal. However, Insikt Group research demonstrates that this is a misleading assumption; the hacktivist landscape has consistently included actors reacting to regional events, and has also involved states operating under the guise of hacktivism to achieve geopolitical goals. In the last 10 years, the number of large-scale, international hacking operations most commonly associated with hacktivism has risen astronomically, only to fall off just as dramatically after 2015 and 2016. This constitutes a return to normalcy, in which hacktivist groups are usually small sets of regional actors targeting specific organizations to protest regional events, or nation-state groups operating under the guise of hacktivism. Attack vectors used by hacktivist groups have remained largely consistent from 2010 to 2019, and tooling has assisted actors to conduct larger-scale attacks. However, company defenses have also become significantly better in the last decade, which has likely contributed to the decline in successful hacktivist operations.
    [Show full text]
  • How Indirect Intermediaries Shape Online Speech
    Mitchell 1 Anna Mitchell Professor Nate Persily INTLPOL 323 December 16, 2018 Hidden Censors: How Indirect Intermediaries Shape Online Speech 1. Introduction “Literally, I woke up in a bad mood and decided someone shouldn’t be allowed on the internet. No one should have that power.” After a whirlwind five days of social media protests and internal tension, Matthew Prince, the CEO of Cloudflare, an Internet-services company, decided to terminate its contract with The Daily Stormer. The Internet’s most popular neo-Nazi website had vanished from the Web.1 Cloudflare’s content distribution network (CDN), improving the Stormer’s performance and protecting it from distributed denial of service (DDoS) attacks, had allowed the site to stay afloat for months or years. But that changed in 2017 during a period of tense racial relations in the United States. From August 11 to 12, the Daily Stormer helped organize the white supremacist Unite the Right rally in Charlottesville, Virginia. Soon after, it published articles calling murdered protestor Heather Heyer a “sloppy lard ass” and labeling her death an accidental heart attack.2 Reaction was swift. The next day, activist Amy Suskind tagged web hosting service GoDaddy in a Twitter post: “@GoDaddy you host The Daily Stormer - they posted this on their site. Please retweet if you think this hate should be taken down & banned.”3 Within 24 hours, GoDaddy responded by withdrawing hosting services. When the Stormer tried to switch to Google, it soon followed suit.4 Zoho refused to provide email service,5 and Twitter shut down Mitchell 2 associated accounts.6 With multiple major companies withdrawing services, it became increasingly difficult for the Stormer to host and propagate content.
    [Show full text]
  • Platform Justice: Content Moderation at an Inflection Point 3
    A HOOVER INSTITUTION ESSAY ON Platform Justice C ONTENT MODERATION at an INFLECTION POINT DANIELLE CITRON & QUINTA JURECIC Aegis Series Paper No. 1811 In June 2016, Facebook Vice President Andrew Bosworth circulated an internal memorandum describing the company’s mission. “We connect people,” he wrote. “Maybe someone finds love. Maybe [Facebook] even saves the life of someone on the brink of suicide. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools.”1 Bosworth was not speaking metaphorically. The month after he distributed the memo, Facebook was sued by the parents of five victims of terrorist attacks conducted by Hamas, which had made extensive use of the company’s platform to recruit and organize. Days before, the social media platform had inadvertently broadcast a murder over the streaming app Facebook Live. National Security, Technology, and Law and Technology, Security, National Yet the outrage Facebook weathered in the aftermath of this violence was nothing compared to the scathing criticism it endured two years later after Buzzfeed News published Bosworth’s memo. When the memo leaked in March 2018, Facebook was neck-deep in controversies over its influence on the 2016 presidential election, including the use of the platform’s ad-targeting function by Russian trolls to sow political discord and the harvesting of user data under false pretenses by the Trump-affiliated political consultancy Cambridge Analytica.2 Meanwhile, Facebook’s competitors, Twitter and Google, faced mounting criticism over their failure to curb disinformation and abusive content.3 In 2016, Bosworth’s memo might have been an awkward attempt at self-reflection.
    [Show full text]
  • Ethical Hacking
    Ethical Hacking Alana Maurushat University of Ottawa Press ETHICAL HACKING ETHICAL HACKING Alana Maurushat University of Ottawa Press 2019 The University of Ottawa Press (UOP) is proud to be the oldest of the francophone university presses in Canada and the only bilingual university publisher in North America. Since 1936, UOP has been “enriching intellectual and cultural discourse” by producing peer-reviewed and award-winning books in the humanities and social sciences, in French or in English. Library and Archives Canada Cataloguing in Publication Title: Ethical hacking / Alana Maurushat. Names: Maurushat, Alana, author. Description: Includes bibliographical references. Identifiers: Canadiana (print) 20190087447 | Canadiana (ebook) 2019008748X | ISBN 9780776627915 (softcover) | ISBN 9780776627922 (PDF) | ISBN 9780776627939 (EPUB) | ISBN 9780776627946 (Kindle) Subjects: LCSH: Hacking—Moral and ethical aspects—Case studies. | LCGFT: Case studies. Classification: LCC HV6773 .M38 2019 | DDC 364.16/8—dc23 Legal Deposit: First Quarter 2019 Library and Archives Canada © Alana Maurushat, 2019, under Creative Commons License Attribution— NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) https://creativecommons.org/licenses/by-nc-sa/4.0/ Printed and bound in Canada by Gauvin Press Copy editing Robbie McCaw Proofreading Robert Ferguson Typesetting CS Cover design Édiscript enr. and Elizabeth Schwaiger Cover image Fragmented Memory by Phillip David Stearns, n.d., Personal Data, Software, Jacquard Woven Cotton. Image © Phillip David Stearns, reproduced with kind permission from the artist. The University of Ottawa Press gratefully acknowledges the support extended to its publishing list by Canadian Heritage through the Canada Book Fund, by the Canada Council for the Arts, by the Ontario Arts Council, by the Federation for the Humanities and Social Sciences through the Awards to Scholarly Publications Program, and by the University of Ottawa.
    [Show full text]
  • Section 230 and the Twitter Presidency
    Copyright 2020 by Michael A. Cheah Vol. 115 Northwestern University Law Review SECTION 230 AND THE TWITTER PRESIDENCY Michael A. Cheah ABSTRACT—In response to Twitter’s decision to label one of the President’s tweets misleading, the Trump White House issued an executive order to limit the scope of Section 230 of the Communications Decency Act via agency rulemaking. In the Order, Trump calls for the Federal Communications Commission (FCC) to “interpret” Section 230 in a manner that curtails websites’ ability to remove and restrict user speech. This Essay analyzes the Order and concludes that the President’s effort to limit Section 230 will fail. First, the FCC does not have rulemaking authority to issue the proposed rules. Second, the proposed rules cannot be issued because they are inconsistent with the statute. Finally, this Essay will discuss the policy implications of the proposed rules and argue that they would lead to less speech and engagement on the Internet, not more of it. AUTHOR—General Counsel of Vimeo, Inc. and adjunct faculty at the University of Miami School of Law. A big thank you to my colleague Erika Barros Sierra Cordera at Vimeo for help in researching and editing, Professor Caroline Mala Corbin at the University of Miami School of Law for her excellent feedback, and the Northwestern University Law Review for their diligent and timely work in shepherding this Essay to publication. 192 115:192 (2020) Section 230 and the Twitter Presidency INTRODUCTION ............................................................................................................. 193 I. SECTION 230 OF THE COMMUNICATIONS DECENCY ACT ....................................... 196 A. The Genesis of Section 230 .........................................................................
    [Show full text]
  • The Terrifying Power of Internet Censors
    https://nyti.ms/2xYIeCu Opinion | OP-ED CONTRIBUTOR The Terrifying Power of Internet Censors By KATE KLONICK SEPT. 13, 2017 After the white-nationalist rally in Charlottesville, Va., last month where a man drove a car into a crowd, killing a counter-demonstrator, the American neo-Nazi website The Daily Stormer published a long, hate-riddled post mocking the victim. Outcry over the article led its domain registrar, GoDaddy, to end The Daily Stormer’s service. The site then registered with Google, which also quickly canceled its hosting. But it wasn’t until Cloudflare, a website security and performance service, dropped the site as a client that The Daily Stormer truly lost its ability to stay online. Because of the precise nature of Cloudflare’s business, and the scarcity of competitors, its role censoring internet speech is not just new, it’s terrifying. What makes Cloudflare an essential part of the internet is its ability to block malicious traffic from barraging clients’ websites with requests that take them offline. Cloudflare is one of the few companies in the world that provide this kind of reliable protection. If you don’t want your website to get taken down by extortionists, jokers, political opposition or hackers, you have to hire Cloudflare or one of its very few competitors. Generally speaking, there are two kinds of corporate players on the internet: companies that build infrastructure through which content flows, and companies that seek to curate content and create a community. Internet service providers like Verizon and Comcast, domain name servers, web hosts and security services providers like Cloudflare are all the former — or the “pipe.” They typically don’t look at the content their clients and customers are putting up, they just give them the means to do it and let it flow.
    [Show full text]
  • Reddit Quarantined: Can Changing Platform Affordances Reduce Hateful Material Online?
    Volume 9 | Issue 4 Reddit quarantined: can changing platform affordances reduce hateful material online? Simon Copland Australian National University [email protected] DOI: https://doi.org/10.14763/2020.4.1516 Published: 21 October 2020 Received: 20 May 2020 Accepted: 6 August 2020 Competing Interests: The author has declared that no competing interests exist that have influenced the text. Licence: This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 License (Germany) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. https://creativecommons.org/licenses/by/3.0/de/deed.en Copyright remains with the author(s). Citation: Copland, S. (2020). Reddit quarantined: can changing platform affordances reduce hateful material online?. Internet Policy Review, 9(4). DOI: 10.14763/ 2020.4.1516 Keywords: Reddit, Affordances, Misogyny, Manosphere, Digital platforms, Online platforms Abstract: This paper studies the efficacy of the Reddit’s quarantine, increasingly implemented in the platform as a means of restricting and reducing misogynistic and other hateful material. Using the case studies of r/TheRedPill and r/Braincels, the paper argues the quarantine successfully cordoned off affected subreddits and associated hateful material from the rest of the platform. It did not, however, reduce the levels of hateful material within the affected spaces. Instead many users reacted by leaving Reddit for less regulated spaces, with Reddit making this hateful material someone else’s problem. The paper argues therefore that the success of the quarantine as a policy response is mixed. 2 Internet Policy Review 9(4) | 2020 This paper is part of Trust in the system, a special issue of Internet Policy Review guest- edited by Péter Mezei and Andreea Verteş-Olteanu.
    [Show full text]
  • Chapter 2 the Hacker Group R
    Chapter 2 The hacker group R. Gevers CHAPTER OUTLINE Introduction 15 The hacker as social reformer 16 The hacker as combatant 17 The hacker group 19 Disciplines 20 Conclusions 39 n INTRODUCTION Since the information revolution the Internet has been a driving force behind many—if not most—social reforms. From the 1% marches to the Arab Spring: The Internet was used to fuel, coordinate, and facilitate protests. The Internet turned out to be a safe haven for liberal thinkers and was used to establish contacts with other like-minded individuals at the other end of the globe. The global nature of the Internet makes (targeted) communica- tion accessible to anyone. This was at the core of many great revelations: WikiLeaks being the first, The Intercept and Edward Snowden following quickly. In the early days the Internet was a safe haven for free thinkers; there was no censorship and no laws were directly applicable. This opened up opportuni- ties on the Internet to influence governments and their laws. However, this situation has changed: The Internet has become securitized and militarized. Whereas the Internet used to be a place aimed at free and unhindered flow of information and ideas, now it is increasingly influenced by State actors and large non-State actors. Whereas any individual could tread onto the Internet and fight for a cause, nowadays you need to tread carefully. 15 Cyber Guerilla Copyright © 2016 Elsevier Inc. All rights reserved. 16 CHAPTER 2 The hacker group Chapter 1 has described the essence of cyber guerilla strategy, tactics, and the concepts of favorable and unfavorable terrain.
    [Show full text]
  • Free Speech Concerns As Extreme-Right Evicted from Web 21 August 2017, by Paul Handley
    Free speech concerns as extreme-right evicted from web 21 August 2017, by Paul Handley after they moved. They were blocked a third time by another web host, after reopening with an ostensibly safe Russian domain name. Then Cloudflare, which provides an essential security service to millions of web hosts and sites, also said it would block Daily Stormer. Others found their Facebook and Instagram accounts frozen. Google cut the app for social media site Gab, a favorite venue for far-right groups. And in one of the more ignominious moments, white supremacist Chris Cantwell was booted off American internet and social media companies have dating site OkCupid on Thursday. launched a crackdown on neo-Nazi and white supremacist material on their sites, sparking warnings "At OkCupid, we take the truth of everyone's that the web's grand promise of free speech is on the inalienable rights very seriously," said chief rocks. executive Elie Seidman. However, Seidman said, "the privilege of being in the OkCupid community does not extend to Nazis A sweeping crackdown by US internet and social and supremacists." media companies on neo-Nazi and white supremacist material has sparked warnings in Free speech in question America that the web's grand promise of free speech is on the rocks. But such moves raise the question: should the private companies that control most web services Over the past week, Vanguard America, Daily have the power to make such decisions? Stormer and other such ultra-right racist groups and their members known for extremely violent and Are the internet and social media services now offensive postings and websites were essentially such an indelible part of our daily lives that people scrubbed from the public web.
    [Show full text]
  • Cloudflare-10-K-2019.Pdf
    Table of contents UNITED STATES SECURITIES AND EXCHANGE COMMISSION Washington, D.C. 20549 __________________________________________________ FORM 10-K __________________________________________________ (Mark One) ☒ ANNUAL REPORT PURSUANT TO SECTION 13 OR 15(d) OF THE SECURITIES EXCHANGE ACT OF 1934 For the fiscal year ended December 31, 2019 or ☐ TRANSITION REPORT PURSUANT TO SECTION 13 OR 15(d) OF THE SECURITIES EXCHANGE ACT OF 1934 For the transition period from to Commission file number: 001-39039 __________________________________________________ Cloudflare, Inc. (Exact name of registrant as specified in its charter) __________________________________________________ Delaware 27-0805829 (State or other jurisdiction of (I.R.S. Employer incorporation or organization) Identification Number) 101 Townsend Street San Francisco, California 94107 (Address of principal executive offices and zip code) (888) 993-5273 (Registrant’s telephone number, including area code) __________________________________________________ Securities registered pursuant to Section 12(b) of the Act: Title of Each Class Trading Symbol Name of Each Exchange on Which Registered Class A Common Stock, $0.001 par value NET The New York Stock Exchange Indicate by check mark if the registrant is a well-known seasoned issuer, as defined in Rule 405 of the Securities Act. Yes ☐ No ☒ Indicate by check mark if the registrant is not required to file reports pursuant to Section 13 or Section 15 (d) of the Act. Yes ☐ No ☒ Indicate by check mark whether the registrant (1) has filed all reports required to be filed by Section 13 or 15(d) of the Securities Exchange Act of 1934 during the preceding 12 months (or for such shorter period that the registrant was required to file such reports), and (2) has been subject to such filing requirements for the past 90 days.
    [Show full text]
  • GEORGETOWN LAW JOURNAL [Vol
    Regulating Online Content Moderation KYLE LANGVARDT* The Supreme Court held in 2017 that ªthe vast democratic forums of the Internet in general, and social media in particular,º are ªthe most important places ... for the exchange of views.º Yet within these forums, speakers are subject to the closest and swiftest regime of censorship the world has ever known. This censorship comes not from the government, but from a small number of private corporationsÐFacebook, Twitter, GoogleÐand a vast corps of human and algorithmic content modera- tors. The content moderators' work is indispensable; without it, social media users would drown in spam and disturbing imagery. At the same time, content moderation practices correspond only loosely to First Amendment values. Leaked internal training manuals from Facebook reveal content moderation practices that are rushed, ad hoc, and at times incoherent. The time has come to consider legislation that would guarantee meaningful speech rights in online spaces. This Article evaluates a range of possible approaches to the problem. These include (1) an administra- tive monitoring-and-compliance regime to ensure that content modera- tion policies hew closely to First Amendment principles; (2) a ªpersonal accountabilityº regime handing control of content moderation over to users; and (3) a relatively simple requirement that companies disclose their moderation policies. Each carries serious pitfalls, but none is as dangerous as option (4): continuing to entrust online speech rights to the private sector. TABLE OF CONTENTS INTRODUCTION ..................................................... 1354 I. THE DILEMMA OF THE MODERATORS.............................. 1358 II. MANDATORY LIMITS ON CONTENT MODERATION .................... 1363 A. FIRST AMENDMENT OBJECTIONS TO LIMITS ON CONTENT MODERATION ...........................................
    [Show full text]