July 28, 2020 the Honorable John Thune the Honorable Brian Schatz

July 28, 2020 the Honorable John Thune the Honorable Brian Schatz

July 28, 2020 The Honorable John Thune The Honorable Brian Schatz Chairman, Subcommittee on Ranking Member, Subcommittee on Communications, Technology Communications, Technology, Innovation, and the Internet Innovation, and the Internet Senate Committee on Commerce, Senate Committee on Commerce, Science, and Transportation Science, and Transportation 512 Dirksen Senate Office Building 512 Dirksen Senate Office Building Washington, DC 20510 Washington, DC 20510 The Honorable Roger Wicker The Honorable Maria Cantwell Chairman, Senate Committee on Ranking Member, Senate Committee on Commerce, Science, and Transportation Commerce, Science, and Transportation 512 Dirksen Senate Office Building 512 Dirksen Senate Office Building Washington, DC 20510 Washington, DC 20510 Dear Senators Thune, Schatz, Wicker and Cantwell: New America’s Open Technology Institute (OTI) appreciates the opportunity to submit a statement for the record for the hearing entitled, “The PACT Act and Section 230: The Impact of the Law that Helped Create the Internet and an Examination of Proposed Reforms for Today’s Online World,” being held by the Subcommittee on Communications, Technology, Innovation, and the Internet. OTI works at the intersection of technology and policy to ensure that every community has equitable access to digital technologies that are open and secure, and their benefits. We support and defend the right to privacy and freedom of expression, and press internet platforms to provide greater transparency and accountability around their operations, technologies, and impacts. In recent weeks, there have been a number of proposals seeking to amend Section 230 of the Communications Decency Act, the law that provides that platforms cannot be held liable for user-generated content and also provides a safe harbor for companies to choose to moderate content and make decisions about what content to permit on their sites. Historically, Section 230 has played a critical role in safeguarding free expressions for users, and the application and scope of the law has relied on judicial interpretation given the fast-paced evolution of digital platforms since the time of the law’s origination. Most of the proposed bills introduced to date would eviscerate the free expression protections provided by Section 230. If Congress seeks to amend Section 230, it is critical that any modifications to the law be precise to accomodate an ever-changing digital environment, and carefully crafted to ensure that Section 230 continues to provide necessary protections for free expression online. Any amendments to Section 230 will not only impact content moderation practices, but also protections around online expression. As policymakers and stakeholders seek to find the balance between supporting innovation and holding platforms accountable, the Platform Accountability and Consumer Transparency (PACT) Act is a significant improvement over the other Section 230 bills. We commend the bill’s authors on seeking to codify important consumer protections in the digital space. OTI supports a number of provisions in the bill, as well as the intentions behind the drafting, but is concerned that other provisions should be amended to ensure adequate safeguards for free expression. Namely, while the bill includes strong transparency requirements, the carve out from Section 230 is too broad to maintain adequate safeguards for user expression online, and more consideration is needed around the potential consequences of the bill’s procedural requirements and potential implications of empowering federal agency actors to enforce civil actions. The PACT Act contains important transparency and accountability measures. OTI supports the bill’s intent to require greater transparency and accountability around content moderation practices from internet platforms. OTI has been a longstanding advocate for transparency and accountability in this regard. In October 2018, we published a Transparency Reporting Toolkit focused on content takedowns which evaluates how 35 global internet and telecommunications companies are currently reporting on content takedown efforts through their transparency reports, and makes recommendations on how these reports can be made more meaningful and granular.1 In addition, OTI is one of the original authors of the Santa Clara Principles on Transparency and Accountability in Content Moderation which outlines minimum levels of transparency and accountability that internet platforms should meet.2 Finally, OTI has published four reports which outline how internet platforms can promote greater fairness, accountability, and transparency around the use of algorithmic decision-making in content moderation and content curation processes.3 1 Spandana Singh and Kevin Bankston, The Transparency Reporting Toolkit: Content Takedown ​ ​ Reporting, ​ https://www.newamerica.org/oti/reports/transparency-reporting-toolkit-content-takedown-reporting/. ​ 2 "The Santa Clara Principles On Transparency and Accountability in Content Moderation," The Santa ​ Clara Principles, last modified May 7, 2018, https://santaclaraprinciples.org/. ​ ​ 3 Spandana Singh, "Holding Platforms Accountable: Online Speech in the Age of Algorithms," New ​ America's Open Technology Institute, https://www.newamerica.org/oti/reports/report-series-content-shaping-modern-era/. ​ 2 OTI welcomes the transparency provisions in the bill to require greater accountability around companies’ content moderation efforts. In particular, we support the provisions that would require platforms to publish detailed and consistent transparency reports that outline the impact of processes such as Terms of Service-based content moderation and appeals. These provisions are in line with some of the foundational recommendations OTI has put forth in our past work. In addition, we believe that the bill could do more to require transparency and accountability from platforms. In particular, the bill should require platforms to report more granular data around how much content and how many accounts individuals and entities such as Trusted Flaggers and Internet Referral Units have flagged to the company. Further, the bill should require that when reporting on content moderation enforcement actions in their transparency reports, platforms, at a minimum, disaggregate data related to content removals from data related to content curation processes such as labeling and deprioritizing. There is currently little transparency around how internet platforms use these algorithmic curation practices and what impact they have on online expression. The bill’s Section 230 carve out is too broad to ensure adequate user protections; lacks critical details around court order content and appeals; and would establish potential user data privacy concerns. Section 6 of the bill requires interactive computer services to take down “illegal content or illegal activity” within 24 hours of receiving notice and 14 days after receiving notice of “potentially policy-violating content.” Per the bill’s procedural regime, content or activity is deemed illegal by a federal or state court order, and one of the elements in deeming the service to have received notice is for the service to have received a copy of such order. For content or activity that is potentially policy-violating, the service is deemed to receive notice from any mechanism it establishes, including the mandated creation of a call center where representatives can take in complaints. Additionally, the bill creates an exception for small business providers. While the intention behind establishing strict timelines and a carve out for smaller competitors in the digital space is admirable, the carve out from Section 230’s protections is too broad to ensure safeguards for free expression online. The strict response timelines for interactive computer services, and especially the bill’s requirement that companies must remove illegal content within 24 hours of receiving notice of certain content, would incentivize platforms to over-censor user content. Such time-bound restrictions have been implemented in other legislation around the world, such as in Germany’s Network Enforcement Act, which also has a 24 hour time restriction on removing such content. As we have outlined before, these forms of regulation place undue pressure on companies to 3 remove content quickly or face liability, thereby creating strong incentives for them to err on the side of broad censorship.4 The bill’s exception for not requiring notice or an opportunity to appeal to the offending content provider if the service “is unable to contact the information content provider after taking reasonable steps to do so” is also troublesome. Impacted users must always have access to both meaningful notice and a robust appeals process. Typically, when a company offers notice and appeals, it is through a built-in system or via email when a user’s account has been suspended. The exception raises questions around what happens when a user does not respond to a notice of moderation from a company within a particular time frame. This is concerning because it could strip users of the right to appeal, an important platform-established remedy mechanism and because “reasonable steps” is never clearly defined. Further, although we appreciate the bill’s notice and takedown approach and its statement that platforms are not required to actively monitor all content on all their services, Section 6 is not narrowly tailored so as to maintain safeguards for free expression. While

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us