A Pluralist Approach to Democratizing Online Discourse

A Pluralist Approach to Democratizing Online Discourse

A Pluralist Approach to Democratizing Online Discourse Jay Chen Barath Raghavan International Computer Science Institute University of Southern California [email protected] [email protected] Paul Schmitt Tai Liu Princeton NYU Abu Dhabi [email protected] [email protected] ABSTRACT Freedom of speech is key to democracy. A free and open Online discourse takes place in corporate-controlled spaces Internet is key to modern global society. Yet how does one thought by users to be public realms. These platforms in reconcile the perspectives above, which speak to fundamental name enable free speech but in practice implement varying issues and are at once in agreement and at odds? Throughout degrees of censorship either by government edict or by uneven the 20th century the greatest threat to free speech was that and unseen corporate policy. This kind of censorship has of government censorship. Indeed, this persists under autoc- no countervailing accountability mechanism, and as such racies around the world (e.g., China, Iran) but it has fused platform owners, moderators, and algorithms shape public with corporate censorship, with autocracies using corporate discourse without recourse or transparency. partners to implement censorship, or corporate moderators Systems research has explored approaches to decentraliz- unevenly removing unprofitable or unpalatable speech. ing or democratizing Internet infrastructure for decades. In The character of discourse online is not only a matter of parallel, the Internet censorship literature is replete with ef- policy or culture, but also one of systems: the systems that forts to measure and overcome online censorship. However, provide or prevent the exercise of free speech. The wide reach in the course of designing specialized open-source platforms and accessibility of the early Internet enabled flourishing free and tools, projects generally neglect the needs of supportive speech, but centralization into a handful of platforms (e.g., but uninvolved ‘average’ users. In this paper, we propose a Facebook, Twitter, Weibo, etc.) has surrendered control over pluralistic approach to democratizing online discourse that online speech. Social media platforms provide users a ‘free’ considers both the systems-related and user-facing issues as service with convenience, usability, and powerful network first-order design goals. effects. Users treat these platforms as public realms, but they are private spaces that can unilaterally decide the content 1 INTRODUCTION they serve [19, 24]. These corporations use one-sided terms “The best test of truth is the power of the thought to get itself of service to justify the removal of content and to provide accepted in the competition of the market.” arbitrary exemptions, typically without explanation [33, 55, 57]. This lack of transparency and accountability raises the Oliver Wendell Holmes question: why should corporations be responsible, de jure “In order to maintain a tolerant society, the society must be (e.g., in China) or de facto (e.g., in the US), for adjudicating speech at all? arXiv:2108.12573v1 [cs.NI] 28 Aug 2021 intolerant of intolerance.” In the US, Section 230 of the Communications Decency Karl Popper Act [50] effectively shields online platforms from responsibil- “We are creating a world that all may enter without privilege or ity for content posted by a third party. However, because social prejudice accorded by race, economic power, military force, media platforms have elected to apply their own moderation or station of birth. We are creating a world where anyone, policies to restrict speech (e.g., harassment or hate speech), anywhere may express his or her beliefs, no matter how singu- they have successfully interjected themselves into a difficult lar, without fear of being coerced into silence or conformity. position. Recent disputes regarding how to handle controver- Your legal concepts of property, expression, identity, move- sial political speech on such platforms have resulted in politi- ment, and context do not apply to us. They are all based on cians simultaneously criticizing platforms for doing both too matter, and there is no matter here.” little and too much policing [26]. Indeed, recent responses by Facebook and Twitter have differed significantly [12, 27], John Perry Barlow which further demonstrates how ill-equipped corporations are in arbitrating such matters [38]. At the time of writing, pro- 2 GOALS posals to weaken Section 230 further complicate the problem, The early Internet consisted of many autonomous providers because if such amendments are made, corporations would with little centralized control of services or infrastructure. then have to assume both the responsibility of policing content Today’s Internet, while physically distributed, consists of a and the risk of legal liability [2]. few popular services whose authority is held in the hands of Legal systems have traditionally recognized that freedom corporations. As observed by Liu et al. [34]: of speech should be limited when it conflicts with other free- “Changes in the Internet’s structure have taken place along doms [17, 32, 37, 45, 51]. Spam, scams, and other abuses two orthogonal axes over the past few decades, yet they are are typically removed because they are nearly universally often conflated. The first axis concerns physical distribu- unwelcome. However, moderation on the basis of political tion—centralized vs. decentralized – whether the physical or ideological grounds is notoriously difficult because there resources being accessed for some service are located at a is so much content to filter and because different people and single machine (at one extreme) or dispersed across many communities have their own norms and expectations [54, 57]. machines all over the planet (at the other). The second axis Today, platform operators generally delegate moderation to in- concerns control -— whether the control over the service dividuals who manage the problem at smaller scales or within and the machines providing a service is held by a few versus particular communities. For example, Facebook has approxi- spread across many individuals or organizations. The Inter- mately 15,000 community managers in the US who review net of today is quite different from that of a few decades past flagged content [54]. While moderators can let their personal along both axes: it has gone from partially centralized and views bias their decisions [42], the bigger problem when it democratic to distributed and feudal.” comes to political speech is that companies can unevenly In this paper we are explicitly focused on the decentral- apply their own policies in favor of profitable content [43]. ization of control rather than that of physical resources, and In response, some have advocated the creation of so-called we use the term “democratization” interchangeably with that free-speech platforms such as Gab, 4chan, and 8chan, which notion of decentralization. Numerous systems have been de- seem to always yield high concentrations of hate speech veloped in academia, industry, and as open source projects and are notorious for their racism, misogyny, and homopho- in attempts to re-democratize the Internet [1, 5, 6, 15, 18, 22, bia [58]. These and other recent historical examples suggest 25, 35, 36, 39, 40, 46–48, 52, 59]. The core problems that that moderation is critical for preserving an environment that these projects tackle predominantly focus on basic Internet is not filled with large amounts of gore and hate speech [54]. infrastructure such as: naming, communication, data storage, In this spectrum of options we have seen the promise and serverless web applications, and digital currencies. Rather pitfalls of the approaches that one might consider aligned than focusing purely on systems goals such as scalability, with Holmes (e.g., free discourse with an expectation of high- security, and efficiency, we include two meta-level goals: minded debate), Popper (e.g., uneven regulation, either by Democratize control over content: Because today’s social governments or corporations, to ensure preservation of a rel- media companies own the infrastructure, they possess inor- atively open sphere), and Barlow (e.g., a free for all). Our dinate power over online content. They control what content observation is that there is a need to balance free speech with may be posted, whether it remains stored, how it is discov- moderation and so we introduce another option: ered by users, whether it is presented to users, and how it is “[D]emocracy is the worst form of Government except for all presented to users. Democratizing control over content (i.e., those other forms that have been tried from time to time.” spreading out power) counteracts digital authoritarianism by Winston Churchill definition. This goal is not new and broadly mirrors themo- tivations behind Tim Berners-Lee’s Solid [52] project and This democratic political perspective lays the philosophical many others in the context of re-decentralizing the Web. How- foundation for our paper and is aligned with notions of free ever, we add a less explored corollary, of considering how to expression and the marketplace of ideas common in Western prevent re-centralization. democracies. It also recognizes that a truly free Internet allows Preserve moderated user experience: Online media com- for the possibility of undesirable pockets of content, but that panies like Facebook hire armies of moderators to filter out they are not desired by the

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us