<<

THE AUTOMATED PUBLIC SPHERE

By Frank Pasquale Table of Contents

Pathologies of the Automated Public Sphere. By the editors...... 1

The Automated Public Sphere

By Frank Pasquale...... 2

The Long Shadow of Intermediary Irresponsibility...... 3

Toward a Robust Regime of Intermediary Responsibility...... 5

Concerns and Concessions...... 9

Conclusion: A Return to Professionalism...... 12

References...... 13

Published by the Rosa Luxemburg Stiftung, Office, December 2017

Editors: Stefanie Ehmsen and Albert Scharenberg Address: 275 Madison Avenue, Suite 2114, New York, NY 10016 Email: [email protected]; Phone: +1 (917) 409-1040

With support from the German Foreign Office.

The Rosa Luxemburg Foundation is an internationally operating, progressive non-profit institution for civic education. In cooperation with many organizations around the globe, it works on democratic and social participation, empowerment of disadvantaged groups, alternatives for economic and social development, and peaceful conflict resolution.

The New York Office serves two major tasks: to work around issues concerning the United Nations and to engage in dialogue with North American progressives in universities, unions, social movements, and politics.

www.rosalux-nyc.org Pathologies of the Automated Public Sphere

The ubiquity of the today—namely of networks and large search engines—has complicated the ways in which content is produced and received, deeply altering how society thinks about the rights to free speech, freedom of the press, and freedom of expression. Public speech can now easily be spread worldwide while remaining anonymous. Nazi-related content liable to prosecu- tion in Europe is safely hosted on US servers where it is protected by the First Amendment. On top of this, our access to content has become completely mediated by algorithms designed to maximize profits. This transformation in the production, distribution, and consumption of content has become the inexorable backdrop of contemporary debates on the basic right to freedom of speech.

As automation emerges as a problem affecting all spheres of production, we are increasingly con- fronted with its ethical implications. This is especially true for the discussions spurred by the new digital contexts that shape public opinion. The automation of decision-making processes put in mo- tion by digital platforms in sensitive areas such as editing, moderating, advertising, and circulating information is at the source of many controversies. When the ways in which information and opinion are produced and disseminated become open to manipulation, we are forced to deal with the conse- quences—an unregulated platform that takes advantage of the same capitalist logic that undermines society in so many other ways.

In this new piece, Frank Pasquale, affiliate fellow at Yale Law School’s Information Society Project and author of The Black Box Society: The Secret Algorithms That Control Money and Information, argues that powerful interest groups build their dominance with the help of a regulatory regime that permits se- crecy and complexity. The study tackles events surrounding the recent US elections as well as other cases in which online interventions (or lack thereof) have allowed for the spread of hateful ideologies in the broader public. Presenting a series of legal and educational steps, the author shows how we may curtail the effects of the pathologies that the contemporary automated public sphere creates.

Disrupting secretive business strategies, preventing the concentration of power and money into a few hands, and introducing more instances of human intervention are initiatives that put the focus on reg- ulating the power held by big tech companies. However, other proposed measures trigger crucial ethi- cal considerations around this regulatory apparatus. How can we ensure that regulation does not turn into censorship, suppression, and thereby into another tool for manipulation and control of society?

Beyond fully endorsing these proposals or necessarily opposing all such forms of regulation, we be- lieve that as progressive actors—many times on the losing end of digital control and/or harassment— we need to reconsider our strategies and methods under new contexts. How can we rethink regula- tion to make it work in both fair and transparent ways everywhere? Can we devise ways to regulate users, who play an important part in producing and distributing content, without falling into brute censorship? Are these desirable, forward-looking options, or rather desperate reactions to the reality of today’s digital world? Ultimately, we need to ask, what is the role of the internet in society today and how can we improve the digital environment for all of us?

Stefanie Ehmsen and Albert Scharenberg Co-Directors of New York Office, December 2017

1 The Automated Public Sphere

By Frank Pasquale

As internet usage grew in the 2000s, scholars raud to distort the information environment on promoted its emancipatory potential. Yochai , , News, Reddit, and Benkler praised not only the wealth that would other networks. be promoted by networks, but also its distribu- tion—toward a platform of platforms that would We now know that virtually every positive prom- enable millions of new voices to be heard online ise made about the internet in the early 2000s (Benkler, 2007). This optimism also animated has a shadow side. While secrecy has empow- one of the United States’ seminal cases on inter- ered some voices who would otherwise be net regulation, Reno v. ACLU (1997), which pre- afraid to speak up, it has also protected trolls, sumed the openness of the internet would re- doxers, and other bad actors online who si- dound to the benefit of all. The majority opinion lence others’ speech via intimidation. Moreover, in ACLU darkly cautioned the US government to online anonymity is of a piece with financial avoid mucking about in many forms of internet anonymity, which has empowered thousands regulation, lest it infringe on free expression of shell companies to obscure who is actually rights in an online environment that the justices, funding messages that could sway the public, as well as later boosters, idealized. Large plat- legislators, and regulators. Everyone is invited forms themselves harbor utopian pretensions to participate, but so too is “everyone” capable to this day; for example, Mark Zuckerberg has of disrupting other communities of interest on- marketed Facebook as a nascent global commu- line, via hashtag spamming or trolling—wheth- nity (even as social critics lament how time on- er by civil society groups, state actors, or mis- line diverts citizens from in-person engagement creants pursuing disruption “for the lulz.” First with friends and neighbours) (Rushkoff, 2016). celebrated as a way to hold states accountable for illegal actions, Wikileaks has emerged as a Even in the 1990s, scholars warned about the witting agent of authoritarian state interference implications of deregulating the internet (Chin, in elections with a troubling tendency to emit 1997). By the mid-2010s, it is hard to remain op- anti-Semitic messages. While major content timistic about the role of the internet in organiz- owners have found their grip on public attention ing a new, and critically important, digital public diminished, fragmentation of audiences has giv- sphere. Wealth has emerged in online advertis- en megaplatforms unprecedented global power ing, but it is largely claimed by two firms—Goo- over attention-commanding interfaces. gle and Facebook take about 75% of the $73 bil- lion digital advertising market in the US (Bond, That last reversal is the subject of this essay. In a 2017). These information intermediaries are world of stable and dominant media firms, large driven by profit, and their methods of selecting social networks and search engines were in a and arranging newsfeeds and search engine rough equilibrium of power relative to the own- results pages are secret (Pasquale, 2015b, pp. ers and creators of the content they selected and 59-100). The promised Wealth of Networks has arranged (Pasquale, 2010). However, a general given way to a black box society—one where trend toward media revenue decline (and plat- trolls, bots, and even foreign governments ma- form revenue growth) makes a new endgame

2 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

apparent: online intermediaries as digital bottle- permitting themselves to be manipulated by the necks or choke-points, with ever more power over most baseless and dangerous propagandists the type and quality of news and non-news media (Marwick and Lewis, 2017).3 Such political forces that reaches individuals (Bracha and Pasquale, are particularly gifted at creating media capable 2008; Pasquale, 2008b).1 The source of this power of influencing and persuading low-information, is, at bottom, big data—the ability of megaplat- floating voters—exactly the persons most likely forms to accumulate ever-more-intimate profiles to swing the results of elections. of users, which are then of enormous interest to commercial entities, political campaigns, gov- This essay first describes the documented, neg- ernments—indeed, anyone aspiring to monitor, ative effects of online propagandists’ interven- monetize, control, and predict human behavior. tions (and platforms’ neglect) in both electoral politics and the broader public sphere (Part I). It Large online intermediaries tend to reduce at then proposes several legal and educational tac- least one good type of media pluralism, and to tics to mitigate their power, or to encourage or promote a very destructive type of diversity.2 require them to exercise it responsibly (Part II). They make the metric of success online “virali- The penultimate section (Part III) offers a conces- ty,” promoting material that has received a good sion to those suspicious of governmental inter- deal of attention or seems to match a sub-pub- vention in the public sphere: some regimes are lic’s personalization profile, regardless of wheth- already too authoritarian and generally unreli- er it is true or minimally decent (Pasquale, 2006). able to be trusted with extensive powers of regu- That reduces pluralism by elevating profit con- lation over media (whether old or new media), or siderations over the democratizing functions of intermediaries. However, I conclude that the in- public discourse, and effectively automating the advisability of extensive media regulation in dis- public sphere. Decisions that once were made ordered societies only makes this agenda more by humans are now made by profit-maximizing urgent in well-ordered societies, lest predictable algorithms. Moreover, the same intermediaries pathologies of the automated public sphere ac- also promote a very troubling diversity by celerate the degradation of their democracies.

The Long Shadow of Intermediary Irresponsibility

Jürgen Habermas observed in 1962, “the pro- into political power is as much in need of crit- cess in which societal power is transformed icism and control as the legitimate exercise of political domination over society” (Habermas, 1 Note, too, that the problem is not altogeth- 1962, trans. 1991, p. 210). As part of the Frank- er plausibly one of left voters needing to be exposed to right voters’ worldview, and vice versa (for who knows furt School, Habermas was sensitive to the how far along the spectrum of ideology once should ways in which new media, manipulated by both search for alternative views, or how rotten centrist con- sensus is). Rather, it is one of a lack of autonomy and un- corporate and state actors, had fundamental- derstanding of how one’s media environment is shaped. 2 Media pluralism is necessary for maintaining the integ- rity of the democratic process; reducing the impact of 3 While the platforms will often insist that they are the the misrepresentation and suppression of information; true victims of propagandists, they somehow manage promoting access to diverse information and opinions; to seek out and stop a great deal of the web spam and and protecting freedom of expression (Smith and Tam- manipulation that threatens their advertising business bini, 2012; Smith, Tamibini, and Morisi, 2012). models.

3 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

ly altered processes of democratic will forma- one another, they can occupy the top slots in tion. He deemed such transitions a “structur- response to tweets from candidates. They can al transformation” of the public sphere, since also flood hashtags, making it very difficult for new forms of media both accelerated, altered, ad hoc publics to crystallize around an issue. and reshaped the expression critical both to opinion formation and expression. On Facebook, a metastatic array of fake con- tent generators and hard-core partisan sites The basic contours of mass media-driven poli- flooded news feeds with lies and propaganda. tics and culture remained stable through much Facebook, as usual, disclaimed any responsi- of the second half of the twentieth century. But bility for the spread of stories falsely claiming since the mid-1990s, the public sphere has en- that the Pope had endorsed , or dured yet another structural transformation. that Hillary Clinton is a satanist (to give a mild Megafirms like Facebook and Google have pair of the lies that swarmed the platform) largely automated the types of decisions once (Schaedel, 2016; Evon, 2016). But the Silicon made by managers and programmers at tele- Valley firm has several levels of responsibility. vision networks, or editors at newspapers. Au- tomated recommendations are often helpful, Basic design choices mean that stories shared aiding audiences as they seek to sort out the on Facebook (as well as presented by Google’s blooming, buzzing confusion of topics online. AMP) all look very similar, and indeed have for But they are also destabilizing traditional me- years (Chayka, 2016). Thus a story from the dia institutions and circuits of knowledge. fabricated “Denver Guardian” can appear as authoritative as a Pulitzer Prize-winning New For example, the US election featured deeply York Times investigation (Lubbers, 2016). More disturbing stories about manipulation of so- directly, Facebook profits from —the cial media for political ends. Unreliable sourc- more a story is shared (whatever its merits), es proliferated, particularly among right-wing the more ad revenue it brings in (Molina, 2016). echo chambers. In December 2016, a Face- Most disturbingly, we now know that Facebook book-fueled fake news story about Hillary Clin- directly helped the Trump campaign target its ton prompted a man to arrive in a pizza parlor voter suppression efforts at African-Americans in D.C. with a gun (Abrams, 2016). (Winston, 2016). The fake story reportedly originated in a white supremacist’s tweet. Politically motivated, Google has suffered from other racially tinged profit-seeking, and simply reckless purveyors scandals (Noble, 2018). Consider, for instance, of untruths all prospered. A Macedonian teen recurrent problems with Google’s “autocom- churned out stories with no basis, tarring Hil- pletes”—when Google anticipates the rest of a lary Clinton with an endless series of lies, in search query from its first word or two. Goo- order to score quick profits (Smith and Banic, gle autocompletes have often embodied racist 2016). For profit-minded content generators, and sexist stereotypes (Cadwalladr, 2016). Its the only truth of Facebook is clicks and ad pay- image search has also generated biased re- ments. Bence Kollanyi, Phil Howard, and Sam- sults, absurdly and insultingly tagging some uel Woolley estimated that tens of thousands photos of black people as “gorillas” (Guarino, of the tweets “written” during the second US 2016; Barr, 2015). presidential debate were spewed by bots (Kol- lanyi, Howard, and Woolley, 2016). These bots If Google and Facebook had clear and public- serve multiple functions—they can promote ly acknowledged ideological agendas, users fake news, and when enough of them retweet could grasp them and “inoculate” themselves

4 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

accordingly, with skepticism toward self-serv- sources of support for climate denialists, mi- ing views (Pasquale, 2011). However, the plat- sogynists, ethnonationalists, and terrorists forms are better understood as tools rapidly easily developed and cultivated in what has manipulated to the advantage of search en- become an automated public sphere. gine optimizers, well-organized extremists, and others at the fringes of political respect- These terrifying acts of violence and hate are ability or scientific validity. Thus a search for likely to continue if action is not taken. Nor “Hillary’s Health” in October 2016 would have is democracy safe in a carelessly automated led to multiple misleading videos and articles public sphere. Without a transparent curation groundlessly proclaiming that the US Demo- process, the public has a hard time judging the cratic presidential candidate had Parkinson’s legitimacy of online sources. In response, a Disease. Google search results reportedly growing movement of academics, journalists, helped shape the racism of Dylann Roof, who and technologists is calling for more algorith- murdered nine people in a historically black mic accountability from giants church in the US in 2015. Roof (Pasquale, 2015a). As algorithms take on more said that when he googled “black on white importance in all walks of life, they are increas- crime, the first I came to was the Coun- ingly a concern of lawmakers. And there are cil of Conservative Citizens,” which is a white many steps both Silicon Valley companies and supremacist organization. “I have never been legislators should take to move toward more the same since that day,” he said. So too are transparency and accountability.

Toward a Robust Regime of Intermediary Responsibility

Activist and academic responses to these im- no longer credibly describe itself as merely a broglios have been multifaceted. Some com- platform for others’ content, especially when it munication scholars have rightly criticized is profiting from micro-targeted ads (Pasquale, Facebook for its apparent indifference to the 2016a). It has to take editorial responsibility. So, problem of fake or misleading viral content too, should mega-platforms like Google take on (Tufekci, 2016). Others have focused their ire some basic responsibilities for the content they on the mainstream media, claiming that it distribute. This section describes several specific was the recklessness or lack of professional initiatives that would help counter the discrimi- responsibility at right-wing news sources (and nation, bias, and propaganda now too often pol- established media institutions like CNN and luting (and even overwhelming) online spaces. ) which accelerated the rise of authoritarian candidates like Trump (Kreiss, 2016; Robinson, 2016). A. Label, monitor, and explain hate-driven search results. In truth, there is no contradiction between a critique of the new media and deep disappoint- In 2004, anti-Semites boosted a Holocaust-deni- ment in old media. Moreover, any enduring al site called “Jewwatch” into the top 10 results solution to the problem will require cooperation for the query “Jew” (Pasquale, 2006). Ironically, between journalists and coders. Facebook can some of those horrified by the site may have

5 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

helped by linking to it in order to criticize it. The its methods secret, if only to reduce controver- more a site is linked to, the more prominence sy and foil copycat competitors. However wise Google’s algorithm gives it in search results. The this secrecy may be as a business strategy, it Anti-Defamation League and other civil rights devastates our ability to truly understand the organizations complained to Google about its social world Silicon Valley is creating. More- technology’s gift of prominence to entities emi- over, like a modern-day Ring of Gyges, opacity nently unworthy of such attention. creates ample opportunities to hide anti-com- petitive, discriminatory, or simply careless con- Google responded to complaints by adding a duct behind a veil of technical inscrutability. headline at the top of the page entitled “An expla- nation of our search results.” A web page linked A recurring pattern has developed: some enti- to the headline explained why the offensive ty complains about a major internet company’s site appeared so high in the relevant rankings, practices, the company claims that its critics thereby distancing Google from the results. The don’t understand how its algorithms sort and label, however, no longer appears. In Europe rank content, and befuddled onlookers are left and many other countries, lawmakers should to sift through rival stories in the press. Mas- consider requiring such labeling in the case of sive search operations are so complex, and so obvious hate speech. To avoid mainstreaming protected by both legal and de facto secrecy, extremism, labels may link to accounts of the that it is almost always impossible for those history and purpose of groups with innocuous outside a search engine or social network firm names like “Council of Conservative Citizens” to identify all the signals that are driving a given (Pasquale, 2016a; Pasquale, 2008a). set of results. Silicon Valley journalists tend to give their advertisers the benefit of the doubt; Are there free expression concerns here? Not national media outlets find the mysteries of on- really. Better labeling practices for food and line content ordering perfectly fit into their own drugs have escaped First Amendment scrutiny templates of balanced reporting. No one knows in the US, and why should information itself exactly what’s going on when a dispute arises, be different? As law professor Mark Patterson so rival accounts balance into an “objective” has demonstrated, many of our most import- equipoise. ant sites of commerce are markets for informa- tion: search engines are not offering products Regulators need to be able to understand how and services themselves but information about some racist or anti-Semitic groups and individ- products and services, which may well be deci- uals are manipulating search and social media sive in determining which firms and groups fail feeds (Pasquale, 2010). We should require im- and which succeed (Patterson, 2017). If they go mutable audit logs of the data fed into algo- unregulated, easily manipulated by whoever rithmic systems. Machine-learning, predictive can afford the best search engine optimization, analytics, or algorithms may be too complex for people may be left at the mercy of unreliable a person to understand, but the data records and biased sources. are not. They can be subjected to algorithmic audits.

B. Audit logs of the data fed into algorithmic A relatively simple set of reforms could vastly in- systems. crease the ability of entities outside Google and Facebook to determine whether and how the We should expect any company aspiring to or- firms’ results and news feeds are being manip- der vast amounts of information to try to keep ulated. There is rarely adequate profit motive

6 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

for firms themselves to do this—but motivated ous robot armies, with the likely winner being non-governmental organizations can help them the firms with the funds to hire the -top pro be better guardians of the public sphere. grammers and the network effect dynamics to gather the most data for the optimal crafting of messages for microtargeted populations. It C. Ban certain content. goes without saying that this type of decompo- sition of the public sphere does not represent In cases where computational reasoning be- a triumph of classic values of free expression hind search results really is too complex to be (autonomy and democratic self-rule); indeed, it understood in conventional narratives or equa- portends their evaporation into the manufac- tions intelligible to humans, there is another tured consent of a phantom public. regulatory approach available: to limit the types of information that can be provided. D. Permit outside annotations and hire Though such an approach would raise con- more humans to judge complaints. stitutional objections in the US, nations like France and Germany have outright banned In the US and elsewhere, limited annota- certain Nazi sites and memorabilia. Policymak- tions—“rights of reply”—could be permitted in ers should also closely study laws regarding certain instances of defamation of individuals “incitement to genocide” to develop guidelines or groups (Pasquale, 2008a). Google continues for censoring hate speech with a clear and pres- to maintain that it doesn’t want human judg- ent danger of causing systematic slaughter or ment blurring the autonomy of its algorithms. violence against vulnerable groups. It’s a small But even spelling suggestions depend on hu- price to pay for a public sphere less warped by man judgment, and in fact, Google developed hatred. And unless something like it is done, ex- that feature not only by means of algorithms pect social media-driven panics about despised but also through a painstaking, iterative inter- minorities to lead to violence. play between computer science experts and human beta testers who report on their satis- To be sure, this approach would almost cer- faction with various results configurations. As tainly draw immediate legal action in the Unit- Sarah Roberts, Lily Irani, and Paško Bilić have ed States, where a form of “free expression shown, supposedly digitized companies are fundamentalism” has protected even the most constantly reliant on manual interventions by reprehensible speech. Cyberlibertarians sup- human beings (Bilić, 2016; Irani, 2013; Roberts, port First Amendment protections for algorith- 2016a; 2016b). Requiring a few more is not a mic orderings of information. Relatedly, the major burden for these firms. same scholars and judges eager to protect the “speech” of computers also promote the idea This step is important because we now know that massive corporations’ “expression” is de- (if we ever doubted) that the hoary “market- serving of exceptional protection from the very place of ideas” metaphor is misleading. The state so often suborned or coopted by those best ideas are not necessarily the most high- same corporations. ly valued; the most sensational or gratifying propaganda can beat out careful reporting. This science fictional appeal to Asimov-ian ide- Highly motivated, well-resourced groups can als of “speaking robots” has fed a romanticiza- easily manipulate newsfeeds or search engine tion of corporate speech. The logical endpoint result pages (SERPs). “Dark ads” and sophisti- is a continual “battle for mindshare” by vari- cated personalization algorithms enable con-

7 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

stant experimentation on unwitting human by online intermediaries, policymakers should research subjects, so A/B testing can reveal limit the profits they make relative to revenues exactly what manipulation works best. With- of the content owners whose work they depend out conscientious and professional curation on. In the health care context in the US, private of such algorithmic orderings of information, insurers can only keep a certain percentage of the public sphere’s automation is susceptible premiums (usually 15 to 20%)—the rest must go to distortion by the most well-resourced and to health care providers, like hospitals, doctors, committed entities. and pharmaceutical firms. Such a rule keeps the intermediary from taking too much of the The European Union’s commitments to rights to spending in a sector—a clear and present dan- be forgotten, and rights of erasure, show that ger in monopolistic internet contexts, as well. the algorithmic ordering of information can be Governments could limit the amount of profits a socially inflected process, with fairer modes that search engines and social networks make of participation for citizens and civil society as intermediaries, requiring them to pay some (Pasquale, 2016b). share of their revenues to content generators like newspapers and media firms (Lanier, 2013; Lehdonvirta, 2017). Alternatively, policymakers E. Limit the predation possible by online could simply force large platforms to pay a fair intermediaries. share of the tax they now avoid by shuttling in- come to tax havens, and use some of that reve- Personalization is leading advertisers to aban- nue for public broadcasting alternatives. don traditional, and even not-so-traditional, publishers in favour of the huge internet plat- forms. No other rival can approach either the F. Obscure content that is damaging and not granularity or the comprehensiveness of their of public interest. data. The result is a revolution-in-process about who can afford to keep publishing, and- con When it comes to search queries on an individu- comitant alarm about the concentration of me- al person’s name, many countries have aggres- dia clout into fewer and fewer hands. sively forced Google to be more careful in how it assembles data dossiers presented as SERPs. One platform owner, , accumulated Thanks to the Court of Justice of the European wealth equivalent to one hundred times the Union, Europeans can now request the removal total value of the United States’ second most of certain search results revealing information important newspaper, . He that is “inadequate, irrelevant, no longer rel- bought the Post, with concomitant chilling ef- evant or excessive,” unless there is a greater fects on the paper’s ability to criticize his own public interest in being able to find the informa- business empire-building, or similar strategies tion via a search on the name of the data sub- by platform capitalists. Given the leverage po- ject (Pasquale, 2016b). tential of their own ever-higher expected earn- ings, large platforms may soon be able to move Such removals represent a middle ground be- to buy more content producers themselves, as tween information anarchy and censorship. cable networks and ISPs have done—or per- They neither disappear information from the haps purchase entire cable networks and ISPs. internet (it can be found at the original source, This type of vertical integration would be a ma- and in searches on terms other than the com- jor threat to the autonomy of . plaining party’s name), nor allow it to dominate Given all the negative externalities generated the impression of the aggrieved individual. They

8 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

are a kind of obscurity that lets ordinary individ- to take news of the murder off search results uals avoid having a single incident indefinitely on her name. This type of public responsibility dominate search results on his or her name. For is a first step toward making search results and example, a woman whose husband was mur- social network newsfeeds reflect public values dered 20 years ago successfully forced Google and privacy rights.

Concerns and Concessions

There will be fierce opposition to virtually all of wise) for a firm like Facebook to take on (Tur- the proposals I have listed above. Some will arise ton, 2016; Lessin, 2016). They argue that the merely out of commercial motivations: policing volume of shared content is simply too high to hate speech and fake news is more expensive be managed by any individual, or team of indi- than letting it flourish. Platforms would rather viduals. But this argument ignores the reality just pile up advertising revenue. As Jodi Dean has of continual algorithmic and manual manipula- demonstrated, outrageous content stokes at least tion of search results at Google. As technology as much engagement online as it has in the tradi- writer Timothy Lee explains, tional media (Dean, 2010). Indeed, the problem is easily intensified online, as personalization allows During the 2000s, people got better and better platforms to deliver material precisely targeted to at gaming Google’s search algorithm. Some were maximize clicks, likes, and shares (Citron, 2014). running quasi-media companies whose writers churned out dozens of extremely short, poorly re- Slowing that accelerated engagement costs a plat- searched articles based on popular search terms. form potential advertising, and all-important data […] In a January 2011 blog post, Google search about its users (Srnicek, 2017). It also impedes the quality czar Matt Cutts acknowledged that Google platform’s ability to shape its users into the kind had a big problem with these “content farms.” […] of people who uncritically act in behaviorally ma- Later that year, Google brought down the ham- mer, releasing changes to its search algorithm that nipulable ways (Schüll, 2012). caused traffic at major content farms to plummet.

Unless platforms can demonstrate that the [This] represented Google making a deliberate intermediary responsibilities discussed above value judgment that some kinds of content were would compromise their ability to run the worse than other kinds. Early versions of Google platform at a reasonable rate of return, such took a naively data-driven approach, assuming that a link from one site to another was a sign cost-based objections should be dismissed. of quality. […] [In later, more sophisticated itera- Neither Mark Zuckerberg nor Facebook share- tions,] Google include[d] human reviewers in the holders have any legitimate expectation of mix because algorithms inevitably make mistakes permanent, massive returns on their invest- and manual human review is needed to keep the ment. Indeed, impeding their ability to accu- algorithms on the right track. Previously reviewed mulate the surplus they have used to buy rival pages can be fed back into Google’s software, allowing the algorithms to learn from human and adjacent firms may well encourage inno- judgment and get better over time. So Facebook vation (Stucke and Grunes, 2017). doesn’t have to choose between fighting fake news with algorithms or human editors. An effective Many apologists for big tech firms claim that fight against fake news is going to require heavy this type of responsibility is impossible (or un- use of both approaches (Lee, 2016).

9 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

There are powerful lessons in this passage. other, more developed and democratic pub- First, be wary of the convenient self-reification lic spheres. Indeed, intervention in the public of platforms. Facebook may aspire to be merely sphere while a polity is still well-ordered may a technology company. Those aspirations may be the only way to keep it well-ordered. Some express themselves as a petulant insistence that of these concerns are variations on the classic unsupervised, rather than supervised, machine problem of regulatory capture: the very insti- learning is the ideal way to solve problems on tutions meant to regulate an industry may be the platform. But that “identity” is a constructed taken over by that industry. Fortunately, the and convenient one, directly at odds with tech problem has now been so carefully studied that firms’ repeated invocation of free expression many prophylactic measures could be put in protections to shield their actions from govern- place to avoid it (Carpenter and Moss, 2014). mental scrutiny (Pasquale, 2016c). Revolving door rules could prevent officials and bureaucrats from working for the industry they Beyond economic and technical objections, are regulating for five or ten years after they there is a third, deeper objection to interme- depart their agency. Higher pay for regulators diary responsibility, focusing on the regulatory would also help assure more independence, apparatus necessary to make it meaningful and as would the type of automatic funding mech- robust. Authoritarian regimes have tried to sti- anism that now empowers the United States fle political dissent by regulating Facebook and Consumer Financial Protection Bureau to act as Google. For example, the Thai, Russian, Chinese, that country’s lead consumer-oriented financial and Turkish governments have aggressively regulator. While serious, the problem of regula- policed criticism of national leaders, and have tory capture is not insurmountable. intimidated dissidents. Corrupt governments may be susceptible to excessive influence from More serious is a larger problem of circularity, well-organized lobbies. Fossil fuel lobbyists may well-identified by Charles Lindblom: the ability influence regulators to force intermediaries to of powerful economic entities to take over po- monitor and censor environmental activists litical institutions and use that political power to committed to resistance against pipeline proj- enhance their economic power, which in turn ects (Citron and Pasquale, 2011, p. 1445; ACLU, gives them resources necessary to entrench 2017). Overly onerous annotation require- their political power (Lindblom, 1977, p. 201- ments, or rights to be forgotten, may become 213). The rise of oligarchical power in nations a pretext for driving a popular platform out of around the world suggests how deep the prob- a country. Governments may abuse taxation lem of circularity can be (Winters, 2011). The powers, too, in retaliation against a platform tendency of oligarchs to enact programs that that enables stinging or politically effective crit- simultaneously harm the material conditions of icism of them. Or platforms may successfully their electoral base, while cultivating and con- lobby to have their own personnel and allies solidating their sense of political identity orga- appointed to the agencies and commissions set nized around common grievance, should also to regulate them. A search or robotics or social serve as a spur to reconsider the foundations network commission, for example, might start of the critiques that motivated the program of out with a robust agenda, but over years or de- reform developed above. cades may find itself taken over by appointees closely aligned with dominant industry players.4 For example, consider the classic problem Still, there is little reason to assume that the of the filter bubble (Pariser, 2011; Sunstein, actions of the worst governments are likely in in legal literature of a regulatory body targeted at search 4 Bracha and Pasquale (2008) included the first discussion engines.

10 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

2007). Personalization enables internet users In a situation of asymmetrical persuadability, to ignore points of view they disagree with, filter bubble-inspired reforms will tend only to so the filter bubble model states and there- consolidate the power of the social group or fore increases polarization. Let us assume, for political party most steadfastly committed to now, that there is some extant middle ground maintaining its own position. We can, of course, of consensus worth saving. Extant solutions imagine 12 Angry Men-type scenarios where a to the filter bubble dynamic presume, first, small remnant of deeply moral hold-outs uses that “all sides” or “both sides” can be exposed its reform-granted exposure to others to grad- to some critical mass of opposing or diverse ually convince the rest of society of the wisdom viewpoints via, say, must-carry rules, or of its position. However, just as likely is a split- some privately implemented version of them ting of society into the more contemplative and (Pasquale, 2016a, p. 499-500). To make that the more active, à la the famed quote on the reform tractable, assume for now a binarily “reality-based community” from a member of divided society, divided between left and right the George W. Bush administration.5 voters. The great problem for advocates of “fil- ter bubble” reforms is that they cannot ade- This elementary challenge to filter bubble-driv- quately model whether exposure of one side’s en reform suggests a larger problem with the adherents to the other side’s version of facts, deliberativist political theory driving reforms priorities, ideology, or values, will lead to un- of the automated public sphere (Pasquale, derstanding or revulsion, reconsideration, or 2008c). How can democracy operate when recalcitrance. large swathes of the population subscribe to di- ametrically opposed conceptions of the nature To be sure, effects studies in media have been of politics? Consider the deliberativist approach contested for decades. It may be impossi- as one end of a spectrum of theories of poli- ble for today’s digital deliberative democrats tics, with a Schmittian, decisionist approach on to demonstrate the empirical likelihood of the opposite end. Deliberativists see politics as open-mindedness among voters (Gutmann fundamentally a realm of reasoned disagree- and Thompson, 2004). But they should be ment, culminating in some form of agreement open to understanding the danger of plausible (or at least improved understanding) after de- models of asymmetrical openness to oppos- bate (Parkinson and Mansbridge, 2012). Jürgen ing views. A society may have a “hard left” and Habermas detailed the “ideal speech situation” a “soft right,” such that those on the right are as the regulative ideal of such political deliber- quite willing to assess and even adopt some ation, where everyone would either be able to left proposals, while the vast majority of the voice their own views, and learn from others, or left is unalterably opposed to accepting any at least count on their political representatives right ideas. In such a scenario, an assault on the filter bubble is only likely to chip away at 5 The journalist Ron Suskind authored an article that conservative self-identification among the quoted a senior George W. Bush administration official “soft right,” while succoring the hard left. Per- as saying “that guys like me were ‘in what we call the reality-based community,’ which he defined as people haps intuiting that danger to its coherence and who ‘believe that solutions emerge from your judicious ability to project power, today’s right in the study of discernible reality.’ [...[ ‘That’s not the way the world really works anymore,’ he continued. ‘We’re an United States may be inoculating itself against empire now, and when we act, we create our own real- such ideological slippage. Very often, those in ity. And while you’re studying that reality—judiciously, the center right will defend or applaud those as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort to their right, but the comity rarely goes the out. We’re history’s actors […] and you, all of you, will be other way (Nagle, 2017). left to just study what we do.’” Suskind (2004).

11 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

in a legislative body engaging in a similar pro- memorably bragged that he could “shoot some- cess (Habermas, 1991). one on Fifth Avenue,” and his voters would still remain devoted to him. That is a Schmittian de- Habermas’s conception of parliamentary de- votion par excellence. More strategically, a po- mocracy was part of a long scholarly campaign litical party may change voting rules to entrench to lay to rest the type of post-rational, emotivist its power, creating a self-reinforcing dynamic: politics associated with Carl Schmitt (Müller- the more the rules change in its favor, the more Doohm, 2017). But Schmitt’s critical ideas are opportunities it has to entrench majorities and finding more traction today, both in - diagno super-majorities that enable further rule chang- ses of political polarization, and in the actual es (Daley, 2015; Berman, 2016). In such circum- attitudes and actions of many voters and pol- stances, some or all of the reforms mentioned iticians. For those committed to a Schmittian above could backfire, simply adding to the pow- perspective, there are friends and enemies in er of a dominant party in a disordered polity, politics, and almost no new information can rather than preserving and promoting the type dissuade them from their attachment to their of pluralism that is a hallmark of a well-ordered party or leader. US President Donald J. Trump democracy.

Conclusion: A Return to Professionalism

Given the potential pitfalls of regulating the au- (Ohlheiser, 2016; CBS News, 2016). Shortly tomated public sphere, implementation of the thereafter, Facebook was swamped by the fake reform ideas in Part II above should be undertak- news which now is the topic of so much contro- en with care in well-ordered polities, and may be versy. The real lesson here is that human editors impossible or counterproductive in disordered at Facebook should be restored, should be given polities. But regardless of those difficult distinc- more authority, not less, and that their delibera- tions, those in media can do much to respond to tions should be open to some forms of scrutiny the automated public sphere’s infirmities. and accountability.

Journalists should be more assertive about their Some communication scholars have resisted own professional prerogatives and identity. In the idea of professionalization of online content the aftermath of the fake news scandals, Tim creation, curation, and delivery in the name of O’Reilly asserted that decisions about the orga- citizen journalism which would democratize the nization of newsfeeds and presentation of in- power of the press to anyone with a computer formation in them were inherently algorithmic and an internet connection. While a beautiful functions, to be supervised by the engineers at idea in theory, in practice a failure among the Facebook (O’Reilly, 2016). Certainly the alpha de facto sovereigns of the internet to distinguish geeks whom O’Reilly describes as his subject between stories on the real Guardian and the share that view: the human editors of trend- “Denver Guardian” is not simply a neutral deci- ing topics at Facebook were low status contract sion to level the informational playing field. Rath- workers, who were unceremoniously dumped er, it predictably accelerates propaganda tactics when a thinly sourced news story asserted that honed by millions of dollars of investment in conservative content was being suppressed both data brokerages and shadowy quasi-state

12 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

actors now investigated by the CIA as sources The public sphere cannot be automated like of bias, disinformation, and illegal influence in an assembly line churning out toasters. As the election (Revesz, 2016; Feldman, 2016). Free- Will Oremus has explained, there are aspects dom for the pike is death for the minnows. of the journalistic endeavor that are inherent- ly human; so, too, are editorial functions nec- In the 1980s, the chair of the US Federal Com- essarily reflective of human values (Oremus, munications Commission, Mark Fowler, dis- 2014). To be sure, there will be deep and seri- missed the bulk of regulation of broadcasters ous conflicts over the proper balance between as irrelevant, since he viewed the television as commercial interests and the public interest nothing more than “a toaster with pictures” in assigning prominence to different sources; (Boyer, 1987). In the 2010s, for better or worse, in deciding how much transparency to give vast conglomerates like Facebook and Google decisions made about such issues; and how effectively take on the role of global communi- much control individual users should have cation regulators. Mark Zuckerberg’s repeated over their newsfeeds, and the granularity of insistence that Facebook is nothing more than a that control. But these are matters of utmost technology company is a sad reprise of Fowler’s importance to the future of democracy. They laissez-faire ideology. It is also deeply hypocriti- can no longer be swept under the rug by plu- cal, for the firm imposes all manner of rules and tocrats more interested in stock returns and regulations on both users and advertisers when advances than the basic those norms generate profits for it (Pasquale, democratic institutions and civil society that 2015b). underpin each.

References

Abrams, A. (2016). Pizzagate gunman: ‘I regret Journal, July 1. blogs.wsj.com/digits/2015/07/01/ how I handled’ Comet Ping Pong shooting. Time, google-mistakenly-tags-black-people-as-goril- December 8. time.com/4594988/pizzagate-gun- las-showing-limits-of-algorithms/ man-comet-ping-pong-regret/ Benkler, Y. (2007). The wealth of networks: How ACLU. (2017). ACLU challenges warrant to social production transforms markets and free- search data of Facebook page for group pro- dom. New Haven, CT: Press. testing Dakota Access Pipeline [press release], March 8. aclu.org/news/aclu-challenges-war- Berman, A. (2016). Give Us the Ballot. New York: rant-search-data-facebook-page-group-pro- Picador, 2016. testing-dakota-access-pipeline Bilić, P. (2016). Search algorithms, hidden labour ACLU v. Reno, 521 US 844 (1997). supreme.justia. and information control. Big Data & Society, 3(1). com/cases/federal/us/521/844 DOI: 10.1177/2053951716652159.

Barr, A. Google mistakenly tags black people as Bond, S. (2017). Google and Facebook build ‘gorillas,’ showing limits of algorithms. Wall Street digital duopoly. Financial Times, March 14.

13 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

ft.com/content/30c81d12-08c8-11e7-97d1- Dean, J. (2010). Blog theory: Feedback and cap- 5e720a26771b ture in the circuits of drive. Malden, MA: Polity. Evon, D. (2016). Spirit cooking. Snopes, Novem- Boyer, P. J. (1987). Under Fowler, F.C.C. treated TV ber 5. snopes.com/john-podesta-spirit-cook- as commerce. New York Times, January 19. p. C15. ing

Bracha, O. & Pasquale, F. (2008). Federal search Feldman, J. (2016). CIA concluded inter- commission? Access, fairness, and accountability vened in election to help Trump, WaPo reports. in the law of search. Cornell Law Review, 93:1149. Mediaite, December 9. mediaite.com/online/ cia-concluded-russia-intervened-in-election- Cadwalladr, C. (2016). Google, democracy and to-help-trump-wapo-reports the truth about internet search. The Guard- ian, December 4. theguardian.com/technolo- Guarino, B. (2016). Google faulted for racial gy/2016/dec/04/google-democracy-truth-inter- bias in image search results for black teen- net-search-facebook agers. Washington Post, June 10. washington- post.com/news/morning-mix/wp/2016/06/10/ Carpenter, D. P. & Moss, D. A. (2014). Preventing google-faulted-for-racial-bias-in-image- regulatory capture: Special interest influence and search-results-for-black-teenagers/?utm_ter- how to limit it. New York: Cambridge University =.1a3595bb8624 Press. Gutmann, A. & Thompson, D. F. (2004). Why de- CBS News. (2016). Facebook: ‘No evidence’ liberative democracy? Princeton, NJ: Princeton conservative stories were suppressed, May University Press. 10.cbsnews.com/news/facebook-no-evi- dence-conservative-stories-trending-suppres- Habermas, J. (1991). The structural transforma- sed-gizmodo tion of the public sphere: An inquiry into a catego- ry of bourgeois society. Burger, T. (Trans.). Cam- Chayka, K. (2016). Facebook and Google make bridge, MA: MIT Press. lies as pretty as truth. , December 6. theverge.com/2016/12/6/13850230/fake-news- Irani, L. (2013) The cultural work of microwork. sites-google-search-facebook-instant-articles New Media and Society, 17(5): 720-739.

Chin, A. (1997). Making the World Wide Web Kollanyi, B., Howard, P. N. & Woolley, S. C. Safe for Democracy: A Medium-Specific First (2016). Bots and automation over Twitter Amendment Analysis, 19 Hastings Communica- during the second US presidential debate. tions and Entertainment Law Journal, 19:309. COMPROP Data Memo, October 19. political- Citron, D. Hate crimes in cyberspace. Cambridge, bots.org/wp-content/uploads/2016/10/Da- MA: Press. ta-Memo-Second-Presidential-Debate.pdf.

Citron, D. K. & Pasquale, F. (2011). Network ac- Kreiss, D. (2016). Social media did not give us countability for the domestic intelligence appa- Trump and it is not weakening democracy. ratus. Hastings Law J, 62:1441-1494. Culture Digitally, November 9. culturedigitally. org/2016/11/social_media_trump. Daley, D. RATF**KED: The True Story Behind the Secret Plan to Steal America’s Democracy. New Lanier, J. (2013). Who owns the future? New York: York: Kirkus, 2015. Simon & Schuster.

14 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

Lehdonvirta, V. (2017). Could data pay for Ohlheiser, A. (2016). Three days after re- global development? Introducing data financ- moving human editors, Facebook is already ing for global good. Oxford Internet Institute trending fake news. Washington Post, August [blog], January 3. oii.ox.ac.uk/blog/could-da- 29. washingtonpost.com/news/the-intersect/ ta-pay-for-global-development-introducing-da- wp/2016/08/29/a-fake-headline-about-megyn- ta-financing-for-global-good. kelly-was-trending-on-facebook/?utm_term=. f857ac42b2e9 Lee, T. B. (2016). Facebook should crush fake news the way Google crushed spammy content O’Reilly, T. (2016). Media in the age of algo- farms. Vox, December 8. vox.com/new-mon- rithms. Medium, November 11. medium.com/ ey/2016/12/8/13875960/facebook-fake-news- the-wtf-economy/media-in-the-age-of-algo- google rithms-63e80b9b0a73#.9l86jw9r4

Lessin, J. (2016). Facebook shouldn’t fact Oremus, W. (2014). The prose of the machines. check. New York Times, November 29. nytimes. Slate, July 14. slate.com/articles/technology/ com/2016/11/29/opinion/facebook-shouldnt- technology/2014/07/automated_insights_to_ fact-check.html write_ap_earnings_reports_why_robots_can_t_ take_journalists.html Lindblom, C. E. (1977). Politics and markets. New York: Basic Books. Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York: Penguin Lubbers, E. (2016). There is no such thing as Press. the Denver Guardian. Denver Post, November 5. denverpost.com/2016/11/05/there-is-no-such- Parkinson, J., & Mansbridge, J. J. (2012). Deliber- thing-as-the-denver-guardian/ ative systems: Deliberative democracy at the large scale. New York: Cambridge University Press. Marwick, A. & Lewis, R. (2017). Media manipula- tion and disinformation online. New York: Data & Pasquale, F. (2006). Rankings, reductionism, Society Research Institute. and responsibility. Cleveland State Law Rev, 54:115-139. Molina, B. (2016). Report: Fake election news performed better than real news on Facebook. Pasquale, F. (2008a). Asterisk revisited: Debat- USA Today, November 17. usatoday.com/story/ ing a right of reply on search results. Journal of tech/news/2016/11/17/report-fake-election- Business & Technology Law, 3:61-86. news-performed-better-than-real-news-face- book/94028370/ Pasquale, F. (2008b). Internet nondiscrimina- tion principles: Commercial ethics for carriers Müller-Doohm, S. (2017). Habermas: A biogra- and search engines. University of Chicago Legal phy. D. Steuer (Trans.). Malden, MA: Polity Press. Forum, 2008:263-300.

Nagle, A. (2017). Kill all normies: the online cul- Pasquale, F. (2008c). Reclaiming egalitarianism ture wars from Tumblr and 4chan to the alt-right in the political theory of campaign finance -re and Trump. Washington, DC: Zero Books. form. University of Law Rev, 2008:599-660.

Noble, S. (2018). Algorithms of Oppression: How search Pasquale, F. (2010). Beyond competition and in- engines reinforce racism. New York: NYU Press. novation: The need for qualified transparency

15 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

in internet intermediaries. Northwestern Univer- net: Race, sex, class and culture online (pp. 147- sity Law Rev, 104:105-174. 159). New York: Peter Lang.

Pasquale, F. (2011). Restoring transparency to Roberts, S.T. (2016b). Digital refuse: Canadian automated authority. Journal on Telecommuni- garbage, commercial content moderation and cations and High Technology Law, 9:235. the global circulation of social media’s waste. Wi: Journal of Mobile Media, 10(1):1-18. wi.mobil- Pasquale, F (2015a). Digital star chamber: Al- ities.ca/digitalrefuse/ gorithms are production profiles of you. What do they say? You probably don’t have the right Robinson, N J. (2016). The necessity of credibil- to know. Aeon, August 18. aeon.co/essays/ ity. Current Affairs, December 6. currentaffairs. judge-jury-and-executioner-the-unaccount- org/2016/12/the-necessity-of-credibility able-algorithm Rushkoff, D. (2016).Throwing Rocks at the Google Pasquale, F. (2015b). The black box society: The Bus. New York: Portfolio. secret algorithms behind money and information. Cambridge, MA: Harvard University Press. Schaedel, S. (2016). Did the Pope endorse Trump? FactCheck.org, October 24. factcheck. Pasquale, F. (2016a). Platform neutrality: Enhanc- org/2016/10/did-the-pope-endorse-trump/ ing freedom of expression in spheres of private power. Theoretical Inquiries in Law, 17:487-514. Schüll, N. D. (2012). Addiction by design: Machine gambling in Las Vegas. Princeton, NJ: Princeton Pasquale, F. (2016b). Reforming the law of repu- University Press. tation. Loyola Law Rev, 47:515-540. Smith, A. & Banic, V. (2016). Fake News: How Pasquale, F. (2016c). Search, speech, and se- a partying Macedonian teen earns thousands crecy: Corporate strategies for inverting net publishing lies. NBC News, December 9. nbc- neutrality debates. Yale Law and Policy Rev news.com/news/world/fake-news-how-party- Inter Alia, May 15. ylpr.yale.edu/inter_alia/ ing-macedonian-teen-earns-thousands-pub- search-speech-and-secrecy-corporate-strate- lishing-lies-n692451 gies-inverting-net-neutrality-debates. Smith, R. C. & Tambini, D. (2012). Measuring Patterson, M. R. (2017). Antitrust law in the new media plurality in the United Kingdom: Poli- economy. Cambridge, MA: Harvard University cy choices and regulatory challenges. Robert Press. Schuman Centre for Advanced Studies, Working Paper No. RSCAS 2012/36. cadmus.eui.eu/bit- Revesz, R. (2016). Steve Bannon’s data firm stream/handle/1814/23314/RSCAS_2012_36. in talks for lucrative White House contracts. pdf?sequence=1&isAllowed Independent, November 23. independent. co.uk/news/world/americas/cambridge-an- Smith, R. C., Tambini, D. & Morisi, D. (2012). alytica-steve-bannon-robert-rebekah-mer- Regulating media plurality and media power in cer-donald-trump-conflicts-of-inter- the 21st century. LSE Media Policy Project: Media est-white-a7435536.html Policy, brief No. 7. eprints.lse.ac.uk/45041/1/ LSEMPPBrief7.pdf Roberts, S.T. (2016a). Commercial content mod- eration: Digital laborers’ dirty work. In Noble, Srnicek, N. (2017). Platform capitalism. Malden, S.U. and Tynes, B. (Eds.), The intersectional inter- MA: Polity.

16 FRANK PASQUALE THE AUTOMATED PUBLIC SPHERE

Stucke, M. E. & Grunes, A. P. (2017). Data-opo- berg-is-in-denial.html?_r=2 lies. University of Legal Studies, Re- search Paper No. 316. papers.ssrn.com/sol3/ Turton, W. (2016). The New York Times Face- papers.cfm?abstract_id=2927018 book Op-Ed is a joke, Gizmodo, November 30. gizmodo.com/new-york-times-public-editor- Sunstein, C. R. (2007). Republic.com 2.0. Prince- slams-facebook-op-ed-1789511334; ton, NJ: Princeton University Press. Winston, J. How the Trump campaign built Suskind, R. (2004). Faith, certainty and the pres- an identity database and used Facebook ads idency of George W. Bush. New York Times, Oc- to win the election. Medium, November 18. tober 17. nytimes.com/2004/10/17/magazine/ medium.com/startup-grind/how-the-trump- faith-certainty-and-the-presidency-of-george- campaign-built-an-identity-database-and- w-bush.html used-facebook-ads-to-win-the-election-4ff- 7d24269ac#.4oaz94q5a Tufekci, Z. (2016). Mark Zuckerberg is in de- nial. New York Times, November 15. nytimes. Winters, J. (2011). Oligarchy. New York: Cam- com/2016/11/15/opinion/mark-zucker- bridge University Press.

Related Publications

PLATFORM COOPERATIVISM Challenging the Corporate Sharing Economy Trebor Scholz - January 2016

www.rosalux-nyc.org

17 LAL NERK F RSA LUEUR SFUN FFES

NR AERA AN UNE NANS LAL EAQUARERS EURPEAN UNN CUSA B BB D D S E P D D E D M S D A S E D D D CD

E ENRAL AERA AN UA NR AFRA EASENRAL EURPE M CM TT P D S E B D I L D D

ANEAN REN ES AFRA SUEAS EURPE E DS BS D D D D A O Director: Krunoslav Stojaković

SUERN NE AN RAL EAS AFRA RUSSA ENRAL ASA AN AUASUS S PB D E ST MR D D D D B D

PALESNE SUERN AFRA SUEAS ASA R S A H D U B D L D L DD

SRAEL EAS ASA SU ASA T A BC DI D T C D T D S M E T

T I