Laws of Air and Ether: Copyright, Technology Standards, and Competition

Ren R. Bucholz

A Dissertation Submitted to the Faculty of Graduate Studies in Partial Fulfillment of the Requirements for the Degree of Master of Arts

Joint Graduate Program in Communication and Culture York University & Ryerson University, Toronto, Ontario

September, 2008 Library and Bibliotheque et 1*1 Archives Canada Archives Canada Published Heritage Direction du Branch Patrimoine de I'edition

395 Wellington Street 395, rue Wellington Ottawa ON K1A0N4 Ottawa ON K1A0N4 Canada Canada

Your file Votre reference ISBN: 978-0-494-51511-2 Our file Notre reference ISBN: 978-0-494-51511-2

NOTICE: AVIS: The author has granted a non­ L'auteur a accorde une licence non exclusive exclusive license allowing Library permettant a la Bibliotheque et Archives and Archives Canada to reproduce, Canada de reproduire, publier, archiver, publish, archive, preserve, conserve, sauvegarder, conserver, transmettre au public communicate to the public by par telecommunication ou par Plntemet, prefer, telecommunication or on the Internet, distribuer et vendre des theses partout dans loan, distribute and sell theses le monde, a des fins commerciales ou autres, worldwide, for commercial or non­ sur support microforme, papier, electronique commercial purposes, in microform, et/ou autres formats. paper, electronic and/or any other formats.

The author retains copyright L'auteur conserve la propriete du droit d'auteur ownership and moral rights in et des droits moraux qui protege cette these. this thesis. Neither the thesis Ni la these ni des extraits substantiels de nor substantial extracts from it celle-ci ne doivent etre imprimes ou autrement may be printed or otherwise reproduits sans son autorisation. reproduced without the author's permission.

In compliance with the Canadian Conformement a la loi canadienne Privacy Act some supporting sur la protection de la vie privee, forms may have been removed quelques formulaires secondaires from this thesis. ont ete enleves de cette these.

While these forms may be included Bien que ces formulaires in the document page count, aient inclus dans la pagination, their removal does not represent il n'y aura aucun contenu manquant. any loss of content from the thesis. Canada Abstract At the dawn of each new technological era, it is common to hear how the most recent development is so powerful and novel that all social, political, geographic, and economic constraints will evaporate in its wake. This paper examines how the mythology of openness—a product of infrastructure-centered telecommunications policy—developed and continues to influence policy in the digital age. Section II explores this dynamic in the realm of network neutrality. Section III introduces "overlay networks" of control, which can operate even on "neutral" networks. Section IV explores the history of new, private standards organizations and their role in the deployment of overlay networks. It includes a case study of the Digital Video Broadcasting project (DVB). Section V explores the substantive problems with private techno-legal policy regimes and identifies shortcomings in strategies for addressing those problems. Specifically, competition law is identified as a promising but inadequate tool.

IV Acknowledgments This work would not have been possible without the help of many hands. It could not have been written without Rosemary Coombe's guidance and incredible intellectual generosity. Before David Skinner and Greg Elmer became part of my committee, they were my professors and friends. They are also among the instructors—in the Communication and Culture Programme and at Osgoode Hall Law School—in whose classes I began to think and write about the positions that constitute this thesis. I am also grateful to my colleagues at the Electronic Frontier Foundation and the Google Policy Fellowship program, which gave me the freedom to explore the ideas in Section V during the summer of 2008. I am deeply indebted to my family, here and abroad, for their unstinting support and encouragement. Hannah and Don Bucholz, whose love of libraries and technology made me the geek that I am today, are especially culpable. Most of all I thank Laura, who makes everything possible on and off the page.

v Table of Contents I. INTRODUCTION: INTERNET MYTHOLOGY AND THE RHETORIC OF OPENNESS 1 A. LITERATURE ROADMAP: "CRITICAL INFORMATION STUDIES" 5 B. A BRIEF CULTURAL HISTORY OF THE INTERNET 8 C. CONVERGING APPROACHES TO INTERNET GOVERNANCE 13 II. THE DIVERSIONARY EFFECTS OF NETWORK NEUTRALITY 21 A. WHAT IS NET NEUTRALITY? 23 B. How NEUTRAL IS THE NET? 28 C. OPEN PATHS, CLOSED PACKAGES 3 0 III. DISTRIBUTED CONTROL: DRM AS AN OVERLAY NETWORK 36 A. OVERLAY NETWORKS: PROLIFERATION OF PATHS 36 B. DRM AND THE EVOLUTION OF IP METAPHORS 41 C. WHY DRM MATTERS 43 1. Information & Subject Creation 44 2. New Modes of Production 45 3. Reifying Borders 47 4. Normalization of Control 51 D. IS DRM DEAD? 53 1. How DRM Works, Fails 55 2. Fixing DRM's Problems Through Standardization 60 IV. THE EVOLUTION OF STANDARDS 67 A. WHAT ARE STANDARDS? 69 B. THE STATE'S (DECLINING) ROLE IN STANDARD-SETTING 72 C. THE RISE OF CONSORTIA 76 D. STANDARDS + DRM 80 E. DVB CASE STUDY 85 V. COMPETITION LAW VS. STANDARDIZED DRM? 97 A. COMPETITION IN AMERICA 99 B. RETHINKING COMPETITION, STANDARDS, AND DRM 104 C. FRAMING OBSERVATIONS 106 1. Harm to Open Source Vendors: Concerted Refusals to Deal Ill 2. Harm to Consumers: Tying 115 3. Standards and Intellectual Property Misuse 118 4. Competition Law Beyond the US 122 VI. CONCLUSION 130

VI Table of Figures Figure 1: Layered Networks 39 Figure 2: DVD Regions 49 Figure 3: DVB Worldwide 85 Figure 4: How CPCM Works 91

vn I. Introduction: Internet Mythology and the Rhetoric of

Openness

The sweeping paradigm shift is a seductive image; we love to declare that

something, anything, has changed everything. This may account for the popular

conception of the Internet as a radically new medium that will have significant positive

effects on the democratic project. According to this view, the realization of human potential, as measured by any number of metaphysical yardsticks, cannot help but be

accelerated by unfettered access to information and the blistering pace of technological progress. To the extent that this new medium is perceived as vulnerable, the most

commonly discussed threats are familiar to communications scholars. Monopolists and

censors have plagued every mass communication technology that humans have ever produced.

When considered alongside the Internet, these threats give rise to rhetorical

skirmishes that pit freedom against control, "open" against "closed" networks. In this

framework, freedom and openness are cast as material characteristics of the Internet.

Anyone can connect via its open protocols. Once connected, the network treats all data

equally. As a result, innovation takes place at the "edges" of the network without being

throttled by some central authority.1 Most importantly, this model's preoccupation with

centers and edges assumes that the links of the network are apolitical, frictionless, and

open. So for early theorists of the Internet, freedom and openness became more than

1 ideals or metaphors: they became organizing principles and blueprints, deviation from which would lead to structural instability and eventual collapse. On the other hand, supporters of a controlled network—usually the owners of the network's physical infrastructure—explain that the future will arrive only when some measure of order is imposed on chaotic data flows.2 Because of the limited capacity of the wires in the ground and the spectrum in the air, networks demand prioritization and hierarchy if rich media experiences are to be delivered. In their view, it is time for the Internet to mature so as to realize its commercial potential.

This dynamic is playing out in one of the most hard-fought battles over Internet governance. In response to attempts from Internet Service Providers (ISPs) to establish a more structured Internet, Web-savvy activists have responded by seeking "network neutrality" regulations from governments. Such rules would grant legal protections to guarantee at least some of the technical characteristics of openness. They would forbid

ISPs from treating Internet traffic differently based on its source, destination, or content, so as to prevent ISPs from exerting centralized control over network usage. By using law to protect the soul of these networked machines, techno-utopians hope to preserve the network's democratizing promise.

Unfortunately, even purely "open" networks, in the sense meant by the typical network neutrality advocate, fail to address emerging, distributed forms of control. In fact, these new strategies for exercising power over the flow and use of information are

2 remarkable precisely because they operate seamlessly in "open" technical environments.

When implemented well, the limitations they create feel like laws of nature, not restrictions designed by human hands. The result may be the reinvigoration of the hierarchical power structures that were supposed to be "flattened" by the Web. This paper argues that traditional loci of communicative power can thrive by promoting illusions of change, and that these powers will remain relevant in our digital future. I also hope to illustrate the history of power on the Internet as a neoliberal parable that incorporates debates over the role of the state in ordering our lives, considers the ways in which capital encourages conflation between the market and the marketplace of ideas, and explores how modern power can be both distributed in application and centralized in control. All of these features can be seen on and offline; each context can illuminate the other.

To understand these new threats we need to examine several apparent contradictions. First, networks of control can be overlaid on top of open networks. Even on the Internet, a successful push for open pathways may provide an illusion of freedom while enabling more subtle forms of control. Second, "overlay networks" of digital rights management (DRM) are becoming sufficiently sophisticated that they may effectively constrain how the public uses certain types of information. Until now, the vast majority of attempts to create such systems have failed in convincing, public ways, which have in turn created a popular sense of inevitability: no matter how cleverly the old guard can craft its digital locks, urban legend has it, some 15-year-old in Norway will be along to 3 pick them within minutes. This sense of inevitability, however, is ill-founded in the face of technical advances that are already on their way to market.

Addressing these changes will require a recalibrated perspective. Has network neutrality become the primary site of struggle over Internet governance, and do its proponents believe that policies enshrining 'openness' are sufficient for the task of protecting the Internet's democratic promise? If new forms of control are emerging, who is building them? What ideological and structural shifts have enabled their production?

How robust are these structures, where are they vulnerable, and whose interests do they serve? In Section I, I approach these questions via a brief cultural history of Internet mythology, then chart its intersection with traditional media scholarship. Section II is a more detailed account of some of network neutrality's propositions, such as the belief in the primordial myth of the Internet's neutral origins. In Section III, I explain how overlay networks and DRM function, how they have evolved, and how they work on open networks. Section IV explores the history of new, private standards organizations and their role in the deployment of overlay networks. It also includes a case study of a particular standards consortium called the Digital Video Broadcasting project (DVB), which is developing an overlay network that exhibits the characteristics described in

Section III. In Section V, I explain the substantive problems with private techno-legal policy regimes, as well as identify some shortcomings of obvious strategies for addressing those problems. Specifically, I point to competition law as a promising but inadequate tool for addressing these concerns. 4 A. Literature Roadmap: "Critical Information Studies"

This inquiry is inherently interdisciplinary, and it engages most directly with what media historian and cultural studies scholar Siva Vaidhyanathan has described as the emerging field of "critical information studies" (CIS).3 Vaidhyanathan locates CIS at the border of cultural studies and critical legal theory, and casts its central concern as an investigation of the "structures, functions, habits, norms, and practices that guide global flows of information and cultural elements."4 Importantly, the discipline is concerned with "positive" rights, or the search for policies that facilitate the exercise of and not just prohibitions on government interference therewith.5 This unabashedly liberal/progressive approach to information policy is applied in the search for "semiotic democracy."6 That term was coined by media scholar John Fiske in 1988, and it refers to the 's ability to interact meaningfully with and contribute to the universe of signs that she inhabits, not just serve as a consumer of cultural goods. The term also incorporates earlier insights from cultural studies figures like Stuart Hall, whose theory of encoding/decoding suggested that the meaning of a communication is mediated by the vocabulary of signs shared by the sender and the receiver.7 Since the birth of the Internet, the study of user-generated initiatives—cultural, technical, commercial, etc.—has only accelerated.8 What CIS contributes to this trajectory is an interdisciplinary, aggressively political model for not only observing these phenomena, but building structures that foster their growth.

5 Another argument for the relevance of CIS is its preoccupation with theorizing power and control. If net neutrality is about the centralization of control, DRM is about its distribution. Where net neutrality deals with the familiar problem of power's concentration and subsequent abuse, DRM is a model for disseminating tiny, self- enforcing fragments of a coherent ideology. In order to function, however, those fragments must be circulated in prearranged channels. Without a network of compliant devices and open pathways, DRM-wrapped content would be useless to audiences and valueless to content owners. Legal scholar Julie Cohen, whose work is associated with

CIS, refers to the content industry's coordinated information control projects—of which

DRM is one example—as "pervasively distributed copyright enforcement,"9 and she argues that it represents an evolution of Foucault's disciplinary society.10 Where Foucault claimed that the specter of surveillance would lead to the internalization of discipline,11

Cohen argues persuasively that tools like DRM can instill "the 'correct' rules for interacting with digital content" through constant but subtle coercion.12 Neither wholly centralized nor wholly distributed, this "hybrid" form of power is new and poorly understood, but she stops short of opining on the likelihood of its success. A variety of others have been more sanguine about its failure, claiming that attempts to achieve airtight control are bound to fail.13 In their view, information still wants to be free, and it will always find a way to escape. This position has the benefit of empirical evidence, as

DRM systems have so far been unable to contain the unauthorized flow of copyrighted material. I submit that the near future of distributed control mechanisms will be less 6 ambiguous and more effective than either of these accounts suggests. Understanding why those changes will occur, however, requires a more detailed inquiry into the materiality of both the Internet and DRM.

Which brings us to my third and final argument for the embrace of CIS: its focus on the materiality of new media. This project is an effort to understand the "nature" of digital networks, as well as the increasingly independent "nature" of digitized cultural artifacts. As a result, I believe that it is necessary to engage with literatures that look critically at those two subjects. CIS is rife with such inquiries. For example, new media researcher and communications scholar Tarleton Gillespie writes about how the specific materiality of digital media—what he calls the "shape of digital culture"—is being assailed by attempts to "build the legal standards of copyright directly into the artifact."14

The ability of copyright holders to lock-down uses of intellectual works with DRM is one of the central themes of this thesis, and that kind of control is especially important because it is the very malleability of digital media—not just the ease with which it is distributed—that makes it such a significant development in the construction of semiotic democracy. Technology theorist Yochai Benkler articulates this concern as the "extent to which a medium permits its users to participate in structuring its message"15 and investigates the ways in which digital information networks facilitate new modes of production. Cultural studies scholars like Henry Jenkins describe a shift towards

"participatory culture," where "each of us constructs our own personal mythology from bits and fragments extracted from the media flow and transformed into resources through 7 which we make sense of our everyday lives."16 In other words, the familiar link between medium and message makes it clear that changing the former bounds the possibilities of the latter. This interaction between information goods, technology, and culture has sparked broad interdisciplinary interest, and it is why Vaidhyanathan has cast the evolution of modern, technologically alloyed copyright law as a focus of CIS.

In exploring the rise of modern information control networks through the lens of

CIS, I have endeavoured to trace a line through a cluster of pertinent disciplines. The result is that each section—on the cultural history of the Internet, the network neutrality movement, the role of digital rights management technologies in the global flow of information, the political economy of standardization, etc.—attempts to provide its own theoretical and/or historical context. As such, each section contains a review of the literature on which it relies. The final section, on the use of competition law on the conduct of standards organizations, takes up the CIS-inspired search for policy recommendations that might be applied beyond the academy.

B. A Brief Cultural History of the Internet

Myths about technology are among our favourite things. At the dawn of each new technological era, it is common to hear how the most recent development is so powerful and novel that all social, political, geographic, and economic constraints will evaporate in its wake.17 Those sentiments were particularly persuasive when first applied to the

8 Internet. Its rhizomatic,18 many-to-many architecture was so new, so alien from the one- to-many broadcast model that had dominated mass communication for hundreds of years that it seemed tailor-made for the post-Cold War, postmodern era.19 In retrospect, it appears totally natural that the first Web browser—the vehicle for navigating the

Information Superhighway—was released within months of the dissolution of the Soviet

Union. Neoliberal fervour was breathing new life into the international institutions that

Realists had dismissed for half a century, and the logic of an ostensibly borderless

Internet made more sense as the world reorganized itself, as maps were redrawn, as walls came down.20

Within a few years, the Internet had exploded into public view with an entourage of boosters, gurus, and digerati who were eager to explain that revolution was nigh.

Hailing from online communities like The Well and Usenet, writing in Wired and Mondo

2000, and drawing inspiration from the fiction of William Gibson and Bruce Sterling, many of those who were familiar with the pre-Web Internet shared a range of counter- cultural touchstones that supported their revolutionary fervour.21 Internet policy scholar

Susan Crawford describes this group of "Internet exceptionalists" as:

"[AJrdent proponents of the idea that communication in cyberspace is not the same as terrestrial communication, at least not with respect to choice of law, jurisdiction, and intellectual property questions. The theory of Internet exceptionalism relies in part on the strong belief that the continued unfettered evolution of the Internet is of great public import. Internet exceptionalists often

9 express deep skepticism as to the appropriateness of government regulation of the internet. "22 [Footnotes omitted]

In other words, Internet exceptionalists started from the assumption that the Internet was not like other media. In fact, they said, it was something much more than a medium: it was a sovereign space where traditional ideas about governance did not apply. As John

Perry Barlow wrote in his widely read manifesto on cyberspace:

"We have no elected government, nor are we likely to have one, so I address you with no greater authority than that with which itself always speaks. I declare the global social space we are building to be naturally independent of the tyrannies you seek to impose on us. You have no moral right to rule us nor do you possess any methods of enforcement we have true reason to fear. "23

An implicit element of this narrative was that those who "get" the Internet, who are comfortable with its lingua franca of code, have no "true reason to fear" the machinations of the old guard.24 Faith in technology's ability to empower the individual dovetailed with elements of Internet culture—variously described as "technolibertarian"25 or as subscribing to the "New Communalism"26—that view government regulators as either incompetent or malicious but always unwelcome. Instead of bailouts from, or deference to, the State, the problems of cyberspace would be solved by self-reliant pioneers27 on the electronic frontier.28

The rejection of governmental jurisdiction on the Internet was tied up in an earlier idea popularized by Barlow: that information "wants to be free." According to the former

10 Grateful Dead lyricist, "digital technology is detaching information from the physical plane, where property law of all sorts has always found definition."29 In a world where knowledge and capital are fungible, the transportability of the former would surely have a redistributive effect on the latter. It is no surprise that Barlow's manifesto was addressed to "Governments of the Industrial World," those "weary giants of flesh and steel;"30 in the minds of Internet exceptionalists, the rise of the Internet and the "new economy" was intimately linked to the impending death of old modes of production and governance.

These twin ideas—that the Internet would defy traditional expressions of political and economic power by granting its denizens unfettered access to information—became core components of Internet mythology.

Ideological predispositions toward suspicion of the government were validated by the Internet's early political and legal history. Throughout the Internet's first decade as a popular medium,31 most of the major conflicts over how it would be governed were between public interest advocates and the government.32 Somewhat surprisingly—at least to people who did not self-identify as cyberspace civil libertarians—the government lost many of these battles. When the Secret Service seized the computers of a Texas board game maker and part-time electronic bulletin board operator, the court found that they had violated his subscribers' privacy rights by reading the subscribers' email without a warrant.33 For the first time, offline rules about civil liberties were extended to the online world.

11 This trend continued in what would come to be known as the "Crypto Wars, a conflict that revolved around the government's attempts to control the publication and availability of information related to codes and the art of secret writing, collectively known as cryptography. To technolibertarians, cryptography is a method for promoting anonymity online and securing one's privacy against prying eyes.

Cryptography is a core technology of empowerment on the Internet.35 To the government, however, cryptography was a dangerous technology that could be used to hide criminal activity and prevent lawful surveillance. For example, in 1995, a graduate student at the University of California at Berkeley named Daniel Bernstein was frustrated to learn that his doctoral thesis, which discussed aspects of cryptography, was subject to the Arms Export Control Act and the International Traffic in Arms Regulations regulatory scheme. Before Bernstein could post his thesis on the Internet, where it would be available to people all over the world, these laws required him "to submit his ideas about cryptography to the government for review, to register as an arms dealer, and to apply for and obtain from the government a license to publish his ideas."36 Bernstein felt that these requirements violated his Constitutionally protected right to free expression and won several court challenges on these grounds.37 Before the case reached the United

States Supreme Court, the government abandoned its requirements. This and several other cases combined to force the American legal system to recognize code as speech, thereby leveraging protections for free expression in a way that made cryptography

12 available to the general public. Again, a government attempt to exercise its will had run aground on the shoals of cyberspace.

It would be misleading to suggest that all early conflicts over Internet governance were decided against the more traditionally powerful participant, but it happened with enough regularity that the prophecy of Net-enabled individual empowerment seemed to be at least partially self-fulfilling.

C. Converging Approaches to Internet Governance

As the community of technolibertarians developed a hazy sense of inevitability about the success of their new medium, another group of media watchers had begun to weigh its potential. Many of the people who wrote and thought about the way communications systems should be managed had spent the latter half of the 20 century gnashing their teeth at the steady concentration of "traditional" media.38 Throughout the

1990s, the experience of prevailing against the government in a policy battle would have been an alien prospect to most media scholars. The end of the Cold War had signaled a huge increase in the pace of deregulation generally, and media activists had loudly protested each wave. For example, the United States Congress passed the

Telecommunications Act of 1996 (the Act) over the vigorous protests of media activists.39 The Act relaxed ownership restrictions on radio and television stations, which lead to rapid consolidation of both industries. Before the Act, a single company could

13 own no more than 20 radio stations across the country, and its holdings in a single market were subject to limits dictated by the overall size of the market. After the Act, the nationwide cap on radio station ownership was abolished altogether, and the common ownership limit in local markets was doubled. According to the Federal

Communication Commission (FCC), between 1996 and 2002, the number of radio station owners declined by 34% even though the total number of stations increased.41 Another study found that in most of the United States, four firms controlled upwards of 70% of the radio stations.42 Throughout the 1990s, newspapers, radio and television stations, and cable companies were increasingly owned by a shrinking group of expanding conglomerates.43

Notwithstanding the fall of communism, media scholars warned that democracy would suffer as the American communications sector—particularly the news media— became more tightly woven into oligopolistic structures of capital.44 In such an environment, they claimed, journalists would be forbidden from pursuing stories that undermined the logic of capitalism, dominant political regimes, or anything else that might disrupt the market. As a result, these scholars argued, citizens would be starved of vital political knowledge while drowned in marketing information, constantly prodded to choose between commodities while totally unable to make informed choices about their leaders.45 At the very least, diversity would suffer as corporate media owners sought to build economies of scale, cut local content, and ceded ever more column inches to advertisers. While there was some disagreement about whether consolidated media was 14 really responsible for the alleged ill health of Western democracy,46 there was considerable consensus around the position that concentrated media did not help.

Against this backdrop, those who had been railing against media consolidation greeted the Internet with cautious optimism. For example, McChesney writes:

"The Internet has opened up very important space for progressive and democratic communication, especially for activists hamstrung by traditional commercial media. This alone has made the Internet an extremely positive development. Yet whether one can extrapolate from this fact to see the Internet becoming the democratic medium for society writ large is another matter. "47

The Internet seemed to solve many of the problems of "traditional" media, like the huge expense associated with running a newspaper or television station, their one-to-many architectures, the proliferation of gatekeepers, and the extent to which their content had been co-opted by powerful commercial interests intent on turning everything into entertainment.48 Communication scholars were cautious, however, because new media accompanied by big democratic promises are a staple of modern history. What made the

Internet different from those other systems?

In response, the digerati pointed to the perceived material realities of the Internet.

Older communication networks like broadcast television were centralized, and they gave a single actor—i.e. the owner of the television transmitter—the power to decide what content would travel over the network. In contrast, the Internet was designed to be decentralized, and to "[interpret] censorship as damage and [route] around it."49 It was

15 constructed in a way that eschewed central control points and left the power to innovate and filter, speak and distribute, at the "edges" of the network. This structure is sometimes described as a "distributed network" that has no "chain of command, only autonomous agents who [operate] according to certain pre-agreed "scientific" rules of the system"50 that are called "protocols." As Lawrence Lessig explained, "this design pushes complexity out of the basic Internet protocols, leaving it to the applications, or ends, to incorporate any sophistication that a particular service may require."51 In other words, the principle of "end-to-end" networking—a simple network with intelligent ends—was built into the Internet's basic protocological form. The apparent result is an open network that treats all data similarly, regardless of source or destination.

As the new century approached, the ardent beliefs of Internet exceptionalists had morphed into a sort of conventional wisdom. While radio stations cycled through the same 20 songs every day, the Internet was a celestial jukebox where one could find any song on demand. Where television was a wasteland of reality television and screaming heads, the Internet provided a laboratory for dissecting the spin. Bloggers began to blog, and citizen media began to take off in all its messy, grammatically dubious glory. The democratizing effects of the Internet had become impossible to ignore, and the myth of digital emancipation, delivered over open networks, continued to grow.

16 The center/edge terminology commonly used in describing the Internet is distinct from the core/periphery discourse found in political science, political economy, and development literatures. For a dscription of the center/edge construction in network theory, see Lawrence Lessig, Code: And Other Laws of Cyberspace, Version 2.0 (New York: Basic Books, 2006), 44. ("Rather than build into this network a complex set of functionality thought to be needed by every single application, this network philosophy pushes complexity to the edge of the network—to the applications that run on the network, rather than the network's core. The core is kept as simple as possible. [...] This design principle was named [...] as the end-to-end principle. It has been a core principle of the Internet's architecture, and, in my view, one of the most important reasons that the Internet produced the innovation and growth that it has enjoyed.") 2 Christopher Stern, "The Coming Tug of War Over the Internet," The Washington Post, January 22, 2006, B01. (Summarizing the debate over net neutrality: "Would these new fees imposed by carriers alter the basic nature of the Internet by putting bumps and detours on the much ballyhooed information superhighway? No, say the telephone companies. Giving priority to a company that pays more, they say, is just offering another tier of service—like an airline offering business as well as economy class. Network neutrality, they say, is a solution in search of a problem.") 3 Siva Vaidhyanathan, "Afterword: Critical Information Studies." Cultural Studies 20 (2006): 292-315. (Providing an excellent overview of the field he seeks to define.) 4 Ibid at 303. 5 For a modern liberal articulation of the distinction between 'positive' and 'negative' rights, see Stephen Holmes and Cass R. Sunstein, The Cost of Rights: Why Liberty Depends on Taxes (W. W. Norton & Company, 2000). (Arguing, inter alia, that meaningful rights demand not only protection, but facilitation for their exercise, from the state.) 6 John Fiske, Television Culture, 1st ed. (New York and London: Routledge, 1987). 7 Hall, Stuart. "Encoding/Decoding." The Cultural Studies Reader. Ed. Simon During (New York and London: Routledge, 1993), 90-103. 8 This literature is discussed more fully in Section III(C). 9 Julie E. Cohen, "Pervasively Distributed Copyright Enforcement," Georgetown Law Journal 95, no. 1 (November 2006): 1-48,2. 10 Cohen 2006, 19-29. 11 Cohen's chief departure from Foucault's vision of normal discipline is in whether and how power is exercised. For Cohen, power is constantly exercised, but at a level that affects structures just outside our view, so that individual actions feel like agency within the "natural" constraints of the world. Perfectly implemented DRM would disappear, as it would define the wholly normalized relationship between a cultural artifact and its user. When a user tries to deviate from this relationship, 'pervasively distributed copyright enforcement' will restrain her actions; its exercise of power is constant but quiet. Foucault claimed, on the other hand, that "the perfection of power should tend to render its actual exercise unnecessary." Michel Foucault, Discipline & Punish: The Birth of the Prison (Vintage, 1995), 201. 12 Cohen 2006, 29. 13 The "Darknet Paper," written by several DRM designers from Microsoft, heavily influences this camp. It argues that an unprotected, unauthorized version of protected content will always be produced if it is 17 useful or interesting to do so, and that private or semi-private networks will be used in its distribution. Peter Biddle et al., "The Darknet and The Future of Content Distribution," in Proc. ACM Conference on DRM, 2002, http://crypto.Stanford,edu/DRM2002/darknetS.doc; For an example of how this technical argument is applied in legal/policy arguments, see Fred von Lohmann, "Measuring the DMCA Against the Darknet: Implications for the Regulation of Technological Protection Measures," Loyola Entertainment Law Review 24, no. 4 (2004): 635. 1 Tarleton Gillespie, Wired Shut: Copyright and the Shape of Digital Culture (Cambridge: MIT Press, 2007), 7. 15 Benkler 2003, 1265. 16 Henry Jenkins, Convergence Culture: Where Old and New Media Collide (New York: NYU Press, 2006), 3. 17 The literature on myth and technology is rich and extensive. In perhaps the most important work on this relationship, media historian James Carey identifies the "mythos of the electronic revolution," and describes some representative futurists -including Buckminster Fuller and Alvin Toffler—as follows: "They all convey an impression that electrical technology is the great benefactor of mankind. Simultaneously, they hail electrical techniques as the motive force of desired social change, the key to re-creation of a humane community, the means for returning to a cherished naturalistic bliss. Their shared belief is that electricity will overcome historical forces and political obstacles that prevented previous Utopias." James W. Carey, "The Mythos of the Electronic Revolution," in Communication as Culture: Essays on Media and Society, 1st ed. (Routledge, 1992), 115. For a more recent inquiry into the mythology of networks like the Internet, see Vincent Mosco, The Digital Sublime: Myth, Power, and Cyberspace (Cambridge: The MIT Press, 2004). For a more wide-ranging account of the ways in which early communications technologies were incorporated into other rhetorical strategies—i.e. nationalism and militarism—see William Boddy, New Media and Popular Imagination: Launching Radio, Television, and Digital Media in the United States (Oxford University Press, USA, 2004). 18 See generally Galloway's discussion of Deleuze and Guattari, where he explains the metaphor of a rhizome in relation to information networks. Alexander Galloway, Protocol: How Control Exists After (Cambridge: MIT Press, 2004), 33-38. 19 The term 'postmodern' is notoriously resistant to definition, but here I refer to its association with the death of meta-narrative. The Internet, a communications medium that empowers average citizens to construct their own information environments and broadcast them to the world, is the logical heir to the monolithic media institutions of the modern age. For more on this articulation of postmodernism, see David Harvey, The Condition ofPostmodernity: An Enquiry into the Origins of Cultural Change (Wiley- Blackwell, 1992), 44-45. 20 Some political economists argue that the retreat of telecommunications regulators was anticipated and encouraged by the logic of capitalism. See, e.g. Dan Schiller, Digital Capitalism: Networking the Global Market System (The MIT Press, 2000), 88. ("Capital's stewardship of the Net, taking the form of multilateral support for cyberspace as a stateless jurisdiction, works to ensure that the market development process will only deepen and broaden the incursions on national sovereignty.")

18 21 For an excellent overview of the link between 1960's American counterculture and the techno- utopianism of the early Internet, see Fred Turner, From Counterculture to Cyber culture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (University Of Chicago Press, 2008). 22 Susan Crawford, "Someone to Watch Over Me: Social Policies for the Internet," Cardoza Legal Studies Research Paper 129 (2005): 3. 3 John Perry Barlow, "A Declaration of the Independence of Cyberspace," Electronic Frontier Foundation Website, 1996, http://www.eff.org/pub/Publications/John_Perry_Barlow/barlow_0296.declaration 24 At the time, the "old guard" was synonymous with the United States government. For an account of early Internet policy battles over cryptography and free speech, see 1. Steven Levy, Crypto: How the Code Rebels Beat the Government Saving Privacy in the Digital Age, 1 st ed. (New York: Penguin, 2001), and Bruce Sterling, The Hacker Crackdown: Law And Disorder On The Electronic Frontier (New York: Bantam, 1993). 25 Paulina Borsook, Cyberselfish: A Critical Romp through the Terribly Libertarian Culture of High Tech (PublicAffairs, 2001). 26 Turner 2006,4-5. (Describing how Internet pioneers like Stewart Brand came from a tradition of 1960s counterculture and Vietnam-era rejection of government interference with everyday life.) 27 For an interesting discussion of the historical and metaphorical link between Internet "pioneers" and the post-Vietnam "back to the land" movement, see Turner 2006, 69-102. 28 Barlow coined this term and was one of three founders of the Electronic Frontier Foundation, the first Internet policy think tank and my employer between 2001-2007. For more on the rhetoric of the electronic frontier, see Turner 2006, 282 note 85. 29 John Perry Barlow, "The Economy of Ideas," Wired, March 1994, http://www.wired.eom/wired/archive/2.03/economy.ideas.html 30 Barlow 1996, 1. 31 Timelines like this are inherently subjective, but I submit that the 1990s best fit mis description. For example, this period saw the development and widespread use of the World Wide Web. 32 See, e.g. Borsook 2001, 73-114. 33 Steve Jackson Games, Inc. v. United States Secret Service, 36 F.3d 457 (5th Cir. 1994). 34 See generally Levy 2001. 35 Section III(D) explains more about cryptographic theory and the ways in which it has been used to enable networks of control, not just technologies of individual empowerment. 36 Electronic Frontier Foundation, "Judge Patel to Decide if Government Restrictions on Cryptography Violate the First Amendment," Electronic Frontier Foundation, (September 18, 1996). http://www.eff.org/press/archives/2008/04/21-38 (accessed August 29, 2008). 37 Bernstein v. United States Dept. of Justice, 192 F.3d 1308 (9th Cir. 1999) (Holding, for the first time, that computer code is expressive content that qualifies for protection under the First Amendment) and Bernstein v. DOC, 2004 U.S. Dist. 6672 (N.D. Cal. April 19, 2004). 19 See, e.g., Harvey J Levin, "Competition Among Mass Media and the Public Interest," The Public Opinion Quarterly 18, no. 1 (Spring 1954): 62-79. (An early version of the argument that competition among media outlets was essential to their democratic functions.) 39 Angela Campbell, "A Public Interest Perspective on the Impact of the Broadcasting Provisions of the 1996 Act," Federal Communications Law Journal 58 (2006): 455-476, 456. 40 Campbell 2006,464. 41 George Williams & Scott Roberts, Media Bureau, FCC, Radio Industry Review 2002: Trends in Ownership, Format, and Finance (2002): 3-4. Peter DiCola & Kristin Thomas, The Future of Music Coalition, Radio Deregulation: Has it Served Citizens and Musicians? (2002), 3. 43 See, e.g., Ben H. Bagdikian, The New Media Monopoly (Boston: Beacon Press, 2004), 16. ("In 1983 there were fifty dominant media corporations; today there are five. These five decide what most citizens will—or will not—learn"). 44 See, e.g., James Curran, "Rethinking Media and Democracy," in Mass Media and Society, ed. Michael Gurevitch (Oxford University Press, 2000): 120-154. See, e.g., James Winter, Democracy's Oxygen: How Corporations Control the News (Montreal: Black Rose Books, 1997). 46 Pippa Norris, A Virtuous Circle: Political Communications in Postindustrial Societies 1st ed. (Cambridge University Press, 2000), 4. (Norris suggests that the moral panic around media consolidation is producing an urge to shoot the messenger, and that democracy suffers from "more deep-rooted flaws in systems of representative government"). 47 Robert W. McChesney, Corporate Media and the Threat to Democracy (Open Media, 1997), 30. See, e.g., Neil Postman, Amusing Ourselves to Death: Public Discourse in the Age of show Business (New York: Penguin, 1986). 49 Philip Elmer-DeWitt, "First Nation in Cyberspace," Time Magazine (Online Edition) December 6, 1993 http://www.time.com/time/magazine/article/0,9171,979768,00 .html 50 Galloway 2004, 38. 51 Lawrence Lessig, Code and Other Laws of Cyberspace (New York: Basic Books, 2000), 33.

20 II. The Diversionary Effects of Network Neutrality

"Together these shifts can move the boundaries of liberty along all three vectors of liberal political morality. They enable democratic discourse to flow among constituents, rather than primarily through controlled, concentrated, commercial media designed to sell advertising, rather than to facilitate discourse. They allow to build their own windows on the world, rather than seeing it largely through blinders designed to keep their eyes on the designer's prize. They allow passive consumers to become active users of their cultural environment, and they allow employees, whose productive life is marked by following orders, to become peers in common productive enterprises."

-Benkler1

After the turn of the century, as technolibertarians and progressive media activists started seeing eye-to-eye on the essential character of the Internet, the rhetoric of openness moved from a design principal to a full-blown social movement. Academics like Benkler began to imagine how open networks would transform the freshly minted information economy—one still organized around industrial information production— into a networked information economy that empowered ordinary people. Open source software, typically written by distributed groups of part-time programmers who had never met, seemed to prove this point. By using the Internet to design, build, and distribute software, open source advocates began to successfully compete against the world's largest corporations. All of these practices, mundane and revolutionary, were aided by the

21 neutral character of the network that connected the practitioners. In modern discussions of the Internet, openness has achieved totemic powers.

It is therefore unsurprising that attacks on the perceived openness of the Internet have produced the most vigorous fight over Net governance to date. The recent political history of the Internet, especially in the United States, has been dominated by the struggle to ensure that those who control significant portions of the Internet's physical infrastructure do not undermine what has come to be known as "network neutrality."2

Unfortunately, this binary choice between an open and closed Internet is an oversimplification, both of what the network is today, and of the directions in which it may develop. On closer inspection, the Internet is far from neutral; it is constantly managed by parties large and small. What's more, it is this very management that keeps the whole thing working. However, the focus on an open/closed binary suggests that only an open Internet will fulfill the medium's potential and that all we must do is ensure that the network remains neutral. The truth is that a neutral network may bar obvious forms of coercion, but it also facilitates—or at least does nothing to combat—emerging webs of control. This paper argues that for all of the talk about the Internet's technical resistance to regulation, strategies of control not only remain, but also are poised to thrive, largely because of recent changes in how technical standards are set and how digital rights management technologies are designed.

22 Why has this phenomenon received so little attention? In part because of another

Internet governance issue that fits more comfortably in the discourse of openness. This other issue is easier to understand and address, as both the villains and protagonists in its narrative perform their appointed roles: telecommunications companies abuse their monopolies, the public cries out for redress, and government regulators mete out punishment. However, closer examination suggests that this story is more complicated, and that its successful resolution would have no effect on the spread of distributed control networks. Therefore, understanding control networks must begin with a critical examination of the nature of net neutrality.

A. What is Net Neutrality?

In the past four years, several American telecommunications executives have made public statements about their plans to start exerting centralized control over Internet traffic. For example, in 2005, SBC (Southwestern Bell Corporation) Communications

CEO Ed Whitacre said in reference to application providers like Google:

"Now what they would like to do is use my pipes free, but I ain't going to let them do that because we have spent this capital and we have to have a return on it. So there's going to have to be some mechanism for these people who use these pipes to pay for the portion they're using. Why should they be allowed to use my pipes? "3

23 These comments were especially alarming because they came just a month before SBC

acquired AT&T Wireless4 in a merger that substantially recreated the AT&T monopoly that was dismantled by federal regulators in 1984.5

The bluntness of these threats was a surprise to those who had come to think of the Internet as an electronic frontier where old rules did not apply. As a practical matter, how could such a threat be carried out on the Internet, a medium that was supposed to be designed to withstand central control? It turns out that the Internet's anti-authoritarian architecture is a matter of scope. Globally, the Internet is as the technolibertarians promised: distributed, adaptive, and robust. But if you zoom in to the individual user, a very different picture emerges. The Internet might hold an ocean of data, but the average user sips it from a single cable, telephone wire, or over-the-air connection that is owned by a local monopoly or duopoly. The telephone and cable companies, by virtue of their role in providing Internet service, have the potential to be the gatekeepers of the Net.

Cyberspace is not beyond the reach of centralized modes of control, but the institutions positioned to exercise such control have chosen not to use it until recently. In fact, using available technology, ISPs can easily and effectively separate Internet users into audiences that are exposed to only a subset of the Internet's content. In these commercial ghettoes, data from companies that pay to access an ISP's stable of subscribers would be favoured, while data from sources that refuse to pay a tithe could be slowed or blocked.6 In the same way that broadcast television stations are said to sell

24 'eyeballs' to advertisers, the attention of Internet users would be auctioned to the highest bidder. The exercise of these powers would be anathema to those who trumpet the emancipatory potential of the Internet and undermine any notion that it is an inherently resistive space. In short, it would expose the Internet to the same monopolistic depredations that afflicted "old media." Thus most of the intellectual and political capital currently being spent on questions of Internet policy is dedicated to one question: how can we 'save the Internet' by legislating neutrality?7

This highlights a puzzle about the contemporary North American schizophrenia surrounding how and whether the Internet should be governed. For most of the 20 century, the typical response to monopolistic behaviour was government competition law and policy. These actions took the form of antitrust prosecutions and any number of regulations to help "order" the market. However, contemporary valorization of free markets—and an attendant distaste for regulation—runs deep in the intellectual contingent of Internet gurus. Many of the most vocal figures associated with Internet policy were intellectually influenced by traditions that are highly suspicious of government intervention. In the 1990s, the Internet's most vocal proponents argued that it would invert the oppressive structures of traditional mass media because it technically embodied a libertarian Zeitgeist. Its resistance to regulatory interference was the very heart of its emancipatory promise. Today, little more than a decade since the Internet entered the public consciousness, those who believed in its democratizing promise are most often found lobbying for more government regulation. Indeed, much of the most 25 visible, effective work on Internet governance policy is intended to attract oversight from the very agencies (e.g. the Federal Communications Commission and Canadian Radio- television and Telecommunications Commision) that were rejected time and again in earlier moments of Net advocacy.

Why the shift? In part, I would argue, because 'traditional' media scholarship, with its attendant political and economic framework, has inspired a sort of regulatory nostalgia. Over the course of the 20th century, scholars and activists were able to point to a clear causal relationship between the centralization of control in the mass media and the relaxation of government oversight. For example, media studies scholar Ben Bagdikian argued, "The indiscriminate passion for deregulation of everything by corporate-minded ideologues has produced unmitigated disaster for cities and states throughout the United

States, in the economy and particularly in the relationship or lack of it between the mass media and the American public."8 Preoccupation with corporate power and the belief that state power was the most likely, effective, or desirable way to check it, are defining characteristics of that intellectual tradition. For many in the world of critical media studies, government regulation is a sought-after commodity. With this history in mind, it makes sense that investigators of 'offline' media would gravitate towards Internet governance issues that can be articulated in familiar terms: private firms accumulating too much power over communications infrastructure, and the corrective state action is sought by the public.

26 These arguments have proven convincing, and the emergent alliance between pro- regulation media scholars and anti-establishment Internet exceptionalists has produced an unexpected synergy. Both camps identified a serious threat to one of the Internet's core characteristics and, more unusually, agreed on a credible solution to that threat. The result has been a surprisingly effective and vigorous pursuit of government intervention. For example, FreePress.org, a lobbying group founded by media studies scholar McChesney, which now counts Lawrence Lessig as a board member, has been the primary organizing force behind the activity ofSaveTheInternet.org. That coalition contains hundreds of other advocacy groups from across the political spectrum, from the American Civil

Liberties Union to the Gun Owners of America. Since 2005, it has succeeded in getting at least four pieces of draft legislation introduced in the United States Congress.9 The group claims to have defeated an anti-net neutrality bill after over one million of its supporters contacted Congress using activism tools from its website. On a number of axes, net neutrality has sparked the most mainstream, effective grassroots political response of any

Internet policy issue to date.

None of this is to say that all other discussions of Internet policy have fallen silent. It is, however, a powerful indicator of how the open/closed dichotomy has come to dominate discussions about the future and health of the Internet. It is also a testament to the effectiveness of harnessing the narrative of openness. What may come as a surprise, however, is how much of that narrative is nonetheless fallacious.

27 B. How Neutral is the Net?

Under closer scrutiny, the open "nature" of the Internet becomes more complicated.10 While data is supposed to travel unmolested from one end of the Web to another, numerous exceptions actually exist. Network operators routinely make management decisions that block certain traffic in order to maximize performance.11 In some cases, those choices are motivated by normative considerations, rather than technical necessity.12 If a single actor controls enough of the technical infrastructure, the open protocols of the Internet can be overcome through brute force. This is the kind of control to which ISPs aspire, but China is a more typical example; there and elsewhere, the state keeps disfavoured content from reaching its citizens.13 Smaller-scale filtering regimes are sold on a commercial basis to institutions throughout the world, though their use is typically limited to offices and publicly funded Internet access points.14 Subtle forms of techno-political control exist as well. Network theorist Alexander Galloway argues that the Internet's protocols are themselves a form of control that mirror the

Deleuzian model of the rhizome.15 Political economist Daniel Pare, in his study of the bodies that allocate Web domain names, has pointed out that the governance of the

Internet is more hierarchical than is widely acknowledged.16 It seems incontrovertible that neutrality is a contested, contingent feature of the Internet, not its 'natural' state.

However, not all network discrimination is undertaken for nefarious purposes; in some cases, discrimination enables the network to function. For the vast majority of its

28 users, the operation of the Internet is a given. Messages travel instantly from one glowing screen to another; typing numbers into a website can cause real goods to be delivered to the front door. We remark on these practices primarily when they malfunction, so familiar have we become with their magic. The truth is considerably messier. The

Internet is comprised of a collection of millions of interconnected subnetworks, each of which must be kept running against hardware failures, unplugged power strips, severed undersea cables, and countless other material threats to its ephemeral stability. The world's sysadmins are constantly making decisions about which services to allow and which network policies to implement. For example, modern network management norms make it acceptable to block certain kinds of data traffic on Internet-connected email servers. In the past, email servers operated like a network of postal drop boxes: any correspondence that had enough postage (read: "was properly encoded") and was sent to a valid address would be routed into the larger postal network, and it would eventually be delivered to its intended recipient. Anonymity, or at least pseudonymity, was easy to achieve. Over time, resourceful spammers realized that they could use this feature of email servers to obfuscate their identities, circumventing anti-spam measures and flooding the Internet with messages about cheap Vl@gr@. To combat this excessive openness, the world's network administrators have coalesced around the practice of instructing email servers not to forward anonymous messages to the wider Internet, essentially ripping out the old post boxes and replacing them with post offices where identification must be presented before services are rendered. This kind of change

29 accounts for the social values of Internet users—most people do not like spam—by

restricting its technical neutrality.

Despite the existence of non-neutral practices on the Internet, net neutrality

advocates insist that the kind of centralized control now threatened by ISPs is of a

different order. Their campaign is directed only at bad discrimination, not the 'good' discrimination that keeps the network running. While this may accurately reflect the intentions of net neutrality advocates, it is also technically, and thus, given their own premises, ideologically incoherent and undermines the core values of the net neutrality movement, namely, that the Internet is technically open, and that openness is always

good. Their rhetoric about openness, freedom, and neutrality obscures the real constraints that exist in every complex system, frustrating the veracity of their assessments and preventing necessary change. Moreover, continuing to argue that openness is the

lynchpin for realizing the Internet's democratic potential diverts attention from other, more insidious forms of control.

C. Open Paths, Closed Packages

The work being done by net neutrality advocates is not wasted, even if it is

orthogonal to the other threats described in this paper. In fact, a meaningful response to the rise of DRM will likely require similar appeals to government regulators and to the public at large. Moreover, understanding those threats will require a more detailed

30 analysis of DRM systems and the political and economic structures that enable them.

Both politically and intellectually, such an intervention will likely need to build on the

degree of government regulation that Internet exceptionalists have come to embrace.

However, net neutrality remains a strategic diversion from new models information

control.

Net neutrality is, at its core, an economic theory that seeks to describe the

conditions that encourage rapid technical innovation,17 and that focuses on the Internet as

a transport medium. A neutral network explicitly rejects inquiries into the nature of the

cargo that it carries. Yet the nature of that cargo and, more importantly, the restrictions

that travel with it are of utmost importance. For example, open digital networks can be

used to deliver cultural artifacts that contain technological usage restrictions that are more

severe than those contemplated by intellectual property law, and those restrictions rely on

hardware and software that are inconsistent with the supposedly emancipatory thrust of

the Internet.18 When power is used to discipline objects and not just the networks that

carry them, it becomes critical to question what flows and how it can be experienced, not just the direction or speed of its flow. At a normative level, net neutrality is meant to

ensure that network policies do not frustrate free use of the Internet, but I wonder about uses that ought to be frustrated. If the Internet becomes a 'free' and 'open' delivery

mechanism for content wrapped in digital chains, would it be better than a 'controlled' network that discriminates in favour of content that may be freely used? A network that

31 is «0«-neutral in ways that benefit the public interest is technically possible, but its discussion is taboo so long as openness is seen as an end in itself.

Neither a libertarian rejection of public authority nor a liberal reliance on state- directed competition regulation, I would suggest, will address the restrictions to our cultural practices in digital environments that the private sector is already planning.

Network neutrality advocates are not railing against an imaginary problem; the danger they describe is real. The trouble is that neutrality is insufficient as a means to address it.

Worse, it might be characterized as a Gramscian exercise in resistance theatre, which is to say that it produces the illusion of resistance while actually supporting the dominant capitalist ideology. In particular, technological advances in the field of digital rights management (DRM) threaten to make cultural artifacts—whether delivered over "open" networks or not—into a billion tiny fiefdoms of control.

This kind of multi-threaded universe, where 'closed' media coexist with media of

'freedom,' may or may not be desirable. Yet it is certainly the world towards which we are headed if current attempts to "save the Internet"20 do not expand to deal seriously with the emergence of control regimes that have little to do with telecommunication and broadcasting infrastructure. Even if pro-regulation net neutrality activists achieved their wildest legislative dreams, there is no reason to believe that the spread of DRM-wrapped content would be affected. While such success would prevent those with infrastructure monopolies from abusing their positions, it would not address the spread of DRM

32 systems that restrict our ability to use information in ways that are inconsistent with the democratic promise of the Internet. As private technical regimes are overlaid on top of an

"open" Internet, the most pressing question for media theorists will not be about who owns the wires, but which technologically enforced norms apply to the content that travels over them.

It is therefore critical to expand beyond network-centered visions of control, which typically revolve around how to regulate the owners of communications infrastructure. This model has deep roots in the communication and media studies camps discussed earlier, and it made sense when media policy was primarily concerned with securing public interest access to scarce communicative resources. The political economy of centralized media pushed issues of control over infrastructure—broadcast licenses, cable networks, etc.—to center stage. But today, controlling infrastructure is only one way of managing information. Instead of network- or infrastructure-oriented notions of control, I argue that object-oriented control, where disciplinary regimes can be encoded into individual cultural artifacts, is increasingly important.21 The following section investigates the nature of other, newer, object-oriented forms of control.

Yochai Benkler, "Freedom in the Commons: Towards a Political Economy of Information (Lecture)." Duke Law Journal 52.6 (2003): 1245-1276, 1249. 2 For a discussion of the somewhat complicated motives for initiating such a change, see Tim Wu, "Network Neutrality, Broadband Discrimination," Journal of Telecommunications & High Technology Law 2 (2003): 141-76, 149-56.

33 3 Patricia O'Connell, "At SBC, It's All About 'Scale and Scope'," BusinessWeek.com, November 7,2005, http://www.businessweek.com/@@n34h*IUQu7KtOwgA/magazine/content/05_45/b3958092.htm 4 Dawn Kawamoto, "SBC to acquire AT&T for $16 billion," ZDNet, January 31, 2005, http://news.zdnet.com/2100-1035-5556793.html 5 See, e.g. Jonathan B. Baker, "The Case for Antitrust Enforcement," SSRN eLibrary (2003): 12-13, http://ssrn.com/paper=452182 6 For a discussion of how this might be done, see Edward W Felten, "Nuts and Bolts of Network Neutrality," AEI-Brookings Joint Center for Regulatory Studies, August 2006, 1-14, http://www.aei-brookings.org/publications/abstract.php?pid=1106 7 See, e.g., SaveTheInternet.com, Free Press Action Fund, http://savetheinternet.com 8 Ben H. Bagdikian, The Media Monopoly 6th Edition, (Boston: Beacon Press, 2000), 137. SaveTheInternet.com, Free Press Action Fund, http://savetheinternet.com. 10 This discussion revolves around users of open networks, but of course the benefits of an open network are limited to those users who are able to access it. For people on the wrong side of the digital divide, openness is as illusory as it is irrelevant. An exceedingly thorough account of this issue can be found in Pippa Norris, Digital Divide: Civic Engagement, Information Poverty, and the Internet Worldwide (Cambridge University Press, 2001). 11 For example, efforts to combat unsolicited email—"spam"—frequently take the form of network management decisions. See, e.g. Joshua Goodman, Gordon V. Cormack, and David Heckerman, "Spam and the Ongoing Battle for the Inbox," Communications of the ACM 50, no. 2 (February 2007): 25-31. 12 For example, in 2005 Telus, a major Canadian ISP, blocked access to the website of a labour organization with which it was feuding. See Ian Austen, "A Canadian Telecom's Labor Dispute Leads to Blocked Web Sites and Questions of Censorship," The New York Times, August 1,2005, sec. Business / World Business, http://www.nytimes.com/2005/08/01/business/worldbusiness/01telus.html 13 For more information about technologies used by the Chinese government to filter Internet content, see Jonathan Zittrain and Benjamin G. Edelman, "Internet Filtering in China," SSRN eLibrary, http://ssrn.com/paper=399920. l4See, e.g. Paul T. Jaeger, John Carlo Bertot, and Charles R. McClure, "The effects of the Children's Internet Protection Act (CIPA) in public libraries and its implications for research: A statistical, policy, and legal analysis," Journal of the American Society for Information Science and Technology 55, no. 13 (2004): 1131-1139. 15 Galloway 2006, 33-28. 16 Daniel J. Pare, Internet Governance in Transition: Who is the Master of this Domain? (Oxford: Rowman & Littlefield Publishers, Inc, 2003), 2-5. 17 Tim Wu, "Network Neutrality, Broadband Discrimination," Journal of Telecommunications & High Technology Law 2 (2003): 141-176, 145. 34 Not all usage restrictions are technical, of course. Where intellectual property law can set a baseline for the relationship between rights holders and users of cultural artifacts, private ordering can dramatically alter that relationship. See, e.g. Niva Elkin-Koren, "Contracts in Cyberspace: Rights Without Laws," Chicago-Kent Law Review 73 (1998): 1155. Section III(A) describes attempts to battle peer-to-peer file sharing by changing the technical characteristics of the Internet. For instance, network operators can block data transferred via protocols associated with popular file sharing software. The same principle could be applied to data that is wrapped in digital restrictions: network operators could identify that traffic and slow or block it as they see fit. Both practices would be violations of net neutrality principles, but an environment modeled after the latter would arguably reduce the distribution of'controlled' or 'closed' content in our information environment. I take no position on the wisdom of such a policy, but point out that it cannot be rationally evaluated without the ability to discuss the merits and demerits of openness, and an investigation of whether the digital control mechanisms at issue are likely to be effective. That question is taken up in Section III(D). This phrase has been popularized by the Save the Internet coalition. The theme of object-oriented control is discussed more fully in Section III.

35 III. Distributed Control: DRM as an Overlay Network

This section begins with a description of the layered structure of the Internet and how it can accommodate a theoretically infinite number of "overlay networks."1 To date, most overlay networks have empowered Internet users and some, like peer-to-peer (P2P) networks, have proven remarkably resistant to regulation. However, similarly robust networks that constrain the use and distribution of information—instead of encouraging its dissemination and promiscuous use—are also being developed. The key shift in this area, I submit, is the development of robust, networked DRM technologies. I therefore dedicate the remainder of this section to an examination of how DRM works, why it fails, and how it is getting better.

A. Overlay Networks: Proliferation of Paths

Understanding the interaction of DRM and the Internet requires an understanding of the layered construction of communications networks. Though multiple articulations of this concept exist,21 prefer the model suggested by Lessig, which has the virtues of elegance and simplicity. In his formulation, a communications network is composed of four distinct layers: a physical link, rules of logic, an application type, and content. Each layer supports the subsequent layer, and changes to lower levels will ripple upwards.

Take, for example, a call made via the original telephone system:4 the content layer of the call is a birthday greeting; its application layer is voice communication; its logical layer is

36 the electrical pattern of modulation and demodulation that carries the sound; and its physical layer is a strand of copper wire strung between two people. If you cut the wire, all of the other layers are severed as well. But if you stop talking, the three layers below the content layer will continue to exist. Power, therefore, has traditionally increased at lower levels of the network.

The Internet is a collection of distinct physical networks that are joined by the logical rules—the protocols—that comprise the Internet. More important, from the perspective of an Internet user, is its ability to support multiple application networks. The

Internet provides the environment in which services like the World Wide Web, email, and instant messaging can be deployed, and this application diversity is a large part of the

Internet's appeal. However, this model—one physical layer, multiple services—is not how communications networks were originally conceived. In the classic model of communications networks, a single firm owned the physical layer and was subsequently entitled—legally or by fiat—to controlling the logic and application layers. The content layer was typically regulated by other means, such as laws barring fraud, harassment, etc.

This arrangement gave infrastructure owners like the telephone company the power to dictate the uses of the network. It also had the effect of centralizing the process of innovation. This is why, for most of the 20th century, the telephone network was designed and used only for telephone calls.

37 The single-use network began to fall out of favour in the late 1960s, as it became

apparent that a single set of physical links could be used for multiple, undiscovered, potentially beneficial applications. The American telephone system was a strictly

controlled fiefdom of the phone companies for most of the 20 century. Any equipment

that attached to the network had to be approved by the phone company, and the bulk of

the approved equipment was manufactured by a subsidiary of AT&T. In 1968, the

Federal Communications Commission (FCC) fundamentally changed this dynamic by

issuing a decree that any device that did not interfere with the network could be attached to the network.5 Known as the Carterfone Decision, this change effectively decoupled

ownership of the physical layer from control over the logic and application layers. By

forcing infrastructure companies to open their wires to competing applications, the new model favoured a proliferation of services. Within a few years, consumers were able to use answering machines, fax machines, and even modems. The Internet is itself an

overlay on a number of physical networks—cable television, telephone, satellite, etc.—

and it provides a unifying logical layer from which even more applications and overlay networks can emerge.

38 P ~'u;i" '. Layered Networks

• ''•'": ''in "• . • • '•' . ".•!.' r.

• ' • ..-.•••!... ..i ...I

^ste;;^^^ 't* J.£ L-s*/ i.'.ye ?,•><&i - &».V a VMimn«Au.IiotfmWmsAi.Ji.rf^*n>M>^\iiJ...rt™nii^

|VV,>VWlr'£ 'ufljT'.,», (:«in:W ?>';'.-« i'rv rt:-i -,w *. m* in JS'.tyttte?3*i i •& BtAt. •'«(:<*.-*!•.

'••el '••'' i:-.-"-.t i IflU.IIiH'Pl'RSiBlKdPO.HWiRCCaPPtS'SfjREC • /'PC;liCOhOi ,/<\>alv)f\ i-. •• *voc or''! •vibc-'i

Figure 1: Layered Networks

Note that Figure 1 .B could illustrate other physical networks as well; the wires originally

laid for cable television and the ether that carries wireless signals are equally capable of providing the physical layer on which Internet protocols, and the wealth of applications that rely on them, can travel.

Restrictions on manipulation of lower levels of the network can make regulating the higher levels more difficult. Consider the archetypical example of an overlay network on the Internet: peer-to-peer (P2P) file sharing systems. Though the technical details vary

39 slightly, true P2P networks are characterized by individual computers, connected to the

Internet, running software that advertizes files on a user's computer while allowing that user to browse the files of others. As a result, P2P networks have been the targets of concerted legal campaigns for their role in the unauthorized distribution of copyrighted material. In the United States alone, the recording industry has commenced litigation against more than 20,000 individuals6 and more than a half-dozen file-sharing software makers.7 These legal efforts have had little effect.8 The most effective—though by no means entirely successful—methods of controlling P2P systems have involved logic- level changes to the network. For example, ISPs have used strategies like bandwidth throttling, where customers are limited to a small amount of data traffic each month, or blocking the ports of popular P2P programs. These are, of course, the core practices that neutrality advocates protest.9

If P2P represents a kind of overlay network that increases the mobility of information, what about networks that restrict the use and transport of information goods? Might those networks of control be similarly robust? If all the money in the world has failed to meaningfully reduce the popularity of P2P networks, how will average citizens impact the deployment of networked technologies of restriction? It is worth reconsidering the ideological opposition to non-neutral network management strategies if restrictive overlay networks, like P2P networks, benefit from these open paths. The following sections describe how DRM technology is evolving into just such an overlay network. 40 B. DRM and the Evolution of IP Metaphors

DRM refers to any technical system that conditions the use and distribution of digital information. Its development was pioneered by copyright holders intent on stanching the unauthorized flow of copyrighted material on the Internet, but it has grown into a much more ambitious project. It begs questions about the "nature" of digital information, whether or not it "wants to be free," and the familiar forms of property to which it should be analogized. Is downloading a song without authorization the moral and legal equivalent of shoplifting a compact disc? Are people who do those things comparable to seafaring, eye-patched buccaneers? The popular discourse around intellectual property, which tends to overshadow virtually every other conversation about information commoditization more generally, is caught up in a struggle to determine which metaphors to use for creative work. In many ways, the debate over DRM is a discussion about the contours of a shared hallucination: what kind of familiar object will we imagine information goods to be?

Intellectual property rights holders first answered this question long ago: intellectual property should be treated like physical property.10 For example, the world's first copyright law, the Statute of Anne, was adopted in 1709 in the midst of Britain's enclosure movement. At that time, the concept of was just emerging as the moral foundation for enclosure because, as John Locke argued, property was a natural right that emerged through the application of labour. He reasoned that working a piece of

41 land generated a moral "right" to that property and to the wealth it produced in perpetuity.

In contrast, the Statute of Anne provided for exclusive rights to the products of

intellectual work for only 21 years. At the end of this term, the work was supposed to

revert to the "public domain," a phrase borrowed from enclosure battles. But in an

argument that would become very familiar over the next 300 years, publishers claimed

that their products were more than metaphorically linked to real property; they wanted

copyrighted work to have the same legal treatment as land. In 1769, after decades of

fighting, an English court sided with publishers and declared that common law protected

copyrights in perpetuity, just as it did for tangible property.11 That standard only lasted

for five years (a Scottish court overturned it)12 but it set the stage for future copyright holders to seek parity with tangible property.13

Though often repeated, copyright holders' dedication to real property metaphors has been less than absolute. For instance, most copyright holders reject the traditional

exceptions to real and physical property that have helped balance their effects on the wider public. Concepts like adverse possession and the finders doctrine, which seek to reward non-owners for productively exploiting property that lies fallow or is lost, have no direct analogs in copyright law.14 Also, the broad discretion to dispose of physical property as its owner sees fit is not extended to individuals who have lawfully acquired a piece of copyrighted content. One can rent their car to a friend, but renting a piece of computer software is likely to violate copyright law.15 These fractures in the IP-as-real- property metaphor may exist, in part, because copyright holders believe that they can do 42 better. Where copyright holders might once have been content with a system that imposed the constraints of physical media—difficult reproduction, unlimited duration of ownership, etc.—to their digital equivalents, DRM is increasingly being employed to expand copyright holders' control far beyond what physical property owners enjoy in the offline world.

C. Why DRM Matters

If "architecture is politics,"16 the architects of a technology have the opportunity to fashion it in ways that privilege some uses while frustrating others. Because of huge costs associated with switching to new technical systems, the decisions made at a technology's inception can be nearly impossible to unmake. With that in mind, copyright holders have started to use technology to shape norms about how the public interacts with cultural goods.17 This is most clearly visible in the area of DRM technology, which facilitates discord between legal norms and technical realities of dealing with copyrighted material.18 For example, a person might be permitted by law to extract an excerpt from a

DVD movie for the purpose of criticism, but the disc and its player are built on technologies that disallow such an action. In this way, the law can provide for a use that is denied by another party's choice of technological design. As numerous critical studies scholars have pointed out, if law prohibits the circumvention of that technologically embedded restriction, the practical result is a copyright policy drafted by copyright holders—not legislatures—and enforced by code.19 43 DRM also has implications outside of the world of copyright law. Those effects

are certainly less obvious but arguably more important, and a few of them are discussed

below.

/. Information & Subject Creation

As outlined in the literature review of Section I, critical information studies is

concerned with how the direction, flow, and materiality of information impact the

construction of identity. Those concerns are made visible in the literature on "fandom,"

or the practice of users drawing raw materials from popular culture and refiguring them

in original stories, shooting unauthorized teleplays, or otherwise alloying their own

labour onto pre-existing creative work. For example, when frustrated fans decided that

George Lucas' Star Wars prequel was a product of the Dark Side, they excised the

objectionable pieces to produce their own "Phantom Edit."20 At Subscene.com, legions of volunteers work to subtitle movies and television shows for audiences that would

otherwise never see them.21 These exercises are just a sample of a larger pool of user-

created content that runs the gamut of any intellectual work; some is funny, some is political, much of it is terrible according to one standard or another. But whatever critical designation is applied to the final product, it is clear that user-generated media represents a huge number of creative and resistive practices of meaning-making.

Importantly, each of these elements—the characters, storylines, trademarks, broadcasts, etc.—falls within the legal domain of intellectual property, and most of the

44 uses described by Jenkins and others are technically unlawful. In a world with pervasive,

functional DRM, many of these practices would be hindered or barred outright. This is

troubling because, as Coombe points out, despite any potential legal impropriety:

"The texts protected by intellectual property laws signify: they are cultural forms that assume local meanings in the lifeworlds of those who incorporate them into their daily lives. Circulating widely in contemporary public spheres, they provide symbolic resources for the construction of identity and community, subaltern appropriations, parodic interventions, and counterhegemonic narratives. "22

In this sense, the semiotic weight of a cultural artifact is in tension with the technical and

legal authority that its legal owners may wish to exert over its use. Granting rights

holders too much authority, then, threatens emerging forms of meaning-making in order to safeguard traditional concentrations of power.

2. New Modes of Production

DRM also plays a role in the larger philosophical debate about emerging modes

of digital production. Open source technologies—software and devices that encourage user-modifiability and experimentation23—have emerged in the last two decades as a major force in the digital economy. By requiring the distribution of source code along with finished applications, and by requiring parties who make new products with that code to do the same, open source advocates break down the distinction between creators and users. This practice encourages independent development of technology and expands

45 the pool of potential innovators past a single company or consortium's R&D department.

Benkler writes about this as an economic paradigm shift from a centralized "industrial information economy" to the distributed "networked information economy,"24 and it is one of the most widely discussed themes in current literature on the digital culture.25

This development is sometimes articulated as a move from "consumers" of media to "users," a taxonomic shift that reflects the more active role of the masses in the production of information goods. As Benkler argues, "Technology now makes possible the attainment of decentralization and democratization by enabling small groups of constituents and individuals to become users—participants in the production of their information environment."26 In other words, we are moving away from industrial modes of information production—where information goods are centrally produced, frozen in a physical medium, and sold like so many sacks of corn—to a networked information economy where information goods derive value from their mutability. Dyer-Witherford sees these developments as having a potentially revolutionary effect on the dominant ideology of capitalist production:

"[By] setting in motion the powers of scientific knowledge and social cooperation, capital undermines the basis of its own rule. [...] [The] profoundly social qualities of the new technoscientific systems—so dependent for their invention and operation on forms of collective, communicative, cooperation— will overflow the parameters of private property. The more technoscience is applied to production, the less sustainable will become the attachment of income to jobs and the containment of creativity within the commodity form.'

46 A danger of changing the "social qualities" of cultural artifacts and their exchange, then, is that it represents an attack on the viability of collaborative modes of production like open source software and other technologically embellished methods for providing public goods. For instance, sufficiently sophisticated DRM systems can preclude the use of open source software with copyrighted cultural goods, which would restrict their use and redistribution.28 Attacks on the viability of open source are attacks on modes of production enabled by the Internet, but which have little to do with the ownership structure of the network itself.

3. Reifying Borders

Despite the compression of distance that is supposed to be abetted by the rise of networks like the Internet, DRM is also being used to recreate the borders that are supposed be melting in the furnace of globalization. Content holders increasingly seek

"region coding"—a particular use of DRM—in a variety of systems. The term region coding has two meanings here. Its first meaning refers to a specific technology associated with DVDs that limits their playback to devices manufactured for the same geographic region. The second meaning of region coding is broader, referring to any system or combination of systems that marks cultural goods as "proper" to a particular area. These new borders do not necessarily reflect existing national or regional mappings, but instead reflect new, artificial boundaries imagined by corporate actors.

47 Content producers employ these measures for a number of reasons. By disrupting

private importation of goods, they have greater control over when or if a particular title

will be released in a given market. A corollary to this power is the ability to engage in

price discrimination (or "differential pricing") in different markets. Economic orthodoxy

suggests that this kind of differentiation is a way to reduce arbitrage and increase

economic efficiency.29 Governments also have reasons for favoring these kinds of digital

fences. In situations where public content producers operate with taxpayer funding—the

British Broadcasting Corporation, for example—politics may dictate that their products

remain within the nation's borders. The inverse of this situation occurs when a country

is concerned with blocking the consumption of foreign media for cultural, political, or

religious reasons. In other words, digital fences are potentially useful to censors and

cultural protectionists as well as Hollywood movie studios. Note that none of these motivations stem from the perspective of the audience.

A more concrete example of region coding can be found in DVD technology.

DVDs employ an encryption system called CSS ("Content Scrambling System") that,

among other things, provides region-coding functionality. In this scheme, the world is

carved into six geographic zones that echo earlier colonial boundaries [Figure 2].

Viewing such a map for the first time, one might be reminded of a child's attempt to

identify the world's continents: Mexico is linked with South America, which is linguistically appropriate but geographically suspect; Johannesburg, Reykjavik, and

Tehran are grouped together; and China is severed from the rest of Asia. It is possible to 48 think of these new boundaries as an unintentional act of political subversion (Bjork's video oeuvre, for example, can be viewed on Iranian DVD players in the original

Icelandic!), but different borders are borders still. Each region's basic unit of construction is still the nation, and the boundaries of each nation remain intact. DVDs themselves need not be released with any regional restrictions ("Region 0" discs can be viewed on any player), but in practice nearly all commercially released discs are limited to a single geographic area.31

:';(ju-v; •; DVD Regions

Figure 2: DVD Regions

The technology underlying this system is encumbered by patents and trademarks, licenses to which are administered by the DVD Copy Control Association (DVD CCA).32

49 DVD CCA is, in turn, composed of the major movie studios and consumer electronics companies. In order to make a DVD player, you must seek permission from DVD CCA, and that permission is contingent on your willingness to enforce the region coding system. This structure provides a mechanism for controlling who may produce DVD players and what features they may have. It also hints at a globalized information environment that is nevertheless carved into distinct regions, enabling a sort of intellectual and cultural redlining.

While DVD region coding provides a useful case study of digital fences, such measures are not limited to that medium. Video game manufacturers have been employing such systems for years. Similar architectures are being devised for use in satellite, cable, and terrestrial television broadcasting. IBM, Apple, and other companies are now shipping millions of products with "trusted computing" modules that can be used for the most hacker-proof DRM ever deployed on a personal computer. In addition, to connect this back to the earlier discussion of music as a mobile commodity, innovative music distribution systems like the iTunes Music Store filter visitors by the country in which they appear to be physically located and limit their purchasing options accordingly. In short, as digital fences become more sophisticated, they are also being applied to a broader selection of cultural goods than ever before. If the 20 century was characterized by the death of distance culminating in our heady expatriation to the borderless terrain of cyberspace, it has become apparent that DRM is being used to defend a more antique cartography. 50 4. Normalization of Control

More pernicious than any individual control element is the prospect that the sum

of DRM's restrictions—or the existence of myriad restrictive information regimes—will become normalized, thereby altering what people expect to be able to do with

information. According to scholars like Althusser, the process of making a particular practice or exercise of authority coincident with the lived experience of society is critical to weaving it into the dominant ideology.33 Once power is no longer strange, it is more readily accepted and more difficult to interrogate. In reference to a similar, earlier form of

epistemological naturalization, Postman writes,

"There is no more disturbing consequence of the electronic and graphic revolution than this: that the world as given to us through television seems natural, not bizarre. [...] Our culture's adjustment to the epistemology of television is by now almost complete; we have so thoroughly accepted its definitions of truth, knowledge and reality that irrelevance seems to us to be filled with import, and incoherence seems eminently sane. "34

In retrospect it seems clear that Postman was writing about both the form and content of the medium of television, not the fact that its ownership structure is highly consolidated.

While consolidation may have been an aggravating factor, the real damage was done by the material and semantic realities of the cultural objects that were transmitted via the television, which encouraged viewers to be passive consumers and produced content in a monolithically capitalist structure. Those features of television—and other "old media" that once seemed so distant from the revolutionary promises of the Web—are being 51 woven into an ever-increasing array of digital cultural goods that can be distributed over

the Internet.

Another salient feature of these systems of control is that they are network-

agnostic. DRM regimes can be overlaid on top of networks that are open because, pursuant to the goals of open network supporters, even "good" discrimination is

forbidden. In a perverse way, DRM is actually aided by the net neutrality movement because, in an "open" environment, users are surrounded by the illusion of choice between cultural inputs. In other words, even though a particular person might interact

exclusively with content that uses some sort of DRM, the theoretical availability of

"open" content makes their unconscious submission to a technological control mechanism feel like an exercise of will. Worse, the "resistant" act of keeping the network

open is a necessary precondition for the proliferation of subtler forms of control like

DRM, which might be defeated by either "positive" network discrimination or popular reaction to the perception that DRM-laden content is the only available choice. In the most Gramscian sense, resistance to network-oriented control has the unintended effect of supporting a dominant ideology that privileges private ownership and control of cultural objects. It is through producing this kind of "false consciousness" in the consumer, as

Fiske and others have argued, that "a dominant class wins the willing consent of the subordinate class to the system that ensures their subordination."35 Winning the ability to choose among cultural goods that enforce usage policies against their users is a troubling victory indeed. 52 D. Is DRM Dead?

DRM has sparked furious controversy, but some claim that its time has passed.

Canada has thus far declined to grant legal protection to DRM, despite enormous pressure

from the United States.36 France considered rejecting legal protections for DRM in

2006.37 Steve Jobs, the Apple Computers CEO whose iTunes Music Store dominates the

digital music market—and the much more lucrative market for digital audio players—by

selling songs wrapped in DRM, has publicly called on entertainment companies to

reconsider their use of the technology. Even the American official who architected the

worldwide spread of DRM in two treaties now says that those "policies didn't work out

very well" and that "our attempts at copyright control have not been successful."39

In addition to these legal and political setbacks, DRM has had been plagued by a

seemingly endless parade of technical problems. One kind of DRM on compact discs turned out to be so fragile that a user armed with nothing more than a Sharpie marker

could disable it.40 Another kind of DRM for audio CDs was so aggressive that the record company that used it was successfully sued by consumers for exposing their computers to outside attackers.41 Even circumventing the DRM on DVDs, one of the most popular consumer technology ever released, has proven trivial for skilled adversaries. In 1999, a

16-year-old Norwegian computer programmer named Jon Johansen became the public face of DRM's futility. Johansen wanted to be able to view his DVD library on the Linux computer operating system, which is the computer user's equivalent of building your own

53 muscle car: assembly is definitely required, standard parts may not work right off the

shelf and a non-trivial amount of tinkering is necessary to get it running smoothly. In

Johansen's case, DVD CCA had not granted a license for DVD players to run on Linux, meaning that he could not watch his lawfully obtained DVDs. However, thanks to an

error in another DVD player manufacturer's implementation of CSS, an Achilles-heel

allowed Johansen and two unknown friends to reverse-engineer the system and disable it.

By funneling what they learned into thirteen lines of computer code called "DeCSS," they broke an encryption scheme that the American motion picture industry had spent years devising. As with other breaches of early DRM technology, the initial work of those skilled adversaries was followed closely by the distributed, pseudonymous production of user-friendly resources that extended the unauthorized functionality to individuals with little or no technical skill.

These " "break once, run always" attacks have two primary characteristics. First, the circumvention technique must be implemented in software and it must not require a user to solder, disassemble, or otherwise physically modify her player. Those relatively

specialized procedures may result in damage to the device, and relatively few people are prepared to undertake that endeavor. Second, because the solution is just computer code, it can be distributed widely via the Internet. That these characteristics seemed to apply to every breach of a DRM system simply reinforced the idea that information control systems would never be able to restrict clever users armed with the Internet.

54 In light of these foibles, it is tempting to agree with the pundits who claim that

DRM is, in fact, an antiquated technology that is on its way to extinction. However, closer examination of how DRM works—and how it is improving—suggests that tales of its demise are greatly exaggerated. First, these systems are becoming harder to break.

Second, once broken, the "solutions" are harder to distribute widely.

/. How DRM Works, Fails

At its most basic, DRM relies on restricting copying of and access to digital media files. Almost every use of digitized information requires either the production of a copy, or the display of a copy that has already been made. By strategically restricting those two activities, a copyright holder can recreate certain real-world constraints on property, like scarcity, rivalrousness, and exclusivity. She could also create restrictions that have been difficult to accomplish with physical property. For example, DRM is particularly useful in providing the rights holders of cultural goods with post-release control over their work. Older forms of media, once sent into the market, were beyond the influence of their publishers in important ways. Barring some kind of physical intervention, you can read a paperback novel in any country, read it as many times as you wish, and flip past the advertisement for the author's other books. DRM, however, can dictate that sections of a television program cannot be skipped,42 that DVD can only be viewed for a limited amount of time before expiring,43 and that media must be viewed in

55 a particular geographic region. But this list of what DRM does—or that its makers want it to do—does not address the question of how it works.

The most benighted forms of DRM came from the music industry, and they provide a useful example of early approaches to DRM. In the late 1990s, the music industry found itself in a strange position. Just as the Internet hit the mainstream and

cheap data storage became widely available, an alarming realization began to dawn: for nearly two decades, the music industry had been distributing digital versions of its

catalogs all over the world. Compact discs were a great way to get people to open their wallets for an album that they might already own on vinyl, tape, or 8-track, but they were also easily and perfectly reproducible. With modern compression technology, an album could easily be extracted from a compact disc and transferred over the Internet. Before long, this was happening at an astonishing rate.44 The record companies attempted to control copying and access in a number of uncoordinated ways. Some companies tried to manufacture their CDs with "bad sectors" that would confuse computer CD-ROM drives, thereby hindering their online distribution.45 Other methods relied on loading CDs with self-executing computer programs that would disable other computer programs that could be used to copy the disc's contents.46 Multiple companies emerged to craft DRM that would work in a technical ecosystem that had a decades-long head start.

What these methods shared was that each attempted to graft DRM onto an environment that was not designed to accommodate it, and the results were uniformly

56 hopeless. In fact, DRM on CDs failed at both the technical and normative level.

Technically, it was impossible to secure the vast store of digital music that was available

on CD because CD players were not obligated to obey any of those new rules. Even if all

of the record companies had agreed on a single kind of DRM, millions of non-responsive players were already in the market. With so many different ways to access the data on

CDs, there was no way for record companies to put the genie back in its bottle.

Normatively, people expected that any CD should function like all the others—i.e. it

should play in a car or a computer, and that music should be transferable between devices. Without a cohesive software and hardware environment, the desired control was

impossible to achieve.

Ironically, the next generation of DRM was made incrementally better by

incorporating on sophisticated cryptography, the cyberlibertarians' quintessential technology of freedom. Instead of personal data, however, copyright holders use cryptography to scramble the movies, music, and other material they distribute. Only people—or, more accurately, devices—with the proper authorization are allowed to decrypt that content into a legible form. In a simple cryptographic exchange, a message is encoded with a cipher and then passed to its intended recipient, who then decodes the message with the same cipher. If an enemy intercepted the message, he would have to know, guess, or intuit the cipher that had been used to encode the message before obtaining its contents. Once a sufficiently clever code-breaker figured out the cipher, all subsequent communications could be decoded at will. This kind of fragile cryptography 57 was used up through World War II, and, because of its fragility, still required that

encoded messages be kept out of the hands of adversaries where possible.

A more recent development called "public key cryptography" has enabled the

widespread deployment of effective DRM systems.48 With the rise of computers, it

became possible to craft complex cryptographic algorithms that can encode a message

when provided with a user-defined "key." However, this meant that both the encrypted

message and its key would have to make their way from sender to receiver. The insight of public key cryptography was that two keys could be used: a private key that was never

disclosed, and a public key that was shared with the world. Both pieces of the key are required to decrypt the message. Gillespie provides a useful example:

"It's as if I can leave an open lock box in the middle of the town square with my name on it; you can put something inside for me and lock it, without needing a key to do so; even if our enemy knows that the box is there, that there is something in it, and that it is for me, without the key only I have he cannot open it."49

This is particularly important because, in the relationship between copyright holders and the public, the public is the attacker. By using public key cryptography, content can be robustly encrypted and distributed with the knowledge that only authorized devices will be able to unscramble it. Those devices, in turn, are supposed to enforce restrictions like those described above.

58 Where earlier systems have been susceptible to the break once, run always

solutions mentioned above, the architects of DRM have been working to avoid such

vulnerabilities in future versions. First, those brittle designs have, in large part, been

improved to the point where a mixture of software and hardware expertise would be

necessary to defeat modern systems. This increased level of specialization, combined

with the risk of damaging the device being modified, dramatically lowers the number of

potential assailants.

Second, it may be difficult or impossible to translate these hardware/software

circumvention techniques into user-friendly software applications that can be distributed

over the Internet. If physically altering a machine really were required, then the most portable solution one might devise would be a good set of instructions. A single machine

could be compromised, but that would not grant the knowledge that could be generalized to other machines. The compartmentalization of failures in the system would

substantially increase its effectiveness and limit the unauthorized flow of information.

Blu-ray DVD players, high-definition successors to the popular DVD format that have begun to be sold around the world, already employ a new DRM system that implements these more robust characteristics.50 While this system has been compromised, its failsafe mechanisms have held up. For consumer Blu-ray players—the boxes that sit in the living room entertainment center—hardware modification is necessary is currently necessary before circumventing their DRM. For software players—the computer programs that use a computer's HD DVD drive—the circumvention method is sophisticated enough that 59 ongoing maintenance is required. A company called Sly Soft, based in Antigua to avoid liability from content holders, updates subscribers' software players at least every three months to maintain their functionality.51

Third, new DRM systems are taking advantage of the connected nature of modern technology. Instead of releasing atomized units of cultural goods that are played on isolated pieces of hardware, both content and content players are increasingly linked to one another and central authorities. For example, Blu-ray, the new format for high definition DVDs, includes a method for movie studios to remotely disable a player that they believe has been tampered with. This 'feature' is being included in a range of devices that goes beyond consumer electronics for entertainment. In some DRM systems, devices can query one another to determine their proximity—in order to see if they are in the same "household"53—or discover their geographic location,54 and then automatically enforce policies based on that information. On connected devices, copying and display policies can be implemented and changed from afar, based on whatever private ordering scheme desired by copyright holders.

2. Fixing DRM's Problems Through Standardization

Throughout this paper I have referred to the emergence of "nuanced" and

"unexpected" forms of control. This may have been misleading. What could be more predictable than an industry trying to preserve and expand its business models in response to disruptive technologies? What is unexpected and nuanced is the level of

60 sophistication that they now exhibit. The makers of digital rights management technologies have been learning from their very public mistakes and, in the near future, it will be possible to deploy distributed control networks that are not likely to be broken by teenagers. However, the technical innovations in DRM are only a part of the story. These systems require a high level of coordination in order to succeed in the market, and organizations that are able and willing to facilitate that coordination are a relatively recent development. Private standards consortia—enabled by the same neoliberal erosion of government regulation that has animated so much of this story so far—are the laboratories in which these new systems are being developed.

I use 'overlay network' in its generic sense, as any network of users and applications that is superimposed on another telecommunication system. Their most significant characteristic, for my purposes, is that an appropriately designed network can support multiple overlay networks in the same way that a single set of roads can accommodate multiple delivery routes for different goods. More on this in Section III(A). 2 See, e.g. the 7-layer model favoured by computer scientists in J.D. Day and H. Zimmermann, "The OSI reference model," Proceedings of the IEEE 71, no. 12 (1983): 1334-1340. 3 Lawrence Lessig, The Future of Ideas: The Fate of the Commons in a Connected World, 1 st ed. (Random House, 2001), 23. 4 Modern telephone systems have been moving to digital, packet-switched networks that are more complicated than the example provided. In fact, the packet-switched telephone network is an excellent example of a new overlay network, at the logical layer, that uses an older physical layer. 5 See Use of the Carterfone Device in Message Toll Tel. Serv., 13 F.C.C.2d 420 (1968) at 424. ("[A] customer desiring to use an interconnecting device ... should be able to do so, so long as the interconnection does not adversely affect the telephone company's operations or the telephone system's utility for others.") 6 David Kravets, "New RIAA Lawsuit Defense Tactic: Admit Liability, Challenge the Law," Threat Level at Wired.com, July 28, 2008, http://blog.wired.com/27bstroke6/2008/07/new-riaa-lawsui.html.

61 7 For example, copyright holders have also sued file sharing firms like Napster (A&MRecords, Inc. v. Napster, Inc., 239 F.3d 1004 (9th Cir. 2001)); Aimster (In Re: Aimster Copyright Litigation, 334 F.3d 643 (7th Cir. 2003), Grokster and Morpheus (MGMStudios, Inc. v. Grokster, Ltd. 545 U.S. 913 (2005)); Kazaa (Jeremy W. Peters, "Kazaa Said to Pay $10 Million in Settlement," The New York Times, November 1, 2006, sec. Technology); iMesh (John Borland, "RIAA Sues iMesh file-trading firm," C\Net News.com, September 19,2003, http://news.cnet.com/2100-1025 3-5079454.html); and Limewire (Greg Sandoval, "Music industry sues P2P firm Lime Wire," C\Net News.com, August 4, 2006, http://news.cnet.eom/2100-1025 3-6102509.html). 8 Sudip Bhattacharjee et al., "Impact of Legal Threats on Online Music Sharing Activity: An Analysis of Music Industry Legal Actions," The Journal of Law and Economics 49, no. 1 (2006): 91-114. (Finding that while lawsuits alter the behaviour of people sharing large numbers of files, the post-lawsuit environment remains flooded with substantial numbers of files from users with smaller libraries.) 9 Saul Hansell, "F.C.C. Chief Would Bar Comcast From Imposing Web Restrictions," The New York Times, July 12, 2008, sec. Technology, http://www.nytimes.com/2008/07/12/technology/12comcast.html 10 See, e.g. Frank H. Easterbrook, "Intellectual Property is Still Property," Harvard Journal of Law & Public Policy 13, no. 1 (1990): 108,112. (Easterbrook claims that "the right to exclude in intellectual property is no different in principle than the right to exclude in physical property."); For an excellent critical survey of the conflation between real and intellectual property, see Mark A. Lemley, "Property, Intellectual Property, and Free Riding," Texas Law Review 83 (2005): 1031. 11 Millar v. Taylor, 4 Burr. 2303, 98 Eng. Rep. 201 (K.B. 1769). 12 Donaldson v. Beckett, 2 Brown's Pari. Cases 129, 1 Eng. Rep. 837; 4 Burr. 2408, 98 Eng. Rep. 257 (1774). 13 For a more extensive account of this period in the development of copyright history, see Mark Rose, "The Author as Proprietor: Donaldson v. Becket and the Genealogy of Modern Authorship," Representations, no. 23 (Summer 1988): 51-85. 14 For an extensive analysis of the application of physical property exceptions to intellectual property, see Michael A. Carrier, "Cabining Intellectual Property Through a Property Paradigm," Duke Law Journal 54, no. 1 (October 2004): 1-145 (Arguing that intellectual property's acceptance as real property is irreversible, and that we should therefore worry about how real property concepts like easements and covenants will limit its scope); For a more general discussion of the slippages between property and intellectual property, see Stephen L. Carter, "Does it Matter Whether Intellectual Property is Property?," Chicago Kent Law Review 68 (1992): 715; Many other scholars have investigated how physical property metaphors could be useful in the context of intellectual property. See,e.g. Constance E. Bagley and Gavin Clarkson, "Adverse Possession for Intellectual Property: Adapting an Ancient Concept to Resolve Conflicts Between Antitrust and Intellectual Property Laws in the Information Age," Harvard Journal of Law & Technology 16, no. 2 (2003): 327. 15 Such a rental would be legal with the express permission of the copyright holder, but such permission is not required from the car manufacturer. US Copyright Office, "The Computer Software Rental Amendments Act of 1990: The Nonprofit Library Lending Exemption to the 'Rental Right'," September 15,2994, http://www.copyright.gov/reports/software ren.html 62 Lessig 2000, 243 note 19. See e.g. Burk, Dan L. "Legal and Technical Standards in Digital Rights Management Technology." Minnesota Legal Studies Research Paper No. 05-16, 2005, http://ssrn.com/abstract=699384 Deirdre K. Mulligan and John S. Erickson, "The Technical and Legal Dangers of Code-Based Fair Use Enforcement," Proceedings of the IEEE 92, no. 6 (2004): 985-996. See, e.g. Matt Jackson, "Using Technology to Circumvent the Law: the DMCA's Push to Privatize Copyright," Hastings Communications and Entertainment Law Journal 23 (2001): 607-646; Lessig 2000; Pamela Samuelson, "DRM {and, or, vs.} the Law," Communications of the ACM46.4 (2003): 41-45. Daniel Kraus, "The Phantom Edit." Salon.com November 5, 2001 http://archive.salon. com/ent/movies/feature/2001/ll/05/phantom_edit/print.html The website SubScene.com solicits homemade subtitling data for movies and televisions shows., http://SubScene.com Rosemary J. Coombe, The Cultural Life of Intellectual Properties: Authorship, Appropriation, and the Law (Duke University Press, 1998), 7. The Gnu General Public License (GPL) is the most well known open source license, though it is by no means the only method "open source" legal license. Work disseminated under the GPL grants users permission to study, change, and distribute changed versions of the work. See "Gnu General Public License 3.0" Free Software Foundation: http://www.gnu.org/copyleft/gpl.html Benkler 2003, 1251. See e.g. Benkler, Yochai. "Intellectual Property and the Organization of Information Production." International Review of Law and Economics 22.1 (2002): 81-107; Ghosh, Rishab Aiyer ed. CODE: Collaborative Ownership and the Digital Economy (Cambridge: The MIT Press, 2005); and Lessig 2001. Yochai Benkler, "From Consumers to Users: Shifting the Deeper Structures of Regulation Toward Sustainable Commons and User Access," Federal Communications Law Journal 52.3 (2000): 561-587, 562. Nick Dyer-Witheford, Cyber-Marx: Cycles and Circuits of Struggle in High Technology Capitalism (Chicago: University of Illinois Press, 1999), 4. I conduct a more detailed analysis of DRM's effects on open source in Section VI(C)(1). This position is somewhat counterintuitive, but it makes sense in the rarified but topical discourse of economics. The premise of the claim is that in a market with only one price for a given good, some people will get the good but be charged less than they are willing to pay. Also, some people will not get the good even though they would have paid something less than the market price but more than the cost of making the good (the "marginal cost"). This results in two forms of waste: the lost profits that arise in the first scenario, and the deadweight loss—i.e. the would-be buyer's loss of utility—in the second scenario. The solution to that problem, according to economists, is price discrimination, or charging each party what they are willing to pay, provided that that amount is higher than the good's marginal cost. See Benjamin Klein and John Shepard Jr. Wiley, "Competitive Price Discrimination as an Antitrust

63 Justification for Intellectual Property Refusals to Deal," Antitrust Law Journal 70 (2003): 599-642, 608- 633. 30 For example, the BBC's "Creative Archive" is an innovative project that makes thousands of BBC programs available online under very permissive licensing terms. However, it is only available to people with Internet addresses within Britain. Three reasons seem likely for the inclusion of these restrictions: to maintain an export market, to avoid the perception of taxpayer money subsidizing the viewing habits of foreign citizens, and a lack of distribution rights outside of the United Kingdom. For general information on the Creative Archive, see Katie Dean, "BBC to Open Content Floodgates," Wired News, June 16, 2004, http://www.wired.eom/news/culture/0,1284,63857,00.html 31 There is also a special region designation for discs that are intended to be played on cruise ships and airplanes. 32 This is an example of the private consortia that will be discussed further in Section IV(D). 33 Louis Althusser, Lenin and Philosophy and Other Essays, Trans. Ben Brewster (New York: Monthly Review Press, 2001), 112. 34 Neil Postman, Amusing Ourselves to Death: Public Discourse in the Age of Show Business (New York: Penguin, 1986), 79-80. 35 John Fiske, "Culture, ideology and interpellation," Eds. Julie Rivkin & Michael Ryan, Literary theory: An anthology (Maiden: Blackwell Publishers, 1998), 305-311, 310. 36 See e.g., Bill Curry, "Ottawa Accused of Caving in to Hollywood on Copyright," The Globe and Mail (December 22, 2007): Al. 37 Margaret Kane, "France Backs Down on DRM Bill," QNetNews.com (May 12,2006), http://news.cnet.com/8301-10784_3-6071581-7.html 38 Steve Jobs, "Thoughts on Music," Apple.com (February 6, 2006), 39 Bruce Lehman, "The Digital Rights Management Dilemma" (presented at the Musical Myopia, Digital Dystopia: New Media and Copyright Reform, McGill University, March 23, 2007), http://mediasite. campus .mcgill.ca/mediasite2/viewer/Viewer.aspx?layoutPrefix=LayoutTopLeft&layout Offset=Skins/Clean&width=800&height=631 &peid=6e 197c68-0b63-4474-ac3b- f770e220de0e&pid=2276b8bb-0299-4©e-9be8- d83d3539f313&pvid=501&mode=Default&shouldResize=false&playerType=WM64Lite. 40 Alex J. Halderman and Edward W. Felten, "Lessons from the Sony DRM Episode," in Proceedings of the 15th USENIXSecurity Symposium (Vancouver, British Columbia, 2006), 77-92,4.1, http://www.usenix.org/events/sec06/tech/full_papers/halderman/halderman_html/ 41 Halderman 2006. 42 Philips Electronics, for example, was recently granted a patent on technology that allows it to stop viewers from using a personal video recorder to skip commercials. See, e.g. Grant Robertson, "Channel Surfers Protest Philips TV Technology," GlobeandMail.com, April 21, 2006, http://www.theglobeandmail.com/servlet/story/LAC.20060421.IBPHILIPS21/TPStory/TPBusiness

64 In 2003, movie studios made a deal to distribute "disposable DVDs" that would corrode and become unusable 48 hours after their exposure to air. See, e.g. Stefanie Olsen, "Disney, CinemaNow Ink Net- Movie Deal," News.com, September 14,2003, http://news.com.com/Disney ,+CinemaNow+ink+Net-movie+deal/2100-10253-5076170.html 44 For example, in a single month in 2001, Napster users transferred 2.8 billion music files. Sam Costello, "Webnoize reports Napster downloads drop 36 percent in April," InfoWorld, (May 1,2001), http://www.infoworld.eom/articles/hn/xrnl/01/05/01 /01050 Ihnnapster.html 45 This method relies on exploiting slight differences between standards for audio and data CDs, with the goal of creating discs that play in "audio" CD players but which fail in "data" CD players like those found in computers. However, differences in the implementation of those two standards created unexpected (dis)functionality in a range of electronic devices. For more information on how this system works, see US Patent 6,425,098, July 23, 2002. 46Halderman2006,4.1. 47 The most famous example of this problem is the Allies' successful attack on the German "Enigma Machine" during World War II. See generally, Kozaczuk, Wladyslaw. Enigma: How the German Machine Cipher Was Broken and How It Was Read by the Allies in World War Two. (Univ Pubns of Amer, 1984). 48 For a more detailed but still readable account of cryptography in DRM, see Gillespie 2007, 248-253. 49 Gillespie 2007, 252. 50 For a concise account of how these copy prevention measures are implemented in new Blu-ray disc technology, see Kevin Henry et. al, "An Overview of the Advanced Access Content System (AACS)," Technical Report CACR 2007-25 (2007), http://www.cacr.math.uwaterloo.ca/techreports/2007/cacr2007-25.pdf 51 Thomas Claburn, "AACS Copy Protection For DVDs Defeated Again," InformationWeek.com, May 18, 2007, http://www.informationweek.com/news/internet/showArticle.jhtml?articleID= 199602031 52 For a discussion of the trend in putting remote-disabling functionality in a range of new technologies, see Bruce Schneier, "I've Seen the Future, and It Has a Kill Switch," Security Matters - Wired.com, June 26, 2008, http://www.wired.com/politics/security/commentary/securitymatters/2008/06/securitymatters_0626. 53 The determination of what constitutes a valid "household" is an interesting normative exercise. DRM systems like CPCM, discussed in Section V, enforce policies based on whether devices are located in the same household, which is loosely determined by the number and proximity of CPCM-enabled devices. This raises the interesting question about whether a non-standard household—i.e. one with more devices, one that spans the separate households of divorced parents, etc.—can be accommodated in such a scheme. For now, the makers of DVB have avoided this question by leaving the definition of a "Single Household Metric" to an as-yet unassembled enforcement committee. See DVB BlueBook A094r2 (DVB, February 2008), 422, http://www.dvb.org/technology/standards/a094r2.l-10.CPCM.pdf 54 DVB provides a useful example: "Content could be delivered to a Singaporean home via a satellite subscription television receiver [...] for use by any Device on the home network [...], then after 24 hours (a possible Remote Access Rule condition), be accessed by a remote TV [...] in the same user's portable 65 device while he is travelling in Hawaii (another Device in the same [authorized domain])." This neatly describes the highly granular, device- and location-level control that the system grants to copyright holders. DVB Bluebook 2008, 9.

66 IV. The Evolution of Standards

"A variety of social systems [...] can function only when parties with conflicting interests agree on what technical characteristics to build into products."

- Crane1

Technically, the world is a complicated place. Each of the developments that make up the technological history of the last two centuries—telegraphy, telephony, radio, television, computer networks, etc.—relies on the coordination of countless individual devices, made by thousands of different manufacturers, operating in every conceivable environment. Common rules for the performance, design, and interoperation of those devices—rules known as "standards"—make it possible for complex communication systems to function. In a standardized environment, television signals are broadcast with the confidence that television sets will understand and display their content; email is successfully exchanged between people using different brands of computers. Today more than ever, standardized technologies permeate our lives. Without standards, our desks and living rooms would be littered with inert boxes of silicon and steel.

The interoperability promoted by standards has led the public, industry, and regulators to view standards in an overwhelmingly positive light. According to the conventional wisdom, standardized technologies beget greater consumer choice, lower prices, and a host of other benefits that accrue in competitive markets. As a result, the 67 ways in which standards are developed, as well as the normative thrust of many

standards, have escaped serious public scrutiny.3 There has been some interest in these shifts, but the motivation for those limited investigations has been confined largely to understanding their implications within the specialized "world of standards."4

This lack of attention is odd in light of the overtly political nature of standard

setting. Standards always have been, and continue to be, a form of governance. In the past, standards were used as non-tariff trade barriers in the protection of domestic industries.5 Today, the choice to standardize a technology can open or close markets and condition the rules of competition. If a standard requires the use of patented technologies, the firms that control such intellectual property may acquire competitive advantages over their peers. Even matters like copyright, voting, and freedom of expression, which are nominally in the sole policy-making purview of government actors, are affected by the decisions of standard-setting organizations (SSOs). Perhaps because of the difficulty of overcoming that "persistent sense that technologies exist outside the frailty, inertia, and selfishness of human politics,"6 we chronically neglect the processes through which politics come to be embedded in technological systems.

Consequently, the implications of a series of recent changes to those processes have been largely ignored. Should this trouble us? Ought the average person be concerned with the technical details of whether a digital television program finds its way to her living room via Standard A, or Standard B? Or are these silly questions, the

68 answers to which only interest the geekiest of government engineers? This section

suggests that such technical details, and the processes by which they are established, have

far-reaching consequences for our political, economic and cultural life. It is also

important to understand how the story of standards—that is to say, the story of

deregulation and privatization coupled with dramatic technical advances—is related to the other narratives explored in this thesis. As more of our social fabric is woven through

complex webs of communications infrastructure, it is vital that we examine the construction of those webs critically.

A. What Are Standards?

It is generally accepted that standards refer to rules that are voluntarily adopted by a large number of marketplace actors.7 Within that broad definition, standards typically fall into two categories. First, "performance standards" are rules limited to conduct within a single industry.8 These standards focus on an industry's relationships with consumers and government officials, and they tend to be concerned with issues like best practices for the manufacture of products, safety testing, and other aspects of their businesses that are likely to attract the attention of activists or regulators.9 In short, performance standards are a way to align corporate conduct with public norms while avoiding the disciplinary attention of the state.10

69 This paper is primarily concerned with a second kind of standard, one that deals

with the interoperability of technologies and usually spans many different industries.

Known as "interface standards," these rules facilitate compatibility between devices or

systems, depending on the level of abstraction of a particular inquiry. While interface

standards do not necessarily "specify how a [...] machine should be designed or

manufactured,"11 they demand faithful interpretation at the junction between two

machines, or between a machine and a network. A typical interface standard will be

expressed as a technical document that describes everything from cryptographic

algorithms for encoding data to the shared grammar of computer protocols. Interface

standards bridge the multiplicity of dialects spoken by different technologies.

The logic behind the push for this kind of interoperability comes from "network

effects" and "positive externalities," two related ideas that suggest that the value of a

network increases as the network expands.12 The telephone network is a classic example

of network effects: the first telephone was worthless until a second telephone was sold,

and each subsequent telephone had greater utility than the last. Common rules, therefore,

are desirable because they reduce market fragmentation that would result from the need to accommodate multiple interfaces, formats, or protocols. The conventional wisdom is that standards help to grow the network by lowering barriers to connectivity, thus increasing competition and utility while decreasing prices for consumers.13

70 It should be noted that these two categories of standards, like most, are somewhat

artificial. In some cases, a performance standard is developed before, or in tandem with,

an interface standard. A performance standard might represent the public face of a

technology project while the interface standard provides a blueprint for implementing the

desired features. In most cases, a performance standard will be just as illegible to the

layman as an interface standard. Within each category, one can find myriad sub­

categories with varying degrees of overlap.14

But important distinctions remain. While performance standards are often

expressed in the language of policy and are intended to deal with "public" issues,

interface standards are rendered in code and couched in technical argot. Performance

standards arguably would not exist but for the specter of government intervention.15 The

conduct that they address is assumed to be within the proper scope of government

action}6 While performance standards are largely responsive to "threats" of government

intervention, interface standards emerge when government is assumed to lack knowledge

or legitimacy. According to Cutler et. al, "In areas where technology is complex or

information plays a significant role, the private sector is sometimes viewed by participants as more capable than governments in designing appropriate rules and procedures."17 Put more bluntly, "the prevailing view is now that standards are developed

exclusively in support of industry (and thus only when and where industry needs them).

The role of governments is understood today only to be a facilitator."18 In other words,

interface standards exist where the state is assumed to lack competence. This aura of 71 technical sophistication has allowed industry to develop interface standards with

relatively little public scrutiny, though this development is a relatively recent

development in the history of standards.

B. The State's (Declining) Role in Standard-Setting

The technical ordering of our communications infrastructure has been informed

by the political and economic context of the international system since the first standards

organization emerged in the mid-19th century. States collaborated on ways to connect

state-run systems; those systems were globally heterogeneous but nationally monolithic;

the line between users of the network and its operators was bright. But in the last three

decades, the state-centered international system has begun to splinter under the weight of

globalization and .19 Our technological environment has become similarly

global and pluralistic. Today, communication networks—most notably the Internet and

its ever-expanding stable of overlay networks—display little willingness to submit to

traditional modes of regulation.20 Instead of asking how the Internet should be managed, we have spent most of the last decade wondering whether it can be managed at all.

Indeed, until the rise of network neutrality as a serious political issue, the answer to the

second question was regularly answered in the negative.

As an historical matter, the state was not always so peripheral to the establishment

of interface standards. This was especially true for telegraphy, the first large-scale, real-

72 time communications network. In 1865, 20 European nations formed the International

Telecommunications Institute (ITU),21 the first major technology standards organization.

The goal of the alliance was to facilitate interconnectivity between, while maintaining the integrity of, each separate state-run telegraph system. That model made sense for several historically contingent reasons. The maintenance of heterogeneous national systems was a form of protectionism for domestic equipment manufacturers and patent-holders.22 The use of standards as "non-tariff trade barriers" continued for much of the next century, and it extended from telegraphy and telephony to television and a range of other communications systems.23 Also, while European nations were cognizant of market pressure to interconnect, they were also wary of becoming dependant on foreign technology.

This model proliferated, and for most of the next 150 years a typical interface standard was developed in state-supported standard setting organizations (SSOs) like the

ITU and the American National Standards Institute (ANSI),24 or in confederations of

SSOs like the International Organization for Standardization (ISO).25 These groups share several important characteristics, including free access to deliberations, public circulation of draft standards, and requirements for licensing of intellectual property at reasonable and non-discriminatory rates.26 In this way, standards are often thought of as "public goods," or goods that provide public benefits and are provided by the state because they are unlikely to be produced by traditional market mechanisms.27 It follows that SSO membership is drawn from a pool of "stakeholders" that includes government 73 representatives, companies, academics, and members of the public with the intention that

"no actors [be] excluded from standards development" or the benefits of using

standards.28 The drafting process is highly formalized and governed by the SSO's quasi-

legal rules. Success is achieved when the group's members reach a consensus—not a

simple majority. As a result, SSOs often produce standards at a glacial pace.29 While no

legislative action is necessary for the promulgation of the final standard,30 government

representatives are usually integral part of each step in the drafting process. Finally, while

SSO proceedings are typically public, the public typically pays them little mind.

Public scrutiny may be the exception in standards bodies, but a sense of obligation

to the public interest underlies much of their work.31 The ITU, for example, is a

specialized agency of the United Nations and is bound by the public interest agenda of

that body.32 As such, it dedicates a substantial portion of its resources to "help spread

equitable, sustainable and affordable access to information and communication

technologies"33 to the developing world. ANSI, the coordinator and accreditor of many

standards developed in the United States, counts as its goals the promotion of consumer

welfare and quality of life improvements for Americans.34 The high level of state

involvement in SSOs—as participants, funders, and, ultimately, legitmizers—requires

some level of coincidence between standards and public norms. Standards, then, have

most often been associated with a sort of soft regulation. When a system became

sufficiently complex, large, or important, its ability to interoperate with other systems and maintain internal functionality became a concern of the state. Because of the number of 74 actors involved, plus their myriad self-interested positionalities, the state was required to serve as an honest broker in the formulation of new standards.

But over the last 25 years the state-centered approach to standards has begun to decline alongside the state-centered approach to telecommunications policy in general.

Between 1984 and 1996, at least 44 public telecommunications operators were privatized around the world.35 Where that infrastructure was already privately owned, the state dramatically reduced its regulatory activities.36 This spate of deregulation and national divestment was part of the larger march towards globalization and neoliberalism that continues to play out around the world. The de-emphasis of the state as the central organizing unit of the international system—what international relations theorists call the

"neoliberal turn"37—produced a series of vacuums in governance. As Hall writes, "In many states, the privatization of government activities, the deregulation of industries and sectors, increased reliance on market mechanisms in general, and the delegation of regulatory authority to private business associations and agencies are expanding the opportunities for the emergence of private and self-regulatory regimes."38 As we see in the next section, the market was enthusiastic about these developments.

75 c. The Rise of Consortia

"It is less intriguing to hold on to the idea that the state is weakening or that public and private governance capacities are mutually exclusive. [...] In essence, we are confronted with dynamic, more synergetic relationships, in which public and private contributions might reinforce each other over time. "

- Heritier39

By the 1980s the state was receding from the ownership and regulation of

electronic communications infrastructure, but those systems still needed standardized rules of engagement to maintain and expand their functionality. Declining confidence in

state regulation, combined with the accelerating pace of technological development, produced a widespread perception that traditional standardization processes were too

slow for the information economy. How would standards be fashioned in an era when the

state was presumed to lack the expertise, authority, or will to do so? As suggested by Hall and others,40 private regimes emerged to augment the changing role of state authority.

By the late 1980s, a new breed of cooperative business arrangement had begun to colonize the evolving regulatory landscape.41 As in so many other areas of social, political, and economic life in the early 1990s, the market seemed to have the answers. In less than a decade, over 140 new "industry consortia" (consortia)—private versions of

SSOs—were established around the world.42 Like SSOs, consortia have highly formalized rules regarding consensus around work items. They address the same kinds of technical issues. But consortia and SSOs differ in significant ways as well. Consortia are usually 76 funded by their members, which presupposes that members have significant financial

resources. Most consortia participants come from the private sector. While public

representatives are usually able to participate in consortia, internal rules about

governance often ensure that the groups are led by private actors.

In addition to differences in composition and control, consortia and SSOs have

distinguishable motives. While both kinds of organizations are dedicated to the pursuit of technical compatibility, SSOs must do so in a way that accounts, however imperfectly,

for some notion of the public interest.43 Consortia need not pay even lip service to that

agenda. Hawkins argues, "Although some [consortia] make a public interest commitment, none are formally accountable to the public interest other than to conform with the laws

and regulations that apply in their countries of legal registration."44 Instead, consortia are primarily driven by the needs of their membership.

Consortia also invert the traditional logic of geography in standards. As mentioned earlier, the dominant pre- consortia model involved the establishment of national standards followed by multi-state deliberations at an international SSO.

Territorial representation was the norm at the SSO level, which facilitated a bottom-up path for standardization. As Werle argues, new international organizations have

"abolished the principle of territorial representation and are open to direct membership of

firms," while national organizations, "only transpose [...] what has been developed

internationally."45 In other words, today's consortia-drafted standards are birthed in

77 international forums, then adopted in local jurisdictions that likely had nothing at all to do

with their creation.

These concerns are amplified when one recognizes that interface standards,

produced in the dominant consortia-led paradigm,46 are almost exclusively drafted by multinational corporations based in the industrialized North.47 Less developed countries

are at a great disadvantage when it comes to influencing the development of standardized technologies. At best, the global South is put in the position of having to select among

standardized technologies from their Northern neighbors. If the first deployment of a

standardized system—digital broadcasting, for example—includes elements like DRM, the user restrictions discussed above will be part of the technical baseline for consumers

in the South. Also, it is likely that those consumers will be limited to proprietary technologies, though open source solutions may be cheaper.48 Salter describes this phenomenon:

"Standards "hardwire" existing differences between participators and non- participators in shaping the 'new economy.' They make change to these relationships harder, if not impossible. Indeed, if ever there was a possibility for indigenous economic development or for smaller nation-states to exert significant control over the powerful forces represented by the new information technologies, it is undermined by widespread agreement upon technical standards developed by multinationals with other agendas in mind"49

This does not mean that developing countries should avoid standardized technologies altogether. In fact, standards can lower the cost of new technological infrastructure. But

78 we should scrutinize the mechanisms that bring new technologies to developing countries, and, where possible, identify the line between their technically necessary and communicatively biased components.

It is important to note that the emergence of new, private regimes did not mean the disappearance of SSOs. Indeed, the major SSOs remain, even though they now serve very different functions. As alluded to above, SSOs are increasingly called upon to baptize standards that they did not draft. As Hawkins explains, "Once a consortium specification looks like it could assume the role of an 'industry standard' in a more conventional sense, the practice of many consortia is then to shift the initiative over to the

[SSO] system. Indeed, the [SSO] and consortia systems are now closely intertwined."50

Today, the oversight of an SSO in the standard-setting process—and the implied oversight of the public on the SSO—provides the final rules with an air of legitimacy.

The presence of strict procedures for consultation and drafting, plus the purifying fires of public bureaucracy, help to dispel the appearance of anti-competitive behaviour that normally attends the close coordination of firms. Put another way, modern standards

"[permit] firms to continue with activities that otherwise might attract lawsuits or public controversy."51

79 D. Standards + DRM

The extent to which the character of standards has evolved along with their modes of production is unclear, but anecdotal evidence suggests that new tensions are developing. For example, standards are supposed to be invisible, to ease barriers to entering markets, to reduce consumer confusion, and to generally lubricate the machinery of markets that rely on technology. The traditional model of DRM contradicts all of these assumptions. Not only is DRM visible to the consumer, it is actively disliked. Where standards putatively promote competition and reduce consumer confusion, DRM blocks open source competition and produces a range of incompatible files, services and devices.53 However, it is the very incongruity between our experiences with DRM and the promise of standards that has brought the two together. Invisible, ubiquitous, functional DRM would be a huge improvement over the models described in Section III.

The belief of content owners is that traditional DRM has problems that standardized

DRM will fix.

It could be argued that successful, effective DRM demands uniform deployment that can only be achieved through standardization. If some devices enforce the restrictions of content owners while others do not, consumers will tend to choose the less restrictive option. If devices enforce restrictions but are easily modified by end-users, some significant percentage of consumers will make such modifications. The net result of these disparities would be that consumers would become aware of the restrictions on their

80 behaviour, as well as the artificiality of those limits. Therefore, effective DRM relies on uniformity in the marketplace and the prevention of user modification. In markets where multiple technologies and hundreds of companies are required to interoperate—like the world of digital broadcasting—standardized technology is often the only way to achieve that goal.

While it is easy to see why the content industry wants standardized DRM, it is worth considering why that goal is easier to achieve in private standards consortia. To start, while complex systems demand technical coordination, their development by an

SSO makes little sense: why would a state-run organization deliberately craft technical rules that is at odds with its national copyright regime? More to the point, where is the incentive for a government to take on such a politically fraught project—designing rules that consumers do not want and trying to convince broadcasters and consumer electronics firms to adopt them—when private entities are clamouring to do that work on their own?

On the other hand, when such rules are drafted by industry, fashionable notions of market-driven solutions, combined with an aura of technical sophistication, dampen appeals for standards that incorporate public interest-oriented concerns (if the public is even aware that such rules are in development, of course).

The rise of consortia also explains why some firms that participate in DRM standard setting act in ways that appear contrary to their self-interest. For example, why do technology firms create products and services that give their customers less freedom

81 to interact with digital information, especially if more empowering technologies are

cheaper and easier to make? Robust DRM, after all, requires development over and above

the basic system of information exchange. It also necessitates the licensing of intellectual

property related to the standard, compliance with the technical demands of the standard,

and the prospect of being excluded from the market for refusing to bear any of these

burdens. In other words, why do technology firms dedicate resources to standards that

will make their products less valuable to consumers and more expensive to produce?

The answer to this puzzle has everything to do with the intellectual property

systems described in the previous section. If the world's largest producers of copyrighted

entertainment threaten to withhold their wares from any broadcast channel that does not use a robust DRM system,54 what other option exists for industries that count such

material as an essential input? One might ask whether that threat to withhold content is

credible or appropriate—some evidence suggests that it is neither55—but most technology

firms would rather not take the risk. If they do not adopt the DRM system, they may be barred from the market entirely. If they do adopt it, they will be only as disadvantaged as

all their other competitors. As a result, the rational decision is not only to adopt the

content industry's requested DRM, but also to participate in its creation for a number of reasons. First, being involved in a DRM standard might make it better—i.e. cheaper or less restrictive—than it would be if left to designers from the content industry. Second, your competitors are less likely to secure an advantage—like the inclusion of their intellectual property in the standard56—if you are there to monitor the situation. Third, 82 you may be able to pull a similar trick on your competitors, thereby eking out an advantage. So long as other firms are not allowed to "cheat" on the standard by releasing more feature-rich or less expensive devices, a firm has every incentive to play along in the standardization of DRM.

The structure of consortia, especially as employed in the production of DRM, has implications for the participation of open source vendors as well. Membership fees, licensing fees for proprietary technology, and high participation costs are typical in consortia, but open source projects usually are loosely governed, composed of volunteers spread around the world, and poorly capitalized. This means that open source groups are unlikely to participate in the dominant, industry-led standards consortia. Moreover, DRM applications like often discourage open source approaches on technical as well as financial levels. In order to access DRM-restricted content, the viewer must use hardware and software that resists "tampering" by users.57 One of the fundamental features of open source products, of course, is that users must be able to modify the code. So while specifications might not explicitly ban open source implementations, it has articulated its rules in a way that privileges the incumbent industrial information economy. As Gillespie puts it, "The standard presumes particular legal and economic arrangements, which themselves become standardized."58

The fact of a public-private shift in the creation of standards is not difficult to see, but its implications are not as obvious. However, it is clear that interface standards,

83 especially those related to technology, deal with more than mere technical compatibility.

All modern network technologies are substantially standardized, and we rely on those technologies for information about our world and how to shape it. The sum of these changes is that, over the last half-century, standards have evolved from blunt instruments of national trade policy into sophisticated regimes of private authority. In the context of

DRM, standards can describe new geographies of power and control. This is troubling because the public interest protections of the public standards system have been hobbled through delegation to the private sector, while the importance of standards has increased in proportion to our reliance on information networks like the Internet. As a result, we have a fundamental political interest in understanding how a standardized technology will filter and privilege information, how we are enabled or constrained in our attempts to access and process cultural inputs.59 Decisions made by a standards body, public or private, can constitute a de facto policy regime.

Our need to critically interrogate technological systems grows in proportion to their centrality in our political, economic, and cultural lifeworlds. It is therefore sobering that the concerns raised above are far from exhaustive—in fact, it is useful to remember that most are drawn from one case study of a single private consortium. Hundreds of similar groups are working on standards of their own, untroubled by public scrutiny and unburdened by democratic safeguards. At a time when network technologies have seized our imagination with their promises of emancipation and democratic reinvigoration, private standardization is an important counter-narrative that must be addressed. 84 The following case study illustrates these changes by looking at one archetypical consortium working on DRM standards.

E. DVB Case Study

Ft.]'..TO 3: DVB Worldwide

DVfj standards for digital television in use, adopted, or planned Olhor s'andards in use

Figure 3: DVB Worldwide

The Digital Video Broadcasting Project (DVB) is a consortium that drafts digital television standards that are used in Europe, Asia, Africa, and South America60 [Fig. 361].

The group was formed in 1993 to coordinate the long-heralded move from analog to digital television, which will require that millions of viewers replace their analog 85 television receivers with new digital tuners. This shift will, in turn, depend on the availability, desirability, and affordability of such devices. While this shift is only now beginning to hit full stride in the world's most prosperous nations, its significance has been apparent for over two decades. Once the transition is complete, broadcasters will use different portions of the electromagnetic spectrum, leaving highly desirable portions spectrum "vacant." Those frequencies will then be auctioned by national governments, which will bring in tens of billions of dollars for states while opening markets for new technologies and services.62 In short, an orderly, quick transition to digital television is an extraordinarily high priority on national communication policy agendas around the world.

What kind of private organization is capable of drafting rules that would have been squarely in the purview of state-sponsored organizations just three decades earlier?

According to DVB's website:

"The DVB Project is an Alliance of about 250-300 companies, originally of

European origin but now worldwide. Its objective is to agree specifications for

digital media delivery systems, including broadcasting. It is an open, private

sector initiative with an annual membership fee, governed by a Memorandum of

Understanding (MoU).,l63

This concise self-description illustrates several of the changes discussed in the previous section. The group's membership fee is €10,000 per year, and each working group— several of which may be toiling away simultaneously on different aspects of a new

86 standard—meets in a different global city every month. Since a single standard can take years to complete, participation in the creation of a DVB standard, from start to finish, may cost millions of dollars. This helps explain why DVB members tend to be multinational corporations, but the most serious barriers to diverse participation are

DVB's membership rules. Every DVB member must fit within four pre-defined

categories to "ensure balance of representation:" content providers/broadcasters,

infrastructure providers (satellite, cable, terrestrial or network operators), manufacturers

and software suppliers, or governments/national regulatory bodies.64 In this context,

"balance" means barring academics, individuals, and civil society organizations from becoming DVB members.65

While government representatives are able to join DVB, they are procedurally restricted from leading the organization. For example, DVB is directed by a Steering

Board comprised of 40 elected representatives from each of the four aforementioned categories.66 The number of representatives from each category is dictated by DVB's organizing documents, and content providers are allowed the greatest number of votes— twice as many as allotted to government/national regulatory bodies.67 Moreover, only members "indicating their intention to contribute with resources [...] for the benefit of the

DVB Project will be eligible to stand for election."68 The message of these choices is clear: the State can participate, but DVB is a private party. The group's private bias is further highlighted on its website, which states, "Specifications are only worth

87 developing if and when they can be translated to products which have a direct

commercial value."69

Copyrighted entertainment programming is a crucial input to the commercial value chain contemplated by DVB, and copyright holders are extremely active in the formulation of DVB standards. Though sometimes represented individually, the largest bloc of copyright holders is typically represented by trade groups like the Motion Picture

Association of America and its international arm, the Motion Picture Association

(referred to collectively as "MPA"). The MPA represent the world's largest producers of commercial movie and television programs.70 While both groups are headquartered in the

United States, their interests are considerably more cosmopolitan. The films of MPA member companies are theatrically released in 125 countries, and their television programs are aired in 150 national markets. In addition to geographic breadth, the MPA is concerned with a broad range of media dissemination technologies. For example, the global film industry, once synonymous with the box office, has diversified immensely over the last three decades. Today, theatrical releases account for only 19% of the $42.6 billion worldwide motion picture market.71 In comparison, 36% of industry revenue comes from pay and free-to-air television, and a full 44% is generated by the market for

DVDs.72 In short, the industry represented by the MPA reaches its customers via a range media technologies in nearly every country in the world.

88 Copyrighted programming produced by MPA members is also a vital input in

many other industries. This paper deals primarily with digital broadcasting, and we can

view that supply chain as follows: an MPA member produces copyrighted material, that

material is licensed to broadcasters, and it is transmitted to the home where, finally, it is

viewed. Additional steps will be inserted as required by the business model of the broadcaster. Terrestrial broadcasters typically sell advertising that airs along with the programming. Cable and satellite broadcasters may do the same, and in addition charge a

monthly or per-play fee for access to their signals. Further, each broadcaster will have its

own suite of technology providers. The electronics companies which manufacture the set-

top box, the flatscreen television, and the personal digital video recorder which display that signal also have a vested interest in being part of the copyrighted entertainment

supply chain.

As significant as the global film and television production market has become, it

is dwarfed by the other links in the value chain. For example, the American cable industry drew over $100 billion in 2007 from subscriber fees and advertising revenue.73

The Consumer Electronics Association, the American trade group that represents manufacturers of home and mobile entertainment equipment, estimates that global sales

of consumer electronics reached almost $618 billion in 2007.74 The result is a huge, inverted pyramid of value that depends largely on copyrighted content, to which the copyright holder enjoys a monopoly right.75

89 Copyright holders have succeeded in expanding DVB's work to include a

standard for a sweeping copy-control regime. This initiative is called "Content Protection

Copy Management" (CPCM). It would only allow protected content to be viewed by

"approved" receiving devices like televisions and tuner cards, and any other downstream

electronics that require access to those signals. The resulting technological environment

will give copyright holders vastly greater control over the use of their work than is

provided for in the copyright law of any country in the world.76

DVB's activity raises at least two serious public policy problems. First, in the

space governed by these standards, technologically enforced rules will effectively replace

national copyright policy within the technical environment controlled by DVB. Where

the relationship between users and copyright holders has nominally involved some kind

of legislatively derived balance, the CPCM framework will allow copyright holders to

make unilateral changes to that balance. Second, because CPCM requires platforms that

are robust against "tampering," it cannot be implemented in open source software. On a

technical level, it is important to point out that CPCM is a DRM standard that is distinct

from the broadcasting signals over which protected content may travel. Therefore, while

a DVB broadcast standard might well be implementable in an open source tuner, that tuner would display nothing whenever it came across CPCM-protected content. In other words, open source software and hardware makers will be unable to meaningfully

compete in the market for digital television programming where DVB standards are

adopted. 90 ••:... nuw v^rv^m vvOrKS

' • • ,v,*1.-- ; •' ••', •'^^MlBPRMConteiitDRMC

Broadcaster Open Source Turssr

Figure 4: How CPCM Works

Figure 4 illustrates the difference between how an open source tuner and a DVB- licensed tuner would function in a DVB market, but it does not quite communicate the disadvantage that open source vendors would face. As most content on a commercial broadcasting network is produced by the kind of copyright holders who are busy building

CPCM, and because there is no incremental cost to using the system once it is in place, it is likely that the ratio of DRM-protected content to "open" content would be much, much higher than one-to-one. The follow-on effect of this, discussed further in Section V(B)(1), is that open source vendors are effectively barred from the market.

The issues raised by DVB beg questions about a public response. We rely on advanced communication networks more every day, but the average citizen's capacity to

91 influence their design and hold their designers accountable is in decline. If copyright holders are able, through DVB, to effectively rewrite a country's copyright policy, which mechanisms could the public use to fight such a development? What social costs will be incurred? The following sections explore legal theories that might be useful in addressing these concerns.

1 Crane, Rhonda J., The Politics of International Standards: France and the Color TV War, (Ablex Publishing Corporation, 1979), xii. 2 Antitrust Enforcement and Intellectual Property Rights: Promoting Innovation and Competition (U.S. Dep't of Justice & Fed. Trade Comm'n, April 2007), 33, www.usdoj.gov/atr/public/hearings/ip/222655.pdf. 3 There are notable exceptions, as in antitrust scrutiny of patent pools and the use of standards to exclude market participants. In Section VI(B), I examine why limiting regulatory scrutiny to these kinds of activities is inadequate, especially in light limitations on NGO participation. 4 Raymund Werleand Eric J. Iversen, "Promoting Legitimacy in Technical Standardization." Science, Technology & Innovation Studies 2.1 (2006): 20. 5 See generally Crane 1979. 6 Gillespie 2007, 2. 7 Liora Salter, "The Standards Regime for Communication and Information Technologies." Private Authority and International Affairs. Claire A. Cutler et. al, eds., (Albany: SUNY Press, 1999), 101. 8 Haufler, Virginia, A Public Role for the Private Sector: Industry Self-Regulation in a Global Economy, (Washington, DC: Carnegie Endowment for International Peace, 2001), 3. 9 Ibid. 10 For example, in the wake of ballot-counting problems in the United States' 2000 presidential election, electronic voting machines emerged as an option for updating the country's democratic infrastructure. To facilitate that shift, the country's largest voting machine manufacturers came together to create a standard to codify values like privacy, security, and inclusiveness. However, it was widely criticized for not meeting those goals and was made largely irrelevant by new legislation that banned the voting machine industry's most popular technologies. Here, industry's inability to develop an appropriate standard highlights the political and economic stakes of striking the right balance. See e.g. John Hendren, "Armed to Send Chads into Voting Oblivion," The New York Times, December 17, 2000: sec. 3; and Erica Vonderheid, "Standing Up for a Better E-Ballot Box," The Institute, September 3, 2004. 92 "Vonderheid2004, 1. 12 Gillespie 2007, 140. 13 The economic justifications for standards are examined more fully in Section VI. 14 Egyedi, T. M. "Beyond Consortia, Beyond Standardization? Redefining the Consortium Problem." Advanced Topics in Information Technology Standards and Standardization Research, ed. Kai Jakobs. (Hershey, Pennsylvania: Idea Group Publishing, 2005), 92. 15Haufler2001,3-7. 16 For example, the voting machine scenario discussed at note 7 in this section. 17 Cutler 1999,4-5. 18 Salter 1999, 113. 19 These themes have been addressed extensively in a range of disciplines. For a good synopsis, see Phillip F Kelly, "The Geographies and Politics of Globalization." Progress in Human Geography 23 (1999): 379-400. 20 See generally Galloway 2004. 21 "About Us." ITU.org August 27, 2007, http://www.itu.int/net/about/index.aspx 22 Richard W. Hawkins, "The Doctrine of Regionalism." Telecommunications Policy 16.4 (1992): 239-40. 23 Crane 1979,41. 24 "American National Standards Institute." ANSI.org August 27,2007, http://ansi.org/ 25 "International Organization for Standardization." ISO.org August 27, 2007, http://www.iso.org/ 26 P. A. David and M. Shurmer, "Formal Standards-Setting for Global Telecommunications and Information Services. Towards an Institutional Regime Transformation?" Telecommunications Policy 20.10 (1996): 793-94. 27 James Love and Tim Hubbard, "Paying for Public Goods." CODE: Collaborative Ownership and the Digital Economy, Rishab Aiyer Ghosh ed. (Cambridge: The MIT Press, 2005), 207-09. 28 Raluca Bunduchi et. al, "Between Public and Private - the Nature of Today's Standards." Proceedings of Standards, Democracy and the Public Interest. (Paris, 25 August 2004) 3. 29 David and Schurmer 1996, 795-96. 30 Though this is debated, for the purposes of this paper I distinguish between mandatory rules (regulations) and voluntary rules (standards). See e.g. Salter 1999, 105. ("Regulations have the force of law. Standards do not have the force of law unless or until they are promulgated as regulations.") 31 Bunduchi 2004, 3.

93 See e.g. U.N. General Assembly, "Resolution 52/2 [United Nations Millennium Declaration]." September 8,2000. (Committing all U.N. agencies "[to] ensure that the benefits of new technologies, especially information and communication technologies [...] are available to all.") "About Us," ITU.org. August 27, 2007, http://www.itu.int/net/about/index.aspx "ANSI: Overview," American National Standards Institute November 11, 2007, http://www.ansi.org/about_ansi/overview/overview.aspx?menuid=l Dan Schiller, Digital Capitalism: Networking the Global Market System, (Cambridge: MIT Press, 1999), 45. Alfred Aman, "A Global Perspective on Current Regulatory Reforms: Rejection, Relocation, Or Reinvention?" Indiana Journal of Global Legal Studies 2.2 (1999) 442. See, e.g. David Harvey, A Brief History ofNeoliberalism, (Oxford: Oxford University Press, 2006). Bruce Rodney Hall and Thomas J. Biersteker, eds. The Emergence of Private Authority in Global Governance. (Cambridge: Cambridge University Press, 2003), 23. Adrienne Heritier, ed. Common Goods: Reinventing European and International Governance (Boulder, Colorado: Rowman & Littlefield, 2002), 99. See Claire A.Cutler, "Private International Regimes and Interfirm Cooperation," in Hall 2003, 23-41. Richard Hawkins, "The Rise of Consortia in the Information and Communication Technology Industries: Emerging Implications for Policy," Telecommunications Policy 23.2 (1999): 164. Roy Rada and John Ketchell. "Sharing standards: standardizing the European information society," Communications of the ACM ALT, (2000): 21-25. Werle 2006, 28. Hawkins 1999, 161. Werle 2006, 29. Other paradigms exist, like the Internet Engineering Task Force (IETF), which produces most of the standards for the public Internet. That group only allows individual membership, conducts meetings over inexpensive email lists, and has a less hierarchical leadership structure. However, analysis of the membership shows that corporate actors dominate even this model. See Kai Jakobs, Standardisation Processes in IT: Impact, Problems and Benefits of User Participation. (Wiesbaden: Vieweg, 1999), 157. The discourse employed to taxonomize countries as either North/South, developed/developing, First World/Third World, core/periphery, etc., is terrifically fraught. I prefer the less normative formulation of North/South, but recognize that this is not without its own pitfalls. For example, it has been argued that the binary is better described as the North defining the South in ways that limit the agency and prospects of the latter's citizens. See, e.g. Roxanne Lynn Doty, Imperial Encounters: The Politics of Representation in North-South Relations (Minneapolis: University of Minnesota Press, 1996). See, e.g. "Integrating Intellectual Property Rights and Development Policy." Report of the UK Commission on Intellectual Property Rights. September 2002, http://www.iprcornmission.org/papers/text/final_report/reporthtmfinal.htm 94 Salter 1999, 102. 50 Hawkins 1999, 166. 51 Salter 1999, 102. 52 For example, a survey from one of the major record labels indicated that the number of people who think that DRM is a "good idea because [it] protects copyrighted content from illegal file sharers" dropped from 61% to 49% over the last year. See Bill Rosenblatt, "New EMR Survey: Consumer Attitudes Souring Towards DRM," DRM Watch, March 13, 2008, http://www.drmwatch.com/article.php/3733936 53 Some European countries have begun to express their dissatisfaction with DRM. For example, see David Ibison, "Norway declares Apple's iTunes illegal," Financial Times, January 27, 2007, http://www.ft.eom/cms/s/2/lfc40360-abe9-lldb-a0ed-0000779e2340.html 54 The American content industry has been vocal about its intention to do this. Allan Friedman et al., "Underlying Motivations in the Broadcast Flag Debate" (Conference Presentation presented at the Telecommunications Policy Research Conference, Washington, D.C., September 21, 2003), 22, http://www.sccs.swarthmore.edu%2Fusers%2F02%2Fallan%2Fbroadcast_flag_debate.pdf 55 In the past, content holders have threatened to withhold their content from various technologies—i.e. colour television, videotapes—until certain technical concessions were made, but they have typically failed as soon as one member broke ranks. See Fred von Lohmann, "Tech Mandates or No DTV? Calling the Cartel's Bluff," Consensus at Lawyerpoint, June 14, 2002, http://bpdg.blogs.eff.org/archives/000142.html 56 This kind of common but anti-competitive behaviour—known as "submarine patenting"—is discussed further in Section VI, 84-85. 57 Cory Doctorow, "The Digital Video Broadcasting Group's Project Content Protection and Copy Management: A Stealth Attack on Consumer Rights and Competition." EFF.org, September 29,2005. http://www.eff.org/IP/DVB/dvb__critique.php 58 Gillespie 2007, 143. 59 See, e.g. Lawrence Lessig, Free Culture: How Big Media Uses Technology and the Law to Lock Down Culture and Control Creativity (New York: Penguin Press, 2004); Siva Vaidhyanathan, The Anarchist in the Library: How the Clash Between Freedom and Control is Hacking the Real World and Crashing the System (New York: Basic Books, 2004); and Coombe 1996. 60 The DRM system being built by DVB, even if implemented in standards used in North America, would have been difficult to adopt in the United States or Canada for two reasons. First, in the United States, it is unlawful to encrypt "basic tier" broadcasts, which refer to content like local and public broadcasting, and major television networks like NBC and CBS. This provision was meant to facilitate compatibility between television tuners, and to promote the dissemination of over-the-air broadcasts that are retransmitted via cable. This would have been a barrier to the development of a CPCM-like DRM system that relies on the encryption of content. Other methods of control—technology mandates, for example— have been pursued by content holders in the U.S. Second, because of the proximity of Canada's population centers to its border with the U.S., it would have been very difficult for Canada to adopt a distinct digital television standard. Using the same standard means that Canadian consumers enjoy the economies of scale of the U.S. electronics market. This is part of the reason that the U.S. and Canada have shared the same television standards since the 1940s. 95 61 Data for this figure was compiled from DVB's website. "Digital Video Broadcasting - DVB Worldwide," DVB.org, July 16,2008, http://www.dvb.org/about_dvb/dvb_worldwide/. 62 In the United States, the 2008 auction for spectrum that will be vacated after the digital television transition raised over $19 billion. Stephen Labaton, "Wireless Spectrum Auction Raises $19 Billion," The New York Times, March 19, 2008, sec. Technology, http://www.nytimes.eom/2008/03/l 9/technology/l 9fcc.html 63 "History of the DVB Project," [DVB History] DVB.org, http://dvb.org/about_dvb/history/index.xml 64 "Memorandum of Understanding," [MoU] DVB.org, Article 6.1, http://dvb.org/membership/mou/ 65 Between 2004 and 2007, the Electronic Frontier Foundation was able to represent open source manufacturers at DVB with the financial support of the John D. and Catherine T. MacArthur Foundation. I managed EFF's work at DVB, including attending the group's meetings, between 2005 and 2007. 66 MoU, Article 6.1. 67 The Memorandum of Understanding stipulates that the steering board have 14 content industry representatives, 9 infrastructure providers, 10 manufacturers, and 7 government or regulatory members. MoU, Article 6.1. 68 Ibid. 69 DVB History. 70 Toby Miller et al., Global Hollywood 2, 2nd ed. (British Film Institute, 2005), 213. 71 Hollinger, Hy. "MPA study: Brighter picture for movie industry." The Hollywood Reporter (March 31, 2008), http://www.hollywoodreporter.com/hr/content_display/news/e3ic5575a8c4f61aadd68a0d344f476d5da 72 Ibid. 73 National Cable & Telecommunications Association, "Statistics," NCTA.com, (March 17, 2008), http://www.ncta.com/Statistic/Statistic/Statistics.aspx 74 Consumer Electronics Association, "International News," CE.org, (March 17, 2008), http://www.ce.org/International/default.asp 75 The competitive implications of this relationship are explored further in Section VI. 76 At its most basic, copyright rewards the creators of original expressions with a temporary monopoly over certain uses of their creations. However, it is important to note that there are significant constraints on that monopoly. For example, fair use and fair dealing—the lawful, context-specific use of copyrighted material without authorization from a rights holder—creates space for the public to engage in commentary, criticism, etc. See, e.g. Cory Doctorow, "The Digital Video Broadcasting Group's Project on Content Protection and Copy Management: A Stealth Attack on Consumer Rights and Competition" (EFF.org, September 29, 2005), http://www.eff.org/IP/DVB/dvb_critique.php

96 V. Competition Law vs. Standardized DRM?

Private standards are remarkably insulated from traditional modes of public

accountability. Their drafters are not elected officials who worry about elections, and

while it is claimed that consumers "vote with their wallets," standards are often designed

to eliminate that choice by settling "harmful" competition issues before they cloud the

market.1 When standards become public, it is typically through dense technical

specifications, or the finished systems that are their physical manifestations. How might

members of the public, alone or represented by public interest organizations, influence

private groups that have institutionalized their indifference to those constituencies?

An obvious response would be an appeal to the state, but what kind of assistance

should be sought? Is effective recourse likely to be found in legislative, judicial, or

regulatory action? And what conduct should we address with those tools? For a variety of

reasons, I submit that competition law—and particularly the American branch known as

antitrust law—provides useful methods for addressing the problem of copyright holders

designing distributed systems of information control in private standards bodies.2 First,

competition inquiries revolve around structural analysis; they take an account of the

parties in question, the power that they wield against one another, the roots of that power,

the propriety of its application in a particular circumstance, etc. It is as useful a tool for

critically engaging with the concentration and application of power as the law is likely to provide. Second, it provides structural remedies to these problems. In their classical form,

97 antitrust remedies ran the gamut from breaking up conglomerates, to enjoining forbidden

conduct, to the award of significant monetary damages. The breadth of these remedies

has attracted the ire of those who reject government involvement in the market, but

perhaps that is more a virtue than a vice. Finally, competition law can provide for private

rights of action in addition to government-led inquiries. This increases the pool of

potential claimants, which reduces reliance on state agencies that may be out of touch

with certain kinds of competitive harms. Other paths, like consumer protection and

copyright reform could be useful and necessary, but competition law provides the best

intersection of structural analysis, applicable doctrines, and meaningful remedies.

Why American competition law? Partly because the companies most active in pushing DRM are American entertainment companies. Also, while the trends described

above have swept the world, the mythology described in Section I is most closely linked

with the rise of the Internet in the United States. Born out of populist concern about the

concentration of power, American competition law has long been concerned with the

kind of behaviour being investigated in this paper. That the populist inclinations of

antitrust law have since been dulled by the worship of economics makes it even more

interesting to apply it to other deregulated, privatized, phenomena like the Internet and

standards. This section does identify European competition doctrines that should be

explored further in relation to standards, DRM, and information control.

98 Ironically, reinvigorated competition law is also a potential remedy for concerns about network neutrality. Some of the concepts discussed below in relation to standards, like the essential facilities doctrine, were debated in proto-network neutrality discussions

of competition in the provision of Internet service.3 The argument was that if ISPs were forced to share their wires—the physical layer of their networks—with competitors, the resulting competition would insulate consumers from harmful practices like high pricing and content filtering. Free marketeers vigorously fought that line of reasoning as

"infrastructure socialism," and it never convinced regulators.4 However, a more consumer welfare-centered approach could address the monopolistic depredations of

ISPs.

While the remedies provided by competition law—injunctions on harmful behaviour, forced restructuring, large damage awards to dissuade future bad conduct, etc.—are attractive, they have become more difficult for public interest-minded litigants to attain. This sections looks at ways in which it might be dragged back into service of its public interest origins.

A. Competition in America

With the passage of the Sherman Antitrust Act in 1890, the American framework for competition regulation was launched with the goal of dissuading "every contract, combination [...] or conspiracy in restraint of trade."5 A populist spirit fueled America's

99 golden age of "trust busting," and it produced a symbolic connection between antitrust law and values like consumer protection, market access for small players, and suspicion of consolidated power. This began to change in the 1970s, as American antitrust law went through a revolution led by law and economics scholars associated with the University of

Chicago. The Chicago School has been analyzed in enough detail that further inquiry is not merited here,6 but a summary of its approach to competition can be summarized as the pursuit of economic efficiency, born of an abiding faith in the ability of the market to solve economic and social problems.7 It correspondingly views government intervention in the market as presumptively inefficient, cumbersome, and undesirable. For example, the Chicago School believes that a functioning market is likely to foster consumer protection, and so it favors market-protecting policies over rules that would protect consumers directly. This way of thinking was a marked departure from the hands-on, populist approach that characterized antitrust inquiries through most of the 20th-century.8

While courts and Congress had long before recognized that the Sherman Act's plain language was over-inclusive and demanded qualification, the Chicago School dramatically restricted the role of antitrust law in American life. Today, antitrust violations fall into two categories. "Per se" violations refer to conduct that is barred outright because it is never beneficial to anyone but the monopolist. For example, price fixing is considered so toxic that the mere existence of an arrangement among competitors to set retail prices is enough to attract sanction. A "rule of reason" offence, on the other hand, refers to a restraint of trade that is unreasonable when balanced against 100 the defendant's business justifications, market power, and any alleged pro-competitive

effects of its conduct. This standard is supposed to account for efficiencies created by

otherwise anticompetitive behavior. Chicago School thinkers are primarily responsible

for this shift towards treating the economic efficiencies of business conduct as mitigating

factors in competition analysis, which has produced an environment where fewer antitrust

claims are now seen as per se violations. For claims evaluated under the rule of reason,

alleged anticompetitive behavior now typically requires a showing that the accused wields significant market power and that its conduct cannot be saved by its procompetitive effects. The net result is that antitrust suits are more fact-intensive, time-

consuming, and costly than ever before.

Despite the general retrenchment of competition law in the United States, at least two maxims persist: competitors ought not share information or coordinate their activities

overmuch, and scrutiny is warranted where market power accumulates in the hands of a

single competitor or group of competitors. Standards and their drafting offer numerous

opportunities for classically anticompetitive behavior. Unfortunately, as Lemley has pointed out, these modest values have never been comfortably applied to standards,

especially standards for digital technologies. The whole point of an SSO is to allow

competitors to meet to exchange information and coordinate their actions. Moreover,

competition between standards defeats their purpose, because the purpose of a standard is to take advantage of the efficiencies produced by eliminating certain kinds of wasteful competition in a market. For most, this tension is resolved by thinking about information 101 technology standards as natural monopolies, or monopolies which are more beneficial than the competition that they displace.

This is not to say that antitrust has no place in the world of standards. For example, competition claims are appropriately brought when a standard is mere camouflage for a conspiracy to set prices. However, such a claim is nonresponsive to the new structures of cooperation embodied by modern standards consortia. Instead, applied to standard-setting processes for digital technology, competition rules have most often dealt with issues ofsubmarine patents and holdup. The former occurs when a consortium participant fails to disclose the existence of a patented technology until after the standard is finished. At that point, because of the high cost of switching to another standard, the patent holder can extract a supracompetitive price for the use of its patent.

The related problem of hold up arises when a participant discloses but refuses to license a patent that is necessary for implementation of the standard. If a patent gains significant value by virtue of its becoming embedded in the standard, its owner can again extract supracompetitive licensing fees.9 SSOs typically address these problems by requiring members to agree to license any relevant intellectual property at reasonable and non­ discriminatory rates, though problems persist.10

As noted, most modern intersections between competition law and technology standards are governed by the rule of reason. Unless the SSO is hosting meetings where price-fixing occurs, a balancing test will be applied to its activity. And here is where the

102 normative perception of standards has dulled the interest of competition jurists and regulators. When compared to technical balkanization and proliferation of incompatible technologies, competition regulators look favorably on standardized markets. Such markets are supposed to spur innovation and competition by avoiding wasteful format wars. Moreover, to the extent that the market is ordered by standardized DRM, price discrimination becomes more feasible. Somewhat counter-intuitively, the ability to undertake granular price discrimination is considered a virtue by economists because it helps eliminate deadweight loss.11 This is a fine example of robust economic reasoning, courtesy of the Chicago School, that would strike the average consumer as producing a decidedly unwelcome result. Finally, though contentious among people who follow its development, the deployment of DRM is an accepted, legitimate activity in the eyes of government.12 Protection of IP has been an increasing area of government activity even as interest in antitrust has waned. Combating DRM's presumption of legitimacy is a larger normative task for advocates of balanced copyright and technology policy.

Provided that any interested party can adopt the standard, an actor with market power does not impose it unilaterally, and any restrictions on its licensing and use are reasonable, it typically passes muster with competition regulators. Even if these tenets are violated, a sufficiently compelling countervailing pro-competitive virtue may excuse the practice under the rule of reason. In other words, the bar is extremely high for parties who challenge the activities of standards groups on antitrust grounds.

103 B. Rethinking Competition. Standards, and DRM

"There is a general consensus that standards provide a wide variety of substantial procompetitive benefits. Because of this, the antitrust enforcement agencies and courts have looked favorably on industry-developed technical standards."

-Anton &Yao, 199513

"Ownership or control of important standards in the network industry may confirm market power that could be used to raise prices, retard innovation or facilitate anticompetitive [conduct]."

-Calderini & Giannaccari, 200614

As suggested by these authors, the conventional wisdom on antitrust and standards has shifted over the last decade. Where standards were once seen as plainly beneficial for competition, the changing nature of how, and by whom, they are drafted has sparked renewed concern over their effect on competition. Despite the current state of antitrust law, there are reasons to believe that the resurgence of a more public interest- oriented approach is possible. As noted competition scholar Hovenkamp has argued, "If one hundred years of federal antitrust policy has taught us anything, it is that antitrust is both political and cyclical."15 In a discussion of the balance between antitrust and intellectual property in pursuit of innovation, former FTC Chairman Robert Pintofsky has cautioned against the eclipse of antitrust by doctrinaire support for intellectual property.1

Growing consumer awareness of copyright policy, scholarly interest in compulsory 104 licensing regimes like the essential facilities doctrine, and general public disdain for

DRM systems are just a few factors that suggest the possibility of change on the horizon.

This section provides an overview of how several antitrust claims might apply to DRM

standards organizations, where they fall short, and how they might be modified to better

reflect their populist, pro-consumer origins.

With this in mind, the following analysis should be taken as a thought experiment

about claims involving generic consumer groups, open source vendors (OSVs), standards

consortia, and consortia participants that, acting individually or in concert, may wield

monopoly power in a market. DVB is referenced solely to illustrate points about structure

and conduct. The fact that American antitrust law has limited application to DVB's

conduct, at least in the European markets where it is most active, should underscore the hypothetical nature of this inquiry. Nonetheless, I submit that it is useful to map the rules

of one jurisdiction over the facts of another in order to study the overlap and lacunae that result.

It is worth noting that competition litigation, while conjuring images of

government regulators, may also be initiated by private actors in the United States.

Aggrieved parties can bring suit for treble damages—three times the amount of any

actual damage suffered by a plaintiff—if they have been harmed by anticompetitive behavior and courts can enjoin ongoing anticompetitive conduct. These remedies are painful, and their power to dissuade should not be underestimated.17 However, aside from

105 government competition regulators, hold up and submarine patent claims can be brought only by private entities that participated in, and were therefore harmed by, a particular standard development process. Claims from parties that have been entirely excluded from the process or fruits of standard-setting—consumers and open source vendors, for example—would need to look to other causes of action.

C. Framing Observations

Before delving into this analysis, four framing observations must be made. First, this analysis proceeds under the assumption that the activity of standards consortia and their content industry participants is not designed to fix prices or reduce output. Those related activities are the most obvious forms of anticompetitive behavior, and they are also the most clearly forbidden. In fact, many other antitrust claims are essentially structural variations on the theme of price fixing. The presence of such activity would be a gift to those who wish to frustrate the work of standards bodies working on DRM, and copyright holders have occasionally indulged in those practices, but good competition counsel would have steered any standards group away from such malfeasance. The likely absence of obvious price fixing is expected, but it also forces this antitrust analysis into more nuanced territory. However, real social harms and economic costs are produced by the standardization and ubiquitous deployment of DRM systems, and antitrust law still may provide a sharp tool for their investigation.

106 Second, because most antitrust offences are analyzed under the rule of reason, which typically requires a fact-specific finding of market power, this analysis proceeds on the assumption that its various evidentiary standards could be met. However, more research is needed to fully understand which parties have market power in the scenarios outlined below. Though such a finding will depend on how one defines the market in question, and though there is no clear statutory or common law threshold above which a firm can clearly be said to enjoy market power, recent jurisprudence suggests that controlling something more than 30% of the market is required.19 For example, it would be useful to know if the MPA wields market power in the sense contemplated by competition jurists. As the primary agitator for DRM standards, is the highly concentrated motion picture industry operating as a monopoly when represented by its unifying trade association? One way to undertake this analysis would be to research the programming served by various broadcasters who use DVB standards and correlate that data with the output of MPA member studios. If it turned that the MPA provides 70% of the commercially broadcast content in a given market, their activities in relation to downstream firms and consumers might be subject to heightened scrutiny.

In this way, groups like DVB could be an example of the "New Wal-Mart Effect" gone wrong. Described by regulatory expert Michael Vandenbergh, this phenomenon occurs when a market actor—or, in this case, a group of companies represented by a single trade organization—wields power to such a degree that it is able to control the behaviour of downstream actors.22 In the world of digital broadcasting, the MPA enjoys 107 this position because its members control an essential input for the market of digital television advertising, subscriptions to cable and satellite television services, consumer electronics devices like personal video recorders, etc.. However, while the New Wal-

Mart Effect was coined in reference to that company's use of its power to improve environmental conditions around the world, copyright holders are using their position to encourage standards consortia to embed controversial policy objectives into their products.

While a company like Wal-Mart exercises its power by controlling access to its retail locations, copyright holders wield power by controlling access to licenses for their content. This phenomenon looks remarkably similar to the "strategic use" of patents, which refers to leveraging a patent right in order to achieve a business goal in another market.23 This model stands in contrast to the "traditional" model of intellectual property, where an exclusive right is granted in order to provide incentives for the creation of more intellectual work. In that model, the purpose of IP protection is to encourage creativity. In the strategic use model, IP rights are sought for their utility in achieving secondary goals.

The phenomenon of strategic use is well documented in patent literature, but its application to copyright is more novel. It is explored in more detail in Section V(B)(3) of this paper.

Third, the social costs of anticompetitive behavior should be expanded to include social harms that are more expansive than high prices. For example, in the context of

108 antitrust, the usual approach to examining harm will focus on 'deadweight loss.'

Deadweight loss occurs when a monopolist raises the price of his goods beyond the

highest price that a consumer will pay for a good—known as the consumer's 'reservation

price'—thereby preventing the consumer from obtaining the good. In the presence of

competition, another firm would simply provide the good at the consumer's reservation

price, provided that that price is not below the cost of producing another unit of the

good—known as the marginal cost. In other words, the social cost of monopoly can be

defined as utility that is lost when monopolists sell goods at unacceptable prices.

However, this dynamic also occurs on matters other than price. Take, for example, a

monopolist in the car market that arbitrarily chooses to sell only blue cars. Customers

who hate blue cars will refuse to buy the monopolist's product, and the absence of

competition means that no yellow cars can be had. This kind of non-price deadweight

loss is important to quantify, especially in the world of technology. If the only way to

receive digital television is to purchase a device that has limited functionality and can be remotely disabled by a copyright holder, some percentage of the market is likely to

decline to buy that product at any price. And if that segment of the market is large

enough, perhaps it should outweigh the efficiencies gained by copyright holders by being

able to undertake granular price discrimination. In other words, if the movie industry

extracted an extra $1 billion by using DRM, but the consumer electronics industry lost $2 billion because it was forced to sell undesirable players laden with DRM, has there been a

109 gain in economic efficiency? The current model of American antitrust law does not deal elegantly with this scenario.

Fourth, it is worth asking whether standards really do spark innovation. The conventional wisdom is that standardized interfaces prevent wasteful competition.

Wasteful competition is the kind that occurs between formats (i.e. VHS vs. Beta, Blu-ray) and scares away consumers. In contrast, standards are supposed to leapfrog that process and free innovators to create desirable new tools and services. When that happens, the public wins.

In the case of DRM standards, however, "innovation" has a very different meaning. DRM standards force conformity in technical design. The entire raison d'etre for such a regime is to enforce certain technical policies while prohibiting others. It is because of the standardized DRM in DVDs—which had to be licensed by any company wishing to sell DVD players—that the features of DVD players have remained essentially stagnant for 15 years. In the case of DVB, new features that interact with CPCM- protected content would require organizational approval. This is hardly a model that invites vigorous innovation. For example, despite the popularity of personal video players like the Apple iPod, there is no legally available device that converts DVDs into iPod- friendly formats. Makers of DVD backup software, while enjoying high consumer demand, have been litigated out of existence. Customers want these products, but the consortium that licenses DVD technology has forbidden their sale in the market. Instead,

110 proponents of standardized DRM suggest that innovation will be facilitated in the business models of content delivery. It is worthwhile, we think, to consider whether this kind of business model "innovation" is preferable to the technical and pricing innovations that would occur in a market with robust competition from OSVs.

With these questions in mind, the following sections walk through three potential

antitrust claims. Subsection 1 attempts to deal with the harm to open source vendors in being excluded from the market for digital television devices. Subsection 2 considers the harm to digital television consumers who are forced to purchase proprietary devices with limited functionality. Subsection 3 moves outside of pure antitrust law to explore the doctrine of intellectual property misuse, which may nonetheless be employed to address

antitrust concerns. Finally, Subsection 4 briefly analyzes relevant areas of non-U.S. competition law.

/. Harm to Open Source Vendors: Concerted Refusals to Deal

In the context of a DVB-like standard for DRM, OSVs may be excluded from the market for digital television equipment. For example, if the world's major producers of copyrighted entertainment content deal only with broadcasters that use a particular DRM system, and the standard for that system is licensed only to device manufacturing firms that prevent user "tampering," OSVs will be effectively blocked from the consumer electronics market for digital television devices. As outlined in Figure 4, an OSV could manufacture a digital tuner that could receive DVB signals, but any content protected

111 with DRM would be illegible to the user. What consumer would purchase an open source

tuner if it only worked with non-copy-protected signals? An OSV cannot compete with

proprietary electronics companies under these circumstances. This model raises classic

competition concerns in that it represents a group of competitors colluding to create rules

that non-members cannot meet. For our purposes, the question is whether this harm is the

result of illegal anticompetitive behavior.

The harm experienced by OSVs flows from being excluded from the DRM

component of a digital television standard. Yet exclusion is not usually a violation of

competition law. It is a basic premise of common law that a firm may decide its business partners, or with whom it will deal. However, this rule does not apply in two situations.

First, a refusal to deal may be illegal if it is undertaken in order to facilitate monopolization or to maintain a monopoly. This does not seem to be the case here, notwithstanding the fact that some MPA members are also competitors in the consumer

electronics market (e.g. Sony Corporation). The second situation in which a refusal to deal may be illegal is when the agreement is made between two or more competitors or even non-competitors—that is, they enter into a concerted refusal to deal. Such agreements are disfavored because they artificially bar competition by certain firms while providing opportunities for price fixing, cartelization of the market, and non-price discrimination. Though previously considered a per se violation of Section 1 of the

Sherman Act, the doctrine is now riddled with so may exceptions that it is more accurately described as a rule of reason offence.24 In particular, to attract sanction, a 112 concerted refusal to deal must be undertaken either by an entity with market power or in

an attempt to achieve an anticompetitive goal like price fixing.

Assume for a moment that the MP A—the entity refusing to deal with non-DRM-

compliant technology vendors—could be shown to have market power. In that case, their refusal to license their copyrighted material would be the action under scrutiny, and it would be analyzed pursuant to the rule of reason. Today, it is the position of the U.S.

Supreme Court that a refusal to license IP is not a violation of antitrust law unless one of three narrow conditions is met: the IP was obtained by fraud, the litigation to enforce the right was a sham, or the IP was used in an illegal tying scheme.25 These conditions have come under fire from a number of commentators, including the former chair of the

Federal Trade Commission, for being overly protective of IP and ignoring situations where IP rights can be used to achieve anticompetitive ends.26 In other words, the activity of the MPA seems to meet the structural requirements of a concerted refusal to deal (i.e. market power, the presence of an agreement between competitors, harm to competition), but their reason for doing so—protecting IP rights—is likely to provide a "reasonable" justification.

Combating this presumption of reasonableness could take one of two paths. First, one could directly engage with the normative support for DRM. As pointed out in Section

III(D), recent consumer polling suggests that DRM is undesirable and unwelcome, and that its legal protection has sparked increasing controversy in countries around the world.

113 Leading this kind of evidence might support the position that DRM is not presumptively reasonable, and that its invocation in an antitrust suit should not provide blanket immunity. Second, a plaintiff could try to convince a court that content industry demands for a DRM standard are about more than protecting property rights. For example, could a court be convinced that the MPA's refusal to deal with OSVs is part of an attempted extension of its members' statutory copyrights into heretofore unclaimed territory? Is the

MP A attempting to leverage their members' exclusive reproduction rights into control over other uses of copyrighted material like time shifting, an activity in which American consumers lawfully engage? In other words, should competition regulators care if a copyright holder uses one statutorily defined right to bootstrap power over activities to which they have no legal claim? No court has yet applied this exact reasoning, but several related strategies exist.

Despite the general proposition that an IP right is exclusive and its holder is under no obligation to license it,27 there are some situations where such a license may be compelled. In the U.S. the "essential facilities doctrine" refers to the idea that when a monopolist who controls facilities that are necessary for competition, a court can compel it to share access with potential competitors.28 While it has traditionally applied to physical property, there is no reason why this doctrine should not apply to intellectual property as well.29 In that context, it would function as a sort of compulsory licensing regime. This is not a new doctrine, but it is getting new attention. Several recent academic works have begun to mull its usefulness at the intersection of antitrust and 114 intellectual property. Even the Department of Justice has echoed some of the concerns

that animate the essential facilities doctrine:

"When the licensor and licensees are in a vertical relationship, the Agencies will analyze whether the licensing arrangement may harm competition among entities in a horizontal relationship at either the level of the licensor or the licensees, or possibly in another relevant market. Harm to competition from a restraint may occur if it anticompetitively forecloses access to, or increases competitors' costs of obtaining, important inputs, or facilitates coordination to raise price or restrict output. "

While the Supreme Court has never applied the doctrine explicitly, lower courts have

employed it in limited circumstances.32 The lack of Supreme Court jurisprudence on this

doctrine is just one barrier to its application in the United States. Even commentators who

argue for the doctrine's reinvigoration recognize that its animating principles—a

commitment to fair play, a hands-on approach to markets—have taken a beating in the

economic turn of antitrust law.33

2. Harm to Consumers.- Tying

Tying refers to the sale of a good (the "tying" product) on the condition that a buyer purchases a second product (the "tied" product) that she does not want or wishes to purchase elsewhere. For example, imagine a firm called FooCorp with a monopoly in the market for staplers, but no market power in the competitive market for staples. If

FooCorp sold its staplers on the condition that consumers buy only FooCorp brand staples, the consumer would be deprived of choice and competitors in the staple market 115 would be unfairly blocked. Tying used to be a per se violation, but this standard has been relaxed in recent years. The defendant must be shown to have market power in the market for the tying product, and their conduct must affect a "not insubstantial" amount of trade.34 This practice is forbidden by both the general language of the Sherman Act and

Section 3 of the Clayton Act.35 Also, it has become accepted that some tying arrangements are pro-competitive.

In the context of DRM standards, the tying product could be copyrighted material and the tied product would be restrictive DVB tuners. If a consumer wishes to view digital television in a DVB market, they will be forced to purchase DVB-licensed devices. The consumer would be able to purchase open source devices—which are more functional and, potentially, less expensive than their DVB-licensed alternatives—in a competitive market. In essence, the MPA is conditioning the sale of its goods on the tied purchase of a DRM-laden electronics device. Assuming that this affects a "not insubstantial" amount of trade,37 the conduct of the MPA would at first appear to be an illegal tying scheme.

Closer analysis of the tying test reveals problems. First, plaintiffs would have to prevail in their argument that there really are two products at issue. A tying claim will fail if the court is convinced that the conflict is really over a single product with multiple components—for example, the court could find that the conflict is really over the provision of digital television service, of which copyright licenses and tuning devices are

116 simply two components. In Jack Walter & Sons Corp. v. Morton Bldg., Inc., Judge

Posner found that a prefabricated building and the trademark attached to the building by its manufacturer were a single product for the purpose of a tying analysis. The willingness to combine a physical item and accompanying IP rights could be a problem for plaintiffs.

Second, it should be noted that in the context of antitrust, the presence of a patent—or, presumably, other IP—does not create a presumption of market power. This is important to recognize because patents and copyrights are often referred to as "time- limited monopolies." While this is a useful framing device for the study of IP, competition scholars and regulators have strenuously resisted it as unnecessarily confusing and inaccurate.38 In the context of modern antitrust, plaintiffs would have to prove that the MPA actually wields market power in the relevant market. Here, the party advancing a tying claim would want to define the market as "the market for programming that is, or is likely to be, aired by commercial broadcasters." Given the appropriate evidence, this formulation would give the plaintiff an opportunity to show that copyright holders represented by the MPA wield market power that constrains the agency of other actors in the digital television market.

Finally, plaintiffs would have to prove that the anticompetitive effects of the tie are not outweighed by its efficiencies. Here again the normative support for DRM and standards will be difficult to overcome. Defendants would point to the many firms that

117 are active in the consumer electronics market and argue that competition is alive and well. They would also argue that the purpose of the tie is not to secure a market advantage (in terms of market share or the extraction of monopoly profits) in the realm of consumer electronics devices. Instead, they will argue that the standardization of DRM, in service of protecting their copyrights and enabling price discrimination, is actually a pro-competitive endeavor. The fact that OSVs are excluded would be cast as an unfortunate but unintended consequence of their legitimate development of DRM.

In order to combat these strong efficiency-based arguments, plaintiffs will have to focus on the consumer harms and the quantification of lost efficiencies from the participation of OSVs. A showing that significant numbers of consumers want functionality that is forbidden by the DRM standard, and that they are choosing to forego digital television devices altogether (i.e. becoming non-price related deadweight loss), a court might be persuaded to ban the tie. Similarly, if the competitive benefits of OSV participation could be shown to be significant enough, even a post-Chicago School court might be convinced to sanction the tying relationship.

3. Standards and Intellectual Property Misuse

In the United States, the original, canonical rationale for granting intellectual property protection is to spur the production of creative work, whether scientific or artistic. The Constitution empowers Congress to pass laws to "promote the Progress of

Science and useful Arts, by securing for limited Times to Authors and Inventors the

118 exclusive Right to their respective Writings and Discoveries." What happens when an

intellectual property right is used to stifle creativity instead of promote it? Though it may

seem paradoxical at first, this happens quite often. When too many exclusive rights litter

the field of competition, innovation is constrained by a phenomenon known variously as

a "patent thicket"40 and the "tragedy of the anti-commons."41 Without the freedom to

practice the fundamental technologies protected by the thicket, a would-be innovator may

be chilled from entering the market altogether. Similar situations arise in creative pursuits

like filmmaking, where entire shots may be discarded or reworked because a snippet of

copyrighted music can be heard in the background. These examples of frustrated uses are

the inevitable result of a system that grants exclusive rights over creative work.

Notwithstanding their exclusivity, courts have long recognized that an IP right can be wielded so unfairly that it undermines the very rationale for the existence of IP

protection. We have already seen that antitrust law can suspend exclusivity in pursuit of

competition, but the more general doctrine of intellectual property misuse (misuse) provides similar assistance. Though it is sometimes confused with antitrust because it is

often invoked against an alleged monopolist, misuse can be applied in non-competition

contexts as well. As articulated in Lasercomb America Inc. v. Reynolds, the leading case

on copyright misuse, "The question is not whether the copyright is being used in a manner violative of antitrust law (such as whether the licensing agreement is

"reasonable"), but whether the copyright is being used in a manner violative of the public policy embodied in the grant of a copyright."42 Misuse is an extension of the equitable 119 doctrine of "unclean hands," and it is meant to sanction conduct that is otherwise legal,

but which causes harm to the public interest.43 The conduct must "draw strength" from a

patent, and a successful patent misuse defense will render the impugned patent

unenforceable so long as the objectionable conduct persists.

This doctrine is attractive for several reasons. First, it invites the kind of

contextual, public interest-oriented analysis that has been lost in the post-Chicago School

era of antitrust law. The ability to punish uses of IP that are contrary to the spirit of

copyright/patent law or inconsistent with the First Amendment44would be a powerful tool

for public interest advocates. The doctrine offers standing benefits as well, because the party alleging misuse does not need to be injured by that misuse.45

These features recommend misuse as a remedy for the harms explained above, but problems with the doctrine remain. First, to date, misuse has only been accepted as an

affirmative defense. This presents certain procedural barriers for civil litigants who want

to use the doctrine to target DRM standardization. Second, while misuse is an accepted,

common defense in the realm of patents,46 copyright misuse is more novel. Though

several circuits have acknowledged its existence in recent years,47 the Supreme Court has not yet issued any opinions that recognize or define it in an authoritative way. Potential

litigants would have to decide whether to file suit in a circuit where the doctrine had

already been recognized and is therefore stronger, or in a circuit where it is unknown.

The latter strategy would be useful if one wanted to provoke a "circuit split," or a

120 disagreement between circuits of the U.S. federal court system. Such a split is one of the elements that raises the likelihood that a case will be heard before the Supreme Court.

Third, the IP in question merely becomes unenforceable for the duration of the abusive conduct. This limits the usefulness of misuse as a deterrent, since the IP-holder's rights become enforceable as soon as the impugned activity ceases. Where antitrust law comes with statutorily defined remedies that have significant deterrent effect, misuse has a comparatively less painful bite. Finally, some of the circuit court rulings have been careful to disassociate the concepts of misuse and a common law compulsory license. For example, in Video Pipeline, Inc. v. Buena Vista Home Entertainment, Inc., the 3r Circuit recognized the existence of the copyright misuse doctrine, but declined to apply it in a situation where it would have created a de facto compulsory license:

"Finally, copyright law, and the misuse doctrine in particular, should not be interpreted to require Disney, if it licenses its trailers for display on any web sites but its own, to do so willy-nilly regardless of the content displayed with its copyrighted works. Indeed such an application of the misuse doctrine would likely decrease the public's access to Disney's works because it might as a result refuse to license at all online display of its works. "4S

This case could be distinguished in several ways, but its reasoning would have to be addressed.

121 4. Competition Law Beyond the US

While it is outside the scope of this paper to undertake a full comparative analysis of global competition law, it is worth noting that other jurisdictions have been grappling with similar issues. The approach of those jurisdictions can be instructive, and several recent international developments build on the public interest-oriented counter narrative to the economic turn in antitrust law.

For example, European regulators have been much more open to an essential facilities approach, known on the Continent as "abuse of dominant market power."

Article 82 of the Treaty Establishing the European Community bars the use of dominant market power in a range of activities, including "limiting production, markets or technical development to the prejudice of consumers."49 In a recent case, the European Court of

Justice invoked this section when forcing three television stations to license their programming data to an independent firm that intended to publish a unified program guide.50 The stations had refused to furnish this information, thereby preventing the creation of a new product—what amounted to a TV guide—that would be valuable to the public. In finding an abuse of dominant position, the Court found that:

"(1) the Commission had proven indispensability, given that the broadcasters were "the only sources of the basic information on programme scheduling which is the indispensable raw material for compiling a weekly television guide"; (2) the broadcasters' refusal to deal had "prevented the appearance of a new product, a comprehensive weekly guide to television programmes, which the appellants did not offer and for which there was a potential consumer demand"; 122 (3) "there was no justification for such refusal either in the activity of television broadcasting or in that of publishing television magazines"; and (4) the broadcasters had "reserved to themselves the secondary market of weekly television guides by excluding all competition on that market . . . since they denied access to the basic information which is the raw material indispensable for the compilation of such a guide.

This reasoning is compellingly pro-consumer and pro-innovator. The third prong of this four-part analysis would be subject to some of the same problems that plague the U.S. approach to a concerted refusal to deal (i.e. the normative acceptance of DRM), but overall this European doctrine appears ripe for exploration.

Canadian competition law includes the concept of "abuse of dominance," which appears at first to be similar to its European cousin.52 This doctrine employs a three-step test that asks whether a firm wields market power, is engaged in anticompetitive behaviour, and whether that behaviour substantially lessens competition.53 The definition of "anticompetitive behaviour" is not provided in the statute, though a non-exhaustive list provides useful guidance.54 For example, the statute defines the "adoption of product specifications that are incompatible with products produced by any other person and are designed to prevent his entry into, or to eliminate him from, a market," as anticompetitive behaviour. This doctrine has never been applied to a standards body working on specifications for DRM, but it does appear germane to that activity.

Despite the promising appearance of this doctrine, Canadian competition law exhibits features that frustrate its utility in the context of DRM and standards. For 123 instance, only the Competition Commissioner—the appointed head of the Competition

Bureau—can initiate proceedings under the abuse of dominance provisions of the

Competition Act. The lack of a private right of action may explain why only five cases of

alleged abuse of dominance have been ever been litigated in Canada. Of those cases, the

Commissioner has pursued only firms that control over 80% of the market.55 And even

where a firm is found to have abused its dominant position, only injunctive remedies are

available to discipline its conduct. The resulting structure is doubly unsatisfactory when

compared to American antitrust law: fewer parties can raise antitrust concerns, and even

successful claims may lack the kind of monetary damages that would deter future

conduct.

Despite these practical barriers, Canada's competition law may be moving in the

right direction. In 2003, a government report outlined public interest-oriented reforms to

the law that included the addition of private rights of action, increasing damage awards to

injured parties, and moving certain offences into the per se category of illegality.56 The recommendations of the report were implemented in legislation during the following year, but that bill died when the Liberal government fell in early 2006.57 Separate

legislation created a limited private right of action for tying claims58 and refusals to deal59

in 2002. In other words, it appears that Canada is alive to some of the reforms that would

enable public interest-oriented uses of its competition system.

124 The appeal of competition doctrines, notwithstanding the challenges inherent in reconfiguring their contours, is that they embody a more generalizable approach to ordering our information environment. Recognizing that information goods are essential inputs in many processes—not the least of which is the process of subject formation and the construction of informed electorates—would be a critical development with wide applicability. The same could be said for remembering the origins of why we protect information goods: requiring that those state-granted rights be used in accordance with the spirit of their granting would, in many situations, be more socially valuable than their slavish observance.

1 The idea that some competition is actually harmful to consumers is one of the central tenets of Chicago School antitrust analysis and the "paradox" in Robert H. Bork, The Antitrust Paradox (New York: Basic Books, 1978). 2 A note on jurisdiction: The political and technical developments that I cover are distinct to particular contexts (network neutrality in the United States and, more recently, Canada; DVB in Europe; etc.). However, the Internet and the trends that shape it are global. With apologies to the international and comparative bar, I have endeavoured to identify the geographic scope of the issues I describe, but I have not attempted to reconcile those issues from the standpoint of comparative legal analysis. In other words, while I know that the Sherman Act does not apply to DVB's European activities, I believe that such a though experiment is useful nonetheless 3 See, e.g. Adam Thierer, What's Yours is Mine: Open Access and the Rise of Infrastructure Socialism (Washington D.C.: , 2003). 4 Ibid. 5 The Sherman Antitrust Act, 15 U.S.C. § 1 (2007). 6 See, e.g. Herbert Hovenkamp, "Antitrust Policy after Chicago," Michigan Law Review 84, no. 2 (November 1985): 213-284; and William H. Page, "The Chicago School and the Evolution of Antitrust: Characterization, Antitrust Injury, and Evidentiary Sufficiency," Virginia Law Review 75, no. 7 (October 1989): 1221-1308. 7 Hovenkamp 1985,213-284. 8 See, e.g. Daniel A. Crane, "Technocracy and Antitrust," Texas Law Review 86, no. 6 (May 2008): 1159- 1222 (Examining the tension between antitrust law's populist roots and its present "technocratic" state, 125 concluding that the transition to technocracy should be encouraged) and Eben Moglen, "Antitrust and American Democracy: Where We've Been and Why it Matters," The Nation, November 30, 1998 ("In its original setting, antitrust agitation was a form of conservative populism, seeking government intervention to maintain the traditional level of concentration of private economic power, in the interest of free economic and political competition."). 9 David L. Meyer, "How to Address "Hold Up" in Standard Setting Without Deterring Innovation: Harness Innovation by SDOs" (presented at the ABA Section of Antitrust Spring Meeting, Washington, D.C., March 26, 2008), http://www.usdoj.gov/atr/public/speeches/234124.htm. 10 See, e.g. In re Dell Computer Corp., 121 F.T.C. 616, FTC LEXIS 291 (May 20 1996) (Dell Computer, one of the world's largest computer manufacturers, made written representations to a standard setting organization that it did not hold any intellectual property related to the standard. Meanwhile, it had applied for patent protection on part of the standard, and subsequently sought to enforce that patent after the standard had been finished.). 11 The notable exception is the Robinson-Patman Act, which condemns the sale of commodities of like grade and quality at different rates when such activity is likely to substantially reduce competition. 12 Notwithstanding recent debate over the wisdom of legal protections for DRM in Canada and elsewhere, summarized in Section 3, those approaches are the exception. 13 James Anton and Dennis Yao. "Standard-Setting Consortia, Antitrust, and High-Technology Industries." Antitrust Law Journal 64.Fall (1995): 247 at 249. 14 Mario Calderini and Andrea Giannaccari. "Standardisation in the ICT - sector: The (complex) interface between antitrust and intellectual property." Economics of Innovation and New Technology 15.6 (2006): 543 at 544. 15Hovenkampl985,213 16 Robert Pitofsky, "Challenges of the New Economy: Issues at the Intersection of Antitrust and Intellectual Property" (presented at the American Antitrust Institute: An Agenda for Antitrust in the 21st Century, Washington, D.C., June 15, 2000), http://www.ftc.gov/speeches/pitofsky/000615speech.shtm (Criticizing the Federal Circuit's ruling in Independent Service Organizations Antitrust Litigation (Xerox), 203 F.3d 1322 (Fed. Cir. 2000), which used language so sweeping that virtually any refusal to license intellectual property, even for anticompetitive ends, would withstand antitrust scrutiny.). 17 For example, American Express recently obtained $2.1 billion from Visa and $1.8 billion from Mastercard in separate settlements over antitrust claims. Of course, the damage multiplier on antitrust awards is a useful deterrent only if the actual damages are significant. Small competitors and new market entrants may not be able to show significant damages, thereby limiting the effectiveness of this remedy. Eric Dash, "MasterCard Pays $1.8 Billion to American Express," The New York Times, June 26, 2008, sec. Business, http://www.nytimes.com/2008/06/26/business/26credit.html?ref=business. 18 See, e.g. "Music groups settle on CD price-fixing." BBC 30 Sep 2002. 27 Jul 2008 http://news.bbc.co.Uk/2/hi/business/2289224.stm (In 2002, the world's largest record companies settled a price-fixing suit by paying over $143 million, though they admitted no wrongdoing.). 19 Jefferson Parish Hospital Dist. No. 2 v. Hyde, 466 U.S. 2 (1984).

126 20 One might argue that it would be more appropriate to return to the "old" Wal-Mart Effect outlined by Charles Fishman, but I prefer the New Governance connotations of a play on the "New Wal-Mart Effect." For information on the original, see Charles Fishman, The Wal-Mart Effect: How the World's Most Powerful Company Really Works—and How It's Transforming the American Economy (New York: Penguin Press, 2006). 21 Michael Vandenbergh, "The New Wal-Mart Effect: The Role of Private Contracting in Global Governance," 54 UCLA L. Rev. 913 (2007). 22 Ibid, at 916. 23 Daniel Rubinfeld and Robert Maness, "The Strategic Use of Patents: Implications for Antitrust." Francois Leveque & Howard Shelanski, eds. Antitrust, Patents and Copyright: EU and US Perspectives (Northampton: Edward Elgar Publishing, 2005), 84. 24 A concerted refusal to deal in service of retail price maintenance, which is per se illegal, may still be reviewed under a per se analysis. See, e.g. FTC v. Superior Court Trial Lawyers Association, 493 U.S. 411 (1990) (Where an association of trial lawyers refused to take clients from a city's indigent criminal defense program unless the city agreed to pay an increased hourly rate.). 25 In re Independent Service Organizations Antitrust Litigation, 203 F.3d 1322 (Fed. Cir. 2000) ( 26 Hovenkamp 2000. 2 DOJ Antitrust & IP Enforcement Guidelines, 6 ("Antitrust liability for mere unilateral, unconditional refusals to license patents will not play a meaningful part in the interface between patent rights and antitrust protections."). 28 Though the essential facilities doctrine is not mentioned explicitly in this decision, the contours of the doctrine's reasoning are traced to United States v. Terminal Railroad Association 224 U.S. 383 (1912). (Holding that the association controlling the only railroad termination point in St. Louis, which was owned by railroad companies, had to open its facilities to competing railroad companies that were not members of the association.) 29 See generally, Cotter, Thomas F. "The Essential Facilities Doctrine." SSRN eLibrary. 2 Jul 2008 . 30 See, e.g. ibid; Waller, Spencer W., and Brett M. Frischmann . "Revitalizing Essential Facilities." Antitrust Law Journal, Vol. 75, No. 1,2008. 2 Jul 2008 . 31 DOJ Guidelines, section 4.1.1 32 Cotter, 5-6. 33 See generally Cotter and Waller & Frischmann. 34 Siegel v. Chicken Delight, Inc., 448 F.2d 43 (9th Cir 1971) (The oft-cited 9th circuit articulation of the tying test: "First,... the scheme in question involved two distinct items and provides that one (the tying product) may not be obtained unless the other the tied product is also purchased. Second, ... the tying product possesses sufficient economic power to appreciably restrain competition in the tied product market. Third, ... a "not insubstantial" amount of commerce is affected by the arrangement.") 35 15 U.S.C. § 14 ("It shall be unlawful for any person [...] to lease or make a sale or contract for sale of goods [...] on the condition, agreement, or understanding that the lessee or purchaser thereof shall not 127 use or deal in the goods [...] of a competitor [...] where the effect of such [agreement] may be to substantially lessen competition or tend to create a monopoly in any line of commerce."). 36 Broadcast Music, Inc. v. CBS, 441 U.S. 1 (1979). 37 Commerce in the amount of $60,800 has been found to meet this threshold. Richard Steuer, "Executive Summary Of The Antitrust Laws," FindLaw, http://library.findlaw.eom/1999/Jan/l/241454.html. 38 Illinois Tool Works, Inc. v. Independent Ink, Inc., 126 S. Ct. 1281 (2006), 1284 (In a unanimous ruling, the Supreme Court rejects the presumption of market power where IP is held); Antitrust-IP Guidelines, Sec 2.2 ("the Agencies have stated that, when analyzing agreements to license, they do not presume that a patent owner has market power"). 39 Article 1, Section 8 40 See, e.g. Bessen, James E. "Patent Thickets: Strategic Patenting of Complex Technologies." SSRN eLibrary (2003). 4 Jul 2008 . 41 Michael A. Heller, "The Tragedy of the Anticommons: Property in the Transition from Marx to Markets." SSRN eLibrary. 4 Jul 2008 . 42 911 F.2d 970, 15 USPQ2d 1846 (4* Cir. 1990). 43 Though not the first case in which patent misuse was alleged, the modern doctrine can be traced to Morton Salt Co. v. G.S. Suppiger, 314 U.S. 488, 492 (1942) ("It is a principle of general application that courts, and especially courts of equity, may appropriately withhold their aid where the plaintiff is using the right asserted contrary to the public interest.") The defense now enjoys statutory definition in the United States as well: 35 U.S.C. Sec. 271(d)(5) ("No patent owner otherwise entitled to relief for infringement or contributory infringement of a patent shall be denied relief or deemed guilty of misuse or illegal extension of the patent right by reason of his having [...] conditioned the license of any rights to the patent [...] on the acquisition of a license to rights in another patent or purchase of a separate product, unless, in view of the circumstances, the patent owner has market power in the relevant market for the patent or patented product on which the license or sale is conditioned.'''' Emphasis added.). 44 Video Pipeline, Inc. v. Buena Vista Home Entertainment, Inc., 342 F.3d 191 at 205-206, (3rd Cir. 2003) ("A copyright holder's attempt to restrict expression that is critical of it (or of its copyrighted good, or the industry in which it operates, etc.) may, in context, subvert—as do anti-competitive restrictions—a copyright's policy goal to encourage the creation and dissemination to the public of creative activity.") 45 Morton Salt, 314 U.S. at 494 (1942) ("It is the adverse effect upon the public interest of a successful infringement suit in conjunction with the patentee's course of conduct which disqualifies him to maintain the suit, regardless of whether the particular defendant has suffered from the misuse of the patent.") 46 Kathryn Judge, "Rethinking copyright misuse," Stanford Law Review 57, no. 3 (December 1,2004): 901- 952: 908-910. 47 Lasercomb America, Inc. v. Reynolds, 911 F.2d 970 (4th Cir. 1990); Alcatel U.S.A., Inc. v. DGI Technologies, Inc., 166 F.3d 772 (5th Cir. 1999); Practice Management Information Corp. v. American Medical Ass'n, 121 F.3d 516 (9th Cir. 1997); Assessment Technologies of Wisconsin, LLP v. WIREdata, LLP, 350 F.3d 640 (7th Cir. 2003). 48 Video Pipeline, 206.

128 Treaty Establishing the European Community (Nice Consolidated Version), art. 82(b), available at http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:12002E082:EN:NOT. 50 RTE andlTP v. Commission (Magill II), Case C 241/91, [1995] ECR1-743, ff 7-9. 51 Cotter, 7-8. 52 Competition Act, R.S., 1985, c. C-34, s. 79 [Competition Act]. 53 Competition Act, s. 79(l)(a-c). 54 Competition Act, s. 78. 55 However, it should be noted that the Competition Bureau has indicated its belief that firms that control less than 35% of the market do not wield market power as contemplated by §79(l)(a). However, with no judicial guidance on firms that control 36%-79% of the market, it is unclear whether a successful claim could be brought against firms that fall into that range. See, e.g. Sheridan Scott, "Abuse of Dominance Under the Competition Act" (presented at the Federal Trade Commission/Department of Justice Hearings on Single-firm Conduct, Washington, D.C., September 12,2006). 56 Options for Amending the Competition Act: Fostering a Competitive Marketplace, Discussion Paper (Ottawa: Competition Bureau, June 2003), http://www.competitionbureau.gc.ca/epic/site/cb- bc.nsf/en/01711e.html 57 Herbert Hovenkamp et. al., IP and Antitrust: An Analysis of Antitrust Principles Applied to Intellectual Property Law (Aspen Publishers, 2002), §46.4, note 23. 58 Competition Act, s. 75. 59 Competition Act, s. 77.

129 VI. Conclusion

"Like a force of nature, the digital age cannot be denied or stopped."

-Nicholas Negroponte1

Myths about technology exist because of our desire to improve, to find solutions to the problems that bedevil us. And if we can build systems that make the right decisions—about how we get information about the world, what we can do with it, and to whom we can tell our stories—we might mitigate some small portion of the risk of agency. With the taking of every decision comes the possibility of error, so why not find answers that work and build them into the best tools we can devise? The problem is that we have yet to fashion a technology that pushes us toward our goals without human intervention, and the advent of the Internet is no exception. Instead of charging blindly towards the goal of openness, it appears that we must take our bearings again. Success in charting our digital future—in determining whether we will be empowered or constrained by the technologies that define our age—depends on a sober examination of where we are today. The privatization of infrastructure and governance are sprawling issues with deep roots in our present intellectual moment, and understanding how their history colours our choices about technology policy is both vital and difficult. However, notwithstanding the hyperbole that heralded its birth, the Internet's democratic promise is valuable enough that such a project is necessary. By engaging with the discourse of openness, we can make some of its assumptions available for debate.2

130 One aspect of the Internet's logic that remains convincing is that the

empowerment of individuals is a compelling method of bringing accountability to

systems of control. The puzzle is ensuring that tools of empowerment evolve to deal with new forms of control. This observation is important because the mythology of the

Internet—as a powerful medium that can improve our lifeworlds in ways unimaginable just 20 years ago—is not the same thing as the rhetoric of openness. Somewhere in our rush to understand the new network, we thought we figured out what made it so powerful, and we began to conflate certain characteristics of a network with the aspirations of its users.

Again, however, this paper is not an argument for the facile turn to "closed" and

"controlled." It is an investigation of the kinds of control that are authorized, enabled, and implemented in open environments. Focusing exclusively on issues of openness like net neutrality is a strategic distraction from emerging, under-appreciated control mechanisms like DRM. Some may claim that this juxtaposition is artificial, and that the proponents of net neutrality and the opponents of DRM are not in conflict. Indeed, in many cases, they are the same individuals and organizations.3 Why try to pit one against the other? It is true that the casual observer might not think to choose between these two positions, but they are in competition nonetheless.

First, there is the pedestrian but practical concern about the finite resources— human, financial, temporal—tied up in any political struggle. If both camps are headed in

131 the same direction, they must still fight over fuel for the journey. Political capital, money from individual and institutional donors, public attention, and other resources are finite.

Moreover, the resources mustered by the neutrality movement have arguably been deployed in a policy monoculture. By and large, neutrality advocates are not engaged in a critical analysis of DRM. For example, the groups that animate the network neutrality movement in the U.S.—i.e. Free Press and the coalition that it coordinates,

Savethelnternet.org—have no position on DRM at all. Of the charter members of the coalition that are also organizations, 82% have no policy statements, original content, or substantive discussion of DRM on their websites.4 Some of those organizations even support the use of DRM to control the use of cultural goods.5 But for many of the groups in the Savethelnternet.org coalition, neutrality is not just the first Internet policy issue with which they have engaged; it is the only one.

Second, insistence on neutrality may hamper attempts to build systems that frustrate DRM. While I do not advocate a reflexive departure from open networks, it is still necessary to consider what open networks cannot do. One such shortcoming is the inability of truly open networks to employ deep strategies of resistance to DRM. For example, logic-layer alterations to the Internet that would block content wrapped in

DRM, instead of unsolicited email, could be an effective technical strategy for combating the spread of distributed forms of information control. If the kinds of tactics derided by network neutrality advocates were employed against restrictive overlay networks, the

132 resulting environment could be more "open" for individuals even as the network becomes less so.

Finally, it is irresponsible to support the rhetoric of openness without also considering where freedom fails to safeguard equality. A level playing field is at best and initial state, at worst an illusion, and it fails to account for the unequal capabilities of individuals to take advantage of its resources. Over time, open markets produce winners and losers, tycoons and paupers. When resources are available to everyone, the most powerful actors will outperform weaker actors in their exploitation. The same is true of information markets.6

The principle of openness was an effective means for disrupting the centralized and observable exercise of power, but distributed networks of control are not so vulnerable. New tools are needed, which is why competition law, especially in its

American articulation, is so promising. Developing a body of law that allows a private right of action, is rooted in populist history, and includes remedies that are equal to the structural nature of the problems being investigated could provide a serious corrective influence to the current direction of global information policy. There are two levels on which this work can be initiated. First, one could work within the modern antitrust discourse. If economic analyses are necessary, then one could seek to better articulate public interest concerns about information policy as economic data. If questions about the

133 economic value of open source competition and balanced copyright law are not ready at hand, their investigation can be encouraged.

The second level of intervention is concerned with changing the economic frame more directly, and scholars are already taking up this work.7 No matter how adept one becomes at speaking the language of the Chicago School, there are some concepts that are likely to have no direct translation. And just as the discourse of promoting international development is unhappily forced into the rhetoric of maximizing , the discourse of public interest antitrust reform may be unacceptably bounded by the language of economics. This kind of normative shift will require sustained scholarly, popular, and political inquiry into the evolving nature of technology, DRM, standards, and competition.

1 Nicholas Negroponte, Being Digital, 1st ed. (Vintage, 1996), 229. 2 Barbara Warnick, Critical Literacy in A Digital Era: Technology, Rhetoric, and the Public interest, 1 st ed. (Lawrence Erlbaum, 2001), 6. ("Critical analysis of the discursive strategies used by protechnology advocates can make them available for public discussion and debate.") 3 Groups like Public Knowledge, for example, have taken policy positions against legal protections for DRM and for government mandates for network neutrality rules. 4 Savethelnternetorg's website lists 78 charter members that are also organizations. After visiting each organization's website and searching for "DRM" and "digital rights management," it appears that 14 have substantive resources (as opposed to pointers to other sites) on digital rights management. 5 TK note on Consumers Union and PK testimony before Congress 6 Chander and Sunder make this argument most elegantly in Anupam Chander and Madhavi Sunder, "The Romance of the Public Domain," California Law Review 92 (2004): 1331. See, e.g. Niva Elkin-Koren and Eli M. Salzberger, Law, Economics and Cyberspace: The Effects of Cyberspace on the Economic Analysis of Law (Edward Elgar Pub, 2004). (Arguing that the law and economics paradigm is incoherent and inadequate in a range of technologically advanced environments.)

134 See, e.g. Dani Rodrik, The Global Governance of Trade as if Development Really Mattered, Background Paper (New York: United Nations Development Programme, July 2001).

135