<<

Protecting , the social contract, and the rule of law in the Virtual World

Matthew Sundquist∗

Introduction and Framework Explanation

I. Valuing Privacy and Determining When to Respond

II. Compounding Privacy Problems: Rational Choice Theory and Technological Growth

III. Legal Guidance for Privacy

A. Precedents B. Statutory Guidance C. Analysis D. Looking Ahead

IV. Case Study of FTC Enforcement

A. Solution: Enhanced Enforcement B. Coalition Solutions C. Lessons from the Collaboration against SOPA D. Conclusion

A host of laws and regulations engage with the legal,1 technological,2 and social3 meanings4 of privacy. In a country of more than 300 million people and plentiful law enforcement officers, there is

∗ Matthew Sundquist is a graduate of Harvard College. He is the Privacy Manager at Inflection, and a Student Fellow of the Harvard Law School Program on the Legal Profession. This paper represents the opinion of the author. It is not meant to represent the position or opinions of Inflection. For their thoughtful advice and suggestions, I am grateful to Allison Anoll, Beth Givens, Bob Gellman, Bruce Peabody, Christopher Wolf, Erik Jones, Jacqueline Vanacek, James Grimmelmann, and Orin Kerr. For their support in writing this paper and friendship, I am grateful to Brian and Matthew Monahan.

1 The term legal has different definitions in relation to privacy.

• For a discussion of privacy as a principle in law, see Julie M. Carpenter & Robin Meriweather, Brief Amicus Curiae of the DKT Liberty Project in support of Petitioner, Kyllo v. United States, 533 U.S. 27 (2000); Ken Gormely, One Hundred Years of Privacy 4 Wisconsin Law Review 1335 (1992); Samuel D. Warren and Louis D. Brandeis, The , 4 Harvard L. R. 193 (1890) (describing privacy as the “right to be let alone.”).

• Privacy is also covered by statutory law. See e.g., HIPAA (Health Insurance Portability and Accountability Act); Sarbanes–Oxley Act; Gramm–Leach–Bliley Act; Payment Card Industry Standard (PCI DSS); Fair and Accurate Credit Transaction Act (FACTA), including Red Flags Rule; Electronic Fund Transfer Act, Regulation E (EFTA); Stored Communications Act; Federal disclosure laws; e–discovery provisions in the Federal Rules of Civil Procedures; Free and Secure Trade Program (FAST); Fair Crediting Reporting Act (FCRA); Right to Financial ; Family Education Rights and Privacy Act; Customer Proprietary Network Information rules; Children’s Online Privacy Protection Act; Cable Act; Video Privacy Protection Act; Driver’s Privacy Protection Act ; Title 39 of the Codes of Federal Regulation which deals with the USPS; Section 5 of the Federal Trade Commission Act. This does not include state laws.

1 likely to be abusive behavior. As a result, we are awash in claims about the definition, function, and value of privacy, we are awash in potential threats to privacy, and, increasingly, we are awash in debates about how to fashion remedies to address these issues. To survey the entire landscape of privacy dilemmas or threats, or assume that I could extract a representative sample of privacy policies or dilemmas is impossible. This paper does not attempt to provide a systematic theoretical account of privacy and technology, nor does it outline a typology of circumstances in which privacy might be threatened or abused by private or public entities. My paper advances a general framework for when a legal or social response to a privacy threat is appropriate.. The emergent areas I survey demonstrate the utility and application of my approach. This paper is in four sections. I begin by introducing a framework for prioritizing when a virtual or online practice, law, or regulatory deficiency warrants a legal or social response. First, a practice or law that violates the law should be prosecuted. Second, where privacy laws are ineffectually enforced we should be on heightened alert. Third, we need a response when a practice violates a valued social expectations of how information should flow.5 The first two conditions are important as updating and enforcing our laws in light of technological change is crucial to the maintenance of the social contract. The third condition is important as many of our expectations about information and privacy developed when tracking at the scale the government and businesses do so was impossible. Now, and have hundreds of millions of users. Around 1,271 government organizations and 1,931 private

• The Supreme Court has repeatedly set precedents on privacy. See e.g., Ex parte Jackson, 96 U.S. 727 (1878); Meyer v. Nebraska, 262 U.S. 390 (1923); Olmstead v. United States, 277 U.S. 438 (1928); Griswold v. Connecticut, 381 U.S. 479 (1965); Katz v. United States, 389 U.S. 347 (1967); Stanley v. Georgia, 394 U.S. 557 (1969); Kelley v. Johnson, 425 U.S. 238 (1976); Moore v. City of East Cleveland, 431 U.S. 494 (1977); Lawrence v. Texas, 539 U.S. 558 (2003). See generally Laura K. Donohue, Anglo- American Privacy and , 96 J. Crim. L. & Criminology 1059 (2006); William Knox, Some Thoughts on the Scope of the Fourth Amendment and Standing to Challenge Searches and Seizures, 40 Mo. L. Rev. 1 (1975).

2 Technology has created intriguing privacy problems. See e.g., Horst Feistel, Cryptography and Computer Privacy, 228 SCIENT. AM. 15 (1973) (exploring enciphering and origin authentication as a means of protecting systems and banks); Rakesh Agrawal & Ramakrishnan Srikant, Privacy-Preserving Data Mining, Proceedings of the 2000 ACM SIGMOD International Conference on Management of Data (attempting to “develop accurate models without access to precise information in individual data records”); Latanya Sweeney, k-: A Model for Protecting Privacy, 10 Intern. J. on Uncertainty, Fuzziness and Knowledge-based Systems 557 (2002) (explaining means for anonymzing data to protect privacy). 3 Scholars have often advocated balanced frameworks for interpreting and protecting privacy. See Helen Nissenbaum, A Contextual Approach to Privacy Online 140 Daedalus 32, 33 (2011) (arguing that our experiences and social surroundings formulate our expectations for information use and privacy); JOHN PALFREY & URS GASSER, BORN DIGITAL, 7 (2008) (arguing that the younger generation of technology users, due to their frequent and early adoption of technology, has different conceptions of privacy than their parents or grandparents); see generally ALAN WESTIN, PRIVACY AND FREEDOM (1967) (arguing that privacy is comprised of intimacy, anonymity, solitude, and reserve); JUDITH DECREW, IN PURSUIT OF PRIVACY: LAW, ETHICS, AND THE RISE OF TECHNOLOGY (1997) (arguing that privacy entails informational privacy, accessibility privacy, and expressive privacy); Jeremy Kang, in Cyberspace Transactions 50 Stan. L. R. 1193 (1997) (arguing for privacy in our physical space, choice, and flows of information). 4 I examine privacy of the personal information we create directly by communicating and completing forms, contracts, and documents as well as the information we create indirectly by using browsers, carrying phones with geo-tracking, and purchasing or using products and services. I focus on the assurances we receive about this information and whether they are complied with. See generally Office of Management and Budget Memorandum, Safeguarding Against and Responding to the Breach of Personally Identifiable Information (May 2007) (defining personally identifiably information and when and how to protect the information). 5 See generally Nissenbaum, supra note.

2 companies work on counterterrorism, homeland security and intelligence in the U.S.6 Information often flows based on what is technologically possible rather than based on norms or laws.7 In the second section, I examine how technology has allowed more information about people to be gathered and stored online.8 The 92% of online adults using search engines daily, 92% using email daily, and 60% of adults doing both on a typical day leave a large digital trail.9 Businesses and government can quickly create and begin to rely on a new practice, leading to claims they could not function without that practice.10 Technology has, as Amazon founder Jeff Bazos explained, begun “eliminating all the gatekeepers” for companies and technical practices.11 Technology has enabled the U.S. government to discover and learn more about Americans and citizens abroad.12 The NSA can intercept and download electronic communications equivalent to the contents of the Library of Congress every six hours.13 Government analysts annually produce 50,000 intelligence reports.14 While economic theory suggests a rational capacity to process the stream of privacy threats and trade-offs we face, people simply cannot process everything about privacy or make rational privacy decisions.15 As a result of all these factors, regulatory inaction or a lack of regulations allows more activity—and potentially invasions of our privacy—to happen faster, at a larger scale. Even as courts and Congress have addressed some questions involving the relationship between evolving technology and privacy, including Constitutional questions, they have avoided others. The

6 See Dana Priest & William Arkin, A hidden world, growing beyond control, WASH. POST, available at projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-growing-beyond-control/. 7 Ibid, at 34. 8 See generally Sweeney, supra note, at 557, (“Society is experiencing exponential growth in the number and variety of data collections containing person-specific information as computer technology, network connectivity and disk storage space become increasingly affordable.”). 9 Kristen Purcell, Search and email still top the list of most popular online activities, Pew Research Center, August 9, 2011. Available at pewinternet.org/Reports/2011/Search-and-email.aspx. 10 See generally World Privacy Forum, Comments of the World Privacy Forum regarding the Federal Trade Commission Preliminary Staff Report, Protecting in an Era of Rapid Change: A Proposed Framework for Business and Policymakers (Jan. 18, 2011), at 6. Available at www.ftc.gov/os/comments/privacyreportframework/00369-57987.pdf (Discussing the FTC’s recent report and narrow focus on tracking and contending that “The Commission needs to focus on the broader picture here and to try to get ahead of developments before they become so embedded in business practices that any limit will be fought as the end of theworld as we know it, a cry heard too often on the .”). 11 Thomas Friedman, Do You Want the Good News First?, NYTimes (May 19, 2012), available at www.nytimes.com/2012/05/20/opinion/sunday/friedman-do-you-want-the-good-news-first.html 12 See Richard A Posner The End is Near: Review of Margaret Atwood, Oryx and Crake, The New Republic, at 32 (Sept. 22, 2003) (arguing that the “rate of scientific and technological progress in this new brave new world is accelerating.”); David Kushner When Man and Machine Converge, Rolling Stone, 57 (Feb. 19, 2009) (discussing 's prediction that in our lifetimes “machines will not only surpass humans in intelligence – they will irrevocably alter what it means to be a human.”). 13 See Jane Mayer, The Secret Sharer, New Yorker (May 23, 2011) available at www.newyorker.com/reporting/2011/05/23/110523fa_fact_mayer?currentPage=all (“Even in an age in which computerized feats are commonplace, the N.S.A.’s capabilities are breathtaking…Three times the size of the C.I.A., and with a third of the U.S.’s entire intelligence budget, the N.S.A. has a five-thousand-acre campus at Fort Meade protected by iris scanners and facial-recognition devices. The electric bill there is said to surpass seventy million dollars a year.”). See also Steve Coll, The Rewards (And Risks) of Cyber War, New Yorker (Jun. 7., 2012), available at http://www.newyorker.com/online/blogs/comment/2012/06/the-rewards-and-risks-of-cyberwar.html (describing the digital wing of the “permanent national-security state.” A new Air Force unit is “the first-ever unit designated for the sole purpose of cyberspace operations,” staffed by “Over 5,400 men and women conduct or support 24-hour operations … including 3,339 military, 2,975 civilian, and 1,364 contractor personnel.”). 14 Dana Priest and William Arkin, A hidden world, growing beyond control, WashingtonPost.com. Available at http://projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-growing-beyond-control/. 15 See generally Alessandro Acquisti and Jens Grossklags, Privacy and Rationality in Individual Decision Making, 3 IEEE Security & Privacy 24 (2005); see infra part II.

3

Supreme Court declined to address whether the police can electronically track citizens,16 and declined to examine whether texting on two-way pagers is private.17 Congress has not updated key privacy legislation,18 and not responded when the government invokes what two Senators called “secret interpretations” of the Patriot Act.19 Regulatory agencies have sought trivial concessions and non- financial settlements with companies charged with breaking the law.20 Meanwhile, leaders struggle to grasp technology, and election-focused politicians prefer solving problems to preventing them, as this yields greater credit.21 I argue we need better laws and precedents and better enforcement, and point out specific areas for change.22 I also argue that courts should weigh our social expectations when determining what constitutes a reasonable .23 As a concluding case study, I examine the recent Federal Trade Commission (FTC) settlements with Google and Facebook. Both companies broke laws and our social expectations, but settled with either undersized financial settlements or none at all, then made trivial concessions to their customers and the FTC. And while the companies are now repeat offenders, the FTC has rarely responded.24 The actions of these companies and the ensuing lack of enforcement covers all three prongs of when we need a response: bad laws, broken social expectations, and deficient enforcement. I argue for better laws, if and when necessary; better enforcement; and a change in the professional culture and values of the FTC. In conclusion, I draw lessons from the successful opposition to the Stop Online Piracy Act (SOPA) and emphasize the importance of privacy education.

I. Valuing Privacy and Determining When to Respond

Justice Brandeis called privacy “the most comprehensive of rights and the right most valued by civilized men.”25 But why is privacy so valuable and important?26 Presumably it has a political value in deterring government overreach into our lives. Privacy also seems necessary to ensure democratic participation.27 However, it is difficult to categorize privacy as a value,28 or to measure the risks or value of privacy.29 We value some things as instrumental good or a means to an ends, like money. We also value intrinsic moral goods and virtues, like justice.30 Privacy is not clearly intrinsic or instrumental. Charles Reid noted this complication:

16 U.S. v. Jones 565 U. S. ____ (2012). 17 City of Ontario v. Quon, 130 S.Ct. 2619, 2629 (2010). 18 Infra, Part III. C. 19 Ron Wyden & Mark Udall, Letter to United States Department of Justice (Mar. 15, 2012), https://www.documentcloud.org/documents/325953-85512347-senators-ron-wyden-mark-udall-letter-to.html. 19 THE FEDERALIST NO. 51, at 286 (Alexander Hamilton) (E.H. Scott ed., 1898). 20 Infra Part IV. B. 21 Infra, Part III. C. 22 Infra, Part III. A. 23 Infra, III. A. 24 Infra, Part IV. 25 Olmstead v. United States. 277 US 438, 478 (1928) (dissenting opinion). 26 I briefly examine this question; others have given the subject a thorough treatment. See generally Ferdinand D. Schoeman, editor, PHILOSOPHICAL DIMENSIONS OF PRIVACY (1984); J. Roland Pennock and John W. Chapman (eds.), PRIVACY: NOMOS XIII (1971). 27 See e.g., Leslie Shade, Reconsidering the right to privacy in Canada, 28 Bulletin of Science, Technology & Society 80 (2008). 28 For a thorough discussion of this problem, see Jeffery L. Johnson, A Theory of the Nature and Value of Privacy, 6 Pub. Aff. Q. 271 (1992). 29 See Paul Syverson, The Paradoxical Value of Privacy, Proceedings of the 2nd Annual Workshop of on Economics and Information Security, 2003. 30 See generally Michael J. Zimmerman, Intrinsic vs. Extrinsic Value, Stanford Encyclopedia of Philosophy, (Dec. 17, 2010), available at plato.stanford.edu/entries/value-intrinsic-extrinsic/.

4

[W]e do not feel comfortable about asserting that privacy is intrinsically valuable, an end in itself- -privacy is always for or in relation to something or someone. On the other hand, to view it as simply instrumental, as one way of getting other goods, seems unsatisfactory too.31

Privacy creates a framework which allows other values to exist and develop. Where privacy is available, we can have freedom, liberty, and other intrinsic goods. We can develop friendships, relationships, and love.32 We also develop as independent beings and people when we have privacy.33 Having established the value of privacy, I posit we should prioritize privacy threats of three types: law breaking; if laws are insufficiently enforced; and when laws, practices, or frameworks subvert valued social expectations of privacy. The first two speak to the role of government and the social contract.34 According to the social contract, an idea deeply infused in American society and government,35 we trade the state of nature—how the world is without a government—to form a society and enjoy protection, security, and property. To protect our values, we create laws. The law’s task “[I]s to secure a situation whereby moral goals which, given the current social situation in the country whose law it is, would be unlikely to be achieved without it…”36 The law should serve the common interest, and secure values that will be broadly useful to society.37 Once established, the law and rules must be enforced,38 since the government derives authority from creating and enforcing laws.39 Thus, there is an immediate positive benefit when we protect a valued good like privacy, and a broader benefit, as enforcing the law gives the government credibility and creates a stable society. Indeed, people expect good laws and enforcement of the government; in one survey, 94% of Internet users said that privacy violators should be disciplined.40

31 Charles L. Reid Choice and action: an introduction to ethics (1981), at 335. 32 See Charles Fried, AN ANATOMY OF VALUES (1970), reprinted in Richard Wasserstrom, editor, TODAY’S MORAL PROBLEMS (1975), at 25 (“Privacy creates the moral capital which we spend in friendship and love.”). 33 Robert F. Murphy, Social Distance and the Veil, 66 Am. Anthropologist (1964) reprinted in Schoeman, supra note, at 36. (“Interaction is threatening by definition, and reserve, hereseen as an aspect of distance, serves to provide partial and temporary protection to the self.”). 34 See generally JOHN LOCKE, THE SECOND TREATISE OF CIVIL GOVERNMENT (1690); JEAN JACQUES ROUSSEAU, THE SOCIAL CONTRACT. OR PRINCIPLES OF POLITICAL RIGHT (1762); IMMANUEL KANT, THE FOUNDATIONS OF THE METAPHYSICS OF MORALS (1797). 35 See, e.g., Anita L. Allen, Social Contract Theory in American Case Law,51 FLA. L. REV. 3 (1999) (“According to some historians, the American colonists relied upon liberal, Lockean notions of a social contract to spirit rebellion against unwanted British rule. Historians have maintained that social contractarian theories of political order significantly influenced the people who wrote and defended the Declaration of Independence, the original Constitution, and the Bill of Rights.”); Christine N. Cimini, The New Contract: Welfare Reform, Devolution, and Due Process, 61 MD. L. REV.246, 275 (2002) (“[T]he Declaration of Independence, original state constitutions, the Articles of Confederation, and the federal Constitution with its accompanying Bill of Rights all based their notions of the structure of democratic government on ideas of social contract. These documents amount to a formalization of the social contract between the government and its people.”). 36 Raz, supra note, at 14. 37 See JOHN RAWLS, A THEORY OF JUSTICE 29 (1971). 38 Joseph Raz, Reasoning with Rules, 54 CURRENT LEGAL PROBLEMS 1, 17 (2001) (“Again we can see how rules are the inevitable backbone of any structure of authority, of which the law is a paradigm example.). 39 Joseph Raz, About Morality and the Nature of Law, 48 AM. J. JURISPRUDENCE 1, 11 (2003) (“A proper doctrine of authority should be based on the task to be done argument, qualified by the success and relevance conditions.”). 40 See Epic, supra note. Social contract theory is primarily based on natural law. Nonetheless, the legislative and judicial support for privacy, as well as social expectation of privacy and law enforcement in the U.S. evidenced by the polling just mentioned and other studies demonstrate that natural law arguments and legal positivism can be invoked to support the framework. See generally Leslie Green, Legal Positivism, Stanford Encyclopedia of Philosophy, Jan. 3, 2003. Available at plato.stanford.edu/entries/legal-positivism/ ("What laws are in force in that system depends on what social standards its officials recognize as authoritative; for example, legislative enactments, judicial decisions, or social customs."). However, I do not engage substantially with legal positivism in this paper, as

5

The third prong of my privacy framework values social expectations. Norms and expectations allow people to feel secure, and ensure society functions.41 Privacy is a social expectation formed based on where information is collected and gathered. As Helen Nissenbaum points out, “When the flow of information adheres to entrenched norms, all is well; violations of these norms, however, often result in protest and complaint.”42 Problematically, technological limitations change and disappear, allowing information to flow detached from outdated expectations or social, ethical, legal, and political norms.43 Businesses should nonetheless act in accordance with our social expectations, and when they do not, courts and legislatures should legislatively protect our expectations. As noted, privacy has a value for us, and when we expect privacy and law enforcement, and our expectations are wrong, it undermines our ability to be secure in our person or development. Yet exploitations and privacy invasions will persist if we do not respond, but as I detail in the next section, regulating technology trends is costly, complicated, and cumbersome.

II. Compounding Privacy Problems: Rational Choice Theory and Technological Growth

Across the Web, consumers annually share double the information they did the year before.44 Analysts project that today, over 90 million Americans use smartphones, and that by 2015 nearly half the country will own a smartphone.45 Wireless data usage is projected to increase by 3,506% by 2014.46 The number of worldwide email accounts is expected to increase from 3.1 billion accounts in 2011 to nearly 4.1 billion by 2015.47 By exploiting this technological growth, the government and businesses can do things that would have been impossible just a few years ago. Our expectations are outdated. Lotame Solutions places cookies on users’ computers to gather information,48 while Apple, Verizon, Target, and others compile information from interactions with their products.49 Roughly 1,271 government organizations and 1,931

I believe others have done so much more thoughtfully than I could. See e.g., Ronald Dworkin TAKING RIGHTS SERIOUSLY (1978); Ronald Dworkin LAW'S EMPIRE (1986). 41 See Helen Nissenbaum, PRIVACY IN CONTEXT: TECHNOLOGY, POLICY, AND THE INTEGRITY OF LIFE (2009). 42 Supra note Nissenbaum, at 33. 43 Ibid, at 34. 44 U.S. Securities and Exchange Commission, Form S-1 Registration Statement Under the Securities Act of 1933: Facebook Inc. (Feb. 2012), http://s3.documentcloud.org/documents/288894/facebook-prospectus.pdf. 45 See Stephanie Reese, Quick Stat: Smartphone Users Account for 38% of Mobile Phone Users, eMarketer (Aug. 24, 2011), www.emarketer.com/blog/index.php/quick-stat-smartphone-users-account-38-mobile-phone-users/.; Marissa McNaughton, US Mobile Internet Use to Increase 25%, Smartphone Use Nearly 50% in 2011, THE REAL TIME (Aug. 26, 2011), therealtimereport.com/2011/08/26/us-mobile-internet-use-to-increase-25-smartphone-use- nearly-50-in-2011/ (“Just under half of the total US population will be using the internet via mobile by 2015, according to the latest estimates by eMarketer.”). 46 FEDERAL COMMUNICATIONS COMMISSION, Mobile Broadband: The Benefits of Additional Spectrum, Obi Technical Paper Series18 (2010), http://download.broadband.gov/plan/fcc-staff-technical-paper-mobile-broadband-benefits-of-additional- spectrum.pdf. 47 The Radicati Group Releases “Email Statistics Report, 2011-2015”, RADICATI TEAM, (May 18, 2011), www.radicati.com/?p=7269. 48 Julia Angwin, The Web's New Gold Mine: Your Secrets, W.S.J (Jul. 30,2010). www.online.wsj.com/article/SB10001424052748703940904575395073512989404.html. 49 See Charles Arthur, iPhone keeps record of everywhere you go, THE GUARDIAN (Apr. 20, 2011), www.guardian.co.uk/technology/2011/apr/20/iphone-tracking-prompts-privacy-fears (“Security researchers have discovered that Apple's iPhone keeps track of where you go – and saves every detail of it to a secret file on the device which is then copied to the owner's computer when the two are synchronised.”); David Goldman, Your phone company is selling your personal data, CNNMoney (Nov. 1, 2011). money.cnn.com/2011/11/01/technology/verizon_att_sprint_tmobile_privacy/ (“In mid-October, Verizon Wireless

6 private companies work on counterterrorism, homeland security and intelligence in about 10,000 locations across the United States.50 An estimated 854,000 people hold top-secret security clearance.51 Using a GPS device, police can do what would have formerly required “a large team of agents, multiple vehicles, and perhaps aerial assistance.”52 As a result, technology neither validated nor rejected by law has flourished—examples include respawning cookies,53 beacons and flash cookies,54 and abusive use of a user’s browser history.55 Governments and businesses entrench around new technology and practices in these unregulated spaces, then claim that changes would endanger their business or national security.56 Moreover, though mainstream microeconomic theory suggests we have a rational capacity to process information about privacy tradeoffs to which we assent in going online, the fact of the matter is that “terms of use” choices, choices about browser settings and software, and choices about purchases and cred cards, etc.,57 are complicated, making it unlikely that the “perfect information” criterion of rationality will be met when we face privacy decisions..58 Even with full information, we may act against our best judgment,59 owing to self-control problems, optimism that something bad will not happen to us,60 or a desire for immediate gratification.61 Privacy decision-making and privacy features are also incredibly complex. An examination of 133 privacy software tools and services revealed a list of 1,241 features.62 Users cannot research these settings under reasonable circumstances, much less choose between them.63 In a recent CMU study of 45 experienced web-users, participants were instructed to activate browsers and tools to block cookies. Users blocked much less than they thought they did, often blocking nothing. Users unsuccessfully used tools designed for privacy, but companies and governments creating technological, legal, and societal defaults aim to gather information.64 Behavioral economics offers insight into these changed its privacy policy to allow the company to record customers' location data and Web browsing history, combine it with other personal information like age and gender, aggregate it with millions of other customers' data, and sell it on an anonymous basis.”); Charles Duhigg, How Companies Learn Your Secrets, N.Y.T. (Feb. 16, 2012), www.nytimes.com/2012/02/19/magazine/shopping-habits.html; Natasha Singer, Following the Breadcrumbs on the Data-Sharing Trail, NYTimes (Apr. 28, 2012), available at www.nytimes.com/2012/04/29/business/on-the-data- sharing-trail-try-following-the-breadcrumbs.html?_r=3&adxnnl=1&pagewanted=all 50 See Washington Post, supra note. 51 Ibid. 52 Jones, supra note, at 11. See also Illinois v. Lidster, 540 U. S. 419, 426 (2004) (noting that typical limitations in police monitoring are “limited police resources and community hostility.” These limitations would not apply to robust electronic or GPS monitoring). 53 See e.g., Ashkin Soltani, et al., Flash Cookies and Privacy (Aug. 10, 2009). Available at SSRN: http://ssrn.com/abstract=1446862 or http://dx.doi.org/10.2139/ssrn. 54 Wendy Davis, The Forever Cookie: New Tracking Technologies Continue To Threaten Privacy, MEDIA POST (Oct. 11, 2010), http://www.mediapost.com/publications/?fa=Articles.showArticle&art_aid=137459. 55 See, e.g., Cory Doctorow, What The Internet Knows About You: your browser is giving away your history, BOINGBOING (Sep. 2, 2010), boingboing.net/2009/09/02/whattheinternetknows.html. 56 World Privacy Forum, supra note. 57 See generally Herbert Simon MODELS OF BOUNDED RATIONALITY (1982). It is worth noting that there are also rational bounds to our capacity to understand medicine, science, finance, etc. 58 Supra note Alessandro Acquisti and Jens Grossklags. 59 Alessandro Acquisti, Privacy in electronic commerce and the economics of immediate gratification, Proceedings of ACM Electronic Commerce Conference (EC 04), pp. 21-29, New York, (2004), http://www.heinz.cmu.edu/~acquisti/papers/privacy-gratification.pdf. 60 See e.g., S.M. Corey, Professional attitudes and actual behavior 28 J. ED. PSYC. 271 (1937). 61 Ibid. 62 Benjamin Brunk, Understanding the Privacy Space, 7 FIRST MONDAY (2002), firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/991/912. 63 Ibid. 64 Pedro G. Leon, Blase Ur, Rebecca Balebako, Lorrie Faith Cranor, Richard Shay, and Yang Wang, Why Johnny Can’t Opt Out: A Usability Evaluation of Tools to Limit Online Behavioral Advertising, CMU-CyLab-11-017 (Oct. 31, 2011), www.cylab.cmu.edu/research/techreports/2011/tr_cylab11017.html. See also Michelle Madejski et al., The Failure of Online Social Network Privacy Settings (2011), available at

7 problems.65 Comprehension challenges are compounded by legal confusion, inaction, and non- compliance, which I address in the next section.

III. Legal and Judicial Privacy Guidance

A. Precedents

Chief Justice Warren called it “emphatically the province and duty of the judicial department to say what the law is.”66 The Supreme Court should do so. To be fair, the Court has recognized that new technology can “shrink the realm of guaranteed privacy,”67 and should be considered when defining “the existence, and extent, of privacy expectations” under our Fourth Amendment privacy guarantees.68 Nonetheless, the Court is careful when grafting privacy protections and expectations onto technological changes; the Justices waited nearly a century after the invention of the telephone to protect phone calls from the government listening in, and even then, only when the individual has a reasonable expectation of privacy.69 The Court now applies a two-part privacy test to determine whether (1) something encroaches on an “actual (subjective) expectation of privacy,” and whether (2) “the expectation [is] one that society is prepared to recognize as ‘reasonable.’”70 We do expect that certain technology will not be used to exploit, expose, or abuse our privacy.71 Federal Courts have occasionally protected these expectations,72 but the Supreme Court has hesitantly www.cs.columbia.edu/~maritzaj/publications/2011-tr-madejski-violations.pdf (“We present the results of an empirical evaluation that measures privacy attitudes and intentions and compares these against the privacy settings on Facebook. Our results indicate a serious mismatch: every one of the 65 participants in our study confirmed that at least one of the identified violations was in fact a sharing violation.”). 65 See Acquisti, supra note. 66 5 U.S. (1 Cranch) 137, 177 (1803). 67 Kyllo v. United States, 533 U.S. 27, 34 (2001). 68 Supra note Quon. See also U.S. Const. amend. IV (“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”). 69 See Katz v. United States, 389 U.S. 347, 352–53. For examples of Courts attempting to deal with technology and the Fourth Amendment, see Recent Case, United States v. Comprehensive Drug Testing, Inc., 579 F.3d 989 (9th Cir. 2009) (en banc), 123 HARV. L. REV. 1003, 1007–10 (2010), suggesting that the Ninth Circuit overreacted to an unreasonable digital search and seizure; United States v. Maynard, Nos. 08-3030, 08-3034, 2010 WL 3063788, at *1, *7–18 (D.C. Cir. Aug. 6, 2010), holding that a warrantless use of a GPS tracking device violated a reasonable expectation of privacy. 70See Katz, at 361 (Harlan, J., concurring); see Minnesota v. Carter, 525 U.S. 83, 97 (1998) (“[The] Katz test . . . has come to mean the test enunciated by Justice Harlan’s separate concurrence in Katz . . . .”). See generally Renée McDonald Hutchins, Tied Up in Knotts? GPS Technology and the Fourth Amendment, 55 UCLA L. REV. 409, 427 (2007) (citing Kyllo, 533 U.S. at 32). 71 In one survey, 91% of respondents were concerned their identities might be stolen and used to make unauthorized purchases. See See Epic, supra note. Ninety percent of cloud-computing users in the United States would be “very concerned” if cloud service providers sold their to a third party. See Memorandum from John B. Horrigan, Pew Internet & Am. Life Project 2, 6–7 (Sept. 2008), http://www.pewinternet.org/~/media//Files/Reports/2008/PIP_Cloud.Memo.pdf.pdf . A Business Week/Harris Poll found that 89% of respondents were not comfortable with web tracking if “data was combined with an individual’s identity.” See Epic, supra note. An American Society of Newspaper Editors study found that 81% of respondents were “very concerned” or “somewhat concerned” that a company might violate their privacy. Ibid. 72 See e.g., United States v. Steven Warshak et al. Citations, 631 F.3d 266 (holding that the Stored Communications Act violates the 4th Amendment by allowing secret searches of emails stored with third parties) But see Orin Kerr, The Legal Rulings in Warshak v. United States, The Volokh Conspiracy, (Jun. 21, 2007), available at http://volokh.com/posts/1182446445.shtml (“The court simply had no business trying to imagine all the ways the statute might be applied and resolving the constitutionality of all of those hypothetical applications.”).

8 addressed the Fourth Amendment’s relationship to technology, particularly in two cases. First, the Court concluded that police must have a warrant to place a GPS tracker on a car. This aligns with societal expectations: a recent poll found that 73% of Americans believe police must have a warrant to put a GPS tracking device on a car.73 Some members of the Court even recognized that long-term warrantless GPS monitoring violates our social expectations.74 The Court thought that tracking someone electronically could be “an unconstitutional invasion of privacy.” However, the Court also concluded that addressing that question would lead “needlessly into additional thorny problems,”75 despite our societal expectations and the reality that long-term GPS monitoring is decreasingly reliant on an actual GPS device.76 Second, in a text-message case, the Court faced what Justice Kennedy termed “issues of far-reaching significance.”77 The Court avoided them, deeming two-way pagers, a decades-old device, an “emerging technology.” The judiciary, concluded Kennedy, would take a risk by engaging “the Fourth Amendment implications of emerging technology before its role in society has become clear.” Some see this as unhelpful.78 Some hesitancy and delay in recognizing social expectations is an inevitable outcome of the relationship among case law, technology, and legislation. Cases do not rise to the Courts until years after the incident has occurred. Courts are beholden to laws Congress passes. Nonetheless, by the time a case reaches the Court, social expectations may be settled. For example, 73% of participants in a recent poll viewed it as extremely important not to have someone watching or listening to them without permission.79 The Court could recognize this reality and find that certain communications and movement carry reasonable privacy expectations society is prepared to recognize. Justice Brennan believed that “Judges cannot avoid a definitive interpretation because they feel unable to, or would prefer not to, penetrate to the full meaning of the Constitution's provisions.”80 Judges can apply Fourth Amendment rules to the virtual world without creating new jurisprudence or frameworks.81 For example, the header, address, contents, and size of emails could be protected just like the address, size, and dates on letters.82 Information in briefcases—which has privacy protections83— mirrors our virtual lives, full of emails, photos, correspondences, address books, etc.84 The Court need not create a new privacy doctrine or theorize in a black box about expectations, as it can rely on polling to examine the societal expectation part of the Katz test. Polling is increasingly easy and can be evaluated for accuracy.85 By using polling, the Court can determine and validate our societal privacy expectations.86

73 PublicMind Poll, High Court Agrees with Public in U.S. v. Jones: Electronic Tails Need a Warrant (Jan 23., 2012), available at publicmind.fdu.edu/2012/tailing/final.pdf. 74 Ibid (Sotomayor, J. at 3, and Alito, at 13, concurring). 75 Supra note, Jones, at 11. 76 Jones, supra note, (Alito, S., concurring 9-12). 77 See Quon, supra note 24, at 2619. 78 See e.g., Rehberg v. Paulk (11th Cir. 2010) 611 F.3d 828, 844 9 (“The Supreme Court’s most-recent precedent [Quon shows a marked lack of clarity in what privacy expectations as to content of electronic communications are reasonable.”). 79 See Epic, supra note. 80 William J Brennan, Speech given at the Text and Teaching Symposium, Georgetown University (Oct. 12, 1985), www.pbs.org/wnet/supremecourt/democracy/sources_document7.html. 81 See e.g., Orin S. Kerr, Applying the Fourth Amendment to the Internet: A General Approach, 62 STAN. L. R. 1005 (2010); Jonathan Zittrain, Searches and Seizures in a Networked World, 119 HARV. L. REV. F. 83, 83 (2006) (arguing for Fourth Amendment protections online). 82 See Orin S. Kerr, Internet Surveillance Law After the USA Patriot Act: The Big Brother That Isn’t, 97 NW. U. L. REV. 607, 611-14 (2003). 83 United States v. Freire, 710 F.2d 1515, 1519 (11th Cir. 1983). 84 New Jersey v. T.L.O., 469 U.S. 325, 339 (1985). See generally David A. Couillard, Defogging the Cloud: Applying Fourth Amendment Principles to Evolving Privacy Expectations in Cloud Computing, 93 MINN. L. R. 2205 (2009). 85 See e.g., Nate Silver, Pollster Ratings v4.0: Results, FiveThirtyEight.com (Jun. 6, 2010), available at www.fivethirtyeight.com/search/label/pollster%20ratings.

9

B. Statutory Guidance

A host of legislation covers privacy.87 No office or piece of legislation covers all personal information.88 I apply my framework to three laws, pinpointing areas where legislation or a lack of legislation allows abuse, subversion, or violations of social expectations of privacy. Outdated, older legislation, for example, can become problematic in application, as can legislation with wide-ranging coverage of technology, people, and content. Then, because of the impact it has on the social contract, it is crucial to examine how federal agencies gather, use, and disclose our information and whether the government keeps its word and mandates compliance with the law. The Privacy Act of 1974 regulates how the government may gather, use, and distribute personal information.89 It states that “No agency shall disclose any record which is contained in a system of records by any means of communication to any person, or to another agency, except pursuant to a written request by, or with the prior written consent of, the individual to whom the record pertains . . . .”90 The law spans the entire lifetime of citizens, “including records of births, marriages, divorces, professional licenses, voting information, worker’s compensation, personnel files (for public employees), property ownership, arrests, victims of crime, criminal and civil court proceedings, and scores of other information.”91 But the Act only applies to the public sector. Members of Congress skirt it by releasing information gathered by the government and buying back re-purposed, enhanced versions of that information from data brokers.92 Moreover, a Congressional Research Service report found that twenty- three federal agencies disclosed the personal information of their websites’ users to banks, retailers, distributors, and trade organizations.93 It has about a dozen exceptions,94 and a widely-criticized, broad exemption for “routine use.”95 There is little wonder it has been called “toothless.”96 The Electronic Communications Privacy Act (ECPA) was drafted to protect the communication privacy of American citizens.97 Written when copying records was a physical activity and records could

86 For an example of a valuable collection of privacy polls, see EPIC: Public Opinion on Privacy,” supra note. See also Public Mind Poll, available at publicmind.fdu.edu (“Fairleigh Dickinson University's PublicMindTM is a research center that conducts polling, survey, and other research on politics, society, popular culture, consumer and economic trends.”). 87 Supra note on privacy laws. 88 See Julia Angwin, Obama Administration Seeks Protections, New Policy Office, W.S.J. (Nov. 11, 2010), http://online.wsj.com/article/SB10001424052748703848204575608970171176014.html. 89 Privacy Act of 1974, 5 U.S.C. § 552a (1974). 90 5 U.S.C. § 552a(b) (2000). 91 See Solove, ibid, at 1139. 92 See Daniel Solove, Access and Aggregation: Public Records, Privacy, and the Constitution, 86 MINN. L. R. 1138, 1138-1139 (2002) (“[T]he government routinely pour[s] [personal] information into the public domain…through posting information and/or databases on the Internet.”); See generally Melissa Carrie Oppenheim, The Dark Data Cycle: How the U.S. Government Has Gone Rogue in Trading Personal Data from an Unsuspecting Public (Mar. 2012), thesis on file with the author. 93 Harold Relyea, The Privacy Act: Emerging Issues and Related Legislation, Congressional Research Service and the Library of Congress’ Report for Congress, at 5 (Feb 26, 2002). 94 PHILIPPA STRUM, PRIVACY 50-51 (1998). 95 Paul M. Schwartz, Privacy and Participation: Personal Information and Public Sector Regulation in the United States, 80 IOWA L. REV. 553, 593 (1995). 96 Anna Kimbol, The Privacy Act May be Toothless, Health Law Perspectives (Sept. 2008), http://www.law.uh.edu/healthlaw/perspectives/homepage.asp. 97 See S. Rep. No. 99-541, at 3557-59 (1986) (“With the advent of computerized record keeping systems Americans have the ability to lock away a great deal of personal and business information . . . [T]he law must advance with technology to ensure the continued vitality of the fourth amendment. . . . Congress must act to protect the privacy of our citizens. . . The Committee believes that [this Act] represents a fair balance between the privacy expectations of American citizens and the legitimate needs of law enforcement agencies.”).

10 be destroyed, ECPA has not been updated since it was passed in 1986. Applying it to email, texting, social networks, data storage and other new technology is quite difficult.98 Unnecessarily complex and overly technical distinctions have emerged, such as between opened and unopened email and emails in transit and in storage.99 Though it may have made sense at the time of passage, it now defies our expectations of reality to distinguish privacy in this way or others recognized by ECPA. Third, the USA PATRIOT Act covers the scope and type of information the federal government can gather in counter-terrorism efforts.100 The Act allows the FBI to issue a National Security Letter (NSLs) demanding information without judicial oversight or probable cause.101 NSLs include a gag order to prevent a recipient from appealing or discussing the request.102 A person who complies with the order cannot be liable for producing information, therby incentivizing people not to resist.103 The FBI guidelines for issuing an NSL and other investigative techniques are classified.104 The secrecy is concerning when we examine how NSLs have been used. From 2003-2006, the FBI issued nearly 200,000 NSLs,105 which cover “any tangible things…for an authorized investigation…to protect against international terrorism or clandestine intelligence activities.”106 The Bureau violated NSL rules more than 1000 times in an audit of 10% of NSLs.107 While Courts have intermittently regulated NSLs,108 two Senators familiar with the Patriot Act claim that “there is now a significant gap between what most Americans think the law allows and what the government secretly claims the law allows. This is a

98 See, Achal Oza, Note, Amend the ECPA: Fourth Amendment Protection Erodes as E-mails Get Dusty, 88 B.U. L. REV. 1043 (2008) (arguing that technology has outpaced ECPA); see also Patricial L. Bellia, Surveillance Law through Cyberlaw’s Lens, 72 Geo. Wash. L. Rev 1375, 1396-1397 (2004) (stating that “[s]tored communications have evolved in such a way that [ECPA’s layer of statutory protection for stored communications], often referred to as the Stored Communications Act (“SCA”), are becoming increasingly outdated and difficult to apply.”). 99 “…distinctions recognized by ECPA include electronic mail in transit; electronic mail in storage for less than or more than 180 days; electronic mail in draft; opened vs. unopened electronic mail; electronic communication service; and remote computing service.” Robert Gellman, Report for the World Privacy Forum, Privacy in the Clouds: Risks to Privacy and Confidentiality from Cloud Computing (Feb. 23, 2009). See generally J. Beckwith Burr, The Electronic Communications Privacy Act of 1986: Principles for Reform, at 9, noting that “a growing chorus of academics argues that these distinctions do not make sense.” Available at www.digitaldueprocess.org/files/DDP_Burr_Memo.pdf. 100 Pub. L. No. 107-56, 115 Stat. 272 (Oct. 26, 2001). 101 See David Kravets, Patriot Act Turns 10, With No Signs of Retirement, WIRED, (Oct. 26, 2011), www.wired.com/threatlevel/tag/national-security-letter/. (Discussing National Security Letters and calling them “perhaps the most invasive facet of the law [Patriot Act]”); see generally EFF, National Security Letters, available at https://www.eff.org/issues/national-security-letters. 102 Wayne Ma, Google’s Gdrive (and its ad potential) raise privacy concerns, POPULAR MECHANICS (29 Nov. 2009), http://www.popularmechanics.com/technology/industry/4234444.html. 103 Supra note 47, at (e). (“A person who, in good faith, produces tangible things under an order pursuant to this section shall not be liable to any other person for such production. Such production shall not be deemed to constitute a waiver of any privilege in any other proceeding or context.”). 104 National Security Investigation Guidelines, § II(A). 105 WASH. POST, My National Security Letter Gag Order” (Mar. 23, 2007), www.washingtonpost.com/wp- dyn/content/article/2007/03/22/AR2007032201882.html. 106 50 USC § 1861 (b) (2) (a). 107 John Solomon, FBI Finds It Frequently Overstepped in Collecting Data, WASH. POST (Jun. 14, 2007), www.washingtonpost.com/wp-dyn/content/article/2007/06/13/AR2007061302453.html. See generally U.S. Department of Justice, A Review of the Federal Bureau of Investigation’s Use of National Security Letters (Mar. 2007). 108 Doe v. Gonzalez, 449 F.3d 415 (2d Cir. 2006) (finding parts of the NSL secrecy mandates unconstitutional), see also Internet Archive et al. v. Mukasey et al. No. 07-6346-CW (N.D. Cal) (Ultimately, the FBI withdrew and NSL and gag order as part of a settlement). See generally Jameel Jaffer, Panel report: secret evidence in the investigative stage: FISA, administrative subpoenas, and privacy, 5 CARDOZO PUB. L. POL’Y & ETHICS J. 7 (2006).

11 problem, because it is impossible to have an informed public debate about what the law should say when the public doesn't know what its government thinks the law says.”109

C. Analysis

Few issues become salient in elections to voters.110 Congress has on a few occasions considered privacy legislation,111 but generally privacy has a low salience as a political priority. Part if this is due to how Congress prioritizes oversight.112 One model is the police-patrol model, which is centralized, active, and direct. Congress must pro-actively search for violations and remedies to legislative goals. However, Congress seems to prefer a fire-alarm model, in which it establishes rules, procedures, and practices to empower individuals to examine administrative decisions, charge executive agencies with violating congressional goals, and subsequently seek remedies. Fire-alarm detection costs and pain are borne by citizens and advocacy groups. Legislators can solve problems and take credit from those who sounded the alarm, as rather than prevent a problem they solve one. As noted, privacy is a difficult good to value, hard to understand, and complex, which may partially explain why Congress has not prioritized the issue. Hyper-partisanship has impeded compromise and action in the legislative branch,113 and Congressional members’ interest in re-election impacts policy, often negatively.114 Political parties also shape our laws, but neither party has championed privacy.115 Privacy lacks an organization, but established businesses have connections, experience, money, and lobbying capacity, and the government has far-reaching power.116 Privacy can also become a proxy discussion of national security, child pornography, and the “War on Terror.” And as previously managed, psychological processing problems and the low salience of privacy as an issue to voters also seems to play a role. Yet if each branch of government accepts legislative and regulatory inaction, and privacy abuse, the separation of powers117 will likewise fail to protect privacy.118

109 Ron Wyden & Mark Udall, supra note. 110 See generally Edward Carmines & James Stimson, On the Structure and Sequence of Issue Evolution, 80 Am. Pol. Sci. Rev. 901, 915 (1986) (“The issue space-that tiny number of policy debates that can claim substantial attention both at the center of government and among the passive electorate is striking-ly limited by mass inattention.”). 111 See Commercial Privacy Bill of Rights Act of 2011, S. 799, 112th Congress (2011); Building Effective Strategies To Promote Responsibility Accountability Choice Transparency Innovation Consumer Expectations and Safeguards Act, H.R. 611, 112th Congress (2011); Consumer Privacy Protection Act of 2011, H.R. 1528, 112th Congress (2011). 112 Matthew McCubbins & Thomas Schwartz, Congressional Oversight Overlooked: Police Patrol versus Fire Alarms 28 AM. J. POL. SCI. 165 (1984). 113 Dino Grandoni, Senate Gridlock Explained in One Chart, THE ATLANTIC (Mar. 8, 2012), available at www.theatlanticwire.com/national/2012/03/us-senate-now-completely-polarized/49641/ (“As you can see something remarkable happened last year: any remaining ideological overlap between the Democratic and Republican parties totally disappeared in the Senate, as the vote ratings, for the first time, were divided neatly by party line.). 114 See generally DAVID R. MAYHEW, CONGRESS: THE ELECTORAL CONNECTION (1975). 115 See Daryl J. Levinson & Richard H. Pildes, Separation of Parties, Not Powers, 119 HARV. L. REV. 2311 (2006) (“Political competition and cooperation along relatively stable lines of policy and ideological disagreement quickly came to be channeled not through the branches of government but rather through an institution the Framers could imagine only dimly but nevertheless despised: political parties.”); see also Ezra Klein, The Unpersuaded, New Yorker (Mar. 19, 2012), www.newyorker.com/reporting/2012/03/19/120319fa_fact_klein?currentPage=all (“…we have a system that was designed to encourage division between the branches but to resist the formation of political parties. The parties formed anyway, and they now use the branches to compete with one another.”) 116 RICHARD FENNO JR., THE POWER OF THE PURSE (1966). 117 The Founders gave “each department the necessary constitutional means and personal motives to resist encroachments of the others.” THE FEDERALIST NO. 51, at 286 (Alexander Hamilton) (E.H. Scott ed., 1898). See generally John A. Fairlie, The Separation of Powers, 21 MICH. L. REV. 393, 393 (1923). (“This tripartite system of

12

D. Looking Ahead

Senators, scholars, and advocates have asserted that agencies are infringing on our privacy.119 The Supreme Court cannot easily interpret poorly written or imprecise laws, or serve as a supplemental lawmaker capable of applying Congressional intent.120 Congress is particularly well placed to handle this type of large-scale public problem legislatively.121 Congress could begin by holding public hearings to examine secret abuses and privacy legislation to bring the issue into the public eye. Congress should then update obsolete frameworks in the ECPA and Privacy Act, amending them with an eye towards current technology and use and the future.122 Congress should empower Courts and administrative agencies to revisit these issues,123 and as necessary, redefine and amend legislative goals,124 particularly in the case of abused or subverted legislation. Where the Department of Justice has found violations within FBI and executive practices, it should vigilantly expose and oppose such violations. To counteract the fact that leaders have trouble understanding technology,125 Congress could rely on technologists when creating legislation and Courts could call on experts as witnesses or to file an amicus curiae brief.126 The White governmental authorities was the result of a combination of historical experience and a political theory generally accepted in this country as a fundamental maxim in the latter part of the eighteenth century.”). 118 See generally Bruce G. Peabody & John D. Nugent, Toward a Unifying Theory of the Separation of Powers, 53 Am. U. L. Rev. 1, 44 (2003) (explaining the importance of the balance of powers and concluding that “while our model underscores the extent to which the U.S. policy process, superimposed over the separation of powers framework, is inevitably disjointed and lurching, the iterative and staggered nature of this process frequently produces broad consensus and guarantees that contentious issues can be readily revisited.”). Complacency among the branches can create problems or inaction on other issues as well. See e.g., Matthew Sundquist, Worcester v. Georgia: A Breakdown in the Separation of Powers, 35 Am. Ind. L. Rev. 239 (2010). 119 See supra notes 25, 31, and 34. 120 Beth M. Henschen, Judicial Use of Legislative History and Intent in Statutory Interpretation, 10. LEGIS. STUD. Q. 353 (1985) (“Thus, the role that the Supreme Court adopts as supplemental lawmaker depends in part on the opportunities for judicial policy making that Congress provides in its statutes.”). 121 See e.g., Orin Kerr, The Fourth Amendment and New Technologies: Constitutional Myths and the Case for Caution 102 Mich. L. Rev. 801, 805–806 (2004). 122 See e.g., Orin S. Kerr, A User's Guide to the Stored Communications Act, and a Legislator's. Guide to Amending It, 72 Geo. Wash. L. Rev. 1208 (2004). 123 See e.g., The Administrative Procedure Act of 1946, which established the right to notice, the right to a hearing, the right to due process in decision making, and the right to independent review. See also The National Environmental Policy Act of 1969, an act in which Congress gave a broad mandate to government agencies. 124 Mathew McCubbins & Thomas Schwartz, 28 Congressional Oversight Overlooked: Police Patrols versus Fire Alarms 165, 174 (1984). 125 See generally Garrett Graff, Don’t know their Yahoo from their YouTube, WASH. POST (Dec. 2, 2007), http://www.washingtonpost.com/wp-dyn/content/article/2007/11/30/AR2007113001802.html. For example, when asked about , Justice Scalia said “I don’t even know what it is. To tell you the truth, I have heard it talked about.” See Mike Masnick, Supreme Court Justices Discuss Twitter, TECHDIRT (May 25, 2010), www.techdirt.com/articles/20100521/1631459536. Senator Ted Stevens, during hearings on a significant internet bill he sponsored,125 described the internet this way: “Ten movies streaming across that, that Internet, and what happens to your own personal Internet? I just the other day got… an Internet was sent by my staff at 10 o'clock in the morning on Friday. I got it yesterday [Tuesday]. Why? Because it got tangled up with all these things going on the Internet commercially. It’s a series of tubes.” WIRED, Your Own Personal Internet, (Jun. 20, 2006), http://www.wired.com/threatlevel/2006/06/your_own_person/. Senator John McCain explained he would not turn to a vice-president for advice on national security, but might “rely on a vice president” for help on less important issues such as “information technology.” See Graff, ibid. 126 A novel solution is moving Camp David to Silicon Valley so the President and Senators can interact with technology and technologists. See “Bridging the Continental Divide: From the Valley to D.C,” a panel discussion at the Tech Policy Summit and Center for Policy on Emerging Technologies breakfast on November 15, 2011 in Menlo Park, CA. Available at: http://vimeo.com/32851257.

13

House could call on Congress to pass robust privacy legislation, directing the FTC to aggressively enforce the FTC Act and protect privacy. The President himself should engage in the legislative arena,127 enact executive policies to protect privacy,128 and help mobilize interest groups.129 The government must access certain information. But rules governing access and practices should be public. Secret, unchallengeable demands threaten due process, prevent public debate, and invade our privacy. Secret policies and interpretations mean we cannot assess what political philosopher John Rawls called justice as regularity: “the regular and impartial, and in this sense fair, administration of law.”130 If we do not know when, why, and how the government obtains and uses information, or is permitted to use information, how can we evaluate the justice of the government and its’ actions?

IV. Case Study of FTC Enforcement

Having reviewed where privacy has stalled legislatively and judicially and the potential solutions, I now turn to enforcement. Congress has empowered the FTC and FCC with regulating businesses and protecting consumers. However, privacy laws can be confusing and difficult to apply.131 The agencies have tepidly enforced laws against companies that have broken laws and violated our social expectations.132 The ensuing problem—poor laws, poor enforcement, and broken social expectations— triggers all three aspects of my proposed privacy framework. The lack of regulation sends mixed : if companies broke the law and violated our privacy, as the FTC claims and is evident, why are there no meaningful consequences, fines, or prosecutions? Instead, the FTC could exercise their litigation and compliance authorities, extracting financial and business costs from legal violators and pursuing criminal charges. The Facebook and Google cases illustrate a pattern employed against MySpace,133 Twitter,134 and others. As matters stand, it is rational for companies to settle and enter into a Consent Decree with the FTC.135 Companies avoid admitting wrongdoing or fines.136 Companies are required to develop privacy plans; submit to privacy reviews; seek their customers’ permission before sharing their information; and pledge not to further misrepresent their privacy policies.137 This raises the odd question of whether the companies were previously permitted to misrepresent their policies.

127 The President could push for legislation to reverse or address Court decisions that punt on important privacy questions. For example, in response to the Supreme Court’s decision in Ledbetter v. Goodyear Tire & Rubber Co., 550 U.S. 618 (2007), President Barack Obama signed the Lilly Ledbetter Fair Pay Act of 2009, to “restore the law to where it was before the Supreme Court's decision.” See Macon Phillips, Now Comes Lilly Ledbetter, White House Blog, http://www.whitehouse.gov/now-comes-lilly-ledbetter/ (Jan. 25, 2009). 128 See generally Terry Moe & and William G. Howell, The Presidential Power of Unitary Action 15 J. L. ECON. & ORGANIZATION 132 (1999). 129 See generally Mark Peterson, The Presidency and Organized Interests: White House Patterns of Interest Group Liaison 86 AM. POL. SCI. REV. 625 (1992). 130 JOHN RAWLS, A THEORY OF JUSTICE 207 (rev. ed. 1999). 131 See generally Kerr, supra notes 73 and 119. 132 See EPIC Comments, Comments of the Electronic Privacy Information Center to the FTC In the Matter of Facebook, Inc. (Dec. 27, 2011), epic.org/privacy/facebook/Facebook-FTC-Settlement-Comments-FINAL.pdf, at 2 (“…the proposed Order is insufficient to address the concerns originally identified by EPIC and the consumer coalition, as well as those findings established by the Commission.”); see also EPIC, Google's Circumvention of Browser Privacy Settings, available at epic.org/privacy/google/tracking/googles_circumvention_of_brows.html. 133 FTC Myspace settlement, supra note 134 FTC Twitter settlement, supra note. 135 See Richard A. Posner, The Behavior of Administrative Agencies, 1 J. LEGAL. ST. 305 (1972), reprinted in ESSAYS IN THE ECONOMICS OF CRIME AND PUNISHMENT, at 232 (Gary S. Becker and William M. Landes eds., 1974). See generally Malcolm B. Coate, et al., Fight, Fold, or Settle?: Modeling the Reaction to FTC Merger Challenges 81 ECON. INQUIRY 829 (1995). 136 Supra note, FTC cases against Facebook and Google. 137 Ibid.

14

A host of ways Google and Facebook used and gathered information violated their terms, privacy policies, and our broader social expectations. I suggest it also violated our social expectations for the companies to repeatedly get off without punishments. The Google Decree arose over the ways shared information.138 Then, cars gathered e-mails, passwords, photos, chat messages, and sites visited from bystanders, even if users were not using a computer at the time.139 Google blamed an engineer, though the practice was planned, and known to supervisors.140 Google later subverted Safari Do Not Track features, despite user indications that they did not wish to be tracked. Google claimed “we didn’t anticipate this would happen.”141 Google then altered their privacy policies in a widely-criticized way that used users’ information in a new fashion.142 Google settled an FCC violation for $25,000 after having “impeded” and “delayed” a U.S. inquiry; the fine accounts for .000066% of their annual revenue of $37.9 billion.143 Another $22.5 million settlement, in comparison to the infraction and their revenue, it is a miniscule fine.144 Facebook publicly displayed information users thought was private; allowed advertisers to gather users’ personal information; and allowed access to users’ information even if a user deleted their profile.145 The FTC called the practices “unfair and deceptive.”146 The FTC did not respond when Facebook tracked users who were logged out of their Facebook accounts,147 or when Facebook unveiled Timeline, which shared information in new, intrusive ways.148

A. Solution: Enhanced Enforcement

The FTC has tools to respond to law-breakers, particularly once companies have entered the types of agreements Google and Facebook have. In the past, the FTC has relied upon self-regulation. Critics of self-regulation tend to believe it does not work,149 or that it might work too well.150 In a large group of

138Google Buzz, supra note. 139 David Streitfeld & Kevin O'Brien, Google Privacy Inquiries Get Little Cooperation, NYTimes (May 22,2012), available at www.nytimes.com/2012/05/23/technology/google-privacy-inquiries-get-little- cooperation.html?_r=1 (noting that Google Street View cars collected “e-mails, photographs, passwords, chat messages, postings on Web sites and social networks — all sorts of private Internet communications…”). 140 David Streifeld, Data Harvesting at Google Not a Rogue Act, Report Finds, N.Y.T. (Apr. 28, 2010) www.nytimes.com/2012/04/29/technology/google-engineer-told-others-of-data-collection-fcc-report- reveals.html?_r=2&smid=tw-nytimes&seid=auto. 141 Steven Musil, Google faces new investigations over Safari tracking, CNET (Mar. 15, 2012), news.cnet.com/8301-1023_3-57398571-93/google-faces-new-investigations-over-safari-tracking/. See also Jonathan Mayer, Safari Trackers, Web Policy (Feb. 17, 2012), www.webpolicy.org/2012/02/17/safari-trackers/. 142 See generally EPIC: EPIC v. FTC (Enforcement of the Google Consent Order), at epic.org/privacy/ftc/google/consent-order.html. 143 Brian Womack & Todd Shields, Google Gets Maximum Fine After ‘Impeding’ Privacy Probe, BLOOMBERG, (Apr. 16, 2012), www.bloomberg.com/news/2012-04-15/fcc-seeks-25-000-fine-from-google-in- wireless-data-privacy-case.html. 144 See Geoff Duncan, Google’s $22.5 million FTC penalty is not enough, Digital Trends (Jul. 12, 2012), available at www.digitaltrends.com/mobile/googles-22-5-million-ftc-penalty-is-not-enough-heres-why/ (Noting that the fine is “Half a day’s pay,” and arguing that “it’s hard to believe any company trying to compete with Google or Facebook will consider dodgy privacy practices anything more than a minor cost of doing business.”). 145 Somni Sengupta, F.T.C. Settles Privacy Issue at Facebook, NYTimes (Nov. 29, 2011), available at www.nytimes.com/2011/11/30/technology/facebook-agrees-to-ftc-settlement-on-privacy.html 146 Ibid. 147 See Jennifer Valentino-DeVries, Facebook Defends Getting Data From Logged-Out Users, WSJBlog (Sept. 26, 2011), available at blogs.wsj.com/digits/2011/09/26/facebook-defends-getting-data-from-logged-out-users/ 148 See EPIC Comments, supra note. 149 See e.g., Comments of Chris Hoofnagle before the Department of Commerce, Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework (Jan. 28, 2011). Available at http://www.ntia.doc.gov/comments/101214614-0614-01/attachments/hoofnagle_doc_comments.pdf.

15 companies, in which no individual contribution or lack of participation makes a difference, it is unlikely that a solution will emerge without coercion or exogenous factors.151 This has been the case for self- regulation, where privacy self-regulation initiatives have often stalled or failed.152 For example, the FTC agreement on self-regulation for behavioral ad targeting and delivery, a self-regulation effort, failed.153 Perhaps the FTC fears that if it litigated a case and lost, it would erode their authority. If so, it should request that Congress pass more robust legislation to empower them to punish wrong-doers, and Congress should do so. Perhaps FTC Commissioners are hindered by the lack of available technology.154 Perhaps FTC Commissioners, many of whom come from or go to the corporate world,155 are concerned

150 FTC Commissioner Thomas Rosch voiced the second concern, noting that although certain best practices are desirable, firms should not be mandated to adopt because “large, well-entrenched firms engaging in ‘self-regulation’ [could] dictate what the privacy practices of their competitors should be.” See Statement of Commissioner J. Thomas Rosch, Dissenting in Part, Internet Privacy: The Views of the FTC, FCC, and NTIA, Testimony before the House Subcommittee on Commerce, Manufacturing, and Trade and House Subcommittee on Communications and Technology of the House Committee on Energy and Commerce (July 14, 2011), available at http://www.ftc.gov/os/2011/07/110714roschdissentingstatement.pdf. 151 MANCUR OLSON, THE LOGIC OF COLLECTIVE ACTION 44 (1968). 152 See generally Robert Gellman and Pam Dixon, supra note. Another example is the Network Advertising Initiative, an FTC-supported example. In one study, only 11% of participants were able to determine the function of the NAI opt-out website. Aleecia McDonald et al., Americans’ Attitudes About Internet Behavioral Advertising Practices, Proceedings of the 9th Workshop on Privacy in the Electronic Society (WPES) (October 4, 2010). The FTC found that in the NAI “Current membership constitutes over 90% of the network advertising industry in terms of revenue and ads served.” The FTC concluded that “only legislation can compel the remaining 10% of the industry to comply with fair information practice principles. Self-regulation cannot address recalcitrant and bad actors, new entrants to the market, and drop-outs from the self-regulatory program.” FEDERAL TRADE COMMISSION, Online Profiling: a Report to Congress: Part 2 Recommendations (July 2000), http://www.ftc.gov/os/2000/07/onlineprofiling.pdf. Another example is the Platform for Privacy Preferences (P3P), a self-regulatory mechanism for websites to communicate privacy policies to user agents. Lorrie Faith Cranor, Web Privacy with P3P, http://p3pbook.com/papers.html. Thousands of websites use P3P compact policies to misrepresent their privacy practices. Pedro Giovanni Leon et al., Token Attempt: The Misrepresentation of Website Privacy Policies Through the Misuse of P3P Compact Policy Tokens, Tech. Rep. 10-014, Carnegie Mellon University, CyLab (2010). 153 “…The agreement and the related self-regulatory body – called the Network Advertising Initiative or NAI – have failed to protect consumers and have failed to self-regulate the behavioral targeting industry.” See Pam Dixon, The Network Advertising Initiative: Failing at Consumer Protection and at Self-Regulation (Nov. 2, 2007), at 2. Available at www.worldprivacyforum.org/pdf/WPF_NAI_report_Nov2_2007fs.pdf. 154 See generally Peter Maass, How a grad student scooped the government and what it means for online privacy, Salon (Jun. 28, 2012) www.salon.com/2012/06/28/how_a_lone_grad_student_scooped_the_government_and_what_it_means_for_your_o nline_privacy_salpart/ (“The desktop in their office is digitally shackled by security filters that make it impossible to freely browse the Web. Crucial websites are off-limits, due to concerns of computer viruses infecting the FTC’s network, and there are severe restrictions on software downloads…Only one FTC official has an unfiltered desktop.”). But see Kashmir Hill, The FTC, 'Your Privacy Watchdog,' Does Have Some Teeth, Forbes (Jun. 29, 2012), available at www.forbes.com/sites/kashmirhill/2012/06/29/your-privacy-watchdog-does-have-some-teeth/ (defending the FTC and taking issue with the Salon article). 155 Former government employees frequently provide expert policy advice. See Kevin T. McGuire, Lobbyists, Revolving Doors and the U.S. Supreme Court, 16 J.L. & POL. 113, 120 (2000) (“[I]n the world of pressure politics, policy-makers reward those representatives who provide them with the types of reliable information that enable them to advance their respective goals.”). See generally JOHN P. HEINZ ET AL., THE HOLLOW CORE: PRIVATE INTERESTS IN NATIONAL POLICY MAKING 22 (1993), a study based on interviews with 300 interest groups, 800 lobbyists, and 300 government officials who seek to influence federal policy. See also Mike Masnick, Mapping Out The Revolving Door Between Gov’t And Big Business In Venn Diagrams, Techdirt, (Dec. 22, 2011), available at www.techdirt.com/articles/20111221/17561617164/mapping-out-revolving-door-between-govt- big-business-venn-diagrams.shtml. I have examined this pattern as it relates to the Supreme Court. See Matthew Sundquist, Learned in Litigation: Former Solicitors General in the Supreme Court Bar, 5 Charl. L. Rev. 59 (2010).

16 about future job prospects.156 If that is the case, candidates less concerned about their post-Commission professional prospects could be appointed.157 Perhaps the FTC is under-staffed. If so, it could request a larger staff.158 FTC Commissioners may genuinely believe in unbridled capitalism, and worry that robust fines or regulations will discourage innovation or competition.159 Regardless, as the World Privacy Forum points out, there is an unfortunate reality: “companies that are the target of Commission actions know that the penalties are often weak in comparison to the profits, and that it is more cost-effective to exploit consumers today and say that they are sorry tomorrow if they are caught.”160 The FTC has broad powers to investigate cases, bring charges against companies, and punish law- breakers.161 The FTC Policy Statement on Deception says deception is a representation, omission, or practice that is likely to mislead consumers acting reasonably, to the consumer’s detriment.162 Section 5 of the FTC Act stipulates that “unfair or deceptive acts or practices in or affecting commerce, are hereby declared unlawful.”163 If a user is led to believe something, and it turns out they have been misled, the FTC could bring Section 5 actions. The FTC can assess penalties of $16,000 per day, per violation, of a violation of a Consent Decree with the FTC, decrees of the type Facebook and Google have entered.164 And though Courts have limited the government from imposing excessively large fines,165 they can be justified if they are necessary to deter future misconduct.166 The repeated occurrence of multiple privacy violations of millions of Google and Facebook users could justify leveling substantial fines of the type that would attract companies’ attention. One can imagine businesses reacting by accusing the FTC of unprecedented anti-business practices, stifling creativity, or not understanding technology. However, breaking the law necessitates punishments.

A. Coalition Solutions

156 For example, Robert Pitofsky is Of Counsel to Arnold & Porter LLP; Timothy Muris is Of Counsel to Kirkland & Ellis LLP; Pamela Jones Harbour is a partner at Fulbright & Jaworski LLP; Deborah Platt Majoras is the CLO at Proctor & Gamble; Thomas Leary is Of Counsel to Hogan & Lovells; Mozelle W. Thompson is the CEO of Thompson Strategic Consulting. 157 Officials elsewhere in the government have sought to reduce the revolving door pattern by extending the no- lobbying period. See the Close the Revolving Door Act of 2010. S. 3272 (111th). The White House could look outside the corporate world for regulatory candidates, examining policy experts, advocates, scholars and others less interested in a corporate job after their tenure. Congress could ban former regulators and staffers from lobbying, advocating, consulting, or representing companies governed by the agency they worked for, either indefinitely or for 5-10 years. 158 See Salon, supra note (“The mismatch between FTC aspirations and abilities is exemplified by its Mobile Technology Unit, created earlier this year to oversee the exploding mobile phone sector. The six-person unit consists of a paralegal, a program specialist, two attorneys, a technologist and its director, Patricia Poss.”). 159 See e.g. FTC Report: To Promote Innovation: The Proper Balance of Competition and Patent Law and Policy (Oct. 2003), at 1. Available at www.ftc.gov/os/2003/10/innovationrpt.pdf (“Competition through free enterprise and open markets is the organizing principle for most of the U.S. economy. Competition among firms generally works best to achieve optimum prices, quantity, and quality of goods and services for consumers.”). 160 World Privacy Forum, supra note , at 2. See also Salon article, supra note (Stanford graduate student Jon Mayer notes that tech companies “know that as long as they stay within lax boundaries, it’s unlikely the FTC will bring enforcement actions against them.”). 161 See FEDERAL TRADE COMMISSION, A Brief Overview of the Federal Trade Commission's Investigative Law and Enforcement Authority, (Jul. 2008), www.ftc.gov/ogc/brfovrvw.shtm. 162 FTC Deception Policy Statement, 103 F.T.C. 174 (1984), http://www.ftc.gov/bcp/policystmt/ad-decept.htm. 163 Federal Trade Commission Act § 5, 15 U.S.C. § 45(a)(1) (2006). 164 Debt Collection Improvement Act of 1996, Pub. L. 104-134, § 31001(s) (amending the Federal Civil Penalties Inflation Adjustment Act of 1990, 28 U.S.C. § 2461 note), and Federal Trade Commission Rule 1.98, 16 C.F.R. § 1.98, 74 Fed. Reg. 857 (Jan. 9, 2009). 165 United States v. Hosep Krikor Bajakajian, 524 U.S. 321 (1998) (holding that imposed fine was unconstitutional under the Eighth Amendment). 166 BMW of North America, Inc. v. Gore, 517 U.S. 559 (1996).

17

Having reviewed some of the reasons why legislation and self-regulation are unlikely to be sufficiently successful tactics for privacy protection, and having just explained why the FTC can serve as a successful but necessarily limited agent for privacy enforcement, the following section considers another strategic approach. Stakeholders in business, technology, government, and consumer protection have, individually or as units, advocated better privacy, or through standardized agreements have created privacy frameworks that could be realized. None are perfect, each has flaws, but for what they are, they are a good start. In essence, there are two distinct problems to address. First, how should we lobby Congress, corporations, and other politicians to implement and enforce meaningful privacy policies? Second, in the absence of this solution, or perhaps as a supplement, how can we promote effective behavior amongst users and businesses? Education is a crucial factor and advocacy must come from all stakeholders. As advocacy goes, a handful of alliances of government, industry, and advocacy groups have defined best practices, supported responsible data usage, and advocated privacy in the cloud.167 Each called for ECPA reforms. Government led coalitions have already begun to leverage their organizational capacity.168 Cisco, SAP, EMC, and others have embraced an Open Cloud Manifesto supporting standardization and standards based on customer requirements.169 The cloud computing industry has created semi-standardized privacy policies and practices in the form of End User License Agreements

167 Consumer Protection in Cloud Computing Services: Recommendations for Best Practices from a Consumer Federation of America Retreat on Cloud Computing (Nov. 30, 2010). Available at http://www.consumerfed.org/pdfs/Cloud-report-2010.pdf. Cloud2 was an industry group formed at the encouragement of the Federal Chief and the U.S. Department of Commerce. See also Tech America, Summary Report of the Commission on the Leadership Opportunity in U.S.Deployment of the Cloud (CLOUD2), www.techamericafoundation.org/content/wp-content/uploads/2011/07/CLOUD2_Summary.pdf.; see also Burr, supra note, discussing the Digital Due Process Coalition; see also Computer & Communications Industry Association, Public Policy for the Cloud: How Policymakers Can Enable Cloud Computing (Jul. 2011), www.ccianet.org/CCIA/files/ccLibraryFiles/Filename/000000000528/CCIA%20- %20Public%20Policy%20for%20the%20Cloud.pdf; Industry Recommendations on the Orientation of a European Cloud Computing Strategy, (Nov., 2011), at 1. available at www.ec.europa.eu/information_society/activities/cloudcomputing/docs/industryrecommendations-ccstrategy- nov2011.pdf (“This report presents recommendations from a select industry group to the European Commission on the orientation of a Cloud computing strategy for Europe, and proposes some actions for the European Commission and Industry to achieve this.”). 168 The White House has advocated for a Consumer Privacy Bill of Rights, identifying “…a need for transparency to individuals about how data about them is collected, used, and disseminated and the opportunity for individuals to access and correct data that has been collected about them.” See White House Report, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (Feb. 2012), at 13, available at www.whitehouse.gov/sites/default/files/privacy-final.pdf. The National Strategy for Trusted Identities in Cyberspace (NSTIC) is another White House initiative to work with companies, advocacy groups, and agencies to improve online privacy. The Strategy calls for interoperable technology where people, companies, and technology can be authenticated. The idea is to create a system wherein individuals could securely validate when necessary or stay if they choose. See NSTIC, About NSTIC, www.nist.gov/nstic/about-nstic.html. Identity Commons was originally created in 2001 to support an identity infrastructure. The identity validation program will pilot on federal websites with AOL, Google, PayPal, Verizon, and Yahoo. See OPEN IDENTITY EXCHANGE, An Open Market Solution for Online Identity Assurance, www.openidentityexchange.org/sites/default/files/oix-white-paper-2010-03-02.pdf. 169 OPEN CLOUD MANIFESTO (Spring 2009) www.opencloudmanifesto.org/Open%20Cloud%20Manifesto.pdf; see generally Patrícia Takako Endo, et al., A Survey on Open-source Cloud Computing Solutions, presented at High Performance Computing and Communications (HPCC), 2011 IEEE 13th International Conference, Sept. 2-4, 2011. Available at sbrc2010.inf.ufrgs.br/anais/data/pdf/wcga/st01_01_wcga.pdf. Amazon, Google, , and Salesforce.com did not join, demonstrating how far away industry agreement may be.

18

(EULA),170 Terms of Services (TOS) or Service Level Agreements (SLA). These may be informative,171 but they are infrequently read and difficult to understand.172 Information security management best practices have also been defined by (ISMS) 27001 and 27002,173 though they remain imperfect.174 For these groups to be successful, they will need to find broad areas of agreement, where they can pursue specific, tangible goals, as the coalition opposing SOPA did.

C. Lessons from the Collaboration Against SOPA

The multi-stakeholder process to stop SOPA is a handy example for a privacy coalition.175 SOPA made internet service providers responsible for filtering copyright infringement material, targeting those who “enable or facilitate” copyright infringement. Commentators argued that Google, YouTube, and other sites could be blocked. Some claimed it would lead to an “Internet blacklist,” or a “great firewall of America.” 176 The deck was stacked in favor of SOPA. Industry repeat players enjoy better financing, established organization, and superior institutional knowledge and relationships.177 As the President of the Computer and Communications Industry Association pointed out, “if you are a member of the Judiciary Committee, year after year after year, the content industry has been at your fundraisers over and over.”178 Organizations supporting SOPA had given 450% as much money to members of Congress as organizations in opposition.179 Indeed, Representative Lamar Smith, the sponsor, called one opposition witness at the House Judiciary Committee. He called five supportive witnesses.180 The Center for Democracy and Technology and the Electronic Frontier Foundation had been initial opponents, but soon more stakeholders joined a coalition organizing “American Censorship Day,” supported by Mozilla, Wikimedia, and others.181 Google, AOL, and Facebook criticized SOPA in a full-page New York Times ad.182 The Twitter hashtag #DontBreakTheInternet trended upwards, and 87,000 people called Congress to voice their opposition. California Democrat Zoe Lofgren organized meetings for opponents of the legislation. President Obama then publicly opposed SOPA.183 It failed to pass.

170 Jens Grossklags & Nathan Good, Empirical studies on software notices to inform policy makers and usability designers, Proceedings of the 11th International Conference on Financial cryptography and 1st International conference on Usable Security (2007). 171 Robert W. Gomulkiewicz & Mary L. Williamson, A Brief Defense of Mass Market Software License Agreements, 22 RUTGERS COMP. & TECH. L L.J. 335 (1996). 172 Balachandra Reddy Kandukuri et al., Cloud Security Issues, 2009 IEEE International Conference on Services Computing (2009). Available at www.mendeley.com/research/cloud-security-issues/. 173 See COMPUTER WEEKLY, Security Zone: Promoting accountability through ISO/IEC 27001 & 27002 (formerly ISO/IEC 17799) (Dec. 2008), www.computerweekly.com/feature/Security-Zone-Promoting-accountability- through-ISO-IEC-27001-27002-formerly-ISO-IEC-17799. 174 See ISO/IEC 27002, http://www.iso27001security.com/html/27002.html. 175 H.R.3261, 112th Congress (2011-2012). 176 Will Oremus, supra note. 177 See Marc Galanter, Why the “Haves” Come Out Ahead: Speculations on the Limits of Legal Change, 9 LAW & SOC’Y REV. 95, 97 (1974). 178 Jennifer Martinez, Shootout at the digital corral, POLITICO, (Nov. 16, 2011) www.politico.com/news/stories/1111/68448_Page4.html. 179 MAPLIGHT, H.R. 3261 - Stop Online Piracy Act, (accessed Dec 11, 2011), www.maplight.org/us- congress/bill/112-hr-3261/1019110/total-contributions.table. 180 Supra note 121. 181 See e.g., fightforthefuture.org; http://americancensorship.org. 182 See Cory Doctorow, Internet giants place full-page anti-SOPA ad in NYT, BOINGBOING (Nov. 16, 2011), www.boingboing.net/2011/11/16/internet-giants-place-full-pag.html. 183 John Gaudiosi, Obama Says So Long SOPA, Killing Controversial Internet Piracy Legislation, FORBES (Jan. 1, 2012), www.forbes.com/sites/johngaudiosi/2012/01/16/obama-says-so-long-sopa-killing-controversial- internet-piracy-legislation/.

19

SOPA showed a moment of , but in privacy, everyone has varied interests. Consumers have different views of privacy than businesses and governments. Opposing legislation is quite different than formulating ideas and advocating policy positions or legislation. However, as this group and groups like the Future of Privacy Forum and Digital Due Process Coalition demonstrate,184 there are areas where stakeholders can work together. Social media is empowering in this regard, as is calling Congress, signing petitions,185 and on an individual level, filing complaints with the FTC,186 FCC,187 and depending on your state of residency, your Attorney General or Governor.188 I file as often as I find privacy infringements or misleading terms or policies, and encourage others to do likewise.

D. Conclusion

In the short term, education orients users to privacy practices and whether their expectations are realistic, in tune with the law, and enforced. Advocacy groups have written helpful educational materials.189 A WikiMedia Project uses crowd sourcing to apply four icons to explain websites’ privacy.190 An iPhone tracker visualizes what information can be gleaned from the files on your phone.191 “Take This Lollipop” is a short video that, using Facebook Connect, depicts a crazed man stalking you in Facebook.192 Pleaserobme.com combines information from Foursquare.com and Twitter to identify when people have willingly provided their location information. Ghostery blocks cookies and displays which cookies have tracked you.193 The Electronic Frontier Foundation has a project to demonstrate to users all the information computers transmit to websites.194 The FTC has shown exceptional energy in educating consumers, leading industry discussions, and advocating that companies promote privacy.195 Once society understands is eager to fix these problems, we can set off fire-alarms, putting our representatives on notice that we value the social contract and that privacy is indeed a cherished value.

184 The Future of Privacy Forum is a D.C. based think tank that brings together privacy advocates from academia, technology, business, and consumer protection. See Future of Privacy Forum, available at http://www.futureofprivacy.org/. The Digital Due Process Coalition is a group of business, government, and advocacy groups organized by the Center for Democracy and Technology to advocate amending the ECPA. See Digital Due Process, available at www.digitaldueprocess.org/index.cfm?objectid=37940370-2551-11DF- 8E02000C296BA163. 185 Issue specific petitions have been compiled in this vein. See e.g., Not Without a Warrant, available at https://notwithoutawarrant.com (advocating amending the ECPA). 186See FTC, “Complaint Assistant,” available at https://www.ftccomplaintassistant.gov. 187 See FCC, Consumer & Governmental Affairs Bureau, “File a Complaint,” available at http://esupport.fcc.gov/complaints.htm. 188 See e.g. State of California Department of Justice, Office of the Attorney General, “Consumer Alerts, Information & Complaints,” available at http://oag.ca.gov/consumers/general. California is a particularly good state for filing, as the California Consumers Legal Remedies Act allows consumers to claim damages. See California Civil Code §§ 1750 et seq. 189 See e.g., CENTER FOR DEMOCRACY AND TECHNOLOGY, Getting Started: Web Site Privacy Policies, https://www.cdt.org/privacy/guide/start/privpolicy.php; see also PRIVACY RIGHTS CLEARINGHOUSE, Online Privacy: Using the Internet Safely (revised 2012), https://www.privacyrights.org/fs/fs18-cyb.htm. 190 See DISCONNECT ME, http://disconnect.me/db/. 191 See iPhone Tracker, www.petewarden..com/iPhoneTracker/. 192 See TAKE THIS LOLLIPOP, www.takethislollipop.com. In general, Facebook applications can access an incredible amount of information, including your inbox and friend lists and requests. See FACEBOOK, Permissions, www.developers.facebook.com/docs/reference/api/permissions/. 193 Similar programs have minimal effectiveness. See Jonathan Mayer, Tracking the Trackers: Self Help Tools, WEB POLICY (Sep. 13, 2011), webpolicy.org/2011/09/13/tracking-the-trackers-self-help-tools/ (Reviewing tracking and blocking tools and concluding that “Most desktop browsers currently do not support effective self-help tools.”). 194 See Panopticlick, https://panopticlick.eff.org/. 195 See generally FTC Report, Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers (Mar. 2012), at 14. Available at www.ftc.gov/os/2012/03/120326privacyreport.pdf.

20

Kinakuta, a fictional island in the science fiction novel Cryptonomicon, is used to traffic data outside legal regulations.196 A large corporation with the will-power and financing could theoretically create a floating data center, beyond government reach or user protection.197 Given the legal gray area surrounding founding countries198 and data storage in space,199 it is not inconceivable to imagine a corporation or group of individuals creating a real-world Kinakuta, where information that threatens or violates our privacy could be gathered, processed, and exploited. However, corporations need not resort to the safety of clandestine islands: privacy violations happen on our own shores, but quietly, secretly, and beyond the scope of challenge or knowledge. Or they occur brazenly, in the open, when laws are vague enough or poorly enforced enough that companies and the government need not establish a haven. Their havens are ignorance, obfuscation, secrecy, complacency, and confusion.

196 Neal Stephenson, CRYPTONOMICON (1999). In a real-world comparison, an abandoned oil rig off the coast of England, “Sealand,” nearly became a data center targeting customers looking for complete freedom from government.Learn more about Sealand at www.sealandgov.org. See generally James Grimmelmann, Sealand, HavenCo, and the Rule of Law, 2012 U. Ill. L. Rev. 405. 197 http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2456/2171 198 PATRI FRIEDMAN, SEASTEADING: A PRACTICAL GUIDE TO HOMESTEADING THE HIGH SEAS (beta version 2009), http://seasteading.org/seastead.org/book_beta/full_book_ beta.pdf; JEROME FITZGERALD, SEA-STEADING: A LIFE OF HOPE AND FREEDOM ON THE LAST VIABLE FRONTIER (2006). 199 Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, Including the Moon and Other Celestial Bodies, Jan. 27, 1967, 18 U.S.T. 2410; MYRES S. MCDOUGAL ET AL, LAW AND PUBLIC ORDER IN SPACE (1963); GEORGE S. ROBINSON & HAROLD M. WHITE, JR., ENVOYS OF MANKIND: A DECLARATION OF FIRST PRINCIPLES FOR THE GOVERNANCE OF SPACE SOCIETIES (1986); Barton Beebe, Note, Law’s Empire and the Final Frontier: Legalizing the Future in the Early Corpus Juris Spatialis, 108 YALE L.J. 1737 (1999).

21