1 Protecting Privacy, the Social Contract
Total Page:16
File Type:pdf, Size:1020Kb
Protecting privacy, the social contract, and the rule of law in the Virtual World Matthew Sundquist∗ Introduction and Framework Explanation I. Valuing Privacy and Determining When to Respond II. Compounding Privacy Problems: Rational Choice Theory and Technological Growth III. Legal Guidance for Privacy A. Precedents B. Statutory Guidance C. Analysis D. Looking Ahead IV. Case Study of FTC Enforcement A. Solution: Enhanced Enforcement B. Coalition Solutions C. Lessons from the Collaboration against SOPA D. Conclusion A host of laws and regulations engage with the legal,1 technological,2 and social3 meanings4 of privacy. In a country of more than 300 million people and plentiful law enforcement officers, there is ∗ Matthew Sundquist is a graduate of Harvard College. He is the Privacy Manager at Inflection, and a Student Fellow of the Harvard Law School Program on the Legal Profession. This paper represents the opinion of the author. It is not meant to represent the position or opinions of Inflection. For their thoughtful advice and suggestions, I am grateful to Allison Anoll, Beth Givens, Bob Gellman, Bruce Peabody, Christopher Wolf, Erik Jones, Jacqueline Vanacek, James Grimmelmann, and Orin Kerr. For their support in writing this paper and friendship, I am grateful to Brian and Matthew Monahan. 1 The term legal has different definitions in relation to privacy. • For a discussion of privacy as a principle in law, see Julie M. Carpenter & Robin Meriweather, Brief Amicus Curiae of the DKT Liberty Project in support of Petitioner, Kyllo v. United States, 533 U.S. 27 (2000); Ken Gormely, One Hundred Years of Privacy 4 Wisconsin Law Review 1335 (1992); Samuel D. Warren and Louis D. Brandeis, The Right to Privacy, 4 Harvard L. R. 193 (1890) (describing privacy as the “right to be let alone.”). • Privacy is also covered by statutory law. See e.g., HIPAA (Health Insurance Portability and Accountability Act); Sarbanes–Oxley Act; Gramm–Leach–Bliley Act; Payment Card Industry Data Security Standard (PCI DSS); Fair and Accurate Credit Transaction Act (FACTA), including Red Flags Rule; Electronic Fund Transfer Act, Regulation E (EFTA); Stored Communications Act; Federal disclosure laws; e–discovery provisions in the Federal Rules of Civil Procedures; Free and Secure Trade Program (FAST); Fair Crediting Reporting Act (FCRA); Right to Financial Privacy Act; Family Education Rights and Privacy Act; Customer Proprietary Network Information rules; Children’s Online Privacy Protection Act; Cable Act; Video Privacy Protection Act; Driver’s Privacy Protection Act ; Title 39 of the Codes of Federal Regulation which deals with the USPS; Section 5 of the Federal Trade Commission Act. This does not include state laws. 1 likely to be abusive behavior. As a result, we are awash in claims about the definition, function, and value of privacy, we are awash in potential threats to privacy, and, increasingly, we are awash in debates about how to fashion remedies to address these issues. To survey the entire landscape of privacy dilemmas or threats, or assume that I could extract a representative sample of privacy policies or dilemmas is impossible. This paper does not attempt to provide a systematic theoretical account of privacy and technology, nor does it outline a typology of circumstances in which privacy might be threatened or abused by private or public entities. My paper advances a general framework for when a legal or social response to a privacy threat is appropriate.. The emergent areas I survey demonstrate the utility and application of my approach. This paper is in four sections. I begin by introducing a framework for prioritizing when a virtual or online practice, law, or regulatory deficiency warrants a legal or social response. First, a practice or law that violates the law should be prosecuted. Second, where privacy laws are ineffectually enforced we should be on heightened alert. Third, we need a response when a practice violates a valued social expectations of how information should flow.5 The first two conditions are important as updating and enforcing our laws in light of technological change is crucial to the maintenance of the social contract. The third condition is important as many of our expectations about information and privacy developed when tracking at the scale the government and businesses do so was impossible. Now, Google and Facebook have hundreds of millions of users. Around 1,271 government organizations and 1,931 private • The Supreme Court has repeatedly set precedents on privacy. See e.g., Ex parte Jackson, 96 U.S. 727 (1878); Meyer v. Nebraska, 262 U.S. 390 (1923); Olmstead v. United States, 277 U.S. 438 (1928); Griswold v. Connecticut, 381 U.S. 479 (1965); Katz v. United States, 389 U.S. 347 (1967); Stanley v. Georgia, 394 U.S. 557 (1969); Kelley v. Johnson, 425 U.S. 238 (1976); Moore v. City of East Cleveland, 431 U.S. 494 (1977); Lawrence v. Texas, 539 U.S. 558 (2003). See generally Laura K. Donohue, Anglo- American Privacy and Surveillance, 96 J. Crim. L. & Criminology 1059 (2006); William Knox, Some Thoughts on the Scope of the Fourth Amendment and Standing to Challenge Searches and Seizures, 40 Mo. L. Rev. 1 (1975). 2 Technology has created intriguing privacy problems. See e.g., Horst Feistel, Cryptography and Computer Privacy, 228 SCIENT. AM. 15 (1973) (exploring enciphering and origin authentication as a means of protecting systems and personal data banks); Rakesh Agrawal & Ramakrishnan Srikant, Privacy-Preserving Data Mining, Proceedings of the 2000 ACM SIGMOD International Conference on Management of Data (attempting to “develop accurate models without access to precise information in individual data records”); Latanya Sweeney, k-Anonymity: A Model for Protecting Privacy, 10 Intern. J. on Uncertainty, Fuzziness and Knowledge-based Systems 557 (2002) (explaining means for anonymzing data to protect privacy). 3 Scholars have often advocated balanced frameworks for interpreting and protecting privacy. See Helen Nissenbaum, A Contextual Approach to Privacy Online 140 Daedalus 32, 33 (2011) (arguing that our experiences and social surroundings formulate our expectations for information use and privacy); JOHN PALFREY & URS GASSER, BORN DIGITAL, 7 (2008) (arguing that the younger generation of technology users, due to their frequent and early adoption of technology, has different conceptions of privacy than their parents or grandparents); see generally ALAN WESTIN, PRIVACY AND FREEDOM (1967) (arguing that privacy is comprised of intimacy, anonymity, solitude, and reserve); JUDITH DECREW, IN PURSUIT OF PRIVACY: LAW, ETHICS, AND THE RISE OF TECHNOLOGY (1997) (arguing that privacy entails informational privacy, accessibility privacy, and expressive privacy); Jeremy Kang, Information Privacy in Cyberspace Transactions 50 Stan. L. R. 1193 (1997) (arguing for privacy in our physical space, choice, and flows of information). 4 I examine privacy of the personal information we create directly by communicating and completing forms, contracts, and documents as well as the information we create indirectly by using browsers, carrying phones with geo-tracking, and purchasing or using products and services. I focus on the assurances we receive about this information and whether they are complied with. See generally Office of Management and Budget Memorandum, Safeguarding Against and Responding to the Breach of Personally Identifiable Information (May 2007) (defining personally identifiably information and when and how to protect the information). 5 See generally Nissenbaum, supra note. 2 companies work on counterterrorism, homeland security and intelligence in the U.S.6 Information often flows based on what is technologically possible rather than based on norms or laws.7 In the second section, I examine how technology has allowed more information about people to be gathered and stored online.8 The 92% of online adults using search engines daily, 92% using email daily, and 60% of adults doing both on a typical day leave a large digital trail.9 Businesses and government can quickly create and begin to rely on a new practice, leading to claims they could not function without that practice.10 Technology has, as Amazon founder Jeff Bazos explained, begun “eliminating all the gatekeepers” for companies and technical practices.11 Technology has enabled the U.S. government to discover and learn more about Americans and citizens abroad.12 The NSA can intercept and download electronic communications equivalent to the contents of the Library of Congress every six hours.13 Government analysts annually produce 50,000 intelligence reports.14 While economic theory suggests a rational capacity to process the stream of privacy threats and trade-offs we face, people simply cannot process everything about privacy or make rational privacy decisions.15 As a result of all these factors, regulatory inaction or a lack of regulations allows more activity—and potentially invasions of our privacy—to happen faster, at a larger scale. Even as courts and Congress have addressed some questions involving the relationship between evolving technology and privacy, including Constitutional questions, they have avoided others. The 6 See Dana Priest & William Arkin, A hidden world, growing beyond control, WASH. POST, available at projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-growing-beyond-control/. 7 Ibid, at 34. 8 See generally Sweeney, supra note, at 557, (“Society is experiencing exponential growth in the number and variety of data