Designing for Consent Articles

Total Page:16

File Type:pdf, Size:1020Kb

Designing for Consent Articles 162 EuCML · Issue 4/2018 Hoofnagle, Designing for Consent Articles Chris Jay Hoofnagle* Designing for Consent While the General Data Protection Regulation leans away Rather than rehash those debates, this paper focuses on from consent as a mechanism to justify the collection and the reality that consent is still a central approach in the use of personal data, consent remains a prime mechanism U. S. context. In fact, consent seems to be increasing in for protecting privacy and limiting marketing in the Uni- importance, as firms need to rely on some form of user ted States (U. S.). This article summarizes the difficulties authorization to embed tracking in highly-private spaces. businesses face in gaining high-quality consent and sets For instance, the rise of voice-assistant appliances (Apple’s out the different consent standards in U. S. statutory law. HomePod, Google Home and Amazon Alexa) means that In an exploratory effort, it then shows how companies consumers are putting internet-connected microphones in have designed consent experiences for particularly priv- their private homes – indeed even in their bedrooms. acy-invasive devices – home voice assistants and “smart” Similar surveillance capacity is now present in many televisions – through testing and documentation of their “smart” televisions. Part 2 of this article introduces the function. The article then explains the quality of consent idea of “quality of consent” in U. S. law. Part 3 then that U. S. law demands in different contexts, and eluci- explains how U. S. statutory law regulates in-home voice dates the limits of consent as a privacy tool in light of assistants and smart televisions with consent of varying the kinds of information-intensive devices that consumers quality. Part 4 presents an initial survey of how specific are adopting. This paper is part of an ongoing research device makers have implemented consent requirements in project looking into the privacy and security aspects of voice assistants and smart televisions. Finally, part 5 con- voice assistants. cludes. 1. Introduction * Chris Hoofnagle is adjunct full professor of information and of law at This article gives an overview of how “consent” remains an the University of California, Berkeley, e-mail [email protected] important mechanism to regulate uses of personal data and Paper prepared for the conference “Consumer Law in the Data Econo- control over marketing in the U. S. It also tests a group of my”, sponsored by the Centre for the Study of European Contract Law and the Journal of European Consumer and Market Law, April 13, popular products to show how product manufactures have 2018. designed for consent, that is, how clever dialogues have been 1 Edward J Janger & Paul M Schwartz, ‘The Gramm-Leach-Bliley Act, implemented to elicit consent from the user. Information Privacy, and the Limits of Default Rules’ (2001) 86 Minne- sota Law Review 1219. 2 Jeff Sovern, ‘Opting In, Opting Out, or No Options at All: The Fight for There are three main approaches to consent in the U. S.: opt- Control of Personal Information’ (1999) 74 Washington Law Review in and opt-out consent, and “no-opt.” In opt-in consent, the 1033. 3 Daniel J Solove, ‘Privacy Self-Management and the Consent Dilemma’ consumer must take some affirmative action, even a very (2013) 126 Harvard Law Review 1880. subtle one, to accept a collection or use of data. In opt-out, 4 Janger & Schwartz (n 1). the consumer must take action to object to data collection or 5 See, for the discussion of Google’s Safari tracking workaround, Chris 1 Jay Hoofnagle, ‘The Federal Trade Commission’s Inner Privacy Strug- use, thus placing the burden on the consumer to act. The gle’ in Evan Selinger, Jules Polonetsky & Omer Tene (eds), The Cam- third approach typically goes unmentioned: “no opt.”2 That bridge Handbook of Consumer Privacy (Cambridge University Press is, situations where the consumer has no ability to object to 2018) (“If firms are strategic, they will compete to both capture bene- fits and deny them to competitors. Through this lens, Google’s tracking information practices. of Safari users could be motivated by a desire to capture benefits from tracking, but also to deny Apple the ability to compete on privacy. Consent is a problematic tool to regulate privacy.3 An Denying Apple the competitive benefit could also affect the psychology of consumers, leading them to think no company can protect priv- extensive literature describes the problem space. Compa- acy.”). nies impose transaction costs to shape consumer choice,4 6 Douglas Edwards, I’m Feeling Lucky: The Confessions of Google use technology to deny benefits to those who take priv- Employee Number 59 (Houghton Mifflin Harcourt 2011) 341 (“We acy-preserving choices,5 and companies select defaults that would clearly want to set the default as ‘accept Google’s cookies.’ If 6 we fully explained what that meant to most users, however, they consumers do not want or expect. At the same time, would probably prefer not to accept our cookie. So our default opt-in consent is problematic because requiring it may setting would go against users’ wishes. Some people might call that violate the commercial free expression rights of compa- evil, and evil made Marissa [then Google executive Marissa Mayer] 7 uncomfortable. She was disturbed that our current cookie-setting nies, because obtaining consent is transactionally diffi- practices made the argument a reasonable one. She agreed that at cult, because consent needs a scaffold of other protections the very least we should have a page telling users how they could to address downstream uses of data, and because it may delete their cookies, whether set by Google or by some other web- site.”) be insufficient to protect consumers against sophisticated 7 Sorrell v IMS Health Inc. [2011] 564 US 552 (SCT). 8 businesses. As a result of fierce criticism, consent is 8 Lauren E Willis, ‘When Nudges Fail: Slippery Defaults’ (2013) 80 disfavored under the General Data Protection Regulation University of Chicago Law Review 1155. (GDPR).9 The GDPR’s architects burdened consent to 9 Regulation (EU) 2016/679 of the European Parliament and of the Coun- cil of 27 April 2016 on the protection of natural persons with regard to prevent companies from using take-it-or-leave it ap- the processing of personal data and on the free movement of such data proaches. [2016] OJ L 119/1 (General Data Protection Regulation). Hoofnagle, Designing for Consent Issue 4/2018 · EuCML 163 Articles 2. Quality of Consent in U. S. Law Table 1: Practical Problems in Achieving High-Quality Con- sent Internet utopians predicted a world of custom-tailored online contracts.10 In reality, the internet cemented the practice of Quality Tension Example standard-form, take-it-or-leave-it user agreements. Internet Who consents The “secondary” Alice and Bob have lawyers interpreted consent to mean that the consumer saw user problem a voice-assistant these form terms and clicked a resource with the mouse. In appliance, the fact, American lawyers devote significant attention to the Amazon Alexa in contours of “browsewrap” and “clickwrap” agreements. But their kitchen. They whether consent is manifested through visiting a website or invite Mallory over the millisecond action of clicking on a box, we know it is a for dinner, who contrived exercise. doesn’t know In such agreements, it is common for American lawyers to about the Alexa. describe choice rules without explaining their implications. Informed Services often do Privacy notices A timely example comes from the Facebook-Cambridge not know how tend to reserve a Analytica controversy, where Cambridge Analytica was they will use perso- broad set of future able to obtain the information of millions of people whose nal information, so data uses to ad- friends filled out a personality survey. As Facebook lawyer they cannot inform dress unforeseen Paul Grewal explained, “Approximately 270,000 people others utility derived from downloaded the [personality testing] app. In so doing, they personal data. [the downloaders] gave their consent for [the application Informed Services sometimes A website opera- developer] to access information such as the city they set do not know what tor could not on their profile, or content they had liked, as well as more their systems actu- have consented to limited information about friends who had their privacy ally do, so they collection of per- settings set to allow it.”11 Under this logic, the transaction cannot inform sonal information was consensual because users who did not download the others by a third party app assumed the risk that others might do so. To reject the advertising service risk, users had to understand the implications of privacy because the ser- policies and numerous settings, and then change those set- vice expressly tings. As Professor Daniel Solove has recognized in his claimed it did not treatment of consent, Facebook’s style of consent focuses collect such on an up-front, abstract statement of the rules instead of data.[1] warning the user of the downstream implications of those rules.12 Informed Services do not Google secretly want to tell users moved its email If we are serious about quality of consent, we would probe very clearly about scanning appliance the following factors in transactions: data practices from monitoring • Who can consent? delivered emails to reading messages • How informed is the consent? before they were delivered, thus • How serious is the right or privilege involved? converting it into a • What was the voluntariness of the consent? wiretap.[2] Seriousness Seriousness de- If data are only • What is the scope of the consent? pends on down- used for service de- • What are the implications of consent? stream uses of data livery, versus building an adver- • What are the procedural safeguards in the consent experi- tising profile.
Recommended publications
  • The Safeguards of Privacy Federalism
    THE SAFEGUARDS OF PRIVACY FEDERALISM BILYANA PETKOVA∗ ABSTRACT The conventional wisdom is that neither federal oversight nor fragmentation can save data privacy any more. I argue that in fact federalism promotes privacy protections in the long run. Three arguments support my claim. First, in the data privacy domain, frontrunner states in federated systems promote races to the top but not to the bottom. Second, decentralization provides regulatory backstops that the federal lawmaker can capitalize on. Finally, some of the higher standards adopted in some of the states can, and in certain cases already do, convince major interstate industry players to embed data privacy regulation in their business models. TABLE OF CONTENTS I. INTRODUCTION: US PRIVACY LAW STILL AT A CROSSROADS ..................................................... 2 II. WHAT PRIVACY CAN LEARN FROM FEDERALISM AND FEDERALISM FROM PRIVACY .......... 8 III. THE SAFEGUARDS OF PRIVACY FEDERALISM IN THE US AND THE EU ............................... 18 1. The Role of State Legislatures in Consumer Privacy in the US ....................... 18 2. The Role of State Attorneys General for Consumer Privacy in the US ......... 28 3. Law Enforcement and the Role of State Courts in the US .................................. 33 4. The Role of National Legislatures and Data Protection Authorities in the EU…….. .............................................................................................................................................. 45 5. The Role of the National Highest Courts
    [Show full text]
  • A Model Regime of Privacy Protection
    GW Law Faculty Publications & Other Works Faculty Scholarship 2006 A Model Regime of Privacy Protection Daniel J. Solove George Washington University Law School, [email protected] Follow this and additional works at: https://scholarship.law.gwu.edu/faculty_publications Part of the Law Commons Recommended Citation Daniel J. Solove & Chris Jay Hoofnagle, A Model Regime of Privacy Protection, 2006 U. Ill. L. Rev. 357 (2006). This Article is brought to you for free and open access by the Faculty Scholarship at Scholarly Commons. It has been accepted for inclusion in GW Law Faculty Publications & Other Works by an authorized administrator of Scholarly Commons. For more information, please contact [email protected]. SOLOVE.DOC 2/2/2006 4:27:56 PM A MODEL REGIME OF PRIVACY PROTECTION Daniel J. Solove* Chris Jay Hoofnagle** A series of major security breaches at companies with sensitive personal information has sparked significant attention to the prob- lems with privacy protection in the United States. Currently, the pri- vacy protections in the United States are riddled with gaps and weak spots. Although most industrialized nations have comprehensive data protection laws, the United States has maintained a sectoral approach where certain industries are covered and others are not. In particular, emerging companies known as “commercial data brokers” have fre- quently slipped through the cracks of U.S. privacy law. In this article, the authors propose a Model Privacy Regime to address the problems in the privacy protection in the United States, with a particular focus on commercial data brokers. Since the United States is unlikely to shift radically from its sectoral approach to a comprehensive data protection regime, the Model Regime aims to patch up the holes in ex- isting privacy regulation and improve and extend it.
    [Show full text]
  • The Federal Trade Commission and Consumer Privacy in the Coming Decade
    University of Pennsylvania ScholarlyCommons Departmental Papers (ASC) Annenberg School for Communication 2007 The Federal Trade Commission and Consumer Privacy in the Coming Decade Joseph Turow University of Pennsylvania, [email protected] Chris Hoofnagle UC Berkeley School of Law, Berkeley Center for Law and Technology Deirdre K. Mlligan Samuelson Law, Technology & Public Policy Clinic and the Clinical Program at the Boalt Hall School of Law Nathaniel Good School of Information at the University of California, Berkeley Jens Grossklags School of Information at the University of California, Berkeley Follow this and additional works at: https://repository.upenn.edu/asc_papers Part of the Communication Commons Recommended Citation (OVERRIDE) Turow, J., Hoofnagle, C., Mulligan, D., Good, N., and Grossklags, J. (2007-08). The Federal Trade Commission and Consumer Privacy in the Coming Decade, I/S: A Journal Of Law And Policy For The Information Society, 724-749. This article originally appeared as a paper presented under the same title at the Federal Trade Commission Tech- ade Workshop on November 8, 2006. The version published here contains additional information collected during a 2007 survey. This paper is posted at ScholarlyCommons. https://repository.upenn.edu/asc_papers/520 For more information, please contact [email protected]. The Federal Trade Commission and Consumer Privacy in the Coming Decade Abstract The large majority of consumers believe that the term “privacy policy” describes a baseline level of information practices that protect their privacy. In short, “privacy,” like “free” before it, has taken on a normative meaning in the marketplace. When consumers see the term “privacy policy,” they believe that their personal information will be protected in specific ways; in particular, they assume that a website that advertises a privacy policy will not share their personal information.
    [Show full text]
  • Recommendations for Businesses and Policymakers Ftc Report March 2012
    RECOMMENDATIONS FOR BUSINESSES AND POLICYMAKERS FTC REPORT FEDERAL TRADE COMMISSION | MARCH 2012 RECOMMENDATIONS FOR BUSINESSES AND POLICYMAKERS FTC REPORT MARCH 2012 CONTENTS Executive Summary . i Final FTC Privacy Framework and Implementation Recommendations . vii I . Introduction . 1 II . Background . 2 A. FTC Roundtables and Preliminary Staff Report .......................................2 B. Department of Commerce Privacy Initiatives .........................................3 C. Legislative Proposals and Efforts by Stakeholders ......................................4 1. Do Not Track ..............................................................4 2. Other Privacy Initiatives ......................................................5 III . Main Themes From Commenters . 7 A. Articulation of Privacy Harms ....................................................7 B. Global Interoperability ..........................................................9 C. Legislation to Augment Self-Regulatory Efforts ......................................11 IV . Privacy Framework . 15 A. Scope ......................................................................15 1. Companies Should Comply with the Framework Unless They Handle Only Limited Amounts of Non-Sensitive Data that is Not Shared with Third Parties. .................15 2. The Framework Sets Forth Best Practices and Can Work in Tandem with Existing Privacy and Security Statutes. .................................................16 3. The Framework Applies to Offline As Well As Online Data. .........................17
    [Show full text]
  • Considerations for Comprehensive State Privacy Proposals
    Considerations for Comprehensive State Privacy Proposals March 2020 Privacy is essential to kids’ safety and well-being, in the home, at school, and in our communities.1 Common Sense Media, and its policy arm Common Sense Kids Action (together, “Common Sense”), has been active in advancing kids’ and families’ privacy rights at the state and federal level over the last two decades from student privacy protections to updates to the federal Children’s Online Privacy Protection Act (“COPPA”), to co-sponsoring the California Consumer Privacy Act (“CCPA”). States are considering a variety of different privacy protections to student privacy rules, safeguards for connected toys, and restrictions on facial recognition. As states consider comprehensive privacy protections, Common Sense recommends that state proposals: 1. Include strong and clear definitions of what personal data is covered and how it can be de-identified; 2. Explain how companies collect, use, and share data in terms parents and kids can understand and provide easy mechanisms to access, delete, and move data; 3. Restrict secondary uses of information and mandate data minimization requirements; 4. Limit data disclosures to unknown third parties and business affiliates; 5. Include additional protections to safeguard the personal data of vulnerable children and teens, such as prohibitions on behavioral advertising and third-party disclosure; 6. Ensure any affirmative obligation on families or individuals to opt-out to protect their privacy be easy to effectuate and easily understandable and accessible; 7. Avoid overbroad and unnecessary exceptions for businesses who may be covered, usually only in part, by federal laws like COPPA; 8. Carefully consider what privacy and security requirements, if any, small businesses should be exempt from following; 9.
    [Show full text]
  • Chris Hoofnagle HOW the FAIR CREDIT REPORTING ACT REGULATES BIG DATA
    From: Chris Jay Hoofnagle To: Privacyrfc2014 Subject: Comments on PCAST Report Date: Monday, August 04, 2014 9:46:18 PM Attachments: LEGAL-Hoofnagle-How-FCRA-Regulates-Big-Data.pdf ATT00001..htm Dear NTIA, I submit the attached paper that responds to several of the questions raised in your request for comments concerning the PCAST report. My comment focuses on problems in use regulation approaches in privacy law. Generally speaking, use regulations are very difficult to enforce. I can say this from years in experience in being involved in consumer privacy lawsuits concerning data practices. First, companies often do not have access controls and cannot even identify the employees who had access to customer data. Thus, they cannot even tell how data were used. Second, once data is in the possession of a company, it can raise first amendment defenses to use regulations. These defenses would not apply to collection limitation rules. Thus, use regulations would always have to satisfy at least intermediate scrutiny. Third, the PCAST report introduces a new loophole to use regulations: “analysis” is not considered a use. This introduces a new defense, creating a new hurdle for wronged consumers who are trying to protect their privacy rights. It also makes little sense because citizens would be offended by many kinds of analysis that is not implemented as a use. For instance, Edward Snowden alleged that NSA employees were passing around sexually-explicit pictures of individuals that were captured by the agency. Was this a use or an analysis? If it is just an analysis because the data were not used to make an agency decision, does that lessen the privacy invasion? Finally, the attached paper discusses what use regulations have meant for the citizen in the context of credit reporting.
    [Show full text]
  • The Drive for Privacy and the Difficulty of Achieving It in the Digital
    This article is part of a Research Dialogue: Krishna (2020): https://doi.org/10.1002/jcpy.1186 Oyserman & Schwarz (2020): https://doi.org/10.1002/jcpy.1189 Mulligan et al. (2020): https://doi.org/10.1002/jcpy.1190 Jagadish (2020): https://doi.org/10.1002/jcpy.1188 Acquisti et al. (2020): https://doi.org/10.1002/jcpy.1187 Secrets and Likes: The Drive for Privacy and the Difficulty of Achieving It in the Digital Age Alessandro Acquisti Laura Brandimarte Carnegie Mellon University University of Arizona George Loewenstein Carnegie Mellon University Accepted by Associate Editor, Aradhna Krishna We review different streams of social science literature on privacy with the goal of understanding consumer pri- vacy decision making and deriving implications for policy. We focus on psychological and economic factors influencing both consumers’ desire and consumers’ ability to protect their privacy, either through individual action or through the implementation of regulations applying to firms. Contrary to depictions of online sharing behaviors as careless, we show how consumers fundamentally care about online privacy, and present evidence of numerous actions they take to protect it. However, we also document how prohibitively difficult it is to attain desired, or even desirable, levels of privacy through individual action alone. The remaining instrument for privacy protection is policy intervention. However, again for both psychological and economic reasons, the collective impetus for adequate intervention is often countervailed by powerful interests that oppose it. Keywords Economic psychology; Public policy issues; Social networks and social media; Privacy; Privacy paradox Introduction targeted ads intended to influence voting in the upcoming US presidential elections.
    [Show full text]
  • Testimony and Statement for the Record of Chris Hoofnagle Deputy Counsel Electronic Privacy Information Center
    Testimony and Statement for the Record of Chris Hoofnagle Deputy Counsel Electronic Privacy Information Center Legislative Hearing on H.R. 2622, The Fair and Accurate Credit Transactions Act of 2003 Before the Committee on Financial Services, United States House of Representatives July 9, 2003 2128 Rayburn House Office Building Chairman Oxley, Ranking Member Frank, and Members of the Committee, thank you for extending the opportunity to testify today on H.R. 2622, the Fair and Accurate Credit Transactions Act of 2003. My name is Chris Hoofnagle and I am deputy counsel with the Electronic Privacy Information Center (EPIC), a not-for-profit research organization based in Washington, D.C. Founded in 1994, EPIC seeks to promote personal privacy rights and expand access to government information. The Fair Credit Reporting Act (FCRA) is a primary concern of EPIC, as it sets a legislative framework of Fair Information Practices to address rights and responsibilities in the handling of personal information. We maintain a web page on FCRA online at http://www.epic.org/privacy/fcra/. The Privacy Rights Clearinghouse, Junkbusters Corp., Computer Professionals for Social Responsibility, Privacy Times, and Consumer Action have joined this written statement. The Privacy Rights Clearinghouse is a nonprofit consumer information and advocacy program. It offers consumers a unique opportunity to learn how to protect their personal privacy. Junkbusters Corp. is a privacy advocacy firm that helps people get rid of junk messages of all kinds: spam, telemarketing calls, unwanted junk mail, junk faxes, and more. CPSR is a public-interest alliance of computer scientists and others concerned about the impact of computer technology on society.
    [Show full text]
  • Privacy, Notice, and Design Ari Ezra Waldman* Design Is Everywhere. It
    DRAFT Privacy, Notice, and Design Ari Ezra Waldman* Design is everywhere. It configures our relationship with a space, whether offline or online. In particular, the design of built online environments can constrain our ability to understand and respond to websites’ data use practices or it can enhance agency by giving us control over information. Design, therefore, poses dangers and offers opportunity to protect privacy online. This Article is the first comprehensive theoretical and empirical approach to the design of privacy policies. Privacy policies today do not convey information in a way understandable to most internet users. This is because they are designed without the needs of real people in mind. They are written by lawyers and for lawyers, and they ignore the way most of us make disclosure decisions online. They also ignore design. This Article argues that in addition to focusing on content, privacy regulators must also consider the ways that privacy policy design—the artistic and structural choices that frame and present a company’s privacy terms to the public—can manipulate or coerce users into making risky privacy choices. I present empirical evidence of the designs currently employed by privacy policies and the effect of different designs on user choices. This research shows that supposedly “user-friendly” designs are not always boons to consumers; design strategies can manipulate users into making bad choices just as easily as they can enhance transparency. This suggests that recommending “user-friendly” design is not enough. Rather, privacy regulators, including the Federal Trade Commission and state attorneys general and legislators, must ensure that privacy policies, and the websites that display them, are designed in ways that enhance transparency.
    [Show full text]
  • Alan Westin's Privacy Homo Economicus Chris Jay Hoofnagle Berkeley Law
    Berkeley Law Berkeley Law Scholarship Repository Faculty Scholarship 6-1-2014 Alan Westin's Privacy Homo Economicus Chris Jay Hoofnagle Berkeley Law Jennifer M. Urban Berkeley Law Follow this and additional works at: http://scholarship.law.berkeley.edu/facpubs Part of the Law Commons Recommended Citation Chris Jay Hoofnagle and Jennifer M. Urban, Alan Westin's Privacy Homo Economicus, 49 Wake Forest L. Rev. 261 (2014), Available at: http://scholarship.law.berkeley.edu/facpubs/2395 This Article is brought to you for free and open access by Berkeley Law Scholarship Repository. It has been accepted for inclusion in Faculty Scholarship by an authorized administrator of Berkeley Law Scholarship Repository. For more information, please contact [email protected]. ALAN WESTIN'S PRIVACY HOMO ECONOMICUS ChrisJay Hoofnagle* JenniferM. Urban" INTRODUCTION A regime of "notice and choice" largely governs U.S. Internet privacy law.' Companies, long encouraged by regulators, issue privacy policieS2 for consumers to read and act upon. In theory, * Chris Jay Hoofnagle is a lecturer in residence at the University of California, Berkeley, School of Law, where he teaches computer crime law, privacy, Internet law, and a seminar on the Federal Trade Commission. ** Jennifer M. Urban is an Assistant Clinical Professor of Law and Director of the Samuelson Law, Technology & Public Policy Clinic at the University of California, Berkeley, School of Law. She teaches and researches in the fields of privacy, intellectual property, and information policy. The authors thank Dr. Su Li, Dr. Joseph Turow, Jennifer King, Deirdre K. Mulligan, Tal Zarsky, Michael Birnhack, our Princeton Survey Research Associates colleagues, Larry Hugick and Margie Engle-Bauer, and Robert Barr for support in framing questions, analyzing data, and raising funds for this multiple-year effort.
    [Show full text]
  • Big Brother's Little Helpers: How Choicepoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement
    NORTH CAROLINA JOURNAL OF INTERNATIONAL LAW Volume 29 Number 4 Article 1 Summer 2004 Big Brother's Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement Chris Jay Hoofnagle Follow this and additional works at: https://scholarship.law.unc.edu/ncilj Recommended Citation Chris J. Hoofnagle, Big Brother's Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement, 29 N.C. J. INT'L L. 595 (2003). Available at: https://scholarship.law.unc.edu/ncilj/vol29/iss4/1 This Article is brought to you for free and open access by Carolina Law Scholarship Repository. It has been accepted for inclusion in North Carolina Journal of International Law by an authorized editor of Carolina Law Scholarship Repository. For more information, please contact [email protected]. Big Brother's Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement Cover Page Footnote International Law; Commercial Law; Law This article is available in North Carolina Journal of International Law: https://scholarship.law.unc.edu/ncilj/vol29/ iss4/1 Big Brother's Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement Chris Jay Hoofnagle* ABSTRACT Lawmakers should revisit federal privacy laws to account for private-sector database companies that sell personal information to the government for law enforcement purposes. Commercial data brokers (CDBs) operate websites tailored to law enforcement that enable police to download comprehensive dossiers on almost any adult. The CDBs have artfully dodged privacy laws, leaving a void where individuals' personal information may be sold and accessed without adequate protections, and creating serious risks to privacy and due process.
    [Show full text]
  • IS the FTC KEEPING PACE? Terrell Mcsweeny*
    GEORGETOWN LAW TECHNOLOGY REVIEW PSYCHOGRAPHICS, PREDICTIVE ANALYTICS, ARTIFICIAL INTELLIGENCE, & BOTS: IS THE FTC KEEPING PACE? Terrell McSweeny* CITE AS: 2 GEO. L. TECH. REV. 514 (2018) INTRODUCTION Election meddling, state-sponsored disinformation campaigns, and the potential manipulation of platform users is provoking intense reactions to technology around the world.1 The outrage following news reports that the data of millions of people were used without their knowledge to train sophisticated targeting tools that may have manipulated voters suggests that consumers expectations of how their data are collected and used do not correspond with the reality of the business models of many data-driven platforms.2 The allegations, if true, underscore the power of increasingly sophisticated predictive technology and the limitations of the United States’ largely self-regulatory approach to consumer data rights, privacy, and security. These allegations also raise the possibility that regulators, * Terrell McSweeny was Commissioner of the Federal Trade Commission until April 2018. Prior to serving as a Commissioner, she held numerous positions in government including Chief Counsel, United States Department of Justice Antitrust Division and Deputy Assistant to President Obama and Domestic Policy Advisor to Vice President Joe Biden. 1 See Giovanni Buttarelli (European Data Protection Supervisor), EDPS Opinion On Online Manipulation and Personal Data, EUROPA (Mar. 19, 2018), https://edps.europa.eu/sites/edp/files/publication/18-03-19_online_manipulation_en.pdf
    [Show full text]