From the GDPR) the United States Should Adopt to Advance Economic Justice
Total Page:16
File Type:pdf, Size:1020Kb
Five Privacy Principles (from the GDPR) the United States Should Adopt To Advance Economic Justice Michele E. Gilman* ABSTRACT Algorithmic profiling technologies are impeding the economic security of low-income people in the United States. Based on their digital profiles, low- income people are targeted for predatory marketing campaigns and financial products. At the same time, algorithmic decision-making can result in their exclusion from mainstream employment, housing, financial, health care, and educational opportunities. Government agencies are turning to algorithms to apportion social services, yet these algorithms lack transparency, leaving thousands of people adrift without state support and not knowing why. Marginalized communities are also subject to disproportionately high levels of surveillance, including facial recognition technology and the use of predictive policing software. American privacy law is no bulwark against these profiling harms, instead placing the onus of protecting personal data on individuals while leaving government and businesses largely free to collect, analyze, share, and sell personal data. By contrast, in the European Union, the General Data Protection Regulation (GDPR) gives EU residents numerous, enforceable rights to control their personal data. Spurred in part by the GDPR, Congress is debating whether to adopt comprehensive privacy legislation in the United States. This article contends that the GDPR contains several provisions that have the potential to limit digital discrimination against the poor, while enhancing their economic stability and mobility. The GDPR provides the following: (1) the right to an explanation about automated decision-making; (2) the right not to be subject to decisions based solely on automated profiling; (3) the right to be forgotten; (4) opportunities for public participation in data processing programs; and (5) robust implementation * Venable Professor of Law and Director, Saul Ewing Civil Advocacy Clinic, University of Baltimore School of Law; Faculty Fellow, Data & Society. B.A., Duke University; J.D., University of Michigan Law School. Many thanks for feedback to the researchers at Data & Society and the faculties of William & Mary Law School, Pace Law School, and Drexel University School of Law. 52:0368] FIVE PRIVACY PRINCIPLES 369 and enforcement tools. The interests of low-income people must be part of privacy lawmaking, and the GDPR is a useful template for thinking about how to meet their data privacy needs. ABSTRACT ................................................................................................... 368 I. INTRODUCTION...................................................................................... 369 II. THE CLASS DIFFERENTIAL IN DATA PRIVACY HARMS .......................... 375 A. Digital Discrimination/Electronic Exploitation ............................. 378 B. Dirty Data and Careless Coding ..................................................... 390 C. Surveillance .................................................................................... 394 D. Conclusion ..................................................................................... 399 III. THE GAPS IN AMERICAN PRIVACY PROTECTIONS ................................. 400 A. Constitution .................................................................................... 400 B. Privacy Statutes .............................................................................. 402 C. Notice and Consent ........................................................................ 406 D. Enforcement ................................................................................... 408 E. Anti-Discrimination Law ............................................................... 409 F. Workplace Protections ................................................................... 411 IV. FIVE GDPR PRINCIPLES TO ADVANCE ECONOMIC JUSTICE .................. 412 A. Right to an Explanation .................................................................. 414 B. Right to Object to Automated Profiling ......................................... 420 C. Right To Be Forgotten and Criminal Records ............................... 425 D. Public Participation ........................................................................ 431 E. Implementation and Enforcement .................................................. 439 V. WHAT ABOUT THE CALIFORNIA CONSUMER PRIVACY ACT? ................ 442 VI. CONCLUSION ......................................................................................... 444 I. INTRODUCTION On May 18, 2018, citizens of the European Union awoke to a new, robust set of data privacy protections codified in the General Data Protection Regulation (GDPR), which gives them new levels of control over their 370 ARIZONA STATE LAW JOURNAL [Ariz. St. L.J. personal information.1 Meanwhile, Americans rise daily to the same fragmented privacy regime that has failed to forestall a drumbeat of data breaches, online misinformation campaigns, and a robust market in the sale of personal data, usually without their knowledge.2 For instance, in 2017, the credit reporting company Equifax disclosed that hackers had breached its servers and stolen the personal data of almost half the United States’s population.3 In 2018, we learned that Cambridge Analytica harvested the data of over 50 million Americans through personality quizzes on Facebook in order to target voters with political advertisements for the Trump campaign.4 And, as consumers are bombarded with advertisements based on their internet searches or remarks captured by digital assistants such as Amazon’s Alexa, Americans are increasingly realizing that their online and offline behavior is being tracked and sold as part of a massive, networked data-for- profit and surveillance system.5 The American privacy regime is largely based on a notice and consent model that puts the onus on individuals to protect their own privacy.6 The model is not working. In the absence of congressional action, some states have enacted laws to protect the data privacy and/or security of their citizens.7 California has enacted the most comprehensive statute, effective January 2020, entitled the California Consumer Privacy Act, which expands transparency about the big data marketplace and gives consumers the right to opt-out of having their data sold to third parties.8 Other states may soon 1. Commission Regulation 2016/679, 2016 O.J. (L 119) 1 [hereinafter GDPR]. 2. See SARAH E. IGO, THE KNOWN CITIZEN: A HISTORY OF PRIVACY IN MODERN AMERICA 353–55 (2018). 3. Stacy Cowley, 2.5 Million More People Potentially Exposed in Equifax Breach, N.Y. TIMES (Oct. 2, 2017), https://www.nytimes.com/2017/10/02/business/equifax-breach.html [https://perma.cc/8VWZ-L653]. 4. Matthew Rosenberg, Nicholas Confessore & Carole Cadwalladr, How Trump Consultants Exploited the Facebook Data of Millions, N.Y. TIMES (Mar. 17, 2018), https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump- campaign.html?module=inline [https://perma.cc/9LVD-9DAG]. 5. Lee Rainie, Americans’ Complicated Feelings About Social Media in an Era of Privacy Concerns, PEW RES. CTR. (Mar. 27, 2018), https://www.pewresearch.org/fact- tank/2018/03/27/americans-complicated-feelings-about-social-media-in-an-era-of-privacy- concerns/ [https://perma.cc/8WLL-QL4Y]. 6. See Daniel J. Solove, Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1880, 1882–83 (2013). 7. State Laws Related to Internet Privacy, NAT’L CONF. ST. LEGISLATURES (Jan. 27, 2020), http://www.ncsl.org/research/telecommunications-and-information-technology/state-laws- related-to-internet-privacy.aspx#Consumer [https://perma.cc/68V2-C338]. 8. CAL. CIV. CODE § 1798.100 (West 2020). For more analysis of the California Consumer Privacy Act, see infra Part IV. 52:0368] FIVE PRIVACY PRINCIPLES 371 follow suit. After years of resistance, Big Tech companies such as Facebook, Google, Microsoft and Apple are now advocating for a national data protection law, primarily because they are worried about the emergence of varying state standards.9 Moreover, Big Tech is already working to comply with the GDPR for their millions of European consumers.10 In light of these new laws and the American public’s “techlash” against revelations of big data scandals,11 Congress is finally and seriously considering comprehensive privacy legislation.12 While the issue has bipartisan support, proposals vary in their solicitude for corporations versus consumers.13 As these issues are being actively debated, it is essential that the legislative process include the interests of all Americans, and not just elites and industry. Simply put, digital privacy needs are not the same for all Americans. Based on their digital profiles, low-income people are targeted with marketing campaigns for predatory products such as payday loans and for- profit educational scams.14 At the same time, algorithmic decision-making can result in their exclusion from mainstream employment, housing, financial, and educational opportunities.15 Government agencies are turning to algorithms to apportion public benefits, yet these automated decision- making systems lack transparency, leaving thousands of people adrift without state support and not knowing why.16 Marginalized communities are also subject to high levels of law enforcement surveillance, including the use of 9. See, e.g., Ben Brody & Spencer Soper, Amazon Joins Tech Giants in Backing Federal Privacy Safeguards, BLOOMBERG (Sept.