FACULTY OF LAW

LUND UNIVERSITY

MAXIMIN ORSERO

UNDERSTANDING THE DATA DIVIDE

BETWEEN THE

AND THE

SUPERVISOR: XAVIER GROUSSOT

JAEM03 MASTER THESIS 30 HIGHER EDUCATION CREDITS

EUROPEAN BUSINESS LAW TERM: SPRING 2019

ABSTRACT

This Thesis seeks to give its reader the tools to understand the data privacy divide between the EU and the US. It explains the crucial notions, historical and jurisprudential factors and regulatory frameworks underlying and constituting it.

First, it answers why regulating data privacy is paramount to our democratic societies on both sides of the Atlantic. The growing importance of the data driven economy, whose raw material is our , creates challenges to basic democratic values, for example privacy and the freedom of speech. This Thesis explores the darker side of the digital economy, sometimes referred to as a form of capitalism. It describes how the advertisement-based business model of some of the most successful internet companies may, if left unregulated, render citizens vulnerable to enhanced forms of influence and manipulation, and weaken essential counter-powers such as dissidents, whistle-blowers and the press.

Second, it answers how the EU and US approaches to regulating data privacy differ. In essence, different historical roots and economic incentives on both sides of the Atlantic explain the difference. The EU has had a painful experience with government surveillance and invasions of privacy, in particular in the former German Democratic Republic. On the contrary, the US does not have such history and its economy has enormously benefited from lax data privacy regulations, allowing it to grow internet giants. As a result, the EU regulates privacy and data protection tightly and enshrines them as fundamental rights, while the US takes a more market- based and light-touch approach by treating data privacy essentially as a subset of consumer protection law.

Third, it answers why the CJEU decided to invalidate the adequacy decision concerning the first attempt at bridging the divide, the Safe Harbor. In summary, this Thesis argues that the Court was trying to give leverage to the EU as the negotiation of the Safe Harbor 2.0 (now Privacy Shield) were nearing their end, in order for the US to make concessions and agree on a more protective framework than would have otherwise been possible.

Fourth, it synthesizes the current avenues for transferring personal data from the EU to US, that is to say, primarily, the Privacy Shield, and other vehicles such as and contracts, the SCCs, the BCRs, codes of conducts and certifications.

2

ACKNOWLEDGEMENTS

I would like to thank my supervisor Xavier Groussot for his support, not only in writing this final assignment but also for when he was coaching our team in the 2017-2018 EU law moot court competition, and throughout my time in Lund more generally.

Je tiens aussi à remercier tout particulièrement Clément pour avoir partagé deux canards à Tolbiac la veille au soir de l’oral. C’était du meilleur effet.

Maximin Orsero

3

ABBREVIATIONS

BCR Binding Corporate Rules

CCPA California Consumer

CIA US Central Intelligence Agency

CJEU / ECJ / The Court Court of Justice of the European Union

Convention 108 Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data

Data Protection Directive Directive 95/46/EC

DoC US Department of Commerce

DoJ US Department of Justice

DoT US Department of Transport

DPA Data Protection Authority

DPC Irish Data Protection Commissioner

ECHR Convention for the Protection of Human Rights and Fundamental Freedoms

ECtHR European Court of Human Rights

EDPS European Data Protection Supervisor

EDPB European Data Protection Board

EEA

EU European Union

FBI US Federal Bureau of Investigation

FISA US Foreign Intelligence Surveillance Act

FTA US Federal Trade Commission

GC General Court of European Union

GDPR General Data Protection Regulation (Regulation 2016/679)

4

IAPP International Association of Privacy Professionals

IRM Independant Recourse Mecanism

MEP Member of the

NGO Non-Governmental Organisation

NSA US National Security Agency

NSLs National Security Letters

ODNI US Office of the Director of National Intelligence

PNR Passenger Name Record

Pr. Professor

SCC Standard Contractual Clauses

TCE Transaction Costs Economics

TEU Treaty on European Union

TFEU Treaty on the Functioning of the European Union

The Charter / CFR The Charter of Fundamental Rights of the European Union

UN United Nations

US United States of America

5

TABLE OF CONTENTS

Abstract 2

Acknowledgements 3

Abbreviations 4

Table of contents 6

Introduction 9

Background 9

Purpose and research questions 9

Delimitations, methodology and resources 9

Outline 11

1 Why data privacy matters on both sides of the Atlantic 12

1.1 Preliminary remarks on property and data 12 1.1.1 The nature of data 12 1.1.2 Comparison with real and personal property 13 1.1.3 Comparison with intellectual property and related rights 13 1.1.4 Risks associated with the creation of property rights for data 14

1.2 The reasons for regulating data privacy 15 1.2.1 Surveillance capitalism 15 1.2.1.1 The internet’s most popular business model 15 1.2.1.2 The cost of ‘free’ 16 1.2.2 Government surveillance and civil liberties 17 1.2.2.1 Chilling effect 17 1.2.2.2 Balance of powers 18 1.2.3 Security & Transparency v Privacy 19 1.2.3.1 ‘I’ve got nothing to hide’ 20 1.2.3.2 The 21

1.3 First sub-conclusion 24

2 The different approaches to data privacy between the EU and the US 25

2.1 A European rights-talk approach and ‘omnibus’ laws 25 2.1.1 Privacy and data protection as fundamental rights 25

6

2.1.2 The differences between data protection and privacy in Europe 27 2.1.3 An ‘omnibus’ legal framework 28

2.2 An American marketplace approach and ‘patchwork’ laws 29 2.2.1 Personal data as a commodity for the privacy consumer 29 2.2.2 A ‘patchwork’ legal framework 29

2.3 Second sub-conclusion 31

3 The downfall of the Safe Harbor 32

3.1 Background 32 3.1.1 How the EU regulates international personal data transfers in general 32 3.1.2 Commission Decision 2000/520 ‘Safe Harbor’ 33 3.1.3 The Snowden revelations 34

3.2 The Schrems judgement 35 3.2.1 Facts of the case 35 3.2.2 The reference for a preliminary ruling 36 3.2.3 Reasoning of the Court 37

3.3 Receptions of the judgement 38

3.4 Ten myths about the Schrems case 40 3.4.1 The invalidation of Commission Decision 2000/520 was inevitable 41 3.4.2 The Safe Harbor was a treaty 41 3.4.3 The Safe Harbor was invalidated 41 3.4.4 The Safe Harbor was meant to address government surveillance issues 42 3.4.5 The Safe Harbor was poorly enforced by the FTC 42 3.4.6 The main victims were the US companies participating in the program 43 3.4.7 Data transfers to the US were rendered illegal 43 3.4.8 US law was not ‘adequate’ because not ‘equivalent’ to the EU legal order 43 3.4.9 EU data subjects had access to no means of redress nor remedies 44 3.4.10 The CJEU enhanced privacy protection for EU citizens 44

3.5 Third sub-conlusion 45

4 The current framework: the Privacy Shield and other compliance vehicles 46

4.1 Negotiations of Safe Harbor 2.0 and ‘re-branding’ to Privacy Shield 46

4.2 Content of the Privacy Shield package 46

4.3 Annual reviews of the Privacy Shield 47 4.3.1 Commercial considerations 48 4.3.2 Access by US authorities for national security and law enforcement 50

7

4.4 The uncertain future of the Privacy Shield 51

4.5 Alternatives to the Privacy Shield 54 4.5.1 Consent and contracts 54 4.5.2 Standard Contractual Clauses 55 4.5.3 Binding Corporate Rules 56 4.5.4 Codes of conducts and certifications 57

4.6 Fourth sub-conclusion 57

Summary and general conclusion 58

References 60

Academic sources 60

ECtHR case law and documentation 62

CJEU case law and opinions of Advocates General 62

US Supreme Court case law 63

Other US case law 63

EU statutes 63

US federal statutes 64

California statutes 64

References from official US institutions 65

References from official EU institutions 65

Sources from official EU Member States’ institutions 67

Professional publications by international law firms 67

Other professional legal publications 68

Press and media sources 69

Other resources 71

8

INTRODUCTION

Background

Data protection and privacy are fundamental rights in the EU. In the US, is a subset of consumer protection law and has no such quasi-constitutional value. The EU, by principle, considers transfers of personal data as illegal. The American regime is the exact opposite. The US’ technology companies are the most valuable in the world. The EU barely ever developed such a data-driven industry. Yet, the EU and the US are the two largest economies in the world, and data flows between them are paramount for the current digital economy. Starting from these two diametrically opposite positions, I was wondering how this Transatlantic data privacy divide is bridged.

Purpose and research questions

The purpose of this Thesis will be to give its reader an understanding of 1) what makes privacy so important in our modern economies and democracies on both sides of the Atlantic, 2) how the EU and the US approaches to regulating it differ, 3) why the first framework bridging the Transatlantic privacy divide, i.e. the Safe Harbor, eventually was discontinued and what this entailed, and 4) what the current regimes and frameworks to transfer personal data from the EU to the US are and what the future holds.

Therefore, the four research questions of this Thesis are:

1) Why does regulating data privacy matter so much on both sides of the Atlantic? 2) How do the EU and US approaches to regulating data privacy differ? 3) Why was the Safe Harbor framework discontinued? 4) What are the current regimes and frameworks available to transfer personal data from the EU to the US?

Delimitations, methodology and resources

This Thesis will not study in detail the regulatory regimes on both sides of the Atlantic. It will not for instance describe how the GDPR works in the EU or how the FISA Court in the US functions. It will instead elaborate on the similarities and/or ideological differences between the two systems and the efforts from EU and US institutions to try and bridge them. I will also not

9

explore issues of data privacy and trade, taxation and antitrust, nor will it deal with questions of non-personal or cybersecurity. Furthermore, this Thesis will touch upon certain politically sensitive topics, but will only mention them insofar as it enhances our understanding of the issues it addresses and without going into any specifics. For example, I will mention a potential ‘Trump effect’ on the Privacy Shield whereby US President Donald J. Trump could decide that the US would discontinue it by unilaterally withdrawing PPD-28, but I will not explore in depth the likeliness of this event happening nor the reasons that might lead him to do so.

Privacy is a term known in both American and European legal systems, the latter speaking also of ‘right to a private life.’1 However, it is important to clarify that ‘data protection’ is a term originating from German law2 and enshrined in EU law. American law does not contain such an expression and instead speaks of ‘information ’3 or ‘ law,’4 and legal scholars sometimes use the expression ‘data privacy law.’5 In order to remain neutral and for the sake of simplicity, this Thesis will use privacy and data protection as synonyms and preferentially use the term ‘data privacy’ to refer to both at once.

This Thesis will use resources from academics from both sides of the Atlantic and beyond. It will cite cases from the ECtHR, the CJEU, the General Court, the US Supreme Court, and at times cases before EU DPAs and US Federal Courts. It will use both EU and US statutes as well as official publications and resources from US, EU and Member States authorities and institutions such as the FTC, the EDPB or the Irish DPC. It will also rely on other sources about privacy such as professional publications by international law firms, e.g. Baker McKenzie, Linklaters or White & Case, trade associations, e.g. the IAPP, NGOs, e.g. Max Schrems’ None Of Your Business, and blogs, e.g. the European Law Blog. It will also exploit media resources from renowned newspapers, for example articles from The Financial Times will be cited when they provide valuable information about Max Schrems’ litigations that cannot be found in any court documents, and articles from The New York Times’ Privacy Project will serve as sources to describe certain economic or political issues surrounding the topics treated in this Thesis.

1 Charter of Fundamental Rights of the European Union (the ”Charter”), Article 7. 2 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017), fn. 62. 3 Solove D., Schwartz P. M., Information Privacy Law, Wolters Kluwer, 6th edition (2017). 4 California Consumer Privacy Act of 2018, Cal. Cons. Priv. Code §§ 1798.100 - 1798.198. 5 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017). 10

All links were accessed and valid as of Monday 27 May 2019.

Outline

This Thesis will start by putting privacy into perspective. It will explore the reasons why it is such an important value on both sides of the Atlantic, how crucial it is to our democratic societies, and how recent technological advances are causing it to conflict with core values such as dignity, freedom of speech, transparency and security.

It will explain how the EU and the US regulate data privacy within their respective jurisdictions.

It will then describe how transferring personal data from the EU to the US may be problematic, and study the first attempt to bridge the divide: the Safe Harbor. It will analyze its background, explain the causes and consequences of its downfall.

It will lastly analyze the current avenues to transfer personal data harvested in the EU towards the US, focusing on the Privacy Shield and other available compliance vehicles.

11

1 WHY DATA PRIVACY MATTERS ON BOTH SIDES OF THE ATLANTIC

Why does regulating data privacy matter so much on both sides of the Atlantic?

This chapter will expand on the elements that make EU and US data privacy laws converge as a matter of policy. It will explore why protecting privacy is crucial in our democratic societies based on individual freedom, fundamental rights and civil liberties, while emphasizing the current importance of personal data in our modern digital economies.

1.1 Preliminary remarks on property and data

Headlines may read ‘The currency of the future is personal data,’6 ‘Can you make money selling your data?,’7 ‘The business of selling your location’8 etc., but make no mistake: these are simplifications. These statements rely on a premise: that personal data can be appropriated. However, as Pr. Lothar Determann puts it: ‘No one owns data.’9 This section will discuss how privacy and personal data are not compatible with any notion of property.

1.1.1 The nature of data

The words ‘data’ and ‘information’ are treated as synonyms in both EU and American legal systems. Data science distinguishes them: data refers to raw material, patterns with no meaning, while information refers to data that has been organized, that is meaningful and usable by human beings.10 Under both definitions, and regardless its physical embodiment, be it signs, sounds, shapes or smells, any underlying information is never appropriable. This is so because of the very nature of information: it is ubiquitous. It may never be acquired, only accessed.11

6 Baldet A., The Currency of the Future is Personal Data, Quartz (2018) . 7 Harrison S., Can You Make Money Selling Your Data?, BBC Capital (2018) . 8 The Daily Podcast, The Business of Selling your Location, The New York Times (2018) . 9 Determann L., No One Owns Data, UC Hastings Research Paper No. 265 (2018). This passage by and large synthesises this article. 10 Ibid, fn. 21. 11 Varian H. R., Economic Aspects of Personal Privacy, US DoC, Privacy and Self-Regulation in the Information Age (1997). 12

1.1.2 Comparison with real and personal property

In both civil and common law traditions, the right to property imply ‘the right to the exclusive use of an asset [and the faculty to] dispose of it at will.’12 However, one may not forget at will. In principle, when someone buys a house, they may choose to have it wrecked. When someone acquires a painting, they may choose to burn it. When I tell someone that I once visited a castle in Scotland, it does not make me forget it. I may forget about that event one day, but I may not do so at will. Information duplicates as it is communicated. That property makes it incompatible with any physical property, be it real or personal, because, colloquially, one may not eat a cake and have it.

1.1.3 Comparison with intellectual property and related rights

Pr. Pamela Samuelson has argued that treating privacy as intellectual property may have some appeal, but she explains that this approach suffers from major limitations refuting the case for treating personal information this way.13 The whole purpose of intellectual property is to incentivize innovators and creators to spread their discoveries and works by financially rewarding their efforts. However, as she put it: ‘the purpose of the proposed new personal data property right is almost the inverse of traditional intellectual property law, for it would grant a property right in order to restrict the flow of personal data to achieve privacy goals.’14

Concerning copyright, what is protected is not the paper on which a book is written, but the ideas, the expressions of originality stemming from it. As Lothar Determann explains, ‘copyright law protects only the creative expression of information, and not the information itself.’15 The same is true in the EU under the Database Directive and in the US under state laws on misappropriation: the ‘wholesale copying’ of databases or news articles is prohibited, but discrete information forming part of the database16 and the underlying news themselves may not be copyrighted.17 Patents suffer from the very same issue. As the Federal Circuit ruled in

12 Black J. and others, A Dictionary of Economics, Oxford, 5th edition (2017). 13 Samuelson P., Privacy as Intellectual Property, Stanford Law Review (1999). 14 Ibid., p.1141. 15 Determann L., No One Owns Data, UC Hastings Research Paper No. 265 (2018), p.18. 16 Ibid., p.22. 17 Int’l News Serv. v Associated Press, 248 U.S. 215 (1918). 13

Digitech Image,18 ‘although the use, storage or application of data can be patentable, the underlying data is not eligible for patent protection.’19 Trademarks are interesting in that personal information such as a person’s name may be trademarked, but they only grant a right to oppose other person’s use of the information under certain conditions. They do not give ownership over that information: Paul Smith is a trademark and the name of a famous fashion designer, but he cannot prevent anyone called Smith from naming their child Paul. Lastly, trade secrets protection could not justify data ownership either. On both sides of the Atlantic, it ceases to apply if its underlying information becomes public for instance via independent discovery or reverse engineering,20 hence beating the purpose of privacy protection.

1.1.4 Risks associated with the creation of property rights for data

Besides being hardly justifiable as a matter of principle, granting ownership of personal data in unnecessary and would be counter-productive as a matter of policy. German law makers are considering creating a notion of ‘data ownership’ to incentivize and develop a ‘data market.’ Pr. Determann explains that such a reform is unnecessary and its premise is deeply flawed: data trade has been growing at an exponential rate for two decades with no need for ‘incentivizing through data propertization.’ More importantly, creating a right to property of personal data may actually defeat its purpose of enhancing privacy protection. Pr. Determann clearly summarized the issue: ‘Companies that acquire ownership to personal data (…) could exclude [the data subject] from using data about him or herself. Such an exclusion right would be diametrically opposed to the policy objectives of data privacy laws,’21 namely, preserving human dignity and personal privacy.

18 Digitech Image Techs. v Elecs. for Imaging, 758 F.3d 1344, 1350 (Fed. Cir. 2014). 19 Determann L., No One Owns Data, UC Hastings Research Paper No. 265 (2018), p.16. 20 Kewanee Oil Co. v Bicron Corp., 416 US 470, 476 (1974). 21 Determann L., No One Owns Data, UC Hastings Research Paper No. 265 (2018), p.37. 14

1.2 The reasons for regulating data privacy

1.2.1 Surveillance capitalism

1.2.1.1 The internet’s most popular business model

Personal data is often referred to as the ‘oil’ of the digital economy.22 The handling of such private information by leading technology companies has recently been tainted by major scandals such as Facebook’s Cambridge Analytica case. It caused a wide uproar on both sides of the Atlantic and sparked criticism over the privacy practices of internet giants.

Pr. David Lyon defines surveillance as ‘The focused, systematic and routine attention to personal details for purposes of influence, management, protection or direction.’23 Capitalism refers to an ‘economic system (…) in which most of the means of production are privately owned and production is guided and income distributed largely through the operation of markets.’24 The expression ‘surveillance capitalism’ was coined by Harvard Business School Pr. Shoshana Zuboff.25 It can be defined as the generalization of the business model pioneered by major technology companies, chiefly Google and Facebook. It draws on Berkeley School of Information founder and Chief Economist at Google Hal Varian’s classification of computer- mediated transactions uses: ‘data extraction and analysis,’ ‘new contractual forms due to better monitoring,’ ‘personalization and customization,’ and ‘continuous experiments.’26

Essentially, first, these companies collect information about behavior not traditionally conceived as economic transactions, such as a simply browsing the internet even without making any purchase. Second, they translate this newfound raw material, i.e. human experience, into a ‘behavioral surplus,’ and give access to that information to interested third parties, such as advertisers, against a remuneration. Third, they use this behavioral information to better understand their customers and constantly improve, refine their services. Pr. Zuboff explains

22 The Economist, The World’s Most Valuable Resource is No Longer Oil, but Data (2017) . 23 Lyon D., Surveillance Studies: An Overview, Cambridge (2007). 24 Encyclopaedia Britannica, Definition of Capitalism . 25 Zuboff S., The Age of Surveillance Capitalism, Profile Books (2019). 26 Varian H. R., Beyond Big Data, Business Economics (2014). 15

that this ‘reflects the history of capitalism of taking things that are live outside the market sphere and declaring their new life as market commodities.’ It permits end users to use their services for free and is where the expression ‘if it’s free online, you are the product’ comes from.27

What is the harm, if it free? Pr. Zuboff explains that the generalization of that business model redistributes, one may say erodes, privacy. The very architecture of the model incentivizes the constant and ever-deeper extraction and accumulation of behavioral information, what she calls ‘surveillance assets,’ to always better predict consumer preferences. She warns that the other side of this coin is a shift of incentives from predicting behavior to influencing and controlling it. ‘It is no longer enough to automate information flows about us; the goal is to automate us.’ This is where a threat to autonomy, freedom of choice and civil liberties more broadly resides.28

1.2.1.2 The cost of ‘free’

This begs the question: are all these services really ‘free?’ Berkeley Pr. of law and information Chris Hoofnagle answers in the negative.29 He argues that orthodox economic theory fails to shade light on the true costs of ‘free’ and proposes instead to analyze the issue through the lenses of Transaction Costs Economics. The TCE theory posits that to quantify the total costs of an economic interaction, one should account for both the price and any down-the-road expenses associated with it, be they economic or not.30 It ‘is particularly useful in cases of divergence between price and cost.’31 Pr. Hoofnagle argues that an appropriate way to determine the value of personal information is to assess the cost of replacing it.

Consider the following statistic: Facebook’s average yearly revenue per North-American user is over $13, and just over $5 for the rest of the world. ‘To Facebook, you represent about a billionth of its revenue-making potential; your individual value to the company is below negligible. But the information you store on the service — your photos, your private messages,

27 Zuboff S., Surveillance Capitalism and the Challenge of Collective Action, New Labor Forum (2019). 28 Ibid. 29 Hoofnagle C. J., Whittington J., Free: Accounting for the Costs of the Internet’s Most Popular Price, UCLA Law Review (2014). 30 Coase R. H., 1991 Nobel Lecture: The Institutional Structure of Production, The Nature of the Firm: Origins, Evolutions, and Development, Oliver E. Williamson & Sidney G. Winter eds. (1991). 31 Hoofnagle C. J., Whittington J., Free: Accounting for the Costs of the Internet’s Most Popular Price, UCLA Law Review (2014), p.615. 16

your personal history — is priceless, or at least extremely valuable to you. If Facebook loses, damages or leaks your data, it has no effect on them and a tremendous effect on you.’32

Tangible transaction costs to individuals include: enhanced risks of , fraud, stalking, the creation of filter bubbles, undue influence and attempts at manipulation. Suppose a person gets her email address compromised and highly valuable information such as her phone number, address or credit cards information leaked into the hands of ill-intended people due to lax data practices of companies like Facebook. The potentially hundreds of dollars she would need to spend to repair those damages would far outstrip the benefits she ever would have gained from the company’s services. In that sense, Pr. James Grimmelmann also proposed to look at privacy from a ‘product safety’33 perspective. As explained by Pr. Jacob Silverman, the costs of protecting oneself against surveillance and these sorts of leaks in the form of ‘digital hygiene’ and ‘security planning’ should also be factored into the value of ‘free’ online services.34 The development of 5G and the democratization of ‘Smart’ devices will only add to this phenomenon and require rapid action.

1.2.2 Government surveillance and civil liberties

Another major issue in data privacy concerns government surveillance. I will briefly review the ‘Dangers of Surveillance’ explained by Pr. Neil M. Richards in the article of the same name published in the Harvard Law Review.35

1.2.2.1 Chilling effect

‘Surveillance is harmful because it can chill the exercise of our civil liberties.’36 The values concerned here are at the core of our democracies on both sides of the Atlantic: freedom of thought, free speech, and self-determination – the EU also refers specifically the right to dignity.37 A free society can only be so if it allows its citizens freedom of thought and speech.

32 Herrman J., This is How Much You’re Worth To Facebook, BuzzFeed News (2012) . 33 James G., Privacy as Product Safety, Widener Law Journal (2010). 34 Silverman J., Privacy under Surveillance Capitalism, Social Research (2017), p.160. 35 Richards N. M., The Dangers of Surveillance, Harvard Law Review (2013). 36 Ibid., p.1935. 37 , It’s about human dignity and autonomy (2018) . 17

In the aggregate, the ‘marketplace of ideas’38 would let truth prevail and benefit all of society. These liberties and objectivity in turns would enable citizens to make up their own minds about the world, determine what kind of persons they wish to be, and what kinds of lives they wish to live.

However, ‘intellectual privacy theory suggests that new ideas often develop best away from the intense scrutiny of public expose.’39 For a micro-example, a child, then teenager, needs a certain private sphere away from the scrutiny of its parents to develop its independence and its own personality. For a macro-example, consider this quote from Cindy Cohn, the executive director of the NGO Electronic Frontier Foundation: ‘Every social movement started with a private conversation. Somebody somewhere turned to somebody else, and said, I don't think slavery is a good idea. That's how you start movements.’40 If citizens can no longer be assured that all their conversations, thoughts and deeds are truly private, they will start behaving differently, fear challenging the status quo, and stop exercising their liberties such as freedom of speech. 41 In light of this, the following quote from Edward Snowden is spot-on: ‘When you say “I don’t care about the because I have nothing to hide,” that’s no different than saying “I don’t care about freedom of speech because I have nothing to say.”’42

1.2.2.2 Balance of powers

Surveillance has an ‘effect on the power dynamic between the watcher and the watched, giving the watcher greater power to influence or direct the subject of surveillance.’43 Drawing on the assumption that information equals power, I will first explain Pr. Richard’s points on blackmail, then persuasion and discrimination, and conclude by detailing another one about the freedom of the press.

38 Gordon J., John Stuart Mill and the Marketplace of Ideas, Social Theory and Practice (1997), pp.235-249. 39 Richards N. M., The Dangers of Surveillance, Harvard Law Review (2013), p.1946; Richards N. M, Intellectual Privacy, Texas Law Review (2008). 40 The Stanford Raw Data Podcast, Data Confidential (2016) . 41 The Daily Podcast, The Business of Selling your Location, The New York Times (2018) . 42 The Guardian, Edward Snowden: a right to privacy is the same as freedom of speech – video interview 43 Richards N. M., The Dangers of Surveillance, Harvard Law Review (2013), p.1953. 18

A surveilling government necessarily has an edge over its population and can use intelligence to discredit social change activists. A disconcerting example concerns Martin Luther King. The FBI engaged in a surveillance program to uncover embarrassing elements of his private life. It discovered that he had an extra-marital affair and sought to use this information to discredit him.44

Surveillance can also give rise to subtler forms of harm, not amounting to outright blackmail, which are more adeptly performed by private actors: persuasion and discrimination. This was explored in the previous section. Suffice is to say here that dangers lie in that fact that governments can technically rely on surveillance capitalists’ capabilities and divert them as tools of political pressuring. As Pr. Richard put it: ‘We must recognize that surveillance transcends the public/private divide.’45 For example, profiles of customers from certain minorities made by a private company could be used by an ill-intended government to identify them more easily and discriminate against them.

A less obvious effect of surveillance is its impact on the freedom of the press. Often referred to as the ‘Fourth estate,’46 a free press is essential to reveal political scandals and hold power accountable – consider the Washington Post’s motto ‘Democracy dies in darkness.’47 To perform their work, journalists need to ensure their sources’ protection and . However, if they cannot hide their digital shadows, journalists cannot they ensure whistle- blowers that their communications would not be intercepted, that no evidence of their meetings would remain.

1.2.3 Security & Transparency v Privacy

The major counter-argument to the previous developments is of course that surveillance offers security. Another one is that too much privacy to the detriment of transparency actually hinders freedom of speech and access to information.

44 Ibid. 45 Ibid., p.1952. 46 Schleiffer J. T., Toqueville’s Insight, The New York Times (1981) . 47 The Washington Post’s homepage . 19

1.2.3.1 ‘I’ve got nothing to hide’

The security v privacy argument is explained by Pr. Daniel Solove in his article titled ‘“I’ve got nothing to hide” and other misunderstandings of privacy.’48 In its most basic form, the case for government surveillance goes: ‘If you’ve got nothing to hide, you’ve got nothing to fear.’ Mass- scale indiscriminate surveillance permits to intercept crimes. This benefits to all of society. ‘If the wiretapping stops one of these Sept. 11 incidents, thousands of lives are saved.’49 Hence, most people support government surveillance. From that angle, surrendering privacy is a valuable trade-off to prevent terrorism and thwart organized crime.

Pr. Richard Posner nuances and strengthens the argument. If privacy is a ‘right to conceal discreditable facts about [one’s] self,’ such as breaking the law or social norms, the law should not protect such behavior.50 He makes a powerful analogy with ‘the efforts of sellers to conceal defects in their products.’ The argument enjoys a great thrust especially when government surveillance only concerns limited personal information, such as meta-data about whom a person calls but not the content of the calls. Furthermore, privacy would not be threatened by government surveillance because most of the data is processed by computers, which are not sentient beings and can make no value judgement of a person’s life. Only a limited number of security official would have access to the data. This would not amount to an all-out exposure of one’s personal life to the government. Thus, any potential harm would be balanced and justified in the general interest.51

The case in favor of surveillance, when conceived in such a balanced way, is particularly compelling. However, privacy should not be simply construed as a right to withhold information nor as a right to ‘hiding a wrong.’52 Solove finds a series of major rebuttals to the ‘nothing to hide’ argument. In essence, the harm of surveillance ‘is less one to particular individuals than it is a structural harm.’53

48 Solove D., 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy, San Diego Law Review (2007). 49 Schneider J., Letter to the Editor, NSA Wiretaps Necessary, St. Paul Pioneer Press, Aug. 24, 2006, at 11B. 50 Posner R. A., The Right of Privacy, Georgia Law Review (1977), p.399. 51 Posner R. A., Our Domestic Intelligence Crisis, The Washington Post (2005) . 52 Solove D., 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy, San Diego Law Review (2007), p.764. 53 Ibid., p.770. 20

First, as explored in the previous section, the main harms of surveillance are the chilling effects on civil liberties and the asymmetry of power between the watcher and the watched. ‘The issue is not about whether the information gathered is something people want to hide, but rather about the power and structure of government.’54

Second, surveillance poses a collective action problem. As Pr. Ann Bartow explains, privacy advocates’ arguments fail to demonstrate harm besides ‘feelings of unease,’ and that privacy lacks ‘blood and death, or at least broken bones and buckets of money.’55 This un-sexiness of privacy law fails to attract citizens’ attention towards the harm caused by surveillance on each individual. This, in turns, causes them not to seek protection. Solove responds that: ‘Privacy is threatened not by singular egregious acts, but by a slow series of relatively minor acts which gradually begin to add up. [This] resembles certain environmental harms.’56

Third, a society based on an absence of privacy is not a society of trust. Citizens must understand that what matters is not so much each discrete piece of information, no matter how abstract and anodyne, a government may access, but how their combination may reveal information they believe is no one else’s business.57 Citizens may agree that their data be accessed for one purpose: preventing terrorism. However, most people do not understand how ‘aggregation’ and ‘secondary use’ work, and how information collected for one purpose can reveal things about them that they never would have wanted to reveal, be it to their government, private companies, or anyone at all.58

1.2.3.2 The right to be forgotten

Withdrawing information on the basis of one’s privacy may cause transparency issues and conflict with the right to information and the freedom of expression of others. This is especially so if a person has a high-profile, such as public officials or celebrities, or if a legitimate interest warrants a disclosure, such as information about the criminal records of a job applicant.

54 Ibid., p.767. 55 Bartow A., A Feeling of Unease About Privacy Law, University of Pennsylvania Law Review (2006). 56 Solove D., 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy, San Diego Law Review (2007), p.769. 57 Ibid., p.766. 58 Turow J. and Hoofnagle C. J., Mark Zuckerberg’s Delusion of Consumer Consent, The New York Times (2019) . 21

The EU’s right to be forgotten, also referred to as right to erasure, is at the core of this issue. It permits a person to ask a company to delete personal data it may hold about her. It originates from the French concept of ‘droit à l’oubli,’ for ‘right to oblivion,’ which allows former criminals to clear their record and be rehabilitated after a sufficiently long time.59 It is now a general right, enshrined in Article 17 of the GDPR. Prior to that, it was debated whether the former Data Protection Directive also contained it. The CJEU, in the Google Spain case,60 ruled that it did. I will summarize the facts of the case and briefly explain the controversy surrounding it.61

In 1998, a newspaper is commanded by the Spanish Ministry of Labor and Social Affairs to publish and upload online an article about social security debt proceedings concerning the applicant. In 2009, he asked the newspaper to remove the article, because it would show up when typing his name on Google. He thought this could negatively impact his reputation, because the case had been resolved long ago and he deemed it no longer relevant. The newspaper refused. The applicant then asked Google to de-reference the article, in vain.62

The CJEU was asked whether the Directive contained the right for a data subject to ask a search engine to remove true information about him ‘on the ground that that information may be prejudicial to him or that he wishes it to be “forgotten” after a certain time.’63 The Court answered in the affirmative, contrary to its Advocate General opinion. Its reasoning was that, in this particular case, the fundamental rights to privacy and data protection overruled the economic interests of the search engine, and that there were no ‘particular reasons substantiating a preponderant interest of the public in having’ access to the data because the applicant was not a ‘public figure.’64

The detractors of the right agree it comes from a good intention: offering redemption, allowing teenagers to delete stupid content they would later regret, giving citizens a sense of control over

59 French Ministry of Justice, Le sens de la peine et le droit à l’oubli . 60 Case C-131/12 Google Spain and Google EU:C:2014:317 a 61 The case also raised questions about the territorial application and personal scope of the Directive, but I will only focus on the right to be forgotten. 62 C-131/12 Google Spain and Google EU:C:2014:317, §15-20, Opinion of Advocate General Jääskinen, §18-22. 63 C-131/12 Google Spain and Google EU:C:2014:317, §89. 64 Ibid., §97-98. 22

their personal information, among other things.65 However, this right may be problematic in terms of a principle, and for empirical reasons.66 First, as Pr. Jeffrey Rosen explains: ‘It represents the biggest threat to free speech on the Internet in the coming decade.67’ This right may create a ‘dramatic clash between European and American conceptions of the proper balance between privacy and free speech.’68 It would be a slippery slope towards a generalization of ‘editing history’ and a form of distributed censorship threatening the ‘integrity of the internet.’69 Second, Advocate General Jääskinen noted that besides the above arguments, the right to be forgotten would create an ‘unmanageable number of requests’70 to be handled by data controllers and processors. He suggested that this might be at odds with the freedom of technology companies to conduct a business,71 also a fundamental right under EU law.72

Arguments in favor of the right would be that ‘Access to transparent and complete information is, on its face, a laudable goal for the Internet. However, this can come at the cost of people's reputations, relationships, career goals, and dignity.’73 Moreover, the right to be forgotten is not absolute and claims that it would irremediably alter transparency on the internet would be overblown. The GDPR now lays out a set of exceptions, for journalism, for public figures, etc. A sweeping definition in the law would only become problematic if an unbound enforcement ensued. The balanced judicial interpretation based on a principle of proportionality currently in place suggests that the dystopian drift envisaged by the right to be forgotten’s opponents should not happen.

65 CJEU, Press release No 70/14 on C-131/12 Google Spain . 66 There are many more concerns and critiques of the right to be forgotten. The purpose of this section being solely to provide a quick overview of debates around it, I will not get into more details. To anyone interested, I highly recommend Google’s global privacy counsel Peter Fleisher’s blog post (Peter Fleisher, Foggy Thinking about the Right to Oblivion, Peter Fleisher blog (2011) ) and a Federalist Society podcast on the subject (The Federalist Society, The Right to Be Forgotten (2018) .). 67 Rosen J., The Right to Be Forgotten, Stanford Law Review Online (2012). 68 Ibid. 69 The Federalist Society, The Right to Be Forgotten (2018) . 70 Opinion of Advocate General Jääskinen in Google Spain, §133. 71 C-360/10 SABAM v netlog, EU:C:2012:85, §46. 72 Charter, Article 16. 73 Belbin R., When Google Becomes the Norm: The Case for Privacy and the Right to Be Forgotten, Dalhousie Journal of Legal Studies (2018), p.33. 23

1.3 First sub-conclusion

In the light of the discussion from this chapter, regulating data privacy matters so much on both sides of the Atlantic due to the ever-growing importance of the data-driven economy and surveillance capitalism. The very business model of the internet, based on supposedly free services in exchange of personal data which we supposedly own, may in effect erode or chill core values of our democracies such as the freedom of speech and privacy. Surrendering our privacy by trading it against the convenience of better digital services in reality comes at the price of becoming more vulnerable to the influence of the highest-bidding advertisers. Furthermore, for the sake of security, we would in reality weaken essential counter-powers of our democratic societies, by giving governments orders-of-magnitude stronger tools to blackmail, pressure and influence social-change activists, dissidents, and journalists.

24

2 THE DIFFERENT APPROACHES TO DATA PRIVACY BETWEEN THE EU AND THE US

How do the EU and US approaches to regulating data privacy differ?

This chapter will expand on the elements that make EU and US data privacy laws diverge as a matter of regulation, as explained by Pr. Paul Schwartz and Karl-Nikolaus Peifer in their article ‘Transatlantic Data Privacy Law.’74 In essence, EU and US privacy regimes widely differ in historical foundations and legislative structures, and the EU’s vision of the rights to privacy and data protection are driven by a ‘rights-talk’ approach, while the US focus on a ‘market-based’ approach to data privacy.

2.1 A European rights-talk approach and ‘omnibus’ laws

Privacy’s importance in the EU’s legal system can be explained by the structure of the European integration project itself, centered around human rights protection, and in the post-World War II experience with government surveillance in former Soviet States, chiefly in the former German Democratic Republic.

2.1.1 Privacy and data protection as fundamental rights

In the EU, privacy and data protection are fundamental rights ‘anchored in interests of dignity, personality and self-determination.’75 In the words of Commission President Jean Claude Junker during his 2016 State of the Union address: ‘[b]eing European means the right to have your personal data protected by strong, European laws. […] Because in Europe, privacy matters. This is a question of human dignity.’76 According to Pr. Schwartz, the individual, as a bearer or rights, a ‘data subject,’ is the center of gravity of EU privacy and data protection law. After the horrors of Fascism, Nazism and World War II, European countries such as Italy and became standard-bearers of constitutional dignity protection. Dignity became the centerpiece of European fundamental rights law, which immediately after the war developed at a supra-national level with the European Convention on Human Rights. In the European

74 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017). 75 Ibid., p.123. 76 Jean Claude Juncker, 2016 State of the Union address at the European Parliament . 25

Economic Communities, the ancestors of the EU, the Stauder77 case of the Court of Justice is the most important early development in this field. Interestingly, this case, which was the first- ever addressing human rights in the EU legal order, indirectly concerned privacy. A welfare scheme created coupons to be allowed to buy butter at a cheaper price. These coupons would bear the name of their beneficiaries. Mr. Stauder argued that this created a stigma in violation of his dignity. Essentially, the fact of disclosing publicly his name through the coupons vehiculated the idea that he, as a beneficiary, would be poor and untrustworthy.

This post-war emphasis on fundamental rights in Western Europe was an effort to reinforce its democratic mode of government by contrast to the situation on the Easter side of the Iron Curtain. One of the reasons why privacy is so dearly valued in Germany can be traced back to the mass-surveillance by the East-German political police, the Stasi, during the Cold War. This was eloquently addressed in the movie ‘The Lives of Others.’78 It is interesting to point out that Martin Selmayr, a German and General Secretary of the , one of the architects, some may say the father, of the GDPR, is the grand-son of a former West-German intelligence chief whose one of the main missions was to combat the Stasi.79 A reaction against this dramatic deprivation of privacy to tamper political dissidence would have given Germans a sense of how important privacy is in a free society. In turns, this German example influenced the EU to enact strong data protection laws placing the individual and dignity at its center when the digital age was barely in its infancy. As a matter of fact, the negotiations for the 1995 Data Protection Directive began just shortly after the Fall of the Berlin Wall. This historical and ideological background explain the European general defiance towards privacy-invasive technology and that personal data processing is by principle not authorized.

77 C-29/69 Stauder EU:C:1969:57. 78 Bradshaw P., The Lives of Others, The Guardian (2007) . 79 Heath R., The World’s Most Powerful Tech Regulator: Martin Selmayr, Politico Europe (2018) . 26

2.1.2 The differences between data protection and privacy in Europe

According to CJEU Advocate General Julianne Kokott,80 both the CJEU and the ECtHR81 interpret broadly the right to privacy and consider it the core of the right to data protection. These rights are respectively enshrined in Articles 7 and 8 of the Charter of Fundamental Rights of the European Union. The European Convention on Human Rights recognizes the right to a private life in its Article 8. The Council of Europe passed a ‘Data Protection Convention 108,’ but it does not fall in principle under the jurisdiction of the European Court of Human Rights.82 This convention and EU law share a similar definition of personal data, namely ‘information relating to an identified or identifiable individual.’83

Privacy and data protection differ in scope and enforceability between the legal system of the EU and that of the Council of Europe. First, the personal scope of the rights diverge: legal persons may not rely on EU data protection84 but may be entitled to the ECHR right to a private life.85 The substantive scope is also different: the right to data protection is broader than that of privacy because it also includes data that may not be considered private.86 Moreover, different bodies of EU data protection and privacy law may be enforceable against public87 or private persons,88 whereas the ECtHR’s right to privacy may only be enforced against the contracting parties to the Convention.89 Second, the regime for justification of interferences with either rights will slightly differ between the CJEU and the ECtHR. Both courts will assess them

80 Kokott J., Sobotta C., The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR, International Data Privacy Law (2013). 81 European Court of Human Rights. 82 Ibid. 83 Council of Europe, Convention for the Protection of Individuals with regard to the Processing of Personal Data (”Convention 108”), Article 2(a). 84 Case C-92/09 Volker and Schecke EU:C:2010:662, §52, 53 and 87. 85 Bernh Larsen Holding AS and others v Norway App no 24117/08, §159. 86 Case C-617/10 Åkerberg Fransson EU:C:2013:105, §20; Case C-279/09 DEB EU:C:2010:811, §32; Rotaru v Romania App no 28341/95. 87 Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) No 45/2001 and Decision No 1247/2002/EC. 88 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), (the ”GDPR”), Article 82. 89 European Convention on Human Rights, Article 1. 27

according to their general proportionality tests, which are slightly different.90 In practical terms, however, both courts’ concerns in their assessment can be summarized by the ECtHR’s statement in Rotaru: “(T)he greater the scope of the recording system, and thus the greater the amount and sensitivity of the data held and available for disclosure, the more important the content of the safeguards to be applied at the various crucial stages in the subsequent processing of the data.”91

2.1.3 An ‘omnibus’ legal framework

The UE’s legal framework is harmonized. This means that the EU’s rules do not merely set a minimum standard of privacy and data protection giving leeway for Member States to choose their methods of implementation. A single piece of legislation is directly applicable throughout the Union: the GDPR. Pr. Paul Schwartz calls it an ‘omnibus’92 law as ‘No areas are left unregulated, and data subjects are guaranteed remedies for privacy harms.’93 It is further completed by the Regulation 2018/1725,94 and will be complemented by the ePrivacy regulation.95 Member States still have legislative powers in this area, but these are merely residual. For instance, in the GDPR, they deal with niche concerns such as age of consent.96 This is all due to the quasi-constitutional value of privacy and data protection in the EU’s legal system, with Article 16 TFEU granting the EU with competence to regulate this area, and Articles 7 and 8 of the Charter enshrining them as fundamental rights.

90 The ECtHR follows a three-pronged test: ‘accordance with the law,’ ‘legitimate basis,’ ‘necessity in a democracy;’ the CJEU’s proportionality test is more flexible, but would, in its most extensive form, consist of legal basis, legitimate aim, adequacy, necessity, proportionality test stricto sensu. 91 Rotaru v Romania App no 28341/95, §200. 92 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017), p.128. 93 Ibid. p.129. 94 Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) No 45/2001 and Decision No 1247/2002/EC. 95 Proposal for a Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications). 96 GDPR, Article 8. 28

2.2 An American marketplace approach and ‘patchwork’ laws

2.2.1 Personal data as a commodity for the privacy consumer

The US, contrary to the EU, does not protect privacy primarily via constitutional or fundamental rights. It relies on a market-based approach protecting individuals as ‘trader[s] of a commodity, namely [their] personal data,’97 with a regime more akin to consumer protection.

The US Constitution protects privacy in what Europeans call vertical relations, between the state and its citizens. This essentially concerns the use of personal information during criminal investigations, for example through the Fourth Amendment98 and the Due Process Clause of the Fourteenth Amendment.99 However, American law does not grant privacy consumers in private-to-private relationships a constitutional protection.

The principle in US law is the very opposite of EU law: it ‘starts with a principle of free information flow and permits the processing of any personal data unless a law limits this action.’100 This market-based approach is reflected in the fact that the top US privacy enforcer is the Federal Trade Commission, whose mission is to ‘protect consumers and promote competition’ and ‘prevent unfair or deceptive acts or practices in or affecting commerce.’101 The American regulation of the digital economy has mostly been light-touch and supply-side friendly. It prioritizes innovation and the growth of its technology sector by primarily relying on industry self-regulation. As Pr. Schwartz explains: ‘Personal information is another commodity in the market,’ hence, ‘The focus of information privacy law in the United States is policing fairness in exchanges of personal data.’102

2.2.2 A ‘patchwork’ legal framework

The American Federal level does not contain any equivalent to a general privacy piece of legislation akin to the GDPR. Hence, Pr. Schwartz calls it a ‘patchwork’ legal framework. It

97 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017), p.121. 98 Concerning searches and seizures, mainly in the context of criminal investigations. 99 Whalen v Roe, 429 U.S. 589 (1977)., creating a general right to information privacy. It is debated whether is case is still good law. 100 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017), p.135. 101 FTC, What We Do . 102 Schwartz P. M., Property, Privacy, and Personal Data, Harvard Law Review (2004), p.132. 29

consists of disparate sector, issue or person-specific laws, with specific statutes for: children privacy,103 electronic communications,104 financial privacy,105 information security,106 government handled-data,107 health privacy and insurance,108 driver’s license plate protection,109 cable TV,110 video privacy,111 computer fraud.112

Besides the above, each State typically has its own privacy regime. The most notable example is California, which is often considered at the privacy’s vanguard113 in the US. Privacy is enshrined in the Article 1 Section 1 of the State’s Constitution: ‘All people are by nature free and independent and have inalienable rights. Among these are enjoying and defending life and liberty, acquiring, possessing, and protecting property, and pursuing and obtaining safety, happiness, and privacy.’114 Its privacy legislation is itself not harmonized and consists also of sector, issue or person-specific statutes,115 although the California Consumer Privacy Act116 is an attempt at a generalized and unified framework.

This disparate nature of the of the US privacy landscape creates uncertainty for both businesses and consumers. It has also been a cause of concerns for EU leaders and policymakers.

103 15 USC 91 Children's Online Privacy Protection. 104 47 USC 222 Privacy of Customer Information. 105 31 CFR 103.120 Anti-Money Laundering Programs. 106 15 U.S.C. 6801(b), 6805(b)(2) Standards for Safeguarding Customer Information. 107 5 USC 552a Records Maintained on Individuals – Government Organizations and Employees. 108 42 USC 201 Health Insurance Portability and Accountability Act of 1996. 109 18 USC 123 Prohibition on Release and Use of Certain Personal Information from State Motor Vehicle Records. 110 47 USC 551 Protection of Subscriber Privacy. 111 18 USC 2710 Wrongful Disclosure of Video Tape Rental or Sale Records. 112 18 USC 1030 Fraud and Related Activity in Connection with Computers. 113 Freiwald S., At the Privacy Vanguard: California’s Electronic Communications Privacy Act (CalECPA), Berkeley Technology Law Journal (2018). 114 Constitution of the State of California, Article 1 ‘Declaration of Rights,’ Section 1. 115 Morrison Foerster, Privacy Library . 116 California Assembly Bill No. 375, Chapter 55 Privacy: Personal Information: Businesses. 30

2.3 Second sub-conclusion

In the light of the discussion from this chapter, the EU and US regulatory approaches differ due to historical and economic differences. The data driven economy presents both the EU and the US with fundamental challenges with which they deal in dramatically opposite ways. The US in many respects pioneered and enabled the digital revolution. It capitalized on it and has become the global leader in this domain. The EU never developed such industries and seeks to play on the technological world stage through regulation and a ‘Brussels effect,’117 to influence other countries and export its highly-protective regime. Furthermore, several EU Member States’ histories with government surveillance and oppression led the EU to take a stronger stance towards the protection of data privacy. As a result, the incentives of the EU and the US for regulation are almost diametrically opposed. The US may accuse the EU of data protectionism,118 and the EU may accuse the US of digital imperialism.119 In any case, the EU’s export of its data privacy rules seems to have influenced California’s adoption of the CCPA, which could in turns lead the US Congress to take up the issue and federalize this area of law, although this looks unlikely in the near future, due to the sheer size of California’s economy and the fact that nearly all major US tech companies are headquartered there, in Silicon Valley.120 Bridging the Transatlantic divide may thus prove difficult to conceive without one model taking over the other.

117 Bradford A., The Brussels Effect, Northwestern University Law Review (2012); Gady F.-S., EU/US Approaches to Data Privacy and the ‘Brussels Effect’: A Comparative Analysis, Georgetown Journal of International Affairs (2014). 118 Denton J., Digital Protectionism Demands Urgent Response, The Financial Times (2019) . 119 Wasik B., Welcome to the Age of Digital Imperialism, The New York Times (2015) . 120 On the ‘California effect:’ Perry I. E., California is Working: The Effects of California’s Public Policy on Jobs and the Economy since 2011, UC Berkeley Labor Center (2017). 31

3 THE DOWNFALL OF THE SAFE HARBOR

Why was the Safe Harbor framework discontinued?

This chapter will focus on the Schrems case by the CJEU which caused the downfall of the Safe Harbor. It will detail the background leading up to the case, the facts of the case, the conclusion of the Court, its reception and consequences on EU-US personal data transfers.

3.1 Background

3.1.1 How the EU regulates international personal data transfers in general

The principle in EU law is that personal data of EU citizens or collected in the EU may not be transferred to a third country unless the Commission would have found that this third country affords an ‘adequate’ level of protection. This used to be regulated under Article 25 of the Data Protection Directive. Nowadays, this is addressed by Article 45 of the GDPR.

Although what constitutes an adequate level of protection is defined neither by the Directive nor by the GDPR, the Commission provides the following insight: ‘Under EU law, an adequacy finding requires the existence of data protection rules comparable to the ones in the EU. This concerns both the substantive protections applicable to personal data and the relevant oversight and redress mechanisms available in the third country.’121

Such decisions have been granted to 13 countries, including ,122 Israel123 and Japan.124 The US has only ever been granted partial adequacy findings. For data transfers from the EU towards the US to be adequate, they have to be funneled through approved mechanisms such as the Safe Harbor then, and the Privacy Shield now.

121 COM/2017/07 Communication From The Commission To The European Parliament And The Council, Exchanging and Protecting Personal Data in a Globalised World . 122 Commission Decision 2000/518/EC of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequate protection of personal data provided in Switzerland. 123 Commission Decision 2011/61/EU of 31 January 2011 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequate protection of personal data by the State of Israel with regard to automated processing of personal data. 124 Commission Implementing Decision (EU) 2019/419 of 23 January 2019 pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council on the adequate protection of personal data by Japan under the Act on the Protection of Personal Information. 32

Other secondary mechanisms exist to transfer personal data from the EU to other countries, such as the Standard Contractual Clauses,125 also referred to as ‘Model Contracts,’126 the Binding Corporate Rules,127 and specific agreements concerning Passenger Name Records,128 and the Terrorist Financing Tracking Programme.129

3.1.2 Commission Decision 2000/520 ‘Safe Harbor’

The Commission’s adequacy Decision 2000/520 (the Safe Harbor Decision)130 stated that companies processing data from EU and EEA nationals had the possibility to transfer this data to the US, provided that they complied with the Safe Harbor Privacy Principles and had registered themselves to do so at the US Department of Commerce. It created a presumption that transfers happening through this program were ‘adequate’ for the purpose of Article 25 of the Data Protection Directive. It consisted of 11 Recitals, 6 Articles and 7 annexes and was intended to regulate the onward transfer of personal data collected in the EU toward servers located in the United States.

It created a compliance program operated by the US Department of Commerce whereby companies were able to transfer personal data from the EU to the US, provided that they self- certified their compliance with a set of seven ‘Privacy Principles’ reflecting the protections granted by the EU Data Protection Directive. Those were: notice, choice, onward transfer, security, data integrity, access and enforcement. The Federal Trade Commission was tasked with enforcing compliance with the program. The program was reviewed and reconducted by

125 European Commission, Standard Contractual Clauses (SCC) . 126 Linklaters, AG Opinion Issued in Schrems – Safe Harbor invalid. Are Model Contracts also at Risk? . 127 European Commission, Binding Corporate Rules (BCR) . 128 European Commission, Passenger Name Record (PNR) . 129 European Commission, Terrorist Finance Tracking Programme . 130 Commission Decision 2000/520/EC of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce. 33

the Commission in 2002 and 2004.131 It was operated throughout much of the life of the Data Protection Directive.

3.1.3 The Snowden revelations

In 2013, former CIA and NSA contractor Edward Snowden132 revealed to newspapers such as The Guardian133 and The Washington Post134 an unprecedented number of classified documents pertaining to surveillance activities undertaken by US authorities. In essence, his successive leaks135 brought to the fore the existence of secret surveillance programs by the US targeted not only towards its own population but also citizens of allied countries even in the EU, such as Germany and . Chief among them was the PRISM program, which allowed the NSA to access data gathered by major US tech companies such as Facebook, Microsoft and Google.136

This spurred a backlash in the public opinion and a reckoning of the importance of data privacy. US NGOs such as the Electronic Frontier Foundation, the American Civil Liberties Union and the Electronic Privacy Information Center filed a number of lawsuits challenging the constitutionality of the programs.137 In the EU, relations between Germany and the US were strained after it was alleged that American agencies had been phone-tapping Chancellor Angela Merkel.138 This led the heads of the German DPAs to express that ‘there is a substantial

131 Coudret F., Schrems vs. Data Protection Commissioner: A Slap on the Wrist for the Commission and New Powers for Data Protection Authorities, European Law Blog (2015) . 132 Express VPN Education, Biography of Edward Snowden . 133 Macaskill E. and Dance G., NSA Files: Decoded, The Guardian (2013) . 134 Noack R., Edward Snowden Revelations, The Washington Post (2016) . 135 The Lawfare Blog, for a chronology of the Snowden Revelations . 136 BBC News, Edward Snowden: Leaks that Exposed US Spy Programme (2014) . 137 Brown I., The feasibility of transatlantic privacy-protective standards for surveillance, International journal of law and information technology (2014). 138 BBC News, Snowden NSA: Germany Drops Merkel Phone-Tapping Probe (2015) . 34

likelihood that the principles of the Commission’s [Safe Harbor] decision are being violated.’139 It also prompted MEP and former Vice President of the European Commission Viviane Reding to say that ‘The Safe Harbor agreement may not be so safe after all.’140 This event that Max Schrems calls the ‘Chernobyl of data protection’141 served as a backdrop for the CJEU to invalidate the Safe Harbor agreement.

3.2 The Schrems judgement

3.2.1 Facts of the case

Maximilian Schrems was a law student from Austria who, during an exchange semester at Santa Clara University in Silicon Valley, had a defining encounter with Ed Palmieri, an in-house privacy counsel at Facebook. During a seminar in which they discussed European Data Protection laws, Schrems recalls: ‘He was basically saying: “F*** it, we do whatever we want to and there’s no consequence.”’142 This prompted Schrems to write a paper about Facebook’s privacy practices. For his research, he requested the company to send him all the data it held about him, using his right to access under the Directive. ‘He discovered that Facebook’s dossiers on individual users are hundreds of pages long and include information users thought had been deleted’143 and are written in an extremely complex language virtually impossible for lay people to understand.144 He makes an analogy between Facebook’s profiling of users and the files of the Stasi on East Germans: ‘The most prominent politicians had a thousand pages in their files […] I’m just a normal guy who’s been on Facebook for three years. Imagine this in 10 years: every demonstration I’ve been to, my political thoughts, intimate conversations,

139 COM/2013/0847 Communication From The Commission To The European Parliament And The Council on the Functioning of the Safe Harbour from the Perspective of EU Citizens and Companies Established in the EU . 140 European Commission, Memo on the Informal Justice Council in Vilnius . 141 Kuchler H., Max Schrems: the Man Who Took On Facebook – and Won, The Financial Times (2018) . 142 Ibid. 143 Hill K., Max Shrems: The Austrian Thorn in Facebook’s Side, Forbes (2012) . 144 ’How can a user work for 10 hours a day, then go back home and understand how Facebook’s algorithm works? I don’t understand and I’ve been doing that for seven years’ Kuchler H., Max Schrems: the Man Who Took On Facebook – and Won, The Financial Times (2018) . 35

discussion of illnesses.’145 When he got back to Austria, he founded the association Europe-v- Facebook146 aiming to streamline his litigation efforts and gather media attention around the company’s handling of personal data. Today, most of his advocacy work happens through his NGO None Of Your Business.147

This led Max Schrems to conduct a number of data protection litigations, chiefly the one that struck down Safe Harbor. In that particular case, although the company was involved, he did not directly sue Facebook nor any other tech company. He lodged a complaint in 2013 to the Irish Data Protection Commissioner to prohibit Facebook, whose European operations are managed from Ireland, from transferring his personal data to the US. His argument was that the country did not guarantee an adequate level of protection against surveillance activities of US authorities, referring to the Snowden revelations.148 The DPC did not give right to the complaint, on the ground that he did not prove that the NSA accessed his data, and that in any case the US had been found by the Commission to afford an adequate level of protection.149

3.2.2 The reference for a preliminary ruling

The High Court, in appeal, referred the case to the CJEU, noting that Snowden had demonstrated a ‘significant over-reach’ by US authorities and that EU citizens had no effective right to be heard to challenge these practices.150 It added that, had the case to be decided solely based upon Irish law, ‘the mass and undifferentiated accessing of personal data is clearly contrary to the principle of proportionality and the fundamental values protected by the Irish Constitution.’151

The High Court asked the CJEU whether the Safe Harbor Decision precluded a Data Protection Authority from assessing the adequacy of the protection afforded by the US. The Court equated

145 Hill K., Max Shrems: The Austrian Thorn in Facebook’s Side, Forbes (2012) . 146 Facebook’s data pool as described on Max Schrems’ Europe-v-Facebook website . 147 None Of Your Business, the current NGO through which Max Schrems carries his advocacy and judicial activities . 148 C-362/14 Schrems EU:C:2015:650, §28. 149 Ibid., §29. 150 Ibid., §30-31. 151 Ibid., §33. 36

this question with that of the validity of the Decision itself: it would make no sense to allow a DPA to investigate American protections without giving it the right to find it inadequate; consequently, if one DPA would find the protection in the US inadequate, this would beat the purpose of the whole Safe Harbor. This followed Advocate General Yves Bot’s opinion. He had advised the Court to find the Safe Harbor Decision invalid and that Member States’ DPA should be able to review the use of the Safe Harbor regarding data flowing from their country and, if need be, to suspend those flows.152

3.2.3 Reasoning of the Court

The CJEU, in a Grand Chamber formation, found the Safe Harbor Decision invalid. Its reasoning essentially focused on the absence of oversight, legal protection and remedies against access to EU citizens’ data by US public authorities, most importantly intelligence agencies.

First, the Court started by setting a ‘strict’ standard of review to the Commission’s Decision, interpretation warranted by the importance of privacy and data protection under the Charter of Fundamental Rights.

Second, it continued by noting that Article 1 of the Decision 2000/520 created a ‘presumption of adequacy’ of the Safe Harbor program with EU Data Protection law. The second paragraph of Annex I to the Decision excluded public authorities from its scope. The fourth paragraph of Annex I to the Decision contained an exception to the compliance with the Privacy Shield Principles by private companies taking part in the program. This exception meant that “‘national security, public interest, or law enforcement requirements” have primacy over the safe harbor principles.’153 In other terms, ‘Clearly, where US law imposes a conflicting obligation, US organisations whether in the safe harbor or not must comply with the law.’154 The Court also noted that US law did not create any rule of exception for the surveillance of EU citizens.155 In the light of this, the Court found that US authorities were able to process data ‘on a generalized basis […] without any differentiation, limitation or exception.’156 This would be a breach of the

152 Opinion of Advocate General Bot in Schrems, §237. 153 C-362/14 Schrems EU:C:2015:650, §28. 154 Part B of Annex IV of the Safe Harbor Decision. 155 C-362/14 Schrems EU:C:2015:650, §88. 156 Ibid., §93 37

essence of the fundamental right to respect for private life under Article 7 of the Charter and its Digital Rights Ireland157 jurisprudence.158

Third, further argument supporting its reasoning was the fact that the enforcement role of the FTC over the Safe Harbor Principles did not, by definition, extend to the processing of personal data by public authority.159 In effect, EU citizens would be left with ‘no administrative or judicial means of redress.’160 The Court found that this would be a violation of the right to an effective judicial protection under Article 47 of the Charter and its Les Verts161 jurisprudence.

As a result, because respect for fundamental rights is a condition of validity of EU acts under the Kadi162 jurisprudence, and because the essence of those rights was violated, the Court declared the Safe Harbor decision invalid without even needing to examine its content163 nor assess them against the principle of proportionality.164

3.3 Receptions of the judgement

The Schrems case has been widely criticized by certain scholars. Pr. David Bender provided a comprehensive and compelling critique in his article for the journal International Data Privacy Law: ‘Having mishandled Safe Harbor, will the CJEU do better with Privacy Shield? A US perspective.’165 His account on the case can be summarized as follows: the Court wrongly relied on the premise that PRISM was a bulk data collection program, that it did not take into account substantial changes to surveillance law in the US following the Snowden revelations, and that it ‘took no note of the fact that, overall, US national security surveillance is more privacy- sensitive than is most EU Member State national security surveillance.’166

157 C-293/12 and C-594/12 Digital Rights Ireland and others EU:C:2014:238, §39. 158 C-362/14 Schrems EU:C:2015:650, §94. 159 Opinion of Advocate General Bot in Schrems, §205 (cited by court at §89), itself citing FAQ 11 in Annex II to Decision 2000/520 and Annexes III, V and VII. 160 C-362/14 Schrems EU:C:2015:650, §94. 161 C-294/83 Les Verts v Parliament EU:C:1986:166, §23 162 C‑584/10 P, C‑593/10 P and C‑595/10 P Commission and Others v Kadi EU:C:2013:518, §66. 163 C-362/14 Schrems EU:C:2015:650, §98. 164 Article 52(1) of the Charter does not permit the use of a proportionality test when the essence of a fundamental rights is impaired. 165 Bender D., Having mishandled Safe Harbor, will the CJEU do better with Privacy Shield? A US perspectice, International Data Privacy Law (2016). 166 Ibid., p.118. 38

First, the major issue which lead the CJEU to find that the PRISM program breached the essence of the rights to privacy and data protection was that it would have been a bulk data collection program. This information directly came from the revelations by Edward Snowden to The Washington Post. However, the journal later rectified certain of its claims,167 and even the Fundamental Rights Agency of the European Union recognized the factual errors. As former White House privacy expert Peter Swire explained in a 2015 white paper to the Belgian DPA : ‘Based on the corrected facts, the Fundamental Rights Agency and the US Privacy and Civil Liberties Oversight Board have found that PRISM is not a bulk collection program but instead is based on the use of targeted selectors such as emails.’168

Second, the CJEU would be guilty of double-standards towards US authorities. Pr. David Bender points to the fact that in the wake of terrorist attacks and threats in countries such as France and Belgium, EU Member States have adopted laws that he considers less protective than the new American ones. For example, he cites a 2015 New York Times op-ed article169 by the former Commissioner of Human Rights for the Council of Europe flagging the French most recent surveillance law as: ‘a controversial law on surveillance that permits major intrusions, without prior judicial authorization, into the private lives of suspects and those who communicate with them, live or work in the same place or even just happen to be near them.’ Even though the Court has a minimal jurisdiction over national security matters and could not declare these laws counter to EU law the way it invalidated the Safe Harbor, it seems odd to hold a non-EU country to a higher standard than EU Member States.

On the other hand, after the judgement was handed down, Edward Snowden tweeted, lauding Max Schrems: ‘You’ve changed the world for the better.’170 The latter wrote on the website of Europe-v-Facebook: ‘very much welcome the judgement of the Court, which will hopefully be a milestone when it comes to online privacy. This judgement […] clarifies that mass

167 Hall J., Washington Post Updates, Hedges on Initial PRISM Report, Forbes (2013) . 168 Swire P., White Paper on US Surveillance Law, Safe Harbor, and Reforms Since 2013 submitted to the Belgian DPA for its 2015 forum on ’The Consequences of the Judgement in the Schrems case’ . 169 Muiznieks N., Europe is Spying on You, The New York Times (2015) . 170 Snowden E., tweet about Maximilian Schrems (2015) . 39

surveillance violates our fundamental rights. Reasonable legal redress must be possible.’171 The CJEU’s decision seemed to have also pleased the European Parliament, which held ‘a long- standing position regarding the lack of adequate level of protection of fundamental rights under the Safe Harbor regime’ and had ‘repeatedly called for the suspension of the Safe Harbor principles.’172 As it noted in its 2017 ‘From Safe Harbour to Privacy Shield’ report, ‘Some US technology companies saw the striking down of the SH as a wake-up call for businesses.’173 Others, such as Microsoft, generally welcomed the case as an occasion to further discussion about the future of global regulation of the internet, and of privacy in particular.174 Organizations such as the Jean Monnet Saar also wrote headlines reading ‘Your Facebook data just got a lot more secure.’175 But, did it, really?

3.4 Ten myths about the Schrems case

This section will explore ten myths about the Schrems case and the Safe Harbor that I selected from articles by Baker McKenzie Partners Lothar Determann176 and Brian Hengesbaugh.177 Their analysis is essential to complete our understanding of the background leading up to the Privacy Shield.178

171 Max Schrems’ initial response to the CJEU’s Schrems decision . 172 European Parliament, From Safe Harbour to Privacy Shield – Advances and Shortcomings of the New EU- US Data Transfer Rules . 173 Ibid. 174 Smith B., The Collapse of the US-EU Safe Harbor: Solving the New Privacy Rubik’s Cube (2015) . 175 Bagchi K., Dissecting the Safe Harbor Decision of the ECJ (2015) . 176 L. Determann, Partner in the Palo Alto office of Baker McKenzie. He is also Pr of law at Berkeley and was cited in the section about property rights and data. Determann L., US Privacy Safe Harbor – More Myths abd Facts, Bloomberg BNA (2016) . 177 B. Hengesbaugh is a Partner in the Chicago office of Baker McKenzie and is the firm’s Global Chair of Data Privacy and Security, Hengesbaugh B., Five Myths About Safe Harbor, International Association of Privacy Professionals (2015) . 178 The titles from this section mostly come from the two articles they wrote (referenced in the two previous footnotes) and most further references also come from links they used in those articles. 40

3.4.1 The invalidation of Commission Decision 2000/520 was inevitable

As explained by Lothar Determann: ‘The High Court in Ireland could have simply overturned the Irish Data Protection Commissioner's decision in the case at hand, finding it should have suspended data flows based on Article 3 of the Safe Harbor Adequacy Decision.’ This would have cut data flows from Ireland, without jeopardizing transfers from the rest of the EU. Moreover, the Commission had already been negotiating for two years a Safe Harbor 2.0 at the time of the judgement that could have addressed the concerns of the Court. More importantly, the applicant, Max Schrems, did not even argue the invalidity of the Safe Harbor.

3.4.2 The Safe Harbor was a treaty

The Safe Harbor was neither a treaty nor any kind of pact within the meaning of the UN Vienna Convention on the Law of Treaties.179 It did not even consist of a single document. It was a privacy protection compliance program, an ‘international cooperative arrangement,’180 operated by the US DoC and enforced by the FTC, and which happened to have been deemed ‘adequate’ by the Commission.

3.4.3 The Safe Harbor was invalidated

As a result, there was no question of termination nor invalidation. The US DoC continued operating the program after the Schrems judgement.181 The CJEU’s decision simply ‘de- harmonized’182 EU/US personal data transfers. Both the Commission and each EU Member States’ Data Protection Authorities had their own adequacy decisions. The CJEU barely invalidated the Commission’s, not any of the Member States.’ The ruling simply meant that

179 Vienna Convention on the Law of Treations from 1969 . 180 Hengesbaugh B., Five Myths About Safe Harbor, International Association of Privacy Professionals (2015) . 181 Scott M., Data Transfer Pact Between U.S. and Europe is Ruled Invalid, The New York Times (2015) . 182 Determann L., US Privacy Safe Harbor – More Myths abd Facts, Bloomberg BNA (2016) . 41

Member States were no longer bound by the Commission’s decision and were able to individually declare the Safe Harbor non-adequate as far as their State was concerned.183

3.4.4 The Safe Harbor was meant to address government surveillance issues

The Safe Harbor in reality was a commercial arrangement, the expression ‘government surveillance’ or simply ‘surveillance’ do not appear in its documents. It was concluded before the 9/11 World Trade Center attack, the Patriot Act184 and the enactment of the PRISM program revealed by Edward Snowden. Government surveillance is specifically addressed by other instruments such as the Umbrella Agreements.185 If anything, surveillance issues remain outside of the EU’s institutions competence, hence a Commission Decision could not have dealt with them. At best, one might have argued that it indirectly addressed surveillance enhanced by private actors, such as explained in the section about Surveillance Capitalism.

3.4.5 The Safe Harbor was poorly enforced by the FTC

As Brian Hengesbaugh explains: ‘After counseling many hundreds of companies over the years on the range of options to address cross-border data transfers, I can say with certainty that the deterrent effect of potential FTC action is a strong motivator for US organizations to build strong privacy programs implementing the Safe Harbor rules and to maintain those programs on an on-going basis.’186

183 For instance, countries such as France, Spain and Germany made such findings, see p.14 of European Parliament, From Safe Harbour to Privacy Shield – Advances and Shortcomings of the New EU-US Data Transfer Rules . 184 US DoJ, The USA : Preserving Life and Liberty , refering to Pub.L. 107-56 Patriot Act.. 185 EUR-Lex, Summary of EU Legislation: EU-US Agreement on Personal Data Protection . 186 Hengesbaugh B., Five Myths About Safe Harbor, International Association of Privacy Professionals (2015) and FTC, Legal Resources, cases brought under the Safe Harbor and Privacy Shield Frameworks .

42

3.4.6 The main victims were the US companies participating in the program

The companies most affected were those operating in the EU/EEA which needed to send personal data from the EEA to the US, regardless of whether they were European or American. It is important to bear in mind that registration and compliance with the program was voluntary. Neither did EU nor US law impose the use of the Safe Harbor. Companies which did not wish to participate had other alternatives, such as the Binding Corporate Rules and the Standard Contractual Clauses. In practical terms, it could have led companies certified under Safe Harbor to keep their European customers data in European servers and seek a new legal basis to justify potential Transatlantic transfers.187

As a side note, Safe Harbor’s simplicity had led other countries such as Israel and Switzerland to adhere to the program. As a result, companies transferring personal data from these countries were also affected, regardless of where they were incorporated. In total, around 4,600 companies had self-certified through the Safe Harbor.188

3.4.7 Data transfers to the US were rendered illegal

As mentioned above, the CJEU’s decision did not have any impact on the adequacy decisions concerning the Binding Corporate Rules and the Standard Contractual Clauses. It was therefore still possible to transfer personal data from the EU to the US via these mechanisms, or still through the Safe Harbor but by changing the legal bases of these transfers from Commission Decision 2000/520 to adequacy decisions made by Member States.

3.4.8 US law was not ‘adequate’ because not ‘equivalent’ to the EU legal order

The wording used by the CJEU is confusing. In paragraph 96 of the Schrems decision, it seems to link ‘adequacy’ to a test of ‘equivalence’ or ‘essential equivalence.’ For the Commission to make an adequacy decision, a third country must afford ‘by reason of its domestic law or its international commitments, a level of protection of fundamental rights essentially equivalent to

187 Nyst C., At Last, The Data Giants Have Been Humbled, The Guardian (2015) . 188 Gilbert F., Invalidation of the Safe Harbor: Will it Cause the Adoption of Data Silos?, Bloomberg BNA (2015) . 43

that guaranteed in the EU legal order.’ This seems to confuse ‘adequacy,’ which is a test addressed to transfers from the EU/EEA to third countries, and the test of ‘equivalence,’ which the Data Protection Directive, and now the GDPR, only requires between each EEA Member States.

However, considering each EEA Member State’s privacy laws equivalent is a fiction. Member States’ rules concerning surveillance differ widely.189 It is therefore questionable to ask a third country with a vastly different legal system, i.e. the US, to pass a bar that states from supposedly similar legal systems cannot.

3.4.9 EU data subjects had access to no means of redress nor remedies

The CJEU focused on the absence of remedies in case of government interference with data held by private companies certified under the Safe Harbor. However, the program contained strong means of redress for all other kinds of infringements on the part of participating companies. If anything, it would have deprived EU citizens from the protection of the FTC.

3.4.10 The CJEU enhanced privacy protection for EU citizens

Lothar Determann explains that ‘intelligence services in the US and in Europe are not subject to the CJEU decision. For all we know, they have been gathering data just as much since [the Judgement] on both sides of the Atlantic and continue to operate closely with each other.’190 As a result, if the CJEU’s goal was to curb US surveillance on European citizens, the Schrems judgement might have missed the point. It however certainly caused compliance headaches for companies on which US surveillance agencies might have been eavesdropping, and offered leverage to the Commission in its negotiations of Safe Harbor 2.0, now known as the Privacy Shield.

189 Determann L., K. T. Guttenberg, On War and Peace in Cyberspace – Security, Privacy, Jurisdiction, Hastings Constitutional Law Quarterly (2014). 190 Ibid. 44

3.5 Third sub-conlusion

In the light of the discussion from this chapter, the Safe Harbor was discontinued due to mounting political pressure against the US’ use of private companies to gather intelligence and harness personal data of EU citizens as revealed by whistle-blower Edward Snowden. The CJEU sought to give EU representatives more leverage to address the issue as the negotiations for a Safe Harbor 2.0 were coming to a close, in order to ultimately award EU citizens stronger privacy protections, for example concerning redress and dispute resolution mechanisms. The Schrems case is still highly controversial and politically loaded, and has been criticized as a display of judicial activism.

45

4 THE CURRENT FRAMEWORK: THE PRIVACY SHIELD AND OTHER COMPLIANCE VEHICLES

What are the current regimes and frameworks available to transfer personal data from the EU to the US?

This chapter will provide its reader with background on the negotiations, content, review and challenges of and to the Privacy Shield, as well as basics on the alternative compliance vehicles to transfer personal data from the EU to the US.

4.1 Negotiations of Safe Harbor 2.0 and ‘re-branding’ to Privacy Shield

At the time of the invalidation of the Safe Harbor Decision by the CJEU, EU and US authorities were already negotiating an updated framework, sought to be called ‘Safe Harbor 2.0.’ It was extremely difficult to find official documents detailing the negotiations but draft minutes from the 103rd meeting of the Article 29 Working Party inform us that the delivery of the Schrems case did not stop the negotiations and that both the EU and US negotiators were hoping to reach a new agreement by the end of January 2016.191 Former US Secretary of Commerce Penny Pritzker announced the completion of what had been re-branded the ‘Privacy Shield’ on 2 February 2016. She hailed that this ‘historic agreement is a major achievement for privacy and for businesses on both sides of the Atlantic. It provides certainty that will help grow the digital economy by ensuring that thousands of European and American businesses and millions of individuals can continue to access services online.’192

4.2 Content of the Privacy Shield package

What was discussed above in relation to the Safe Harbor goes for the Privacy Shield too. It is not an international treaty. It is a voluntary certification scheme administered by the DoC and that happens to have been deemed ‘adequate’ by the Commission. As such, it does not consist of a single document but is rather a ‘package.’ Its core is made up of the seven ‘Privacy Shield Principle.’ It is complemented by five letters from US institutions aimed at clarifying both

191 Article 29 Data Protection Working Party, Minutes of the 103rd meeting of the Article 29 Data Protection Working Party . 192 Statement from U.S. Secretary of Commerce Penny Pritzker on EU-U.S. Privacy Shield (2016) . 46

parties’ understanding and commitments from the US side. I call these the Privacy Shield ‘correspondence’ and will address it first.

The correspondance is principally made up of: an International Trade Administration Letter193 and the Arbitral Model, a letter from US Secretary of State John Kerry to Commissioner Jourova194 and the Ombudsperson Mechanism,195 a letter from FTC Chair Edith Ramirez to Commissioner Jourova,196 a letter from the ODNI to the US Department of Commerce,197 a letter from US Deputy Assistant Advocate General Bruce Schwartz to the Department of Commerce.198 In essence, they contain precisions and clarifications, as well as reassurances for the EU that US institutions take the framework’s requirements seriously and are giving themselves the means to achieve its purpose.

The core of the Privacy Shield framework consists of the exposé of the seven following principles: Notice, Choice, Accountability, Security, Integrity and purpose limitation, Access, and Recourse, enforcement and liability. In essence, these aim to mirror, or attempt to mirror, requirements of EU law to be applied by participants to the program, even concerning operations would have otherwise been falling outside the territorial scope of EU law. They act as a bridge, a middle ground, between EU and US regulatory regimes. They could be considered as a softer version of EU data protection law, or a reinforced US information privacy law.

4.3 Annual reviews of the Privacy Shield

The adequacy decision provides for a yearly review of the functioning of the Privacy Shield. The first annual review, in 2017, focused on certification rather than monitoring and supervision, logically, because the program was just getting started. The Commission used the

193 International Trade Administration Letter . 194 Letter from US Secretary of State John Kerry to Commissioner Jourova . 195 Ibid, Annex on the Ombudsperson Mechanism . 196 Letter from FTC Chair Edith Ramirez to Commissioner Jourova . 197 Letter from the ODNI to the US Department of Commerce . 198 Letter from US Deputy Assistant Advocate General Bruce Schwartz to the Department of Commerce . 47

occasion to propose an array of recommendations going forward. This section will focus on the second report, from 2018. It concentrated on US authorities’ incorporations of the 2017 recommendations in two areas: the commercial aspects of the program, and the access by US authorities to data of EU citizens transferred through it.

4.3.1 Commercial considerations

The report mostly focused on several issues: the (re)-certification, compliance monitoring, enforcement, and complaint handling.

First, concerning the certification and re-certification process,199 after its first two years in place, already about 3,900 companies had signed-up and certified to the program,200 nearly as much as the case was under the Safe Harbor. There are around 1,000 more applications still pending.201 Between 2017 and 2018, 93% of companies decided to renew their certifications. The DoC put in place additional elements to the certification procedure, such as a standstill obligation making that ‘companies should not be allowed to make public representations about their Privacy Shield certification before the DoC has finalized the certification.’202 It also published guidance on the Privacy Shield’s website to clarify the procedure.203 The program so far proves extremely successful. Only few companies decided not to re-certify, citing ‘corporate or business changes, changes to activities where personal data from the EU is no longer collected and the use of other tools [such as the SCCs and the BCR]. Concerns about onerous requirements (in particular the Onward Transfer Principle), as well as validity and long-term stability of the Privacy Shield were also raised.’204

Second, concerning DoC compliance monitoring, the Department put in place a variety of supervision instruments. It carries out random ‘spot-checks’ to verify the data practices of certified companies, it conducts ‘sweeps’ of participants’ privacy policies, and screens for public reports to makes sure the information on their website concerning compliance remains

199 For companies already certified and wishing to renew their certification. 200 European Commission, Staff Working document for the second annual review of the Privacy Shield, p.4 < https://ec.europa.eu/info/sites/info/files/staff_working_document_-_second_annual_review.pdf >. 201 Over 4,000 in 15 years of existence. 202 Ibid, p.6. 203 US Department of State, Privacy Shield Ombudsperson . 204 European Commission, Staff Working document for the second annual review of the Privacy Shield, p.8, fn. 22. < https://ec.europa.eu/info/sites/info/files/staff_working_document_-_second_annual_review.pdf >. 48

up to date. It also closely works with EU DPAs to detect potential compliance issues. Moreover, it developed a comprehensive plan to identify false claims of participation in the Privacy Shield. To that end, it has sent over 400 warning letters and referred more than 100 companies to the FTC since the launch of the program.

Third, concerning FTC enforcement, only 8 of the 100 cases referred to it by the DoC led to enforcement steps – the mere threat of a fine lead the overwhelming majority of companies to get in line without the need to actually initiate proceedings. Most of those concerned false claims of participation, failure to complete the certification procedure, or forgetting to recertify. Besides, the agency can choose to engage of its own volition in investigations focusing on one of two dimensions: either on a particular industry, or according to a specific privacy principle. For example, the infamous Cambridge Analytica, which is no longer part of the Privacy Shield. However, the report does not provide insights on the FTC’s handling of that case. The agency does not share information pertaining to on-going investigations. Depending on the outcome of the case, the DoC stated that ‘Facebook will be removed from the Privacy Shield list should the FTC determine it failed to comply with its commitments under the framework.’205

Fourth, concerning complaint handling, the report noted that an extremely low number of complaints by data subjects to companies have been brought forward during the first two years of the program. One of the reasons is that companies actually receive many enquiries and claims but that those are resolved voluntarily before escalating to a complaint. The DoC, DoT and FTC also saw none or very few direct complaints from data subjects, and the Arbitration Panel has not yet been triggered once. On the other hand, Independent Recourse Mechanisms received a great deal of complaints: the Better Business Bureau saw some 700 and TrustArc about 300. However, the report notes that ‘the majority of complaints turn out to be ineligible, because they contain an incoherent message, do not come from EU individuals or are not related to the Privacy Shield.’206

Lastly, the report mentioned specific areas of improvement suggested by the Commission, such as deeper engagement in awareness raising and cooperation between EU and US authorities. It also included a discussion on specific areas of potential divergences which will need to be

205 Ibid. 206 Ibid. 49

addressed in the future, namely concerning automated individual decision making and the definition of human resources data.

4.3.2 Access by US authorities for national security and law enforcement

American authorities are able to access data transferred to the US through the Privacy Shield. However, they may only do so under specific circumstances, regulated by the Foreign Intelligence Surveillance Act207 or statutes pertaining to National Security Letters. In any case, the ‘collection of personal data is always targeted.’ Bulk data collection is only an exceptional mechanism that in any case may never happen to data transferred under the Privacy Shield.208

The report welcomed recent development in US law in favor of more important privacy protections in relation to national security, namely the reauthorization of FISA Section 702, the affirmation of commitment to PPD-28209 and the CLOUD210 act. The report also noted the US’ efforts to fill in all the positions at the Privacy and Civil Liberties Oversight Board, which had only one board member left at the time of the first review. The Commission however regretted that the US had not yet appointed a permanent Privacy Shield Ombudsperson,211 making this grievance the headline of its press release on the 2018 report. This was remedied on 18 January 2019 when the White House announced the nomination of Keith Krach212 to the role.213 Moreover, during the 2018 review meeting, the US Inspector General for the Intelligence Community was present and ‘confirmed that any referral from the Privacy Shield Ombudsperson would receive his “serious, timely and effective attention.”’

207 50 USC 36 §1081 et seq. Foreign Intelligence Surveillance Act (FISA). 208 European Commission, Staff Working document for the second annual review of the Privacy Shield, p.25 . 209 Presidential Policy Directive 28: Signals Intelligence Activities. 210 Pub.L. 115-141 Clarifying Lawful Overseas Use of Data Act (CLOUD). 211 Lee A., US to Appoint Permanent Privacy Shield Ombudsperson, as EU Pressure Tells, Euractiv (2019) . 212 The White House, President Donald J. Trump Announces Intent to Nominate Individual to Key Administration Posts . 213 Daily Dashboard by the International Association of Privacy Professionals, Keith Krach nominated as Privacy Shield Ombudsperson . 50

Concerning access to personal data for law enforcement purposes, the Carpenter214 case has been one of the most important developments noted by the Commission. It also discussed with the DoJ concerning a 2017 memo by Deputy Attorney General Rod Rosenstein regarding the Stored Communications Act.215 According to that act, ‘US authorities, on the basis of a warrant, subpoena or court order, can obtain records and information relating to customers or subscribers from providers of electronic communications services or remote computer services.’216 The memo intends to ‘harmonise the current practice of applications for protective orders and set a general ceiling on how long a notification can be withheld.’ Protective orders are tools permitting law enforcement authorities not to notify, for a limited period of time, a person that their data is being investigated. The DoJ reiterated to the Commission that the memo ‘contributes to increased transparency and prevention of overuse of protective orders.’217

4.4 The uncertain future of the Privacy Shield

The foremost critique of the Privacy Shield comes from Max Schrems himself. He referred to it as a ‘soft update of the Safe Harbor,’ an ‘absolutely laughable proposal for a new US data sharing agreement.’218 Firstly, he argues that it ‘does not even regulate the vast majority of processing operations by US controllers, as the proposed Notice & Choice principles only limit the ‘change of purpose’ and any forwarding of data to a third party.’219 For example, sensitive operations by private companies such as profiling are excluded from the scope of the Privacy Shield. Secondly, concerning government surveillance, he essentially accuses the new framework of playing on words, with limitations applying only to ‘data used in bulk’ but not ‘collected in bulk.220’ He also contends that the redress mechanism is ‘an outright affront to the highest court in the European Union,’ because the Ombudsperson cannot afford any real remedy

214 Carpenter v U.S., 585 U.S. 138 (2018), concerning privacy and access of phone location history for law enforcement purposes. The Supreme Court held that under the Fourth Amendment, authorities needed a warrant to access such information. 215 18 USC 121 §§2701-2712 Stored Communications Act. 216 US DoJ, Rod Rosenstein, Memorandum on Policy Regarding Applications for Protective Orders Pursuant to 18 USC §2705(b) . 217 Ibid. 218 Schrems M., The Privacy Shield is a Soft Update of the Safe Harbor, European Data Protection Law Review (2016). 219 Ibid. 220 Ibid. 51

but ‘only inform [an individual placing a request before it] that all US laws have been complied with, or that non-compliance was remedied.’221

To date, two direct actions and an indirect challenge have been started against the Privacy Shield. The first one was brought forward by the NGO Digital Rights Ireland. It sought an action an annulment action under Article 263 TFEU before the General Court.222 Its claims can be summarized as follows. It argued that the FISA permitted ‘public authorities to have secret access on a generalised basis to the content of electronic communications,’223 that the Privacy Shield ‘failed to safeguard against indiscriminate access to electronic communications by foreign law enforcement authorities,’ and that it did not constitute ‘“international commitments” within the meaning of Article 25(6) of Directive 95/46.’224 It concluded that the framework would breach the rights to privacy, data protection, freedom of expression and assembly, the right to an effective judicial remedy and the principle of good administration. However, the General Court dismissed the action as inadmissible for lack of legal interest. It held that Digital Rights Ireland could not argue that its data was endangered by transfers to the US because it is not a natural person, that in any case it did not transfer personal data to US companies certified under the Privacy Shield. Furthermore, it considered that because Digital Rights Ireland is not an association, it may not bring any ‘actio popularis’225 in the public interest of consumers and that in any case it did not demonstrate that it would have been empowered by consumers to do so.

The second one was also a 263 action before the General Court. It was brought by the French NGO La Quadrature du Net and others226 and is still pending. Its arguments are similar to those of Digital Rights Ireland. US law would allow for generalized data collection, the Privacy Shield would not afford any meaningful remedy, and would thus not match the ‘essential adequacy’ standard set by the Court in Schrems.

The third challenge to the Privacy Shield is indirect, and originates in the first Schrems case. Following the CJEU’s decision, Max Schrems asked the Irish DPC to reformulate and re-submit

221 Ibid. 222 Case T-670/16 Order of the General Court Digital Rights Ireland v Commission ECLI:EU:T:2017:838. 223 Ibid. 224 Ibid. 225 Ibid. 226 Case T-738/16 Action brought on 25 October 2016 La Quadrature du Net and Others v Commission. 52

his complaint originally addressed at the Safe Harbor. The DPC agreed, and Max Schrems challenged the Standard Contractual Clauses. The proceedings before the Irish courts have been tedious ever since and have been the subject of six explanatory memoranda by the Data Protection Commissioner.227 Suffice is to say here that Max Schrems has sought to structure his arguments in such a way as to put the CJEU in a corner, forcing it to drag the Privacy Shield to fall along in the event of an invalidation of the 2010 Standard Contractual Clauses Commission Decision. In essence, among many pleas and eleven questions referred by the Irish High Court to the CJEU, he asked whether the Privacy Shield Ombudsperson mechanism would be amenable to handle surveillance issues arising from the use of the SCCs. By so doing, he led the High Court to ask whether the Privacy Shield Decision ‘constitute a finding of general application binding on data protection authorities [and member states’ courts] to the effect that the US ensures an adequate level of protection’228 according to the Directive. Given the way this question was asked, the CJEU will most certainly have to make an assessment of the validity of the Privacy Shield, for the exact same reasons it had to for the Safe Harbor. As pointed out by William Krouse, ‘Privacy Shield contains the exact same language as the previously invalidated Safe Harbor agreement: “Adherence to these Principle may be limited: (A) to the extent necessary to meet national security, public interest, or law enforcement requirements.”’229 The Court may well decide that the Ombudsperson mechanism does not appropriately remedy this. The major difference here with the first Schrems case is political. It remains to be seen whether the CJEU will be willing to invalidate the Privacy Shield framework after so few years of existence.

On that note, all these legal challenges however are not the only hurdles rendering the Privacy Shield’s future uncertain. Without getting into specifics, political challenges on both sides of the Atlantic such as what Pr. Schwartz calls the ‘Trump effect’230 will be critical going forward. Essentially, the adequacy finding in the Privacy Shield Decision rests on the assumption that the US’ framework is adequate to EU’s standard. However, a critical piece to that framework

227 Irish Data Protection Commission, Explanatory Memoranda on the Litigation Concerning the Standard Contractual Clauses (SCCs) . 228 Ibid. 229 Krouse W., The Inevitable Demise of Privacy Shield: How to Prepare, The Computer & Internet Lawyer (2018). 230 Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017). 53

is Presidential Policy Directive 28,231 which President Donald J. Trump could unilaterally revoke, thereby jeopardizing the whole agreement.

4.5 Alternatives to the Privacy Shield

Other compliance vehicles meant to transfer personal data outside of the EEA in general may also be used between the EU and the US as alternatives to the Privacy Shield.

4.5.1 Consent and contracts

The right to privacy and personal data affords data subjects the possibility to allow any transfer of their personal data if they provide their explicit consent for such a processing.232 Therefore, in theory, there is no need for any Privacy Shield or other instrument to transfer data from the EU to the US: so long as a data subject233 or privacy consumer says ‘Yes,’ such an onward transfer would create a legally binding contract. However, in practice, this is not a viable option at scale. For consent to be valid under EU law, it must be freely given, specific, informed, unambiguous, and may be revoked at any time.234 Article 29 Working Party issued guidelines on how to interpret these requirements. In any case, consent is a possible but not advisable basis for EU-US personal data transfers. As an example, Advocate General Szpunar in Planet49235 advised the CJEU that pre-ticked boxes to accept internet cookies should not be deemed to express a valid consent.236

231 According to the Department of Homeland Security: ’This Policy Instruction establishes the policies and procedures governing the safeguarding by Office of Intelligence and Analysis (l&A) employees of personal information collected from signals intelligence activities as required by Presidential Policy Directive-28, ”Signals Intelligence Activities,”’ January 17, 2014 . 232 GDPR, Article 49. 233 Article 29 Data Protection Working Party, Guidelines on Consent under Regulation 2016/679 . 234 GDPR, Article 4 and Convention 108, Article 5(2). 235 Case C‑673/17 Planet49 ECLI:EU:C:2019:246, Opinion of Advocate General Szpunar. 236 Norton Rose Fulbright, EU Advocate General Issues Opinion On Consent For Cokkies And Intersection Between ePrivacy-Directive and GDPR . 54

4.5.2 Standard Contractual Clauses

The Commission has issued a set of ‘Standard Contractual Clauses,’ also known as Model Contracts,237 which are basically templates whereby an undertaking contractually abides by certain data protections not otherwise guaranteed in the jurisdiction of an onward transfer.238 Two Model Contracts exist: from EU controllers to non-EEA controllers,239 and from EU controllers to non-EEA processors.240 They were adopted under the Data Protection Directive and have been grandfathered by the GDPR. Undertakings are able to make modifications, so long as any ‘homemade’ provisions do not contradict the GDPR, and to the extent permitted by the SCC themselves.241

Their advantages lie in the fact that they are easier to negotiate than the Binding Corporate Rules and somewhat less bureaucratic to manage, may be used for more than two parties, and are in principle applicable towards any jurisdiction.242 Their downsides mainly reside in their relative lack of flexibility, and that they are difficult to enforce for example when foreign authorities seek to access EEA data in the context of a corporate investigation.

When the Safe Harbor Decision was invalidated, Facebook switched to these mechanisms for its Transatlantic transfers. However, Max Schrems shortly thereafter challenged them too.243 In this new case,244 the High Court of Ireland filed a request for a preliminary ruling to the CJEU in April 2018245 on the validity of the EU controller to non-EEA processor SCC. The order for

237 Linklaters, AG Opinion Issued in Schrems – Safe Harbor invalid. Are Model Contracts also at Risk? . 238 European Commission, Standard Contractual Clauses (SCC) . 239 Originally from Decision 2001/497/EC, then from Decision 2004/915/EC. 240 From Decision 2010/87/EU. 241 GDPR, Recital 109. 242 Voigt P., Von dem Bussche A., The EU General Data Protection Regulation (GDPR), Springer, 1st edition (2017), p.122. 243 Kelleher D., Standard Contractual Clauses to be reviewed by CJEU, International Association of Privacy Professionals (2017) . 244 Irish Data Protection Commission, Litigation Concerning the Standard Contractual Clauses . 245 Irish Data Protection Commission, Reference for a Preliminary Ruling before the CJEU by the Irish High Court . 55

reference has been filed under C-311/18.246 At the time of writing, the case is still pending and no Advocate General opinion has been issued.

4.5.3 Binding Corporate Rules

The Binding Corporate Rules are akin to codes of conducts for data transfers within international corporate groups. ‘They allow multinational companies to transfer personal data internationally within the same corporate group to countries that do not provide an adequate level of protection.’247 Contrary to the SCC, they do not rely on templates. However, former Article 29 Working Party issued several working papers,248 now endorsed by the EDPB,249 detailing what rules these should include and how they should be structured. They essentially reflect the principles of the former Directive and the GDPR such as transparency, data quality, security or enforcement. In addition, to be valid, they must be approved by a DPA.250

They in theory offer more flexibility than the SCC and force to undertake measures that can in turn facilitate a company’s compliance with EU Data Protection law in general.251 Some even argued that they are the ‘most well regarded and prestigious method to legitimise the transfer of personal data in both the eyes of the European regulators and the data privacy community.’252 However, they have also been criticized as burdensome and lacking clarity,253 and have only been put in place within about 130 companies.254

246 Case C-311/18 Data Protection Commissioner v Facebook Ireland Limited, Maximillian Schrems, Reference for a preliminary ruling from the High Court (Ireland) made on 9 May 2018. 247 European Commission, Binding Corporate Rules (BCR) . 248 Article 29 Working Party Guidelines 263, 264, 265, 256, 257, 133, 153, 154, 155, 74, 107 and 108: EDPB, GDPR: Guidelines, Recommendations, Best Practices . 249 EDPB, Endorsement of the EDPB of soft law issued by the Article 29 Working Party . 250 GDPR, Article 47. 251 Voigt P., Von dem Bussche A., The EU General Data Protection Regulation (GDPR), Springer, 1st edition (2017), p.129. 252 White & Case, Binding Corporate Rules for Data Processors . 253 Voigt P., Von dem Bussche A., The EU General Data Protection Regulation (GDPR), Springer, 1st edition (2017), p.125. 254 List of companies for which the EU BCR cooperation procedure is closed as of 24 May 2018 . 56

4.5.4 Codes of conducts and certifications

Codes of conduct255 and certifications256 are ‘semi-self-regulating’ instruments257 introduced by the GDPR and supplemented by guidelines from the EDPB. The codes of conduct are established by trade associations and representative bodies. They need to be approved by DPAs or the EDPB and, if so, would apply to a whole industry or sector. They essentially function as enhanced best practices258 at an EEA-wide level but nothing limit them to that territorial scope and they theoretically could include companies located outside of the EEA.259 The certifications are a mechanism permitting a company to voluntarily undergo an audit by relevant DPAs and the EDPB to be declared GDPR compliant across its structure, including operations and transfers outside the EEA.

4.6 Fourth sub-conclusion

The current ways to transfer personal data from the EU to the US are, first, and preferentially, through the Privacy Shield, and, second, through any other standard compliance vehicles such as consent and contracts, the SCCs, the BCRs, codes of conduct and certifications.

255 GDPR, Articles 40 and 41. 256 GDPR, Articles 42 and 43. 257 Bird & Bird, Codes of Conduct and Certifications . 258 UK Information Commissioner Office, Codes of Conduct 259 Heimes R., Top 10 operational impacts of the GDPR: Part 9 – Codes of Conduct and Certifications, International Association of Privacy Professionals (2016) . 57

SUMMARY AND GENERAL CONCLUSION

This Thesis seeks to give its reader the tools to understand the data privacy divide between the EU and the US. It explains the crucial notions, historical and jurisprudential factors and regulatory frameworks underlying and constituting it.

First, it answered why regulating data privacy is paramount to our democratic societies on both sides of the Atlantic. In essence, the growing importance of the data driven economy, whose raw material is our personal data, creates challenges to basic values of our democratic societies, for example privacy and the freedom of speech. This Thesis explored the darker side of the digital economy, sometimes referred to as a form of surveillance capitalism. It described how the advertisement-based business model of some of the most successful internet companies may, if left unregulated, render citizens vulnerable to enhanced forms of influence and manipulation, and weaken essential counter-powers such as dissidents, whistle-blowers and the press.

Second, it answered how the EU and US approaches to regulating data privacy differ. In essence, all processing of personal data in the EU must have a legal basis and a legitimate aim to be allowed, whereas there is no such conception in the US and all processing of personal data there is in principle legal. This originates in different historical roots and economic incentives on both sides of the Atlantic. The EU has had a painful experience with government surveillance and invasions of privacy, especially in states formerly members of the Soviet Union, in particular in the former German Democratic Republic. On the contrary, the US does not have such history and its economy has enormously benefited from lax data privacy regulations, allowing it to grow internet giants such as Facebook and Google. As a result, the EU regulates privacy and data protection tightly and enshrines them as fundamental rights, while the US takes a more market-based and light-touch approach by treating data privacy essentially as a subset of consumer protection law. This explains the universalist will of the EU to enact all- encompassing regulations such as the GDPR, while the US’ data privacy landscape remains fragmented and arguably less protective.

58

Third, it answered why the first attempt at bridging the divide, the Safe Harbor, was discontinued. In Essence, the CJEU decided to invalidate the Safe Harbor adequacy decision based on the revelations by whistle-blower Edward Snowden. He had disclosed surveillance activities by US authorities which seriously strained the EU’s trust in the US’ level of protection of the data privacy of EU citizens. The Court essentially tried to put pressure on the US negotiators of what was then to be called the Safe Harbor 2.0 for them to make concessions and afford stronger levels of protection for the data of EU citizens against American surveillance activities.

Fourth, it synthesized the current avenues for transferring personal data from the EU to US. In essence, the Privacy Shield is the main vehicle enabling such data transfers. It consists of an enhanced update of the Safe Harbor, especially concerning compliance monitoring, enforcement and redress mechanisms. However, the Privacy Shield has already been subject to several challenges before the CJEU, some of which still pending, in particular another one brought, yet again, by Max Schrems. Furthermore, this Thesis outlined the other tools which companies might decide to use in order to transfer personal data processed in the EU towards the US, such as consent and contracts, the SCCs, the BCRs, codes of conducts and certifications.

In conclusion, the Transatlantic data privacy divide can be understood as the result of a different history and economic incentives. Those have led the EU and the US to address in vastly different ways the challenges posed by the data-driven economy to their respective democratic systems of government. The Safe Harbor and the Privacy Shield have been attempts at bridging the gap, but, in spite of these, it remains to be seen whether the EU and the US’ regulations of data privacy will ever converge or drift further apart.

59

REFERENCES

Academic sources

Bartow A., A Feeling of Unease About Privacy Law, University of Pennsylvania Law Review (2006).

Belbin R., When Google Becomes the Norm: The Case for Privacy and the Right to Be Forgotten, Dalhousie Journal of Legal Studies (2018).

Bender D., Having mishandled Safe Harbor, will the CJEU do better with Privacy Shield? A US perspectice, International Data Privacy Law (2016).

Black J. and others, A Dictionary of Economics, Oxford, 5th edition (2017).

Bradford A., The Brussels Effect, Northwestern University Law Review (2012).

Brown I., The feasibility of transatlantic privacy-protective standards for surveillance, International journal of law and information technology (2014).

Coase R. H., 1991 Nobel Lecture: The Institutional Structure of Production, The Nature of the Firm: Origins, Evolutions, and Development, Oliver E. Williamson & Sidney G. Winter eds. (1991).

Determann L., K. T. Guttenberg, On War and Peace in Cyberspace – Security, Privacy, Jurisdiction, Hastings Constitutional Law Quarterly (2014).

Determann L., No One Owns Data, UC Hastings Research Paper No. 265 (2018).

Freiwald S., At the Privacy Vanguard: California’s Electronic Communications Privacy Act (CalECPA), Berkeley Technology Law Journal (2018).

Gady F.-S., EU/US Approaches to Data Privacy and the ‘Brussels Effect’: A Comparative Analysis, Georgetown Journal of International Affairs (2014).

Gordon J., John Stuart Mill and the Marketplace of Ideas, Social Theory and Practice (1997).

Hoofnagle C. J., Whittington J., Free: Accounting for the Costs of the Internet’s Most Popular Price, UCLA Law Review (2014).

James G., Privacy as Product Safety, Widener Law Journal (2010).

60

Kokott J., Sobotta C., The distinction between privacy and data protection in the jurisprudence of the CJEU and the ECtHR, International Data Privacy Law (2013).

Krouse W., The Inevitable Demise of Privacy Shield: How to Prepare, The Computer & Internet Lawyer (2018).

Lyon D., Surveillance Studies: An Overview, Cambridge (2007).

Perry I. E., California is Working: The Effects of California’s Public Policy on Jobs and the Economy since 2011, UC Berkeley Labor Center (2017).

Posner R. A., The Right of Privacy, Georgia Law Review (1977).

Richards N. M., The Dangers of Surveillance, Harvard Law Review (2013).

Rosen J., The Right to Be Forgotten, Stanford Law Review Online (2012).

Samuelson P., Privacy as Intellectual Property, Stanford Law Review (1999).

Schrems M., The Privacy Shield is a Soft Update of the Safe Harbor, European Data Protection Law Review (2016).

Schwartz P. M., Property, Privacy, and Personal Data, Harvard Law Review (2004).

Schwartz P. M., Peifer K.-N., Transatlantic Data Privacy Law, Georgetown Law Journal (2017).

Silverman J., Privacy under Surveillance Capitalism, Social Research (2017).

Solove D., 'I've Got Nothing to Hide' and Other Misunderstandings of Privacy, San Diego Law Review (2007).

Solove D., Schwartz P. M., Information Privacy Law, Wolters Kluwer, 6th edition (2017).

Varian H. R., Economic Aspects of Personal Privacy, US DoC, Privacy and Self-Regulation in the Information Age (1997).

Varian H. R., Beyond Big Data, Business Economics (2014)

Voigt P., Von dem Bussche A., The EU General Data Protection Regulation (GDPR), Springer, 1st edition (2017).

Zuboff S., The Age of Surveillance Capitalism, Profile Books (2019).

61

Zuboff S., Surveillance Capitalism and the Challenge of Collective Action, New Labor Forum (2019).

ECtHR case law and Council of Europe documentation

Bernh Larsen Holding AS and others v Norway App no 24117/08.

Rotaru v Romania App no 28341/95.

Council of Europe, Convention for the Protection of Individuals with regard to the Processing of Personal Data (Convention 108).

CJEU case law and opinions of Advocates General

Case C-29/69 Stauder EU:C:1969:57.

Case C-294/83 Les Verts v Parliament EU:C:1986:166.

Case C-92/09 Volker and Schecke EU:C:2010:662.

Case C-279/09 DEB EU:C:2010:811.

Case C-360/10 SABAM v netlog EU:C:2012:85.

Case C‑584/10 P C‑593/10 P and C‑595/10 P Commission and Others v Kadi EU:C:2013:518.

Case C-617/10 Åkerberg Fransson EU:C:2013:105.

Case C-131/12 Google Spain and Google EU:C:2014:317 and Opinion of Advocate General Jääskinen.

Case C-293/12 and C-594/12 Digital Rights Ireland and others EU:C:2014:238.

Case C-362/14 Schrems EU:C:2015:650 and Opinion of Advocate General Bot.

Case T-670/16 Order of the General Court Digital Rights Ireland v Commission.

Case C‑673/17 Planet49 ECLI:EU:C:2019:246, Opinion of Advocate General Szpunar.

Case C-311/18 Data Protection Commissioner v Facebook Ireland Limited, Maximillian Schrems, Reference for a preliminary ruling from the High Court (Ireland) made on 9 May 2018.

62

US Supreme Court case law

Int’l News Serv. v Associated Press, 248 U.S. 215 (1918).

Kewanee Oil Co. v Bicron Corp., 416 U.S. 470 (1974).

Whalen v Roe, 429 U.S. 589 (1977).

Carpenter v U.S., 585 U.S. 138 (2018).

Other US case law

Digitech Image Techs. v Elecs. for Imaging, 758 F.3d 1344, 1350 (Fed. Cir. 2014).

EU statutes

Commission Decision 2000/518/EC of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequate protection of personal data provided in Switzerland.

Commission Decision 2000/520/EC of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce.

Commission Decision 2011/61/EU of 31 January 2011 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequate protection of personal data by the State of Israel with regard to automated processing of personal data.

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).

Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) No 45/2001 and Decision No 1247/2002/EC.

63

Commission Implementing Decision (EU) 2019/419 of 23 January 2019 pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council on the adequate protection of personal data by Japan under the Act on the Protection of Personal Information.

Proposal for a Regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications).

US federal statutes

5 USC 552a Records Maintained on Individuals – Government Organizations and Employees.

15 USC 91 Children's Online Privacy Protection.

15 U.S.C. 6801(b), 6805(b)(2) Standards for Safeguarding Customer Information.

18 USC 121 §§2701-2712 Stored Communications Act.

18 USC 123 Prohibition on Release and Use of Certain Personal Information from State Motor Vehicle Records.

18 USC 1030 Fraud and Related Activity in Connection with Computers.

18 USC 2710 Wrongful Disclosure of Video Tape Rental or Sale Records.

31 CFR 103.120 Anti-Money Laundering Programs.

42 USC 201 Health Insurance Portability and Accountability Act of 1996.

47 USC 222 Privacy of Customer Information.

47 USC 551 Protection of Subscriber Privacy.

50 USC 36 §1081 et seq. Foreign Intelligence Surveillance Act (FISA).

Pub.L. 107-56 Patriot Act.

Pub.L. 115-141 Clarifying Lawful Overseas Use of Data Act (CLOUD).

Presidential Policy Directive 28: Signals Intelligence Activities.

California statutes

Constitution of the State of California, Article 1 ‘Declaration of Rights,’ Section 1.

64

California Assembly Bill No. 375, Chapter 55 Privacy: Personal Information: Businesses.

California Consumer Privacy Act of 2018, Cal. Cons. Priv. Code §§ 1798.100 - 1798.198.

References from official US institutions

FTC, What We Do .

FTC, Legal Resources, cases brought under the Safe Harbor and Privacy Shield Frameworks

.

US DoJ, The USA PATRIOT Act: Preserving Life and Liberty .

US DoJ, Rod Rosenstein, Memorandum on Policy Regarding Applications for Protective Orders Pursuant to 18 USC §2705(b) .

US Department of State, Privacy Shield Ombudsperson

The White House, President Donald J. Trump Announces Intent to Nominate Individual to Key Administration Posts .

References from official EU institutions

CJEU, Press release No 70/14 on C-131/12 Google Spain .

Jean Claude Juncker, 2016 State of the Union address at the European Parliament .

COM/2013/0847 Communication From The Commission To The European Parliament And The Council on the Functioning of the Safe Harbour from the Perspective of EU Citizens and Companies Established in the EU .

65

COM/2017/07 Communication From The Commission To The European Parliament And The Council, Exchanging and Protecting Personal Data in a Globalised World .

European Commission, Binding Corporate Rules (BCR) .

European Commission, Standard Contractual Clauses (SCC) .

European Commission, Passenger Name Record (PNR) .

European Commission, Terrorist Finance Tracking Programme .

European Commission, Staff Working document for the second annual review of the Privacy Shield .

European Commission, Memo on the Informal Justice Council in Vilnius .

European Parliament, From Safe Harbour to Privacy Shield – Advances and Shortcomings of the New EU-US Data Transfer Rules .

EUR-Lex, Summary of EU Legislation: EU-US Agreement on Personal Data Protection .

Article 29 Data Protection Working Party, Minutes of the 103rd meeting of the Article 29 Data Protection Working Party . 66

Article 29 Data Protection Working Party, Guidelines on Consent under Regulation 2016/679 .

EDPB, GDPR: Guidelines, Recommendations, Best Practices .

Sources from official EU Member States’ institutions

French Ministry of Justice, Le sens de la peine et le droit à l’oubli .

Irish Data Protection Commission, Explanatory Memoranda on the Litigation Concerning the Standard Contractual Clauses (SCCs) .

Irish Data Protection Commission, Litigation Concerning the Standard Contractual Clauses .

Irish Data Protection Commission, Reference for a Preliminary Ruling before the CJEU by the Irish High Court .

UK Information Commissioner Office, Codes of Conduct

Professional publications by international law firms

Bird & Bird, Codes of Conduct and Certifications .

Linklaters, AG Opinion Issued in Schrems – Safe Harbor invalid. Are Model Contracts also at Risk? .

67

Morrison Foerster, Privacy Library .

Norton Rose Fulbright, EU Advocate General Issues Opinion On Consent For Cokkies And Intersection Between ePrivacy-Directive and GDPR .

White & Case, Binding Corporate Rules for Data Processors .

Other professional legal publications

Daily Dashboard by the International Association of Privacy Professionals, Keith Krach nominated as Privacy Shield Ombudsperson .

Determann L., US Privacy Safe Harbor – More Myths and Facts, Bloomberg BNA (2016) .

Gilbert F., Invalidation of the Safe Harbor: Will it Cause the Adoption of Data Silos?, Bloomberg BNA (2015) .

Heimes R., Top 10 operational impacts of the GDPR: Part 9 – Codes of Conduct and Certifications, International Association of Privacy Professionals (2016) .

Hengesbaugh B., Five Myths About Safe Harbor, International Association of Privacy Professionals (2015) .

Kelleher D., Standard Contractual Clauses to be reviewed by CJEU, International Association of Privacy Professionals (2017) .

68

Press and media sources

Baldet A., The Currency of the Future is Personal Data, Quartz (2018) .

BBC News, Snowden NSA: Germany Drops Merkel Phone-Tapping Probe (2015) .

Bradshaw P., The Lives of Others, The Guardian (2007) .

Denton J., Digital Protectionism Demands Urgent Response, The Financial Times (2019) .

Hall J., Washington Post Updates, Hedges on Initial PRISM Report, Forbes (2013) .

Harrison S., Can You Make Money Selling Your Data?, BBC Capital (2018) .

Heath R., The World’s Most Powerful Tech Regulator: Martin Selmayr, Politico Europe (2018) .

Herrman J., This is How Much You’re Worth To Facebook, BuzzFeed News (2012) .

Hill K., Max Shrems: The Austrian Thorn in Facebook’s Side, Forbes (2012) .

Kuchler H., Max Schrems: the Man Who Took On Facebook – and Won, The Financial Times (2018) .

Lee A., US to Appoint Permanent Privacy Shield Ombudsperson, as EU Pressure Tells, Euractiv (2019) .

69

Macaskill E. and Dance G., NSA Files: Decoded, The Guardian (2013) .

Muiznieks N., Europe is Spying on You, The New York Times (2015) .

Noack R., Edward Snowden Revelations, The Washington Post (2016) .

Posner R. A., Our Domestic Intelligence Crisis, The Washington Post (2005) .

The Daily Podcast, The Business of Selling your Location, The New York Times (2018) .

The Economist, The World’s Most Valuable Resource is No Longer Oil, but Data (2017) .

The Guardian, Edward Snowden: a right to privacy is the same as freedom of speech – video interview,

The Washington Post’s homepage .

Turow J. and Hoofnagle C. J., Mark Zuckerberg’s Delusion of Consumer Consent, The New York Times (2019) .

Schleiffer J. T., Toqueville’s Insight, The New York Times (1981) .

70

Scott M., Data Transfer Pact Between U.S. and Europe is Ruled Invalid, The New York Times (2015) .

Wasik B., Welcome to the Age of Digital Imperialism, The New York Times (2015) .

Other resources

Bagchi K., Dissecting the Safe Harbor Decision of the ECJ (2015) .

Coudret F., Schrems vs. Data Protection Commissioner: A Slap on the Wrist for the Commission and New Powers for Data Protection Authorities, European Law Blog (2015) .

Encyclopaedia Britannica, Definition of Capitalism .

Express VPN Education, Biography of Edward Snowden .

Facebook’s data pool as described on Max Schrems’ Europe-v-Facebook website .

Max Schrems’ initial response to the CJEU’s Schrems decision .

None Of Your Business, the current NGO through which Max Schrems carries his advocacy and judicial activities .

Peter Fleisher, Foggy Thinking about the Right to Oblivion, Peter Fleisher blog (2011) .

Privacy International, It’s about human dignity and autonomy (2018) .

71

The Federalist Society, The Right to Be Forgotten (2018) .

The Lawfare Blog, chronology of the Snowden Revelations .

The Stanford Raw Data Podcast, Data Confidential (2016) .

Smith B., The Collapse of the US-EU Safe Harbor: Solving the New Privacy Rubik’s Cube (2015) .

Snowden E., tweet about Maximilian Schrems (2015) .

Statement from U.S. Secretary of Commerce Penny Pritzker on EU-U.S. Privacy Shield (2016) .

Swire P., White Paper on US Surveillance Law, Safe Harbor, and Reforms Since 2013 submitted to the Belgian DPA for its 2015 forum on ’The Consequences of the Judgement in the Schrems case’ .

Vienna Convention on the Law of Treations from 1969 .

72