<p>Informational Privacy in a World of Big Data and Transnational Data Flows</p><p>Liberty finds no refuge in a jurisprudence of doubt</p><p>Sandra Day O’Connor, Planned Parenthood v. Casey</p><p>The very purpose of a Bill of Rights was to withdraw certain subjects from the vicissitudes of political controversy, to place them beyond the reach of majorities and officials and to establish them as legal principles to be applied by the courts. One's right to life, liberty, and property, to free speech, a free press, freedom of worship and assembly, and other fundamental rights may not be submitted to vote; they depend on the outcome of no elections.</p><p>Robert H. Jackson, West Virginia Board of Education v. Barnette</p><p>1. Introduction</p><p>Almost every aspect of law had been profoundly influenced by the advent of modern electronic communication technologies. The Internet is, truly what economic historians, call a ‘transforming technology’, it not only changes our patterns of enterprise and daily activities but also creates ‘buildings blocs’ an entirely new ecosystem in which new business can prosper. The history of Internet shows that it underwent already a series of transformations, from virtually an electronic telegraph through a WWW web page system up to a broadband, prosument, video and other content sharing web of today. It had also staidly expanded its reach, from a net of handful of academic physicists to billions of people on all continents now. Those changes had also forced changes in the regulatory environment. </p><p>I believe that we are now in an eve of another profound change in the nature of the Internet. Technology, mainly the ‘Internet of things’1 will create an overwhelming torrent of data about users and their most intimate activities. Already smartphones are collecting and aggregating</p><p>1 The term ‘Internet of things’ (‘IoT’) describes technologies that allow household appliances and all kinds of devices (ex. washing machines, cars, refrigerator etc.) to connect directly to the Internet. Such devices will not be intermediaries between a human and the Internet, they will create an interactive ecosystem of their own and also generate gigantic quantities of data. This can allow to control such devices at a distance and creates all sorts of possibilities ex. optimizing: electricity consmaption, usage of electricity grids, roads and other infrastructure. It also can provide detailed information about the manner in which such devices are used, creating possibility for onerous surveillance. See http://www.wired.com/2014/03/data-centers-internet-things- come/ (accessed 4 May 2014). scores of data on millions of people, creating possibilities for surveillance that would be unimaginable just a few years earlier. The progress in software and data analyzing tools allows to aggregate data to ‘mine’ detailed information about almost everyone using the net. Already, the basic business model of Internet firms is to buy privacy of the user whit convenient services. I will try to show that this model is unsustainable if we want to stay committed to fundamental rights. It is becoming a transaction that is less and less beneficial for the users and creates an escalating probability of catastrophic privacy leaks whit profound controversies. The Internet firms simply know too much. </p><p>In this paper I shall try to analyze the legal consequence of technological changes in the field of electronic communications in the context of EU law. Mainly, I will try to determine the impact of big data, data mining technologies and the intensification of transnational, transjurisdictional data flows on fundamental rights to privacy and to data protection as laid down in art. 7 and 8 of the Charter of Fundamental Rights (‘The Charter’) and on the whole, unique European system of data protection. I will focus on private, commercial usage of data. I will not deal whit the workings of governments and other authorities. Although now, after Edwards Snoweden’s revelations, the atrocities committed by governments may seem to be a greater threat to privacy, in my opinion they are not. Governments, at least in our, democratic context are centralized organizations susceptible to pressure from judicial and public opinions. Whils private actors operating in a chaotic and decentralized ecosystem of modern Internet business, equipped whit modern means of technology that allow costless and infinite data reproducing are far more difficult to effectively regulate. </p><p>2. Informational Privacy and the European data protection system </p><p>Defining the scope of the right to privacy is an extremely controversial issue, especially in the ever-changing context of informational technologies. One can just look to US Supreme Court jurisprudence to find out that what is covert by protection of privacy is one of the most controversial legal issues in USA2. Although there is nothing national in defining the scope of privacy online, courts in different countries take different approaches. In Europe, courts still consider privacy cases a fact-based, case to case issue and often give contradicting judgments3, no general over-arching rule can be found.4 Defining privacy on the Internet is</p><p>2 See ex. H. Hardt, American Jurisprudence Through English Eyes: The Nightmare and the Noble Dream, at: http://digitalcommons.law.uga.edu/lectures_pre_arch_lectures_sibley/33/ (accessed 9 May 2014). even more complex. One reason for that is the durability and accessibility of electronic data5. The ‘substantial content’ of the right to privacy is in constant flux also because of the evolution of services offered online. The rise of services such as Google Street View, new kinds of personal information like geolocalisation data generated by smartphones or recently Google Glass constantly extend privacy considerations to new fields of human activity6. </p><p>In UE law the online privacy is protected (or supposed to be protected) by an elaborate system of protection of personal data. In this instance art. 87 of the Charter is perceived as a mean of achieving general privacy protection stipulated by art. 78 of the Charter and art. 89 of the European Convention of Human Rights (“ECHR”). The “subsidiary” character of art. 8 of the Charter is the position that seems to be taken by European human rights protection bodies10. Art. 52(3) the Charter makes a direct reference to ECHR, proclaiming that when the provisions of those two acts constitute the same right, the meaning and scope of Charter rights will be the same as those in the ECHR. This allows for recognition in EU law of rich jurisprudence of the European Court of Human Rights (ECtHR)11. Because the Charter is a relatively new law, while the ECtHR has long been ruling in human rights cases, the CJUE frequently refers to standards from the ECHR. </p><p>3 See J. Kulesza, Walled Gardens of Privacy or “Binding Corporate Rules?”: A Critical Look at International Protection of Online Privacy, 34 U. ARK. LITTLE ROCK L. REV., p. 747-765 </p><p>4 J. Kulesz claims that a miminum standards set by the courts can be defined as: “a right to establish and develop relationship with other human beings.” See. Id. Page 753. </p><p>5 Id.</p><p>6 Id.</p><p>7 “1. Everyone has the right to the protection of personal data concerning him or her. 2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified. 3. Compliance with these rules shall be subject to control by an independent authority”</p><p>8 “Everyone has the right to respect for his or her private and family life, home and communications.”</p><p>9 “1.Everyone has the right to respect for his private and family life, his home and his correspondence. 2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic wellbeing of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”</p><p>10 See also Handbook on European Data Protection Law published by Fundamenta Rigths Agency and Council of Europe in December 2013, pp. 14-20. </p><p>11 Ex. Rotaru v. Romania [GC] ECtHR, No. 28341/95. Id pp. 63. In the context of the Internet right to privacy is then realized by the protection of personal data. The assumption is that privacy of individuals in cyber space will be protected by special, legal restrictions on the freedom of processing and obtaining such data, thus by restriction of freedom of information12. The aim of personal data protection, in my opinion, is to obtain “informational privacy”, understood as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others”13. The main goal of privacy protection online is then to give a degree of control to the individuals over their personal data. This is a Sisyphean task, mainly because of the ease of reproducing and generating data whit the use of modern technologies14. </p><p>There is a myriad of acts concerning personal data protection in the framework of EU law 15. Data protection is rooted in the treaties: in the relevant articles of the Charter and in art. 16 of the Treaty on the Functioning of the European Union (“TFEU”). Art. 16 of TFUE constitutes the right to the protection of personal data and in section 2 mandates that The European Parliament and the Council shall lay down principles that will govern the usage of personal data by UE institutions and Member States. The most importuned act in secondary law is Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data16 (“Data Protection Directive”). The Data Protection Directive creates the frame work for European data protection law, other acts are apply the same principles in other, specialized sectors (ex. medical data)17. Out of those leges special the most significant for this paper is the Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the</p><p>12 See J. Kulesza, Freedom of information in the global information society – the question of The Internet Bill of Rights, University of Warmia and Mazury in Olsztyn Law Review, vol. 1 (2008), pp. 81 – 95.</p><p>13 R. Warner; R.H. Sloan; The Impact of Information Processing Technology on Informational Privacy in J. Balcarczyk, Rights of personality in the XXI century. New values, rules, technologies, Warsaw 2012, pp. 384. The authors quote an earlier definition formulated by J.B. Rule in Towards Strong Privacy: Values, Markets, Mechanisms, and Institutions, 54 U. Toronto L.J. 183 (2004). </p><p>14 Kulesza, supra note 4.</p><p>15 For a comprehensive list see Comentery to art. 8 of The Charter of Fundamental Righst, ed. A. Wróbel, SIP Legalis (online legal information database), pt 1-18 (accessed 30 April 2014).</p><p>16 Official Journal L 281 , 23/11/1995 P. 0031 – 0050.</p><p>17 See Supra note 14. electronic communications sector (“Directive on Privacy and Electronic Communications”)18. Until 8 April 2014 the third most importuned legal act in the European data protection framework was the Data Retention Directive19, which had been widely criticized by privacy activists and national courts20. The Data Retention Directive has been declared to be invalid by the CJEU in the Digital Rights Management Ireland case21 on the grounds that it unproportionally limits fundamental rights to privacy and to data protection22. It is importuned to notice, that the fundamental character of the right to data protection was recognized after the Data Protection Directive had been enacted. </p><p>The European system of data protection is undergoing a “Copernican revolution”. One reason is the fallout from the disclosure of PRISM program. European judges are less susceptible to the rhetoric of national security and seem to be genuinely concerned about the scale of surveillance. This creates an environment in which privacy considerations are becoming more importuned. One product of this atmosphere is the Digital Rights Management Ireland judgment. The other is the reform of the system proposed by commissioner Vivian Reding. The reform consist of two parts, one is the proposal for a new European regulation (“Draft Regulation”)23 that will repeal the Data Protection Directive and, as a directly applicable act, will replace the data protection laws of Member States24. The other, less known part, is a proposal to amend the international data exchange regime and to replace the “safe harbor</p><p>18 Official Journal L 201 , 31/07/2002 P. 0037 – 0047.</p><p>19 Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC. See Handbook on European Data Protection Law pp. 19-20.</p><p>20 See Kulesz, supra note 3, pp. 752.</p><p>21 Joined Cases C-293/12 and C-594/12.</p><p>22 C-293/12 “§ 69 Having regard to all the foregoing considerations, it must be held that, by adopting Directive 2006/24, the EU legislature has exceeded the limits imposed by compliance with the principle of proportionality in the light of Articles 7, 8 and 52(1) of the Charter.” </p><p>23 Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) published by the European Commission in Brussels on 25 I 2014 COM(2012) 11 final, http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf ( downloaded 2 V 2014).</p><p>24 See Handbook on European Data Protection Law pp. 21. system” whit the “binding corporate rules system”25. This reform was proposed mainly to answer to the challenges created by technological change. The limitations this paper do not allow me to comprehensively elaborate on the first part of the reform, the proposal for a data protection regulation. This proposal has been significantly delayed and will not be enacted soon. It also, as I will try to show, may not be enough to protect privacy, because it does not address fundamental features of the system most impacted by big data and other technological changes. The shape of the reform is still unknown, mainly because of the lobbying effort of big Internet companies26. The only certain benefit of the reform, if it will come to pass, will be an enactment of a uniform standard, replacing differing national laws that implement the Data Protection Directive27. One of the reasons for this reform is that the Data Protection Directive had been enacted before the elevation of data protection to the status of a fundamental right. One of its chief goals was to create a stable regulatory environment and facilitate the growth of Internet companies. Such philosophy should not be accepted anymore, whit the elevation of data protection the priorities of the European legislator should change. In this paper I will write about the laws in force now, as of May 2014, when necessary I shall refer to the proposed changes. </p><p>Individuals control over personal data is to be realized by certain features of European data protection legislation. Those features are sets of key principles laid down in the Charter of Fundamental Rights and in the Data Protection Directive. Those sets of key principles are supposed to guarantee: lawful processing, purpose specification and limitation, data quality, fair processing and accountability28. Those principles are supposed to be achieved by a set of rules governing the processing of data: rules on lawful processing (different for sensitive and non-sensitive data), rules of security of processing and rules on promoting compliance. 29 There is also a broad definition of personal data30, in order to prevent the circumvention of</p><p>25 See Kulesza, supra note 3, pp. 757.</p><p>26 See Nie ma woli by lepiej chronić prywaność (There is no will to protect privacy better) an interviwe given by Katarzyna Szymielewicz from The Panoptyon Fundation, chief Polish privacy pressure group, Dziennik Gazeta Prawna; 30 April – 4 May 2014 nr 83/84.</p><p>27 It is widely understood that the Data Protection Directive failed to create a uniformed standard for the protection of personal data. One example of that can be different judgments of the courts of Member States in the context of Google Street View. See Explanotary Memorandum in: supra. note 23.</p><p>28 See Handbook on European Data Protection Law pp. 63-78.</p><p>29 Id. pp. 81-105. data protection rules. Processing of data is also defined broadly31, in order to encompass a broad scope of activities. The CJEU in its jurisprudence recognizes and reaffirms this trend and generally considers operations on data to constitute data processing32. Those rules impose obligations on the data controller33. </p><p>The principle of lawful processing stipulate that data processing is lawful only when it is in accordance whit the law, pursues a legitimate purpose or is necessary in a democratic society in order to achieve a democratic purpose.34 This catalogue is an exception from the principle of freedom of information, it is supposed to be a strict limitation on personal data processing. It is a set of general principles that are directed to the national parliaments when they will implement the Data Protection Directive35. </p><p>The rules of lawful data processing are a concretization and consequence of those principles. There are two different sets of rules of lawful processing. The first set of rules govern the processing of non-sensitive data (art. 7 of Data Protection Directive) the second refers to certain data that is considered to be sensitive (art. 8). The rules for processing of sensitive data are stricter but for both categories of data they can be summaries in three groups. The</p><p>30 Art. 2(a) of Data Protection Directive: “ 'personal data' shall mean any information relating to an identified or identifiable natural person ('data subject'); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;”. It is worth noting that the Draft Regulation uses a different regulation technique for defining personal data. Art. 4(1) of the Regulation proposal defines a ‘data subject’ and art. 4(2) defines personal data as any data relating to the data subject. In effect the Draft Regulation dismisses the indentificability as a decisive criteria and nexus between data and personal data. Such approach is intended to strengthen the privacy of data subjects, because in order to heavily evade the privacy of a person identification is not need, such intrusion can be brought about if a person can be effectively “singled out”. See supra note 22. In the legislative process in the European Parliament there proposals where made to return to the definition from the Data Protection Directive http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2F%2FEP%2F%2FTEXT%2BREPORT%2BA7-2013- 0402%2B0%2BDOC%2BXML%2BV0%2F%2FEN&language=EN - title1 (accessed 8 May 2014). </p><p>31 Art. 2 (b) of the Data Protection Directive: “'processing of personal data' ('processing') shall mean any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction;”.</p><p>32 Ex. C-101/01 Bodil Lindquist. See supra note 27.</p><p>33 European data protection laws define a controller as a person (legal or natural) who processes personal data. An entity, who processes data on behalf of the controller. See Art. 2 of the Data Protection Directive and Handbook on European Data Protection Law pp. 47. </p><p>34 Id pp. 64.</p><p>35 Id. See also the preamble of Data Protection Regulation. processing is legal when it is based on the consent of data subject, vital interest of the subject require it and legitimate interests of the controller or third parties are the reason for processing (but only as long as they are not overridden by the protection of data subjects fundamental rights)36. The consent of the data subject must also be a qualified consent, it cannot be implied, it must be free and informed. </p><p>Unfortunately those rules are frequently circumvented. One of the ways of such circumvention is to invoke the “legitimate interest of the controller or a third party”37. Two most recent cases of actions that in the opinion of many constitute such circumvention are: Google adopting a unified privacy policy for all its service (and using the data gathered by those to build profiles for advertisement purposes) and Linkedin’s mobile app that was intended to have access to a calendar of meetings on the mobile device but is also gathering a lot of data from it38. </p><p>The data processor has also certain duties to the data subject. In processing personal data he must respect his rights and interests, protect the data from unlawful usage and allow the data subject to access data that are gathered about him39. </p><p>The European data protection system also provides certain rights to the data subject. As I mentioned the data subject has the right to access data gathered about her, she has the right to rectify (or block) the data if they are inaccurate and she can also demand deletion of data processed illegally. The data subject can also object to certain kind of data processing.40 In those stipulations, like nowhere else, the nature of informational privacy as control of data</p><p>36 Id. pp. 84 and art. 7 Data Protection Regulation.</p><p>37 Art. 7(f) of Data Protection Directive “(f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject which require protection under Article 1 (1).”</p><p>38 Six European data protection ombudsmen considered such activities unlawfull and had started proceedings against Google. See Reform of European Personal Data Protection Laws, opinion issued by the Panoptykon Funadtion, http://panoptykon.org/sites/panoptykon.org/files/panoptykon_foundation_gdpr_brief_07.2013_0.pdf (downloaded 4 May 2014). </p><p>39 See supra note 10, pp. 110-119.</p><p>40 Id. relating to the data subject can be seen. Unfortunately, the data subject right are the least respected parts of the system and it is harder and harder to enforce them.41 </p><p>3. Fallacies of the European data protection system </p><p>Many rights and principles mentioned above do not work in the way that their were intended. I already described how the “legitimate interest of the data controller” is being abused in order to circumvent limitations on data processing. Other rights are also almost illusory, making them merely unenforceable fallacies, not legal safeguards of fundamental rights. To some extend this is the result of a fundamental flaw in the Data Protection Directive, as I already mentioned, that law was enacted before the recognitions of data protection as a fundamental rights and was originally intended to remove administrative obstacles for Internet companies. The other reason is the nature of high frequency, durable and easy to reproduce electronic data42. Some guaranties had been undermined by technological change. All this created a environment in which fundamental rights of data subject are not respected. </p><p>Technological change makes certain principles of European data protection system illusory or unenforceable. This set of fallacies about the data protection system consists of: the fallacy of consent, the fallacy of private enforcement, the fallacy of anonymisation and distinction between sensitive and non sensitive data.</p><p>The fallacy of informed, free consent is the most obvious one. As a recent report prepared for the US President noted: “Only in some fantasy world do users actually read these notices and understand their implications before clicking to indicate their consent.”43 The “consumers do not read” critique is the simplest one, but even if consumers would read and understood the terms and conditions, it still would not make their consent informed. As Warner and Sloan put it:</p><p>41 See Explanotary Memorandum in: supra. note 23.</p><p>42 See Kulesza, Supra note 13. </p><p>43 http://www.whitehouse.gov/blog/2014/05/01/pcast-releases-report-big-data-and-privacy. Big Data: Seizing Opportunities, Preserving Values, Executive Office of the President May 2014. http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf (downloaded 10 May 2014). See also http://www.economist.com/blogs/democracyinamerica/2014/05/regulating-big-data “Even if consumers did read and understand the privacy notices, they would not obtain all the information necessary to give informed consent. The problem is that information collected on one occasion for one purpose is typically retained, analyzed, and distributed for a variety of other purposes in unpredictable ways. The unpredictability of future uses makes informed consent a practical impossibility.”44 </p><p>If fact constant invention of new ways to use our data is one of the main vehicles of progress in the age of information45. The requirement of consent shows how technology can change the meaning of laws. Progress in usage of data makes the consequences of consenting to data processing more and more unpredictable. This issue makes the informed consent similar to the freedom of contract doctrine in the XIX century. The freedom of contract was used as a fig leaf to mask an out-right discrimination and exploitation of workers and consumers46. Most obvious example of this was the American “substantive due process” doctrine47. Reaction to that was a string of statutes limiting the freedom of contract. The growing inequality of parties to “information contract” in cyberspace will probably also require such limitation or perhaps even ditching private law method of regulation in favor of a more restrictive, administrative regulation. This issue was recognized in the Draft Regulation, this proposal stipulates that consent must be “explicit”.48 In the legislative process an amendment had been proposed to get back to the “unambiguous consent” already laid down in the Data Protection Directive. The “unambiguous consent” standard does not provide adequate information privacy protection for data subjects49. </p><p>Consent must also be free. The “freedom” of consent on the Internet is questionable. Many companies require a prior, vague consent as a prerequisite before using their services, it is a sin qua non condition50. As more and more services and daily activities move online this will</p><p>44 Supra note 13, pp. 386.</p><p>45 See supra note 43. </p><p>46 See K. Zweigert & H. Kotz, An Introduction to Comparative law, Oxford 1998, Chapter 24. Freedom of Contracts and its Limits.</p><p>47 See T. Koopmans, Courts and Political Institutions. A Comparative View, Cambridge 2005, Chapter 8. </p><p>48 See supra note 23, art. 4, 8, 6(1) and 7 of the Draft Regulation. </p><p>49 See supra note 38. </p><p>50 See supra note 38. become a bigger problem. The business model in which Internet outfits require scores of personal data will raise concerns about discrimination. </p><p>The other fallacy is the fallacy of private enforcement. In most European countries privacy is recognized as personal right (dobro osobiste, Persönlichkeit). The consequence of this is that misuse or abuse of personal data can be treated as a tort, giving the ability to seek remedy in civil procedure51. This path is long and ineffective. The main problems in obtaining an effective remedy are the cost, burden of proof and uncertainty of outcomes52. Furthermore, under some jurisdictions the assessment of damages becomes a problem, especially by a victim that suffered emotional traumas and stress because of the data breach and discloser53. One of policy proposal to facilitate the pursue of data breach remedies is to relax procedural requirements, especially costs and burden of proof.54 </p><p>But effective data breach remedies in cyberspace are difficult to achieve because of the nature of the Internet itself. As I already mentioned, the nature of data reproduction on the Internet makes it difficult to erase an information once it already got viral. A good example of that is the fallout from the Wikileaks leak. The most powerful state in the world could not erase diplomatic cables and other documents from the Internet once they were placed there55. Obtaining a remedy against one private company is almost certainly not enough to erase the information from the Internet. This problem was recognized by CJUE in the latest Google Spain judgment56. </p><p>Those consideration make an effective remedy in tort all but illusory. It would require an adoption of a strict, universal “data publishing prohibition order” which is not only unknown</p><p>51 See Kulesza, J., International Protection of Human Rights On-Line; Key Problems [in:] Collection of Papers from the International Scholastic Conference “Law as a Unifying Factor in Europe – Jurisprudence and Practice”, Bratislava 2011, pp. 387-394. http://www.academia.edu/2392030/International_Protection_of_Human_Rights_On-Line_Key_Problems (downloaded 27 April 2014).</p><p>52 See Access to data protection remedies in EU Member States a report by The Fundamental Rights Agency January 2014. http://fra.europa.eu/en/publication/2014/access-data-protection-remedies-eu-member-states (downloaded 8 May 2014).</p><p>53 See Kulesz, supra note 51.</p><p>54 See supra note 52.</p><p>55 See H. Bińkiewicz, Odpowiedzialność w sieci (Responsibility on the Internet) [in polish] in Odpowiedzialność w sensie ekonomicznym i prawnym (soon to be published). in some jurisdictions but in current technological environment would be nothing but unenforceable57.</p><p>This has led to proposals to strengthen to role of civil society organizations and national data protection authorities58. One of such proposals is the right to be forgotten stipulated in the Draft Regulation59. Although the exact wording of the provisions guarantying the right to be forgotten is not yet known, there are serious doubts if such a right can be effectively enforced on the Internet without an introduction of legal and technical measures that infringe the freedom of the Internet60. The right to be forgotten is a legitimate policy goal to pursue, whit out it informational privacy is likely to remain in danger.</p><p>It is possible that the right to be forgotten will not be effectively enacted. The legislative history and lobbying efforts surrounding the legislative path of the Draft Regulation are not a good augury. The other avenue for effective enforcement of informational privacy is robust court action. One option is the adoption of the “horizontal effect” of art. 7 and 8 of the Charter by CJEU and a gradual development of responsibility of private Internet companies for privacy breaches similar to responsibility of Member States for a breach of EU law. A simple striking-down of the Data Protection Directive (or the regulation that will be enacted in its place) similar to the Data Retention Directive doesn’t seem to be a good option, because it would only decrease the standard of privacy protection. Development of applicable remedies and procedures by the CJUE jurisprudence and then application of those via the doctrines of effect utile and acte claire by national courts seems to be the only way to preserve civil law character of privacy protection. “Horizontal effect” of data protection could also give national</p><p>56 Case C-131/12, “97. As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer be made available to the general public by its inclusion in such a list of results, it should be held, as follows in particular from paragraph 81 of the present judgment, that those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question.”</p><p>57 See Kulesza, supra note 51.</p><p>58 See supra note 52, also the Draft Regulation art. 73-79.</p><p>59 See supra note 23.</p><p>60 One such measures could be something similar to the „Great Firewall of China” (although this fire wall is considered to be ineffective and easy to circumvent) See Kulesz, supra note 3. judges a powerful tool to access the compatibility of national legislation whit fundamental rights standards.</p><p>In face of what looks like a “regulatory capture” of the legislative organs of the European Union and recent judgment of the CJUE in Digital Right Management Ireland and Google Spain this scenario seems more and more plausible. This seems to be a positive development from the point of view of privacy activists. CJUE seems to be the only actor fully committed to informational privacy. </p><p>4. The power of big data and data mining </p><p>The rise of big data created the fallacies of anonymisation and non-sensitive personal data. A group of advisors of US President defines big data as:</p><p>“[..] data that is so large in volume, so diverse in variety or moving with such velocity, that traditional modes of data capture and analysis are insufficient — characteristics colloquially referred to as the “3 Vs.” The declining cost of collection, storage, and processing of data, combined with new sources of data like sensors, cameras, geospatial and other observational technologies, means that we live in a world of near-ubiquitous data collection. The volume of data collected and processed is unprecedented. This explosion of data — from web-enabled appliances, wearable technology, and advanced sensors to monitor everything from vital signs to energy use to a jogger’s running speed — will drive demand for high-performance computing and push the capabilities of even the most sophisticated data management technologies. […]With the rising capabilities of “data fusion,” which brings together disparate sources of data, big data can lead to some remarkable insights. […] Furthermore, data collection and analysis is being conducted at a velocity that is increasingly approaching real time, which means there is a growing potential for big data analytics to have an immediate effect on a person’s surround environment or decisions being made about his or her life.”61 </p><p>Big data and the Internet of things will be able to merge the industrial and Internet economies.62 In the current context big data technologies are used to build “profiles” of Internet users in order to market them certain goods and services. Profiles are build by “data</p><p>61 Big Data: Seizing Opportunities, Preserving Values, Executive Office of the President May 2014 pp. 4-5, see supra note 43.</p><p>62 Id. fusion” (pooling many database from different sources) and then using “data mining techniques”. Data mining is the use of computing power in order to search for hidden patterns. Although profiling is the marketing buzz word of the day, it is just an overture63. Big data is in its infancy stage, the data used to build profiles is still relatively small. But today’s big data is tomorrow’s small data, and even the issues raising form profiling and direct marketing are worrying. Today, one can reasonably expect to buy a list of five thousand women who buy sexy underwear and work in the public sector, impotent middle aged man, woman who wear wigs or registered Republicans that buy pornography or pornography whit S-M themes64.</p><p>Already ubiquities data gathering on the Internet allow marketers to use non-sensitive personal data in order to obtain sensitive personal data. One of the most well known example of this is was the disclosure of teenage pregnancy by a store chain Target. Target was analyzed data of their customers and sending them discount coupons for articles that the algorithms predicted that they might want to buy. It had send a discount coupon for articles for pregnant woman to a teenager, her father received the mail and contacted the company whit a complained. He found out that his teenage daughter is pregnant later65. This story has become an anecdote – algorithms detecting daughters pregnancy before the father, but it is in fact scary. The amount of information that profiling can disclosure about Internet users is great66 and we must remember that is just a being of using big data techniques. One can only imagine what kind of information can be discovered when using data mining on database obtained buy merged medical records (more and more of them are digital) credit card bills and geolocalization data. Not only detailed commercial habits could be reconstructed but also health condition, political views, sexual preference. Sky is the limit. Big data offers also great possibilities for scientific progress67. As more and more of daily activities will be conducted online, the data stream will grow. More information could be decoded form such data. The current business model of Internet companies, offering services for personal data is becoming</p><p>63 Id.</p><p>64 See supra note 13. </p><p>65 See http://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was- pregnant-before-her-father-did/ (accessed 15 May 2014).</p><p>66 See supra note 43.</p><p>67 For a more detailed analyses see supra note 43. more and more intrusive. In the process of obtaining data about consumer behavior, those firms collected and are collecting far too much information.</p><p>New data mining technology has also significantly hindered anonymisation or psuedonimisation of data. Under the Data Protection Directive data can be anonymised in order to rip the benefits of open data. Anonymised data are no longer considered personal data, and they can be crunched in a free, unrestricted way. This assumption – that data can be anonymized – is also becoming a fallacy. Cases of de-anonymisation of data bases abound, Netflix data base, the disclosure of personality of credit card users in Wal-Mart data base68 shows how yesterdays standards for anonymisation are easy to break today, especially when combining data from different data base69. It is imperative to remember that, as I mentioned, today’s big data is tomorrow’s small data and that we should take into consideration technological development in setting our anonymisation standards.</p><p>Anonymisation is problematic, because once data made publicly available on the Internet it is hard to delete them. As critiques of strict intellectual property laws say: “Information wants to be free”. Those data can be harvested, de-anonymised and used in all sorts of malicious ways 70.</p><p>The data gathering by private firms also create an escalating possibility of a disastrous data theft or leak. Data leaks in the commercial sector are less publicized then those in the government sector, but they do happen regularly71. The nature of and scope for leaking data is changing rapidly. Daniel Ellsberg worked for months to carry out a few hundred documents from the Pentagon in the Pentagon Papers case in the ’70. That was by far the largest security breach in the history of USA up to that date72. Chelsea Manning manage to leak hundreds of thousands of pages of classified documents just whit a few mouse clicks. </p><p>68 See supra note 38.</p><p>69 See Opinion 05/2014 of Article 29 Data Protection Working Party on Anonymisation Techniques, adoptedd on 10 April 2014. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion- recommendation/files/2014/wp216_en.pdf (downloaded 9 May 2014).</p><p>70 Id. </p><p>71 The latest example is the South Korean credit cards data theft Cards sharps, “The Economist” January 25 2014. http://www.economist.com/news/finance-and-economics/21595059-enormous-data-heist-may-dim- koreans-love-affair-credit-cards-card-sharps </p><p>72 See supra note 55. Crooks in Poland use old phonebooks in order to obtain personal data of older people in order to swindle money from them. This shows that database have a life of their own. If a large Internet company will be hacked and those data would be leaked to the Internet, crooks of the future may have much more effective tools.</p><p>5. BCRs Or Splinternet </p><p>The European Commission realized that the UE stands alone in its insistence on data protection. Or at least it stood alone until very recently73. The commissioners realized that in a world of rapid, transnational data flows, compliance to European data protection laws can be hindered. The Internet is accessible from all of the world, that allows for circumvention of local laws. In a speech delivered in November 2011 Viviane Reding74 announced a plan for the data protection reform. One part of the reform is the Draft Regulation, other proposals was the adoption of Binding Corporate Rules (“BCRs”). BCRs are sets of good business practices adopted by companies voluntarily and applied in all of their branches, after they are voluntarily adopted they become legally binding75. If a company’s BCR will be accepted by a data protection agency in one member state, other data protection agencies will accept it as well. Mrs. Reding also proposed to strengthen the power of data protection agencies in order to allow them to effectively prosecute data breaches76. </p><p>Commissioner Reding proposed to “push the boundaries of traditional regulatory models”77 and allowing for a trans-EU recognition of actions of one data protection ombudsmen. The</p><p>73 Evidence seem to suggest the even the US is changing its approach to data protection. See supra note 61.</p><p>74 Viviane Reding, Vice President of the European Commission & EU Justice Commissionerr, Address at the IAPP Europe Data Protection Congress in Paris: Binding Corporate Rules: Unleashing the Potential of the Digital Single Market and Cloud Computing. http://europa.eu/rapid/press-release_SPEECH-11-817_en.htm?locale=en </p><p>75 Opinion 02/2014 on a "Referential for requirements for Binding Corporate Rules submitted to national Data Protection Authorities in the EU and Cross Border Privacy Rules submitted to APEC CBPR Accountability Agents”. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion- recommendation/files/2014/wp212_en.pdf See also supra note 3.</p><p>76 See supra note 74.</p><p>77 “I believe that for efficient data protection and to ensure effective rights for individuals, we have to push the boundaries of traditional regulatory models. If European businesses are to compete with the rest of the world, we need to encourage innovation. And we need to embrace new technology.The first area we need to reconsider is the idea of geographical borders. The internet and other technologies have made it just as easy to purchase online from the United States or India, as it is to buy from your local store. Data protection laws that apply only within a given territory just do not work in an era where information flows are global: personal data is stored in one country, effectively processed in another and the data subject is located in a completely different country.” Reding, supra note 74. BCRs will also apply to all internal and transnational data flows whit in the scope of a company that adopted them. The full adoption of Mrs. Redind proposal will lead to giving national data protection ombudsmen a kind of transnational jurisdiction78. This resonates well whit the nature of cyberspace and whit functionalism – the founding philosophy of the EU. Functionalists argued that the boundaries of jurisdiction cannot be dictated by territory, it must be dictated by function. This approach, originally proposed in the ’50 seems surprisingly fit for cyberspace regulation. </p><p>If BCRs will prove to be a model not protecting privacy well enough, it is likely that the EU will eventually adopt some kind of “Internet-filtering” technologies similar to the “great firewall of China”. As Kulesza puts it:</p><p>“ The vision of an effectively and extensively filtered Internet is sometimes pejoratively referred to as “splinternet,” a term depicting the loss of the universality of the network (an Internet “splintered” into separate local webs)”79. </p><p>The ushering of the era of Splinternet is a grim perspective, but it would make data protection a lot more enforceable. The possibility of Splinternet seems to be also in the “penumbras” and “emanations” of Digital Rights Management Ireland judgment. The Courts has suggested, that locating servers whit sensitive personal data outside UE territory was one of the reasons for its actions80. </p><p>A lot of powerful forces work against the freedom of the Internet. Russia and China are trying to fence “their own Internet” for security reasons. Iran limits access to certain portals from security and religious reasons. USA considered adopting the SOFA and PIPA acts, that would effectively create a US “copyright violation free” Internet. The “old media companies” and intellectual property owners lobbies are pressing for more IP protection and more surveillance. The nature of the Internet economy is also a factor. Currently 4 companies control a substantial and growing part of the traffic on the Internet, a situation radically</p><p>78 See supra note 3.</p><p>79 Kulesza, supra note 3 pp. 762. See also The Future of the Internet, a Virtual Counter-Revolution, “The Economist, 2 September 2010, http://www.economist .com/node/16941635 .</p><p>80 See C293/12 and C-594/12 Digital Rights Management Ireland. different then even 10 years ago81. The “iron law of oligarchy” also works online. This makes Internet more homogenous and more susceptible to control. “The right not to be googled” is much easier to enforce then the right to be forgotten. The freedom and internationality of the Internet hinders enforcement of data protection but it is a valuable thing. The problem is that, one day it can turn out that we are waging the enforceability of our fundamental rights against something that does not exist.</p><p>6. Conclusion: the judges march in</p><p>Tim Koopmans claims that the protection of human rights give a whole new dimension to judicial review82. This seems to be a prevailing mood in Europe. CJUE judgments in Google Spain and Digital Rights Management Ireland showed that it is in the avant-garde of data protection. However, that is not a good sign. The CJEU lacks the administrative power to effectively police the Internet, such power is vested in the Commission and national governments. The procedure of the adoption of data protection reform shows that privacy concerns and fundamental rights are not a “trump card” over the commercial interests of Internet company. That is not a good sign of things to come. Even UE, jurisdiction most open for data protection concerns seems hopelessly unprepared for the torrent of data from the “Internet of Things”. It seems unlikely that we will manage to preserve a level of privacy protection stipulated in our founding documents. Still it is imperative to stress that privacy protection is a fundamental right and that limitations of such a right can be imposed “only if they are necessary and genuinely meet objectives of general interest recognized by the Union or the need to protect the rights and freedoms of others”. </p><p>81 Facebook, Google, Amazaon, Apple See Another game of thrones, “The Economist” 1 December 2012. http://www.economist.com/news/21567361-google-apple-facebook-and-amazon-are-each-others-throats-all- sorts-ways-another-game</p><p>82 See T. Koopmans, Courts and Political Institutions. A Comparative View, Cambridge 2005, Chapter 8.</p>
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages18 Page
-
File Size-