PERSUASIVE TECHNOLOGY and

HUMAN RIGHTS

LL.M Law and Technology Tilburg University Netherlands

Panagiota Tokamani

2016 PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

Abstract

The fast-paced development of technology throughout the last decades has been redounded, inter alia, to the emergence of a newly-introduced notion, stipulated as “persuasive technology”. Computing systems are designed in many cases having the intention to shape human attitudes or behaviors. Under these terms, the reasonable question arisen, from a legal perspective of viewing, is the following: to what extent may persuasive technology affect fundamental human rights established in

European legal order and what the regulatory level of protection of these human rights is so for this possible effect to be mitigated.

This paper aims at presenting the issues following this emerging question analyzing the legislative imprinting of the human right to privacy, the cornerstone for two inherent rights, the right to data protection and the right to autonomy. Conclusions are drawn with respect to the interaction of the aforementioned human rights with “persuasive technology” doctrine, examining the consistency of the privacy protective law, under the current legislative framework, with the policies used by persuasive computing systems’ technology.

2

TILBURG LAW SCHOOL

Abstract

Le développement rapide de la technologie au long des ces dernières décennies a activement contribué, inter alia, à l’émergence d’une notion récemment introduite, définie comme ‘technologie persuasive’. Des systèmes informatiques sont désignées, aux plusieurs cas, en ayant l’intention de formuler des attitudes ou comportements humains. Selon ces termes, la question raisonnable qui se pose, d’un point de vue legal, est la suivante: dans quelle mesure la technologie persuasive est-elle capable d’influencer les droits de l’homme fondamentaux, établis à l’ordre juridique européenne, et quel est le niveau de régulation de la protection de ces droits, afin que l’effet possible soit mitigé.

Cette thèse vise à presenter les sujets qui suivent cette question émergente, en analysant l’empreinte legislative au droit à la vie privée, pierre angulaire de deux doits inhérents, le droit à la protection des données et le droit à l’autonomie personnelle. Les conclusions concernant l’interaction entre les droits humains ci-dessus et la doctrine de la ‘technologie persuasive’, sont tirées en examinant la consistence de la loi protectrice de la vie privée, dans le cadre législatif en vigueur, avec les pratiques utilisées dans la technologie des systèmes informatiques persuasives.

3

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

Περίληψη

Ο ραγδαίος ρυθμός ανάπτυξης της τεχνολογίας τις τελευταίες δεκαετίες έχει οδηγήσει, μεταξύ άλλων, στην εμφάνιση ενός νέο εισαχθέντος φαινομένου, οριζόμενο ως «τεχνολογία της πειθούς». Τα υπολογιστικά συστήματα σχεδιάζονται, σε πολλές περιπτώσεις, έχοντας την πρόθεση να διαμορφώσουν τις ανθρώπινες συμπεριφορές και τη στάση ζωής. Υπό αυτές τις συνθήκες, το εύλογο ερώτημα που ανακύπτει, από άποψη νομικής παρατήρησης, είναι το εξής: μέχρι ποίου βαθμού δύναται η «τεχνολογία της πειθούς» να επηρεάσει θεμελιώδη ανθρώπινα δικαιώματα, όπως αυτά θεσμοθετούνται στην Ευρωπαϊκή έννομη τάξη και ποιο είναι το κανονιστικό επίπεδο προστασίας ώστε να μετριαστεί αυτή η εν δυνάμει επιρροή.

Η παρούσα εργασία θέτει ως στόχο την παρουσίαση των θεμάτων που απορρέουν από το ως άνω ανακύπτον ερώτημα αναλύοντας το νομοθετικό αποτύπωμα του ανθρώπινου δικαιώματος στην ιδιωτικότητα, ως ακρογωνιαίου λίθου δύο συνυφασμένων ανθρωπίνων δικαιωμάτων, του δικαιώματος στην προστασία των προσωπικών δεδομένων και του δικαιώματος στην αυτονομία. Καταληκτικά, θα προκύψουν συμπεράσματα όσον αφορά στην αλληλεπίδραση των ως άνω αναφερόμενων ανθρωπίνων δικαιωμάτων με το φαινόμενο της «τεχνολογίας της πειθούς» και στη συνέπεια του προστατευτικού νόμου για το δικαίωμα στην ιδιωτικότητα με τις στρατηγικές που ακολουθούνται από τα υπολογιστικά συστήματα της «τεχνολογίας της πειθούς».

4

TILBURG LAW SCHOOL

Table of contents

TABLE OF CONTENTS…………………………………………………………. 5

CHAPTER 1 I.Introduction……………………………………………………………………7 1.1. Abstracted description of the topic………………………………………7 1.2. Overview of the paper……………………………………………………10

CHAPTER 2 II. Persuasive Technology- A newly introduced notion……………………...13 2.1. Introduction to the term………………………………………………...13 2.1.1. …………………………………………………………...14 2.1.2. Technology…………………………………………………………..15 2.1.3. Persuasive Technology……………………………………………...17 2.2. Captology…………………………………………………………………18 2.3. Practical examples of persuasive technology…………………………...20 2.4. Conclusion………………………………………………………………...23

CHAPTER 3 III. Right to Privacy………………………………………………………………24 3.1. The notion of privacy………………………………………………………24 3.2. The aspects of the right to privacy………………………………………...26 3.3. Conclusion…………………………………………………………………..28

CHAPTER 4 IV. Right to Privacy and Persuasive technology………………………………..30 4.1. The legal context…………………………………………………………....30 4.2. Persuasive technology and e-privacy: Challenges………………………..34

CHAPTER 5 V. Right to Data Protection and Persuasive Technology……………………….36 5.1. The legal context……………………………………………………………36 5.2. Data protection and persuasive technology……………………………….39

CHAPTER 6 VI. Right to Autonomy and Persuasive Technology…………………………….43 6.1. The context of autonomy…………………………………………………...43 6.2. The right to autonomy and persuasive technology……………………….45

CONCLUSION……………………………………………………………………49

5

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

LIST OF SOURCES…………………………………………………………….52 List of books……………………………………………………………………..52 List of articles……………………………………………………………………53 Table of legislation………………………………………………………………55 Table of case law………………………………………………………………...56 Table of sites……………………………………………………………………..56

6

TILBURG LAW SCHOOL

CHAPTER 1

Introduction

1.1 Abstracted description of the topic

Persuasion is “an attempt to shape, reinforce or change behaviors, feelings or thoughts about an issue, object or action”1. This is the most general definition which is used in the fields of psychological analysis of human characteristics and conducts. People can persuade, artifacts may persuade and lately technology is considered to persuade. This thesis focuses on persuasive technologies, and especially computing systems, with the aim to analyze the meaning and the extent of this term, and to probe, under the spectrum of legal order, this newly-introduced subject and how it may affect specific fundamental human rights.

The term “persuasive technology” triggers various questions about the deeper analysis of both the two compositing words of this phrase. How can persuasion be achieved through or with the technology? Which faculties of technology can be identified as persuasive? Who wields this power of technological influence? What is the distinctive characteristic of persuasion in order not to be confused with deception or coercion? All these questions are attempted to be answered in the first place as an introductory note, so the reader may be familiar with the issue in question before further going deeper to the examination of its legal aspects in the following chapters.

Within the context of this introduction to the main issue, a newly-emerged science, that is captology, will be used as a component for the comprehension of “persuasive technology” in terms of persuasive computing systems. Captology is the study of the area in which computing, operating systems and persuasion intersect2 and it constitutes an acronym originated from the term “Computers As Persuasive Technology”. By definition, captology focuses on the persuasive “power” that merely computing systems may bear. Accordingly, the present paper will content itself to narrow the wide range of persuasive technologies to the extent of “computing/operating systems as persuasive tools”, focusing on the cases of persuasion originated from computing systems and based on the reviews and analysis coming from the field of captology.

The most recognized and thorough definition of “persuasion” in the field of technological development could be the one interpreting persuasion as “the interaction

1 B. J. Fogg, “Persuasive Computers: Perspectives and Research directions”, (1998), CHI 98, Stanford University, p. 225 2 B. J. Foggs, Persuasive Technology, Using Computers to Change What We Think and Do, (2011, Kaufmann Publishers), p. 5

7

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

of computing systems in order to change behaviors and attitudes”3.The key point of persuasion in the debated case is intentionality; persuasion is not resulted as an accidental consequence of the occurrence of sequel incidents. Persuasion needs to imply an intention of changing attitudes of persuaded subjects. More particularly, the designed effects which are accrued by the planned persuading process may force users of technological equipment to register with their personal data in myriads sites on internet under the pretext of urgency for identification, to accept the terms of a service in order to use it with the assumption of their own safety or to be motivated under various apps to foster a specific way of life which will be controlled in total by these apps per se4. These examples raise concerns related to the consequences of persuasive technology’s prevalence with regards to the respect or violation of fundamental human rights related within the content of these practical examples.

“Technology” is the second compositing word of the term “persuasive technology”. Pursuant to the theory of captology, the main virtual environments which lead a major persuasive role in the technological scene consist of computers, mobile phones and video games5. This generalization does not result to the exclusion of any other computing system from the list of persuasive technological equipment when the latter intends to persuade the users into a specific direction6. Nevertheless, for the needs of the present paper, merely computing systems will be meant with the use of term “persuasive technology.

In more precise terms, computers were first introduced themselves as a tool of storing, handling and loading data and mobile phones as a means of connecting people outside their personal place regardless the distance. Their initial goal was not to persuade. Their invasion, however, into the daily lifetime, ensured to them a more robust role. The easy access into websites of widely ranged interest, the downloading of apps used as finance managers, music player, exercise coach7, the instant participation into the social media (blogs, social networking tools) have invigorated the close association of users with these facilities not only in the field of personal entertainment but, instead, in a national level in the sector of health, education, business, cultural heritage, environmental protection8. Becoming these achievements part of national priorities, their persuasive power has been increased by design. Taking as springboard the persuasive dominance of the aforementioned operating systems, the next step which has to be taken is the exploration, within a

3 Ibid, p. 1 4 Ibid, p. 17, p. 19, Harri Oinas-Kukkonen, “Behavior Change Support Systems: A Research Model and Agenda”, (2010), 5th International Conference, PERSUASIVE 2010, p. 4 5 Supra note 1, p. 225 6 Supra note 2, p. 2 7 Jennifer J. Preece, “I Persuade, They Persuade, It Persuades!, (2010), 5th International Conference, PERSUASIVE 2010, p. 2 8 Ibid

8

TILBURG LAW SCHOOL

legal framework, of the interaction of persuasive technology with vested, fundamental human rights.

The principal human rights engaged by the examined issue are without any doubt the right to privacy, the right to data protection and the right to autonomy. Privacy is the core of the right to data protection and the right to autonomy, which constitute one of the central aspects of public interest9 and therefore they are identified as fundamental human rights. Privacy can be distinguished, inter alia, into three different kinds; decisional privacy, local privacy and informational (data) privacy10. The latest type is the least protected one, from a regulatory point of view under the debate of persuasive technology, and relates to processing of personal data, namely location and traffic data. The way of processing the above mentioned types of data is mainly monitoring and tracking. It is notable, that e- Privacy Directive11 explicitly specifies the practice of cookies and the statutory regulation around this issue since cookies are used to serve the purposes of monitoring and tracking.

Data protection norms provide for all the legislative safeguards with which the designers of persuasive technology have to comply during the process of personal data communicated by the data subjects. Revelation and communication of this personal information constitutes the consequence of the successfully designed persuading process which aims at convincing data subjects to follow inadvertently the procedure that persuasive operating systems, i.e apps or sites, demand for their “proper usage”, including the compulsory disclosure of personal data. The most illustrating example on this situation is the requirement for registration with personal information by the recipient in order for a persuasive app to be downloaded or a persuasive site to be popped up or a persuasive video game to be properly installed. The raised issue is to what extent the right to personal data protection, as it is stipulated in article 8 of European Convention of Human Rights, is respected or infringed by the designers of these persuasive technological tools. Profiling is an additional facet of the aforementioned issue within the field of data protection right in persuasive technology. It is based on the aggregation of visited sites, or keywords or content of search taken place by the data subject in prior time. Profiling is used as a tool for and practically by the recommendation to data subjects of sites or apps which statistically touch upon their interest. Data Protection Directive12extensively regulates the debated issue.

Finally, it is notable that in majority, vulnerable groups, like children, elderly, mentally disabled or invariably lonely persons are considered to get influenced the

9International Council on Human Rights Policy, “Navigating the Dataverse:Privacy, Technology, Human Rights”, (2011), p. ii 10 Ibid, p. iii 11 Article 5.3 2002/58/EC Directive 12 Articles 7 and 8 95/46/EC Directive

9

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

most by these practices of persuasive technology13. It may be speculated that this aspect has to be examined under an ethical spectrum. Nonetheless, legal concerns may arise as well. The referred groups are treated, for the sake of emphasis, like docile instruments non able in many cases to detect the danger that technology engulfs in these fields. The arisen question here regards the information provided by the designers of persuasive computing systems to the recipients and the terms and conditions of consent that may be given by the latter so the right to privacy, data protection and autonomy may not be violated. The core, as already mentioned, of these fundamental rights is privacy14 and therefore the detailed examination of its multiple aspects is inextricably linked to the final analysis of the right to data protection and autonomy and their intersection with persuasive technology, in order for conclusions to be drawn.

By the conjectures of this brief, introductory analysis, the main issue arisen is the pointed out, interactive relation between the rapidly paced technological persuasion achieved by the intervention of computing systems in everyday life and the vested human rights putatively violated by this newly introduced phenomenon. How easy is for a balance to be kept in the legislation between the above intersecting resultants? What is the legislative spectrum under which the discussed issue has to be examined? These subsequent queries will be attempted to be answered further, citing the current and potentially the future provisions which have been ruled or proposed into the legal society.

1.2 Overview of the paper

The main research question that the present paper is required to answer is if and how persuasive technology affects human rights established under European legal order, and precisely the right to privacy, the right to data protection and the right to autonomy. What is the regulatory level of protection of these human rights under the current legislation, and does the legislative protective law is effective in order to prevent or mitigate the possible, adverse effects of persuasive technology? Further, how has EU addressed the directions of potential future legislation for the most adequate protection of aspects of these human rights to be achieved? These questions are aiming at being answered in the following chapters of the thesis, and especially within concentrated conclusions, which will be drawn in the last chapter, summarizing the main points of the previous analysis.

In the first chapter a brief introduction has been presented approaching the debated topic in that sense in order to provide the reader with a main idea about the subjects that the present thesis purports to analyze. The term of persuasive technology is quite

13 Supra note, p. 232 14 Supra note 9, p. iv

10

TILBURG LAW SCHOOL

new in the field of research and therefore the topic of this paper is pretty unknown to the average man who needs to comprehend summarily the meaning and its extent. Additionally, the placement of persuasive technology into the framework of legal research and especially in relation to specific, as listed above, human rights’ violation has been attempted for the further analysis of the central concept of the paper to be more perspicuous.

In the second chapter, the doctrine of persuasive technology is going to be examined under more practical terms. Although, the notion of persuasive technology is dated only during the last two decades, its very early root is found back in ancient time whereas its consequence in modern age has led to controversial results. Therefore, a detailed description of the features of persuasive technology will take place, in the way it has been studied by the science of Captology and practical examples will be presented proving the leading role this issue holds in various aspects of everyday life.

In the third chapter, an introduction to the human rights affected by the notion of persuasive technology, within the present paper, will be listed. A more academic approach to the right of privacy will be presented, in this chapter. The right to privacy constitutes the cornerstone of two more rights related closely and directly to privacy. Data protection is the inherent component of the right to privacy and at the same time the precondition for individual’s right to autonomy. A brief presentation will take place around the key point of the concept of privacy ant its multiple facets, more particularly, the right to private life, the right to data protection and the right to autonomy, which will be further examined in the following chapters.

In the fourth chapter the framework of the first of the fundamental human rights which are closely linked with the impacts of persuasive technology will be examined; that is privacy. Privacy, as it is stipulated under article 8 of ECHR, is an established right of essential importance and the safeguards for its protection are required to be kept in the field of persuasive technology. One of the main issues in this vein is the extent of respect to e-privacy right’s implications, as they are regulated under the 2002/58/EC Directive, by the designers of persuasive technological tools and how legislation may be violated in practice by the operation of persuasive computing systems.

In the fifth chapter, the right to data protection will be probed in detail. The concept of correlation between the operation of persuasive computing systems and the right to protection of personal information is what the writer will try to deal with. The legitimate enforcement of the law as it is described in European Convention of Human Rights and 95/46/EC Directive will be examined in addition with the citation of tangible examples of persuasive technology’s practices. Except the main provisions regulating the conflicts raised between persuasion in computing systems and personal data protection right, further norms provided by the same legislative corps may apply

11

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

with respects to conditions must be met for proper consent to be given by the recipient regarding the processing of her personal information. These conditions ought to be satisfied will be examined under the spectrum of legislation and case law in an extensive correlation.

The legislation implemented in case of privacy and data protection vested rights will draw conclusions associated with the nature of the right to autonomy. In the sixth chapter, the intersection of persuasive technology with the right to autonomy will be set under the spectrum of examination. No explicit provision regulating this right per se exists in the current legislation. Nonetheless, autonomy arises from the core of privacy15 and data protection rights and therefore the analysis taken place in the previous chapters will contribute to the most comprehensive presentation of the engagement of the right to autonomy with persuasive technology notion.

At the end, a conclusion will summarize the main points of the present thesis, aiming at answering the main research question considering all the data and facts which will have been previously presented. The opinion of the writer regarding the outcomes of the research will be quoted at the end of this paper.

Before going deeper to the analysis of the issue of this paper, it should be noted the sectors that the present thesis may not approach, so for the scope of the paper to be more readable and clearer. Within the purpose of the present analysis, the human rights affected by the persuasive technology are examined within the private activity; that means that public sphere may not be considered. This implies that no theory, for instance, about the role of persuasive technology on integrity of electoral process or traffic of data for national security reasons may be presented.

Further the present paper aims exclusively at classifying and eventually examining the aforementioned human rights affected associated with individuals and the private use of persuasive technology means and merely from a legal perspective, so no ethical extensions may be displayed within its content. Finally, as it will be analyzed in more details in the following chapters, in the present thesis persuasive technology is extended only in the field of computing systems, namely computers, mobile apps and videogames, excluding from the present paper the examination of other type of persuasive technological equipment.

15 Article 8 ECHR

12

TILBURG LAW SCHOOL

CHAPTER 2

Persuasive Technology- A newly introduced notion

2.1 Introduction to the term

Persuasive technology constitutes a newly introduced concept in the fields of technological research and legal academic circles. Because of the relatively young origins of this subject, there exists much controversy regarding the deep understanding of its meaning in practical level and therefore the piece of legislation has to face the so called Collingdridge dilemma16. According to this doctrine, there is a risk that regulators need to take in order to face a new technology; they can even regulate the unanticipated effects of this given technology, but this cannot be achieved in adequate level since the consequences may not be foreseen yet, or to wait until the moment of actual comprehension of these effects, but then the peril of losing control over the regulation is inevitable. This phenomenon will be proven in more details in the next chapters approaching the gaps of current legislation in the field of persuasive operating systems.

Going back to the deconstruction of the nature of the compact term “persuasive technology”, in a first level, this term seems to be understood easily by the average man with no further process to be necessary for its comprehension. In a second view, however, there has to be a deeper examination of the notions that the words “persuasion” and “technology” express so for the scope of this thesis to be more precise.

Under this chapter this deeper examination of the term “persuasive technology” will take place by approaching the definition and the meaning of the words “persuasion” and “technology” separately in the first section. At the first part of this first section, the term “persuasion” will be examined in the bud and the criteria should be satisfied by default for the notion of “persuasion” in technology to apply, will be given in detail. The second part of the first section relates to the term “technology”. In this thesis, as already mentioned, “technology” means merely the computing systems, namely computers, videogames and phone apps. The computing systems will be examined under the spectrum of their role as social actors which they hold within the “three-dimension theory”. At the end of the first section the combined term of “persuasive technology” will be given underlining the substantial role of intentionality integrated in its meaning for the needs of this thesis. In the second section of this

16 Wolfgang Liebert, Jan C. Schmidt, Collingridge’s dilemma and technoscience An attempt to provide a clarification from the perspective of the philosophy of science, (Poiesis Prax, Springer-Verlag, DOI 10.1007/s10202-010-0078-2, 2010), pp. 57-58

13

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

chapter a brief presentation of the study of “captology” will take place. “Captology” is the study of the field in which persuasion and computing systems’ technology are intersected giving birth to further inferences regarding the debated concept of persuasive technology. In the third section a more practical approach will be attempted giving examples of persuasive technology has been surrounded our daily life, namely e- health and m- health, e-commerce, marketing policy and social and personal dynamics. Finally, a brief overview of the main elements of persuasive technology on which the present thesis will focus will take place.

2.1.1 Persuasion

People are social animals who constantly are interacting with each other persuading and getting persuaded. This is the first fact upon which Aristotle builds his theory regarding and persuasion in his book Rhetoric17. Analyzing the importance of persuasion, Aristotle listed the three elements, namely credibility (ethos), emotion (pathos) and logic (logos), the combination of which contributes to the perfection of the art of persuasion, according to Aristotle, who claims that a person exposing himself in public is required to own this qualification in an optimal level18. Credibility is one of the components upon which persuasive technology is built, as it will be presented in the coming sections.

Using as a basis the theory developed by the great philosopher around the significance of persuasion, many sciences have founded a doctrine regarding the meaning of persuasion. There is no specific definition which can be used dogmatically covering all the possible aspects of this term. , marketing, advertising, communication have elaborated on the essence of persuasion by differentiated points of observation19.

In psychology20, persuasion entails a process through which a person’s attitude or behavior is affected by various factors, inter alia the communicational influence by other people. On the other side, persuasion in terms of advertising focuses on the influence of promotion of products on the consumers’ options with the ultimate goal to achieve the purchase of greater amount of the specific promoted product or service21. Finally, persuasive communication implies the attempt of people, especially those who are connected closely to other people with family or friendship ties, to

17Αριστοτέλης, Ρητορική Βιβλίο Πρώτο. Εισαγωγή, Mετάφραση, 2002, Θεσσαλονίκη: Ζήτρος, σελ. 136-161, Aristotle, Rhetoric, book 1. Introduction, Translation, (2002, Zitros Ed.) p. 136-161 18 Ibid 19 Supra note 2, p. 24 20Miller, Richard , Brickman Philip, Bolen, Diana “Attribution versus persuasion as a means for modifying behavior” (1975), Journal of Personality and Social Psychology, Vol 31(3), p. 430 21Thomas J. Reynolds, Charles E. Gengler,, Daniel J. Howard, “A means-end analysis of brand persuasion through advertising” (1995), International Journal of Research in MarketingVol. 12, Issue 3, p. 145

14

TILBURG LAW SCHOOL

affect the beliefs of each other. This method relies predominantly on emotion but always under the condition of free choice22.

The features of the above definitions may vary according to the field in which they have been “born”. The common hallmark, however, which can be pointed out, is that persuasion purports to the alteration of attitudes and opinions towards a specific way of living. Persuasion used in technological terms does not deviate from this general rule. Therefore, persuasion in technology is the intended endeavor to change the behavior and attitudes of the users of persuasive technological equipment, but under the condition of voluntary choice and without the use of any deceptive means23. Deception should not be confused with persuasion. Deception includes the element of misinformation, which is absent from the procedure of persuasion24. Users choose by their own will the use of the persuasive technological equipment, being acknowledged in total about its utility and the aspects of its operation. The aforementioned condition of voluntary choice should be met by default.

Persuasion in technology, due to the wide use of technology internationally, may not only have an impact on the change of attitude in a personal level but also in a social level. This social extent is confirmed when social networking, discussion groups, recommendations or reporting problems apply national priorities concerning health, energy, environmental issues, business day-to-day policy, problems of society or communicational relationships to relatives’ or friends’ cycles25. The global character of these consequences proves the ground that persuasion in technology has gained throughout the years.

2.1.2 Technology

Technology as a general term includes all the machinery, modifications and arrangements used by people with the assistance of tools and which purports to the improvement of standards of living. Under the discussed issue, however, technology should be interpreted in a narrower way. Talking about persuasion, the technological achievements which play the predominant role are computers, video games and mobile phones, in other words computing systems26.

Computers were first introduced to the market as a means of storing and retrieving data which gradually became part of ordinary everyday life and today they provide

22 Richard M. Perloff, The Dynamics of Persuasion: Communication and Attitudes in the 21st Century, (2nd ed., Publisher, Mahwah, 2003), p. 1 23 Supra note 2, p. 15 24 Ibid 25 Supra note 7, p. 2 26 Supra note 2, pp 19 and 23; Supra note 7, p. 3

15

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

service of essential importance for the satisfaction of practical needs. It was not in their intended purpose to persuade and to impose a specific way of living. Nonetheless, due to technological breakthrough throughout the years, computing systems, today, play a three-dimension role; they apply as tools, media and social actors. This is the so called illustrating phenomenon of functional triad27 and it forms intelligibly the following triangle:

The first function is reasonably comprehensible; computing systems perform the activities that humans can also do, but the former can reach the same result in a more efficient and simplified way and in the shortest time.

Even from this perspective, computers may persuade the data subjects through the calculations they perform or the process they follow in order to handle the command given by the user28. Operating as media, computers affect mainly the behavior of the user welcoming him in a virtual environment. This is described as the sensory function of the second dimension of computing systems. The environment which is provided can motivate the “visitors” to understand better the relationships relied on cause-effects links and to experience an insight into their own behaviors29. The third and final aspect of operating systems’ role in “persuasive technology” concept is the one with which the present paper will deal in the meaning of “technology” as this is considered the most interactive; they operate as social actors. This role is the most essential role in persuasive technology which is addressed in the present thesis, as under this spectrum, computing systems take over the role of a person who potentially may interact with the user in person. Computing systems may work as coaches, assistants, opponents, even pets, expressing feelings and creating really close, maybe inseparable, bonds with the data subject. They influence the general behavior of the latter in the social pyramid and they adopt social dynamics which direct the user30. The main thing ought to be outlined in this phenomenon is that computers do not use person singular (“I”) but they are treated by humans as persons able to meet the demands of them and to respond in correspondence to them. It is reasonably deducted that in this way the users get involved emotionally with their computer devices31.

A second technological achievement which is also considered as form of computing systems and which during the years has developed a more persuading character is videogames. The distinguished feature in this social actor, comparing to computers which are used by people of various ages, is that it targets at narrower group of users, especially children and teenagers. Videogames set goals, which the players have to

27 Supra note 2, p. 23 28 Ibid p. 33 29 Supra note 1, p. 227 30 Ibid 31 Supra note 2, p. 26

16

TILBURG LAW SCHOOL

pursue in order to win. Regardless the type of the videogame, this is the main structure of the idea of playing and it constitutes the entertaining part and the main reason of their existence. It has been pointed out, however, that the greater popularity and use video games have gained, the more means they use to persuade. The designers of video games put their best effort to make them compelling enough, so they can be more valuable in the market, and for this reason, they enhance their performance with sounds, graphics or even dialogues which reward or disapprove the player to keen on a particular way of playing. These elements create the persuasive environment into which the user takes the decisions of strategy followed during the virtual environment of the game. It is inferred that he is persuaded to act in a given way, being eager to follow precisely the instructions required for reaching the highest score32.

Finally, a new service related to the outbreak of new technology is mobile apps. Various new apps run on the phones and turn them into social actors. Smart phones have replaced the classic mobile devices, which role was limited to reach other mobile telephone users. Nowadays, phones have become a platform of apps with highly persuasive character. They substitute music players, personal coaches, business managers, weather forecasters for people whose job depends directly on the weather conditions33. One of the most promising and rapidly widespread uses of these apps applies optimally in the field of health. The promotion of a healthy way of life, the quitting of smoking, the exercising and the managing of the addictions have been incorporated into myriads apps which are gradually designed to adjust to the particular needs of each individual user34.

The above mentioned computing systems have a common denominator; they operate all as social actors. They constitute one of the most essential aspects of persuasive technology notion, as they interact directly to users and they are used so widely due to the needs, (in some cases only artificial ones but this may not considered in this thesis), they intend to create to individuals by their construction35. In other words, computing systems by the meaning of social actors persuade the person to conduct her behavior in a given way, leaving room for possible violence of her human rights, as it will be argumented in the following chapters.

2.1.3 Persuasive Technology

32 Ibid, pp. 19 and 230-232 33 Supra note 7, p. 2 34 Sajanee Halko and Julie A. Kientz, “Personality and Persuasive Technology: An Exploratory Study on Health-Promoting Mobile Applications”, (2010), 5th International Conference, PERSUASIVE, p.151 35 Supra note 2, p. 26-27

17

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

The combination of the two words “persuasion” and “technology” has its roots in ancient Greece. Aristotle gave the etymology of these words36 proving the close relation of them. Gradually, during the years, the technological revolution verified this relation first claimed by Aristotle and today persuasive technology has been introduced in the fields of research as a new phenomenon with multiple aspects.

Persuasive technology is the interactive technology designed to change attitudes and behaviors. The hallmark of this emerging phenomenon is the intentionality that it implies. Persuasion is not stipulated as the side effect of the occurrence of sequel incidents but instead technology has been designed in a way intending to persuade37. It means that for computing systems to be identified as persuasive, it is required the human interaction with the computing system to result to the persuasion of the user;"A computer qualifies as a persuasive technology only when those who create, distribute, or adopt the technology do so with intent to affect human attitudes or behaviors"38.

Intentionality appears in three different types: endogenous, exogenous and autogenous. Referring briefly to the characteristics of each type39, it is inferred that the main distinction between them concerns the factors by which persuasion is intended; endogenous intention is attempted by the designer of the technological tool. On the contrary exogenous intention is of external, independent persons who try to persuade the users of computing systems. Finally, autogenous intention is inherent when the user per se targets to persuade himself through the use of technology. The limits of these definitions are not always precise enough and therefore in some cases the recognition of the type may not be clear-cut. Even in these cases, intentionality has always to exist if we refer to persuasive technology.

2.2 Captology

Captology is documented under this part of the paper for the sake of scientific substantiation of what already has been presented in the previous sections of the present chapter regarding the direct intersection of persuasion and technology. Captology, as it will be described in the following paragraphs, arguments, from a more scientific point of view, in favor of the existence of “persuasive technology” notion. It further gives in academic and research cycles the springboard to subsequently examine the potential effects of the aforementioned, debated concept upon human rights. This gives the reason for the need to briefly explain the study of Captology under the following lines.

36Supra note 16 37 Supra note 1, p. 226 38 Fogg, unknown date; John W. Shaffer, “Captology: The Study of Computers As Persuasive Technology”, (2004), p. 1 39 Ibid

18

TILBURG LAW SCHOOL

Persuasive technology has been introduced in international scientific society only the last decades giving light to more parameters of the basic concept on which technology has been built up; to be used as a tool for improvement of primary peoples’ standards of living. Persuasive technology has raised new issues which have been examined under two different spectrums: interaction of people through or with computing systems. The first type concerns the computer-mediated communication which is realized as the interaction of the user with the real world using technology-in-use as an active mediator but not as a terminus of his action40. In this case technology plays an auxiliary role in the communication of the user with the reality as a channel throwing bridge between the two “subjects”.

The second type of the examination investigates the interaction of humans with computing systems. It underlines the persuading relation per se between the individual user and the operating system. This is the delimited field of study of captology, the acronym of “Computers As Persuasive Technology” term (CAPT-ology)41.

Captology is, in short, the study of intersection of interactive computing, operating systems with the general boundaries of persuasion. The introduction of this term took place at Human Computer Interaction studies in 1997 by B.J. Fogg and it triggered further the scientific interest for deeper investigation and research.

Captology examines the subject of the present thesis under two separate prisms; the first answers the question regarding the level of designed persuasion and the second refers to the types of the subjects interacting and finally getting persuaded by technology. With respects to the first issue, the levels on which persuasion exists have been categorized into macrosuasion and microsuasion. Macrosuasion is called the construction or designation of technological products with the exclusive purpose to persuade per se. Microsuasion is defined as the incorporation of small persuasive elements into technological products which assign to the latter the persuasive character of their final use.42

The second question which captology has been challenged to deal with is whether persuasion has an impact only on individual level or on broader social one. It is

40Peter-Paul Verbeek, “Persuasive Technology and Moral Responsibility. Toward an ethical framework for persuasive technologies”, (2006), Paper for Persuasive06, Eindhoven University of Technology, p. 3 41Supre note 1, supra note 2 42Supra note 2, pp. 17-18

19

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

recognized that any conceptualized notion which works within a narrow space, it may also work within broader limits. Consequently, not only individual life but further society may be subject to persuasive technology’s effects and more precisely researchers have especially pointed out that outcomes of persuasive technology may be obvious on family, societal organization or interindividual relations, in other words community in general43.

Central to any successful attempt at persuasion is credibility44. This factor operates as normative requirement for the aim of technology at persuading the users to succeed. The source of the information or other stimulus must be both clearly trustworthy and able to demonstrate sufficient expertise to legitimate user acceptance of the persuasion. Systems that behave erratically or are obviously driven by particular interests (such as commercial benefit) lose credibility. To be successful, though, the persuasive stimulus must be delivered at the right time and place; just when and where the user is ready to adopt the new attitude or behavior. For Fogg45, credibility is the determining factor in the use of the internet for persuasion and why social networking sites are so successful (and potentially dangerous). The fact that the user is among a 'community' of 'friends' and peers gives the site and the opinions expressed on it an endogenous credibility. In a similar way, mobile technology that is always on and always with us provides a powerful platform for ensuring credibility is achieved.

2.3 Practical examples of persuasive technology

Persuasive technology resulted by the interaction of users with computing systems is a phenomenon having intervened in myriads aspects of everyday life. The broad implementation of technology is showed up approximately in every sector of business, individual or social life and therefore the designers of persuasive technology having targeted practically in all the aforementioned sectors have already achieved highly successful results. Under this section, there will be presented briefly some of the most common fields which are involved in persuasive technology of computing systems and there will be given perceptible examples of these fields.

Health apps running on mobile phones have been downloaded by the majority of holders of mobile devices in their effort to gain a healthier way of life with the instructions recognized as expertise. The combination of unhealthy eating habits with the sedentary lifestyle has led to major physical and mental problems. Therefore the awareness of consumers has led to the adoption of programs working in a precautionary level or providing solutions to already emerging health problems. These

43 Supra note 1, p. 228 44 Supra note 2, pp. 125-126 45 Ibid

20

TILBURG LAW SCHOOL

programs engulf strategies assisting to the loss of weight, quitting smoking or following a work-out regime. Facing these issues, many sites and mobile apps have been created motivating people to exercise following a training program and replacing the role of personal coach, like “I move you” site, which engages people in a challenge with friends being notified to accept or deny the challenge. Persuasion is found in the fact that users are motivated to act in the shortest time and they are “forced” to perform in a bit competitive level46. Many other sites, like DirectLife, MiLife or FitBug operate with the same motif, personalizing their provided services according to the needs of the user, in accordance with strategies and theories flowing from persuasion research.

Another domain where persuasive technology is highly incorporated is marketing. Products and services are promoted with a view to persuade the consumers to buy them. The main idea behind this performance is once again the personalizing of their services according to the needs of consumers using e-commerce recommendations based on them, tracking the information for these recommendations via cookies47.When people intend to invest their money in a purchase, they are willing to have a prior feedback regarding the quality of the product or the degree of satisfaction it meets. This can be achieved in real world with the contribution of a friend or relative’s opinion, but this practice cannot always work properly in the web. Even consumer fora where opinions about the possible product or service are exchanged are not always credible; this is why these comments-opinions posted on a forum are impersonal and they may come from users who have commercial benefits by the good or bad advertising of the provided product or service.

This drawback of e-commerce has been attempted to be compensated through persuasive technology. In practice, there has been created a platform with the reviews of previous consumers; the customers have access to compare the comments of other buyers using a link which almost always accompanies the description of the qualities of the product or services. Another strategy for persuading can be implemented through the promotion of the selling product according to the principle of scarcity. Huge statements surround the potential purchasable item like: “This item is available today at a special discount rate”48; this constitutes a persuasive message which intends to lure the viewers to become more eager in order to obtain it in the shortest time under the pretext of possible loss of the discount. With these tactics e-commerce has gained a great power upon the market demand using the persuasive technology in

46 B.J Fogg, unknown date; Ibid 47 Phillip King & Jason Terster, “The Landscape of Persuasive Technologies. of the ACM”, (1999) Volume 42, Issue 5, pp. 31-38; Ran Cheng, “Persuasion Strategies for Computers as Persuasive Technologies”, Department of Computer Science University of Saskatchewan, p. 2 48 Kaptein, Maurits, “ADAPTIVE PERSUASIVE MESSAGES IN AN E-COMMERCE SETTING: THE USE OF PERSUASION PROFILES”, Eindhoven University of Technology, p. 6

21

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

order to enhance its credibility and to be more approachable to the customers groups. These persuasive strategies have been highly succeeded due to the breakthrough of computing systems technology. The ground gained throughout years by e-commerce, replacing at a great level the common commercial practices, is an outcome of computing systems’ technology which has persuaded the users world widely to adopt a new way of living including the online shopping of products or services in their everyday routine.

Finally, one of the main features of persuasive technology, as it has been already referred to, is the influence of social dynamics49. Social dynamics reflect, within the broad, social environment, the way that people follow unwritten, implicit rules according to which they communicate and interact with each other. The same rules apply to the relation between people and computing systems. A very first example achieving the intended scope of affecting personal dynamics, which is an aspect compositing the whole of social dynamics, is the digital toys targeted to children and which have been launched to the market, attempting to conduct children to specific way of attitudes. Microsoft’s Actimates characters are only the primary samples of this “industry” having been promoted with the quality to express emotions and to ask the recipients to respond with comparable feelings50. Consequently, children detect some protocols of behavior through this process which they adopt in their interaction with real humans as well. Additionally, another example of social dynamics can constitute the email program of “Eudora”. Like plenty of other similar software, it motivates the users to register with their personal data in order to use the facilities of the program properly, giving them two options: to register now or later, but not to cancel the pop up registration box. Clicking the “maybe later” button it implies that users may be convinced later to register, and at some point they will finally do it when this box pops up constantly. Social dynamics can be promoted through technology in many aspects. When technological systems operate in an environment which encourages a particular way of treatment toward it using wizards or expressing attitudes and emotions, they gain the ground to nominate the social norms should be followed in real world as well.

The above presented examples are contained as a part of what it is called “ambient persuasion”. It refers to the persuasive, intelligent, computing systems which are established in public and individual places and they can directly interact with the users in their everyday life. The access to these systems is highly easy and they can be found located in any place where the person spends the most of his daily time, like home, vehicles, office, public entertaining or shopping places. It is inferred that persuasive technology has been facilitated to get intervened dynamically into everyday life.

49 Supra note 7 50 Supra note 2, pp. 105-106

22

TILBURG LAW SCHOOL

2.4 Conclusion

Persuasive technology is a newly-introduced notion which last years has become a subject of examination. It considers the change of human attitude and behavior through computing systems. Probing in detail the term “persuasive technology” the two words composing this term, have a close relation into their meaning which is dated from ancient years. Persuasion can be achieved by many ways depending on the intended purpose; in case of technology plenty of the technological achievements, predominantly computers, mobile phones and videogames, have been engaged with persuasion, acting within different roles, tools, medias or social actors, in order to achieve their designed persuading scope.

The notion of persuasive technology in computing systems has been studied within the field of a specific science named captology. It focuses on the interaction of people with and not through computing systems. This indicates the dependent relation has been developed between the two subjects, human and technological tools. The power of technology’s persuasion on human attitude is highly impressive affecting social dynamics and applying social norms of behavior. Various practical examples surrounding people in their everyday lifestyle confirm the power that persuasive technology has gained.

This paper focuses on examining the persuasive interaction of users with computing systems operating as social actors. Under this spectrum, technology, which covers for the needs of this thesis, as already been clear, merely the use of computing systems, is designed with the intent to persuade individuals conducting a specific way of living and personalizing the services provided to the latter. The further issue needs to be examined is whether this close interaction of people with persuasive technology has positive or negative implications with regards to the protection of fundamental human rights. In legal terms, could persuasive technology affect human rights regulated by European legislation and what is the framework of safeguards with respects to this potential violation? This will be the subject on which the following chapters will elaborate on.

23

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

CHAPTER 3

Right to privacy

3.1The notion of privacy

The right to privacy is an ancient right, with roots in various religious traditions – including the Jewish, Christian and Muslim traditions – as well as in ancient Greece and China. Some sorts of protection for privacy existed in England as far back as 1361, with the Justices of the Peace Act criminalizing eavesdropping and peeping toms51.

The legal right to privacy is recognized in nearly every national constitution and in most international human rights treaties, including the Universal Declaration of Human Rights, the International Covenant on Civil and Political Rights and the European Convention on Human Rights.

The right to privacy is the cornerstone of every democratic society. In its essential form, privacy is based on the notion of personal integrity and dignity. Privacy is a multifaceted concept52 that is currently seriously challenged by the overall technological developments. The concept of privacy has always been subjected to changes53. Patterns of privacy may differ significantly from society to society, depending on social, cultural, political factors, as well as on the historical situation. Moreover, privacy is often balanced against other values.54

All these factors make the concept of privacy difficult to define. The list of its possible definitions seems to be endless.55 Nevertheless, the lack of a single definition should not imply that the issue lacks importance.56 Some of the viewpoints on privacy given below demonstrate different approaches, which are taken in the literature in defining a rich and quite controversial phenomenon, which is privacy. As a legislative approach to the notion is analysed in the next section, the current one attempts to describe privacy from a more theoretical, academic angle.

51 Electronic Privacy information Center and Privacy International, Privacy and Human Rights 2006: An international Survey of Privacy Laws and Developments (2006) p.5 52 M Friedewald, ‘A New Concept for Privacy in the Light of Emerging Sciences and Technologies’, (April 2010), From the Selected Works of Michael Friedewald, p. 71 53 Ibid 54 Ibid 55 M Foutouchos, ‘The European Workplace: The Right to Privacy and Data Protection’, (2005), Accounting Business & the Public Interest, vol. 4, No. 1, p. 38 56 C Laurant, Privacy & Human Rights 2003: An International Survey of Privacy Laws and Developments, (Electronic Privacy Information Center, Washington D.C., Privacy International, London, UK, 2003)

24

TILBURG LAW SCHOOL

Privacy is recognized as fundamental human right in Europe under article 8 of European Convention of Human Rights. The European Court of Human Rights, in 2008, could not give an exhaustive definition on the notion of privacy, and relying on judgments of previous relevant cases named the aspects of life which fall under the spectrum of right to privacy. More precisely, in S. and Marper v. United Kingdom57 case ECHR interpreted article 8 ECHR with the following statement: “the concept of ‘private life’ is a broad term not susceptible to exhaustive definition”.

Outside of the strict context, privacy protection is frequently seen as a way of drawing the line at how far society can intrude into a person's affairs or as a restriction of information disclosure. One of the first definitions, and apparently one of the most broadly accepted, is that privacy is “the right to be let alone”. It has been made in the 1890s by a future United States Supreme Court Justice Louis Brandeis and his co- author Warren58.

Schoeman defined privacy as “a claim, entitlement or right of an individual to determine what information about himself may be communicated to others; the measure of control an individual has over information about himself”59. According to Robert Ellis Smith, an editor of the Privacy Journal, privacy is a “desire by each of us for physical space where we can be free of interruption, intrusion, embarrassment, or accountability and the attempt to control the time and manner of disclosures of personal information about ourselves.”60 Furthermore, Alan Westin has defined privacy as “the claim of individuals, groups or institutions to determine for themselves when, how, and to what extent information about them is communicated to others”.61

There is also an attempt to define privacy through the three “zones” in need of protection.62 The first one deals with territorial or spatial aspects (e.g. privacy within somebody’s home). The second zone refers to person as such, linking privacy exclusively to intimate or sensitive aspects of one’s’ life. Finally, the last ‘zone’ of privacy is understood in the terms of information control.

57S. and Marper v. United Kingdom, nos. 30562/04 and 30566/04, judgment of 4 December 2008, para. 66 58 Warren/Brandeis, The right to privacy, (in Harvard Law Review, V. IV, No. 5, December 1890) 59Veghes/Pantea/Balan/LaluEuropean, Union Consumers’ Views on the protection of their personal data: an exploratory assessment, (Annales Universitatis Apulensis Series Oeconomica, 11(2), 2009), p. 988 60 R E Smith, Ben Franklin's Web Site: Privacy and Curiosity from Plymouth Rock to the Internet, (Sheridan Books, 2000), p. 6 61 A Westin, “Privacy and Freedom”, (1967), London, Bodley Head, p. 7 62Supranote 45, p. 37

25

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

While talking about privacy in the last aspect it is more appropriate to operate with the notion of “information privacy”. Not surprisingly, in the digital age the third zone of privacy and its “informational dimension” has become the primary focus of public attention and legislative development.63 The following definition of information privacy may be provided here: “it is an interest that individuals have in controlling, or at least significantly influencing, the handling of personal data about themselves”64. The term “data privacy” is sometimes used in the same way. The notion emerged during the mid-1960s, and the growth of its importance is often perceived to be directly linked to the development of computer technologies. An understanding of privacy through its third, “information”, zone, constitutes a particular importance for the current research, as it deals directly with the law and policy on information privacy and data protection.

The definition of privacy in this context would remain insufficient without a better understanding of what exactly counts as private information. However trivial it may sound, yet a definition given by Stanley Benn in 1971 clearer than anything else explains the core idea behind private information. He defined the last by referring to a simple example - a couple kissing in the bushes to hide from the public, thus acting privately, or in private. Although the couple’s act may have meant to be a private affair, the two could later decide to share this experience with someone else, at which point the private matter becomes public. Benn believes that “it is not that the information is kept out of sight or from the knowledge of others that makes it private. Rather, what matters is that it would be inappropriate for others to try to find out about, much less to report on this information, without the couple’s consent”.65

Privacy is a broad concept relating to the protection of individual autonomy and the relationship between an individual and society (including governments, companies, and other individuals). Privacy is considered essential in protecting an individual’s ability to develop ideas and personal relationships. Although it is often summarized as “the right to be left alone,66” it encompasses a wide range of rights—including protection from intrusions into family and home life, control of sexual and reproductive rights, and communications secrecy.67 It is commonly recognized as a core right on which it is based human dignity and other values.

3.2 The aspects of the right to privacy

63 R Clarke, ‘Introduction to Dataveillance and Information Privacy, and Definitions of Terms’, (15 August 1997), in Roger Clarke's Web-Site 64R Clarke, ‘Beyond the OECD Guidelines: Privacy Protection for the 21st Century’, (4 January 2000), in Roger Clarke's Web-Site 65 S I Benn, Privacy, Freedom, and Respect for Persons, (in R J Pennock & J W Chapman (Eds.), Privacy (pp. 1-26). New York: Atherton Press, 1971), p. 2 66 Olmstead v. U.S, Supreme Court Justice, Louis Brandeis, 1928 67Niemietz v. Germany, ECHR: “The Court does not consider it possible or necessary to attempt an exhaustive definition of the notion of ‘private life”.

26

TILBURG LAW SCHOOL

At the core of the right of privacy lies the right in private life, in the sense of a right of concealment but also voluntarily disposal of personal information. At the same time, nowadays this right in its enlarged form involves some other aspects, relevant to this presentation, such as the right to data protection and the right to autonomy. These forms are not conceived as individual human rights under the international and European Law, but more as two important aspects of an enlarged “right to privacy”68.

Strongly related to the right to privacy, it appears the right to data protection. Due to the close interaction between those human rights, it seems necessary, before proceed to the analysis of the legal framework for data protection, to define first the relationship between privacy and data protection.

Privacy and data protection have been characterized as the different sides of the same coin. The hallmark of privacy may be claimed to be opacity, but data protection, instead, concerns transparency69. It is possible to outline the two main theoretical approaches to the issue of the relation between privacy and data protection: theoretical attitudes, as well as the legal policies and practices across the globe are either in favour of treating data protection as consumed by or largely intersecting with privacy, or, alternatively, treat the two categories as absolutely distinct.

In the European legal order there is a conclusive evidence in favour of treating data protection interests as an integral part of a more general right to privacy with a consequence of data protection interests enjoying the full scope of a fundamental human right status.70 Under the European legislation, the right to data protection is an inherent component of the right of privacy and, at the same time, a precondition for the individual’s autonomy.

As will be further discussed below, one of the key pieces of legislation relevant to the right of privacy is the article 8 ECHR. This article establishes the right to privacy in its total including both the right to private life and the right to data protection. Information privacy (or the right to the protection of personal data) under the European model is conceived as a fundamental component of a more broad right to privacy71. As the last is largely safeguarded by means of the Article 8 ECHR, data protection should also benefit from a shielding power of the basic human rights instrument, which the Convention is. Personal consumer information, as a result, cannot be exchanged in the marketplace, but must be protected from exploitation. For business the consequence is in clear delimitation of data collection possibilities with a small room for interpretation.

68 Privacy International, “PRIVACY AND HUMAN RIGHTS An International Survey of Privacy Laws and Practice”, GLOBAL INTERNET LIBERTY CAMPAIGN 69Supra note 9, p. 55 70Supranote 56, p. 3 71 Ibid

27

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

At the same time, the right to data protection had been incorporated in the protection guaranteed by article 8 ECHR by the jurisprudence of the European Court of Human Rights. In the case I. v. Finland72, the Court stated that “personal information relating to a patient undoubtedly belongs to his or her private life”. This judgment confirmed a wide scope of the protected privacy rights and prepared a ground to include the entire body of data protection rules into privacy interests protected by Art. 8 ECHR.73

Regarding the right of autonomy, here is not any explicit provision for the right of autonomy in the current legislation. Instead, the Universal Declaration of Human Rights claims that persons have the ability to lay the foundations for an environment in which they can develop their autonomy. Further, article 8 ECHR provides for inclusion of the right to autonomy into the broader umbrella for the right to privacy. Personal autonomy is conceived here as people being free and having the authority to determine about their own course of action according to their own goals and values, takes a prominent place in discussions about persuasive technological tools.

Normative implication of the free, autonomous will, which explicitly means independence, is the undertaking of the responsibility bearing in total the person himself regarding his choices. In parallel, autonomous will implies the respect of the others for the one’s right to follow his own decision making process without intervening in this process. Briefly, right to autonomy should be understood as “a fundamental boundary not to be violated”74.

As a result, human autonomy, besides its interaction with the right to privacy and the right to personal data protection, also includes the right of freedom, and especially the freedom of choice. In that point, it seems that the persuasive technological tools, designed to inevitably and often implicitly help to shape human actions and perception, can be opposed to the fundamental human right of autonomy.

3.3. Conclusion

The new forms of technologies present nowadays more and more threatening for the right to privacy. Specifically, the persuasive technology methods, in an attempt to ameliorate human life, pose in threat the traditionally guaranteed right of privacy. The massive storage of personal information about the user, after a dubious consensus endangers the legitimate right of every human being to be able to keep secret the information of privacy and being able to have it disposed freely and voluntarily.

72I v. Finland, Application no. 20511/03,ECHR 17 July 2008, 73Supranote 56, p. 12 74 J. Anderson, “Autonomy”, (2012),H. LaFollette, J. Deigh and S. Stroud, International Encyclopedia of Ethics, Wiley – Blackwell

28

TILBURG LAW SCHOOL

Besides, these are the critical aspects of the right that have been jeopardized in the process of this new persuasive technology. Crucial for the purposes of this presentation are the words “consensus”, “information” and “freely and voluntarily”. These words refer directly to the right to privacy in its narrower sense, but also to the right to protection of personal data and the right to autonomy.

In the following chapters, each of this aspect will be discussed and examined under the prism of its interaction with the persuasive technology and the challenges posed by this newly introduced technological method.

29

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

CHAPTER 4

Right to privacy and persuasive technology

4.1The legal context

Nowadays, there is a particular focus on protection of this right in terms of legislation. International and European relevant texts as well as national legislations focus more and more in the necessity of obtaining a clear consent after being properly informed. As it will be presented below, the appearance of persuasive technology led the international community into an enforcement of the legislation relevant to right to privacy, as new threats had appeared.

In the twentieth century privacy appears as a human right at international level. The Universal Declaration of Human Rights (UDHR), 194875, captured the first effort to ensure privacy as an autonomous human right. Although the UDHR does not contain legally binding provisions, its contribution to the codification of this right has been proved crucial and the right to privacy can be found in many other legal documents including the legally binding International Covenant on Civil and Political Rights (ICCPR). Article 17 of the ICCPR provides that: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks.”76

According to ICCPR, the right to privacy expands in multiple forms of activity on the internet, including the collection and storage of users’ personal information. According to resolution 68/16777 of United Nations, the international conventions draw the general conditions and criteria for the protection of human rights which are more specified in national level.

Although the right to privacy is not absolute78, any restriction of this right must be defined by the law (the legal text must be accessible, clear and precise so that an individual may have access to the law and ascertain to whom is permitted to conduct data surveillance); be necessary for achieving a particular objective; and be proportionate. Moreover every restriction posed on the right must be shown to being

75 UN, Universal Declaration of Human Rights (UDHR), 1948, article 12, available at http://www.un.org/en/universal-declaration-human-rights/ 76International Covenant on Civil and Political Rights (ICCPR), adopted by the United Nations General Assembly on 16 December 1966, article 17, available at https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668-English.pdf 77 Resolution A/RES/68/167 adopted by the General Assembly on 18 December 2013, available at https://ccdcoe.org/sites/default/files/documents/UN-131218-RightToPrivacy.pdf 78 Supra note 68

30

TILBURG LAW SCHOOL

appropriate of achieving that objective79. Therefore, a burden of proof is established that the restriction is necessary and appropriate for achieving the legal aim. Furthermore, any restriction to the right to privacy must not go to the point of negate the essence of the right itself and must be consistent with other human rights. If the restriction does not fulfill these criteria, it is illegal.

Article 8 of the European Convention of Human Rights formulates as follows: “Everyone has the right to respect for his private and family life, his home and his correspondence. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.”80

By this definition of the right, it becomes obvious that the European public policy has to face many problems and threats. The protection of the right to privacy has a special significance as it is a distinct human right protected by the ECHR. The complexity and the variety of the threats posed on the right of privacy nowadays make the creation of an appropriate legal framework pretty hard. Except the problems arising from the use of technology, even the national laws can set barriers to the right of privacy.

It’s worth mentioning that the ECHR has contributed to the strengthening of the right in two ways. The European Court of Human Rights has in many cases reviewed the national laws and has imposed to member states penalties for not achieving to protect adequately the right to privacy. This is the so-called positive obligation of the states, in addition to the duty of non-interference, to comply with the provisions protecting the right to privacy81. The court has also extended the application of the Article 8, except from the actions of the government, and also to those of private persons when it appears that the state had to prohibit such actions82.

The Council of Europe Convention 108, which is based on the model of Article 8 of the ECHR, incorporates the data subject’s right to privacy, capturing at the same time the basic principles for the data processing. Further, it determinates that “it is desirable to extend the safeguards for everyone's rights and fundamental freedoms,

79 Douwe Korff, “THE STANDARD APPROACH UNDER ARTICLES 8 – 11 ECHR AND ARTICLE 2 ECHR”, (2008), London Metropolitan University 80Council of Europe, Convention for the Protection of Human Rights and Fundamental Freedoms, (ETS no: 005). Strasbourg, 81 Marckx v. Belgium, ECHR, 1979 82 X and v the Netherlands (1985) 8 EHRR 235

31

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

and in particular the right to the respect for privacy, taking account of the increasing flow across frontiers of personal data undergoing automatic processing”83.

According to the Internet Governance Strategy of the Council of Europe, “the freedom, dignity and privacy of Internet users must be a central concern and priority for democracies, especially governments which rely upon and encourage the use of new technologies.”84

The Article 7 of the EU Chart of Fundamental Rights states that “everyone has the right to respect for his or her private and family life, home, and communications”.

At the regulatory level, privacy and trust as regards electronic communications in Europe is mainly ensured by Directive 2002/58/EC85.It is worth noting that the ”right to respect for [one’s] private and family life, home and communications” and the “right to the protection of personal data” are viewed as fundamental and universal human rights. Comprehensive data protection legislation has been enacted in different EU countries. This legislation grants specific rights to data subjects, while imposing on the data controllers important limitations as regards data processing. While data protection authorities are playing a larger role in the enforcement of this legislation, it seems that much still remains to be done in order to achieve real awareness of these data protection provisions among both data subjects and data controllers. In Europe, it is significant to note that the aforementioned legislative texts tend to regard self- regulation and co-regulation schemes as an enhancement rather than a substitute (Article 27 of Directive 95/4686) means of making data protection legislative requirements more effective and legitimate.

At this point, we will examine the content of the e-privacy Directive, which is an application of the data protection principles to the electronic communication sector. The e-Privacy Directive’s provisions should be examined under this section, since they apply to the personal data which are processed through persuasive computing systems. As already been mentioned in the introductory note, various persuasive technological systems, namely mobile apps or internet sites, require the registration of

83 European Convention 108, Preamble, Strasbourg 1981, available at https://www.privacycommission.be/sites/privacycommission/files/documents/convention_108_en.pdf 84 CM (2011) 175, Internet Governance Strategy, Council of Europe 2012 – 2015, Section III, 10.1, available at https://wcd.coe.int/ViewDoc.jsp?id=1919461 85Directive 2002/58/ECof European Parliament and the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications, available at http://eur-lex.europa.eu/legal- content/EN/TXT/PDF/?uri=CELEX:32002L0058&from=EN 86Article 27 of Directive 95/46 of European Parliament and the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, available at http://ec.europa.eu/justice/policies/privacy/docs/95-46-ce/dir1995- 46_part1_en.pdf

32

TILBURG LAW SCHOOL

personal information by the user before their use, downloading and installation. This information allows the persuasive program to identify the user via “cookies”, which are used to serve the purposes of tracking and monitoring.

This Directive contains a provision on data breaches that can be of interest in the context of information security management. According to the Directive, a data breach is “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed.”87 Furthermore, it is also provided that notification to individuals will be required if the data breach is likely to adversely affect their personal data or privacy, as in the case of identity theft. The protection against data breaches is part of a broader obligation to guarantee the right to confidentiality of electronic communication networks, that is, ensuring they will not be eavesdropped, tapped, or whatsoever.

Consent regime plays the most significant role in e-privacy rights as well as in data protection. It is required by the European legislation that the developers of persuasive computing systems, should ask for the consent of the users before the final installation/use of persuasive computing technology.88. More detailed, the designers have to provide the users with all the necessary information about the basic features of the potentially used computing systems and the legal ground on which data are processed. Users need to respond to this information, and after having formed an accurate judgment, they need to operate in an active way, mainly clicking the button “install” or “register” or “use”, an option which is usually shown as the final step for downloading or using sites, apps or other tools of computing systems.

This action may fulfill the requirement of article 5 of e-Privacy Directive, but further it needs to be proved that legitimate and explicit information supplied before to the user89. Consequently, consent must meet the standards of being explicit, informed and specific. This type of consent should not be confused with the notion of consent required for the process of personal data by the user, which will be described in the next chapter. In order to avoid this potential confusion of the user, granular consent is strongly recommended. Accordingly, consent must be given for every kind of data before their processing. This policy serves a main legal purpose; to provide the user with adequate information and further to validly fortify the specific consent by the latter90.

87 Article 2 E- Privacy Directive 88Article 5 (3) E-Privacy Directive 89European Commission, “COMMISSION STAFF WORKING DOCUMENT on the existing EU legal framework applicable to lifestyle and wellbeing apps”, (2014), SWD 135 final, p. 14 90 Ibid, p. 15

33

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

4.2. Persuasive technology and e-privacy: Challenges

An aspect of the eventual violation of the right to e-privacy in terms of persuasive technology is the processing of personal information, named traffic data, which are required in many cases in computing systems, indicatively for the use of a mobile app, for instance for a fitness app recommending a specific route for exercise, or for a video game, in which players are engaged globally in real time. The basic structure of e-privacy imposes obligations on the treatment of personal information once collected91. A critical component places limits on the duration of storage of personal information.

Further, the use of the so-called “cookies” method does not always comply with the safeguards of e- Privacy Directive. In many cases, the registration into a site or an app pre-requires the signing in of the user with his personal social networking profile. In 2009, researchers Krishnamurthy and Craig claimed and proved that “cookies” may be used to link the social networking profiles with the broader internet habits of the users92. This constitutes merely an example of how persuasive computing systems may use inappropriately the method of “cookies” tracking the internet attitude of their users, against the regulatory protection provided by e- Privacy Directive.

In general, the whole idea of the persuasive technology strategy lies on the idea of knowing the most about the user. The more the system knows, the more appropriate it responds in its customer desires. It is the aforementioned, in previous chapters, principle of personalization of service provided to the users. In that way persuasive computing systems gain more popularity and the user does not consider the dangers lurking regarding their personal right to privacy.

Although the idea of persuasive technology seems to benefit both the market and the consumer, by offering in each one the right and directed products and services, based on a fully recorded history of the user, especially via tracking the user’s personal information, the idea that someone has access, archive and process information and fact of your private life without your clear consent and a proper knowledge seems scary.

The threat posed by this method lies in the storage of your personal information without knowing exactly how they will be used, by whom and by when will remain stored93. One of the main dangers seems to be the strategy of aggressive and perfectly

91 Article 6 E- Privacy Directive 92 Balachander Krishnamurthy, Craig E. Wills, “On the Leakage of Personally Identifiable Information Via Online Social Networks”, (2009), pp. 8-9 93 Supra note 89

34

TILBURG LAW SCHOOL

targeted marketing which, although favoring the market, seems to completely violate the free will and the right of choice of every person.

The knowledge of all aspects of a person's life puts in fact the other side in an advantageous position, leaving only an illusion of choice to the costumer. And as safeguards in the current legislation are inadequate, safety valves do not appear to adequately protect the consumer.

It becomes clear that the main challenge currently is to balance between the right to privacy as guaranteed by Article 8 ECHR and the current situation of a rapid technological expansion. Clearly, the legal tools exist already in the European environment. Although, the solution consists on demanding and requiring by the member states not only to implement privacy protective legislation, but also to ensure effective enforcement of such legislation, especially a compliance with it by the companies, coming from the other side of Atlantic and providing their services on the European ground. In other words, the Article 8 ECHR must be interpreted as also implying positive obligations of a State with regard to the protection of the right to privacy94.

At the same time, it seems necessary that the legal texts concerning privacy must be developed and interpreted through the spectrum of the new challenges posed by persuasive technology. Since computing systems demand more and more personal information of the user in order to run properly, provisions must be enforced in a stricter framework, leaving less space to the designers of persuasive technology to exploit in an illegitimate way the personal data required to be provided by the users.

This arguments the urgency, besides the positive obligations imposed to the States by article 8 ECHR and the necessary evolution of the interpretation of legal instruments, the expansion of the obligation of article 8 ECHR to the designers of persuasive technological tools95, so for the crucial balancing of persuasive technology with the established right to privacy to be achieved. However, applying the theory of positive obligations to the Internet privacy would therefore entail proving that privacy infringements committed by a certain network or application are so substantial that they amount to fundamental right’s breach, and the State ought to have regulated this field in order to prevent privacy infringements.

94 Supra note 79, p.1 95 Bart Prennel, Demosthenes Ikonomou, “Privacy Technologies and Poilicy”, (2012), First Annual Privacy Forum, pp. 61-62

35

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

CHAPTER 5

Right to data protection and persuasive technology

5.1Legal Context

Data protection right is considered one of the fundamental rights existing within the European Union; this is why the protective law regulating this right is pretty explicit and detailed. The right to data protection falls under the wide umbrella of the right to privacy, as it is stipulated in article 8 ECHR, since the interpretation given in the jurisprudence by the judges of European Court of Human Rights was quite broad96.

The EU was created as an international environment with the aim to create a common market. Human rights appear gradually, as it became obvious that market opening could threat human rights and that the protection existed under nationals laws were not appropriate. At first, the European Court of Justice took the leading role in establishing the human rights. In the decade of 90’, however, the European legislation has been activated.

Inside Europe, we can detect a variety of laws protecting personal data. The first modern data protection law in the world appears in Germany in 197097. This law was followed by national laws in Sweden98 and France99. These laws eventually resulted to an attempt of harmonisation of national law by the directive of 1995, the EU Data Protection Directive 95/46/EU100.

The Data Protection Directive, proposed in 1990 and validated in 1995, establishes a complex regulatory framework at the national level to protect human rights. At that time, as the EU was still focused on the aim of common market, the objective of the data protection regulation was to prevent data protection abuses by market actors and by government101. Recently, nonetheless, EU data protection has taken a different direction.

96 Supra note 57 97Datenschutzgesetz [Data Protection Act], Oct. 7, 1970, HESSISCHES GESETZ-UND VERORDNUNGSBLATT I 98 Swedish Data Act, 1973 (revised in 1998) 99French Data Protection Act of 6 January 1978 (revised in 2004) 100Directive 95/46 of European Parliament and the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, available at http://ec.europa.eu/justice/policies/privacy/docs/95-46-ce/dir1995-46_part1_en.pdf 101 Francesca Bignami, “Privacy and Law Enforcement in the European Union: The Data Retention Directive”, (2007), Chicago Journal of International Law, Vol. 8, No 1, p.234

36

TILBURG LAW SCHOOL

The EU policies behind the Data Protection Directive 1995 are predetermined by the established view on privacy as a human right. A general argument of the Directive is that the reliance on the recognition of a property right of personal information would have the undesirable consequence of placing responsibility on each individual to protect his or her own interest. Without an external authority imposing and enforcing regulations on business, the individual user’s interest in protection and the businesses’ interest in data collection are in direct conflict, with the business organizations having a superior position in the unequal bargaining process.

As established to the Privacy and Human Rights 2002 report, “the basic principles established by the Directive are: the right to know where the data originated; the right to have inaccurate data rectified; a right of recourse in the event of unlawful processing; and the right to withhold permission to use data in some circumstances. For example, individuals have the right to opt-out free of charge from being sent direct marketing material”102.

The application of Data Protection Directive concerns the processing of personal information, which as stated in article 2 of the Directive, means “any information relating to an identified or identifiable individual”. At the same time, under the term of any information is conceived both objective and subjective information and it is included any form of information, including what is called sensitive data (art. 8 of the Directive). Moreover, the information can be of any type, such as graphical or photographical data103.

Furthermore, the information can be connected to the person either directly, or indirectly. The Article 29 Working Party states that in order to admit that a data can relate to an individual, three specific criteria are examined: content, purpose, or result. The fulfilment of even one of these criteria can lead to a positive answer. Either the content of the data concerns the individual, either the information is not directly about a person but is used with the objective of taking actions on this person, or either the information, although not related to the person, can be led to an impact on this individual. The expansion of what can be used as an identifier has to be examined ad hoc, case-by-case104.

Moreover, the individual needs to be identified or identifiable. That second criterion is fulfilling when, even if the person is not identified yet, it exists the possibility of its

102 EPIC.Privacy and Human Rights 2002: An International Survey of Privacy Laws and Developments, (Washington, D.C.: Electronic Privacy Information Center and Privacy International, August 2002) 103 Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data 4/2007 104 Ibid

37

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

identification.105 Furthermore, the art. 29 WP takes a pragmatic approach by stressing that this criterion should take all the factors at stake into account. This includes the cost of conducting the identification, the intended purpose, the way the processing is structured, the expected advantages, the interests at stake for the individual, the risks of organizational dysfunctions (e.g., breaches of confidentiality duty), and technical failures. Furthermore, the test should take into account the real-time level of technological development. Special importance must be given to the purpose of the data controller106.

In order to modernize and adapt the 95’ Directive for data protection, the European Union has prepared a regulation in order to adapt the legislation to the vast technological changes, among them and the persuasive technology. The evolution of the European legislation seems necessary, as it will be further analyzed bellow, and the reform has been already adopted. The plan is for the regulation to be entered into force in 2018107.

The first aspect, included to the current draft, which differs from the Directive, concerns the responsibility. Under the Directive, any data “by which an individual can be identified”108 was the sole responsibility of the data controller, the owner of this data. Under the new regulations, however, any company or individual that processes this data will also be held responsible for its protection, including third parties such as cloud providers. Therefore, third parties will need to be extra vigilant when it comes to securing the data of others, and data owners will want to thoroughly vet their partners109.

Moreover, the legal framework on transferring data of EU citizens outside Europe will become stricter. Even if sharing is allowed (however legitimate the data controller thinks this is), the Directive currently prohibits personal data from being transferred outside the European Economic Area (EEA) unless the controller assures an adequate level of privacy protection110.

Under the Directive, users already have the right to see the data collected about them. In the new regulation, users can also demand their data to be erased. Furthermore, under the new regulations, controllers must inform and remind users of their rights, as well as documenting the fact that they have reminded them of their rights111. Finally, the regulation will allow users to claim damages in the instance of

105 Ibid 106 Ibid 107 http://www.consilium.europa.eu/en/policies/data-protection-reform/data-protection-regulation/ 108 Article 2 (a) Data Protection Directive 109Supra note 104 110 Article 5 Data Protection Directive 111 Supra note 104

38

TILBURG LAW SCHOOL

data loss as a result of unlawful processing, including collective redress, the equivalent of a US-style class action lawsuit.

5.2 Data protection and persuasive technology

Since the decade of 70’, an evolution has made crucial the enforcement of the law and gave a new dimension to the interaction between law enforcement and data protection laws—technology. Gradually the last decades, through the outbreak of persuasive technology, we live our lives in digital space using computing systems with high persuasive purposes by their design. We register with our personal data in order for an app to be downloaded; we install video games providing the software with our real personal information so we can be identified by it, recording our scores. Consequently, our personal information and details are recorded, stored and searched easily.

While accessing a site or an app or a videogame, as mentioned above, the first question that the user has to cope with is the registration of his personal data, which is recommended to be completed by connection of the site or app or videogame to his personal e-mail account or to his account used for social media. This is a decision needs to be taken by the data subject with no alternative option of skipping; otherwise he may not be able to have access on many activities provided by the aforementioned computing systems.

A representative example of this policy in digital world can be found on the grounds of m-health, which consists one of the inherent components of e-health persuasive technology. The users are required to describe in detail their medical records in order to enjoy the promising benefits of the apps or sites listed in the field of e-health/ m- health, which “guarantee” to conduct the users into a much healthier way of living and in some times recovering their health problems. Consequently, designers of these apps or sites may cause unwanted raise of legal threats to the users if the piece of legislation regulating the protection of sensitive personal data112 is not respected to its total extent. It is proved, according to an investigation of Financial Times, that 9 of the top 20 health-care apps track information of the user’s mobile phones to send them to their corresponding companies113

Another example of a possible threat of personal data protection, identified in the area of persuasive technology and more often appears in the use of social network, is

112 Article 8 Data Protection Directive 113 Financial Times, “Health apps run into privacy snags”, (2013); EUROPEANCOMMISSION, “GREEN PAPERon mobile Health ("mHealth")”, (2014) COM(2014) 219 final, , p. 8

39

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

targeted advertising114. Behavioural advertising is defined as a practice of the market in order to target advertisements to users based upon observed or known information of individual, such as age, profile and online activity; it is consequently strongly related to the collection and long-term storage of personal data and it involves tracking of consumers. The lack of awareness of this processing of personal data by the user, which data may have been provided, for instance, in a trade sale site, causes, in various cases, the uncontrolled processing of these data. The personal characteristics that are stored may include IP address, web pages that the individual had visited, how much time has been spent in each web page, which online buys the person had done and others. Nonetheless, it tracks a pattern of on-line activities.

The above is just a couple of examples of what can now be done with our web data, and an image of what might be done in the future with that data. The most important factor raised by these examples is that sensitive personal data or personal data are processed, in the meaning of collection, storage, retrieval, use or disclosure115. For the sake of data protection, personal data must be processed only after the consent of the person116. The current legal framework related to the collection and storage of personal data for the purposes of behavioural advertising is not appropriate and poses a lot of threats for personal data protection, especially on the basis of free, clear and informed consent.117

On the prism of the special form of persuasive computing systems which require access to the personal information of the users, a first potential problem can be detected. Some social network companies assume that IP addresses cannot be perceived as personal information. However, the Working Party has claimed that IP address is considered personal data of the user118. Besides, the existing technology provides the means of direct relation of a user’s IP with his or her personal information such as name, address, and telephone number. Consequently, a user’s personal information may be spread to the third party without not only the user’s consent, but not even being acknowledged of it.

This fact leads to the second problem, as already sounded, which is related to user’s consent. Generally, users are simply uninformed about processing of their personal data119. The information given to those before disclosing personal data for the use of a computing system, in the most cases doesn’t fulfil the criteria posed by data

114 Article 29 Data Protection Working Party, Opinion 2/2010 on online behavioural advertising, p. 4 115 Article 2 (b) Data Protection Directive 116Article 7 (a) Data Protection Directive 117 European Commission, “A comprehensive approach on personal data protection in the European Union”, p. 6. 118 Working Party 37, “Privacy on the Internet - An integrated EU Approach to On-line Data Protection”, adopted on 21.11.2000 119 ARTICLE 29 Data Protection Working Party, Opinion 15/2011 on the definition of consent, 2011, pp. 18-19

40

TILBURG LAW SCHOOL

protection legislation120. The multiplication of actors that participate in persuasive technology field and the technological complexity of this hamper the possibility of an individual to know and understand if his personal data are being collected, by whom, and why.121

Referring to consent does not imply merely a simple consent. Consent must be freely given and must be unambiguous, which means that the purpose of processing of personal data should be communicated properly to the user so he may be able to evaluate the necessity of this process122. Further data must be processed fairly and according to the law, for special and concrete purposes and limited use of them, and stored only for a concrete period of time123. At the same time, it must be guaranteed the security of the data, and everyone must ensure that those data remain into countries with the appropriate framework of protection.

Another key problem that exists is that in each legislative approach may be observed suppressive treatment of the phenomenon. That means that the system starts with the collection and storage of personal data of users, unless the user expressly requests their erasure. Besides, even in the case of the demanding consensus, under the current circumstances we can talk about clear and informed consensus based on the tactics used to detach this today124.

In any case, the main issue posed by persuasive technology and threatening the right to personal data remains the superiority position system to the user which leads to subconscious guidance of people who are not familiar with these practices and, by extension, to violation of his rights. Persuasive technology customizes the needs of the users according to its purposes, luring them to use it without being aware of the potential latent threats.

Common feature of persuasive computing systems is the precondition of previous registration of the user into them with his personal data. This requirement is met at the procedure of both the installation of a mobile app or the signing in internet sites and the engagement within a virtual society for the execution of a video game. This revealing of personal information is not accompanied, in most cases, with the safeguards should be applied for the adequate information and protection of the users with respect to the processing of their disclosed personal data. As people are gradually more exposed to the threaten of illegitimate processing of their personal data making

120Supranote 68, p. 11 121Supranote 56, p. 6 122Supra note 77, p. 4 123 Articles 6, 7 and 8 Data Protection Directive 124Neil J. Rubenking, Who’sWatching You Surf?, (PC MAG., June 26, 2000)

41

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

use of persuasive computing means, they should be provided, under stricter legal provisions, with the right of previous information and consent to this processing by the data controllers.

As a conclusion, the right to personal data seems at first side to be completely protected by the European Legislation, due to the vast amount of legislative texts. However, the rapid evolution of technology and, especially, the rise of persuasive technology put a new threat to this aspect of the right to privacy. Since users are forced in a way to register more and more with their personal data for persuasive computing systems to operate properly, higher level of data protection is urgently required.

Despite the legislative efforts, seems there are still gaps in the legal framework and in particular the question of consent. The peculiarity of methods used by persuasive technology needs a different approach.

The European legislation on that matter seems quite complete, given indeed the impending amendment of the relevant Directive. However, given that the persuasive technology tools is an important factor for the market and that the EU tries to compromise the human rights with the development of the market, it appears difficult to have a fully protective European legislation. In any case, it certainly deserves to be emphasized that the countries of Europe have adopted a fairly rigorous approach to this subject than for example USA, where there is greater flexibility125.

European Commission suggested a potential solution in order to avoid the abuses of data protection legislation by the persuasive technological tools. More specifically, European Commission has proposed technologically neutral mixture of rules inside the EU, which can stand face to the future technological evolutions, when it has reviewed the Data Protection Directive in 2011126. Particularly, this effort had focused on making stricter the responsibility of data controllers, including the obligation to alert data violations, and by adopting the principle of "privacy by design". This concept provides the application of privacy requirements from the total of a system’s development and throughout its life cycle.

125 InfoSec Institute, “Differences between the privacy laws in the EU and the US”, (January 2013) 126 European Commission, “COMMISSION STAFF WORKING PAPER, Impact Assessment”, (2012), SEC(2012) 72 final

42

TILBURG LAW SCHOOL

CHAPTER 6

Right to autonomy and persuasive technology

6.1 The Context of autonomy

As analysed in the previous chapters, the new technology and communication systems, used as force of persuasion, can violate the fundamental, and established in the European environment, rights of privacy and data protection. As an inevitable result, the persuasive technological tools affect also the human right of privacy in the essence of autonomy, as it is protected under art. 8 ECHR, as briefly presented in the previous chapters.

Warren and Brandeis127 have argued in favor of the inextricable link between the notion of autonomy and privacy. According to the etymology of the term “autonomy”, the word “autonomy” is derivative of the word “autonomos”, a Greek compound word consisting in the word autos, that is “self” and the word nomos, which means “law”. Consequently, autonomy is associated with the concept of a man defining his self-identity and having his own laws in the essence of owning his own self and determining ad libitum about himself. Apparently, this definition falls within the broader domain of the notion of privacy, as already presented in detail, in the meaning of the right of man “to be let alone”.128

In early case law different points of view have been voiced by academic representatives with regards to the inclusion of the right to autonomy under the wide umbrella of the right to privacy. Indicatively, Hyman Gross explicitly has rejected the conception of privacy in terms of autonomy. He particularly focalizes on the hazard raised by the confusion of the notion of autonomy under the spectrum of privacy and he claims that this potential confusion emerges from the use of the phrase “let to be alone” as a synonym of the right to privacy129.

On the other hand, Professor David Richards has strongly followed a positive attitude towards the circumscription of autonomy in terms of privacy. Accordingly: “It is natural to call the autonomy a right of privacy in the sense that moral principles no longer define these matters as issues of proper public concern but as matters of highly personal self-definition”130. This latter approach, expressed by professor Richards,

127Supra note 53 128 Supra note 61 129Carl Wellman, The Right to Privacy and Personal Autonomy, (Volume 29, Springer Netherlands), p. 177; Gross, 1971, p. 100 130 Ibid; Richards, 1979, p. 1000

43

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

appears to prevail in literature and case law, not only under American Constitutional Law131, but also in the legislation in force at the east side of Atlantic ocean.

Deborah Hurley goes to state that “in Europe privacy and personal data protection is regarded as an inalienable right because it is so important to [their] dignity and sense of autonomy”.132Friedman has also argued for the inclusion of human values on the design of computer systems and software agents, especially respecting and enhancing the “user autonomy”.133

There is not any explicit provision for the right of autonomy in the current legislation. Instead, the Universal Declaration of Human Rights assumes that people have the ability to endorse their own self-rules and undertakes to lay the ground works for an environment in which people can conduct themselves in their own autonomy.134 Personal autonomy implies the freedom, the ability and the authority of man to choose his own course of action in accordance with his goals, values and moral intention, undertaking the whole responsibility of his decisions and choices135.

This personal responsibility being held by the autonomous person has further normative implications regarding the role of the others in this free will process of autonomy. More particularly, the others have the obligation to respect one’s right to decide on and follow a certain course of action. Under these terms, it is underlined the close relation between personal autonomy and personal dignity, each one of which factors constitutes an inherent condition leading to the other factor and in general in autonomous way of living. The right to autonomy should be understood as “a fundamental boundary not to be violated”136.

In 2002, the European Court of Human Rights had to deal for the first time with a case regarding the right to autonomy and its interpretation in terms of article 8 ECHR137. The Court referred to the concept of personal autonomy stating that “The Court is not prepared to exclude that this constitutes an interference with her (the applicant’s) right to respect for private life as guaranteed under Article 8 §1 of the Convention”138. Consequently, article 8 ECHR explicitly protects the right to self- determination and eventually the right to autonomy through the implementation of Convention in the European legal fields. Within the instructions given under the

131www.justia.com 132 D Hurley, “Privacy in Play”, (1998), Think Leadership Magazine, p. 17 133 B. Friedman. Value-sensitive design. Interactions, (ISSN 1072-5520, 1996) pp. 3(6):16–23 134Bart Kamphorst, “The primacy of human autonomy: understanding agentrights through the human rights framework”, p. 2;Supra note 67 135Ibid 136Ibid 137Pretty v. the United Kingdom, ECHR, 2346/02, 2002 138Ivana Roagna,Protecting the right to respect for private and family life under the European Conventio on Human Rights,(Council of Europe human rights handbooks, 2012). p. 15 44

TILBURG LAW SCHOOL

section of the scope of paragraph 1 of article 8 ECHR, it is clearly provided for the inclusion of the right to autonomy under the protective frame of article 8 ECHR, safeguarding the right to privacy with the following text: “(..) embraces personal autonomy, the right to make choices regarding one’s own life without interference (…) to develop one’s own personality”.139

As a result, human autonomy, besides its interaction with the right to privacy and the right to personal data protection, also includes the right of freedom, and especially the freedom of choice. In that point, it seems that the persuasive technological tools, designed to inevitably and often implicitly help to shape human actions and perception, can be opposed to the fundamental human right of autonomy. As mentioned before, pursuant to article 8 ECHR, autonomy is incorporated as an inherent element of the right to privacy. At the same time, many national laws, between them the Greek Constitution, guarantee the autonomy as an individual right140.

6.2 The right to autonomy and persuasive technology

It is obvious that the digital technology, particularly the internet, offers potential complications into human beings’ discussion and understanding of free will; even as the internet appears to open up options and capacities for individuals to exercise increased autonomy, it also has the potential to change the very ways in which human beings think, thereby impeding human capacities for meaningful self-reflection, a necessary if not sufficient criterion for rational autonomy.141

As the use of the persuasive computing systems rises, it seems that the free choices of the users start to diminish. Even if the technology can be used to ameliorate our lives and to offer us better, more directed and faster choices, this option appears to set aside the free will of a human being. The more and the faster are the options and the choices, the less we think and our decision seems not a clear and conscious one. The directed presentation of choices, relevant to the user’s needs and desires and through management practices, unfamiliar to the common user, puts in danger the human autonomy.

The autonomy of a human being is totally related and eventually threatened by the evolution of persuasive technology. As Verbeek stated, “autonomy was thought to be attacked when human actions are explicitly and consciously steered with the help of

139http://echr-online.info/article-8-echr/ 140A. Ciacchi, Party autonomy as a fundamental right in the European Union, in European Review of Contract Law (6 ERCL 2010), p. 303 141 Supra note 1

45

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

technology. This reduction of autonomy was even perceived as a threat to human dignity. If the human actions are not a result from deliberating decision but from steering technologies, people were thought to be deprived from what makes them human”142.

Besides, the idea of persuasive technology sets in the vulnerability of human beings’ autonomy. The vulnerability and the influence of the human being towards this environment are used by the market as an advantage by means of the technology. This practice could lead to a restriction of free will and freedom of choice.143 By accepting the method of persuasive technology, in fact the current legislation poses limits to the right of autonomy for the purpose of development of the market.

After a research has been carried out within the framework of interaction between the use of persuasive technology and the autonomous decision making procedure, some interesting conclusions have been resulted. Accordingly, the users participating in this particular experimental research expressed feelings of reactance, when they felt getting exposed to autonomy-threatening messages by persuasive technology tools144. The most psychologically reactant the user appears to be experiencing, the most negative evaluation about persuasive technology means has been exerted. In short words, the feeling of threat of the right to autonomy caused by the use of persuasive technology tools leads to the detestation of these tools by the users perceiving this threat.

The freedom of choice allows to individually and collectively resolving cognitive dissonance and helps creating personal autonomy. On the other hand, persuasion that operates without the user being aware of the programmers’ intent could threat this freedom. As a result, in order to balance the respect to human autonomy and the developing persuasive technology, an extra condition should be fulfilled by the technological tool. More specifically, the purpose of the persuasion, in other words the macrosuasion’s intent, has to be exposed at the beginning of one’s engagement with a program. It would then be possible for the user to determinate the program’s relevance and exercise its right to accept or reject its offering145.

The main core of this idea is the respect needs to be shown by technological equipment for people’s autonomy. The intent of persuasive technology should

142 Supra note 106, p. 2; P-P. Verbeek. Designing morality. In Ethics, Technology And Engineering: An Introduction, (2011 chapter 7.Wiley-Blackwell) 143Julie E. Cohen, “Examined Lives: Informational Privacy and the Subject as Object”,(2000) 52 STAN. L. REV. 1373, 1423-25), pp. 1424 - 27 144Harri Oinas-Kukkonen, “The dominant robot: Threatening robots cause psychological reactance”, (2010), 5th International Conference, PERSUASIVE 2010, p. 183 145 A. Kobsa, “Privacy Enhanced Personalization”, Communication of the ACM, 50 (8) , 24-33 [DOI 101145/1278201.1278202

46

TILBURG LAW SCHOOL

converge with the intent of the users. This may be achieved by the design of the persuasive technological tools. It seems to be urgent for persuasive technology to adjust persuasive messages to the intention of the average user, so persuasive technology effects not to be evaluated so emphatically in terms of autonomy146.

Freidman was a proponent of this theory claiming that human values and especially the right to autonomy should be integrated into the design of computing systems147. According to his reasoning, users are, in various cases, distorted regarding the moral role of computing systems, confusing them with the morality of human beings; but in order to talk about morality, intentionality is an essential precondition. Computing systems lack intentionality per se. Intentionality can be determined only at the stage of designation of computing systems by their designers; thus, this caused distortion may be confronted merely by the designers of technological tools. Two ways towards this goal, proposed by Friedman, may be the “non anthropomorphic interface design” of computing systems, so the latter to be distinguished from human beings, and the “participatory design”, where the user may participate in the process of designing of the persuasive technological tool detecting by him the problems that computing systems should resolve148.

Fulfilling the prerequisite of incorporating human values into the designation of persuasive technological equipment, the second step needs to be taken is the one already mentioned briefly before with regards to the presentation of the intent of persuasion to the user at the primary level of his interaction with the computing system. The designation of persuasive technological tools has highly mass character; this implies that persuasive technology’s systems apply broadly to all of their users with no distinction of individual’s personal demands and goals. This is the so-called lack of personalization and tailoring techniques of persuasive technology149.

The user should be aware of this policy, comprehending that the type of possible needs satisfied by the use of persuasive computing systems has already been predetermined by the designer of this kind of technology. This can be achieved via the disclosure of the scope of persuasion of the technological tool to the person, so the latter may be able to follow his own decision making process according to his personal necessities150.

146Supra note 116 147 Supra note 106, p. 2; Supra note 105 148Ibid 149Supra note 106, p. 5; This policy does not have to be confused with the personalizing of the services offered by persuasive computing systems after tracking of personal data of the users. 150Ibid

47

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

As a conclusion and in more general terms, it can be inferred that the right to autonomy is protected in a quite vague veil, and it is recognized merely under the broader right to privacy. Article 8 ECHR is considered to incorporate an open-ended right, under which they fall a range of interests that are not regulated under separate parts of the Convention. This is caused partly due to the lack of a comprehensive definition of the facets of the right to privacy stipulated in article 8, by both the Commission and the European Court of Human Rights151. This has resulted to the failure of adapting the multiple facets of the interests covered by the right to privacy the changes of the time152.

Therefore, the right to autonomy does not constitute a distinct right on the Convention, so for the parameters of its protection to be regulated in more detailed and explicit terms following the demands of the modern times, and especially applying more adequately in the need of safeguarding the right with respect to its affection by persuasive technology. Accordingly, the first step should be taken from a legal point of view, is the clear and explicit provision of the right to autonomy in European and national legislation. As long as the framework of the right to autonomy is structured into the legislation and case law, the designers of persuasive technological equipment should be complied with this legitimate framework and the users may be more informed and «decommissioned” with regards to the protection of their right to autonomy.

151UK Human rights Blog, “Article 8 | Right to private and family life” 152 Ibid

48

TILBURG LAW SCHOOL

CONCLUSION

The notion of persuasive technology in computing systems has been studied within the field of a specific science named captology. It focuses on the interaction of people with and not through technology. This indicates the dependent relation has been developed between the two subjects, human and technological tools. The power of technology’s persuasion on human attitude is highly impressive promoting social dynamics and applying social norms of behavior.

In regard to the fundamental human rights, the persuasive technological tools pose threats for the right to privacy. This right has as inherent facets the right of protection of personal data and the right of human being’s autonomy. The latter two rights are incorporated into the core of the right to privacy, under article 8 ECHR, although the right to data protection is additionally regulated under the provisions of Data Protection Directive.

The current legislation in Europe, with emphasis on the article 8 of the European Convention of Human rights, the e-privacy Directive and the Directive 95/46/EU appears not vast enough to address the threats of this new type of technology.

Therefore, the European Court of Human Rights, which assumes the role of interpreter of Convention’s rights and, at the same time, leads with its jurisprudence an evolution of these rights, in order to customize them to the new demands, helps to fulfil the gaps of the current European legislation and shields the rights to privacy and data protection towards the threats of persuasive technology.

At the same time, the European Commission also contributes to assure this protection, adopting guidelines and even review the current directives, in order to pose limits and conditions in the use of the persuasive technological tools. The regulation, expected in 2017, concerning the adaption of the Directive 95/46/EU to the new aspects of technology and, thus, to the threats posing by persuasive technology, eventually will contribute to the creation of a strict framework of protection for personal data in the European environment.

Although the legislative efforts of the European Community are important, addressing risks arising from technology is always hindered by the singularity of technology, the fast evolution in this field. New technologies are emerged with a rhythm that can’t be followed by the legislator.

As regards persuasive technology especially, the targeted choices offered to a user, based on a full collection of information about his private life and his personal data,

49

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

clearly poses problems of violation of human rights legislation under the current context. The key problems, where the lacks of the legislation are more apparent, are the absence of a clear and informed consent by the user. At the same time, the inversion of the weight from the system and the companies to the user, that he is the one to have the ability and awareness to react in order to protect his rights, seems to be a basic failure on the current legislations. It is clear that the methods adopted in the field of persuasive technology restrain the autonomy of a human being and his ability of free choice.

So, the question that is raised is the interaction between the human rights and the persuasive technology and whether the human rights are being protected under the current legal framework. Firstly, it is obvious that, even if one of the main goals of persuasive technology is the amelioration of internet user’s lives, the methods adopted in order to fulfill this goal put in threat the right to privacy, in multiple aspects of it, such the right to data protection and the human autonomy. So, the interaction between persuasive technology and human rights endangers each one of the aforementioned basic rights.

As seen above, the European community makes efforts to protect the broad right to privacy from this new threat. The legislative production on that matter, with emphasizes on data protection, is massive in the EU and stricter than in other regions. However, the rhythm of evolution of technology introduces new threats almost every day, posing a huge difficulty in the legislator who has not the time to react. Indeed, in the area of technology, even a vast wording of the law does not ensure that the provision will certainly cover the new parameters.

The current legislative production into EU seems to focus on the major problem of informed and clear consent of the user. However, it is doubtful if any legal provision can clearly guarantee that, in regarding with the marketing methods using by persuasive technology. In the opinion of the writer, a better choice would be to address directly to these methods, posing criteria and limits in their use, an issue that in fact seems difficult as another parallel goal of EU is the evolution of the market.

In an effort to enrich the current legal framework, the European Commission, during the process of revision of Data Protection Directive in 2011, suggested the focus on making stricter the responsibility of data controllers, including the obligation to alert data violations, and by adopting the principle of "privacy by design". This concept provides the application of privacy requirements from the total of a system’s development and throughout its life cycle.

Another solution proposed by the theory was the addition of another condition in the legal practice of persuasive technology, introducing the additional condition of the purpose of the persuasion. In other words, a real consent can be guaranteed only if the user is informed a priori not merely about the methods but also the intent of the designer of persuasive computing systems. This condition has to be revealed at the

50

TILBURG LAW SCHOOL

beginning of one’s engagement with an operating system. It would then be possible for the user to determinate the computing system’s program’s relevance and exercise its right to accept or reject its offering.

At the same time with a guarantee of an informed and clear consent, it seems also necessary that the users obtain full and real control over their data, knowing who dispose them, for what reason and for how long. Moreover, it is necessary that they have a clear option to refuse this use. In the opinion of the writer, it would be more appropriate to give more emphasis on the information upon the term and the clear consent, that it comes first and it is the act that begins the cycle of collection and storage of personal information that focusing in the later ability of repudiation.

It has been pretty obvious, after the analysis of the present paper, that persuasive technology is justifiably characterized a dubious phenomenon. It is bound in the midway between persuading and manipulating and the human rights affected by its evolution are directly required to be protected in stricter and more updated terms.

51

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

LIST OF SOURCES

List of books

Αριστοτέλης, ΡητορικήΒιβλίοΠρώτο. Εισαγωγή, Mετάφραση, 2002, Θεσσαλονίκη: Ζήτρος, Aristotle, Rhetoric, book 1. Introduction, Translation, (2002, Zitros Ed.)

Benn S.I., Privacy, Freedom, and Respect for Persons, (in R J Pennock & J W Chapman (Eds.), Privacy (pp. 1-26). New York: Atherton Press, 1971)

Ciacchi A., Party autonomy as a fundamental right in the European Union, in European Review of Contract Law (6 ERCL 2010)

Electronic Privacy information Center and Privacy International, Privacy and Human Rights 2006: An international Survey of Privacy Laws and Developments (2006)

EPIC,Privacy and Human Rights 2002: An International Survey of Privacy Laws and Developments, (Washington, D.C.: Electronic Privacy Information Center and Privacy International, August 2002)

Foggs B.J., Persuasive Technology, Using Computers to Change What We Think and Do, (2011, Kaufmann Publishers) Friedman B.. Value-sensitive design. Interactions, (ISSN 1072-5520, 1996)

Laurant C., Privacy & Human Rights 2003: An International Survey of Privacy Laws and Developments, (Electronic Privacy Information Center, Washington D.C., Privacy International, London, UK, 2003)

Liebert W. and Schmidt J.C., Collingridge’s dilemma and technoscience An attempt to provide a clarification from the perspective of the philosophy of science, (Poiesis Prax, Springer-Verlag, DOI 10.1007/s10202-010-0078-2, 2010)

Perloff R.M., The Dynamics of Persuasion: Communication and Attitudes in the 21st Century, (2nd ed., Publisher, Mahwah, 2003)

Roagna I.,Protecting the right to respect for private and family life under the European Conventio on Human Rights, (Council of Europe human rights handbooks, 2012)

Rubenking N.J., Who’s Watching You Surf?, (PC MAG., June 26, 2000)

Smith R.E., Franklin's Web Site: Privacy and Curiosity from Plymouth Rock to the Internet, (Sheridan Books, 2000)

Veghes et al., European, Union Consumers’ Views on the protection of their personal data: an exploratory assessment, (Annales Universitatis Apulensis Series Oeconomica, 11(2), 2009)

52

TILBURG LAW SCHOOL

Verbeek, Designing morality. In Ethics, Technology And Engineering: An Introduction, (2011 chapter 7.Wiley-Blackwell)

Warren and Brandeis, The right to privacy, (in Harvard Law Review, V. IV, No. 5, December 1890)

Wellman C., The Right to Privacy and Personal Autonomy, (Volume 29, Springer Netherlands)

List of articles

Anderson J., “Autonomy”, (2012), H. LaFollette, J. Deigh and S. Stroud, International Encyclopedia of Ethics, Wiley – Blackwell

Bignami F., “Privacy and Law Enforcement in the European Union: The Data Retention Directive”, (2007), Chicago Journal of International Law, Vol. 8, No 1

Cheng R., “Persuasion Strategies for Computers as Persuasive Technologies”, Department of Computer Science University of Saskatchewan

Clarke R., ‘Beyond the OECD Guidelines: Privacy Protection for the 21st Century’, (4 January 2000), in Roger Clarke's Web-Site

Clarke R., ‘Introduction to Dataveillance and Information Privacy, and Definitions of Terms’, (15 August 1997), in Roger Clarke's Web-Site

Cohen J.E., “Examined Lives: Informational Privacy and the Subject as Object”, (2000) 52 STAN. L. REV. 1373, 1423-25)

European Commission, “A comprehensive approach on personal data protection in the European Union”

European Commission, “Commission Staff WorkingDocument on the existing EU legal framework applicable to lifestyle and wellbeing apps”, (2014), SWD 135 final

European Commission, “Commission Staff Working Paper, Impact Assessment”, (2012), SEC(2012) 72 final

European Commission, “GREEN PAPER on mobile Health ("mHealth")”, (2014) COM(2014) 219 final

Financial Times, “Health apps run into privacy snags”, (2013)

53

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

Fogg B.J., “Persuasive Computers: Perspectives and Research directions”, (1998), CHI 98, Stanford University

Foutouchos M., ‘The European Workplace: The Right to Privacy and Data Protection’, (2005), Accounting Business & the Public Interest, vol. 4, No. 1

Friedewald M., ‘A New Concept for Privacy in the Light of Emerging Sciences and Technologies’, (April 2010), From the Selected Works of Michael Friedewald

Halko S. and Kientz J., “Personality and Persuasive Technology: An Exploratory Study on Health-Promoting Mobile Applications”, (2010), 5th International Conference, PERSUASIVE

Hurley D., “Privacy in Play”, (1998), Think Leadership Magazine

InfoSec Institute, “Differences between the privacy laws in the EU and the US”, (January 2013)

International Council on Human Rights Policy, “Navigating the Dataverse: Privacy, Technology, Human Rights”, (2011)

Kamphorst Β., “The primacy of human autonomy: understanding agent rights through the human rights framework”

Kaptein and Maurits, “ADAPTIVE PERSUASIVE MESSAGES IN AN E- COMMERCESETTING: THE USE OF PERSUASION PROFILES”, Eindhoven University of Technology

King P. and Terster J., “The Landscape of Persuasive Technologies. Communications of the ACM”, (1999) Volume 42, Issue 5

Kobsa Α., “Privacy Enhanced Personalization”, Communication of the ACM, 50 (8) , 24-33 DOI 101145/1278201.1278202

Korff D., “The standard approach under articles 8-11 ECHR and article 2 ECHR” (2008), London Metropolitan University

Krishnamurthy B. and Wills C.E., “On the Leakage of Personally Identifiable Information Via Online Social Networks”, (2009)

Miller et al., “Attribution versus persuasion as a means for modifying behavior” (1975), Journal of Personality and Social Psychology, Vol 31(3)

Oinas-Kukkonen H., “Behavior Change Support Systems: A Research Model and Agenda”, (2010), 5th International Conference, PERSUASIVE 2010

Oinas-Kukkonen H., “The dominant robot: Threatening robots cause psychological reactance”, (2010), 5th International Conference, PERSUASIVE 2010

54

TILBURG LAW SCHOOL

Preece J.J., “I Persuade, They Persuade, It Persuades!” (2010), 5th International Conference, PERSUASIVE 2010

Prennel Β. And Ikonomou D., “Privacy Technologies and Poilicy”, (2012), First Annual Privacy Forum

Privacy International, “PRIVACY AND HUMAN RIGHTS An International Survey of Privacy Laws and Practice”, GLOBAL INTERNET LIBERTY CAMPAIGN

Reynolds T., Charles E. Gengler,, Daniel J. Howard, “A means-end analysis of brand persuasion through advertising” (1995), International Journal of Research in MarketingVol. 12, Issue 3

Shaffer J.W., “Captology: The Study of Computers As Persuasive Technology”, (2004)

UK Human rights Blog, “Article 8 | Right to private and family life”

Verbeek, “Persuasive Technology and Moral Responsibility. Toward an ethical framework for persuasive technologies”, (2006), Paper for Persuasive06, Eindhoven University of Technology

Westin Α., “Privacy and Freedom”, (1967), London, Bodley Head

Table of legislation

2002/58/EC Directive

95/46/EC Directive

Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data 4/2007

Article 29 Data Protection Working Party, Opinion 2/2010 on online behavioural advertising

ARTICLE 29 Data Protection Working Party, Opinion 15/2011 on the definition of consent, 2011

CM (2011) 175, Internet Governance Strategy, Council of Europe 2012 – 2015

55

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

Council of Europe. Convention for the Protection of Human Rights and Fundamental Freedoms, (ETS no: 005). Strasbourg

Datenschutzgesetz [Data Protection Act], Oct. 7, 1970, HESSISCHES GESETZ-UND VERORDNUNGSBLATT I

European Convention 108

French Data Protection Act of 6 January 1978 (revised in 2004)

International Covenant on Civil and Political Rights (ICCPR)

Resolution A/RES/68/167

Swedish Data Act, 1973 (revised in 1998)

Universal Declaration of Human Rights (UDHR), 1948

Working Party 37, “Privacy on the Internet - An integrated EU Approach to On-line Data Protection”, adopted on 21.11.2000

Table of case law

I v. Finland [2008] Application no. 20511/03, ECHR

Marckx v. Belgium [1979] ECHR

Niemietz v. Germany, ECHR

Olmstead v. U.S [1928] Supreme Court Justice, Louis Brandeis

Pretty v. the United Kingdom [2002] ECHR, 2346/02

S. and Marper v. United Kingdom [2008] nos. 30562/04 and 30566/04

X and v the Netherlands [1985] 8 EHRR 235

Table of sites

http://www.un.org/en/universal-declaration-human-rights/ https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668- English.pdf https://ccdcoe.org/sites/default/files/documents/UN-131218-RightToPrivacy.pdf

56

TILBURG LAW SCHOOL

https://www.privacycommission.be/sites/privacycommission/files/documents/convent ion_108_en.pdf https://wcd.coe.int/ViewDoc.jsp?id=1919461 http://eur-lex.europa.eu/legal- content/EN/TXT/PDF/?uri=CELEX:32002L0058&from=EN http://ec.europa.eu/justice/policies/privacy/docs/95-46-ce/dir1995-46_part1_en.pdf http://www.consilium.europa.eu/en/policies/data-protection-reform/data-protection- regulation/ www.justia.com http://echr-online.info/article-8-echr/

57

PERSUASIVE TECHNOLOGY and HUMAN RIGHTS

58