Tilburg Institute for Law, Technology and Society

LL.M. Law and Technology

Master`s Thesis

CHATBOTS AND CONSUMER PROTECTION AN INVESTIGATION IN THE REGULATORY FRAMEWORK AND CHALLENGES

Author: J.H.J.A. Faber SNR 2020871

Supervisor: Ms. S. de Conca LL.M. Second reader: Professor mr. ir. M.H.M. Schellekens

Tilburg, January 2019 Acknowledgement

I would like to express my gratitude to my thesis supervisor Ms S. de Conca LL.M. for her continued advice, guidance and valuable comments during the process of researching and writing of this Master`s Thesis.

Furthermore, I would like to thank Professor mr. ir. M.H.M. Schellekens for his useful suggestions and engagements through the learning process of this thesis.

Finally, I would like to express my gratitude to my wife, my parents and my parents-in-law for their support and patience. This accomplishment would not have been possible without them.

Gloria sine labore nulla

Author

Jelle Faber

2

“Education is not the learning of facts but training the mind to think”

- Albert Einstein - (1879 - 1955)

3

Table of Contents

Chapter I Introduction

1.1 Background 7 1.1.1 Technology and chatbots 8 1.1.2 Social, legal and ethical concerns 9 1.2 Problem statement and hypothesis 12 1.3 Central research question and sub-questions 12 1.4 Methodology and significance 13 1.4.1 Method of research 13 1.4.2 Significance 13 1.5 Thesis outline 13 Bibliography 15

Chapter II and chatbots

2.1 Introduction 19 2.2 Definitions of Artificial Intelligence 19 2.2.1 Stuart Russell and Peter Norvig’s approach 23 2.2.2 Eran Kahana’s approach 25 2.2.3 The road to a working definition 26 2.3 Definitions of chatbots 29 2.4 Practical examples of chatbots applications in a B2C environment 35 2.4.1 Watson Assistant 35 2.4.2 Google Duplex 37 2.5 Conclusions 39 Bibliography 41

4

Chapter III Regulatory framework for B2C contracts in the Netherlands and EU for consumer protection

3.1 Introduction 46 3.2 Regulatory framework for B2C contracts in the Netherlands 46 3.2.1 Special agreements under Dutch law 46 3.2.2 Consumer sale regime 48 3.2.3 Legal consequences 49 3.2.4 The Dutch consumer sale regime 51 3.3 The applicable EU framework of consumer protection 55 3.3.1 E-commerce Directive 55 3.3.1.1 E-commerce Directive 58 3.3.2 Consumer Rights Directive 60 3.3.2.1 Consumer Rights Directive 67 3.3.3 Ongoing EU regulation initiatives on consumer protection within B2B contacts 70 3.4 Conclusions 75 Bibliography 77

Chapter IV Potential regulatory challenges for chatbots

4.1 Introduction 81 4.2 How the existing legal framework for B2C contracts in the Netherlands and EU can be applied to chatbots 81 4.2.1 The current legal framework for B2C contracts applied to sale and purchase scenario carried out by a chatbot 82 4.3 Potential practical challenges associated with the use of chatbots in B2C contracts 86 4.3.1 Potential practical challenges 86 4.3.2 Perspective on the use of chatbots in future B2C contracts 88 4.4 Conclusions 89 Bibliography 90

5

Chapter V Future perspective on enhanced consumer protection for chatbots in B2B contracts

5.1 Introduction 93 5.2 Rationales of consumer rights protection 93 5.3 Resemblances between rationales of consumer rights protection and those of the GDPR 94 5.4 An alternative regulatory approach applied to the use of chatbots in B2C contracts 96 5.5 Conclusions 99 Bibliography 100

Chapter VI Conclusions and recommendations

6.1 Conclusions 102 6.2 Recommendations 104

6

Chapter I Introduction

1.1 Background

The European Union announced recently the development of a code of conduct for robots, to prevent devices and other ‘smart’ technologies from causing damage. The Commission will be allocating 1.5 billion euro`s from its research budget for AI for the period 2018 – 2020 and 500 million from the European Fund for Strategic Investments for Start-up and AI Projects, this was recently announced.1

We are at the beginning of a new phase in digital evolution. The first phase was about the release of information through the wide expansion of the Internet. The second phase was about mobile technology and the new forms of communication. The third phase is about automation and Artificial Intelligence.2 Chatbots3, i.e. chatbots using the underlying technique of Artificial Intelligence, could be exploited within organisations for multiple purposes, such as process improvement, building interfaces to enterprise solutions, scalable 1:1 communication, verification and compliance, collecting information, searching information and service channel.4

So far, chatbots are mainly used for commercial purposes, and while this use is not common yet, an increase in the use of chatbots is to be expected in the upcoming years in the business to consumer (B2C5) sector. Due to this expected growth in the number of chatbots and

1 ‘EU-gedragscode moet robots in toom houden’ De Telegraaf (Rotterdam, 25 April 2018), accessed 6 January 2019.

2 Steven van Belleghem, Customers the day after tomorrow: hoe trek je klanten aan in een wereld van AI, bots en automatisering (Editor Lannoo Campus, Leuven, Belgium 2017) p. 9.

3 Definition ‘chatbot’: A computer program designed to simulate conversation with human users, especially over the Internet, English Oxford Living Dictionaries, accessed 6 January 2019.

4 Jaap Linssens, ‘Zeven slimme toepassingsgebieden van (chats)bots’ (Emerce, 2018), accessed 6 January 2019.

5 Definition ‘B2C’: Business-to-consumer, denoting trade conducted via the Internet between businesses and consumers, English Oxford living Dictionaries, accessed 6 January 2019.

7 increased use throughout different existing and new industries, social, legal and ethical issues will appear.6 As a result of this societal development, it is important to examine the legal aspects of chatbots, in particularly related to potential regulatory challenges and consumer protection in a B2C environment. It`s now time to act before “casualties” start happening!

Hereafter will follow a brief overview of the technology behind chatbots (section 1.1.1) and some potential social, legal and ethical concerns (section 1.1.2).

1.1.1 Technology and chatbots

Several types of chatbots can be distinguished, for example scripted- and smart chatbots. Most of the chatbots currently used are scripted bots, in the sense that these chatbots have been pre- programmed with scripts, which can be seen as pre-written answers, based on an algorithm (a software program), to provide relevant information to customers on certain topics. On the other hand we see smart chatbots, developed on ‘learning algorithms’7 which relates to data repository (data library or data archive8). The latter are capable to “train” chatbots.

A concrete example of a smart chatbot is the one used by KLM9, a conversational chatbot on Facebook Messenger. This chatbot is a user-friendly, informative and communicative bot, which makes it easier for customers to receive and search for instant flight information. This chatbot was developed as a short interaction alternative for customers to have instant and real time conversation with their customers.10

6 Loraine Nijhuis, ‘Hoe krijg je ethiek in je chatbot?’(Emerce, 2017), accessed 6 January 2019.

7 Two types distinguished of learning algorithms are distinguished: ‘supervised learning’ and ‘unsupervised learning’. For an overview of ‘learning algorithms’, accessed 6 January 2019.

8 Jeff Aldorisio, ‘What is a Data Repository?’ Digital Guardian (London 2018), accessed 6 January 2019.

9 KLM, accessed 6 January 2019.

10 Fadoa Schurer, ‘Dit is de stand van zaken van chatbots in Nederland’ (Marketingfacts, 2017), accessed 6 January 2019.

8

The suddenly increasing use of chatbots platforms could well be explained by the drawback of apps in general, such as the inability to switch among different mobile operating systems platforms, limited number of apps used per person, apps with different types of interfaces and lack of standardization among app designers, in combination with the increasing popularity of chat platforms, such as WhatsApp.11

According to Marcel Broersma, professor in journalistic culture and media at the University of Groningen in the Netherlands, who is conducting a research on the use of news via social media: ‘We see that people who previously would share news topics via Facebook or Twitter, increasingly do so by means of using WhatsApp groups. Due to the closed character of these type of WhatsApp groups, users feel more free to share their ideas. We tend to become more careful on public platforms. Many people seem to be more reluctant in sharing their opinion in public. The WhatsApp group feels like a safer alternative’.12

Based on people`s positive experience in general with chat platforms as a smooth and pleasant way of communication (for private- and business purposes), visionaries consider messaging to be the new future platform. By converting messaging into a new development platform, a potential alternative could be created for the worldwide web and the various app stores.13

1.1.2 Social, legal and ethical concerns

Algorithms and chatbots tend to play an increasing important role in our current and future society. By using and improving underlying technique of algorithms, chatbots are able to ‘train’ themselves and learn to respond and adapt to human behaviour. The downside of this development is that, although they offer a great variety of opportunities for infinite technological innovation, algorithms can become a sort of ‘black box’.

11 Sander Duivestein, ‘Chatbots, de nieuwe vorm van communiceren’ (ICT magazine, 2017), accessed 6 January 2019.

12 Kaya Bouma, ‘De wurggreep van de WhatsApp groep’ de Volkskrant (Amsterdam, 31 March 2018).

13 Sander Duivestein, ‘Chatbots, de nieuwe vorm van communiceren’ (ICT magazine, 2017), accessed 6 January 2019.

9

The ‘black box’ problem relates to the way Artificial Intelligence (AI) systems are trained. AI systems are trained by using back-propagation, a method of convex optimization for a continuous function called the loss function14 Once back-propagation is done, the AI is ‘trained’, meaning that its inside matrices are fine tuned to perform a task. However, one of the challenges arising from back-propagation, the ‘black box’, works by measuring accuracy on test data but can’t explain how it works. Moreover, for the same task, the matrices are different at every training. Thus, for now, it is not fully possible to monitor the progress of AI learning over time.15

As a corollary, important legal questions have arisen around the application of chatbots, at least for the ones using AI. Legal aspects involved are, inter alia, website terms, conditions and disclaimers and second applicability of chatbots in regulated industries.16

An example of the first aspect, if a chatbot is assisting a user with booking a flight, a disclaimer stating that the service is computer generated and that users are responsible for checking information provided before booking travel may be appropriate.17 An example of the second aspect, in case chatbots are used in regulated industries, the activities of these chatbots must be programmed to comply with industry regulations and standards and in addition to this, where a chatbot is giving advice for instance, information fed to that chatbot must be kept up to date.18

The conducted research within this thesis - associated with the use of chatbots - will be primarily focused on two essential legal aspects: the regulatory challenges in B2C contracts

14 The loss function is the difference between the base-truth output vector and the AI output, representing the deviation for any prediction. By computing the gradient, the error can be ‘back-propagated’ and make the AI ‘smarter’. This leads to finding the best parameters (weight matrices) for which a function (i.e. the loss function) is minimized.

15 Théo Szymkowiak, ‘The Artificial Intelligence Blackbox Problem & Ethics’ (2017), < https://medium.com/mcgill-artificial-intelligence-review/the-artificial-intelligence-black-box-problem-ethics- 8689be267859> accessed 6 January 2019.

16 Farah Mukaddam, ‘Chatbots: Some Legal Issues’ (Social Media Law Bulletin, 6 June 2017), < https://www.socialmedialawbulletin.com/2017/06/chatbots-legal-issues/> accessed 6 January 2019.

17 Ibid.

18 Ibid.

10 and the protection of consumers` interests. Hence, recommendations for future regulation will be brought forward.

Underlying the legal concerns, lie important ethical challenges concerning the application of algorithms and chatbots. Especially in a business environment, the following ethical considerations should be taken into account. Transparency into algorithms and data that drives the chatbot`s behavior is important to engender trust or otherwise market uptake may be impeded. Legal systems will need to consider how to allocate legal responsibility for loss or damage caused by chatbots, in order to determine accountability for the harm caused by chatbots.19

Additional ethical and legal aspects which are important in a business environment are, inter alia, organizations must decide when building chatbots who do they primarily serve and who has the ownership of the data shared with the chatbot, the business or the customer. Hence, the lack of clarity on the ownership of information shared with a chatbot could lead to intellectual property issues, if not handled correctly.20

The aforementioned social, legal and ethical concerns will contribute to the recommendations and conclusions, set out in the last chapter of this thesis.

19 Maya Medeiros, ‘Chatbots gone wild? Some ethical considerations’ (30 October 2017), < https://www.aitech.law/blog/chatbots-gone-wild-some-ethical-considerations> accessed 6 January 2019.

20 Trips Reddy, ‘The code of ethics for AI and chatbots that every brand should follow’ (2017), accessed 6 January 2019.

11

1.2 Problem statement and hypothesis

In this section, first of all, the problem statement will be presented. Subsequently, the hypothesis will be presented.

Due to the expected increasing use and significance of chatbots, as described in the previous section, the following problem statement is at the core of this research:

We are on the verge of the use of chatbots, although there are many uncertainties about potential regulatory challenges in a B2C environment. At the same time, consumers should also be provided with the highest level of protection due to the asymmetry of information and power in the B2C relationship. This analysis can help protect consumers in the context of contracts concluded with the intervention of chatbots.

Based on the above defined problem statement the following hypothesis will be applied:

The lack of proper regulation for the use of chatbots in a B2C environment could cause potential liability claims for businesses. At the same time, due to the infinite opportunities of the use of chatbots in a B2C environment, at the end the interest of businesses will prevail, which will lead undeniable to insufficient protection of consumers` interests.

The next section will be dedicated to the central research question and sub-questions.

1.3 Central research question and sub-questions

The above mentioned problems lead to the central research question in this thesis: How can chatbots be regulated in the Netherlands in case of B2C contracts?

In order to respond to the central research question, the following sub-questions are addressed:

(i) What are the definitions of Artificial Intelligence & chatbots, their potential uses in B2C contracts and practical examples of already existing uses or experiments? (ii) What is the regulatory framework for B2C contracts in the Netherlands and EU for consumer protection? (iii) What are the potential regulatory challenges deriving from the introduction of chatbots in the B2C relationship, within the existing legislative framework?

12

1.4 Methodology and significance

1.4.1 Method of research

The central- and sub-questions - as defined in section 1.3 - will be answered by means of desktop research and literature review. The research methodology applied in this thesis is based on doctrinal legal research.21

The research in this thesis is based on the following sources: primary sources, such as national regulation, EU legislation (via legal database EUR-Lex) and court cases. Secondary sources, such as books, (online) articles, newspaper articles, technical blogs and websites. Finally, by using databases, such as WorldCat, Google Scholar and SSRN.

In addition, technical blogs are frequently used in this work, given the fact that chatbots is a developing field closely related to the discipline of .

The reviewed sources can be consulted in the bibliography.

1.4.2 Significance

The research in the thesis aims to assess how regulatory challenges deriving from chatbots can be regulated in the Netherlands in case of B2C contracts. This thesis will help to bridge the gap between existing regulations and a recommended future regulatory framework for chatbots. Moreover, this analysis has a signal function to prioritize customers` interests above business interests, in the context of contracts concluded with the intervention of chatbots.

1.5 Thesis outline

This thesis is divided into three main components, each part providing an answer to one of the sub-questions, defined in section 1.3.

Chapter 2 (covering the first sub-question) analyses the definitions of Artificial Intelligence & chatbots and their potential uses in B2C contracts, including examples of already existing uses or experiments, will be reviewed. In chapter 3 (covering the second sub-question), the focus

21 Paul Chynoweth (2008), Chapter three Legal research, Advanced Research Methods in the Built Environment (Blackwell Publishing Ltd, Oxford, United Kingdom 2008) pp. 28–38.

13 will be on the regulatory framework for B2B contracts in the Netherlands with part on EU framework for consumer protection will be investigated. Chapter 4 (covering the third sub- question) will be examining and evaluating the potential challenges for chatbots. Chapter 5 will add some critics and recommendations for future regulatory frameworks.

Finally, in chapter 6, the conclusions and recommendations of this research will be presented.

14

Bibliography

1. Andrew Murray, Information Technology Law: The Law and Society (Oxford University Press, Oxford, United Kingdom 2016).

2. Ava Chisling, ‘Bots vs chatbots vs robots vs AI’, accessed 6 January 2019.

3. Cándido García Molyneux and Rosa Oyarzabal, ‘What is a Robot under EU Law?’ (EU law and regulatory, 2017) accessed 6 January 2019.

4. ‘EU-gedragscode moet robots in toom houden’ De Telegraaf (Rotterdam, 25 April 2018), < https://www.telegraaf.nl/nieuws/1961099/eu-gedragscode-moet-robots-in-toom-houden > accessed 6 January 2019.

5. EUR-Lex, accessed 6 January 2019.

6. European parliament, accessed 6 January 2019.

7. European Parliament Resolution of 16 February 2017 with recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL), accessed 6 January 2019.

8. Fadoa Schurer, ‘Dit is de stand van zaken van chatbots in Nederland’ (Marketingfacts, 2017), accessed 6 January 2019.

9. Farah Mukaddam, ‘Chatbots: Some Legal Issues’ (Social Media Law Bulletin, 6 June 2017), accessed 6 January 2019.

15

10. Hamza Harkous, Kassem Fawaz, Kang G. Shin, Karl Aberer, ‘PriBots: Conversational Privacy with Chatbots’(2016).

11. Horst Eidenmueller, The Rise of Robots and the Law of Humans (Oxford Legal Studies Research Paper No. 27/2017), accessed 6 January 2019.

12. Jaap Linssens, ‘Zeven slimme toepassingsgebieden van (chats)bots’ (Emerce, 2018), accessed 6 January 2019.

13. James Le, ‘The 10 Algorithms Machine Learning Engineers Need to Know’ (2016), accessed 6 January 2019.

14. Jeff Aldorisio, ‘What is a Data Repository?’ Digital Guardian (London 2018), accessed 6 January 2019.

15. Joost Linnemann, ‘Juridisch toepassingen van (toepassingen van) blockchain’ (Computerrecht, No 218(6), 2016).

16. Katharina Schwab, ‘Nest founder: “I Wake Up In Cold Sweats Thinking, What Did We Bring To the World?’ (2017), accessed 6 January 2019.

17. Kaya Bouma, ‘De wurggreep van de WhatsApp groep’ de Volkskrant (Amsterdam, 31 March 2018).

18. Loraine Nijhuis, ‘Hoe krijg je ethiek in je chatbot?’(Emerce, 2017), accessed 6 January 2019.

16

19. Maya Medeiros, ‘Chatbots gone wild? Some ethical considerations’ (30 October 2017), < https://www.aitech.law/blog/chatbots-gone-wild-some-ethical-considerations> accessed 6 January 2019.

20. Nicolas Petit, ‘Law and Regulation of Artificial Intelligence and Robots: conceptual framework and normative implications’ (Working paper, 2017), accessed 6 January 2019.

21. ‘Partnership on AI to benefit people and society’, accessed 6 January 2019.

22. Paul Chynoweth (2008), Chapter three Legal research, Advanced Research Methods in the Built Environment (Blackwell Publishing Ltd, Oxford, United Kingdom 2008) pp. 28-38.

23. Prins, C., & Roest, J., ‘AI en de rechtspraak: Meer dan alleen de ‘robotrechter’ (Nederlands Juristenblad, No 93(4), 2018) pp. 260-268.

24. Rachel Hall, ‘Ready for robot lawyers? How students can prepare for the future of law’ The Guardian (London, 31 July 2017), accessed 6 January 2019.

25. Ronald Leenes , Erica Palmerini, Bert-Jaap Koops, Andrea Bertolini , Pericle Salvini and Federica Lucivero, ‘Regulatory challenges of robotics: some guidelines for addressing legal and ethical issues’, (Law, Innovation and Technology, No 9(1), 2017) pp. 1-44, accessed 6 January 2019.

26. Ryan Calo, A. Michael Froomkin, Ian Kerr, Robot Law (Edward Elgar Publishing Limited, Cheltenham, United Kingdom 2016).

27. Sander Duivestein, ‘Chatbots, de nieuwe vorm van communiceren’ (ICT magazine, 2017), accessed 6 January 2019.

17

28. Steven van Belleghem, Customers the day after tomorrow: hoe trek je klanten aan in een wereld van AI, bots en automatisering (Editor Lannoo Campus, Leuven, Belgium 2017).

29. Théo Szymkowiak, ‘The Artificial Intelligence Blackbox Problem & Ethics’ (2017), < https://medium.com/mcgill-artificial-intelligence-review/the-artificial-intelligence-black-box- problem-ethics-8689be267859> accessed 6 January 2019.

30. Tjong Tjin Tai, Eric ‘Aansprakelijkheid voor robots en algoritmes’ Nederlands Tijdschrift voor Handelsrecht, 14(3), 2017) pp. 123-132.

31. Trips Reddy, ‘The code of ethics for AI and chatbots that every brand should follow’ (2017), accessed 6 January 2019.

32. Ugo Pagallo, The Laws of Robots: crimes, Contracts and Torts (Laws, Governance and Technology Series 10, Springer, Dordrecht, The Netherlands 2013).

18

Chapter II Artificial Intelligence and chatbots

2.1 Introduction

The advance of AI is closely related to the increasing digitalization of our society. Virtually anything can have a digital component and generate data.22 Combining large amounts of data and the use of algorithms to analyse information enables to perceive relationships within the combined information and gain new insights and knowledge.23

In the long term, AI will eventually affect the everyday lives of millions of people and is likely to have an impact in the medium term on eight domains: transportation, home/service robots, healthcare, education, low-resource communities, public safety and security, employment and workplace, and entertainment.24

This chapter answers the first sub-question of this research: what are the definitions of Artificial Intelligence & chatbots, their potential uses in B2C contracts and practical examples of already existing uses or experiments? Section 2.2 examines the different definitions of AI. Section 2.3 gives an overview of some definitions in use for chatbots. Section 2.4 is devoted to two practical examples of chatbots applications in a B2C environment. Finally, the findings will be summarised in section 2.5.

2.2 Definitions of Artificial Intelligence

Defining the term “Artificial Intelligence” (AI) is complicated since there is no universal accepted definition of AI.25 The definition and meaning of the single word intelligence, and even more so of Artificial Intelligence, is the subject of discussions and debates. Moreover,

22 Corien Prins en Jurgen Roest, AI en de rechtspraak: Meer dan alleen de ‘robotrechter’, Nederlands Juristenblad (93(4) 2018) p. 262.

23 Ibid, p. 261.

24 , Artificial Intelligence and Life in 2020, One Hundred Year Study on Artificial Intelligence, Report of the 2015 Study Panel (2016) pp. 18 – 41.

25 Ibid, p. 12.

19 the definitions have changed over time, due to the rapid technological developments.26

The Oxford dictionaries, consulted as a starting point, uses a fairly broad variant by defining Artificial Intelligence as the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making and translation between languages.27

Hereafter, the definitions of AI used by some scholars and AI researchers, such as computer scientists and engineers, will be discussed.

Nils John Nilsson, Professor Emeritus of engineering at Stanford University28, describes Artificial Intelligence as “that activity devoted to making machines intelligent, and intelligence is that quality that enables an entity to function appropriately and with foresight in its environment”.29 Based on a continuum, machines and many animals are at the primitive end of the extended continuum, along which entities with various degrees of intelligence are arrayed. At the other end are humans, who are able to reason, achieve goals, understand and generate language, perceive and respond to sensory inputs.30

The applied definition of Artificial Intelligence by Nilsson encompasses a rather generous view of AI, by its broad view that “intelligence“ lies on a multi-dimensional spectrum.31

26 Joost N. Kok et al and UNESCO, Artificial Intelligence: encyclopaedia of life support systems (EOLSS), Artificial Intelligence: Definition, Trends, Techniques, and Cases (Eolss Publishers 2009) pp. 1 – 2.

27 English Oxford Living Dictionaries, accessed 6 January 2019.

28 Stanford University, accessed 6 January 2019.

29 Nils J. Nilsson, The Quest for Artificial Intelligence: A History of Ideas and Achievements (Cambridge University Press, Cambridge, UK 2010) p. 13.

30 Ibid.

31 Stanford University, Artificial Intelligence and Life in 2020, One Hundred Year Study on Artificial Intelligence, Report of the 2015 Study Panel (2016), p. 12.

20

Pei Wang, associate Professor of Computer and Information Sciences at Temple University32, analysed and compared five types of working definitions of AI, based on structure, behaviour, capability, function and principle.33

In short these working definitions clarified, structure-AI contributes to the study of the human brain. Behavior-AI, contributes to the study of human psychology. Subsequently, capability- AI is contributing to various application domains, by solving practical problems in that specific area. Function-AI is contributing to computer science, by producing new software or hardware that can carry out various type of computation. Ultimately, principle-AI delivers its contributions to the study of information processing in various situations.34

These definitions of AI differ since they set different goals, require different methods, produce different results and cannot replace one another, while they can be integrated into a coherent satisfying definition. In the opinion of Pei Wang AI should mean building computer systems that are similar to human mind, although there are very different ideas on where this similarity should be, depending on the applied aforementioned working definition(s).35

According to John Mc Carthy36, Professor Emeritus of Computer Science at Stanford University37, AI is “the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable”. By this latter Mc Carthy meant that we shouldn’t necessary confine our AI to being ‘like humans’, instead AI can solve the problems in entirely different ways humans might do.38

32 Temple University, accessed 6 January 2019.

33 Pei Wang, What Do You Mean by “AI”? Frontiers in Artificial Intelligence and Applications (17(1) 2008) p. 4.

34 Ibid, pp. 4 – 8.

35 Ibid, pp. 10 – 11.

36 John McCarthy, What is Artificial Intelligence?, Stanford University (2007) p. 2.

37 Stanford University, accessed 6 January 2019.

38 David Pryce, Blog AI Tech World: Machine learning at an enterprise scale, accessed 6 January 2019.

21

AI can also be determined as a branch of computer science that studies the properties of intelligence by synthesizing intelligence.39 In a survey conducted by Shane Legg40 and Marcus Hutter41, several different informal definitions of AI from researchers were identified42. For example, AI defined as “the field that studies the synthesis and analysis of computational agents that act intelligently” (David L. Poole43 and Alan K. Mackworth44).45 or AI defined as “the study of computations that make it possible to perceive, reason and act” (Patrick Henry Winston46).47

In the next two paragraphs two approaches to applied definitions of AI will be discussed, first

39 Stanford University, Artificial Intelligence and Life in 2020, One Hundred Year Study on Artificial Intelligence, Report of the 2015 Study Panel (2016) p. 12 and Herbert A. Simon, Artificial Intelligence: An Empirical Science, Elsevier (77(1) 1995) pp. 95 – 127.

40 Shane Legg is researcher and cofounder of the AI company DeepMind Technologies Ltd. (UK), acquired by Google in 2014. Bloomberg, accessed 6 January 2019.

41 Marcus Hutter is Professor of Engineering and Computer Science at the Australian National University. Australian National University, accessed 6 January 2019.

42 Shane Legg and Marcus Hutter, A Collection of Definitions of Intelligence, Frontiers in Artificial Intelligence and applications (Volume 157 (2007) p. 7

43 David L. Poole is Professor of Computer Science at the University of British Columbia. Personal website, accessed 6 January 2019.

44 Alan K. Mackworth is a Professor in the Department of Computer Science at the University of British Columbia. Personal website, accessed 6 January 2019.

45 David L. Poole and Alan K. Mackworth, Artificial Intelligence: Foundations of Computational Agents (Cambridge University Press The Edinburgh Building, , 2010) pp. 3-4 .

46 Patrick Henry Winston is Professor of Artificial Intelligence and Computer Science at the Massachusetts Institute of Technology (MIT). MIT, accessed 6 January 2019.

47 Patrick Henry Winston, Artificial Intelligence, Third edition, Addison-Wesley Publishing Company, Reading Massachusetts (US) (1992) p. 5.

22 the one from Stuart Russell and Peter Norvig and second a functional approach based on an intelligence / sophistication continuum (see hereafter) applied by Eran Kahama.

The reason for choosing these two approaches is that on one side, Stuart Russell and Peter Norvig`s approach provides a complete overview by integrating several different definitions in one practical model. Hence, their AI model is described in the leading introductory textbook on regulation of AI48, which contributes to reliability. Eran Kahama`s model, based on a broad scale of Artificial Intelligence applications viewed as a continuum, represents a meaningful, practical and contemporary concept.

2.2.1 Stuart Russell and Peter Norvig’s approach

The leading introductory textbook on AI, Stuart Russell49 and Peter Norvig’s50 ʺArtificial Intelligence: A Modern Approachʺ, presents eight different definitions of AI organized into four main categories: thinking humanly, acting humanly, thinking rationally, and acting rationally. Over time, the importance of each of these definitional concepts has waxed and waned within the AI research community.51

The categories are presented in figure 2.1. Stuart Russell and Peter Norvig arrange these definitions along two main dimensions. Roughly, the ones on top are concerned with thought processes and reasoning, whereas the ones on the bottom address behavior. The definitions on the left measure success in terms of fidelity to human performance, whereas the ones on the right measure against an ideal concept of intelligence, which we will call rationality. A system is rational if it does the "right thing," given what it knows.52

48 Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, Harvard Journal of Law and Technology (29 (2) 2016) p. 360.

49 Stuart Russell is Professor of Computer Science at the University of California, Berkeley. University of California, Berkeley, accessed 6 January 2019.

50 Peter Norvig is a Director of Research at Google. Google AI, accessed 6 January 2019.

51 Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, Harvard Journal of Law and Technology (29 (2) 2016) p. 360.

52 Stuart J Russell and Peter Norvig, Artificial Intelligence: A Modern Approach, Third edition, Prentice Hall, New York (2003) pp. 1 – 2.

23

Today, it appears that the most widely used current approach to defining AI focuses on the concept of machines that work to achieve goals – a key component of ‘acting rationally’ in Russell and Norvig`s scheme.53 Consensus on the definition of AI has emerged around the idea of a rational agent that perceives and acts in order to maximally achieve its objectives.54 However, from a regulatory perspective, this goal-oriented definition approach does not seem particularly helpful because it simply replaces one difficult-to-define term (intelligence) with another (goal).55 Consequently, it is not clear how defining AI through the lens of goals could provide a solid working definition of AI for regulatory purposes.56

Figure 2.1 Some definitions of Artificial Intelligence, organized into four categories57

53 Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, Harvard Journal of Law and Technology (29 (2) 2016) p. 361.

54 Stuart Russell, Provably Beneficial Artificial Intelligence, in The Next Steps: Exponential Life, p. 1.

55 Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, Harvard Journal of Law and Technology (29 (2) 2016) p. 361.

56 Ibid.

57 Stuart J Russell and Peter Norvig, Artificial intelligence: A Modern Approach, Third edition, Prentice Hall, New York (2003) pp. 1 – 2.

24

2.2.2 Eran Kahana’s approach Eran Kahana58, council and legal research fellow at Stanford Law School, provides an interesting classification into four types of Artificial Intelligence applications (apps) - although originally written from an intellectual property angle - on an intelligence or sophistication continuum by categorising the following four types of apps.59

 Level A apps vary in their query-response, are programmatically constrained to perform a specific operation and are incapable of operational variance. An example is a chess computer.  Level B apps retrieve data from sources external to the host device (e.g. an iPhone). Examples of these sources can be websites and other apps granted with the necessary access rights. For instance, fraud detection applications fall within this category.  Level C apps feature autonomous decision-making capabilities. These apps can dynamically evaluate and decide from what source and what data to retrieve and how most effectively to present it. Practical examples are AI applications such as Watson Assistant (see paragraph 2.5.1) and Google Duplex (see paragraph 2.5.2).  Level D apps, manifesting intelligence levels so sophisticated, are able to create apps without human involvement. Apps at level D can also use data it finds in any manner it decides, in ways that indistinguishably replicate (and even exceed) human behaviour. An example can be the (controversial) use of fully autonomous weapons, also known as ‘killer robots’.60

The benefit of classifying AI applications from a functional point of view into different level apps, as conceptualised by Eran Kahana, is a better understanding of the AI sophistication level involved. However, as with every category, discussions might arise with regard to the classification of a specific AI application into an apps-category.

Figure 2.2 (see next page) provides a graphical overview of the levels A, B, C and D apps.

58 Maslon, accessed 6 January 2019.

59 Eran Kahana, Intellectual Property Infringement by Artificial Intelligence Applications, Stanford Center for Legal Informatics (2016) pp. 1 -2.

60 Human Rights Watch, accessed 6 January 2019.

25

High intelligence

Level D apps

Level C apps

Level B apps

Level A apps

Low intelligence

Figure 2.2 Continuum Level A apps – Level D apps (Eran Kahana`s approach)

2.2.3 The road to a working definition

In conclusion, due to a great variety of definitions for AI among scholars and researchers, a general accepted definition for AI would be appropriate, a definition which is on one hand flexible and on the other hand does not block innovation.61 Definitions of intelligence vary widely and focus on myriad interconnected human characteristics that are themselves difficult to define, including consciousness, self-awareness, language use, the ability to learn, the ability to abstract, the ability to adapt, and the ability to reason.62

61 European Parlement, Verslag met aanbevelingen aan de Commissie over civielrechtelijke regels inzake robotica (2015/2103(INL), Commissie juridische zaken, Zittingsdocument A8/0005/2017 (24 Januari 2017), p.3. European Parliament, accessed 6 January 2019.

62 Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, Harvard Journal of Law and Technology (29 (2) 2016) p. 360.

26

Nevertheless, it is important to be able to rely on a legal definition of Artificial Intelligence.63 In absence of a general accepted and applicable definition, the following solutions can be proposed to come to a useful working definition:

First, AI should be considered merely as an umbrella terminology for self-learning computer systems (Corien Prins64 and Jurgen Roest65).66

Second, as an alternative approach instead of looking at a general definition of AI, restricting the definition to “AI systems”, as most of them can be classified into systems that think like human, systems that act like human, systems that think rationally and systems that act rationally (Joost Kok67 et al. and Mannes Poel68).69

Third, as a merely practical approach, a workable definition could be adopted based on ongoing EU policy preparation initiatives in the AI field, among other things on a legal framework for AI.

In the latter context recently a European approach on AI was announced by the European

63 Gary Lea, Why we need a legal definition of Artificial Intelligence (Australian National University 2015), accessed 6 January 2019.

64 Corien Prins is Professor of Law and Informatisation at Tilburg Law School. University of Tilburg, accessed 6 January 2019.

65 Jurgen van der Roest is lawyer at Houthoff. Houthoff, accessed 6 January 2019.

66 Corien Prins and Jurgen Roest, AI en de rechtspraak: Meer dan alleen de ‘robotrechter’, Nederlands Juristenblad (93(4) 2018) footnote 9.

67Joost Kok is an expert in the field of processing and analysing data at the at Leiden University. Leiden University, accessed 6 January 2019.

68 Mannes Poel, is an assistant Professor in the department of Computer Science in the field of processing and analysing data at the University of Twente. University of Twente, accessed 6 January 2019.

69 Joost Kok et al and UNESCO, Artificial Intelligence: encyclopaedia of life support systems (EOLSS), Artificial Intelligence: Definition, Trends, Techniques, and Cases (Eolss Publishers 2009) pp. 1 – 2.

27

Commission70, responsible for planning, preparing and proposing new European legislation.71 In its communication from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions the following comprehensive more descriptive definition of AI was provided:

“AI refers to systems that display intelligent behaviour by analysing their environment and taking actions – with some degree of autonomy – to achieve specific goals. AI-based systems can be purely software-based, acting in the virtual world (e.g. voice assistants, image analysis software, search engines, speech and face recognition systems) or AI can be embedded in hardware devices (e.g. advanced robots, autonomous cars, drones or Internet of Things (IoT) applications). AI is used on a daily basis, e.g. to translate languages, generate subtitles in videos or to block email spam. Many AI technologies require data to improve their performance. Once they perform well, they can help improve and automate decision making in the same domain. For example, an AI system will be trained and then used to spot cyber- attacks on the basis of data from the concerned network or system”.72

Merely for practical reasons, due to lack of consensus on a common accepted definition of AI and its broad definition scope, the above mentioned comprehensive and extensive description of AI initiated by the European Commission will be applied as the working definition of AI throughout this thesis.

70 Press release European Commission, accessed 6 January 2019.

71 European Commission, accessed 6 January 2019.

72 European Commission, Communication from the Commission to the European Council, The European Economic and Social Committee and the Committee of the Regions: Artificial Intelligence for Europe (25 April 2018) p. 2. European Commission, accessed 6 January 2019.

28

2.3 Definitions of chatbots

With over 2.5 billion users that currently have at least one messaging app installed73, chat has emerged as the communication preferred by a large user base. This lured the big technology players, such as Microsoft and Facebook, to build new ecosystems on top of the chat environment, in the form of chatbots.74 Chatbots can assist for example customer service by gathering information for the eventual interaction with a human representor, understand what happened, what consumers wants and can even be trained to solve basic issues automatically.75

The Oxford dictionaries, consulted as a starting point, describes chatbots as a computer program designed to simulate conversation with human users, especially over the internet.76 In this section, some definitions of chatbots by scholars will be shortly examined. In this section the definitions of chatbots by scholars will be shortly examined, followed by a clarification on the relation between the terms chatbots and AI.

Robert Gorwa77 and Douglas Guilbeault78 consider chatbots as a form of human-computer dialog system operating through natural language via text or speech. In other words, they are programs that approximate human speech and interact with humans through some sort of interface. Developers of functional chatbots seek to design programs that can hold at least a basic dialogue with a human user. This entails processing inputs somehow (through natural

73 ‘Bots, the next frontier’ (The Economist, 9 April 2016), accessed 6 January 2019.

74 Hamza Harkous, Kassem Fawaz, Kang G. Shin and Karl Aberer, ‘PriBots: Conversational Privacy with Chatbots’ (2016) p. 1.

75 Michael Schneider, ‘Bots, Messenger and the Future of Customer Service’, TechCrunch (8 May 2016), accessed 6 January 2019.

76 The Oxford dictionaries, accessed 6 January 2019.

77 Robert Gorwa is PhD student, Researcher at the University of Oxford. Personal website, accessed 6 January 2019.

78 Douglas Guilbeault is Research Assistant in the Network Dynamics Group and PhD student at the Annenberg School for Communication at the University of Pennsylvania. The Computational Propaganda Project, accessed 6 January 2019.

29 language processing79 for example), and making use of a corpus of data to formulate a response to this input.80

Modern chatbots are substantially more sophisticated than their predecessors: today, chatbot programs have many commercial implementations, and are often known as virtual assistants or assisting conversational agents with current voice-based examples including Apple’s Siri and Amazon’s Alexa. Another implementation for chatbots is within messaging applications, and text based chatbots have been developed for multiple messaging apps, including Facebook Messenger, Skype and WeChat. These bots have been built by developers to perform a range of practical functions, including answering frequently asked questions and performing organizational tasks.81

According to Nicole Radziwill and Morgan Benton82, both Associate Professors at James Madison University, chatbots can be considered as one of the categories of conversational agents83, which are software systems that mimic interactions with real people. Chatbots are typically not embodied in the forms of avatars, humans, or humanoid robots (those programs are considered to be “embodied conversational agents”)84 Interactive Voice Response (IVR)

79 Natural language processing is the application of computational techniques to the analysis and synthesis of natural language and speech. Oxford dictionaries, accessed 6 January 2019.

80 Robert Gorwa and Douglas Guilbeault, Understanding Bots for Policy and Research: Challenges, Methods, and Solutions (2018) p. 6.

81 Ibid.

82 Nicole Radziwill is Associate Professor, ISAT at James Madison University. James Madison University, accessed 6 January 2019. Morgan Benton is Associate Professor, ISAT at James Madison University. James Madison University, accessed 6 January 2019.

83 Conversational agents exploit natural language technologies to engage users in text-based information-seeking and task-oriented dialogs for a broad range of applications. James Lester, Karl Branting, Bradford Mott, Conversational Agents, (CRC Press LLC. 2004) p. 2.

84 Embodied conversational agents are computer-generated cartoonlike characters that demonstrate many of the same properties as humans in face-to-face conversation, including the ability to produce and respond to verbal and nonverbal communication. Elisabeth Andre et al., Interaction with Embodied Conversational Agents (the MIT Press, 2018) pp. 1 – 23.

30 systems85 (e.g. “Press or Say 1 for English”) are dialog systems86, but are not usually considered conversational agents.87 The relationship amongst the above applied technical terms is represented in figure 2.3. Chatbots receive natural language input, sometimes interpreted through speech recognition software88, and execute one or more related commands to engage in goal-directed behavior (often on behalf of a human user). As intelligent agents, they are usually autonomous, reactive, proactive, and social. The most advanced systems employ machine learning89 so that they may also adapt to new information or new requests.90

85 Interactive Voice Response is a technology that allows a computer to interact with a human through the use of voice and input via keyboard. Jamie Tolentino, Enhancing customer engagement with interactive voice response, future of communication (20 April 2015), accessed 6 January 2019.

86 A dialogue system is a computer program that communicates with a human user in a natural way, providing an interface between the user and a computer-based application that permits interaction with the application in a relatively natural manner. Suket Arora et al., Dialogue System: A Brief Review (2013) p. 1.

87 Nicole Radziwill and Morgan Benton, Evaluating Quality of Chatbots and Intelligent Conversational Agents (2017) p. 3.

88 Speech recognition software is computer software that allows a computer to understand spoken words. Cambridge Dictionary, accessed 6 January 2019.

89 Machine learning is the capacity of a computer to learn from experience, i.e. to modify its processing on basis of newly acquired information. Oxford Dictionaries, accessed 6 January 2019.

90 Nicole Radziwill and Morgan Benton, Evaluating Quality of Chatbots and Intelligent Conversational Agents (2017) p. 4.

31

Figure 2.3 Relationships between classes of software-based dialog systems91

It is important to note that the word chatbot is often used in the media and the industry as a synonym for conversational agent.92

Daniel Jurafsky93 and James H. Martin94 define chatbots as systems that can carry on extended conversations with the goal of mimicking the unstructured conversational or ‘chats’ characteristic of human-human interaction.95

91 Ibid

92 Daniel Jurafsky & James H. Martin, Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, Chapter 29 Dialog Systems and Chatbots (Third Edition draft, 2017) p. 2.

93 Daniel Jurafsky is Professor of Computer Science at Stanford University. Stanford University, accessed 6 January 2019.

94 James H. Martin is Professor of Computer Science at University of Colorado Boulder. University of Colorado Boulder, accessed 6 January 2019.

95 Ibid, p. 4.

32

Hamza Harkous and others96 determine chatbots as computer agents that are designed to simulate a conversation with human users via auditory or textual methods.97

Rohan Kar98 and Rishin Haldar99 make use of a functional description of two types of chatbots, on one hand chatbots that function on pre-defined rules and commands and on the other hand chatbots that operate supported by AI. The first category is often limited as they are only as smart as they are (functionally) programmed. The second category provides an impression of being “intelligent” as they are capable of understanding natural language, not just pre-defined commands but get smarter as they interact more due to their ability to maintain different states.100

The benefit of applying the definition of chatbots according to Rohan Kar and Rishin Haldar is that it provides a clear understanding in the relation between the terms ‘chatbots’ and ‘AI’, namely the distinction between the two types of aforementioned chatbots, one that function with pre-defined rules (also known as scripted bots) and the other one supported by AI (also

96 Hamza Harkous is Postdoctoral Researcher at École Polytechnique Fédérale de Lausanne (EPFL). EPEL, accessed 6 January 2019.

97 Hamza Harkous, Kassem Fawaz, Kang G. Shin and Karl Aberer, ‘PriBots: Conversational Privacy with Chatbots’ (2016) p. 1.

98 Rohan Kar is Graduate Student at the University of Berkeley. UC Berkeley, accessed 6 January 2019.

99 Rishin Haldar is Assistant Professor at the VIT University of Vellore (India). VIT University, accessed 6 January 2019.

100 Rohan Kar and Rishin Haldar, Applying Chatbots to the Internet of Things: Opportunities and Architectural Elements, International Journal of Advanced Science and Applications (7(11) 2016) pp. 1 – 9.

33 known as AI bots)101 A scripted chatbot doesn’t carry even a glimpse of AI., whereas AI bots are built on Natural Language Processing (NLP)102 and Machine learning (ML)103.104

As a result of the important distinction between scripted bots and AI bots and lack of consensus on a common accepted definition of chatbots, the functional definition of chatbots - advocated by Rohan Kar and Rishin Haldar - will be applied as a general working definition of chatbots throughout this thesis. However, a qualification on this point for the research objectives, the remaining part of this thesis will be confined to the second type, aka AI bots.

In the next section two practical examples of chatbots applications in a B2C environment will be discussed, respectively IBM Watson and Google Duplex.

101 Hira Saeed, The difference between an A.I. interface and an A.I. chatbot, accessed 6 January 2019.

102 Natural Language Processing is the application of computational techniques to the analysis and synthesis of natural language and speech. Oxford Dictionaries, accessed 6 January 2019.

103 Machine learning is the capacity of a computer to learn from experience, i.e. to modify its processing on basis of newly acquired information. Oxford Dictionaries, accessed 6 January 2019.

104 Hira Saeed, The difference between an A.I. interface and an A.I. chatbot, accessed 6 January 2019.

34

2.4 Practical examples of chatbots applications in a B2C environment

AI in general has amazing potential to improve our lives, helping us live healthier, happier and generating large numbers of new jobs.105 To indicate the implication of AI it is useful to distinguish three types of the use of AI. First, AI can support in describing processes and procedures, for example by combining enormous amounts of data patterns extracted from and translated into different types of customers’ buying behavior. Secondly, by means of insights generated by AI expected behavior of actors or implications of their choices can be predicted. Thirdly, anticipation on expected customers’ behavior by policy or targeted actions with the object to pre-influence conduct.106

This section will briefly describe two practical examples of chatbots applications in a B2C environment, both supported by AI, the first use case is a chatbot application built by IBM Watson (paragraph 2.4.1) and the second is the case of Google Duplex (paragraph 2.4.2).

2.4.1 Watson Assistant

The chatbot solution by IBM Watson is named Watson Assistant service (hereafter Watson Assistant or W.A., formerly known as Watson Conversation107). W.A. enables enterprises to build a solution that understands natural language input and uses machine learning to respond to customers in a way that stimulates a conversation between humans.108

Watson Assistant contains the following functional features:109

105 Thomas Metzinger, Peter J. Bentley, Olle Häggström, Miles Brundage, Should we fear Artificial Intelligence? In Depth Analysis: Science and Technology Options Assessment, Scientific Foresights Unit (STOA), European Parliamentary Research Service (EPRS) (2018) p. 11.

106 Corien Prins and Jurgen Roest, AI en de rechtspraak: Meer dan alleen de ‘robotrechter’, Nederlands Juristenblad (93(4) 2018) pp. 261-262.

107 IBM Watson, accessed 6 January 2019.

108 IBM Cloud Docs, accessed 6 January 2019.

109 IBM Watson, accessed 6 January 2019.

35

First, dialogue or conversation tree110 view tooling makes it easy to program multi-turn dialog and provide response variations based on different conditions. Second, use folders are included to keep dialog nodes111 organized to be able to scale and simplify the context of the nodes, enabling users to interact with the application through user interfaces.112 For example by means of a simple chat window or a mobile app, or even a robot with a voice interface.113 Finally, user interfaces are built into an application or a device, are pre-trained with industry- relevant content and can make sense of historical chat or call logs.114

Watson Assistant has for example enabled Staples Inc. to increase revenue by means of higher order frequency, increased order size and improved service scores.115 By using the natural language processing and machine learning capabilities of W.A., Staples Inc has transformed and enriched its traditional order process into an intelligent ordering ecosystem that customers can use to order supplies easily via voice, text or email.116

The figure below shows an example of a potential customer conversation with IBM Watson.

110 Dialog or conversation trees are tree structures representing all possible developments of the dialogue, where users can decide between different branches using multiple choice. Tibor Bosse and Simon Provoost, Integrating Conversation Trees and Cognitive Models within an ECA for Aggression De-escalation Training (2015) pp. 1 – 2.

111 Examples of nodes are: action node, concept node, default node, entity node, folder node, getUserInput node, goto node, if node, input node, output node, search node, variables node. IBM Cloud Docs, accessed 6 January 2019.

112 User interface is the means by which the user and a computer system interact, in particular the use of input devices and software. Oxford Dictionaries, accessed 6 January 2019.

113 IBM Cloud Docs, accessed 6 January 2019.

114 IBM Cloud Docs, accessed 6 January 2019.

115 The case study Staples Inc., IBM, accessed 6 January 2019.

116 Ibid.

36

Figure 2.4 Example screenshot of a potential customer conversation with IBM Watson117

2.4.2 Google Duplex

A big game changer could become the recently launched voice chatbot Google Duplex.118 This new technology was launched during Google`s annual developer conference in May 2018 and is capable to conduct natural conversations and carry out “real world” tasks over the phone Google Duplex is programmed to complete specific tasks, such as scheduling appointments or reserve a table at a restaurant. For such tasks, the system makes the conversational experience as natural as possible, allowing people to speak normally, like they would to another person, without having to adapt to a machine.119

The Google Duplex system is capable of carrying out sophisticated conversations and it completes the majority of its tasks fully autonomously, without human involvement. The

117 WordPress, accessed 6 January 2019.

118 Google`s AI assistant Can Now Make Real Phone Calls. YouTube, accessed 6 January 2019.

119 Yaniv Leviathan, Principal Engineer at Google and Yossi Matias, Vice President, Engineering at Google. ‘Google Duplex An AI System for Accomplishing Real-World Tasks over the phone’ Google AI blog, the latest news from Google AI (8 May 2018), accessed 6 January 2019.

37 system has a self-monitoring capability, which allows it to recognize the tasks it cannot complete autonomously (e.g., scheduling an unusually complex appointment). In these cases, it signals to a human operator, who can complete the task.120

After the launch, Google was accused of using deceitful and unethical technology, as the Google Duplex voice chatbot can trick humans into believing to be talking to another human, because of the voice and little expressions used.121 In its reaction to the controversy, Google spokeswoman declared in a statement that “We (Google) are designing this feature with disclosure built-in, and we’ll make sure the system is appropriately identified. What we showed at I/O was an early technology demo, and we look forward to incorporating feedback as we develop this into a product.”122

Google Duplex is currently in a test phase, it will be tested intensively this year and probably become part of Google Assistant over time.123

In the next chapter the protection of customers within B2C contracts will be explored from two angles, the regulatory framework for B2C contracts in the Netherlands and the EU framework of consumer protection.

120 Ibid.

121 The Guardian, accessed 6 January 2019.

122 Techspot, accessed 6 January 2019.

123 Scott Huffman, VP Engineering, ‘The future of the Google Assistant: Helping you get things done to give you time back’ (8 May 2018). Google, accessed 6 January 2019.

38

2.5 Conclusions

This chapter answered the first sub question of this research: what are the definitions of AI and chatbots, their potential uses in B2C contracts and practical examples of already existing uses or experiments?

By examining the various definitions of the terminologies for Artificial Intelligence and chatbots respectively, two working definitions have been adopted for each notion (see paragraph 2.2.3 and section 2.3) in absence of a general universal accepted formal definitions amongst scholars.

First, the adopted working definition of Artificial Intelligence according to the European Commission:

“AI refers to systems that display intelligent behaviour by analysing their environment and taking actions – with some degree of autonomy – to achieve specific goals. AI-based systems can be purely software-based, acting in the virtual world (e.g. voice assistants, image analysis software, search engines, speech and face recognition systems) or AI can be embedded in hardware devices (e.g. advanced robots, autonomous cars, drones or Internet of Things (IoT) applications). AI is used on a daily basis, e.g. to translate languages, generate subtitles in videos or to block email spam. Many AI technologies require data to improve their performance. Once they perform well, they can help improve and automate decision making in the same domain. For example, an AI system will be trained and then used to spot cyber- attacks on the basis of data from the concerned network or system”.

Merely for practical reasons, due to lack of consensus on an common accepted definition of AI and its broad definition scope, the above mentioned comprehensive and extensive description of AI initiated by the European Commission will be applied as the working definition of AI throughout this thesis.

Second, the adopted functional working definition of chatbots according to the scholars Rohan Kar and Rishin, who distinguish two main types of chatbots:

The first category of chatbots ‘scripted bots’ function on pre-defined rules and commands and are as smart as they are (functionally) programmed. The second category of chatbots ‘AI bots’ operate supported by AI and are capable of understanding natural language, not just pre- defined commands but get smarter as they interact more due to their ability to maintain different states.

39

The reason for applying the working definition of chatbots according to Rohan Kar and Rishin Haldar, is the insight in relation to Artificial Intelligence. The first type that solely function with pre-defined rules and the latter which is supported by AI. However, for research purposes, the remaining part of this thesis will be confined to the second type of chatbots.

Finally, two practical examples of AI applications in the B2C field were discussed, Watson Assistant and Google Duplex (see paragraph 2.4.1 and 2.4.2). Watson Assistant enables enterprises to build a solution that understands natural language input and uses machine learning to respond to customers in a way that stimulates a conversation between humans. The voice chatbot Google Duplex is capable of carrying out sophisticated conversations and it completes the majority of its tasks fully autonomously, without human involvement.

40

Bibliography

1. Anirudh Khanna et al., A Study of Today’s A.I. through Chatbots and Rediscovery of Machine Intelligence, International Journal of u- and e- Service, Science and Technology (Volume 8 (7) 2015) pp. 277-284.

2. ‘Bots, the next frontier’ (The Economist, 9 April 2016), accessed 6 January 2019.

3. Corien Prins and Jurgen Roest, AI en de rechtspraak: Meer dan alleen de ‘robotrechter’, Nederlands Juristenblad (93(4) 2018) pp. 260-268.

4. Daniel Jurafsky & James H. Martin, Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, Chapter 29 Dialog Systems and Chatbots (Third Edition draft, 2017) pp. 441 – 458.

5. David L. Poole and Alan K. Mackworth, Artificial Intelligence: Foundations of Computational Agents (Cambridge University Press The Edinburgh Building, New York, 2010) pp. 1 - 682.

6. David L. Poole, Alan K. Mackworth, and Randy Goebel, Computational Intelligence: A Logical Approach (Oxford University Press, New York 1998) pp. 1 - 558.

7. Dimitar Shterionov, Dissertation Design and Development of Probabilistic Inference Pipeline, KU Leuven, Faculty of Engineering, Department of Computer Science (Heverlee, Belgium 2010) pp. 1 – 230.

8. Elisabeth Andre et al., Interaction with Embodied Conversational Agents (the MIT Press, 2018) pp. 1 – 23.

9. Eran Kahana, Intellectual Property Infringement by Artificial Intelligence Applications, Stanford Center for Legal Informatics (2016).

41

10. Press release European Commission, accessed 6 January 2019.

11. European Commission, Communication from the Commission to the European Council, The European Economic and Social Committee and the Committee of the Regions: Artificial Intelligence for Europe (25 April 2018). European Commission, accessed 6 January 2019.

12. Europees Parlement, Verslag met aanbevelingen aan de Commissie over civielrechtelijke regels inzake robotica (2015/2103(INL), Commissie juridische zaken, Zittingsdocument A8/0005/2017 (24 Januari 2017). Europees Parlement, accessed 6 January 2019.

13. Gary Lea, Why we need a legal definition of Artificial Intelligence (Australian National University 2015), accessed 6 January 2019.

14. Hamza Harkous, Kassem Fawaz, Kang G. Shin and Karl Aberer, ‘PriBots: Conversational Privacy with Chatbots’ (2016) pp. 1 - 9.

15. Herbert A. Simon, Artificial Intelligence: An Empirical Science, Elsevier (77(1) 1995) pp. 95–127.

16. Ian R. Kerr, Bots, Babes and the Californication of Commerce, University of Ottawa Law & Technology Journal (2004) pp. 285 – 324.

17. James Lester, Karl Branting, Bradford Mott, Conversational Agents, (CRC Press LLC. 2004) pp. 1 – 17.

18. John McCarthy, What is Artificial Intelligence?, Stanford University (2007) pp. 1 – 15.

42

19. Joost N. Kok et al and UNESCO, Artificial Intelligence: encyclopaedia of life support systems (EOLSS), Artificial Intelligence: Definition, Trends, Techniques, and Cases (Eolss Publishers 2009) pp. 1 - 20.

20. Kay Firth-Butterfield and Yoon Chae, World Economic Forum, White paper Artificial Intelligence collides with Patent Law, Center for the Fourth Industrial Revolution, Geneva, (2018) pp. 1 – 24.

21. Marcus Hutter and Shane Legg, Universal Intelligence: A definition of Machine Intelligence, Minds and Machines (17 (4) 2007) pp. 391 and 405-423.

22. Marshal S. Willick, Artificial Intelligence: Some Legal Approached and Implications, AI Magazine (Volume 4 (2) 1983) pp. 5 - 16.

23. Matthew U. Scherer, Regulating Artificial Intelligence Systems: Risks, Challenges, Competencies, and Strategies, Harvard Journal of Law and Technology (29 (2) 2016) pp. 353- 400.

24. Michael Negnevitsky, Artificial Intelligence: A Guide to Intelligent Systems (2nd edition, Pearson Education Limited, Harlow, UK 2005) pp. 1 – 415.

25. Michael Schneider, ‘Bots, Messenger and the Future of Customer Service’, TechCrunch (8 May 2016), accessed 6 January 2019.

26. Nicolas Petit, Law and Regulation of Artificial Intelligence and Robots: conceptual framework and normative implications (Working paper, 2017) pp. 1 – 37.

27. Nicole Radziwill and Morgan Benton, Evaluating Quality of Chatbots and Intelligent Conversational Agents (2017) pp. 1 – 21.

28. Nils J. Nilsson, The Quest for Artificial Intelligence: A History of Ideas and Achievements (Cambridge University Press, Cambridge, UK 2010) pp. 1 – 707.

43

29. Patrick Henry Winston, Artificial Intelligence, Third edition Addison-Wesley Publishing Company, Reading Massachusetts (US) 1992) pp. 1 – 730.

30. Pei Wang, What Do You Mean by “AI”? Frontiers in Artificial Intelligence and Applications (17(1) 2008) pp. 362 – 373.

31. Robert Gorwa and Douglas Guilbeault, Understanding Bots for Policy and Research: Challenges, Methods, and Solutions (2018) pp. 1 – 29.

32. Rohan Kar and Rishin Haldar, Applying Chatbots to the Internet of Things: Opportunities and Architectural Elements, International Journal of Advanced Science and Applications (7(11) 2016) pp. 1 – 9.

33. Ryan Calo, Robotics and the Lessons of Cyberlaw, California Law Review (Volume 103 (3) 2015) pp. 513 – 563.

34. Sam N. Lehman-Wilzig, Frankenstein unbound: Towards a legal definition of Artificial intelligence (Elsevier Volume 13 (6) 1981) pp. 442 – 457.

35. Sameera A. Abdul-Kader and Dr. John Woods, Survey on Chatbot Design Techniques in Speech Conversation Systems , (IJACSA) International Journal of Advanced Computer Science and Applications (Volume 6 (7) 2015) pp. 72– 80.

36. Shane Legg and Marcus Hutter, A Collection of Definitions of Intelligence, Frontiers in Artificial Intelligence and applications (Volume 157 (2007) pp. 17 – 24.

37. Stanford University, Artificial Intelligence and Life in 2020, One Hundred Year Study on Artificial Intelligence, Report of the 2015 Study Panel (2016) pp. 1 – 52.

38. Stuart J Russell and Peter Norvig, Artificial Intelligence: A Modern Approach, Third edition, Prentice Hall, New York (2003) pp. 1 – 1132.

39. Stuart Russell, Provably Beneficial Artificial Intelligence, in The Next Steps: Exponential Life, pp. 1 – 13.

44

40. Suket Arora et al., Dialogue System: A Brief Review (2013) pp. 1- 4.

41. Thomas Metzinger, Peter J. Bentley, Olle Häggström, Miles Brundage, Should we fear Artificial Intelligence? In Depth Analysis: Science and Technology Options Assessment, Scientific Foresights Unit (STOA), European Parliamentary Research Service (EPRS) (2018) pp. 1 – 40.

42. Tibor Bosse and Simon Provoost, Integrating Conversation Trees and Cognitive Models within an ECA for Aggression De-escalation Training (2015) pp. 1 – 9.

45

Chapter III Regulatory framework for B2C contracts in the Netherlands and EU for consumer protection

3.1 Introduction

This chapter answers the second sub-question of this research: what is the regulatory framework for B2C contracts in the Netherlands and EU for consumer protection?

Section 3.2 examines the regulatory framework for B2C contracts in the Netherlands. Section 3.3 provides an overview of the applicable EU framework for consumer protection. Finally, the findings will be summarised in section 3.4.

3.2 Regulatory framework for B2C contracts in the Netherlands

In this section the regulatory framework for business-to-consumer (B2C) contracts in the Netherlands will be drawn. First, the special agreements under Dutch law (paragraph 3.2.1), subsequently the consumer sale regime (paragraph 3.2.2) and ultimately the legal impact associated with this sales regime (paragraph 3.2.3) will be described.

Where relevant, especially for reasons of clarity, the legal terminology applied in this section will be accompanied with the original Dutch translation in brackets.

The starting point of the regulatory framework for B2C contracts in the Netherlands consists of the special agreements (‘bijzondere overeenkomsten’), a special regime under Dutch law codified in the Books 7, 7A and 8 of the Dutch Civil Code. B2C contracts are contracts that involve agreements between natural and legal persons.124

3.2.1 Special agreements under Dutch law

The above-mentioned Books 7, 7A and 8 of the Dutch Civil Code (‘Burgelijk Wetboek or BW’), hereafter the Dutch Civil Code, contain the regulatory framework for various special agreements under Dutch law. Title 1 Book 7 of the Dutch Civil Code125 includes the relevant

124 Jan van Beckum and Gert-Jan Vlasveld, Contractmanagement voor opdrachtgever en leverancier, Van Haren Publishing 2014, p. 5.

125 Articles 1-50 title 1: Special Agreements Book 7 of the Dutch Civil Code.

46 provisions applicable to one of those special agreements, i.e. purchase (‘koop’) and exchange (‘ruil’). Special agreements derogate from agreements in general126 (‘overeenkomsten in het algemeen’), based on the lex specialis principle.127

The sale-purchase agreement (‘koopovereenkomst’) under Dutch law holds its own provisions under the consumer sale title ( ‘kooptitel’)128 of Book 7 of the Dutch Civil Code. The Dutch legislator, following the European Directive offers the consumer-buyer - usually considered as the weaker contract party in relation to the seller - additional protection within sales agreements.129 Stronger consumer protection often leads to tighten legislation, aiming to provide consumers a better (more balanced) position in such relationships: consumer protection law.130

As a result of this policy, many provisions under the consumer sale title have a mandatory legal character (‘dwingendrechtelijk karakter’), which means that it concerns provisions from which there can be no derogation.131 As a consequence, legal acts (‘rechtshandelingen’) which are in contradiction with those mandatory rules of the law (‘dwingend recht’) are in general void (‘nietig’) or voidable (‘vernietigbaar’) on the basis of article 40 paragraph 2 of the Dutch Civil Code.132

126 Articles 213–279 title 5: Agreements in general Book 6 of the Dutch Civil Code.

127 Silvia Zorzetto, The Lex Specialis Principle and its Uses in Legal Argumentation: An Analytical Inquire, Eunomia. Revista en Cultura de la Legalidad (3) 2013, p. 61.

128 The consumer sale title: title 1 Book 7 of the Dutch Civil Code.

129 Marco Loos, Consumentenkoop, Monografieën BW (B65b) Kluwer, 2014, p. 1.

130 Kitty Lieverse and Jac Rinkes, Oneerlijke handelspraktijken en handhaving van consumentenbescherming in de financiële sector, Deel 106, Kluwer, Deventer, 2010, p. 139.

131 Marc Loth, Dwingend en aanvullend recht, Monografieën BW (A19), Kluwer 2009, p. 3.

132 Asser/Arthur Severijn Hartkamp and Carla Sieburgh, 6-III Algemeen overeenkomsten recht, Verbintenissenrecht 2010, nr. 323.

47

A practical example of such a mandatory provision: in case of non-conformity of a product, consumers are entitled to compensation, in accordance with sections 9 and 10 Book 6 of the Dutch Civil Code.133

In addition to these mandatory provisions, special provisions were introduced solely applying to the consumer sale (‘consumentenkoop’) regime134 (see hereafter).

In the next paragraph the scope of the consumer sale regime notion will be further explored.

3.2.2 Consumer sale regime

Consumer sale under Dutch law is defined as the sale with respect to a movable property which is executed between a seller acting on behalf of his trade, business, craft or profession and a buyer, a natural person, acting for purposes outside his trade, business or professional activity.135 The notion of consumer sale is extended to an authorized representative acting on behalf of his profession or business as a seller. In that case the consumer sale regime will also apply, unless the buyer knows that the agent (‘volmachtgever’) is not acting on behalf of his profession or business.136

Moreover, the Dutch legislator extended the definition of consumer sale to service contracts for work on movable property (‘overeenkomsten tot aanneming van werk’). In these types of contracts with consumers involved, the rules for consumer sale will ultimately prevail in case of a potential conflict arising between both forms of contract types.137 A practical example is an agreement on custom hair units bought by a consumer from a professional party.138

133 Articles 7 paragraph 1 and 24 paragraph 1 Book 7 of the Dutch Civil Code.

134 Marco Loos, Consumentenkoop, Monografieën BW (B65b) Kluwer, 2014, p. 1.

135 Article 5 paragraph 1 Book 7 of the Dutch Civil Code.

136 Article 5 paragraph 2 Book 7 of the Dutch Civil Code.

137 Article 5 paragraph 4 Book 7 of the Dutch Civil Code.

138 Judgement of the Dutch Court Arnhem-Leeuwarden of 6 October 2015, ECLI:NL:GHARL:2015:7448.

48

Consequently, according to article 5 paragraph 5 Book 7 of the Dutch Civil Code, the provisions of the consumer sale title139 are also declared applicable to the supply of electricity, water, gas and district heating to a natural person. Equally, the provisions on consumer sale apply to the ‘supply of digital content that is not delivered on a tangible medium’140, except for streaming agreements (for example streaming of videos, films or online radio) which are explicitly excluded from the consumer sale regime.141

In addition, judgements by Dutch Courts have given the consumer sale notion further precision. For instance, in its judgement Beeldbrigade142 the Dutch Supreme Court adopted the consumer sale regime to agreements concerning the purchase of ‘standard computer software’ – on a data carrier or via download – for a not limited in time use against payment of a certain amount‘, since these agreements diminish ‘to provide the purchaser with something that is individualized and on which he can exercise real power’.

In the next paragraph the legal consequences associated with the consumer sale regime will be discussed.

3.2.3 Legal consequences

The qualification of an agreement as consumer sale has the following five legal consequences.

First, the provisions in section 1 up to and including part 7 title 1 Book 7 of the Dutch Civil Code are mandatorily applicable provisions in case of consumer sale, in the sense that it is not permitted to deviate to the detriment of consumers143, except for the five provisions mentioned under article 6 paragraph 2 Book 7 of the Dutch Civil Code, concerning costs of

139 Except for the articles 9, 11 and 19a Book 7 of the Dutch Civil Code.

140 Caroline Cauffman, kroniek consumentkoop 2014-2015, Tijdschrift voor Consumentenrecht en handelspraktijk (5) 2016, p. 212.

141 Article 5 paragraph 5 Book 7 of the Dutch Civil Code and Caroline Cauffman, kroniek consumentkoop 2014 -2015, Tijdschrift voor Consumentenrecht en handelspraktijk (5) 2016, p. 213.

142 Judgement of the Dutch Supreme Court in ‘het Beeldbrigade arrest’, HR 27 April 2012, ECLI:NL:HR:2012:BV1301.

143 Article 6 paragraph 1 Book 7 of the Dutch Civil Code.

49 delivery, payments and increase of the sales price after the conclusion of the agreement.144 Deviation from these latter provisions is solely permitted under individual condition (‘individueel beding’) and deviation from them in the general terms and conditions (‘de algemene voorwaarden’) is considered to be unreasonable onerous (‘onredelijk bezwarend’) and as a consequence voidable.145

Second, title 1 Book 7 of the Dutch Civil Code contains certain additional special mandatory legal provisions that exclusively apply to consumer sale. Examples among others are explicit provisions on transfer of risk in case of home delivery, costs for home delivery, prepayment, ability to dissolute the agreement (in case of price increase or delivery after 3 months), deceitful advertising from the producer, shifting the burden of proof and a shorter term of limitation for legal claims (2 years).146

Third, article 6 paragraph 3 Book 7 of the Dutch Civil Code implemented Directive 99/44/EG of the European Parliament and the Council of 25 May 1999 on certain aspects of the sale of consumers goods and associated guarantees147, hereafter the Consumer Sales Directive (1999) (see section 3.3). The Consumer Sales Directive aimed to harmonise those parts of consumer sale contract law that concern legal and commercial guarantees.148 Article 6 paragraph 3 Book 7 of the Dutch Civil Code explicitly determines that if parties were to make a choice of applicable law for the law of a country outside the European Economic Area (EEA), consumers will still maintain mandatory protection under the Consumer Sales Directive, in case the buyer has it`s normal residence in one of the countries of the EEA.149

144 Marco Loos, Consumentenkoop, Monografieën BW (B65b) Kluwer, 2014, p. 6. The relevant provisions are articles 12, 13 first and second sentence, 26 and 35 respectively of Book 7 of the Dutch Civil Code.

145 Ibid.

146 Respectively articles 11, 13, 26 paragraph 2, 35, 18 paragraphs 1 – 2 and 28 Book 7 of the Dutch Civil Code.

147 EUR-Lex, accessed 6 January 2019.

148 European Commission, accessed 6 January 2019.

149 Marco Loos, Consumentenkoop, Monografie Nieuw BW (B65c), Kluwer, 2004, p. 14.

50

Fourth, the general rules arising from Book 3 Property law (‘Vermogensrecht’) of the Dutch Civil Code and Book 6 Law of obligations (‘Verbintenissenrecht’) of the Dutch Civil Code equally apply to consumer sale, inasmuch no exception has been made under title 1 Book 7 of the Dutch Civil Code.150

Fifth, the transposition of the Consumer Rights Directive 2011/83/EU of 13 June 2014 (see section 3.3) into the Dutch Civil Code has led to the introduction of a new part 2B under title 5 of Book 6 of the Dutch Civil Code with mandatory provisions, restricted to B2C contracts.151

The provisions under the new part 2B of the Dutch Civil Code are related to information duties for agreements concluded on business premises, at a distance and by means of the use of automatic call and communication systems without human intervention, faxes and electronic messages for the transmission of unsolicited communications obtained without prior consent from the subscriber or user152, hereinafter referred to as ‘spam’, between a professional seller or service provider and a consumer. These provisions include a mandatory reflexion period for consumers for agreements concluded at a distance or by means of spam.153

3.2.4 The Dutch consumer sale regime

In this paragraph the existing consumer sale regime under Dutch law will be applied to chatbots. Specifically, the following elements information requirements and remedies, the execution of the contract and remedies and finally passage of the risk and remedies will be discussed.

150 Marco Loos, Consumentenkoop, Monografieën BW (B65b) Kluwer, 2014, p. 7.

151 Giovanni de Christofaro and Alberto de Franceschi and others, Consumer Sales in Europe After the implementation of the Consumer Right Directive, Intersentia Ltd, Cambridge, 2016, p. 128.

152 Article 230h paragraph 2m Book 6 of the Dutch Civil Code and article 11.7 of the Telecommunicatiewet (Tw).

153 Marco Loos, Consumentenkoop, Monografieën BW (B65b) Kluwer, 2014, pp. 2-4.

51

Information requirements and remedies

The information requirements which apply to all consumer sales transactions under Dutch law are established in Title 5 section 2b (articles 230g-230z Book 6 of the Dutch Civil Code). In principle, this part of the Dutch Civil Code is applicable mandatorily to all consumer sale contracts154 and makes a clear distinction among three types of contracts with consumers involved, namely on premise (e.g. purchase in a shop), at a distance (e.g. purchase online) and outside business premises (e.g. door-to door purchase between a trader and consumer155).156

The information requirements consist of information regarding the seller or supplier, to be provided in a clear and understandable manner with regards to contracts concluded at a distance and outside business premises157. The required information shall be provided to consumers in a clear, comprehensible and prominent manner prior to placing the actual order158, the information must contain an explicit recognition that the order implies an obligation to pay with an alike button on the website159. Finally, the information requirements concern the confirmation of the agreement which shall be provided on a durable medium, for instance by email.160

The remedies under Dutch law for consumers consist of the right of termination of contracts within fourteen days161 without cause from the date of receipt of the goods by a consumer or a

154 Articles 230h paragraph 1 and 230i paragraph 1 Book 6 of the Dutch Civil Code.

155 Marco Loos, Consumentenkoop, Monografieën BW (B65b), Kluwer, 2014, p. 36.

156 Respectively articles 230g paragraph 1(g)), 230g paragraph 1(e)) and article 230g paragraph 1(f) Book 6 of the Dutch Civil Code.

157 Article 230b Book 6 of the Dutch Civil Code.

158 Article 230m Book 6 of the Dutch Civil Code.

159 Article 230v paragraph 2 Book 6 of the Dutch Civil Code.

160 Article 230v paragraph 3 Book 6 of the Dutch Civil Code.

161 Article 230p Book 6 of the Dutch Civil Code contains some exceptions, for example sales by public auctions or the supply of goods which are liable to deteriorate or expire rapidly.

52 third party indicated by the consumer, other than the carrier.162 Where non-compliance is identified with article 230m Book 6 of the Dutch Civil Code (information requirements) the withdrawal period can be extended to a maximum of one year163. Further on, in principle, no consumer liability for the potential depreciation of goods is deemed164 and finally excluding certain additional costs for consumers165.

Execution of the contract and remedies

The general national legal regime applicable to the execution of consumer contracts and remedies is respectively enshrined in article 9 and article 19a Book 7 of the Dutch Civil Code. Execution means that the seller is to provide possession of the good(s) sold, according to articles 114 and 115 Book 3 of the Dutch Civil Code (about possession)166 In the case of consumer sales, transfer of possession implies to provide consumers the ‘physical possession or control over the good(s)’.167

Under Dutch law the consumer has two remedies in case of non-execution, either termination of the agreement or reduction the price proportionately.168 The consumer shall not be entitled to have the sales contract rescinded if the lack of conformity is minor.169 Hence, these two regulatory remedies (termination and reducing the price proportionately) will only arise when repair and replacement are not feasible, cannot be demanded from the trader or the trader has failed to comply with his obligations.170

162 Article 230o paragraph 1 Book 6 of the Dutch Civil Code.

163 Article 230o paragraph 2 Book 6 of the Dutch Civil Code.

164 Article 230s paragraph 3 Book 6 of the Dutch Civil Code.

165 Article 230s paragraph 5 Book 6 of the Dutch Civil Code.

166 Article 9 paragraph 2 Book 7 of the Dutch Civil Code.

167 Marco Loos, Consumentenkoop, Monografieën BW (B65b), Kluwer, 2014, p. 63.

168 Article 22 paragraph 1 Book 7 of the Dutch Civil Code.

169 Article 22 paragraph 1(a) Book 7 of the Dutch Civil Code.

170 Article 22 paragraph 2 Book 7 of the Dutch Civil Code.

53

Passage of the risk and remedies

According to the main rule under Dutch law171, the buyer bears the risk for the purchased good(s) from the moment upon taking possession of the good(s).172 However, in case of consumer sale when the good(s) is (are) delivered by the seller or an authorized carrier, the consumer bears the risk for the purchased good(s) from the moment the actual reception of the good(s) has taken place.173 In those cases the consumer has appointed the carrier, the risk shall pass to the consumer at the moment the good(s) is (are) received by the carrier.174 This rule according to article 11 Book 7 of the Dutch Civil Code is of regulatory law (‘regelend recht’), allowing contracting parties the possibility for differing agreements to be concluded.175

Under national law, if agreements - used by the trader in the general conditions of sale - are less favourable to consumers, these clauses are considered to be voidable. In case of existence of a voidable clause, the consumer will be entitled to one of the following two remedies, first either termination of the sale-purchase agreement or second request for replacement.176 For example, suppose the purchased goods were already damaged during transport, in that case the consumer should not have to accept e.g. a limited liability clause of a carrier. In that case the consumer has two remedies, either terminate the sale-purchase agreement or ask for an adequate substitute.

171 Article 10 paragraph 3 Book 7 of the Dutch Civil Code.

172 Marco Loos, Consumentenkoop, Monografieën BW (B65b), Kluwer, 2014, p. 2.

173 Article11 paragraph 1 Book 7 of the Dutch Civil Code.

174 Article 11 paragraph 2 Book 7 of the Dutch Civil Code.

175 L.J. van Apeldoorn, Inleiding tot de Studie van het Nederlandsche recht, Zwolle, W.E.J. Tjeenk Willink, 1933, p. 90.

176 Article 233 Book 6 and article 6 paragraph 2 Book 7 of the Dutch Civil Code.

54

The Consumer Rights Directive has been implemented almost word-for-word into the Dutch Civil Code by the Dutch legislator.177 Since the EU Directive plays such an important role at national level and for the creation and implementation of national provisions, in the next section the EU framework of consumer protection for B2C contracts will be described.

3.3 The applicable EU framework of consumer protection

This section addresses the current applicable EU framework of consumer protection for B2C contracts.

First, the E-commerce Directive 2000/31/EC (paragraph 3.3.1) will be discussed, followed by the Consumer Rights Directive 2011/83/EU (paragraph 3.3.2) and related Unfair Contract Terms Directive (1993) and Consumer Sales Directive (1999). Finally, the end of this section will be devoted to ongoing EU regulatory initiatives to enhance consumer protection also in the context of B2B contracts, by discussing two proposals for draft Directives, both presently under preparation by the European legislator (paragraph 3.3.3).

3.3.1 E-commerce Directive

More than 15 years after its publication the E-commerce Directive 2000/31/EC178 of 8 June 2000 remains at the heart of EU-e-commerce regulatory framework.179 The E-commerce Directive 2000/31/EC, hereafter the E-commerce Directive, introduced an internal market framework for electronic commerce, aiming at providing legal certainty for business and

177 Giovanni de Christofaro and Alberto de Franceschi and others, Consumer Sales in Europe After the implementation of the Consumer Right Directive, Intersentia Ltd, Cambridge, 2016, p. 111.

178 EUR-Lex, accessed 6 January 2019.

179 Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) p. 17.

55 consumers alike.180 The E-Commerce Directive provides the legal framework for online consumer transactions181 and was transposed into Dutch law in 2004.182

The purpose, scope and legal impact of this Directive will be explained briefly, hereafter.

The objective of the E-Commerce Directive is to remove obstacles to cross-border online service in the European Union and to provide legal certainty to business and citizens in cross- border online transactions. Although the E-Commerce Directive is not specifically targeted at Consumers (in contrast to for instance the Consumers Rights Directive)183, the E-commerce Directive sets out basic requirements among others on mandatory consumer information, steps to be followed in online contracting and rules on commercial communications (i.e. rules on online advertisement and unsolicited commercial communications).184

The relevant definitions are set out in article 2 of the E-Commerce Directive. A ‘consumer’ is defined as any natural person who is acting for purposes which are outside his or her trade, business or profession. Further on, a ‘service provider’ is defined as any natural or legal person providing an information society service. The notion ‘information society service’ includes any service normally provided for remuneration (e.g. the services of Netflix), at a distance (e.g. a hotel overnight reservation via Booking.com), by electronic means (includes

180 European Commission, accessed 6 January 2019.

181 European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis (EPRS, PE 568.322, 2015) p. 8.

182 Europa decentral, accessed 6 January 2019.

183 Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) p. 11.

184 European Commission, accessed 6 January 2019.

56 all activities by desktop, tablet, laptop, smartphone, etc.) and at the individual request of a recipient of services (e.g. the visit of a website)185

Article 9 paragraph 1 determines that Member States shall ensure that their legal system allows contracts to be concluded by electronic means, with the exception of certain typological contracts mentioned under article 9 paragraph 2.186

Article 10 describes the minimum information requirements: (a) the technical steps necessary to conclude the contract, (b) the filing and accessibility of the contract (c) the identification and correction of input errors and (d) the language of the contract. These requirements do not apply if parties who are not consumers have agreed otherwise.187 The minimum information requirements should be given prior to the order being placed and once the information has been provided, the handling of the order can begin. Hence, the information should be given ‘clearly, comprehensibly and unambiguously’.188

A short explanation of the aforementioned minimum information requirements. The first (a) needs to prevent that people are contractually bound before knowing it. A filed and accessible contract (b) may give the recipient more confidence in the provider and influence his purchase. Before goods or services are ordered, it must be possible to correct input errors. The

185 Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) p. 6-11.

186 Among other are excluded (a) contracts that create or transfer rights in real estate (e.g. contracts involving real-estate) except for rental rights; (b) contracts requiring by law the involvement of courts, public authorities or professions exercising public authority (e.g. contacts involving a notary public); (c) contracts of suretyship granted and on collateral securities furnished by consumers (e.g. contracts for financial services) and (d) contracts governed by family law (e.g. a prenuptial agreement) or by the law of succession (e.g. a testament).

187 Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) p. 25.

188 Ibid, pp. 25-27.

57 recipient must be aware of this possibility (c). Finally, it should be indicated in what language the contract is concluded (d).189

In short, the E-Commerce Directive is a minimum harmonisation Directive (see next section) - leaving Member States to maintain or adopt more consumer-friendly implementing provisions190 - solely applying to online contacts, including contracts for the online sale of consumer goods. Hence, the above described minimum detailed information rights for parties concluding electronic contracts are mandatorily applicable to B2C contracts.191

3.3.1.1 E-commerce Directive

In this paragraph the E-commerce Directive will be discussed. In particular the following elements information requirements and remedies, the execution of the contract and remedies and finally passage of the risk and remedies will be addressed.

Information requirements and remedies

The E-Commerce Directive 2000/31/EC192 - providing the EU legal framework for online consumer transactions193 and transposed into national legislation - contains minimum information requirements for conducting online contracts, such as (1) the technical steps necessary to conclude the contract, (2) filing and accessibility of the contract, (3) identification and correction of input errors and (4) the language of the contract.194 These

189 Ibid.

190 European Parliament, Briefing EU Legislation in Process Contracts for online and other distance sales of goods (February 2016) p. 2.

191 Ibid, p. 4.

192 EUR-Lex, accessed 6 January 2019.

193 European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis EPRS, PE 568.322, 2015, p. 8.

194 Article 5 and 10 of the E-Commerce Directive 2000/31/EC.

58 information requirements are rules of mandatory law, are applicable to online B2C contracts, should be given prior to the order being placed and provided to the consumer ‘clearly, comprehensibly and unambiguously’.195

There are two remedies under the E-commerce Directive. The first one is that the trader is obliged to send the consumer an acknowledgement of the receipt. Important to notice that, as long as the confirmation has not been received by the consumer, the latter will be entitled to withdraw from the agreement.196 According to the second remedy, potential failure to comply with the aforementioned minimum information requirements entitles the consumer to either annul the agreement or withdraw from the agreement.197

Execution of the contract and remedies

Under the E-Commerce Directive, the execution moment for online contracts is determined after placing the order and once the trader has acknowledged the receipt of the order.198 Order and acknowledgement of the order are deemed to be received at the moment parties are able to access them. For example, as soon as an e-mail has arrived at a mail server, the email is deemed to be received.199 Without aforementioned acknowledgment of the receipt by the trader, consumers have the remedy to withdraw from the agreement.200

195 Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) pp.. 25-27.

196 Article 227c paragraph 2 Book 6 of the Dutch Civil Code and article 11 of the E-Commerce Directive 2000/31/EC.

197 Article 227b Book 6 of the Dutch Civil Code.

198 Article 11 of the E-Commerce Directive 2000/31/EC and Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) p. 28.

199 Ibid.

200 Article 227c paragraph 2 Book 6 of the Dutch Civil Code.

59

Passage of the risk and remedies

Under the E-commerce Directive, the passage of the risk for online contracts shall occur at the moment the good(s) is (are) actually received by the consumer or when the good(s) is (are) received by the carrier delivering the goods.201 The remedy consist of a termination of the agreement without reason within fourteen calendar days (or with an extensions up to maximum twelve months in case of failure to comply with mandatory information requirements).202 Obviously, these time periods are due to start after the consumer has received the good(s), given the purpose of the reflection period of fourteen days.203

3.3.2 Consumer Rights Directive

Directive 2011/83/EU of the European Parliament and the European Council on consumer rights of 25 October 2011204, hereafter the Consumer Rights Directive or CRD, was adopted on 23 June 2011205 in order to achieve a B2C internal market206 with a high level of consumer protection and competitive businesses.207 The intervention of the European legislator aimed to

201 Respectively article 11 paragraph 1 and 2 Book 7 of the Dutch Civil Code.

202 Article 230o paragraphs 1 and 2 Book 6 of the Dutch Civil Code.

203 Marco Loos, Consumentenkoop, Monografieën BW (B65b), Kluwer, 2014, p. 40.

204 EUR-Lex, accessed 6 January 2019.

205 Press Release Database European Commission, accessed 6 January 2019.

206 The internal market of the European Union is a single market in which the free movement of goods, services, capital and persons is assured, and in which citizens are free to live, work, study and do business. EUR-Lex, available at accessed 6 January 2019.

207 European Commission, accessed 6 January 2019 and recital 2 of the Preamble to Directive 2011/83/EU of the European Parliament and the Council on consumer rights of 25 October 2011.

60 fill at filling gaps among national private law regimes, since at the time of the Directive`s adoption in 1999 there were no common rules on contractual conformity in the case of sales of goods.208

By adopting the new Consumer Rights Directive the rights of consumers in all 27 Member States of the EU were strengthened.209 The purpose of tailoring sales rules for consumers in the context of the internal market made it possible for the CRD to be adopted on the basis of article 114 Treaty on the Functioning of the European Union (TFEU)210.211 The CRD - applicable to contracts concluded after 13 June 2014212 and transposed into the Dutch Civil Code (see paragraph 3.2.3) - repealed and amended the following four EU Directives.

The Consumer Rights Directive repealed the Distance Selling Directive 97/7/EC of 20 May 1997213 and Directive 85/577/EEC of 20 December 1985 on protection of consumers in respect of contracts negotiated away from business premises214, also known as the Doorstep Selling Directive.215 Whilst these former Directives only provided for a minimum level of

208 Catalina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, p. 94.

209 Press Release Database European Commission, accessed 6 January 2019.

210 The objective of article 114 TFEU is to complete the internal market. See Treaty on the Functioning of the European Union (TFEU), accessed 6 January 2019.

211 Cătălina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, p. 95 and recitals 3 and 4 of the Preamble to Directive 2011/83/EU of the European Parliament and the Council on consumer rights of 25 October 2011.

212 Article 28 Directive 2011/83/EU of the European Parliament and the European Council on consumer rights

213 EUR-Lex, accessed 6 January 2019.

214 EUR-Lex, accessed 6 January 2019.

215 Cătălina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, p. 33.

61 harmonisation of consumer protection rules, the Consumer Rights Directive is, in principle, a full harmonisation Directive216 (see hereafter).

The CRD amended the Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts217, hereafter the Unfair Contract Terms Directive (1993), and Directive 99/44/EG of the European Parliament and the European Council of 25 May 1999 on certain aspects of the sale of consumers goods and associated guarantees, hereafter the Consumer Sales Directive (1999).218 The first one aims to protect consumers in the EU against unfair standard contacts terms applied by traders219, while the latter intended to harmonise those parts of consumer sales contract law that concern legal and commercial guarantees, including seller`s liability for non-conformity of the object sold.220

The relevant definitions, scope, exemptions and legal impact of the CRD will be explained below.

Article 2 of the Consumer Rights Directive establishes amongst others the definitions of ‘consumer’ and ‘trader’. In parallel with the E-commerce Directive a ‘consumer’ is defined as a natural person who, in contracts covered by this Directive, is acting for purposes which are outside his trade, business, craft or profession. A ‘trader’ is defined as any natural person or

216 European Commission, DG Justice Guidance Document concerning Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council, 2014, p. 4.

217 EUR-Lex, accessed 6 January 2019.

218 EUR-Lex, accessed 6 January 2019.

219 European Commission, accessed 6 January 2019.

220 European Commission, accessed 6 January 2019 and the European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis (EPRS, PE 568.322, 2015) p. 8.

62 any legal person, irrespective of whether privately or publicly owned, who is acting, including through any other person acting in his name or on his behalf, for purposes relating to his trade, business, craft or profession in relation to contracts covered by this Directive.

Article 3 paragraph 1 outlines the scope of the Directive, which applies to any B2C contract between a trader and consumer and to contracts for the supply of water, gas, electricity or district heating offered, commodities by public providers included.221 The scope of the CRD applies to different types of contracts, among them sales contracts, service contracts and distance contracts and off premises contracts.222 Moreover, the CRD contains several specific provisions on distance and off premises contracts, such as information requirement and formal requirements, right to withdraw, execution and consequences of withdrawal, obligations of traders and consumers and exceptions to the right of withdrawal.223

The Consumer Rights Directive includes substantial information rules, so called consumers’ information rights, that extend to all consumer contracts.224. These mandatory consumers’ information rights can be divided into two categories, pre-contractual information225 and post- contractual information226.

The first category covers among others respectively, (a) the main characteristics of a product or service, (b) the identity and contact details of the trader, (c) the total price of the goods or services inclusive of taxes, (d) where applicable the arrangements for payment, delivery, performance and complaint handling policy, (e) a reminder of the existence of a legal guarantee of conformity for goods, (f) the duration of the contract where applicable or if a

221 Article 3 paragraph 1 of the Consumer Rights Directive.

222 The definitions of the different types of contracts are set out in article 2 of the Consumer Rights Directive.

223 Chapter III: article 6 – 16 of the Consumer Rights Directive.

224 Catherine Barnard and Steve Peers, European Union Law, Second Edition, Oxford University Press, Oxford, 2017, p. 692 and the European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis (EPRS, PE 568.322, 2015) p. 8.

225 Articles 5 and 8 of the Consumer Rights Directive.

226 Article 8 paragraph 7 of the Consumer Rights Directive.

63 contract is of indeterminate duration or is to be extended automatically, the conditions for terminating the contract, (g) where applicable the functionality, including applicable technical protection measures, of the digital content and (h) where applicable any relevant interoperability of digital content with hardware and software.227

The second category consist of the aforementioned pre-contractual information, accompanied with the confirmation of the contract concluded on a durable medium (e.g. by email)228 within a reasonable time after the conclusion of the contract, and at the latest at the moment of delivery of the goods or before the performance of the service commences.229

However, due to specific EU rules put in place in certain sectors of industries, the Consumer Rights Directive has an extensive list of subjects that are excluded from its scope of application.230 For example, article 3 paragraph 3 excludes contracts among others for social services (e.g. social housing or child care), healthcare, gambling (e.g. lotteries and or casino games), financial services, immovable property (e.g. purchase of a house), construction of new buildings (e.g. building a new house), rental of accommodation for residential purposes (e.g. tenancy agreement) and package holidays.

Equally excluded are contracts established by a public office-holder with a statutory obligation to be independent and impartial (e.g. notary deeds), contracts for the supply of foodstuff, beverages or other goods physically supplied on frequent and regular rounds to the consumer’s home, residence or workplace (e.g. sale of consumption goods), contracts for

227 Article 5 paragraph 1 of the Consumer Rights Directive.

228 ‘durable medium’ means any instrument which enables the consumer or the trader to store information addressed personally to him in a way accessible for future reference for a period of time adequate for the purposes of the information and which allows the unchanged reproduction of the information stored (article 2(10) and Annex I of the Consumer Rights Directive).

229 Article 8 paragraph 7 of the Consumer Rights Directive.

230 European Commission, DG Justice Guidance Document concerning Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council, 2014, p. 4.

64 passenger transport services (e.g. airline tickets), contracts concluded by means of automatic vending machines and ultimately excluding contracts concluded with telecommunications operators through public payphones for their use (e.g. through a phone-booth).231

It is important to note the relationship between the Consumer Rights Directive and the two aforementioned Directives, the Unfair Contract Terms Directive (1993) and the Consumer Sales Directive (1999). Both Directives are based on minimum harmonisation and therefore allow more stringent provisions, while the CRD focuses on maximum harmonisation, according to its article 4. 232

A brief note to clarify these harmonisation principles, minimum harmonization in a nutshell entails that, should any of the Member States want to impose an even higher level of consumer protection, they could make use of this possibility. Maximum harmonisation or full harmonisation entails that Member States cannot go beyond the scope of European Rules in their transposition, which is considered to lead to a more compact and unitary system of consumer protection.233 As article 4 of the Consumer Rights Directive clearly determines its aim to achieve a full harmonisation of the EU consumer law, only a few options remain for Member States to derogate from the full harmonisation character of the CRD.234

According to the CRD there are three exceptions to the maximum harmonization principle. First, Member States are not required to apply maximum harmonisation to contracts which involve day to day transactions.235 Second, Member States may maintain or introduce language requirements regarding the contractual information.236 Finally, the information

231 The majority of these excluded contacts are regulated by separate EU Directives, according to Article 3 paragraph 3 of the Consumer Rights Directive.

232 Cătălina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, p. 53.

233 Ibid, p. 18.

234 Joasia Luzak and Vanessa Mak, The Consumer Rights Directive, Amsterdam Law School Legal Studies Research Paper, Centre for the Study of European Contract Law, Working Paper Series No. 2013-01, p. 5.

235 Articles 5 paragraphs 3 and 4 of the Consumer Rights Directive.

236 Article 6 paragraph 7 of the Consumer Rights Directive.

65 requirements laid down in CRD are in addition to the information requirements in Directive 2006/123/EC on service in the internal market and Directive 2000/31/EC on electronic commerce. Member States may impose additional information requirements in accordance with those Directives and in case of a conflict between Directive 2006/123/EC and Directive 2000/31/EC with the CRD, the provisions of the CRD will prevail.237

Hence, in case the minimum harmonization principle – applied under the Unfair Contract Terms Directive (1993) and the Consumer Sales Directive (1999) - versus the full harmonisation principle – applied in the Consumer Rights Directive – would cause barriers to the internal market in terms of large differences between Member States, review of the Consumer Rights Directive could lead to a proposal from the European Commission to amend the CRD accordingly with the objective to achieve a high, common level of consumer protection.238

On one hand, due to the maximum harmonization principle, the Consumer Rights Directive can be considered as an important step in European consumer law and should be seen as an important boost to the further development of consumer rights in Europe.239 On the other hand there are also concerns that the consumer character of the EU law is being diluted by an obsession for maximum harmonization that threatens national traditions.240

Finally, it is important to note the legal consequences of these three aforementioned Directives on B2C contracts. The Consumer Rights Directive applies indistinctly to all consumer contracts with certain exceptions, as indicated above.241 The Unfair Terms Directive (1993) applies to all consumer contracts concluded online or offline, with only some

237 Article 6 paragraph 8 of the Consumer Rights Directive.

238 Recital 62 of Directive 2011/83/EU of the European Parliament and the Council on consumer rights.

239 Joasia Luzak and Vanessa Mak, The Consumer Rights Directive, Amsterdam Law School Legal Studies Research Paper, Centre for the Study of European Contract Law, Working Paper Series No. 2013-01, p. 18.

240 Catherine Barnard and Steve Peers, European Union Law, Second Edition, Oxford University Press, Oxford, 2017, p. 708.

241 European Parliament, Briefing EU Legislation in Process Contracts for online and other distance sales of goods (February 2016) p. 3.

66 exceptions242.243 Ultimately, the Consumer Sales Directive (1999) also applies to all consumer sales transactions (online and offline).244

3.3.2.1 Consumer Rights Directive

In this paragraph the Consumer Rights Directive will be discussed. In particular the following elements information requirements and remedies, the execution of the contract and remedies and finally passage of the risk and remedies will be addressed.

Information requirements and remedies

The Consumer Rights Directive 2011/83/EU245 - aiming to achieve a B2C market with a high level of consumer protection and competitive business246 and transposed into the Dutch Civil Code (see paragraph 3.3.2) – contains consumers’ information rights, that extend to all

242 Exemptions will apply for terms by which a supplier of financial services reserves the right to terminate unilaterally a contract of indeterminate duration without notice where there is a valid reason (ANNEX 2a Unfair Terms Directive (1993)) and terms under which a supplier of financial services reserves the right to alter the rate of interest payable by the consumer or due to the latter, or the amount of other charges for financial services without notice where there is a valid reason (ANNEX 2b Unfair Terms Directive (1993)). Moreover excluded are transactions in transferable securities, financial instruments and other products or services where the price is linked to fluctuations in a stock exchange quotation or index or a financial market rate that the seller or supplier does not control. And contracts for the purchase or sale of foreign currency, traveller's cheques or international money orders denominated in foreign currency (ANNEX 2c Unfair Terms Directive (1993)).

243 Ibid, p. 4 and European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis (EPRS, PE 568.322, 2015) p. 12.

244 Ibid, p. 3 and European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis (EPRS, PE 568.322, 2015) p. 10.

245 EUR-Lex, accessed 6 January 2019.

246 European Commission, accessed 6 January 2019 and recital 2 of the Preamble to Directive 2011/83/EU of the European Parliament and the Council on consumer rights of 25 October 2011.

67 consumer contracts.247 These mandatory consumers’ information rights can be divided into two categories, namely pre-contractual information requirements248 and post-contractual249 information requirements.

The first category covers, among others, respectively (a) the main characteristics of a product or service, (b) the identity and contact details of the trader, (c) the total price of the goods or services inclusive of taxes, (d) where applicable the arrangements for payment, delivery, performance and complaint handling policy, (e) a reminder of the existence of a legal guarantee of conformity for goods, (f) the duration of the contract where applicable or if a contract is of indeterminate duration or is to be extended automatically, the conditions for terminating the contract, (g) where applicable the functionality, including applicable technical protection measures of the digital content and (h) where applicable relevant interoperability of digital content with hardware and software.250

The second category consists of the aforementioned pre-contractual information accompanied with the confirmation of the contract concluded on a durable medium251 (e.g. by email), to be provided within a reasonable time after the conclusion of the contract, at the latest at the moment of delivery of the goods or before the commencement of the service.252

247 Catherine Barnard and Steve Peers, European Union Law, Second Edition, Oxford University Press, Oxford, 2017, p. 692 and the European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis EPRS, PE 568.322, 2015, p. 8.

248 Articles 5 and 8 of the Consumer Rights Directive.

249 Article 8 paragraph 7 of the Consumer Rights Directive.

250 Article 5 paragraph 1 of the Consumer Rights Directive.

251 A ‘durable medium’ means any instrument which enables the consumer or the trader to store information addressed personally to him in a way accessible for future reference for a period of time adequate for the purposes of the information and which allows the unchanged reproduction of the information stored (article 2(10) and Annex I of the Consumer Rights Directive).

252 Article 8 paragraph 7 of the Consumer Rights Directive.

68

The remedy under the Consumer Rights Directive consists of a withdrawal period of fourteen days253 for consumers, which can be extended up to twelve months in case the trader has not provided the consumer the information on his or her right of withdrawal254. This remedy under the Consumer Right Directive was transposed into national legislation under provision 320o Book 6 of the Dutch Civil Code.

Execution of the contract and remedies

The Consumer Rights Directive determines that, unless the parties have agreed otherwise, the trader shall deliver the goods within thirty days from the conclusion of the contract.255 If the trader has failed to fulfil his obligation to deliver, the consumer shall be entitled to call upon him to perform the delivery within an additional period of time. If the trader fails (again) to deliver, the consumer has the remedy to terminate the contact.256 In that case, the consumer shall be reimbursed by the trader, all sums paid under the contract.257

Passage of the risk and remedies

The Consumer Rights Directive contains a specific provision on the passing of risk, which determines that contracts where the trader dispatches the goods to the consumer, the risk of loss of or damage to the goods shall pass to the consumer when he or a third party indicated by the consumer and other than the carrier has acquired the physical possession of the goods. However, the risk shall pass to the consumer upon delivery to the carrier if the carrier was commissioned by the consumer to carry the goods and that choice was not offered by the trader, without prejudice to the rights of the consumer against the carrier.258

253 Article 9 of the Consumer Rights Directive.

254 Article 10 of the Consumer Rights Directive.

255 Article 18 paragraph 1 of the Consumer Rights Directive.

256 Article 18 paragraph 2 of the Consumer Rights Directive.

257 Article 18 paragraph 3 of the Consumer Rights Directive.

258 Article 20 of the Consumer Rights Directive.

69

A brief comment on the unfair Contract Terms Directive (1993) and the Consumer Sales Directive (1999). These Directives do not contain any specific provisions on information requirements and remedies, execution of the contract and remedies and passage of the risk and remedies that apply to B2C contracts.

Finally, in order to gain a full and comprehensive picture of the EU framework of consumer protection for B2C contracts, the next paragraph will pay attention to two important EU legislation initiatives, intended to achieve enhanced consumer protection in this field.

3.3.3 Ongoing EU regulation initiatives on consumer protection within B2B contacts

As indicated, this paragraph will look forward and conclude a few words on two ongoing EU legislative initiatives – currently under preparation by the European Commission – with enhanced consumer protection objectives relevant to B2C contracts. These two legislative initiatives, announced by the European Commission in 2015, aim to harmonise the rules for the supply of digital content and online sales of goods for B2C contracts.259

Earlier in October 2011 the European Commission submitted a proposal for a Regulation on a Common European Sales Law.260 This Common European Sales Law or CESL261 would have been introduced in the national legal systems of the Member States of the European Union as a ‘second national system of contract law’ to be applied if the parties to a contract so choose. The full harmonisation approach (see paragraph 3.3.2) thus seems to have been abandoned and replaced by an optional instrument of contract law.262

259 European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the supply of digital content (COM, 2015, 634 final, 2015/0287, COD) p. 2.

260 Marco Loos, The Regulation of Digital Content B2C Contracts in CESL, Amsterdam Law School Research, Paper Centre for the Study of European Contract Law, Working Paper Series No. 2013-10, p. 3.

261 EUR-Lex, accessed 6 January 2019.

262 Marco Loos, The Regulation of Digital Content B2C Contracts in CESL, Amsterdam Law School Research, Paper Centre for the Study of European Contract Law, Working Paper Series No. 2013-10, p. 3.

70

The proposal for CESL incorporated existing acquis communautaire263, in particular the Consumer Rights Directive, the Unfair Contract Terms Directive (1993) and the Consumer Sales Directive (1999) and sometimes would offer additional protection to consumers.264 However, the CESL remained stuck in the EU legislation procedure, was finally withdrawn in September 2015265 and is succeeded by two draft proposals for a new Directives in the works: one on certain aspects concerning contracts for the supply of digital content, hereafter the draft Digital Content Directive, and another one on certain aspects concerning contracts for the online and other distance sales goods, hereafter the draft Online Sales Directive.266

The draft proposals for the new Directives are part of the Digital Single Market (DSM) strategy for Europe, announced by the European Commission in May 2015267 and contain a targeted, fully harmonised set of rules.268 The aim of the DSM strategy is to create a digital single market where the free movement of goods, services, capital and data is guaranteed and where citizens and businesses can seamlessly and fairly access online goods and services whatever their nationality, and wherever the live.269 The general objective of these proposals

263 This is a French term referring to the legal order of the European Union. It is the cumulative body of European Union legislation consisting of primary (treaties and protocols) and secondary legislation (regulations, directives and decisions) and the case law of the European Court of Justice. European Commission, accessed 6 January 2019.

264 Ibid.

265 European Parliament, accessed 6 January 2019.

266 European Law Institute (ELI), Statement of the European Law Institute, on the European Commission’s Proposed Directive on the Supply of Digital Content to Consumers, (COM, 2015, 634 final) p. 1.

267 Press release of 6 May 2015 from the European Commission. European Parliament, accessed 6 January 2019.

268 European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the online and other distance sales of goods (COM, 2015, 635 final, 2015/0288 COD, p. 2.

269 European Commission, accessed 6 January 2019.

71 is to contribute to faster growth opportunities offered by creating a true Digital Single Market, to the benefit of both consumers and businesses.270

The first one, the draft Digital Content Directive, was initiated as a result of the unclear position of digital content271 due to the absence of contractual rights for specific defective digital content such as an app or film272 This draft Digital Content Directive, based on article 114 TFEU273 and presented by the European Commission on 9 December 2015, aims at filling the current legal gap in the consumer acquis at EU level regarding certain contractual aspects for which there are currently no rules274 and also supplements the E-Commerce Directive275 (see paragraph 3.3.1).

The draft Digital Content Directive concerns B2C contracts for the supply of digital content and covers data produced and supplied in digital form (e.g. music, online video etc.), services allowing for the creation, processing or storage of data in digital form (e.g. cloud storage), services allowing for the sharing of data (e.g. Facebook, You Tube, etc.) and any durable medium used exclusively as a carrier of digital content (e.g. DVDs).276

270 European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the online and other distance sales of goods (COM, 2015, 635 final, 2015/0288 COD, p. 2.

271 Catherine Barnard and Steve Peers, European Union Law, Second Edition, Oxford University Press, Oxford, 2017, p. 700.

272 Eerste Kamer der Staten-Generaal, accessed 6 January 2019.

273 EUR-Lex, accessed 6 January 2019.

274 European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the supply of digital content (COM, 2015, 634 final, 2015/0287, COD) p. 3.

275 Ibid.

276 Press release of 8 June 2017 from the European Council Council of the European Union: New rules for contracts for the supply of digital content – Council adopts its position. European Council Council of the European European Union, accessed 6 January 2019.

72

The second one, the (amended) draft Online Sales Directive, is equally based on article 114 of the TFEU277 and concerns contracts for online and other distance sales of goods. Further it intends to entirely replace the existing Consumer Sales Directive (1999) (see paragraph 3.3.2) by including offline (face-to face) sales within its scope.278 The draft Online Sales Directive takes the rules of the Consumer Sales Directive (1999) as a basis and equally supplements the E-Commerce Directive279 (see paragraph 3.3.1).

However, the draft Online Sales Directive in contrast to the current applicable Consumer Sales Directive (1999), provides for maximum instead of minimum harmonisation (see for this notion paragraph 3.3.2) by effectively barring Member States from introducing or maintaining more consumer-friendly rules.280 This means that when implementing the Directive into domestic law, Member States will be obliged to offer exactly the same level of consumers’ protection as that envisaged by the Directive.281

In sum, the draft Digital Content Directive aims to define harmonised rules concerning for the supply of digital content, while the draft Online Sales Directive aims to harmonise these rules with regards to online sales of goods.282 For completeness, the latest status of these two draft proposals will be discussed below.

277 EUR-Lex, accessed 6 January 2019.

278 European Parliament, Briefing EU Legislation in Process on Consumer sale of goods (March 2018) pp. 1-2.

279 European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the online and other distance sales of goods (COM, 2015, 635 final, 2015/0288 COD, pp. 2–4.

280 Ibid.

281 Ibid.

282 Eerste Kamer der Staten-Generaal, accessed 6 January 2019.

73

The expected next step involved with the draft Digital Content Directive are negotiations with the European Parliament, as part of the trilogue negotiations283.284 These negotiations are most likely to be expected during autumn of 2018.285 The expected next step of the draft Online Sales Directive is equally the legislative phase of triologue negotiations.286 The most recent status was a policy debate with the responsible ministers, organised by the Council of the European Union, in May this year.287

In the next chapter the potential regulatory challenges for chatbots will be extensively examined.

283 Negotiations between the institutions on legislative proposals generally take the form of tripartite meetings ('trilogues') between Parliament, the Council and the Commission. For a given file, each institution designates its negotiators and defines its negotiating mandate. European Parliament, accessed 6 January 2019.

284 European Parliament, Briefing EU Legislation in Process on Contracts for the supply of digital content and digital services (February 2018) p. 1.

285 Press release of 8 June 2017 from the European Council Council of the European Union: New rules for contracts for the supply of digital content – Council adopts its position. European Council Council of the European Union, accessed 6 January 2019.

286 European Parliament, Briefing EU Legislation in Process on Consumer sale of goods (March 2018) p. 1.

287 European Council Council of the European Union, accessed 6 January 2019.

74

3.4 Conclusions

This chapter answered the second sub-question of this research: what is the regulatory framework for B2C contracts in the Netherlands and EU for consumer protection?

As we saw the regulatory framework in the Netherlands for B2C contracts is regulated by title 1 Book 7 of the Dutch Civil Code, known as the consumer sale title. Many provisions under this title have a mandatory legal character when consumers are involved and moreover special provisions were introduced solely applying to this consumer sale regime. Hence, the general rules from Book 3 Property law and Book 6 Law of obligations of the Dutch Civil Code apply equally, provided that no exception has been made under the consumer sale title of Book 7 of the Dutch Civil Code.

Additionally, the Consumer Sales Directive (1999), intended to harmonise parts of consumer sale contract law concerning legal and commercial guarantees and the Consumer Rights Directive dated from 2011, were respectively transposed into the Dutch Civil Code. The first Directive implemented under the consumer sale title, while the latter Directive led to a new part 2B under title 5 Book 6 of the Dutch Civil Code, containing mandatory provisions restricted to B2C contracts.

As we saw the EU framework of consumer protection for B2C contracts consists of four Directives: the E-commerce Directive, the Unfair Terms Directive (1993), the Consumer Sales Directive (1999) and the Consumer Rights Directive.

The E-commerce Directive - providing the legal framework for online consumer transactions - solely applies to online contacts. The Unfair Terms Directive (1993) - containing provisions to protect consumers in the EU against unfair standard contacts terms used by traders - applies to all business to consumer contracts online or offline The Consumer Sales Directive (1999) - concerning legal and commercial guarantees including seller’s liability for non-conformity - applies to all consumer sales transactions(online or offline). Ultimately, the Consumer Rights Directive - including consumers’ information rights – equally applies, subject to exceptions, to any contract between a trader and consumer.

75

In closing, the aforementioned four Directives should not be dissociated from the two ongoing EU legislative initiatives work in progress, currently under preparation by the European Commission as of 2015 aiming to further enhance consumer protection within B2C contracts, one draft proposals for a new Directive on digital content and another one on online sales.

76

Bibliography

1. Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) pp. 17-58.

2. Asser/Arthur Severijn Hartkamp and Carla Sieburgh, 6-III Algemeen overeenkomsten recht, Verbintenissenrecht 2010, nr. 323.

3. Caroline Cauffman, kroniek consumentkoop 2014 -2015, Tijdschrift voor Consumentenrecht en handelspraktijk (5) 2016, pp. 211-223.

4. Cătălina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, pp. 18-33, 53-66, 94-125, 234-238.

5. Catherine Barnard and Steve Peers, European Union Law, Second Edition, Oxford University Press, Oxford, 2017, pp. 687-709.

6. David Edward and Robert Lane, European Union Law, Edward Elgar, Cheltenham and Northampton, 2013.

7. Dominika Bezáková, The Consumer Rights Directive and its Implications for Consumer Protection regarding intangible Digital Content, Masaryk University Journal of Law and Technology (7) 2013, pp. 177-191.

8. Esther Arroyo Amayuelas, La Propuesta de Directiva relativa a determinados aspectos de los contratos de compraventa en línea y otras ventas de bienes a distancia, InDret Revista para el Analisis del Derecho (3) 2016, pp. 1-33.

9. European Commission, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee: a new Deal for Consumers (COM, 2018, 183 final) pp. 1–17.

77

10. European Commission, DG Justice Guidance Document concerning Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council, 2014, pp. 1-79.

11. European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the supply of digital content (COM, 2015, 634 final, 2015/0287, COD) pp. 1-33.

12. European Commission, Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on certain aspects concerning contracts for the online and other distance sales of goods (COM, 2015, 635 final, 2015/0288 COD, pp. 1-31.

13. European Law Institute (ELI), Statement of the European Law Institute, on the European Commission’s Proposed Directive on the Supply of Digital Content to Consumers, (COM, 2015, 634 final) pp. 1–72.

14. European Parliament, Briefing EU Legislation in Process Contracts for online and other distance sales of goods (February 2016) pp. 1-11.

15. European Parliament, Briefing EU Legislation in Process on Contracts for the supply of digital content and digital services (February 2018) pp. 1-12.

16. European Parliament, Briefing EU Legislation in Process on Consumer sale of goods (March 2018) pp. 1-12.

17. European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis (EPRS, PE 568.322, 2015) pp. 1-29.

18. Ewoud Hondius and G.J. Rijken, Handboek Consumentenrecht, Uitgeverij Paris, Zutphen, 2015.

78

19. Geoffrey Woodroffe and Robert Lowe, Consumer Law and Practice, Eight Edition, Sweet & Maxwell, London, 2010.

20. Giovanni de Christofaro and Alberto de Franceschi and others, Consumer Sales in Europe: After the implementation of the Consumer Right Directive, Intersentia Ltd, Cambridge, 2016, pp. 109-130.

21. Jan van Beckum and Gert-Jan Vlasveld, Contractmanagement voor opdrachtgever en leverancier, Van Haren Publishing 2014, pp. 1-237.

22. Joasia Luzak and Vanessa Mak, The Consumer Rights Directive, Amsterdam Law School Legal Studies Research Paper, Centre for the Study of European Contract Law, Working Paper Series No. 2013-01, pp. 1-20.

23. Kamerstukken 34071 Wijziging van de Boeken 6 en 7 van het Burgerlijk Wetboek, in verband met verduidelijking van het toepassingsbereik van de koopregels van titel 7.1 BW.

24. Keven Davis and Gemma Lobb, Draft Directives on the online sale of digital content and tangible goods, Department for Business Innovation & Skills, London, 2016, pp. 1-11.

25. Kitty Lieverse and Jac Rinkes, Oneerlijke handelspraktijken en handhaving van consumentenbescherming in de financiële sector, Deel 106, Kluwer, Deventer, 2010, pp. 139- 157.

26. Marc Loth, Dwingend en aanvullend recht, Monografieën BW (A19), Kluwer 2009, pp. 2- 12.

27. Marco Loos, Consumentenkoop, Monografie Nieuw BW (B65c), Kluwer, 2004, pp. 1-112.

28. Marco Loos, Consumentenkoop, Monografieën BW (B65b), Kluwer, 2014, pp. 1-142.

29. Marco Loos, De koopregeling in het voorstel voor een richtlijn consumentenrechten, Studiekring ‘Prof. Mr. J. Offerhaus’, nieuwe reeks nr. 12, Kluwer, Deventer, 2009.

79

30. Marco Loos, Flow charts, Introduction of the new Section 6.5.2B (Provisions for agreements between traders and consumers), 2014, pp. 1-11.

31. Marco Loos, The Regulation of Digital Content B2C Contracts in CESL, Amsterdam Law School Research, Paper Centre for the Study of European Contract Law, Working Paper Series No. 2013-10, pp. 1–22.

32. Martijn Hesselink and Marco Loos and others, Het voorstel voor een Europese richtlijn consumentenrechten: een Nederlands perspectief, Centre for the Study of European Contract Law, Boom Juridische uitgevers, Den Haag, 2009, pp. 1-20.

33. Milà Rafel R., 'Intercambios digitales en Europa: las propuestas de directiva sobre compraventa en línea y suministro de contenidos digitales', Revista CESCO de Derecho de Consumo (17) 2016, pp. 11-44.

34. Natali Helberger and Marco Loos and others, Digital Content Contracts for Consumers, Journal of Consumer Policy (36) 2016, pp. 37-57.

35. René Barents and Laurens Jan Brinkhorst, Grondlijnen van Europees recht, 13-de druk Kluwer, Deventer, 2012, pp. 645-646, 755-757.

36. Silvia Zorzetto, The Lex Specialis Principle and its Uses in Legal Argumentation: An Analytical Inquire, Eunomia. Revista en Cultura de la Legalidad (3) 2013, pp. 61-87.

37. Sjef van Erp, Magna Charta: Leergang Contractenrecht, 2016, pp. 1-443.

38. Willem van Boom, Handhaving consumentenbescherming: Een toelichting op de Wet handhaving consumentenbescherming, 2010.

39. Willem van Boom and Rob Kottenhagen, De Richtlijn oneerlijke bedingen en haar plaats in het Nederlandse recht, 2014, pp. 1-23.

80

Chapter IV Potential regulatory challenges for chatbots

4.1 Introduction

This chapter answers the third and last sub-question of this research: what are the potential regulatory challenges deriving from the introduction of chatbots in the B2C relationship, within the existing legislative framework?

Section 4.2 examines how the existing legal framework for B2C contracts in the Netherlands and EU can be applied to chatbots. Section 4.3 provides an overview of potential practical challenges associated with the use of chatbots in B2C contracts. Section 4.4 will be taking a perspective. Finally, the findings will be summarised in section 4.5 of this chapter.

4.2 How the existing legal framework for B2C contracts in the Netherlands and EU can be applied to chatbots

Chapter 3 discussed the regulatory framework for consumer protection, consisting of the Dutch Civil Code and four EU Directives - all implemented in the Dutch law - respectively the E-Commerce Directive, the Consumer Rights Directive, the Unfair Terms Directive (1993) and finally the Consumer Sales Directive (1999). For research purposes, the previous chapter focussed on three specific elements within the applicable Dutch and EU regulatory framework for B2C contracts, namely the information requirements and remedies, the execution and remedies and finally the passage of the risk and remedies.

As outlined in chapter 3, under the existing legal framework for B2C contracts in the Netherlands and EU, presently there are no specific legal provisions in place governing chatbots that make use of AI (see section 2.3). Therefore, this section now turns to the question, how the existing legal framework for B2C contract in the Netherlands and EU actually respond to chatbots?

This section will illustrate possible scenarios deriving from the application of the existing legal framework for B2C contracts to a sale-purchase transaction conducted by a chatbot application, i.e. Amazon Lex (paragraph 4.2.1).

81

4.2.1 The current legal framework for B2C contracts applied to sale and purchase scenario carried out by a chatbot.

In this paragraph the existing legal framework for B2C contracts in the Netherlands and EU will be applied to chatbots using as a starting point a fictitious example of an online sale- purchase scenario conducted by Amazon Lex288, a chatbot application developed by retailer Amazon289.

As a start, suppose Consumer A searches for a legal handbook and needs some additional support to identify the legal handbook he desires. Consumer A browses to Amazon’s website and requests Amazon Lex for assistance. At the end of the conversation with Amazon Lex the consumer decides to order a legal handbook and have the parcel with the handbook delivered at his home address, through an official carrier appointed by Amazon.

The following actions below describe the process of the sale-purchase transaction in detail, in accordance with Dutch and EU consumer legislation. The possible ending of this scenario could take place either in an email environment (via email) or in a chatbot environment (via a chat window), as clarified below.

Steps one to six, based on the aforementioned scenario, will be describing the sale-purchase scenario in detail, carried out by the chatbot, and subsequently will connect each step to the applicable legal framework.

Step 1 placing the order

Consumer A wishes to consult the chatbot at Amazon`s website and starts a conversation with the chatbot by clicking on the chatbot logo on the website. Amazon Lex, hereafter the chatbot, pops up on the screen, identifies itself as being the digital assistant from Amazon and requests where it can assist with. Consumer A responds that information on legal handbooks is desired. The chatbot replies with some additional questions to specify what kind of legal handbook is requested and suggests some scholars. The consumer replies which scholars he is interested in

288 AWS Amazon Lex, accessed 6 January 2019.

289 AWS, accessed 6 January 2019.

82 and subsequently the chatbot provides some relevant offerings regarding available legal handbooks corresponding to consumer`s A criteria. Consumer A makes the final choice and notifies the chatbot which legal handbook he wishes to purchase.

Amazon Lex shall provide the consumer with mandatory information requirements for online B2C transactions, as set out under Dutch law, the E-commerce Directive and the Consumer Rights Directive.

Step 2 payment inside the chatbot environment

Subsequently, the chatbot requests the consumer to proceed with the payment for the legal handbook, by initiating the payment procedure in accordance with customer`s preselected payment method(s), as set out under his or her Amazon`s personal payment settings. All this will take place inside the chatbot environment.

The consumer has the obligation of paying the purchase price for the legal handbook, in line with the Dutch legislation290.

Step 3 acknowledgment

The chatbot receives the order for the legal handbook, verifies its availability in Amazon’s warehouse (probably via an in-house Enterprise Resource Planning (ERP) system291) and assuming its availability, subsequently acknowledge the receipt of the consumer`s order.

The execution moment for the online B2C sale purchase agreement is to be determined after the placing of the order by the consumer and once the chatbot has acknowledged the receipt of the consumer’s order, in accordance with Dutch law and the E-Commerce Directive.

There are two possible variants of the ‘step 3’scenario. The first, acknowledgement of the order by the chatbot in an email environment (‘step 3a’) or the second, acknowledgement of

290 Article 26 paragraph 1 Book 7 of the Dutch Civil Code.

291 Enterprise resource planning is the management of all the information and resources involved in a company's operations by means of an integrated computer system. Oxford Dictionaries, accessed 6 January 2019.

83 the order by the chatbot in a chatbot environment (‘step 3b’).

Variant step 3a acknowledgement by the chatbot inside the email environment

The order and the acknowledgement are deemed to be received at the moment parties are able to access them. Thus, assuming the acknowledgment is conducted by the chatbot in an email environment, once the chatbot`s email with acknowledgment of the order has arrived the consumer`s mail server, the email is deemed to be received by the consumer.

Variant step 3b acknowledgement by the chatbot inside the chatbot environment

In this variant, parties are able to access both the order and acknowledgement immediately inside the chatbot environment, which implies that the implementation of the contract has started.

Step 4 delivery of the order

The chatbot has the order shipped to the relevant Amazon warehouse and from there delivered to the consumer on his or her home address, via a carrier - appointed by Amazon - responsible for the delivery from the specific warehouse to the customer`s home address.

The legal handbook is due to be delivered by Amazon to the consumer within a thirty days‘ period, to be calculated from the execution moment (step 3), in accordance with Dutch law and the Consumer Rights Directive.

Step 5 passage of the risk in case of a loss

Under Amazon’s general terms and conditions, hereafter T&C, the passage of the risk from Amazon to the consumer in case of a loss, occurs upon delivery from the warehouse to the carrier, who is responsible for the delivery of the parcel and appointed by Amazon.292

292 Amazon, Conditions of Use (last updated: May 21, 2018), accessed 6 January 2019.

84

In accordance with the main rule under Dutch law and the Consumer Rights Directive, the passage of risk from Amazon to the consumer occurs once the consumer has received the parcel with the legal handbook from the carrier. However, the contracting parties have the possibility for differing agreements to be concluded, which is the case in this scenario, as indicated in Amazon`s Conditions of Use.

Step 6 remedies in case of withdrawal

Suppose, after receiving the legal handbook at his or her home address, the consumer would decide to withdraw from the contract and return the legal handbook to Amazon by mail. In line with Amazon`s Conditions of Use the consumer is entitled to return items within 30 days of delivery for a full refund.293

The following EU and Dutch regulation applies in this situation. In accordance with the E- Commerce Directive regime, as long as the acknowledgment by the chatbot of the receipt of the placement of the online order for the handbook has not occurred (step 3) the consumer will be entitled to cancel the agreement. The same applies, in case of potential failure of the chatbot to provide the minimum mandatory information requirements.

Furthermore, the consumer has a regular withdrawal period of fourteen days without just cause, which can be extended up to twelve months in case the chatbot would have failed to notify the consumer on his or her right of the regular withdrawal period, in accordance with the Consumer Rights Directive.

The Dutch legislation on remedies is consistent with the aforementioned EU legislation under the E-commerce Directive and the Consumer Rights Directive on remedies.

The next section will discuss some practical challenges associated with the use of chatbots.

293 Amazon, Conditions of Use (last updated: May 21, 2018), Returns Center, accessed 6 January 2019.

85

4.3 Potential practical challenges associated with the use of chatbots in B2C contracts

This section begins by assessing some potential practical challenges with chatbots (paragraph 4.3.1) and is followed by a perspective on its use in B2C contracts (paragraph 4.3.2).

4.3.1 Potential practical challenges

It has been shown that - under the existing regulatory framework for B2C contracts in the Netherlands and EU – no specific legal mechanism is currently in place to protect consumers’ interests against potential misuse of chatbots programmed on the basis of AI.

Since chatbots can also be used for evil purposes, in particular for security reasons consumers need to be extremely careful about the websites consumers visit and the systems they interact with.294 For example, talking with a malicious chatbot could be as dangerous as entering your credit card details into a phishing website.295

In addition, as a result of the maximum harmonisation principle (see paragraph 3.3.2) - Member States cannot go beyond the regulatory scope of the current European rules on consumer protection, in the transposition of Directives.296

Moreover, the earlier described ‘black-box problem’ (see paragraph 1.1.2) could lead to uncertainty in the outcomes associated with the use of AI chatbots. This problem, posing severe technical challenges and entailing untransparency risks for consumers due to potential unverifiable outcomes, will be explained hereafter.

The ‘black box’ problem - concerning the nature of algorithms and datasets supporting these systems297 - is related to the way Artificial Intelligence systems are trained, namely via back-

294 Panda Security, ‘Chatbots and AI-are they dangerous?’(2018) < https://www.pandasecurity.com/mediacenter/technology/chatbots-ai-dangerous/> accessed 6 January 2019.

295 Ibid.

296 Cătălina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, p. 18.

297 Adam Thierer et al., Artificial Intelligence and Public Policy, Mercatus Center, George Mason University, 2017, p. 31.

86 propagation, resulting in different matrices at every training. Thus, presently it is not fully possible to technically monitor the progress of AI learning over time298, since the algorithm finds the rules itself without leaving an audit trail to explain its decision299.

In other words, what goes into the ‘black box’ and what comes out of it can be retrieved, but less obvious are the considerations which underlie.300 For example, in case of an erroneous decision taken by a chatbot regarding personalized financial advice for consumers (e.g. a mortgage application), the manner in which that particular conclusion was reached by the chatbot using AI, can even be unknown to a business, as a result of the black box problem.301

Finally, as described in chapter 1, important ethical challenges need to be taken into account, concerning the application of algorithms and chatbots (see paragraph 1.1.2). Especially in a business environment, chatbots should embed human values to avoid breaching human rights and creating bias. Ethical transparency into the algorithms and data that drives the chatbot`s behavior is important to engender trust or otherwise market uptake may be impeded.302

Given the current lack of the consumer protection regulation for chatbots in B2C contracts, the regulatory restrictions with regard to potential enhanced consumer protection as a result of the maximum harmonisation principle under the Consumer Rights Directive, the risks for lack of transparency caused by the ‘black box’ problem and important ethical concerns

298 Théo Szymkowiak, ‘The Artificial Intelligence Blackbox Problem & Ethics’ (2017) < https://medium.com/mcgill-artificial-intelligence-review/the-artificial-intelligence-black-box-problem-ethics- 8689be267859> accessed 6 January 2019.

299 MIT Technology Review, accessed 6 January 2019.

300 European University Institute (EUI), (Working Papers, LAW 2018/11), Consumer Law and artificial intelligence Challenges to the EU consumer law and policy stemming from the business’ use of artificial intelligence, Department of Law, Final report of the ARTSY project, pp. 20-21.

301 Jarno Duursma, ‘Kunnen we kunstmatige intelligentie wel vertrouwen? Het Blackbox probleem’ (2017) < https://nl.linkedin.com/pulse/kunnen-we-kunstmatige-intelligentie-wel-vertrouwen-het-duursma-> accessed 6 January 2019.

302 Maya Medeiros, ‘Chatbots gone wild? Some ethical considerations’(30 October 2017), accessed January 2019.

87 associated, potential regulator challenges for chatbots may arise.

In this sense, the next paragraph will be taking a perspective on the use of chatbots in future B2C contacts as a possible way forward.

4.3.2 Perspective on the use of chatbots in future B2C contracts

Thinking about the future of chatbots in B2C contracts, the total lack of legislation in the field of chatbots, the regulatory limitations due to maximum harmonization principle and potential risks for untransparency and ethical important ethical concerns, additional measures for chatbots in B2C contracts need to be designed aiming to safeguard consumers` interests.

In this perspective, in my opinion, consumers` interests could be safeguarded in two ways. First, by introducing a mandatory identification requirement for chatbots, aiming to pre inform consumers and make consumers explicitly aware that a chatbot is involved. Second, since chatbots are a product of human mankind, ultimately introducing the possibility for an overruling human intervention, in those cases in which the chatbot would generate unverifiable outcomes or where consumers would be faced with unforeseeable consequences.

Therefore, looking forward and rethinking the limitation of the current legal framework, the next and final chapter will be addressing these two aforementioned recommendations in the form of an alternative regulatory approach, inspired by certain provisions of the General Data Protection Regulation 2016/679/EU of 27 April 2016303, applied to the future use of chatbots in a B2C environment.

303 EUR-Lex, accessed 6 January 2019.

88

4.4 Conclusions

This chapter answered the third sub-question of this research: what are the potential regulatory challenges with the introduction of chatbots in the B2C relationship, within the existing legislative framework?

The current legal framework for B2C contracts was applied to a sale-purchase scenario with a consumer involved and carried out by a chatbot application, i.e. Amazon Lex. The steps involved from placing the order till remedies in case of a withdrawal were described in detail and one at a time, where relevant, applied to the legal framework.

As shown, given the current lack of consumer protection regulation for chatbots in B2C contracts, the regulatory restrictions with regards to potential enhanced consumer protection as a result of the maximum harmonisation principle under the Consumer Rights Directive, the risks for untransparency caused by the ‘black box’ problem and important ethical challenges associated, potential regulatory challenges for chatbots may arise.

In this perspective, consumers` interests could be further safeguarded in two ways. First, by the introduction of a mandatory identification requirement for chatbots, aiming to pre inform consumers and make consumers explicitly aware that a chatbot is involved. Second, since chatbots are a product of human mankind, by ultimately enabling the possibility for an overruling human intervention, especially for those cases in which the chatbot would generate unverifiable outcomes or where consumers would be faced with unforeseeable consequences.

89

Bibliography

1. Adam Thierer et al., Artificial Intelligence and Public Policy, Mercatus Center, George Mason University, 2017, pp. 1-57.

2. Amazon`s Conditions of Use (Last updated: May, 21 2018), accessed 6 January 2019.

3. Arno Lodder, Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market., in Arno Lodder and Andrew Murray and others, EU Regulation of E-commerce: A Commentary (Chapter 2), Elgar Commentaries, Edward Elgar Publishing Limited, Cheltenham (UK) pp. 1-39.

4. Arnoud Engelfried, De wet op Internet, edities 2017/2018, ICT recht, IUS Mentis B.V., Amsterdam, 2017, pp. 1-146.

5. Article 29 Data Protection Working Party, 17/EN WP 251, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 WP 251, 2017, pp. 1-34.

6. Bayan Abu Shawar and Eric Atwell, Chatbots, are they Really Useful? LDV-Forum 22(1) 2007, pp. 1-22.

7. Bayan Abu Shawar and Eric Atwell, Chatbots: Are they Really Useful?, LDV-Forum 22(1) 2007, pp. 29-49.

8. Cătălina Goanţă, Convergence in European Consumer Sales Law A Comparative and Numerical Approach, Intersentia Ltd., Cambridge, 2016, p. 18.

9. Catherine Barnard and Steve Peers, European Union Law, Second Edition, Oxford University Press, Oxford, 2017, p. 692.

90

10. Christos Giakoumopoulos et al., Handbook on European data protection law, edition 2018, European Union Agency for Fundamental Rights (FRA) and Council of Europe, Publication Office of the European Union, Imprimerie Centrale, Luxembourg, 2018, pp. 1- 402.

11. David Edward and Robert Lane, European Union Law, Edward Elgar, Cheltenham and Northampton, 2013.

12. European Commission, Communication European standards for the 21st century, COM 358 final, 2016, pp. 1-12.

13. Europese Commissie, DG Justitie, DG Justitie - Leidraad, betreffende Richtlijn 2011/83/EU van het Europees Parlement en de Raad van 25 oktober 2011 betreffende consumentenrechten, tot wijziging van Richtlijn 93/13/EEG van de Raad en van Richtlijn 1999/44/EG van het Europees Parlement en de Raad en tot intrekking van Richtlijn 85/577/EEG en van Richtlijn 97/7/EG van het Europees Parlement en de Raad, 2014, pp. 1- 91.

14. European Parliament, Briefing EU Legislation in Process Contracts for online and other distance sales of goods, 2016, pp. 3-4.

15. European Parliament, Contract Law and the Digital Single Market, Toward a new EU online consumer sales law, in-depth-analysis EPRS, PE 568.322, 2015, pp. 8-12.

16. European Parliament, Environmental Policy: General Principles and Basic Framework, pp. 1-4.

17. European University Institute (EUI), (Working Papers, LAW 2018/11), Consumer Law and artificial intelligence Challenges to the EU consumer law and policy stemming from the business’ use of artificial intelligence, Department of Law, Final report of the ARTSY project, pp. 20-21.

91

18. Gianclaudio Malgieri and Giovanni Comandé, International Data Privacy Law, Volume 11 Issue 4, 1 November 2017, Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation, pp. 243–265.

19. Jan Spanjaard, Rechtskeuzebedingen in consumenten overeenkomsten: spitsroeden lopen, Actualia contractspraktijk, Boom juridisch, contracteren 2016 (4), pp. 118-121. 19. Kamerstukken II 2012/13, 33 520 , nummer 3, 2012, pp. 17-18.

20. L.J. van Apeldoorn, Inleiding tot de Studie van het Nederlandsche recht, Zwolle, W.E.J. Tjeenk Willink, 1933, p. 90.

21. Marco Loos, Consumentenkoop, Monografieën BW (B65b), Kluwer, 2014, pp. 1-112.

22. Nicole Radziwill and Morgan Benton, Evaluating Quality of Chatbots and Intelligent Conversational Agents, 2017, pp. 1–21.

23. Paulius Cerka et. al, Liability for damages caused by Artificial Intelligence, Computer Law & Security Review, (31) 2015, pp. 376-389.

24. R. Rojas: Neural Networks, Springer-Verlag, Berlin, 1996, pp. 151-184.

25. Signe Annette Bøgh, A World Built on Standards – A Textbook for Higher Education, 2015, pp. 1-115.

92

Chapter V Future perspective on enhanced consumer protection for chatbots in B2B contracts

5.1 Introduction

It has been argued in chapter 4 that in striking the right balance between consumer protection and technological advancement in the use of chatbots in B2C contracts, enhanced consumer protection should prevail. The current legislation in this field is marked by a lack of mandatory identification requirements for chatbots and an absence of an overruling human intervention in the event of undesirable outcomes for consumers.

A future perspective to realize enhanced consumer protection could consist of an additional set of provisions regulating the use of chatbots in B2C contracts, integrating these two aforementioned elements, inspired by certain provisions of the recently entered into force General Data Protection Regulation (GDPR) 2016/679 of 27 April 2016.

Section 5.2 examines the rationales of the consumer rights protection. Section 5.3 explores resemblances in rationales between consumer protection and those of the GDPR. Section 5.4 provides an alternative regulatory approach applied to the use of chatbots in B2C contracts. Finally, the findings will be summarised in section 5.5.

5.2 Rationales of consumer rights protection

Consumer rights at EU level can be described as those rights under EU law aiming to safeguard specific consumers’ interests.304 According to article 169 paragraph 1 TFEU305, the following three areas of consumer protection can be distinguished. First, health and safety. Second, the protection of consumers’ economic interests. And finally, promoting consumers` rights to information, education and consumers organizing themselves in order to safeguard their own interests.306 These three areas of interests have determined the content of EU

304 René Barents and Laurens Jan Brinkhorst, Grondlijnen van Europees recht, 13-de druk Kluwer, Deventer, 2012, p. 753.

305 EUR-Lex, accessed 6 January 2019.

306 Fabian Amtenbrink and Hans Veder, Recht van de Europese Unie, 6-de druk, Boom Juridische uitgevers, Den Haag 2017, p. 507.

93 consumer law, since the 1970`s.307

The reasons for specific mandatory rules applicable to consumer sale, as outlined in chapter 3, is to provide consumers additional protection, since consumers are often considered to be the weaker and the less informed contracting party.308 In particular the protection of consumers as the weaker contracting party is a developing field, since the weaker contracting party (the consumer) must be adequately informed by the other party about the content of the agreement or the objective(s) in mind with regard to the conclusion of the contract309. One example of this are the provisions of mandatory information requirements, as described in chapter 3.

The rationales of consumer rights protection is the protection of the weaker and the less informed contracting party, aiming to balance the inequality of contractual power between businesses and consumers310 in the contractual relationship.

5.3 Resemblances between rationales of consumer rights protection and those of the GDPR

The General Data Protection Regulation 2016/679/EU of 27 April 2016311, hereafter the GDPR (see section 5.4), lays down rules relating to the protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data.312 The GDPR aims to protect fundamental rights and freedoms of natural persons, in particular their right to the protection of personal data.313

307 René Barents and Laurens Jan Brinkhorst, Grondlijnen van Europees recht, 13-de druk Kluwer, Deventer, 2012, p. 753.

308 Bob Wessels and Albert Verheij, Bijzondere overeenkomsten, 2013, Deventer, Wolters Kluwer, derde druk, p. 44.

309 Frits de Vries, De overeenkomst in het algemeen, Monografieën BW (B54), Kluwer, 2016, p. 7.

310 Pablo Cortés, Online Dispute Resolution for Consumers in the European Union, Routledge, London and New York 2011, p. 11.

311 EUR-Lex, accessed 6 January 2019.

312 Article 1 paragraph 1 of the GDPR.

313 Article 1 paragraph 2 of the GDPR.

94

The rationale of the GDPR is that natural persons - due to the fact that present rapid technological developments and globalisation have brought new challenges for the protection of personal data314 - should have control of their own personal data.315 Hence, to ensure a consistent and high level of protection of natural persons and to remove the obstacles to flows of personal data within the European Single Market, the level of protection of the rights and freedoms of natural persons with regard to the processing of such data should be equivalent in all Member States.316

There are a number of significant similarities between the rationales of consumer rights protection and the rationales of the GDPR.

First, both the rationales of consumer rights protection and those set out in the GDPR relate to natural persons (the consumer and the 'data subject', see below). Second, both rationales have the intention to protect the weaker party (the consumer as a buyer and the data subject with its personal data). Finally, to promote consistency throughout Europe, both consumer rights protection and the GDPR strive for harmonization at EU level.

These resemblances between the rationales of consumer rights protection and those of the GDPR have brought the author of this work to propose an alternative regulatory approach applied to the use of chatbots in B2C contracts, one inspired by the GDPR. This renewed approach will be addressed in the next section.

314 Recital 6 of the GDPR.

315 Recital 7 of the GDPR.

316 Recital 10 of the GDPR.

95

5.4 An alternative regulatory approach applied to the use of chatbots in B2C contracts

In this section an alternative regulatory approach, inspired by certain provisions of the GDPR and applied to the use of chatbots in B2C contracts, will be presented.

As shown above, the GDPR, which entered into force on 25 May 2018317, envisions to strengthen individuals fundamental rights in the digital age and facilitate business by clarifying rules for companies and public bodies in the digital single market.318 In sum, the GDPR regulates the processing by an individual, a company or an organization of personal data relating to individuals in the EU319 and shall be binding in its entirety and directly applicable in all EU Member states.320

Under article 23 of the GDPR Member States can introduce legislation concerning data profiling321 (e.g. gathering information about an individual or group of individuals’ interests)322 and automated decision-making (e.g. in mortgage applications or during recruiting processes323).324

317 Article 99 of the General Data Protection Regulation 2016/679/EU.

318 European Commission, accessed 6 January 2019.

319 European Commission, accessed 6 January 2019.

320 The final text of the General Data Protection Regulation 2016/679/EU.

321 ‘profiling’ means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements (article 4(4) of the General Data Protection Regulation 2016/679/EU).

322 Article 29 Data Protection Working Party, 17/EN WP 251, Guidelines on Automated individual decision- making and Profiling for the purposes of Regulation 2016/679 WP 251, 2017, p. 7.

323 Christos Giakoumopoulos et al., Handbook on European data protection law, edition 2018, European Union Agency for Fundamental Rights (FRA) and Council of Europe, Publication Office of the European Union, Imprimerie Centrale, Luxembourg, 2018, p. 355.

324 Article 29 Data Protection Working Party, 17/EN WP 251, Guidelines on Automated individual decision- making and Profiling for the purposes of Regulation 2016/679 WP 251, 2017, p. 9 and article 22 of the General Data Protection Regulation 2016/679/EU.

96

Transparency must be ensured for all processing of personal data and is especially important in relation to Internet services and other complex automated data processing, such as the use of algorithms (see paragraph 1.1.1) for decision-making. To ensure a fair and a transparent processing, the GDPR requires the controller to preventively inform data subject with meaningful information about the logic involved in automated decision-making, including profiling, according to article 13 paragraph 2(f) and article 14 paragraph 2(g) of the GDPR.325

Moving forward to the use of chatbots in B2C contacts, inspired by analogy with the articles 22, 23, 13 paragraph 2(f) and 14 paragraph 2(g) of the GDPR, a similar approach to the use of chatbots in B2C contracts could be followed, by making a plea for adding similar types of mandatory provisions in the Consumer Rights Directive sec applicable to chatbots. This type of provisions could be embedded under the mandatory consumers’ information rights (see paragraph 3.3.2).

At best, the mandatory consumers’ information rights under the Consumer Rights Directive provides the foundation for the proposed mandatory provisions applicable to chatbots (see hereafter), according to the opinion of the author. Since, transparency in the outcomes of chatbots are best guaranteed through the consumers` information rights under the Consumer Rights Directive, the CRD may serve as the adequate basis for harmonizing future EU initiatives on the regulation of chatbots in B2C contracts.

In the light of the above, the author proposes the following provisions, to be applied to chatbots in B2C contracts.

Identification requirements for chatbots

1. ‘In case the business part were to use a chatbot application, the consumers shall be informed prior placing the order with meaningful information about the chatbot`s logic involved, the significance and the envisaged consequences that consumers might face from the use of chatbots’.326

325 Christos Giakoumopoulos et al., Handbook on European data protection law, edition 2018, European Union Agency for Fundamental Rights (FRA) and Council of Europe, Publication Office of the European Union, Imprimerie Centrale, Luxembourg, 2018, p. 357.

326 By analogy with article 13 paragraph 2(f) and article 14 paragraph 2(g) of the GDPR.

97

Overruling human intervention

2. ‘In the case referred in point 1, the trader shall implement suitable measures to safeguard the consumer`s rights with the use of a chatbot application, assuring the right for consumers to obtain human intervention, to express his or her point of view and to contest its decision’.327

For the foreseeable future, this proposed alternative regulatory approach inspired by certain provisions under the GDPR, could be a way forward for advancement in enhancing the degree of consumer protection, associated with the use of chatbots in B2C contracts.

327 By analogy with article 22 paragraph 3 of the GDPR.

98

5.5 Conclusions

This chapter explored a future perspective on adhered enhanced consumer protection for chatbots in B2B contracts.

There is a legal vacuum in the existing Dutch and EU legislation on the use of chatbots in B2C environments, i.e. the lack of mandatory identification requirements and the absence of an overruling human intervention in the event of undesirable outcomes for consumers. In this sense, the proposed and discussed future perspective in this chapter consists of an alternative regulatory approach applied to the use of chatbots in an B2C environment. This approach is aiming to realize further consumer protection in the area of chatbots and is explicitly inspired by resemblances in rationales between consumer rights protection and those of the GDPR.

The rationales of consumer rights protection is the protection of the weaker and the less informed contracting party, while at the same time the rationales of the GDPR consists of the protection of fundamental rights and freedoms of natural persons, in particular their right to the protection of personal data. As a result of significant similarities between the rationales of consumer rights protection and the rationales of the GDPR, the author proposed a practical alternative regulatory approach applied to the use of chatbots in B2C contracts, one inspired by certain provisions under the GDPR.

Inspired by analogy with the articles 22, 23, 13 paragraph 2(f) and 14 paragraph 2(g) of the GDPR, similar types of mandatory provisions could be laid down in the Consumer Rights Directive, embedded under the mandatory consumers’ information rights and sec applicable to chatbots. By applying these provisions to chatbots in B2C contracts - as a way forward for advancement in enhancing the degree of consumer protection - two draft legal texts on the use of chatbots were drawn up by the author. The first, describing the identification requirements and the second, describing the overruling human intervention for chatbots.

99

Bibliography

1. Andrea Biondi, Piet Eeckhout and Stefanie Ripley, EU Law After Lisbon, Oxford University Press, Oxford 2012, pp. 48, 255.

2. Article 29 Data Protection Working Party, 17/EN WP 251, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 WP 251, 2017, pp. 1-34.

3. Bob Wessels and Albert Verheij, Bijzondere overeenkomsten, 2013, Deventer, Wolters Kluwer, derde druk, p. 44.

4. Christos Giakoumopoulos et al., Handbook on European data protection law, edition 2018, European Union Agency for Fundamental Rights (FRA) and Council of Europe, Publication Office of the European Union, Imprimerie Centrale, Luxembourg, 2018, pp. 1-402.

5. David Edward and Robert Lane, European Union Law, Edward Elgar Publishing, Cheltenham (UK) 2013, pp. 404, 504, 925.

6. Derrick Wyatt and Allan Dashwood, European Union Law, 6th edition, Hart Publishing, Oxford 2011, pp. 378, 412-413, 443-445.

7. European Data Protection Supervisor (EDPS), EDPS Opinion 8/2018, on the legislative package “A New Deal for Consumers”, 2018, pp. 1-27.

8. Fabian Amtenbrink and Hans Veder, Recht van de Europese Unie, 6-de druk, Boom Juridische uitgevers, Den Haag 2017, p. 507.

9. Frits de Vries, De overeenkomst in het algemeen, Monografieën BW (B54), Kluwer, 2016, p. 7.

10. Jan Willem Sap, Ars Aequi Jurisprudentie Europees Recht 1963-2014, 3-de druk, Ars Aequi Libre, 2015, pp. 1-453.

100

11. Koen Lennaerts and Piet van Nuffel, Europees Recht, 6-de editie, Intersentia, Antwerpen 2017, pp. 272-273.

12. Pablo Cortés, Online Dispute Resolution for Consumers in the European Union, Routledge, London and New York 2011, p. 11.

13. Paul de Hert and Vagalis Papakonstantinou, Analysis & Opinions The new Police and Criminal Justice Data Protection Directive A First Analysis, New Journal of European Criminal Law (7) 2016, pp. 1-13.

14. Paul de Hert and Vagalis Papakonstantinou, The new General Data Protection Regulation: Still a sound system for the protection of individuals?, Computer Law & Security Review (32) 2016, pp. 179-194.

15. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). EUR-Lex, accessed 6 January 2019.

16. René Barents and Laurens Jan Brinkhorst, Grondlijnen van Europees recht, 13-de druk Kluwer, Deventer, 2012, pp. 753-758.

101

Chapter VI Conclusions and recommendations

This chapter concludes and summarises the core parts of this thesis, complemented by some practical recommendations, from a perspective to enhance consumer protection in the use of chatbots in B2B contracts.

The aim of this thesis was to investigate the central research question: How can chatbots be regulated in the Netherlands in case of B2C contracts?

In order to respond to the central research question, the following sub-questions were defined:

(i) What are the definitions of Artificial Intelligence & chatbots, their potential uses in B2C contracts and practical examples of already existing uses or experiments? (ii) What is the regulatory framework for B2C contracts in the Netherlands and EU for consumer protection? (iii) What are the potential regulatory challenges deriving from the introduction of chatbots in the B2C relationship, within the existing legislative framework?

Finally, some recommendations were made, by proposing a new path to a future with key legislative safeguards to mitigate negative outcomes for consumers in the use of chatbots.

6.1 Conclusions

As we have seen in chapter 2, due to the absence of a general universal accepted formal definitions amongst scholars, two working definitions for Artificial Intelligence and chatbots were adopted. The adopted working definition of AI used throughout this thesis was based on the comprehensive and extensive definition of AI, as applied by the European Commission. The definition of chatbots was based on the functional definition description of chatbots, as applied by the scholars Rohan Kar and Rishin, who distinguished two main types: ‘scripted bots’ and ‘AI bots’. The latter, was used as the working definition of chatbots throughout this thesis.

Moreover, two practical examples of AI applications in the field were discussed, Watson Assistant and Google Duplex. The first example, enabling enterprises to build a solution that understands natural language input and uses machine learning to respond to customers in a way that stimulates a conversation between humans. The second example, capable of carrying out sophisticated conversations and completing the majority of its tasks fully autonomously, which means without human involvement.

102

As outlined in chapter 3, the regulatory framework in the Netherlands for B2C contracts is regulated by title 1 Book 7 of the Dutch Civil Code, also known as the consumer sale title. Many provisions under this title have a mandatory legal character, when consumers are involved. Hence, if applicable, the general rules from Book 3 Property law and Book 6 Law of obligations of the Dutch Civil Code apply equally.

The EU framework of consumer protection for B2C contracts consist of the following four Directives: the E-commerce Directive, the Unfair Terms Directive (1993), the Consumer Sales Directive (1999) and finally the Consumer Rights Directive. The E-commerce Directive - providing the legal framework for online consumer transactions - solely applies to online contacts. The Unfair Terms Directive (1993) - protecting consumers in the EU against unfair standard contacts terms used by traders - applies to all business to consumer contracts online or offline. The Consumer Sales Directive (1999) - on legal and commercial guarantees and seller’s liability (non-conformity included) - applies to all online and offline consumer sales transactions. Finally, the Consumer Rights Directive - containing consumers’ information rights – equally applies, subject to exceptions, to any contract between a trader and consumer.

In addition, two ongoing EU legislative initiatives were discussed, currently under preparation by the European Commission and aiming to further enhance consumer protection within B2C contracts: the draft proposal for a new Directive on digital content and the draft proposal for a new Directive on online sales.

As set out in chapter 4, the existing Dutch and EU legal framework for B2C contracts was approached from three distinguished angles, namely information requirements and remedies, execution of the contract and remedies and finally, passage of the risk and remedies. This existing legal framework for B2C contracts was subsequently applied to a sale-purchase scenario with a consumer involved and carried out by chatbot application Amazon Lex. The steps involved from placing the order till remedies in case of a withdrawal were described in detail and one at a time applied to the existing Dutch and EU legal framework.

Concluding, due to the current lack of consumer protection regulation for chatbots in a B2C environment, regulatory restrictions with regards to potential enhanced consumer protection as a result of the applicable maximum harmonisation principle under the Consumer Rights Directive, risks for untransparency caused by the ‘black box’ problem and finally important ethical challenges associated, potential regulatory challenges for chatbots may arise.

103

In this perspective, consumers` interests could be further safeguarded in the following two ways: by the introduction of a mandatory identification requirement for chatbots, aiming to pre inform consumers that a chatbot is involved, and by enabling the possibility for overruling human intervention, especially in those cases in which a chatbot would generate unverifiable outcomes or where consumers would be faced with unforeseeable consequences.

6.2 Recommendations

As shown in chapter 5, the legal vacuum in the existing Dutch and EU legislation on the use of chatbots in B2C environments requires action. In this sense, a new alternative regulatory approach was proposed and applied to the use of chatbots in a B2C environment, aiming to realize further consumer protection in the area of chatbots. As a result of resemblances in the rationales of consumer rights protection and those of the GDPR, a renewed regulatory approach, applied to the use of chatbots in B2C contracts and inspired by certain provisions under the GDPR.

By analogy with the articles 22, 23, 13 paragraph 2(f) and 14 paragraph 2(g) of the GDPR, similar types of mandatory provisions could be laid down in the Consumer Rights Directive, embedded under the mandatory consumers’ information rights and sec applicable to chatbots. By applying these alike provisions to chatbots in B2C contracts - as a possible way forward for advancement in enhancing the degree of consumer protection - two draft legal texts on the use of chatbots were drawn up by the author. The first one, describing the identification requirements and the second one describing the overruling human intervention.

Lastly, by implementing the suggested aforementioned approach into the Consumer Rights Directive, undesirable risks for consumers involved in the use of chatbots can be mitigated in a practical way.

104