<<

Papeles del Psicólogo / Psychologist Papers, 2020 Vol. 41(1), pp. 27-34 Articles https://doi.org/10.23923/pap.psicol2020.2920 http://www.papelesdelpsicologo.es http://www.psychologistpapers.com

HOW TO CREATE A PSYCHOLOGIST-CHATBOT

Miriam Romero, Cristina Casadevante y Helena Montoro Universidad Autónoma de Madrid

El desarrollo de agentes conversacionales o se ha visto incrementado en las últimas décadas, especialmente en el sector comercial. No obstante, si bien el primer bot conversacional de la historia presentaba una apariencia de psicoterapeuta, son pocos los agentes virtuales con este tipo de funciones construidos hasta la fecha. En el presente trabajo, exponemos las bases para diseñar un psicólogo, concretamente, un bot con funciones de evaluación psicológica. Para ello, revisamos las herramientas disponibles para diseñarlo y configurarlo, y los conceptos básicos para su construcción. Asimismo, proponemos una serie de objetivos de evaluación que habrían de guiar el diálogo del agente conversacional. Finalmente, exponemos una reflexión acerca de las ventajas e inconvenientes de los chatbots y sobre las líneas de actuación que serían necesarias para desarrollarlos con garantías científicas. Palabras clave: Chatbot, Agente conversacional, Evaluación psicológica, Tecnología, Inteligencia artificial.

In recent decades, the development of conversational agents or chatbots has increased, especially in the commercial sector. Although the first chatbot in computational history was presented as a psychotherapist, few virtual agents with this type of function have been built since then. In the present article we describe the fundamental aspects of designing a psychologist- chatbot and, more specifically, a bot with psychological assessment functions. We review the available tools and the concepts for its construction. We also propose a series of assessment objectives that would guide the conversational agent’s dialogue. Finally, we discuss the advantages and disadvantages of chatbots and the scientific guarantees that they need to fulfill. Key words: Chatbot, Conversational agent, Psychological assessment, Technology, or chatbot is a computer program with to pass the (Khan & Das, 2017; Maudin, 1994; which it is possible to have a conversation and from Warwick & Shah, 2014, 2016). A which information or some type of action can be The origins of chatbots are necessarily linked to psychology. obtained (Hill, Ford, & Farreras, 2015; Khan & Das, 2017; The first steps in the development of these machines are Shawar & Atwell, 2005). To understand the history of these attributed to (1966), who built a machines and the human efforts to improve them, it is program that simulated a conversation with a psychologist. imperative to talk about . In the middle of the This program, called ELIZA, can be considered the first twentieth century, Turing proposed a theoretical postulate in chatbot or virtual agent in history (Khan & Das, 2017). ELIZA which he tested the intelligent behavior of a machine against is responsible for identifying keywords in the text that the user that of a person (Maudin, 1994; Turing, 1951; Turing, enters, from which it generates questions. When it is not able Braithwaite, Jefferson, & Newman, 1952). The “Turing Test” to identify them, it uses set phrases that encourage the user to or “Imitation Game” is intended to elucidate whether a robot speak more: “Why do you say that?”, “Can you go into more can show a behavior similar to that of a human. To do this, an detail?” (Weizembaum, 1966). Although the answers are evaluator has a conversation, through an interface, with two predefined, this robot conveys the feeling that it is capable of interlocutors: a bot and a human person. If the evaluator is not understanding the user (ELIZA, 2018). able to distinguish which is the bot in a time interval of five ELIZA served as the inspiration for subsequent works (Khan minutes, it is concluded that the machine has passed the test. & Das, 2017). An example of this is the chatbot Alicebot— In the competition, which first took place in created in 1995 by Richard Wallace, who has won the 1991, various robots compete to pass the famous test. In this Loebner award several times—which was created with more competition prizes have been distributed to the best robots for than 40,000 knowledge categories. (ELIZA had around 200). many years, but it was not until 2014—more than two These categories were composed of a question and an decades after the first competition—that a machine managed answer, and were integrated into a tree diagram to facilitate dialogue. By design, this virtual agent was thought to be in continuous development and improvement: in charge of this Received: 16 July 2019 - Accepted: 11 November 2019 was the botmaster, who created new content to adjust Correspondence: Miriam Romero. Centro de Psicología Aplica- Alicebot’s responses (Wallace, 2009). A few years later, the da. Universidad Autónoma de Madrid. C/Iván Pavlov, nº 6. SmarterChild bot was created, which not only allowed the 28049 Madrid. España. E-mail: [email protected] possibility of having a conversation, but also offered

27 Articles PSYCHOLOGIST-CHATBOT

information on various topics (sports, movies, weather, etc.) Also worth mentioning is the chatbot Replika (Brandtzaeg & (Klopfenstein, Delpriori, Malatini, & Bogliolo 2017; Khan & Følstad, 2018; Replika, 2018), which was designed with the Das, 2017). Subsequently, animated virtual agents have also objective that the user maintains a pleasant interaction and been developed, that is, avatars with human appearance, “feels better”. Replika introduces itself as a virtual friend, and gestures, and expressions that interact with users. Apparently, asks questions about the daily activities, hobbies, aspirations, this type of presentation encourages the user to perceive the and feelings of its interlocutor. Although it is not defined as a chatbot as more sociable and enjoyable (Klopfenstein et al., psychologist nor has it been built with that intention, it is able 2017). to identify keywords linked to psychological distress; and These pioneer bots have established the basis for the perhaps most remarkable is its role of referral to a specialized design of a great diversity of conversational agents. The care service when it detects suicidal ideation. Regarding this best-known ones are currently on our devices, such as virtual agent, we have not found controlled studies that offer Google Assistant (developed by Google), Siri (developed evidence about its efficacy, effectiveness, or efficiency. by Apple), Cortana (developed by Microsoft), and In our research team we are currently designing a (developed by IBM). They are able to interact with the user psychologist-chatbot that has the purpose of conducting an based on text and voice inputs, and are intended to help initial psychological assessment interview. With this project, them perform multiple actions such as activating music, we intend to create a conversational robot with guarantees organizing medical appointments, solving questions of all and scientific rigor that will help in psychological assessment kinds, or even ordering takeaway food for home delivery work. In addition, once the tool has been built, it will be (Khan & Das, 2017). Also, every day there are more possible to study the differences between the interviews in companies that include a chatbot on their website or in their which the user interacts with a chatbot (human-chatbot social networks in order to offer products and services to process) and the interviews in which the user interacts with a customers. An example is Irene, the virtual assistant of the human psychologist (human-human process). This will allow us company Renfe, which makes it easy for the user to prepare not only to improve the chatbot, but also to understand in his or her trip and also has an animated avatar. depth what forms of interaction are most effective in achieving Although most chatbots have been developed for a certain objective. Furthermore, it will serve to establish the commercial purposes, these machines are also useful in basis for the development of future tools, as well as to initiate other areas: a potential case of non-commercial use could a debate about the relevance and the necessary regulation of be as a support tool in the tasks of psychological assessment their use. and intervention. We have only found one bot aimed at Next, we present the necessary foundations for developing a carrying out psychological assessment tasks so far. It is virtual psychological assessment agent, describing the basic called Sentinobot (Sentino, 2018) and was built with the concepts for its creation and design, and the necessary objective of evaluating personality traits (ELIZA and Alicebot psychological considerations. are not specifically presented as tools to aid evaluation, even though they encouraged the user to reveal their HOW TO CREATE A CHATBOT psychological problems). Sentinobot collects information on Tools for developing conversational agents the Big Five (extraversion, conscientiousness, agreeableness, Since the development of conversational agents has neuroticism, and openness to experience) through multiple- increased in recent years, the number of tools that enable choice questions and answers based on a Likert-type scale. them to be designed has also increased. Some of the most In other words, it is a virtual agent that administers an important platforms that offer services related to the evaluation test with closed questions and answers. development of chatbots are IBM Watson, API.ai, , Sentinobot is a tool with a different format from traditional and Microsoft LUIS. These tools facilitate the programming of assessment tests, but we have not found studies on the conversational agents with apparent intelligence and are also psychometric guarantees of this prototype. The best example under constant improvement (Khan & Das, 2017). The of help in intervention work is Woebot (Fitzpatrick Darcy & services provided by Chatfuel and Flow XO are also known. Vierhile, 2017), a conversational agent based on elements These are simpler tools than those mentioned above, but they of cognitive behavioral therapy. This bot administers a self- are more limited (Janarthanam, 2017; Kothari, Zyane, & help program to users who have symptoms of depression. In Hoover, 2017). a study with a sample of 70 students, it was observed that Within the scope of psychology, it will be necessary to use the group that received therapy from the virtual assistant advanced platforms that allow a fluid and rich interaction with reduced symptoms of depression, compared to the group users since, if the chatbot does not appear to have a certain that simply received information about this disorder. The level of intelligence, the user’s distrust of the tool will increase. authors of this study conclude that chatbots can be a tool of In any case, in order to design a conversational assistant, it is potential utility in administering cognitive behavioral therapy necessary to know a series of basic concepts that work with (Fitzpatrick et al., 2017). the vast majority of platforms designed to develop them.

28 MIRIAM ROMERO, CRISTINA CASADEVANTE AND Articles HELENA MONTORO

Chatbot structuring: intentions, entities, and To ensure that the assistant does not get lost in the dialogue conversation, it is essential that all possible conversation There are three essential concepts that must be considered options are contemplated in the . for the construction of a chatbot: intentions, entities, and In particular, the chatbots that are designed to perform a dialogue (Khan & Das, 2017). commercial function are built considering the concepts Intentions refer to the actions or demands that the user mentioned above. Their main purpose is to provide the requires (buying a train ticket, reserving a table, information requested by the humans with whom they interact. communicating a problem, asking a question, etc.). The first In other words, the users require a product or service, and thing that a conversational agent must identify is what the what they say to the chatbot (whether spoken or written) will, human with whom it is interacting is requesting. However, in turn, direct the next question the bot will ask. So, if a since human language is broad and rich, there are multiple customer indicates that he or she wants to buy a ticket, the ways to communicate the same thing and it should be borne chatbot will detect that it has to identify what type of ticket is in mind that the chatbot can only identify the user’s intentions needed (theater, cinema, etc.) and will, therefore, respond if the different ways of expressing the same intention have accordingly. been previously described in its programming. However, as will be indicated in the next section, this is Imagine this simple example: a study center offers online slightly different when we are talking about a chatbot that is training and implements a chatbot on its website in order to intended to do the work of a psychologist who performs the understand the wishes of the visitors to the site and help them initial assessment of a clinical case. In this use, the sign up to the course that most interests them. A user needs to psychologist-chatbot is the one that requires the necessary enroll in a training course and, for this, he or she can tell the information from the human in order to assess his or her chatbot: “I need a course”, “I want to sign up for a course”, problem. The virtual agent will guide the conversation with the “I would like to attend a course at your center”, etc. In this user so that he or she is able to describe it in the best possible case, we need to have included all the ways to express that way (Figure 1). request in the intention #course_sign_up (it is usual to use the Therefore, when building a psychologist-chatbot, a series of “#” symbol before each intention). aspects must be considered before defining the intentions, the In addition, the same intention can have different nuances. entities, and the dialogue tree. For example, one user may be interested in enrolling in a clinical psychology course, while another may prefer a place CREATING A PSYCHOLOGIST-CHATBOT on a human resources management course. The intention is Previous considerations the same, to enroll in a course, but the users are not referring Before starting to create the chatbot, it is necessary to to the same one. So, entities are what make it possible to consider what type of virtual agent we want to program and make this type of distinctions related to the same intention. for what purpose. In our case, we will present the necessary They could be considered “keywords” and are generally elements to build a chatbot aimed at performing preliminary identified using the “@” symbol. Thus, in the example just psychological assessment tasks, although it is true that the mentioned, we could distinguish between concepts and steps we will describe could be generalized to @clinical_psychology and @human_ resources, and we could robots with similar functions. program it so that, if the chatbot recognizes the intention It is important to be clear about the audience to which it is #course_sign_up, it will ask: “Are you interested in courses in aimed. Many companies offer services through a chatbot the area of clinical psychology or in the area of human hosted on their website or on their social networks, open to all resources?” If the user replies “I want to find out information on clinical psychology courses”, the chatbot will identify the FIGURE 1 intention @clinical_psychology and offer information about it. RELATIONSHIP PSYCHOLOGIST-CHATBOT AND USER Finally, the chatbot needs a set of questions or phrases for interacting with the user, in other words, a dialogue. This dialogue is programmed in detail according to the type of interaction to be carried out. In the process of creating it, the intentions and entities will be shaping and directing the dialogue: the system works like a decision tree, that is, according to the intentions and entities that it detects in the user’s responses, it will decide what node to pass. Following the previous example, if a user shows interest in signing up to a course, and the chatbot “understands” his or her intention, it will ask questions (described in the dialogue) to try to identify what type of course he or she wants, the schedule, etc.

29 Articles PSYCHOLOGIST-CHATBOT

audiences with unspecified objectives. In our case, we believe to the tool through the flow of dialogue and entities. that the psychologist-chatbot should only be available to Remember that in this case the chatbot does not have to clients of health centers where preliminary psychological identify what the client wants, but instead it will be the assessments are required. predefined dialogue tree that guides the conversation. This It is also necessary to consider the information that the means that it will not be necessary to define intentions, but chatbot will obtain from the users. In our case, given that instead entities or keywords that will allow the bot to identify psychologists work with confidential information, we must take if it is collecting the appropriate information. maximum care of the processing of the data and their The pre-assessment objectives that we propose below to accessibility. The virtual agent must be built so that the access integrate into the psychologist-chatbot are based on the and custody of the information comply with all the behavioral interview proposed by Fernández Ballesteros requirements of the European Data Protection Regulation (EU) (2015) and on the experience and criteria of the authors of 2016/679 of April 27, 2016 (GDPR) and the Organic Law this article. With this type of question, we do not seek to make 3/2018, of December 5, on Protection of Personal Data and a diagnosis, but rather to obtain general information that Guarantee of Digital Rights. Among other things, it must be helps us both to analyze the problem later (that is, an hosted on a secure server and access to the data must be explanation of why the user is failing to cope with their limited to the psychologists of the center where the case is difficulties successfully), and to determine how the assessment treated. Of course, users who interact with the chatbot will be phase must continue, if the user decides to start therapy. Also, informed about the processing of their personal data and must as will be described later, this tool can be very useful for consent to this before starting the conversation. clinics that have different professionals, since the data Once all these steps are clear, we can begin to consider the collected in the conversation can help in deciding to which objectives that the chatbot or conversational agent will have. specialist the user will be referred. Figures 2 and 3 reflect the structure and objectives that Objectives of the chatbot with initial psychological should pertain to a chatbot whose purpose is to pre-evaluate assessment functions a user who seeks psychological help. When a human psychologist makes a pre-assessment interview (initial psychological assessment), he or she does so Dialogue and keywords with a set of pre-established objectives. Firstly, it is necessary The keywords and dialogue will be established based on the to know the user’s basic data: name, age, occupation, etc. objectives of the chatbot with psychological assessment Secondly, the psychologist who conducts the interview collects functions. For example, regarding questions about data on the client’s problem and guides him or her to describe sociodemographic data, if the objective is to know the user’s it as accurately as possible. For example, if the client indicates occupation, the dialogue could be: “What is your current that he or she has anxiety, the psychologist must determine the occupation?” or “What do you do for a living?”. Likewise, the intensity of the symptoms, in what situations they occur, since key words that indicate that the question has been answered and that, therefore, the chatbot can continue with the next when, how often, etc. In this case, it is not the client who asks objective would be “study”, “work”, “unemployed”, etc., the chatbot for a certain type of information (schedules, types which would correspond to the entities @study, @work, of tickets, etc.), or requires certain actions (booking, buying, @unemployment, etc. etc.), but instead it is the chatbot that asks the client, with the In short, each chatbot objective must have a list of possible purpose of later providing the relevant information of his or synonymous associated questions that would make up the her case to a psychological clinic. dialogue. In addition, lists of keywords that the chatbot will To build a conversational assistant with psychological have to identify in the response (via text or voice) will be assessment functions, we must adequately specify the recorded. However, it may be that in certain parts of the objectives of the interview in order to be able to transmit them dialogue these keywords are not defined for detection, but instead the user will be allowed to respond more openly. FIGURE 2 Specifically, in almost all the questions designed to gather STRUCTURE OF A CHATBOT WITH INITIAL PSYCHOLOGICAL information about the client’s problem, it is not very useful to SCREENING FUNCTIONS make a list of keywords to be identified in the response, since the conversation must move forward without restricting the users’ way of expressing themselves or the information they must provide. In order to build an intelligent and efficient chatbot, the dialogue tree must have the following (outlined in Figure 4): 4 Nodes with pre-assessment questions expressed in different ways.

30 MIRIAM ROMERO, CRISTINA CASADEVANTE AND Articles HELENA MONTORO

4 An explanatory node for the occasions when the client Synthesis of the information collected by the does not understand the question asked (e.g., “I meant that chatbot if there is any other problem that may be relevant or that Once the user has completed the interview with the chatbot, you may have remembered during the conversation which it will be stored on a secure server with restricted access. The also causes you distress”). information that identifies the user will be encrypted and 4 A node in which the user is required to complete the re- separated from the rest of the information. For this type of sponse when it is markedly brief (e.g., “Could you give me chatbot, specifically, we have designed the data output so that a little more detail?”), with the of expanding the infor- the whole conversation can be obtained with the relevant mation. words (e.g., “sad”, “anxiety”, “fear”, “pain”, “ die” etc.). At 4 Nodes that express understanding and support in relation the beginning of the conversation, a summary table will be to the problem posed by the user (e.g., “I am sorry that you generated that will collect these words in addition to the are going through this situation”). description of the user’s request. Thus, from a first glance one 4 Nodes that reinforce the user by conducting the interview can know the subject of the case. (e.g., “You are doing very well, let’s continue”). As previously announced, the information collected by the 4 Nodes that remind the user that he or she is interacting with chatbot may be useful in deciding which professional to a chatbot and that it is likely that it does not understand ev- assign the case to within a team of psychologists. This work erything he or she says (e.g., “You know I am a robot. will be streamlined thanks to the tool, since it will not be Sometimes I don’t understand everything”). necessary to spend time personally evaluating each case. For a more advanced development, it would be very useful Likewise, the situation will be avoided where the user starts to have a configurator that allows you to manage the therapy with a therapist other than the one who made the questions, so the appearance of the questions would be associated with a certain probability or even with the tool’s own . However, since these advances FIGURE 3 DETAIL OF THE OBJECTIVES OF THE PSYCHOLOGIST-CHATBOT. require more complex computer development, they are not THESE ARE FORMULATED THROUGH QUESTIONS PRESENTED IN essential in the initial phase of the chatbot. A CERTAIN ORDER TO THE USER

User instructions Once we have an overview about the dialogue and the keywords that the chatbot will “recognize”, it will be time to consider how the conversation will begin. It will be useful to warn the user about the functioning of the virtual agent, indicating that it is a robot and that it needs them to interact with it using concise phrases. Otherwise, users could begin to describe their problem using long text entries, which would make it difficult for the virtual agent to “understand” the content (that is, recognize the appropriate entities). The user may also be advised that he or she may be asked for data that he or she may have already mentioned in previous . Sometimes, in the same text entry the client may report several data (e.g., the place where he or she had an anxiety attack and the people present). If the chatbot is not programmed to identify both data points in a single entry, it is likely that in other questions it will ask the user again for information that he or she has already given. If we do not warn the client that this may occur, he or she is more likely to become frustrated and leave the conversation without having finished the interview. On the other hand, as we mentioned before, it is necessary to inform the client about the use of the personal data that he or she will provide. In our case, we consider that this information, in accordance with the current legislation, must be offered to the user before the conversation with the device begins. However, there is no harm in the virtual agent giving a reminder at the beginning of the interaction.

31 Articles PSYCHOLOGIST-CHATBOT

initial assessment. It must be said that the therapist assigned to development. The ideal is to have machine learning tools, but the case will have the information collected by the chatbot, not all people have access to this type of technology. In any which will facilitate the analysis of the problem and the case, this process will have to be supervised by a botmaster preparation of the face-to-face sessions. If this tool is used by (and in the case of a psychologist-chatbot, this task must be one professional alone, the information collected will also help performed by psychologists, engineers, and linguists). to plan the assessment and even make a first approach to explaining the problem. DISCUSSION Chatbots are a reality that is constantly increasing and Interdisciplinary development developing. While their origins date back to more than half a From everything described above, it can be noted that century ago, it is now that their presence is beginning to collaboration among psychologists, computer engineers, and emerge. This is possible, in large part, thanks to the latest linguists is vital. Psychologists are responsible for indicating advances in computational science and artificial intelligence what the chatbot’s questions should be and the type of (Brandtzaeg & Følstad, 2018). These machines are capable information that should be collected. In addition, they are of offering us services of unquestionable potential: they help us with the management of daily tasks, with the acquisition of responsible for pointing out the situations that require priority products and services, and they are even offered as a virtual referral to a human psychologist or even, more specifically, to friend. As for the field of psychology, although the first chatbot a specialized emergency service (e.g., the user presents in history presented itself as a psychotherapist, there are few suicidal ideation and a set plan). The engineers and linguists virtual agents of this type that have been developed since then are responsible for designing and building the chatbot so that and even fewer that have undergone controlled studies to it can understand the user (identifying certain words in the text obtain evidence about their efficacy, effectiveness, and or voice input), guiding the conversation, interacting with efficiency. language that is as natural as possible, etc. We believe that the creation of chatbots dedicated to It must not be forgotten that a chatbot must be in continuous assessment and intervention work can be of particular interest and usefulness. The virtual psychological assessment agents FIGURE 4 have the potential to be just another tool for the psychologist GENERAL SCHEME OF THE TREE OF DECISION MAKING who is in charge of the evaluation of a clinical case. One of REGARDING THE CHATBOT’S QUESTIONS their main advantages is savings in terms of time and use of physical spaces. First, since it is a tool available through a web service, an office is not necessary for carrying out the initial assessment. Second, if the chatbot manages to collect information, it can expedite the evaluation process, saving time for the psychologist in charge of assigning the case to the different therapists and, consequently, saving time for the user him- or herself. Likewise, virtual agents can serve as support during the treatment, facilitating the client’s process of learning the guidelines and techniques required for their specific case. It is important to reflect on the limitations of these conversational agents. First, they do not have intelligence as we understand this concept; their functioning will depend on how we have defined the intentions, the entities, and the dialogue. Obviously, an initial version of the bot will not be enough; it will be necessary to undergo a process of constant “learning” to facilitate its improvement. Second, the use of virtual agents requires users to be familiar with the new technologies. To interact with a chatbot you need to have a computer or mobile device, as well as an internet connection. Therefore, using a psychologist-chatbot for some sectors of the population (for example, children and the elderly) may not be the best option. Also, and in relation to the need to be connected to the Internet, the interaction with this tool is subject to possible connection problems, which can negatively affect the user experience.

32 MIRIAM ROMERO, CRISTINA CASADEVANTE AND Articles HELENA MONTORO

Another aspect to consider is the type of information that the detect. We hope that the chatbot that is currently being client receives in the interaction with a conversational developed and that has been described in this work will allow assistant. Psychologists not only offer verbal information, but us to obtain this type of data, so that we can contribute to nonverbal language also plays an important role, so if the improving and expanding the use of these tools in the field of virtual agent does not have an avatar, the interaction can be psychology. impoverished. The same applies to the information collected The objective of this work has been to present in a by the chatbot, because for now the advanced and affordable summarized way the state of the question in relation to the technology that would allow us to understand the user’s development of chatbots and their link with psychology. It is nonverbal language does not exist. Therefore, the type and also intended to encourage professionals to participate in the amount of information obtained is clearly different from that design and testing process of these tools to demonstrate their which a human would collect. In any case, we must not forget efficacy, effectiveness, and efficiency, and to ensure good that this is a support tool and will can never completely practices. No doubt, in the coming years, technological replace the psychologist. advances will allow substantial improvements in the On the other hand, and regardless of the discipline to which development of virtual agents, which will facilitate their they are attached, chatbots must comply with the guidelines creation and improvement. In the field of psychology, we related to data protection regulations. These guidelines are hope that these advances incorporate elements that are even stricter in the case of psychology, since the data that are relevant to the profession, such as the possibility of identifying handled are personal, and must be treated confidentially and the users’ nonverbal language. in accordance with the law. In the contact platform, the user must be guaranteed confidentiality, be informed of the use that CONFLICT OF INTERESTS will be made of the data obtained, and they must be asked for There is no conflict of interest consent. In this case, the regulations require a high level of protection and this information must only be accessible to REFERENCES authorized psychologists. In addition to the above, we believe Brandtzaeg, P. B., & Følstad, A. (2018). Chatbots: Changing that it is important that chatbots dedicated to psychological user needs and motivations. Interactions, 25(5), 38-43. assessment and intervention work have regulations that doi: 10.1007/978-3-642-12630-7_6 guarantee compliance with minimum guidelines regarding the Eliza (2018). DixiLabs Retrieved from ethical standards that are so relevant in this profession. On the http://deixilabs.com/eliza.html#impl other hand, it is also essential to be able to offer the user the Fernández Ballesteros, R. (2015). Evaluación psicológica: information that has been obtained through the chatbot, in Conceptos, métodos y estudio de casos [Psychological case they request it (just as a user can request a report about assessment: Concepts, methods and case studies] (2ª. ed.). the data collected in the psychological center in the Madrid: Ediciones Pirámide. therapeutic process). Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering The designing of a virtual psychologist agent necessarily cognitive behavior therapy to young adults with symptoms requires the contribution of knowledge from psychology, of depression and anxiety using a fully automated computer engineering, and linguistics. The machine requires conversational agent (Woebot): A randomized controlled an advanced programming base for its creation and must be trial. JMIR mental health, 4(2): e19. doi: constructed so that it is able to “understand” the natural 10.2196/mental.7785 language of the users. At the same time, it must take into Hill, J., Ford, W. R., & Farreras, I. G. (2015). Real account the assessment and intervention strategies recognized conversations with artificial intelligence: A comparison as efficacious and effective so that it achieves its purpose between human–human online conversations and human– successfully. Chatbot conversations. Computers in Human Behavior, 49, One of the benefits of developing these conversational 245-250. doi: 10.1016/j.chb.2015.02.026 agents is the possibility of studying the characteristics of the Janarthanam, S. (2017). Hands-on chatbots and chatbot-human therapeutic process and analyzing the conversational UI development: Build chatbots and voice differences that it presents with the human-human process. For user interfaces with Chatfuel, Dialogflow, Microsoft Bot example, we can study whether they differ in interaction time, Framework, Twilio, and Alexa Skills. Birmingham, UK: in the number of sentences, in the subjective assessment of the Packt degree of help, and in perceived satisfaction. We can also Khan, R., & Das, A. (2017). Build Better Chatbots: A study whether there are expressions that are more effective in Complete Guide to Getting Started with Chatbots (1st ed.). obtaining a certain type of information, etc. This, in turn, will New York: Apress. allow us to optimize the conversational agent (Hill et al., Klopfenstein, L. C., Delpriori, S., Malatini, S., & Bogliolo, A. 2015), implementing the possible improvements that we (2017). The rise of bots: a survey of conversational

33 Articles PSYCHOLOGIST-CHATBOT

interfaces, patterns, and paradigms. In Proceedings of the Turing, A., Braithwaite, R., Jefferson, G., & Newman, M. 2017 Conference on Designing Interactive Systems (pp. (1952). Can automatic calculating machines be said to 555-565). New York: ACM. think? In B. J. Copeland (Ed.), The Essential Turing: Seminal Kothari, A., Zyane, R., & Hoover, J. (2017). Chatbots for writings in Computing, Logic, Philosophy, Artificial eCommerce: Learn how to build a virtual shopping Intelligence and Artificial Life, plus the Secrets of Enigma assistant. Bleeding Edge Press. (pp. 487). Oxford: Oxford University Press. Mauldin, M. L. (1994). Chatterbots, tinymuds, and the Turing Wallace, R. S. (2009). The anatomy of ALICE. In R. Epstein, test: Entering the Loebner prize competition. Association for G. Roberts & G Beber (Eds.), the Turing Test (pp. the Advancement of Artificial Intelligence, 94, 16-21. 181-210). Springer, Dordrecht. Retrieved from https://bit.ly/2xN1AfV Warwick, K., & Shah, H. (2014). Assumption of knowledge Replika (2018). Retrieved from https://replika.ai/ and the Chinese room in Turing test interrogation. AI Sentino (2018). Retrieved from https://sentino.org/ communications, 27(3), 275-283. doi: 10.1109/TCIAIG Shawar, B. A., & Atwell, E. (2005). Using corpora in Warwick, K., & Shah, H. (2016). Can machines think? A machine-learning Chatbot systems. International Journal of report on Turing test experiments at the Royal , 10(4), 489–516. doi: Society. Journal of Experimental & Theoretical Artificial 10.1075/ijcl.10.4.06 Intelligence, 28(6), 989-1007. doi: Turing, A. (1951). Can digital computers think? In B. J. 10.1080/0952813X.2015.1055826 Copeland (Ed.), The Essential Turing: Seminal writings in Weizenbaum, J. (1966). ELIZA: A computer program for the Computing, Logic, Philosophy, Artificial Intelligence and study of natural language communication between man and Artificial Life, plus the Secrets of Enigma (pp. 476-486). machine. Communications of the ACM, 9(1), 36–45. Oxford: Oxford University Press. Retrieved from: https://dl.acm.org/citation.cfm?id=365168

34