<<

View metadata, citation and similar papers at core.ac.uk brought to you by CORE

provided by AIS Electronic Library (AISeL)

Emotion Transfer from Frontline Social Robots to Human Customers During Service Encounters: Testing an Artificial Model Research-in-Progress Paper

Autor: Ruth Maria Stock, Technische Universität Darmstadt, Germany Email: [email protected]

Abstract This research examines mood transitions during human-robot interactions (HRI) compared with human-human interactions (HHI) during service encounters. Based on emotional contagion and social identity theory, we argue that transmission within HRI (e.g., between a frontline service robot and a human customer) may occur through the imitation of the robot’s verbal and bodily expressions by the customer and may be stronger for negative than for positive . The customer’s positive attitude and toward robots will further be examined as contingencies that strengthen or weaken the emotion transition during the HRI. We already identified the five most important emotions during service encounters (critical incident study with 131 frontline employees). The subsequent output behavior was programmed to a robot and validated (ratings from 234 students). In the next step, we attempt to manipulate the emotional expressions of a frontline and a customer within an experimental study.

Key Words: Human Resource IS – Robotic Psychology – Human-Robot-Interaction – Articial Emotional Contagion

Introduction “As robots step into humans’ daily lives, we can't stress enough the importance of natural human–robot interactions." (Park et al. 2010, p. 332) In recent years, service firms increasingly have relied on social robots to provide routine services. These social robots “exist primarily to interact with people” (Kirby, Forlizzi, and Simmons 2010, p. 322) and assist human users in various settings, such as retailing, hospitality, education, and health care (Han, Lin, and Song 2013; Zhang et al. 2008). For example, Marriott recently started using social robots to provide room service (Jeffrey and DeSocio 2016). Nestlé has placed hundreds of frontline social robots on shop floors to sell Nescafé in Japan (Nestlé 2016). The Japanese travel agency HIS runs the Henn-na Hotel almost completely with robots, which function as receptionists, luggage carriers, and room service personnel (Rajesh 2015). As these examples illustrate, the tasks of humanoid robots can be manifold, ranging from carrying customer items and transportation services to welcoming and checking in customers or answering routine questions. As frontline social robots become increasingly important during service encounters (Park, Kim, and Chung 2010), “the market of service robots is forecasted to grow” (Han, Lin, and Song 2013, p. 1290). Accordingly, human–robot interactions (HRI) have attracted considerable attention in the robotic research community (for an overview, see Han, Lin, and Song 2013). With the cohesive goal of ensuring that robots exhibit “more humanlike behaviors” (Han, Lin, and Song 2013, p. 1251), three research streams have emerged in this field. The first focuses on manipulating robots’ human-level emotion expression (e.g., Breazeal 2004; Gadanho and Hallam 1998; Mavridis and Hanson 2009; Mavridis et al. 2011). For example, early researchers sought to teach robots to learn human- like emotions from the environmental conditions (Gadanho and Hallam 1998). Xin et al. (2013) even have developed an algorithm that enables robots to exhibit emotions through their facial expressions and physical behavior. A second research stream seeks ways to design robotic emotional expressions to synchronize with and mimic emotions exhibited by human users (e.g., Broekens 2007; Han, Lin, and Song 2013; Kwon et al. 2007; Rani et al. 2004). Broekens (2007) thus was able to generate artificial emotional expressions, using the adoption of human facial expressions. Han, Lin, and Song’s (2013) path model of robotic suggests a means to program human-like facial expressions by robots based on the robot’s personality and mood state, as well as users’ emotions. Rani et al. (2004) also offer an -sensitive human–robot cooperative framework, according to which a robot senses the affective state of the human and responds appropriately. These two research streams provide valuable insights from a robot-centric perspective , related to the design and methods available to make robots express emotions in a human-like manner. However, they do not cover the human side of the HRI, and in particular, they ignore human responses to a social robot’s emotions.

1 The third stream of robotic research is different, in that it seeks to define human responses to artificial robotic emotions (Hashimoto, Yamano, and Usu 2009; Kirby, Forlizzi, and Simmons 2010). This research is based on prior findings showing that humans can observe a robot’s expressed emotions (Ogata and Sugang 2000; Zhang et al. 2008). Hashimoto, Yamano, and Usu (2009) find that humans feel comfortable when robots’ expressions synchronize with their own facial expressions. In applied research, Shamsuddin et al. (2012) use the Nao to teach children with autism to express their emotions. Although this research clearly indicates that humans can observe a robot’s artificial emotions, it is still unclear how humans actually respond to these emotions and which underlying psychological mechanisms might trigger or hinder these human responses. That is, research explains how robots make use of humans’ emotions, from a robot-centric perspective, but knowledge about whether and how robots might transfer different types of emotions to humans from a human-centric perspective is scarce. Specifically, we do not know whether and how humans synthesize robotic artificial emotions. This gap is surprising. Companies increasingly use robots in social contexts, such as service encounters with customers. When human service personnel and human customers interact, these actors can “catch” emotions from each other, a finding well established in emotional contagion literature (Hennig-Thurau et al. 2006; Hochschild 1983). Yet no study has examined such service encounters between social service robots and customers and their potential for emotional contagion. This research gap is significant and relevant; addressing it could provide better insights for designing interactions during service encounters, especially because “humans’ about robots are expected to be a significant factor in the success of service robots” (Broadbent et al. 2007, p. 3703; see also Bauer 2003; Breazeal 2004; Dario, Laschi, and Guglielmelli 1999; Dautenhahn et al. 2005; Libin and Libin 2004). To make robots effective tools to facilitate service encounters, we need “to expand our understanding of the psychological aspects of these [HRI ] exchanges” (Broadbent et al. 2007, p. 3703). Accordingly, this research contributes to HRI literature by adding insights about the transfer of emotions, expressed by a frontline service robot, to a human customer. In turn, this study strives to answer three research questions. First, through which underlying psychological processes does a human customer adopt artificial emotions expressed by a frontline social robot? To examine the psychological mechanisms of emotion transfer from a frontline social robot to a customer, this study relies on the emotional contagion concept (Hennig-Thurau et al. 2006; Hochschild 1983), which holds that emotions pass among humans during service encounters, because of the human tendency to mimic and synchronize facial expressions, vocalizations, and bodily expressions (i.e., postures and movements) with those of another person and thereby converge emotionally (Hatfield, Cacioppo, and Rapson 1994; Tamietto and de Gelder 2008). The emotional contagion effect is well understood in human settings, including frontline service encounters (for an overview, see Hennig-Thurau et al. 2006). To the best of the author’s knowledge though, this study is the first to test an “artificial emotion contagion effect” from a frontline social robot to a customer. Artificial emotional contagion is different from the simple human responses to emotions that might be expressed by a humanoid robot. In particular, emotional contagion involves more than the observation of a robot’s artificial emotions and instead entails unconscious synchronization with the social robot’s emotions by the human. Emotional contagion also reflects a specific set of emotions, such as pleasantness vs. unpleasantness, that gets transferred from the service robot to a human customer. Second, this study accounts for the valence of the emotion (Russell and Bullock 1985) by examining differential contagion effects for positive vs. negative emotions, as expressed by the frontline social robot during the HRI. In this sense, this study asks, does the emotional contagion effect during the HRI depend on the valence of the emotion expressed by the frontline social robot? Third, a customer’s affect toward humanoid robots may be a contingency factor, influencing emotion transfer during the HRI. A customer’s affect toward humanoid robots can be captured by positive attitudes toward robots and by robot anxiety. Thus, this study asks, how does a customer’s affect toward humanoid robots influence the intensity of the emotional contagion effect from a frontline social robot to a customer? This research is based on an experimental study, investigating emotion transfer from a frontline social robot to a customer, rather than emotion transfer among humans in a service encounter. The humanoid robot Nao will serve as the focal frontline social robot. Face recognition software, video- recordings, and independent third-rater assessments are used to examine humans’ emotional responses to Nao’s exhibited emotions.

2 A Model of Artificial Emotional Contagion During Service Encounters Emotional Contagion Theory The theoretical basis for exploring the transfer of emotions from a frontline social robot to a human customer during the service encounter is emotional contagion theory (Hochschild 1983), which is rooted in social psychology (e.g., Gump and Kulik 1997; Higgins and Range 1994). It also has found broad empirical support in psychology and management literature that describes emotion transfers among humans (e.g., Hennig-Thurau et al. 2006; Pugh 2001). The basic idea of emotional contagion theory is that even minimal contact between individuals can allow emotions to pass from one person to another (Barsade 2002; Gump and Kulik 1997; Hatfield, Cacioppo, and Rapson 1994; Rozin and Royzman 2001). Emotions refer to “an immediate affective response to the evaluation of some event (or stimuli) as being of major significance” (Kirby, Forlizzi, and Simmons 2010, p. 323). Emotional contagion is the tendency to automatically mimic and synchronize one’s facial expressions, vocalizations, and bodily expressions (i.e., postures and movements) with those of another person, such that the two parties converge emotionally (Hatfield, Cacioppo, and Rapson 1994; Tamietto and de Gelder 2008). Initial research on emotional contagion has largely focused on facial expressions; more recent evidence indicates that emotions also transfer in response to vocal and bodily expressions (Bradley and Lang 2000; Magnee et al. 2007).

A Conceptual Model of Artificial Emotional Contagion Figure 1 depicts a proposed model of artificial emotional contagion. The model compares emotion transfer during human–human interactions (HHI; i.e., between a frontline service employee and a human customer) (e.g., Hennig-Thurau et al. 2006; Stock and Hoyer 2002) against emotion transfer during a HRI (i.e., between a frontline social robot and a human customer). The customer’s emotional contagion is captured by the change in customer affect , defined as the extent to which a customer has positive or negative feelings after the interaction with the frontline social robot or frontline employee. This study also examines human customers’ affect toward humanoid robots (positive attitude, robot anxiety) as a contingency that may strengthen or weaken the emotional contagion effect from a frontline social robot to a human customer.

Main Effects Emotional contagion theorists argue that emotions transfer among humans through imitations of bodily language (Hatfield, Cacioppo, and Rapson 1994; Hochschild 1983). Conceptual robotic research has mentioned—though without empirical proof—that emotional contagion may occur during HRI (Hashimoto, Yamano, and Usu 2009). This study argues that emotional contagion is likely to occur for both pleasant and unpleasant emotions within HRI. A presurvey by the author with 234 students revealed that humans can observe both positive and negative emotional expressions by Nao (Figure 2). Thus, it seems likely that humans also might tend to synchronize both types of emotions. It is further argued that emotional contagion is weaker for HRI as opposed to HHI, in line with social identity theory (SIT) (Tajfel 1978, 1981; Tajfel and Turner 1985). According to SIT, people classify others into different categorization schemes to define them and locate themselves in their social environment. Social identification, or “the perception of oneness with or belongingness to some human aggregate” (Ashforth and Mael 1989, p. 21), is more likely to result when the similarity between the focal individual and the other is greater (Tajfel 1981). The weak similarity in verbal and non-verbal expression modes between robots and humans, compared with the strong similarities among humans, suggests that a frontline social robot may not provide a source for emotional imitation. Therefore, emotional contagion effects should be weaker for HRI as opposed to HHI: H1: Emotional contagion within the HRI occurs for (a) pleasant and (b) unpleasant emotions. H2: Emotional contagion effects are weaker for HRI as opposed to HHI.

Another hypothesis related to the emotion contagion effect depends on the valence of the emotion. Research in psychology (Barrett 2004, 2006; Russell and Bullock 1985) describes the valence of emotional expressions on pleasurable (e.g., , ) and displeasurable (e.g., , ) axes. Research has shown that positive robot behaviors result in more positive assessments of the robots’ task fulfillment; negative robot behaviors result in negative assessments (Broadbent et al. 2007). It is argued though that the emotional contagion effect during the HRI is stronger for positive emotions than for negative emotions expressed by a social service robot, for several reasons. This notion is consistent with the findings of a second presurvey of 230 human users by the author, which could show that positive robot emotions during service encounters generally are perceived more intensively by recipients than are negative emotions. Therefore, the emotional contagion effect should generally be weaker for negative than for positive emotions, and negative 3 emotions expressed by the frontline social robot during the HRI may not result in attention, such that they are less easily caught by the customer during the service encounter. Thus, H3: Emotional contagion effects during the HRI are stronger for positive emotions as opposed to negative emotions.

Contingency Effects Beyond the underlying mechanisms of the robot-to-human emotion transmission, the effect of a customer’s positive or negative attitudes toward robots, as contingencies for the emotional contagion between a frontline social robot and a customer will be investigated. Positive attitudes toward robots indicate the extent to which a customer has a positive affective state toward interactions with robots, including positive feelings toward situations marked by interactions with robots and the social influence of robots (Nomura et al. 2004). Customers with positive attitudes may tend to be particularly receptive to emotions expressed by robots (Dautenhahn 2005). Thus, H4: A customer’s positive attitudes toward robots strengthen the transfer of emotions from a frontline social robot to the customer during an HRI.

Information systems research also investigates “technophobia,” or human anxiety about using computers or information technologies, and shows that it is widespread (Brosnan 1998; Raub 1981). Furthermore, humans experience latent anxiety when dealing with humanoid robots (Bartneck et al. 2005, 2007; Kaplan 2004; Oyedele, Hong, and Minor 2007; Syrdal et al. 2009), leading Nomura et al. (2006, p. 372) to propose that “negative emotions such as anxiety should be given more attention in human—robot interaction.” Robot anxiety is examined as a contingency variable that might affect the emotional contagion effect within the HRI. Robot anxiety refers to “emotions of anxiety or preventing individuals from interaction with robots having functions of communication in daily life, in particular, communication in a human-robot dyad” (Nomura et al. 2006, p. 372). People with this form of anxiety are less receptive to signals expressed by a robot than are people with little anxiety (Oyedele, Hong, and Minor 2007). Accordingly, the emotional contagion effect from a frontline social robot to a customer might be weaker for humans with robot anxiety. Thus, H5: A customer’s robot anxiety weakens the transfer of emotions from a frontline social robot to the customer during an HRI.

Study 1: Critical Emotions During the Service Encounter Critical Incident Study An important step in preparation for the experiment was to identify the most relevant emotions to capture with the experiment. These emotions determine the frontline service robot’s output behavior and the frontline employee’s expressed emotions for this experiment. Conceptually, this study relies on research in psychology that has examined individual differences in the extent to which people emphasize valence and in their self-reported emotion expression (Barrett 2004, 2006; Russell and Bullock 1985). Valence incorporates the axes of – displeasure, whereas arousal includes the axes of arousal–sleepiness to describe expressions of human mood states (Russell and Bullock, 1985). To capture the most important emotions that are relevant for HRI during service encounters, 131 part-time master’s students in psychology, working as frontline employees in various services industries (retailing 53.8%, hospitality services 10.6%, health services 11.7%, consulting 9.2%, banking/insurance 5.3%, IT services 4.4%, other 2.0%) were surveyed, using open-end questions. The structure of the survey questions focused on negative or positive incidents during service encounters, in accordance with extant literature (Hughes, Williamson, and Lloyd 2007). The respondents were asked to report critical incidents and details they recalled that made this experience memorable to them. A critical incident is one that “contributes to, or detracts from, the general aim of the activity in a significant way” (Bitner, Booms, and Tetreault 1990, p. 73), which describes exactly what this study aimed to uncover. Regarding their feelings, the following question was asked: “What did you feel when the incident happened? Please describe your feelings as specifically as you can.” The frontline employees reported 143 negative and 132 positive incidents. The respondents experienced happiness (59% of the positive incidents), (27%), and surprise (11%). As negative emotions, the frontline employees reported anger (59%) and frustration (30%). Only in relatively few incidents did the respondents report fear (11%). Therefore, four important emotions experienced during service encounters were identified, expressing both pleasantness (happiness and positive surprise) and unpleasantness (anger and frustration).

4 Emotional Expressions by the Robotic Arbitrator Nao The mechanical basis for the robot for the second study is the humanoid robot Nao, a 57 cm. high robot with 25 degrees of freedom, produced by Aldebaran. Nao previously has been used in various HRI settings, to study simulated body movements (Domingues et al. 2011; Xu et al. 2014), storytelling and conversation (Csapo et al. 2012; Gelin et al. 2010; Han et al. 2012), smart home interactions with human beings (Louloudi et al. 2010), and emotion transitions for therapeutic treatments (Miskam et al. 2014; Shamsuddin et al. 2012, 2013). Emotions are typically transferred among humans by facial, vocal, and bodily expressions (Bänziger, Grandjean, and Scherer 2009; Hall and Bernieri 2001; Karpouzis et al. 2007). Nao’s platform only offers simple, moderate facial features. The LED head features a graphical face for the experiments. In line with recent evidence from emotional contagion research (Bradley and Lang 2000; Magnee et al. 2007), this study focuses on vocal and bodily expressions, which can be easily expressed by Nao. Nao’s graphical programming tool Chloregraphe (Pot et al. 2009) also supports designs of complex behaviors in an intuitive way. The appropriate positions for Nao to express emotions in its output behavior were identified in three steps: First, the author relied on extant literature in psychology (Pease 1981) and robotic research that suggested various behavioral outputs for the emotional expressions (Shamsuddin et al. 2012). Second, a web search was conducted and 100 pictures for each of the five emotions were identified. Based on these pictures, the two most typical bodily expressions for happiness, surprise, anger, and frustration, as well as one neutral position were programmed. Third, the nine programmed positions were presented to 234 students (18–43 years of age; 67% men; 80% technical background) and asked them to rate the bodily expressions by Nao. The respondents clearly identified the bodily expressions for all four emotions identified in the CIT study, across both pleasantness (happiness 91%; positive surprise 95%) and unpleasantness (anger 83%; frustration 94%). In addition, the neutral emotion expression by the frontline social robot was tested, which was clearly recognized by 93% of the students. There were no significant differences between technically oriented students (i.e., those studying information science, engineering, business engineering) and other students (studying sociology, politics, management). In the experiment, each of the five emotions expressed by Nao’s body gestures would be displayed to the customer during the conversation (see also Gelin et al. 2010).

Study 2: Emotional Contagion within the HRI Experimental Setup The experiment will be carried out in a hotel setup, in a research lab associated with the authors’ university. Visitors to the hotel can ask the frontline social robot (Figure 3a) or the frontline employee (Figure 3b) questions about an offer. During the experiment, each participant in the role of a customer will interact separately with the robot/employee. All visual displays and sounds will be recorded by external HD cameras, positioned on Nao’s body. The experimenter will not be visible to the customer; at a hidden station, the experimenter will observe video streams from the cameras, through a separate screen. Experimental design . An important step is selecting the type of research design (Aguinis and Bradley 2014), whether between-subject, within-subject, or mixed (Atzmüller and Steiner 2010). For this study, a mixed, 4 × 2 design will be applied in multiple steps. Specifically, the three different emotion valences—neutral, pleasant, and unpleasant—will be tested for four different groups, namely, the HHI and HRI, each in a pleasant or an unpleasant condition, in a between-subject design. In the first setting (t 0), neither the frontline service robot nor the frontline service employee will express particularly pleasant or unpleasant emotions, beyond a generally friendly and professional treatment of the participant as a customer (neutral baseline). To avoid learning effects, we will test the pleasantness and unpleasantness condition for these two groups—HHI and HRI—with a time lag of eight weeks (t 1). To avoid order effects, we will randomize the order of the pleasant and the unpleasant conditions during the experiments at t 1. The experimental design of the study is depicted in Figure 4. Setting . In both groups, participants will be asked to imagine themselves in one of two hypothetical situations during the service encounter, describing (a) an interaction with a frontline social robot (HRI treatment) or (b) an interaction with a frontline service employee (HHI treatment). The experiments will capture a scenario, in which the participants must attempt to check in at the service counter of a hotel, with the help of a frontline service robot or a frontline employee, or to check out at the service counter of the same hotel (see Appendix 1 for the HRI/HHI treatments). Three alternative treatments for each of the two groups will be conducted: In treatment 1 (neutral condition), the participant will have to check in with an employee or a robot at t 0. In treatments 2 and 3, the participants also check in; in one of the conditions, the service representative will express pleasurable emotions, while in the other, the service representative will express unpleasurable emotions.

5 Actors . To increase the level of immersion, intensively trained actors will be used as confederates to play the roles of frontline service employees in the experiment. To reduce any confounding effects due to their personal communication style and to standardize the interaction with the company, standardized service scripts will be used (the detailed scripts can be requested from the authors), similar to those commonly used for hotels in business practice. To increase the realism of the experiments, it will be refered to an existing service offer by a hotel. The service representative will check in the participant as a customer, according to a standardized script. Controls . In the laboratory setting for the experiment, it will be controlled for the customer’s technological experience, defined as familiarity with the use of information technology (Smarr et al. 2012) and robot familiarity (Mitzner et al. 2011; Smarr et al. 2012). Furthermore, it will be controlled for the customer’s nonverbal sensitivity, defined as the ability to recognize emotions or interpersonal attitudes from nonverbal cues in the face, body, and/or voice (Hall and Bernieri 2001).

Sample and Measurement The participants in this study will be approximately 80 students, enrolled in psychology or information science courses, who will volunteer to participate. The minimum sample size to ensure valid measures, according to the “G*Power 3.1.92” software, must be 75 respondents. The planned sample thus is sufficient to test our hypotheses. To suit the aims of this study, a list of selection criteria for participants will be used: They must have no deficits in hearing or vision and be able to understand, speak, and follow simple commands in English. As an incentive, all participants will receive financial remuneration of $12 for completing the study. Emotional contagion will be measured by both participants’ self-reports, face recognition, and observers’ ratings of mood via video-taped ratings of the participants interacting in the group exercise, as suggested by Barsade (2002; see also Hsee, Hatfield, and Chemtob 1992). Following the procedure by Barsade (2002), four video-coders will be trained extensively to code human customers’ emotions, as displayed by their facial expressions, body language, and verbal tone. The coders will be kept intentionally unaware of the experimental conditions and study purpose. In addition, the coders will only view the videos of the participants, not the human frontline employee or the robot, to reduce any potential coding bias due to the human confederate’s or the robot’s behavior. The coders will assess emotional contagion by watching the human customers’ facial expressions, body language, and verbal tone throughout the course of the experiment, while rating the level of each participant’s pleasant mood every 30 seconds, on a scale from 1 (not at all) to 7 (very much). The 30- second segments will be aggregated across coders for the second part of the experiment to create a Time 2 mood scale based on the video-coders’ ratings. Self-report measures . To capture the customers’ self-rated pleasant emotional contagion, the self- reported pleasant emotion measure will be compared with the parallel measure taken during the latter part of the experiment. Customers will be instructed to describe their emotions on a scale (Barsade 2002), in response to the prompt, “To what extent do you feel this way right now, that is, at the present moment,” on a 9-point Likert-type scale (1 = “Not at all,” to 9 = “Extremely much”). The included adjectives are pleasant, happy, optimistic, and warm, as well as unhappy, pessimistic, gloomy, lethargic, depressed, and sad, which will be reverse coded. The measures for the contingency variables (i.e., positive attitude toward robots and robot anxiety) reflect existing scales, developed in prior robotic research. The scale that will be used to capture positive attitudes toward robots was developed by Nomura et al. (2006); the scale for robot anxiety comes from Nomura et al. (2004). Finally, this study will control for demographic customer characteristics, which can affect human interactions with robots in general, beyond the service encounter: age (Arras and Cerqui 2005; Czaja and Sharit 1998), gender (Forlizzi 2007; Mutlu et al. 2006; Nomura et al. 2008), education (Giulini, Scopelliti, and Fornara 2005; Libin and Libin 2004), and experience with robots (Dijkers et al. 1991). It will further be controlled for customer personality, which will be assessed on the basis of the big five personality traits—openness to experience, extraversion, conscientiousness, agreeableness, and neuroticism—using the 60-item NEO-FFI scale (Costa and McCrae 1992).

Test of Hypotheses Hypothesis 1 argues for an emotional contagion effect from frontline social robots to human customers. In other words, human customers are predicted to “catch” artificial emotions from a frontline social robot. The emotion expression by the robot will be checked by asking the customer whether and to what extent she or he perceives expressions of pleasant or unpleasant emotions by the robot (manipulation check). To test the emotional contagion effect, the originally self-reported customer emotions with the emotions, reported right after the experiment will be compared. Then, the results from the analysis of the video-codes will be assessed (a) with face recognition software and (b) by the four trained, independent observers, before versus in the latter part of the experiment. Through 6 this procedure, Hypothesis 2 can be tested, in which it was argued that emotion contagion is more likely to occur in an HHI rather than an HRI setting. This study further anticipates that the emotional contagion effect is stronger for positive as opposed to negative emotions, transferred between a frontline social robot and a customer (Hypothesis 3). It is planned to use a between-subject design to test it. Different participants will be involved in two different and randomly ordered HRI settings, one with positive and another with negative emotions expressed by the robot. This procedure allows effective intrapersonal comparisons in terms of whether a customer catches emotions from the robot.

Conclusion This study proposes a model of artificial emotion contagion, from a frontline social robot to a human customer. The author has developed an empirical design for an experiment to test this model by using the humanoid robot Nao in a hotel setting. On the basis of emotional contagion theory, the emotional transmission from social robots to humans, as well as some contingencies that may strengthen or weaken this transmission will be tested. The results thus will contribute to robotic research in several important respects. First, this study focuses on service encounters in which frontline social robots interact with customers. Insights about the mechanisms that underlie robot-to-human emotion transmission may provide a conceptual basis for continued studies that seek to examine how and whether emotion transmission occurs in long-term robot–human relationships. Second, with the investigation of emotional transitions during HRI, the results may establish a valuable foundation for examining emotional contagion in mixed teams of robots and customers. Third, firms may benefit from these insights as a basis for the adequate design and placement of frontline social robots.

7 FIGURES:

Emotional Emotional Customer Contingency Effects Contagion Expression Affect Change Mechanism

Positive Attitude Robot toward Anxiety Robots

Pleasantness Unconcious Pleasantness vs. Unplea- Synchroni- vs. Unplea- santness zation santness

Frontline Social Self-Perceived/ Robot vs. Frontline Video-Coded Employee

Figure 1. Model of Artificial Emotional Contagion

Figure 2. Emotions Expressed by Nao During the Experiment: Neutral, Pleasantness (Happiness, Positive Surprise) and Unpleasantness (Anger, Frustration)

(A) Human-Robot-Interaction (HRI) (B) Human-Human-Interaction (HHI)

Figure 3. Experimental Setup of the Study

8 Emotions Expressed by Service Representative - Human FLE or Service Robot - Neutral Pleasant Unpleasant

Group n 1 = 25 Human- Robot- Interaction

Group n 2 = 25

Group n 3 = 25

Human-Human- Interaction

Group n 4 = 25

t t 0 1

Figure 4. Experimental Design of the Study

APPENDIX 1. Hotel Service Script for the HRI and HHI treatment Check -In at the Hotel (Neutral baseline condition / Treatment 2 and 3 – pleasantness vs. unpleasantness) 1) As soon as the customer enters the hotel and steps toward the service counter, the service representative (frontline social robot/frontline service employee) makes eye contact with the customer and welcomes him or her to the hotel. 2) The service representative asks a series of standardized questions about the customer’s room booking. These questions follow a standardized order and cover the following topics: single room vs. double room and duration of stay (in nights). The service representative also gathers personal information on a tablet screen, following a standardized script. 3) The service representative asks a series of standardized questions about the customer’s room habits and preferences. These questions follow a standardized order and cover the following topics: (a) room preference on the upper or lower floor, (b) usage of the spa, () usage of the restaurant, (d) and help with luggage. 4) The service representative finally asks the customer whether she or he has any further questions. If not, the service representative offers help if further questions occur at a later time. 5) When the choice is finalized, the service representative proceeds with an explanation of the electronic room key system. The customer is thanked for his or her reservation and wished a pleasant stay in the hotel. Notes: The service representative (frontline service robot or frontline employee) is instructed to express pleasantness (treatment A) or unpleasantness (treatment B) in a mixed-method design. The within-subject-design pertains to the transmission of neutral, positive, and negative emotions (randomized order for positive and negative emotions). The between- subject design entails the transmission of emotions within HRI vs. HHI. To compare HHI and HRI, the confederate (frontline service employee) and the frontline social robot perform the same script in the same setting. In the neutral condition (baseline) the instructions for confederate/programming of robot included the following elements: (1) atmosphere: cool and professional friendliness; (2) face expression: neutral, less emotional; (3) gestures: almost none; (4) tone of voice: medium loud, relatively monotonous, medium speed speaking; and (5) verbal expressions: use of phrases like “sure” and “no problem.” For the pleasantness condition, the instructions/programming led to the following conditions: (1) atmosphere: warm and heartfelt friendliness; (2) face expression: happiness, positive surprise; (3) gesture: intense and positive (e.g., open arms); (4) tone of voice: loud, enthusiastic, fast speaking; and (5) verbal expression: vocal: use of phrases like “happy to do so” and “with pleasure.” Finally, in the unpleasantness setting, (1) atmosphere: impersonal and unfriendly; (2) face expression: anger, frustration; (3) gesture: intense and negative (e.g., arms on hips); (4) tone of voice: loud, annoyed, fast speaking; and (5) vocal: use of phrases like “no, that’s impossible,” “I am sorry...,” and “I cannot....”

9 References Aguinis, H. and Bradley, K. J. 2014. “Best Practice Recommendations for Designing and Implementing Experimental Vignette Methodology Studies,” Organizational Research Methods (17), pp. 351-371. Arras, K. O. and Cerqui, D. 2005. “Do We Want to Share our Lives and Bodies with Robots? A 2000 People Survey,“ Technical Report Nr. 0605-001 , Autonomous Systems Lab, Swiss Federal Institute of Technology Lausanne (EPFL). Ashforth, B. E. and Mael, F. 1989. “Social Identity Theory and the Organization,“ Academy of Management Review (14: 1), pp. 20-39. Atzmüller, C., and Steiner, P. M. 2010. “Experimental Vignette Studies in Survey Research,” European Journal of Research Methods for the Behavioral and Social Sciences , 6, 128-138. Bänzinger, T., Grandjean, D., and Scherer, K. 2009. “ from Expressions in Face, Voice, and Body: The Multimodal Emotion Recognition Test (MERT),“ Emotion (9:5), pp. 691-704. Barrett, L. F. 2004. “Feelings or Words? Understanding the Content in Self-report Ratings of Experienced Emotion,“ Journal of Personality and Social Psychology (87), pp. 266–281. Barrett, L. F. 2006. “Solving the Emotion Paradox: Categorization and the Experience of Emotion,“ Personality and Social Psychology Review (10:1) , pp. 20–46. Barsade, S. G. 2002. “The Ripple Effect: Emotional Contagion and Its Influence on Group Behavior,” Administrative Science Quarterly (47:December), pp. 644-675. Bartneck, C., Nomura, T., Kanda, T., Suzuki, T., and Kennsuke, K. 2005. “A Cross-Cultural Study on Attitudes towards Robots,” in HCI International 2005 . Bartneck, C., Suzuki, T., Kanda, T., and Nomura, T. 2007. “The Influence of People’s Culture and Prior Experiences with Aibo on their Attitude towards Robots,” AI & Society (21:1-2), pp. 217-230. Bauer, J. C. 2003. “Service Robots in Health Care: The Evolution of Mechanical Solutions to Human Resource Problems,“ Bon Secours Health System, Inc. Technology Early Warning System – White Paper. Retrieved August, 2006, from http://bshsi.com/tews/docs/TEWS%20Service% 20Robots.pdf. Bitner, M. J., Booms, B., and Tetreault, M. 1990. “The Service Encounter: Diagnosing Favorable and Unfavorable Incidents,“ Journal of Marketing (54), pp. 71–84. Bradley, M. M. and Lang, P. J. 2000, "Affective Reactions to Acoustic Stimuli," Psychophysiology (37), pp. 204- 215. Breazeal, C. 2004. “Social Interactions in HRI: The Robot View,“ IEEE Transactions on Systems, Man, and Cybernetics , 34, 181-186. Broadbent, E., MacDonald, B., Jago, L., Juergens, M., and Mazharullah, O. 2007. “Human Reactions to Good and Bad Robots,“ Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, Oct 29 - Nov 2 . Broekens, J. 2007. “Emotion and Reinforcement: Affective Facial Expression Facilitate Robot Learning,” in: T. S. Guangdong et al. (Eds.). Human Computing , LNAT 4451, pp. 113-132. Brosnan, M. 1998. Technophobia: The Psychological Impact of Information Technology . Routledge. Costa, P. T., and MacCrae, R. R. 1992. Revised NEO Personality Inventory (NEO PI-R) and NEO Five-Factor Inventory (NEO FFI): Professional Manual , Psychological Assessment Resources. Csapo, A., Gilmartin, E., Grizou, J., Han, J., Meena, R., Anastasiou, D., and Wilcock, G. 2012. “Multimodal Conversational Interaction with a Humanoid Robot,” Cognitive Infocommunications (CogInfoCom) , IEEE 3rd International Conference, pp. 667-672. Czaja, S. J. and Sharit, J. 1998. “Age Differences in Attitudes towards Computers,“ Journal of Gerontology (53), pp. 329–340. Dario, P., Laschi, C., and Guglielmelli, E. 1999. “Design and Experiments on a Personal Robotic Assistant,“ Advanced Robotics (13:2), 153-169. Dautenhahn, K., Woods, S., Kaouri, C., Walters, M. L., Koay, K. L., and Werry, I. 2005. “What Is a Robot Companion—Friend, Assistant or Butler?” Intelligent Robots and Systems . 2005 IEEE/RSJ International Conference, pp. 1192-1197. Dijkers, M. I., deBear, P. C., Erlandson, R. F., Kristy, K., Geer, D. M., and Nichols, A. 1991. “Patient and Staff of Robotic Technology in Occupational Therapy: A Pilot Study,“ J Rehabil Res Dev (28), pp. 33–44. Domingues, E., Lau, N., Pimentel, B., Shaffii, N., Reis, L. P., Neves, A. J. R. 2011. “Humanoid Behaviors: From Simulation to a Real Robot,” Progress in Artificial Intelligence, 352-364, Springer Berlin Heidelberg. Forlizzi J. 2007. “How Robotic Products Become Social Products: An Ethnographic Study of Cleaning in the Home,“ In: Proceedings of the 2007 ACM/IEEE International Conference on Human-Robot Interaction , Virginia, USA, pp. 129–136. Gadanho, S. S. and Hallam, J. 1998. “Emotion-Triggered Learning for Autonomous Robots,“ Workshop on Grounding Emotions in Adaptive Systems at the 5th International Conference of the Society for Adaptive Behavior (SAB’98) Zurich. Gelin, R., d'Alessandro, C., Le, Q. A., Deroo, O., Doukhan, D., Martin, J. C., and Rosset, S. 2010. “Towards a Storytelling Humanoid Robot,” AAAI Fall Symposium: Dialog with Robots . Giulini, M. V., Scopelliti, M., & Fornara, F. 2005. “Elderly People at Home: Technological Help in Everyday Activities,“ In: Proceedings of the 2005 IEEE International Workshop on Robots and Human Interactive Communication , Nashville, TN, pp. 365–370. Gump, B. B., and Kulik, J. A. 1997. “Stress, Affiliation, and Emotional Contagion,” Journal of Personality and Social Psychology (72:2), pp. 305-319. Hall, J. A. and Bernieri, F. J. 2001. Interpersonal Sensitivity: Theory and Measurement . Mahwah, NJ: Erlbaum.

10 Han, J., Campbell, N., Jokinen, K., and Wilcock, G. 2012. “Investigating the Use of Non-Verbal Cues in Human- Robot Interaction with a Nao Robot,” In Cognitive Infocommunications (CogInfoCom) , IEEE 3rd International Conference, pp. 679-683. Han, M.-J., Lin, C.-H., and Song, K.-T. 2013. “Emotional Expression Generation Based on Mood Transition and Personality Model,” IEEE Transactions on Cybernetics (43:4), pp. 1290-1303. Hashimoto, M., Yamano, M., and Usu, T. 2009. “Effects of Emotional Synchronization in Human-Robot KANSEI Communication,” 18th IEEE International Symposium on Robot and Human Interactive Communication , Toyama, Japan. Hatfield, E., Cacioppo, J. T., and Rapson, R. L. 1994. Emotional Contagion , Cambridge, UK: Cambridge University Press. Hennig-Thurau, T., Groth, M., Paul, M., and Gremler, D. D. 2006. “Are All Smiles Created Equal? How Emotional Contagion and Affect Service Relationships,” Journal of Marketing (70:3), pp. 58-73. Higgins, L. and Range, L. M. 1996. “Does Information That a Suicide Victim was Psychiatrically Disturbed Reduce the Likelihood of Contagion?“ Journal of Applied Social Psychology (26:9), pp. 781-785. Hochschild, A. R. 1983. The Managed Heart: Commercialization of Human , Berkeley: University of California Press. Hsee, C. K., Hatfield, E., and Chemtob, C. 1992. “Assessments of the Emotional States of Others: Conscious Judgments versus Emotional Contagion,“ Journal of Social and Clinical Psychology (11), pp. 119-128. Hughes, H., Williamson, K., and Lloyd, A. 2007. “Critical Incident Technique,“ In: Lipu, Suzanne, (ed) Exploring Methods in Information Literacy Research. Topics in Australasian Library and Information Studies, Number 28. Centre for Information Studies, Charles Stuart University, Wagga Wagga, N.S.W., pp. 49-66. Jeffrey, S., and DeSocio, T. 2016. “Meet Wally. The Room Service Robot of the Residence Inn Marriott at LAX,” Fox11 , [http://www.foxla.com/news/local-news/93131443-story]. Kaplan, F. 2004. “Who is Afraid of the Humanoid? Investigating Cultural Differences in the Acceptance of Robots,” International Journal of Humanoid Robotics , 1, 03, 465-480. Karpouzis, K., Caridakis, G., Kessous, L., Amir, N., Raouzaiou, A., Malatesta, L., and Kollias, S. 2007. “Modeling Naturalistic Affective States Via Facial, Vocal, and Bodily Expressions Recognition,“ in: T. S. Huang et al. (Eds.): Human Computing , LNAI 4451, pp. 91–112. Kirby, R., Forlizzi, J,. and Simmons, R. 2010. “Affective Social Robot,” Robotics and Autonomous Systems (58), pp. 322-332. Kwon, D.-S., Kwan, Y. K., Park, J. C., Chung, M. J., Jee, E.-S., Park, K.-S., Kim, H.-R., Kim, Y.-M., Park, J.-C., Kim, E. H., Hyun, K. H., Min, H.-J., Lee, H. S., Park, J. W., Jo, S. H., Park, S.-Y., and Lee, K.-W. 2007. “Emotion Interaction System for a Service Robot,” 16th IEEE Conference on Robot Human Interactive Communication , August 26-28, Jeju, Korea, pp. 351-356. Libin, A. V. and Libin, E.V.2004. “Robotic Psychology,“ In C. Spielberger (Ed.), Encyclopedia of Applied Psychology , Vol. 3 (pp. 295-298). NY: ScienceDirect. Louloudi, A., Mosallam, A., Marturi, N., Janse, P., and Hernandez, V. 2010. “Integration of the Humanoid Robot Nao Inside a Smart Home: A Case Study,” In Proceedings of the Swedish AI Society Workshop (SAIS) . Linköping Electronic Conference Proceedings (48), pp. 35-44. Magnee, M. J., Stekelenburg, J. J., Kemner, C., and de Gelder, B. 2007. "Similar Facial Electromyographic Responses to Faces, Voices, and Body Expressions," Neuroreport (18), pp. 369-372. Mavridis, N., AlDhaheri, A., AlDhaheri, L., Khanii, M., and AlDarmaki, N. 2011. “Transforming IbnSina into an Advanced Multilingual Interactive Robot,“ In GCC Conference and Exhibition (GCC), IEEE (pp. 120- 123). IEEE. Mavridis, N. and Hanson, D. 2009. “The IbnSina Center: An Augmented Reality Theater with Intelligent Robotic and Virtual Characters,“ In Robot and Human Interactive Communication . RO-MAN 2009. The 18th IEEE International Symposium on (pp. 681-686). IEEE. Miskam, M. A., Masnin, N. F. S., Jamhuri, M. H., Shamsuddin, S., Omar, A. R., and Yussof, H. 2014. “Encouraging Children with Autism to Improve Social and Communication Skills through the Game-based Approach,” Procedia Computer Science (42), pp. 93-98. Mitzner, T. L., Smarr, C.-A., Beer, J. M., Chen, T. L., Springman, J. M., Prakash, A., Kemp, C. C., Rogers, W. A. 2011. “Older Adults’ Acceptance of Assistive Robots,“ (HFA-TR-1105) Georgia Institute of Technology, School of Psychology, Human Factors and Aging Laboratory , Atlanta, GA: 2011. Retrieved from http://hdl.handle.net/1853/39671. Mutlu, B., Osman, S., Forlizzi, F., Hodgins J., Kiesler, S. 2006. “Task Structure and User Attributes as Elements of Human-Robot Interaction Design,“ In: Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication RO-MAN06, University of Hertfordshire, Hatfield, UK, pp. 74–79. Nestlé 2016. “Nestlé to Use Humanoid Robot to Sell Nescafé in Japan,” [http://www.nestle.com/media/news/nestle-humanoid-robot-nescafe-japan]. Nomura, T., Kanda, T., Suzuki, T., and Kato, K. 2004. “Psychology in Human-Robot Communication: An Attempt through Investigation of Negative Attitudes and Anxiety toward Robots.” Proceedings of the 13th IEEE International Workshop on Robot and Human Interactive Communication (September), pp. 35-40. Nomura, T., Kanda, T., Suzuki, T., Kato, K. 2008. “Prediction of Human Behavior in Human–Robot Interaction Using Psychological Scales for Anxiety and Negative Attitudes toward Robots,“ IEEE Trans Robot (24), pp. 442–451. Nomura, T., Suzuki, T., Kanda, T., and Kato, K. 2006. “Measurement of Anxiety toward Robots,” Robot and Human Interactive Communication . ROMAN 2006 . The 15th IEEE International Symposium, pp. 372-377.

11 Ogata, T., and Sugang, S. 2000.” Emotional Communication between Humans and the Autonomous Robot WAMOEBA-2 (Waseda Amoeba) Which Has the Emotion Model,” JSME International Journal , 43, 3, 568- 574. Oyedele, A., Hong, S., and Minor, M. S. 2007. “Contextual Factors in the Appearance of Consumer Robots: Exploratory Assessment of Perceived Anxiety toward Humanlike Consumer Robots,” CyberPsychology & Behavior (10: 5), pp. 624-632. Park, J. W., Kim, W. H., and Chung, M. J. 2010. “Artificial Emotion Generation Based on Personality, Mood, and Emotion for Life-Like Facial Expressions of Robots,” in: P. Forbrig, F. Paternó, and A. Mark-Pejtersen (Eds.): HCIS 2010, IFIP AICT (332), pp. 223-233. Pease, A. 1981. Body Language: How to Read Others’ Thoughts by their Gestures , Great Britain: University Printing House, Oxford. Pot, E., Monceaux, J., Gelin, R., & Maisonnier, B. 2009. “Choregraphe: A Graphical Tool for Humanoid Robot Programming,“ In Robot and Human Interactive Communication . RO-MAN 2009. The 18th IEEE International Symposium on (pp. 46-51). IEEE. Pugh, S. D. 2001. “Service with a Smile: Emotional Contagion in the Service Encounter,” Academy of Management Journal (44:October), pp. 1018-1027. Rajesh, M. 2015. “Inside Japan’s First Robot-Staffed Hotel.” Japan Holidays [http://www.theguardian.com/travel/2015/aug/14/japan-henn-na-hotel-staffed-by-robots]. Rani, P., Sarkar, N., Smith, C. A., & Kirby, L. D. 2004. “Anxiety Detecting Robotic System–Towards Implicit Human-Robot Collaboration,” Robotica (22:01), pp. 85-95. Raub, A. C. 1981. Correlates of Computer Anxiety in College Students . Ph.D. dissertation, University of Pennsylvania. Rozin, P. and Royzman, E. B. 2001. “Negativity Bias, Negativity Dominance, and Contagion,“ Personality and Social Psychology Review (5:4), pp. 296-320. Russell, J. A. and Bullock, M. 1985. “Multidimensional scaling of emotional facial expressions: Similarity from preschoolers to adults,” Journal of Personality Social Psychology (48:5), pp. 1290–1298, May 1985. Shamsuddin, S., Yussof, H., Ismail, L. I., Mohamed, S., Hanapiah, F. A., and Zahari, N. I. 2012. “Initial Response in HRI - A Case Study on Evaluation of Child with Autism Spectrum Disorders Interacting with a Humanoid Robot Nao,” Procedia Engineering (41), pp. 1448-1455. Shamsuddin, S., Yussof, H., Miskam, M. A., Che Hamid, A., Malik, N. A., and Hashim, H. I. 2013. “Humanoid Robot NAO as HRI Mediator to Teach Emotions Using Game-Centered Approach for Children with Autism,” In HRI 2013 Workshop on Applications for Emotional Robots , Tokyo, Japan. Smarr, C.-A., Prakash, A., Beer, J. M., Mitzner, T. L., Kemp, C. C., and Robers, W. A. 2012. „Older Adults’ Preferences for and Acceptance of Robot Assistance for Everyday Living Tasks,“ Proceedings of Human Factors Ergonomy Society Annual Meeting (56:1), pp. 153-157. Stock, R. M. and Hoyer, W. D. 2005. “An Attitude-Behavior Model of Salespeople’s Customer Orientation,“ Journal of the Academy of Marketing Science (33:4), pp. 536-552. Syrdal, D. S., Dautenhahn, K., Koay, K. L., and Walters, M. L. 2009. “The Negative Attitudes towards Robots Scale and Reactions to Robot Behaviour in a Live Human-Robot Interaction Study,” Adaptive and Emergent Behaviour and Complex Systems . Tamietto, M. and De Gelder, B. 2008. “Emotional Contagion for Unseen Bodily Expressions: Evidence from Facial EMG,” In Automatic Face & Gesture Recognition . FG'08. 8th IEEE International Conference on (pp. 1-5). IEEE. Tajfel, H. 1978. “The Achievement of Group Differentiation,“ In H. Tajfel (Ed.), Differentiation Between Social Groups: Studies in the Social Psychology of Intergroup Relations (pp. 77-98). London: Academic Press. Tajfel, H. 1981. Human Groups and Social Categories: Studies in Social Psychology . Cambridge, England: Cambridge University Press. Tajfel, H. and Turner, J. C. 1985. “The Social Identity Theory of Intergroup Behavior,“ In S. Worchel and W. G. Austin (Eds.), Psychology of Intergroup Relations (2nd ed., pp. 7-24). Chicago: Nelson-Hall. Xin, L., Lun, X., Wang, Z-l., and Fu, D.-M. 2013. “Robot Emotion and Performance Regulation Based on HMM,” International Journal of Advanced Robotic Systems (10), pp. 1-6. Xu, J., Broekens, J., Hindriks, K., and Neerincx, M. A. 2014. “Robot Mood is Contagious: Effects of Robot Body Language in the Imitation Game,” In Proceedings of the 2014 International Conference on Autonomous Agents and Multi-agent Systems (pp. 973-980). International Foundation for Autonomous Agents and Multiagent Systems, May 2014. Zhang, T., Zhu, B., Lee, L. and Kaber, D. 2008. “Service Robot Anthropomorphism and Interface Design for Emotion in Human-Robot Interaction,” 4th IEEE Conference on Automation Science and Engineering , Washington DC, USA]August 23-26, p. 200.

12