The Perception of Emotion in Artificial Agents
Total Page:16
File Type:pdf, Size:1020Kb
The perception of emotion in artificial agents Ruud Hortensius Felix Hekele Wales Institute for Cognitive Neuroscience, Wales Institute for Cognitive Neuroscience School of Psychology, Bangor University, Wales School of Psychology and Bangor University Institute of Neuroscience and Psychology, School Bangor, Wales of Psychology, University of Glasgow, Scotland [email protected] [email protected] Emily S. Crossy Wales Institute for Cognitive Neuroscience, School of Psychology, Bangor University, Wales and Institute of Neuroscience and Psychology, School of Psychology, University of Glasgow, Scotland [email protected] yto whom correspondence should be addressed: [email protected] Abstract— Given recent technological developments in interactions with other people. Emotions serve as clues for robotics, artificial intelligence and virtual reality, it is perhaps these interactions. Powerful drivers of human behaviour unsurprising that the arrival of emotionally expressive and [1], emotions expressed by an agent communicate social reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to information to an observer. Whether subtle facial establish an understanding of whether and how humans perceive expressions of sadness or angry bodily gestures to signal emotion in artificial agents. In this review, we incorporate recent dominance, we are experts in recognizing emotion findings from social robotics, virtual reality, psychology, and expression across a broad range of situations. The neuroscience to examine how people recognize and respond to importance of emotion recognition for these interactions is emotions displayed by artificial agents. First, we review how well established [2]-[5]. For instance, accurate recognition people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate of emotional facial expressions is related to higher the similarities and differences in the consequences of perceived perceived quality of daily interaction between people [4]. emotions in artificial compared to human agents. Besides However, technology continuously changes the social accurately recognizing the emotional state of an artificial agent, it interactions we have. In the last years, and in the decades is critical to understand how humans respond to those emotions. to come, artificial agents are increasingly entering our Does interacting with an angry robot induce the same responses in social environment, with growing numbers of these agents people as interacting with an angry person? Similarly, does appearing in hospitality, care, and education contexts [6]. watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of From conversing with a humanoid robot to check into a the current state of emotion expression and perception in social hotel room, to collaborating with a virtual agent in a robotics, as well as a clear articulation of the challenges and medical rehabilitation context, social interactions with guiding principles to be addressed as we move ever closer to truly artificial agents will increasingly become part of our daily emotional artificial agents. lives. This prospect raises important questions regarding Keywords— artificial agent, emotion, human-robot interaction how these agents will be accepted and incorporated into our social milieu. I. INTRODUCTION To maximise the quality of social interactions between We spend our entire lives navigating a vast and humans and artificial agents, it is important that the complex social environment. From our homes to our artificial agent is not only responsive to emotions schools, from work to online life, our lives revolve around expressed by the human agent, but is also able to express 1 emotions itself. As interest and investment in social robotics continues to grow, developing artificial agents with this kind of emotional capacity is a core requirement for truly social robotic agents [7], [8]. While the emotional component of artificial agents has been long neglected, a recent surge in technological development and empirical investigations is starting to shift the focus. Emerging research documents how expressive robots are rated as more likeable and humanlike, lead to higher engagement and more pleasurable interactions [9]-[11]. Importantly, acceptance of and cooperation with a robot is dependent on the match between the situation and the emotional behaviour of the robot [12]. Understanding how we perceive and interact with an emotional artificial agent is thus crucial given the growing deployment of these agents in social settings. As we depart from an era where our social exchanges take place solely with other humans, it is vital that we understand how rich social information can be conveyed and understood between humans and artificial agents. In this review, we aim to provide an integrative overview of the perception of emotions in artificial agents by human observers and discuss insights and perspectives from the field of robotics, virtual reality, psychology, and neuroscience. We discuss how people recognize emotions expressed by artificial agents via different modalities such as the face and body and consider two distinct types of emotional artificial agents; robotic and virtual agents. The first category includes physically instantiated or embodied robots, such as humanoid or pet-like robots. The second category includes visually presented agents or avatars, such as those used in virtual reality environments [13], therapeutic interventions [14], and in educational contexts Fig. 1. Examples of emotional artificial agents. (1a) iCat (Philips Researcn); [15]. The embodied and virtual artificial agents discussed (1b) AIBO (Sony Corp.); (1c) Sparky (Interval Research Corp.); (1d) KaMERo in this review are described in detail in Table 1 and several (Korea Advanced Institute of Science and Technology); (1e) Mung (Yonsei University/Korea Advanced Institute of Science and Technology); (1f) The examples are presented in Fig. 1. In the second part of this Haptic Creature (Haptic Creature Project); (1g) Unnamed lego robot [68]; (1h) review, we describe the behavioural consequences of an Keepon (National Institute of Information and Communications Technology); emotional artificial agent by separately considering (2a) ) NAO (Aldebaran Robotics); (2b) iCub (Instituto Italiano di Tecnologia/RobotCub Consortium); (2c) WE-4RII (Waseda University); (2d) positive reactions, such as empathy for robots, as well as Pepper (Aldebaran Robotics); (2e) F.A.C.E. (Hanson Robotics); (2f) Robovie- negative reactions, such aggression toward robots. We X (Vstone Corp.); (2g) Einstein (Hanson Robotics); (2h) BERT2 (Elumotion/Bristol Robotics Lab); (3a) Virtual version of NAO [103]; (3b) conclude by examining some of the challenges that remain Female virtual avatar ; (3c) & (3d) Male virtual avatar [76]; 3e Female virtual in studying emotion perception during human-robot avatar [48]; 3f Female virtual avatar [41]; 3f Male virtual avatar [56]; 3g Male interaction and articulate guiding principles for future virtual avatar [40]. research and development of emotional artificial agents. This review is not meant to be all-encompassing. Instead, II. PERCEPTION OF EMOTION IN ARTIFICIAL AGENTS we focus on highlighting research on how the human Humans use a range of facial and bodily expressions, observer perceives and reacts to emotional artificial as well as vocal tone, to communicate their emotional state agents. While the technical aspects of emotion recognition and salient events in the environment to other individuals. within artificial agents are beyond the scope of the present Artificial agents are becoming progressively better at review, the literature covered here certainly has the reading and appropriately responding to emotions potential to inform future development of emotional and expressed by humans [7], and many artificial agents have social artificial agents. been programmed to display emotional reactions such as sadness or joy. From a design perspective, emotional reactions such as facial expressions often act as feedback 2 Table 1. List of expressive robots reviewed in this article interaction partners. Psychology and neuroscience already Name Type of Manufacturer Expressive provide a detailed picture of the processes underlying Robot capabilities perception of human emotional expressions [3], [19], with AIBO Pet-like Sony Corp. Facial and bodily the most studied emotional signals being facial and bodily expressions ASIMO Humanoid Honda Motor Co. Bodily expressions expressions. Work from these fields can provide a BERT(2) Humanoid Elumotion/Bristol Facial, bodily and framework for understanding the mechanisms of Robotics Lab vocal expressions perception of artificial agents’ emotional expressions. DustCart Mechanical DustBot Project Facial expressions Einstein Humanoid Hanson Robotics Facial and vocal A. Facial Expressions expressions F.A.C.E. Android Hanson Robotics Facial expressions The crucial role played by faces in emotional Feelix Mechanical [22] Facial expressions expression during social interaction between humans has Geminoid Android Advanced Facial and bodily inspired creators of artificial agents to incorporate HI-2 Telecommunication expressions elements of dynamic facial expressions in both virtual and Research Institute International physically