Automatic, Dimensional and Continuous Emotion Recognition

Automatic, Dimensional and Continuous Emotion Recognition

IGI PUBLISHING ITJ 5590 68 International701 E. JournalChocolate of SyntheticAvenue, Hershey Emotions, PA 1(1),17033-1240, 68-99, January-June USA 2010 Tel: 717/533-8845; Fax 717/533-8661; URL-http://www.igi-global.com This paper appears in the publication, International Journal of Synthetic Emotions, Volume 1, Issue 1 edited by Jordi Vallverdú © 2010, IGI Global Automatic, Dimensional and Continuous Emotion Recognition Hatice Gunes, Imperial College London, UK Maja Pantic, Imperial College London, UK and University of Twente, EEMCS, The Netherlands ABSTRACT Recognition and analysis of human emotions have attracted a lot of interest in the past two decades and have been researched extensively in neuroscience, psychology, cognitive sciences, and computer sciences. Most of the past research in machine analysis of human emotion has focused on recognition of prototypic expressions of six basic emotions based on data that has been posed on demand and acquired in laboratory settings. More recently, there has been a shift toward recognition of affective displays recorded in naturalistic settings as driven by real world applications. This shift in affective computing research is aimed toward subtle, continuous, and context-specific interpretations of affective displays recorded in real-world settings and toward combining multiple modalities for analysis and recognition of human emotion. Accordingly, this article explores recent advances in dimensional and continuous affect modelling, sensing, and automatic recognition from visual, audio, tactile, and brain-wave modalities. Keywords: Bodily Expression, Continuous Emotion Recognition, Dimensional Emotion Modelling, Emotional Acoustic and Bio-signals, Facial Expression, Multimodal Fusion INTRODUCTION Despite the available range of cues and modalities in human-human interaction (HHI), Human natural affective behaviour is mul- the mainstream research on human emotion has timodal, subtle and complex. In day-to-day mostly focused on facial and vocal expressions interactions, people naturally communicate and their recognition in terms of seven discrete, multimodally by means of language, vocal basic emotion categories (neutral, happiness, intonation, facial expression, hand gesture, head sadness, surprise, fear, anger and disgust; movement, body movement and posture, and Keltner & Ekman, 2000; Juslin & Scherer, possess a refined mechanism for understanding 2005). In line with the aforementioned, most and interpreting information conveyed by these of the past research on automatic affect sensing behavioural cues. and recognition has focused on recognition of facial and vocal expressions in terms of basic DOI: 10.4018/jse.2010101605 emotional states, and then based on data that Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. International Journal of Synthetic Emotions, 1(1), 68-99, January-June 2010 69 has been posed on demand or acquired in emotional displays using a dimensional (rather laboratory settings (Pantic & Rothkrantz, 2003; than categorical) approach. Gunes, Piccardi, & Pantic, 2008; Zeng, Pantic, Roisman, & Huang, 2009). Additionally, each modality—visual, auditory, and tactile—has BACKGROUND RESEARCH been considered in isolation. However, a num- Emotions are researched in various scientific ber of researchers have shown that in everyday disciplines such as neuroscience, psychology, interactions people exhibit non-basic, subtle and linguistics. Development of automated and rather complex mental/affective states like affective multimodal systems depends signifi- thinking, embarrassment or depression (Baron- cantly on the progress in the aforementioned Cohen & Tead, 2003). Such subtle and complex sciences. Accordingly, we start our analysis by affective states can be expressed via tens (or exploring the background in emotion theory, possibly hundreds) of anatomically possible and human perception and recognition. facial expressions, bodily gestures or physi- ological signals. Accordingly, a single label (or any small number of discrete classes) may THEORIES OF EMOTION not reflect the complexity of the affective state conveyed by such rich sources of information According to the research in psychology, three (Russell, 1980). Hence, a number of researchers major approaches to emotion modelling can be advocate the use of dimensional description of distinguished (Grandjean, Sander, & Scherer, human affect, where an affective state is char- 2008): (1) categorical approach, (2) dimensional acterized in terms of a small number of latent approach, and (3) appraisal-based approach. dimensions (e.g., Russell, 1980; Scherer, 2000; The categorical approach is based on Scherer, Schorr, & Johnstone, 2001). research on basic emotions, pioneered by It is not surprising, therefore, that automatic Darwin (1998), interpreted by Tomkins (1962, affect sensing and recognition researchers have 1963) and supported by findings of Ekman & recently started exploring how to model, anal- his colleagues (1992, 1999). According to this yse and interpret the subtlety, complexity and approach there exist a small number of emo- continuity of affective behaviour in terms of tions that are basic, hard-wired in our brain, and latent dimensions, rather than in terms of a small recognized universally (e.g., Ekman & Friesen, number of discrete emotion categories. 2003). Ekman and his colleagues conducted A number of recent survey papers exist on various experiments on human judgment of automatic affect sensing and recognition (e.g., still photographs of deliberately displayed Gunes & Piccardi, 2008; Gunes et al., 2008; facial behaviour and concluded that six basic Zeng et al., 2009). However, none of those focus emotions can be recognized universally.These on dimensional affect analysis. This article, emotions are happiness, sadness, surprise, fear, therefore, sets out to explore recent advances in anger and disgust (Ekman, 1982). Although human affect modelling, sensing, and automatic psychologists have suggested a different number recognition from visual (i.e., facial and bodily of such basic emotions, ranging from 2 to 18 expression), audio, tactile (i.e., heart rate, skin categories (Ortony & Turner, 1990; Wierzbicka, conductivity, thermal signals etc.) and brain- 1992), there has been considerable agreement wave (i.e., brain and scalp signals) modalities by on the aforementioned six emotions. To date, providing an overview of theories of emotion (in Ekman’s theory on universality and interpreta- particular the dimensional theories), expression tion of affective nonverbal expressions in terms and perception of emotions, data acquisition of basic emotion categories has been the most and annotation, and the current state-of-the- commonly adopted approach in research on art in automatic sensing and recognition of automatic affect recognition. Copyright © 2010, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited. 70 International Journal of Synthetic Emotions, 1(1), 68-99, January-June 2010 On the other hand, however, a number in a systematic manner. In this approach, ma- of researchers in psychology argued that it jority of affect variability is covered by three is necessary to go beyond discrete emotions. dimensions: valence, arousal, and potency Among various classification schemes, Baron- (dominance) (Davitz, 1964; Mehrabian & Rus- Cohen and his colleagues, for instance, have sell, 1974; Osgood, Suci, & Tannenbaum, investigated cognitive mental states (e.g., 1957). The valence dimension refers to how agreement, concentrating, disagreement, think- positive or negative the emotion is, and ranges ing, reluctance, and interest) and their use in from unpleasant feelings to pleasant feelings daily life. They did so via analysis of multiple of happiness. The arousal dimension refers to asynchronous information sources such as facial how excited or apathetic the emotion is, and it actions, purposeful head gestures, and eye-gaze ranges from sleepiness or boredom to frantic direction. They showed that cognitive mental excitement. The power dimension refers to the states occur more often in everyday interac- degree of power or sense of control over the tions than the basic emotions (Baron-Cohen emotion. Taking into account the aforemen- & Tead, 2003). These states were also found tioned, a reasonable space of emotion can be relevant in representing problem-solving and modelled as illustrated in Figure 1a. Russell decision-making processes in human-computer (1980) introduced a circular configuration Interaction (HCI) context and have been used called Circumflex of Affect (see Figure 1b) and by a number of researchers, though based on proposed that each basic emotion represents a deliberately displayed behaviour rather than bipolar entity being a part of the same emotional in natural scenarios (e.g., El Kaliouby & Rob- continuum. The proposed polars are arousal inson, 2005). (relaxed vs. aroused) and valence (pleasant According to the dimensional approach, vs. unpleasant). As illustrated in Figure 1b, affective states are not independent from one the proposed emotional space consists of four another; rather, they are related to one another quadrants: low arousal positive, high arousal Figure 1. Illustration of a) three dimensions of emotion space (V-valence, A-arousal, P-power), and b) distribution of the seven emotions in arousal-valance (A-V) space. Images adapted from

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    32 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us