Recommended publications
  • Neural Networks for Emotion Classification
    Neural Networks for Emotion Classification For obtaining master degree In computer science At Leiden Institute of Advanced Computer Science Under supervision of Dr. Michael S. Lew and Dr. Nicu Sebe by Yafei Sun August 2003 1 ACKNOWLEDGMENTS I would like to express my heartfelt gratitude to Dr. Michael S. Lew for his support, invaluable guidance, time, and encouragement. I’d like to express my sincere appreciation to Dr. Nicu Sebe for his countless ideas and advice, his encouragement and sharing his valuable knowledge with me. Thanks to Ernst Lindoorn and Jelle Westbroek for their help on the experiments setup. Thanks to all the friends who participated in the construction of the authentic emotion database. I would also like to Dr. Andre Deutz, Miss Riet Derogee and Miss Rachel Van Der Waal for helping me out with a problem on the scheduling of my oral presentation of master thesis. Finally a special word of thanks to Dr. Erwin Bakker for attending my graduation exam ceremony. 2 Neural Networks for Emotion Classification Table of Contents ABSTRACT ............................................................................................................................................5 1 INTRODUCTION ..........................................................................................................................5 2 BACKGROUND RESEARCH IN EMOTION RECOGNITION..............................................8 2.1 AN IDEAL SYSTEM FOR FACIAL EXPRESSION RECOGNITION: CURRENT PROBLEMS.....11 2.2 FACE DETECTION AND FEATURE EXTRACTION................................................................11
    [Show full text]
  • Truth Be Told
    Truth Be Told . - washingtonpost.com Page 1 of 4 Hello vidal Change Preferences | Sign Out Print Edition | Subscribe NEWS POLITICS OPINIONS LOCAL SPORTS ARTS & LIVING CITY GUIDE JOBS SHO SEARCH: washingtonpost.com Web | Search Ar washingtonpost.com > Print Edition > Sunday Source » THIS STORY: READ + | Comments Truth Be Told . Sunday, November 25, 2007; Page N06 In the arena of lie detecting, it's important to remember that no emotion tells you its source. Think of Othello suffocating Desdemona because he interpreted her tears (over Cassio's death) as the reaction of an adulterer, not a friend. Othello made an assumption and killed his wife. Oops. The moral of the story: Just because someone exhibits the behavior of a liar does not make them one. "There isn't a silver bullet," psychologist Paul Ekman says. "That's because we don't have Pinocchio's nose. There is nothing in demeanor that is specific to a lie." Nevertheless, there are indicators that should prompt you to make further inquiries that may lead to discovering a lie. Below is a template for taking the first steps, gleaned from interviews with Ekman and several other experts, as well as their works. They are psychology professors Maureen O'Sullivan (University of San Francisco), (ISTOCKPHOTO) Robert Feldman (University of Massachusetts) and Bella DePaulo (University of California at Santa Barbara); TOOLBOX communication professor Mark Frank (University at Resize Text Save/Share + Buffalo); and body language trainer Janine Driver (a.k.a. Print This the Lyin' Tamer). E-mail This COMMENT How to Detect a Lie No comments have been posted about this http://www.washingtonpost.com/wp-dyn/content/article/2007/11/21/AR2007112102027...
    [Show full text]
  • An Analysis of Annotated Corpora for Emotion Classification in Text
    An Analysis of Annotated Corpora for Emotion Classification in Text Laura-Ana-Maria Bostan and Roman Klinger Institut fur¨ Maschinelle Sprachverarbeitung University of Stuttgart, Pfaffenwaldring 5b, 70569 Stuttgart, Germany [email protected] [email protected] Abstract Several datasets have been annotated and published for classification of emotions. They differ in several ways: (1) the use of different annotation schemata (e. g., discrete label sets, including joy, anger, fear, or sadness or continuous values including valence, or arousal), (2) the domain, and, (3) the file formats. This leads to several research gaps: supervised models often only use a limited set of available resources. Additionally, no previous work has compared emotion corpora in a systematic manner. We aim at contributing to this situation with a survey of the datasets, and aggregate them in a common file format with a common annotation schema. Based on this aggregation, we perform the first cross-corpus classification experiments in the spirit of future research enabled by this paper, in order to gain insight and a better understanding of differences of models inferred from the data. This work also simplifies the choice of the most appropriate resources for developing a model for a novel domain. One result from our analysis is that a subset of corpora is better classified with models trained on a different corpus. For none of the corpora, training on all data altogether is better than using a subselection of the resources. Our unified corpus is available at http://www.ims.uni-stuttgart.de/data/unifyemotion. Title and Abstract in German Eine Analyse von annotierten Korpora zur Emotionsklassifizierung in Text Es existieren bereits verschiedene Textkorpora, welche zur Erstellung von Modellen fur¨ die automatische Emotionsklassifikation erstellt wurden.
    [Show full text]
  • Ekman, Emotional Expression, and the Art of Empirical Epiphany
    JOURNAL OF RESEARCH IN PERSONALITY Journal of Research in Personality 38 (2004) 37–44 www.elsevier.com/locate/jrp Ekman, emotional expression, and the art of empirical epiphany Dacher Keltner* Department of Psychology, University of California, Berkeley, 3319 Tolman, 94720 Berkeley, CA, USA Introduction In the mid and late 1960s, Paul Ekman offered a variety of bold assertions, some seemingly more radical today than others (Ekman, 1984, 1992, 1993). Emotions are expressed in a limited number of particular facial expressions. These expressions are universal and evolved. Facial expressions of emotion are remarkably brief, typically lasting 1 to 5 s. And germane to the interests of the present article, these brief facial expressions of emotion reveal a great deal about peopleÕs lives. In the present article I will present evidence that supports this last notion ad- vanced by Ekman, that brief expressions of emotion reveal important things about the individualÕs life course. To do so I first theorize about how individual differences in emotion shape the life context. With this reasoning as backdrop, I then review four kinds of evidence that indicate that facial expression is revealing of the life that the individual has led and is likely to continue leading. Individual differences in emotion and the shaping of the life context People, as a function of their personality or psychological disorder, create the sit- uations in which they act (e.g., Buss, 1987). Individuals selectively attend to certain features of complex situations, thus endowing contexts with idiosyncratic meaning. Individuals evoke responses in others, thus shaping the shared, social meaning of the situation.
    [Show full text]
  • E Motions in Process Barbara Rauch
    e_motions in process Barbara Rauch Abstract This research project maps virtual emotions. Rauch uses 3D-surface capturing devices to scan facial expressions in (stuffed) animals and humans, which she then sculpts with the Phantom Arm/ SensAble FreeForm device in 3D virtual space. The results are rapidform printed objects and 3D animations of morphing faces and gestures. Building on her research into consciousness studies and emotions, she has developed a new artwork to reveal characteristic aspects of human emotions (i.e. laughing, crying, frowning, sneering, etc.), which utilises new technology, in particular digital scanning devices and special effects animation software. The proposal is to use a 3D high-resolution laser scanner to capture animal faces and, using the data of these faces, animate and then combine them with human emotional facial expressions. The morphing of the human and animal facial data are not merely layers of the different scans but by applying an algorithmic programme to the data, crucial landmarks in the animal face are merged in order to match with those of the human. The results are morphings of the physical characteristics of animals with the emotional characteristics of the human face in 3D. The focus of this interdisciplinary research project is a collaborative practice that brings together researchers from UCL in London and researchers at OCAD University’s data and information visualization lab. Rauch uses Darwin’s metatheory of the continuity of species and other theories on evolution and internal physiology (Ekman et al) in order to re-examine previous and new theories with the use of new technologies, including the SensAble FreeForm Device, which, as an interface, allows for haptic feedback from digital data.Keywords: interdisciplinary research, 3D-surface capturing, animated facial expressions, evolution of emotions and feelings, technologically transformed realities.
    [Show full text]
  • A Survey on Performance Evaluation of Emotion Recognition on Unison Model with LSSVM Classifier Miss
    IJRECE VOL. 7 ISSUE 2 (APRIL- JUNE 2019) ISSN: 2393-9028 (PRINT) | ISSN: 2348-2281 (ONLINE) A Survey on Performance Evaluation of Emotion Recognition on Unison Model with LSSVM Classifier Miss. Mayuri Nikam, Sheetal Thokal Department of Computer Engineering, JSPM’S Imperial college of Engineering and Research, Wagholi Pune. Abstract- The analysis of social networks is a very challenging categorization with two additional emotions and presented his research area while a fundamental aspect concerns the detection categorization in a wheel of emotions. Finally, Profile of Mood of user communities. The existing work of emotion recognition States (POMS) is a psychological instrument that defines a six- on Twitter specifically depends on the use of lexicons and dimensional mood state representation using text mining. The simple classifiers on bag-of words models. The vital question of novel algorithm a Profile of Mood States (POMS) generating our observation is whether or not we will enhance their overall twelve-dimensional mood state representation using 65 performance using machine learning algorithms. The novel adjectives with combination of Ekman’s and Plutchik’s algorithm a Profile of Mood States (POMS) represents twelve- emotions categories like, anger, depression, fatigue, vigour, dimensional mood state representation using 65 adjectives with tension, confusion, joy, disgust, fear, trust, surprise and combination of Ekman’s and Plutchik’s emotions categories anticipation. Previous work generally studied only one emotion like, anger, depression, fatigue, vigour, tension, confusion, joy, classification. Working with multiple classifications disgust, fear, trust, surprise and anticipation. These emotions simultaneously not only enables performance comparisons classify with the help of text based bag-of-words and LSI between different emotion categorizations on the same type of algorithms.
    [Show full text]
  • Facial Expression and Emotion Paul Ekman
    1992 Award Addresses Facial Expression and Emotion Paul Ekman Cross-cultural research on facial expression and the de- We found evidence of universality in spontaneous velopments of methods to measure facial expression are expressions and in expressions that were deliberately briefly summarized. What has been learned about emo- posed. We postulated display rules—culture-specific pre- tion from this work on the face is then elucidated. Four scriptions about who can show which emotions, to whom, questions about facial expression and emotion are dis- and when—to explain how cultural differences may con- cussed: What information does an expression typically ceal universal in expression, and in an experiment we convey? Can there be emotion without facial expression? showed how that could occur. Can there be a facial expression of emotion without emo- In the last five years, there have been a few challenges tion? How do individuals differ in their facial expressions to the evidence of universals, particularly from anthro- of emotion? pologists (see review by Lutz & White, 1986). There is. however, no quantitative data to support the claim that expressions are culture specific. The accounts are more In 1965 when I began to study facial expression,1 few- anecdotal, without control for the possibility of observer thought there was much to be learned. Goldstein {1981} bias and without evidence of interobserver reliability. pointed out that a number of famous psychologists—F. There have been recent challenges also from psychologists and G. Allport, Brunswik, Hull, Lindzey, Maslow, Os- (J. A. Russell, personal communication, June 1992) who good, Titchner—did only one facial study, which was not study how words are used to judge photographs of facial what earned them their reputations.
    [Show full text]
  • Emotion Classification Based on Biophysical Signals and Machine Learning Techniques
    S S symmetry Article Emotion Classification Based on Biophysical Signals and Machine Learning Techniques Oana Bălan 1,* , Gabriela Moise 2 , Livia Petrescu 3 , Alin Moldoveanu 1 , Marius Leordeanu 1 and Florica Moldoveanu 1 1 Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Bucharest 060042, Romania; [email protected] (A.M.); [email protected] (M.L.); fl[email protected] (F.M.) 2 Department of Computer Science, Information Technology, Mathematics and Physics (ITIMF), Petroleum-Gas University of Ploiesti, Ploiesti 100680, Romania; [email protected] 3 Faculty of Biology, University of Bucharest, Bucharest 030014, Romania; [email protected] * Correspondence: [email protected]; Tel.: +40722276571 Received: 12 November 2019; Accepted: 18 December 2019; Published: 20 December 2019 Abstract: Emotions constitute an indispensable component of our everyday life. They consist of conscious mental reactions towards objects or situations and are associated with various physiological, behavioral, and cognitive changes. In this paper, we propose a comparative analysis between different machine learning and deep learning techniques, with and without feature selection, for binarily classifying the six basic emotions, namely anger, disgust, fear, joy, sadness, and surprise, into two symmetrical categorical classes (emotion and no emotion), using the physiological recordings and subjective ratings of valence, arousal, and dominance from the DEAP (Dataset for Emotion Analysis using EEG, Physiological and Video Signals) database. The results showed that the maximum classification accuracies for each emotion were: anger: 98.02%, joy:100%, surprise: 96%, disgust: 95%, fear: 90.75%, and sadness: 90.08%. In the case of four emotions (anger, disgust, fear, and sadness), the classification accuracies were higher without feature selection.
    [Show full text]
  • LNU-THESIS-2016.Pdf (2.995Mb)
    REAL TIME CLASSIFICATION OF EMOTIONS TO CONTROL STAGE LIGHTING DURING DANCE PERFORMANCE A Thesis Presented to The Faculty of the Department of Biomedical Engineering University of Houston In Partial Fulfillment of the Requirements for the Degree Master of Science In Biomedical Engineering By Shruti Ray August 2016 REAL TIME CLASSIFICATION OF EMOTIONS TO CONTROL STAGE LIGHTING DURING DANCE PERFORMANCE _____________________________ Shruti Ray Approved: ________________________________ Chair of The Committee Dr. Jose Luis Contreras – Vidal, Professor, Department of Electrical and Computer Engineering Committee Members: ________________________________ Dr. Ahmet Omurtag, Associate Professor, Department of Biomedical Engineering _______________________________ Dr. Saurabh Prasad, Assistant Professor, Department of Electrical and Computer Engineering, _______________________ ____________________________ Dr. Suresh K. Khator, Dr. Metin Akay, Founding Chair, Associate Dean John S. Dunn Cullen Endowed Professor, Cullen College of Engineering Department of Biomedical Engineering Acknowledgement I would like to show my deepest gratitude for my advisor, Dr. Jose Luis Contreras - Vidal, for his continuous guidance, encouragement and support throughout this research project. I would also like to thank my colleagues from Laboratory for Noninvasive Brain- Machine Interface Systems, for their immense support and encouragement and help in data collection for analysis. I would like to thank Ms. Rebecca B. Valls and Ms. Anastasiya Kopteva for their dancer performances with EEG caps to help me with the data collection. Additionally, I would like to thank all my friends Su Liu, Thomas Potter, Dr. Kinjal Dhar Gupta and my sister Shreya Ray who have supported me in both happy and adverse conditions. Last, but not the least, I would like to thank my parents and family to believe in my dreams and supporting my quest for higher education.
    [Show full text]
  • Download Article
    Advances in Intelligent Systems Research, volume 166 7th Scientific Conference on Information Technologies for Intelligent Decision Making Support (ITIDS 2019) Summarizing Emotions from Text Using Plutchik’s Wheel of Emotions Mohsin Manshad Abbasi Anatoly Beltiukov Theoratical Foundation of Computer Sciences Theoratical Foundation of Computer Sciences Udmurt State University Udmurt State University Izhevsk, Russian Federation Izhevsk, Russian Federation [email protected] [email protected] Abstract—Text is an important and major source of shopping. We will analyze and summarize the emotions from communication over Internet. It is analyzed to identify text using the concept of Plutchik’s wheel of emotions [1]. In interesting information and trends of communication. Within 1980’s Robert Plutchik divided emotions into eight main this work, we are analyzing emotions expressed by people on categories. Half of these emotions are positive emotions, and Internet using Plutchik’s wheel of emotions. Plutchik’s wheel of the other half are negative ones. They are seen as opposite to emotions is use as a tool to identify and summarize emotions to each other. We can observe this among secondary emotions, their primary classes. To accomplish it, we allocate a weight to such as joy is opposite to sadness, surprise is opposite to each emotion depending upon the class it belongs and its anticipation, trust is opposite to disgust, and anger is opposite distance from the center of Plutchik’s wheel of emotions. These to fear. He explained each emotion in detail and divided it weights are then multiplied by the frequencies of emotions in text to identify their intensity level.
    [Show full text]
  • Spin the Wheel of Emotions Well-Being Level 4-6
    Emotional Grade Spin the Wheel of Emotions Well-Being Level 4-6 Materials Access to the internet Learning Recognize a variety of emotions including opposite emotions, mixed Outcome emotions, and intensities of emotions. Description Ask the child to name as many emotions as possible. After they have done so, visit Plutchik’s Wheel of Emotions webpage and look at the wheel of emotions together. Review each section of the wheel and think about what the sections have in common. Observe the emotions with no colour and guess what they may mean. Read all of the emotions and provide a definition for any that the child does not know. After reviewing the Wheel of Emotions, share with the child that Robert Plutchik was a psychologist that stated there are eight basic emotions: joy, trust, fear, surprise, sadness, anticipation, anger, and disgust. Each basic emotion has a polar opposite. This means: Joy is the opposite of sadness Fear is the opposite of anger Anticipation is the opposite of surprise Disgust is the opposite of trust Look at the webpage again and explain to the child that the emotions with no colour represent an emotion that is a mix of two of the basic emotions. For example, the emotions of anticipation and joy combine to be the emotion of optimism. Also, explain that emotions get more intense as they move from the outside of the wheel to the center of the wheel. You can see this represented on the wheel with the darker shades representing the most intense emotions. After looking at the wheel again, ask the child: What do you think
    [Show full text]
  • Facial Expressions of Emotion Influence Interpersonal Trait Inferences
    FACIAL EXPRESSIONS OF EMOTION INFLUENCE INTERPERSONAL TRAIT INFERENCES Brian Knutson ABSTRACT.. Theorists have argued that facial expressions of emotion serve the inter- personal function of allowing one animal to predict another's behavior. Humans may extend these predictions into the indefinite future, as in the case of trait infer- ence. The hypothesis that facial expressions of emotion (e.g., anger, disgust, fear, happiness, and sadness) affect subjects' interpersonal trait inferences (e.g., domi- nance and affiliation) was tested in two experiments. Subjects rated the disposi- tional affiliation and dominance of target faces with either static or apparently mov- ing expressions. They inferred high dominance and affiliation from happy expressions, high dominance and low affiliation from angry and disgusted expres- sions, and low dominance from fearful and sad expressions. The findings suggest that facial expressions of emotion convey not only a targets internal state, but also differentially convey interpersonal information, which could potentially seed trait inference. What do people infer from facial expressions of emotion? Darwin (1872/1962) suggested that facial muscle movements which originally sub- served individual survival problems (e.g., spitting out noxious food, shield- ing the eyes) eventually allowed animals to predict the behavior of their conspecifics. Current emotion theorists agree that emotional facial expres- sions can serve social predictive functions (e.g., Ekman, 1982, Izard, 1972; Plutchik, 1980). For instance, Frank (1988) hypothesizes that a person might signal a desire to cooperate by smiling at another. Although a viewer may predict a target's immediate behavior on the basis of his or her facial expressions, the viewer may also extrapolate to the more distant future, as in the case of inferring personality traits.
    [Show full text]