Emotion Classification Using Facial Expression

Total Page:16

File Type:pdf, Size:1020Kb

Emotion Classification Using Facial Expression (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 2, No. 7, 2011 Emotion Classification Using Facial Expression Devi Arumugam Dr. S. Purushothaman B.E, M.E., Ph.D(IIT-M) Research Scholar, Department of Computer Science, Principal, Mother Teresa Women’s University, Sun College of Engineering and Technology, Kodaikanal, India. Sun Nagar, Erachakulam, Kanyakumari, India. Abstract— Human emotional facial expressions play an positive and negative emotions. The six basic emotions are important role in interpersonal relations. This is because humans angry, happy, fear, disgust, sad, surprise. One more expression demonstrate and convey a lot of evident information visually is neutral. Other emotions are Embarrassments, interest, pain, rather than verbally. Although humans recognize facial shame, shy, anticipation, smile, laugh, sorrow, hunger, expressions virtually without effort or delay, reliable expression curiosity. recognition by machine remains a challenge as of today. To automate recognition of emotional state, machines must be taught In different situation of emotion, Anger may be expressed to understand facial gestures. In this paper we developed an in different ways like Enraged, annoyed, anxious, irritated, algorithm which is used to identify the person’s emotional state resentful, miffed, upset, mad, furious, and raging. Happy may through facial expression such as angry, disgust, happy. This can be expressed in different ways like joy, greedy, ecstatic, be done with different age group of people with different fulfilled, contented, glad, complete, satisfied, and pleased. situation. We Used a Radial Basis Function network (RBFN) for Disgust may be expressed in different ways like contempt, classification and Fisher’s Linear Discriminant (FLD), Singular exhausted, peeved, upset, and bored. Value Decomposition (SVD) for feature selection. In emotional expression face, If we are angry, the brows are Keywords - Universal Emotions; FLD; SVD; Eigen Values; Eigen lowered and drawn together, Vertical lines appear between the Vector; RBF Network. brows, lower lid tensed, eyes hard stare or bulging, lips can be pressed firmly together with corners down or square shape as if I. INTRODUCTION shouting, nostrils may be dilated, the lower jaw juts out. Human beings express their emotions in everyday If we are happy, corners of the lips are drawn back and up, interactions with others. Emotions are frequently reflected on mouth may or may not be parted, teeth exposed, a wrinkle runs the face, in hand and body gestures, in the voice, to express our from outer nose to outer lip, cheeks are raised, lower lid may feelings or liking. Recent Psychology research has shown that show wrinkles or be tense, crows feet near the outside of the the most expressive way humans display emotions is through eyes. Disgust has been identified as one of the basic emotions. facial expressions. Mehrabian [1] indicated that the verbal part Its basic definition is “bad taste” secondarily to anything which of a message contributes only for 7% to the effect of the causes a similar feeling, through the sense of smell, touch and message as a whole, the vocal part for 38%, while facial even of eyesight. The well defined facial expression of disgust expressions for 55% to the effect of the speaker’s message. is characterized by furrowing of the eyebrows, closure of the Emotions are feeling or response to particular situation or eyes and pupil constriction, wrinkling of the nose, upper lip environment. Emotions are an integral part of our existence, as retraction and upward movement of the lower lip and chin, one smiles to show greeting, frowns when confused, or raises drawing the corners of the mouth down and back one’s voice when enraged. It is because we understand other This research describes a neural network based approach emotions and react based on that expression only enriches the for emotion classification. We learn a classifier that can interactions. Computers are “emotionally challenged”. They recognize three basic emotions. neither recognize other emotions nor possess its own emotion [2]. II. RELATED WORK To enrich human-computer interface from point-and-click There are several approaches taken in the literature for to sense-and-feel, to develop non intrusive sensors, to develop learning classifiers for emotion recognition [2] [6]. lifelike software agents such as devices, this can express and understand emotion. Since computer systems with this In the static Approach, the classifier classifies each frame in capability have a wide range of applications in different the video to one of the facial expression categories based on the research arrears, including security, law enforcement, clinic, tracking results of that frame. Bayesian network classifiers education, psychiatry and Telecommunications [4]. There as were commonly used in this approach. Naïve Bayes classifiers been much research on recognizing emotion through facial were also used often. Because of this unrealistic approach some expressions. used Gaussian classifiers. In emotional classification there are two basic emotions are In the Dynamic Approach, these classifiers take into there, Love-fear. Based on this we classify the emotion into account the temporal pattern in displaying facial expression. Hidden Markov Model (HMM) based classifiers for facial 92 | P a g e www.ijacsa.thesai.org (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 2, No. 7, 2011 expression recognition has been previously used in recent of the research done in automatic facial expression recognition works. Cohen and Sebe [4] further advanced this line of in recent years. The work in computer-assisted quantification research and proposed a multi-level HMM classifier. of facial expressions did not start until the 1990s. This field has been a research interest for scientists from Mase [10] used optical flow (OF) to recognize facial several different tracks like computer science, engineering, expressions. He was one of the firsts to use image-processing psychology, and neuroscience [1]. In this paper, we propose a techniques to recognize facial expressions. Lanitis [11] used a complete emotional classification using facial expression. The flexible shape and appearance model for image10 coding, network is well trained with the various emotions of the face person identification, pose recovery, gender recognition, and with different age group of people from different situation. facial expression recognition. Black and Yacoob [12] used When given an input, the RBF network with the help of FLD, local parameterized models of image motion to recover non- trained with various emotions of the face expression with that rigid motion. Once recovered, these parameters were used as of training examples and produces the output. inputs to a rule-based classifier to recognize the six basic facial expressions. This Existing research was developed by Paul Ekman [3]. He is a psychology perspective. In the early 1990s the Yacoob and Davis [13] computed optical flow and used engineering community started to use these results to construct similar rules to classify the six facial expressions. Rosenblum, automatic methods of recognizing emotions from facial Yacoob, and Davis [14] also computed optical flow of regions expressions in images or video [4] based on various techniques on the face then applied a radial basis function network to of tracking [5]. An important problem in the emotion classify expressions. Essa and Pentlands [15] used an optical recognition field is the lack of agreed benchmark database and flow region-based method to recognize expressions. Donato et methods for compare different methods performance. Cohn- al. [16] tested different features for recognizing facial AUs and kanade database is a step in this direction [2]. inferring the facial expression in the frame. Otsuka and Ohya [17] first computed optical flow, then computed the 2D Fourier Since the early 1970, Paul Ekman and his colleagues have transform coefficients, which were used as feature vectors for a performed extensive studies of human facial expression [7]. hidden Markov model (HMM) to classify expressions. The They found evidence to support universality in facial trained system was able to recognize one of the six expressions expressions. These “Universal facial expressions” are those near real-time (about 10 Hz). Furthermore, they used the representing happiness, sadness, anger, fear, surprise, and tracked motions to control the facial expression of an animated disgust. They studied facial expressions in different cultures, Kabuki system [18]. Martinez [19] introduced an indexing including preliterate cultures, and found much commonality in approach based on the identification of frontal face images the expression and recognition of emotions on the face. under different illumination conditions, facial expressions, and However, they observed differences in expressions are occlusions. governed by “display rules” in different social contexts. For Example, Japanese subjects and American subjects showed A Bayesian approach was used to find the best match similar facial expressions while viewing the same stimulus between the local observations and the learned local features film. However, in the presence of authorities, the Japanese model and an HMM was employed to achieve good recognition viewers were more reluctant to show their real expressions. On even when the new conditions did not correspond to
Recommended publications
  • Neural Networks for Emotion Classification
    Neural Networks for Emotion Classification For obtaining master degree In computer science At Leiden Institute of Advanced Computer Science Under supervision of Dr. Michael S. Lew and Dr. Nicu Sebe by Yafei Sun August 2003 1 ACKNOWLEDGMENTS I would like to express my heartfelt gratitude to Dr. Michael S. Lew for his support, invaluable guidance, time, and encouragement. I’d like to express my sincere appreciation to Dr. Nicu Sebe for his countless ideas and advice, his encouragement and sharing his valuable knowledge with me. Thanks to Ernst Lindoorn and Jelle Westbroek for their help on the experiments setup. Thanks to all the friends who participated in the construction of the authentic emotion database. I would also like to Dr. Andre Deutz, Miss Riet Derogee and Miss Rachel Van Der Waal for helping me out with a problem on the scheduling of my oral presentation of master thesis. Finally a special word of thanks to Dr. Erwin Bakker for attending my graduation exam ceremony. 2 Neural Networks for Emotion Classification Table of Contents ABSTRACT ............................................................................................................................................5 1 INTRODUCTION ..........................................................................................................................5 2 BACKGROUND RESEARCH IN EMOTION RECOGNITION..............................................8 2.1 AN IDEAL SYSTEM FOR FACIAL EXPRESSION RECOGNITION: CURRENT PROBLEMS.....11 2.2 FACE DETECTION AND FEATURE EXTRACTION................................................................11
    [Show full text]
  • Truth Be Told
    Truth Be Told . - washingtonpost.com Page 1 of 4 Hello vidal Change Preferences | Sign Out Print Edition | Subscribe NEWS POLITICS OPINIONS LOCAL SPORTS ARTS & LIVING CITY GUIDE JOBS SHO SEARCH: washingtonpost.com Web | Search Ar washingtonpost.com > Print Edition > Sunday Source » THIS STORY: READ + | Comments Truth Be Told . Sunday, November 25, 2007; Page N06 In the arena of lie detecting, it's important to remember that no emotion tells you its source. Think of Othello suffocating Desdemona because he interpreted her tears (over Cassio's death) as the reaction of an adulterer, not a friend. Othello made an assumption and killed his wife. Oops. The moral of the story: Just because someone exhibits the behavior of a liar does not make them one. "There isn't a silver bullet," psychologist Paul Ekman says. "That's because we don't have Pinocchio's nose. There is nothing in demeanor that is specific to a lie." Nevertheless, there are indicators that should prompt you to make further inquiries that may lead to discovering a lie. Below is a template for taking the first steps, gleaned from interviews with Ekman and several other experts, as well as their works. They are psychology professors Maureen O'Sullivan (University of San Francisco), (ISTOCKPHOTO) Robert Feldman (University of Massachusetts) and Bella DePaulo (University of California at Santa Barbara); TOOLBOX communication professor Mark Frank (University at Resize Text Save/Share + Buffalo); and body language trainer Janine Driver (a.k.a. Print This the Lyin' Tamer). E-mail This COMMENT How to Detect a Lie No comments have been posted about this http://www.washingtonpost.com/wp-dyn/content/article/2007/11/21/AR2007112102027...
    [Show full text]
  • Emotion Detection in Twitter Posts: a Rule-Based Algorithm for Annotated Data Acquisition
    2020 International Conference on Computational Science and Computational Intelligence (CSCI) Emotion detection in Twitter posts: a rule-based algorithm for annotated data acquisition Maria Krommyda Anastatios Rigos Institute of Communication and Computer Systems Institute of Communication and Computer Systems Athens, Greece Athens, Greece [email protected] [email protected] Kostas Bouklas Angelos Amditis Institute of Communication and Computer Systems Institute of Communication and Computer Systems Athens, Greece Athens, Greece [email protected] [email protected] Abstract—Social media analysis plays a key role to the person that provided the text and it is categorized as positive, understanding of the public’s opinion regarding recent events negative, or neutral. Such analysis is mainly used for movies, and decisions, to the design and management of advertising products and persons, when measuring their appeal to the campaigns as well as to the planning of next steps and mitigation actions for public relationship initiatives. Significant effort has public is of interest [4] and require extended quantities of text been dedicated recently to the development of data analysis collected over a long period of time from different sources to algorithms that will perform, in an automated way, sentiment ensure an unbiased and objective output. analysis over publicly available text. Most of the available While this technique is very popular and well established, research work, focuses on binary categorizing text as positive or with many important use cases and applications, there are negative without further investigating the emotions leading to that categorization. The current needs, however, for in-depth analysis other equally important use cases where such analysis fail to of the available content combined with the complexity and multi- provided the needed information.
    [Show full text]
  • Ekman, Emotional Expression, and the Art of Empirical Epiphany
    JOURNAL OF RESEARCH IN PERSONALITY Journal of Research in Personality 38 (2004) 37–44 www.elsevier.com/locate/jrp Ekman, emotional expression, and the art of empirical epiphany Dacher Keltner* Department of Psychology, University of California, Berkeley, 3319 Tolman, 94720 Berkeley, CA, USA Introduction In the mid and late 1960s, Paul Ekman offered a variety of bold assertions, some seemingly more radical today than others (Ekman, 1984, 1992, 1993). Emotions are expressed in a limited number of particular facial expressions. These expressions are universal and evolved. Facial expressions of emotion are remarkably brief, typically lasting 1 to 5 s. And germane to the interests of the present article, these brief facial expressions of emotion reveal a great deal about peopleÕs lives. In the present article I will present evidence that supports this last notion ad- vanced by Ekman, that brief expressions of emotion reveal important things about the individualÕs life course. To do so I first theorize about how individual differences in emotion shape the life context. With this reasoning as backdrop, I then review four kinds of evidence that indicate that facial expression is revealing of the life that the individual has led and is likely to continue leading. Individual differences in emotion and the shaping of the life context People, as a function of their personality or psychological disorder, create the sit- uations in which they act (e.g., Buss, 1987). Individuals selectively attend to certain features of complex situations, thus endowing contexts with idiosyncratic meaning. Individuals evoke responses in others, thus shaping the shared, social meaning of the situation.
    [Show full text]
  • E Motions in Process Barbara Rauch
    e_motions in process Barbara Rauch Abstract This research project maps virtual emotions. Rauch uses 3D-surface capturing devices to scan facial expressions in (stuffed) animals and humans, which she then sculpts with the Phantom Arm/ SensAble FreeForm device in 3D virtual space. The results are rapidform printed objects and 3D animations of morphing faces and gestures. Building on her research into consciousness studies and emotions, she has developed a new artwork to reveal characteristic aspects of human emotions (i.e. laughing, crying, frowning, sneering, etc.), which utilises new technology, in particular digital scanning devices and special effects animation software. The proposal is to use a 3D high-resolution laser scanner to capture animal faces and, using the data of these faces, animate and then combine them with human emotional facial expressions. The morphing of the human and animal facial data are not merely layers of the different scans but by applying an algorithmic programme to the data, crucial landmarks in the animal face are merged in order to match with those of the human. The results are morphings of the physical characteristics of animals with the emotional characteristics of the human face in 3D. The focus of this interdisciplinary research project is a collaborative practice that brings together researchers from UCL in London and researchers at OCAD University’s data and information visualization lab. Rauch uses Darwin’s metatheory of the continuity of species and other theories on evolution and internal physiology (Ekman et al) in order to re-examine previous and new theories with the use of new technologies, including the SensAble FreeForm Device, which, as an interface, allows for haptic feedback from digital data.Keywords: interdisciplinary research, 3D-surface capturing, animated facial expressions, evolution of emotions and feelings, technologically transformed realities.
    [Show full text]
  • Machine Learning Based Emotion Classification Using the Covid-19 Real World Worry Dataset
    Anatolian Journal of Computer Sciences Volume:6 No:1 2021 © Anatolian Science pp:24-31 ISSN:2548-1304 Machine Learning Based Emotion Classification Using The Covid-19 Real World Worry Dataset Hakan ÇAKAR1*, Abdulkadir ŞENGÜR2 *1Fırat Üniversitesi/Teknoloji Fak., Elektrik-Elektronik Müh. Bölümü, Elazığ, Türkiye ([email protected]) 2Fırat Üniversitesi/Teknoloji Fak., Elektrik-Elektronik Müh. Bölümü, Elazığ, Türkiye ([email protected]) Received Date : Sep. 27, 2020 Acceptance Date : Jan. 1, 2021 Published Date : Mar. 1, 2021 Özetçe— COVID-19 pandemic has a dramatic impact on economies and communities all around the world. With social distancing in place and various measures of lockdowns, it becomes significant to understand emotional responses on a great scale. In this paper, a study is presented that determines human emotions during COVID-19 using various machine learning (ML) approaches. To this end, various techniques such as Decision Trees (DT), Support Vector Machines (SVM), k-nearest neighbor (k-NN), Neural Networks (NN) and Naïve Bayes (NB) methods are used in determination of the human emotions. The mentioned techniques are used on a dataset namely Real World Worry dataset (RWWD) that was collected during COVID-19. The dataset, which covers eight emotions on a 9-point scale, grading their anxiety levels about the COVID-19 situation, was collected by using 2500 participants. The performance evaluation of the ML techniques on emotion prediction is carried out by using the accuracy score. Five-fold cross validation technique is also adopted in experiments. The experiment works show that the ML approaches are promising in determining the emotion in COVID- 19 RWWD.
    [Show full text]
  • Facial Expression and Emotion Paul Ekman
    1992 Award Addresses Facial Expression and Emotion Paul Ekman Cross-cultural research on facial expression and the de- We found evidence of universality in spontaneous velopments of methods to measure facial expression are expressions and in expressions that were deliberately briefly summarized. What has been learned about emo- posed. We postulated display rules—culture-specific pre- tion from this work on the face is then elucidated. Four scriptions about who can show which emotions, to whom, questions about facial expression and emotion are dis- and when—to explain how cultural differences may con- cussed: What information does an expression typically ceal universal in expression, and in an experiment we convey? Can there be emotion without facial expression? showed how that could occur. Can there be a facial expression of emotion without emo- In the last five years, there have been a few challenges tion? How do individuals differ in their facial expressions to the evidence of universals, particularly from anthro- of emotion? pologists (see review by Lutz & White, 1986). There is. however, no quantitative data to support the claim that expressions are culture specific. The accounts are more In 1965 when I began to study facial expression,1 few- anecdotal, without control for the possibility of observer thought there was much to be learned. Goldstein {1981} bias and without evidence of interobserver reliability. pointed out that a number of famous psychologists—F. There have been recent challenges also from psychologists and G. Allport, Brunswik, Hull, Lindzey, Maslow, Os- (J. A. Russell, personal communication, June 1992) who good, Titchner—did only one facial study, which was not study how words are used to judge photographs of facial what earned them their reputations.
    [Show full text]
  • Emotion Classification Based on Biophysical Signals and Machine Learning Techniques
    S S symmetry Article Emotion Classification Based on Biophysical Signals and Machine Learning Techniques Oana Bălan 1,* , Gabriela Moise 2 , Livia Petrescu 3 , Alin Moldoveanu 1 , Marius Leordeanu 1 and Florica Moldoveanu 1 1 Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Bucharest 060042, Romania; [email protected] (A.M.); [email protected] (M.L.); fl[email protected] (F.M.) 2 Department of Computer Science, Information Technology, Mathematics and Physics (ITIMF), Petroleum-Gas University of Ploiesti, Ploiesti 100680, Romania; [email protected] 3 Faculty of Biology, University of Bucharest, Bucharest 030014, Romania; [email protected] * Correspondence: [email protected]; Tel.: +40722276571 Received: 12 November 2019; Accepted: 18 December 2019; Published: 20 December 2019 Abstract: Emotions constitute an indispensable component of our everyday life. They consist of conscious mental reactions towards objects or situations and are associated with various physiological, behavioral, and cognitive changes. In this paper, we propose a comparative analysis between different machine learning and deep learning techniques, with and without feature selection, for binarily classifying the six basic emotions, namely anger, disgust, fear, joy, sadness, and surprise, into two symmetrical categorical classes (emotion and no emotion), using the physiological recordings and subjective ratings of valence, arousal, and dominance from the DEAP (Dataset for Emotion Analysis using EEG, Physiological and Video Signals) database. The results showed that the maximum classification accuracies for each emotion were: anger: 98.02%, joy:100%, surprise: 96%, disgust: 95%, fear: 90.75%, and sadness: 90.08%. In the case of four emotions (anger, disgust, fear, and sadness), the classification accuracies were higher without feature selection.
    [Show full text]
  • DEAP: a Database for Emotion Analysis Using Physiological Signals
    IEEE TRANS. AFFECTIVE COMPUTING 1 DEAP: A Database for Emotion Analysis using Physiological Signals Sander Koelstra, Student Member, IEEE, Christian M¨uhl, Mohammad Soleymani, Student Member, IEEE, Jong-Seok Lee, Member, IEEE, Ashkan Yazdani, Touradj Ebrahimi, Member, IEEE, Thierry Pun, Member, IEEE, Anton Nijholt, Member, IEEE, Ioannis Patras, Member, IEEE Abstract—We present a multimodal dataset for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute long excerpts of music videos. Participants rated each video in terms of the levels of arousal, valence, like/dislike, dominance and familiarity. For 22 of the 32 participants, frontal face video was also recorded. A novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection and an online assessment tool. An extensive analysis of the participants’ ratings during the experiment is presented. Correlates between the EEG signal frequencies and the participants’ ratings are investigated. Methods and results are presented for single-trial classification of arousal, valence and like/dislike ratings using the modalities of EEG, peripheral physiological signals and multimedia content analysis. Finally, decision fusion of the classification results from the different modalities is performed. The dataset is made publicly available and we encourage other researchers to use it for testing their own affective state estimation methods. Index Terms—Emotion classification, EEG, Physiological signals, Signal processing, Pattern classification, Affective computing. ✦ 1 INTRODUCTION information retrieval. Affective characteristics of multi- media are important features for describing multime- MOTION is a psycho-physiological process triggered dia content and can be presented by such emotional by conscious and/or unconscious perception of an E tags.
    [Show full text]
  • Emotion Classification Based on Brain Wave: a Survey
    Li et al. Hum. Cent. Comput. Inf. Sci. (2019) 9:42 https://doi.org/10.1186/s13673-019-0201-x REVIEW Open Access Emotion classifcation based on brain wave: a survey Ting‑Mei Li1, Han‑Chieh Chao1,2* and Jianming Zhang3 *Correspondence: [email protected] Abstract 1 Department of Electrical Brain wave emotion analysis is the most novel method of emotion analysis at present. Engineering, National Dong Hwa University, Hualien, With the progress of brain science, it is found that human emotions are produced by Taiwan the brain. As a result, many brain‑wave emotion related applications appear. However, Full list of author information the analysis of brain wave emotion improves the difculty of analysis because of the is available at the end of the article complexity of human emotion. Many researchers used diferent classifcation methods and proposed methods for the classifcation of brain wave emotions. In this paper, we investigate the existing methods of brain wave emotion classifcation and describe various classifcation methods. Keywords: EEG, Emotion, Classifcation Introduction Brain science has shown that human emotions are controlled by the brain [1]. Te brain produces brain waves as it transmits messages. Brain wave data is one of the biological messages, and biological messages usually have emotion features. Te feature of emo- tion can be extracted through the analysis of brain wave messages. But because of the environmental background and cultural diferences of human growth, the complexity of human emotion is caused. Terefore, in the analysis of brain wave emotion, classifcation algorithm is very important. In this article, we will focus on the classifcation of brain wave emotions.
    [Show full text]
  • EMOTION CLASSIFICATION of COVERED FACES 1 WHO CAN READ YOUR FACIAL EXPRESSION? a Comparison of Humans and Machine Learning Class
    journal name manuscript No. (will be inserted by the editor) 1 A Comparison of Humans and Machine Learning 2 Classifiers Detecting Emotion from Faces of People 3 with Different Coverings 4 Harisu Abdullahi Shehu · Will N. 5 Browne · Hedwig Eisenbarth 6 7 Received: 9th August 2021 / Accepted: date 8 Abstract Partial face coverings such as sunglasses and face masks uninten- 9 tionally obscure facial expressions, causing a loss of accuracy when humans 10 and computer systems attempt to categorise emotion. Experimental Psychol- 11 ogy would benefit from understanding how computational systems differ from 12 humans when categorising emotions as automated recognition systems become 13 available. We investigated the impact of sunglasses and different face masks on 14 the ability to categorize emotional facial expressions in humans and computer 15 systems to directly compare accuracy in a naturalistic context. Computer sys- 16 tems, represented by the VGG19 deep learning algorithm, and humans assessed 17 images of people with varying emotional facial expressions and with four dif- 18 ferent types of coverings, i.e. unmasked, with a mask covering lower-face, a 19 partial mask with transparent mouth window, and with sunglasses. Further- Harisu Abdullahi Shehu (Corresponding author) School of Engineering and Computer Science, Victoria University of Wellington, 6012 Wellington E-mail: [email protected] ORCID: 0000-0002-9689-3290 Will N. Browne School of Electrical Engineering and Robotics, Queensland University of Technology, Bris- bane QLD 4000, Australia Tel: +61 45 2468 148 E-mail: [email protected] ORCID: 0000-0001-8979-2224 Hedwig Eisenbarth School of Psychology, Victoria University of Wellington, 6012 Wellington Tel: +64 4 463 9541 E-mail: [email protected] ORCID: 0000-0002-0521-2630 2 Shehu H.
    [Show full text]
  • Hybrid Approach for Emotion Classification of Audio Conversation Based on Text and Speech Mining
    Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 46 ( 2015 ) 635 – 643 International Conference on Information and Communication Technologies (ICICT 2014) Hybrid Approach for Emotion Classification of Audio Conversation Based on Text and Speech Mining a, a b Jasmine Bhaskar * ,Sruthi K , Prema Nedungadi aDepartment of Compute science , Amrita University, Amrita School of Engineering, Amritapuri, 690525, India bAmrita Create, Amrita University, Amrita School of Engineering, Amritapuri, 690525, India Abstract One of the greatest challenges in speech technology is estimating the speaker’s emotion. Most of the existing approaches concentrate either on audio or text features. In this work, we propose a novel approach for emotion classification of audio conversation based on both speech and text. The novelty in this approach is in the choice of features and the generation of a single feature vector for classification. Our main intention is to increase the accuracy of emotion classification of speech by considering both audio and text features. In this work we use standard methods such as Natural Language Processing, Support Vector Machines, WordNet Affect and SentiWordNet. The dataset for this work have been taken from Semval -2007 and eNTERFACE'05 EMOTION Database. © 20152014 PublishedThe Authors. by ElsevierPublished B.V. by ElsevierThis is an B.V. open access article under the CC BY-NC-ND license (Peer-reviewhttp://creativecommons.org/licenses/by-nc-nd/4.0/ under responsibility of organizing committee). of the International Conference on Information and Communication PeerTechnologies-review under (ICICT responsibility 2014). of organizing committee of the International Conference on Information and Communication Technologies (ICICT 2014) Keywords: Emotion classification;text mining;speech mining;hybrid approach 1.
    [Show full text]