Emotion Classification Using Facial Expression
Total Page:16
File Type:pdf, Size:1020Kb
(IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 2, No. 7, 2011 Emotion Classification Using Facial Expression Devi Arumugam Dr. S. Purushothaman B.E, M.E., Ph.D(IIT-M) Research Scholar, Department of Computer Science, Principal, Mother Teresa Women’s University, Sun College of Engineering and Technology, Kodaikanal, India. Sun Nagar, Erachakulam, Kanyakumari, India. Abstract— Human emotional facial expressions play an positive and negative emotions. The six basic emotions are important role in interpersonal relations. This is because humans angry, happy, fear, disgust, sad, surprise. One more expression demonstrate and convey a lot of evident information visually is neutral. Other emotions are Embarrassments, interest, pain, rather than verbally. Although humans recognize facial shame, shy, anticipation, smile, laugh, sorrow, hunger, expressions virtually without effort or delay, reliable expression curiosity. recognition by machine remains a challenge as of today. To automate recognition of emotional state, machines must be taught In different situation of emotion, Anger may be expressed to understand facial gestures. In this paper we developed an in different ways like Enraged, annoyed, anxious, irritated, algorithm which is used to identify the person’s emotional state resentful, miffed, upset, mad, furious, and raging. Happy may through facial expression such as angry, disgust, happy. This can be expressed in different ways like joy, greedy, ecstatic, be done with different age group of people with different fulfilled, contented, glad, complete, satisfied, and pleased. situation. We Used a Radial Basis Function network (RBFN) for Disgust may be expressed in different ways like contempt, classification and Fisher’s Linear Discriminant (FLD), Singular exhausted, peeved, upset, and bored. Value Decomposition (SVD) for feature selection. In emotional expression face, If we are angry, the brows are Keywords - Universal Emotions; FLD; SVD; Eigen Values; Eigen lowered and drawn together, Vertical lines appear between the Vector; RBF Network. brows, lower lid tensed, eyes hard stare or bulging, lips can be pressed firmly together with corners down or square shape as if I. INTRODUCTION shouting, nostrils may be dilated, the lower jaw juts out. Human beings express their emotions in everyday If we are happy, corners of the lips are drawn back and up, interactions with others. Emotions are frequently reflected on mouth may or may not be parted, teeth exposed, a wrinkle runs the face, in hand and body gestures, in the voice, to express our from outer nose to outer lip, cheeks are raised, lower lid may feelings or liking. Recent Psychology research has shown that show wrinkles or be tense, crows feet near the outside of the the most expressive way humans display emotions is through eyes. Disgust has been identified as one of the basic emotions. facial expressions. Mehrabian [1] indicated that the verbal part Its basic definition is “bad taste” secondarily to anything which of a message contributes only for 7% to the effect of the causes a similar feeling, through the sense of smell, touch and message as a whole, the vocal part for 38%, while facial even of eyesight. The well defined facial expression of disgust expressions for 55% to the effect of the speaker’s message. is characterized by furrowing of the eyebrows, closure of the Emotions are feeling or response to particular situation or eyes and pupil constriction, wrinkling of the nose, upper lip environment. Emotions are an integral part of our existence, as retraction and upward movement of the lower lip and chin, one smiles to show greeting, frowns when confused, or raises drawing the corners of the mouth down and back one’s voice when enraged. It is because we understand other This research describes a neural network based approach emotions and react based on that expression only enriches the for emotion classification. We learn a classifier that can interactions. Computers are “emotionally challenged”. They recognize three basic emotions. neither recognize other emotions nor possess its own emotion [2]. II. RELATED WORK To enrich human-computer interface from point-and-click There are several approaches taken in the literature for to sense-and-feel, to develop non intrusive sensors, to develop learning classifiers for emotion recognition [2] [6]. lifelike software agents such as devices, this can express and understand emotion. Since computer systems with this In the static Approach, the classifier classifies each frame in capability have a wide range of applications in different the video to one of the facial expression categories based on the research arrears, including security, law enforcement, clinic, tracking results of that frame. Bayesian network classifiers education, psychiatry and Telecommunications [4]. There as were commonly used in this approach. Naïve Bayes classifiers been much research on recognizing emotion through facial were also used often. Because of this unrealistic approach some expressions. used Gaussian classifiers. In emotional classification there are two basic emotions are In the Dynamic Approach, these classifiers take into there, Love-fear. Based on this we classify the emotion into account the temporal pattern in displaying facial expression. Hidden Markov Model (HMM) based classifiers for facial 92 | P a g e www.ijacsa.thesai.org (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 2, No. 7, 2011 expression recognition has been previously used in recent of the research done in automatic facial expression recognition works. Cohen and Sebe [4] further advanced this line of in recent years. The work in computer-assisted quantification research and proposed a multi-level HMM classifier. of facial expressions did not start until the 1990s. This field has been a research interest for scientists from Mase [10] used optical flow (OF) to recognize facial several different tracks like computer science, engineering, expressions. He was one of the firsts to use image-processing psychology, and neuroscience [1]. In this paper, we propose a techniques to recognize facial expressions. Lanitis [11] used a complete emotional classification using facial expression. The flexible shape and appearance model for image10 coding, network is well trained with the various emotions of the face person identification, pose recovery, gender recognition, and with different age group of people from different situation. facial expression recognition. Black and Yacoob [12] used When given an input, the RBF network with the help of FLD, local parameterized models of image motion to recover non- trained with various emotions of the face expression with that rigid motion. Once recovered, these parameters were used as of training examples and produces the output. inputs to a rule-based classifier to recognize the six basic facial expressions. This Existing research was developed by Paul Ekman [3]. He is a psychology perspective. In the early 1990s the Yacoob and Davis [13] computed optical flow and used engineering community started to use these results to construct similar rules to classify the six facial expressions. Rosenblum, automatic methods of recognizing emotions from facial Yacoob, and Davis [14] also computed optical flow of regions expressions in images or video [4] based on various techniques on the face then applied a radial basis function network to of tracking [5]. An important problem in the emotion classify expressions. Essa and Pentlands [15] used an optical recognition field is the lack of agreed benchmark database and flow region-based method to recognize expressions. Donato et methods for compare different methods performance. Cohn- al. [16] tested different features for recognizing facial AUs and kanade database is a step in this direction [2]. inferring the facial expression in the frame. Otsuka and Ohya [17] first computed optical flow, then computed the 2D Fourier Since the early 1970, Paul Ekman and his colleagues have transform coefficients, which were used as feature vectors for a performed extensive studies of human facial expression [7]. hidden Markov model (HMM) to classify expressions. The They found evidence to support universality in facial trained system was able to recognize one of the six expressions expressions. These “Universal facial expressions” are those near real-time (about 10 Hz). Furthermore, they used the representing happiness, sadness, anger, fear, surprise, and tracked motions to control the facial expression of an animated disgust. They studied facial expressions in different cultures, Kabuki system [18]. Martinez [19] introduced an indexing including preliterate cultures, and found much commonality in approach based on the identification of frontal face images the expression and recognition of emotions on the face. under different illumination conditions, facial expressions, and However, they observed differences in expressions are occlusions. governed by “display rules” in different social contexts. For Example, Japanese subjects and American subjects showed A Bayesian approach was used to find the best match similar facial expressions while viewing the same stimulus between the local observations and the learned local features film. However, in the presence of authorities, the Japanese model and an HMM was employed to achieve good recognition viewers were more reluctant to show their real expressions. On even when the new conditions did not correspond to