Facial Coding Platform for Measuring Emotions in Farm Animals

Total Page:16

File Type:pdf, Size:1020Kb

Facial Coding Platform for Measuring Emotions in Farm Animals bioRxiv preprint doi: https://doi.org/10.1101/2021.04.09.439122; this version posted April 11, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission. 1 Type of the Paper (Article) 2 Happy Cow or Thinking Pig? WUR Wolf – Facial Coding 3 Platform for Measuring Emotions in Farm Animals 4 Suresh Neethirajan1* 5 1 Adaptation Physiology Group, Department of Animal Sciences, Wageningen University & Research, 6700 6 AH Wageningen, The Netherlands. 7 * Correspondence: [email protected] 8 Abstract: Emotions play an indicative and informative role in the investigation of farm animal 9 behaviors. Systems that respond and can measure emotions provide a natural user interface in 10 enabling the digitalization of animal welfare platforms. The faces of farm animals can be one of the 11 richest channels for expressing emotions. We present WUR Wolf (Wageningen University & Re- 12 search: Wolf Mascot)—a real-time facial expression recognition platform that can automatically 13 code the emotions of farm animals. Using Python-based algorithms, we detect and track the facial 14 features of cows and pigs, analyze the appearance, ear postures, and eye white regions, and corre- 15 late with the mental/emotional states of the farm animals. The system is trained on dataset of facial 16 features of images of the farm animals collected in over 6 farms and has been optimized to operate 17 with an average accuracy of 85%. From these, we infer the emotional states of animals in real time. 18 The software detects 13 facial actions and 9 emotional states, including whether the animal is ag- 19 gressive, calm, or neutral. A real-time emotion recognition system based on YoloV3, and Faster 20 YoloV4-based facial detection platform and an ensemble Convolutional Neural Networks (RCNN) 21 is presented. Detecting expressions of farm animals simultaneously in real time makes many new 22 interfaces for automated decision-making tools possible for livestock farmers. Emotions sensing 23 offers a vast amount of potential for improving animal welfare and animal-human interactions. Citation: Lastname, F.; Lastname, F.;24 Keywords: Biometric verification; facial profile images; animal welfare; precision livestock farm- Lastname, F. Title. Sensors 2021, 21, 25 ing; welfare monitoring; YOLOv3; YOLOv4; faster RCNN x. https://doi.org/10.3390/xxxxx 26 Academic Editor: Firstname Lastname 27 1. Introduction Received: date 28 Digital technologies, in particular, precision livestock farming, and artificial intelligence Accepted: date 29 have the potential to shape the transformation in animal welfare [1]. To ensure access to Published: date 30 sustainable and high-quality health attention and welfare in animal husbandry man- Publisher’s Note: MDPI stays neu-31 agement, innovative tools are needed. Unlocking the full potential of automated meas- tral with regard to jurisdictional32 urement of mental and emotional states of farm animals through digitalization such as claims in published maps and insti-33 facial coding systems would help to blur the lines between biological, physical, and dig- tutional affiliations. 34 ital technologies. 35 Animal caretakers, handlers, and farmworkers typically rely on hands-on observations 36 and measurements while investigating methods to monitor animal welfare. To avoid the Copyright: © 2021 by the authors.37 increased handling of animals in the process of taking functional or physiological data, Submitted for possible open access 38 and to reduce the subjectivity associated with manual assessments, automated animal publication under the terms and 39 behavior and physiology measurement systems can complement the current traditional conditions of the Creative Commons 40 welfare assessment tools and processes in enhancing the detection of animals in distress Attribution (CC BY) license 41 or pain in the barn [2]. Automated and continuous monitoring of animal welfare through (http://creativecommons.org/licenses 42 digital alerting is rapidly becoming a reality [3]. /by/4.0/). Sensors 2021, 21, x. https://doi.org/10.3390/xxxxx www.mdpi.com/journal/sensors bioRxiv preprint doi: https://doi.org/10.1101/2021.04.09.439122; this version posted April 11, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission. Sensors 2021, 21, x FOR PEER REVIEW 2 of 13 43 In the human context, facial recognition platforms have long been in use for various ap- 44 plications, such as password systems on smartphones, identification at international 45 border checkpoints, identification of criminals [4, diagnosis of Turner syndrome [5]; de- 46 tection of genetic disorder phenotypes [6]; as a potential diagnostic tool for Parkinson 47 disease [7]; measuring tourist satisfaction through emotional expressions [8]; and quan- 48 tification of customer interest during shopping [9]. 49 Emotions 50 Emotions are believed to be a social and survival mechanism that is present in many 51 species. In humans, emotions are understood as deep and complex psychological expe- 52 riences that influence physical reactions. There is an entire sector of science devoted to 53 understanding the sophisticated inner workings of the human brain, yet many questions 54 related to human emotions remain unanswered. Even less scientific research is focused 55 on understanding the emotional capacity of non-human primates and other animals. The 56 ability to interpret the emotional states of an animal is considerably more difficult than 57 understanding the emotional state of a human [10]. The human face is capable of a wide 58 array of expressions that communicate emotion and social intent to other humans. These 59 expressions are so clear that even some non-human species, like dogs, can identify hu- 60 man emotion through facial expression [11]. Each species has its own unique physiolog- 61 ical composition resulting in special forms of expression. Despite human intellectual ca- 62 pacity, emotional understanding of other species through facial observation has proven 63 difficult. 64 Early studies [12] noted the influence of human bias on interpretations and accidental 65 interference with the natural responses of animals. It is not uncommon for humans to 66 anthropomorphize the expressions of animals. The baring of teeth is an example. Hu- 67 mans commonly consider such an expression to be “smiling” and interpret it as a sign of 68 positive emotions. In other species, such as non-human primates, the baring of teeth is 69 more commonly an expression of a negative emotion associated with aggression [13]. 70 For these reasons and many others, the involvement of technology is critical in main- 71 taining accurate and unbiased assessments of animal emotions and individual animal 72 identification. In recent years, the number of studies concerning technological interven- 73 tion in the field of animal behavior has increased [10]. The ability of customized software 74 to improve research, animal welfare, the production of food animals, legal identification, 75 and medical practices is astounding. 76 Understanding Animal Emotions 77 The human comprehension of animal emotions may seem trivial; however, it is a mutu- 78 ally beneficial skill. The ability of animals to express complex emotions, such as love and 79 joy, is still being debated within the field of behavioral science. Other emotions, such as 80 fear, stress, and pleasure are studied more commonly. These basic emotions have an 81 impact on how animals feel about their environment and interact with it. It also impacts 82 an animal’s interactions with its counter specifics. 83 Non-domesticated species of animals are commonly observed in the wild and main- 84 tained in captivity to understand and conserve their species. Changes in the natural en- 85 vironment, because of human actions, can be stressful for individuals within a species. 86 Captive non-domesticated animals also experience stress created through artificial envi- 87 ronments and artificial mate selection. If even one animal experiences and displays signs 88 of stress or aggression, its companions are likely to understand and attempt to respond to 89 their emotional state [14]. These responses can result in stress, conflict, and the uneven bioRxiv preprint doi: https://doi.org/10.1101/2021.04.09.439122; this version posted April 11, 2021. The copyright holder for this preprint (which was not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission. Sensors 2021, 21, x FOR PEER REVIEW 3 of 13 90 distribution of resources [15]. The understanding of emotional expression in captive 91 animals can help caretakers determine the most beneficial forms of care and companion 92 matching for each individual, resulting in a better quality of life for the animals in ques- 93 tion. 94 Companion animals are another category of individuals who can benefit from a deeper 95 understanding of animal emotion. Just like humans, individual animals experience dif- 96 ferent thresholds for coping with pain and discomfort. Since many companion animals 97 must undergo voluntary medical procedures for the well-being of their health and their 98 species, it is important to understand their physical responses. Animals cannot tell hu- 99 mans how much pain they are in, so it is up to their caretakers to interpret the pain level 100 an animal is experiencing and treat it appropriately [16]. This task is most accurately 101 completed when the emotions of an animal are clearly and quickly detectable. 102 The understanding of expressions related to stress and pain is impactful in animal agri- 103 culture. Animals used for food production often produce higher quality products when 104 they do not experience unpleasant emotions [17]. The detection of individual animals 105 experiencing stress allows for the early identification of medical complications as well. A 106 study on sows in parturition showed a uniform pattern of facially expressed discomfort 107 during the birthing cycle [18].
Recommended publications
  • Neuroticism and Facial Emotion Recognition in Healthy Adults
    First Impact Factor released in June 2010 bs_bs_banner and now listed in MEDLINE! Early Intervention in Psychiatry 2015; ••: ••–•• doi:10.1111/eip.12212 Brief Report Neuroticism and facial emotion recognition in healthy adults Sanja Andric,1 Nadja P. Maric,1,2 Goran Knezevic,3 Marina Mihaljevic,2 Tijana Mirjanic,4 Eva Velthorst5,6 and Jim van Os7,8 Abstract 1School of Medicine, University of Results: A significant negative corre- Belgrade, 2Clinic for Psychiatry, Clinical Aim: The aim of the present study lation between the degree of neuroti- 3 Centre of Serbia, Faculty of Philosophy, was to examine whether healthy cism and the percentage of correct Department of Psychology, University of individuals with higher levels of neu- answers on DFAR was found only for Belgrade, Belgrade, 4Special Hospital for roticism, a robust independent pre- happy facial expression (significant Psychiatric Disorders Kovin, Kovin, Serbia; after applying Bonferroni correction). 5Department of Psychiatry, Academic dictor of psychopathology, exhibit Medical Centre University of Amsterdam, altered facial emotion recognition Amsterdam, 6Departments of Psychiatry performance. Conclusions: Altered sensitivity to the and Preventive Medicine, Icahn School of emotional context represents a useful Medicine, New York, USA; 7South Methods: Facial emotion recognition and easy way to obtain cognitive phe- Limburg Mental Health Research and accuracy was investigated in 104 notype that correlates strongly with Teaching Network, EURON, Maastricht healthy adults using the Degraded
    [Show full text]
  • How Deep Neural Networks Can Improve Emotion Recognition on Video Data
    HOW DEEP NEURAL NETWORKS CAN IMPROVE EMOTION RECOGNITION ON VIDEO DATA Pooya Khorrami1, Tom Le Paine1, Kevin Brady2, Charlie Dagli2, Thomas S. Huang1 1 Beckman Institute, University of Illinois at Urbana-Champaign 2 MIT Lincoln Laboratory 1 fpkhorra2, paine1, [email protected] 2 fkbrady, [email protected] ABSTRACT An alternative way to model the space of possible emo- There have been many impressive results obtained us- tions is to use a dimensional approach [11] where a person’s ing deep learning for emotion recognition tasks in the last emotions can be described using a low-dimensional signal few years. In this work, we present a system that per- (typically 2 or 3 dimensions). The most common dimensions forms emotion recognition on video data using both con- are (i) arousal and (ii) valence. Arousal measures how en- volutional neural networks (CNNs) and recurrent neural net- gaged or apathetic a subject appears while valence measures works (RNNs). We present our findings on videos from how positive or negative a subject appears. the Audio/Visual+Emotion Challenge (AV+EC2015). In our Dimensional approaches have two advantages over cat- experiments, we analyze the effects of several hyperparam- egorical approaches. The first being that dimensional ap- eters on overall performance while also achieving superior proaches can describe a larger set of emotions. Specifically, performance to the baseline and other competing methods. the arousal and valence scores define a two dimensional plane while the six basic emotions are represented as points in said Index Terms— Emotion Recognition, Convolutional plane. The second advantage is dimensional approaches can Neural Networks, Recurrent Neural Networks, Deep Learn- output time-continuous labels which allows for more realistic ing, Video Processing modeling of emotion over time.
    [Show full text]
  • Emotion Recognition Depends on Subjective Emotional Experience and Not on Facial
    Emotion recognition depends on subjective emotional experience and not on facial expressivity: evidence from traumatic brain injury Travis Wearne1, Katherine Osborne-Crowley1, Hannah Rosenberg1, Marie Dethier2 & Skye McDonald1 1 School of Psychology, University of New South Wales, Sydney, NSW, Australia 2 Department of Psychology: Cognition and Behavior, University of Liege, Liege, Belgium Correspondence: Dr Travis A Wearne, School of Psychology, University of New South Wales, NSW, Australia, 2052 Email: [email protected] Number: (+612) 9385 3310 Abstract Background: Recognising how others feel is paramount to social situations and commonly disrupted following traumatic brain injury (TBI). This study tested whether problems identifying emotion in others following TBI is related to problems expressing or feeling emotion in oneself, as theoretical models place emotion perception in the context of accurate encoding and/or shared emotional experiences. Methods: Individuals with TBI (n = 27; 20 males) and controls (n = 28; 16 males) were tested on an emotion recognition task, and asked to adopt facial expressions and relay emotional memories according to the presentation of stimuli (word & photos). After each trial, participants were asked to self-report their feelings of happiness, anger and sadness. Judges that were blind to the presentation of stimuli assessed emotional facial expressivity. Results: Emotional experience was a unique predictor of affect recognition across all emotions while facial expressivity did not contribute to any of the regression models. Furthermore, difficulties in recognising emotion for individuals with TBI were no longer evident after cognitive ability and experience of emotion were entered into the analyses. Conclusions: Emotion perceptual difficulties following TBI may stem from an inability to experience affective states and may tie in with alexythymia in clinical conditions.
    [Show full text]
  • Facial Emotion Recognition
    Issue 1 | 2021 Facial Emotion Recognition Facial Emotion Recognition (FER) is the technology that analyses facial expressions from both static images and videos in order to reveal information on one’s emotional state. The complexity of facial expressions, the potential use of the technology in any context, and the involvement of new technologies such as artificial intelligence raise significant privacy risks. I. What is Facial Emotion FER analysis comprises three steps: a) face de- Recognition? tection, b) facial expression detection, c) ex- pression classification to an emotional state Facial Emotion Recognition is a technology used (Figure 1). Emotion detection is based on the ana- for analysing sentiments by different sources, lysis of facial landmark positions (e.g. end of nose, such as pictures and videos. It belongs to the eyebrows). Furthermore, in videos, changes in those family of technologies often referred to as ‘affective positions are also analysed, in order to identify con- computing’, a multidisciplinary field of research on tractions in a group of facial muscles (Ko 2018). De- computer’s capabilities to recognise and interpret pending on the algorithm, facial expressions can be human emotions and affective states and it often classified to basic emotions (e.g. anger, disgust, fear, builds on Artificial Intelligence technologies. joy, sadness, and surprise) or compound emotions Facial expressions are forms of non-verbal (e.g. happily sad, happily surprised, happily disgus- communication, providing hints for human emo- ted, sadly fearful, sadly angry, sadly surprised) (Du tions. For decades, decoding such emotion expres- et al. 2014). In other cases, facial expressions could sions has been a research interest in the field of psy- be linked to physiological or mental state of mind chology (Ekman and Friesen 2003; Lang et al.
    [Show full text]
  • Deep-Learning-Based Models for Pain Recognition: a Systematic Review
    applied sciences Article Deep-Learning-Based Models for Pain Recognition: A Systematic Review Rasha M. Al-Eidan * , Hend Al-Khalifa and AbdulMalik Al-Salman College of Computer and Information Sciences, King Saud University, Riyadh 11451, Saudi Arabia; [email protected] (H.A.-K.); [email protected] (A.A.-S.) * Correspondence: [email protected] Received: 31 July 2020; Accepted: 27 August 2020; Published: 29 August 2020 Abstract: Traditional standards employed for pain assessment have many limitations. One such limitation is reliability linked to inter-observer variability. Therefore, there have been many approaches to automate the task of pain recognition. Recently, deep-learning methods have appeared to solve many challenges such as feature selection and cases with a small number of data sets. This study provides a systematic review of pain-recognition systems that are based on deep-learning models for the last two years. Furthermore, it presents the major deep-learning methods used in the review papers. Finally, it provides a discussion of the challenges and open issues. Keywords: pain assessment; pain recognition; deep learning; neural network; review; dataset 1. Introduction Deep learning is an important branch of machine learning used to solve many applications. It is a powerful way of achieving supervised learning. The history of deep learning has undergone different naming conventions and developments [1]. It was first referred to as cybernetics in the 1940s–1960s. Then, in the 1980s–1990s, it became known as connectionism. Finally, the phrase deep learning began to be utilized in 2006. There are also some names that are based on human knowledge.
    [Show full text]
  • A Review of Alexithymia and Emotion Perception in Music, Odor, Taste, and Touch
    MINI REVIEW published: 30 July 2021 doi: 10.3389/fpsyg.2021.707599 Beyond Face and Voice: A Review of Alexithymia and Emotion Perception in Music, Odor, Taste, and Touch Thomas Suslow* and Anette Kersting Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig Medical Center, Leipzig, Germany Alexithymia is a clinically relevant personality trait characterized by deficits in recognizing and verbalizing one’s emotions. It has been shown that alexithymia is related to an impaired perception of external emotional stimuli, but previous research focused on emotion perception from faces and voices. Since sensory modalities represent rather distinct input channels it is important to know whether alexithymia also affects emotion perception in other modalities and expressive domains. The objective of our review was to summarize and systematically assess the literature on the impact of alexithymia on the perception of emotional (or hedonic) stimuli in music, odor, taste, and touch. Eleven relevant studies were identified. On the basis of the reviewed research, it can be preliminary concluded that alexithymia might be associated with deficits Edited by: in the perception of primarily negative but also positive emotions in music and a Mathias Weymar, University of Potsdam, Germany reduced perception of aversive taste. The data available on olfaction and touch are Reviewed by: inconsistent or ambiguous and do not allow to draw conclusions. Future investigations Khatereh Borhani, would benefit from a multimethod assessment of alexithymia and control of negative Shahid Beheshti University, Iran Kristen Paula Morie, affect. Multimodal research seems necessary to advance our understanding of emotion Yale University, United States perception deficits in alexithymia and clarify the contribution of modality-specific and Jan Terock, supramodal processing impairments.
    [Show full text]
  • Emotion Recognition in Conversation: Research Challenges, Datasets, and Recent Advances
    1 Emotion Recognition in Conversation: Research Challenges, Datasets, and Recent Advances Soujanya Poria1, Navonil Majumder2, Rada Mihalcea3, Eduard Hovy4 1 School of Computer Science and Engineering, NTU, Singapore 2 CIC, Instituto Politécnico Nacional, Mexico 3 Computer Science & Engineering, University of Michigan, USA 4 Language Technologies Institute, Carnegie Mellon University, USA [email protected], [email protected], [email protected], [email protected] Abstract—Emotion is intrinsic to humans and consequently conversational data. ERC can be used to analyze conversations emotion understanding is a key part of human-like artificial that take place on social media. It can also aid in analyzing intelligence (AI). Emotion recognition in conversation (ERC) is conversations in real times, which can be instrumental in legal becoming increasingly popular as a new research frontier in natu- ral language processing (NLP) due to its ability to mine opinions trials, interviews, e-health services and more. from the plethora of publicly available conversational data in Unlike vanilla emotion recognition of sentences/utterances, platforms such as Facebook, Youtube, Reddit, Twitter, and others. ERC ideally requires context modeling of the individual Moreover, it has potential applications in health-care systems (as utterances. This context can be attributed to the preceding a tool for psychological analysis), education (understanding stu- utterances, and relies on the temporal sequence of utterances. dent frustration) and more. Additionally, ERC is also extremely important for generating emotion-aware dialogues that require Compared to the recently published works on ERC [10, an understanding of the user’s emotions. Catering to these needs 11, 12], both lexicon-based [13, 8, 14] and modern deep calls for effective and scalable conversational emotion-recognition learning-based [4, 5] vanilla emotion recognition approaches algorithms.
    [Show full text]
  • Recognition of Emotional Face Expressions and Amygdala Pathology*
    Recognition of Emotional Face Expressions and Amygdala Pathology* Chiara Cristinzio 1,2, David Sander 2, 3, Patrik Vuilleumier 1,2 1 Laboratory for Behavioural Neurology and Imaging of Cognition, Department of Neuroscience and Clinic of Neurology, University of Geneva 2 Swiss Center for Affective Sciences, University of Geneva 3 Department of Psychology, University of Geneva Abstract Reconnaissance d'expressions faciales émo- tionnelles et pathologie de l'amygdale The amygdala is often damaged in patients with temporal lobe epilepsy, either because of the primary L'amygdale de patients atteints d'une épilepsie du epileptogenic disease (e.g. sclerosis or encephalitis) or lobe temporal est souvent affectée, soit en raison de because of secondary effects of surgical interventions leur maladie épileptogénique primaire (p.ex. sclérose (e.g. lobectomy). In humans, the amygdala has been as- ou encéphalite), soit en raison des effets secondaires sociated with a range of important emotional and d'interventions chirurgicales (p.ex. lobectomie). L'amyg- social functions, in particular with deficits in emotion dale a été associée à une gamme de fonctions émo- recognition from faces. Here we review data from tionnelles et sociales importantes dans l'organisme hu- recent neuropsychological research illustrating the main, en particulier à certains déficits dans la faculté de amygdala role in processing facial expressions. We décrypter des émotions sur le visage. Nous passons ici describe behavioural findings subsequent to focal en revue les résultats des récents travaux de la neuro- lesions and possible factors that may influence the na- psychologie illustrant le rôle de l'amygdale dans le trai- ture and severity of deficits in patients.
    [Show full text]
  • Physiological Sensors Based Emotion Recognition While Experiencing Tactile Enhanced Multimedia
    sensors Article Physiological Sensors Based Emotion Recognition While Experiencing Tactile Enhanced Multimedia Aasim Raheel 1, Muhammad Majid 1,* , Majdi Alnowami 2 and Syed Muhammad Anwar 3 1 Department of Computer Engineering, University of Engineering and Technology, Taxila 47050, Pakistan; [email protected] 2 Department of Nuclear Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia; [email protected] 3 Department of Software Engineering, University of Engineering and Technology, Taxila 47050, Pakistan; [email protected] * Correspondence: [email protected] Received: 29 April 2020; Accepted: 14 May 2020; Published: 21 July 2020 Abstract: Emotion recognition has increased the potential of affective computing by getting an instant feedback from users and thereby, have a better understanding of their behavior. Physiological sensors have been used to recognize human emotions in response to audio and video content that engages single (auditory) and multiple (two: auditory and vision) human senses, respectively. In this study, human emotions were recognized using physiological signals observed in response to tactile enhanced multimedia content that engages three (tactile, vision, and auditory) human senses. The aim was to give users an enhanced real-world sensation while engaging with multimedia content. To this end, four videos were selected and synchronized with an electric fan and a heater, based on timestamps within the scenes, to generate tactile enhanced content with cold and hot air effect respectively. Physiological signals, i.e., electroencephalography (EEG), photoplethysmography (PPG), and galvanic skin response (GSR) were recorded using commercially available sensors, while experiencing these tactile enhanced videos. The precision of the acquired physiological signals (including EEG, PPG, and GSR) is enhanced using pre-processing with a Savitzky-Golay smoothing filter.
    [Show full text]
  • Overview of the Advancements in Automatic Emotion Recognition: Comparative Performance of Commercial Algorithms
    OVERVIEW OF THE ADVANCEMENTS IN AUTOMATIC EMOTION RECOGNITION: COMPARATIVE PERFORMANCE OF COMMERCIAL ALGORITHMS APREPRINT Mariya Malygina Mikhail Artemyev Department of Psychology Neurodata Lab National Research University Moscow, Russia Higher School of Economics; [email protected] Neurodata Lab Moscow, Russia [email protected] Andrey Belyaev Olga Perepelkina Neurodata Lab Neurodata Lab Moscow, Russia Moscow, Russia [email protected] [email protected] ∗ December 24, 2019 ABSTRACT In the recent years facial emotion recognition algorithms have evolved and in some cases top commercial algorithms detect emotions like happiness better than humans do. To evaluate the performance of these algorithms, the common practice is to compare them with human-labeled ground truth. This article covers monitoring of the advancements in automatic emotion recognition solutions, and here we suggest an additional criteria for their evaluation, that is the agreement between algorithms’ predictions. In this work, we compare the performance of four commercial algorithms: Affectiva Affdex, Microsoft Cognitive Services Face module Emotion Recognition, Amazon Rekognition Face Analysis, and Neurodata Lab Emotion Recognition on three datasets AFEW, RAVDESS, and SAVEE, that differ in terms of control over conditions of data acquisition. We assume that the consistency among algorithms’ predictions indicates the reliability of the predicted emotion. Overall results show that the algorithms with higher accuracy and f1-scores that were obtained for human-labeled ground truth (Microsoft’s, Neurodata Lab’s, and Amazon’s), showed higher agreement between their predictions. Agreement among algorithms’ predictions is a promising criteria in terms of further exploring the option to replace human data labeling with automatic annotation. Keywords Emotion Recognition · Affective computing · Facial expression 1 Introduction Various automatic facial expression recognition systems have been developed to detect human emotions.
    [Show full text]
  • Recognition of Emotional Facial Expressions and Alexithymia in Patients with Chronic Facial Pain
    Central Annals of Psychiatry and Mental Health Research Article *Corresponding author Harry von Piekartz , Department of Movement and Rehabilitation science, University of Applied Science Recognition of Emotional Facial Osnabrück, Caprivistrasse 30a |49076 Osnabrück -Germany, Tel +49541969-3526, Email: Expressions and Alexithymia Submitted: 17 November 2018 Accepted: 07 December 2018 in Patients with Chronic Facial Published: 08 December 2018 Copyright © 2018 Von Piekartz et al. Pain ISSN: 2374-0124 OPEN ACCESS Harry Von Piekartz1* ,Gesche Mohr1, Kerstin Limbrecht- Keywords 2 3 4 Ecklundt , Harald Traue , and Hendrik Kessler • Alexithymia 1Department of Movement and Rehabilitation, University of Applied Science Osnabrück, • Face pain • Emotion recognition Germany • Emotion expression 2Psychologische PsychotherapeutinUniversität HamburgPsychotherapeutische, Germany 3University Clinic for Psychosomatic Medicine and Psychotherapy, Germany 4Department of Psychosomatic Medicine and Psychotherapy, LWL-University Clinic Bochum, Germany Abstract Objectives: Alexithymia, conceived as difficulties to identify emotions, is said to be related with several pain syndromes. This study examined the recognition of facially expressed emotions and its relation to alexithymia in subjects with chronic facial pain. Methods: A total of 62 subjects were recruited, with n=20 patients with chronic facial pain and n=42 healthy controls. All subjects were tested for the recognition of facially expressed emotions (Facially Expressed Emotion Labelling Test (FEEL test).
    [Show full text]
  • Automated Pain Detection from Facial Expressions Using FACS: a Review
    1 Automated Pain Detection from Facial Expressions using FACS: A Review Zhanli Chen, Student Member, IEEE, Rashid Ansari, Fellow, IEEE, and Diana J. Wilkie, Fellow, AAN, Abstract—Facial pain expression is an important modality for assessing pain, especially when the patient’s verbal ability to communicate is impaired. The facial muscle-based action units (AUs), which are defined by the Facial Action Coding System (FACS), have been widely studied and are highly reliable as a method for detecting facial expressions (FE) including valid detection of pain. Unfortunately, FACS coding by humans is a very time-consuming task that makes its clinical use prohibitive. Significant progress on automated facial expression recognition (AFER) have led to its numerous successful applications in FACS-based affective computing problems. However, only a handful of studies have been reported on automated pain detection (APD), and its application in clinical settings is still far from a reality. In this paper, we review the progress in research that has contributed to automated pain detection, with focus on 1) the framework-level similarity between spontaneous AFER and APD problems; 2) the evolution of system design including the recent development of deep learning methods; 3) the strategies and considerations in developing a FACS-based pain detection framework from existing research; and 4) introduction of the most relevant databases that are available for AFER and APD studies. We attempt to present key considerations in extending a general AFER framework to an APD framework in clinical settings. In addition, the performance metrics are also highlighted in evaluating an AFER or an APD system.
    [Show full text]