Emotion and Pain: a Functional Cerebral Systems Integration
Total Page:16
File Type:pdf, Size:1020Kb

Load more
Recommended publications
-
CROSS LINGUISTIC INTERPRETATION of EMOTIONAL PROSODY Åsa Abelin, Jens Allwood Department of Linguistics, Göteborg University
CROSS LINGUISTIC INTERPRETATION OF EMOTIONAL PROSODY Åsa Abelin, Jens Allwood Department of Linguistics, Göteborg University for which emotions are the primary, see e.g. Woodworth (1938), ABSTRACT Izard (1971), Roseman (1979). In the present study we have chosen some of the most commonly discussed emotions and This study has three purposes: the first is to study if there is any attitudes, “joy”, “surprise”, “sadness”, “fear”, “shyness”, stability in the way we interpret different emotions and attitudes “anger”, “dominance” and “disgust” (in English translation) in from prosodic patterns, the second is to see if this interpretation order to cover several types. is dependent on the listeners cultural and linguistic background, and the third is to find out if there is any reoccurring relation 2. METHOD between acoustic and semantic properties of the stimuli. Recordings of a Swedish speaker uttering a phrase while 2.1. Speech material and recording expressing different emotions was interpreted by listeners with In order to isolate the contribution of prosdy to the different L1:s, Swedish, English, Finnish and Spanish, who interpretation of emotions and attitudes we chose one carrier were to judge the emotional contents of the expressions. phrase in which the different emotions were to be expressed. The results show that some emotions are interpreted in The carrier phrase was salt sill, potatismos och pannkakor accordance with intended emotion in a greater degree than the (salted herring, mashed potatoes and pan cakes). The thought other emotions were, e.g. “anger”, “fear”, “sadness” and was that many different emotions can be held towards food and “surprise”, while other emotions are interpreted as expected to a that therefore the sentence was quite neutral in respect to lesser degree. -
Decoding Emotions from Nonverbal Vocalizations: How Much Voice Signal Is Enough?
Motivation and Emotion https://doi.org/10.1007/s11031-019-09783-9 ORIGINAL PAPER Decoding emotions from nonverbal vocalizations: How much voice signal is enough? Paula Castiajo1 · Ana P. Pinheiro1,2 © Springer Science+Business Media, LLC, part of Springer Nature 2019 Abstract How much acoustic signal is enough for an accurate recognition of nonverbal emotional vocalizations? Using a gating paradigm (7 gates from 100 to 700 ms), the current study probed the efect of stimulus duration on recognition accuracy of emotional vocalizations expressing anger, disgust, fear, amusement, sadness and neutral states. Participants (n = 52) judged the emotional meaning of vocalizations presented at each gate. Increased recognition accuracy was observed from gates 2 to 3 for all types of vocalizations. Neutral vocalizations were identifed with the shortest amount of acoustic information relative to all other types of vocalizations. A shorter acoustic signal was required to decode amusement compared to fear, anger and sadness, whereas anger and fear required equivalent amounts of acoustic information to be accurately recognized. These fndings confrm that the time course of successful recognition of discrete vocal emotions varies by emotion type. Compared to prior studies, they additionally indicate that the type of auditory signal (speech prosody vs. nonverbal vocaliza- tions) determines how quickly listeners recognize emotions from a speaker’s voice. Keywords Nonverbal vocalizations · Emotion · Duration · Gate · Recognition Introduction F0, intensity) serving the expression of emotional mean- ing. Nonetheless, compared to speech prosody, nonverbal The voice is likely the most important sound category in a vocalizations are considered to represent more primitive social environment. It carries not only verbal information, expressions of emotions and an auditory analogue of facial but also socially relevant cues about the speaker, such as his/ emotions (Belin et al. -
DISGUST: Features and SAWCHUK and Clinical Implications
Journal of Social and Clinical Psychology, Vol. 24, No. 7, 2005, pp. 932-962 OLATUNJIDISGUST: Features AND SAWCHUK and Clinical Implications DISGUST: CHARACTERISTIC FEATURES, SOCIAL MANIFESTATIONS, AND CLINICAL IMPLICATIONS BUNMI O. OLATUNJI University of Massachusetts CRAIG N. SAWCHUK University of Washington School of Medicine Emotions have been a long–standing cornerstone of research in social and clinical psychology. Although the systematic examination of emotional processes has yielded a rather comprehensive theoretical and scientific literature, dramatically less empirical attention has been devoted to disgust. In the present article, the na- ture, experience, and other associated features of disgust are outlined. We also re- view the domains of disgust and highlight how these domains have expanded over time. The function of disgust in various social constructions, such as cigarette smoking, vegetarianism, and homophobia, is highlighted. Disgust is also becoming increasingly recognized as an influential emotion in the onset, maintenance, and treatment of various phobic states, Obsessive–Compulsive Disorder, and eating disorders. In comparison to the other emotions, disgust offers great promise for fu- ture social and clinical research efforts, and prospective studies designed to improve our understanding of disgust are outlined. The nature, structure, and function of emotions have a rich tradition in the social and clinical psychology literature (Cacioppo & Gardner, 1999). Although emotion theorists have contested over the number of discrete emotional states and their operational definitions (Plutchik, 2001), most agree that emotions are highly influential in organizing thought processes and behavioral tendencies (Izard, 1993; John- Preparation of this manuscript was supported in part by NIMH NRSA grant 1F31MH067519–1A1 awarded to Bunmi O. -
About Emotions There Are 8 Primary Emotions. You Are Born with These
About Emotions There are 8 primary emotions. You are born with these emotions wired into your brain. That wiring causes your body to react in certain ways and for you to have certain urges when the emotion arises. Here is a list of primary emotions: Eight Primary Emotions Anger: fury, outrage, wrath, irritability, hostility, resentment and violence. Sadness: grief, sorrow, gloom, melancholy, despair, loneliness, and depression. Fear: anxiety, apprehension, nervousness, dread, fright, and panic. Joy: enjoyment, happiness, relief, bliss, delight, pride, thrill, and ecstasy. Interest: acceptance, friendliness, trust, kindness, affection, love, and devotion. Surprise: shock, astonishment, amazement, astound, and wonder. Disgust: contempt, disdain, scorn, aversion, distaste, and revulsion. Shame: guilt, embarrassment, chagrin, remorse, regret, and contrition. All other emotions are made up by combining these basic 8 emotions. Sometimes we have secondary emotions, an emotional reaction to an emotion. We learn these. Some examples of these are: o Feeling shame when you get angry. o Feeling angry when you have a shame response (e.g., hurt feelings). o Feeling fear when you get angry (maybe you’ve been punished for anger). There are many more. These are NOT wired into our bodies and brains, but are learned from our families, our culture, and others. When you have a secondary emotion, the key is to figure out what the primary emotion, the feeling at the root of your reaction is, so that you can take an action that is most helpful. . -
Interoception: the Sense of the Physiological Condition of the Body AD (Bud) Craig
500 Interoception: the sense of the physiological condition of the body AD (Bud) Craig Converging evidence indicates that primates have a distinct the body share. Recent findings that compel a conceptual cortical image of homeostatic afferent activity that reflects all shift resolve these issues by showing that all feelings from aspects of the physiological condition of all tissues of the body. the body are represented in a phylogenetically new system This interoceptive system, associated with autonomic motor in primates. This system has evolved from the afferent control, is distinct from the exteroceptive system (cutaneous limb of the evolutionarily ancient, hierarchical homeostatic mechanoreception and proprioception) that guides somatic system that maintains the integrity of the body. These motor activity. The primary interoceptive representation in the feelings represent a sense of the physiological condition of dorsal posterior insula engenders distinct highly resolved the entire body, redefining the category ‘interoception’. feelings from the body that include pain, temperature, itch, The present article summarizes this new view; more sensual touch, muscular and visceral sensations, vasomotor detailed reviews are available elsewhere [1,2]. activity, hunger, thirst, and ‘air hunger’. In humans, a meta- representation of the primary interoceptive activity is A homeostatic afferent pathway engendered in the right anterior insula, which seems to provide Anatomical characteristics the basis for the subjective image of the material self as a feeling Cannon [3] recognized that the neural processes (auto- (sentient) entity, that is, emotional awareness. nomic, neuroendocrine and behavioral) that maintain opti- mal physiological balance in the body, or homeostasis, must Addresses receive afferent inputs that report the condition of the Atkinson Pain Research Laboratory, Division of Neurosurgery, tissues of the body. -
Exposure to Negative Socio-Emotional Events Induces Sustained Alteration of Resting-State Brain Networks in the Elderly
Exposure to negative socio-emotional events induces sustained alteration of resting-state brain networks in the elderly Sebastian Baez Lugo ( [email protected] ) University of Geneva https://orcid.org/0000-0002-7781-2387 Yacila Deza-Araujo University Of Geneva https://orcid.org/0000-0002-2624-077X Fabienne Collette University of Liège https://orcid.org/0000-0001-9288-9756 Patrik Vuilleumier Lab NIC https://orcid.org/0000-0002-8198-9214 Olga Klimecki University of Geneva https://orcid.org/0000-0003-0757-7761 and The Medit-Ageing Research group Research Article Keywords: Ageing, Anxiety, Rumination, Emotional resilience, Default Mode Network, Functional connectivity, Amygdala, Insula, Posterior cingulate cortex, fMRI Posted Date: November 3rd, 2020 DOI: https://doi.org/10.21203/rs.3.rs-91196/v2 License: This work is licensed under a Creative Commons Attribution 4.0 International License. Read Full License 1 Title: 2 Exposure to negative socio-emotional events induces sustained alteration of 3 resting-state brain networks in the elderly 4 5 6 Authors: 7 Sebastian Baez Lugo1,2,*, Yacila I. Deza-Araujo1,2, Fabienne Collette3, Patrik Vuilleumier1,2, Olga 8 Klimecki1,4, and the Medit-Ageing Research Group 9 10 Affiliations: 11 1 Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland. 12 2 Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Neuroscience, 13 Medical School, University of Geneva, Geneva, Switzerland. 14 3 GIGA-CRC In Vivo Imaging Research Unit, University of Liège, Liège, Belgium. 15 4 Psychology Department, Technische Universität Dresden, Dresden, Germany. 16 17 * Correspondence concerning this article should be addressed to: [email protected] 18 1 19 Abstract: 20 Socio-emotional functions seem well-preserved in the elderly. -
Emotion Classification Based on Biophysical Signals and Machine Learning Techniques
S S symmetry Article Emotion Classification Based on Biophysical Signals and Machine Learning Techniques Oana Bălan 1,* , Gabriela Moise 2 , Livia Petrescu 3 , Alin Moldoveanu 1 , Marius Leordeanu 1 and Florica Moldoveanu 1 1 Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Bucharest 060042, Romania; [email protected] (A.M.); [email protected] (M.L.); fl[email protected] (F.M.) 2 Department of Computer Science, Information Technology, Mathematics and Physics (ITIMF), Petroleum-Gas University of Ploiesti, Ploiesti 100680, Romania; [email protected] 3 Faculty of Biology, University of Bucharest, Bucharest 030014, Romania; [email protected] * Correspondence: [email protected]; Tel.: +40722276571 Received: 12 November 2019; Accepted: 18 December 2019; Published: 20 December 2019 Abstract: Emotions constitute an indispensable component of our everyday life. They consist of conscious mental reactions towards objects or situations and are associated with various physiological, behavioral, and cognitive changes. In this paper, we propose a comparative analysis between different machine learning and deep learning techniques, with and without feature selection, for binarily classifying the six basic emotions, namely anger, disgust, fear, joy, sadness, and surprise, into two symmetrical categorical classes (emotion and no emotion), using the physiological recordings and subjective ratings of valence, arousal, and dominance from the DEAP (Dataset for Emotion Analysis using EEG, Physiological and Video Signals) database. The results showed that the maximum classification accuracies for each emotion were: anger: 98.02%, joy:100%, surprise: 96%, disgust: 95%, fear: 90.75%, and sadness: 90.08%. In the case of four emotions (anger, disgust, fear, and sadness), the classification accuracies were higher without feature selection. -
Emotion Classification Based on Brain Wave: a Survey
Li et al. Hum. Cent. Comput. Inf. Sci. (2019) 9:42 https://doi.org/10.1186/s13673-019-0201-x REVIEW Open Access Emotion classifcation based on brain wave: a survey Ting‑Mei Li1, Han‑Chieh Chao1,2* and Jianming Zhang3 *Correspondence: [email protected] Abstract 1 Department of Electrical Brain wave emotion analysis is the most novel method of emotion analysis at present. Engineering, National Dong Hwa University, Hualien, With the progress of brain science, it is found that human emotions are produced by Taiwan the brain. As a result, many brain‑wave emotion related applications appear. However, Full list of author information the analysis of brain wave emotion improves the difculty of analysis because of the is available at the end of the article complexity of human emotion. Many researchers used diferent classifcation methods and proposed methods for the classifcation of brain wave emotions. In this paper, we investigate the existing methods of brain wave emotion classifcation and describe various classifcation methods. Keywords: EEG, Emotion, Classifcation Introduction Brain science has shown that human emotions are controlled by the brain [1]. Te brain produces brain waves as it transmits messages. Brain wave data is one of the biological messages, and biological messages usually have emotion features. Te feature of emo- tion can be extracted through the analysis of brain wave messages. But because of the environmental background and cultural diferences of human growth, the complexity of human emotion is caused. Terefore, in the analysis of brain wave emotion, classifcation algorithm is very important. In this article, we will focus on the classifcation of brain wave emotions. -
Emotion Autonomic Arousal
PSYC 100 Emotions 2/23/2005 Emotion ■ Emotions reflect a “stirred up’ state ■ Emotions have valence: positive or negative ■ Emotions are thought to have 3 components: ß Physiological arousal ß Subjective experience ß Behavioral expression Autonomic Arousal ■ Increased heart rate ■ Rapid breathing ■ Dilated pupils ■ Sweating ■ Decreased salivation ■ Galvanic Skin Response 2/23/2005 1 Theories of Emotion James-Lange ■ Each emotion has an autonomic signature 2/23/2005 Assessment of James-Lange Theory of Emotion ■ Cannon’s arguments against the theory: ß Visceral response are slower than emotions ß The same visceral responses are associated with many emotions (Î heart rate with anger and joy). ■ Subsequent research provides support: ß Different emotions are associated with different patterns of visceral activity ß Accidental transection of the spinal cord greatly diminishes emotional reactivity (prevents visceral signals from reaching brain) 2 Cannon-Bard Criticisms ■ Arousal without emotion (exercise) 2/23/2005 Facial Feedback Model ■ Similar to James-Lange, but not autonomic signature; facial signature associated with each emotion. 2/23/2005 Facial Expression of Emotion ■ There is an evolutionary link between the experience of emotion and facial expression of emotion: ß Facial expressions serve to inform others of our emotional state ■ Different facial expressions are associated with different emotions ß Ekman’s research ■ Facial expression can alter emotional experience ß Engaging in different facial expressions can alter heart rate and skin temperature 3 Emotions and Darwin ■ Adaptive value ■ Facial expression in infants ■ Facial expression x-cultures ■ Understanding of expressions x-cultures 2/23/2005 Facial Expressions of Emotion ■ Right cortex—left side of face 2/23/2005 Emotions and Learning ■ But experience matters: research with isolated monkeys 2/23/2005 4 Culture and Display Rules ■ Public vs. -
Conceptual Framework for Quantum Affective Computing and Its Use in Fusion of Multi-Robot Emotions
electronics Article Conceptual Framework for Quantum Affective Computing and Its Use in Fusion of Multi-Robot Emotions Fei Yan 1 , Abdullah M. Iliyasu 2,3,∗ and Kaoru Hirota 3,4 1 School of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130022, China; [email protected] 2 College of Engineering, Prince Sattam Bin Abdulaziz University, Al-Kharj 11942, Saudi Arabia 3 School of Computing, Tokyo Institute of Technology, Yokohama 226-8502, Japan; [email protected] 4 School of Automation, Beijing Institute of Technology, Beijing 100081, China * Correspondence: [email protected] Abstract: This study presents a modest attempt to interpret, formulate, and manipulate the emotion of robots within the precepts of quantum mechanics. Our proposed framework encodes emotion information as a superposition state, whilst unitary operators are used to manipulate the transition of emotion states which are subsequently recovered via appropriate quantum measurement operations. The framework described provides essential steps towards exploiting the potency of quantum mechanics in a quantum affective computing paradigm. Further, the emotions of multi-robots in a specified communication scenario are fused using quantum entanglement, thereby reducing the number of qubits required to capture the emotion states of all the robots in the environment, and therefore fewer quantum gates are needed to transform the emotion of all or part of the robots from one state to another. In addition to the mathematical rigours expected of the proposed framework, we present a few simulation-based demonstrations to illustrate its feasibility and effectiveness. This exposition is an important step in the transition of formulations of emotional intelligence to the quantum era. -
EMOTION CLASSIFICATION of COVERED FACES 1 WHO CAN READ YOUR FACIAL EXPRESSION? a Comparison of Humans and Machine Learning Class
journal name manuscript No. (will be inserted by the editor) 1 A Comparison of Humans and Machine Learning 2 Classifiers Detecting Emotion from Faces of People 3 with Different Coverings 4 Harisu Abdullahi Shehu · Will N. 5 Browne · Hedwig Eisenbarth 6 7 Received: 9th August 2021 / Accepted: date 8 Abstract Partial face coverings such as sunglasses and face masks uninten- 9 tionally obscure facial expressions, causing a loss of accuracy when humans 10 and computer systems attempt to categorise emotion. Experimental Psychol- 11 ogy would benefit from understanding how computational systems differ from 12 humans when categorising emotions as automated recognition systems become 13 available. We investigated the impact of sunglasses and different face masks on 14 the ability to categorize emotional facial expressions in humans and computer 15 systems to directly compare accuracy in a naturalistic context. Computer sys- 16 tems, represented by the VGG19 deep learning algorithm, and humans assessed 17 images of people with varying emotional facial expressions and with four dif- 18 ferent types of coverings, i.e. unmasked, with a mask covering lower-face, a 19 partial mask with transparent mouth window, and with sunglasses. Further- Harisu Abdullahi Shehu (Corresponding author) School of Engineering and Computer Science, Victoria University of Wellington, 6012 Wellington E-mail: [email protected] ORCID: 0000-0002-9689-3290 Will N. Browne School of Electrical Engineering and Robotics, Queensland University of Technology, Bris- bane QLD 4000, Australia Tel: +61 45 2468 148 E-mail: [email protected] ORCID: 0000-0001-8979-2224 Hedwig Eisenbarth School of Psychology, Victoria University of Wellington, 6012 Wellington Tel: +64 4 463 9541 E-mail: [email protected] ORCID: 0000-0002-0521-2630 2 Shehu H. -
Pell Biolpsychology 2015 0.Pdf
Biological Psychology 111 (2015) 14–25 Contents lists available at ScienceDirect Biological Psychology journal homepage: www.elsevier.com/locate/biopsycho Preferential decoding of emotion from human non-linguistic vocalizations versus speech prosody a,b,∗ a a c a b M.D. Pell , K. Rothermich , P. Liu , S. Paulmann , S. Sethi , S. Rigoulot a School of Communication Sciences and Disorders, McGill University, Montreal, Canada b International Laboratory for Brain, Music, and Sound Research, Montreal, Canada c Department of Psychology and Centre for Brain Science, University of Essex, Colchester, United Kingdom a r t i c l e i n f o a b s t r a c t Article history: This study used event-related brain potentials (ERPs) to compare the time course of emotion process- Received 17 March 2015 ing from non-linguistic vocalizations versus speech prosody, to test whether vocalizations are treated Received in revised form 4 August 2015 preferentially by the neurocognitive system. Participants passively listened to vocalizations or pseudo- Accepted 19 August 2015 utterances conveying anger, sadness, or happiness as the EEG was recorded. Simultaneous effects of vocal Available online 22 August 2015 expression type and emotion were analyzed for three ERP components (N100, P200, late positive compo- nent). Emotional vocalizations and speech were differentiated very early (N100) and vocalizations elicited Keywords: stronger, earlier, and more differentiated P200 responses than speech. At later stages (450–700 ms), anger Emotional communication vocalizations evoked a stronger late positivity (LPC) than other vocal expressions, which was similar Vocal expression but delayed for angry speech. Individuals with high trait anxiety exhibited early, heightened sensitiv- Speech prosody ERPs ity to vocal emotions (particularly vocalizations).