Classification of Human Emotions from Electroencephalogram (EEG) Signal Using Deep Neural Network

Total Page:16

File Type:pdf, Size:1020Kb

Classification of Human Emotions from Electroencephalogram (EEG) Signal Using Deep Neural Network (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 8, No. 9, 2017 Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network Abeer Al-Nafjan Areej Al-Wabil College of Computer and Information Sciences Center for Complex Engineering Systems Imam Muhammad bin Saud University King Abdulaziz City for Science and Technology Riyadh, Saudi Arabia Riyadh, Saudi Arabia Manar Hosny Yousef Al-Ohali College of Computer and Information Sciences College of Computer and Information Sciences King Saud University King Saud University Riyadh, Saudi Arabia Riyadh, Saudi Arabia Abstract—Estimation of human emotions from [1]. Recognizing a user‘s affective state can be used to Electroencephalogram (EEG) signals plays a vital role in optimize training and enhancement of the BCI operations [2]. developing robust Brain-Computer Interface (BCI) systems. In our research, we used Deep Neural Network (DNN) to address EEG is often used in BCI research experimentation because EEG-based emotion recognition. This was motivated by the the process is non-invasive to the research subject and minimal recent advances in accuracy and efficiency from applying deep risk is involved. The devices‘ usability, reliability, cost- learning techniques in pattern recognition and classification effectiveness, and the relative convenience of conducting applications. We adapted DNN to identify human emotions of a studies and recruiting participants due to their portability have given EEG signal (DEAP dataset) from power spectral density been cited as factors influencing the increased adoption of this (PSD) and frontal asymmetry features. The proposed approach is method in applied research contexts [3], [4]. These advantages compared to state-of-the-art emotion detection systems on the are often accompanied by challenges such as low spatial same dataset. Results show how EEG based emotion recognition resolution and difficulty managing signal-to-noise ratios. can greatly benefit from using DNNs, especially when a large Power-line noise and artifacts caused by muscle or eye- amount of training data is available. movement may be permissible or even exploited as a control signal for certain EEG-based BCI applications. Therefore, Keywords—Electroencephalogram (EEG); Brain-Computer signal-processing techniques can be used to eliminate or reduce Interface (BCI); emotion recognition; affective state; Deep such artifacts. Neural Network (DNN); DEAP dataset This paper explains our research which involves I. INTRODUCTION implementing and examining the performance of the Deep Recent developments in BCI (Brain Computer Interface) Neural Network (DNN) to model a benchmark emotion dataset technologies have facilitated emotion detection and for classification. classification. Many BCI studies have investigated, detected The remainder of this paper is organized as following: In and recognized the user‘s affective state and applied the Section 2, we start with background about the emotion models findings in varied contexts including, among other things, then we briefly review the EEG-based emotion detection communication, education, entertainment, and medical systems in Section 3. In Section 4, we describe our proposed applications. BCI researchers are considering different method and techniques. In ection 5 , we discuss the results and responses in various frequency bands, ways of eliciting we finally draw conclusions in Section 6. emotions, and various models of affective states. Different techniques and approaches have been used in the processing II. EMOTION MODEL steps to estimate the emotional state from the acquired signals. Emotions can be generally classified on the basis of two BCI systems based on emotion detection are considered as models: discrete and dimensional [2], [5]. Dimensional model passive/involuntary control modality. For example, affective of emotion proposes that emotional states can be accurately computing focuses on developing applications, which represented by a small number of underlying affective automatically adapts to changes in the user‘s states, thereby dimensions. It represents the continuous emotional state, improving interaction that leads to more natural and effective represented as a vector in a multidimensional space. Most usability (e.g. with games, adjusting to the interest of the user) dimensional models incorporate valence and arousal. Valence 419 | P a g e www.ijacsa.thesai.org (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 8, No. 9, 2017 refers to the degree of ‗pleasantness‘ associated with an IV. PROPOSED SYSTEM emotion. It ranges from unpleasant (e.g. sad, stressed) to The performance of EEG recognition systems is based on pleasant (e.g. happy, elated). Whereas, arousal refers to the the method of feature extraction algorithm and classification strength of experienced emotion. This arousal occurs along a process. Hence, the aim of our study is to research the continuum and may range from inactive (e.g. uninterested, possibility of using EEG for the detection of four affective bored) to active (e.g. alert, excited) [5]. states, namely excitement, meditation, boredom, and Discrete theories of emotion propose an existence of small frustration using classification and pattern recognition numbers of separate emotions. These are characterized by techniques. For this purpose, we conducted rigorous offline coordinated response patterns in physiology, neural anatomy, analysis for investigating computational intelligence for and morphological expressions. Six basic emotions frequently emotion detection and classification. We used deep learning advanced in research papers include happiness, sadness, anger, classification on the DEAP dataset to explore how to employ disgust, fear, and surprise [6], [7]. intelligent computational methods in the form of classification algorithm. This could effectively mirror emotional affective Emotion measurement and assessment methods can be states of subjects. We also compared our classification subjective and/or objective affective measures. Subjective performance with a Random Forest classifier. measures use different self-report instruments, such as questionnaires, adjective checklists, and pictorial tools to We built our system in an open source programming represent any set of emotions, and can be used to measure language, Python, and used Scikit-Learn toolbox for machine mixed emotions. Self-reporting techniques, however, do not learning, along with Scipy for EEG filtering and preprocessing, provide objective measurements, but they do measure MNE for EEG-specific signal processing and Keras library for conscious emotions and they cannot capture the real-time deep learning. dynamics of the experience. In this section, we illustrate our methodology along with Objective measures can be obtained without the user‘s some implementation details of our proposed system. We start assistance. They use physiological cues derived from the with describing the benchmark dataset. Then, describe our physiology theories of emotion. Instruments that measure extracted features. Finally, we discuss the classification process blood pressure responses, skin responses, pupillary responses, and model evaluation method. brain waves, and heart responses are all used as objective A. DEAP Dataset measures methods. Moreover, hybrid methods combining both subjective and objective methods have been used to increase DEAP is a benchmark affective EEG database for the accuracy and reliability of affective emotional states [2], [6]. analysis of spontaneous emotions. DEAP database was prepared by Queen Mary University of London and published III. EEG-BASED EMOTION DETECTION SYSTEMS in [9]. The database contains physiological signals of 32 The volumes of studies and publications on EEG based participants. It was created with the goal of creating an emotion recognition have surged in recent years. Different adaptive music video recommendation system based on user models and techniques yield a wide range of systems. current emotion. DEAP has been used for conducting a number However, these systems can be easily differentiated owing to of studies and it has been proved that it is well-suited for the differences in stimulus, features of detection, temporal testing new algorithms [9], [10]. window, classifiers, number of participants and emotion To evaluate our proposed classification method, we used model, respectively. the preprocessed EEG dataset from DEAP database, where the Although the number of research studies conducted on sampling rate of the original recorded data of 5 Hz was EEG-based emotion recognition in recent years has been down-sampled to a sampling rate of 128 Hz, with a bandpass increasing, EEG-based emotion recognition is still a relatively frequency filter ranging from 4.0-45.0 Hz, and the EOG new area of research. The effectiveness and the efficiency of artifacts are eliminated from the signals. these algorithms are somewhat limited. Some unsolved issues B. Feature Extraction in current algorithms and approaches include the following: Feature extraction plays a critical role in designing an EEG- time constraints, accuracy, number of electrodes, number of based BCI system. Different features have been used in recognized emotions, and benchmark EEG affective databases literature, including Common Spatial pattern, Higher Order [6], [8]. Crossings, Hjorth parameters, time-domain statistics, EEG Accuracy and reliability of sensory
Recommended publications
  • Why Feelings Stray: Sources of Affective Misforecasting in Consumer Behavior Vanessa M
    Why Feelings Stray: Sources of Affective Misforecasting in Consumer Behavior Vanessa M. Patrick, University of Georgia Deborah J. MacInnis, University of Southern California ABSTRACT drivers of AMF has considerable import for consumer behavior, Affective misforecasting (AMF) is defined as the gap between particularly in the area of consumer satisfaction, brand loyalty and predicted and experienced affect. Based on prior research that positive word-of-mouth. examines AMF, the current study uses qualitative and quantitative Figure 1 depicts the process by which affective misforecasting data to examine the sources of AMF (i.e., why it occurs) in the occurs (for greater detail see MacInnis, Patrick and Park 2005). As consumption domain. The authors find evidence supporting some Figure 1 suggests, affective forecasts are based on a representation sources of AMF identified in the psychology literature, develop a of a future event and an assessment of the possible affective fuller understanding of others, and, find evidence for novel sources reactions to this event. AMF occurs when experienced affect of AMF not previously explored. Importantly, they find consider- deviates from the forecasted affect on one or more of the following able differences in the sources of AMF depending on whether dimensions: valence, intensity and duration. feelings are worse than or better than forecast. Since forecasts can be made regarding the valence of the feelings, the specific emotions expected to be experienced, the INTRODUCTION intensity of feelings or the duration of a projected affective re- Before purchase: “I can’t wait to use this all the time, it is sponse, consequently affective misforecasting can occur along any going to be so much fun, I’m going to go out with my buddies of these dimensions.
    [Show full text]
  • What We Mean When We Talk About Suffering—And Why Eric Cassell Should Not Have the Last Word
    What We Mean When We Talk About Suffering—and Why Eric Cassell Should Not Have the Last Word Tyler Tate, Robert Pearlman Perspectives in Biology and Medicine, Volume 62, Number 1, Winter 2019, pp. 95-110 (Article) Published by Johns Hopkins University Press For additional information about this article https://muse.jhu.edu/article/722412 Access provided at 26 Apr 2019 00:52 GMT from University of Washington @ Seattle What We Mean When We Talk About Suffering—and Why Eric Cassell Should Not Have the Last Word Tyler Tate* and Robert Pearlman† ABSTRACT This paper analyzes the phenomenon of suffering and its relation- ship to medical practice by focusing on the paradigmatic work of Eric Cassell. First, it explains Cassell’s influential model of suffering. Second, it surveys various critiques of Cassell. Next it outlines the authors’ concerns with Cassell’s model: it is aggressive, obscure, and fails to capture important features of the suffering experience. Finally, the authors propose a conceptual framework to help clarify the distinctive nature of sub- jective patient suffering. This framework contains two necessary conditions: (1) a loss of a person’s sense of self, and (2) a negative affective experience. The authors suggest how this framework can be used in the medical encounter to promote clinician-patient communication and the relief of suffering. *Center for Ethics in Health Care and School of Medicine, Oregon Health and Science University, Portland. †National Center for Ethics in Health Care, Washington, DC, and School of Medicine, University of Washington, Seattle. Correspondence: Tyler Tate, Oregon Health and Science University, School of Medicine, Depart- ment of Pediatrics, 3181 SW Sam Jackson Park Road, Portland, OR 97239-3098.
    [Show full text]
  • Measuring the Happiness of Large-Scale Written Expression: Songs, Blogs, and Presidents
    J Happiness Stud DOI 10.1007/s10902-009-9150-9 RESEARCH PAPER Measuring the Happiness of Large-Scale Written Expression: Songs, Blogs, and Presidents Peter Sheridan Dodds Æ Christopher M. Danforth Ó The Author(s) 2009. This article is published with open access at Springerlink.com Abstract The importance of quantifying the nature and intensity of emotional states at the level of populations is evident: we would like to know how, when, and why individuals feel as they do if we wish, for example, to better construct public policy, build more successful organizations, and, from a scientific perspective, more fully understand economic and social phenomena. Here, by incorporating direct human assessment of words, we quantify hap- piness levels on a continuous scale for a diverse set of large-scale texts: song titles and lyrics, weblogs, and State of the Union addresses. Our method is transparent, improvable, capable of rapidly processing Web-scale texts, and moves beyond approaches based on coarse categorization. Among a number of observations, we find that the happiness of song lyrics trends downward from the 1960s to the mid 1990s while remaining stable within genres, and that the happiness of blogs has steadily increased from 2005 to 2009, exhibiting a striking rise and fall with blogger age and distance from the Earth’s equator. Keywords Happiness Á Hedonometer Á Measurement Á Emotion Á Written expression Á Remote sensing Á Blogs Á Song lyrics Á State of the Union addresses 1 Introduction The desire for well-being and the avoidance of suffering arguably underlies all behavior (Argyle 2001; Layard 2005; Gilbert 2006; Snyder and Lopez 2009).
    [Show full text]
  • About Emotions There Are 8 Primary Emotions. You Are Born with These
    About Emotions There are 8 primary emotions. You are born with these emotions wired into your brain. That wiring causes your body to react in certain ways and for you to have certain urges when the emotion arises. Here is a list of primary emotions: Eight Primary Emotions Anger: fury, outrage, wrath, irritability, hostility, resentment and violence. Sadness: grief, sorrow, gloom, melancholy, despair, loneliness, and depression. Fear: anxiety, apprehension, nervousness, dread, fright, and panic. Joy: enjoyment, happiness, relief, bliss, delight, pride, thrill, and ecstasy. Interest: acceptance, friendliness, trust, kindness, affection, love, and devotion. Surprise: shock, astonishment, amazement, astound, and wonder. Disgust: contempt, disdain, scorn, aversion, distaste, and revulsion. Shame: guilt, embarrassment, chagrin, remorse, regret, and contrition. All other emotions are made up by combining these basic 8 emotions. Sometimes we have secondary emotions, an emotional reaction to an emotion. We learn these. Some examples of these are: o Feeling shame when you get angry. o Feeling angry when you have a shame response (e.g., hurt feelings). o Feeling fear when you get angry (maybe you’ve been punished for anger). There are many more. These are NOT wired into our bodies and brains, but are learned from our families, our culture, and others. When you have a secondary emotion, the key is to figure out what the primary emotion, the feeling at the root of your reaction is, so that you can take an action that is most helpful. .
    [Show full text]
  • Preverbal Infants Match Negative Emotions to Events
    Developmental Psychology © 2019 American Psychological Association 2019, Vol. 55, No. 6, 1138–1149 0012-1649/19/$12.00 http://dx.doi.org/10.1037/dev0000711 How Do You Feel? Preverbal Infants Match Negative Emotions to Events Ashley L. Ruba, Andrew N. Meltzoff, and Betty M. Repacholi University of Washington There is extensive disagreement as to whether preverbal infants have conceptual categories for different emotions (e.g., anger vs. disgust). In addition, few studies have examined whether infants have conceptual categories of emotions within the same dimension of valence and arousal (e.g., high arousal, negative emotions). The current experiments explore one aspect of infants’ ability to form conceptual categories of emotions: event-emotion matching. Three experiments investigated whether infants match different negative emotions to specific events. In Experiment 1, 14- and 18-month-olds were randomly assigned to 1 of 3 negative emotion conditions (Anger, Fear, or Disgust). Infants were familiarized with an Emoter interacting with objects in an anger-eliciting event (Unmet Goal) and a disgust-eliciting event (New Food). After each event, the Emoter expressed an emotion that was either congruent or incongruent with the event. Infants matched unmet goals to the expression of anger. However, neither age matched the expression of disgust to an event involving exposure to new food. To probe whether this was a design artifact, a revised New Food event and a fear-congruent event (Strange Toy) were created for Experiment 2. Infants matched the expression of disgust to the new food event, but they did not match fear to an event involving an unfamiliar object.
    [Show full text]
  • Emotion Classification Using Physiological Signals
    Emotion Classification Using Physiological Signals Byoung-Jun Park1, Eun-Hye Jang1, Myoung Ae Chung1, Sang-Hyeob Kim1*, Jin Hun Sohn2* 1IT Convergence Technology Research Laboratory, Electronics and Telecommunications Research Institute, Daejeon, 305-700 2Department of Psychology/Brain Research Institute, Chungnam National University, Daejeon, 305-765 ABSTRACT Objective: The aim of this study is to discriminate negative emotions, such as sadness, fear, surprise, and stress using physiological signals. Background: Recently, the main topic of emotion classification research is to recognize human’s feeling or emotion using various physiological signals. It is one of the core processes to implement emotional intelligence in human computer interaction (HCI) research. Method: Electrodermal activity (EDA), electrocardiogram (ECG), skin temperature (SKT), and photoplethysmography (PPG) are recorded and analyzed as physiological signals. And emotional stimuli are audio-visual film clips which have examined for their appropriateness and effectiveness through preliminary experiment. For classification of negative emotions, five machine learning algorithms, i.e., LDF, CART, SOM, and Naïve Bayes are used. Results: Result of emotion classification shows that an accuracy of emotion classification by CART (84.0%) was the highest and by LDA (50.7%) was the lowest. SOM showed emotion classification accuracy of 51.2% and Naïve Bayes was 76.2%. Conclusion: We could identify that CART was the optimal emotion classification algorithm for classifying 4 negative emotions (sadness, fear, surprise, and stress). Application: This result can be helpful to provide the basis for the emotion recognition technique in HCI. Keywords: Emotion classification, Negative emotion, Machine learning algorithm, Physiological signal 1. Introduction Nasoz, Alvarez, Lisetti, and Finkelstein, 2003).
    [Show full text]
  • OWNING YOUR FEELINGS Tips for Success
    OWNING YOUR FEELINGS It can be easy to get caught up in your emotions as you’re feeling them. Most people don’t think about what emotions they are dealing with, but taking the time to really identify what you’re feeling can help you to better cope with challenging situations. The English language has over 3,000 words for Tips for success emotions. Allow yourself to feel. Sometimes there are societal pressures that encourage people to shut down their emotions, often expressed through People who are good at statements like, “Big girls don’t cry,” or “Man up.” These outdated ideas are being specific about harmful, not helpful. Everyone has emotionsthey are part of the human identifying and labeling experienceand you have every right to feel them, regardless of gender, their emotions are less sexual orientation, ethnicity, socio-economic status, race, political likely to binge drink, be affiliation or religion. physically aggressive, or selfinjure when Don’t ignore how you’re feeling. Most of us have heard the term “bottling up distressed. your feelings” before. When we try to push feelings aside without addressing them, they build strength and make us more likely to “explode” at some point in When schoolaged kids are the future. It may not always be appropriate to process your emotions at the taught about emotions for very moment you are feeling them, but try to do so as soon as you can. 20-30 minutes per week their social behavior and Talk it out. Find someone you trust that you can talk to about how you’re school performance feeling.
    [Show full text]
  • Fear, Anger, and Risk
    Journal of Personality and Social Psychology Copyright 2001 by the American Psychological Association, Inc. 2001. Vol. 81. No. 1, 146-159 O022-3514/01/$5.O0 DOI. 10.1037//O022-3514.81.1.146 Fear, Anger, and Risk Jennifer S. Lemer Dacher Keltner Carnegie Mellon University University of California, Berkeley Drawing on an appraisal-tendency framework (J. S. Lerner & D. Keltner, 2000), the authors predicted and found that fear and anger have opposite effects on risk perception. Whereas fearful people expressed pessimistic risk estimates and risk-averse choices, angry people expressed optimistic risk estimates and risk-seeking choices. These opposing patterns emerged for naturally occurring and experimentally induced fear and anger. Moreover, estimates of angry people more closely resembled those of happy people than those of fearful people. Consistent with predictions, appraisal tendencies accounted for these effects: Appraisals of certainty and control moderated and (in the case of control) mediated the emotion effects. As a complement to studies that link affective valence to judgment outcomes, the present studies highlight multiple benefits of studying specific emotions. Judgment and decision research has begun to incorporate affect In the present studies we follow the valence tradition by exam- into what was once an almost exclusively cognitive field (for ining the striking influence that feelings can have on normatively discussion, see Lerner & Keltner, 2000; Loewenstein & Lerner, in unrelated judgments and choices. We diverge in an important way, press; Loewenstein, Weber, Hsee, & Welch, 2001; Lopes, 1987; however, by focusing on the influences of specific emotions rather Mellers, Schwartz, Ho, & Ritov, 1997). To date, most judgment than on global negative and positive affect (see also Bodenhausen, and decision researchers have taken a valence-based approach to Sheppard, & Kramer, 1994; DeSteno et al, 2000; Keltner, Ells- affect, contrasting the influences of positive-affect traits and states worth, & Edwards, 1993; Lerner & Keltner, 2000).
    [Show full text]
  • From Desire Coherence and Belief Coherence to Emotions: a Constraint Satisfaction Model
    From desire coherence and belief coherence to emotions: A constraint satisfaction model Josef Nerb University of Freiburg Department of Psychology 79085 Freiburg, Germany http://www.psychologie.uni-freiburg.de/signatures/nerb/ [email protected] Abstract The model is able to account for the elicitation of discrete emotions using these five criteria. In the DEBECO archi- Appraisal theory explains the elicitation of emotional reactions as a consequence of subjective evaluations tecture, the five criteria evolve out of three basic principles: based on personal needs, goals, desires, abilities, and belief coherence, desire coherence and belief-desire coher- beliefs. According to the appraisal approach, each emo- ence. Based on these principles, which are all well sup- tion is caused by a characteristic pattern of appraisals, ported by empirical evidence from psychological research, and different emotions are associated with different ap- DEBECO also allows the representation of two elementary praisals. This paper presents the DEBECO architec- types of goals: approach goals and avoidance goals. The ture, a computational framework for modeling the ap- DEBECO architecture is implemented as a parallel con- praisal process. DEBECO is based upon three core straint satisfaction network and is an extension of Thagard’s coherence principles: belief coherence, desire coher- HOTCO model (Thagard, 2000, 2003; Thagard & Nerb, ence and belief-desire coherence. These three coher- 2002). ence mechanisms allow two elementary types of goals to be represented: approach and avoidance goals. By Within DEBECO, emotions are seen as valenced reac- way of concrete simulation examples, it is shown how tions to events, persons or objects, with their particular na- discrete emotions such as anger, pride, sadness, shame, ture being determined by the way in which the eliciting situ- surprise, relief and disappointment emerge out of these ation is construed (cf.
    [Show full text]
  • Emotion Classification Based on Biophysical Signals and Machine Learning Techniques
    S S symmetry Article Emotion Classification Based on Biophysical Signals and Machine Learning Techniques Oana Bălan 1,* , Gabriela Moise 2 , Livia Petrescu 3 , Alin Moldoveanu 1 , Marius Leordeanu 1 and Florica Moldoveanu 1 1 Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Bucharest 060042, Romania; [email protected] (A.M.); [email protected] (M.L.); fl[email protected] (F.M.) 2 Department of Computer Science, Information Technology, Mathematics and Physics (ITIMF), Petroleum-Gas University of Ploiesti, Ploiesti 100680, Romania; [email protected] 3 Faculty of Biology, University of Bucharest, Bucharest 030014, Romania; [email protected] * Correspondence: [email protected]; Tel.: +40722276571 Received: 12 November 2019; Accepted: 18 December 2019; Published: 20 December 2019 Abstract: Emotions constitute an indispensable component of our everyday life. They consist of conscious mental reactions towards objects or situations and are associated with various physiological, behavioral, and cognitive changes. In this paper, we propose a comparative analysis between different machine learning and deep learning techniques, with and without feature selection, for binarily classifying the six basic emotions, namely anger, disgust, fear, joy, sadness, and surprise, into two symmetrical categorical classes (emotion and no emotion), using the physiological recordings and subjective ratings of valence, arousal, and dominance from the DEAP (Dataset for Emotion Analysis using EEG, Physiological and Video Signals) database. The results showed that the maximum classification accuracies for each emotion were: anger: 98.02%, joy:100%, surprise: 96%, disgust: 95%, fear: 90.75%, and sadness: 90.08%. In the case of four emotions (anger, disgust, fear, and sadness), the classification accuracies were higher without feature selection.
    [Show full text]
  • On Emotion Specificity in Decision Making
    Judgment and Decision Making, Vol. 3, No. 1, January 2008, pp. 18–27. On emotion specificity in decision making: Why feeling is for doing Marcel Zeelenberg∗1, Rob M. A. Nelissen1, Seger M. Breugelmans2, & Rik Pieters3 1 Department of Social Psychology and TIBER, Tilburg University 2 Department of Developmental, Clinical and Cross-cultural Psychology, Tilburg University 3 Department of Marketing and TIBER, Tilburg University Abstract We present a motivational account of the impact of emotion on decision making, termed the feeling-is-for-doing approach. We first describe the psychology of emotion and argue for a need to be specific when studying emotion’s impact on decision making. Next we describe what our approach entails and how it relates emotion, via motivation to behavior. Then we offer two illustrations of our own research that provide support for two important elements in our reasoning. We end with specifying four criteria that we consider to be important when studying how feeling guides our everyday doing. Keywords: emotion, decision making, motivation, action. 1 Introduction 1988). Bounded rationality may be helped by the exis- tence of emotion, because emotions restrict the size of the Most theories of rational decision making are descrip- consideration set and focus the decision maker on certain, tively implausible, especially if taken as process models. relevant aspects of the options (Hanoch, 2001). Emotions The idea that we take the time and invest the effort to pro- assign value to objects, aid the learning of how to ob- duce a list of advantages and disadvantages, or costs and tain those objects, and provide the motivation for doing benefits for all alternatives in each single decision and so (Gifford, 2002).
    [Show full text]
  • The Thinking-Feeling Connection
    Back from the Back from the Bluez Module 3 The Thinking-Feeling Connection The Thinking-Feeling Connection 2 Making the Connection 4 Module Summary 6 About the Modules 7 Centre for Clinical Module 3: The Thinking-Feeling Connection Page 1 nterventions •Psychotherapy•ReI search•Training Back from the The Thinking-Feeling Connection People often believe that the feelings and emotions they experience are determined by external events, situations, and the behaviour of others. For example, we may hear ourselves say, “My boss made me so nervous,” “My partner made me so angry,” “This trip down south made me feel so relaxed,” or “I’m depressed because I didn’t get the job I wanted.” What is the assumption underlying these statements? That someone or something other than ourselves was directly determining the feelings we experienced. We come to these conclusions automatically without asking ourselves if this assumption is true. However, if we stop to analyse the process that links an external situation to our emotional responses, we will find that there is a step in between. How Our Thoughts Influence Our Feelings What really makes us feel and respond the way we do, is often not the situation or the words and actions of another person, but how we perceive that situation or that person’s actions. It is how we see something or someone and what we think about it or them that really influences how we feel. It is our thoughts and beliefs about an event that significantly influences our emotions and actions. Here’s an example. Suppose you went to a party and your host introduces you to Mike.
    [Show full text]