Classification of Human Emotions from Electroencephalogram (EEG) Signal Using Deep Neural Network
Total Page:16
File Type:pdf, Size:1020Kb
(IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 8, No. 9, 2017 Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network Abeer Al-Nafjan Areej Al-Wabil College of Computer and Information Sciences Center for Complex Engineering Systems Imam Muhammad bin Saud University King Abdulaziz City for Science and Technology Riyadh, Saudi Arabia Riyadh, Saudi Arabia Manar Hosny Yousef Al-Ohali College of Computer and Information Sciences College of Computer and Information Sciences King Saud University King Saud University Riyadh, Saudi Arabia Riyadh, Saudi Arabia Abstract—Estimation of human emotions from [1]. Recognizing a user‘s affective state can be used to Electroencephalogram (EEG) signals plays a vital role in optimize training and enhancement of the BCI operations [2]. developing robust Brain-Computer Interface (BCI) systems. In our research, we used Deep Neural Network (DNN) to address EEG is often used in BCI research experimentation because EEG-based emotion recognition. This was motivated by the the process is non-invasive to the research subject and minimal recent advances in accuracy and efficiency from applying deep risk is involved. The devices‘ usability, reliability, cost- learning techniques in pattern recognition and classification effectiveness, and the relative convenience of conducting applications. We adapted DNN to identify human emotions of a studies and recruiting participants due to their portability have given EEG signal (DEAP dataset) from power spectral density been cited as factors influencing the increased adoption of this (PSD) and frontal asymmetry features. The proposed approach is method in applied research contexts [3], [4]. These advantages compared to state-of-the-art emotion detection systems on the are often accompanied by challenges such as low spatial same dataset. Results show how EEG based emotion recognition resolution and difficulty managing signal-to-noise ratios. can greatly benefit from using DNNs, especially when a large Power-line noise and artifacts caused by muscle or eye- amount of training data is available. movement may be permissible or even exploited as a control signal for certain EEG-based BCI applications. Therefore, Keywords—Electroencephalogram (EEG); Brain-Computer signal-processing techniques can be used to eliminate or reduce Interface (BCI); emotion recognition; affective state; Deep such artifacts. Neural Network (DNN); DEAP dataset This paper explains our research which involves I. INTRODUCTION implementing and examining the performance of the Deep Recent developments in BCI (Brain Computer Interface) Neural Network (DNN) to model a benchmark emotion dataset technologies have facilitated emotion detection and for classification. classification. Many BCI studies have investigated, detected The remainder of this paper is organized as following: In and recognized the user‘s affective state and applied the Section 2, we start with background about the emotion models findings in varied contexts including, among other things, then we briefly review the EEG-based emotion detection communication, education, entertainment, and medical systems in Section 3. In Section 4, we describe our proposed applications. BCI researchers are considering different method and techniques. In ection 5 , we discuss the results and responses in various frequency bands, ways of eliciting we finally draw conclusions in Section 6. emotions, and various models of affective states. Different techniques and approaches have been used in the processing II. EMOTION MODEL steps to estimate the emotional state from the acquired signals. Emotions can be generally classified on the basis of two BCI systems based on emotion detection are considered as models: discrete and dimensional [2], [5]. Dimensional model passive/involuntary control modality. For example, affective of emotion proposes that emotional states can be accurately computing focuses on developing applications, which represented by a small number of underlying affective automatically adapts to changes in the user‘s states, thereby dimensions. It represents the continuous emotional state, improving interaction that leads to more natural and effective represented as a vector in a multidimensional space. Most usability (e.g. with games, adjusting to the interest of the user) dimensional models incorporate valence and arousal. Valence 419 | P a g e www.ijacsa.thesai.org (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 8, No. 9, 2017 refers to the degree of ‗pleasantness‘ associated with an IV. PROPOSED SYSTEM emotion. It ranges from unpleasant (e.g. sad, stressed) to The performance of EEG recognition systems is based on pleasant (e.g. happy, elated). Whereas, arousal refers to the the method of feature extraction algorithm and classification strength of experienced emotion. This arousal occurs along a process. Hence, the aim of our study is to research the continuum and may range from inactive (e.g. uninterested, possibility of using EEG for the detection of four affective bored) to active (e.g. alert, excited) [5]. states, namely excitement, meditation, boredom, and Discrete theories of emotion propose an existence of small frustration using classification and pattern recognition numbers of separate emotions. These are characterized by techniques. For this purpose, we conducted rigorous offline coordinated response patterns in physiology, neural anatomy, analysis for investigating computational intelligence for and morphological expressions. Six basic emotions frequently emotion detection and classification. We used deep learning advanced in research papers include happiness, sadness, anger, classification on the DEAP dataset to explore how to employ disgust, fear, and surprise [6], [7]. intelligent computational methods in the form of classification algorithm. This could effectively mirror emotional affective Emotion measurement and assessment methods can be states of subjects. We also compared our classification subjective and/or objective affective measures. Subjective performance with a Random Forest classifier. measures use different self-report instruments, such as questionnaires, adjective checklists, and pictorial tools to We built our system in an open source programming represent any set of emotions, and can be used to measure language, Python, and used Scikit-Learn toolbox for machine mixed emotions. Self-reporting techniques, however, do not learning, along with Scipy for EEG filtering and preprocessing, provide objective measurements, but they do measure MNE for EEG-specific signal processing and Keras library for conscious emotions and they cannot capture the real-time deep learning. dynamics of the experience. In this section, we illustrate our methodology along with Objective measures can be obtained without the user‘s some implementation details of our proposed system. We start assistance. They use physiological cues derived from the with describing the benchmark dataset. Then, describe our physiology theories of emotion. Instruments that measure extracted features. Finally, we discuss the classification process blood pressure responses, skin responses, pupillary responses, and model evaluation method. brain waves, and heart responses are all used as objective A. DEAP Dataset measures methods. Moreover, hybrid methods combining both subjective and objective methods have been used to increase DEAP is a benchmark affective EEG database for the accuracy and reliability of affective emotional states [2], [6]. analysis of spontaneous emotions. DEAP database was prepared by Queen Mary University of London and published III. EEG-BASED EMOTION DETECTION SYSTEMS in [9]. The database contains physiological signals of 32 The volumes of studies and publications on EEG based participants. It was created with the goal of creating an emotion recognition have surged in recent years. Different adaptive music video recommendation system based on user models and techniques yield a wide range of systems. current emotion. DEAP has been used for conducting a number However, these systems can be easily differentiated owing to of studies and it has been proved that it is well-suited for the differences in stimulus, features of detection, temporal testing new algorithms [9], [10]. window, classifiers, number of participants and emotion To evaluate our proposed classification method, we used model, respectively. the preprocessed EEG dataset from DEAP database, where the Although the number of research studies conducted on sampling rate of the original recorded data of 5 Hz was EEG-based emotion recognition in recent years has been down-sampled to a sampling rate of 128 Hz, with a bandpass increasing, EEG-based emotion recognition is still a relatively frequency filter ranging from 4.0-45.0 Hz, and the EOG new area of research. The effectiveness and the efficiency of artifacts are eliminated from the signals. these algorithms are somewhat limited. Some unsolved issues B. Feature Extraction in current algorithms and approaches include the following: Feature extraction plays a critical role in designing an EEG- time constraints, accuracy, number of electrodes, number of based BCI system. Different features have been used in recognized emotions, and benchmark EEG affective databases literature, including Common Spatial pattern, Higher Order [6], [8]. Crossings, Hjorth parameters, time-domain statistics, EEG Accuracy and reliability of sensory