Reading Emotion from Mouse Cursor Motions: Affective Computing Approach
Total Page:16
File Type:pdf, Size:1020Kb
Cognitive Science (2017) 1–49 Copyright © 2017 Cognitive Science Society, Inc. All rights reserved. ISSN: 0364-0213 print / 1551-6709 online DOI: 10.1111/cogs.12557 Reading Emotion From Mouse Cursor Motions: Affective Computing Approach Takashi Yamauchi, Kunchen Xiao Department of Psychological and Brain Science, Texas A&M University Received 1 March 2016; received in revised form 5 July 2017; accepted 22 August 2017 Abstract Affective computing research has advanced emotion recognition systems using facial expres- sions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16–26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from “known” partici- pants (training data) could predict nearly 10%–20% of the variance of positive affect and atten- tiveness ratings of “unknown” participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Keywords: Mouse cursor motion analysis; Affective computing; Choice reaching trajectory; Emotion and motor control 1. Introduction As the face conveys information about a person’s emotions, do movements of the com- puter cursor inform her emotions? Affective computing research—interdisciplinary research arenas for the design of systems that can recognize, interpret, and simulate human emotions (http://www.acii2015.org/)—has advanced emotion recognition technolo- gies using facial expressions, voices, gaits, and physiological signals (Calvo, D’Mello, Gratch, & Kappas, 2015); yet these methods are costly and cumbersome (e.g., wearing a head gear for an electroencephalography [EEG] measure but see Yamauchi, Xiao, Correspondence should be sent to Takashi Yamauchi, Department of Psychology and Brain Science, Texas A&M University, College Station, TX 77843. E-mail: [email protected] 2 T. Yamauchi, K. Xiao / Cognitive Science (2017) Bowman, & Mueen, 2015), making them difficult for everyday applications (Calvo & D’Mello, 2010). By integrating the mouse cursor motion analysis method developed by Spivey, Dale, Freeman, and others into affective computing (Dale, Kehoe, & Spivey, 2007; Freeman, Ambady, Midgley, & Holcomb, 2011; Spivey, Grosjean, & Knoblich, 2005; Xiao & Yamauchi, 2014, 2015; Yamauchi, Kohn, & Yu, 2007), this article reports findings from four experiments that suggest that movements of the computer cursor can provide information about emotions of the user. 1.1. Theoretical background Choice-reaching—reaching a target object by hand or with a relevant tool (the com- puter pointer in this study)—is a dynamic decision-making process. It involves continuous assimilations of the desired location of the hand, relevant motor commands, and feedback signaling the discrepancy between the actual and the desired states (Song & Nakayama, 2009; Spivey, 2007; Wolpert & Ghahramani, 2000). In this process, higher cortical sys- tems make a coarse action plan, and sensorimotor subsystems coordinate the hand move- ment by processing contextual information in real time through three internal models: the forward model, the inverse model, and the forward feedback model (Glover, 2004; The- len, 1995; Wolpert & Ghahramani, 2000; Wolpert, Ghahramani, & Jordan, 1995). Our hypothesis is that emotion influences this coordination process and the interaction between emotion and hand motion can be analyzed in trajectories of the computer cursor in a choice-reaching task. Research in embodied cognition suggests that people’s cognitive, attitudinal, and affec- tive states are expressed in their bodily actions, which in turn invoke affective states (Barsalou, 1999; Barsalou, Niedenthal, Barbey, & Ruppert, 2003). Barsalou’s (1999) per- ceptual symbol systems hypothesis states that off-line cognition arises from a reenactment of sensory and perceptual modules. Neurological motor disorders, such as Parkinson’s disease, support the idea that human emotion can be reflected in cursor motions in a choice-reaching task. The basal ganglia, which play a pivotal role in voluntary goal-directed motor control, also aid cognitive and affective coordination through topographically organized neural circuits connecting the cortex and the thalamus (i.e., cortico-basao ganglia-thalamocortical circuits, Wichmann & DeLong, 2013). These circuits encompass motor-related areas (e.g., primary motor, sup- plementary motor and primary somatosensory cortices) and the regions that control emo- tion, motivation, and decision making (e.g., orbital and mesial frontal cortices, the anterior cingulate gyrus, the hypothalamsus, and the basolateral amygdala) (Mendoza & Foundas, 2008; Mink, 2008; Wichmann & DeLong, 2013). These circuits are interactive partly due to the involvement of the dopamine system, which influences affective, motor, and cognitive activities by modulating neurons in the basal ganglia, the limbic system, and the cerebral cortex (Bjorklund€ & Dunnett, 2007; Mendoza & Foundas, 2008). As a result, an imbalance in dopamine in these areas causes many behavioral aberrations of learning and memory, cognitive and motivational T. Yamauchi, K. Xiao / Cognitive Science (2017) 3 processing, and decision making. Neurological movement disorders such as Parkinson’s disease and Tourette syndrome are known to be linked to an imbalance in dopamine in the basal ganglia; these motor disorders often accompany cognitive and emotional disrup- tions such as depression, apathy, and obsessive-compulsive disorder (OCD) (Mink, 2008; Rabey, 2007; Weintraub & Stern, 2007). More than 40% of the people suffering from Tourette syndrome experience symptoms of OCD (Mink, 2008); nearly 60% of patients with Parkinson’s disease suffer from depression, and about 40% of patients have apathy —“a decrease in goal-directed behavior, thinking, and mood” (Weintraub & Stern, 2007). Interestingly, depression often precedes motor symptoms of Parkinson’s disease (Rabey, 2007). Given that a considerable amount of integration of emotion, motion, and cognitive con- trol takes place in the neural circuits that link the cortex, the basal ganglia, and the thala- mus (Wichmann & DeLong, 2013), it is likely that subtle disruptions in emotion can be reflected in cursor motions in a choice-reaching task. 1.2. Mouse cursor motion analysis in affective computing and cognitive science 1.2.1. Affective computing In human–computer interaction research, mouse cursor motion analysis originated in the late 1970s when researchers started to compare the performance of different input devices (Accot & Zhai, 1997, 1999; Card, English, & Burr, 1978; MacKenzie, Kauppinen, & Silfverberg, 2001). In the last 15 years, researchers have studied activities of the computer cursor for emotion recognition. Zimmermann et al. (Zimmermann, Gut- tormsen, Danuser, & Gomez, 2003) used a film-based emotion elicitation technique and investigated the impact of arousal and valence on cursor motion in an online shopping task (N = 76). The study showed that the total duration of cursor movement and the num- ber of velocity changes were related to arousal. However, no evidence linking valence (e.g., positive and negative affects) and cursor activities was corroborated. Kapoor et al. (Kapoor, Burleson, & Picard, 2007) integrated a pressure-sensitive mouse into their multi- channel automatic affect detection system. The researchers measured mean, variance, and skewness of mouse pressure while subjects (middle school students, N = 24) learned to solve a Tower of Hanoi puzzle. The mouse pressure was as discriminable as skin conduc- tance measures for the detection of frustration. Azcarraga and Suarez (2012) evaluated EEG signals and mouse activities (the number of mouse clicks, distance traveled, click duration) during algebra learning using an intelligent tutoring system (N = 25). Emotion prediction rates based solely on EEG were 54%–88%. When mouse activity data were augmented to the EEG data, accuracy rates increased up to 92%, indicating that mouse activity can supply useful information for emotion recognition on top of EEG data. Grimes et al. (Grimes, Jenkins, & Valacich, 2013) used pictures for emotion elicitation and measured mouse cursor motion patterns such as traveled distance, speed, and direction change when participants indicated their valence and arousal with the 4 T. Yamauchi, K. Xiao / Cognitive Science (2017) Self-Assessment Manikin (SAM) (Bradley & Lang, 1994). They found that high and low arousal as well as positive and negative valence influenced cursor movements such as trav- eled distance and direction change. Hibbeln et al. (Hibbeln, Jenkins, Schneider, Valacich, & Weinmann, 2017) and Sun et al. (Sun, Paredes, & Canny, 2014) employed task-based emotion elicitation and showed that users’ stress and frustration felt in a simple interface manipulation task (e.g., pointing and dragging icons) could be reflected in cursor activities such as traveled distance and direction change (e.g., controlling oscillation of the cursor). Beyond these studies, clear evidence that links cursor activities and emotions is