Emotion Emotion Words: Adding Face Value Jennifer M. B. Fugate, Maria Gendron, Satoshi F. Nakashima, and Lisa Feldman Barrett Online First Publication, June 12, 2017. http://dx.doi.org/10.1037/emo0000330 CITATION Fugate, J. M. B., Gendron, M., Nakashima, S. F., & Barrett, L. F. (2017, June 12). Emotion Words: Adding Face Value. Emotion. Advance online publication. http://dx.doi.org/10.1037/emo0000330 Emotion © 2017 American Psychological Association 2017, Vol. 0, No. 999, 000 1528-3542/17/$12.00 http://dx.doi.org/10.1037/emo0000330 Emotion Words: Adding Face Value Jennifer M. B. Fugate Maria Gendron University of Massachusetts Dartmouth Northeastern University Satoshi F. Nakashima Lisa Feldman Barrett Kyoto University Northeastern University and Massachusetts General Hospital, Boston, Massachusetts Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants’ abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d=) to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer “yes”) when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. Keywords: emotion perception, emotional faces, emotion words, semantic priming, signal detection theory Emotion perception happens with ease: humans can effortlessly structural information of the face. An alternative constructionist look at another person’s face and see happiness, sadness, anger, or account suggests that emotion labels are not applied as a result of some other emotion. Structural approaches to emotion perception structural processing, but rather contribute to emotion perception suggest that the information in another person’s face informs in a predictive manner by bringing online conceptual knowledge discrete emotion judgments due to inherent links between facial associated with an emotion category. Constructionist accounts are actions and emotions (e.g., Ekman, 1972, 1992; Ekman & Cor- similar to structural approaches in that some structural features daro, 2011; Izard, 1971, 1994; Matsumoto, Keltner, Shiota, of facial actions inform perceptions. These approaches are O’Sullivan, & Frank, 2008; Tomkins, 1962, 1963). According to distinct, however, in that perceivers routinely infer emotional these views, language is either independent to emotion perception states that are informed by other sources of information. Said (e.g., Brosch, Pourtois, & Sander, 2010; Sauter, LeGuen, & Haun, another way, emotion perception is shaped by the internal 2011) or a byproduct of the communication process (i.e., innate context that exists in the mind of a perceiver or additional categories sediment out into language; Scherer, 2000). As a result, external context in the environment (Barrett, 2006a, 2006b; labels applied to faces should be the result of processing of the 2011, 2012, 2013, 2014). This document is copyrighted by the American Psychological Association or one of its allied publishers. Jennifer M.B. Fugate and by NRSA (5F32MH105052) to Maria Gend- This article is intended solely for the personal use of the individual userJennifer and is not to be disseminated broadly. M. B. Fugate, Department of Psychology, University of Massachusetts ron, as well as a National Institutes of Health Director’s Pioneer Award Dartmouth; Maria Gendron, Department of Psychology, Northeastern University; (DP1OD003312) and National Science Foundation Grant BCS 0721260 Satoshi F. Nakashima, Graduate School of Education, Kyoto University; Lisa to Lisa Feldman Barrett. The content is solely the responsibility of the Feldman Barrett, Department of Psychology, Northeastern University, and Psy- authors and does not necessarily represent the official views of the chiatric Neuroimaging Research Program and Martinos Center for Biomedical National Institute of Mental Health or the National Institutes of Imaging, Massachusetts General Hospital, Boston, Massachusetts. Health. Satoshi F. Nakashima is now at Department of Psychology, Hiroshima We thank (in alphabetical order) Eve Barnard, Katie Bessette, Dasha Shudo University. Cherkosav, Colleen Mahrer, and Elisabeth Secor for help with data col- Portions of this data were presented in poster format at the meeting for lection and coding. We also thank Souhee Yang for help with additional the Society of Personality and Social Psychology, American Psychological data analysis. In addition, we thank Erika Siegel for helpful feedback on an Society, and International Society for Research on Emotion, and in oral earlier version of this article. formats at the meeting for the American Psychological Society and the Correspondence concerning this article should be addressed to Jen- Society for Experimental Social Psychology. nifer M. B. Fugate, Department of Psychology, University of Massa- Experiments were supported by an National Research Service Award chusetts Dartmouth, Dartmouth, MA 0274. E-mail: jfugate@umassd (NRSA; F32MH083455) from the National Institute of Mental Health to .edu 1 2 FUGATE, GENDRON, NAKASHIMA, AND BARRETT Language is a key part of the internal context which informs tory task (via semantic satiation), their ability to judge whether emotion perception. In our model of emotion perception, emotion two faces match in emotional content is no better than chance words like “anger,” “sadness,” and “fear” name folk categories (Lindquist, Barrett, Bliss-Moreau, & Russell, 2006). In addi- that divide up the continuous and highly variable measurable tion, patients with semantic dementia (a neurodegenerative outcomes (e.g., facial muscle movements, peripheral physiology, disease defined by the loss of abstract words) do not categorize behavior; Barrett, 2006a, 2006b; 2011, 2012, 2013, 2014). Al- emotional faces into discrete categories. Instead, they sort faces into though in some instances of an emotion, individuals might produce more basic, affective (positive and negative) piles (Lindquist, Gend- a canonical “expression” (i.e., widened eyes, mouth agape), what ron, Barrett, & Dickerson, 2014). Russell (2003) referred to as “blue-ribbon” instances, there are Perceptual memory is also impacted by accessibility of emotion many more instances in which this structural information is not language. Participants remember morphed emotional faces as more present, or not sufficient in and of itself. The language-as-context “angry” when they are paired with the word “anger” than when hypothesis (Barrett, 2009; Barrett, Lindquist, & Gendron, 2007; paired with no word (Halberstadt, 2003, 2005; Halberstadt & Barrett, Mesquita, & Gendron, 2011; for a recent review, see Niedenthal, 2001). Subsequent research suggested that participants Fugate & Barrett, 2014; Lindquist & Gendron, 2013) suggests that do not simply reconstruct the emotional faces at the time of recall, emotion words provide an internal context that helps to constrain but rather they encode the faces in the presence of words as the complex, multidimensional flow of information. As a result, actually more intense (e.g., more angry). To distinguish perceptual bringing an emotion word online might change the nature of the encoding effects from postprocessing effects, participants’ facial structural information which is encoded from emotional faces. In musculature was measured with EMG while encoding the the current experiments, we build on the language-as-context morphed faces with differing category information (e.g., emotion hypothesis to examine how emotion words change perceptual words, personality adjectives, and ideographs from a foreign lan- memory for emotional faces. Specifically, we investigate how guage). Consistent with the previous demonstrations, participants emotion words change participants’ likelihood to correctly select a indicated a more intense exemplar (more angry) as the target target face from distractor faces. We also investigate how emotion during recall when encoded with a matching emotion word; more- words change perceptual memory biases when the time to encode over, the amount of facial muscle activity in the perceiver’s face the face is systematically varied and the emotion words presented (assumed to be the product of simulation of the emotion word) are not fully congruent with the face. during encoding predicted the amount of memory bias. That is, participants who showed more facial activity congruent with the Emotion Words and Emotion Judgments stereotype of the emotion label during encoding indicated targets as containing more of that stereotyped emotion during recall, even The impact of emotion words on emotion judgments is demon- when the category information was not paired with the face at strated by several lines of research. Making language
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages15 Page
-
File Size-