Chen Et Al., 2021 1 Facial Expressions Dynamically
Total Page:16
File Type:pdf, Size:1020Kb
[Version for pre-print view only; revised in January 2021] Chen et al., 2021 Facial expressions dynamically decouple the transmission of emotion categories and intensity over time Chaona Chen1, Daniel S. Messinger2, Cheng Chen3, Hongmei Yan4, Yaocong Duan1, Robin A. A. Ince5, Oliver G. B. Garrod5, Philippe G. Schyns1,5, & Rachael E. Jack1,5 1School of Psychology, University of Glasgow, Scotland, UK 2Department of Psychology, University of Miami, Florida, USA 3Foreign Language Department, Teaching Center for General Courses, Chengdu Medical College, Chengdu, China 4The MOE Key Lab for Neuroinformation, University of Electronic Science and Technology of China, Chengdu, China 5Institute of Neuroscience and Psychology, University of Glasgow, Scotland, UK Abstract. Facial expressions dynamically transmit information-rich social messages. How they achieve this complex signalling task remains unknown. Here we identified, in two cultures – East Asian and Western European – the specific face movements that transmit two key signalling elements – emotion categories (e.g., ‘happy’) and intensity (e.g., ‘very intense’) – in basic and complex emotions. Using a data-driven approach and information- theoretic analyses, we identified in the six basic emotions (e.g., happy, fear, sad) – the specific face movements that transmit the emotion category (classifiers), intensity (intensifiers), or both (classifier+intensifier) to each of 60 participants in each culture. We validated these results in a broader set of complex emotions (e.g., excited, shame). Cross- cultural comparisons revealed cultural similarities (e.g., eye whites as intensifiers) and differences (e.g., mouth gaping). Further, in both cultures, classifier and intensifier face movements are temporally distinct. Our results reveal that facial expressions transmit complex emotion messages by cascading information over time. One Sentence Summary. Facial expressions of emotion universally transmit multiplexed emotion information using specific face movements that signal emotion categories and intensity in a temporally structured manner over time. 1 [Version for pre-print view only; revised in January 2021] Chen et al., 2021 Social communication is essential for the survival of most species because it provides important information about the internal states1 and behavioural intentions2 of others. Across the animal kingdom, social communication is often achieved using non-verbal signalling such as facial expressions3-6. For example, when smiling retracts the corners of the lips, this facial movement is often readily perceived as a sign of happiness or appeasement in humans, apes, and dogs7-9. Facial expressions can also convey additional important information such as emotional intensity – for example, contentment to cheerful to delighted and ecstatic – each of which can also signal affiliation and social bonding or reward and joy10-12. Across human cultures, the intensity of expressed emotion can also lead to different social inferences – for example, in Western European cultures broad smiling is often associated with positive traits such as competence and leadership. In contrast, in Eastern cultures such as Russia and China where milder expressions are favoured, broad smiles are often associated with negative traits such as low intelligence13 or high dominance14. Therefore, facial expressions are a powerful tool for social communication because they can transmit information-rich social messages, such as emotion categories and their intensities, that inform and shape subsequent social perceptions and interactions15-20. However, how facial expressions achieve this complex signalling task remains unknown – that is, which specific components of facial expression signals transmit the key elements of a social message: its category and intensity. Here, we address this question by studying the communicative functions and adaptive significance of human facial expressions of emotion from the perspective of theories of communication (see Fig. 1). These theories posit that signals are designed to serve several main purposes, two of which are particularly important for social communication. The first main purpose is ‘classifying,’ which enables the receiver to recognize a particular emotion category. For example, smiles are typically associated with states of happiness and positive 2 [Version for pre-print view only; revised in January 2021] Chen et al., 2021 affect. The second main purpose is ‘intensification,’ where specific modulations of a signal – such as variations in amplitude, size, duration, or repetition rate – enhances the signal salience, quickly draws the receiver’s attention, and communicates the magnitude of social message. For example, larger, higher amplitude signals are detectable from longer distances21, and signals with long durations or high repetition rates can easily draw the attention of otherwise distracted receivers22,23 thereby enabling them to focus on analysing the signal in more detail24-26 which may be particularly important is cases of threat. Although certain signals might serve to communicate either the emotion category or its intensity, some might play a dual role, particularly for emotions that require efficient signalling to elicit rapid responses from others, such as surprise, fear, disgust, or anger. We study these communicative functions in two distinct cultures – East Asian and Western European – each with known differences in perceiving facial expressions27,28, to derive a culturally informed understanding of facial expression communication29,30. Fig. 1 illustrates the logic of our hypothesis as a Venn diagram, where each colour represents a different communicative function. 3 [Version for pre-print view only; revised in January 2021] Chen et al., 2021 Fig. 1 | Sending and receiving signals for social communication. To communicate a message to others, the sender encodes a message (e.g., “I am very happy” coloured in blue) in a signal. Here, the signal is a facial expression composed of different face movements, called Action Units (AUs)31. The sender transmits this signal to the receiver across communication channel. On receiving the signal, the receiver decodes a message from it (“he is very happy”) according to existing associations. A complex signal such as a facial expression could contain certain components – e.g., smiling, crinkled eyes, or wide opened mouth – that transmit specific elements of the message such as the emotion category ‘happy’ or its intensity (‘very’). We represent these different communicative functions using the Venn diagram. Green represents the set of AUs that communicate the emotion category (‘Classify,’ e.g., ‘happy’), red represents those that communicate intensity (‘Intensify,’ e.g., ‘very’), and orange represents those that serve a dual role of classification and intensification (‘Classify & Intensify’). The green set represents the facial signals that receivers use to classify the emotion category (e.g., ‘happy’), red represents those that receivers use to perceive emotional intensity (e.g., ‘very’), and the orange intersection represents the facial signals that serve both functions of classification and intensification (e.g., ‘very happy’). The empirical question we address is to identify, in each culture, the facial signals – here, individual face movements called Action Units31 (AUs) and their dynamic characteristics such as amplitude and temporal signalling order – that serve each type of communicative function (see Fig. 2 for the methodical 4 [Version for pre-print view only; revised in January 2021] Chen et al., 2021 approach). We find that, in each culture, individual face movements such as smiling, eye widening, or scowling, each serve a specific communicative function of transmitting emotion category and/or intensity information. Cross-cultural comparisons showed that certain face movements serve a similar communicative function across cultures – for example, Upper Lid Raiser (AU5) serves primarily as an emotion classifier with occasional use as an intensifier – while others serve different functions across cultures – for example, Mouth Stretch (AU27) primarily serves as an emotion classifier for East Asian participants and an intensifier for Western participants (see Fig. 3). An analysis of the temporal ordering of classifier and intensifier face movements show that, in each culture, they are temporally distinct with intensifier face movements peaking earlier or later than classifiers. Together, our results reveal for the first time how facial expressions, as a complex dynamical signalling system, transmit multi-layered emotion messages. Our results therefore provide new insights into the longstanding goal of deciphering the language of human facial expressions3,4,32-35. Results Identifying face movements that communicate emotion categories and intensity. To identify the specific face movements that serve each communicative function – emotion classifier, intensifier, or dual classifier and intensifier – we used a data-driven approach that agnostically generates face movements and tests them against subjective human cultural perception36. We then measured the statistical relationship between the dynamic face movements – i.e., Action Units (AUs) – presented on each trial and the participants’ response using an information-theoretic analysis37. Fig. 2 operationalizes our hypothesis and illustrates our methodological approach with the six classic emotions – happy, surprise, fear, disgust, anger and sad. 5 [Version