<<

The Impact of Emotional Expression on the Association between Sensorimotor Simulation and Empathic Accuracy

Master’s Thesis

Presented to

The Faculty of the Graduate School of Arts and Sciences Brandeis University Department of Jennifer Gutsell, Advisor

In Partial Fulfillment of the Requirements for the Degree

Master of Arts in Psychology

by Tong Lin

August 2019

Copyright by

Tong Lin

© 2019

ABSTRACT

The Impact of Emotional Expression on the Association between Sensorimotor Simulation and Empathic Accuracy

A thesis presented to the Department of Psychology

Graduate School of Arts and Sciences Brandeis University Waltham, Massachusetts

By Tong Lin

Understanding other people’s is one of the most important things people do in daily life. Multiple systems are involved in this process including sensorimotor simulation, the inhibition of activity in sensorimotor networks (Wood, Rychlowska, & Niedenthal, 2016).

Sensorimotor simulation has been studied in simple tasks such as reading a narrative or looking at pictures, but it is unclear how this simulation-related brain activity might play out in daily interactions that are far more complex than the experience of research participants in traditional social neuroscience studies that use non-naturalistic stimuli, nor do we know its impact on empathic accuracy. Therefore, we explored people’s ability to understand another’s emotions by combining a face-to-face interaction with a reliable index of sensorimotor simulation, which is mu suppression, an 8–13 Hz band-range of the electroencephalogram (EEG) over the motor cortex (Ulloa, & Pineda, 2007) in this study. In the experiment, we recorded EEG as participants took turns sharing one positive and one negative experience with their partner while the other listened passively. They later watched video recordings of the shared experience and rated the iii

expressed by themselves and their partner. Correlations between self– and partner– ratings served as an estimate of empathic accuracy, the extent to which they understand each other’s emotions. Following the experience sharing task, participants watched their partner repeatedly squeezing a ball, and EEG mu suppression during the observation of the partner’s ball squeezing served as a measure of sensorimotor simulation. We hypothesized that stronger mu suppression would be correlated with higher empathic accuracy and that this association would be moderated by emotional expression displayed in conversations, such that having more emotional expression would be associated with a stronger correlation between mu suppression and empathic accuracy. We did not find a correlation between mu suppression and empathic accuracy, but we found evidence supporting a positive correlation between emotional expressions and empathic accuracy.

iv

List of Tables

TABLE PAGE

Table 1 20

Table 2 25

Table 3 28

Table 4 30

v

List of Figures

Figure PAGE

Figure 1 17

Figure 2 24

Figure 3 25

Figure 4 26

Figure 5 32

Figure 6 32

vi

The Impact of Emotional Expression on the Association between Sensorimotor Simulation and

Empathic Accuracy

Empathy is an important ability that allows us to perceive and respond (Decety, 2016) to the emotional states of others. It plays a crucial role in interpersonal interaction, which was illustrated by Barack Obama who told the Northwestern University graduates at 2006

Commencement that people should try to “put [themselves] in someone else’s shoes” and “to see the world through those who are different from [them]” (Obama, 2016). Furthermore, impairments in could result in severe problems in social function, such as those seen in spectrum disorders and psychopathy (Zaki, Weber, Bolger, & Ochsner, 2009). However, the exact mechanisms that underlie people’s ability to accurately empathize are still under debate: Studies have linked facial mimicry, which is the tendency to imitate others’ emotional facial expressions (Achaibou, Pourtois, Schwartz, & Vuilleumier, 2008) to empathic accuracy, which is defined as the observer’s ability to correctly infer the target’s emotion (Hess, & Blairy,

2001). Other studies have linked sensorimotor simulation, a mechanism that activates similar neural networks in perceivers as they would if they were to perform the same action or experience the same emotions themselves (Zaki et al., 2012), and empathic accuracy (Ong, Zaki,

Perry, & Ong, 2018, preprint), but to our knowledge, no study has looked at how both emotional expression and sensorimotor simulation might interact to empathic accuracy. Therefore, the goal of the current study is to look at neural and behavioral indicators of the simulation of another’s emotional states, and to relate them to empathic accuracy.

1

Experience Sharing, its Precursors and Consequences

Characteristics of empathy can be divided into three mental categories: (1) experience sharing: “vicariously sharing targets’ internal states”, (2) mentalizing: understanding the targets’ mental states, and (3) prosocial concern: “expressing motivation to improve targets’ experiences”

(Zaki, & Ochsner, 2012). The category of of this study is experience sharing, which is often tied to sensorimotor simulation (Zaki et al., 2012). Previous research has identified factors that contribute to the degree to which individuals share the experience of others: The target- related factors include similarity (Gutsell, & Inzlicht, 2012), familiarity (Khalil, 2002), past experience (Stinson, & Ickes, 1992), learning (Preston, & Waal, 2002) and emotional expressivity (Zaki, Bolger, & Ochsner, 2009); situational factors include material costs (Zaki,

2014); perceiver related factors include personality traits (Avenanti, Minio-Paluello, Bufalari, &

Aglioti, 2009) and the socioeconomic status of the perceivers (Kraus, Côté, & Keltner, 2010).

Yet no study has examined the effect of the perceivers’ emotional expressivity on their ability to share the experiences of others.

Sensorimotor simulation is one of the brain mechanisms involved in experience sharing

(Ong et al., 2018, under review). The association between sensorimotor simulation and different components in empathy has been explored. For example, sensorimotor simulation is associated with perspective-taking (Woodruff, Martin, & Bilyk, 2011), an aspect of trait empathy that allows individuals to understand situations from another point of view (Galinsky, Maddux, Gilin,

& White, 2008), and plays an important role in the social learning of reactions to (Avenanti,

Bueti, Galati, & Aglioti, 2005). Experience sharing can influence social understanding through its interplay with mentalizing (Kanske, Böckler, Trautwein, Parianen Lesemann, & Singer,

2016). For instance, research using a measure of empathic ability and propensity that separates

2

empathy and Theory of Mind, showed that experience sharing could inhibit mentalizing in emotionally intense situations (Kanske et al., 2016). The neurological investigation of sensorimotor simulation can be traced back to the discovery of mirror neurons in macaque monkeys. In the experiment that examined a macaque monkey’s premotor area, mirror neurons responded both when the monkey performed a particular action and when it observed the same action (Gallese, & Goldman, 1998). Later studies also found sensorimotor simulation in response to emotional faces (Moore, Gorodnitsky, & Pineda, 2012) and tasks that require empathy (Pineda

& Hecht, 2009).

Most of the existing research on the relationship between experience sharing and empathy has focused on trait empathy or state empathy, but it is also important to look at empathic accuracy because it is the aspect of empathy that includes both the state of the empathic perceiver and the state of the target, and examines the extent to which people accurately empathize with the target at the moment (Zaki et al., 2008). Moreover, experience sharing is usually examined through self-report (such as Atkins, Uskul, & Cooper, 2016; Ali, Amorim, &

Chamorro-Premuzic, 2009; Clark, Winkielman, & McIntosh, 2008), or assessed on the neural level through vicarious neural activation in relevant sensorimotor, pain, or emotional areas (such as Fabi, & Leuthold, 2018; Cheng, Chen, & Decety, 2014; Oberman, Hubbard, McCleery,

Altschuler, Ramachandran, & Pineda, 2005). The targets of empathy in these studies are images or videos of facial expressions, hands in painful situations, or similarly artificial stimuli, rather than actual interaction partners, which compromises in mundane realism. Using similar methods, a previous study found supporting evidence for a positive association between experience sharing, indexed through EEG mu suppression (a 8–13 Hz band-range of the electroencephalogram over the motor cortex (Ulloa et al., 2007), and empathic accuracy where

3

participants watched videos of other people sharing their stories and rated the of the targets (Ong et al., preprint). However, given that the social situation in this study was somewhat artificial it is still unclear how experience sharing might affect empathic accuracy in a more realistic environment during which the perceiver also interacts and communicates with the target.

Finally, on a behavioral level experience sharing sometimes is revealed in vicarious emotional facial expressions and has been investigated using facial electromyography (EMG).

For instance, facial EMG activity showed that people with high trait empathy are more reactive to emotional faces in comparison with individuals with low trait empathy (Dimberg, Andréasson,

& Thunberg, 2011). The high empathic group had stronger corrugator muscle activity –indicative of frowning and other negative emotion expressions, in response to angry faces and stronger zygomatic muscle – indicative of smiling and positive emotion expressions, in response to happy faces (Dimberg et al., 2011). Behavioral methods such as coding have not been used in empathic accuracy studies, but they could give valuable insights for two reasons: First, neural activation can only be a proxy and we don’t yet know to what extent it really translates into the experience of emotion sharing. Second, self-report is subjective and people are not always able to accurately report their emotions (Nisbett & Wilson, 1977). Furthermore, the most naturalistic study on empathic accuracy in existing literature used minutes-long video clips to measure participants’ empathic response with no actual interaction. Here we took a novel approach to experience sharing and examine it during in vivo interaction to fill this gap in literature. People might behave more naturally and express emotions that they feel more in a real interaction than when they just sit in front of a video.

4

Emotional Expressions during Interaction

Emotional expressions when measured during social interactions might result from several distinct social processes and motivations, including mimicry. Mimicry potentially allows us to share other individuals’ emotions because subjective emotional experience is affected by the feedback from facial mimicry (Hatfield, Cacioppo, & Rapson, 1993). Specifically, the embodied theory postulates the facial feedback pathway: it suggests that perceiving emotions involves somatovisceral and motoric reexperiencing of the relevant emotion in oneself

(Niedenthal, 2007), implying that people decode another person’s expression by simulating it in their own facial musculature (Neal, & Chartrand, 2011; Dimberg et al., 2011). The hypothesis that empathy is mediated by mimicry of another person’s behavior was first proposed by Lipps

(1907), although this initial model was not restricted to mimicry of facial expression (Blairy,

Herrera, & Hess, 1999). Specifically, he proposed three steps in this model as the process of understanding others: first, the interaction with another individual leads to imitation by the observer; second, the observer’s mimicry induces a feedback process; third, the observer employs his or her internal state to understand the observed individual (Blairy et al., 1999). This theory of mimicry is supported by multiple studies in social interactions where they found behavioral and physiological evidence supporting synchronization between people including mutual eye-gaze, similar skin conductance response, contagious yawning, and similar body postures (Chartrand, & Lakin, 2013; Prochazkova, & Kret, 2017; Kret, 2015; Kret, Fischer, &

De Dreu, 2015). The second step, in particular is outlined in the facial feedback hypothesis

(Blairy et al., 1999). It states that the observer’s mimicry induces a feedback process, leading the observer to experience the emotion they were led to express and this process is supported by numerous studies (Helt, Eigsti, Snyder, & Fein, 2010; Tia, Saimpont, Paizis, Mourey, Fadiga, &

5

Pozzo, 2011). Indeed, when researchers manipulated participants’ facial muscles, individuals instructed to move their facial muscles to frown were less happy than those in a “smile” condition (Hatfield et al., 1993). Moreover, people display positive and negative emotional reactions unconsciously on their faces (Dimberg, Thunberg, & Elmehed, 2000). For instance, when participants were presented with face pictures of different emotions, they reacted with facial electromyographic (EMG) reactions that corresponded to the happy and angry faces

(Dimberg et al., 2000). Interfering with the imitation process also interfered with the ability to recognize emotional expressions at least for highly empathic participants (Jospe, Flöel, &

Lavidor, 2018). Finally, in another study that tested the facial feedback hypothesis, the researchers found evidence suggesting that simulated tearing up would cause (Mori, &

Mori, 2007).

Despite the evidence for Lipps theory about the role of emotion expression in empathy as well as the facial feedback hypothesis, there are critical accounts against it (e.g. Jacob, 2008;

Decety, 2010). For example, a large-scale study (17 replications) has failed to replicate the original facial feedback study (Wagenmakers, Beek, Dijkhoff, Gronau, Acosta, Adams &

Bulnes, 2016, but also see Noah, Schul, & Mayo, 2018 for a methodological critique of the replication). Heilman (2002) also reported a case where a patient who was unable to display facial expressions showed no deficits in recognizing facial expressions (Decety, 2010), which questions that if mimicry really is necessary for empathy, but it does not speak against simulation on a neural level – patients could still have sensorimotor simulation. Besides the emotional sharing aspect of mimicry, it may also serve as a social glue and a communicative act. Studies suggest that having a goal of affiliation increases nonconscious mimicry (Lakin, & Chartrand,

2003) and the tendency to adopt other people’s behaviors allows individuals to maintain

6

harmonious relationships with their group members (Lakin, Jefferis, Cheng, & Chartrand, 2003).

Therefore, facial expressions that people display in interpersonal interactions might result from a combination of the experience sharing and the communication aspects of mimicry. Last, facial expressions in social interactions might be automatic expressions from one’s own experience, which may or may not be the result of empathy.

To conclude, there are two potential reasons we should expect modulation of facial expressions from individuals’ interacting with others that would strengthen the relationship between sensorimotor simulation with empathic accuracy: 1. Facial expressions as the result of simulation based mimicry, 2. Facial expressions as the result of people’s own empathic feelings.

There are also two possible reasons why facial expressions might influence empathic accuracy independent of sensorimotor simulation: 1. Mimicry of facial expressions as a communicative tool such as being driven by similarity and liking; 2. Facial expressions of the perceivers’ non- empathic emotions. In this study, we will statistically control for the latter two reasons (such as controlling for participants’ liking for their partners and the emotional content of their stories) as a follow up to our main analyses, and hypothesize the involvement of emotional expressions in sensorimotor simulation. However, we will not be able to differentiate the first two mechanisms.

Based on the previous research on mimicry and facial feedback theory, this current study theorizes that facial expressions moderate the relationship between sensorimotor simulation and empathic accuracy in a face to face interaction. Specifically, we predicted that showing more facial expressions that are congruent with the emotion conveyed by the observed target would strengthen the positive connection between sensorimotor simulation and empathic accuracy of the observer. This prediction is based on the assumption that mimicry might elicit actual emotions from the observer that are similar to the ones of the target. Thus, independent of why

7

exactly people might mimic, the mere act of mimicking should facilitate empathy through facial feedback.

Although researchers have explored the sensorimotor simulation theory, mimicry and facial feedback theory, to the best of our knowledge, no study has examined sensorimotor simulation, emotion expressions and empathic accuracy altogether; no existing literature has looked at them in naturalistic face-to-face interactions, either. Therefore, this study explored the interactions among sensorimotor simulation, emotion expressions and empathic accuracy in face to face interactions.

Measuring SM simulation

Sensorimotor simulation can be measured with most brain imaging methods, including functional magnetic resonance imaging (fMRI), transcranial magnetic simulation (TMS) and

EEG. The most widely used index of sensorimotor simulation in EEG research is the suppression of the mu-rhythm measured over sensorimotor areas. Theoretically, mu suppression has been linked to activation of the putative mirror neuron system (eg. Cochin et al. and Holz et al., as cited in Woodruff, Martin, & Bilyk, 2011) and meta-analysis also supports using mu suppression as an index for the mirror neuron system (Fox, Bakermans-Kranenburg, Yoo, Bowman, Cannon,

Vanderwert, & van IJzendoorn, 2016). Although there are some critiques arguing that the effect is weak and could be confounded with occipital alpha related to differences in attention (Hobson,

& Bishop, 2016), mu rhythm is still a reliable measurement when optimal parameters and approaches are employed (Bowman, Bakermans-Kranenburg, Yoo, Cannon, Vanderwert, Ferrari,

& Fox, 2017). For example, we analyzed the effects for both central and occipital alpha and included a self action condition as a baseline for comparison. The connection between mu

8

suppression and empathic abilities was studied and supported by many studies on humans (Perry,

Bentin, Bartal, Lamm, & Decety, 2010; Woodruff, Barbera, & Von Oepen, 2016; Pineda, 2005), but the relationship to empathic accuracy has only been investigated in one previous study that found a positive correlation between mu suppression over sensorimotor regions and empathic accuracy (Ong et al., 2018) looking at mu suppression in response to emotional stimuli. To our knowledge, no study has looked at the link between action related mu suppression and empathic accuracy. Although mu suppression is not measured during the interactions, it is measured right after the interaction through a ball squeezing task, which has been used to test mu suppression in previous studies (Coll, Bird, Catmur, & Press, 2015).

We hypothesized that stronger mu suppression predicts higher empathic accuracy in naturalistic contexts. In addition, based on the theory of embodied cognition and the facial feedback hypothesis, this correlation will be strengthened if the participants should express more positive affect during positive experience sharing or more negative affect during the negative experience sharing.

Methods

Participants

Our sample were 76 typical adult participants. Participants were recruited from the

Boston area through Craigslist.com and recruitment flyers, and they signed informed consent prior to participation. After excluding participants with incomplete data, the final sample size for data analysis is 49 participants. A sensitivity analysis revealed that the study can only discover relatively large effects with a power of 80. The final sample included 26 male, 21 females, 1 identified as “Other”, and 1 unreported, with a mean age of 39.15. It includes 40 White

9

participants, 8 Black participants, and 1 identified as “Other”. This study mainly has African-

American and Caucasian participants because it is part of a larger experiment that examines interracial interaction between these two races.

Procedures

Participants were run in pairs in a single experimental session that lasted about three hours (please see Figure 1 for a diagram of the procedure). All instruments and measures referred to here are described in detail in the next section. Participants were fitted with EEG caps and seated in comfortable chairs face to face 2 meters apart. Once the participant was set up for EEG recording, we measured their baseline brain activity while they were sitting completely still alternating between keeping their eyes open and closed for 5 minutes. Then they did an ice breaker task in which each participant recounted a neutral event (how they got to the lab that day) to familiarize themselves with the paradigm of experience sharing. Next, each related shared one positive and one negative experience to their partner. While the dyads were talking and listening, both their facial expressions were videotaped. Immediately after the experience sharing task, participants watched their partner squeezing a ball for 30 seconds (observation condition) and also squeezed the ball by themselves for 30 seconds (self action condition). Then the participants watched both their own and their partner’s videos and rated the emotions they perceived on a 9-point slider ranging from “very negative” to “very positive”. The slider allowed participants to continuously update their affect ratings while watching videos. The empathic accuracy was calculated based on the similarity of the participant's continuous ratings for his/her partner and the continuous ratings from the partner for himself/herself. At the end, the participants completed questionnaires about the experience using the Subjective Experience

10

Assessment Scale and the Questionnaire of Cognitive and Affective Empathy, as described in the next section.

Figure 1. The Experiment Procedure

Materials & Measures

EEG recording and processing: EEG data were recorded from 32 active electrodes embedded in a stretch-Lycra cap (ActiCap, BrainProducts, GmbH, Munich Germany) using

BrainAmp amplifiers and the BrainVision recorder software (BrainProducts, GmbH, Munich,

Germany), and digitized at 512 Hz. Electrodes were arranged according to the 10-20 system with impedances kept below 15 KΩ and with an initial reference at FCz. Offline data analyses were done using custom MATLAB (R2018a, version 7.10.0) scripts and the EEGLAB toolbox

(Delorme, & Makeig, 2004).

First, data were segmented into intervals of the duration of each task to be analyzed. We used cleanline EEGlab extension to reduce line noise (Delorme, & Makeig, 2004), re-referenced the data to the average reference, filtered temporally with a highpass of 1 Hz, a lowpass of 30 Hz

11

and spatially with a Laplacian filter. Data were then subjected to a SOBI based independent component analysis (ICA) using EEGLAB’s extension ICLabel (Makeig, & Onton, 2011). The maximally independent source components were then categorized as reflecting either brain or non-brain source activity by this function. Components that were identified as blinks, facial muscle activities or other noise with higher than 50% chance by ICLabel were removed. Then we used EEGLAB’s pop_eegthresh function for epoch rejection.

Mu power between frequencies of 8 – 13 Hz, was calculated for both the five-minute eye open state baseline period and the 30-second ball squeezing task. To do so we first segmented these periods into artifact-free epochs of 2000ms, extracted through a Hamming window, overlapped by 80% to minimize data loss, and then Fourier transformed through the Spectopo

EEGLAB function. We then averaged the artifact free segments across the duration of the tasks.

The resulting mu-power averages (8-13 Hz) were then log transformed (log(ball squeezing observation condition/self ball squeezing baseline)) to correct for the non-normality of the data.

We used the self ball squeezing baseline in this log ratio to control for individual differences and session specific differences that might affect EEG recordings. Stronger suppression would lead to smaller mu-power; therefore, a higher similarity between the activation during observation and the activation during the self action condition would indicate more resonance. We hypothesized a positive correlation between empathic accuracy and mu suppression, so our prediction would imply a negative correlation between the empathic accuracy and the mu-power scores.

Experience Sharing Task: Participants were told that they would be sharing two experiences with their partner: one positive experience and one negative experience.

Specifically, they were told to “take a moment to think of a most positive/negative personal

12

event that [they] would like to share” and “try to choose an event which [they] can speak about for a full 2 minutes, but no longer”. Before experience sharing, participants were asked to recount a neutral event (how they got to the lab that day) to familiarize themselves with the paradigm, while the listeners were instructed not to talk during the experience sharing.

Experimenters video recorded both the speaker and the observer during this experience sharing task.

Empathic Accuracy Task (adapted from Zaki et al., 2008): Participants watched their partner and their own videos after the ball squeezing task. While watching the video, they rated the emotions of their partner and themselves on a rating bar of a 9-point Likert scale from “very negative” to “very positive” through arrow keys: clicking the left arrow would mean more negative emotions and clicking the right arrow would indicate more positive emotion. We followed how previous studies calculated empathic accuracy: we operationalized empathic accuracy by quantifying participants’ agreement on how emotional content of the video changes

(Ong et al., 2018, under review). For each one-second interval, we classified participants' ratings into one of three categories: either an increase in affect, a decrease, or maintained from the previous epoch. Specifically, we defined deltaPart(t) as a scale-invariant derivative:

deltaPart(t) = { "increase", if part(t) - part(t-1) > 0

{ "maintain", if part(t) - part(t-1) == 0

{ "decrease", if part(t) - part(t-1) < 0

We defined a corresponding variable, deltaTarg(t), using the target's self-reported ratings

(Ong et al., 2018, preprint). We then operationalized change detection such that a "successful" change detection occurs if participants' rated change matches the target's change at that second. If they do not match, this would be a "failed" change detection (Ong et al., 2018, preprint).

13

changeDetection(t) = {1, if deltaPart(t)==deltaTarg(t)

{ 0, otherwise

Therefore, change detection is a binary variable for each one-second interval that reflects whether the participant successfully detected any change in the target's affect (Ong et al., 2018, under review). Last, we averaged across the entire time period of each shared experience, so the final score ranged from 0 to 1. Then we created averages of the positive and the negative experience for each participant, so the final score for empathic accuracy is a continuous variable ranging from 0 to 1.

Facial expression coding: each video was coded by two research assistants using The

Facial Expression Coding System (FACES) (Kring, & Sloan, 1991). The inter-class reliability between coders was higher than 0.8 to ensure adequate inter-rater reliability as shown in Table 1.

Table 1 Inter-rater Reliability Table

Frequency Duration Intensity

Positive 0.93 0.90 0.88

Negative 0.82 0.80 0.83

The subcomponents of facial expression coding include frequencies, duration, intensities, and valence of the expressions. When raters detected a change in the participant’s facial expression on the video, they wrote down when this change occurs and ends, as well as its intensity and valence (either positive or negative). Therefore, the minimum rating for frequency, duration and intensity is zero and there is no upper limit. Following previous studies that used this scale (Kring, & Sloan, 2007; Kring, & Gordon, 1998), we only used frequency to reduce the number of dependent variables in the analysis. Also, the other two components are strongly 14

correlated with frequency. As pre-registered for the positive condition, we used the total frequency of positive expressions as an index of each participant’s emotional expressions, while for the negative condition, we used the total frequency of negative expressions as an index of each participant’s emotional expressions. We also averaged the frequencies of both positive and negative facial expressions in both positive and negative conditions to calculate the average emotional expressivity of each participant.

The QCAE (Questionnaire of Cognitive and Affective Empathy) (Reniers, Corcoran,

Drake, Shryane, & Völlm, 2011): Participants filled out an QCAE that measured their trait empathy on a 4-point likert scale (“strongly agree”, “slightly agree”, “slightly disagree”,

“strongly disagree”).

Subjective Experience Assessment: After completion of the empathic accuracy measure, participants completed a series of self-report measures of subjective experience of the interaction: Their to affiliate, their subjective closeness to the interaction partner (Aron,

A., Aron E. N., & Smollan, D., 1992) and their positive evaluations (from Holoien, Bergsieker,

Shelton, & Alegre, Jan Marie, 2015).

Emotional Content of the Conversation: To analyze the emotions from experiment participants, we utilized Sentiment Analysis using Stanford CoreNLP version 3.8.0 (Manning,

Surdeanu, Bauer, Finkel, Bethard, & McClosky, 2014) and Sonix, an online automated transcription software to convert audio files (.MP4) of participants telling positive and negative experiences to text files, with an average accuracy above 85%.

15

Statistical Model of Pre-registered Hypothesis

A Multiple Regression Model will be used for this study as shown below:

Estimated Empathic Accuracy=β0+β1MuSuppresion+β2EmotionalExpression+

β3(MuSuppresion *EmotionalExpression) + ϵi

Hypothesis 1: Mu suppression will significantly predict empathic accuracy.

β1 will be statistically significant (p< .005)

Hypothesis 2: the effect of mu suppression on empathic accuracy would be moderated by emotion expression - we expected an interaction between mu suppression and emotion expression scores.

β3 will be statistically significant (p< .005)

We will also run the multiple regression model separately for positive experience and negative experience sharing and compare the differences between the two conditions as below:

For the positive condition: Estimated Empathic Accuracy =

β0+β1MuSuppresion+β2EmotionalExpression|positive+ β3(MuSuppresion

*EmotionalExpression)|positive + β4TraitEmpathy +β5SubjectiveExperience+ϵi

For the negative condition: Estimated Empathic Accuracy =

β0+β1MuSuppresion+β2EmotionalExpression|negative+ β3(MuSuppresion

*EmotionalExpression)|negative + β4TraitEmpathy +β5SubjectiveExperience+ϵi

We will also conduct follow up analysis that controls for potential confounds.1

1 Follow up analysis: Estimated Empathic Accuracy=β0+β1MuSuppresion+β2EmotionalExpression+ β3(MuSuppresion *EmotionalExpression)+ β4TraitEmpathy +β5SubjectiveExperience+ϵi Predictors: mu suppression, emotional expression; Outcome: empathic accuracy; Covariates: trait empathy, subjective experience

16

Results

This study is pre-registered (https://osf.io/72atz), We first conducted descriptive statistics to get a general idea of the distribution and the normality of all dependent variables. Following the methodology of previous studies, we used baseline normalization for the mu suppression data

(Oberman, Ramachandran, & Pineda, 2008; Perry, Bentin, Bartal, Lamm, & Decety, 2010) because power law scaling has been observed in EEG spectrum (Ferree, & Hwa, 2003). The average mu suppression at C3, C4 and Cz is slightly symmetric with one peak (n=49, mean=0.01, median=0.02, sd=0.10). Mu suppression at C3 is slightly left skewed with one peak

(n=49, mean=0.01, median=0.01, sd=0.16). Mu suppression at C4 is normally distributed (n=49, mean=0.02, median=0.02, sd=0.13). Mu suppression at Cz is slightly right skewed (n=49, mean=0.01, median=0.01, sd=0.26). Empathic accuracy (n=49, mean=0.59, median=0.58, sd=0.12) is comparatively normally distributed. The average emotional expressions (n=49, mean=3.34, median=3.28, sd=3.19) are slightly right skewed. For positive and negative conditions, the positive emotions (n=49, mean=5.19, median=4.0, sd=4.91) in positive condition are comparatively right skewed, as well as the negative emotions (n=49, mean=1.48, median=0.5, sd=2.03) in the negative condition.

We then used Tukey’s method for mu suppression, empathic accuracy and emotional expressions to identify outliers that ranged above and below the 1.5*IQR: no outliers were found for empathic accuracy; two outliers were found for emotional expressions and removed, which changed the average emotional expressions from 3.34 to 2.91; three outliers were found for mu suppression and but removing them did not change the mean value of mu suppression2.

2 Mean (mu) before removing the outliers: 0.01; mean (mu) after removing the three outliers: 0.01. 17

Bivariate analysis

We used a scatterplot matrix among the continuous variables to visually present the data to see the direction, linearity, and strength of the relationship between the predictors and predicted variables, as shown in Figure 2.

Figure 2. The Scatterplot of the Continuous Variables

The relationship between emotional expressions and empathic accuracy appears to be non-linear as shown in Figure 3, so we conducted a boxcox to see if we need to transform data: the optimal λ is around 1.5, indicating that no transformation was necessarily needed, so we kept the original data.

18

Figure 3. The Relationship between Emotional Expressions and Empathic Accuracy

We also conducted a simple regression test between mu suppression averaged across the three sites and empathic accuracy because we pre-registered this prediction as shown in Table 2.

It is slightly negative though not significant, r = -.08, t(47)=-0.57, p = 0.56. There is a strong positive correlation between emotional expression and empathic accuracy, r = 0.42, t(47)= 3.19, p= .003. This relatively strong zero order correlation signals that in a regression model where empathic accuracy serves as an outcome variable, emotional expressions individually will help explain an ascend proportion of variation in empathic accuracy.

Table 2

Correlation Table for Mu, Emotional Expressions and Empathic Accuracy

Mu Suppression Emotional Expressions Empathic Accuracy

Mu Suppression -0.17 -0.08

Emotional Expressions 0.42** -0.17 Empathic Accuracy -0.08 0.42**

Note: . p<0.1, * p<0.05, ** p<0.01, *** p<0.001 The numbers are correlation coefficients

19

Hypothesis Testing

Basic Model (M1): Empathic accuracy = β0+β1*mu suppression+β2*emotional expressions+β3*mu suppression*emotional expressions.

H1: We hypothesized that stronger mu suppression would be positively correlated with empathic accuracy. More mu suppression would lead to smaller mu power, so our prediction was a negative correlation between mu power and empathic accuracy. So we expected to find a statistically significant correlation coefficient for β1(mu suppression)

The regression result showed that the model is significant, F(3,45)=3.26, p=.03, R2=12%, although the adjusted R2 is small. We did not find evidence supporting our first hypothesis,

β1(mu suppression) = -0.004, p = 0.97

H2: We predicted a moderation effect of emotional expressions on the association between mu and empathic accuracy, so we expected to find a statistically significant correlation coefficient for the interaction term β3(mu suppression*emotional expressions).

We did not find evidence supporting our second hypothesis on the moderation effect.

The first fitted moderation model is: Empathic Accuracy=0.53-0.004*mu suppression +

0.016*emotional expressions-0.002*mu suppression*emotional expressions as shown in Figure

4.

20

Figure 4. The relationship between mu and empathic accuracy

The coefficient for the interaction between mu and emotional expressions(β3) is not statistically significant as shown in Table 2, β3(mu suppression*emotional expressions) = -0.002, p

= 0.95. It might be the case that mu suppression is not statistically significantly correlated with empathic accuracy potentially because the experiments took place in different testing rooms, which might be a confound for EEG signals due to differences in external noise, amplifiers, and other recording conditions. Hence, participants tested in one room might have higher mu suppression than participants tested in another room due to equipment differences. Therefore, in the next model, the baseline of the eye open states is included in the model as a covariate to control for potential noise caused by the testing room difference.

Model 2: Empathic accuracy = β0+β1*mu suppression (average)+β2*emotional expressions+β3eye open state + β4*mu suppression(average)*emotional expressions. The regression result shows that the model is significant, F(4,44)=2.9, p=.03, R2=13.9%, as shown in

Table 33. This model is showing the effect of mu suppression on empathic accuracy, moderated by emotional experience, controlling for the testing room differences: the effect of mu suppression is not statistically significant and the emotional expressions has a statistically significant effect on empathic accuracy although it was not a moderation effect.

3 The analysis has also explored several possible additional confounds that might influence the association between mu suppression and empathic accuracy as we pre-registered, which are the desire to affiliate with the partner, the personal evaluation for the partner, trait empathy including both affective and cognitive empathy, and the emotional content of the conversation. Statistical analysis shows that only the personal evaluation for the partner is negatively correlated with empathic accuracy: Empathic accuracy = 0.90+0.55mu suppression + 0.02emotional expressions-0.006eye open state -0.05personal evaluation-0.03mu suppression*emotional expressions. The regression results show that the model is significant, F(5,39)=5.35, p=.0008, R2=33.1%. 21

Table 3 Model Comparisons Table M1 M2 (se)

Intercept 0.53*** 0.60*** (0.02) (0.05)

Mu(average -0.004 0.14 ) (0.25) (0.27)

Emotional 0.016* 0.016* Expressions (0.005) (0.005)

Mu*Emotional -0.002 -0.01 Expressions

(0.048) (0.05)

Resting -0.005 Baseline (0.004)

Evaluation

R2 0.12 0.14

F(df1, df2) 3.25* 2.9*

(3,45) (4,44)

RMSE 0.11 0.12

Note: . p<0.1, * p<0.05, ** p<0.01, *** p<0.001 The numbers are correlation coefficients and the numbers in parentheses are standard errors

22

Follow up Analysis

Positive and Negative Conditions

The moderation models are similar in both the positive condition to the negative condition. In the positive condition where the participants shared their positive experience with their partners: Empathic accuracy = 0.6+0.16*mu suppression+0.01*positive emotional expressions-0.006eye open state -0.008*mu suppression*positive emotional expressions,

F(4,44)=2.85, p=.03, R2=0.13. Emotional expressions are positively correlated with empathic accuracy in this model, ρ (emotional expressions) = .01, p = .03, indicating that for every additional emotional expression that the participant displays, empathic accuracy increases by .01 unit on average, controlling for other variables. In the negative condition: Empathic accuracy =

0.62+0.15*mu suppression+0.02*negative emotional expressions - 0.005eye open state -

0.04*mu suppression*positive emotional expressions, F(4,44)=1.95, p=.12, R2=0.12. Emotional expressions are also positively correlated with empathic accuracy in the negative condition, ρ

(emotional expressions) = .02, p = .03, indicating that emotional expressions positively contribute to empathic accuracy.

Mu Suppression at Different Sites

We tested Model 2 at C3, Cz, and C4 separately: Empathic accuracy = β0+β1*mu suppression+β2*emotional expressions+β3*eye open state + β4*mu suppression*emotional expressions. This model is significant on all three sites (C3, Cz, C4) from the sensorimotor area as shown in Table 4. The correlation coefficient for emotional expressions is also significant on all three sites.

23

Table 4

Mu Suppression at Different Sites

M2 (C3) (se) M2 (C4) M2 (Cz)

Intercept 0.58*** 0.59*** 0.60***

(0.05) (0.04) (0.05)

Mu (C3) 0.27

(0.20)

Mu (C4) -0.13

(0.21)

Mu (Cz) 0.14

(0.15)

Emotional 0.016* 0.014* 0.014* Expressions (0.005) (0.006) (0.005)

Mu*Emotiona -0.067 -0.02 -0.005 l Expressions (0.037) (0.047) (0.02)

Eye Open -0.004 -0.003 -0.007 State (0.003) (0.003) (0.004)

R2 0.18 0.21 0.17

F (df1, 3.57* 3.49** 3.5* df2) (4,44) (4,44) (4,44)

RMSE 0.11 0.11 0.11

Note: . p<0.1, * p<0.05, ** p<0.01, *** p<0.001 Numbers in parentheses are standard errors 24

Differentiating central from occipital effects

We compared mu suppression with alpha activities from the occipital region. As shown in Figure 5, there is strong activation in the occipital region, so to test whether we are able to isolate mu-suppression due to sensorimotor activation as opposed to attention related changes in occipital alpha, we have conducted a 3 (lateralization: left, central, right) x 2 (position: central vs. occipital) x 3 (condition: self vs. other) ANOVA. There is a main effect of centrality,

F(1,63)=61.47, p < .001 and a main effect of lateralization, F(2,126)=9.38, p<.001. No main effect is found on condition, F(1,63) = .11, p =.74.The main effect of centrality is qualified by a significant interaction between centrality and condition, F(1,63) = 31.1, p< .001. No other interactions are significant (all p-values > .14). To break down the significant centrality X condition interaction, we conducted a simple effects analysis. Alpha waves from the central electrodes and the occipital electrodes have similar patterns in the observation condition as shown in Figure 6. However, although the simple effect of condition is significant in both the occipital and central electrodes, it goes in the opposite direction with central electrodes showing less mu during the self-condition than the observation condition as expected, x̅ (self-other)= -.46, p=.045, while occipital electrodes show more mu in the self condition than during the observation condition, x̅ (self-other)= 74, p=.01. This pattern of results suggests that the effects observed in central electrodes are unlikely to be a bleeding over of attention related occipital alpha.

25

Figure 5. Topoplot of the Ball Squeezing task (the Observation Condition is on the left and the

Self Condition on the right)

Figure 6. Alpha Wave Comparisons between the Central electrodes and Occipital electrodes in means

26

To ensure that our observed effects are not solely due to differences in attention, we also tested the effects from occipital attention related alpha, Empathic accuracy = 0.61- 0.37*alpha desynchronization (occipital region)-0.005*eye open state baseline+0.014emotional expressions+0.05alpha*emotional expressions. The coefficient for alpha wave suppression is not statistically significant, β(alpha) -0.37, p=.06, so we did not find strong support for a relationship between attention related occipital alpha during action observation, suggesting that our mu- suppression findings are restricted to sensorimotor regions. Emotional expressions has a statistically significant association with empathic accuracy in this attention model that includes occipital activities, β(emotional expressions)=0.01, t(44)=2.9, p=.005, but emotional expressions do not have a moderation effect for the association between alpha waves and empathic accuracy at occipital cite either, indicating that attention is not playing an important role in our paradigm.

Discussion

The objective of this study is to examine the association between sensorimotor simulation and empathic accuracy, the role of emotional expressions in empathic accuracy, and how emotional expressions might influence the relationship between sensorimotor simulation and empathic accuracy.

Our pre-registered analysis on the average of central electrodes did not find a correlation between mu-suppression and empathic accuracy, nor a moderation effect of emotional expressions. Thus, we were unable to provide evidence for our proposed statistical relationship between sensorimotor simulation and empathic accuracy. We did, however, find a statistically significant correlation between emotional expressions and empathic accuracy, supporting a

27

potential association between expressing more emotions and better abilities of understanding other people’s emotions.

Hypothesis 1

We predicted that sensorimotor simulation as indexed by EEG-mu suppression would predict better empathic accuracy. The averaged mu suppression from the central electrodes was not significantly correlated with empathic accuracy. Since, EEG mu-suppression is often lateralized to the contralateral left hemisphere when participants observe right-hand movements, we looked at effects specific to electrodes C3, Cz, and C4 in an exploratory set of analyses. The multiple regression models were significant for all three sites. We found a marginally significant trend only at electrode C3, suggesting that EEG-mu suppression within the left sensorimotor cortex might predict empathic accuracy, and we also found that emotional expressions moderate the association between mu-suppression at C3 and empathic accuracy. It is important to note, though, that this analysis was not pre-registered, and is thus exploratory in nature. The statistical results were weak and the observed trend was positive, suggesting that more sensorimotor simulation interfered rather than facilitated empathic accuracy – a result that was the opposite of what we predicted. This study was also underpowered as the sensitivity analysis shows that we only had sufficient power to find large effects, so a future replication study with a higher power can potentially clarify the results.

One possible explanation for our inconclusive results might be the baseline technique.

There are several ways of choosing the baseline for mu when using it as an index for sensorimotor simulation based on existing literature (Hobson, & Bishop, 2016). For example, a previous study showed that the within-trial condition the better baseline to show the pattern of

28

mu suppression in comparison with the between trial condition (Hobson et al., 2016)). We used self action as the baseline (Pineda, Allison, & Vankov, 2000) whereas some previous studies used resting states as the baseline (Ong et al., 2018, preprint; Colla, Birdc, Catmure, & Pressf, n.d.). Unfortunately, previous experiments all used simple stimuli to activate mu rhythm, so there is not enough literature supporting any particular method of calculating mu suppression in a face to face interaction.

The lack of conclusive evidence for the association between mu suppression and empathic accuracy might be also due to the complexity of the paradigm: the participants did not just passively receive information from their partners. Instead, they were constantly communicating non-verbally with their partners and potentially adjusting their behaviors based on the real time feedback from their partners. Unlike previous studies where the participants were just perceivers of images (Dimberg et al., 2000) or video recordings (Ong et al., 2018) of an emotional display of others and did not see the targets in person, participants in this study had a much more intimate interaction with the targets. There might be more factors to be considered when interacting with another person face to face and hence this study had to control for more complicated confounds such as the target’s talking skill. We controlled for the content of the conversation and how approachable the target was (e.g. the likeability question in the personal evaluation questionnaire), but we did not have control measures for many additional confounds including the speaker’s non-verbal communication skills. A study showed that conversation

“openers” could reinforce their partners' participation in a conversation and tend to display more attentive facial expressions (Purvis, Dabbs& Hopper, 1984). So a future direction of this study would be controlling for more target related confounding variables.

29

Hypothesis 2

We did not find our expected moderation effect of facial expression in the multiple regression model; the negative coefficient of the interaction term in the model is not significant.

As mentioned earlier, there are four potential reasons behind emotional expressions, and three of them might lead to an association between emotional expressions and empathic accuracy so we predicted a moderation effect of emotional expressions. We found an effect but not a moderation effect of emotional expressions potentially because multiple mechanisms were present in the study: the emotional expressions might be influenced by both mimicry and the receivers’ own non-empathic emotions.

For our exploratory testing, we found a marginally significant correlation between Mu from C3 and empathic accuracy, suggesting that EEG-mu suppression within the left sensorimotor cortex might predict empathic accuracy; we also found that emotional expressions slightly moderated the association between mu-suppression at C3 and empathic accuracy. The negative moderation effect of emotional expression indicated that expressing more emotional expressions when listening to the stories was correlated with smaller differences between the mu rhythm during action observation and the mu rhythm during action execution at the same level of empathic accuracy because the coefficient of mu was positive.

Moreover, the emotional expression is positively and significantly correlated with empathic accuracy, indicating that the more emotional expressions people express during the interaction, the better they are at detecting their partner’s feelings. This finding is consistent with the theory of embodied cognition, which suggests that the motor system influences individuals’ cognition (Garbarini, & Adenzato, 2004). Although the origin of the emotional expressions is not tested in this paradigm, whether it is a result of mimicry or a result of the participants’ own

30

feelings towards their partner, this association between emotional expression and empathic accuracy in a face to face interaction still enriches our understanding of the role that non-verbal communication plays in a face to face interaction. Our findings suggest that when we see people having a conversation with others, the individuals who display more facial expressions are likely to be the ones who understand others better, even though we cannot infer causality of emotional expressions and the ability to understand others’ feelings.

Future Directions

By examining the role of facial expression in the connection to sensorimotor simulation and empathic accuracy in vivo interaction, this study shed light on the possible influence of emotion expression on how well we understand others in real life. Some early research failed to find a connection between sensorimotor simulation and empathy (Blairy, Herrera, & Hess,

1999), yet the lack of evidence could be caused by the simplicity of the stimuli. According to

Swann (1984), there are crucial differences between active and passive perceivers (Blairy et al.,

1999). Active perceivers might be more motivated to engage in understanding their targets

(Blairy et al., 1999). Having more facial expressivity might reflect more communicative engagement, actual experience of the observed emotions and mimicry, although the current study cannot disentangle them or pinpoint which one has the strongest effect. Future studies should use measurement that could measure facial mimicry such as simultaneous recordings of both the speaker and the listener’s facial expression through facial EMG and API.

31

Conclusions

The existing models of mimicry propose that facial expressions of the speakers might be followed by a mimicry response from the observer (Sel, Calvo-Merino, Tuettenberg, & Forster,

2015.; Niedenthal, 2007). This relationship is reciprocal, meaning that the facial muscle movements triggered by the target’s facial movements might evoke an emotional state that can then influence the observer’s perception of emotional states in the speaker (Strack et al., 1988;

Kuhn et al., 2011 as cited in Sel et al., 2015). In addition, the effects of one’s own facial expressions might also influence this information processing (Critchley and Nagai, 2012 as cited in Sel et al., 2015). However, it remains unclear how well people can estimate another’s emotions after interacting with the person and what role sensorimotor simulation plays in empathic accuracy.

This study investigated the relation of sensorimotor simulation to understanding others’ emotions by addressing two questions: First, is sensorimotor simulation positively correlated with empathic accuracy in a face to face interaction? Second, how is the neural networks underlying empathy influenced by emotional expression? Although we did not find conclusive evidence for a moderation effect of emotional expression on the connection between sensorimotor simulation and empathic accuracy, we did find evidence supporting the connection between listeners' emotional expressions and their empathic accuracy. This finding shows the importance of non-verbal communication on people’s ability to understand other individuals’ feelings. It indicates that individuals might be better at understanding the other person that they are interacting with by expressing themselves more during the interaction.

32

References

Achaibou, A., Pourtois, G., Schwartz, S., & Vuilleumier, P. (2008). Simultaneous recording of EEG and facial muscle reactions during spontaneous emotional mimicry. Neuropsychologia, 46(4), 1104-1113. Avenanti, A., Minio-Paluello, I., Bufalari, I., & Aglioti, S. M. (2009). The pain of a model in the personality of an onlooker: influence of state-reactivity and personality traits on embodied empathy for pain. Neuroimage, 44(1), 275-283. Avenanti, A., Bueti, D., Galati, G., & Aglioti, S. M. (2005). Transcranial magnetic highlights the sensorimotor side of empathy for pain. Nature neuroscience, 8(7), 955. Atkins, D., Uskul, A. K., & Cooper, N. R. (2016). Culture shapes empathic responses to physical and social pain. Emotion, 16(5), 587. Ali, F., Amorim, I. S., & Chamorro-Premuzic, T. (2009). Empathy deficits and trait in psychopathy and Machiavellianism. Personality and Individual Differences, 47(7), 758-762. Aron, A., Aron, E. N., & Smollan, D. (1992). Inclusion of other in the self scale and the structure of interpersonal closeness. Journal of personality and social psychology, 63(4), 596. Blairy, S., Herrera, P., & Hess, U. (1999). Mimicry and the judgment of emotional facial expressions. Journal of Nonverbal behavior, 23(1), 5-41. Bowman, L. C., Bakermans-Kranenburg, M. J., Yoo, K. H., Cannon, E. N., Vanderwert, R. E., Ferrari, P. F., ... & Fox, N. A. (2017). The mu-rhythm can mirror: Insights from experimental design, and looking past the controversy. Cortex, 96, 121-125. Chartrand, T. L., & Lakin, J. L. (2013). The antecedents and consequences of human behavioral mimicry. Annual review of psychology, 64, 285-308. Cheng, Y., Chen, C., & Decety, J. (2014). An EEG/ERP investigation of the development of empathy in early and middle childhood. Developmental Cognitive Neuroscience, 10, 160- 169. Colla, M. P., Birdc, G., Catmure, C., & Pressf, C. Crossmodal repetition effects in the mu rhythm indicate tactile mirroring during action observation 2. Coll, M. P., Bird, G., Catmur, C., & Press, C. (2015). Cross-modal repetition effects in the mu rhythm indicate tactile mirroring during action observation. Cortex, 63, 121-131. Clark, T. F., Winkielman, P., & McIntosh, D. N. (2008). Autism and the extraction of emotion from briefly presented facial expressions: stumbling at the first step of empathy. Emotion, 8(6), 803. Delorme, A., & Makeig, S. (2004). EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of neuroscience methods, 134(1), 9-21. Decety, J. (2010). To what extent is the experience of empathy mediated by shared neural circuits? Emotion Review, 2(3), 204-207. Decety, J., Bartal, I. B. A., Uzefovsky, F., & Knafo-Noam, A. (2016). Empathy as a driver of prosocial behaviour: highly conserved neurobehavioural mechanisms across species. Phil. Trans. R. Soc. B, 371(1686), 20150077. Dimberg, U., Thunberg, M., & Elmehed, K. (2000). Unconscious facial reactions to emotional facial expressions. Psychological science, 11(1), 86-89.

33

Delorme, A., & Makeig, S. (2004). EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of neuroscience methods, 134(1), 9-21. Dimberg, U., Andréasson, P., & Thunberg, M. (2011). Emotional empathy and facial reactions to facial expressions. Journal of Psychophysiology. Fabi, S., & Leuthold, H. (2018). Racial bias in empathy: Do we process dark- and fair-colored hands in pain differently? An EEG study. Neuropsychologia, 114, 143–157. Ferree, T. C., & Hwa, R. C. (2003). Power-law scaling in human EEG: relation to Fourier power spectrum. Neurocomputing, 52, 755-761. Fox, N. A., Bakermans-Kranenburg, M. J., Yoo, K. H., Bowman, L. C., Cannon, E. N., Vanderwert, R. E., ... & van IJzendoorn, M. H. (2016). Assessing human mirror activity with EEG mu rhythm: A meta-analysis. Psychological Bulletin, 142(3), 291. Gallese, V., & Goldman, A. (1998). Mirror neurons and the simulation theory of mind-reading. Trends in cognitive sciences, 2(12), 493-501. Garbarini, & Adenzato. (2004). At the root of embodied cognition: Cognitive science meets neurophysiology. Brain and Cognition, 56(1), 100-106. Gutsell, J. N., & Inzlicht, M. (2012). Intergroup differences in the sharing of emotive states: neural evidence of an empathy gap. Social cognitive and , 7(5), 596-603. Galinsky, A. D., Maddux, W. W., Gilin, D., & White, J. B. (2008). Why it pays to get inside the head of your opponent: The differential effects of perspective taking and empathy in negotiations. Psychological science, 19(4), 378-384. Hess, U., & Bourgeois, P. (2010). You smile–I smile: Emotion expression in social interaction. Biological Psychology, 84(3), 514–520. Hess, U., & Blairy, S. (2001). Facial mimicry and to dynamic emotional facial expressions and their influence on decoding accuracy. International journal of psychophysiology, 40(2), 129-141. Hatfield, E., Cacioppo, J. T., & Rapson, R. L. (1993). Emotional contagion. Current directions in psychological science, 2(3), 96-100. Hobson, H. M., & Bishop, D. V. (2016). Mu suppression–a good measure of the human mirror neuron system?. Cortex, 82, 290-310. Helt, M. S., Eigsti, I. M., Snyder, P. J., & Fein, D. A. (2010). Contagious yawning in autistic and typical development. Child development, 81(5), 1620-1631. Holoien, D. S., Bergsieker, H. B., Shelton, J. N., & Alegre, J. M. (2015). Do you really understand? Achieving accuracy in interracial relationships. Journal of personality and social psychology, 108(1), 76. Jacob, P. (2008). What do mirror neurons contribute to human social cognition?. Mind & Language, 23(2), 190-223. Jospe, K., Flöel, A., & Lavidor, M. (2018). The interaction between embodiment and empathy in facial expression recognition. Social cognitive and affective neuroscience, 13(2), 203- 215. Khalil, E. L. (2002). Similarity versus familiarity: When empathy becomes selfish. Behavioral and Brain Sciences, 25(1), 41-41. Kraus, M. W., Côté, S., & Keltner, D. (2010). Social class, contextualism, and empathic accuracy. Psychological science, 21(11), 1716-1723.

34

Kret, M. E. (2015). Emotional expressions beyond facial muscle actions. A call for studying autonomic signals and their impact on social perception. Frontiers in psychology, 6, 711. Kret, M. E., Fischer, A. H., & De Dreu, C. K. (2015). Pupil mimicry correlates with in in- group partners with dilating pupils. Psychological science, 26(9), 1401-1410. Kring, A. M., & Sloan, D. (1991). The facial expression coding system (FACES): A users guide. Unpublished manuscript. Lakin, J. L., & Chartrand, T. L. (2003). Using nonconscious behavioral mimicry to create affiliation and rapport. Psychological science, 14(4), 334-339. Lakin, J. L., Jefferis, V. E., Cheng, C. M., & Chartrand, T. L. (2003). The chameleon effect as social glue: Evidence for the evolutionary significance of nonconscious mimicry. Journal of nonverbal behavior, 27(3), 145-162. Manning, C., Surdeanu, M., Bauer, J., Finkel, J., Bethard, S., & McClosky, D. (2014). The Stanford CoreNLP natural language processing toolkit. In Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations (pp. 55- 60). Makeig, S., & Onton, J. (2011). ERP features and EEG dynamics: An ICA perspective. S., Luck, E. Kappenman,(Eds.), Oxford handbook of event-related potential components, 51-88. MATLAB. (2010). version 7.10.0 (R2010a). Natick, Massachusetts: The MathWorks Inc. Mori, H., & Mori, H. (2007). We feel sorry because we cry: a test of the passive facial feedback hypothesis. Perceptual and Motor Skills, 105, 1242-1244. Moore, A., Gorodnitsky, I., & Pineda, J. (2012). EEG mu component responses to viewing emotional faces. Behavioural brain research, 226(1), 309-316. Neal, D. T., & Chartrand, T. L. (2011). Embodied : amplifying and dampening facial feedback modulates emotion perception accuracy. Social Psychological and Personality Science, 2(6), 673-678. Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological review, 84(3), 231. Niedenthal, P. M. (2007). Embodying emotion. science, 316(5827), 1002-1005. Noah, T., Schul, Y., & Mayo, R. (2018). When both the original study and its failed replication are correct: Feeling observed eliminates the facial-feedback effect. Journal of personality and social psychology, 114(5), 657. Ong, D. C., Zaki, J., Perry, A., & Ong, D. C. (2018). Mu rhythm suppression over sensorimotor regions is associated with greater empathic accuracy. Unpublished manuscript. Oberman, L. M., Hubbard, E. M., McCleery, J. P., Altschuler, E. L., Ramachandran, V. S., & Pineda, J. A. (2005). EEG evidence for mirror neuron dysfunction in autism spectrum disorders. Cognitive brain research, 24(2), 190-198. Perry, A., Bentin, S., Bartal, I. B. A., Lamm, C., & Decety, J. (2010). “Feeling” the pain of those who are different from us: Modulation of EEG in the mu/alpha range. Cognitive, Affective, & Behavioral Neuroscience, 10(4), 493-504. Pineda, J. A., Allison, B. Z., & Vankov, A. (2000). The effects of self-movement, observation, and imagination on/spl mu/rhythms and readiness potentials (RP's): toward a brain- computer interface (BCI). IEEE Transactions on Rehabilitation Engineering, 8(2), 219- 222. Preston, S. D., & De Waal, F. B. (2002). Empathy: Its ultimate and proximate bases. Behavioral and brain sciences, 25(1), 1-20.

35

Purvis, J. A., Dabbs Jr, J. M., & Hopper, C. H. (1984). The" Opener" Skilled User of Facial Expression and Speech Pattern. Personality and Social Psychology Bulletin, 10(1), 61-66. Pineda, J. A., & Hecht, E. (2009). Mirroring and mu rhythm involvement in social cognition: are there dissociable subcomponents of theory of mind?. Biological psychology, 80(3), 306- 314. Pineda, J. A. (2005). The functional significance of mu rhythms: translating “seeing” and “hearing” into “doing”. Brain research reviews, 50(1), 57-68. Prochazkova, E., & Kret, M. E. (2017). Connecting minds and sharing emotions through mimicry: A neurocognitive model of emotional contagion. Neuroscience & Biobehavioral Reviews, 80, 99-114. Reniers, R., Corcoran, R., Drake, R., Shryane, N., & Völlm, B. (2011). The QCAE: A Questionnaire of Cognitive and Affective Empathy. Journal of Personality Assessment., 93(1), 84-95. Sel, A., Calvo-Merino, B., Tuettenberg, S., & Forster, B. (2015). When you smile, the world smiles at you: ERP evidence for self-expression effects on face processing. Social cognitive and affective neuroscience, 10(10), 1316-1322. Stinson, L., & Ickes, W. (1992). Empathic accuracy in the interactions of male friends versus male strangers. Journal of personality and social psychology, 62(5), 787. Tia, B., Saimpont, A., Paizis, C., Mourey, F., Fadiga, L., & Pozzo, T. (2011). Does observation of postural imbalance induce a postural reaction?. PloS one, 6(3), e17799. The Stanford NLP Group. (n.d.). Retrieved June 4, 2019, from https://nlp.stanford.edu/software/ Woodruff, C. C., Martin, T., & Bilyk, N. (2011). Differences in self-and other-induced Mu suppression are correlated with empathic abilities. Brain research, 1405, 69-76. Woodruff, C. C., Barbera, D., & Von Oepen, R. (2016). Task-related dissociation of EEG β enhancement and suppression. International Journal of Psychophysiology, 99, 18-23. Wood, A., Rychlowska, M., Korb, S., & Niedenthal, P. (2016). Fashioning the face: sensorimotor simulation contributes to facial expression recognition. Trends in cognitive sciences, 20(3), 227-240. Wagenmakers, E. J., Beek, T., Dijkhoff, L., Gronau, Q. F., Acosta, A., Adams Jr, R. B., ... & Bulnes, L. C. (2016). Registered Replication Report: Strack, Martin, & Stepper (1988). Perspectives on Psychological Science, 11(6), 917-928. Zaki, J., Bolger, N., & Ochsner, K. (2008). It takes two: The interpersonal nature of empathic accuracy. Psychological Science, 19(4), 399-404. Zaki, J., Weber, J., Bolger, N., & Ochsner, K. (2009). The neural bases of empathic accuracy. Proceedings of the National Academy of Sciences, 106(27), 11382-11387. Zaki, J., Bolger, N., & Ochsner, K. (2009). Unpacking the informational bases of empathic accuracy. Emotion, 9(4), 478. Zaki, J., & Ochsner, K. N. (2012). The neuroscience of empathy: progress, pitfalls and promise. Nature neuroscience, 15(5), 675. Zaki, J. (2014). Empathy: a motivated account. Psychological bulletin, 140(6), 1608.

36

Appendix 1: Subjective Experience Assessment

Scale: Items were rated on 1 (not at all) to 7 (very) scales and were averaged to form composites.

We will personalize the items by inserting the interaction partner’s first name whenever items referenced the partner.

Desire to affiliate

● How much did you want to get along with your partner?

● How much did you want to have a smooth interaction with your partner?

● How much did you like your partner?

Positive Evaluations of Interaction Partner:

● Based on your interactions with your partner overall, how pleasant did he or she seem?

● Based on your interactions with your partner overall, how warm did he or she seem?

● Based on your interactions with your partner overall, how likable did he or she seem?

● Based on your interactions with your partner overall, how natural did he or she seem?

● Based on your interactions with your partner overall, how confident did he or she seem?

● Based on your interactions with your partner overall, how attractive did he or she seem?

● Based on your interactions with your partner overall, how intelligent did he or she seem?

37

Appendix 2: Questionnaire of Cognitive and Affective Empathy (QCAE)

People differ in the way they feel in different situations. Below you are presented with a number of characteristics that may or may not apply to you. Read each characteristic and indicate how much you agree or disagree with the item by ticking the appropriate box. Answer quickly and honestly.

Strongly agree Slightly agree Slightly disagree strongly disagree

1. I sometimes find it difficult to see things from the ‘other guy’s’ point of view.

2. I am usually objective when I watch a film or play, and I don’t often get completely caught up in it.

3. I try to look at everybody’s side of a disagreement before I make a decision.

4. I sometimes try to understand my friends better by imagining how things look from their perspective.

5. When I am upset at someone, I usually try to ‘put myself in his shoes' for a while.

6. Before criticising somebody, I try to imagine how I would feel if I was in their place.

7. I often get emotionally involved with my friends’ problems.

8. I am inclined to get nervous when others around me seem to be nervous.

9. People I am with have a strong influence on my mood.

10. It affects me very much when one of my friends seems upset.

11. I often get deeply involved with the feelings of a character in a film, play or novel.

12. I get very upset when I see someone cry.

13. I am happy when I am with a cheerful group and sad when the others are glum.

14. It worries me when others are worrying and panicky. 15. I can easily tell if someone else wants to enter a conversation.

38

16 I can pick up quickly if someone says one thing but means another.

17. It is hard for me to see why some things upset people so much.

18. I find it easy to put myself in somebody else's shoes.

19. I am good at predicting how someone will feel.

20. I am quick to spot when someone in a group is feeling awkward or uncomfortable.

21. Other people tell me I am good at understanding how they are feeling and what they are thinking.

22. I can easily tell if someone else is interested or bored with what I am saying.

23. Friends talk to me about their problems as they say that I am very understanding.

24. I can sense if I am intruding, even if the other person does not tell me.

25. I can easily work out what another person might want to talk about.

39

Appendix 3: FACES Coding Sheet

FACES Coding Sheet

Participant Video # Rater Page 1

(If you are not sure whether the expression was positive or negative, please look at the Positive and Negative Affect Descriptors on page2)

Time start: end: Valence: Positive Negative Intensity: Low Medium High Very High 1 2 3 4

Time start: end: Valence: Positive Negative Intensity: Low Medium High Very High 1 2 3 4

Time start: end: Valence: Positive Negative Intensity: Low Medium High Very High 1 2 3 4

Time start: end: Valence: Positive Negative Intensity: Low Medium High Very High 1 2 3 4

Time start: end: Valence: Positive Negative Intensity: Low Medium High Very High 1 2 3 4

40

LIKERT FORMAT SUMMARY SHEET (Page2): FACES Summary Sheet Participant Video # Rater What is the overall level of expressiveness for this person for this film? Low fairly low medium fairly high high 1 2 3 4 5 Total number of positive expressions: Total number of negative expressions: Total duration of positive expressions: Total duration of negative expressions: Total intensity of positive expression: Total intensity of negative expressions: Other notes (such as body movements, nodding): *Positive and Negative Affect Descriptors Positive Negative Happy Miserable Delighted Distressed Glad Annoyed Amused Jittery Pleased Nervous Content Angry Satisfied Gloomy Calm Anxious Serene Afraid Excited Tense Astonished Alarmed Cheerful Frustrated Surprised Disgusted Active Depressed Content Hostile *Facial tics (should not be counted): Occasionally, a participant may repeatedly display facial movements that do not appear to be expressions of emotion and are instead facial tics. * Detecting an expression: While viewing a participant's record, an expression is detected if the face changes from a neutral display to a non-neutral display and then back to a neutral display.

41

Appendix 4: Instructions for Using FACES

(Adapted from Kring, A. M., & Sloan, D. (1991). The facial expression coding system (FACES):

A users guide. Unpublished manuscript.)

1. Detecting an expression: When you see any change in the face from a neutral display to a non-neutral display, begin to fill in the following information for this expression.

2. Start time: Write down the time (e.g 01:21) when the expression starts.

Valence: Then rate the valence (positive or negative) of this expression.

3. Intensity: Write down the intensity (using a 4 - point Likert scale: 1 = low, 4 = very high) of this detected expression.

4. End time: Write down the time (e.g 01:23) when the expression ends. Then write down the duration (duration = the ending time - the starting time of this expression).

After coding each video, write the summary:

1. Frequency: Count the frequency of positive and negative expressions and

recording these on the summary form.

2. Duration: the total duration for expressions is computed by adding together the

seconds for the positive and negative expressions (computed separately) and

recording them on the summary form.

3. Intensity: Add the positive intensity ratings together and then add all the negative

intensity ratings together.

4. Rate the degree to which the participant expressed their emotions listed on the

summary form.

Detecting an expression: if after a participant displays a shift from a neutral to nonneutral display and, instead of returning to a neutral display, shows a clear change in affective

42

expression, this change is counted as an additional expression. For example, if while smiling, a participant then raises his or her eyebrows and stops smiling, indicating more of a surprised look, two expressions will be coded.

Valence: If there is as to whether the expression is positive or negative, see the appendix for positive and negative emotions

Intensity: Intensity ratings for an individual expression range from one to four (1=low,

2=medium, 3=high, 4=very high). The low rating is given for those expressions that are mild, such as a smile where a participant slightly raises the corners of his/her mouth but does not show the teeth, and very little movement around the eyes occurs. The medium rating is given for those expressions where a participant's expression is more moderate than mild in intensity, such as a smile bordering on a laugh, with the eyebrows slightly raised and the lips apart, exposing the teeth. The high rating is given for an expression that involves most, if not all, of the face, such as laughing with an open mouth and raising the eyebrows and cheeks. The very high rating is reserved for those expressions that are very intense. An example of such an expression is one where a participant is undeniably laughing, with the mouth completely open with the eyebrows and cheeks substantially raised.

43