Multimodal Emotion Processing in Autism Spectrum Disorders
Total Page:16
File Type:pdf, Size:1020Kb
Developmental Cognitive Neuroscience 3 (2013) 11–21 View metadata, citation and similar papers at core.ac.uk brought to you by CORE Contents lists available at SciVerse ScienceDirect provided by Elsevier - Publisher Connector Developmental Cognitive Neuroscience j ournal homepage: http://www.elsevier.com/locate/dcn Multimodal emotion processing in autism spectrum disorders: An event-related potential study a,b,∗ c a Matthew D. Lerner , James C. McPartland , James P. Morris a Department of Psychology, University of Virginia, 102 Gilmer Hall, P.O. Box 400400, Charlottesville, VA 22904-4400, United States b Department of Psychiatry and Behavioral Neuroscience, University of Chicago, 5841 S. Maryland Avenue, MC 3077, Chicago, IL 60637, United States c Child Study Center, Yale University, 230 South Frontage Road, New Haven, CT 06520, United States a r t i c l e i n f o a b s t r a c t Article history: This study sought to describe heterogeneity in emotion processing in autism spectrum Received 14 April 2012 disorders (ASD) via electrophysiological markers of perceptual and cognitive processes Received in revised form 17 August 2012 that underpin emotion recognition across perceptual modalities. Behavioral and neural Accepted 22 August 2012 indicators of emotion processing were collected, as event-related potentials (ERPs) were recorded while youth with ASD completed a standardized facial and vocal emotion identi- Keywords: fication task. Children with ASD exhibited impaired emotion recognition performance for Autism spectrum disorder ERP adult faces and child voices, with a subgroup displaying intact recognition. Latencies of early N170 perceptual ERP components, marking social information processing speed, and amplitudes of Emotion processing subsequent components reflecting emotion evaluation, each correlated across modalities. Emotion recognition Social information processing speed correlated with emotion recognition performance, and Social cognition predicted membership in a subgroup with intact adult vocal emotion recognition. Results indicate that the essential multimodality of emotion recognition in individuals with ASDs may derive from early social information processing speed, despite heterogeneous behavioral performance; this process represents a novel social-emotional intervention target for ASD. © 2012 Elsevier Ltd. All rights reserved. Accurate recognition of affect is a requisite skill for O’Connor et al., 2005), highlighting the under-studied topic adaptive social functioning and a noted domain of impair- of heterogeneity within ASD. Despite variable presenta- ment in individuals with autism spectrum disorder (ASD; tion and a limited understanding of its nature and course, Baker et al., 2010; Philip et al., 2010; Rump et al., 2009). emotion decoding represents a common intervention tar- Deficits in identification of emotional content in faces get (Beauchamp and Anderson, 2010; Lerner et al., 2011; (Rump et al., 2009) and voices (e.g. prosody; Baker et al., Solomon et al., 2004). A critical goal for research is eluci- 2010) are well-documented in this population and are dating sensory and cognitive processes (i.e. mechanisms) posited to represent a core deficit (Philip et al., 2010). underlying emotion perception (i.e. attending to, noticing, Nevertheless, some research reveals subgroups of individ- decoding, and distinguishing nonverbal emotional inputs) uals with ASD who exhibit relatively preserved emotion in ASD; this will aid in identifying meaningful subgroups recognition ability (Bal et al., 2010; Harms et al., 2010; and concretizing appropriate treatment targets. 1. Multimodality of emotion perception ∗ Corresponding author at: 102 Gilmer Hall Box 400400, Charlottesville, VA 22904-4400, United States. Tel.: +1 773 702 2913; fax: +1 866 829 4976. Emotion recognition in humans is often hypothesized E-mail addresses: [email protected], [email protected] to emerge from a unified, multimodal emotion processing (M.D. Lerner), [email protected] (J.C. McPartland), [email protected] (J.P. Morris). ability (Borod et al., 2000) – that is, a latent capacity to 1878-9293/$ – see front matter © 2012 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.dcn.2012.08.005 12 M.D. Lerner et al. / Developmental Cognitive Neuroscience 3 (2013) 11–21 process emotions in multiple domains. The vast majority (fMRI; Wicker et al., 2008), and EEG/ERP (Wong et al., of studies of emotion perception examine faces, often 2008) to address processing of emotional information even asserting that “basic to processing emotions is the when behavioral performance appears unimpaired. While processing of faces themselves” (Batty et al., 2011, p. 432). ERP measurement involves its own set of limitations (e.g. Crucially, this claim rests on the assumption of multimodal movement artifacts, requirement of a relatively large num- emotion processing; that is, the utility of facial emotional ber of trials, highly controlled environment), it provides a processing for informing understanding of general emotion powerful method for isolating and quantifying cognitive perception lies in the notion that emotional face processing processes that may underlie perceptual processes (Nelson is an emergent process stemming from a more general- and McCleery, 2008). Notably, high temporal resolution ized latent emotion processing ability. So, assessment of allows for description of both speed/efficiency (latency) the multimodality of emotion processing fills an essential and salience/intensity (amplitude) of such processes at gap in the literature linking emotional face processing to discrete stages representing distinct cognitive events. We emotion processing generally. employed ERPs recorded during a normed facial and vocal In ASD populations, it is not known whether individ- emotion identification task to assess the presence and uals exhibit multimodal deficits in emotion perception, and heterogeneity of a multimodal emotion processing abnor- recent behavioral studies assessing the presence of a uni- mality in youth with ASD, and, if present, to specify its fied multimodal emotion recognition deficit have yielded impact on emotion recognition. inconsistent results. Often, emotion perception deficits in ASD have been hypothesized to reflect a generalized, rather 2. Neural correlates of visual (facial) emotion than sensory modality-specific, deficit in social perception perception (Beauchamp and Anderson, 2010; Jones et al., 2010; Philip et al., 2010). This account characterizes emotion perception ERPs have frequently been used to examine facial deficits in different domains as a common developmental emotion processing in typically developing (TD) popu- consequence of impairments in more basic social percep- lations, with two components yielding highly consistent tual abilities (Batty et al., 2011). Therefore, difficulties with evidence. The N170 has been shown to mark early percep- emotion perception should be evident in multiple sensory tual processes involved in recognition and identification of domains, or multimodal. In consonance with this model, configural information in faces (Bentin et al., 1996; Nelson Philip et al. (2010) found substantial emotion recognition and McCleery, 2008) and has been shown to respond to deficits in faces, voices, and bodies among adults with ASD. emotional expressions (Blau et al., 2007); however, see In contrast, some research contradicts this account. Eimer et al. (2003) for a contrasting account. Neural gen- Jones et al. (2010) found no fundamental difficulty with erators of the N170 have been localized to occipitotemporal facial or vocal (multimodal) emotion recognition, and sites critical for social perception, including fusiform gyrus Rump et al. (2009) found intact basic emotion recognition, (Shibata et al., 2002) and superior temporal sulcus (Itier in adolescents with ASD. These results suggest an account and Taylor, 2004). in which putative “deficits” may simply reflect discrete Meanwhile, components of processing subsequent to modality-specific perceptual abnormalities, or instances face structural encoding have been shown to be con- of heterogeneity in social perception among youth with sistently reflective of emotional valence. Specifically, the ASDs. However, studies considering the question of mul- fronto-central N250 (Luo et al., 2010) is evoked by observa- timodal emotion perception deficits in ASD have primarily tion of an emotionally expressive face (Balconi and Pozzoli, employed standardized and unstandardized behavioral 2008; Carretie et al., 2001; Streit et al., 2001; Wynn et al., tasks, which limit generalizability and do not address dif- 2008). The N250 is considered to mark higher-order face ferences in underlying processes (Harms et al., 2010; Jones processing such as affect decoding, and is presumed to et al., 2010). Overall, it is unresolved whether observed reflect the modulatory influence of subcortical structures, deficits in emotion perception in ASD are modality-specific including amygdala (Streit et al., 1999). Overall, while other or reflect a generalized emotion processing difficulty. components have sometimes been shown to be respon- In this study, we addressed this question directly by sive to the presence of faces (e.g. the P1; Hileman et al., examining lower-level perceptual and cognitive processes 2011), these two ERP components can be seen to reflect the across sensory modalities during emotion decoding within earliest structural (N170) and evaluative (N250) processes a sample of youth with ASD.