Affective Computing, Emotional Development, and Autism
Total Page:16
File Type:pdf, Size:1020Kb
OUP UNCORRECTED PROOF – FIRSTPROOFS, Thu Jul 17 2014, NEWGEN CHAPTER Afective Computing, Emotional 39 Development, and Autism and Abstract Key Words: Introduction the adult he or she will become. Interventions based Afective Computing and Child in afective computing that help children develop Development optimally have the potential to beneft society in Children’s development is a fertile application the long term. Troughout, whenever appropriate, of afective computing. Te nonverbal emotional we discuss how the reviewed studies of detection communication of children and infants may be less and modeling of emotions have contributed to our impacted by social display rules than the commu- understanding of emotional development in chil- nication of older individuals, thus ofering a rich dren with ASD. environment for the automated detection and mod- eling of emotion. Substantively, early dyadic inter- Afective Computing and the Development action between infants and parents ofers a model of Autism Spectrum Disorders for understanding the underpinnings of nonverbal Disordered development can provide insights communication throughout the lifespan. Tese into typical development. Tis chapter discusses interactions, for example, may lay the basis for the the detection and modeling of emotion—and the development of turn-taking and mutual smiling application of interventions grounded in afective that are fundamental to later nonverbal communi- computing—in children with autism spectrum dis- cation (Messinger, Ruvolo, Ekas, & Fogel, 2010). orders (ASDs) and their high-risk siblings. Autism At the same time, the child’s development afects spectrum disorders are pervasive disorders of social 516 book.indb 516 7/18/2014 1:21:49 PM OUP UNCORRECTED PROOF – FIRSTPROOFS, Thu Jul 17 2014, NEWGEN communication and impact a broad range of non- opportunities for advancement in afective comput- verbal (as well as verbal) interactive skills (American ing eforts with children. Psychiatric Association, 2000). Because the symp- toms of these developmental disorders emerge Automated Measurement of Emotional before 3 years of age, ASDs provide a window Behavior into early disturbances of nonverbal social inter- Automated Facial Measurement action. In addition, the younger siblings of chil- Te face is central to the communication of emo- dren with an ASD—high-risk siblings—can ofer tion from infancy through old age. However, man- a prospective view of the development of ASDs ual measurement of facial expression is laborious and related symptoms. Approximately one-ffth and resource-intensive (Cohn & Kanade, 2007). of these ASD siblings will develop an ASD and As a consequence, much more is known about the another ffth will exhibit ASD-related symptoms perception of facial expressions than of the produc- by 3 years of age that are below the threshold for a tion of facial expressions. Software-based automated clinical diagnosis (Boelte & Poustka, 2003; Bolton, measurement ofers the possibility of efcient, Pickles, Murphy, & Rutter, 1998; Constantino objective portraits of facial expression and emotion et al., 2006; Messinger et al., 2013; Murphy et al., communication. Here, we describe a methodologi- 2000; Ozonof et al., 2011; Szatmari et al., 2000; cal framework for the automated measurement of Wassink, Brzustowicz, Bartlett, & Szatmari., facial expression in infants and their parents during 2004). Automated measurement and model- early interaction. ing often focuses on high-risk siblings to provide A growing body of research on infant–parent objective data on the development of ASD-related interaction uses automated measurement based on symptoms. the facial action coding system (FACS) (Ekman & Friesen, 1992; Ekman, Friesen, & Hager, 2002) Chapter Overview and its application to infants (BabyFACS) (Oster, In a developmental context, afective comput- 2006). FACS is a comprehensive manual system for ing involves the use of computer software to detect recording anatomically based appearance changes behavioral signs of emotions and model emotional in the form of facial action units (AUs; Lucey, functioning and communication and the construc- Ashraf, & Cohn, 2007). To better understand the tion of software and hardware agents that interact dynamics of expression and emotional communica- with children. Te chapter begins with a review of tion, the strength of key AUs is measured using an automated measurement of facial action and the intensity metric that specifes whether a facial action application of those measures to better understand its present and, if present, its strength from mini- early emotion expression. Emotional communi- mal to maximal using FACS criteria (Mahoor et al., cation is complex, and the chapter then reviews 2008). Objective measurement of facial expression time-series and machine-learning approaches to intensity allows for time-series modeling of interac- modeling emotional communication in early inter- tive infuence. action, which includes comparisons between typi- A commonly used automated measurement cally developing children and children with ASDs. pipeline combines active appearance and shape Next, we review automated approaches to emotion models (AASMs) and support vector machines detection—and to the identifcation of ASDs— (SVMs) (Messinger et al., 2012). Active appear- from children’s vocalizations, and we discuss eforts ance and shape models are used to detect and track to model the vocal signal using graph-based and facial movement (see Figure 39.1). Te shape com- time-series approaches. Te fnal measurement ponent of the AASM unites the two-dimensional section reviews new approaches to the collection representations of the movement of 66 vertices of electrophysiological data (electrodermal activa- (Baker, Matthews, & Schneider, 2004; Cohn & tion [EDA]), focusing on eforts in children with Kanade, 2007). Mouth opening can be measured ASD. Finally, we review translational applications as the vertical distance between the upper and of afective computing in two areas that have shown lower lips in the shape component of the AASM. promise in helping children with ASD develop Te appearance component of the AASM con- skills in the areas of emotional development and tains the grayscale values for each pixel contained social communication: embodied conversational in the modeled face. Appearance is the grayscale agents (ECAs) and robotics. Te chapter ends texture within the region defned by the mesh. In with a critical discussion of accomplishments and the research reported here, nonlinear manifold Messinger, Duvivier, Warren, Mahoor, Baker, Warlaumont, Ruvolo 517 book.indb 517 7/18/2014 1:21:49 PM OUP UNCORRECTED PROOF – FIRSTPROOFS, Thu Jul 17 2014, NEWGEN learning (Belkin & Niyogi, 2003) was used to Emotion Measurement via reduce the dimensionality of the appearance and Continuous Ratings shape data to produce a set of variables that are Here, we describe a method for collecting con- used to train SVMs. Support vector machines tinuous ratings of emotion constructs in time are machine learning classifers that were used to that can be modeled in their own right and used determine whether the AU in question was pres- to validate automated measurements of emo- ent and, if present, its intensity level. To make tional behavior. In the automated facial expres- this assignment, a one-against-one classifcation sion measurement, expert manual measurement strategy was used (each intensity level was pitted of facial actions’ levels of cross-system (automated against each of the others) (Chang & Lin, 2001; vs. manual) reliability are typically comparable to Mahoor et al., 2008). standard interobserver (manual vs. manual) reliabil- ity. However, intersystem agreement speaks to the validity of the automated measurements but not to the emotional meaning of the underlying behaviors. One approach to validating automated measure- ments of the face as indices of emotion intensity are Input video with tracking continuous ratings made by third-party observers (http:// measurement.psy.miami.edu/). Continuous emotion measurement is similar to the afect rating dial in which participants in an emotional experience can provide a continuous report on their own afective state (Gottman & Levenson, 1985; Levenson & Gottman, 1983; Ruef & Levenson, 2007). In the research described here, however, continuous ratings were made by Afne observes who moved a joystick to indicate the warp afective valence they perceived in an interacting infant or parent. Te ratings of multiple indepen- dent observers were united into a mean index of perceived emotional valence (Waldinger, Schulz, Hauser, Allen, & Crowell, 2004). Continuous nonexpert ratings have strong face validity because they refect a precise, easily interpretable descrip- tion of a construct such as positive (“joy, happi- ness, and pleasure”) or negative emotion (“anger, Feature sadness, and distress”). extraction (SIFT) Applying Automated and Other Measurement to Early Emotion Expression THE CASE OF SMILING Automated measurement of the intensity of smiling has yielded insights into early positive emotion. Although infant smiles occur frequently in social interactions and appear to index positive AU 12 (0.00) emotion, adult smiles occur in a range of contexts, AU not all of which are associated with positive emo- detection tion. Tis has led some investigators to propose that a particular type of smiling, Duchenne smiling, is uniquely associated with the expression of positive Fig. 39.1 Facial measurement.