FACIAL RECOGNITION IN GENERALIZED DISORDER AND : ASSESSING FOR UNIQUE AND COMMON RESPONSES TO AND NEUTRALITY

A dissertation submitted to Kent State University in partial fulfillment of the requirements for the degree of Doctor of Philosophy

by Eftihia Linardatos Fall 2011

Dissertation written by Eftihia Linardatos B.S., University of Oregon, 2002 M.A., Kent State University, 2008 Ph.D., Kent State University, 2011

Approved by ______, Chair, Doctoral Dissertation Committee David M. Fresco ______, Members, Doctoral Dissertation Committee John Gunstad ______, John A. Updegraff ______, David L. Hussey ______, William Kalkhoff

Accepted by ______, Chair, Department of Maria S. Zaragoza ______, Dean, College of Arts and Sciences Timothy S. Moerland

ii

TABLE OF CONTENTS

LIST OF TABLES ...... iv

ACKNOWLEDGMENTS ...... v

INTRODUCTION ...... 1

METHOD ...... 25

RESULTS ...... 35

DISCUSSION ...... 48

REFERENCES ...... 61

iii

LIST OF TABLES

1. Table 1. Summary of Planned Nonorthogonal Contrast Coefficients ...... 34

2. Table 2. Demographic, clinical, and treatment data for Major Depressive 36 Disorder (MDD), Generalized Anxiety Disorder (GAD), comorbid MDD+GAD, and controls ………....………….…………...………….…………...………….……

3. Table 3. Tests of premorbid IQ (AMNART), visuospatial 38 ability (BFRT), processing speed ability (SC), and working memory (LNS) for participants with Generalized Anxiety Disorder (GAD), Major Depressive Disorder (MDD), comorbid MDD+GAD, and controls (N =90)..……………………………...

4. Table 4. Contrasts of mean correct responses (%) to facial emotion 39 expressions endorsed by participants with Generalized Anxiety Disorder (GAD), Major Depressive Dirorder (MDD), comorbid MDD+GAD, and controls as a function of of expression..………….…………...………….…………...………….……

5. Table 5. Contrasts of mean error responses (%) to facial 41 emotion expressions endorsed by participants with Generalized Anxiety Disorder (GAD) and controls as a function of valence of expression …...………….………...………

6. Table 6. Contrasts of mean sad error responses (%) to facial emotion 42 expressions endorsed by participants with Major Depression Dirorder (MDD), comorbid MDD+GAD, and controls as a function of valence of expression ...………...………

7. Table 7. Hierarchical Linear Regression predicting interpersonal 44 functioning from diagnostic status (GAD, MDD) and facial ….

iv

ACKNOWLEDGMENTS

First, I would like to express my deepest to my advisor, Dr.

David Fresco, for his guidance, support, and endless encouragement not only

during the dissertation process, but throughout my graduate training. I also want

to extend my gratitude to my committee members for their helpful feedback and

insightful comments.

Second, I would like to thank my trusted colleagues in the Fresco Lab for

supporting me both professionally and personally. I must acknowledge my project

coordinator, Ashley Garvin, for volunteering her to collect the data for this

project. I would also like to thank Jenn and Thao for their friendship,

extraordinary thoughtfulness, and often much needed distraction over the last

year. Dr. Terry Takahashi, Dr. Julie Nelligan, and Robert Tell, thank you for

encouraging me to go to graduate school. Dr. John Akamatsu, thank you for your

, support, and guidance throughout my graduate training.

Third, I am eternally grateful to my parents and brothers, Andrew and

Napoleon, for their , support, and generosity. They inspired me to pursuit my

goals, work hard, and achieve my goals.

And lastly, my sincere appreciation goes to the study participants who

were willing to share their experiences and make this research project possible.

Myrtos, Kefalonia

v

INTRODUCTION

Emotions have a central role in human experience and their importance has been recognized as early as the 3rd Century BCE. For Aristotle in the Art of Rhetoric (1991), emotions play a functional role in that they give us the capacity to take action in response to external and internal stimuli. In more recent years, Darwin (1872/1965) proposed that expressions of emotion serve a communicative function and are important for the human well-being. Indeed, accurate facial recognition of emotions has been shown to be central to effective social communication and adaptive behavior (Ekman, 1992). Consequently, facial emotion recognition may play a central role in emotional disorders.

Interpersonal functioning in GAD and MDD: Does facial emotion recognition play a role?

Generalized anxiety disorder (GAD) and major depressive disorder (MDD) are prevalent, chronic, and potentially debilitating emotional disorders. Both GAD (e.g.,

Borkovec, Newman, Pincus, & Lytle, 2002) and depression (e.g., Fredman, Weissman,

Leaf, & Bruce, 1988) have been linked to difficulties in social and interpersonal functioning raising the question as to whether these conditions are also associated with deficits in facial emotion recognition. Specifically, GAD has been associated with significant functional impairment and low life satisfaction (e.g., Wittchen, Carter, Pfister,

Montgomery, & Kessler, 2000). In the interpersonal realm, individuals with GAD endorse more themes concerning interpersonal relationships than individuals with

1

2

other anxiety disorders (e.g., Breitholtz, Johansson, & Öst, 1999) and nonanxious controls (e.g., Craske, Rapee, Jackel, & Barlow, 1989). Further, persons with GAD report interpersonal lifestyles characterized by more problems with nonassertiveness, over- accommodation, self-sacrificing behaviors, and intrusiveness or neediness compared to healthy controls (e.g., Eng & Heimberg, 2006). Finally, GAD has been associated with marital dissatisfaction (Whisman, Sheldon, & Goering, 2000), separation and divorce

(Hunt, Issakidis, & Andres, 2002), and low satisfaction in terms of relationships with friends and relatives (Henning, Turk, Mennin, Fresco, & Heimberg, 2007). Despite speculations that facial emotion recognition impairments maybe underlying these interpersonal difficulties, there are no published studies to date that have examined the process of facial emotion recognition in GAD.

Individuals with MDD also report poor quality of life and impairment in their overall functioning with difficulties in social and interpersonal relationships considerably contributing to these deficits (e.g., Pyne et al., 1997; Rapaport, Clary, Fayyad, &

Endicott, 2005). More specifically, individuals with depression report less supportive social networks (Billings, Cronkite, & Moos, 1983), more negative interactions with others (Keitner & Miller, 1990), and greater discord in intimate relationships as compared to healthy controls (Zlotnick, Kohn, Keitner, & Della Grotta, 2000). With regard to intimate relationships, couples with depressed individuals have also been found to exhibit increased expressions of dysphoria, and lower levels of constructive problem solving, mutual self-disclosure, and reciprocal support compared with their nondepressed

3

counterparts (e.g., Barnett & Gotlib, 1988). Additionally, individuals with depression report having less contact with relatives and are less satisfied in their relationships with friends and significant others (Fredman, Weissman, Leaf, & Bruce, 1988). Moreover, interpersonal problems in depression involving deficits in social skills (e.g., inappropriate negative self-disclosure, low assertiveness; Segrin, 2000) are associated with negative mood (e.g., Coyne, 1976) and (e.g., Joiner, Alfano, & Metalsky, 1992).

Although previous studies have examined the role of facial emotion recognition in MDD, the findings have been inconsistent. Specifically, a considerable number of studies has supported the association between depression and facial emotion recognition deficits

(e.g., Cooley & Nowicki, 1989; Carton et al., 1999; Csukly, Czobor, Szily, Takács, &

Simon, 2009; Feinberg, Rifkin, Schaffer & Walker, 1986; Langenecker, Bieliauskas,

Rapport, Zubieta, Wilde, et al., 2005; Persad & Polivy, 1993), but others have failed to find such a relationship (e.g., Bediou, Krolak-Salmon, Saoud, Henaff, Burt, et al. (2005);

Gaebel & Wölwer, 1992; Joorman & Gotlib, 2006; Kan, Mimura, Kamijima, &

Kawamura, 2009; Ridout, et al., 2007.). Therefore, the function of facial emotion recognition in MDD remains unclear.

Shared and unique features in GAD and MDD: Where does facial emotion recognition fall?

In addition to being associated with interpersonal difficulties, GAD and MDD overlap substantially in both clinical and community samples. Brown and colleagues

(2001), for instance, found that two thirds of a clinical sample composed of individuals with MDD had a current additional diagnosis of GAD, whereas a third of those with a

4

primary diagnosis of GAD also had major depressive disorder (Brown, Campbell,

Lehman, Grisham, & Mancill, 2001). The comorbidity rates are higher for lifetime diagnoses, with two thirds of individuals with GAD having a lifetime history of MDD

(Brown et al., 2001). The high comorbidity levels in GAD and MDD stem, in part, from shared features at the genotypic and phenotypic level. Specifically, twin studies have consistently shown that GAD and MDD share a common genetic diathesis (e.g., Roy,

Neale, Pedersen, Mathé, & Kendler, 1995). , a personality trait associated with the tendency to experience negative emotional states, has also been found to be genetically associated with both GAD and MDD (e.g., for a review, see Brandes and

Bienvenu, 2006).

At the phenotypic level, both GAD and MDD are associated with disturbances in the emotional processing of negative valenced emotions. Specifically, GAD is characterized by excessive anxiety, whereas depression is characterized by depressive mood or reduced or in usual activities (American Psychiatric

Association; APA, 1994). Further, anxiety is typically accompanied by worry, which is a repetitive thought process that serves an avoidant function in GAD (e.g., Borkovec,

Alcaine, & Behar, 2004). Like GAD, individuals with MDD often engage in a repetitive thought pattern, often referred to as depressive rumination (e.g., Nolen-Hoeksema, 1998).

Finally, both GAD and MDD have four diagnostic criteria in common including, restlessness, fatigue, difficulty concentrating, and sleep disturbance.

Because of the substantial overlap between GAD and MDD, some investigators

(cf. Watson, O’Hara, & Stuart, 2008) have recently proposed that these disorders should

5

be reclassified into a common category in the successor to the fourth edition of the

Diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM-IV; APA,

1994). However, including GAD in the same cluster as MDD may be premature, if we take into consideration the unique features and processes associated with these mental health conditions. At the cognitive level, for instance, GAD and MDD are characterized by unique thought patterns that operate through , , memory, and interpretation (Beck, 1976). Clark and Beck (1989) formulated the cognitive specificity hypothesis, which posits that cognitive processing of emotion congruent information is characterized by domains that are idiosyncratic to depression and anxiety. This hypothesis then predicts that individuals with anxiety tend to be preoccupied with threatening information, whereas depressed individuals are concerned with thoughts related to loss, failure, and . Indeed, empirical findings suggest that depressed individuals report more negative thoughts involving past failure and loss (e.g., Clark,

Beck, & Stewart, 1990). In contrast, anxious individuals report more thoughts related to anticipated harm and danger (e.g., Jolly, Dyck, Kramer, & Wherry, 1994). In addition to automatic thoughts and appraisals specific to GAD and MDD, cognitive biases or selective processing of emotion-relevant information have also been associated with these mental health conditions (e.g., Mineka, 1992; Mineka & Tomarken 1989). Specifically, depression has been linked to both memory and attentional biases for negative information in general and sadness in particular. In contrast, individuals with anxiety tend to endorse attention biases related to threatening cues. The cognitive biases and automatic thoughts in anxiety and depression may interfere with the processing of facial expressions

6

of emotion and as a result change the interpretation of the sensory input. Cognitive biases may be especially significant in the processing of neutral expressions, given that neutrality is associated with more ambiguity and uncertainty than expressions of emotion

(e.g., anger, sadness, ). Therefore, exploring the process of facial emotion recognition in GAD, MDD, and comorbid MDD+GAD maybe instrumental in identifying differential patterns of response to stimuli with emotional and neutral content.

These response patterns may subsequently enhance our conceptualization of these two highly co-occurring mental health problems.

The Process of Facial Emotion Recognition

Emotions are physiological responses to external (e.g., people, events) and/or internal stimuli (e.g., other emotions) that motivate us to action (e.g., Izard 1971, 1972).

One of the primary roles of the human face is to convey these emotions. Subsequently, facial emotion recognition allows others to receive these emotions and hence facilitate human communication. Specifically, recognition is a process that is based on the perception of a visual stimulus, such as a of emotion. Although perception requires detection and processing of the features of the visual image, recognition requires that we also have prior knowledge of what the image represents

(Adolphs, 2002). In other words, recognition entails that we remember early of the particular stimulus. Remembering specific facial expressions of emotion then allow the person to discriminate among different expressions of emotion and interpret their meaning during recognition. It is not uncommon, however, that interpretation of

7

emotional facial expressions also includes other nonverbal behaviors, such as body movements, gaze, and tone of voice.

Numerous behavioral and physiological studies have provided empirical support for the innate nature of emotion recognition. In cross-cultural studies of literate and pre- literate societies, for example, Ekman and colleagues (1971, 1987) found that there are at least six emotions, known as basic emotions, which are reliably recognized universally.

The basic emotions include happiness, sadness, , , , and anger and correspond to distinctive facial expressions (Ekman, 1992; 1999).

Facial Emotion Recognition: Physiological correlates

At the muscular level, the expression of the basic emotions includes a number of facial muscles including, the depressor anguli oris, orbicularis oris, orbicularis occuli, zygomaticous major, and frontalis. These core muscles exhibit high bilateral symmetry and appear to be present in all individuals, whereas other facial muscles vary in their occurrence and are often asymmetrical (Waller, Cray, & Burrows, 2008). Facial expressions of emotion can then be either voluntary or involuntary depending on whether the neural systems controlling the movement of these muscles are volitional or automatic.

The primary motor cortex appears to be responsible for the regulation of voluntary facial expressions, whereas the insula basal ganglia of the prefrontal cortex control involuntary facial expressions (Hopf, Muller-Forell, & Hopf, 1992).

In addition to controlling the movement of facial muscles, neuroanatomical structures are also responsible for the recognition of facial expressions of .

Functional imagining studies have identified the as the brain region associated

8

with the recognition of most basic emotions including, anger, fear, happiness, and sadness (e.g., Blair, Morris, Frith, Perrett, & Dolan, 1999; Killgore & Yurgelun-Todd,

2004; Phillips, Drevets, Rauch, & Lane, 2003). The right has also been implicated in the recognition of sad facial expressions, whereas the facial expression of happiness also activates the bilateral fusiform gyri (e.g., Surguladze et al., 2005) and the basal ganglia (e.g., Phan, Wager, Taylor, & Liberzon, 2002). Further, the ventrolateral prefrontal cortex has been linked to the identification of expressions of anger (e.g., Blair

& Cipolotti, 2000). Finally, the recognition of disgust appears to activate neural structures, such as the anterior insula and basal ganglia (e.g., Calder, Keane, Manes,

Antoun, & Young, 2000).

Facial Emotion Recognition: Behavioral correlates

Do facial expressions of emotion influence human behavior, and how?

Through the process of facial emotion recognition, the neural signals for the basic emotions translate into human behavior. The functional role of facial expressions of emotion is multifaceted; they not only convey information about our , thoughts, and intentions, but they also allow us to regulate other’s behavior (Salovey & Mayer,

1990). The expression of a smile, for example, serves multiple functions in social settings. It may express happiness, which is associated with the urge or the action tendency of approach (Fridja, Kuipers, & Ter Schure, 1989). Besides expressing happiness, smiles are used to reduce conflict or tension (Ikuta, 1999) and to please others

(Hecht & LaFrance, 1998). In contrast, the expression of sadness may communicate a sense of loss for something or someone and thus, solicit social support (Huppert & Alley,

9

2004). Disgust may express that ingesting something may be harmful (e.g., Rozin, Haidt,

& McCauley, 2000). Disgust may also correspond to moral judgment and thus, indicate repulsion/avoidance for socially inappropriate relationships and behaviors (e.g., Haidt,

Rozin, McCauley & Imada, 1997). Although the expression of fear is associated with imminent threat and danger (Barlow, 2002), it can also be employed to decrease conflict or promote conciliatory responses (e.g., Marsh, Ambady, and Kleck, 2005). In contrast, surprise denotes that something unexpected or unanticipated occurred. Finally, anger communicates that a socially acceptable behavior has been violated (Averill, 1982) and thus, is signaling a readiness for physical or symbolic attack (e.g., Schupp et al., 2004).

As a result, anger is considered to be a threatening expression (e.g., Berkowitz, 1990).

Empirical findings also indicate that the expression of basic emotions, such as anger, fear, and happiness can influence human behavior. Marsh and colleagues (2005), for instance, showed participants photographs of men and women expressing anger and fear and instructed them to either push a lever away from themselves (indicating avoidance) or toward themselves (indicating approach). The facial expressions of anger facilitated avoidance-related behaviors in participants, whereas fear expressions facilitated approach behaviors. In the case of happiness, Winkielman, Berridge, and

Wilbarger (2005) found that subliminal happy facial expressions increased the consumption of an unfamiliar beverage in thirsty participants. However, when the participants were exposed to facial expressions of anger, their beverage consumption decreased. The authors suggested that facial expression of emotions interact in a dynamic way with other complex processes, such as motivation, to determine a person’s behavior.

10

Does facial emotion recognition accuracy matter in human behavior?

Facial emotion recognition accuracy is not only a prerequisite for successful social interaction, but a person’s overall functioning as well. Indeed, accumulating empirical findings has shown that accurate recognition of facial affective signals facilitates social communication and interpersonal functioning, whereas inaccuracies in emotional processing are associated with reduced quality of life. Specifically, individuals with deficits in facial emotion recognition report having few close interpersonal relationships and tend to feel isolated and frustrated in their relationships (Carton,

Kessler, & Pape, 1999). Furthermore, numerous psychiatric disorders, such as (e.g., Addington, & Addington, 1998; Kohler, Bilker, Hagendoorn, Gur, &

Gur, 2000), bipolar disorder (e.g., Ketter & Lembke, 2002) obsessive compulsive disorder (OCD; Corcoran, Woody, & Tolin, 2008), anorexia nervosa (e.g., Kucharska-

Pietura, Nikolaou, Masiak, & Treasure, 2004), and alcoholism (e.g., Kornreich et al.,

2001a; Kornreich et al., 2001b) have been associated with deficits in facial emotion recognition. In some of these disorders, interpersonal difficulties have been attributed, partly, to impairment in facial emotion recognition. In schizophrenia, for example, difficulties labeling facial expressions of affect are inversely related to social competence and functioning (e.g., Addington, Saeedi, & Addington, 2006). Similarly to the aforementioned mental health disorders, MDD has been linked to deficits in facial emotion recognition (e.g., Csulsky, et al., 2009; Persad & Polivy, 1993). However, GAD remains one of the few prevalent mental health disorders for which the role of facial emotion recognition has not been investigated yet.

11

How is facial emotion recognition accuracy assessed?

Behavioral paradigms have been instrumental in assessing a person’s ability to accurately discriminate emotional expressions and identifying deficits in facial emotion recognition. These paradigms typically involve a recognition task during which study participants are presented with a set of stimuli expressing one of the basic emotions (e.g., anger, sadness, happiness) in addition to neutral expressions. Although some studies have employed stimuli involving schematic drawings of facial expressions of affect in the past, a set of standardized photographs of female and/or male individuals depicting an emotion is more common in this area of research. These images can be either static unmorphed or static morphed faces. The static unmorphed faces are comprised of still photographs of emotional expressions. The static morphed faces, on the other hand, are comprised of still photographs that have been interpolated between each prototype (e.g., happy) and neutral expressions to create a continuum of evoked low (0%; neutral) to high (100%; happy) intensity expressions of affect. Investigators can select to either vary the intensity of the morphed stimuli within the experimental task, or present the stimuli at a set intensity

(e.g., 70% happy expression). The unmorphed and morphed faces are presented either on paper or electronically (e.g., computer), and study participants are instructed to identify and then classify the specific emotion. Further, the computerized task can involve a forced response option that requires participants to identify emotions in a predetermined time frame set by the experimenter. Alternatively, the computer task can be programmed to allow for the selection of emotional expressions without any time restrictions. In both cases, participants are instructed to respond to the stimulus as quickly and accurately as

12

possible. Although sets of morphed and unmorphed stimuli have been typically including black and white photographs of posed facial expressions of affect restricted in ethnicity and age, color images of individuals representing a greater age range and ethnic diversity have become more common over time. In recent years, researchers have also employed animated or dynamic mages to examine the process of facial emotion recognition.

Dynamic faces are typically comprised of video clips of natural human faces of male and female individuals who portray neutral expressions in addition to the basic emotions.

Irrespective of the stimulus type, facial emotion recognition accuracy is measured as the percent of facial expressions judged correctly.

GAD and facial emotion recognition: Behavioral, cognitive, and biological findings

Although GAD has been linked to interpersonal difficulties, no study to date has examined the process of facial emotion recognition in this disorder. One factor that may account for the lack of research in this area of GAD is the limited research devoted to

GAD overall compared to other anxiety and mood disorders. For example, Dugas and colleagues (2010) have recently shown that research in GAD lags behind research on most other anxiety disorders, such as posttraumatic stress disorders (PTSD), social anxiety, and disorder (Dugas, Anderson, Deschenes, & Donegan, 2010). Further, the authors indicated that research in GAD has been focused more on treatment issues and less on the clinical features of the disorder, and processes related to its cognitive, emotional, and biological characteristics. Finally, research in the area of emotion has only recently regain attention in the field of psychology (e.g., Gross, 1999), which may have

13

also contributed to the scarcity of research concerning GAD and facial emotion recognition.

With regard to cognitive processing, research has shown that individuals with

GAD endorse attentional biases for a range of emotional information including, negative or threatening words, threatening faces, and less frequently for positive cues (for review, see Mogg & Bradley, 2005). For example, Mogg and colleagues (1999) found that individuals with GAD showed greater vigilance for faces expressing anger, relative to neutral faces, than healthy controls (e.g., Bradley, Mogg,White, Groom, & de Bono,

1999). In a similar study, Mogg, Millar, and Bradley (2000) found that compared to nonanxious controls, individuals with GAD were more likely to look towards angry than neutral faces and at a relatively faster rate. Further, GAD has been associated with interpretive biases in favor of threatening meanings. In an early study of interpretive bias, participants provided interpretations of ambiguous scenarios (Butler & Matthews, 1983).

The findings revealed that participants with GAD endorsed more threatening interpretations than the nonanxious participants. In a different study, Mathews, Richards, and Eysenck (1989) instructed participants with GAD and healthy controls to complete a spelling task including words of threat and neutral homophones. Participants with GAD were more likely to spell the threatening words than the controls. Overall, research indicates that individuals with GAD are continuously scanning their environment for threatening related stimuli and interpret ambiguous situations as dangerous. This cognitive vulnerability in GAD may maintain and/or exacerbate feelings of anxiety and worry, and it is also plausible that it indirectly influences the person’s interpersonal

14

functioning. If so, studies exploring the relationship of facial expression recognition of threatening emotions and neutrality to interpersonal functioning in GAD may be particularly instrumental in our understanding of the disorder.

Although the studies in GAD have been limited to adolescent populations, findings indicate that GAD may be associated with anatomical deficits in brain regions involved in the process of facial emotion recognition. For example, in a functional magnetic resonance imagining (fMRI) study, De Bellis and colleagues (2000) found that the right and total amygdala was larger in participants with GAD as compared to non-psychiatric controls (De Bellis et al., 2000). Another fMRI study demonstrated that although both GAD participants and the control group endorsed an attentional bias to a brief exposure of angry faces, the right amygdala in participants with GAD, relative to controls, was more activated (Monk et al., 2008). Furthermore, researchers found a differential influence of the ventrolateral prefrontal cortex on amygdala depending on diagnostic status and stimulus exposure time. Neuroimagining studies have indicated that vigilance in response to threatening stimuli is controlled by the synergy of amygdala and ventrolateral prefrontal cortex. The amygdala appears to influence the immediate processing of threats, whereas the ventrolateral prefrontal cortex is responsible for regulating responses at later stages in the process as well as modulating the amygdala responses. In the study of Monk and colleagues (2008), the right ventrolateral prefrontal cortex was more activated in participants with GAD, relative to controls, in response to prolonged exposure of angry faces; however, there were no differences in amygdala activation. Further, this activation was greater in GAD participants with mild anxiety than

15

those with severe anxiety. The investigators suggested that the ventrolateral prefrontal cortex might have compensated for the lack of activity in amygdala in the case of prolonged exposure to danger.

MDD and facial emotion recognition: Behavioral, cognitive, and biological findings

Research findings in the interpersonal realm suggest that MDD may be associated with deficits in facial emotion recognition (e.g., Persad et al., 1998; Zurroff et al., 2006).

However, behavioral studies investigating the ability of individuals with depression to recognize facial expressions of emotions have yielded inconsistent findings. Specifically, some studies suggest that depression is associated with a general deficit in facial emotion recognition (e.g., Cooley & Nowicki, 1989; Carton et al., 1999; Csukly, Czobor, Szily,

Takács, & Simon, 2009; Feinberg, Rifkin, Schaffer & Walker, 1986; Langenecker,

Bieliauskas, Rapport, Zubieta, Wilde, et al., 2005; Persad & Polivy, 1993). Other studies, however, have demonstrated that individuals with depression have difficulties recognizing specific emotions, such as happiness (Mandal, & Bhattacharya, 1985), or interest, and sadness (Rubinow & Post, 1992). Additionally, other studies have found that individuals with MDD do not differ from healthy controls in their ability to decode facial expressions of emotion (e.g., Bediou et al., 2005; Gaebel & Wölwer, 1992; Joorman &

Gotlib, 2006; Kan, Mimura, Kamijima, & Kawamura, 2009; Ridout, et al., 2007).

One factor that may account for the discrepant findings in facial emotion recognition accuracy in MDD is methodological variability. Specifically, some studies have used samples comprised of participants with MDD only (e.g., Joorman et al., 2006;

Kan et al., 2009; Persad & Polivy, 1993), whereas others have used samples comprised of

16

participants with unipolar and bipolar depression (e.g. Gur et al, 1992; Rubinow et al.,

1992). Given that unipolar and bipolar depression are neurochemically, physiologically, and genetically distinct (Baxter et al., 1989; Goodwin and Jamison, 1990), using heterogeneous samples may produce results that are not necessarily representative of

MDD. Furthermore, some studies have included only female participants (e.g.,

Langenecker, et al. 2005; Persad & Polivy, 1993) and others have included both males and females (e.g., Bediou et al., 2005; Gaebel & Wölwer, 1992; Ridout et al., 2005). In terms of emotional stimuli, some studies have used morphed faces (e.g., Bediou et al.,

2005; Joormann & Gotlib, 2006; Kan et al., 2009), whereas others have used static faces

(e.g., Csukly et al., 2009; Feinberg et al., 1986; Gessler et al., 1989), or even dynamic images (e.g., Ridout et al., 2007).

Despite their methodological differences, the aforementioned behavioral investigations have provided the empirical basis for the process of facial emotion recognition in MDD. Fienberg and colleagues (1986) conducted one of the first studies of facial emotion recognition in depression with a clinical sample composed of MDD participants only. The participants were presented with black and white facial expressions of the 6 basic emotions and neutrality on a slide screen and were instructed to select and circle on a score sheet the appropriate emotion. The findings revealed that individuals with MDD evidenced significantly lower accuracy in decoding facial expressions of all basic emotions compared to the healthy control group. In a more recent study, Csukly and colleagues (2009) presented a sample of individuals with clinical MDD (n = 23) and healthy controls (n = 23) with static morphed faces evoking the six basic emotions at 5

17

intensity levels ranging from 20% to 100%. The study participants were also presented with neutral facial expressions. The authors found that participants with depression endorsed a significant impairment in their overall ability to recognize facial expressions, and that the impairment was most pronounced at facial stimuli representing low-

(neutral, disgust, sadness). Further, participants with MDD made more misattribution errors in the direction of high-arousal emotions (fear, anger, surprise) compared to controls. They were also more likely to attribute high-arousal emotions to neutral expressions relative to the control group. The authors suggested that the findings may be attributed to amygdala abnormalities in MDD and potentially reflect deficits in cognitive processing associated with the disorder. In an earlier study conducted by Bediou and his colleagues (2005), these general and specific deficits in facial emotion recognition were not present in MDD. The authors compared groups of patients with depression, schizophrenia, and healthy controls on a computerized task of morphed faces expressing neutrality and 3 basic emotions (disgust, fear, happiness) at nine intensity levels (10% to

90%). Although patients with schizophrenia endorsed significantly lower facial expression recognition scores than both the patients with MDD and controls, the differences between depression patients and controls were not significant. In one of the few behavioral studies employing dynamic stimuli in MDD, Ridout and his colleagues

(2007) found similar results. The authors instructed participants with MDD (n = 17) and healthy controls (n = 22) to identify the emotions (basic and neutral) portrayed by the protagonist of 28 video clips each lasting 15 to 60 seconds. The videos featured either a single individual speaking on the telephone or directly addressing the viewer, or 2

18

different individuals engaging in conversation. The authors suggested that the lack of significant differences between the groups might be due to the use of highly ecologically valid stimuli as all participants achieved above 80% recognition accuracy on the emotion recognition task.

In addition to impairments in facial emotion recognition, depression has also been associated with cognitive anomalies in emotional processing. Studies have found that individuals with MDD are better at recognizing and recalling negatively valenced words compared to healthy controls (for review, see Mineka, Rafaeli, & Yovel, 2003).

Individuals with depression exhibit similar cognitive biases in response to facial stimuli.

Ridout and colleagues (2003), for example, found that participants with MDD exhibited memory biases for emotional facial expressions. Specifically, they presented study participants with a set of pictures with sad, happy, and neutral facial expressions during a facial emotion recognition task. Subsequently, participants had to indicate which of the faces from the first set of pictures was included in a different set of pictures. Results indicated that depressed participants recognized significantly more sad faces than happy faces, whereas healthy controls identified more happy faces than sad faces. In addition to biases in recall, research has shown that individuals with depression endorse attentional biases for sadness. For example, Gotlib and his colleagues (2004) found that participants with clinical depression endorsed attentional biases specific to sad faces compared to healthy controls and individuals with GAD (Gotlib, Krasnoperova, Yue, and Joorman,

2004). Other studies have also suggested that MDD is associated with an that is specific to sadness and/or with difficulties in the recognition of positive emotions.

19

Joormann and Gotlib (2006), for example, found that individuals with MDD required higher emotional intensity of facial expressions to correctly identify happiness compared to controls. The opposite trend was observed for sad expressions in that the MDD group required less intensity to correctly identify sadness relative to the controls. In other studies, individuals with MDD recognized neutral expressions less accurately than the control group (e.g., Gollan, Pane, McCloskey, and Coccaro, 2008; Gur et al., 1992; Kan et al, 2009; Leppänen, Milders, Bell, Terriere, & Hietanen, 2004). Leppänen and his colleagues (2004), for instance, instructed patients with clinical MDD (n = 18) and healthy controls (n = 18) to make a forced-choice response to happy, sad, and neutral static unmorphed stimuli presented briefly on a computer screen. Although the control group recognized neutral faces as accurately as the happy and sad faces, patients with depression recognized neutral faces less accurately than either happy or sad faces. The results also revealed that these difficulties reflected negative biases, namely the MDD participants attribute sadness to the neutral faces more often than the healthy controls.

These difficulties in processing neutral facial expressions remained following remission of the depressive symptoms. Thus, the authors suggested that the impairment might be a trait characteristic of MDD that persists despite improvement in mood state.

Neuroimaging studies have found that individuals with depression exhibit structural anomalies in neural networks involved in emotional processing (for a review, see Phillips, Drevets, Rauch, & Lane, 2003b). For example, studies report differences in amygdala volume among individuals with MDD albeit with somewhat conflicting findings. Specifically, individuals from recurrent major depression have

20

evidenced no volumetric changes in amygdala (e.g., Mervaala et al., 2000), whereas individuals with recent onset major depression appear to have an enlarged amygdala volume (e.g., Frodl et al. 2002; Lange & Irle, 2004). Alternatively, Lange and Irle (2004) have suggested that the volume of the amygdala in MDD is enlarged at the onset of the disorder, and then it may decrease with extended disorder duration. Anomalies in neural activity in MDD have also been evidenced in studies directly assessing facial emotion recognition. For instance, in one study individuals with MDD and healthy controls were exposed to happy and sad facial expressions. Results revealed a linear increase in activity to the left amygdala in response to increasingly intense sad faces in the MDD group but not in controls (Surguladze et al., 2005). Similarly, Fu and colleagues (2004) found an exacerbated amygdala response to facial expressions of sadness in patients with MDD as opposed to healthy controls.

Facial emotion recognition in GAD and MDD: Integrating biological, cognitive, and behavioral components

Facial emotion recognition has a central role in human communication and thus, examining this process in the context of emotional disorders, such as GAD and depression, may have significant theoretical and clinical value. Individuals with GAD and depression share many genotypic and phenotypic features, leading to the high rates of comorbidity in these two prominent disorders. In terms of facial emotion recognition,

GAD and depression appear to share deficits in some brain regions associated with the process of facial emotion recognition. For example, both disorders have been linked to anomalies in the amygdala activity in response to facial expressions of emotion. Thus, the

21

accuracy of facial recognition in GAD and MDD may be reflective of, at least in part, neurobiological deficits associated with the disorders.

At the cognitive level, GAD and depression have been associated with biases that appear to interfere or influence emotional processing. For example, individuals with

GAD tend to attend more to facial expressions of threat compared to controls, whereas individuals with MDD have been shown to recall significantly more sad facial expressions than the controls. These cognitive biases reflect differential cognitive patterns in GAD and MDD. Specifically, GAD is associated with thoughts revolving around threatening information, whereas thoughts in depression are related to loss, failure, and sadness. These unique cognitive mechanisms may also play a role in the process of facial emotion recognition resulting in differential patterns of responses to facial expressions of emotions for GAD and depression.

Behaviorally, both disorders have been associated with difficulties in interpersonal relationships and social functioning. Further, individuals with MDD have been shown to have a general impairment in facial emotion recognition as well as for specific emotions such as happiness. In contrast, facial emotion recognition has not yet been explored in individuals with GAD. Overall, I propose that that the process of facial emotion recognition in GAD and depression is influenced by the interplay of cognitive and biological processes that will be evident in the behavioral responses of individuals completing a facial emotion recognition task. Specifically, an overall deficit in facial emotion recognition will be driven mainly by neurobiological mechanisms, whereas

22

differential responses to specific facial expressions of emotion would be influenced by cognitive biases.

With these theoretical and empirical perspectives in mind, the goals of this study were threefold: 1) Examine the overall accuracy of facial emotion recognition as well as that for specific emotions in GAD, MDD, and comorbid MDD+GAD, 2) Examine misattributions in facial expression recognition in response to anger, sadness, and neutral expressions in GAD, MDD, and comorbid MDD+GAD, and 3) Investigate the relationship of facial emotion recognition and interpersonal functioning in the context of

GAD and MDD.

The present study

We aimed to compare and contrast GAD and MDD in the context of facial expression recognition of all basic emotions and neutrality to account for the interpersonal difficulties associated with these two conditions. This study was the first to examine the process of facial emotion recognition in GAD. With regard to MDD, some behavioral studies have found that depression is associated with deficits in facial emotion recognition, whereas other studies have failed to find any impairment. Furthermore, very few studies have examined the response patterns of recognition to neutral expressions in

MDD. Previous studies have only compared pattern responses for sad and happy expressions in MDD and this study explored these patterns in response to neutral expressions. The purpose of the present study was to replicate and expand the findings from past studies by including additional facial expressions to the experimental paradigm

23

(e.g., fear, disgust) as well as neutral expressions. Because MDD overlaps substantially with GAD, we also aimed to examine the role of facial emotion recognition in the comorbid MDD+GAD condition.

Further, although previous studies have proposed that impairments in facial emotion recognition in GAD and MDD may be underlying the social and interpersonal deficits in these disorders, no study today has directly assessed the interpersonal functioning in relation to the process of facial emotion recognition in these populations.

In the present study, we directly assessed the interpersonal ability of the study participants and examined whether facial emotion recognition accuracy could predict interpersonal functioning above and beyond psychopathology.

Finally, very few studies in this line of research have examined whether deficits in facial emotion recognition can be attributed to deficits in executive functioning in general versus deficits in emotional processing in particular. Ability to recognize facial expressions of emotion can be influenced by factors that impair recognition broadly, such as general (IQ), attention, and perception. Thus, a global impairment in facial emotion recognition may reflect a global deficit in recognition process or a global perceptual impairment rather than a deficit in perception of emotion. With this in mind, the present study included measures that assess for executive functioning ability including, premorbid IQ, visuospatial ability, and processing speed.

Hypothesis 1. Participants with GAD, MDD, and MDD+GAD will not differ from control participants on executive functioning measures of premorbid IQ, visuospatial and processing speed ability, and working memory.

24

Hypothesis 2. GAD and MDD have been associated with anomalies in structures involved in emotional processing in particular. In light of these findings, individuals with

GAD, MDD, and comorbid MDD+GAD will exhibit a general deficit in their ability to recognize facial expressions of emotions as compared to control participants.

Hypothesis 3. Individuals with GAD and depression will endorse differential patterns in response to expressions of neutrality. Findings indicate that depression and anxiety are associated with mood-congruent biases (Beck, 1976). Specifically, individuals with anxiety are preoccupied with threatening information, whereas depressed individuals are concerned with thoughts related to loss, failure, and sadness. The comorbid MDD+GAD group will likely manifest response biases associated with the

MDD group. Thus, we predicted the following:

a. Participants with GAD, MDD, and MDD+GAD will recognize neutral

expressions less accurately as compared to controls.

b. Participants with GAD will attribute anger to neutral expressions more so

than the control participants. This response pattern would be consistent

with a negative response bias reflecting cognitive schemes involving

themes of threat and danger in anxiety.

c. MDD participants and comorbid MDD+GAD participants will attribute

sadness to neutral expressions more frequently as compared to healthy

controls. This response pattern would be consistent with a negative

response bias reflecting cognitive schemes involving themes of loss,

failure, and sadness in depression.

25

Hypothesis 4. Response biases may also interfere in the processing of facial expressions of emotions and thus, we examined the type of errors performed by the study participants. Thus, we hypothesized:

a. Participants with GAD will tend to interpret non-anger emotions (e.g., fear,

disgust) as anger more so as compared to control participants.

b. Participants with MDD and comorbid MDD+GAD will tend to interpret

non-sad emotions (e.g., fear, disgust) as sadness more so as compared to

control participants.

Hypothesis 5. Emotion theorists posit that deficits in facial emotion recognition may underlie difficulties in interpersonal functioning. Thus, we hypothesized that deficits in facial emotion recognition would predict levels of interpersonal difficulties above and beyond diagnostic status (e.g., GAD, MDD).

METHOD

Participants

Ninety individuals (72 females; 80%) were included in this study. The participants were composed of university undergraduates recruited from mass testing and through fliers in the Kent Hall Psychological Clinic, Counseling and Human

Development Center, and University Psychological Services. Only individuals who met pre-determined inclusion criteria were recruited into this study (see screening procedures). The study groups included a GAD group (without depression), a group with

26

depression (without GAD), a comorbid MDD+GAD group, and a control group (e.g., no current mental health diagnosis). Prospective participants were contacted via telephone or email and were invited to participate in this study in exchange for research points or monetary reimbursement.

Stimuli

Participants were presented with gray scale pictures of 4 females and 4 male faces expressing various emotions, including fear, anger, happiness, sadness, surprise, and disgust, and neutral expressions. Pictures were selected from Ekman and Friesen’s

Pictures of Facial Affect (POFA; 1976). The pictures were presented on a high-resolution

17-inch screen Macintosh computer (1024 x 768 pixel display), running the OS X 10.4.11 operating system, and positioned approximately 60cm from the chair. Stimulus presentation, timing, and data collection [percentage of correct responses, recognition errors for neutral expressions as well as emotions, and reaction (in ms)] were controlled with the SuperLab program (Cedrus Corporation, 2009).

Measures

The American National Adult Reading Test (AMNART; Grober & Sliwinski,

1991). This test measures premorbid intellectual function (IQ). Participants are instructed to read 45 irregularly spelled words (e.g., ache, thyme) aloud. Assuming that the participant is familiar with the word, accuracy of pronunciation is used to predict IQ.

The Beck Depression Inventory – Second Edition (BDI-II; Beck, et al., 1996) is a

21-item self-report instrument that broadly assesses the symptoms of depression including the affective, cognitive, behavioral, somatic, and motivational domains. Each

27

symptom is rated on a 4-point scale, raging from 0 to 3, with higher scores indicating more depressive symptomatology. Total scores can range from 0 to 63. It has a high internal consistency (Beck et al, 1996) and has been extensively used in the literature. A range of scores from 0-13 is reflective of minimal depression, a range of 14-19 reflects mild depression, 20-28 reflects moderate depression, and 29-63 reflects severe depression

(Beck et al., 1996). Previous studies examining the relationship between MDD and facial emotion recognition have employed primarily the BDI to assess symptom severity of depression. The group mean score for the symptom severity assessed with the BDI for these studies is M = 27 (SD = 7). Thus, participants included in this study had to score on the moderate level of depression or higher to be included in the MDD group.

The Benton Facial Recognition Test, Short Form (BFRT; Benton et al., 1994;

Levin et al., 1975). This test measures ability to recognize unfamiliar human faces.

Participants are asked to select from a set of six 2 x 3-inch aligned black and white photographs the face with the same identity as the reference face. In the short form, six items require one identity recognition response and 7 items require three identity recognition responses. The 13 items yield a maximum score of 27 correct choices; the effective range is from 11 to 27.

The Center for Epidemiological Studies Depression (CES-D; Radloff, 1977).

This measure is a 20-item self-report scale designed to evaluate depression in the general population. The items are comprising six scales reflecting major dimensions of depression: depressed mood, feelings of and worthlessness, feelings of helplessness and hopelessness, psychomotor retardation, loss of appetite, and sleep disturbance.

28

Depression symptoms experienced in the past week are rated on a 4-point scale, ranging from 0 to 3, with higher scores indicating more depressive symptomatology. In epidemiological research, a CES-D score from 0-15 has been used to group participants in a non-depressed group, whereas a score of 16-20 and 21-30 places participants in the mild and moderate depression groups respectively (Radloff et al., 1977; 1986). However, research with medical patients has shown that a CES-D score from 16 to 26 is indicative of mild depression, whereas scores of 27 or greater reflect major depression in medical patients (Zich et al. 1990, Ensel 1986). In a recent study with a college sample (Shean et al., 2008), a mean CES-D score of 27.53 (SD = 8.36) correctly identified study participants who were diagnosed with a unipolar depressive disorder with the Diagnostic

Interview Schedule-IV (DIS-IV; American Psychiatric Association, 1994). Thus, a score of 19 < CES-D ≥16 will indicate mild depression in this study whereas a score of CES-D

≥ 20 will correspond to problems with depression at the moderate/severe level of depression.

The Digit Symbol Coding (DSC), subtest of the Wechsler Adult Intelligence

Scale—Fourth Edition (WAIS-IV; Wechsler, 1997). This test assesses speed of information processing. It consists of rows containing small blank squares, each paired with a randomly assigned number from 1 to 9. Above these rows there is a printed key that pairs each number with a nonsense symbol. The participant is asked to fill in the blank spaces with the symbol that is paired to the number above the blank space for 120 seconds. The score is the number of squares filled in correctly.

29

The Pictures of Facial Affect (POFA; Ekman & Friesen, 1976). The POFA collection consists of 110 photographs of facial expressions of the six basic emotions

(anger, disgust, fear, happiness, sadness, and surprise) and neutrality. All images are black and white and include 14 different male and female posers. The development and validation of the POFA was based on Ekman and Friesen’s highly reliable and valid

Facial Coding Action System (FACS; Ekman & Friesen, 1978). FACS is a coding system that maps facial muscle configurations to different emotional experiences, and was used to select photographs for the POFA (Ekman & Friesen, 1978).

The Inventory of Interpersonal Problems, Short Circumplex (IIP-SC; Soldz, et al.,

1995). This measure is a 32-item self-report scale designed to evaluate interpersonal distress associated with things that are hard to do or that are done too much including being: assertive, submissive, intimate, sociable, responsible, and controlling. Each item is rated on 5-point, ranging from 0 to 4, yielding a total maximum score of 128. A higher score reflects higher distress related to interpersonal difficulties. The IIP-SC has been shown to have high internal consistency with an a = .89 (Hopwood, Pincus, DeMoor, &

Koonce, 2008).

The Generalized Anxiety Disorder Questionnaire for DSM-IV (GAD-Q-IV;

Newman et al., 2001). This questionnaire is a 9-item self-report measure that reflects the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV;

APA, 1994) criteria for GAD. Most items are dichotomous (YES/NO) and assess the presence of GAD criteria such as excessive and uncontrollable worry or assess the presence of physical symptoms (e.g., restlessness, muscle tension). One item is open-

30

ended and asks for a listing of the participant’s most frequent worry topics. Finally, two items are rated on a scale of zero (“None”) to eight (“Very Severe”) and measure functional impairment and subjective distress. In an outpatient community sample (N =

156), Barnes, Haigh, and Fresco (2006) found that the GAD-Q-IV criterion-based scoring scheme (relative to the Structured Clinical Interview for DSM-IV (SCID; First et al.,

2002) provided a good balance of sensitivity (Sn = .82) and specificity (Sp = .75) and correctly classifying 78% of patients. Haigh, Linardatos, Fresco, Bartko, & Logue

(unpublished data) replicated these findings in a sample of 125 outpatient primary care patients assessed with the SCID (Sn = .67, Sp = .86; Correctly classified = 81%). This criterion-based scoring scheme was used in the current study.

The Health History Questionnaire (adapted by Gunstad & Benitez, 2008). This instrument asks participants to identify history of conditions with the potential to impact cognitive function, including head injury and seizures, among others.

The Letter-Number Sequencing (LNS), subtest of the Wechsler Adult Intelligence

Scale—Fourth Edition (WAIS-IV; Wechsler, 1997). This test is designed to measure working memory. In this test participants are presented with a mixed list of numbers and letters (e.g., T-9-A-3) and they are asked to repeat the list by saying the numbers first in ascending order and then the letters in alphabetical order (e.g., 3-9-A-T). The test includes 7 items, and each item has 3 lists of numbers and letters yielding a maximum score of 21 correct choices.

The Structured Clinical Interview (SCID) for DSM-IV-TR (SCID; First, Spitzer,

Gibbon, & Williams, 2002). This is a widely used clinician-administered diagnostic

31

interview that allows for current and lifetime diagnoses based on DSM-IV-TR criteria.

Inter-rater reliability for diagnosis of MDD using the SCID ranges from .61 to .80, whereas for GAD was .63 to .81 (Zanarini et al., 2000). In the current study, the SCID was administered by one of 3 graduate students who had received practicum training in conducting the interview, as well as one year of supervision rounds as part of their programmatic clinical training.

Procedures

Screening. Participants who met criteria for GAD and MDD based on the GAD-

Q-IV and CES-D self-report measures respectively were invited to participate in the study. Participants who met the following criteria were excluded form the study: (1) active suicidal ideation (based on the BDI-II), (2) bipolar disorder (based on healthy questionnaire data, and the SCID), (3) current diagnosis of organic health problem (e.g., brain disease, head injury; based on healthy questionnaire data), and (4) met DSM-IV criteria for other Axis I diagnoses, but not for GAD and/or MDD. Of the 139 participants recruited for the study, a total of thirty-five were excluded from our data analyses based on the aforementioned criteria. Specifically, eight participants produced abnormal BFRT scores (≤ 40), two participants reported a history of a severe head injury, and one participant was unable to complete the task because of motor difficulties. Further, a total of 24 participants were excluded because they were diagnosed with a mental health disorder other than MDD and/or GAD including bipolar disorder (5), posttraumatic stress disorder (4), (1), social anxiety disorder (3), anxiety disorder not otherwise specified (2), alcohol/drug abuse (3), bulimia nervosa (1), and adjustment disorder (2).

32

Finally, a total of 16 participants were excluded from our analyses because technical problems with the facial emotion recognition task produced incomplete data profiles for these individuals. In total, our final sample was composed of 90 participants.

Testing. Upon arrival at the lab, participants provided informed consent and completed a battery of self-report measures, including the Health History Questionnaire,

GAD-Q-IV, BDI-II, and the IIP-SC. Participants were also administered a series of neurocognitive measures to assess IQ and visuoperceptual functioning. Participants then underwent the facial emotion recognition task. Participants were tested individually in a quite room with a comfortable chair and a Macintosh computer. Each experimental trial was consisted of the following sequence of events: First, a fixation signal (“+”) was presented in the middle of the computer screen for 500ms. Research has shown that the time required for the conceptual knowledge of the emotion signaled by the face is <

300ms (e.g., Adolphs, 2002). Thus, each facial expression was then presented on the screen for 300ms and participants were asked to identify which emotion was presented on the screen by pressing the appropriate keyboard button as quickly and accurately as possible. The order of the keyboard numbers corresponding to the emotions presented on the screen (e.g., # 1 for sad, # 2 for disgust) as well as the order of the emotion presentation (but not more than 2 consecutive emotions in the same emotion category) were randomized. After the subject’s response, a 1500ms intertrial interval preceded the start of the next trial. Each facial expression was presented 2 times resulting to 112 trials

(7 emotions x 8 posers x 2). Although each participant was scheduled to receive 16 presentations for each of the 7 emotions (for a total of 112 trials), due to technological

33

difficulties this number varied from 10 to 16 presentations within and across participants.

Thus, one participant may have received 18 presentations of the emotion of happiness and only 14 presentations of the emotion of sadness (total 112 trials), whereas another may have received 17 presentations of happy faces and 15 presentations of sad faces.

Participants had 14 practice trials (with different female and male posers) prior to the experiment to ensure they comprehend the task. The task took between 15-30 minutes to complete. Upon completion of the emotion recognition task, participants were administered the SCID.

Statistical analysis plan

The percentage and count of correct responses was calculated for each facial expression category for each participant. To test hypothesis 1, a series of one-way analysis of variance (ANOVAs) was performed for the executive functioning measures.

A power analysis using Cohen’s (1992) conventions with power set to .80 and a two- tailed alpha set to .05, revealed that a sample of 180 was required to detect group mean differences with four groups of a medium magnitude (f =. 25). Given that the actual sample consisted of 90 participants, these analyses were powered to detect effects as small as f = .36.

To test hypotheses 2 through 4, a series of one-way ANOVAs were performed followed by a priori nonorthogonal contrasts. Because the hypotheses were specific, and theory-driven (a priori), we interpreted the contrasts irrespective of the significance level of the omnibus test (Thompson, 1994). For a summary of the nonorthogonal contrast coefficients see Table 1.

34

Table 1.

Summary of Planned Nonorthogonal Contrast Coefficients

Hypotheses Control GAD MDD MDD+GAD

2 3 -1 -1 -1

3a. 3 -1 -1 -1

3b. -1 1 0 0

3c. -2 0 1 1

4a. -1 1 0 0

4b. -2 0 1 1

In terms of statistical power, hypotheses 2 through 4 posit focused, directional predictions indicating the need of a sample size of 128 with a two-tailed p-value or 102 with a one- tailed p-value. With a sample of 90, these analyses were powered to detect effects as small as f = .30 using a two-tailed p-value set at .05 and alpha set at .80.

Finally, to examine study hypothesis 5, a hierarchical linear regression was conducted. Ability to function interpersonally measured by the IIP-SC was the dependent variable. Diagnostic status (GAD, or MDD) was entered simultaneously at step 1. Participants with the MDD+GAD comorbid condition were assigned to both the

GAD and MDD group. Then the overall facial emotion recognition accuracy was entered at step 2. We predicted that ability to accurately recognize emotions from facial expressions would contribute significantly in explaining interpersonal functioning, above

35

and beyond its association with GAD or MDD status. A power analysis using Cohen’s

(1992) conventions with power set to .80 and a two-tailed alpha set to .05, revealed that a sample of 55 was required to detect the incremental prediction of deficits in interpersonal functioning above and beyond diagnosis at a medium magnitude (f2 =. 15). With a sample of 90, this analysis possesses power of .95 to detect medium effect sizes and would have power of .80 to detect effect sizes as small as Cohen’s f2 =. 09.

RESULTS

The demographic, clinical, and treatment characteristics of our sample are summarized in Table 2. Although our sample was predominately Caucasian (86.7%), our study participants also reported other ethnicities including, Asian American/Pacific

Islander (5.6%), African American (4.4%), Hispanic/Latino (2.2%), and other (1.1%). The study groups did not differ significantly in terms of their gender, age, and education. As expected, the four groups differed significantly in their BDI scores in that comorbid

MDD+GAD and MDD groups endorsed higher levels of depression compared to the GAD group and healthy controls.

36

Table 2.

Demographic, clinical, and treatment data for Major Depressive Disorder (MDD), Generalized Anxiety Disorder (GAD), comorbid MDD+GAD, and controls (N = 90) Group MDD GAD MDD+GAD Controls Total (13) (19) (22) (36) (90) Age (years) 19.46 (1.94) 21.74 (6.20) 21.05 (4.58) 19.42 (2.49) 20.31 (4.08)

Sex (female) 86% 79% 90% 72% 80%

BDI-II score 24.54 (11.05)* 11.16 (4.94) 26.95 (7.98)* 6.56 (4.16) -

Psychotropics 2 3 3 1 9

Amytryptaline 1

Citalopram 1

Escitalopram 1a

Fluoxetine 1

Paroxetine 1b, 1

Sentraline 1 1b 1

Wellbutrin 1a

Venlafaxine 1

Psychotherapy 1 1 5 1 8

Note: BDI-II = Beck Depression Inventory (second edition); a and b indicate that study participant is on more than one psychotropics.

* = p < 0.01

37

With regard to diagnoses, of the 19 participants with a primary diagnosis of GAD, one received a secondary diagnosis of social anxiety and another one was diagnosed with

OCD tendencies and past PTSD. Seventeen of the 19 participants with GAD were also diagnosed with a past mental health condition, including seven with MDD, one with depressive disorder, one with panic disorder, and one with adjustment disorder. Of the 13 participants diagnosed with MDD, three were also diagnosed with secondary social anxiety, another three had an additional diagnosis of PTSD, and one had a past diagnosis of an eating disorder. One participant with comorbid MDD+GAD received a past diagnosis of polysubstance abuse, and another panic disorder. Finally, five healthy controls were diagnosed with past MDD, and one with past MDD and PTSD.

Hypothesis 1: Diagnostic status and executive functioning

As hypothesized, participants with GAD, MDD, and MDD+GAD did not differ from control participants on executive functioning measures of premorbid IQ, F(3, 86) =

.31, p = .81, visuospatial ability, F(3, 86) = .14, p = .93, processing speed ability F(3, 86)

= 1.19, p = .32, and working memory, F(3, 86) = 1.38, p = .25. Results are summarized in Table 3.

38

Table 3. Tests of premorbid IQ (AMNART), visuospatial ability (BFRT), processing speed ability (SC), and working memory (LNS) for participants with Generalized Anxiety

Disorder (GAD), Major Depressive Disorder (MDD), comorbid MDD+GAD, and controls

(N =90)

Controls GAD MDD MDD+GAD

AMNART 111.51 (5.21) 112.19 (4.53) 113.48 (4.52) 111.25 (6.03)

BFRT 46.58 (3.61) 46. (4.11) 46.69 (3.17) 46.45 (2.54)

SC 86.86 (12.24) 82.26 (12.14) 81.85 (14.85) 80.82 (11.81)

LNS 20.81 (2.83) 21.79 (2.90) 21.85 (3.29) 20.36 (2.94)

Hypothesis 2: Overall facial emotion recognition accuracy in GAD and MDD

We hypothesized that individuals with GAD, MDD, and comorbid MDD+GAD would exhibit a general deficit in their ability to recognize facial expressions of emotions compared to control participants. A one-way ANOVA did not reveal any significant differences between the analogue clinical groups and healthy controls, F(3, 86) = .29, p =

.83. A summary of the mean overall and specific facial emotion recognition accuracies, and nonorthogonal contrasts are depicted in Table 4. As in previous studies, happy expressions were the best recognized of all emotions across groups.

39

Table 4.

Contrasts of mean correct responses (%) to facial emotion expressions endorsed by participants with Generalized Anxiety Disorder (GAD), Major Depressive Disorder

(MDD), comorbid MDD+GAD, and controls as a function of valence of expression (N

=90)

Controls GAD MDD MDD+GAD Contrast

M (SD)

Overall 77.7 (8.10) 78.57 (7.69) 79.20 (7.50) 79.51 (6.64) p = .40

Anger 67.43 (15.11) 70.39 (17.89) 68.35 (19.87) 75.38 (12.26) p = 1.00

Disgust 72.78 (17.68) 68.37 (21.60) 74.24 (13.50) 79.04 (19.18) p = .78

Fear 60.61 (19.02) 63.68 (24.88) 59.52 (18.67) 66.59 (18.80) p = .55

Happiness 96.91 (3.60) 98.48 (2.63) 97.47 (2.85) 99.25 (1.92) p = .23

Sad 61.11 (21.99) 67.98 (19.03) 67.97 (22.06) 58.30 (20.49) p = .43

Surprise 90.80 (12.43) 92.28 (10.10) 91.35 (8.98) 88.21 (11.44) p = .94

Note: Contrasts were evaluated by 2-tailed t-tests with 86 degrees of freedom.

40

Hypothesis 3: Response patterns to neutrality in GAD and MDD

We hypothesized that individuals with GAD, MDD, and the comorbid

MDD+GAD condition would have more difficulty identifying neutral expressions compared to healthy controls. Contrary to our predictions, the study findings did not reveal any significant differences between GAD (M = 90.49, SD = 12.58), MDD (M =

94.06, SD = 6.18), MDD+GAD (M = 89.31, SD = 9.94), and the healthy controls (M =

90.42, SD = 10.49), t(86) = -3.81, p = .70.

We also hypothesized that participants with GAD and depression (MDD and comorbid MDD+GAD) would endorse differential emotional patterns, namely anger and sadness respectively, in response to expressions of neutrality compared to healthy controls. Although participants with GAD (M = .07, SD = .09) labeled expressions of neutrality as anger more frequently than the control group (M = .04, SD = .06) the difference was not significantly different, t(86) = 1.65, p = .10. Participants with depression [MDD (M = .02, SD = .03) and MDD+GAD (M = .03, SD = .07)] did not label neutral expressions as sad more frequently than healthy controls (M = .04, SD = .06), t(86) = -.90, p = .37.

Hypothesis 4: Misattributions of specific emotions (other than neutral) in GAD and MDD

We hypothesized that participants with GAD would tend to interpret non-anger related emotions (other than neutral) as anger more so as compared to control participants. The findings did not reveal any significant differences between participants

41

with GAD and healthy controls for all 5 emotions. Also, none of the participants with

GAD mislabeled the emotions of happiness or surprise as anger and thus, the means for these emotions were zero. The findings are summarized in Table 5.

Table 5.

Contrasts of mean anger error responses (%) to facial emotion expressions endorsed by participants with Generalized Anxiety Disorder (GAD) and controls as a function of valence of expression (N = 90)

Group Controls GAD Contrast

M (SD)

Disgust/Anger .23 (.12) .29 (.21) p = .30

Fear/Anger .03 (.06) .05 (.07) p = .24

Happiness/Anger _ _ _

Sad/Anger .03 (.04) .04 (.05) p = .61

Surprise/Anger _ _ _

Note: Contrasts were evaluated by 2-tailed t-tests with 86 degrees of freedom.

Similarly, we hypothesized that participants with MDD and comorbid

MDD+GAD would be more likely to interpret non-sad emotions (other than neutral) as sadness more so as compared to participants in the control group. Contrary to our predictions, participants with depression (MDD, MDD+GAD) did not differ in their

42

misattributions of sadness in response to emotions compared to healthy controls.

Moreover, none of the participants with depression mislabeled the emotions of happiness, or surprise as sadness. As a result, the mean error responses for these emotions were zero.

Table 6 summarizes the findings.

Table 6.

Contrasts of mean sad error responses (%) to facial emotion expressions endorsed by participants with Major Depression Dirorder (MDD), comorbid MDD+GAD, and controls as a function of valence of expression (N = 90)

Controls MDD MDD+GAD Contrast

M (SD)

Anger/Sad .06 (.09) .09 (1.00) .06 (.09) p = .55

Disgust/Sad .01 (.03) .00 (.00) .01 (.02) p = .15

Fear/Sad .01 (.03) .01 (.02) .00 (.00) p = .64

Happiness/Sad _ _ _ _

Surprise/Sad _ _ _ _

43

Hypothesis 5: Facial emotion recognition, interpersonal functioning and diagnostic status

Finally, we hypothesized that deficits in facial emotion recognition would predict levels of interpersonal difficulties above and beyond diagnostic status (GAD or MDD). In step 1, we entered GAD followed by MDD. Participants with the comorbid condition

MDD+GAD were assigned to both the GAD and MDD group during this analysis. In step

2, we entered the mean facial emotion recognition accuracy (%). Interpersonal functioning assessed with a self-report measure was entered as the dependent variable.

Regression equations in Table 7 indicate that GAD and MDD status, but not facial emotion recognition accuracy, predict difficulties in the interpersonal realm.

44

Table 7.

Hierarchical Linear Regression predicting interpersonal functioning from diagnostic status (GAD, MDD) and facial emotion recognition (N = 90)

Interpersonal Functioning

Variable B (SE B) β R2 F

Step 1 .33 21.1*

GAD 7.72 (3.31) .21*

MDD 17.52 (3.38) .47**

Step 2 .33 .03

GAD 7.74 (3.33) .21*

MDD 17.57 (3.41) .48**

Total % -.04 (.22) -.02 Accuracy

*p < .05

**p < .01

Additional Analysis

Because our a priori hypotheses were not supported by our findings, we conducted additional analyses to try to account for the non-significant results.

45

Specifically, we performed a one-way ANOVA to examine whether there were differences in the interpersonal level based on diagnostic status. Previous research has indicated that individuals with GAD and MDD experience higher levels of interpersonal problems as compared to healthy controls. Consistent with other studies, our findings also indicated that interpersonal functioning differed significantly across our 4 diagnostic groups, F (3, 68) = 15.77, p < .001. Tukey post-hoc comparisons revealed that individuals with GAD (M = 36.68, SD = 17.13) endorsed more interpersonal problems compared to healthy controls (M = 21.42, SD = 11.20), p < .01. Further, participants with MDD (M =

48.77, SD = 17.57) reported more interpersonal difficulties compared to controls, p < .01.

Finally, participants with the comorbid MDD+GAD condition (M = 45.00, SD = 14.65) also endorsed more interpersonal difficulties relative to healthy controls, p < .01.

Because the reaction time (measured in milliseconds) to the emotion stimuli might have influenced our facial emotion recognition accuracy findings, we tested this variable as a function of participant’s diagnostic status with an independent samples t- test. For example, although participants in the analogue clinical groups did not differ from healthy controls in their ability to recognize facial emotional expressions, they might have required more time to do so compared to the controls. Our data did not reveal any significant differences in reaction time between the analogue clinical groups and controls for all type of emotional expressions, including anger, t(88) = .33, p = .37, disgust, t(88) = .58, p = .20, fear, t(88) = .42, p = .21, neutral, t(88) = -.54, p = .51 happiness, t(88) = -1.03, p = .83 sad, t(88) = -.89, p = .33, and surprise, t(88) = .09, p =

.60. Overall, participants’ reaction time in response to happiness was faster compared to

46

other emotions for both the analogue clinical groups (M = 2349, SD = 514) and controls

(M = 2233, SD = 536), whereas participants required more time to respond to images of anger (clinical groups: M = 3520, SD = 699; controls: M = 3570, SD = 713) and fear

(clinical groups: M = 3526, SD = 730; controls: M = 3598, SD = 875).

Female participants only were included in the analyses below

Because the meta-analysis conducted by Linardatos and Fresco (2011) indicated that deficits in facial emotion recognition are more pronounced in women with MDD than men with MDD, we performed additional analysis with the female participants only.

Further, our study sample included only a small number of males (20%) in general and in the analogue clinical groups in particular including, GAD (27%), MDD (15%), and

MDD+GAD (9%). We also combined the MDD and MDD+GAD groups to increase our sample size for the depressed group, especially, because we expected the 2 groups to be similar on cognitive and neuroanatomical variables and hence, endorse similar responses in terms of accuracy. Excluding the males from the analysis resulted in a sample of 72 participants total including, fifteen participants with GAD, thirty-one participants with depression, and twenty-six healthy controls. For the analysis, we transformed the interpersonal relationship variable (range of 1 to 73) into a categorical variable by dividing it (at the median = 33) into 2 groups with one representing low interpersonal dysfunction and the other high interpersonal dysfunction. We then performed a two-way

ANOVA to examine the relationship of facial emotion recognition accuracy to interpersonal dysfunction (low, high) and diagnostic status (GAD, depression, healthy controls). There was a significant main effect of interpersonal functioning on overall

47

accuracy, F(1, 66) = 6.00, p = .017, = η2 = .083. Diagnostic status did not have a significant effect on overall accuracy, F(1,66) = 1.32, p = .27, ns. The interaction of interpersonal functioning and diagnostic status was significant, F(1, 66) = 4.44, p =.015,

η2 = .012. Depressed participants (MDD, MDD+GAD) with higher levels of interpersonal dysfunction endorsed higher levels of accuracy (M = 81.6, SD = 6.7) relative to depressed participants with lower levels of interpersonal dysfunction (M =

77.7, SD = 3.5), whereas participants with GAD and healthy controls endorsed lower levels of accuracy at higher levels of interpersonal dysfunction.

Because we hypothesized that our analogue clinical groups would endorse unique response patterns to anger (for GAD), sadness (for depression), and neutrality (GAD and depression), we examined the relationship of interpersonal functioning and diagnostic status to facial expression accuracy of these emotions with a two-way ANOVA.

Although there were no significant findings for the emotions of anger and neutrality, the emotion of sadness revealed significant main and interaction effects. Specifically, there was a significant main effect of interpersonal functioning on facial recognition accuracy of sadness, F(1, 66) = 4.00, p = .05, = η2 = .057. Diagnostic status also had a significant main effect on overall accuracy, F(1,66) = 3.38, p = .04, η2 = .09. The interaction of interpersonal functioning and diagnostic status was significant, F(1, 66) = 3.61, p =.03,

η2 = .09. Participants with depression endorsed higher levels of accuracy in response to sadness when reported higher levels of interpersonal dysfunction (M = 64.3, SD = 22.6), relative to lower levels of interpersonal dysfunction (M = 53.9, SD = 16.5), whereas

48

participants with GAD and healthy controls endorsed lower levels of accuracy in response to sadness when reported higher levels of interpersonal dysfunction.

We also examined the relationship of interpersonal functioning and diagnostic status to sad and anger misattributions in response to neutral expressions with a two-way

ANOVA. We only found a significant interaction of the diagnostic status and interpersonal functioning related to anger misattributions, F(1, 66) = 3.78, p =.03, η2 =

.10. Participants with GAD (M = .10, SD = .11) and depression (M = .05, SD = .06) were more likely to attribute anger to neutral expressions when endorsed higher levels of interpersonal dysfunction, relative to lower levels of dysfunction, as compared to healthy controls (M = .02, SD = .03).

Comparisons regarding possible effects of medication and on facial emotion recognition were not possible because of the limited number of study participants receiving treatment (19%) in the original sample.

DISCUSSION

The present study examined the accuracy of facial emotion recognition in GAD,

MDD, and comorbid MDD+GAD condition and its relationship to interpersonal functioning. The findings from the present study did not support our hypotheses for a general deficit in facial emotion recognition accuracy in depression and anxiety compared to healthy individuals with matched intelligence and overall executive functioning. With regard to depression, this finding is inconsistent with many of the

49

previous studies that have linked MDD to impairments in facial emotion recognition accuracy (e.g., Csukly et al., 2009; Feinberg et al., 1986; Persad & Polivy, 1993).

However, it is consistent with several other studies that have reported the lack of deficits in facial emotion recognition in MDD (e.g., Bediou et al., 2005; Gaebel & Wölwer, 1992;

Gessler, Cutting, Frith, & Weinman, 1989). One factor that may account for our is the level of difficulty of our facial emotion recognition task. In general, increasing the number of stimuli presented during the task (ranging from 1 to 7 in previous studies) while decreasing the time allowed for the stimulus presentation

(ranging from 200 milliseconds to 5 minutes in previous studies) also increases the task difficulty. We specifically included 7 stimuli in our experimental paradigm and set our stimulus duration at 300 milliseconds. Our decision to use these experimental parameters was based on our goal to examine the participants’ response to all 6 basic emotions (in addition to neutral expressions) and ensure that our stimulus duration reflects the time necessary for humans to perceive a visual stimulus (<300 milliseconds). Although previous studies have used stimulus duration settings as low as 200 milliseconds, they employed less than seven stimuli. Despite the high difficulty level of our facial emotion recognition task, the analogue clinical groups and healthy controls did not differ in their reaction time to the task stimuli.

Further, we failed to identify any specific response biases associated with diagnostic status (e.g., MDD vs. healthy controls). Previous studies, for example, have found that depression is associated with negative biases in that depressed individuals tend to label neutral facial expressions as sad (e.g., Gur, Erwin, Gur, and Zwil, 1992;

50

Leppänen et al., 2004). Although some studies have shown response biases to neutral expressions other studies have failed to observe similar findings. For example,

Langenecker and colleagues (2005) found that individuals with MDD did not differ from healthy controls in their ability to identify neutral expressions (Langenecker et al., 2005).

Similarly to MDD, our findings suggest that GAD does not differ from healthy controls in their ability to identify affect from facial expressions. Further, individuals with GAD do not appear to endorse any response biases to facial emotion stimuli as well as neutral expressions compared to individuals with MDD or healthy controls. Because research in the area of facial emotion recognition in GAD is currently very limited, it is premature to offer any conclusions about this process based on our findings alone. Future research is required to determine whether our results are specific to GAD or they are related to factors specific to our study (e.g., study population).

With regard to interpersonal functioning, researchers have suggested that deficits in facial emotion recognition in MDD may underlie the interpersonal difficulties endorsed by individuals with depression. To our knowledge, none of the previous studies in the area of facial emotion recognition have assessed perceived levels of interpersonal functioning. Our findings showed that the analogue clinical groups differ from the healthy controls in their report of interpersonal problems in that interpersonal problems were higher in the analogue clinical groups relative to controls. However, difficulties in facial emotion recognition did not predict interpersonal problems above and beyond psychopathology.

51

Although our hypotheses were not supported by the data, additional analyses based on past empirical and theoretical perspectives in this line of research revealed potentially interesting relationships among our study variables. A recent meta-analysis conducted by Linardatos and Fresco (2011) indicated that women with depression endorsed lower overall facial emotion recognition accuracy compared to men with depression as well as healthy women. As a result, we conducted additional analyses with female participants only. With regard to the overall accuracy, participants with depression were more likely to endorse greater overall accuracy when they reported higher difficulties in the interpersonal realm. However, participants with GAD and healthy controls endorsed more deficits in facial emotion recognition when reported more interpersonal difficulties. Our analysis also revealed that individuals with depression were more likely to recognize sad expressions correctly when endorsed high levels of interpersonal dysfunction relative to lower levels of dysfunction. This finding, although tentative, appears to be consistent with previous research suggesting that individuals with depression can more readily identify expressions of sadness compared to controls. It may be that identifying expressions of sadness in an accurate manner in depression exacerbates negative automatic thoughts and feelings about the self and others and, in turn, negatively influences their interpersonal relationships. Alternatively, increased interpersonal problems may increase sensitivity to mood congruent cues and stimuli in depression, which then, may further increase interpersonal problems. Regardless of the directionality between interpersonal functioning and facial recognition accuracy for sad expressions in depression, this observed trend might explain the lack of facial emotion

52

recognition deficits in general between our analogue clinical groups and healthy controls.

Specifically, it is possible that the enhanced accuracy for sad expressions in the depression group when interpersonal dysfunction is high coupled with the high levels of interpersonal problems associated with the group counteracted the overall facial emotion recognition deficits associated with psychopathology in our sample.

Further, participants with GAD and depression were more likely to attribute anger to neutral expressions when experiencing higher levels of interpersonal dysfunction, relative to lower dysfunction. In contrast, healthy controls endorsed more anger related misattributions at lower, relative to higher, levels of interpersonal functioning. Previous research has suggested that individuals with GAD tend to attend to threatening stimuli in general, and in ambiguous situations in particular. It is plausible that when individuals with GAD experience difficulties in their interpersonal life, their tendency to detect threat enhances, especially, when ambiguity and uncertainty is high. Although depression has been primarily linked to thoughts and feelings related to loss, failure, and sadness, it is possible that significant interpersonal problems and low self value would predispose depressed individuals to interpret ambiguous stimuli as angry and threatening.

Although the aforementioned findings offer a more detailed story about the role of facial emotion recognition in GAD and depression in the context of interpersonal functioning, it would be premature to draw any definitive conclusions at this time. Future research with larger sample sizes that are inclusive of male participants is required to replicate and expand these findings.

53

Limitations

Findings from the present study must be considered in the light of some limitations including, sample size and methodology. First, due to technological difficulties the number of trials the participants received within an emotion category varied during the task. Our results, however, were based on percentages reducing the effects of this variability within each emotion category. Additionally, the total number of trials presented during the task was equal across participants. Second, due to time constraints and difficulties recruiting individuals with clinical levels of psychopathology, our sample size was relatively small which resulted in a slight lack of statistical power.

Finally, the sample size for our clinical groups was relatively small. Previous studies in facial emotion recognition in depression, for example, have used sample sizes ranging from 16 to 23 participants with MDD whereas our sample included only14 participants with MDD. Further, most of the studies in the field of facial emotion recognition have included only two groups in their sample size, namely a clinical group and healthy controls. We were also interested, however, in comparing the MDD group to another clinical group to determine whether potential impairments in facial emotion recognition in MDD are unique to this mental health condition. Given that MDD co-occurs highly with GAD, including a fourth comorbid group in our sample size seemed conceptually essential, but it increased the need for a larger sample size statistically. When we combined the MDD group and comorbid MDD+GAD group in the analyses, some significant trends in the process of facial emotion recognition were uncovered suggesting that larger clinical sample sizes may be instrumental in this line of research. Additionally,

54

larger clinical samples would offer more opportunities to examine the relationship of gender to facial emotion recognition, given that women are consistently shown to be at greater risk to develop MDD and GAD at some point during their lives compared to men

(e.g., Nolen-Hoeksema, 1990; Regier et al., 1993).

In terms of the methodological limitations, it is important to examine the use of static versus animated (e.g., dynamic) stimuli in the facial emotion recognition task. We used a set of prototypical facial expressions of the six basic emotions in addition to neutral expressions from a standardized series of images (POFA; Ekman & Friesen,

1976). Some researchers have suggested, however, that animated faces maybe more appropriate to identify deficits in facial emotion recognition because they are more ecologically valid compared to static stimuli (e.g., Gollan et al., 2008). In real-life situations, people respond to facial expressions of affect that vary in intensity and duration; we are not only sensitive to intense and sustained facial expressions of emotion, but also subtle changes of emotional displays (e.g., Ambadar, Schooler, and Cohn, 2005).

In contrast, other researchers have argued that that animated faces may provide higher recognition rates, which in turn could result to a ceiling effect that may decrease the possibility of identifying differences among study groups (Csukly et al., 2009). Indeed, healthy participants perceive emotion intensity to be higher in dynamic facial expressions than in static facial expressions (e.g., Biele & Grabowska, 2006). Further, dynamic stimuli have been found to facilitate facial expression recognition (Ambadar et al., 2005).

Similarly to dynamic stimuli, static but morphed images have been considered to be more ecologically valid than unmorphed stimuli. Because this type of stimulus includes images

55

that gradually change emotional expressions from neutral (no emotion) to full-intensity

(e.g., sad, happy, angry), researchers have argued that it represents interpersonal interactions occurring in everyday life more realistically than unmorphed stimuli (e.g.,

Joorman et al., 2006).

Although the difference between the overall facial emotion recognition accuracy for the analogue clinical groups and healthy controls was not statistically significant in our study, the healthy controls performed slightly worse compared to the analogue clinical groups (M = 77.70%, SD = 8.70 vs. M = 79.10%, SD = 7.10). This finding raises the question about whether healthy controls performed at an optimal level on the facial emotion recognition task. However, we did not assess the study participant’s level of engagement and effort during the task. Having this information might have allowed us to determine suboptimal levels of effort among participants. In other words, it is possible that the findings from the task might have underrepresented the participants’ true level of emotional processing due to lack of effort.

Additionally, our study participants were a convenience sample comprised of college students. Therefore, the generalizability of our findings is limited due to the unique characteristics of our sample with respect to age, education, and general functioning. For example, the analogue clinical groups and controls in our sample had an estimated average IQ of equal or greater to111, which is above the average population mean of 100. Linardatos and Fresco (2011) conducted a meta-analysis of 13 studies examining facial emotion recognition accuracy in MDD (unpublished data). All but one of the studies reviewed in the meta-analysis included clinical samples of individuals in

56

inpatient and outpatient settings seeking or currently receiving treatment for their depression. Of the 54 participants included in our analogue clinical groups (GAD, MDD,

MDD+GAD), only 11 individuals were receiving some type of treatment for their mental health difficulties including, psychotherapy only (2), pharmacotherapy only (4), and a combination of psychotherapy and pharmacotherapy (5) at the time of their participation.

Further, the average age of the patients included in the meta-analysis was M = 40.33 and the average BDI score was M = 26.65. In our study, however, the average age of the participants in all analogue clinical groups was M = 20.3 and the average BDI score was

M = 20.8. The average age of the MDD participants alone was M = 19.5 and the average

BDI score was M = 24.54.

Future directions

The aforementioned limitations provide several opportunities for future research projects in the area of MDD and GAD and facial emotion recognition. The current experimental paradigm could be replicated with static morphed images, or dynamic stimuli. Animated images of facial expressions of emotion may represent human emotions more accurately than static stimuli and hence provide greater sensitivity and ecological validity for future experimental designs. It is also plausible that the intensity of facial expressions modulates the recognition of facial expressions of affect and thus, employing static morphed stimuli of varying intensity may be essential in this line of research. Our findings, albeit tentative, suggested that participants with depression are more accurate in labeling expressions of sadness when interpersonal functioning is low.

A future study employing static morphed stimuli, for instance, could examine the

57

relationship of interpersonal functioning and sensitivity to facial expressions of sadness to determine whether high dysfunction is also related to lower recognition threshold for sad expressions. Further, effort measures could be employed to identify suboptimal levels of performance on the facial emotion recognition task. At the end of the experiment, for example, study participants may be asked to indicate whether they found the task boring or interesting, overall easy or difficult. They could also be asked whether they put their full effort into the facial recognition task throughout the experiment, or only in the beginning/end. The investigators then may use this information to eliminate lack of effort as a potential contributing factor to inaccuracies in facial emotion recognition. Future research may also examine the process of facial emotion recognition and its relationship to interpersonal functioning in clinical populations of GAD and MDD who vary in terms of socioeconomic status, age, symptom severity and overall functioning, treatment- seeking behavior (e.g., medication and/or psychotherapy), and treatment response.

Finally, functional imaging studies contrasting GAD, MDD, and comparison groups using different emotional categories may establish the neural substrates of potential emotion recognition deficits in these disorders.

Depression and anxiety are mental health problems that are often characterized by a chronic course during which symptoms wax and wane and impairments in overall functioning range from mild to severe. Over 75% of individuals with depression, for instance, have more than one depressive episode in their lifetime and 50% relapse within

2 years of recovery (e.g., Keller & Boland, 1998; Keller et al., 1992). More importantly, the risk for relapse increases with each additional depressive episode (e.g., Kessler et al.,

58

2003). With regard to emotional processing, studies have shown that deficits in facial emotion recognition are associated with relapse (Bouhuys, Geerts, & Gordijn, 1999) and persistence in depression (Hale, 1998). Additionally, Leppänen and colleagues (2004) found that deficits in facial recognition of neutral expressions persisted following the remission of depressive symptoms. However, findings from another study have indicated that impairments in facial emotion recognition are alleviated during remission and thus, are more likely to be a state versus a trait feature of depression (Surguladge et al., 2004).

Similarly, GAD has low rates of remission (e.g., Kessler et al., 2005). Although several cognitive and behavioral based models of GAD have informed treatment practices over the last two decades, success in treating GAD lags behind the effectiveness of other anxiety and mood disorders (Borkovec & Ruscio, 2001; Fisher, 2006). Given the chronic patterns of depression and anxiety, future research may examine the process of facial emotion recognition with longitudinal designs.

Clinical Implications

Our findings suggested that study participants with MDD, GAD, or the comorbid condition did not differ from healthy controls in their ability to identify emotions from facial expressions. Other studies, however, have found deficits in emotional processing in both depression (e.g., Leppänen et al., 2004) and anxiety (e.g., Joorman et al., 2006).

Thus, evaluating potential risk and protective factors associated with the process of facial emotion recognition in depression and anxiety might be essential for the development of appropriate treatments in the future. In the area of treatment in schizophrenia, which has also been associated with deficits in facial emotion recognition, training programs in

59

facial emotion recognition have been developed and successfully implemented.

According to Wolwer and colleagues, for example, a computer-based program that includes training with ambiguous expressions of affect as well as identification of the 6 basic emotions can improve the ability to identify emotions from facial expressions in schizophrenia (Wolwer, Frommann, Halfmann, Piaszek, Streit, & Gaebel, 2005). Similar training approaches could be incorporated in the treatment of depression and anxiety to enhance facial emotion recognition and subsequently improve interpersonal functioning.

Research has indicated, for example, that quality of interpersonal relationships may predict long-term treatment outcomes in both GAD (Durham, Allan, and Hackett, 1997) and depression (Beach & O'Leary, 1986).

In of the limitations of the present study and the lack of support of our hypotheses, a substantial strength of this project lies in the experimental design.

Specifically, we included measures of executive functioning ability (e.g., premorbid IQ, visuospatial ability) in our study to determine whether potential deficits in facial emotion recognition reflect a global perceptual impairment rather than a deficit in perception of emotion. Eight participants in our study sample, for example, produced abnormal BFRT scores suggesting a deficit in discriminating faces that could potentially confound our results. Further, although researchers have proposed that impairments in facial emotion recognition may underlie the interpersonal difficulties often associated with depression and anxiety, previous studies have not examined this variable. Our study was the first to include measures of perceived interpersonal functioning in the experimental design and to directly explore the relationship of interpersonal functioning to facial emotion

60

recognition accuracy. Future research is required to evaluate whether these abnormalities in emotional processing do indeed contribute to the vulnerability for the development and/or maintenance of depression and anxiety, or are a byproduct of the disorder.

Moreover, our study was the first to examine the process of facial emotion recognition in

GAD. Although the accuracy of facial emotion recognition has been examined in other anxiety disorders, such as social anxiety (e.g., Joorman et al., 2006) and OCD (e.g.,

Aigner, et al., 2007), this process has not been addressed in GAD to date. Our findings indicated that GAD is not associated with deficits in facial emotion recognition and further research to replicate these findings is warranted. Finally, our study was the first to evaluate the process of recognizing emotions from facial expressions in a group of individuals with co-occurring MDD and GAD. Given that MDD and GAD are highly comorbid, an important task of future research in facial emotion recognition is to discern the common and unique features of these disorders. In summary, research in emotional processing in MDD and GAD would benefit greatly from longitudinal designs that would allow us to better comprehend the nature of these mental health conditions and potentially develop more efficacious treatments.

REFERENCES

Adolphs, R. (2002). Recognizing emotion from facial expressions: psychological and

neurological mechanisms. Behavioral and Reviews, 1,

21–61.

Aigner, M., Sachs, G., Bruckmüller, E., Winklbaur, B., Zitterl, W., Kryspin-Exner, I.,

Gur, R., Katschnig, H. (2007). Cognitive and emotion recognition deficits in

obsessive-compulsive disorder. Psychiatry Research, 15, 121-8.

Aigner, M., Sachs, G., Bruckmüller, E., Winklbaur, B., Zitterl, W., Kryspin-Exner, I.,

Gur, R., Katschnig, H. (2007). Cognitive and emotion recognition deficits in

obsessive-compulsive disorder. Psychiatry Research, 15, 121-8.

American Psychiatric Association (1994). Diagnostic and statistical manual of mental

disorders (4th ed.). Washington, DC: Author.

Anderson, A. K., & Phelps, E. A. (2000). Expression without recognition: Contributions

of the human amygdala to emotional communication. Psychological Science, 11,

106-111.

Aristotle (1991). The art of rhetoric. London: Penguin.

Averill, J. R. (1982). Anger and aggression: An essay on emotion. New York: Springer-

Verlag.

61

62

Barlow, D. H. (2002). Anxiety and its Disorders: The nature and treatment of anxiety and

panic (2nd ed.). New York: The Guilford Press.

Barnett, P. A. & Gotlib, I. H. (1988). Psychosocial Functioning and Depression:

Distinguishing Among Antecedents, Concomitants, and Consequences.

Psychological Bulletin, 104, 97-126.

Barnett, M. A., King, L. M., & Howard, J. A. (1979). Inducing affect about self or other:

Effects on generosity in children. , 15,164-167.

Beach, S. R. H., & O'Leary, D. K. (1986). The treatment of depression occurring in the

context of marital discord. Behaviour Therapy, 17, 43-50.

Beck, A. T. (1976). Cognitive therapy and the emotional disorders. New York:

International Universities Press.

Beck, A. T., Brown, G., & Steer, R. A. (1996). Beck Depression Inventory II manual.

San Antonio, TX: The Psychological Corporation.

Bediou, B., Krolak-Salmon, P., Saoud, M., Henaff, M., Burt, M., Dalery, J., et al. (2005).

Facial expression and sex recognition in schizophrenia and depression. The

Canadian Journal of Psychiatry, 50, 525-533.

Benton, A. L., Sivan, A. B., Hamsher, K., Varney, N. R., & Spreen, O. (1994).

Contributions to Neuropsychological Assessment. New York: Oxford University

Press.

63

Berkowitz, L. (1990). On the formation and regulation of anger and aggression: A

cognitive-neoassociationistic analysis. American , 45, 494-503.

Blair, R. J. R., & Cipolotti, L., (2000). Impaired social response reversal. A case of

‘acquired sociopathy’. Brain, 123, 1122–1141.

Blair, R. J. R., Morris, J. S., Frith, C. C., Perrett, D. I., & Dolan, R. J. (1999). Dissociable

neural responses to facial expressions of sadness and anger. Brain 122, 883–893.

Borkovec, T. D., Alcaine, O., & Behar, E. (2004). Avoidance theory of worry and

generalized anxiety disorder. In R. G. Heimberg, C. L. Turk, and D. S. Mennin

(Eds). Generalized anxiety disorder: Advances in research and practice (pp. 77-

108). New York: Guilford Press

Borkovec, T. D., Newman, M. G., Pincus, A. L., & Lytle, R. (2002). A component

analysis of cognitive-behavioral therapy for generalized anxiety disorder and the

role of interpersonal problems. Journal of Consulting and Clinical Psychology,

70, 288-298.

Botteron, K., Raichle, M., Drevets, W. C., Heath, A., & Todd, R. D. (2002): Volumetric

reduction in left subgenual prefrontal cortex in early onset depression. Biological

Psychiatry, 51, 342–344.

Bouhuys, A. L., Geerts, E., & Gordijn, M. C. (1999). Depressed patients' perceptions of

facial emotions in depressed and remitted states are associated with relapse: a

longitudinal study. Journal of Nervous Mental Disease, 187, 595-602.

Bowlby, J. (1973). Attachment and loss: Vol. 2. Separation and anger. New York: Basic

Books.

64

Bradley, B. P., Mogg, K., White, J, Groom, C., Bono, J. D. (1999). Attentional bias for

emotional faces in generalized anxiety disorder. British Journal of Clinical

Psychology, 38, 267-278.

Brandes, M. & Bienvenu, O. J. (2006). Personality and anxiety disorders. Current

Psychiatry Reports, 8, 263-269.

Breitholtz, E., Johansson, B., & Ost, L. G. (1999). in generalized anxiety

disorder and panic disorder patients. A prospective approach. Behaviour

Research and Therapy, 37, 533-544.

Bremner, J. D., Vythilingham, M., Vermetten, E., Nazeer, A., Adil, J., Khan, S., et al

(2002): Reduced volume of orbitofrontal cortex in major depression. Biological

Psychiatry, 51, 273–279.

Brown, L. A., Campbell, C. L., Lehman, J. R., & Grisham (2001). Current and lifetime

comorbidity of the DSM-IV anxiety and mood disorders in a large clinical

sample. Journal of , 4, 585-599.

Bruce, V., & Young, A. (1986). Understanding face recognition. British Journal of

Psychology, 77, 305-327.

Calder, A. J., Young, A. W., Keane, J., & Dean, M. (2000). Configural information in

facial expression perception. Journal of : Human

Perception and Performance, 26, 527-551.

Carstensen, L., Pasupathi, M., Mayr, U., & Nesselroade, J. R. (2001). Emotional

experience in everyday life across the adult life span. Journal of Personality and

Social Psychology, 79, 644–655.

65

Carton, J. S., Kessler, E. A., & Pape, C. L. (1999). Nonverbal decoding skills and

relationship well-being in adults. Journal of Nonverbal Behavior, 23, 91-100.

Clark, D. A., Beck, A. T. (1989). Cognitive theory and therapy of anxiety and depression.

In P.C.Kendall & D.Watson (Eds.), Anxiety and depression: Distinctive and

overlapping features (pp. 379–411). San Diego: Academic Press.

Clark, D. A., Beck, A. T., & Stewart, B. (1990). Cognitive specificity and positive–

: Complementary or contradictory views on anxiety and

depression? Journal of Abnormal Psychology, 99, 148–155.

Corcoran, K. M., Woody, Sheila R., & Tolin, David F. (2008). Recognition of facial

expressions in obsessive-compulsive disorder. Journal of Anxiety Disorders, 22,

56-66.

Coyne, J. C. (1976). Depression and the response of others. Journal of Abnormal

Psychology, 85, 186-193.

Craske, M. G., Rapee, R. M., Jackel L., & Barlow, D. H. (1989). Qualitative dimensions

of worry in DSM-III-R generalized anxiety disorder subjects and nonanxious

controls. Behaviour Research and Therapy, 27, 397-402.

Csukly, G., Czobor, P., Szily, E., Takács, B., & Simon, L. (2009). Facial expression

recognition in depressed subjects: the impact of intensity level and arousal

dimension. The Journal of Nervous and Mental Disease, 197, 98-103.

Darwin, C. (1898). The expression of the emotions in man and animals. New York: D.

Appleton & Co.

66

De Bellis, M. D., Casey, B. J., Dahl, R. E., Birmaher, B., Williamson, D. E., & Thomas,

K. M. (2000). A pilot study of amygdala volumes in pediatric generalized

anxiety disorder. Biological Psychiatry, 48, 51-7.

Dugas, M. J., Anderson, K. G., Deschenes, S. S., & Donegan, E. (2010). Generalized

anxiety disorder publications: where do we stand a decade later? Journal of

Anxiety Disorders, 24, 780-784.

Durham, R. C., Allan, T., & Hackett, C. A. (1997). On predicting improvement and

relapse in generalized anxiety disorder following psychotherapy. British Journal

of Clinical Psychology, 36, 101–119.

Edwards, T., Manstead, A. S. R., & MacDonald, C. J. (1984). The relationship between

children's sociometric status and ability to recognize facial expression of

emotion. European Journal of , 14, 235-238.

Ekman, P. (1992). Are there basic emotions? Psychological Review, 99, 550-553.

Ekman, P. (1999). Facial Expressions. In T. Dalgleish and T. Power (Eds.), The

Handbook of and Emotion (pp. 301-320). UK: John Wiley & Sons.

Ekman, P., Friesen, W. V. (1971). Constants across in the face and emotion.

Journal of Personality and Social Psychology, 17, 124-129.

Ekman, P., Friesen, W. V. (1978). Facial action coding system. Consulting

Press, Inc., Palo Alto, CA.

Ekman, P. Friesen, W. V., O'Sullivan, M., Chan, A., Diacoyanni-Tarlatzis, I., Heider, K.,

et al. (1987). Universals and cultural differences in the judgments of facial

67

expressions of emotion. Journal of Personality and Social Psychology, 53, 712-

717.

Eng, W., & Heimberg, R. G. (2006). Interpersonal correlates of generalized anxiety

disorder: Self versus other perception. Journal of Anxiety Disorders, 20, 380-

387.

Ensel, W. (1986). Measuring depression: The CES-D Scale. Social Support, Life Events

and Depression. N. Lin, A. Dean and W. M. Ensel. New York, Academic Press.

Feinberg, T. E., Rifkin, A., Schaffer, C., & Walker, E. (1986). Facial discrimination and

emotional recognition in schizophrenia and affective disorders. Archives of

General Psychiatry, 43, 276–279.

First, M. B., Spitzer, R. L., Gibbon, M., & Williams, J. B. W. (2002). Structured Clinical

Interview for DSM-IV-TR Axis I Disorders - Patient Edition (with Psychotic

Screen). New York: New York State Research Institute.

Fisher, P. L. (2006). The efficacy of psychological treatments for generalized anxiety

disorder. In: G. C. L. Davey, & A. Wells (Eds.). Worry and its psychological

disorders: theory, assessment and treatment (pp. 359–377). New York: John

Wiley and Sons.

Fredman, L., Weissman, M. M., Leaf, P. J., & Bruce, M. L. (1988). Social functioning in

community residents with depression and other psychiatric disorders: results of

the New Haven Epidemiologic Catchment Area Study. Journal of Affective

Disorders, 15, 103-112.

68

Frijda, N. H., Kuipers, P., ter Schure, E. (1989). Relations among emotion, appraisals,

and emotional action readiness. Journal of Personality and Social Psychology,

57, 212-228.

Gaebel, W., & Wölwer, W. (1992). Facial expression and emotional face recognition in

schizophrenia and depression. European Archives of Psychiatry Clinical

Neuroscience, 242, 46-52.

Gessler, S., Cutting, J., Frith, C. D., & Weinman, J. (1989). Schizophrenic inability to

judge facial emotion: a controlled study. British Journal of Clinical Psychology,

28, 19-29.

Gorber, E., & Sliwinski, M. (1991). Development and validation of a model of estimating

premorbid verbal intelligence in the elderly. Journal of Clinical and

Experimental , 13, 933-949.

Gross, J. J. (1999). Emotion and emotion regulation. In L. A. Pervin

Gur R. C., Erwin, R. J., Gur, R. E., Zwil, A. S., Heimberg, C., & Kraemer, H. C. (1992).

Facial emotion discrimination: II. Behavioral findings in depression. Psychiatry

Research, 42, 241-51.

Hale, W.W. (1998). Judgment of facial expressions and depression persistence.

Psychiatry Research 80, 265– 274.

Hecht, M. A., & LaFrance, M. (1998). License or obligation to smile: The effect of power

and sex on amount and type of smiling. Personality and Social Psychology

Bulletin, 24, 1332-1342.

69

Henning, E. R., Turk, C. L., Mennin, D. S., Fresco, D. M., & Heimberg, R. G. (2007).

Impairment and quality of life in individuals with generalized anxiety disorder.

Depression and Anxiety, 24, 342-349.

Hopf, H. C., Muller-Forell, W., & Hopf, N. J. (1992). Localization of emotional and

volitional facial paresis. Neurology, 42, 1918-1923.

Hopwood, C. J., Koonce, E. A., & Morey, L. C. (2008). Psychometric characteristics of

the Inventory of Interpersonal Problems-Short Circumplex (IIP-SC) with college

students. Journal of Personality Assessment, 90, 615-618.

Huppert, J. D., & Alley, A. C. (2004). The clinical application of emotion research in

generalized anxiety disorder: Some proposed procedures. Cognitive and

Behavioral Practice, 11, 387-392.

Ikuta, M. (1999). The self-regulatory of facial expression in conflict discourse situation.

Japanese Journal of Counseling Science, 32, 43–48.

Izard, C., Fine, S., Schultz, D., Mostow, A., Ackerman, B., & Youngstrom, E. (2001).

Emotion knowledge as a predictor of social behavior and academic competence

in children at risk. Psychological Science, 12, 18-23.

Joiner, T. E., Alfano, M. S., & Metalsky, G. I. (1992). When depression breeds :

reassurance seeking, self-esteem, and rejection of depressed college students by

their roommates. Journal of Abnormal Psychology, 101, 165-173.

Jolly, J. B., Dyck, M. J., Kramer, T. A., &Wherry, J. N. (1994). Integration of positive

and negative affectivity and cognitive content-specificity: Improved

70

discrimination of anxious and depressed symptoms. Journal of Abnormal

Psychology, 103, 544–552.

Joormann, J. & Gotlib, I. H. (2006). Is This Happiness I See? Biases in the identification

of emotional facial expressions in depression and social phobia. Journal of

Abnormal Psychology, 115, 705-714.

Kan, Y., Mimura, M., Kamijima, K., & Kawamura, M. (2004). Recognition of emotion

from moving facial and prosodic stimuli in depressed patients. Journal of

Neurology, Neurosurgery, and Psychiatry, 75, 1667-1671.

Keitner, G. I., & Miller, I. W. (1990). Family functioning and major depression: an

overview. American Journal of Psychiatry, 147, 1128-1137.

Kessler, R. C., Chiu, W. T., Demler, O., Merikangas, K. R., & Walters, E. E. (2005).

Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the

national comorbidity survey replication. Archives of General Psychiatry, 62,

617-627.

Killgore, W. D. S., & Yurgelun-Todd, D. A., (2004). Activation of the amygdala and

anterior cingulate during nonconscious processing of sad versus happy faces.

NeuroImage, 21, 1215–1223.

Kohler, C. G., Bilker, W., Hagendoorn, M., Gur, R. E., & Gur, R. C. (2000). Emotion

recognition deficit in schizophrenia: association with symptomatology and

cognition. Biological Psychiatry, 48, 127–136.

Kornreich, C., Blairy, S., Philippot, P., Dan, B., Foisy, M., Hess, U., et al. (2001).

Impaired emotional facial expression recognition in alcoholism compared with

71

obsessive–compulsive disorder and normal controls. Psychiatry Research, 102,

235–248.

Kucharska-Pietura, K., Nikolaou, V., Masiak, M., & Treasure, J. (2004). The recognition

of emotion in the faces and voice of anorexia nervosa. International Journal of

Eating Disorders, 35, 42–47.

Langenecker, S. A., Bieliauskas, L. A., Rapport, L. J., Zubieta, J. K., Wilde, E. A., &

Berent, S. (2005). Face emotion perception and executive functioning deficits in

depression, Journal of Clinical and Experimental Neuropsychology, 27, 320-333.

Leppänen, J. M., Milders, M., Bell, J. S., Terriere, E., & Hietanen, J. K. (2004).

Depression biases the recognition of neutral faces. Psychiatry Research, 128,

123–133.

Macmillan, N.A., & Creelman, C.D. (1990). Response bias: Characteristics of detection

theory,

threshold theory, and “nonparametric” measures. Psychological Bulletin, 107,

401-413.

Macmillan, N.A., & Creelman, C.D. (2005). Detection theory: A user’s guide. (2nd

Edition).

Mahwah, NJ: Erlbaum.

Marsh, A. A., Ambady, N., & Kleck, R. E. (2005). The effects of fear and anger facial

expressions on approach- and avoidance-related behaviors. Emotion, 5, 119-124.

Mathews, A., & MacLeod, C. (1994). Cognitive approaches to emotion and emotional

disorders

72

Annual Review of Psychology, 45, 25-50.

McClure, E. B. (2000). A meta-analytic review of sex differences in facial expression

processing and their development in infants, children, and adolescents.

Psychological Bulletin, 126, 424–453.

Mineka, S. (1992). Evolutionary memories, emotional processing and the emotional

disorders. In The Psychology of Learning and Motivation, ed. D Medin,

28:161–206. New York: Academic.

Mineka, S., & Tomarken, A. (1989). The role of cognitive biases in the origins and

maintenance of fear and anxiety disorders. In Aversion, Avoidance, and Anxiety:

Perspectives on Aversively Motivated Behavior, ed. T Archer, L Nilsson, pp.

195–221. Hillsdale, NJ: Erlbaum.

Mogg, K., Millar, N., & Bradley, B. P. (2000). Biases in eye movements to threatening

facial expressions in generalised anxiety disorder and depressive disorder.

Journal of Abnormal Psychology, 19, 695–704.

Monk, C. S., Telzer, E. H., Mogg, K., et al. (2008) Amygdala and ventrolateral prefrontal

cortex activation to minstructed angry faces in children and adolescents with

generalized anxiety disorder. Archives of General Psychiatry, 65, 568–576.

Newman, M. G., Zuellig, A. R., Kachin, K. E., Constantino, M. J., Przeworski, A.,

Erickson, T., & Cashman-McGrath, L. (2002). Preliminary reliability and

validity of the generalized anxiety disorder questionnaire-iv: A revised self-

73

report diagnostic measure of generalized anxiety disorder. Behaviour Therapy,

33, 215-233.

Nolen-Hoeksema, S. (1998). The other end of the continuum: The costs of rumination.

Psychological Inquiry, 9, 216-219.

Persad, S. M., & Polivy, J. (1993). Differences between depressed and nondepressed

individuals in the recognition of and response to facial emotional cues. Journal

of Abnormal Psychology, 102, 358–368.

Phan, K. L., Wager, T., Taylor, S. F., & Liberzon, I. (2002). Functional neuroanatomy of

emotion: a meta-analysis of emotion activation studies in PET and fMRI.

Neuroimage, 16, 331-348.

Phillips M. L., Drevets, W. C., Rauch, S. L., & Lane, R. (2003a). Neurobiology of

emotion perception I: The Neural Basis of Normal Emotion Perception.

Biological Psychiatry, 54, 504–514.

Pyne, J. M., Patterson, T. L., Kaplan, R. M., Gillin, J. C., Koch, W. L., & Grant, I.

(1997). Assessment of the quality of life of patients with major depression.

Psychiatric Services, 48(2), 224-230.

Radloff, L. S. (1977). The CES-D Scale: A self-report depression scale for research in the

general population. Applied Psychological Measurement, 1, 385-401.

Ridout, N., O’Carroll, R., Dritschel, B., Christmas, D., Eljamel, M., & Matthews, K.

(2007). Emotion recognition from dynamic emotional displays following anterior

cingulotomy and anterior capsulotomy for chronic depression.

Neuropsychologia, 45, 1735-1743.

74

Roy, M-A., Neale M. C., Pedersen, N. L., Mathé, A. A., & Kendler, K. S. (1995). A twin

study of generalized anxiety disorder and major depression. Psychological

Medicine 25, 1037–1049.

Rozin, P., & Fallon, A. E. (1987). A perspective on disgust. Psychological Review, 94,

23-41.

Rozin, P., Haidt, J., & McCauley, C. R. (2000). Disgust. In M. Lewis & J. Haviland

(Eds.) Handbook of emotions, 2nd edition, (pp. 637-653). New York: Guilford

Press.

Schmolck, H., & Squire, L. R. (2001). Impaired perception of facial emotions following

bilateral damage to the anterior temporal lobe. Neuropsychology, 15, 30-38.

Segrin, C. (2000). Social skills deficits associated with depression. Clinical Psychology

Review, 20, 379-403.

Stark, C. E. L., & Squire, L. R. (2000). Intact visual perceptual discriminations in humans

in the absence of perirhinal cortex. Learning and Memory, 7, 273-278.

Salovey, P., & Mayer, J. D. (1990). . Imagination, Cognition and

Personality, 9, 185–211.

Thompson, B. (1994). Planned versus unplanned and orthogonal versus nonorthongonal

contrasts: The neo-classical perspective. In B. Thompson (Ed.), Advances in

social science methodology. Greenwich, CT: JAI Press.

Wechsler, D. (2008). WAIS-IV administration and scoring manual. San Antonio, TX:

The Psychological Corporation.

75

Walbott, H. G. (1991). Recognition of emotion from facial expression via imitation?

Some indirect evidence for an old theory. British Journal of Social Psychology,

30, 207–219.

Waller, B. M., Cray, J. J., & Burrows, A. M. (2008). Selection for universal facial

emotion. Emotion, 8, 435-439.

Watson, D., O'Hara, M. W., & Stuart S. (2008). Hierarchical structures of affect and

psychopathology and their implications for the classification of emotional

disorders. Depression and Anxiety, 25, 282-288.

Winkielman, P., Berridge, K. C., & Wilbarger, J. L. (2005). Unconscious affective

reactions to minstructed happy versus angry faces influence consumption.

Personality and Social Psychology Bulletin, 31, 121-135.

Wolwer, W., Frommann, N., Halfmann, S., Piaszek, A., Streit, M., Gaebel, W., 2005.

Remediation of impairments in facial affect recognition in schizophrenia:

efficacy and specificity of a new training program. Schizophrenia Research, 80,

295–303.

Zich, J. M., C. C. Attkisson, et al. (1990). Screening for depression in primary care

clinics: the CES-D and the BDI. International Journal of Psychiatry Medicine,

20, 259-77.