Neural systems supporting linguistic structure, linguistic experience, and symbolic in sign and

Aaron J. Newmana,b,c,d,1, Ted Supallae, Nina Fernandezf, Elissa L. Newporte,1, and Daphne Bavelierf,g

aDepartment of Psychology and Neuroscience, Dalhousie University, Halifax, NS, Canada B3H 4R2; bDepartment of Psychiatry, Dalhousie University, Halifax, NS, Canada B3H 4R2; cDepartment of Surgery, Dalhousie University, Halifax, NS, Canada B3H 4R2; dDepartment of Pediatrics, Division of Neurology, Dalhousie University, Halifax, NS, Canada B3H 4R2; eDepartment of Neurology, Georgetown University, Washington, DC 20007; fDepartment of Brain and Cognitive Sciences, University of Rochester, Rochester, NY 14620; and gFaculty of Psychology and Educational Sciences, University of Geneva, CH-1205 Geneva, Switzerland

Contributed by Elissa L. Newport, July 9, 2015 (sent for review November 13, 2013; reviewed by Barbara Landau and Rachel I. Mayberry) Sign used by deaf communities around the world possess direction of motion (e.g., forward or backward), and also a classifier the same structural and organizational properties as spoken lan- that specifies the semantic category (e.g., vehicle) or size and shape guages: In particular, they are richly expressive and also tightly (e.g., round, flat) of the object that is moving (2). Although verbs of grammatically constrained. They therefore offer the opportunity motion with classifiers occur in some spoken languages, in ASL to investigate the extent to which the neural organization for these constructions are often iconic—the forms of the morphemes language is modality independent, as well as to identify ways in are frequently similar to the visual–spatial meanings they express— which modality influences this organization. The fact that sign and they have therefore become a focus of discussion about the languages share the visual–manual modality with a nonlinguistic degree to which they (and other parts of ASL) are linguistic or symbolic communicative system—gesture—further allows us to gestural in character. Some researchers have argued that the fea- investigate where the boundaries lie between language and sym- tures of motion and spatial relationships marked in ASL verbs of bolic communication more generally. In the present study, we had motion are in fact not linguistic morphemes but are based on the three goals: to investigate the neural processing of linguistic struc- analog imagery system that underlies nonlinguistic visual–spatial ture in American (using verbs of motion classifier processing (3–5). In contrast, Supalla (2, 6, 7) and others have ar- constructions, which may lie at the boundary between language gued that these ASL constructions are linguistic in nature, differing and gesture); to determine whether we could dissociate the brain from in that they have segmental structure, are produced systems involved in deriving meaning from symbolic communica- and perceived in a discrete categorical (rather than analog) manner, tion (including both language and gesture) from those specifically and are governed by morphological and syntactic regularities found engaged by linguistically structured content (sign language); and in other languages of the world. to assess whether sign language experience influences the neural These similarities and contrasts between sign language and systems used for understanding nonlinguistic gesture. The results gesture allow us to ask some important questions about the neural demonstrated that even sign language constructions that appear systems for language and gesture. The goal of this study was to on the surface to be similar to gesture are processed within the examine the neural systems underlying the processing of ASL verbs left-lateralized frontal-temporal network used for spoken lan- of motion compared with nonlinguistic gesture. This allowed us to guages—supporting claims that these constructions are linguisti- ask whether, from the point of view of neural systems, there is cally structured. Moreover, although nonsigners engage regions involved in action perception to process communicative, Significance symbolic gestures, signers instead engage parts of the language- processing network—demonstrating an influence of experience Although sign languages and nonlinguistic gesture use the on the perception of nonlinguistic stimuli. same modalities, only sign languages have established vocab- ularies and follow grammatical principles. This is the first study brain | American Sign Language | fMRI | deafness | neuroplasticity (to our knowledge) to ask how the brain systems engaged by sign language differ from those used for nonlinguistic gesture ign languages such as American Sign Language (ASL) are matched in content, using appropriate visual controls. Signers Snatural human languages with linguistic structure. Signed and engaged classic left-lateralized language centers when viewing spoken languages also largely share the same neural substrates, both sign language and gesture; nonsigners showed activation including left hemisphere dominance revealed by brain injury only in areas attuned to human movement, indicating that sign and neuroimaging. At the same time, sign languages provide a language experience influences gesture perception. In signers, unique opportunity to explore the boundaries of what, exactly, sign language activated left hemisphere language areas more “language” is. Speech-accompanying gesture is universal (1), yet strongly than gestural sequences. Thus, sign language con- such gestures are not language—they do not have a set of structural structions—even those similar to gesture—engage language- components or combinatorial rules and cannot be used on their related brain systems and are not processed in the same ways own to reliably convey information. Thus, gesture and sign language that nonsigners interpret gesture. are qualitatively different, yet both convey symbolic meaning via the hands. Comparing them can help identify the boundaries between Author contributions: A.J.N., T.S., N.F., E.L.N., and D.B. designed research; A.J.N. and N.F. performed research; T.S. contributed new reagents/analytic tools; A.J.N. and D.B. ana- language and nonlinguistic symbolic communication. lyzed data; and A.J.N., E.L.N., and D.B. wrote the paper. Despite this apparently clear distinction between sign language Reviewers: B.L., Johns Hopkins University; and R.I.M., University of California, San Diego. and gesture, some researchers have emphasized their apparent The authors declare no conflict of interest. similarities. One construction that has been a focus of contention 1 “ ” “ ” To whom correspondence may be addressed. Email: [email protected] or eln10@ is classifier constructions (also called verbs of motion ). In ASL, georgetown.edu. a verb of motion (e.g., moving in a circle) will include a root ex- This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10. pressing the motion event, morphemes marking the manner and 1073/pnas.1510527112/-/DCSupplemental.

11684–11689 | PNAS | September 15, 2015 | vol. 112 | no. 37 www.pnas.org/cgi/doi/10.1073/pnas.1510527112 Downloaded by guest on September 24, 2021 “linguistic structure” in ASL verbs of motion. It also allowed us and gesture) from those specifically engaged by “linguistically to distinguish networks involved in “symbolic communication” from structured content” (sign language), while controlling for the those involved specifically in language, and to determine whether sensory and spatial processing demands of the stimuli. We also “sign language experience” alters systems for gesture comprehension. asked whether sign language experience influences the brain It is already established that a very similar, left-lateralized systems used to understand nonlinguistic gesture. To address neural network is involved in the processing of many aspects of these questions, we measured brain activation using functional lexical and syntactic information in both spoken and signed MRI (fMRI) while two groups of people, deaf native ASL signers languages. This includes the inferior frontal gyrus (IFG) (clas- and nonsigning, hearing native English speakers, viewed two sically called Broca’s area), superior temporal sulcus (STS) and types of videos: ASL verbs of motion constructions describing adjacent superior and middle temporal gyri, and the inferior the paths and manners of movement of toys [e.g., a toy cow parietal lobe (IPL) (classically called Wernicke’s area) includ- falling off a toy truck, as the truck moves forward (2); see Fig. S5 ing the angular (AG) and supramarginal gyri (SMG) (4, 8–18). for examples], and gestured descriptions of the same events. Likewise, narrative and discourse-level aspects of signed language Importantly, we also included a “backward-layered” control con- depend largely on right STS regions, as they do for spoken lan- dition in which the ASL and gesture videos were played backward, guage (17, 19). with three different videos superimposed (as in refs. 17 and 18). Although the neural networks engaged by signed and spoken We predicted that symbolic communication (i.e., both gesture language are overall quite similar, some studies have suggested and ASL, in signers and nonsigners) would activate areas typi- that the linguistic use of space in sign language engages addi- cally seen in both sign language and meaningful gesture process- tional brain regions. During both comprehension and production ing, including the left IFG and the SMG and STSp bilaterally. We of spatial relationships in sign language, the superior parietal further predicted that the left IFG would show stronger activation lobule (SPL) is activated bilaterally (4, 5, 12). In contrast, par- for linguistically structured content, i.e., in the contrast of ASL allel studies in spoken languages have found no (12) or only left versus gesture in signers but not in nonsigners. We also predicted (20) parietal activation when people describe spatial relation- that other areas typically associated with syntactic processing and ships. These differences between signed and spoken language led lexical retrieval, including anterior and middle left STS, would be Emmorey et al. (5) to conclude that, “the location and move- more strongly activated by ASL in signers. Such a finding would ments within [classifier] constructions are not categorical mor- provide evidence in favor of the argument that verbs of motion phemes that are selected and retrieved via left hemisphere constructions are governed by linguistic morphology and argue language regions” (p. 531). However, in these studies, signers against these being gestural constructions rather than linguistic. had to move their hands whereas speakers did not; it is unclear We further predicted that visual–manual language experience whether parietal regions are involved in processing linguistic would lead to greater activation of the left IFG of signers than structure in sign language as opposed to simply using the hands nonsigners when viewing gesture, although less than for ASL. to symbolically represent spatial structure and relationships. Other studies have touched on the question of symbolic com- Results munication, comparing the comprehension of sign language with Behavioral Performance. Overall, deaf signers showed greater ac- pantomime and with meaningless, sign-like gestures (11, 21–23). curacy than hearing nonsigners in judging which picture matched In signers, activation for both sign language and pantomime the preceding video, shown in Fig. S1. Deaf signers were more gestures was reported in classical language-related areas in- accurate than nonsigners for both ASL (signers: 1.99% errors; cluding the IFG, the posterior region of the STS (STSp), and the nonsigners: 17.85% errors) and for gestures (signers: 2.37% er- SMG, although typically these activations are stronger for sign rors; nonsigners: 5.68% errors). A generalized linear mixed model language than gesture. Similar patterns of activation—although fitted using a binomial distribution, involving factors Group often more bilateral—have been observed in nonsigners, for (signers, nonsigners) and Stimulus Type (ASL, gesture) identified meaningful as well as for meaningless gesture perception (24–27). aGroup× Stimulus Type interaction (z = 3.53; P = 0.0004). Post Thus, on the one hand, the classical left-lateralized “language- hoc tests showed that signers were significantly more accurate processing” network appears to be engaged by both signers and than hearing nonsigners for both ASL (z = 7.61; P < 0.0001) and nonsigners for interpreting both sign language and nonlinguistic forgestures(z = 2.9; P = 0.004). Furthermore, deaf signers showed gesture. On the other hand, sign language experience appears to similar levels of accuracy on both ASL and gesture stimuli, whereas drive a specialization of these regions for signs over nonsigns in hearing nonsigners were significantly more accurate in judging signers, whereas similar levels of activation are seen for gestures gestural than ASL stimuli (z = 6.91; P < 0.0001). and spoken descriptions of human actions in nonsigners (27). In Examination of the reaction times (RTs) shown in Fig. S1 the right hemisphere, when viewing gestures both signers and suggested an interaction between Group and Stimulus Type for nonsigners show activation of the homologs of the left hemi- this measure as well, with signers showing faster responses to ASL sphere language regions noted above, including the STS (the (1,543.4 ms; SD = 264.1) than to gestures (1,815.4 ms; SD = STSp associated with biological motion, as well as more anterior 314.3), but nonsigners showing faster responses to gestures areas), inferior and superior parietal regions, and the IFG. (1,701.3 ms; SD = 325.3) than to ASL (1,875.0 ms; SD = 334.4). One important caveat in considering the literature is that most This observation was borne out by the results of a 2 (Group) × 2 studies have compared task-related activation to a simple baseline (Stimulus Type) linear mixed-effects analysis, which included a condition that did not control for the types of movements made or significant Group × Stimulus Type interaction [F(1,2,795) = 148.08; for other low-level stimulus features. Thus, apparent differences P < 0.0001]. RTs were significantly faster for signers than non-

between sign language and gesture may in fact be attributable to signers when judging ASL stimuli (t = 3.36; P = 0.0006); however, NEUROSCIENCE their distinct physical/perceptual qualities, whereas subtle but im- the two groups did not differ significantly when judging gesture portant activation differences between sign language and non- stimuli (t = 1.15; P = 0.2493). Signers were also significantly linguistic communication may have been “washed out” by the overall faster at making judgments for ASL than gesture stimuli (t = 10.54; similarity of brain activation when people attempt to find meaning in P < 0.0001), whereas nonsigners showed the reverse pattern, hand, body, and face movement relative to a static stimulus. responding more quickly gesture stimuli (t = 6.63; P < 0.0001). Our goal in the present study was to further investigate the neural processing of ASL verbs of motion and to determine fMRI Data. Although we developed backward-layered control stim- whether we could dissociate the brain systems involved in deriving uli to closely match and control for the low-level properties of our COGNITIVE SCIENCES meaning from “symbolic communication” (including both language stimuli, we first examined the contrast with fixation to compare our PSYCHOLOGICAL AND

Newman et al. PNAS | September 15, 2015 | vol. 112 | no. 37 | 11685 Downloaded by guest on September 24, 2021 data with previous studies (most of which used this contrast) and Between-Group Comparisons. Signers showed significantly stronger to gain perspective on the entire network of brain regions that activation for ASL than nonsigners in the anterior/middle STS responded to ASL and gesture. This is shown in Fig. S2 with details bilaterally and in the left IFG (Fig. 2, Right, and Table S6). The in Tables S1 and S2. Relative to fixation, ASL and gesture activated area of stronger activation in the left IFG did not survive multiple- a common, bilateral network of brain regions in both deaf signers comparison correction. However, because differences between and hearing nonsigners, including occipital, temporal, inferior pa- groups were predicted a priori in this region, we interrogated it rietal, and motor regions. Deaf signers showed unique activation in using a post hoc region of interest analysis. Left IFG was defined the IFG and the anterior and middle STS bilaterally. as Brodmann’s areas 44 and 45 (28), and within this we thresholded activations at z > 2.3, uncorrected for multiple comparisons. As seen Contrasts with Backward-Layered Control Stimuli. The better-matched in Fig. 2, the area that showed stronger activation for signers than contrasts of ASL and gesture with backward-layered control stimuli nonsigners was within the left IFG cluster that, in signers, showed identified a subset of the brain regions activated in the contrast stronger activation for ASL than gesture. Hearing nonsigners showed with fixation, as seen in Fig. 1. Details of activation foci are pro- greater activation than signers only for gesture, and this was re- vided in Tables S3 and S4. Very little of this activation was shared stricted to the STSp/SMG of the right hemisphere. Between-group between signers and nonsigners. Notably, all of the activation in the contrasts for the data relative to fixation baseline (without the occipital and superior parietal lobes noted in the contrast with backward-layered control condition subtracted) are shown in Fig. S4. fixation baseline was eliminated when we used a control condition that was matched on higher-level visual features. On the lateral Discussion surface of the cortex, activations were restricted to areas in and The central question of this study was whether distinct brain around the IFG and along the STS, extending into the IPL. Me- systems are engaged during the perception of sign language, dially, activation was found in the ventromedial frontal cortex and compared with gestures that also use the visual–manual modality fusiform gyrus for both groups, for ASL only. Signers uniquely and are symbolic communication but lack linguistic structure. Some showed activation in the left IFG and bilateral anterior/middle previous work has suggested that aspects of sign language—such as STS, for both ASL and gesture. Nonsigners uniquely showed ac- verbs of motion—are nonlinguistic and are processed like gesture, tivation bilaterally in the STSp for both ASL and gesture (although thus relying on brain areas involved in processing biological motion with a small amount of overlap with signers in the left STSp), as and other spatial information. This position would predict shared well as in the left inferior frontal sulcus and right IFG for ASL brain systems for understanding verbs of motion constructions and only. The only area showing any extensive overlap between signers and nonsigners was the right STS. nonlinguistic gestures expressing similar content. In contrast, we hypothesized that verbs of motion constructions are linguistically Comparison of ASL and Gesture Within Each Group. No areas were governed and, as such, would engage language-specific brain sys- more strongly activated by gesture than by ASL in either group. tems in signers distinct from those used for processing gesture. We In signers, the ASL–gesture contrast yielded an exclusively left- also investigated whether knowing sign language influenced the lateralized network, including the IFG and middle STS, as well neural systems recruited for nonlinguistic gesture, by comparing as the fusiform gyrus (Fig. 2, Left, and Table S5). By contrast, the responses to gesture in signers and nonsigners. Finally, we com- only areas that showed significantly stronger activation for ASL pared signers and nonsigners to determine whether understanding than gesture in hearing people was a small part of the left STSp symbolic communication differs when it employs a linguistic code as (distinct from the areas activated in signers) and the posterior opposed to when it is created ad hoc. Because there is little sym- cingulate gyrus. Although nonsigners showed significant activa- bolic but nonlinguistic communication in the oral–aural channel, tion for ASL but not gesture in or around the IFG bilaterally, such a comparison is best done using sign language and gesture. these activations were not significantly stronger for ASL than for Although many neuroimaging studies have contrasted language gesture, implicating subthreshold activation for gesture. Between- with control stimuli, to our knowledge no studies have compared condition contrasts for the data relative to fixation baseline (without linguistic and nonlinguistic stimuli while attempting to match the the backward-layered control condition subtracted) are shown semantic and symbolic content. Here, ASL and gesture were each in Fig. S3. used to describe the same action events.

Fig. 1. Statistical maps for each stimulus type relative to the backward-layered control stimuli, in each subject group. Statistical maps were masked with the maps shown in Fig. S2, so that all contrasts represent brain areas activated relative to fixation baseline. Thresholded at z > 2.3, with a cluster size-corrected P < 0.05. In the coronal and sagittal views, the right side of the brain is shown on the right side of each image.

11686 | www.pnas.org/cgi/doi/10.1073/pnas.1510527112 Newman et al. Downloaded by guest on September 24, 2021 Fig. 2. (Left) Between-condition differences for each subject group, for the contrasts with backward-layered control stimuli. No areas showed greater ac- tivation for gesture than ASL in either group. Thresholded at z > 2.3, with a cluster size-corrected P < 0.05. (Right) Between-group differences for the contrast of each stimulus type relative to the backward-layered control stimuli. No areas were found that showed stronger activation in signers than nonsigners for gesture, nor in nonsigners than signers for ASL. Thresholded at z > 2.3, with a cluster size-corrected P < 0.05. Significant between-group differences in the left IFG were obtained in a planned region of interest analysis at z > 2.3, uncorrected for cluster size.

In the present study, compared with the low-level fixation included the middle STS—an area associated with lexical (lemma) baseline, ASL and gesture activated an extensive, bilateral network selection and retrieval in studies of spoken languages (29)—and of brain regions in both deaf signers and hearing nonsigners. This the posterior STS. For signers, this left-lateralized activation was network is consistent with previous studies of both sign language posterior to the STSp region activated bilaterally in nonsigners for and gesture that used similar contrasts with a low-level baseline both ASL and gesture and typically associated with biological (10, 22–27). Critically, however, compared with the backward- motion processing. The area activated in signers is in line with the layered conditions that controlled for stimulus features (such as characterization of this region as “Wernicke’sarea” and its asso- biological motion and face perception) and for motor responses, a ciation with semantic and phonological processing. much more restricted set of brain areas was activated, with con- siderably less overlap across groups and conditions. Indeed, the Symbolic Communication. Symbolic communication was a common only area commonly activated across sign language and gesture in feature of both the ASL and gesture stimuli. Previous studies had both signers and nonsigners was in the middle/anterior STS region suggested that both gesture and sign language engage a broad, of the right hemisphere. In general, when there were differences common network of brain regions including classical left hemi- between stimulus types, ASL elicited stronger brain activation than sphere language areas. Some previous studies used pantomimed gesture in both groups. However, the areas that responded more actions (10, 21, 27), which are more literal and less abstractly strongly to ASL were almost entirely different between groups, symbolic than some of the gestures used in the present study; again supporting the influence of linguistic experience in driving other studies used gestures with established meanings [emblems the brain responses to symbolic communication. (22, 27, 30)], or meaningless gestures (21, 23, 30). Thus, the stimuli differed from those in the present study in terms of their Linguistic Structure. Our results show that ASL verbs of motion meaningfulness and degree of abstract symbolism. Our data produce a distinct set of activations in native signers, based in the revealed that, when sign language and gesture stimuli are closely classic left hemisphere language areas: the IFG and anterior/ matched for content, and once perceptual contributions are middle STS. This pattern of activation was significantly different properly accounted for, a much more restricted set of brain re- from that found in signers observing gesture sequences expressing gions are commonly engaged across stimulus types and groups— approximately the same semantic content, and was wholly differ- restricted to middle and anterior regions of the right STS. We ent from the bilateral pattern of activation in the STSp found in have consistently observed activation in this anterior/middle right hearing nonsigners observing either sign language or gestural STS area in our previous studies of sign language processing, in stimuli. These results thus suggest that ASL verbs of motion are both hearing native signers and late learners (16) and in deaf not processed by native signers as nonlinguistic imagery—because native signers both for ASL sentences with complex morphology nonsigners showed activation primarily in areas associated with [including spatial morphology (18), and narrative and prosodic general biological motion processing—but rather are processed in markers (17)]. The present findings extend this to nonlinguistic terms of their linguistic structure (i.e., as complex morphology), as stimuli in nonsigners, suggesting that the right anterior STS is Supalla (2, 6, 7) has argued. This finding is also consistent with involved in the comprehension of symbolic manual communica- evidence that both grammatical judgment ability and left IFG tion regardless of linguistic structure or sign language experience. NEUROSCIENCE activation correlated with age of acquisition in congenitally deaf people who learned ASL as a first language (14). Apparently, Sign Language Experience. Our results indicate that lifelong use of despite their apparent iconicity, ASL verbs of motion are processed a visual–manual language alters the neural response to non- in terms of their discrete combinatorial structure, like complex linguistic manual gesture. Left frontal and temporal language words in other languages, and depend for this type of pro- processing regions showed activation in response to gesture only cessing on the left hemisphere network that underlies spoken as in signers, once low-level stimulus features were accounted for. well as other aspects of signed languages. Although these same left hemisphere regions were more strongly The other areas more strongly activated by ASL—suggesting activated by ASL, their activation by gesture exclusively in signers COGNITIVE SCIENCES linguistic specialization—were in the left temporal lobe. These suggests that sign language experience drives these areas to PSYCHOLOGICAL AND

Newman et al. PNAS | September 15, 2015 | vol. 112 | no. 37 | 11687 Downloaded by guest on September 24, 2021 attempt to analyze visual–manual symbolic communication even have had their origins in gesture, over generations of use they have when it lacks linguistic structure. An extensive portion of the become regularized and abstracted into segmental, grammatically right anterior/middle STS was also activated exclusively in controlled linguistic units—a phenomenon that has been repeatedly signers. This region showed no specialization for linguistically described in the development and evolution of sign languages structured material, although in previous studies we found sen- (31–33) and the structure and historical emergence of full-fledged sitivity of this area to both morphological and narrative/prosodic adult sign languages (2, 6, 7, 34). Human communication systems structure in ASL (17, 18). It thus seems that knowledge of a seem always to move toward becoming rapid, combinatorial, and – ’ visual manual sign language can lead to this region s becoming highly grammaticized systems (31); the present findings suggest more sensitive to manual movements that have symbolic content, that part of this change may involve recruiting the left hemisphere whether linguistic or not, and that activation in this region in- network as the substrate of rapid, rule-governed computation. creases with the amount of information that needs to be integrated to derive meaning. Materials and Methods It is interesting to note that our findings contrast with some Please see SI Materials and Methods for detailed materials and methods. previous studies that compared ASL and gesture, and found left IFG activation only for ASL (10, 21). This finding was inter- Participants. Nineteen congenitally deaf native learners of ASL (signers) and preted as evidence for a “gating mechanism” whereby signs were 19 normally hearing, native English speakers (nonsigners) participated. All distinguished from gesture at an early stage of processing in native provided informed consent. Study procedures were reviewed by the Uni- signers, with only signs passed forward to the IFG. However, those versity of Rochester Research Subjects Review Board. previous studies did not use perceptually matched control condi- tions (allowing for the possibility of stimulus feature differences) ASL and Gesture Stimuli. Neural activation to sign language compared with and used pantomimed actions. In contrast, we used sequences of gesture was observed by asking participants to watch videotaped ASL and gesture gestures that involved the articulators to symbolically represent sequences expressing approximately the same content. Both ASL and gesture objects and their paths. In this sense, our stimuli more closely video sequences were elicited by the same set of stimuli, depicting events of match the abstract, symbolic nature of language than does panto- motion. These stimuli included a set of short videos of toys moving along various mime. Thus, there does not appear to be a strict “gate” whereby paths, relative to other objects, and a set of line drawings depicting people, the presence or absence of phonological or syntactic structure animals, and/or objects engaged in simple activities. Examples of these are shown determines whether signers engage the left IFG; rather signers may in Fig. S5. After viewing each such video, the signer or gesturer was filmed producing, respectively, ASL sentences or gesture sequences describing the engage language-related processing strategies when meaning needs video. These short clips were shown to participants in the scanner. The ASL to be derived from abstract symbolic, manual representations. constructions were produced by a native ASL signer; gestured descriptions were produced by three native English speakers who did not know sign language. The Conclusions native signer and the gesturers were instructed to describe each video or picture This study was designed to assess the effects of linguistic struc- without speaking, immediately after viewing it. The elicited ASL and gestures ture, symbolic communication, and linguistic experience on brain were video recorded, edited, and saved as digital files for use in the fMRI ex- activation. In particular, we sought to compare how sign lan- periment. Backward-layered control stimuli were created by making the signed guages and nonlinguistic gesture are treated by the brain, in and gestured movies partly transparent, reversing them in time, and then signers and nonsigners. This comparison is of special interest in overlaying three such movies (all of one type, i.e., ASL or gesture) in software light of recent claims that some highly spatial aspects of sign andsavingtheseasnewvideos. languages (e.g., verbs of motion) are organized and processed like gesture rather than like language. Our results indicate that ASL fMRI Procedure. Each participant completed four fMRI scans (runs) of 40 trials verbs of motion are not processed like spatial imagery or other each. Each trial consisted of a task cue (instructions to try to determine nonlinguistic materials, but rather are organized and mentally the meaning of the video, or watch for symmetry between the two hands), processed like grammatically structured language, in specialized followed by a video (ASL, gesture, or control), followed by a response prompt. For ASL and gesture movies, participants saw two pictures and had to indicate brain areas including the IFG and STS of the left hemisphere. via button press which best matched the preceding video. For control movies, Although in this study both ASL and gesture conveyed in- participants made button presses indicating whether, during the movie, formation to both signers and nonsigners, we identified only re- three hands had simultaneously had the same handshape. Two runs in- stricted areas of the right anterior/middle STS that responded volved ASL stimuli only, whereas the other two involved gesture stimuli similarly to symbolic communication across stimulus types and only. Within each run, one-half of the trials were the ASL or gesture videos groups. Overall, our results suggest that sign language experience and the other half were their backward-layered control videos. The ASL modifies the neural networks that are engaged when people try to movies possessed enough iconicity, and the “foil” pictures were designed to make sense of nonlinguistic, symbolic communication. Nonsigners be different enough from the targets that task performance was reasonably engaged a bilateral network typically engaged in the perception of high even for nonsigners viewing ASL. Data were collected using an echo- biological motion. For native signers, on the other hand, rather planar imaging pulse sequence on a 3-T MRI system (echo time, 30 ms; repe- than sign language being processed like gesture, gesture is pro- tition time, 2 s; 90° flip angle, 4-mm isotropic resolution). fMRI data were cessed more like language. Signers recruited primarily language- analyzed using FSL FEAT software according to recommendations of the processing areas, suggesting that lifelong sign language experience software developers (fsl.fmrib.ox.ac.uk/fsl/fslwiki/). leads to imposing a language-like analysis even for gestures that ACKNOWLEDGMENTS. We are grateful to Dara Baril, Patricia Clark, Joshua are immediately recognized as nonlinguistic. Diehl, Jason Droll, Matt Hall, Elizabeth Hirshorn, Michael Lawrence, Don Finally, our findings support the analysis of verbs of motion Metlay, Raylene Harris Paludneviciene, and Jennifer Vannest for their help classifier constructions as being linguistically structured (2, 6, on this project; and to Barbara Landau and Rachel Mayberry for their 7), in that they specifically engage classical left hemisphere thoughtful comments. This study was supported by a grant from the James S. McDonnell Foundation (to D.B., E.L.N., and T.S.) and by NIH Grants DC00167 language-processing regions in native signers but not areas sub- (to E.L.N. and T.S.) and DC04418 (to D.B.). A.J.N. was supported by Natural serving nonlinguistic spatial perception. We suggest that this is Sciences and Engineering Research Council (Canada) and the Canada Research because, although the signs for describing spatial information may Chairs program.

1. McNeill D (1985) So you think gestures are nonverbal? Psychol Rev 92(3):350– 3. Liddell SK (2003) Grammar, Gesture, and Meaning in American Sign Language 371. (Cambridge Univ Press, Cambridge, UK). 2. Supalla T (1982) Structure and acquisition of verbs of motion and location in Amer- 4. Emmorey K, et al. (2002) Neural systems underlying spatial language in American Sign ican Sign Language. PhD thesis (University of California, San Diego). Language. Neuroimage 17(2):812–824.

11688 | www.pnas.org/cgi/doi/10.1073/pnas.1510527112 Newman et al. Downloaded by guest on September 24, 2021 5. Emmorey K, McCullough S, Mehta S, Ponto LLB, Grabowski TJ (2013) The biology of linguistic 21. Emmorey K, Xu J, Gannon P, Goldin-Meadow S, Braun A (2010) CNS activation and expression impacts neural correlates for spatial language. JCognNeurosci25(4):517–533. regional connectivity during pantomime observation: No engagement of the mirror 6. Supalla T (1990) Serial verbs of motion in ASL. Theoretical Issues in Sign Language neuron system for deaf signers. Neuroimage 49(1):994–1005. Research, eds Fischer SD, Siple P (The Univ of Chicago Press, Chicago), pp 127–152. 22. Husain FT, Patkin DJ, Thai-Van H, Braun AR, Horwitz B (2009) Distinguishing the 7. Supalla T (2003) Revisiting visual analogy in ASL classifier predicates. Perspectives on processing of gestures from signs in deaf individuals: An fMRI study. Brain Res 1276: Classifier Constructions in Sign Language, ed Emmorey K (Lawrence Erlbaum Asso- 140–150. ciates, Mahwah, NJ), pp 249–257. 23. MacSweeney M, et al. (2004) Dissociating linguistic and nonlinguistic gestural com- 8. Bavelier D, et al. (2008) Encoding, rehearsal, and recall in signers and speakers: Shared munication in the brain. Neuroimage 22(4):1605–1618. network but differential engagement. Cereb Cortex 18(10):2263–2274. 24. Decety J, et al. (1997) Brain activity during observation of actions. Influence of action ’ – 9. Campbell R, MacSweeney M, Waters D (2008) Sign language and the brain: A review. content and subject s strategy. Brain 120(Pt 10):1763 1777. J Deaf Stud Deaf Educ 13(1):3–20. 25. Lotze M, et al. (2006) Differential cerebral activation during observation of expressive – 10. Corina D, et al. (2007) Neural correlates of human action observation in hearing and gestures and motor acts. Neuropsychologia 44(10):1787 1795. deaf subjects. Brain Res 1152:111–129. 26. Villarreal M, et al. (2008) The neural substrate of gesture recognition. Neuropsychologia – 11. Hickok G, Bellugi U, Klima ES (1996) The neurobiology of sign language and its im- 46(9):2371 2382. 27. Xu J, Gannon PJ, Emmorey K, Smith JF, Braun AR (2009) Symbolic gestures and spoken plications for the neural basis of language. Nature 381(6584):699–702. language are processed by a common neural system. Proc Natl Acad Sci USA 106(49): 12. MacSweeney M, et al. (2002) Neural correlates of British sign language comprehen- 20664–20669. sion: Spatial processing demands of topographic language. J Cogn Neurosci 14(7): 28. Eickhoff SB, et al. (2007) Assignment of functional activations to probabilistic cy- 1064–1075. toarchitectonic areas revisited. Neuroimage 36(3):511–521. 13. MacSweeney M, Capek CM, Campbell R, Woll B (2008) The signing brain: The neu- 29. Indefrey P (2011) The spatial and temporal signatures of word production compo- robiology of sign language. Trends Cogn Sci 12(11):432–440. nents: A critical update. Front Psychol 2:255. 14. Mayberry RI, Chen JK, Witcher P, Klein D (2011) Age of acquisition effects on the 30. Husain FT, Patkin DJ, Kim J, Braun AR, Horwitz B (2012) Dissociating neural correlates functional organization of language in the adult brain. Brain Lang 119(1):16–29. of meaningful emblems from meaningless gestures in deaf signers and hearing non- 15. Neville HJ, et al. (1998) Cerebral organization for language in deaf and hearing signers. Brain Res 1478:24–35. subjects: Biological constraints and effects of experience. Proc Natl Acad Sci USA 31. Newport EL (1981) Constraints on structure: Evidence from American Sign Language – 95(3):922 929. and language learning. Minnesota Symposium on Child Psychology, ed Collins, WA 16. Newman AJ, Bavelier D, Corina D, Jezzard P, Neville HJ (2002) A critical period for right (Lawrence Erlbaum Associates, Hillsdale, NJ), Vol 14, pp 93–124. – hemisphere recruitment in American Sign Language processing. Nat Neurosci 5(1):76 80. 32. Newport EL (1999) Reduced input in the acquisition of signed languages: Contributions 17. Newman AJ, Supalla T, Hauser PC, Newport EL, Bavelier D (2010) Prosodic and narrative to the study of creolization. Language Creation and Language Change: Creolization, processing in American Sign Language: An fMRI study. Neuroimage 52(2):669–676. Diachrony, and Development, ed DeGraff M (MIT Press, Cambridge, MA), pp 161–178. 18. Newman AJ, Supalla T, Hauser P, Newport EL, Bavelier D (2010) Dissociating neural 33. Senghas A, Kita S, Ozyürek A (2004) Children creating core properties of language: subsystems for grammar by contrasting word order and inflection. Proc Natl Acad Sci Evidence from an emerging sign language in Nicaragua. Science 305(5691):1779–1782. USA 107(16):7539–7544. 34. Supalla T, Clark P (2015) Sign Language Archeology: Understanding the History and 19. Hickok G, et al. (1999) Discourse deficits following right hemisphere damage in deaf Evolution of American Sign Language (Gallaudet Univ Press, Washington, DC). signers. Brain Lang 66(2):233–248. 35. Worsley KJ (2001) Statistical analysis of activation images. Functional Magnetic 20. Damasio H, et al. (2001) Neural correlates of naming actions and of naming spatial Resonance Imaging: An Introduction to the Methods, eds Jezzard P, Matthews PM, relations. Neuroimage 13(6 Pt 1):1053–1064. Smith SM (Oxford Univ Press, New York), pp 251–270. NEUROSCIENCE COGNITIVE SCIENCES PSYCHOLOGICAL AND

Newman et al. PNAS | September 15, 2015 | vol. 112 | no. 37 | 11689 Downloaded by guest on September 24, 2021