Neural Systems Underlying Spatial Language in American Sign Language

Total Page:16

File Type:pdf, Size:1020Kb

Neural Systems Underlying Spatial Language in American Sign Language NeuroImage 17, 812– 824 (2002) doi:10.1006/nimg.2002.1187 Neural Systems Underlying Spatial Language in American Sign Language Karen Emmorey,*,1 Hanna Damasio,*,† Stephen McCullough,* Thomas Grabowski,† Laura L. B. Ponto,† Richard D. Hichwa,† and Ursula Bellugi* *Salk Institute for Biological Studies, La Jolla, California 92037; and †University of Iowa, Iowa City, Iowa 52242 Received August 28, 2001 sylvian areas within the left hemisphere produces var- A [15O]water PET experiment was conducted to in- ious types of aphasia, whereas damage to homologous vestigate the neural regions engaged in processing areas within the right hemisphere does not generally constructions unique to signed languages: classifier produce aphasic symptoms, such as effortful speech, predicates in which the position of the hands in sign- phonological or morphological errors, or difficulty un- ing space schematically represents spatial relations derstanding words or sentences. Similarly, research among objects. Ten deaf native signers viewed line over the past 2 decades has indicated that the left drawings depicting a spatial relation between two ob- cerebral hemisphere is also critical to processing jects (e.g., a cup on a table) and were asked either to signed languages. Damage to perisylvian areas within produce a classifier construction or an American Sign the left, but not the right hemisphere lead to sign Language (ASL) preposition that described the spatial language aphasias, and the general dichotomy be- relation or to name the figure object (colored red). Compared to naming objects, describing spatial rela- tween anterior–posterior lesions and nonfluent–fluent tionships with classifier constructions engaged the su- aphasias holds for signed language as well (for reviews pramarginal gyrus (SMG) within both hemispheres. see Corina, 1998, and Hickok and Bellugi, 2000). Compared to naming objects, naming spatial relations Although sign aphasia does not result from right- with ASL prepositions engaged only the right SMG. hemisphere damage, evidence indicates that some Previous research indicates that retrieval of English signers with right hemisphere damage exhibit a spe- prepositions engages both right and left SMG, but cific impairment in the topographic use of signing more inferiorly than for ASL classifier constructions. space (Poizner et al., 1987; Emmorey et al., 1995; Em- Compared to ASL prepositions, naming spatial rela- morey, 1996). In American Sign Language (ASL), as tions with classifier constructions engaged left infe- well as in other signed languages, signing space can rior temporal (IT) cortex, a region activated when function topographically to represent spatial relations naming concrete objects in either ASL or English. Left among objects. Signing space is the term used for the IT may be engaged because the handshapes in classi- three-dimensional space in front of the signer, extend- fier constructions encode information about object ing from the waist to the forehead, where signs can be type (e.g., flat surface). Overall, the results suggest articulated. Signers schematize this space to represent more right hemisphere involvement when expressing physical space, as well as to represent abstract concep- spatial relations in ASL, perhaps because signing tual structure (see Emmorey, 2001). For most locative space is used to encode the spatial relationship be- expressions in ASL, there is a schematic correspon- tween objects. © 2002 Elsevier Science (USA) dence between the location of the hands in signing space and the position of physical objects in the world. When describing spatial relations in ASL, the identity INTRODUCTION of each object is first indicated by a lexical sign (e.g., HOUSE, BIKE2). The location of the objects, their ori- For more than a century we have known that the left entation, and their spatial relation vis-a-vis one an- hemisphere of the human brain is critical for producing other is indicated by where the appropriate classifier and comprehending spoken language. Damage to peri- signs are articulated. Figure 1 provides a simple illus- 1 To whom reprint requests should be addressed at Laboratory for 2 Words in capital letters represent English glosses for ASL signs. Cognitive Neuroscience, The Salk Institute for Biological Studies, Multiword glosses connected by hyphens are used when more than 10010 North Torrey Pines Rd., La Jolla, CA 92037. E-mail: one English word is required to translate a single sign. English [email protected]. translations are given in quotes. 1053-8119/02 $35.00 812 © 2002 Elsevier Science (USA) All rights reserved. NEURAL SYSTEMS UNDERLYING SPATIAL LANGUAGE 813 FIG. 1. An illustration of a simple spatial description in ASL, using classifier constructions. An English translation would be “The bike is near the house.” Whole-entity CL refers to the type of classifier handshape morpheme and ϩloc refers to the position movement morpheme (a short downward movement) that means to be located. tration of an ASL locative sentence that could be trans- spatial contexts, BI was reported to use spatial loca- lated as “The bike is near the house.” tions on the left half of space for pronominal reference Classifier predicates are complex forms in which the and verb agreement. Similarly, Corina et al. (1996) handshape is a morpheme that encodes information reported that the RHD signer JH exhibited neglect, but about object type (Supalla, 1986; see papers in Emmo- he produced signs using the left half of his body (e.g., rey, in press, for an in-depth discussion of classifier LAZY is signed by tapping the left shoulder with an L constructions in signed languages). For example, in handshape), and he also directed pronouns and verbs Fig. 1 the hooked 5 handshape (fingers spread and toward the left half of signing space in spontaneous curved) specifies a large bulky object (such as a house signing. Interestingly, Corina et al. (1996) reported or box), and the 3 handshape (thumb, middle, and difficulty eliciting spatial descriptions from JH, who index fingers extended) refers to vehicles (such as a tended to “avoid using topographic space and simply bicycle, car, or ship). The first and third signs of Fig. 1 list[ed] the contents of his room” (p. 338). It appears are nouns that refer to the ground and figure objects, that left hemisphere control for linguistic production respectively (the sign for BIKE is made with the right generally compensates for left-side attention deficits hand, while the left hand holds the classifier sign re- unless spatial complexity of the discourse requires the ferring to the house). This ordering of figure and use of several locations representing left–right spatial ground may be an effect of the visual–spatial modality distinctions. of sign languages (Emmorey, 1996). For example, to Emmorey et al. (1995) described another RHD signer present a scene visually by drawing a picture, the (DN) who was impaired in retelling ASL descriptions ground object tends to be drawn first, and then the that involved spatial layouts. Her impairment was not figure is located with respect to the ground. Thus, if in remembering the objects in the spatial descriptions, drawing a picture of a bike next to a house, most people but in the correct placement of classifier signs within draw the house first. Crucially, the spatial relationship signing space to indicate the spatial relations among expressed by the classifier construction in Fig. 1 is not the objects. This patient was not aphasic for ASL— her encoded by a separate word as it would be in English descriptions of spatial layouts were fluent and gram- with the preposition near. Although ASL has preposi- matical, but the location and orientation of the objects tions such as NEAR, ON, or IN (see Fig. 2C), signers were described incorrectly. prefer to use classifier constructions when describing Further evidence that the right hemisphere is in- spatial relationships. Rather than encoding spatial in- volved in the comprehension of the topographic func- formation with prepositions, such information is con- tions of signing space, particularly within classifier veyed by a schematic and isomorphic mapping between constructions, comes from two other studies. The first where the hands are placed in signing space and the further examined the RHD signer DN who was hearing locations of objects being described. and bilingual for ASL and English. When DN was It is this topographic use of signing space that may asked to set up real objects in accordance with spatial be impaired with right hemisphere damage (RHD). For descriptions given in either English or ASL, she per- example, Poizner et al. (1987) report a RHD signer (BI) formed well in English, but poorly when the same who when asked to describe her room, displaced all of description was given in ASL (Corina et al., 1990; Em- the objects to the right in signing space, and did not morey, 1996). For the English descriptions, the spatial respect spatial relations, haphazardly piling the furni- relations were encoded by a preposition (e.g., “The pen ture in one place. This signer also exhibited neglect on is on the paper”), but in ASL the spatial relations had nonlinguistic drawing tasks. However, in other non- to be recovered from the spatial positioning of the 814 EMMOREY ET AL. FIG. 2. (A) Example stimuli with a flat-surface ground object and with a cylindrical ground object. (B) Example classifier construction responses for the example stimuli. (C) Example preposition responses for the example stimuli. classifier signs within signing space (e.g., the classifier tions of ASL prepositions are given in Fig. 2C. The handshape for pen (a 1 handshape: fist with index signers were asked to pick the picture that best finger extended) was placed on top of the classifier matched a preposition (e.g., IN) or a classifier construc- handshape representing a piece of paper (a B hand- tion depicting a similar spatial relation (e.g., the clas- shape: fingers together, palm down).
Recommended publications
  • Psycholinguistics and Neurolinguistics
    VI. Psycholinguistics and neurolinguistics 28. Acquisition 1. Introduction 2. Babbling 3. Phonological development 4. Lexical development 5. Morphological and syntactic development 6. Discourse development 7. Acquisition in other contexts 8. Conclusions 9. Literature Abstract This chapter provides a selective overview of the literature on sign language acquisition by children. It focuses primarily on phonological, lexical, morphological, and syntactic development, with a brief discussion of discourse development, and draws on research conducted on a number of natural sign languages. The impact of iconicity on sign lan- guage acquisition is also addressed. The chapter ends with brief discussion of acquisition in other, less typically researched contexts, including late L1 acquisition, bilingual sign- speech acquisition, and adult acquisition of sign as a second language. 1. Introduction This chapter is an overview of the acquisition of phonological, lexical, morphological, syntactic and discourse properties of sign languages. Only a few decades ago, the task of reading everything written about sign language acquisition was still reasonably man- ageable. Today, with the establishment of new sign research programs all around the globe, the list of published articles on sign acquisition (not to mention unpublished theses and dissertations) has far outstripped the abilities of even the most assiduous reader. This chapter does not attempt to summarize them all. Rather it aims to lay out the major directions in which sign language research has progressed over the last few decades, sketching a general outline of what we know so far about this fascinating aspect of human development. A major theme of early sign acquisition research was to draw parallels between L1 acquisition of natural sign languages by native-signing deaf children and more tradi- tional L1 acquisition of spoken languages by hearing children.
    [Show full text]
  • Curriculum Vitae (2013)
    CURRICULUM VITAE (2013) URSULA BELLUGI Director, Laboratory for Cognitive Neuroscience Professor, The Salk Institute for Biological Studies 10010 North Torrey Pines Road, La Jolla, CA 92037 (858) 453-4100, ext. 1222 EDUCATION Harvard University Cambridge, MA Ed.D. 1967 Antioch College Yellow Springs, OH B.A. 1952 PROFESSIONAL EXPERIENCE The Salk Institute Director, Laboratory for 1970-forward for Biological Studies Cognitive Neuroscience Language and Cognitive Studies Professor 1981-forward Associate Professor 1974-1981 Research Associate 1969-1974 Member 1968-1969 San Diego State University Adjunct Professor, 1995-forward Department of Psychology University of California, Adjunct Professor, 1977-forward San Diego Department of Psychology Harvard University Assistant Professor & Research Fellow 1967-1968 PRESS RELEASES: (2003-2012) Salk Institute Press Release (2003). Are There "Social Behavior" Genes? La Jolla, CA: Salk Institute. Salk Institute Press Release (2006). Williams Syndrome, the Brain, and Music. La Jolla, CA. Salk Institute Press Release (2007). Beyond nature vs. nurture: Williams syndrome across cultures. La Jolla, CA. Scientific American Press Release. (2007). Genetic Interplay with Culture Dobbs, D. (2007). The Gregarious Brain, The New York Times Magazine (pp. 42-47). New York. NIDCD (2008). Dr. Ursula Bellugi Elected to the National Academy of Sciences Haas et al. (2008). Williams Syndrome and Social Fearlessness. Medical News Today (2009). Narrowing Search for Behavior- related Genes in People with WS. Stanford (2009). Sociability Traced to Particular Region of Brain, Science Daily. November 2012 Science Daily (2009). Two Genes Influence Social Behavior, Visual-Spatial Performance In People With Williams Syndrome Science Daily (2009). Two Genes Influence Social Behavior, Visual-Spatial Performance In People With Williams Syndrome Salk/Utah.
    [Show full text]
  • The Neurobiology of Reading Differs for Deaf and Hearing Adults Karen
    The Neurobiology of Reading Differs for Deaf and Hearing Adults Karen Emmorey Please cite as: Emmorey, K. (2020). The neurobiology of reading differs for deaf and hearing adults. In M. Marschark and H. Knoors (Eds). Oxford Handbook of Deaf Studies in Learning and Cognition, pp. 347–359, Oxford University Press. Running head: The neurobiology of reading Karen Emmorey Laboratory for Language and Cognitive Neuroscience 6495 Alvarado Road, Suite 200 San Diego, CA 92120 USA [email protected] Acknowledgements This work was supported by a grant from the National Institutes of Health (DC014246) and grants from the National Science Foundation (BCS-1651372; BCS-1756403). The neurobiology of reading 2 Abstract Recent neuroimaging and electrophysiological evidence reveal how the reading system successfully adapts when phonological codes are relatively coarse-grained due to reduced auditory input during development. New evidence suggests that the optimal end-state for the reading system may differ for deaf versus hearing adults and indicates that certain neural patterns that are maladaptive for hearing readers may be beneficial for deaf readers. This chapter focuses on deaf adults who are signers and have achieved reading success. Although the left-hemisphere dominant reading circuit is largely similar, skilled deaf readers exhibit a more bilateral neural response to written words and sentences compared to their hearing peers, as measured by event- related potentials and functional magnetic resonance imaging. Skilled deaf readers may also rely more on neural regions involved in semantic processing compared to hearing readers. Overall, emerging evidence indicates that the neural markers for reading skill may differ for deaf and hearing adults.
    [Show full text]
  • Enhanced Imagery Abilities in Deaf and Hearing ASL Signers*
    Cognition, 46 (1993) 139-181 Visual imagery and visual-spatial language: Enhanced imagery abilities in deaf and hearing ASL signers* Karen Emmorey,” Stephen M. Kosslynb and Ursula Bellugi” “Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, 10010 North Torrey Pines Road, La Jolla, CA 92037, USA bDepartment of Psychology and Social Relations, Harvard University, Cambridge, MA 02138, USA Received April 22, 1991, final version accepted September 7, 1992 Abstract Emmorey, K., Kosslyn, S.M., and Bellugi, U., 1993. Visual imagery and visual-spatial language: Enhanced imagery abilities in deaf and hearing ASL signers. Cognition, 46: 139-181. The ability to generate visual mental images, to maintain them, and to rotate them was studied in deaf signers of American Sign Language (ASL), hearing signers who have deaf parents, and hearing non-signers. These abilities are hypothesized to be integral to the production and comprehension of ASL. Results indicate that both deaf and hearing ASL signers have an enhanced ability to generate relatively complex images and to detect mirror image reversals. In contrast, there were no group differences in ability to maintain information in images for brief periods or to imagine objects rotating. Signers’ enhanced visual imagery abilities may be tied to specific linguistic requirements of ASL (referent visualization, topological classifiers, perspective shift, and reversals during sign perception). Correspondence to: Dr. Karen Emmorey, Laboratory for Cognitive Neuroscience, The Salk Institute for Biological Studies, 10010 North Torrey Pines Road, La Jolla, CA 92037, USA; e-mail: [email protected]. *This work was supported by NIH grant HD-13249 awarded to Ursula Bellugi and Karen Emmorey, as well as NIH grants DC-00146, DC-0021 and NSF grant BNS86-09085.
    [Show full text]
  • CURRICULUM VITAE Karen Emmorey 6630 Towhee Lane
    CURRICULUM VITAE Karen Emmorey 6630 Towhee Lane Speech, Language, and Hearing Sciences Carlsbad, CA 92011 San Diego State University Home: (760) 931-8152 Director, Laboratory for Language and Cognitive Neuroscience 6495 Alvarado Road, Suite 200 Office: (619) 594-8080 San Diego, CA 92120 Lab: (619) 594-8049 [email protected] Website: http://slhs.sdsu.edu/llcn/ EDUCATION University of California, Los Angeles Ph.D. 1987 Linguistics University of California, Los Angeles M.A. 1984 Linguistics University of California, Los Angeles B.A. 1982 Linguistics, Psychology EMPLOYMENT San Diego State University Distinguished Professor 2013 – present San Diego State University Professor 2005 – 2013 School of Speech, Language, and Hearing Sciences San Diego State University Adjunct Professor 2006 – present Department of Psychology The University of California, Adjunct Professor 1998 – present San Diego Department of Psychology The Salk Institute Associate Director 2002 – 2005 Lab for Cognitive Neuroscience The Salk Institute Senior Staff Scientist 1996 – 2005 The Salk Institute Staff Scientist 1990 – 1996 The Salk Institute Senior Research Associate 1988 – 1990 The Salk Institute Post-Doctoral Fellow 1987 – 1988 HONORS Fellow, Linguistic Society of America 2019 Chair, Society for the Neurobiology of Language 2018 Chair, Linguistics Section of the AAAS 2014 Distinguished Professor, SDSU 2013 Outstanding Faculty Alumni Award 2011 Fellow, American Association for the Advancement for Science 2010 Top 25 Service Award, SDSU 2009 DISTINGUISHED SERVICE Language
    [Show full text]
  • Effects of Learning American Sign Language on Co-Speech Gesture
    Effects of Learning American Sign Language on Co-speech Gesture Shannon Casey Karen Emmorey Laboratory for Language and Cognitive Neuroscience San Diego State University Anecdotally, people report gesturing more after learning ASL • If true, this would indicate an unusual effect of the non-dominant L2 language (ASL) on the dominant L1 language (English) – Gesture creation interacts on-line with speech production processes (e.g., McNeill, 2005) • For spoken languages, cross-language interference rarely occurs from the L2 to the L1 Overview • Study 1: Survey of signed vs. spoken language learners after one year of instruction • Study 2: Longitudinal study of signed vs. spoken language learners before and after one year of instruction Study 1: Survey • Students surveyed after two semesters of a foreign language at San Diego State University: – ASL, N = 102 – French, N = 72 – Italian, N = 47 – Spanish, N = 119 (total spoken learners = 238) Survey Questions 1. After learning French/Italian/Spanish/ASL, do you think you gesture while talking (in English): less more the same 2. Do you feel that gestures you make while talking have changed since learning French/Italian/ Spanish/ASL? yes no 3. If yes, please explain how you think your gestures have changed. Most ASL learners felt their gesture frequency increased after 1 year Perceived Gesture Frequency 100 90 80 70 60 Increase 50 Decrease Same 40 30 Percent of Respondents 20 10 0 ASL (N = 101) French (N = 71) Italian (N = 47) Spanish (N = 119) ASL learners, unlike spoken language learners,
    [Show full text]
  • Bilingualism and Cognitive Reserve and Resilience NIA Virtual Workshop Division of Neuroscience March 2-3, 2021
    Bilingualism and Cognitive Reserve and Resilience NIA Virtual Workshop Division of Neuroscience March 2-3, 2021 August 2, 2021 This meeting summary was prepared by Rose Li and Associates, Inc., under contract to the National Institute on Aging (NIA). The views expressed in this document reflect both individual and collective opinions of the workshop participants and not necessarily those of NIA. Writing, editing, and review by the following individuals is gratefully acknowledged: Shadya Sanders, Jubin Abutalebi, Suvarna Alladi, Miguel Arce Renteria, Thomas H. Bak, Ellen Bialystok, Esti Blanco-Elorrieta, Dana Carluccio, Karen Emmorey, Dave Frankowski, Tamar Gollan, John G. Grundy, Arturo Hernandez, Erika Hoff, Judith Kroll, Boon Lead Tee, Dan Mungas, Kenneth Paap, Christos Pliatsikas, Caroline Sferrazza, Nancy Tuvesson. Bilingualism and Cognitive Reserve and Resilience March 2-3, 2021 Table of Contents Acronym Definitions ............................................................................................................... ii Executive Summary................................................................................................................. 3 Meeting Summary .................................................................................................................. 5 Welcome and Introductions.................................................................................................... 5 Overview: Cognitive Reserve and Resilience in Aging ................................................................
    [Show full text]
  • David Rigler Collection of Research Materials Related to Linguistic-Psychological Studies of Genie (Pseudonym), 1895-2003, 1970-2003
    http://oac.cdlib.org/findaid/ark:/13030/kt0q2nc69q No online items Finding Aid for the David Rigler Collection of Research Materials related to Linguistic-Psychological Studies of Genie (pseudonym), 1895-2003, 1970-2003 Processed by Elizabeth Sheehan, with assistance from Laurel McPhee, 2006; machine-readable finding aid created by Caroline Cubé. UCLA Library Special Collections Room A1713, Charles E. Young Research Library Box 951575 Los Angeles, CA 90095-1575 Email: [email protected] URL: http://www.library.ucla.edu/libraries/special/scweb/ © 2006 The Regents of the University of California. All rights reserved. 800 1 Descriptive Summary Title: David Rigler Collection of Research Materials related to Linguistic-Psychological Studies of Genie (pseudonym) Date (inclusive): 1895-2003, 1970-2003 Collection number: 800 Creator: Rigler, David. Extent: 74 boxes (37 linear ft.) 2 shoeboxes. 7 oversize boxes. Abstract: "Genie" (b. 1957) is the pseudonym of a young girl raised in an abusive and isolated environment until the age of 13. The collection consists of material that chronicles her discovery and the study and rehabilitation efforts of researchers. Items include reports and essays; correspondence; notes; medical records; diagnostic material; legal paperwork such as depositions, summonses, and settlement agreements; pedagogical material; administrative paperwork; Genie's artwork; articles and clippings; photographs and slides; audio-visual videotapes, cassettes, and film; assorted printed material; and ephemera. Language: Finding aid is written in English. Language of the Material: Materials are in English. Repository: University of California, Los Angeles. Library Special Collections. Los Angeles, California 90095-1575 Physical location: Stored off-site at SRLF. Advance notice is required for access to the collection.
    [Show full text]
  • Shook & Marian, 2012
    Cognition 141 (2015) 9–25 Contents lists available at ScienceDirect Cognition journal homepage: www.elsevier.com/locate/COGNIT Parallel language activation and inhibitory control in bimodal bilinguals ⇑ Marcel R. Giezen a, , Henrike K. Blumenfeld b, Anthony Shook c, Viorica Marian c, Karen Emmorey b a San Diego State University, 5250 Campanile Drive, San Diego, CA 92182, USA b School of Speech, Language and Hearing Sciences, San Diego State University, 5500 Campanile Drive, San Diego, CA 92182-1518, USA c Department of Communication Sciences and Disorders, Northwestern University, 2240 Campus Drive, Evanston, IL 60208, USA article info abstract Article history: Findings from recent studies suggest that spoken-language bilinguals engage nonlinguistic Received 17 February 2014 inhibitory control mechanisms to resolve cross-linguistic competition during auditory Revised 1 April 2015 word recognition. Bilingual advantages in inhibitory control might stem from the need Accepted 3 April 2015 to resolve perceptual competition between similar-sounding words both within and between their two languages. If so, these advantages should be lessened or eliminated when there is no perceptual competition between two languages. The present study inves- Keywords: tigated the extent of inhibitory control recruitment during bilingual language comprehen- Bimodal bilingualism sion by examining associations between language co-activation and nonlinguistic Cross-linguistic competition Inhibitory control inhibitory control abilities in bimodal bilinguals, whose two languages do not perceptually Visual world paradigm compete. Cross-linguistic distractor activation was identified in the visual world paradigm, and correlated significantly with performance on a nonlinguistic spatial Stroop task within a group of 27 hearing ASL-English bilinguals. Smaller Stroop effects (indexing more effi- cient inhibition) were associated with reduced co-activation of ASL signs during the early stages of auditory word recognition.
    [Show full text]
  • Language and Linguistics: Introduction | NSF - National Science Foundation
    Language and Linguistics: Introduction | NSF - National Science Foundation RESEARCH AREAS FUNDING AWARDS DOCUMENT LIBRARY NEWS ABOUT NSF Language is common to all humans; we seem to be “hard-wired” for it. Many social scientists and philosophers say it’s this ability to use language symbolically that makes us “human.” Though it may be a universal human attribute, language is hardly simple. For decades, linguists’ main task was to track and record languages. But, like so many areas of science, the field of linguistics has evolved dramatically over the past 50 years or so. Languages come in many shapes and sounds. Language is simultaneously a physical process and a way of sharing meaning among people. Credit: Design by Alex Jeon, National Science Foundation Today’s science of linguistics explores: the sounds of speech and how different sounds function in a language the psychological processes involved in the use of language how children acquire language capabilities social and cultural factors in language use, variation and change the acoustics of speech and the physiological and psychological aspects involved in producing and understanding it the biological basis of language in the brain This special report touches on nearly all of these areas by answering questions such as: How does language develop and change? Can the language apparatus be "seen" in the brain? Does it matter if a language disappears? What exactly is a dialect? How can sign language help us to understand languages in general? Answers to these and other questions have implications for neuroscience, psychology, sociology, biology and more. More Special Reports Next: Speech is Physical and Mental https://www.acpt.nsf.gov/news/special_reports/linguistics1/intro.jsp[12/30/2019 3:39:39 PM] Language and Linguistics: Speech Is Physical | NSF - National Science Foundation RESEARCH AREAS FUNDING AWARDS DOCUMENT LIBRARY NEWS ABOUT NSF Humans are equipped with Click on illustrations for more detail.
    [Show full text]
  • Interaction Between Topic Marking and Subject Preference Strategy in Sign Language Processing
    Interaction Between Topic Marking and Subject Preference Strategy in Sign Language Processing Julia Krebs – University of Salzburg, Austria Evie Malaia – University of Alabama Ronnie B. Wilbur – Purdue University Dietmar Roehm – University of Salzburg, Austria Deposited 08/04/2021 Citation of published version: Krebs, J., Malaia, E., Wilbur, R., Roehm, D. (2019): Interaction Between Topic Marking and Subject Preference Strategy in Sign Language Processing. Language, Cognition, and Neuroscience. 35(4). DOI: https://doi.org/10.1080/23273798.2019.1667001 © 2019 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group This is an Open Access article distributed under the terms of the Creative Commons Attribution- NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way. LANGUAGE, COGNITION AND NEUROSCIENCE 2020, VOL. 35, NO. 4, 466–484 https://doi.org/10.1080/23273798.2019.1667001 REGULAR ARTICLE Interaction between topic marking and subject preference strategy in sign language processing Julia Krebsa,b, Evie Malaia c, Ronnie B. Wilbur d and Dietmar Roehma,b aResearch group Neurobiology of Language, Department of Linguistics, University of Salzburg, Salzburg, Austria; bCentre for Cognitive Neuroscience (CCNS), University of Salzburg, Salzburg, Austria; cDepartment of Communicative Disorders, University of Alabama, Tuscaloosa, AL, USA; dDepartment of Linguistics, and Department of Speech, Language, and Hearing Sciences, Purdue University, West Lafayette, IN, USA ABSTRACT ARTICLE HISTORY The preference of the human parser for interpreting syntactically ambiguous sentence-initial Received 18 April 2019 arguments as the subject of a clause (i.e.
    [Show full text]
  • Neural Effects of Iconicity in Sign Language Karen Emmorey
    Neural effects of iconicity in sign language Karen Emmorey Language is fundamental to our species, and yet almost everything we know is based on the study of spoken languages. An investigation of sign language is particularly interesting for theories that propose close links between perceptual- motor experience and language, e.g., “embodied” or “grounded” theories of semantic processing vs. more traditional theories in which arbitrary and amodal symbols are abstracted away from sensory-motor systems. In comparison to spoken languages, sign languages are much more strongly grounded in the body, as evidenced by the pervasive iconicity observed for these languages (e.g., the form of verbs often depicts an aspect of the denoted action). To investigate the neural response to iconic signs, our research team utilized event-related potentials (ERPs) in conjunction with our database of 1000 American Sign Language (ASL) signs rated for iconicity. We examined whether iconic signs exhibit distinct neural signatures for deaf signers, as well as for hearing ASL learners. Results from a series of experiments reveal that there is no “neural signature” that tracks with the strength of iconicity during sign recognition for deaf signers; however, iconicity impacts the neural response during lexical access for novice ASL learners (but not deaf signers) in a translation priming paradigm. For deaf signers, neural effects of iconicity are found in picture-naming and picture-matching tasks, particularly when there is a structural alignment between the ASL sign and the picture (e.g., the ASL sign BIRD depicts a bird’s beak and aligns with a picture of a bird with a prominent beak, but not with a picture of a bird in flight).
    [Show full text]