A Neurolinguistic Model of Grammatical Construction Processing

Total Page:16

File Type:pdf, Size:1020Kb

A Neurolinguistic Model of Grammatical Construction Processing A Neurolinguistic Model of Grammatical Construction Processing Peter Ford Dominey1, Michel Hoen1, and Toshio Inui2 Downloaded from http://mitprc.silverchair.com/jocn/article-pdf/18/12/2088/1756017/jocn.2006.18.12.2088.pdf by guest on 18 May 2021 Abstract & One of the functions of everyday human language is to com- construction processing based on principles of psycholinguis- municate meaning. Thus, when one hears or reads the sen- tics, (2) develop a model of how these functions can be imple- tence, ‘‘John gave a book to Mary,’’ some aspect of an event mented in human neurophysiology, and then (3) demonstrate concerning the transfer of possession of a book from John to the feasibility of the resulting model in processing languages of Mary is (hopefully) transmitted. One theoretical approach to typologically diverse natures, that is, English, French, and Japa- language referred to as construction grammar emphasizes this nese. In this context, particular interest will be directed toward link between sentence structure and meaning in the form of the processing of novel compositional structure of relative grammatical constructions. The objective of the current re- phrases. The simulation results are discussed in the context of search is to (1) outline a functional description of grammatical recent neurophysiological studies of language processing. & INTRODUCTION ditransitive (e.g., Mary gave my mom a new recipe) One of the long-term quests of cognitive neuroscience constructions (Goldberg, 1995). has been to link functional aspects of language process- In this context, the ‘‘usage-based’’ perspective holds ing to its underlying neurophysiology, that is, to under- that the infant begins language acquisition by learning stand how neural mechanisms allow the mapping of the very simple constructions in a progressive development surface structure of a sentence onto a conceptual repre- of processing complexity, with a substantial amount of sentation of its meaning. The successful pursuit of this early ground that can be covered with relatively mod- objective will likely prove to be a multidisciplinary endeav- est computational resources (Clark, 2003; Tomasello, or that requires cooperation between theoretical, devel- 2003). This is in contrast with the ‘‘continuity hypothe- opmental, and neurological approaches to the study of sis’’ issued from the generative grammar philosophy, language, as well as contributions from computational which holds that the full range of syntactic complexity modeling that can eventually validate proposed hypothe- is available in the form of a universal grammar and is ses. The current research takes this multidisciplinary used at the outset of language learning (see Tomasello, approach within the theoretical framework associated 2000, and comments on the continuity hypothesis de- with grammatical constructions. The essential distinc- bate). In this generative context, the universal grammar tion within this context is that language is considered to is there from the outset, and the challenge is to under- consist of a structured inventory of mappings between stand how it got there. In the usage-based construction the surface forms of utterances and meanings, referred approach, initial processing is simple and becomes to as grammatical constructions (see Goldberg, 1995). increasingly complex, and the challenge is to explain These mappings vary along a continuum of complexity. the mechanisms that allow full productivity and com- At one end are single words and fixed ‘‘holophrases’’ positionality (composing new constructions from exist- such as ‘‘Gimme that’’ that are processed as unparsed ing ones). This issue is partially addressed in the current ‘‘holistic’’ items (see Tomasello, 2003). At the other research. extreme are complex abstract argument constructions that allow the use of sentences like this one. In between Theoretical Framework for are the workhorses of everyday language, abstract argu- Grammatical Constructions ment constructions that allow the expression of spatio- temporal events that are basic to human experience If grammatical constructions are mappings from sen- including active transitive (e.g., John took the car) and tence structure to meaning, then the system must be capable of (1) identifying the type of grammatical con- struction for a given sentence and (2) using the identi- 1CNRS UMR 5015, France, 2Kyoto University, Japan fied construction and its corresponding mapping to D 2006 Massachusetts Institute of Technology Journal of Cognitive Neuroscience 18:12, pp. 2088–2107 Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/jocn.2006.18.12.2088 by guest on 25 September 2021 extract the meaning of the sentence. Interestingly, this sentences with respect to the other content words and corresponds to Townsend and Bever’s (2001) two steps with respect to the function words. Although this is the of syntactic processing, that is, (i) parsing of the sen- case in English, Bates, McNew, MacWhinney, Devescovi, tence into phrasal constituents and words (requiring and Smith (1982) and MacWhinney (1982) have made access to lexical categories and word order analysis) and the case more generally, stating that across human lan- (ii) subsequent analysis of phrasal structure and long- guages, the grammatical structure of sentences is spec- distance dependencies by means of syntactic rules. ified by a combination of cues including word order, Figure 1A illustrates an example of an active transi- grammatical function words (and or grammatical mark- tive sentence and its mapping onto a representation of ers attached to the word roots), and prosodic structure. Downloaded from http://mitprc.silverchair.com/jocn/article-pdf/18/12/2088/1756017/jocn.2006.18.12.2088.pdf by guest on 18 May 2021 meaning. The generalized representation of the corre- The general idea here is that these constructions are sponding active transitive construction is depicted in templates into which a variety of open class elements Figure 1B. The ‘‘slots’’ depicted by the NPs and V can (nouns, verbs, etc.) can be inserted in order to express be instantiated by different nouns and verbs in order to novel meanings. Part of the definition of a construction generate an open set of new sentences. For each sentence is the mapping between slots or variables in the tem- corresponding to this construction type, the mapping of plate and the corresponding semantic roles in the sentence to meaning is provided by the construction. meaning, as illustrated in Figure 1. In this context, a Figure 1C illustrates a more complex construction that substantial part of the language faculty corresponds to a contains an embedded relative phrase. In this context, a structured set of such sentence-to-meaning mappings, central issue in construction grammar will concern how and these mappings are stored and retrieved based on the potential diversity of constructions are identified. the patterns of structural markers (i.e., word order and Part of the response to this lies in the specific ways in function word patterns) unique to each grammatical which function and content words are structurally orga- construction type. Several major issues can be raised nized in distinct sentence types. Function words (also with respect to this characterization. The issues that we referred to as closed class words because of their limited address in the current research are as follows: (1) Can number in any given language), including determiners, this theoretical characterization be mapped onto human prepositions, and auxiliary verbs, play a role in defining functional neuroanatomy in a meaningful manner and the grammatical structure of a sentence (i.e., in specify- (2) can the resulting system be demonstrated to account ing who did what to whom) although they carry little for a restricted subset of human language phenomena semantic content. Content words (also referred to as in a meaningful manner? open class words because of their essentially unlimited number) play a more central role in contributing pieces Implementing Grammatical Constructions of meaning that are inserted into the grammatical in a Neurocomputational Model structure of the sentence. Thus, returning to our exam- ples in Figure 1, the thematic roles for the content Interestingly, this characterization of grammatical con- words are determined by their relative position in the structions can be reformulated into a type of sequence learning problem, if we consider a sentence as a se- quence of words. Determining the construction type for a given sentence consists in analyzing or recoding the sentence as a sequence of open class and closed class elements, and then performing sequence recognition on this recoded sequence. Dominey (1995) and Dominey, Arbib, and Joseph (1995) have demonstrated how such sequence recognition can be performed by a recurrent prefrontal cortical network (a ‘‘temporal recurrent net- work’’ [TRN]) that encodes sequential structure (see also Dominey 1998a, b). Then corticostriatal connec- tions allow the association of different categories of sequences represented in the recurrent network with different behavioral responses. Once the construction type has thus been identified, the corresponding mapping of open class elements onto their semantic roles must be retrieved and per- Figure 1. Grammatical construction overview. (A) Specific example formed. This mapping corresponds to what we have of a sentence-to-meaning mapping. (B) Generalized representation called ‘‘abstract structure’’
Recommended publications
  • Regular and Irregular Aspects of Grammatical Constructions Konrad Szcześniak, Palacký University, Olomouc, Czech Republic University of Silesia, Sosnowiec, Poland
    Regular and Irregular Aspects of Grammatical Constructions Konrad Szcześniak, Palacký University, Olomouc, Czech Republic University of Silesia, Sosnowiec, Poland This study focuses on two grammatical forms, the Incredulity Response Construction and the je vidět / vidieť construction. By analyzing their form and usage, I intend to weigh in on the debate concerning the importance of regularity and unpredictability inherent in grammatical constructions. The debate, initiated by Chomsky’s dismissal of idiosyncratic forms as peripheral, has gained momentum especially in recent years, with cognitive linguists uncovering more unpredictable aspects of language forms and claiming that language is primarily idiosyncratic, not regular. As a result, current cognitive linguistic descriptions of constructions highlight what is unpredictable and all but dismiss the regular aspects of constructions. The present study argues that the two sides—the predictable and the form-specific—coexist and thus have obvious implications for models of mental representations of language forms. By extrapolation, a recognition of a balance of the regular and irregular properties is relevant to our theorizing on how speakers learn and use constructions: just as the presence of idiosyncratic properties makes it necessary to memorize special cases of forms, the presence of regular properties can be assumed to ease the learning burden. An accurate description of any construction should attempt to detail both those aspects that can be predicted by rule and those that are endemic to a given construction and must be memorized. Keywords: grammatical construction, idiosyncrasy, regularity, IRC, je vidět / vidieť construction 1. Introduction The notions of irregularity, idiosyncrasy and unpredictability have received widely different treatments in major theoretical frameworks.
    [Show full text]
  • The ERP Technique in Children's Sentence Processing: a Review a Técnica De ERP No Processamento De Sentenças De Crianças: U
    Revista de Estudos da Linguagem, Belo Horizonte, v.25, n.3, p.1537-1566, 2017 The ERP technique in children’s sentence processing: a review A técnica de ERP no processamento de sentenças de crianças: uma revisão Marília Uchôa Cavalcanti Lott de Moraes Costa Universidade Federal do Rio de Janeiro, Rio de Janeiro, Rio de Janeiro / Brasil [email protected] Resumo: A medição da ativação cerebral por meio da técnica de potenciais relacionados a eventos (ERP) tem sido valiosa para lançar luz sobre diversas cognições humanas. A linguagem é uma das cognições que têm sido estudadas com essa técnica de grande resolução temporal entre o estímulo apresentado e a ativação observada decorrente desse estímulo. A área de aquisição da linguagem tem se beneficiado especialmente dessa técnica, dado que é possível investigar relações entre dados linguísticos e a ativação cerebral sem a necessidade de uma resposta explícita, como apertar um botão ou apontar para uma imagem. O objetivo deste artigo é apresentar o estado da arte sobre o processamento de frases em crianças utilizando a técnica de ERP. Palavras-chave: ERP/EEG; processamento de frases; aquisição da linguagem; N400; P600. Abstract: The measurement of brain activation through the technique of event related potentials (ERP) has been valuable in shedding light on various human cognitions. Language is one of those cognitions that has been studied using this technique, which allows for a more accurate temporal resolution between the stimulus presented and the time in which we observe an activation resulting from this stimulus. The area of eISSN: 2237-2083 DOI: 10.17851/2237-2083.25.3.1537-1566 1538 Revista de Estudos da Linguagem, Belo Horizonte, v.25, n.3, p.1537-1566, 2017 language acquisition has especially benefited from this technique since it is possible to investigate relationships between linguistic data and brain activation without the need for an explicit response.
    [Show full text]
  • An Introduction to Cognitive Grammar
    COGNITIVE SCIENCE 10, l-40 (1986) An Introduction to Cognitive Grammar RONALD W. LANGACKER University of California, San Diego Cognitive grammar takes a nonstandard view of linguistic semantics and grammatical structure. Meaning is equated with conceptualization. Semantic structures are characterized relative to cognitive domains, and derive their value by construing the content of these domains in a specific fashion. Gram- mar is not a distinct level of linguistic representation, but reduces instead to the structuring and symbolization of conceptual content. All grammatical units are symbolic: Basic categories (e.g., noun ond verb) are held to be notionally definable, and grammatical rules are analyzed as symbolic units that are both complex ond schematic. These concepts permit a revealing account of gram- maticol composition with notable descriptive advantages. Despite the diversity of contemporary linguistic theory, certain fundamental views enjoy a rough consensus and are widely accepted without serious question. Points of general agreement include the following: (a) language is a self-contained system amenable to algorithmic characterization, with suf- ficient autonomy to be studied in essential isolation from broader cognitive concerns; (b) grammar (syntax in particular) is an independent aspect of lin- guistic structure distinct from both lexicon and semantics; and (c) if mean- ing falls within the purview of linguistic analysis, it is properly described by some type of formal logic based on truth conditions. Individual theorists would doubtlessly qualify their assent in various ways, but (a)-(c) certainly come much closer than their denials to representing majority opinion. What follows is a minority report. Since 1976, I have been developing a linguistic theory that departs quite radically from the assumptions of the currently predominant paradigm.
    [Show full text]
  • The Potentials for Basic Sentence Processing: Differentiating
    The Potentials for Basic Sentence 20 Processing: Differentiating Integrative Processes Marta Kutas and Jonathan W. King ABSTRACT We show that analyzing voltage fluctuations known as "event-related brain potentials," or ERPs, recorded from the human scalp can be an effective way of tracking integrative processes in language on-line. This is essential if we are to choose among alternative psychological accounts of language comprehension. We briefly review the data implicating the N400 as an index of semantic integration and describe its use in psycholinguistic research. We then introduce a cognitive neuroscience approach to normal sentence processing, which capitalizes on the ERP's fine temporal resolution as well as its potential linkage to both psychological constructs and activated brain areas. We conclude by describing several reliable ERP effects with different temporal courses, spatial extents, and hypothesized relations to comprehension skill during the reading of simple transitive sentences; these include (1) occipital potentials related to fairly low-level, early visual processing, (2) very slow frontal positive shifts related to high-level integration during construction of a mental model, and (3) various frontotemporal potentials associated with thematic role assignment, clause endings, and manipulating items that are in working memories. In it comes out it goes and in between nobody knows how flashes of vision and snippets of sound get bound to meaning. From percepts to concepts seemingly effortless integration of highly segregated
    [Show full text]
  • Psych150/ Ling155 Spring 2015 Review Questions: Speech Perception, Word Recognition, Sentence Processing, and Speech Production
    Psych150/ Ling155 Spring 2015 Review Questions: Speech perception, word recognition, sentence processing, and speech production Terms/concepts to know: mondegreen, lack of invariance problem, Motor Theory of Speech Perception, McGurk effect, Ganong effect, priming, lexical decision, facilitation, inhibition, neighborhood density, homophone, homograph, Cohort model, TRACE model, Dual Route model, graphemes, developmental dyslexia, acquired dyslexia, parsing, incrementality, garden path sentence, global ambiguity, garden path theory, constraint-based approach, thematic relation, intransitive, transitive, ditransitive, sentential complement, slip of the tongue, tip of the tongue state, serial model, cascaded model, bidirectional cascaded model 1. The lack of invariance problem led to the development of the Motor Theory of Speech Perception (MToSP). What is the lack of invariance problem? How did the MToSP propose to solve that problem? 2. How does the McGurk effect differ from the Ganong effect? 3. Can priming occur without conscious awareness of the prime? What paradigm do researchers use to assess this possibility? 4. How does a lexical decision task work? Can lexical decision be used for both visual (written) and auditory (spoken) experiments? What is the dependent measure in lexical decision tasks? 5. Priming can cause facilitation or inhibition. How would these processes be reflected in reaction times? 6. How do homophones and homographs differ? 7. According to the Cohort model of auditory word recognition, what happens as the beginning of a word is detected? 8. What does the TRACE model predict about possibility of “speaker” and “beaker” being activated at the same time during word recognition? Why? 9. Writing systems differ in terms of the primary level of linguistic representations that they tap into, i.e., that symbols refer to (phonemes, syllables, morphemes).
    [Show full text]
  • Toward Collaboration Between the Fields of Signal and Sentence Processing
    Cortical Tracking of Speech: Toward Collaboration between the Fields of Signal and Sentence Processing Eleonora J. Beier1, Suphasiree Chantavarin1,2, Gwendolyn Rehrig1, Fernanda Ferreira1, and Lee M. Miller1 Abstract ■ In recent years, a growing number of studies have used corti- between the two fields could be mutually beneficial. Specifically, cal tracking methods to investigate auditory language processing. we argue that the use of cortical tracking methods may help re- Although most studies that employ cortical tracking stem from solve long-standing debates in the field of sentence processing that the field of auditory signal processing, this approach should also commonly used behavioral and neural measures (e.g., ERPs) have be of interest to psycholinguistics—particularly the subfield of failed to adjudicate. Similarly, signal processing researchers who sentence processing—given its potential to provide insight into usecorticaltrackingmaybeabletoreducenoiseintheneuraldata dynamic language comprehension processes. However, there and broaden the impact of their results by controlling for linguis- has been limited collaboration between these fields, which we tic features of their stimuli and by using simple comprehension suggest is partly because of differences in theoretical background tasks. Overall, we argue that a balance between the methodolog- and methodological constraints, some mutually exclusive. In this ical constraints of the two fields will lead to an overall improved paper, we first review the theories and methodological con- understanding of language processing as well as greater clarity on straints that have historically been prioritized in each field and what mechanisms cortical tracking of speech reflects. Increased provide concrete examples of how some of these constraints collaboration will help resolve debates in both fields and will lead may be reconciled.
    [Show full text]
  • Introducing Sign-Based Construction Grammar IVA N A
    September 4, 2012 1 Introducing Sign-Based Construction Grammar IVA N A. SAG,HANS C. BOAS, AND PAUL KAY 1 Background Modern grammatical research,1 at least in the realms of morphosyntax, in- cludes a number of largely nonoverlapping communities that have surpris- ingly little to do with one another. One – the Universal Grammar (UG) camp – is mainly concerned with a particular view of human languages as instantia- tions of a single grammar that is fixed in its general shape. UG researchers put forth highly abstract hypotheses making use of a complex system of repre- sentations, operations, and constraints that are offered as a theory of the rich biological capacity that humans have for language.2 This community eschews writing explicit grammars of individual languages in favor of offering conjec- tures about the ‘parameters of variation’ that modulate the general grammat- ical scheme. These proposals are motivated by small data sets from a variety of languages. A second community, which we will refer to as the Typological (TYP) camp, is concerned with descriptive observations of individual languages, with particular concern for idiosyncrasies and complexities. Many TYP re- searchers eschew formal models (or leave their development to others), while others in this community refer to the theory they embrace as ‘Construction Grammar’ (CxG). 1For comments and valuable discussions, we are grateful to Bill Croft, Chuck Fillmore, Adele Goldberg, Stefan Müller, and Steve Wechsler. We also thank the people mentioned in footnote 8 below. 2The nature of these representations has changed considerably over the years. Seminal works include Chomsky 1965, 1973, 1977, 1981, and 1995.
    [Show full text]
  • Cognitive Grammar with Reference to Passive Construction in Arabic
    International Journal of Linguistics, Literature and Culture (LLC) March 2017 edition Vol.4 No.1 ISSN 2410-6577 Cognitive Grammar with Reference to Passive Construction in Arabic Dr. Bahaa-eddin Hassan Faculty of Arts, Sohag University, Egypt Abstract The purpose of this study is to examine Cognitive Grammar (CG) theory with reference to the active and the passive voice construction in Arabic. It shows how the cognitive approach to linguistics and construction grammar work together to explain motivation beyond the use of this syntactic construction; therefore, formal structures of language are combined with the cognitive dimension. CG characterizations reveal the limitations of the view that grammatical constructions are autonomous categories. They are interrelated with conceptualization in the framework of cognitive grammar. The study also accounts for the verbs which have active form and passive meaning, and the verbs which have passive form and active meaning. Keywords: Cognitive Grammar, Voice, Arabic, Theme-oriented. 1. Introduction In his Aspects of the Theory of Syntax, Chomsky represented each sentence in a language as having a surface structure and a deep structure. The deep structure represents the semantic component and the surface structure represents the phonological approach. He explained that languages’ deep structures share universal properties which surface structures do not reveal. He proposed transformational grammar to map the deep structure (semantic relations of a sentence) on to the surface structure. Individual languages use different grammatical patterns for their particular set of expressed meanings. Chomsky, then, in Language and Mind, proposed that language relates to mind. Later, in an interview, he emphasized that language represents a state of mind.
    [Show full text]
  • EEG Theta and Gamma Responses to Semantic Violations in Online Sentence Processing
    Brain and Language 96 (2006) 90–105 www.elsevier.com/locate/b&l EEG theta and gamma responses to semantic violations in online sentence processing Lea A. Hald a,c, Marcel C.M. Bastiaansen a,b,¤, Peter Hagoort a,b a Max Planck Institute for Psycholinguistics, P.O. Box 310, 6500 AH Nijmegen, The Netherlands b F. C. Donders Centre for Cognitive Neuroimaging, Radbout Universiteit Nijmegen, P.O. Box 9101, 6500 HB Nijmegen, The Netherlands c Center for Research in Language, University of California, San Diego, 9500 Gilman Dr., Dept. 0526, La Jolla, CA 92093-0526, USA Accepted 18 June 2005 Available online 3 August 2005 Abstract We explore the nature of the oscillatory dynamics in the EEG of subjects reading sentences that contain a semantic violation. More speciWcally, we examine whether increases in theta (t3–7 Hz) and gamma (around 40 Hz) band power occur in response to sen- tences that were either semantically correct or contained a semantically incongruent word (semantic violation). ERP results indicated a classical N400 eVect. A wavelet-based time-frequency analysis revealed a theta band power increase during an interval of 300– 800 ms after critical word onset, at temporal electrodes bilaterally for both sentence conditions, and over midfrontal areas for the semantic violations only. In the gamma frequency band, a predominantly frontal power increase was observed during the processing of correct sentences. This eVect was absent following semantic violations. These results provide a characterization of the oscillatory brain dynamics, and notably of both theta and gamma oscillations, that occur during language comprehension. 2005 Elsevier Inc.
    [Show full text]
  • Sentence Processing What Makes up Psycholinguistics?
    Introduction to Psycholinguistics Lecture 3: Sentence Processing Matthew W Crocker Computerlinguistik Universität des Saarlandes What makes up Psycholinguistics? Computational models of the representations, architectures and mechanisms that underlie human language processing. Linguistics: Wow people represent linguistic knowledge Psycholinguistics: How people use this knowledge to produce and understand language Computational Models: Implementations of psycholinguistic theory, using linguistic representations. More complete theory: models and predicts human behaviour Experiments & Data: Descriptive studies of human language processing Off-line: grammaticality judgement, completions, global reading time (?) On-line: “word-by-word”, self-paced reading, eye-tracking, ERP Test prediction of theories and computer models Corpus data: word-frequency, category and sense bias, subcategorization © Matthew W. Crocker Introduction to Psycholinguistics 2 1 Sentence Processing Sentence processing is the means by which the words of an utterance are combined to yield and interpretation All people do it well It is a difficult task: complexity and ambiguity Not simple ‘retrieval’, like lexical access Compositional: interpretation must be built, rapidly, even for novel word/structure input What are the architectures, mechanisms and representations underlying this process? Architectures: modularity vs. interaction Mechanisms: how is input mapped to interpretations using knowledge Representations: How is knowledge encoded © Matthew W. Crocker Introduction to Psycholinguistics 3 A Simple Theory of Grammar The Grammar The Lexicon S NP VP Det = {the, a, every} NP PN N = {man, woman, book, NP Det N hill, telescope} NP NP PP PN = {John, Mary} PP P NP P = {on, with} VP V V = {saw, put, open, VP V NP read, reads} VP V NP PP © Matthew W. Crocker Introduction to Psycholinguistics 4 2 A Generated Sentence the man read every book S ei S NP VP NP VP ty ru NP Det N VP V NP Det N V NP g g g tu NP Det N the man read Det N g g every book © Matthew W.
    [Show full text]
  • Recovery from Misinterpretations During Online Sentence Processing
    RECOVERY FROM MISINTERPRETATIONS 1 Recovery from misinterpretations during online sentence processing Lena M Blott1a, Jennifer M Rodd1, Fernanda Ferreira2, and Jane E Warren1 1 Division of Psychology and Language Sciences, University College London, UK 2 Department of Psychology, University of California, Davis, CA, USA a Corresponding author: Lena M Blott Chandler House 2 Wakefield Street London WC1N 1PF United Kingdom Email: [email protected] RECOVERY FROM MISINTERPRETATIONS 2 Abstract Misinterpretations during language comprehension are common. The ability to recover from such processing difficulties is therefore crucial for successful day-to-day communication. The present study investigated the outcome of comprehension processes and on-line reading behaviour when misinterpretations occurred. Although group-level effects of reinterpretation on sentence comprehension and on-line processing are of great theoretical interest, individual differences in the recovery from processing difficulty are of particular practical relevance. Even adult readers vary considerably in their “lexical expertise”, their knowledge of word forms and meanings and their experience with written material. We therefore also investigated the effect of individual differences in lexical expertise on processes related to the recovery from misinterpretations. Ninety-six adult participants read “garden-path” sentences in which an ambiguous word was disambiguated towards an unexpected meaning (e.g. The ball was crowded), while their eye movements were monitored. A Meaning Coherence Judgement task additionally required them to decide whether or not each sentence made sense. Results suggested that readers did not always engage in reinterpretation processes but instead followed a “good enough” processing strategy. Successful detection of a violation to sentence coherence and associated reinterpretation processes also required additional processing time compared to sentences that did not induce a misinterpretation.
    [Show full text]
  • Generalizing Beyond the Input: the Functions of the Constructions Matter ⇑ Florent Perek , Adele E
    Journal of Memory and Language 84 (2015) 108–127 Contents lists available at ScienceDirect Journal of Memory and Language journal homepage: www.elsevier.com/locate/jml Generalizing beyond the input: The functions of the constructions matter ⇑ Florent Perek , Adele E. Goldberg Princeton University, Department of Psychology, Peretsman-Scully Hall, Princeton, NJ 08540, United States article info abstract Article history: A growing emphasis on statistics in language learning raises the question of whether and Received 4 June 2014 when speakers use language in ways that go beyond the statistical regularities in the input. revision received 27 April 2015 In this study, two groups were exposed to six novel verbs and two novel word order constructions that differed in function: one construction but not the other was exclusively used with pronoun undergoers. The distributional structure of the input was manipulated Keywords: between groups according to whether each verb was used exclusively in one or the other Language acquisition construction (the lexicalist condition), or whether a minority of verbs was witnessed in Artificial language learning both constructions (the alternating condition). Production and judgments results Novel construction learning Statistical learning demonstrate that participants tended to generalize the constructions for use in appropriate Argument structure discourse contexts, ignoring evidence of verb-specific behavior, especially in the alternating Generalization condition. Our results suggest that construction learning
    [Show full text]