XNL-Soar, Incremental Parsing, and the Minimalist Program

Total Page:16

File Type:pdf, Size:1020Kb

XNL-Soar, Incremental Parsing, and the Minimalist Program XNL-Soar, Incremental Parsing, and the Minimalist Program Deryle Lonsdale, LaReina Hingson, Jamison Cooper-Leavitt, Warren Casbeer, Rebecca Madsen BYU Department of Linguistics Syntax in the Minimalist Program (MP) Minimalist Principles (Chomsky 1995) Abstract Merge: Build structure based on uninterpretable categorial features First: head+complement The object: a new incremental language modeling parser Second: spec + head/X-bar Operator trace (partial) Syntactic representation based on the Minimalist Program (Chomsky 1995) Move: Generate grammaticality in the surface representation 1: Updates the prior GB (Principles & Parameters) representation Head-head movement 2:-3: 1: lexaccess(their) The framework: Soar Phrasal movement 5: 2: lexaccess(investigation) Symbolic, rule-based, goal-directed cognitive modeling approach (Newell 1990) Hierarchy of Projections (Adger 2003) 3: project (NP) Machine learning architecture (Laird et al. 1987) Inherent knowledge for specifying structural projections 4: hop (n) Models on-line human language processing. 6: Nominal: D > (Poss) > n > N 5: merge1(n’) The goal: explore MP correlations with prior psycholinguistic findings in human language processing (Lewis 1993) Clausal: C > T > (Neg) > (Perf) > (Prog) > (Pass) > v > V 6: merge2(nP) The approach: process incoming lexical items incrementally via operators Features play a central role 7: movehead (N) Lexical access operator for each incoming word Strong (movement) and weak (merge) 8: hop(D) Build MP syntactic structure via projection, merge, and movement operators Feature checking 9: merge1(D’) Constrain operators using subcategorization and thematic information NP, VP symmetry including shells 10: merge2(DP) Allow strategies for limited, local structural reanalysis for unproblematic 11: lexaccess(should) ambiguities 12: lexaccess(have) Related issues: 13: lexaccess(exonerated) 7: Semantic interpretation incrementally mapped from syntactic structures Incremental parsing 14: lexaccess(the) Machine learning: chunking up prior decisions and reusing them later 15: lexaccess(defendant) Performance issues: memory usage across time, parsing difficulties Human language processing is incremental 16: project(NP) References: Processes are largely lexically driven 17: hop(n) Chomsky, N. 1995. The Minimalist Program. MIT Press. 18: merge1(n’) Laird, J., A. Newell, and P. Rosenbloom. 1987. Soar: An architecture for Word-by-word processing: words... 9:-10: general intelligence. Artificial Intelligence 33:1-64. 19: merge2(nP) enter system’s perceptual buffer Lewis, R. 1993. An architecturally-based theory of human sentence 20: movehead(N) attended to via lexical access operator comprehension. PhD thesis, Carnegie Mellon, School of Computer Science. 21: hop(D) Newell, A. 1990. Unified theories of cognition. Harvard University Press: disappear if unattended to after specified duration 11:-15: 22: merge1(D’) Cambridge, MA. Structure (syntactic and semantic) constructed piecemeal 23: merge2(DP) Pritchett, B. 1992. Grammatical Competence and Parsing Performance. 24: merge1(V’) University of Chicago Press. Chicago, IL. Open question: Is parsing within the MP incrementally feasible in a 25: merge2(VP) cognitively plausible way? 26: hop(v) 27: merge1(v’) 28: merge2(vP) 29: movehead(V) Background: Soar WordNet data 30: hop(Perf) 31: merge1(Perf’) Operators and operator types General theory of human problem solving 32: merge2(PerfP) Overview of verb exonerate Cognition: language, action, performance (in all their varieties) 16:-24: 33: hop(T) Operator: basic unit of cognitive processing The verb exonerate has 1 sense (first 1 from tagged 34: merge1(T’) Cognitive modeling architecture instantiated as intelligent agent texts) Stepwise progress toward specified goal 35: merge2(TP) Observable mechanisms, time course of behaviors, deliberation Various types and functions in XNL-Soar (1)acquit, assoil, clear, discharge, exonerate, Knowledge levels and their use exculpate -- (pronounce not guilty of criminal Lexical access: retrieve and store lexically-related information charges; "The suspect was cleared of the murder Instantiate the model as a computational system Merge: construct syntax via MP-specified merge operations charges") Symbolic rule-based architecture Movehead: perform head-to-head movement (via adjunction) Semantic class: verb.communication Subgoal-directed problem specification HoP: consult hierarchy of projections, return next possible target level Verb frames: Operator-based problem solving Project: create bare-structure maximal projection from lexical item Somebody ---s somebody. Machine learning Somebody ---s somebody of something. Applications: robotics, video games and simulation, tutorial dialogue, etc. NL-Soar: natural language processing engine built on Soar English LCS lexicon data Background: NL-Soar 10.6.a#1#_ag_th,mod- External knowledge sources poss(of)#exonerate#exonerate#exonerate#exonerate+ed# (2.0,00874318_exonerate%2:32:00::) Soar extension for modeling language use “10.6.a” “Verbs of Possessional Deprivation: Cheat Verbs/-of” WordNet 2.0 (wordnet.princeton.edu) WORDS (absolve acquit balk bereave bilk bleed burgle cheat Unified theory of cognition Lexical semantics: part-of-speech, word senses, subcategorization cleanse con cull cure defraud denude deplete depopulate + deprive despoil disabuse disarm disencumber dispossess divest Inflectional and derivational morphology drain ease exonerate fleece free gull milk mulct pardon Soar cognitive modeling system English LCS lexicon (www.umiacs.umd.edu/~bonnie/verbs-English.lcs) plunder purge purify ransack relieve render rid rifle rob sap 25:-29: + strip swindle unburden void wean) NL components Thematic information: θ-grids, θ-roles ((1 "_ag_th,mod-poss()") = Used to derive uninterpretable features (1 "_ag_th,mod-poss(from)") (1 "_ag_th,mod-poss(of)")) Unified cognitive architecture for overall cognition including NL Triggers syntactic construction Used specifically to model language tasks: acquisition, language use, "He !!+ed the people (of their rights); He !!+ed him of his Aligned with WordNet information sins" language/task integration, etc. Different modalities supported Parsing/comprehension: derive semantics of incoming utterances Generation/production: output utterances expressing semantic content Rules for XNL-Soar Mapping: convert between semantic representations Discourse/dialogue: learn and execute dialogue plans IF→THEN (productions) XNL-Soar updates the syntactic component of NL-Soar to use the MP If certain conditions are met, then the agent performs some action Represent a priori knowledge Several rule/production firings can be bundled together as operators Current XNL-Soar system: about 60 productions External knowledge sources: interfaced via 1000+ lines of Tcl/Perl 30:-34: NL-Soar system: 3500 productions Our goals Integrate the MP into a cognitive modeling engine Explore language/task-integrations using the MP Current status Test cross-linguistic implementation possibilities with the MP Ultimately, determine whether the MP supports incremental, operator- Proof of concept for fundamental syntactic structures based processing Basic transitive sentences work, ditransitives soon Unergatives and unaccusatives work Our approach All functional and lexical projections in syntactic structure Feature percolation, feature checking mostly works Map the syntactic parsing task onto an operator-based framework Some constraints derived from thematic information Specify goals, subgoals, etc. for parsing Develop operator types for various MP syntactic operations Implement constraints, preconditions, precedence hierarchies Future functionality Integrate necessary and relevant external knowledge sources Strengths: XP adjunction We have already done this for a prior syntactic model. More semantics/deeper semantics. The MP has an operator-like feel to it. Quantifier raising The Soar operator-based framework is versatile and flexible. Scopal relationships Weaknesses: C-command and other interpretive mechanisms The MP literature does not address incremental parsing in-depth. More detailed LCS structures The external knowledge sources are somewhat incommensurable. Web-based interactive Minimalist Parser grapher 35: Our knowledge of human performance data is sketchy. Issue: Find a balance between generation and parsing Most MP descriptions are generative, not recognitional in focus Is it advisable and well motivated to “undo” or “reverse” movements? If not, is generate-and-test the right mechanism for parsing input? What are the implications for learning and bootstrapping language capabilities (e.g. parsing in the service of generation)? Similar Work Incremental parsing in general (Phillips 2003) Future applications Other linguistic theories for incremental parsing GB (Kolb 1991) Integrate syntax/semantics into discourse/conversation component Dependency grammar (Milward 1994, Ait-Mokhtar et al. 2002) Develop human-agent and agent-agent communication Categorial Grammar (Izuo 2004) Parameterize XNL-Soar for processing of other languages besides English Finite-state methods (Ait-Mokhtar & Chanod 1997) Model cognition in reading Minimalist parsing in other frameworks (Stabler 1997, Harkema 2001) Model real-time language/task integrations Thematic information and parsing (Schlesewsky & Bornkessel 2004) Crosslinguistic considerations in incremental parsing (Schneider 2000) Note: all of the above have been implemented in NL-Soar, the predecessor Human studies on ambiguity, reanalysis Eye tracking (Kamide, Altmann, & Haywood 2003) ERP (Bornkessel, Schlesewsky, & Friederici 2003) For more information ... on Soar: http://sitemaker.umich.edu/soar on NL-Soar: http://linguistics.byu.edu/nlsoar.
Recommended publications
  • Common and Distinct Neural Substrates for Pragmatic, Semantic, and Syntactic Processing of Spoken Sentences: an Fmri Study
    Common and Distinct Neural Substrates for Pragmatic, Semantic, and Syntactic Processing of Spoken Sentences: An fMRI Study G.R.Kuperberg and P.K.McGuire Downloaded from http://mitprc.silverchair.com/jocn/article-pdf/12/2/321/1758711/089892900562138.pdf by guest on 18 May 2021 Institute of Psychiatry, London E.T.Bullmore Institute of Psychiatry, London and the University of Cambridge M.J.Brammer, S.Rabe-Hesketh, I.C.Wright, D.J.Lythgoe, S.C.R.Williams, and A.S.David Institute of Psychiatry, London Abstract & Extracting meaning from speech requires the use of difference between conditions in activation of the left-super- pragmatic, semantic, and syntactic information. A central ior-temporal gyrus for the pragmatic experiment than the question is:Does the processing of these different types of semantic/syntactic experiments; (2) a greater difference be- linguistic information have common or distinct neuroanatomi- tween conditions in activation of the right-superior and middle- cal substrates? We addressed this issue using functional temporal gyrus in the semantic experiment than in the syntactic magnetic resonance imaging (fMRI) to measure neural activity experiment; and (3) no regions activated to a greater degree in when subjects listened to spoken normal sentences contrasted the syntactic experiment than in the semantic experiment. with sentences that had either (A) pragmatical, (B) semantic These data show that, while left- and right-superior-temporal (selection restriction), or (C) syntactic (subcategorical) viola- regions may be differentially involved in processing pragmatic tions sentences. All three contrasts revealed robust activation of and lexico-semantic information within sentences, the left- the left-inferior-temporal/fusiform gyrus.
    [Show full text]
  • Modeling Subcategorization Through Co-Occurrence Outline
    Introducing LexIt Building Distributional Profiles Ongoing Work Conclusions Outline Modeling Subcategorization Through Co-occurrence 1 Introducing LexIt A Computational Lexical Resource for Italian Verbs The Project Distributional Profiles 1 2 Gabriella Lapesa ,AlessandroLenci 2 Building Distributional Profiles Pre-processing 1University of Osnabr¨uck, Institute of Cognitive Science Subcategorization Frames 2 University of Pisa, Department of Linguistics Lexical sets Selectional preferences Explorations in Syntactic Government and Subcategorisation nd University of Cambridge, 2 September 2011 3 Ongoing Work 4 Conclusions Gabriella Lapesa, Alessandro Lenci Modeling Subcategorization Through Co-occurrence 2/ 38 Introducing LexIt Introducing LexIt Building Distributional Profiles The Project Building Distributional Profiles The Project Ongoing Work Distributional Profiles Ongoing Work Distributional Profiles Conclusions Conclusions Computational approaches to argument structure LexIt: a computational lexical resource for Italian The automatic acquisition of lexical information from corpora is a longstanding research avenue in computational linguistics LexIt is a computational framework for the automatic acquisition subcategorization frames (Korhonen 2002, Schulte im Walde and exploration of corpus-based distributional profiles of Italian 2009, etc.) verbs, nouns and adjectives selectional preferences (Resnik 1993, Light & Greiff 2002, Erk et al. 2010, etc.) LexIt is publicly available through a web interface: verb classes (Merlo & Stevenson 2001, Schulte im Walde 2006, http://sesia.humnet.unipi.it/lexit/ Kipper-Schuler et al. 2008, etc.) Corpus-based information has been used to build lexical LexIt is the first large-scale resource of such type for Italian, resources aiming at characterizing the valence properties of predicates cf. VALEX for English (Korohnen et al. 2006), LexSchem for fully on distributional ground French (Messiant et al.
    [Show full text]
  • 1 Minimalism Minimalism Is the Name of the Predominant Approach In
    Minimalism Minimalism is the name of the predominant approach in generative linguistics today. It was first introduced by Chomsky in his work The Minimalist Program (1995) and has seen several developments and changes since. The central idea of minimalism is that a linguistic theory should contain as few non-derived assumptions as possible. Many notions that had been developed in earlier generative theory, specifically the Government & Binding theory (GB), have been abandoned, in an attempt to derive them from more basic concepts. In Chomsky's (1995) view, minimalism is an implementation of the more general Principles and Parameters model. According to this language model, the human language capacity consists of a set of universal principles combined with a set of parameters. The principles are thought to be innate, which means that every language adheres to them. The parameters can be thought of as switches that can be set in two positions. Learning the syntax of one's native language, according to Chomsky's view, is a matter of acquiring the correct parameter settings for the language. Chomsky describes syntax as a cognitive system that connects two other cognitive systems: the conceptual-intentional system and the articulatory-perceptual system. Because syntax is linked to these two systems, the syntactic model defines two interface levels, one for each of them: Phonological Form (PF) is the interface to the articulatory-perceptual system, and Logical Form (LF) is the interface to the conceptual-intentional system. The grammar model is built up as follows: a clause is derived by selecting a set of lexical items from the lexicon.
    [Show full text]
  • Generative Linguistics Within the Cognitive Neuroscience of Language Alec Marantz
    Generative linguistics within the cognitive neuroscience of language Alec Marantz Department of Linguistics and Philosophy, MIT Standard practice in linguistics often obscures the connection between theory and data, leading some to the conclusion that generative linguistics could not serve as the basis for a cognitive neuroscience of language. Here the foundations and methodology of generative grammar are clarified with the goal of explaining how generative theory already functions as a reasonable source of hypotheses about the representation and computation of language in the mind and brain. The claims of generative theory, as exemplified, e.g., within Chomsky’s (2000) Minimalist Program, are contrasted with those of theories endorsing parallel architectures with independent systems of generative phonology, syntax and semantics. The single generative engine within Minimalist approaches rejects dual routes to linguistic representations, including possible extra-syntactic strategies for semantic structure-building. Clarification of the implications of this property of generative theory undermines the foundations of an autonomous psycholinguistics, as established in the 1970’s, and brings linguistic theory back to the center of a unified cognitive neuroscience of language. 2 1. The place of linguistics* The first decade of the 21st century should be a golden era for the cognitive neuroscience of language. Fifty years of contemporary linguistic analysis of language can be coupled with a wide range of brain imaging and brain monitoring machines to test hypotheses and refine theory and understanding. However, there is still a gulf between mainstream linguistics within the generative linguistic tradition and most of those engaged in experimental cognitive neuroscience research. Some have argued that the fault here lies with the linguists, whose generative theories are based in principle on separating the study of linguistic representations from research on the acquisition and use of language in the minds and brains of speakers.
    [Show full text]
  • Acquiring Verb Subcategorization from Spanish Corpora
    Acquiring Verb Subcategorization from Spanish Corpora Grzegorz ChrupaÃla [email protected] Universitat de Barcelona Department of General Linguistics PhD Program “Cognitive Science and Language” Supervised by Dr. Irene Castell´onMasalles September 2003 Contents 1 Introduction 5 2 Verb Subcategorization in Linguistic Theory 7 2.1 Introduction . 7 2.2 Government-Binding and related approaches . 7 2.3 Categorial Grammar . 9 2.4 Lexical-Functional Grammar . 11 2.5 Generalized Phrase-Structure Grammar . 12 2.6 Head-Driven Phrase-Structure Grammar . 14 2.7 Discussion . 17 3 Diathesis Alternations 19 3.1 Introduction . 19 3.2 Diathesis . 19 3.2.1 Alternations . 20 3.3 Diathesis alternations in Spanish . 21 3.3.1 Change of focus . 22 3.3.2 Underspecification . 26 3.3.3 Resultative construction . 27 3.3.4 Middle construction . 27 3.3.5 Conclusions . 29 4 Verb classification 30 4.1 Introduction . 30 4.2 Semantic decomposition . 30 4.3 Levin classes . 32 4.3.1 Beth Levin’s classification . 32 4.3.2 Intersective Levin Classes . 33 4.4 Spanish: verbs of change and verbs of path . 35 4.4.1 Verbs of change . 35 4.4.2 Verbs of path . 37 4.4.3 Discussion . 39 4.5 Lexicographical databases . 40 1 4.5.1 WordNet . 40 4.5.2 VerbNet . 41 4.5.3 FrameNet . 42 5 Subcategorization Acquisition 44 5.1 Evaluation measures . 45 5.1.1 Precision, recall and the F-measure . 45 5.1.2 Types and tokens . 46 5.2 SF acquisition systems . 47 5.2.1 Raw text .
    [Show full text]
  • Features from Aspects Via the Minimalist Program to Combinatory Categorial Grammar
    Á. Gallego & D. Ott (eds.). 2015. 50 Years Later: Reflections on Chomsky’s Aspects. Cambridge, MA: MITWPL. Available for purchase at http://mitwpl.mit.edu © The Authors FEATURES FROM ASPECTS VIA THE MINIMALIST PROGRAM TO COMBINATORY CATEGORIAL GRAMMAR NEIL SMITH ANNABEL CORMACK University College London 1 Background One major contribution of Aspects (Chomsky 1965)1 was to initiate the development of a theory of syntactic features. There is no use of features, either syntactic or morphophonemic, in Chomsky’s earliest work The Morphophonemics of Modern Hebrew (1951/79); they do not appear in his monumental The Logical Structure of Linguistic Theory (1955/75); nor in Syntactic Structures (1957) (except for category labels); nor, except for phonological distinctive features, in Current Issues in Linguistic Theory (1964). In Aspects features play a crucial role. Chomsky suggests (p.214) that “We might, then, take a lexical entry to be simply a set of features, some syntactic, some phonological, some semantic”. But even here the use of features was somewhat haphazard, apparently unconstrained, and a terminological mess. In this contribution we want to trace – in broad outline – the development of a theory of morphosyntactic features and propose a fully Merge-based and parsimonious version of such a theory. Our work is intended as a contribution to Chomsky’s Minimalist Program, pursuing the same ends on the basis of some different background assumptions. These assumptions and the specific analyses derived from them have been developed using an austere version of Combinatorial Categorial Grammar, but are of wider applicability. All current theories use phonological features and ‘semantic’ features which we will largely ignore except to point out that the term ‘semantic’ is used in radically different ways to cover truth-theoretic, conceptual or encyclopaedic distinctions.
    [Show full text]
  • Movement in Minimalism, Class 2: Linearization and the Problem of Copies July 11, LSA Summer Institute 2017
    Movement in Minimalism, Class 2: Linearization and the problem of copies July 11, LSA Summer Institute 2017 This class: . We discuss Nunes’s (1995, 2004) proposal that copy deletion derives from the demands of the linearization algorithm at PF . We look at cases for which a copy theory has an advantage over a trace theory, involving multiple spell-out in predicate clefting/V-fronting constructions as well as wh-copying . We examine exceptional cases of copy spell-out, including deletion of the highest copy and scattered deletion 1 Two approaches to copy deletion As we discussed last week, Merge provides us with a single structure-building mechanism. Merge can create long-distance dependencies as long as we adopt the Copy Theory of Movement: (1) Which book did you read which book? CP DP C’ which book did TP you T’ did VP read which book However, getting rid of traces in this way means that we need to posit a rule of copy deletion. But where does copy deletion come from? 1.1 Three sources of wellformedness constraints In the Minimalist Program, constraints on wellformedness have only three possible sources: 1. Constraints on the PF interface 2. Constraints on the LF interface 3. Natural principles of computation, often referred to as “economy constraints” =) Copy deletion too should derive from one of these sources. 1 Two of these options have been explored in previous work, and we will discuss them this week: 1. Copy deletion as a result of linearization at PF (Nunes 1995, 2004; cf. Fox and Pesetsky 2005; Johnson 2012): .
    [Show full text]
  • The Development of Minimalist Syntax
    THE DEVELOPMENT OF MINIMALIST SYNTAX Timothy Hawes Northeastern University Senior Honors Project Dec, 15 2005 Introduction Many animals communicate with sounds and signs. Birds have elaborate songs to attract mates, some monkeys have special warning cries for different threats and some animals have even been taught to communicate with people using limited vocabularies of signed or spoken words. Each of these forms of communication represents some sort of paring between sound (or actions) and meaning. However, all of these forms of communication differ from human language in one very important feature, productivity. Not only can humans produce sounds that have abstract relations to meanings, but humans have a system of language that is both hierarchically organized and recursive, thus making it infinitely productive. As the only species on earth with language in this sense, it seems logical to conclude that there must be some sort of biological basis for this property. Though the degree to which language is biologically specified has been argued for many years now, it has been generally agreed that there is something specific to humans that gives them the ability to produce language. This special property is known as the “Faculty of Language”, and has long been the object of linguistic inquiry. By studying the Faculty of Language, linguists hope not only to be able to describe languages with all of their variation, but be able to explain the basic properties of language and how children are able to acquire it so easily despite the many factors working against them. In the early 1960’s, linguists were trying to explain language acquisition and linguistic variation with the “format framework”, which relied on rules and constructions to explain grammar.
    [Show full text]
  • Glossary for Syntax in Three Dimensions (2015) © Carola Trips
    Glossary for Syntax in three Dimensions (2015) © Carola Trips A - B - C - D - E - F - G - H - I - J - K - L - M - N - O - P - Q - R - S - T - U - V - W - X - Y – Z A abstract case Abstract Case (or "Deep Case") is an abstract notion of case distinct from the morphologically marked case (cf. "case") used in Case theory (subtheories of Generative Grammar). Abstract Case is present even in languages which lack morphological case on noun phrases; it is usually assumed to be congruent with morphological case when such features are present. For example in German dative case is marked morphologically as in Ich helfe ihm ('I help him') whereas no such case marking is found in the English sentence (but Case is still there). academy In previous centuries the need for an academy dealing with linguistic matters was discussed and debated in England (on the model of the French académie française). Due to the increase in the production of grammars in the 17th and 18th centuries and the rising number of grammarians, calls for an academy to codify the English language by publishing an authoritative grammar and dictionary dwindled. acceptance Step in the process of standardization: a selected variety of a particular language must gain acceptance by a group of speakers/users who are capable of influencing other speakers. actants In valency grammar, a functional unit determined by the valency of the verb. Actants are required by the valence of the verb and are the equivalents of arguments (in generative grammar). © Carola Trips 1 active voice Term used in grammatical analysis of voice, referring to a sentence, clause or verb form where from a semantic point of view the grammatical subject is typically the actor in relation to the verb.
    [Show full text]
  • Subcategorization Semantics and the Naturalness of Verb-Frame Pairings
    University of Pennsylvania Working Papers in Linguistics Volume 4 Issue 2 Article 11 1997 Subcategorization Semantics and the Naturalness of Verb-Frame Pairings Edward Kako University of Pennsylvania Follow this and additional works at: https://repository.upenn.edu/pwpl Recommended Citation Kako, Edward (1997) "Subcategorization Semantics and the Naturalness of Verb-Frame Pairings," University of Pennsylvania Working Papers in Linguistics: Vol. 4 : Iss. 2 , Article 11. Available at: https://repository.upenn.edu/pwpl/vol4/iss2/11 This paper is posted at ScholarlyCommons. https://repository.upenn.edu/pwpl/vol4/iss2/11 For more information, please contact [email protected]. Subcategorization Semantics and the Naturalness of Verb-Frame Pairings This working paper is available in University of Pennsylvania Working Papers in Linguistics: https://repository.upenn.edu/pwpl/vol4/iss2/11 Subcategorization Semantics and the Naturalness of Verb-Frame Pairings Edward Kako 1. Introduction Do subcategorization frames have meanings independent of the verbs which appear in them? Advocates of the Projectionist position have answered “no” to this question, arguing that subcat frames serve only to satisfy the structural demands imposed upon sentences by the semantics of verbs (e.g., Chomsky, 1981; Pinker, 1989; Levin, 1993). Proponents of the Constructionist position, by contrast, have answered “yes” to this question, arguing that whether a verb and frame can peaceably cohabit in a sentence depends upon whether the two are “compatible” (e.g., Goldberg,
    [Show full text]
  • The Minimalist Program 1 1 the Minimalist Program
    The Minimalist Program 1 1 The Minimalist Program Introduction It is my opinion that the implications of the Minimalist Program (MP) are more radical than generally supposed. I do not believe that the main thrust of MP is technical; whether to move features or categories for example. MP suggests that UG has a very different look from the standard picture offered by GB-based theories. This book tries to make good on this claim by outlining an approach to grammar based on one version of MP. I stress at the outset the qualifier “version.” Minimalism is not a theory but a program animated by certain kinds of methodological and substantive regulative ideals. These ideals are reflected in more concrete principles which are in turn used in minimalist models to analyze specific empirical phenomena. What follows is but one way of articulating the MP credo. I hope to convince you that this version spawns grammatical accounts that have a theoretically interesting structure and a fair degree of empirical support. The task, however, is doubly difficult. First, it is unclear what the content of these precepts is. Second, there is a non-negligible distance between the content of such precepts and its formal realization in specific grammatical principles and analyses. The immediate task is to approach the first hurdle and report what I take the precepts and principles of MP to be.1 1 Principles-Parameters and Minimalism MP is many things to many researchers. To my mind it grows out of the per- ceived success of the principles and parameters (P&P) approach to grammatical competence.
    [Show full text]
  • Verb Sense and Verb Subcategorization Probabilities Douglas Roland Daniel Jurafsky University of Colorado University of Colorado Department of Linguistics Dept
    To appear in Suzanne Stevenson and Paola Merlo’s volume of papers from the 1998 CUNY Sentence Processing Conference (Benjamins) Verb Sense and Verb Subcategorization Probabilities Douglas Roland Daniel Jurafsky University of Colorado University of Colorado Department of Linguistics Dept. of Linguistics, Computer Boulder, CO 80309-0295 Science, and Inst. of Cognitive Science [email protected] Boulder, CO 80309-0295 [email protected] 1 Introduction The probabilistic relation between verbs and their arguments plays an important role in psychological theories of human language processing. For example, Ford, Bresnan and Kaplan (1982) proposed that verbs like position have two lexical forms: a more preferred form that subcategorizes for three arguments (SUBJ, OBJ, PCOMP) and a less preferred form that subcategorizes for two arguments (SUBJ, OBJ). Many recent psychological experiments suggest that humans use these kinds of verb-argument preferences as an essential part of the process of sentence interpretation. (Clifton et al. 1984, Ferreira & McClure 1997, Garnsey et al. 1997, MacDonald 1994, Mitchell & Holmes 1985, Boland et al. 1990, Trueswell et al. 1993). It is not completely understood how these preferences are realized, but one possible model proposes that each lexical entry for a verb expresses a conditional probability for each potential subcategorization frame (Jurafsky 1996, Narayanan and Jurafsky 1998). Unfortunately, different methods of calculating verb subcategorization probabilities yield different results. Recent studies (Merlo 1994, Gibson et al. 1996, Roland & Jurafsky 1997) have found differences between syntactic and subcategorization frequencies computed from corpora and those computed from psychological experiments. Merlo (1994) showed that the subcategorization frequencies derived from corpus data were different from the subcategorization data derived from a variety of psychological protocols.
    [Show full text]