An Introduction to Unification- Based Approaches to Grammar

Total Page:16

File Type:pdf, Size:1020Kb

An Introduction to Unification- Based Approaches to Grammar An Introduction to Unification- Based Approaches to Grammar The Harvard community has made this article openly available. Please share how this access benefits you. Your story matters Citation Shieber, Stuart M. 2003. An introduction to unification-based approaches to grammar. Brookline, Massachusetts: Microtome Publishing. Reissue of Shieber, Stuart M. 1986. An introduction to unification-based approaches to grammar. Stanford, California: CSLI Publications. Citable link http://nrs.harvard.edu/urn-3:HUL.InstRepos:11576719 Terms of Use This article was downloaded from Harvard University’s DASH repository, and is made available under the terms and conditions applicable to Other Posted Material, as set forth at http:// nrs.harvard.edu/urn-3:HUL.InstRepos:dash.current.terms-of- use#LAA An Introduction to Unification-Based Approaches to Grammar i An Introduction to Unification-Based Approaches to Grammar Stuart M. Shieber Microtome Publishing Brookline, Massachusetts c Stuart M. Shieber 1986, 1988, 2003 This work by Stuart M. Shieber is licensed under a Creative Commons Attribution 3.0 Unported License. The full text of the license is available at http://creativecommons.org/licenses/by/3.0/deed.en_US. PREFACE v Preface These notes were originally developed as a supplement to a tutorial on unification-based approaches to grammar presented at the 23rd Annual Meet- ing of the Association for Computational Linguistics on 7 July, 1985 at the University of Chicago. My intention was to present a set of formalisms and theories from a single organizing perspective, that of their reliance on notions of identity of complex feature structures and on unification in the underlying algebras. Of course, as discussed in the first chapter, such an expositional method precludes cover- ing the formalisms from the perspective put forward by particular researchers in the field. Fortunately, I can now recommend an introduction to the two main linguistic theories discussed in these notes which does present their own view in an introductory way yet with surprisingly broad coverage; Peter Sells’ Lectures on Contemporary Syntactic Theories, published in this same CSLI Lecture Note Series, serves this purpose admirably. Because of the brevity of the present work, and its intended (though not completely realized) nonparochial nature, I have purposefully left out any lin- guistic analyses from a unification-based perspective of any but the most cur- sory sort. My reasoning here was that the vast linguistics literature in the area provides ample examples of such analyses. (For interested readers, Chapter 5 contains references to some of this research.) A future edition, however, may be extended with examples of analyses of control, long-distance dependencies and various of the other canonical linguistic phenomena described using the tools of unification-based formalisms. Much of my thinking on these issues has been influenced by researchers at the Center for the Study of Language and Information at Stanford University, and especially those in the Foundations of Grammar project. CSLI is a unique environment for research, and has done more to shape this work and the opin- ions presented herein than any other single force. I am also indebted to Martin Kay, Fernando Pereira, Ivan Sag, Susan Stucky, Hans Uszkoreit, Thomas Wa- sow and Annie Zaenen for their comments on earlier drafts of these notes. However, the views and conclusions expressed here are not necessarily shared by them, nor are they accountable for any errors remaining. The exposition in these notes, such as it is, was considerably improved, as usual, by my editor at SRI, Savel Kliachko. Any unfortunate idiosyncrasies in the text are undoubtedly lapses of the author; Savel never errs. I also extend vi heartfelt thanks to the CSLI editor, Dikran Karagueuzian, for his patience and support of this work. Finally, the preparation of these notes was considerably simplified by the efforts of Emma Pease in the formatting and indexing of the final version. The preparation of the published version of these notes was made possible by a gift from the System Development Foundation. Menlo Park, California 13 February, 1986 This second printing corrects some typographical and bibliographical errors pointed out by readers of the first printing. Special thanks go to Geoffrey K. Pullum for his assiduous comments on the citations. Menlo Park, California 2 February, 1988 This reissue of “An Introduction to Unification-Based Approaches to Grammar Formalisms” is only lightly edited from the previous 1988 printing. By now, the content is considered quite elementary and the presentation is grossly out of date, but it may have some remaining utility by virtue of the former property at least. More complete and modern presentations of the formal basis for the type of formalisms described in this book can be found in the books by Carpenter (1992) and Johnson (1988). From a linguistic perspective, a good starting point is the textbook by Sag and Wasow (1999). Cambridge, Massachusetts 20 August, 2003 Contents Preface v Chapter1. IntroductionandHistory 1 Chapter2. TheUnderlyingFramework 3 2.1. TheRoleofGrammarFormalisms 3 2.2. SomeParticularDesignChoices 4 2.3. CoverageoftheFramework 5 Chapter 3. The Simplest Unification-Based Formalism 9 3.1. OverviewoftheFormalism 9 3.2. TheInformationalDomain 10 3.3. Combinatory Rules 17 3.4. SomePATR-IIGrammars 20 Chapter4. ExtendedFormalisms 31 4.1. Introduction 31 4.2. Two Classes of Formalisms 31 4.3. FunctionalUnificationGrammar 32 4.4. Definite-ClauseGrammars 37 4.5. Lexical-FunctionalGrammar 39 4.6. GeneralizedPhraseStructureGrammar 42 4.7. Head Grammar and Head-Driven Phrase Structure Grammar 44 4.8. Lexical Organization 45 4.9. OtherExtensionsofFormalisms 53 Chapter 5. Conclusions 55 5.1. EmergentIssuesConcerningFormalisms 55 5.2. A Summary 57 AppendixA. TheSamplePATR-IIGrammars 59 vii viii CONTENTS A.1. Sample Grammar One 59 A.2. Sample Grammar Two 62 A.3. Sample Grammar Three 67 A.4. Sample Grammar Four 73 Appendix B. The Literature 79 B.1. General Papers 79 B.2. BackgroundandOverviewsoftheFormalisms 79 B.3. HandlingSpecificLinguisticPhenomena 80 B.4. Related Formalisms and Languages from Computer Science 80 B.5. RelatedImplementationTechniques 81 B.6. ImplementationsofUnification-BasedFormalisms 82 Bibliography 83 CHAPTER 1 Introduction and History These notes discuss a particular approach to encoding linguistic informa- tion, both syntactic and semantic, which has been called “unification-based” or “complex-feature-based.” The term “unification-based grammar formalism” covers quite a variety of formal devices, which have been developed along historically different paths. The fact that they are regarded as a group is due perhaps partly to accidental and historical circumstances, partly to a shared underlying view of grammar, but most importantly to their reliance in some way on one particular device—unification. Historically, these grammar formalisms are the result of separate research in computational linguistics, formal linguistics, and natural-language process- ing; related techniques can be found in theorem proving, knowledge repre- sentation research, and theory of data types. Several independently initiated strains of research have converged on the idea of unification to control the flow of information. Beginning with the augmented-transition-network (ATN) concept (like so much of the research in modern computational linguistics) and inspired by Bresnan’s work on lexically oriented nontransformational linguistics, the lexical-functional grammar (LFG) framework of Bresnan and Kaplan was evolved. Simultaneously, Kay devised the functional grammar (later unifi- cation grammar, now functional unification grammar [FUG]) formalism. Independently, Colmerauer had produced the Q-system and metamorpho- sis grammar formalisms as tools for natural-language processing. The logic- programming community, specifically Pereira and Warren, created definite- clause grammars (DCG) on the basis of Colmerauer’s earlier work on these formalisms and on the programming language Prolog. Independent work in logic programminghas used DCG as the foundationof many unification-based formalisms, such as extraposition, slot, and gapping grammars. 1 2 1.INTRODUCTIONANDHISTORY Another strain of parallel research grew out of the work on nontransfor- mational linguistic analyses, from which Gazdar developed generalized phrase structure grammar (GPSG). In its later formalization by Gazdar and Pullum, GPSG imported a unification relation. Pollard defined head grammars in his dissertation, basing them on GPSG. Implementation of GPSG at Hewlett- Packard led Pollard and his colleagues to design their head-driven phrase- structure grammar (HPSG) as a successor to GPSG and head grammars. Most recently, influenced by early papers on GPSG, Rosenschein and the author devised PATR as a successor to the DIAGRAM grammar formalism at SRI International. Although PATR did not use unification, it developed (under the influence of FUG, later work in GPSG, and DCG) into PATR-II, perhaps the simplest of the unification-based formalisms. One thing has become clear through this morass of historical fortuities: unification is a powerful tool for grammar formalisms. All these formalisms make crucial use of unification in one manner or another, supplementing it, for various linguistic or computational reasons, with other mechanisms. A Caveat. Unification formalisms, as we have seen, can be traced to a diversity of sources. Consequently, proponents of each formalism have
Recommended publications
  • The Structure of Reflexive Clauses in Michif: a Relational Grammar Approach
    University of North Dakota UND Scholarly Commons Theses and Dissertations Theses, Dissertations, and Senior Projects 12-1984 The trS ucture of Reflexive Clauses in Michif: A Relational Grammar Approach Larry Lee Lovell Follow this and additional works at: https://commons.und.edu/theses Part of the Linguistics Commons Recommended Citation Lovell, Larry Lee, "The trS ucture of Reflexive Clauses in Michif: A Relational Grammar Approach" (1984). Theses and Dissertations. 903. https://commons.und.edu/theses/903 This Thesis is brought to you for free and open access by the Theses, Dissertations, and Senior Projects at UND Scholarly Commons. It has been accepted for inclusion in Theses and Dissertations by an authorized administrator of UND Scholarly Commons. For more information, please contact [email protected]. THE STRUCTURE OF REFLEXIVE CLAUSES IN MICHIF: . ' i ' A RELATIONAL GRAMMAR APPROACH ) I by Larry Lee Lovell Bachelor of icience, Middle Tennessee State University, 1982 A Thesis Submitted to the Graduate Faculty of the University of North Dakota in partial fulfillment of the requirements for the degree of Master of Arts Grand Forks, North Dakota December · 1984 This thesis submitted by Larry Lee Lovell in partial fulfillment of the requirements for the Degree of M.A. in Linguistics from the University of North Dakota is hereby approved by the Faculty Advisory Committee under whom the work has been done. 0. This thesis meets the standards for appearance and conforms to the style and format requirements of the Graduate School of the University of North Dakota, and is hereby approved. ; I ! i - ii - ' . Title THE STRUCTURE OF REFLEXIVE CLAUSES IN MICHIF: A RELATIONAL GRAMMAR APPROACH Department Linguistics Degree Master of Arts In presenting this thesis in ~artial fulfillment of the requirements for a graduate degree from the University of North Dakota, I agree that the Library of this University shall make it freely available for inspection.
    [Show full text]
  • A Relational Grammar Approach to Kera Syntax Janet K
    Work Papers of the Summer Institute of Linguistics, University of North Dakota Session Volume 28 Article 1 1984 A relational grammar approach to Kera syntax Janet K. Camburn SIL-UND Follow this and additional works at: https://commons.und.edu/sil-work-papers Recommended Citation Camburn, Janet K. (1984) "A relational grammar approach to Kera syntax," Work Papers of the Summer Institute of Linguistics, University of North Dakota Session: Vol. 28 , Article 1. DOI: 10.31356/silwp.vol28.01 Available at: https://commons.und.edu/sil-work-papers/vol28/iss1/1 This Thesis is brought to you for free and open access by UND Scholarly Commons. It has been accepted for inclusion in Work Papers of the Summer Institute of Linguistics, University of North Dakota Session by an authorized editor of UND Scholarly Commons. For more information, please contact [email protected]. A RELATIOIAL GRAMMAR APPROACH TO KERA SIITAX Janet K. Camburn 1 Introduction 2 Introduction to Relational Grammar 3 Kera syntax 3T1 Tone 3.2 Kera verbs 3.3 Terms and term marking 3~4 Oblique markings 4 Advancements 4.1 Passive 4~2 3 advancement 4~3 Ben-3 advancement 4.4 Summary 5 Multiple depencencies 5.1 Pro-Replacement 5.2 Equi-Erasure and Equi-Subject Union 5.3 Equi-Erasure 5~4 Equi-Subject Union 6 The proper formulation of the Stem Agreement Rule 6~1 Summary 7 Ascensions 7,1 Ascensions out of 2-host 7~2 Ascensions out of a 1-host 7.3 Ascensions and unaccusatives 8 Possessor ascension 8.1 The possessive construction in Kera 8.2 Possessor ascension to 2 8.3 Possessor ascension to 3 8.4 Summary 1 Introduction Kera is an Afroasiatic language belonging to the Chadic family of languages.
    [Show full text]
  • Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
    Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook.
    [Show full text]
  • Common and Distinct Neural Substrates for Pragmatic, Semantic, and Syntactic Processing of Spoken Sentences: an Fmri Study
    Common and Distinct Neural Substrates for Pragmatic, Semantic, and Syntactic Processing of Spoken Sentences: An fMRI Study G.R.Kuperberg and P.K.McGuire Downloaded from http://mitprc.silverchair.com/jocn/article-pdf/12/2/321/1758711/089892900562138.pdf by guest on 18 May 2021 Institute of Psychiatry, London E.T.Bullmore Institute of Psychiatry, London and the University of Cambridge M.J.Brammer, S.Rabe-Hesketh, I.C.Wright, D.J.Lythgoe, S.C.R.Williams, and A.S.David Institute of Psychiatry, London Abstract & Extracting meaning from speech requires the use of difference between conditions in activation of the left-super- pragmatic, semantic, and syntactic information. A central ior-temporal gyrus for the pragmatic experiment than the question is:Does the processing of these different types of semantic/syntactic experiments; (2) a greater difference be- linguistic information have common or distinct neuroanatomi- tween conditions in activation of the right-superior and middle- cal substrates? We addressed this issue using functional temporal gyrus in the semantic experiment than in the syntactic magnetic resonance imaging (fMRI) to measure neural activity experiment; and (3) no regions activated to a greater degree in when subjects listened to spoken normal sentences contrasted the syntactic experiment than in the semantic experiment. with sentences that had either (A) pragmatical, (B) semantic These data show that, while left- and right-superior-temporal (selection restriction), or (C) syntactic (subcategorical) viola- regions may be differentially involved in processing pragmatic tions sentences. All three contrasts revealed robust activation of and lexico-semantic information within sentences, the left- the left-inferior-temporal/fusiform gyrus.
    [Show full text]
  • Modeling Subcategorization Through Co-Occurrence Outline
    Introducing LexIt Building Distributional Profiles Ongoing Work Conclusions Outline Modeling Subcategorization Through Co-occurrence 1 Introducing LexIt A Computational Lexical Resource for Italian Verbs The Project Distributional Profiles 1 2 Gabriella Lapesa ,AlessandroLenci 2 Building Distributional Profiles Pre-processing 1University of Osnabr¨uck, Institute of Cognitive Science Subcategorization Frames 2 University of Pisa, Department of Linguistics Lexical sets Selectional preferences Explorations in Syntactic Government and Subcategorisation nd University of Cambridge, 2 September 2011 3 Ongoing Work 4 Conclusions Gabriella Lapesa, Alessandro Lenci Modeling Subcategorization Through Co-occurrence 2/ 38 Introducing LexIt Introducing LexIt Building Distributional Profiles The Project Building Distributional Profiles The Project Ongoing Work Distributional Profiles Ongoing Work Distributional Profiles Conclusions Conclusions Computational approaches to argument structure LexIt: a computational lexical resource for Italian The automatic acquisition of lexical information from corpora is a longstanding research avenue in computational linguistics LexIt is a computational framework for the automatic acquisition subcategorization frames (Korhonen 2002, Schulte im Walde and exploration of corpus-based distributional profiles of Italian 2009, etc.) verbs, nouns and adjectives selectional preferences (Resnik 1993, Light & Greiff 2002, Erk et al. 2010, etc.) LexIt is publicly available through a web interface: verb classes (Merlo & Stevenson 2001, Schulte im Walde 2006, http://sesia.humnet.unipi.it/lexit/ Kipper-Schuler et al. 2008, etc.) Corpus-based information has been used to build lexical LexIt is the first large-scale resource of such type for Italian, resources aiming at characterizing the valence properties of predicates cf. VALEX for English (Korohnen et al. 2006), LexSchem for fully on distributional ground French (Messiant et al.
    [Show full text]
  • Can Participate in Syntactic Operations; the Law Means That Any Oblique Which Appears in a Sentence Must Bear That Oblique Relation in the Ini Tial Level
    0SU WPL # 26 (1982) 93-101. A Note on the Oblique Law* Brian D. Joseph Perlmutter and Postal (1978a: 15) posit the following "law" within the framework of Relational Grammar : (1) We say that Bis a Ci arc if Bis an arc one of whose coordinates is Ci. Then: if A is an oblique arc , A is a C1 arc. This law is known as the Oblique Law because it constrains the extent to which oblique nominals, i.e. those bearing the nonterm core grammatical rel ations1 such as benefactive, temporal, circumstantial, locative, etc. can participate in syntactic operations; the law means that any oblique which appears in a sentence must bear that oblique relation in the ini tial level. The original intent of this law as indicated by Perlmutter and Postal ' s discussion (p. 15) of it, was to rule out advancements or demotions2 to any oblique grammatical relation. However, as it is currently formulated , it is more general than that and rules out ascensions (i. e . raisings) to an oblique relation as well. The purpose of this note is to show that at best, only the more restricted interpretation of the Oblique Law is valid, because there is a construction in Modern Greek which provides a counter- example to the broader interpretation, as well as other facts which might bear on even the more narrow version of the law. The Greek construction in question is the one called Raising-to-Oblique in Joseph (1979), and involves the circumstantial preposition me ' with' .3 Me can govern simple nominal complements, as in (2), or clausalcomplements, as in (3) : (2) me t6so 66rivo, aen borusa na aulepso with so-much noise/ACC not could/SG VBL PART work/1 SG ' With so much noise, I couldn't work.' (3) me to na stekete eki etsi with NOMINALIZER VBL PART stand/3 S~ there thus o Yanis, tlen borusa na dulepso John/NOM ' With John standing there like that, I couldn' t work.' The evidence against the Oblique Law comes from a construction which is a variant of (3), in which the subject of the clausal object of me is raised to become itself the object of me.
    [Show full text]
  • Acquiring Verb Subcategorization from Spanish Corpora
    Acquiring Verb Subcategorization from Spanish Corpora Grzegorz ChrupaÃla [email protected] Universitat de Barcelona Department of General Linguistics PhD Program “Cognitive Science and Language” Supervised by Dr. Irene Castell´onMasalles September 2003 Contents 1 Introduction 5 2 Verb Subcategorization in Linguistic Theory 7 2.1 Introduction . 7 2.2 Government-Binding and related approaches . 7 2.3 Categorial Grammar . 9 2.4 Lexical-Functional Grammar . 11 2.5 Generalized Phrase-Structure Grammar . 12 2.6 Head-Driven Phrase-Structure Grammar . 14 2.7 Discussion . 17 3 Diathesis Alternations 19 3.1 Introduction . 19 3.2 Diathesis . 19 3.2.1 Alternations . 20 3.3 Diathesis alternations in Spanish . 21 3.3.1 Change of focus . 22 3.3.2 Underspecification . 26 3.3.3 Resultative construction . 27 3.3.4 Middle construction . 27 3.3.5 Conclusions . 29 4 Verb classification 30 4.1 Introduction . 30 4.2 Semantic decomposition . 30 4.3 Levin classes . 32 4.3.1 Beth Levin’s classification . 32 4.3.2 Intersective Levin Classes . 33 4.4 Spanish: verbs of change and verbs of path . 35 4.4.1 Verbs of change . 35 4.4.2 Verbs of path . 37 4.4.3 Discussion . 39 4.5 Lexicographical databases . 40 1 4.5.1 WordNet . 40 4.5.2 VerbNet . 41 4.5.3 FrameNet . 42 5 Subcategorization Acquisition 44 5.1 Evaluation measures . 45 5.1.1 Precision, recall and the F-measure . 45 5.1.2 Types and tokens . 46 5.2 SF acquisition systems . 47 5.2.1 Raw text .
    [Show full text]
  • Features from Aspects Via the Minimalist Program to Combinatory Categorial Grammar
    Á. Gallego & D. Ott (eds.). 2015. 50 Years Later: Reflections on Chomsky’s Aspects. Cambridge, MA: MITWPL. Available for purchase at http://mitwpl.mit.edu © The Authors FEATURES FROM ASPECTS VIA THE MINIMALIST PROGRAM TO COMBINATORY CATEGORIAL GRAMMAR NEIL SMITH ANNABEL CORMACK University College London 1 Background One major contribution of Aspects (Chomsky 1965)1 was to initiate the development of a theory of syntactic features. There is no use of features, either syntactic or morphophonemic, in Chomsky’s earliest work The Morphophonemics of Modern Hebrew (1951/79); they do not appear in his monumental The Logical Structure of Linguistic Theory (1955/75); nor in Syntactic Structures (1957) (except for category labels); nor, except for phonological distinctive features, in Current Issues in Linguistic Theory (1964). In Aspects features play a crucial role. Chomsky suggests (p.214) that “We might, then, take a lexical entry to be simply a set of features, some syntactic, some phonological, some semantic”. But even here the use of features was somewhat haphazard, apparently unconstrained, and a terminological mess. In this contribution we want to trace – in broad outline – the development of a theory of morphosyntactic features and propose a fully Merge-based and parsimonious version of such a theory. Our work is intended as a contribution to Chomsky’s Minimalist Program, pursuing the same ends on the basis of some different background assumptions. These assumptions and the specific analyses derived from them have been developed using an austere version of Combinatorial Categorial Grammar, but are of wider applicability. All current theories use phonological features and ‘semantic’ features which we will largely ignore except to point out that the term ‘semantic’ is used in radically different ways to cover truth-theoretic, conceptual or encyclopaedic distinctions.
    [Show full text]
  • F I Rrrrrrrrr N Rr
    fI 1 1 3 rrrrrrrrr1822 00070 n rr8354 UNIVERSITY OF CALIFORNIA San Diego The Relational Structure of Turkish Syntax A dissertation submitted in partial satisfaction of the requirements for the degree Doctor of Philosophy in Linguistics by Inci Zilhra Ozkaragoz Committee in charge: Professor Sandra Chung, Chairperson Professor Edward Klima Professor Margaret Langdon Professor David Jordan Professor Michael Meeker 1986 Copyright 1986 by Inci• Zuhra•• Ozkaragoz•• - The dissertation of Inci. Zuhra.. .Ozkaragoz .. is approved, and it is acceptable in quality and form for publication on microfilm: I /.- &l a--,_ .. /<Zx-v-·/ ·· Chairperson University of California, San Diego 1986 iii Table of Contents Page List of Abbreviations . vii Acknowledgements ................................ viii Vita, Publications and Fields of Study .......... x Abstract . xi Introduction • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • 1 1 Relational Grammar ........................... 6 1 • 1 Introduction .............................. 6 1 . 2 Clause Structure .......................... 7 1 • 3 Typology of Strata and Some Defined Concepts ......................... 1 1 1 • 4 Well-formedness Conditions ................ 1 3 2 Aspects of Turkish Grammar ................... 1 4 2. 1 Introduction .............................. 1 4 2.2 Word Order ................................ 1 4 2.3 Nominal Morphology ........................ 16 2. 3. 1 Case Marking and Articles ................. 1 6 2.3.2 Pronouns .................................. 29 2.4 Verbal
    [Show full text]
  • Glossary for Syntax in Three Dimensions (2015) © Carola Trips
    Glossary for Syntax in three Dimensions (2015) © Carola Trips A - B - C - D - E - F - G - H - I - J - K - L - M - N - O - P - Q - R - S - T - U - V - W - X - Y – Z A abstract case Abstract Case (or "Deep Case") is an abstract notion of case distinct from the morphologically marked case (cf. "case") used in Case theory (subtheories of Generative Grammar). Abstract Case is present even in languages which lack morphological case on noun phrases; it is usually assumed to be congruent with morphological case when such features are present. For example in German dative case is marked morphologically as in Ich helfe ihm ('I help him') whereas no such case marking is found in the English sentence (but Case is still there). academy In previous centuries the need for an academy dealing with linguistic matters was discussed and debated in England (on the model of the French académie française). Due to the increase in the production of grammars in the 17th and 18th centuries and the rising number of grammarians, calls for an academy to codify the English language by publishing an authoritative grammar and dictionary dwindled. acceptance Step in the process of standardization: a selected variety of a particular language must gain acceptance by a group of speakers/users who are capable of influencing other speakers. actants In valency grammar, a functional unit determined by the valency of the verb. Actants are required by the valence of the verb and are the equivalents of arguments (in generative grammar). © Carola Trips 1 active voice Term used in grammatical analysis of voice, referring to a sentence, clause or verb form where from a semantic point of view the grammatical subject is typically the actor in relation to the verb.
    [Show full text]
  • Subcategorization Semantics and the Naturalness of Verb-Frame Pairings
    University of Pennsylvania Working Papers in Linguistics Volume 4 Issue 2 Article 11 1997 Subcategorization Semantics and the Naturalness of Verb-Frame Pairings Edward Kako University of Pennsylvania Follow this and additional works at: https://repository.upenn.edu/pwpl Recommended Citation Kako, Edward (1997) "Subcategorization Semantics and the Naturalness of Verb-Frame Pairings," University of Pennsylvania Working Papers in Linguistics: Vol. 4 : Iss. 2 , Article 11. Available at: https://repository.upenn.edu/pwpl/vol4/iss2/11 This paper is posted at ScholarlyCommons. https://repository.upenn.edu/pwpl/vol4/iss2/11 For more information, please contact [email protected]. Subcategorization Semantics and the Naturalness of Verb-Frame Pairings This working paper is available in University of Pennsylvania Working Papers in Linguistics: https://repository.upenn.edu/pwpl/vol4/iss2/11 Subcategorization Semantics and the Naturalness of Verb-Frame Pairings Edward Kako 1. Introduction Do subcategorization frames have meanings independent of the verbs which appear in them? Advocates of the Projectionist position have answered “no” to this question, arguing that subcat frames serve only to satisfy the structural demands imposed upon sentences by the semantics of verbs (e.g., Chomsky, 1981; Pinker, 1989; Levin, 1993). Proponents of the Constructionist position, by contrast, have answered “yes” to this question, arguing that whether a verb and frame can peaceably cohabit in a sentence depends upon whether the two are “compatible” (e.g., Goldberg,
    [Show full text]
  • Lexical Functional Grammar Carol Neidle, Boston University
    Lexical Functional Grammar Carol Neidle, Boston University The term Lexical Functional Grammar (LFG) first appeared in print in the 1982 volume edited by Joan Bresnan: The Mental Representation of Grammatical Relations, the culmination of many years of research. LFG differs from both transformational grammar and relational grammar in assuming a single level of syntactic structure. LFG rejects syntactic movement of constituents as the mechanism by which the surface syntactic realization of arguments is determined, and it disallows alteration of grammatical relations within the syntax. A unique constituent structure, corresponding to the superficial phrase structure tree, is postulated. This is made possible by an enriched lexical component that accounts for regularities in the possible mappings of arguments into syntactic structures. For example, the alternation in the syntactic position in which the logical object (theme argument) appears in corresponding active and passive sentences has been viewed by many linguists as fundamentally syntactic in nature, resulting, within transformational grammar, from syntactic movement of constituents. However, LFG eliminates the need for a multi-level syntactic representation by allowing for such alternations to be accomplished through regular, universally constrained, productive lexical processes that determine multiple sets of associations of arguments (such as agent, theme) with grammatical functions (such as SUBJECT, OBJECT)— considered within this framework to be primitives—which then map directly into the syntax. This dissociation of syntactic structure from predicate argument structure (a rejection of Chomsky’s Projection Principle, in essence) is crucial to the LFG framework. The single level of syntactic representation, constituent structure (c-structure), exists simultaneously with a functional structure (f-structure) representation that integrates the information from c-structure and from the lexicon.
    [Show full text]