Lexical Heads, Phrase Structure and the Induction of Grammar
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related. -
Antisymmetry Kayne, Richard (1995)
CAS LX 523 Syntax II (1) A Spring 2001 March 13, 2001 qp Paul Hagstrom Week 7: Antisymmetry BE 33 Kayne, Richard (1995). The antisymmetry of syntax. Cambridge, MA: MIT Press. CDFG 1111 Koopman, Hilda (2000). The spec-head configuration. In Koopman, H., The syntax of cdef specifiers and heads. London: Routledge. (2) A node α ASYMMETRICALLY C-COMMANDS β if α c-commands β and β does not The basic proposals: c-command α. X-bar structures (universally) have a strict order: Spec-head-complement. There is no distinction between adjuncts and specifiers. • B asymmetrically c-commands F and G. There can be only one specifier. • E asymmetrically c-commands C and D. • No other non-terminal nodes asymmetrically c-command any others. But wait!—What about SOV languages? What about multiple adjunction? Answer: We’ve been analyzing these things wrong. (3) d(X) is the image of a non-terminal node X. Now, we have lots of work to do, because lots of previous analyses relied on d(X) is the set of terminal nodes dominated by node X. the availability of “head-final” structures, or multiple adjunction. • d(C) is {c}. Why make our lives so difficult? Wasn’t our old system good enough? • d(B) is {c, d}. Actually, no. • d(F) is {e}. A number of things had to be stipulated in X-bar theory (which we will review); • d(E) is {e, f}. they can all be made to follow from one general principle. • d(A) is {c, d, e, f}. The availability of a head-parameter actually fails to predict the kinds of languages that actually exist. -
Antisymmetry and the Lefthand in Morphology*
CatWPL 7 071-087 13/6/00 12:26 Página 71 CatWPL 7, 1999 71-87 Antisymmetry And The Lefthand In Morphology* Frank Drijkoningen Utrecht Institute of Linguistics-OTS. Department of Foreign Languages Kromme Nieuwegracht 29. 3512 HD Utrecht. The Netherlands [email protected] Received: December 13th 1998 Accepted: March 17th 1999 Abstract As Kayne (1994) has shown, the theory of antisymmetry of syntax also provides an explanation of a structural property of morphological complexes, the Righthand Head Rule. In this paper we show that an antisymmetry approach to the Righthand Head Rule eventually is to be preferred on empirical grounds, because it describes and explains the properties of a set of hitherto puzz- ling morphological processes —known as discontinuous affixation, circumfixation or parasyn- thesis. In considering these and a number of more standard morphological structures, we argue that one difference bearing on the proper balance between morphology and syntax should be re-ins- talled (re- with respect to Kayne), a difference between the antisymmetry of the syntax of mor- phology and the antisymmetry of the syntax of syntax proper. Key words: antisymmetry, Righthand Head Rule, circumfixation, parasynthesis, prefixation, category-changing prefixation, discontinuities in morphology. Resum. L’antisimetria i el costat esquerre en morfologia Com Kayne (1994) mostra, la teoria de l’antisimetria en la sintaxi també ens dóna una explicació d’una propietat estructural de complexos morfològics, la Regla del Nucli a la Dreta. En aquest article mostrem que un tractament antisimètric de la Regla del Nucli a la Dreta es prefereix even- tualment en dominis empírics, perquè descriu i explica les propietats d’una sèrie de processos fins ara morfològics —coneguts com afixació discontínua, circumfixació o parasíntesi. -
Processing English with a Generalized Phrase Structure Grammar
PROCESSING ENGLISH WITH A GENERALIZED PHRASE STRUCTURE GRAMMAR Jean Mark Gawron, Jonathan King, John Lamping, Egon Loebner, Eo Anne Paulson, Geoffrey K. Pullum, Ivan A. Sag, and Thomas Wasow Computer Research Center Hewlett Packard Company 1501 Page Mill Road Palo Alto, CA 94304 ABSTRACT can be achieved without detailed syntactic analysis. There is, of course, a massive This paper describes a natural language pragmatic component to human linguistic processing system implemented at Hewlett-Packard's interaction. But we hold that pragmatic inference Computer Research Center. The system's main makes use of a logically prior grammatical and components are: a Generalized Phrase Structure semantic analysis. This can be fruitfully modeled Grammar (GPSG); a top-down parser; a logic and exploited even in the complete absence of any transducer that outputs a first-order logical modeling of pragmatic inferencing capability. representation; and a "disambiguator" that uses However, this does not entail an incompatibility sortal information to convert "normal-form" between our work and research on modeling first-order logical expressions into the query discourse organization and conversational language for HIRE, a relational database hosted in interaction directly= Ultimately, a successful the SPHERE system. We argue that theoretical language understanding system wilt require both developments in GPSG syntax and in Montague kinds of research, combining the advantages of semantics have specific advantages to bring to this precise, grammar-driven analysis of utterance domain of computational linguistics. The syntax structure and pragmatic inferencing based on and semantics of the system are totally discourse structures and knowledge of the world. domain-independent, and thus, in principle, We stress, however, that our concerns at this highly portable. -
Head Words and Phrases Heads and Their Dependents
Head Words and Phrases Tallerman: Chapter 4 Ling 222 - Chapter 4 1 Heads and their Dependents • Properties of heads – Head bears most important semantic information of the phrase. – Word class of head determines word class of entire phrase. • [NP very bright [N sunflowers] ] [VP [V overflowed] quite quickly] [AP very [A bright]] [AdvP quite [Adv quickly]] [PP [P inside] the house] Ling 222 - Chapter 4 2 1 – Head has same distribution as the entire phrase. • Go inside the house. Go inside. • Kim likes very bright sunflowers. Kim likes sunflowers. – Heads normally can’t be omitted • *Go the house. • *Kim likes very bright. Ling 222 - Chapter 4 3 – Heads select dependent phrases of a particular word class. • The soldiers released the hostages. • *The soldiers released. • He went into the house. *He went into. • bright sunflowers *brightly sunflowers • Kambera – Lalu mbana-na na lodu too hot-3SG the sun ‘The sun is hot.’ – *Lalu uma too house Ling 222 - Chapter 4 4 2 – Heads often require dependents to agree with grammatical features of head. • French – un livre vert a:MASC book green:MASC ‘a green book.’ – une pomme verte a:FEM apple green:FEM ‘a green apple’ – Heads may require dependent NPs to occur in a particular grammatical case. • Japanese – Kodomo-ga hon-o yon-da child-NOM book-ACC read-PAST ‘The child read the book.’ Ling 222 - Chapter 4 5 • More about dependents – Adjuncts and complements • Adjuncts are always optional; complements are frequently obligatory • Complements are selected by the head and therefore bear a close relationship with it; adjuncts add extra information. -
Acquiring Verb Subcategorization from Spanish Corpora
Acquiring Verb Subcategorization from Spanish Corpora Grzegorz ChrupaÃla [email protected] Universitat de Barcelona Department of General Linguistics PhD Program “Cognitive Science and Language” Supervised by Dr. Irene Castell´onMasalles September 2003 Contents 1 Introduction 5 2 Verb Subcategorization in Linguistic Theory 7 2.1 Introduction . 7 2.2 Government-Binding and related approaches . 7 2.3 Categorial Grammar . 9 2.4 Lexical-Functional Grammar . 11 2.5 Generalized Phrase-Structure Grammar . 12 2.6 Head-Driven Phrase-Structure Grammar . 14 2.7 Discussion . 17 3 Diathesis Alternations 19 3.1 Introduction . 19 3.2 Diathesis . 19 3.2.1 Alternations . 20 3.3 Diathesis alternations in Spanish . 21 3.3.1 Change of focus . 22 3.3.2 Underspecification . 26 3.3.3 Resultative construction . 27 3.3.4 Middle construction . 27 3.3.5 Conclusions . 29 4 Verb classification 30 4.1 Introduction . 30 4.2 Semantic decomposition . 30 4.3 Levin classes . 32 4.3.1 Beth Levin’s classification . 32 4.3.2 Intersective Levin Classes . 33 4.4 Spanish: verbs of change and verbs of path . 35 4.4.1 Verbs of change . 35 4.4.2 Verbs of path . 37 4.4.3 Discussion . 39 4.5 Lexicographical databases . 40 1 4.5.1 WordNet . 40 4.5.2 VerbNet . 41 4.5.3 FrameNet . 42 5 Subcategorization Acquisition 44 5.1 Evaluation measures . 45 5.1.1 Precision, recall and the F-measure . 45 5.1.2 Types and tokens . 46 5.2 SF acquisition systems . 47 5.2.1 Raw text . -
1 on Agent Nominalizations and Why They Are Not Like Event
On agent nominalizations and why they are not like event nominalizations1 Mark C. Baker and Nadya Vinokurova Rutgers University and Research Institute of Humanities -Yakutsk Abstract: This paper focuses on agent-denoting nominalizations in various languages (e.g. the finder of the wallet), contrasting them with the much better studied action/event- denoting nominalizations. In particular, we show that in Sakha, Mapudungun, and English, agent-denoting nominalizations have none of the verbal features that event- denoting nominalizations sometimes have: they cannot contain adverbs, voice markers, expressions of aspect or mood, or verbal negation. An apparent exception to this generalization is that Sakha allows accusative-case marked objects in agentive nominalizations. We show that in fact the structure of agentive nominalizations in Sakha is as purely nominal as in other languages, and the difference is attributable to the rule of accusative case assignment. We explain these restrictions by arguing that agentive nominalizers have a semantics very much like the one proposed by Kratzer (1996) for Voice heads. Given this, the natural order of semantic composition implies that agentive nominalizers must combine directly with VP, just as Voice heads must. As a preliminary to testing this idea typologically, we show how a true agentive nominalization can be distinguished from a headless subject relative clause, illustrating with data from Mapudungun. We then present the results of a 34-language survey, showing that indeed none of these languages allow clause-like syntax inside a true agentive nominalization. We conclude that a generative-style investigation into the details of particular languages can be a productive source of things to look for in typological surveys. -
On Object Shift, Scrambling, and the PIC
On Object Shift, Scrambling, and the PIC Peter Svenonius University of Tromsø and MIT* 1. A Class of Movements The displacements characterized in (1-2) have received a great deal of attention. (Boldface in the gloss here is simply to highlight the alternation.) (1) Scrambling (exx. from Bergsland 1997: 154) a. ... gan nagaan slukax igaaxtakum (Aleut) his.boat out.of seagull.ABS flew ‘... a seagull flew out of his boat’ b. ... quganax hlagan kugan husaqaa rock.ABS his.son on.top.of fell ‘... a rock fell on top of his son’1 (2) Object Shift (OS) a. Hann sendi sem betur fer bréfi ni ur. (Icelandic) he sent as better goes the.letter down2 b. Hann sendi bréfi sem betur fer ni ur. he sent the.letter as better goes down (Both:) ‘He fortunately sent the letter down’ * I am grateful to the University of Tromsø Faculty of Humanities for giving me leave to traipse the globe on the strength of the promise that I would write some papers, and to the MIT Department of Linguistics & Philosophy for welcoming me to breathe in their intellectually stimulating atmosphere. I would especially like to thank Noam Chomsky, Norvin Richards, and Juan Uriagereka for discussing parts of this work with me while it was underway, without implying their endorsement. Thanks also to Kleanthes Grohmann and Ora Matushansky for valuable feedback on earlier drafts, and to Ora Matushansky and Elena Guerzoni for their beneficient editorship. 1 According to Bergsland (pp. 151-153), a subject preceding an adjunct tends to be interpreted as definite (making (1b) unusual), and one following an adjunct tends to be indefinite; this is broadly consistent with the effects of scrambling cross-linguistically. -
Theories of Syntax, Semantics and Discourse Interpretation for Natural Language
Theories of Syntax, Semantics and Discourse Interpretation for Natural Language Ted Briscoe Computer Laboratory University of Cambridge c Ted Briscoe (3 sessions, Michaelmas Term 2011) October 26, 2011 Abstract This handout builds on the Intro to Linguistics material by presenting formal theories of aspects of language. We mostly focus on the formalisa- tions of aspects of the subtasks syntactic, semantic and discourse interpre- tation, rather than computational approximations to solve them, because it is useful to thoroughly understand the problem you are trying to solve before you start to solve it. This handout is not meant to replace text- books – see Intro to Linguistics for readings and references herein. Please red each section in advance of the session, attempt the exercises, and be prepared to ask and answer questions on the material covered. Contents 1 (Context-free) Phrase Structure Grammar 2 1.1 Derivations . 3 1.2 Ambiguity . 4 1.3 Inadequacies of CF PSG . 6 1.4 Unification and Features . 7 2 Compositional Semantics 10 2.1 Predicates & Arguments . 10 2.2 NP Semantics . 12 2.3 Scope Ambiguities . 15 2.4 Intensionality . 16 2.5 Word Meaning . 18 2.6 Theorem Proving . 19 3 Discourse Processing 21 3.1 Abductive Inference . 22 1 3.2 Scripts and Plans . 23 3.3 Shallow, Knowledge-Poor Anaphora Resolution . 23 1 (Context-free) Phrase Structure Grammar A generative grammar is a finite set of rules which define the (infinite) set of grammatical sentences of some language. Here are some example rules for English: a) S → NP VP b) NP → Det N c) NP → Nname d) VP → Vt NP These rules assign the sentence The people love Sandy the same analysis and phrase structure tree that was proposed in the Intro. -
The Antisymmetry of Syntax
Contents Series Foreword xi Preface xiii Acknowledgments xvii Chapter 1 Introduction 3 1.1 Introduction 3 1.2 Proposal 5 Chapter 2 Deriving XBar lhory 7 PART 11 13 Chapter 3 Adjunction 15 3.1 Segments and Categories 15 3.2 Adjunction to a Head 17 3.3 Multiple Adjunctions: Clitics 19 3.4 Multiple Adjunctions: Nonheads 21 3.5 Specifiers 22 ... Vlll Contents Contents 3.6 Verb-Second Effects 27 Chapter 6 3.7 Adjunction of a Head to a Nonhead 30 Coordination 57 6.1 More on Coordination 57 Chapter 4 6.2 Coordination of Heads, Wordorder 33 4.1 The specifier-complement including Clitics 59 Asymmetry 33 6.3 Coordination with With 63 4.2 Specifier-Head-Complement as a Universal Order 35 6.4 Right Node Raising 67 4.3 Time and the Universal Chapter 7 -- Specifier-Head-Complement Order Complementation 69 7.1 Multiple Complements and 36 Adjuncts 69 4.4. Linear Order and Adjunction to 7.2 Heavy NP Shift 71 Heads 38 7.3 Right-Dislocations 78 4.5 Linear Order and Structure below the Word Level 38 Relatives and Posseshes 85 8.1 Postnominal Possessives in 4.6 The Adjunction Site of Clitics English 85 42 8.2 Relative Clauses in English 86 Chapter 5 Fortherconsequences 47 5.1 There Is No Directionality 8.3 N-Final Relative Clauses 92 Parameter 47 8.4 Reduced Relatives and 5.2 The LCA Applies to All Syntactic Representations 48 Adjectives 97 8.5 More on Possessives 101 I 5.3 Agreement in Adpositional Phrases 49 1 8.6 More on French De 105 b 5.4 Head Movement 50 8.7 Nonrestrictive Relatives 1 10 5.5 Final Complementizers and Agglutination 52 .. -
Dependency Grammars
Introduction Phrase Structures Dependency Grammars Dependency Grammars Syntactic Theory Winter Semester 2009/2010 Antske Fokkens Department of Computational Linguistics Saarland University 27 October 2009 Antske Fokkens Syntax — Dependency Grammar 1/42 Introduction Phrase Structures Dependency Grammars Outline 1 Introduction 2 Phrase Structures 3 Dependency Grammars Introduction Dependency relations Properties of Dependencies Antske Fokkens Syntax — Dependency Grammar 2/42 Introduction Phrase Structures Dependency Grammars Outline 1 Introduction 2 Phrase Structures 3 Dependency Grammars Introduction Dependency relations Properties of Dependencies Antske Fokkens Syntax — Dependency Grammar 3/42 Introduction Phrase Structures Dependency Grammars Overview of this lecture In the next three lectures, we will discuss Dependency Grammars: Dependencies and Phrase Structures: basic objectives of syntactic analysis properties of phrase structure grammars Basic definitions of Dependencies What are dependencies? Example analyses Differences and Relations between Dependencies and Phrase Structures Syntactic Theory/CL and Dependencies Meaning to Text Theory Prague Dependency Treebank Antske Fokkens Syntax — Dependency Grammar 4/42 Introduction Phrase Structures Dependency Grammars Syntactic Analysis Syntax investigates the structure of expressions Some reasons for performing syntactic analysis: To understand something about how language works To analyze language: how can we relate speech/written text to meaning? Antske Fokkens Syntax — Dependency Grammar -
1 a Microparametric Approach to the Head-Initial/Head-Final Parameter
A microparametric approach to the head-initial/head-final parameter Guglielmo Cinque Ca’ Foscari University, Venice Abstract: The fact that even the most rigid head-final and head-initial languages show inconsistencies and, more crucially, that the very languages which come closest to the ideal types (the “rigid” SOV and the VOS languages) are apparently a minority among the languages of the world, makes it plausible to explore the possibility of a microparametric approach for what is often taken to be one of the prototypical examples of macroparameter, the ‘head-initial/head-final parameter’. From this perspective, the features responsible for the different types of movement of the constituents of the unique structure of Merge from which all canonical orders derive are determined by lexical specifications of different generality: from those present on a single lexical item, to those present on lexical items belonging to a specific subclass of a certain category, or to every subclass of a certain category, or to every subclass of two or more, or all, categories, (always) with certain exceptions.1 Keywords: word order, parameters, micro-parameters, head-initial, head-final 1. Introduction. An influential conjecture concerning parameters, subsequent to the macro- parametric approach of Government and Binding theory (Chomsky 1981,6ff and passim), is that they can possibly be “restricted to formal features [of functional categories] with no interpretation at the interface” (Chomsky 1995,6) (also see Borer 1984 and Fukui 1986). This conjecture has opened the way to a microparametric approach to differences among languages, as well as to differences between related words within one and the same language (Kayne 2005,§1.2).