Practical Instructions for Working with the Formalism of Lexical Functional Grammar*
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related. -
Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook. -
Syntax III Assignment I (Team B): Transformations Julie Winkler
Syntax III Assignment I (Team B): Transformations Julie Winkler, Andrew Pedelty, Kelsey Kraus, Travis Heller, Josue Torres, Nicholas Primrose, Philip King A layout of this paper: I. Transformations! An Introduction a. What b. Why c. What Not II. A-Syntax: Operations on Arguments Pre-Surface Structure III. Morphology: Form and Function a. Lexical assumptions: b. VP Internal Subject Raising Hypothesis c. Form Rules d. Morpho-Syntactic Rules: e. Further Questions IV. A-Bar Syntax a. Not A-Syntax: Movement on the Surface. b. Islands c. Features & Percolation: Not Just for Subcats d. Successive Cyclic Movement V. Where Do We Go From Here? a. Summary 1 I. Introduction Before going into the specifics of transformational syntax as we've learned it, it behooves us to consider transformations on a high level. What are they, how do they operate, and, very importantly, why do we have them? We should also make a distinction regarding the purpose of this paper, which is not to provide a complete or authoritative enumeration of all transformational operations (likely an impossible endeavour), but rather to investigate those operations and affirm our understanding of the framework in which they exist. I. a. What What are transformations? A transformation (Xn) takes an existing syntactic structure and renders a new construction by performing one or more of the following three operations upon it: movement, deletion, or insertion. These operations are constrained by a number of rules and traditionally adhered-to constraints. Furthermore, it has been mentioned that semi- legitimate syntactic theories have eschewed certain of these operations while retaining sufficient descriptive power. -
Lexical-Functional Grammar and Order-Free Semantic Composition
COLING 82, J. Horeck~(eel) North-HoOandPubllshi~ Company O A~deml~ 1982 Lexical-Functional Grammar and Order-Free Semantic Composition Per-Kristian Halvorsen Norwegian Research Council's Computing Center for the Humanities and Center for Cognitive Science, MIT This paper summarizes the extension of the theory of lexical-functional grammar to include a formal, model-theoretic, semantics. The algorithmic specification of the semantic interpretation procedures is order-free which distinguishes the system from other theories providing model-theoretic interpretation for natural language. Attention is focused on the computational advantages of a semantic interpretation system that takes as its input functional structures as opposed to syntactic surface-structures. A pressing problem for computational linguistics is the development of linguistic theories which are supported by strong independent linguistic argumentation, and which can, simultaneously, serve as a basis for efficient implementations in language processing systems. Linguistic theories with these properties make it possible for computational implementations to build directly on the work of linguists both in the area of grammar-writing, and in the area of theory development (cf. universal conditions on anaphoric binding, filler-gap dependencies etc.). Lexical-functional grammar (LFG) is a linguistic theory which has been developed with equal attention being paid to theoretical linguistic and computational processing considerations (Kaplan & Bresnan 1981). The linguistic theory has ample and broad motivation (vide the papers in Bresnan 1982), and it is transparently implementable as a syntactic parsing system (Kaplan & Halvorsen forthcoming). LFG takes grammatical relations to be of primary importance (as opposed to the transformational theory where grammatical functions play a subsidiary role). -
Introduction to Transformational Grammar
Introduction to Transformational Grammar Kyle Johnson University of Massachusetts at Amherst Fall 2004 Contents Preface iii 1 The Subject Matter 1 1.1 Linguisticsaslearningtheory . 1 1.2 The evidential basis of syntactic theory . 7 2 Phrase Structure 15 2.1 SubstitutionClasses............................. 16 2.2 Phrases .................................... 20 2.3 Xphrases................................... 29 2.4 ArgumentsandModifiers ......................... 41 3 Positioning Arguments 57 3.1 Expletives and the Extended Projection Principle . ..... 58 3.2 Case Theory and ordering complements . 61 3.3 Small Clauses and the Derived Subjects Hypothesis . ... 68 3.4 PROandControlInfinitives . .. .. .. .. .. .. 79 3.5 Evidence for Argument Movement from Quantifier Float . 83 3.6 Towards a typology of infinitive types . 92 3.7 Constraints on Argument Movement and the typology of verbs . 97 4 Verb Movement 105 4.1 The “Classic” Verb Movement account . 106 4.2 Head Movement’s role in “Verb Second” word order . 115 4.3 The Pollockian revolution: exploded IPs . 123 4.4 Features and covert movement . 136 5 Determiner Phrases and Noun Movement 149 5.1 TheDPHypothesis ............................. 151 5.2 NounMovement............................... 155 Contents 6 Complement Structure 179 6.1 Nouns and the θ-rolestheyassign .................... 180 6.2 Double Object constructions and Larsonian shells . 195 6.3 Complement structure and Object Shift . 207 7 Subjects and Complex Predicates 229 7.1 Gettingintotherightposition . 229 7.2 SubjectArguments ............................. 233 7.2.1 ArgumentStructure ........................ 235 7.2.2 The syntactic benefits of ν .................... 245 7.3 The relative positions of µP and νP: Evidence from ‘again’ . 246 7.4 The Minimal Link Condition and Romance causatives . 254 7.5 RemainingProblems ............................ 271 7.5.1 The main verb in English is too high . -
Processing English with a Generalized Phrase Structure Grammar
PROCESSING ENGLISH WITH A GENERALIZED PHRASE STRUCTURE GRAMMAR Jean Mark Gawron, Jonathan King, John Lamping, Egon Loebner, Eo Anne Paulson, Geoffrey K. Pullum, Ivan A. Sag, and Thomas Wasow Computer Research Center Hewlett Packard Company 1501 Page Mill Road Palo Alto, CA 94304 ABSTRACT can be achieved without detailed syntactic analysis. There is, of course, a massive This paper describes a natural language pragmatic component to human linguistic processing system implemented at Hewlett-Packard's interaction. But we hold that pragmatic inference Computer Research Center. The system's main makes use of a logically prior grammatical and components are: a Generalized Phrase Structure semantic analysis. This can be fruitfully modeled Grammar (GPSG); a top-down parser; a logic and exploited even in the complete absence of any transducer that outputs a first-order logical modeling of pragmatic inferencing capability. representation; and a "disambiguator" that uses However, this does not entail an incompatibility sortal information to convert "normal-form" between our work and research on modeling first-order logical expressions into the query discourse organization and conversational language for HIRE, a relational database hosted in interaction directly= Ultimately, a successful the SPHERE system. We argue that theoretical language understanding system wilt require both developments in GPSG syntax and in Montague kinds of research, combining the advantages of semantics have specific advantages to bring to this precise, grammar-driven analysis of utterance domain of computational linguistics. The syntax structure and pragmatic inferencing based on and semantics of the system are totally discourse structures and knowledge of the world. domain-independent, and thus, in principle, We stress, however, that our concerns at this highly portable. -
Psychoacoustics, Speech Perception, Language Structure and Neurolinguistics Hearing Acuity Absolute Auditory Threshold Constant
David House: Psychoacoustics, speech perception, 2018.01.25 language structure and neurolinguistics Hearing acuity Psychoacoustics, speech perception, language structure • Sensitive for sounds from 20 to 20 000 Hz and neurolinguistics • Greatest sensitivity between 1000-6000 Hz • Non-linear perception of frequency intervals David House – E.g. octaves • 100Hz - 200Hz - 400Hz - 800Hz - 1600Hz – 100Hz - 800Hz perceived as a large difference – 3100Hz - 3800 Hz perceived as a small difference Absolute auditory threshold Demo: SPL (Sound pressure level) dB • Decreasing noise levels – 6 dB steps, 10 steps, 2* – 3 dB steps, 15 steps, 2* – 1 dB steps, 20 steps, 2* Constant loudness levels in phons Demo: SPL and loudness (phons) • 50-100-200-400-800-1600-3200-6400 Hz – 1: constant SPL 40 dB, 2* – 2: constant 40 phons, 2* 1 David House: Psychoacoustics, speech perception, 2018.01.25 language structure and neurolinguistics Critical bands • Bandwidth increases with frequency – 200 Hz (critical bandwidth 50 Hz) – 800 Hz (critical bandwidth 80 Hz) – 3200 Hz (critical bandwidth 200 Hz) Critical bands demo Effects of masking • Fm=200 Hz (critical bandwidth 50 Hz) – B= 300,204,141,99,70,49,35,25,17,12 Hz • Fm=800 Hz (critical bandwidth 80 Hz) – B=816,566,396,279,197,139,98,69,49,35 Hz • Fm=3200 Hz (critical bandwidth 200 Hz) – B=2263,1585,1115,786,555,392,277,196,139,98 Hz Effects of masking Holistic vs. analytic listening • Low frequencies more effectively mask • Demo 1: audible harmonics (1-5) high frequencies • Demo 2: melody with harmonics • Demo: how -
3 Derivation and Representation in Modern Transformational Syntax
62 Howard Lasnik 3 Derivation and Representation in Modern Transformational Syntax HOWARD LASNIK 0 Introduction The general issue of derivational vs. representational approaches to syntax has received considerable attention throughout the history of generative gram- mar. Internal to the major derivational approach, transformational grammar, a related issue arises: are well-formedness conditions imposed specifically at the particular levels of representation1 made available in the theory, or are they imposed “internal” to the derivation leading to those levels? This second question will be the topic of this chapter. Like the first question, it is a subtle one, perhaps even more subtle than the first, but since Chomsky (1973), there has been increasing investigation of it, and important argument and evidence have been brought to bear. I will be examining a range of arguments, some old and some new, concerning (i) locality constraints on movement, and (ii) the property forcing (overt) movement, especially as this issue arises within the “minimalist” program of Chomsky (1995b). 1 Locality of Movement: Subjacency and the ECP The apparent contradiction between unbounded movement as in (1) and strict locality as in (2) has long played a central role in syntactic investigations in general and studies of the role of derivation in particular: (1) Who do you think Mary said John likes (2) ?*Who did you ask whether Mary knows why John likes Based on the unacceptability of such examples as (2), Chomsky (1973), reject- ing earlier views (including the highly influential Ross 1967a) proposed that Derivation and Representation 63 long distance movement is never possible. (1) must then be the result of a series of short movements, short movements that are, for some reason, not available in the derivation of (2). -
66 EMPTY CATEGORIES in TRANSFORMATIONAL RULES Abstract
JL3T Journal of Linguistics, Literature & Language Teaching EMPTY CATEGORIES IN TRANSFORMATIONAL RULES Ruly Adha IAIN Langsa [email protected] Abstract There are three theories that are always developed in any study of language, namely theory of language structure, theory of language acquisition, and theory of language use. Among those three theories, theory of language structure is regarded as the most important one. It is assumed that if someone knows the structure of language, he/she can develop theories about how language is acquired and used. It makes Chomsky interested in developing the theory of language structure. Chomsky introduced a theory of grammar called Transformational Generative Grammar or Transformational Syntax. Transformational Syntax is a method of sentence fomation which applies some syntactic rules (or also called transformational rules). Transformational rules consist of three types, namely movement transformation, deletion transformation, and substitution transformation. When those transformational rules are applied in a sentence, they will leave empty categories. Empty categories can be in the form of Complementizer (Comp), Trace, and PRO. The objectives of this article are to elaborate those empty categories; to show their appearance in the transformational rules; and to describe the characteristics of each empty category. Comp, Trace, and PRO can be found in movement and deletion transformation. In this article, tree diagram and bracketing are used as the methods of sentence analysis. Keywords Empty Categories, Transformational Rules, Comp, Trace, PRO JL3T. Vol. V, No. 1 June 2019 66 JL3T Journal of Linguistics, Literature & Language Teaching INTRODUCTION Chomsky (in Radford, 1988) states that there are three inter-related theories which any detailed study of language ultimately seeks to develop, namely: 1. -
Phrase Structure Rules, Tree Rewriting, and Other Sources of Recursion Structure Within the NP
Introduction to Transformational Grammar, LINGUIST 601 September 14, 2006 Phrase Structure Rules, Tree Rewriting, and other sources of Recursion Structure within the NP 1 Trees (1) a tree for ‘the brown fox sings’ A ¨¨HH ¨¨ HH B C ¨H ¨¨ HH sings D E the ¨¨HH F G brown fox • Linguistic trees have nodes. The nodes in (1) are A, B, C, D, E, F, and G. • There are two kinds of nodes: internal nodes and terminal nodes. The internal nodes in (1) are A, B, and E. The terminal nodes are C, D, F, and G. Terminal nodes are so called because they are not expanded into anything further. The tree ends there. Terminal nodes are also called leaf nodes. The leaves of (1) are really the words that constitute the sentence ‘the brown fox sings’ i.e. ‘the’, ‘brown’, ‘fox’, and ‘sings’. (2) a. A set of nodes form a constituent iff they are exhaustively dominated by a common node. b. X is a constituent of Y iff X is dominated by Y. c. X is an immediate constituent of Y iff X is immediately dominated by Y. Notions such as subject, object, prepositional object etc. can be defined structurally. So a subject is the NP immediately dominated by S and an object is an NP immediately dominated by VP etc. (3) a. If a node X immediately dominates a node Y, then X is the mother of Y, and Y is the daughter of X. b. A set of nodes are sisters if they are all immediately dominated by the same (mother) node. -
Transformational Grammar and Problems of Syntactic Ambiguity in English
Central Washington University ScholarWorks@CWU All Master's Theses Master's Theses 1968 Transformational Grammar and Problems of Syntactic Ambiguity in English Shirlie Jeanette Verley Central Washington University Follow this and additional works at: https://digitalcommons.cwu.edu/etd Part of the Liberal Studies Commons, and the Scholarship of Teaching and Learning Commons Recommended Citation Verley, Shirlie Jeanette, "Transformational Grammar and Problems of Syntactic Ambiguity in English" (1968). All Master's Theses. 976. https://digitalcommons.cwu.edu/etd/976 This Thesis is brought to you for free and open access by the Master's Theses at ScholarWorks@CWU. It has been accepted for inclusion in All Master's Theses by an authorized administrator of ScholarWorks@CWU. For more information, please contact [email protected]. TRANSFORMATIONAL GRAMMAR AND PROBLEMS OF SYNTACTIC AMBIGUITY IN ENGLISH A Thesis Presented to the Graduate Faculty Central Washington State College In Partial Fulfillment of the Requirements for the Degree Master of Education by Shirlie Jeanette Verley August, 1968 uol3u:qsa A'\ '8.mqsuau3 a80110J •ne1s UO:i)}U'.1_Sllh\ f!:UlUC>J ~PJ ~ NOU!J3TIO:) i'4133dS APPROVED FOR THE GRADUATE FACULTY ________________________________ D. W. Cummings, COMMITTEE CHAIRMAN _________________________________ Lyman B. Hagen _________________________________ Donald G. Goetschius ACKNOWLEDGMENTS I wish to thank, first, my chairman, Dr. D. W. Cummings for his assistance and guidance in the writing of this thesis. Next, I wish to thank Dr. Lyman Hagen and Dr. Don Goetschius for their participation on the committee. Last, my very personal thanks to my family, Gene, Sue, Steve and Shane for their patience. TABLE OF CONTENTS CHAPTER PAGE I. -
The Syntactic Power of Transformational Rules in Shona
Journal of Literature, Languages and Linguistics www.iiste.org ISSN 2422-8435 An International Peer-reviewed Journal Vol.20, 2016 The Syntactic Power of Transformational Rules in Shona Isaac Mhute Faculty of Arts and Education, Department of Language and Media studies, Zimbabwe Open University, Masvingo Regional Campus Abstract The paper ventures into the Linguistic branch of syntax with the goal of finding out how much power transformational rules possess in Shona sentences. Focus is on the effects that application of various transformational rules may have on the various components of the sentences. It has been realised that transformational rules have the power to rearrange, delete or add some words into sentences. In the process forms of the words may also be altered for various reasons. It was also demonstrated that at times one finds himself bound to apply certain rules or risk making people draw various implications in response. Keywords: Shona, syntactic, transformation, power, linguistic, rules Introduction and Brief Literature Review Fowler (1971) asserts that the grammarian’s task is to describe sentences in such a way as to account for what native mature speakers of the language know about those sentences. Fowler further observes that one kind of generative grammar that performs these tasks particularly well is called a transformational grammar. Transformational grammar is a grammar that recognizes deep structure and surface structure distinction in syntax and employs certain formally distinctive kinds of linguistic rules linking the two distinct levels. Fowler (ibid) also describes transformational rules as rules that relate sentences to each other adding that this is done by relating partially overlapping derivations.