Government-Binding/Principles and Parameters Theory
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Government and Binding Theory.1 I Shall Not Dwell on This Label Here; Its Significance Will Become Clear in Later Chapters of This Book
Introduction: The Chomskian Perspective on Language Study Contents Introduction 1 Linguistics: the science of language 2 The native speaker: grammaticality and acceptability 2.1 Descriptive adequacy 2.2 Grammaticality and acceptability 2.3 The grammar as a system of principles 3 Knowledge of language 3.1 The poverty of the stimulus 3.2 Universal grammar 3.3 Parameters and universal grammar 3.4 Language learning and language acquisition 3.5 The generative linguist 4 The new comparative syntax 4.1 Principles and parameters: a recapitulation 4.2 The pro-drop properties 4.3 Relating the properties 4.4 Agreement and pro-drop 5 Purpose and organization of the book 5.1 General purpose 5.2 Organization 6 Exercises Introduction The aim of this book is to offer an introduction to the version of generative syntax usually referred to as Government and Binding Theory.1 I shall not dwell on this label here; its significance will become clear in later chapters of this book. Government-Binding Theory is a natural development of earlier versions of generative grammar, initiated by Noam Chomsky some thirty years ago. The purpose of this introductory chapter is not to provide a historical survey of the Chomskian tradition. A full discussion of the history of the generative enterprise would in itself be the basis for a book.2 What I shall do here is offer a short and informal sketch of the essential motivation for the line of enquiry to be pursued. Throughout the book the initial points will become more concrete and more precise. By means of footnotes I shall also direct the reader to further reading related to the matter at hand. -
On Binding Asymmetries in Dative Alternation Constructions in L2 Spanish
On Binding Asymmetries in Dative Alternation Constructions in L2 Spanish Silvia Perpiñán and Silvina Montrul University of Illinois at Urbana-Champaign 1. Introduction Ditransitive verbs can take a direct object and an indirect object. In English and many other languages, the order of these objects can be altered, giving as a result the Dative Construction on the one hand (I sent a package to my parents) and the Double Object Construction, (I sent my parents a package, hereafter DOC), on the other. However, not all ditransitive verbs can participate in this alternation. The study of the English dative alternation has been a recurrent topic in the language acquisition literature. This argument-structure alternation is widely recognized as an exemplar of the poverty of stimulus problem: from a limited set of data in the input, the language acquirer must somehow determine which verbs allow the alternating syntactic forms and which ones do not: you can give money to someone and donate money to someone; you can also give someone money but you definitely cannot *donate someone money. Since Spanish, apparently, does not allow DOC (give someone money), L2 learners of Spanish whose mother tongue is English have to become aware of this restriction in Spanish, without negative evidence. However, it has been noticed by Demonte (1995) and Cuervo (2001) that Spanish has a DOC, which is not identical, but which shares syntactic and, crucially, interpretive restrictions with the English counterpart. Moreover, within the Spanish Dative Construction, the order of the objects can also be inverted without superficial morpho-syntactic differences, (Pablo mandó una carta a la niña ‘Pablo sent a letter to the girl’ vs. -
Circumstantial Evidence for Syntactic Head Movement
Circumstantial Evidence for Syntactic Head Move ment Bartosz Wiland University of Pozna 1. Introduction Recently, a number of analyses have advanced a thesis that syntactic heads are immobile and that head movement does not exist in grammar (cf. Mahajan 2001, 2003; Müller 2004, a.o.) or is severely restricted (e.g. Koopman and Szabolcsi 2000, Nilsen 2003). Such approaches take dislocation of the head X0 to be an instance of a remnant movement of the XP-constituent, preceded by vacating movements of other members of the XP. Detrimental to the claim that head movement does not exist is a scenario in which a dislocation of X0 is followed by a remnant movement of the XP-constituent. Such a derivational scenario is outlined in (1). 0 0 0 (1) a. [YP Y [P [XP X ZP]]] 0 0 0 b. [YP Y [P X + [XP tX0 ZP]]] 0 0 0 c. [YP [XP tX0 ZP][Y Y [P X + tXP ]]] The only possibility of dislocating the head X0 before remnant XP-fronting (in (1c)) is by X0-movement (in (1b)). In this paper, I argue that the derivational scenario in (1) is attested in Polish and it allows us to explain the interpretive contrast between (2a–d) and (2e). (2) a. Jan znowu pos a Marii ksi k . (repetitive) Ja n-NOM again sent Mary-DAT book-ACC b. Jan znowu Marii pos a ksi k . (repetitive) Ja n-NOM again Mary-DAT sent book-ACC c. Jan znowu ksi k pos a Marii. (repetitive) Jan-NOM again book-ACC sent Mary-DAT d. -
3.1. Government 3.2 Agreement
e-Content Submission to INFLIBNET Subject name: Linguistics Paper name: Grammatical Categories Paper Coordinator name Ayesha Kidwai and contact: Module name A Framework for Grammatical Features -II Content Writer (CW) Ayesha Kidwai Name Email id [email protected] Phone 9968655009 E-Text Self Learn Self Assessment Learn More Story Board Table of Contents 1. Introduction 2. Features as Values 3. Contextual Features 3.1. Government 3.2 Agreement 4. A formal description of features and their values 5. Conclusion References 1 1. Introduction In this unit, we adopt (and adapt) the typology of features developed by Kibort (2008) (but not necessarily all her analyses of individual features) as the descriptive device we shall use to describe grammatical categories in terms of features. Sections 2 and 3 are devoted to this exercise, while Section 4 specifies the annotation schema we shall employ to denote features and values. 2. Features and Values Intuitively, a feature is expressed by a set of values, and is really known only through them. For example, a statement that a language has the feature [number] can be evaluated to be true only if the language can be shown to express some of the values of that feature: SINGULAR, PLURAL, DUAL, PAUCAL, etc. In other words, recalling our definitions of distribution in Unit 2, a feature creates a set out of values that are in contrastive distribution, by employing a single parameter (meaning or grammatical function) that unifies these values. The name of the feature is the property that is used to construct the set. Let us therefore employ (1) as our first working definition of a feature: (1) A feature names the property that unifies a set of values in contrastive distribution. -
Wh-Concord in Okinawan = Syntactic Movement + Morphological Merger
University of Pennsylvania Working Papers in Linguistics Volume 22 Issue 1 Proceedings of the 39th Annual Penn Article 20 Linguistics Conference 2016 Wh-Concord in Okinawan = Syntactic Movement + Morphological Merger Kunio Kinjo Yohei Oseki Follow this and additional works at: https://repository.upenn.edu/pwpl Recommended Citation Kinjo, Kunio and Oseki, Yohei (2016) "Wh-Concord in Okinawan = Syntactic Movement + Morphological Merger," University of Pennsylvania Working Papers in Linguistics: Vol. 22 : Iss. 1 , Article 20. Available at: https://repository.upenn.edu/pwpl/vol22/iss1/20 This paper is posted at ScholarlyCommons. https://repository.upenn.edu/pwpl/vol22/iss1/20 For more information, please contact [email protected]. Wh-Concord in Okinawan = Syntactic Movement + Morphological Merger Abstract The main purpose of this paper is to provide a novel account for Wh-Concord in Okinawan based on the Copy Theory of Movement and Distributed Morphology. We propose that Wh-Concord interrogatives and Japanese-type wh-interrogatives have exactly the same derivation in the syntactic component: the Q- particle -ga, base-generated as adjoined to a wh-phrase, undergoes movement to the clause-final position. The two types of interrogatives are distinguished in the post-syntactic component: only in Wh-Concord, the -r morpheme on C0 triggers Morphological Merger, which makes it possible to Spell-Out lower copy of -ga. It is shown that the proposed analysis correctly predicts three descriptive generalizations on the distribution of -ga in -
Derivational Economy in Syntax and Semantics
Derivational Economy in Syntax and Semantics Željko Bošković and Troy Messick Summary Economy considerations have always played an important role in the generative theory of grammar. They are particularly prominent in the most recent instantiation of this approach, the Minimalist Program, which explores the possibility that Universal Grammar is an optimal way of satisfying requirements that are imposed on the language faculty by the external systems that interface with the language faculty which is also characterized by optimal, computationally efficient design. In this respect, the operations of the computational system that produce linguistic expressions must be optimal in that they must satisfy general considerations of simplicity and efficient design. Simply put, the guiding principles here are do something only if you need to; and if you do need to, do it in the most economical/efficient way. These considerations ban superfluous steps in derivations and superfluous symbols in representations. Under economy guidelines, movement takes place only when there is a need for it (with both syntactic and semantic considerations playing a role here), and when it does take place, it takes place in the most economical way: it is as short as possible and carries as little material as possible. Furthermore, economy is evaluated locally, on the basis of immediately available structure. The locality of syntactic dependencies is also enforced by minimal search and by limiting the number of syntactic objects and the amount of structure that are accessible in the derivation. This is achieved by transferring parts of syntactic structure to the interfaces during the derivation, the transferred parts not being accessible for further syntactic operations. -
Syntactic Movement 24.900/R02
October 19, 2018 Syntactic movement 24.900/R02 Syntactic movement Introduction to Linguistics, Instructor: Prof. Adam Albright Section R02, Suzana Fong ([email protected]) Recitation #6, October 19, 2018 1 Overview and recap Recall: this is the structure that we have for sentences: (1) CP 0 C C IP 0 NP I I VP 0 (modifier(s)) V V (XP) Where XP is the complement of the verb; it can be an NP, a PP, or a CP. This gives us a neat structure to capture the linear order of English interrogative sentences: (2) Yes/No question (closed-ended question) (3) Constituent question (open-ended question) a. The dog will eat the watermelon. a. Rosa has criticized Ana. b. Will the dog eat the watermelon? b. Who has Rosa cricitized ? Hypothesis: auxiliaries and interrogatives phrases (what, which picture, etc) are born in the position that their non- interrogative counterparts occupy and then they are moved to C and Spec-CP, respectively. ² Movement: an operation that takes a constituent (phrase or head) and places it elsewhere in the sentence. ² Consequence of movement: constituents can be interpreted in one position (e.g. complement of VP), but pro- nounced in another (e.g. Spec-CP). This property of languages is called displacement. 1/10 October 19, 2018 Syntactic movement 24.900/R02 Formalizing movement (4) a. Take an element α (word or phrase) and move it to an eligible higher position P. i. P must be empty (no overt morpheme). ii. P must be a suitable host for the type of element (heads to heads/argument positions, phrases to modifier positions) b. -
Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related. -
A PROLOG Implementation of Government-Binding Theory
A PROLOG Implementation of Government-Binding Theory Robert J. Kuhns Artificial Intelligence Center Arthur D. Little, Inc. Cambridge, MA 02140 USA Abstrae_~t For the purposes (and space limitations) of A parser which is founded on Chomskyts this paper we only briefly describe the theories Government-Binding Theory and implemented in of X-bar, Theta, Control, and Binding. We also PROLOG is described. By focussing on systems of will present three principles, viz., Theta- constraints as proposed by this theory, the Criterion, Projection Principle, and Binding system is capable of parsing without an Conditions. elaborate rule set and subcategorization features on lexical items. In addition to the 2.1 X-Bar Theory parse, theta, binding, and control relations are determined simultaneously. X-bar theory is one part of GB-theory which captures eross-categorial relations and 1. Introduction specifies the constraints on underlying structures. The two general schemata of X-bar A number of recent research efforts have theory are: explicitly grounded parser design on linguistic theory (e.g., Bayer et al. (1985), Berwick and Weinberg (1984), Marcus (1980), Reyle and Frey (1)a. X~Specifier (1983), and Wehrli (1983)). Although many of these parsers are based on generative grammar, b. X-------~X Complement and transformational grammar in particular, with few exceptions (Wehrli (1983)) the modular The types of categories that may precede or approach as suggested by this theory has been follow a head are similar and Specifier and lagging (Barton (1984)). Moreover, Chomsky Complement represent this commonality of the (1986) has recently suggested that rule-based pre-head and post-head categories, respectively. -
Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook. -
Syntax III Assignment I (Team B): Transformations Julie Winkler
Syntax III Assignment I (Team B): Transformations Julie Winkler, Andrew Pedelty, Kelsey Kraus, Travis Heller, Josue Torres, Nicholas Primrose, Philip King A layout of this paper: I. Transformations! An Introduction a. What b. Why c. What Not II. A-Syntax: Operations on Arguments Pre-Surface Structure III. Morphology: Form and Function a. Lexical assumptions: b. VP Internal Subject Raising Hypothesis c. Form Rules d. Morpho-Syntactic Rules: e. Further Questions IV. A-Bar Syntax a. Not A-Syntax: Movement on the Surface. b. Islands c. Features & Percolation: Not Just for Subcats d. Successive Cyclic Movement V. Where Do We Go From Here? a. Summary 1 I. Introduction Before going into the specifics of transformational syntax as we've learned it, it behooves us to consider transformations on a high level. What are they, how do they operate, and, very importantly, why do we have them? We should also make a distinction regarding the purpose of this paper, which is not to provide a complete or authoritative enumeration of all transformational operations (likely an impossible endeavour), but rather to investigate those operations and affirm our understanding of the framework in which they exist. I. a. What What are transformations? A transformation (Xn) takes an existing syntactic structure and renders a new construction by performing one or more of the following three operations upon it: movement, deletion, or insertion. These operations are constrained by a number of rules and traditionally adhered-to constraints. Furthermore, it has been mentioned that semi- legitimate syntactic theories have eschewed certain of these operations while retaining sufficient descriptive power. -
Lexical-Functional Grammar and Order-Free Semantic Composition
COLING 82, J. Horeck~(eel) North-HoOandPubllshi~ Company O A~deml~ 1982 Lexical-Functional Grammar and Order-Free Semantic Composition Per-Kristian Halvorsen Norwegian Research Council's Computing Center for the Humanities and Center for Cognitive Science, MIT This paper summarizes the extension of the theory of lexical-functional grammar to include a formal, model-theoretic, semantics. The algorithmic specification of the semantic interpretation procedures is order-free which distinguishes the system from other theories providing model-theoretic interpretation for natural language. Attention is focused on the computational advantages of a semantic interpretation system that takes as its input functional structures as opposed to syntactic surface-structures. A pressing problem for computational linguistics is the development of linguistic theories which are supported by strong independent linguistic argumentation, and which can, simultaneously, serve as a basis for efficient implementations in language processing systems. Linguistic theories with these properties make it possible for computational implementations to build directly on the work of linguists both in the area of grammar-writing, and in the area of theory development (cf. universal conditions on anaphoric binding, filler-gap dependencies etc.). Lexical-functional grammar (LFG) is a linguistic theory which has been developed with equal attention being paid to theoretical linguistic and computational processing considerations (Kaplan & Bresnan 1981). The linguistic theory has ample and broad motivation (vide the papers in Bresnan 1982), and it is transparently implementable as a syntactic parsing system (Kaplan & Halvorsen forthcoming). LFG takes grammatical relations to be of primary importance (as opposed to the transformational theory where grammatical functions play a subsidiary role).