The Minimalist Program 1 1 the Minimalist Program
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Lisa Pearl LING200, Summer Session I, 2004 Syntax – Sentence Structure
Lisa Pearl LING200, Summer Session I, 2004 Syntax – Sentence Structure I. Syntax A. Gives us the ability to say something new with old words. Though you have seen all the words that I’m using in this sentence before, it’s likely you haven’t seen them put together in exactly this way before. But you’re perfectly capable of understanding the sentence, nonetheless. A. Structure: how to put words together. A. Grammatical structure: a way to put words together which is allowed in the language. a. I often paint my nails green or blue. a. *Often green my nails I blue paint or. II. Syntactic Categories A. How to tell which category a word belongs to: look at the type of meaning the word has, the type of affixes which can adjoin to the word, and the environments you find the word in. A. Meaning: tell lexical category from nonlexical category. a. lexical – meaning is fairly easy to paraphrase i. vampire ≈ “night time creature that drinks blood” i. dark ≈ “without much light” i. drink ≈ “ingest a liquid” a. nonlexical – meaning is much harder to pin down i. and ≈ ?? i. the ≈ ?? i. must ≈ ?? a. lexical categories: Noun, Verb, Adjective, Adverb, Preposition i. N: blood, truth, life, beauty (≈ entities, objects) i. V: bleed, tell, live, remain (≈ actions, sensations, states) i. A: gory, truthful, riveting, beautiful (≈ property/attribute of N) 1. gory scene 1. riveting tale i. Adv: sharply, truthfully, loudly, beautifully (≈ property/attribute of V) 1. speak sharply 1. dance beautifully i. P: to, from, inside, around, under (≈ indicates path or position w.r.t N) 1. -
The Empirical Base of Linguistics: Grammaticality Judgments and Linguistic Methodology
UCLA UCLA Previously Published Works Title The empirical base of linguistics: Grammaticality judgments and linguistic methodology Permalink https://escholarship.org/uc/item/05b2s4wg ISBN 978-3946234043 Author Schütze, Carson T Publication Date 2016-02-01 DOI 10.17169/langsci.b89.101 Data Availability The data associated with this publication are managed by: Language Science Press, Berlin Peer reviewed eScholarship.org Powered by the California Digital Library University of California The empirical base of linguistics Grammaticality judgments and linguistic methodology Carson T. Schütze language Classics in Linguistics 2 science press Classics in Linguistics Chief Editors: Martin Haspelmath, Stefan Müller In this series: 1. Lehmann, Christian. Thoughts on grammaticalization 2. Schütze, Carson T. The empirical base of linguistics: Grammaticality judgments and linguistic methodology 3. Bickerton, Derek. Roots of language ISSN: 2366-374X The empirical base of linguistics Grammaticality judgments and linguistic methodology Carson T. Schütze language science press Carson T. Schütze. 2019. The empirical base of linguistics: Grammaticality judgments and linguistic methodology (Classics in Linguistics 2). Berlin: Language Science Press. This title can be downloaded at: http://langsci-press.org/catalog/book/89 © 2019, Carson T. Schütze Published under the Creative Commons Attribution 4.0 Licence (CC BY 4.0): http://creativecommons.org/licenses/by/4.0/ ISBN: 978-3-946234-02-9 (Digital) 978-3-946234-03-6 (Hardcover) 978-3-946234-04-3 (Softcover) 978-1-523743-32-2 -
CAS LX 522 Syntax I
It is likely… CAS LX 522 IP This satisfies the EPP in Syntax I both clauses. The main DPj I′ clause has Mary in SpecIP. Mary The embedded clause has Vi+I VP is the trace in SpecIP. V AP Week 14b. PRO and control ti This specific instance of A- A IP movement, where we move a likely subject from an embedded DP I′ clause to a higher clause is tj generally called subject raising. I VP to leave Reluctance to leave Reluctance to leave Now, consider: Reluctant has two θ-roles to assign. Mary is reluctant to leave. One to the one feeling the reluctance (Experiencer) One to the proposition about which the reluctance holds (Proposition) This looks very similar to Mary is likely to leave. Can we draw the same kind of tree for it? Leave has one θ-role to assign. To the one doing the leaving (Agent). How many θ-roles does reluctant assign? In Mary is reluctant to leave, what θ-role does Mary get? IP Reluctance to leave Reluctance… DPi I′ Mary Vj+I VP In Mary is reluctant to leave, is V AP Mary is doing the leaving, gets Agent t Mary is reluctant to leave. j t from leave. i A′ Reluctant assigns its θ- Mary is showing the reluctance, gets θ roles within AP as A θ IP Experiencer from reluctant. required, Mary moves reluctant up to SpecIP in the main I′ clause by Spellout. ? And we have a problem: I vP But what gets the θ-role to Mary appears to be getting two θ-roles, from leave, and what v′ in violation of the θ-criterion. -
Linguistics an Introduction, SECOND EDITION
Linguistics An Introduction SECOND EDITION ANDREW RADFORD MARTIN ATKINSON DAVID BRITAIN HARALD CLAHSEN and ANDREW SPENCER University of Essex CAMBRIDGE UNIVERSITY PRESS Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo Cambridge University Press The Edinburgh Building, Cambridge CB2 8RU, UK Published in the United States of America by Cambridge University Press, New York www.cambridge.org Information on this title: www.cambridge.org/9780521849487 © Andrew Radford, Martin Atkinson, David Britain, Harald Clahsen and Andrew Spencer 2009 This publication is in copyright. Subject to statutory exception and to the provision of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press. First published in print format 2009 ISBN-13 978-0-511-47924-3 eBook (EBL) ISBN-13 978-0-521-84948-7 hardback ISBN-13 978-0-521-61478-8 paperback Cambridge University Press has no responsibility for the persistence or accuracy of urls for external or third-party internet websites referred to in this publication, and does not guarantee that any content on such websites is, or will remain, accurate or appropriate. Contents List of illustrations page x List of tables xii Preface to the second edition xiii A note for course organisers and class teachers xiv Introduction 1 Linguistics 2 Developmental linguistics 6 Psycholinguistics 9 Neurolinguistics 11 Sociolinguistics 14 Exercises 17 Further reading and references 21 Part I Sounds 23 1 Introduction 25 2 Sounds -
Chapter 3 Rethinking Remerge: Merge, Movement and Music Hedde Zeijlstra Georg-August-Universität Göttingen
Chapter 3 Rethinking remerge: Merge, movement and music Hedde Zeijlstra Georg-August-Universität Göttingen In an influential paper, Katz & Pesetsky (2011) present the identity thesis for lan- guage and music, stating that “[a]ll formal differences between language and music are a consequence of differences in their fundamental building blocks (arbitrary pairings of sound and meaning in the case of language; pitch classes and pitch- class combinations in the case of music). In all other respects, language and music are identical.” Katz & Pesetsky argue that, just like syntactic structures, musical structures are generated by (binary) Merge, for which they provide a number of ar- guments: for instance, musical structures are endocentric (each instance of Merge in music, just like in language, has a labelling head). They also argue that move- ment phenomena (i.e., the application of Internal Merge) can be attested in both language and music. While fully endorsing the view that musical structures are the result of multiple applications of External (binary) Merge, this paper argues that the arguments in favour of the presence of Internal Merge in music are at best inconclusive and arguably incorrect. This is, however, not taken as an argument against the identity thesis for language and music; rather, I take it to follow from it: the identity thesis for language and music reduces all differences between language and music to its basic building blocks. If the application of Internal Merge in natural language is driven by uninterpretable features (cf. Chomsky 1995; 2001; Bošković 2007; Zeijlstra 2012) that are language-specific and not applicable to music (the rea- son being that only building blocks that are pairings of sound and meaning can be made up of interpretable and uninterpretable features), the direct consequence is that Internal Merge cannot be triggered in music either. -
Greek and Latin Roots, Prefixes, and Suffixes
GREEK AND LATIN ROOTS, PREFIXES, AND SUFFIXES This is a resource pack that I put together for myself to teach roots, prefixes, and suffixes as part of a separate vocabulary class (short weekly sessions). It is a combination of helpful resources that I have found on the web as well as some tips of my own (such as the simple lesson plan). Lesson Plan Ideas ........................................................................................................... 3 Simple Lesson Plan for Word Study: ........................................................................... 3 Lesson Plan Idea 2 ...................................................................................................... 3 Background Information .................................................................................................. 5 Why Study Word Roots, Prefixes, and Suffixes? ......................................................... 6 Latin and Greek Word Elements .............................................................................. 6 Latin Roots, Prefixes, and Suffixes .......................................................................... 6 Root, Prefix, and Suffix Lists ........................................................................................... 8 List 1: MEGA root list ................................................................................................... 9 List 2: Roots, Prefixes, and Suffixes .......................................................................... 32 List 3: Prefix List ...................................................................................................... -
Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related. -
From Merge and Move to Form Dependency*
From Merge and Move to Form Dependency* M. RITA MANZINI 0 Introduction In this article, I shall first briefly present the minimalist framework that emerges from the recent work of Chomsky (1995). As I shall show, a number of classical problems in movement theory, concerning in particular locality in the sense of Manzini (1992), remain open under minimalism. What is more, while the operation Merge can be argued to be necessary on conceptual grounds alone, there is no analogous motivation for the operation Move, but only considerations of language use. I shall then outline a theory which substitutes Chomsky's (1995) movement rule with a rule of Form Dependency. As I shall show, this rule is sufficient to subsume Merge, thus effectively eliminating the Merge-Move dualism. Finally, I shall indicate how the classical problems in locality theory can also be solved within the version of minimalism that I propose. 1 Merge and Move In the minimalist model of Chomsky (1995), each linguistic expression is characterized by two representations, a P(honological) F(orm) representation, interfacing with the articulatory-perceptual system(s) and a L(ogical) F(orm) representation, interfacing with the conceptual-intentional system(s). This represents a clear departure from previous models, even within the principles and parameters framework, which are characterized by at least two additional syntactic representations, namely D(eep)-S(structure) and S(urface)-S(tructure). The minimalist model is obviously preferable on conceptual grounds, since in it linguistic representations are seen to simply match the interfaces between language and other cognitive systems. Furthermore, it is compatible with known empirical evidence and arguably superior in dealing with at least some of it. -
Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook. -
The Complexity of Narrow Syntax : Minimalism, Representational
Erschienen in: Measuring Grammatical Complexity / Frederick J. Newmeyer ... (Hrsg.). - Oxford : Oxford University Press, 2014. - S. 128-147. - ISBN 978-0-19-968530-1 The complexity of narrow syntax: Minimalism, representational economy, and simplest Merge ANDREAS TROTZKE AND JAN-WOUTER ZWART 7.1 Introduction The issue of linguistic complexity has recently received much attention by linguists working within typological-functional frameworks (e.g. Miestamo, Sinnemäki, and Karlsson 2008; Sampson, Gil, and Trudgill 2009). In formal linguistics, the most prominent measure of linguistic complexity is the Chomsky hierarchy of formal languages (Chomsky 1956), including the distinction between a finite-state grammar (FSG) and more complicated types of phrase-structure grammar (PSG). This dis- tinction has played a crucial role in the recent biolinguistic literature on recursive complexity (Sauerland and Trotzke 2011). In this chapter, we consider the question of formal complexity measurement within linguistic minimalism (cf. also Biberauer et al., this volume, chapter 6; Progovac, this volume, chapter 5) and argue that our minimalist approach to complexity of derivations and representations shows simi- larities with that of alternative theoretical perspectives represented in this volume (Culicover, this volume, chapter 8; Jackendoff and Wittenberg, this volume, chapter 4). In particular, we agree that information structure properties should not be encoded in narrow syntax as features triggering movement, suggesting that the relevant information is established at the interfaces. Also, we argue for a minimalist model of grammar in which complexity arises out of the cyclic interaction of subderivations, a model we take to be compatible with Construction Grammar approaches. We claim that this model allows one to revisit the question of the formal complexity of a generative grammar, rephrasing it such that a different answer to the question of formal complexity is forthcoming depending on whether we consider the grammar as a whole, or just narrow syntax. -
Lexical-Functional Grammar and Order-Free Semantic Composition
COLING 82, J. Horeck~(eel) North-HoOandPubllshi~ Company O A~deml~ 1982 Lexical-Functional Grammar and Order-Free Semantic Composition Per-Kristian Halvorsen Norwegian Research Council's Computing Center for the Humanities and Center for Cognitive Science, MIT This paper summarizes the extension of the theory of lexical-functional grammar to include a formal, model-theoretic, semantics. The algorithmic specification of the semantic interpretation procedures is order-free which distinguishes the system from other theories providing model-theoretic interpretation for natural language. Attention is focused on the computational advantages of a semantic interpretation system that takes as its input functional structures as opposed to syntactic surface-structures. A pressing problem for computational linguistics is the development of linguistic theories which are supported by strong independent linguistic argumentation, and which can, simultaneously, serve as a basis for efficient implementations in language processing systems. Linguistic theories with these properties make it possible for computational implementations to build directly on the work of linguists both in the area of grammar-writing, and in the area of theory development (cf. universal conditions on anaphoric binding, filler-gap dependencies etc.). Lexical-functional grammar (LFG) is a linguistic theory which has been developed with equal attention being paid to theoretical linguistic and computational processing considerations (Kaplan & Bresnan 1981). The linguistic theory has ample and broad motivation (vide the papers in Bresnan 1982), and it is transparently implementable as a syntactic parsing system (Kaplan & Halvorsen forthcoming). LFG takes grammatical relations to be of primary importance (as opposed to the transformational theory where grammatical functions play a subsidiary role). -
Title Experimental Syntax for Biolinguistics? Author(S) Fujita, Koji
Title Experimental Syntax for Biolinguistics? Author(s) Fujita, Koji Citation (2009) Issue Date 2009-11-15 URL http://hdl.handle.net/2433/87611 Right c Koji Fujita Type Presentation Textversion publisher Kyoto University Language evolution (and development) Experimental Syntax boils down to the emergence of: for Biolinguistics? Recursive Unbounded Merge Interfaces Lexicon Koji Fujita cf. FLN / FLB dichotomy Kyoto University 1 3 Real-time Grammar (Phillips' theses): Biolinguistics: Human language is "implementation Naturalization, or biologization, of human dependent." language faculty (biosyntax, biosemantics, etc.) Grammar is a real-time structure building system. Design Derivation proceeds mostly from left Development (top) to right (bottom). Evolution C. Phillips & S. Lewis. Derivational order in syntax: Evidence and architectural consequences. 2 4 Grammar = Parser? FLN? Competence TP What a cognitive system could achieve John 3 with unbounded resources T 3 VP FLB? Performance John3 saw3 the girl What it can achieve when it is subject to real-life resource limitations Mismatch between derivations and Phylogeny/ontogeny? C. Phillips & M. Wagers. Relating structure and time in linguistics and psycholinguistics. Oxford Handbook of Psycholinguistics. 5 7 Major Issues: Unbounded Merge "… unbounded Merge is not only a genetically determined property of language, but also unique Mismatch between Theoretical and Psycho-/ to it." Neuro-Linguistics Lack of Comparative Methods "… for both evolution and development, there seems to be little reason to suppose that there were Modularity as an end result of evolution & precursors to unbounded Merge." development - N. Chomsky 6 8 Pirahã: A Language without Recursion? Unbounded, recursive Merge: Competence Cross-linguistic variations: Performance ti gái -sai kó'oi hi kaháp -ií I say-old.info Kó'oi he leave-intention "..