CAS LX 522 Syntax I Some History of Generative Grammar

Total Page:16

File Type:pdf, Size:1020Kb

CAS LX 522 Syntax I Some History of Generative Grammar CAS LX 522 Some history of generative grammar Syntax I • Transformational Grammar – (Chomsky 1955, Chomsky 1957) • Standard Theory – (Chomsky 1965) Week 13. Loose ends, minimalism • Extended Standard Theory – (Chomsky 1970, …) we’ve mostly been in here • Government and Binding Theory somewhere… – (Chomsky 1981, 1986) • Minimalist Program – (Chomsky 1993) Transformational grammar Standard Theory • Grammar was a set of phrase structure rules • Introduced the idea of a lexicon. –S → NP VP NP → D N • Tied DS to meaning, SS to pronunciation. VP → V NP • Development of innateness argument and levels of D → the adequacy (descriptive, explanatory) N → man, dog, sandwich, … V → meet, see, … • Treated reflexivization as a transformation (beginnings of Binding Theory) – Start with S, apply rules until none left. – DS: Bill saw Bill • and transformations – SS: Bill saw himself → –Tpassive: NP1-Aux-V-NP2 NP2 + be + V+en by NP1 . Generative semantics vs. Extended Standard Theory interpretive semantics • In the late 60s there was a rift between those • Replaced phrase structure rules with X-bar theory. who thought meaning should be tied to DS • Gradually started replacing construction-specific and those who thought meaning should be rules with more general constraints (binding tied to SS. condition, complex NP constraint, wh-island • DS people were generative semanticists condition) and rules (NP movement, WH • SS people were interpretive semanticists movement). – The editor didn’t find many mistakes. – Many mistakes weren’t found by the editor. • Development of theta-theory. • The path we’re following took the SS side. 1 Government and Binding Minimalist Program • Grammar has a highly modular character. Separate modules govern separate things, all have to be satisfied • Since 1993, the syntactic paradigm has shifted to for a sentence to be grammatical. The logical extreme the Minimalist Program. of the increasing generality. – X-bar theory – Binding theory • The motivation behind the Minimalist Program is – Theta theory – Bounding theory – Case theory – Movement rules (NP, WH, head) that it was starting to seem like syntax was getting too complicated and that perhaps syntactic • Constraints began to refer to structural relations (c- command, m-command, government) machinery that was inherited from previous approaches was as complicated as the phenomena • The level of LF was introduced, and covert movement (like QR). that were being explained. • This it the model we have been using, basically. Minimalist Program Minimalist Program • The goal of MP was to sort of “start over” with • Practically speaking, what happened was a change syntax, now that we know what we do from the in the fundamental perspective on what is years of learning (vast amounts) about the happening in syntax, but it turned out to have little structure of language. effect on the day-to-day life of syntacticians. • There’s still Case to be assigned (checked), there • We start with only things that have to be true and are still theta-roles, the trees all look basically the then we carefully justify everything else that we same. need as we rebuild the system from scratch. • We’ll go through things in more detail in Syntax II Ways to think Ways to think • In GB theory, there were three kinds of movement rules – NP movement (movement of DPs, e.g., for Case) • So closeness seems to matter. – WH movement (movement of wh-words) • This evolved into the idea that lexical items (and – Head movement (movement of heads to heads) phrases) have features and they need to be close to • It was observed that each kind of movement served to get each other in order to be checked. two things close together. • So, with wh-movement, the [wh] feature of the – NP movement of the subject brings it into SpecTP wh-word needs to be checked against the [+WH] close to T so that it can get Case. feature of the interrogative C, and to do this it – WH movement brings wh-words into SpecCP needs to be close. SpecCP counts as close. Hence, to be close to [+Q, +WH] C. the wh-word needs to move to SpecCP. – Head movement brings V up close to T. 2 Ways to think Other changes • If we assume that all movement is driven by the requirement to check features, (and that all • There are various other changes in MP thinking features must be checked in a grammatical which we can’t really get into here, but they all derivation) then this has to be what happens in tend to have the result that we get basically the head-movement too. same (or simpler) structures out of a dramatically • The idea would be that, for example, interrogative simpler system. Q has a feature on it that needs to be checked with a feature of T. • Somewhat fundamental changes occurred in the • So what we say instead of there’s a rule that notion of DS and of X-bar theory, and even more moves T to C when C is [+Q] that when C is [+Q] recent work has even broken apart the distinction it is also [+T]. The feature checking system takes between overt and covert movement somewhat. care of the rest. VP shells VP shells VP • So far, so good. • Let’s go back and consider VP shells a bit V′ • Now, Bill melted the ice. • The ice is still Theme. The verb is still melt. • The ice melted. V DP melt the ice • Uniform Theta Assignment Hypothesis • The boat sank. (UTAH) (Baker 1988): Two arguments • The door closed. which fulfill the same thematic function with respect to a given predicate must • The ice, the boat, the door are all Themes, occupy the same underlying (DS) position in suggesting that the verbs are unaccusative—the the syntax. argument starts in “object” (complement of V) • So the ice must still be a complement of the position. verb at DS. VP shells VP shells VP vP • In Bill melted the ice what have we done? • Bill melted the ice. V′ • We’ve added a causer. DP v′ • Then, the main verb moves up to the • Bill caused [the ice to melt]. Bill light verb, yielding the surface order. V DP v VP melt the ice – Later, Bill will move to SpecTP for Case • We’ve already supposed that the light verb V′ and EPP reasons. assigns the Agent θ-role in ditransitives. • Why does V move to v? We’ll assume V DP that it does this for a reason analogous melt the ice • It isn’t much of a jump to think of it as to why V moves to T (for French having a meaning something like CAUSE. verbs, say). 3 VP shells VP shells • Warning. Even though v may carry a “causative” • Bill remarked that Patrick runs fast. meaning, this does not mean that it is synonymous • Bill remarked to her that Patrick runs fast. with the English word “cause”. • The water boiled. • Bill boiled the water • UTAH and the CP. – Billi T ti v+boil the water • “Cause” meaning a bit more general • Bill caused the water to boil – Bill cause TP VP shells VP shells • You must satisfy the jury that you’re innocent. • Object control predicates. • Ever try to draw the tree for They persuaded • The jury gets the same kind of theta role, Bill to leave ? Again, too many arguments, something like Experiencer (but no to). Also not not enough syntactic places available in a optional. binary branching tree. • It strikes me that Bill runs fast. • They persuaded me that I should leave. • It seems to me that our analysis needs more light verbs. VP shells VP shells • He sold me a camel. • Following along as before… • He lied. •Hei T ti v+sell me tv a camel. • Agent, no theme. • Suppose that Agents only come about by • Compare that to He gave Mary a book. Ah. virtue of a v. That is, if there’s an Agent, it’s in the specifier of a vP at DS. • Turns out this alternative to Larson is more crosslinguistically applicable (IO seems to start out • Compare He told a lie. The verb lie seems higher in the tree than DO across languages). It also to be denominal. Like dance… and others. means that Bill gave me a book is the more basic form, Bill gave a book to me is more derived. 4 Unergative verbs AgrSP • Hale & Keyser proposed that denominal verbs • They have probably all left. like lie involve head-movement of an N to to • *They have completely probably all left another a light (verbalizing) verb. • They probably all have left. • If we’re going to do that, perhaps we can deal with verb-particle constructions the same way. – Bill turned on the light. – Bill turned the light on. 5.
Recommended publications
  • On the Irrelevance of Transformational Grammar to Second Language Pedagogy
    ON THE IRRELEVANCE OF TRANSFORMATIONAL GRAMMAR TO SECOND LANGUAGE PEDAGOGY John T. Lamendella The University of Michigan Many scholars view transformational grammar as an attempt to represent the structure of linguistic knowledge in the mind and seek to apply transformational descriptions of languages to the de- velopment of second language teaching materials. It will be claimed in this paper that it is a mistake to look to transformational gram- mar or any other theory of linguistic description to provide the theoretical basis for either second language pedagogy or a theory of language acquisition. One may well wish to describe the ab- stract or logical structure of a language by constructing a trans- formational grammar which generates the set of sentences identi- fied with that language. However, this attempt should not be con- fused with an attempt to understand the cognitive structures and processes involved in knowing or using a language. It is a cogni- tive theory of language within the field of psycholinguistics rather than a theory of linguistic description which should underlie lan- guage teaching materials. A great deal of effort has been expended in the attempt to demonstrate the potential contributions of the field of descriptive linguistics to the teaching of second languages and, since the theory of transformational grammar has become the dominant theory in the field of linguistics, it is not surprising that applied linguists have sought to apply transformational grammar to gain new in- sights into the teaching of second languages. It will be claimed in this paper that it is a mistake to look to transformational grammar or any other theory of linguistic description to provide the theoretical basis for either second language pedagogy or a theory of language acquisition.
    [Show full text]
  • 1A Realistic Transformational Grammar
    1 A Realistic Transformational Grammar JOAN BRESNAN The Realization Problem More than ten years ago Noam Chomsky expressed a fundamental assumption of transformational grammar : . A reasonable model of language use will incorporate , as a basic component , the generative grammar that express es the speaker - hearer ' s knowledge of the language . " This assumption is fundamental in that it defines basic research objectives for transformational grammar : to characterize the grammar that is to represent the language user ' s knowledge of language , and to specify the relation between the grammar and the model of language use into which the grammar is to be incorporated as a basic component . We may call these two research objectives the grammatical characterization problem and the grammatical realization problem . In the past ten years linguistic research has been devoted almost exclusively to the characterization problem ; the crucial question posed by the realization problem has been neglected - How } \ ' ould a reasonable model of language use incorporate a transformational grammar ? If we go back to the context in which Chomsky expressed this assumption about the relation of grammar to language use , we find that it is couched within an admonition : . A generative grammar is not a model for a speaker or a hearer . No doubt , a reasonable model of language use will incorporate , as a basic component , the generative grammar that express es speaker - hearer ' s knowledge of the language ; but this generative grammar does not , in itself , prescribe the character or functioning of a perceptual model or a model of speech - production " ( 1965 , p . 9 ) . In retrospect , this caution appears to have been justified by the rather pessimistic conclusions that have since been drawn in This chapter discuss es research in progress by the author , which is likely to be modified and refined.
    [Show full text]
  • Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
    Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related.
    [Show full text]
  • Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
    Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook.
    [Show full text]
  • Lexical-Functional Grammar and Order-Free Semantic Composition
    COLING 82, J. Horeck~(eel) North-HoOandPubllshi~ Company O A~deml~ 1982 Lexical-Functional Grammar and Order-Free Semantic Composition Per-Kristian Halvorsen Norwegian Research Council's Computing Center for the Humanities and Center for Cognitive Science, MIT This paper summarizes the extension of the theory of lexical-functional grammar to include a formal, model-theoretic, semantics. The algorithmic specification of the semantic interpretation procedures is order-free which distinguishes the system from other theories providing model-theoretic interpretation for natural language. Attention is focused on the computational advantages of a semantic interpretation system that takes as its input functional structures as opposed to syntactic surface-structures. A pressing problem for computational linguistics is the development of linguistic theories which are supported by strong independent linguistic argumentation, and which can, simultaneously, serve as a basis for efficient implementations in language processing systems. Linguistic theories with these properties make it possible for computational implementations to build directly on the work of linguists both in the area of grammar-writing, and in the area of theory development (cf. universal conditions on anaphoric binding, filler-gap dependencies etc.). Lexical-functional grammar (LFG) is a linguistic theory which has been developed with equal attention being paid to theoretical linguistic and computational processing considerations (Kaplan & Bresnan 1981). The linguistic theory has ample and broad motivation (vide the papers in Bresnan 1982), and it is transparently implementable as a syntactic parsing system (Kaplan & Halvorsen forthcoming). LFG takes grammatical relations to be of primary importance (as opposed to the transformational theory where grammatical functions play a subsidiary role).
    [Show full text]
  • Introduction to Transformational Grammar
    Introduction to Transformational Grammar Kyle Johnson University of Massachusetts at Amherst Fall 2004 Contents Preface iii 1 The Subject Matter 1 1.1 Linguisticsaslearningtheory . 1 1.2 The evidential basis of syntactic theory . 7 2 Phrase Structure 15 2.1 SubstitutionClasses............................. 16 2.2 Phrases .................................... 20 2.3 Xphrases................................... 29 2.4 ArgumentsandModifiers ......................... 41 3 Positioning Arguments 57 3.1 Expletives and the Extended Projection Principle . ..... 58 3.2 Case Theory and ordering complements . 61 3.3 Small Clauses and the Derived Subjects Hypothesis . ... 68 3.4 PROandControlInfinitives . .. .. .. .. .. .. 79 3.5 Evidence for Argument Movement from Quantifier Float . 83 3.6 Towards a typology of infinitive types . 92 3.7 Constraints on Argument Movement and the typology of verbs . 97 4 Verb Movement 105 4.1 The “Classic” Verb Movement account . 106 4.2 Head Movement’s role in “Verb Second” word order . 115 4.3 The Pollockian revolution: exploded IPs . 123 4.4 Features and covert movement . 136 5 Determiner Phrases and Noun Movement 149 5.1 TheDPHypothesis ............................. 151 5.2 NounMovement............................... 155 Contents 6 Complement Structure 179 6.1 Nouns and the θ-rolestheyassign .................... 180 6.2 Double Object constructions and Larsonian shells . 195 6.3 Complement structure and Object Shift . 207 7 Subjects and Complex Predicates 229 7.1 Gettingintotherightposition . 229 7.2 SubjectArguments ............................. 233 7.2.1 ArgumentStructure ........................ 235 7.2.2 The syntactic benefits of ν .................... 245 7.3 The relative positions of µP and νP: Evidence from ‘again’ . 246 7.4 The Minimal Link Condition and Romance causatives . 254 7.5 RemainingProblems ............................ 271 7.5.1 The main verb in English is too high .
    [Show full text]
  • Download the Linguistics Wars Free Ebook
    THE LINGUISTICS WARS DOWNLOAD FREE BOOK Randy Allen Harris | 368 pages | 09 Mar 1995 | Oxford University Press Inc | 9780195098341 | English | New York, United States Language wars: the 19 greatest linguistic spats of all time While the entrenched linguists were not looking for a messiah, apparently many of their students were. Dave rated it liked it Jun 06, Trivia About The Linguistics Wars. Use this word at your own risk. This book chronicles both sides of the Generative Semantics vs. The debates followed the usual trajectory of most large-scale clashes, scientific or otherwise. To study this co-ordination of certain sounds with certain meanings is to study language. Albaugh and Kathryn M. Both positions changed dramatically in the course of the dispute--the triumphant Chomskyan position was very The Linguistics Wars from The Linguistics Wars initial one; the defeated generative semantics position was even more transformed. Particularly, it The Linguistics Wars, when you add gender to the mix. Jul 11, David rated it liked it Shelves: linguisticsdid-not-finish. PaperbackThe Linguistics Wars. That drives them to spend hours arguing with strangers on the internet, to go around correcting misspelt signs in the dead of night, or even to threaten acts of violence? Start your review of The Linguistics Wars. Betsy rated it really liked it Jun 10, What can I say about this book. There was a revolution, which colored the field of linguistics for the following decades. There was a revolution, which colored the field of linguistics for the following decades. The repercussions of the Linguistics Wars are still with us, not only in the bruised feelings and late-night war stories of the combatants, and in the contentious mood in many quarters, but in the way linguists currently look at language and the mind.
    [Show full text]
  • Generative Linguistics Within the Cognitive Neuroscience of Language Alec Marantz
    Generative linguistics within the cognitive neuroscience of language Alec Marantz Department of Linguistics and Philosophy, MIT Standard practice in linguistics often obscures the connection between theory and data, leading some to the conclusion that generative linguistics could not serve as the basis for a cognitive neuroscience of language. Here the foundations and methodology of generative grammar are clarified with the goal of explaining how generative theory already functions as a reasonable source of hypotheses about the representation and computation of language in the mind and brain. The claims of generative theory, as exemplified, e.g., within Chomsky’s (2000) Minimalist Program, are contrasted with those of theories endorsing parallel architectures with independent systems of generative phonology, syntax and semantics. The single generative engine within Minimalist approaches rejects dual routes to linguistic representations, including possible extra-syntactic strategies for semantic structure-building. Clarification of the implications of this property of generative theory undermines the foundations of an autonomous psycholinguistics, as established in the 1970’s, and brings linguistic theory back to the center of a unified cognitive neuroscience of language. 2 1. The place of linguistics* The first decade of the 21st century should be a golden era for the cognitive neuroscience of language. Fifty years of contemporary linguistic analysis of language can be coupled with a wide range of brain imaging and brain monitoring machines to test hypotheses and refine theory and understanding. However, there is still a gulf between mainstream linguistics within the generative linguistic tradition and most of those engaged in experimental cognitive neuroscience research. Some have argued that the fault here lies with the linguists, whose generative theories are based in principle on separating the study of linguistic representations from research on the acquisition and use of language in the minds and brains of speakers.
    [Show full text]
  • Psychoacoustics, Speech Perception, Language Structure and Neurolinguistics Hearing Acuity Absolute Auditory Threshold Constant
    David House: Psychoacoustics, speech perception, 2018.01.25 language structure and neurolinguistics Hearing acuity Psychoacoustics, speech perception, language structure • Sensitive for sounds from 20 to 20 000 Hz and neurolinguistics • Greatest sensitivity between 1000-6000 Hz • Non-linear perception of frequency intervals David House – E.g. octaves • 100Hz - 200Hz - 400Hz - 800Hz - 1600Hz – 100Hz - 800Hz perceived as a large difference – 3100Hz - 3800 Hz perceived as a small difference Absolute auditory threshold Demo: SPL (Sound pressure level) dB • Decreasing noise levels – 6 dB steps, 10 steps, 2* – 3 dB steps, 15 steps, 2* – 1 dB steps, 20 steps, 2* Constant loudness levels in phons Demo: SPL and loudness (phons) • 50-100-200-400-800-1600-3200-6400 Hz – 1: constant SPL 40 dB, 2* – 2: constant 40 phons, 2* 1 David House: Psychoacoustics, speech perception, 2018.01.25 language structure and neurolinguistics Critical bands • Bandwidth increases with frequency – 200 Hz (critical bandwidth 50 Hz) – 800 Hz (critical bandwidth 80 Hz) – 3200 Hz (critical bandwidth 200 Hz) Critical bands demo Effects of masking • Fm=200 Hz (critical bandwidth 50 Hz) – B= 300,204,141,99,70,49,35,25,17,12 Hz • Fm=800 Hz (critical bandwidth 80 Hz) – B=816,566,396,279,197,139,98,69,49,35 Hz • Fm=3200 Hz (critical bandwidth 200 Hz) – B=2263,1585,1115,786,555,392,277,196,139,98 Hz Effects of masking Holistic vs. analytic listening • Low frequencies more effectively mask • Demo 1: audible harmonics (1-5) high frequencies • Demo 2: melody with harmonics • Demo: how
    [Show full text]
  • Introduction
    CHAPTER 1 Introduction 1.1 Some Historical and Methodological Backgrounds Semantic Syntax is a formal theory of grammar, introducing an entirely new method of syntactic description. It takes a grammar to be a formal system map- ping well-motivated semantic analyses (SA’s) onto well-formed surface struc- tures of sentences. My claim is that no other formal grammatical theory has achieved the level of success reached by Semantic Syntax, which is unique in the way it accounts for, often so far unobserved, syntactic complications in the languages analysed in the terms of a unified theory. Grammars are, naturally, language-specific but it is a central tenet of Semantic Syntax that all grammars of individual languages or language varieties are subject to strong universal constraints. Consequently, it is a central concern of Semantic Syntax to search for those constraints, so that a maximally unified account can be provided of natural languages and their grammars. Semantic Syntax is characterised by the one overarching hypothesis that every sentence in every natural language is to be seen as a composition of prop- ositional structures (under a speech act operator), each consisting of a predi- cate and its argument terms. In its present form, this hypothesis goes back to James D. McCawley (1938–1999) but it is, in fact, the continuation of medi- eval and renaissance thinking about language, reaching a first apogee in the great work by the Salamanca professor Franciscus Sanctius, Minerva seu de Causis Linguae Latinae, published in 1587, and in Antoine Arnauld and Claude Lancelot’s famous Grammaire générale et raisonnée, known as the Port Royal Grammar, published in Paris in 1660.
    [Show full text]
  • Syntax-Semantics Interface: Arabic Is a Case
    Bulletin of Advanced English Studies – Vol. 1, No. 1 , 2018, pp. 1 -15 Available online at http:// www.refaad.com Syntax-Semantics Interface: Arabic is a Case Samir Al Jumaily Postgraduate Department - International University of Islamic Science - London [email protected] Abstract: In order to make a coherent statement or an accurate description, linguists have to focus on one aspect of a language and preclude the others according to a process called selected focusing, but it is believed that such an act of isolation is only an artificial practice. Although a naïve or a child is not aware of the various levels of language, he/she is well knowledgeable about the grammatical, structural, and semantic tools that make him/her easily and instantly spot the ill-formed or meaningless sentences of his/her native language. Two opposite mainstreams discovered in the study of syntax-semantics interface. The 1st is the syntactically-oriented perspective established by Chomsky and his followers, which is later modified and supported by the Optimal Theory Approach, and the 2nd is the semantically-oriented one in its two facets- the generative and the interpretive. The early generative transformational approach went too far in insisting that the syntactic aspect has an autonomous characteristic and should be dealt with in isolation from semantics, whereas others argue that they are interrelated and cannot be separated. The main objective of this study is to arrive at some general outlines that might help linguists, second/foreign language teachers as well as students and tries to shed light on this linguistic controversy in order to establish a scientific scheme in language studies.
    [Show full text]
  • Universal Grammar Is Dead 7
    BEHAVIORAL AND BRAIN SCIENCES (2009) 32, 429–492 doi:10.1017/S0140525X0999094X The myth of language universals: Language diversity and its importance for cognitive science Nicholas Evans Department of Linguistics, Research School of Asian and Pacific Studies, Australian National University, ACT 0200, Australia [email protected] http://rspas.anu.edu.au/people/personal/evann_ling.php Stephen C. Levinson Max Planck Institute for Psycholinguistics, Wundtlaan 1, NL-6525 XD Nijmegen, The Netherlands; and Radboud University, Department of Linguistics, Nijmegen, The Netherlands [email protected] http://www.mpi.nl/Members/StephenLevinson Abstract: Talk of linguistic universals has given cognitive scientists the impression that languages are all built to a common pattern. In fact, there are vanishingly few universals of language in the direct sense that all languages exhibit them. Instead, diversity can be found at almost every level of linguistic organization. This fundamentally changes the object of enquiry from a cognitive science perspective. This target article summarizes decades of cross-linguistic work by typologists and descriptive linguists, showing just how few and unprofound the universal characteristics of language are, once we honestly confront the diversity offered to us by the world’s 6,000 to 8,000 languages. After surveying the various uses of “universal,” we illustrate the ways languages vary radically in sound, meaning, and syntactic organization, and then we examine in more detail the core grammatical machinery of recursion, constituency, and grammatical relations. Although there are significant recurrent patterns in organization, these are better explained as stable engineering solutions satisfying multiple design constraints, reflecting both cultural-historical factors and the constraints of human cognition.
    [Show full text]