Structure Rules and Transformational Rules, Requires Three Steps Toexplain the Sentence

Total Page:16

File Type:pdf, Size:1020Kb

Structure Rules and Transformational Rules, Requires Three Steps Toexplain the Sentence 1 DOCUMENT RESVAE 1 ED 022 771 TE 000 681 By-Hayes. Curtis W. SYNTAX: SOME PRESENT-DAYCONCEPTS. Pub Date Jan 67 Note-9p. Journal Cit-English Journal; v56 ri p89-96 Jan 1967 MRS Price MF-S025 HC-S0A4 Descriptors-ENGLISH INSTRUCTION, KERNEL SENTENCES.LINGUISTICS. PI-RASE STRUCTURE. *SENTENCE STRIXTURE. *STRUCTURAL GRAMIAR. *SYNTAX. *TRANSFORMATION GEMRATIVE GRAMAR. TRANSFORMATIONS (LANGUAGE). TRANSFORMATION THEORY (LANGUAGE) The valueof a transformational model of syntax can be Illustrated by comparing the taxonomic grammatical description ofa complex sentence to a transformation-oriented description of the same sentence. The taxonomic approach. an immediate constituent analysis. requires10 steps to break the sample sentence onto its grammatical components; the transformational approach. incorporatingboth phrase structure rules and transformational rules, requires three steps toexplain the sentence. Because the transformational method allowsfor generalaations about the process of embedding. it canmake more economical statements about syntax. Furthermore. since the transformational theory holds that afinite set of phrase structure rules plus a finite set of transformational rules canexplan any sentence, it is linguistically more complete and consistent and, thus, more practical on theclassroom than the taxonomic theory which assumes that an infinite set of phrase structurerules is necessary to describe all sentences. (LH) ENGLISHJOURNAL The Official Journal of theSecondary Section of r4 National Council of Teachersof English ND Editor:Ric:man S. ALM Pato University of Hawaii evki tv 0 Volume56 January 1967 Cil Number 1 tiJ Matter and Meaning of MotionPictures 23The Rev. I. Paul Calico, C.S.C. The Faculty Club, Wittenberg(Verse) 37 William F. Gavin Reeling in English Class 38Sister Mary Labouré Hang, S.N.D. LoneliEess of the Long DistanceRunner: First Film Fare 41Sister M. Amanda Ely, ,O.P. Macbeth for the Busy Reader(Verse) 44 Richard Gaggin Aboard the Narcissus 45Evalee Hart An Approach to TeachingThe Secret Sharer 49 Marian C. Powell The Role of Order and Disorderin The Long March 54Welles T. Brandriff A Letter to J. A. (Verse) 59Pansye H. Powell The IV eltanschauung of Steinbeckand Hemingway: An Analysis ofThemes 60 Samuel Scoville Miss Brownstone and theAge of Science 64 John H. Bens Helping Students See TheirLanguage Working 67W. Wilbur Hatfield ...... The Feature System in theClassroom 74 Keith Schap Cho Syntax: Some Present-DayCcncepts NO 89Curtis W. Hayes Some Usage Forms Die Hard o Thanks to College EntranceExams 97Evelyn Schroth o Is Composition Obsolete? 100Solomon S. Simonson oTeaching Writing Today Composition or Decomposition? Lk, 103 Edward Lueders t. Priming the Pump andControlling the Flow109Vivian Buchan How To Write Lcss Efficiently 114Arthur A. Stern IMAM. EDUCATION &WEUARE U.S. DEPARTMENT Of NIKE OF EDUCATION RECEIVED FROM TIM DM DOCUMENT HM KENREPRODUCED EXACTLY AS 01116I0A1016 U. POINTS OFVIEW OR OPINIONS PERSON OR ORGANIZATION REPRESENT Off ICIAl OFFICEOF EDUCATION STATED 00 NOT NECESSARILY POSITION OR POLICY. Syntax: SomePresent-Day Concepts "PENSION TO mega THIS MONO NAM NAS 1111 GIANTS Curtis W. Hayes gy, 71076..,1_ 102..../41744.4 TO MC AID MAMA OPENATVW Department of English MEI MOWS WM TIE LS. OfFICE University of Nebraska MC*UIL MINE MOWN* NM Lincoln, Nebraska NE Mt SYSTEM MOUES !MISSION OF COPTIMIT OWNEL" THE APPEARANCE in1957of These attacks have taken severalpaths, Noam Chomsky's monograph en-but in general the Chomskyanshave titled Syntactic Structures(The Hague:judged the older grammar(frequently Mouton and Co.) dividedlinguistic sci-labeled the taxonomic grammar) tobe ence into twoschools, sharply divergent.inadequate in its power to describe cer- The older of these two,which by nowtain linguistic facts and processes.Chom- of his may almost becalled the traditional orsky himself, for instance, in one structural school, traces itsorigin tomore recentpublications, argues "that 1933, the year of LeonardBloomfield'sa taxonomicmodel (or any of its variants is monumentalLanguage.Thenewerwithin a modern study of language) school, which may be calledthe MIT,far too oversimplified to be able to ac- or Chomskyan,school, had its birth incount for the factsof linguistic structure 1958. The Chomskyansmaintain thatand that the transformationalmodel of their own system describes humanlan-generative grammar is much closer to ge "as in itself itreally is" (to borrowthe truth."' In Syntactic Structures(pp. tthew Arnold's phrase); allcompeting18 If.), he had argued that the taxonomic theories they feel to beunsophisticated,grammar WO madequatebecauseit inherently incapable of describingthewould not generate all the grammatical complexities of language. In a numberofsentences of a languageand only those. fairly recentpublications, they haveSpecifically, he held, it would not gen_ drawn attention to what they fecl tobeerate the "nesting" (orself-embeddmg) the inadequacies of the older school.' properties typicalof certain English sentences. 'The traditional system is representedby Linguistic Structures: Front Sotmd to Sentence Charles C. Fries,Tbe Smwture of English: An Introduction to the Construction of English in English(New York: Harcourt, Brace and Co., Inc., 1958), to mention only a few. Sentences(New York: Harcourt, Braceand Co., Inc., 1952); George L. Trager andHenry 2See, for example, Chomsky's paper, "The An Outline of English Structure Logical Basis of Linpistic Theory," which Lee Smith, Jr., Proceedings of the Ninth Interna- (Studies in Linguistics,Occasional Papers, No.appears in the tional Congress of Linguists(The Hague: Mou- 3),(Norman, Oklahoma: BattenburgPress, 1951); and Archibald A. Hill,Introductim to ton and Co., 1964), pp. 914-1008. 89 .1 . 90 - ENGLISH JOURNAL Another member ofthis school, Paul tional rules. Then, inaccordance with Postal, discusses ina recent monographthese rules, he the inadequacies of may construct an infinite traditional modelsnumber ofsentences. This avoids the of linguisticdescription.3 Taxonomic notion thata human speaker learnsto models, accordingto Postal, cannotac- talk by mastering allthe complexsen- count for the intuitively-feltrelation- ships among tence patterns of his language, eachone sentences such 2S active andseparately. Take, for instance,the fol- passive,interrogative and declarative,lowing example: assertive and negative,incorporated and non-incorporated. Nor, headds, can this Union Oil sells oilmay be considered grammar account for such grammatical a kernel (a base sentence) forfurther processes as the following: transformations, suchas the passive: Deletion: XAY Oil is sold by Union Oil. ----) AY the negative: Substitution: XA Union Oil doesn't sell oil. XBY the negativelpassive: ThW Oil isn't sold by UnionOil. the interrogative: Permutation: XAYBZ XBYAZ Does Union Oil sell oil? Adjunction (embedding ofconstituents): the negativelinterrogative: Doesn't Union Oil sell oil? the negative/passive/interrogative: ZBWX1 XABY Isn't oil sold by UnionOil? THE GENFRALtenet of transfor- For the linguist(as well as those A mational/generativegrammar (i.e., aschooled in logicor mathematics), the Chomskyan grammar) isthat every adultabove algebramay give rigor, consis- possesses a relatively few, simplesentencetency, and exactnessto statements about patterns (the kernels4) anda complex setlanguage. But, for theoutsider, including of rules (calledtransformations) whichperhaps the classroomteacher, these rules describe the operationsby which hemay be repellent and thusmay have combines and modifiessimple sentencesonly negligible valuein a classroom into the infinite numberof complicatedsituation. It is with thisdifficulty in sentences he can produce. Theprocessesmind that thispaper is written; first, to oi combining andmodifyingsentencesexplain a few of theinsights of transfor- to form evenmore complex sentencesmationalgrammar and how they leadto are technically knownas transforma-a complete 2S well as toa simple view of tions. In other words,a speaker learnsgrammaticalprocesses. And second, to a finite set of basicsentence patternscompare a description ofa complex sen- together witha finite set of transforma-tence provided by a taxonomicgrammar and a description ofthat samesentence *"Constituent Structure:A Study of Con-using the transformationalapproach. In Models of SyntacticDescriptions,"the course of this /tefAir,r;akrPartIII (January 1964). paper, I should like *The notion of kernelperhaps is outmoded into make more easily understandablethe its original definition (i.e.sentences which havecomplex equations ofthe transforma- had no optionaltransformations performedontional/generativegrammar. them), yet it is still usefulto think of a human speaker as havinga set of basic sentences,per- haps from which hecan produce an NVYMAY take thissentence from infinite number ofsentences. In this paper the -laroldWhitehall's book, Struc- term kernel can be liberalizedto include thetural Essentials ofEnglish (Harcourt, notion of basesentence. 1956), for analysis:5 SYNTAX: SOMEPRESENT-DAY CONCEPTS 91 old man prepositional group: to a poorold man To sing such songs to a poor approaching persuaded of his own approachingdeath persuaded of his own had been a charitable actI had not death I contemplated. The verbal group then parsesinto a The traditional or taxonomiclinguist, following the rules ofimmediate con- 3. head: to sing analyze such a noun group:such songs (with songs stituent analysis, would as its head) sentence inlinear order; that is, he would use a "straightline"
Recommended publications
  • Scaffolding Learning of Language Structures with Visual‐Syntactic Text
    UC Irvine UC Irvine Previously Published Works Title Scaffolding learning of language structures with visual-syntactic text formatting Permalink https://escholarship.org/uc/item/6235t25b Journal British Journal of Educational Technology, 50(4) ISSN 0007-1013 Authors Park, Y Xu, Y Collins, P et al. Publication Date 2018 DOI 10.1111/bjet.12689 Peer reviewed eScholarship.org Powered by the California Digital Library University of California British Journal of Educational Technology Vol 0 No 0 2018 1–17 doi:10.1111/bjet.12689 Scaffolding learning of language structures with visual-syntactic text formatting Youngmin Park, Ying Xu , Penelope Collins, George Farkas and Mark Warschauer Youngmin Park is a Lecturer at Pusan National University, Korea. She received her Ph.D. degree from the University of California, Irvine, specializing in Language, Literacy and Technology. Ying Xu is a Ph.D. student at the University of California, Irvine, with a specialization in specializing in Language, Literacy and Technology. Penelope Collins is an associate professor at the University of California, Irvine. Her research examines the development of language and literacy skills for children from linguistically diverse backgrounds. George Farkas is a professor at the University of California, Irvine. He has employed a range of statistical approaches and databases to examine the causes and consequences of reading achievement gap across varying age groups and educational settings. Mark Warschauer is a professor at the University of California, Irvine. He works on a range of research projects related to digital media in education. Address for correspondence: Ying Xu, University of California Irvine, 3200 Education Bldg, Irvine, CA 92697, USA.
    [Show full text]
  • Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
    Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related.
    [Show full text]
  • Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
    Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook.
    [Show full text]
  • Chapter 7 Linguistics As a Science of Structure Ryan M
    Chapter 7 Linguistics as a science of structure Ryan M. Nefdt University of the Western Cape Generative linguistics has rapidly changed during the course of a relatively short period. This has caused many to question its scientific status as a realist scientific theory (Stokhof & van Lambalgen 2011; Lappin et al. 2000). In this chapter, I argue against this conclusion. Specifically, I claim that the mathematical foundations of the science present a different story below the surface. I agree with critics that due to the major shifts in theory over the past 80 years, linguistics is indeed opened up to the problem of pessimistic meta-induction or radical theory change. However, I further argue that a structural realist approach (Ladyman 1998; French 2006) can save the field from this problem and at the same time capture its structural nature. I discuss particular historical instances of theory change in generative grammar as evidence for this interpretation and finally attempt to extend it beyond the gener- ative tradition to encompass previous frameworks in linguistics. 1 Introduction The generativist revolution in linguistics started in the mid-1950s, inspired in large part by insights from mathematical logic and in particular proof theory. Since then, generative linguistics has become a dominant paradigm, with many connections to both the formal and natural sciences. At the centre of the newly established discipline was the syntactic or formal engine, the structures of which were revealed through modelling grammatical form. The generativist paradigm in linguistics initially relied heavily upon the proof-theoretic techniques intro- duced by Emil Post and other formal logicians to model the form language takes (Tomalin 2006; Pullum 2011; 2013).1 Yet despite these aforementioned formal be- ginnings, the generative theory of linguistics has changed its commitments quite 1Here my focus will largely be on the formal history of generative syntax but I will make some comments on other aspects of linguistics along the way.
    [Show full text]
  • Lexical-Functional Grammar and Order-Free Semantic Composition
    COLING 82, J. Horeck~(eel) North-HoOandPubllshi~ Company O A~deml~ 1982 Lexical-Functional Grammar and Order-Free Semantic Composition Per-Kristian Halvorsen Norwegian Research Council's Computing Center for the Humanities and Center for Cognitive Science, MIT This paper summarizes the extension of the theory of lexical-functional grammar to include a formal, model-theoretic, semantics. The algorithmic specification of the semantic interpretation procedures is order-free which distinguishes the system from other theories providing model-theoretic interpretation for natural language. Attention is focused on the computational advantages of a semantic interpretation system that takes as its input functional structures as opposed to syntactic surface-structures. A pressing problem for computational linguistics is the development of linguistic theories which are supported by strong independent linguistic argumentation, and which can, simultaneously, serve as a basis for efficient implementations in language processing systems. Linguistic theories with these properties make it possible for computational implementations to build directly on the work of linguists both in the area of grammar-writing, and in the area of theory development (cf. universal conditions on anaphoric binding, filler-gap dependencies etc.). Lexical-functional grammar (LFG) is a linguistic theory which has been developed with equal attention being paid to theoretical linguistic and computational processing considerations (Kaplan & Bresnan 1981). The linguistic theory has ample and broad motivation (vide the papers in Bresnan 1982), and it is transparently implementable as a syntactic parsing system (Kaplan & Halvorsen forthcoming). LFG takes grammatical relations to be of primary importance (as opposed to the transformational theory where grammatical functions play a subsidiary role).
    [Show full text]
  • Syntactic Structures and Their Symbiotic Guests. Notes on Analepsis from the Perspective of On- Line Syntax*
    Pragmatics 24:3.533-560 (2014) International Pragmatics Association DOI: 10.1075/prag.24.3.05aue SYNTACTIC STRUCTURES AND THEIR SYMBIOTIC GUESTS. NOTES ON ANALEPSIS FROM THE PERSPECTIVE OF ON- LINE SYNTAX* Peter Auer Abstract The empirical focus of this paper is on utterances that re-use syntactic structures from a preceding syntactic unit. Next utterances of this type are usually treated as (coordination) ellipsis. It is argued that from an on-line perspective on spoken syntax, they are better described as structural latency: A grammatical structure already established remains available and can therefore be made use of with one or more of its slots being filled by new material. A variety of cases of this particular kind of conversational symbiosis are discussed. It is argued that they should receive a common treatment. A number of features of the general host/guest relationship are discussed. Keywords: Analepsis; On-line syntax; Structural latency. "Ein Blick in die erste beste Erzählung und eine einfache Ueberlegung muss beweisen, dass jede frühere Aeusserung des Erzählenden die Exposition aller nachfolgenden Prädikate bildet." (Wegener 1885: 46)1 1. Introduction The notion of 'ellipsis' is often regarded with some skepticism by Interactional Linguists - the orientation to 'full' sentences is all too obvious (cf. Selting 1997), and the idea that speakers produce complete sentences just in order to delete some parts of them afterwards surely fails to account for the processual dynamics of sentence production and understanding (cf. Kindt 1985) in time. On the other hand, there can be little doubt that speakers often produce utterance units that could not be understood * I wish to thank Elizabeth Couper-Kuhlen and Susanne Günthner for their helpful comments on a previous version of this paper.
    [Show full text]
  • LING83600: Context-Free Grammars
    LING83600: Context-free grammars Kyle Gorman 1 Introduction Context-free grammars (or CFGs) are a formalization of what linguists call “phrase structure gram- mars”. The formalism is originally due to Chomsky (1956) though it has been independently discovered several times. The insight underlying CFGs is the notion of constituency. Most linguists believe that human languages cannot be even weakly described by CFGs (e.g., Schieber 1985). And in formal work, context-free grammars have long been superseded by “trans- formational” approaches which introduce movement operations (in some camps), or by “general- ized phrase structure” approaches which add more complex phrase-building operations (in other camps). However, unlike more expressive formalisms, there exist relatively-efficient cubic-time parsing algorithms for CFGs. Nearly all programming languages are described by—and parsed using—a CFG,1 and CFG grammars of human languages are widely used as a model of syntactic structure in natural language processing and understanding tasks. Bibliographic note This handout covers the formal definition of context-free grammars; the Cocke-Younger-Kasami (CYK) parsing algorithm will be covered in a separate assignment. The definitions here are loosely based on chapter 11 of Jurafsky & Martin, who in turn draw from Hopcroft and Ullman 1979. 2 Definitions 2.1 Context-free grammars A context-free grammar G is a four-tuple N; Σ; R; S such that: • N is a set of non-terminal symbols, corresponding to phrase markers in a syntactic tree. • Σ is a set of terminal symbols, corresponding to words (i.e., X0s) in a syntactic tree. • R is a set of production rules.
    [Show full text]
  • Generative Linguistics and Neural Networks at 60: Foundation, Friction, and Fusion*
    Generative linguistics and neural networks at 60: foundation, friction, and fusion* Joe Pater, University of Massachusetts Amherst October 3, 2018. Abstract. The birthdate of both generative linguistics and neural networks can be taken as 1957, the year of the publication of foundational work by both Noam Chomsky and Frank Rosenblatt. This paper traces the development of these two approaches to cognitive science, from their largely autonomous early development in their first thirty years, through their collision in the 1980s around the past tense debate (Rumelhart and McClelland 1986, Pinker and Prince 1988), and their integration in much subsequent work up to the present. Although this integration has produced a considerable body of results, the continued general gulf between these two lines of research is likely impeding progress in both: on learning in generative linguistics, and on the representation of language in neural modeling. The paper concludes with a brief argument that generative linguistics is unlikely to fulfill its promise of accounting for language learning if it continues to maintain its distance from neural and statistical approaches to learning. 1. Introduction At the beginning of 1957, two men nearing their 29th birthdays published work that laid the foundation for two radically different approaches to cognitive science. One of these men, Noam Chomsky, continues to contribute sixty years later to the field that he founded, generative linguistics. The book he published in 1957, Syntactic Structures, has been ranked as the most influential work in cognitive science from the 20th century.1 The other one, Frank Rosenblatt, had by the late 1960s largely moved on from his research on perceptrons – now called neural networks – and died tragically young in 1971.
    [Show full text]
  • Psychoacoustics, Speech Perception, Language Structure and Neurolinguistics Hearing Acuity Absolute Auditory Threshold Constant
    David House: Psychoacoustics, speech perception, 2018.01.25 language structure and neurolinguistics Hearing acuity Psychoacoustics, speech perception, language structure • Sensitive for sounds from 20 to 20 000 Hz and neurolinguistics • Greatest sensitivity between 1000-6000 Hz • Non-linear perception of frequency intervals David House – E.g. octaves • 100Hz - 200Hz - 400Hz - 800Hz - 1600Hz – 100Hz - 800Hz perceived as a large difference – 3100Hz - 3800 Hz perceived as a small difference Absolute auditory threshold Demo: SPL (Sound pressure level) dB • Decreasing noise levels – 6 dB steps, 10 steps, 2* – 3 dB steps, 15 steps, 2* – 1 dB steps, 20 steps, 2* Constant loudness levels in phons Demo: SPL and loudness (phons) • 50-100-200-400-800-1600-3200-6400 Hz – 1: constant SPL 40 dB, 2* – 2: constant 40 phons, 2* 1 David House: Psychoacoustics, speech perception, 2018.01.25 language structure and neurolinguistics Critical bands • Bandwidth increases with frequency – 200 Hz (critical bandwidth 50 Hz) – 800 Hz (critical bandwidth 80 Hz) – 3200 Hz (critical bandwidth 200 Hz) Critical bands demo Effects of masking • Fm=200 Hz (critical bandwidth 50 Hz) – B= 300,204,141,99,70,49,35,25,17,12 Hz • Fm=800 Hz (critical bandwidth 80 Hz) – B=816,566,396,279,197,139,98,69,49,35 Hz • Fm=3200 Hz (critical bandwidth 200 Hz) – B=2263,1585,1115,786,555,392,277,196,139,98 Hz Effects of masking Holistic vs. analytic listening • Low frequencies more effectively mask • Demo 1: audible harmonics (1-5) high frequencies • Demo 2: melody with harmonics • Demo: how
    [Show full text]
  • The Chomsky Enigma
    Weekly Worker 655 - Thursday January 4 2007 18/08/2010 11:07 Weekly Worker 655 Thursday January 11 2007 Subscribe to the Weekly Worker 18:08:201004:05:2009 The Chomsky home contact enigma action weekly worker respect the unity coalition How is that a powerful critic of US imperialism european social forum has been regarded as a valued asset by the US theory military? In the first of three articles Chris Knight resources of the Radical Anthropology Group begins his what we fight for examination of the life and work of Noam programme Chomsky join Noam Chomsky ranks among the leading intellectual search figures of modern times. He has changed the way we Fighting fund think about what it means to be human, gaining a communist university position in the history of ideas - at least according to On the links his supporters - comparable with that of Galileo, Descartes or Newton. Since launching his intellectual our history assault against the academic orthodoxies of the move 1950s, he has succeeded - almost single-handedly - in revolutionising linguistics and establishing it as a modern science. Our January fund has received an early boost with a handsome Such intellectual victories, however, have come at a £100 donation from comrade GD. cost. The stage was set for the “linguistics wars”1 Which is very handy, as when Chomsky published his first book. He might as circumstances have conspired to well have thrown a bomb. “The extraordinary and force us to change premises - an traumatic impact of the publication of Syntactic expensive business, as readers structures by Noam Chomsky in 1957,” recalls one will know.
    [Show full text]
  • Phrase Structure Rules, Tree Rewriting, and Other Sources of Recursion Structure Within the NP
    Introduction to Transformational Grammar, LINGUIST 601 September 14, 2006 Phrase Structure Rules, Tree Rewriting, and other sources of Recursion Structure within the NP 1 Trees (1) a tree for ‘the brown fox sings’ A ¨¨HH ¨¨ HH B C ¨H ¨¨ HH sings D E the ¨¨HH F G brown fox • Linguistic trees have nodes. The nodes in (1) are A, B, C, D, E, F, and G. • There are two kinds of nodes: internal nodes and terminal nodes. The internal nodes in (1) are A, B, and E. The terminal nodes are C, D, F, and G. Terminal nodes are so called because they are not expanded into anything further. The tree ends there. Terminal nodes are also called leaf nodes. The leaves of (1) are really the words that constitute the sentence ‘the brown fox sings’ i.e. ‘the’, ‘brown’, ‘fox’, and ‘sings’. (2) a. A set of nodes form a constituent iff they are exhaustively dominated by a common node. b. X is a constituent of Y iff X is dominated by Y. c. X is an immediate constituent of Y iff X is immediately dominated by Y. Notions such as subject, object, prepositional object etc. can be defined structurally. So a subject is the NP immediately dominated by S and an object is an NP immediately dominated by VP etc. (3) a. If a node X immediately dominates a node Y, then X is the mother of Y, and Y is the daughter of X. b. A set of nodes are sisters if they are all immediately dominated by the same (mother) node.
    [Show full text]
  • G612310 Syntactic Theory and Analysis
    The Openness of Natural Languages Paul M. Postal, New York University e-mail: [email protected] Preface It might seem plausible to the non-specialist who thinks about natural language (NL) that a given NL, NLx, permits one to report linguistic performances, both performances of NLx elements and those of NLs distinct from NLx. By ‘reporting linguistic performances’ I refer to nothing more arcane than forming statements like ‘Amanda just shouted ‘where’s my baby?’’ It might also seem to a non-specialist that NLx permits one to do descriptive linguistics, not only the descriptive linguistics of NLx, but that of other distinct NLs. By ‘doing descriptive linguistics’ I mean nothing more exotic than forming sentences like ‘The German word for ‘air force’ is ‘Luftwaffe’’. But while these non-specialist assumptions might seem not only plausible but self-evidently true, modern linguistics in its dominant instantiation called generative grammar, in fact denies both these claims. Of course, it does this only implicitly and most advocates of generative grammar may be unaware that its doctrines taken literally preclude what any non-specialist would assume possible. Readers not easily accepting this conclusion will find their skepticism addressed in what follows, for a major goal of this study is to justify in detail the claim that generative grammar has the evidently intolerable implications just mentioned. Section 1 Background Near the beginning of the generative grammar movement in linguistics the following claims were made (all emphases mine: PMP): (1)a. Chomsky (1959: 137) “A language is a collection of sentences of finite length all constructed from a finite alphabet (or, where our concern is limited to syntax, a finite vocabulary) of symbols.” b.
    [Show full text]