Syntactic Structures Revisited Syntactic Structures Revisited Contemporary Lectures on Classic Transformational Theory

Total Page:16

File Type:pdf, Size:1020Kb

Syntactic Structures Revisited Syntactic Structures Revisited Contemporary Lectures on Classic Transformational Theory = . ! SYNTACTIC STRUCTURES REVISITED SYNTACTIC STRUCTURES REVISITED CONTEMPORARY LECTURES ON CLASSIC TRANSFORMATIONAL THEORY Howard Lasnik with Marcela Depiante and Arthur Stepanov The MIT Press Cambridge, Massachusetts London, England Contents © 2000 Massachusetts Institute of Technology Preface vii All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or informa­ Introduction tion storage and retrieval) without permission in writing from the publisher. This book was set in Times New Roman by Asco Typesetters, Hong Kong and StructureChapter 1 and Infinity of Human Language 5 was printed and bound in the United States of America. 1.1 Structure 5 Library of Congress Cataloging-in-Publication Data 1.2 Infinity II Lasnik, Howard. Exercises 34 Syntactic structures revisited : contemporary lectures on classic 1.3 English Verbal Morphology 35 transformational theory / Howard Lasnik with Marcela Depiante and Arthur Stepanov. Chapter 2 p. cm. - (Current studies in linguistics; 33) Transformational Grammar 51 Based on tape recordings made in the fall of 1995 of a Portion of 21 What Is a Transformation? 51 'Ii a syntax course taught by Howard Lasnik at the University of Connecticut. 2 . 2 A First Look at TransformatIons:. The Number and AUXl ary Includes bibliographical references and index. Transformations TIS and T20 53 ISBN 0-262-12222-7 (hc : alk. paper). _ ISBN 0-262-62133-9 (pb : alk. paper) 2.3 Properties of Transformations 56 1. Generative grammar. 2. Grammar, Comparative and general­ 2.4 Other Properties of Transformations 64 Syntax. 3. Chomsky, Noam. Syntactic structures. I. Depiante, 2.S Transformations in Syntactic Structures 66 Marcela A. II. Stepanov, Arthur. III. Title. IV. Series: Current studies in linguistics series; 33. Exercises 105 PIS8.L377 2000 2.6 TheoretIcal. Issues R'alse d by Syntactic Structures 106 4IS-dc2I 99-39907 2.7 Learna bII'l'ty and Language Acquisition 114 CIP Exercises 124 Chapter 3 hology' Suntactic Strllctllres and Beyond Verba l Morp . oJ' 125 3.1 Problems in Syntactic Structures 125 vi Contents 3.2 X-Bar Theory 128 3.3 Subcategorization and Selection 129 3.4 English Verbal Morph 1 .. Preface 3.5 V R . 0 ogy ReVIsIted 136 - aISmg and Split I 163 3.6 Verb Movement and E 3 conomy' Ch ky .7 Chomsky 1993 181 . oms 1991 165 3.8 Syntactic Structures R . eVlVed: Lasnik 1995 Notes 197 187 References 203 Index 207 This book is an introduction to some classic ideas and analyses of trans­ formational generative grammar, viewed both on their own terms and from a more modem perspective. Like A Course in GB Syntax (Lasnik and Uriagereka 1988), the book grew out of a transcript (created from tapes) of a portion of a course, in particular, the first several units of the first-semester graduate syntax course at the University of Connecticut. The tapes were made in the fall of 1995, and Marcela Depiante, a UConn graduate student, did the transcription and initial editing the following year. Arthur Stepanov, another UConn graduate student, did the subse­ quent editing, including organizing the material into chapters, numbering the examples, and providing bibliographic references. In the book, as in the course, I examine in considerable detail the cen­ tral analyses presented by Noam Chomsky in Syntactic Structures (1957) and the theory underlying those analyses, a theory completely formulated in Chomsky's (1955) The Logical Structure of Linguistic Theory. The major focus is on the best set of analyses in Syntactic Structures and The Logical Structure of Linguistic The01Y (and, in many respects, the best set of analyses in the history of our field), those treating English verbal mor­ phology. I show how the technology works, often filling in underlying assumptions and formal particulars that are left unstated in Syntactic Structures. I emphasize the virtues of these analyses because those virtues have been overlooked in recent decades. However, as is well known, the analyses are not without defects, par­ ticularly with respect to questions of explanatory adequacy, that is, ques­ tions of how the child, faced with limited data, arrives at the correct grammar out of the vast set of possible grammars made available by the theory. Thus, in this book, after laying out the Syntactic Structures account, I follow the pendulum swing the field took toward greater explanatory adequacy, as I present successive theoretical developments Introduction Our concerns in this course will be driven by two fundamental inquiries: What is "knowledge of language"? How does it arise in the individual? At the beginning, I want to forestall some potential confusion that has overtaken a .large part of the field. In fact, there are two branches of the investigation of language. One is represented by "practitioners," that is, practicing linguists, practicing psychologists, and so on. The other is the "reflective" part of the field represented by philosophers and cognitive scientists who are concerned with deep conceptual issues relating to the study of language. The confusion has arisen in the "reflective" branch, which often tries to treat natural language words as if they were technical terms and vice versa. Consider the following English sentence: (1) Howard knows English Philosophers have always been quite rightly concerned with the concept "knowledge." However, the trap is to think that a sentence like (1) can tell us something about this and other concepts. There is no a priori reason to think that it can. To see the point, consider this: (2) Howard forced Arthur to leave the room There is no reason to think that this sentence can tell a physicist some­ thing about the concept "force." That seems trivially obvious. (1) is a similar case, but it is actually misleading in two ways. One way in which it is misleading is that it uses the word know. However, there are languages (like French) that do not even use the word for know in expressing the meaning of (1). The other way in which (1) is misleading is that it looks like a two-place predicate know establishing some kind of relation (or "knowing") between Howard and English. Assuming that individuals like 2 Introduction "Howard" . t d· 3 Introduction . eXl~, an gIven this relation of "kn ." . what IS this object "English"? Owmg, the question is, Knowledge of language cannot be a list of behaviors. Linguists came to This object has some peculiar ro . sentence like p perties. One such property is that a that conclusion for many reasons, but they can be summed up in the term that Chomsky often uses (borrowed from Humboldt, and perhaps most (3) This floor needs washing significantly Descartes): the creative aspect of language use. What does is and is not a part of Engli h In this mean? Speakers of human languages can produce .and understand particular, western PennsYlva~a· . some part.s of the United States, in sentences they've never heard or produced before. How can a behavior understood), whereas in the rest'o~~:e:ence IS. n~ver said (although still that was never produced before be shaped by operant conditioning? One versely, a sentence like untry It IS perfectly good. Con- frequent answer invokes analogy; that is to say, the new behaviors we produce that we haven't produced before are "analogous" to behaviors (4) This floor needs washed that we have produced and have been reinforced for before. The hard is not and is a part of English· .. .. question is how to make the notion "analogous" precise. Ultimately, the . , Sillce It IS saId m the rt f Wereh (3) IS not acceptable wh '" pa s 0 the country right notion of analogy will have to be something like what can be found ·t ' ereas lor people in th f 1 sounds very peculiar. It is eas . e rest 0 the country in pages 111-114 of Syntactic Structures, a set of rules for one kind of "English," then, the system thath: :ultiPly parallel examples. What is structure and another set of rules for relating one structure to another. But ties? It is some sort of sociological a:d es~ .strange contradictory proper­ then, why call it analogy at all? like history and colors on a F llP01~tiCal construct, based on things The list of behaviors of which knowledge of language purportedly tin map. 0 owmg standa d ue to use terms like English dR. r usage, I will con- consists has to rely on notions like "utterance" and "word." But what is attribute much significance to than Usslan, but I will be careful not to a word? What is an utterance? These notions are already quite abstract. Th em. us, we cannot get too far investi atin Even more abstract is the notion "sentence." Chomsky has been and m Just by relying on commo g g the concepts we are interested continues to be criticized for positing such abstract notions as trans­ 1 . n sense or by lookin t anguage. We want to treat sentences of g a sentences of natural formations and structures, but the big leap is what everyone takes for need to analyze but we do not h natural languages as data that we granted. It's widely assumed that the big step is going from sentence to our field are. ' expect t em to tell us what the concepts of transformation, but this in fact isn't a significant leap. The big step is FollOWing Chomsky, the approach that I going from "noise" to "word." I will not grant (I) any theoreti al . CTn;~ want to take is the following: Once we have some notion of structure, we are in a position to address .
Recommended publications
  • Scaffolding Learning of Language Structures with Visual‐Syntactic Text
    UC Irvine UC Irvine Previously Published Works Title Scaffolding learning of language structures with visual-syntactic text formatting Permalink https://escholarship.org/uc/item/6235t25b Journal British Journal of Educational Technology, 50(4) ISSN 0007-1013 Authors Park, Y Xu, Y Collins, P et al. Publication Date 2018 DOI 10.1111/bjet.12689 Peer reviewed eScholarship.org Powered by the California Digital Library University of California British Journal of Educational Technology Vol 0 No 0 2018 1–17 doi:10.1111/bjet.12689 Scaffolding learning of language structures with visual-syntactic text formatting Youngmin Park, Ying Xu , Penelope Collins, George Farkas and Mark Warschauer Youngmin Park is a Lecturer at Pusan National University, Korea. She received her Ph.D. degree from the University of California, Irvine, specializing in Language, Literacy and Technology. Ying Xu is a Ph.D. student at the University of California, Irvine, with a specialization in specializing in Language, Literacy and Technology. Penelope Collins is an associate professor at the University of California, Irvine. Her research examines the development of language and literacy skills for children from linguistically diverse backgrounds. George Farkas is a professor at the University of California, Irvine. He has employed a range of statistical approaches and databases to examine the causes and consequences of reading achievement gap across varying age groups and educational settings. Mark Warschauer is a professor at the University of California, Irvine. He works on a range of research projects related to digital media in education. Address for correspondence: Ying Xu, University of California Irvine, 3200 Education Bldg, Irvine, CA 92697, USA.
    [Show full text]
  • Chapter 7 Linguistics As a Science of Structure Ryan M
    Chapter 7 Linguistics as a science of structure Ryan M. Nefdt University of the Western Cape Generative linguistics has rapidly changed during the course of a relatively short period. This has caused many to question its scientific status as a realist scientific theory (Stokhof & van Lambalgen 2011; Lappin et al. 2000). In this chapter, I argue against this conclusion. Specifically, I claim that the mathematical foundations of the science present a different story below the surface. I agree with critics that due to the major shifts in theory over the past 80 years, linguistics is indeed opened up to the problem of pessimistic meta-induction or radical theory change. However, I further argue that a structural realist approach (Ladyman 1998; French 2006) can save the field from this problem and at the same time capture its structural nature. I discuss particular historical instances of theory change in generative grammar as evidence for this interpretation and finally attempt to extend it beyond the gener- ative tradition to encompass previous frameworks in linguistics. 1 Introduction The generativist revolution in linguistics started in the mid-1950s, inspired in large part by insights from mathematical logic and in particular proof theory. Since then, generative linguistics has become a dominant paradigm, with many connections to both the formal and natural sciences. At the centre of the newly established discipline was the syntactic or formal engine, the structures of which were revealed through modelling grammatical form. The generativist paradigm in linguistics initially relied heavily upon the proof-theoretic techniques intro- duced by Emil Post and other formal logicians to model the form language takes (Tomalin 2006; Pullum 2011; 2013).1 Yet despite these aforementioned formal be- ginnings, the generative theory of linguistics has changed its commitments quite 1Here my focus will largely be on the formal history of generative syntax but I will make some comments on other aspects of linguistics along the way.
    [Show full text]
  • Syntactic Structures and Their Symbiotic Guests. Notes on Analepsis from the Perspective of On- Line Syntax*
    Pragmatics 24:3.533-560 (2014) International Pragmatics Association DOI: 10.1075/prag.24.3.05aue SYNTACTIC STRUCTURES AND THEIR SYMBIOTIC GUESTS. NOTES ON ANALEPSIS FROM THE PERSPECTIVE OF ON- LINE SYNTAX* Peter Auer Abstract The empirical focus of this paper is on utterances that re-use syntactic structures from a preceding syntactic unit. Next utterances of this type are usually treated as (coordination) ellipsis. It is argued that from an on-line perspective on spoken syntax, they are better described as structural latency: A grammatical structure already established remains available and can therefore be made use of with one or more of its slots being filled by new material. A variety of cases of this particular kind of conversational symbiosis are discussed. It is argued that they should receive a common treatment. A number of features of the general host/guest relationship are discussed. Keywords: Analepsis; On-line syntax; Structural latency. "Ein Blick in die erste beste Erzählung und eine einfache Ueberlegung muss beweisen, dass jede frühere Aeusserung des Erzählenden die Exposition aller nachfolgenden Prädikate bildet." (Wegener 1885: 46)1 1. Introduction The notion of 'ellipsis' is often regarded with some skepticism by Interactional Linguists - the orientation to 'full' sentences is all too obvious (cf. Selting 1997), and the idea that speakers produce complete sentences just in order to delete some parts of them afterwards surely fails to account for the processual dynamics of sentence production and understanding (cf. Kindt 1985) in time. On the other hand, there can be little doubt that speakers often produce utterance units that could not be understood * I wish to thank Elizabeth Couper-Kuhlen and Susanne Günthner for their helpful comments on a previous version of this paper.
    [Show full text]
  • LING83600: Context-Free Grammars
    LING83600: Context-free grammars Kyle Gorman 1 Introduction Context-free grammars (or CFGs) are a formalization of what linguists call “phrase structure gram- mars”. The formalism is originally due to Chomsky (1956) though it has been independently discovered several times. The insight underlying CFGs is the notion of constituency. Most linguists believe that human languages cannot be even weakly described by CFGs (e.g., Schieber 1985). And in formal work, context-free grammars have long been superseded by “trans- formational” approaches which introduce movement operations (in some camps), or by “general- ized phrase structure” approaches which add more complex phrase-building operations (in other camps). However, unlike more expressive formalisms, there exist relatively-efficient cubic-time parsing algorithms for CFGs. Nearly all programming languages are described by—and parsed using—a CFG,1 and CFG grammars of human languages are widely used as a model of syntactic structure in natural language processing and understanding tasks. Bibliographic note This handout covers the formal definition of context-free grammars; the Cocke-Younger-Kasami (CYK) parsing algorithm will be covered in a separate assignment. The definitions here are loosely based on chapter 11 of Jurafsky & Martin, who in turn draw from Hopcroft and Ullman 1979. 2 Definitions 2.1 Context-free grammars A context-free grammar G is a four-tuple N; Σ; R; S such that: • N is a set of non-terminal symbols, corresponding to phrase markers in a syntactic tree. • Σ is a set of terminal symbols, corresponding to words (i.e., X0s) in a syntactic tree. • R is a set of production rules.
    [Show full text]
  • Generative Linguistics and Neural Networks at 60: Foundation, Friction, and Fusion*
    Generative linguistics and neural networks at 60: foundation, friction, and fusion* Joe Pater, University of Massachusetts Amherst October 3, 2018. Abstract. The birthdate of both generative linguistics and neural networks can be taken as 1957, the year of the publication of foundational work by both Noam Chomsky and Frank Rosenblatt. This paper traces the development of these two approaches to cognitive science, from their largely autonomous early development in their first thirty years, through their collision in the 1980s around the past tense debate (Rumelhart and McClelland 1986, Pinker and Prince 1988), and their integration in much subsequent work up to the present. Although this integration has produced a considerable body of results, the continued general gulf between these two lines of research is likely impeding progress in both: on learning in generative linguistics, and on the representation of language in neural modeling. The paper concludes with a brief argument that generative linguistics is unlikely to fulfill its promise of accounting for language learning if it continues to maintain its distance from neural and statistical approaches to learning. 1. Introduction At the beginning of 1957, two men nearing their 29th birthdays published work that laid the foundation for two radically different approaches to cognitive science. One of these men, Noam Chomsky, continues to contribute sixty years later to the field that he founded, generative linguistics. The book he published in 1957, Syntactic Structures, has been ranked as the most influential work in cognitive science from the 20th century.1 The other one, Frank Rosenblatt, had by the late 1960s largely moved on from his research on perceptrons – now called neural networks – and died tragically young in 1971.
    [Show full text]
  • The Chomsky Enigma
    Weekly Worker 655 - Thursday January 4 2007 18/08/2010 11:07 Weekly Worker 655 Thursday January 11 2007 Subscribe to the Weekly Worker 18:08:201004:05:2009 The Chomsky home contact enigma action weekly worker respect the unity coalition How is that a powerful critic of US imperialism european social forum has been regarded as a valued asset by the US theory military? In the first of three articles Chris Knight resources of the Radical Anthropology Group begins his what we fight for examination of the life and work of Noam programme Chomsky join Noam Chomsky ranks among the leading intellectual search figures of modern times. He has changed the way we Fighting fund think about what it means to be human, gaining a communist university position in the history of ideas - at least according to On the links his supporters - comparable with that of Galileo, Descartes or Newton. Since launching his intellectual our history assault against the academic orthodoxies of the move 1950s, he has succeeded - almost single-handedly - in revolutionising linguistics and establishing it as a modern science. Our January fund has received an early boost with a handsome Such intellectual victories, however, have come at a £100 donation from comrade GD. cost. The stage was set for the “linguistics wars”1 Which is very handy, as when Chomsky published his first book. He might as circumstances have conspired to well have thrown a bomb. “The extraordinary and force us to change premises - an traumatic impact of the publication of Syntactic expensive business, as readers structures by Noam Chomsky in 1957,” recalls one will know.
    [Show full text]
  • G612310 Syntactic Theory and Analysis
    The Openness of Natural Languages Paul M. Postal, New York University e-mail: [email protected] Preface It might seem plausible to the non-specialist who thinks about natural language (NL) that a given NL, NLx, permits one to report linguistic performances, both performances of NLx elements and those of NLs distinct from NLx. By ‘reporting linguistic performances’ I refer to nothing more arcane than forming statements like ‘Amanda just shouted ‘where’s my baby?’’ It might also seem to a non-specialist that NLx permits one to do descriptive linguistics, not only the descriptive linguistics of NLx, but that of other distinct NLs. By ‘doing descriptive linguistics’ I mean nothing more exotic than forming sentences like ‘The German word for ‘air force’ is ‘Luftwaffe’’. But while these non-specialist assumptions might seem not only plausible but self-evidently true, modern linguistics in its dominant instantiation called generative grammar, in fact denies both these claims. Of course, it does this only implicitly and most advocates of generative grammar may be unaware that its doctrines taken literally preclude what any non-specialist would assume possible. Readers not easily accepting this conclusion will find their skepticism addressed in what follows, for a major goal of this study is to justify in detail the claim that generative grammar has the evidently intolerable implications just mentioned. Section 1 Background Near the beginning of the generative grammar movement in linguistics the following claims were made (all emphases mine: PMP): (1)a. Chomsky (1959: 137) “A language is a collection of sentences of finite length all constructed from a finite alphabet (or, where our concern is limited to syntax, a finite vocabulary) of symbols.” b.
    [Show full text]
  • Right Association in Parsing and Grammar*
    Right Association in Parsing and Grammar* Colin Phillips, Massachusetts Institute of Technology 1. Introduction It is straightforward enough to show that sentence parsing and grammaticality judgements are different. There are sentences which are easy to parse but ungrammatical (e.g. that-trace effects), and there are sentences which are extremely difficult to parse, but which may be judged grammatical given appropriate time for reflection (e.g. multiply center embedded sentences). This classic argument shows that parsing and grammar are not identical, but it tells us very little about just how much they have in common. The goal of this paper is to point to one principle which may account for a number of effects of ambiguity in both domains. I have in mind two closely related kinds of ambiguity. Both occur in situations in which structure is underdetermined by meaning. The first kind of ambiguity is familiar from the literature on sentence comprehension: there are points in the comprehension of sentences where the input is consistent with more than one structure. Not knowing the speaker’s intended meaning, the comprehender must make a structural choice independent of meaning. This is a typical instance of a processing ambiguity. There is, however, an additional kind of situation in which structural choices are undetermined by meaning, one which has received less attention. When I say something, knowing the meaning that I intend to convey and the words that I intend to use may not be sufficient to determine the structure that my grammar assigns to them. There could be more than one way of structuring the same set of elements such that the semantic rules of my grammar assign the intended interpretation to the sentence.
    [Show full text]
  • STRUCTURE and STRUCTURALISM in PHILOSOPHY of LANGUAGE and SEMIOTICS by Susan Petrilli
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by ESE - Salento University Publishing STRUCTURE AND STRUCTURALISM IN PHILOSOPHY OF LANGUAGE AND SEMIOTICS by Susan Petrilli Abstract Structuralism covers a broad range of different tendencies in different disciplines over the entire twentieth century. The term structuralism is plurivocal: it is used for different trends from a variety of different scientific fields and may even diverge on the theoretical and methodological levels. This essay examines some of the main trends in structuralism not only in linguistics, but beyond in other areas of research on language and signs, including philosophy of language through to latest developments in semiotics, and most recently biosemiotics. A critical approach to structuralism is proposed for the development of critical structuralism involving such problematics as Marxian proto-structuralism; the intersemiotic transposition of semiotic approaches to linguistic and socio-cultural structures; ontological structuralism and methodological structuralism; the human being as a semiotic animal and a structuralist animal. Le structuralisme couvre un large éventail de tendances différentes dans les différentes disciplines pendant le XXe siècle. Le terme structuralisme est plurivoque: se réfère à des orientations différentes de différents domaines scientifiques, même sur le 44 plan théorique et méthodologique. Cet article examine quelques-unes des principales tendances du structuralisme, non seulement en linguistique,
    [Show full text]
  • Syntactic Structures in Dialogue: the Role of Influence
    University of Northern Iowa UNI ScholarWorks Dissertations and Theses @ UNI Student Work 2016 Syntactic structures in dialogue: The role of influence Brenda Ellen Hite University of Northern Iowa Let us know how access to this document benefits ouy Copyright ©2016 Brenda Ellen Hite Follow this and additional works at: https://scholarworks.uni.edu/etd Part of the Curriculum and Instruction Commons Recommended Citation Hite, Brenda Ellen, "Syntactic structures in dialogue: The role of influence" (2016). Dissertations and Theses @ UNI. 286. https://scholarworks.uni.edu/etd/286 This Open Access Thesis is brought to you for free and open access by the Student Work at UNI ScholarWorks. It has been accepted for inclusion in Dissertations and Theses @ UNI by an authorized administrator of UNI ScholarWorks. For more information, please contact [email protected]. Copyright by BRENDA ELLEN HITE 2016 All Rights Reserved SYNTACTIC STRUCTURES IN DIALOGUE: THE ROLE OF INFLUENCE An Abstract of a Thesis Submitted in Partial Fulfillment of the Requirements for the Degree Master of Arts in Education Brenda Ellen Hite University of Northern Iowa July, 2016 ABSTRACT Genuine dialogues reflect the way that language is initially developed by providing opportunities to hear and practice language. Participants within dialogues are found to speak in similar ways through the priming effect, giving language learners continued opportunities to hear and practice a greater variety of syntactic structures. The purpose of this study was to determine if genuine dialogue, and the priming that occurs, could aid in the use and expansion of syntactic structures. Within the context of Reading Recovery, the teacher/researcher (a female member of the dominant culture) and a single student participant (male, African American, 6- years-old) were engaged in dialogue.
    [Show full text]
  • Syntactic Enhancement and Second Language Literacy: an Experimental Study
    Language Learning & Technology October 2016, Volume 20, Number 3 http://llt.msu.edu/issues/october2016/parkwarschauer.pdf pp. 180–199 SYNTACTIC ENHANCEMENT AND SECOND LANGUAGE LITERACY: AN EXPERIMENTAL STUDY Youngmin Park, University of California, Irvine Mark Warschauer, University of California, Irvine This experimental study examined how the reading and writing development of sixth- grade L2 students was affected by syntactic enhancement. Visual-syntactic text formatting (VSTF) technology, which visualizes syntactic structures, was used to convert a textbook to the one with syntactic enhancement. The sample (n = 282), which was drawn from a larger study conducted in Southern California, was a mixed population of English proficiency levels: low-proficiency L2 students (n = 113) and high-proficiency L2 students (n =169). Over a school year, VSTF students read their English language arts (ELA) textbooks in VSTF on their laptops and control students read their regular block- formatted textbooks either on their laptops or in print. Observations and interviews revealed that VSTF reading facilitated student engagement in ELA instruction by drawing students’ attention to syntactic structures. The results of the California Standard Tests (CST) before and after the intervention were examined to evaluate participants’ learning outcomes. Although high-proficiency students did not show a significant improvement on the post-test, low-proficiency made significant gains on two subtests of the CST: written conventions and writing strategies. These findings suggest that VSTF reading may facilitate improving syntactic awareness, which is essential for L2 students to develop their English reading and writing skills in academic contexts. Language(s) Learned in this Study: English Keywords: Computer-assisted Language Learning, Grammar, Reading, Writing APA Citation: Park, Y., & Warschauer, M.
    [Show full text]
  • Relating Structure and Time in Linguistics and Psycholinguistics Colin Phillips and Matthew Wagers
    45-Gaskell-Chap45 3/10/07 8:15 PM Page 739 CHAPTER 45 Relating structure and time in linguistics and psycholinguistics Colin Phillips and Matthew Wagers 45.1 Linguistics and questions, and vice versa. This is an area where a rich linguistic literature and a sizeable body of psycholinguistics psycholinguistic research address closely related The field of psycholinguistics advertises its men- phenomena. The most widely discussed form of talistic commitments in its name. The field of unbounded dependency occurs when a noun linguistics does not. Psycholinguistic research fre- phrase (NP) such as which voters in (1) appears in quently involves ingenious experimental designs, a position that is structurally distant from the fancy lab equipment such as eye-trackers or elec- verb that it is an argument of (e.g. bribe in (1)). troencephalograms (EEGs), large groups of exper- Following standard practice, we mark the canon- imental subjects, and detailed statistical analyses. ical position of the direct object of bribe with an Linguistic research typically requires no special- underline or gap in (1), but it is a matter of great ized equipment, no statistical analyses, and controversy whether such gap positions are a part somewhere between zero and a handful of coop- of the mental representation of sentences like (1). erative informants. Psycholinguistic research is After reviewing some of the competing linguistic most commonly conducted in a Department of analyses of unbounded dependency construc- Psychology. Linguistic research is not. Some of tions in section 45.2, we discuss in section 45.3 these differences may contribute to the wide- the contributions of psycholinguistics to the ques- spread perception, well-represented among lin- tion of how these dependencies are represented.
    [Show full text]