Transformations (Primarily Movement) Grammatical Rules Are All Structure-Dependent, ↓ As Discussed in Myth 12 Early in the Semester, in S-Structure the Textbook (Pp

Total Page:16

File Type:pdf, Size:1020Kb

Transformations (Primarily Movement) Grammatical Rules Are All Structure-Dependent, ↓ As Discussed in Myth 12 Early in the Semester, in S-Structure the Textbook (Pp Announcements LNGT0101 Introduction to Linguistics LAP typed proposal is due by e-mail or as hard copy in my mailbox this Friday by 3pm. Mention the language of your choice, the interesting linguistic or nonlinguistic inspected that you hope to study in your project. List at least two references that you’ll be using. Lecture #15 HW4 will be the final homework, so you Nov 5th, 2012 can focus on the LAP. 2 Announcements A visual puzzle On Wed, please sit in ‘dialectal’ groups, as in the clip we saw on HW3: http://www.youtube.com/watch?v=fGxlxOcS-tE I will send a doodle link for possible times for the screening of The Linguists. http://www.magicmgmt.com/gary/oi_pac_tr i/# 3 4 Another visual illusion (just for fun) Summary of Syntax so far Syntax is the study of sentence structure. They key notion to understanding sentence structure in human language is “constituency.” Constituency of a string of words can be determined by objective diagnostics: the http://www.youtube.com/watch?v=hPCoe- substitution, movement, clefting, and 6RRks&feature=player_embedded#! stand-alone tests. 6 1 Summary of Syntax so far Summary of Syntax so far Constituents are phrases. A phrase is a string of words composed of a syntactic head, its The syntactic categories we talked about so far complement (if needed), and its specifier (if any). are: NP, VP, PP, AP, AuxP, and CP. All phrases follow the X'-schema: Our grammar thus far has two types of rules: XP (i) Phrase structure rules (PSRs) of the form ru A B C, and Specifier X' (ii) Lexical insertion rules, which insert words ru into syntactic structures generated by PSRs. X Complement 7 8 Summary of Syntax so far Today’s agenda The grammar we have constructed so far It remains to account for sentence can account for the following aspects of relatedness. We do this with regard to the speakers’ syntactic knowledge: relationship between statements and - Grammaticality questions (a variation on Linda’s question - Recrusiveness from last time). - Ambiguity We also need to explain why languages differ in their syntax. We do this with regard to word order (a variation on Danielle’s question from last time). 9 10 Sentence relatedness revisited Sentence relatedness revisited As we said before, some sentences are Here’s the mini PSG again: intuitively “felt” to be related, e.g., 1. CP C AuxP a. Your friend can play the piano. 2. AuxP NP Aux' b. Can your friend play the piano? 3. Aux' Aux VP We know that a phrase structure grammar can 4. VP V (NP) (PP) generate the (a) sentence, but the question now 5. VP V (CP) is: Can it also generate the sentence in (b)? 6. VP V (AP) 7. NP (Det) N (PP) 8. PP (Deg) P NP 9. AP (Deg) A (PP) 11 12 2 Sentence relatedness revisited Sentence relatedness revisited The answer then is probably not. There is no The additional rule can help, but at a high cost: PSR that will allow the Aux “can” to appear at Now, we simply have no direct explanation for the beginning of the sentence. why a statement and a corresponding question But why should this be a problem? Can’t we are felt to be related. simply add a rule that allows us to have an Aux In essence, while a phrase structure grammar head at the beginning? After all, this is a mini- can account for grammaticality, ambiguity, and grammar, not an exhaustive grammar. recursiveness, it fails to account for sentence Yes, we sure can. Here’s one possible rule: relatedness in a straightforward manner, which AuxP Aux NP VP is not a good result. To solve this problem, we need to enrich our Can this rule help? grammar. 13 14 Transformational rules Transformational rules A solution, first proposed by Chomsky in the 1950s, is to include another What is a transformational rule? component in the grammar in addition to A transformational rule is a syntactic the phrase structure component: a operation that takes one structure as input transformational component that and operates on it producing a modified consists of a set of transformational syntactic structure as output. rules. 15 16 Deep and surface structure Deriving English yes-no questions For this purpose, a fundamental distinction in So, let’s now get back to the yes-no question the grammar has to be made between two “Can your friend play the piano?” and see how separate levels of structure: we can implement a transformational analysis. (a) a pre-transformational structure, which is Now, instead of drawing a tree for the yes-no called deep structure (or D-structure) and is question directly, we actually draw a tree for the derived by phrase structure rules, and corresponding statement “Your friend can play (b) a post-transformational structure, which is the piano.” called surface structure (or S-structure) and The only difference is that such structure will be is derived through the application of marked as interrogative. We can do that, say by transformational rules. adding a [+Q] feature on C in the tree. 17 18 3 Now, a transformation moves Aux to C, thereby deriving: Your friend can play the piano. Can your friend play the piano? CP ei CP C AuxP +Q ei eo NP Aux’ C+Q AuxP ru ei can eo Det N Aux VP NP Aux' your friend can ru ru ei V NP Det N Aux VP play ru Aux-to-C Det N Movement your friend ru the piano V NP play ru D-structure Det N the piano (Note: [+Q] indicates this sentence is interrogative. After all, we do not want to say that both sentences are identical. They obviously are not.) S-structure 19 20 Evidence for Aux-to-C movement Deriving wh-questions But how do we prove that there is actually Ok, let’s try another kind of question, the Aux-to-C movement in English yes-no so-called wh-questions, e.g., questions? What evidence is there that What will your friend play? ‘invisible triangles’ actually exist in syntax? Since “what” is interpreted as the object of Well, consider: “play,” we assume that this is where it He asked if your friend could play the piano. starts at D-structure: *He asked if could your friend play the piano. your friend will play what 21 22 Now, since this is a question, we apply Aux-to-C We apply PSRs to derive the D-structure: movement to derive the S-structure: CP CP ei ei C AuxP C AuxP +Q +Q will eo eo NP Aux' NP Aux' ru ei ru ei Det N Aux VP Det N Aux VP your friend ru your friend will ru V NP V NP play | N play | what N But does that give us the desired sentence? what 23 24 4 CP Where do wh-phrases end up? ei NP C' To get the desired surface structure, we | ei what C AuxP need to move the wh-phrase “what” to the +Q front of the sentence. will eo NP Aux' The question now is: Where does the wh- ru ei phrase move to? Det N Aux VP There is a restriction, however. It’s called your friend ru structure perseveration: Phrases can V NP move only to specifier positions, and play | heads can only move to head positions. Wh-movement t (“t” stands for trace of the moved phrase, another invisible triangle.) 25 26 A puzzle: wanna-contraction A puzzle: wanna-contraction Who do you want to kiss? Provide a principled explanation for the Who do you wanna kiss? contrast for 4 points of extra credit. Deadline: Monday Nov 12th in class. Who do you want to kiss Mary? Assume that ‘do’ starts under Aux. *Who do you wanna kiss Mary? Assume also that verbs like ‘want’ subcategorize for an AuxP complement Compare: I want to kiss Mary. I wanna kiss Mary. headed by the Aux element ‘to.’ So, a VP rule in that case is VP V AuxP And think triangles. 27 28 Parameters of question-formation Parameters of question-formation In other languages like Japanese, Chinese, and Notice that not all languages are like Egyptian Arabic, the wh-word appears where English when it comes to wh-questions. other nouns normally appear: Japanese Some languages like English form a John-ga dare-o butta ka? question by fronting the wh-word: John-Subj who-Obj hit Q-particle What did you see _? “Who did John hit?” Egyptian Arabic These are typically referred to as wh- /inta Suft miin? fronting languages. you saw who “Who did you see?” This type is called wh-in-situ languages. 29 30 5 Syntax: The grammar model Universal Grammar: Principles and Parameters Phrase structure grammar (X'-theory) Languages differ because UG (Universal ↓ Grammar, remember?) includes two components: D-structure principles and parameters. The principles are ↓ invariant; they hold in all languages. For example, Transformations (primarily Movement) grammatical rules are all structure-dependent, ↓ as discussed in Myth 12 early in the semester, in S-structure the textbook (pp. 157-60). Parameters are also universal, but unlike But if this language model is universal, why do principles, they come in the form of (usually) languages differ then? binary options, and this is where the locus of cross-linguistic variation exists. 31 32 UG: principles and parameters UG: principles and parameters As Chomsky notes: Or, we can represent this graphically as follows: “We can think of the initial state of the faculty of language as a fixed network connected to a switch box; the network is constituted of the principles of language, while the switches are the options to be determined by experience.
Recommended publications
  • Lisa Pearl LING200, Summer Session I, 2004 Syntax – Sentence Structure
    Lisa Pearl LING200, Summer Session I, 2004 Syntax – Sentence Structure I. Syntax A. Gives us the ability to say something new with old words. Though you have seen all the words that I’m using in this sentence before, it’s likely you haven’t seen them put together in exactly this way before. But you’re perfectly capable of understanding the sentence, nonetheless. A. Structure: how to put words together. A. Grammatical structure: a way to put words together which is allowed in the language. a. I often paint my nails green or blue. a. *Often green my nails I blue paint or. II. Syntactic Categories A. How to tell which category a word belongs to: look at the type of meaning the word has, the type of affixes which can adjoin to the word, and the environments you find the word in. A. Meaning: tell lexical category from nonlexical category. a. lexical – meaning is fairly easy to paraphrase i. vampire ≈ “night time creature that drinks blood” i. dark ≈ “without much light” i. drink ≈ “ingest a liquid” a. nonlexical – meaning is much harder to pin down i. and ≈ ?? i. the ≈ ?? i. must ≈ ?? a. lexical categories: Noun, Verb, Adjective, Adverb, Preposition i. N: blood, truth, life, beauty (≈ entities, objects) i. V: bleed, tell, live, remain (≈ actions, sensations, states) i. A: gory, truthful, riveting, beautiful (≈ property/attribute of N) 1. gory scene 1. riveting tale i. Adv: sharply, truthfully, loudly, beautifully (≈ property/attribute of V) 1. speak sharply 1. dance beautifully i. P: to, from, inside, around, under (≈ indicates path or position w.r.t N) 1.
    [Show full text]
  • A More Perfect Tree-Building Machine 1. Refining Our Phrase-Structure
    Intro to Linguistics Syntax 2: A more perfect Tree-Building Machine 1. Refining our phrase-structure rules Rules we have so far (similar to the worksheet, compare with previous handout): a. S → NP VP b. NP → (D) AdjP* N PP* c. VP → (AdvP) V (NP) PP* d. PP → P NP e. AP --> (Deg) A f. AdvP → (Deg) Adv More about Noun Phrases Last week I drew trees for NPs that were different from those in the book for the destruction of the city • What is the difference between the two theories of NP? Which one corresponds to the rule (b)? • What does the left theory claim that the right theory does not claim? • Which is closer to reality? How did can we test it? On a related note: how many meanings does one have? 1) John saw a large pink plastic balloon, and I saw one, too. Let us improve our NP-rule: b. CP = Complementizer Phrase Sometimes we get a sentence inside another sentence: Tenors from Odessa know that sopranos from Boston sing songs of glory. We need to have a new rule to build a sentence that can serve as a complement of a verb, where C complementizer is “that” , “whether”, “if” or it could be silent (unpronounced). g. CP → C S We should also improve our rule (c) to be like the one below. c. VP → (AdvP) V (NP) PP* (CP) (says CP can co-occur with NP or PP) Practice: Can you draw all the trees in (2) using the amended rules? 2) a. Residents of Boston hear sopranos of renown at dusk.
    [Show full text]
  • Language Structure: Phrases “Productivity” a Property of Language • Definition – Language Is an Open System
    Language Structure: Phrases “Productivity” a property of Language • Definition – Language is an open system. We can produce potentially an infinite number of different messages by combining elements differently. • Example – Words into phrases. An Example of Productivity • Human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features. (24 words) • I think that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (33 words) • I have always thought, and I have spent many years verifying, that human language is a communication system that bears some similarities to other animal communication systems, but is also characterized by certain unique features, which are fascinating in and of themselves. (42 words) • Although mainstream some people might not agree with me, I have always thought… Creating Infinite Messages • Discrete elements – Words, Phrases • Selection – Ease, Meaning, Identity • Combination – Rules of organization Models of Word reCombination 1. Word chains (Markov model) Phrase-level meaning is derived from understanding each word as it is presented in the context of immediately adjacent words. 2. Hierarchical model There are long-distant dependencies between words in a phrase, and these inform the meaning of the entire phrase. Markov Model Rule: Select and concatenate (according to meaning and what types of words should occur next to each other). bites bites bites Man over over over jumps jumps jumps house house house Markov Model • Assumption −Only adjacent words are meaningfully (and lawfully) related.
    [Show full text]
  • Antisymmetry Kayne, Richard (1995)
    CAS LX 523 Syntax II (1) A Spring 2001 March 13, 2001 qp Paul Hagstrom Week 7: Antisymmetry BE 33 Kayne, Richard (1995). The antisymmetry of syntax. Cambridge, MA: MIT Press. CDFG 1111 Koopman, Hilda (2000). The spec-head configuration. In Koopman, H., The syntax of cdef specifiers and heads. London: Routledge. (2) A node α ASYMMETRICALLY C-COMMANDS β if α c-commands β and β does not The basic proposals: c-command α. X-bar structures (universally) have a strict order: Spec-head-complement. There is no distinction between adjuncts and specifiers. • B asymmetrically c-commands F and G. There can be only one specifier. • E asymmetrically c-commands C and D. • No other non-terminal nodes asymmetrically c-command any others. But wait!—What about SOV languages? What about multiple adjunction? Answer: We’ve been analyzing these things wrong. (3) d(X) is the image of a non-terminal node X. Now, we have lots of work to do, because lots of previous analyses relied on d(X) is the set of terminal nodes dominated by node X. the availability of “head-final” structures, or multiple adjunction. • d(C) is {c}. Why make our lives so difficult? Wasn’t our old system good enough? • d(B) is {c, d}. Actually, no. • d(F) is {e}. A number of things had to be stipulated in X-bar theory (which we will review); • d(E) is {e, f}. they can all be made to follow from one general principle. • d(A) is {c, d, e, f}. The availability of a head-parameter actually fails to predict the kinds of languages that actually exist.
    [Show full text]
  • Antisymmetry and the Lefthand in Morphology*
    CatWPL 7 071-087 13/6/00 12:26 Página 71 CatWPL 7, 1999 71-87 Antisymmetry And The Lefthand In Morphology* Frank Drijkoningen Utrecht Institute of Linguistics-OTS. Department of Foreign Languages Kromme Nieuwegracht 29. 3512 HD Utrecht. The Netherlands [email protected] Received: December 13th 1998 Accepted: March 17th 1999 Abstract As Kayne (1994) has shown, the theory of antisymmetry of syntax also provides an explanation of a structural property of morphological complexes, the Righthand Head Rule. In this paper we show that an antisymmetry approach to the Righthand Head Rule eventually is to be preferred on empirical grounds, because it describes and explains the properties of a set of hitherto puzz- ling morphological processes —known as discontinuous affixation, circumfixation or parasyn- thesis. In considering these and a number of more standard morphological structures, we argue that one difference bearing on the proper balance between morphology and syntax should be re-ins- talled (re- with respect to Kayne), a difference between the antisymmetry of the syntax of mor- phology and the antisymmetry of the syntax of syntax proper. Key words: antisymmetry, Righthand Head Rule, circumfixation, parasynthesis, prefixation, category-changing prefixation, discontinuities in morphology. Resum. L’antisimetria i el costat esquerre en morfologia Com Kayne (1994) mostra, la teoria de l’antisimetria en la sintaxi també ens dóna una explicació d’una propietat estructural de complexos morfològics, la Regla del Nucli a la Dreta. En aquest article mostrem que un tractament antisimètric de la Regla del Nucli a la Dreta es prefereix even- tualment en dominis empírics, perquè descriu i explica les propietats d’una sèrie de processos fins ara morfològics —coneguts com afixació discontínua, circumfixació o parasíntesi.
    [Show full text]
  • Wh-Movement in Standard Arabic: an Optimality
    Poznań Studies in Contemporary Linguistics 46(1), 2010, pp. 1–26 © School of English, Adam Mickiewicz University, Poznań, Poland doi:10.2478/v10010-010-0001-y WH-MOVEMENT IN STANDARD ARABIC: AN OPTIMALITY-THEORETIC ACCOUNT MOUSA A. BTOOSH Al-Hussein Bin Talal University, Ma’an [email protected] ABSTRACT This paper is meant to delineate the syntax of wh-movement in Standard Arabic within the Opti- mality Theory framework. The scope of this study is limited to examine only simple, relativized and indirect verbal information questions. Further restrictions also have been placed on tense and negation in that only past tense affirmative questions are tackled here. Results show that Standard Arabic strictly adheres to the OP SPEC constraint in the matrix as well as the subordinate clauses. Findings also show that *Prep-Strand violation is intolerable in all types of information questions. Furthermore, the phonological manifestation of the com- plementizer is obligatory when the relative clause head is present or when the relative clause head is deleted and a resumptive pronoun is left behind. KEYWORDS : Wh-movement; constraints; questions; resumptive pronouns; relative clauses. 1. Introduction and review of literature Languages vary not only as to where they place wh-words or phrases, but also in the constraints governing the movement process, in general. In spite of the different kinds of movement that languages feature (topicalization, extraposition, wh-movement, heavy NP-shift, etc.), all of such syntactic phenomena do share a common property, the movement of a category from a place to another. Though its roots began long before, the unified theory of wh-movement came into being by the publication of Chomsky’s article On wh-movement in 1977.
    [Show full text]
  • Danish Pre-Nominals As Specifier Heads
    Danish Pre-nominals as Specifier Heads Anne Neville University of Southern Denmark 1. Introduction This paper presents an outline of a central aspect of an extensive formal analysis of the syntax of the Danish nominal phrase within the framework of Head-Driven Phrase Structure Grammar, cf. Pollard and Sag (1994). The aspect of the analysis I will focus on is nominal phrase structure. Modern theories of phrase structure are commonly based on X-bar Theory and phrase structure conforms to the X-bar schema proposed in Chomsky (1970). Within this tradition, there has been much discussion about the categorial status of nominal phrases. This paper is a contribution to this discussion. I propose a nominal phrase analysis which I refer to as the Specifier Head Analysis to indicate that it is a combination of the NP and DP approaches. In 2. I give some examples of Danish nominal phrases covered by the analysis. In 3. I introduce two basic assumptions that form the basis of the analysis. A brief outline of the NP and DP analyses and the difference be- tween them is given in 4. 5. is a brief introduction to HPSG. The Specifier Head Analysis is presented in 6. In 7. and 8. respectively, the analysis is applied to two types of pre-nominal, adjectives functioning as specifiers and the Danish definite article. Finally, the paper is concluded in 9. 2. Some Data In (1) examples of nominal phrases covered by the analysis are shown. The examples are by no means exhaustive. As can be seen only nominal phrases consisting of a noun and possibly pre-nominals are covered.
    [Show full text]
  • Processing English with a Generalized Phrase Structure Grammar
    PROCESSING ENGLISH WITH A GENERALIZED PHRASE STRUCTURE GRAMMAR Jean Mark Gawron, Jonathan King, John Lamping, Egon Loebner, Eo Anne Paulson, Geoffrey K. Pullum, Ivan A. Sag, and Thomas Wasow Computer Research Center Hewlett Packard Company 1501 Page Mill Road Palo Alto, CA 94304 ABSTRACT can be achieved without detailed syntactic analysis. There is, of course, a massive This paper describes a natural language pragmatic component to human linguistic processing system implemented at Hewlett-Packard's interaction. But we hold that pragmatic inference Computer Research Center. The system's main makes use of a logically prior grammatical and components are: a Generalized Phrase Structure semantic analysis. This can be fruitfully modeled Grammar (GPSG); a top-down parser; a logic and exploited even in the complete absence of any transducer that outputs a first-order logical modeling of pragmatic inferencing capability. representation; and a "disambiguator" that uses However, this does not entail an incompatibility sortal information to convert "normal-form" between our work and research on modeling first-order logical expressions into the query discourse organization and conversational language for HIRE, a relational database hosted in interaction directly= Ultimately, a successful the SPHERE system. We argue that theoretical language understanding system wilt require both developments in GPSG syntax and in Montague kinds of research, combining the advantages of semantics have specific advantages to bring to this precise, grammar-driven analysis of utterance domain of computational linguistics. The syntax structure and pragmatic inferencing based on and semantics of the system are totally discourse structures and knowledge of the world. domain-independent, and thus, in principle, We stress, however, that our concerns at this highly portable.
    [Show full text]
  • Possessive Nominal Expressions in GB: DP Vs. NP
    Possessive Nominal Expressions in GB: DP vs. NP Daniel W. Bruhn December 8, 2006 1 Introduction 1.1 Background Since Steven Abney's 1987 MIT dissertation, The English Noun Phrase in Its Sentential Aspect, the so- called DP Hypothesis has gained acceptance in the eld of Government and Binding (GB) syntax. This hypothesis proposes that a nominal expression1 is headed by a determiner that takes a noun phrase as its complement. The preexisting NP Hypothesis, on the other hand, diagrams nominal expressions as headed by nouns, sometimes taking a determiner phrase as a specier.2 This basic dierence is illustrated in the following trees for the nominal expression the dog: NP DP DP N0 D0 D0 N0 D0 NP dog the D0 N0 the N0 dog The number of dierent types of nominal expressions that could serve as the basis for an NP vs. DP Hypothesis comparison is naturally quite large, and to address every one would be quite a feat for this squib. I have therefore chosen to concentrate on possessive nominal expressions because they exhibit a certain array of properties that pose challenges for both hypotheses, and will present these according to the following structure: Section 2.1 (Possessive) nominals posing no problem for either hypothesis Section 2.2 Problem nominals for the NP Hypothesis Section 2.3 How the DP Hypothesis accounts for the problem nominals of the NP Hypothesis Section 2.4 Problem nominals for the DP Hypothesis Section 2.5 How the NP Hypothesis accounts for the problem nominals of the DP Hypothesis By problem nominals, I mean those which expose some weakeness in the ability of either hypothesis to: 1.
    [Show full text]
  • Wh-Movement and Locality Constraints
    Wh-movement and Locality Constraints Kofi K. Saah Kofi K. Saah 1 After this lecture, you have mastered the following ideas and skills: • Explain the motivation for wh-movement • Draw the tree indicating wh-movement • Identify various complementizer types • Draw a tree for a relative clause • Identify island types Kofi K. Saah 2 Movement Rules • We’ve seen that DPs move from the position where they got their theta role to a position where they could get Case. • We saw that the trigger for this movement was the requirement that DPs check their Case feature. Kofi K. Saah 3 Movement Rules • Case, as we saw, can only be assigned in specific structural positions. • In this lecture, we’re going to discuss another kind of phrasal movement, one where DPs already have Case. • We will see that DPs can move for a different reason: to form what are called Wh- questions. Kofi K. Saah 4 Types of Questions • There are several different types of questions, but we’ll be concerned with only two of them: Yes/no questions and wh-questions. Kofi K. Saah 5 Yes/no Questions • Yes/no questions include the following: 1) a) Are you going to eat the bagel? b) Do you drink whisky? c) Have you seen the spectrograph for the phoneme? Kofi K. Saah 6 Yes/no Questions • We saw that yes/no questions in English are formed by moving T to an empty [+Q] as shown in the tree below: Kofi K. Saah 7 Yes/no questions Kofi K. Saah 8 Yes/no questions • The derivation of (1a) will be as follows.
    [Show full text]
  • Acquiring Verb Subcategorization from Spanish Corpora
    Acquiring Verb Subcategorization from Spanish Corpora Grzegorz ChrupaÃla [email protected] Universitat de Barcelona Department of General Linguistics PhD Program “Cognitive Science and Language” Supervised by Dr. Irene Castell´onMasalles September 2003 Contents 1 Introduction 5 2 Verb Subcategorization in Linguistic Theory 7 2.1 Introduction . 7 2.2 Government-Binding and related approaches . 7 2.3 Categorial Grammar . 9 2.4 Lexical-Functional Grammar . 11 2.5 Generalized Phrase-Structure Grammar . 12 2.6 Head-Driven Phrase-Structure Grammar . 14 2.7 Discussion . 17 3 Diathesis Alternations 19 3.1 Introduction . 19 3.2 Diathesis . 19 3.2.1 Alternations . 20 3.3 Diathesis alternations in Spanish . 21 3.3.1 Change of focus . 22 3.3.2 Underspecification . 26 3.3.3 Resultative construction . 27 3.3.4 Middle construction . 27 3.3.5 Conclusions . 29 4 Verb classification 30 4.1 Introduction . 30 4.2 Semantic decomposition . 30 4.3 Levin classes . 32 4.3.1 Beth Levin’s classification . 32 4.3.2 Intersective Levin Classes . 33 4.4 Spanish: verbs of change and verbs of path . 35 4.4.1 Verbs of change . 35 4.4.2 Verbs of path . 37 4.4.3 Discussion . 39 4.5 Lexicographical databases . 40 1 4.5.1 WordNet . 40 4.5.2 VerbNet . 41 4.5.3 FrameNet . 42 5 Subcategorization Acquisition 44 5.1 Evaluation measures . 45 5.1.1 Precision, recall and the F-measure . 45 5.1.2 Types and tokens . 46 5.2 SF acquisition systems . 47 5.2.1 Raw text .
    [Show full text]
  • The Asymmetry and Antisymmetry of Syntax¹ a Relational Approach to Displacement
    The Asymmetry and Antisymmetry of Syntax¹ A Relational Approach to Displacement Justin Malčić 2019 Part IIB Dissertation Linguistics Tripos University of Cambridge 9,963 words ¹Special thanks go to Theresa Biberauer for her invaluable comments and support while supervising this dissertation. Any remaining errors are my own. Abstract In both syntax and phonology, it has long been observed that significant restrictions exist on displacement. One such restriction ensures that displacement leads to sequences of elements which are in some sense contiguous, formalised in syntax in the concept of Feature Geometry-based Relativised Minimality by Starke (2001) and Contiguous Agree by Nevins (2007), and in Autosegmental Phonology by the Line-Crossing Prohibition (originating in the Well-formedness Condition in Goldsmith 1976). I argue that effects of this type, which have been called Contiguity Effects, are best captured by taking displacement to involve total weak orders of elements in the sense of Order Theory. Building on work taking the LCA to hold throughout the derivation, I argue that precedence relations may be the basis of phrase structure, though without claiming that linearisation is necessary for LF (as for example suggested in Kayne 2013). I then develop this approach to show that Order Theory provides useful axioms for both phrase structure and displacement, and that the existence of displacement is expected given the use of Order Theory. 2 The extent of ‘antisymmetry’ in syntax has been a major issue for phrase structure theories ever since Kayne’s (1994) original proposal for the Linear Correspondence Axiom. The requirement for antisymmetry in linearisation seems to follow from the nature of the phonological component, which seems to show a different kind of recursion to that displayed in syntax.
    [Show full text]