1 Dependency Grammar
Total Page:16
File Type:pdf, Size:1020Kb
Christopher Manning Dependency Grammar and Dependency Structure Dependency Dependency syntax postulates that syntac)c structure consists of Grammar relaons between lexical items, normally binary asymmetric relaons (“arrows”) called dependencies submitted Bills were by Introduc)on on Brownback ports Senator Republican and immigration of Kansas Christopher Manning Christopher Manning Dependency Grammar and Dependency Grammar and Dependency Structure Dependency Structure Dependency syntax postulates that syntac)c structure consists of Dependency syntax postulates that syntac)c structure consists of relaons between lexical items, normally binary asymmetric relaons between lexical items, normally binary asymmetric relaons (“arrows”) called dependencies relaons (“arrows”) called dependencies submitted submitted nsubjpass auxpass prep The arrow connects a nsubjpass auxpass prep head (governor, The arrows are Bills were by Bills were by prep pobj superior, regent) with a prep pobj commonly typed dependent (modifier, on on with the name of Brownback inferior, subordinate) Brownback pobj nn appos pobj nn appos grammacal ports Senator Republican ports Senator Republican relaons (subject, cc conj prep Usually, dependencies cc conj prep preposi)onal object, and immigration of form a tree (connected, and immigration of apposi)on, etc.) pobj acyclic, single-head) pobj Kansas Kansas Christopher Manning Christopher Manning Dependency Grammar and Dependency Structure Dependency Grammar/Parsing History • The idea of dependency structure goes back a long way • To Pāṇini’s grammar (c. 5th century BCE) • Basic approach of 1st millennium Arabic grammarians ROOT Discussion of the outstanding issues was completed . • Cons)tuency is a new-fangled inven)on • 20th century inven)on (R.S. Wells, 1947) • Modern dependency work o^en linked to work of L. Tesnière • Some people draw the arrows one way; some the other way! (1959) • Tesnière had them point from head to dependent… • Was dominant approach in “East” (Russia, China, …) • Usually add a fake ROOT so every word is a dependent of • Good for free-er word order languages precisely 1 other node • Among the earliest kinds of parsers in NLP, even in the US: • David Hays, one of the founders of U.S. computaonal linguis)cs, built early (first?) dependency parser (Hays 1962) 1 Christopher Manning Christopher Manning Rela9on between phrase structure and dependency structure Dependency Condioning Preferences • A dependency grammar has a no)on of a head. Officially, CFGs don’t. What are the sources of informaon for dependency parsing? • But modern linguis)c theory and all modern stas)cal parsers (Charniak, Collins, Stanford, …) do, via hand-wrihen phrasal “head rules”: 1. Bilexical affini)es [issues à the] is plausible • The head of a Noun Phrase is a noun/number/adj/… 2. Dependency distance mostly with nearby words • The head of a Verb Phrase is a verb/modal/…. 3. Intervening material • The head rules can be used to extract a dependency parse from a CFG parse Dependencies rarely span intervening verbs or punctuaon • The closure of dependencies 4. Valency of heads give cons)tuency from a How many dependents on which side are usual for a head? dependency tree • But the dependents of a word must be at the same level (i.e., “flat”) – there can be no VP! ROOT Discussion of the outstanding issues was completed . Christopher Manning Christopher Manning Dependency Parsing Methods of Dependency Parsing • A sentence is parsed by choosing for each word what other 1. Dynamic programming (like in the CKY algorithm) word (including ROOT) that it is a dependent of. You can do it similarly to lexicalized PCFG parsing: an O(n5) algorithm Eisner (1996) gives a clever algorithm that reduces the complexity to O(n3), by producing parse items with heads at the ends rather than in the middle • Usually some constraints: 2. Graph algorithms • Only one word is a dependent of ROOT You create a Minimum Spanning Tree for a sentence • Don’t want cycles A → B, B → A McDonald et al.’s (2005) MSTParser scores dependencies independently using a ML classifier (he uses MIRA, for online learning, but it can be something else) • This makes the dependencies a tree 3. Constraint Sasfac)on • Final issue is whether arrows can cross (non-projecve) or not Edges are eliminated that don’t sasfy hard constraints. Karlsson (1990), etc. 4. “Determinis)c parsing” Greedy choice of aachments guided by good machine learning classifiers MaltParser (Nivre et al. 2008) ROOT I ’ll give a talk tomorrow on bootstrapping 9 Christopher Manning Probabilistic dependency DP for generave grammar: generative model dependency $ λw ρw grammars 1. Start with left wall $ 0 0 2. Generate root w0 w0 3. Generate left children w-1, w w w-2, ..., w-ℓ from the FSA λw0 -1 1 4. Generate right children w1, w-2 w w2, ..., wr from the FSA ρw0 2 ... 5. Recurse on each wi for i in {- ... ℓ, ..., -1, 1, ..., r}, sampling αi λw-ℓ (steps 2-4) w-ℓ 6. Return αℓ...α-1w0α1...αr wr w-ℓ.-1 These 5 slides are based on slides by Jason Eisner and Noah Smith 2 Christopher Manning Christopher Manning Dependency Grammar Cubic Naïve Recognition/Parsing Recognition/Parsing (Eisner & Satta, 1999) p goal O(n5) • Triangles: span over words, where combinations p c r tall side of triangle is the head, other i j k 0 n } side is dependent, and no non-head goal words expecting more dependents takes • Trapezoids: span over words, where larger side is head, smaller side is It takes } dependent, and smaller side is still looking for dependents on its side of takes to the trapezoid It takes two to tango It takes two to tango Christopher Manning Christopher Manning Dependency Grammar Cubic Cubic Recognition/Parsing (Eisner & Satta, Recognition/Parsing (Eisner & Satta, 1999) 1999) goal A triangle is agoal head O(n) with some left (or combinations right) subtrees. 0 i n One trapezoid per O(n3) dependency. combinations i j k i j k O(n3) combinations i j k i j k Gives O(n3) dependency grammar It takes two to tango parsing Christopher Manning McDonald et al. (2005 ACL) Online Large-Margin Training of Dependency Parsers Graph Algorithms: MSTs • One of two best-known recent dependency parsers • Score of a dependency tree = sum of scores of dependencies • Scores are independent of other dependencies • If scores are available, parsing can be solved as a minimum spanning tree problem • Chiu-Liu-Edmonds algorithm • One then needs a score for dependencies 17 3 Christopher Manning McDonald et al. (2005 ACL): Online Large-Margin Training of Dependency Parsers Greedy Transion-Based • Edge scoring is via a discriminave classifier Parsing • Can condi)on on rich features in that context • Each dependency is a linear func)on of features )mes weights • Feature weights were learned by MIRA, an online large-margin algorithm • But you could use an SVM, maxent, or a perceptron • Features cover: MaltParser • Head and dependent word and POS separately • Head and dependent word and POS bigram features • Words between head and dependent • Length and direc)on of dependency Christopher Manning Christopher Manning MaltParser [Nivre et al. 2008] Basic transion-based dependency parser • A simple form of greedy discriminave dependency parser Start: σ = [ROOT], β = w1, …, wn , A = ∅ • The parser does a sequence of bohom up ac)ons 1. Shi σ, wi|β, A è σ|wi, β, A • Roughly like “shi^” or “reduce” in a shi^-reduce parser, but the “reduce” 2. Le^-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} ac)ons are specialized to create dependencies with head on le^ or right 3. Right-Arc σ|w , w |β, A è σ, w |β, A∪{r(w ,w )} • The parser has: r i j i i j Finish: β = ∅ • a stack σ, wrihen with top to the right • which starts with the ROOT symbol Notes: • a buffer β, wrihen with top to the le^ • Unlike the regular presentaon of the CFG reduce step, • which starts with the input sentence dependencies combine one thing from each of stack and buffer • a set of dependency arcs A • which starts off empty • a set of ac)ons Christopher Manning Christopher Manning 1. Le^-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondi)on: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arcr σ|wi, wj|β, A è σ|wi|wj, β, A∪{r(wi,wj)} Ac9ons (“arc-eager” dependency parser) Example 3. Reduce σ|wi, β, A è σ, β, A Precondi)on: (wk, r’, wi) ∈ A 4. Shi σ, wi|β, A è σ|wi, β, A Start: σ = [ROOT], β = w1, …, wn , A = ∅ Happy children like to play with their friends . 1. Le^-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondi)on: r’ (wk, wi) ∉ A, wi ≠ ROOT [ROOT] [Happy, children, …] ∅ Shi [ROOT, Happy] [children, like, …] ∅ 2. Right-Arcr σ|wi, wj|β, A è σ|wi|wj, β, A∪{r(wi,wj)} LAamod [ROOT] [children, like, …] {amod(children, happy)} = A1 3. Reduce σ|wi, β, A è σ, β, A Shi [ROOT, children] [like, to, …] A1 Precondi)on: r’ (wk, wi) ∈ A LAnsubj [ROOT] [like, to, …] A1 ∪ {nsubj(like, children)} = A2 4. Shi σ, wi|β, A è σ|wi, β, A RAroot [ROOT, like] [to, play, …] A2 ∪{root(ROOT, like) = A3 Finish: β = ∅ Shi [ROOT, like, to] [play, with, …] A3 LAaux [ROOT, like] [play, with, …] A3∪{aux(play, to) = A4 This is the common “arc-eager” variant: a head can immediately RAxcomp [ROOT, like, play] [with their, …] A4∪{xcomp(like, play) = A5 take a right dependent, before its dependents are found 4 Christopher Manning 1. Le^-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1.