<<

Christopher Manning Dependency and Dependency Structure Dependency Dependency postulates that syntacc structure consists of Grammar relaons between lexical items, normally binary asymmetric relaons (“arrows”) called dependencies submitted

Bills were by Introducon on Brownback

ports Senator Republican

and immigration of

Kansas

Christopher Manning Christopher Manning Dependency Grammar and Dependency Grammar and Dependency Structure Dependency Structure

Dependency syntax postulates that syntacc structure consists of Dependency syntax postulates that syntacc structure consists of relaons between lexical items, normally binary asymmetric relaons between lexical items, normally binary asymmetric relaons (“arrows”) called dependencies relaons (“arrows”) called dependencies submitted submitted nsubjpass auxpass prep The arrow connects a nsubjpass auxpass prep (governor, The arrows are Bills were by Bills were by prep pobj superior, regent) with a prep pobj commonly typed dependent (modifier, on on with the of Brownback inferior, subordinate) Brownback pobj nn appos pobj nn appos grammacal ports Senator Republican ports Senator Republican relaons (, cc conj prep Usually, dependencies cc conj prep preposional , and immigration of form a tree (connected, and immigration of apposion, etc.) pobj acyclic, single-head) pobj Kansas Kansas

Christopher Manning Christopher Manning Dependency Grammar and Dependency Structure Dependency Grammar/ History

• The idea of dependency structure goes back a long way • To Pāṇini’s grammar (c. 5th century BCE) • Basic approach of 1st millennium Arabic grammarians ROOT Discussion of the outstanding issues was completed . • Constuency is a new-fangled invenon • 20th century invenon (R.S. Wells, 1947)

• Modern dependency work oen linked to work of L. Tesnière • Some people draw the arrows one way; some the other way! (1959) • Tesnière had them point from head to dependent… • Was dominant approach in “East” (Russia, China, …) • Usually add a fake ROOT so every is a dependent of • Good for free-er word order languages precisely 1 other node • Among the earliest kinds of parsers in NLP, even in the US: • David Hays, one of the founders of U.S. computaonal linguiscs, built early (first?) dependency parser (Hays 1962)

1 Christopher Manning Christopher Manning Relaon between structure and dependency structure Dependency Condioning Preferences

• A dependency grammar has a noon of a head. Officially, CFGs don’t. What are the sources of informaon for dependency parsing? • But modern linguisc theory and all modern stascal parsers (Charniak, Collins, Stanford, …) do, via hand-wrien phrasal “head rules”: 1. Bilexical affinies [issues à the] is plausible • The head of a is a noun/number/adj/… 2. Dependency distance mostly with nearby • The head of a is a verb/modal/…. 3. Intervening material • The head rules can be used to extract a dependency parse from a CFG parse Dependencies rarely span intervening verbs or punctuaon

• The closure of dependencies 4. of heads give constuency from a How many dependents on which side are usual for a head? dependency tree • But the dependents of a word must be at the same level (i.e., “flat”) – there can be no VP! ROOT Discussion of the outstanding issues was completed .

Christopher Manning Christopher Manning

Dependency Parsing Methods of Dependency Parsing

• A sentence is parsed by choosing for each word what other 1. Dynamic programming (like in the CKY algorithm) word (including ROOT) that it is a dependent of. You can do it similarly to lexicalized PCFG parsing: an O(n5) algorithm Eisner (1996) gives a clever algorithm that reduces the complexity to O(n3), by producing parse items with heads at the ends rather than in the middle • Usually some constraints: 2. Graph algorithms • Only one word is a dependent of ROOT You create a Minimum Spanning Tree for a sentence • Don’t want cycles A → B, B → A McDonald et al.’s (2005) MSTParser scores dependencies independently using a ML classifier (he uses MIRA, for online learning, but it can be something else) • This makes the dependencies a tree 3. Constraint Sasfacon • Final issue is whether arrows can cross (non-projecve) or not Edges are eliminated that don’t sasfy hard constraints. Karlsson (1990), etc. 4. “Determinisc parsing” Greedy choice of aachments guided by good machine learning classifiers MaltParser (Nivre et al. 2008) ROOT I ’ll give a talk tomorrow on bootstrapping 9

Christopher Manning Probabilistic dependency DP for generave grammar: generative model dependency $ λw ρw 1. Start with left wall $ 0 0 2. Generate root w0 w0

3. Generate left children w-1, w w w-2, ..., w-ℓ from the FSA λw0 -1 1

4. Generate right children w1, w-2 w w2, ..., wr from the FSA ρw0 2 ... 5. Recurse on each wi for i in {- ... ℓ, ..., -1, 1, ..., r}, sampling αi λw-ℓ (steps 2-4) w-ℓ 6. Return αℓ...α-1w0α1...αr wr

w-ℓ.-1

These 5 slides are based on slides by Jason Eisner and Noah Smith

2 Christopher Manning Christopher Manning Dependency Grammar Cubic Naïve Recognition/Parsing Recognition/Parsing (Eisner & Satta, 1999)

p goal O(n5) • Triangles: span over words, where combinations p c r tall side of triangle is the head, other i j k 0 n } side is dependent, and no non-head goal words expecting more dependents

takes • Trapezoids: span over words, where larger side is head, smaller side is It takes } dependent, and smaller side is still looking for dependents on its side of takes to the trapezoid It takes two to tango

It takes two to tango

Christopher Manning Christopher Manning Dependency Grammar Cubic Cubic Recognition/Parsing (Eisner & Satta, Recognition/Parsing (Eisner & Satta, 1999) 1999)

goal A triangle is agoal head O(n) with some left (or combinations right) subtrees. 0 i n One trapezoid per O(n3) dependency. combinations i j k i j k

O(n3) combinations i j k i j k Gives O(n3) dependency grammar It takes two to tango parsing

Christopher Manning McDonald et al. (2005 ACL) Online Large-Margin Training of Dependency Parsers Graph Algorithms: MSTs • One of two best-known recent dependency parsers • Score of a dependency tree = sum of scores of dependencies • Scores are independent of other dependencies • If scores are available, parsing can be solved as a minimum spanning tree problem • Chiu-Liu-Edmonds algorithm • One then needs a score for dependencies

17

3 Christopher Manning McDonald et al. (2005 ACL): Online Large-Margin Training of Dependency Parsers Greedy Transion-Based • Edge scoring is via a discriminave classifier Parsing • Can condion on rich features in that context • Each dependency is a linear funcon of features mes weights • Feature weights were learned by MIRA, an online large-margin algorithm • But you could use an SVM, maxent, or a perceptron • Features cover: MaltParser • Head and dependent word and POS separately • Head and dependent word and POS bigram features • Words between head and dependent • Length and direcon of dependency

Christopher Manning Christopher Manning MaltParser [Nivre et al. 2008] Basic transion-based dependency parser

• A simple form of greedy discriminave dependency parser Start: σ = [ROOT], β = w1, …, wn , A = ∅

• The parser does a sequence of boom up acons 1. Shi σ, wi|β, A è σ|wi, β, A • Roughly like “shi” or “reduce” in a shi-reduce parser, but the “reduce” 2. Le-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} acons are specialized to create dependencies with head on le or right 3. Right-Arc σ|w , w |β, A è σ, w |β, A∪{r(w ,w )} • The parser has: r i j i i j Finish: β = ∅ • a stack σ, wrien with top to the right • which starts with the ROOT symbol Notes: • a buffer β, wrien with top to the le • Unlike the regular presentaon of the CFG reduce step, • which starts with the input sentence dependencies combine one thing from each of stack and buffer • a set of dependency arcs A • which starts off empty • a set of acons

Christopher Manning Christopher Manning

1. Le-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondion: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arcr σ|wi, wj|β, A è σ|wi|wj, β, A∪{r(wi,wj)} Acons (“arc-eager” dependency parser) Example 3. Reduce σ|wi, β, A è σ, β, A Precondion: (wk, r’, wi) ∈ A 4. Shi σ, wi|β, A è σ|wi, β, A

Start: σ = [ROOT], β = w1, …, wn , A = ∅ Happy children like to play with their friends .

1. Le-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)}

Precondion: r’ (wk, wi) ∉ A, wi ≠ ROOT [ROOT] [Happy, children, …] ∅ Shi [ROOT, Happy] [children, like, …] ∅ 2. Right-Arcr σ|wi, wj|β, A è σ|wi|wj, β, A∪{r(wi,wj)} LAamod [ROOT] [children, like, …] {amod(children, happy)} = A1 3. Reduce σ|wi, β, A è σ, β, A Shi [ROOT, children] [like, to, …] A1 Precondion: r’ (wk, wi) ∈ A LAnsubj [ROOT] [like, to, …] A1 ∪ {nsubj(like, children)} = A2 4. Shi σ, wi|β, A è σ|wi, β, A RAroot [ROOT, like] [to, play, …] A2 ∪{root(ROOT, like) = A3 Finish: β = ∅ Shi [ROOT, like, to] [play, with, …] A3

LAaux [ROOT, like] [play, with, …] A3∪{aux(play, to) = A4

This is the common “arc-eager” variant: a head can immediately RAxcomp [ROOT, like, play] [with their, …] A4∪{xcomp(like, play) = A5 take a right dependent, before its dependents are found

4 Christopher Manning

1. Le-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondion: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arcr σ|wi, wj|β, A è σ|wi|wj, β, A∪{r(wi,wj)} 2. Right-Arcr σ|wi, wj|β, A è σ|wi|wj, β, A∪{r(wi,wj)} Example σ β σ β Example 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce |wi, , A è , , A Precondion: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shi σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A Happy children like to play with their friends .

RAxcomp [ROOT, like, play] [with their, …] A4∪{xcomp(like, play) = A5

RAprep [ROOT, like, play, with] [their, friends, …] A5∪{prep(play, with) = A6 _ROOT_ Red figures on the screen indicated falling stocks Shi [ROOT, like, play, with, their] [friends, .] A6 S Q

LAposs [ROOT, like, play, with] [friends, .] A6∪{poss(friends, their) = A7

RApobj [ROOT, like, play, with, friends] [.] A7∪{pobj(with, friends) = A8

Reduce [ROOT, like, play, with] [.] A8

Reduce [ROOT, like, play] [.] A8

Reduce [ROOT, like] [.] A8

RApunc [ROOT, like, .] [] A8∪{punc(like, .) = A9

You terminate as soon as the buffer is empty. Dependencies = A9 26 Dependency Parsing (Prashanth Mannem) October 20, 2014

1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Shift Left-arc

27 Dependency Parsing (P. Mannem) October 20, 2014 28 Dependency Parsing (P. Mannem) October 20, 2014

1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Shift Right-arc

29 Dependency Parsing (P. Mannem) October 20, 2014 30 Dependency Parsing (P. Mannem) October 20, 2014

5 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Shift Left-arc

31 Dependency Parsing (P. Mannem) October 20, 2014 32 Dependency Parsing (P. Mannem) October 20, 2014

1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Right-arc Reduce

33 Dependency Parsing (P. Mannem) October 20, 2014 34 Dependency Parsing (P. Mannem) October 20, 2014

1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Reduce Left-arc

35 Dependency Parsing (P. Mannem) October 20, 2014 36 Dependency Parsing (P. Mannem) October 20, 2014

6 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Right-arc Shift

37 Dependency Parsing (P. Mannem) October 20, 2014 38 Dependency Parsing (P. Mannem) October 20, 2014

1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Left-arc Right-arc

39 Dependency Parsing (P. Mannem) October 20, 2014 40 Dependency Parsing (P. Mannem) October 20, 2014

1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} 1. Left-Arcr σ|wi, wj|β, A è σ, wj|β, A∪{r(wj,wi)} Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT Precondition: (wk, r’, wi) ∉ A, wi ≠ ROOT 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} 2. Right-Arc σ|w, w|β, A è σ|w|w, β, A∪{r(w,w)} Example r i j i j i j Example r i j i j i j 3. Reduce σ|wi, β, A è σ, β, A 3. Reduce σ|wi, β, A è σ, β, A Precondition: (wk, r’, wi) ∈ A Precondition: (wk, r’, wi) ∈ A 4. Shift σ, wi|β, A è σ|wi, β, A 4. Shift σ, wi|β, A è σ|wi, β, A

_ROOT_ Red figures on the screen indicated falling stocks _ROOT_ Red figures on the screen indicated falling stocks S Q S Q

Reduce Reduce

41 Dependency Parsing (P. Mannem) October 20, 2014 42 Dependency Parsing (P. Mannem) October 20, 2014

7 Christopher Manning Christopher Manning MaltParser Evaluaon of Dependency Parsing: [Nivre et al. 2008] (labeled) dependency accuracy

Acc = # correct deps • We have le to explain how we choose the next acon # of deps • Each acon is predicted by a discriminave classifier (oen SVM, UAS = 4 / 5 = 80% can be perceptron, maxent classifier) over each legal move ROOT She saw the video lecture LAS = 2 / 5 = 40% • Max of 4 untyped choices, max of |R| × 2 + 2 when typed 0 1 2 3 4 5 • Features: top of stack word, POS; first in buffer word, POS; etc. • There is NO search (in the simplest and usual form) Gold Parsed • But you could do some kind of beam search if you wish 1 2 She nsubj 1 2 She nsubj • It provides VERY fast linear me parsing 2 0 saw root 2 0 saw root 3 5 the det 3 4 the det • The model’s accuracy is slightly below the best Lexicalized 4 5 video nn 4 5 video nsubj PCFGs (evaluated on dependencies), but 5 2 lecture dobj 5 2 lecture ccomp • It provides close to state of the art parsing performance

Christopher Manning Christopher Manning

Representave performance numbers Projecvity

• The CoNLL-X (2006) shared task provides evaluaon numbers • Dependencies from a CFG tree using heads, must be projecve for various dependency parsing approaches over 13 languages • There must not be any crossing dependency arcs when the words are laid • MALT: LAS scores from 65–92%, depending greatly on language/ out in their linear order, with all arcs above the words. • Here we give a few UAS numbers for English to allow some • But dependency theory normally does allow non-projecve comparison to constuency parsing structures to account for displaced constuents • You can’t easily get the semancs of certain construcons right without Parser UAS% these nonprojecve dependencies Sagae and Lavie (2006) ensemble of dependency parsers 92.7 Charniak (2000) generave, constuency, as dependencies 92.2 Collins (1999) generave, constuency, as dependencies 91.7 McDonald and Pereira (2005) – MST graph-based dependency 91.5 Yamada and Matsumoto (2003) – transion-based dependency 90.4 Who did Bill buy the coffee from yesterday ?

Christopher Manning

Handling non-projecvity Dependencies encode relaonal • The arc-eager algorithm we presented only builds projecve structure dependency trees • Possible direcons to head: 1. Just declare defeat on nonprojecve arcs 2. Use a dependency which only admits projecve representaons (a CFG doesn’t represent such structures…) 3. Use a postprocessor to a projecve dependency parsing algorithm to Relaon Extracon idenfy and resolve nonprojecve links 4. Add extra types of transions that can model at least most non- with Stanford projecve structures Dependencies 5. Move to a parsing mechanism that does not use or require any constraints on projecvity (e.g., the graph-based MSTParser)

8 Christopher Manning Christopher Manning Dependency paths idenfy relaons like protein interacon Stanford Dependencies

[Erkan et al. EMNLP 07, Fundel et al. 2007] [de Marneffe et al. LREC 2006] demonstrated • The basic dependency representaon is projecve nsubj ccomp • It can be generated by postprocessing headed phrase structure results interacts prep_with compl parses (Penn Treebank syntax) det that advmod SasA nsubj conj_and conj_and • It is generated directly by dependency parsers, such as The MaltParser, or the Easy-First Parser KaiC rythmically KaiA KaiB jumped! prep KaiC çnsubj interacts prep_withè SasA nsubj KaiC çnsubj interacts prep_withè SasA conj_andè KaiA boy! over! KaiC çnsubj interacts prep_withè SasA conj_andè KaiB det amod pobj the! little! the! det fence!

Christopher Manning Christopher Manning BioNLP 2009/2011 relaon extracon Graph modificaon to facilitate semanc shared tasks [Björne et al. 2009] analysis

Bell, based in LA, makes and distributes! 50 Many 45 relationships electronic and computer products.! become short 40 distance! conj 35 makes distributes 30 nsubj cc dobj 25 Dependency distance and 20 Linear distance Bell products partmod amod 15 based electronic 10 prep cc conj 5 in and computer 0 pobj 0 1 2 3 4 5 6 7 8 9 10 >10 LA

Christopher Manning Christopher Manning Graph modificaon to facilitate semanc analysis

Bell, based in LA, makes and distributes! electronic and computer products.!

nsubj

conj_and makes distributes nsubj dobj

Bell products partmod amod amod based electronic conj_and prep_in computer

LA 54

9