The English VerbNet PropBank
VerbNet and PropBank
Word and VerbNets for Semantic Processing
DAAD Summer School in Advanced Language Engineering, Kathmandu University, Nepal
Day 3 Annette Hautli
1 / 34 The English VerbNet PropBank Recap
Levin (1993) Detailed analysis of syntactic alternations in English Verbs with a common meaning partake in the same syntactic alternations Verb classes that share syntactic properties
→ How can we use this information in nlp?
2 / 34 The English VerbNet Lexical information PropBank The VerbNet API Today
1 The English VerbNet Lexical information The VerbNet API
2 PropBank
3 / 34 The English VerbNet Lexical information PropBank The VerbNet API Overview
Hierarchically arranged verb classes Inspired by Levin (1993) and further extended for nlp purposes Levin: 240 classes (47 top level, 193 second level), around 3000 lemmas VerbNet: 471 classes, around 4000 lemmas Explicit links to other resources, e.g. WordNet, FrameNet
4 / 34 The English VerbNet Lexical information PropBank The VerbNet API Outline
1 The English VerbNet Lexical information The VerbNet API
2 PropBank
5 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Thematic roles 23 thematic roles are used in VerbNet Each argument is assigned a unique thematic role When verbs have more than one sense, they usually have different thematic roles
6 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Role Example Actor Susan was chitchatting. Agent Cynthia nibbled on the carrot. Asset Carmen purchased a dress for $50. Attribute The price of oil soared. Beneficiary Sandy sang a song to me. Cause My heart is pounding from fear. Destination The king deported the prisoner to the island. Experiencer Kate fears snakes. Extent The price of oil soared 10%. Instrument He pounded the metal with a hammer.
7 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Role Example Location He danced in the hall. Material David constructed a house out of sticks. Product David constructed a house. Patient The girl reddened. Recipient I gave him the book. Source He departed from Kathmandu. Stimulus The pea soup tasted delicious. Theme He pushed the wheelchair to the door. Time The concert began at 5 o’clock. Topic They talked about their wedding.
8 / 34 BUT: What about The boy eats a rock. The mouse eats an elephant.
The English VerbNet Lexical information PropBank The VerbNet API Components
Selectional restrictions Semantic restrictions that a word imposes on the environment it occurs in For example, the verb to eat Requirement 1: subject refers to an animate entity Requirement 2: object is something concrete → The boy ate the mango. → #The mountains eat sincerity.
9 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Selectional restrictions Semantic restrictions that a word imposes on the environment it occurs in For example, the verb to eat Requirement 1: subject refers to an animate entity Requirement 2: object is something concrete → The boy ate the mango. → #The mountains eat sincerity. BUT: What about The boy eats a rock. The mouse eats an elephant.
9 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Selectional restrictions Selectional restrictions taken from EuroWordNet → ontological structure of selectional restrictions 36 selectional restrictions in v3.2 E.g. the selectional restriction location has three subfields: 1 region (from under the rug) 2 place (under the rug) 3 object (on the table)
10 / 34 The English VerbNet Lexical information PropBank The VerbNet API
11 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Syntactic frames Short-hand description for the surface realization of arguments → Only possible because the English word order is rather fixed! Syntactic arguments are combined with thematic roles Total number of 357 syntactic frames Encoding of additional restrictions: quotation plural infinitival 57 possible prepositions are listed, based on a hierarchy
12 / 34 The English VerbNet Lexical information PropBank The VerbNet API
13 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Semantic predicates denote the relation between participants and events denoted by the verb convey the key components of meaning for each class semantic information expressed in the following way: to run: the verb is an instance of motion performed by a Theme to eat: the verb is an instance of an event where an Agent takes in a Patient
14 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Semantic predicates for multiple events Sometimes a verb entails two events: Susan brought the book. → Susan moves and the book moves towards some location.
VerbNet: the motion of the Agent equals the motion of the Theme
15 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Event classification States: eventualities without a perceptible change (1) a. The air is cold. b. He has a baby. VerbNet: (E) Activities: event does not denote a goal (2) a. Henry swims. b. Amra dances. VerbNet: during(E)
16 / 34 The English VerbNet Lexical information PropBank The VerbNet API Components
Accomplishments: event denotes a goal can be modified with the atelic for an hour → durative (3) a. Ravi paints the house (for an hour). b. Amra cooks the water (for an hour). VerbNet: result(E) Achievements: event denotes a goal cannot be modified with the atelic for an hour → not durative (4) a. Tina arrives (*for an hour). b. I bake the cake (*for an hour). VerbNet: during(E), end(E)
17 / 34 The English VerbNet Lexical information PropBank The VerbNet API
Example: Verb class ID e.g. run-51.3.2. Members (with WordNet ID) e.g. run (run%2:38:00), walk (walk%2:38:03) Thematic roles + selectional restrictions Agent [ type = animate] Theme [ type = animate | type = machine ] Location [ type = concrete ] Syntactic frames NP1 = Theme, e.g. The horse jumped. NP1 = Theme, PP = Location, e.g. He jumped over the fence. Semantic predicates during(E)
18 / 34 → motion(during(E), they) not(location(start(E), they, hill)) location(end(E), they, hill)
The English VerbNet Lexical information PropBank The VerbNet API The VerbNet API
They reached the hill. motion(during(E), Agent) not(location(start(E), Agent, Destination)) location(end(E), Agent, Destination)
19 / 34 The English VerbNet Lexical information PropBank The VerbNet API The VerbNet API
They reached the hill. motion(during(E), Agent) not(location(start(E), Agent, Destination)) location(end(E), Agent, Destination)
→ motion(during(E), they) not(location(start(E), they, hill)) location(end(E), they, hill)
19 / 34 The English VerbNet Lexical information PropBank The VerbNet API Outline
1 The English VerbNet Lexical information The VerbNet API
2 PropBank
20 / 34 The English VerbNet Lexical information PropBank The VerbNet API Start up your engines.....
21 / 34 The English VerbNet Lexical information PropBank The VerbNet API The VerbNet API
Task # 1 Use the VerbNet Inspector to determine the following verb properties.
1 What are the thematic roles of the verb “to reach”? 2 What is the semantic representation of “I bake the potatoes?” 3 What are the syntactic frames of the verb “to balance” (the spatial configuration)
22 / 34 The English VerbNet Lexical information PropBank The VerbNet API Conclusion
Central question of semantic nlp: Who did What to Whom, and How, When and Where?
So far: VerbNet gets pretty close in defining many components that we need for semantic nlp connects arguments with thematic roles encodes event structure marks some abstract meaning of events
23 / 34 The English VerbNet PropBank Today
1 The English VerbNet Lexical information The VerbNet API
2 PropBank
24 / 34 The English VerbNet PropBank Facts
Section of Penn Treebank 2 annotated with thematic roles 112917 annotated examples 3257 unique verbs Core arguments are numbered, optional arguments receive a label Primary goal of developing an annotated corpus as training data for supervised machine learning systems Facilitates experiments of the sort that dominate NLP currently
25 / 34 The English VerbNet PropBank Frame file
taken from Christopher Potts on Semantic Role Labeling 26 / 34 The English VerbNet PropBank PropBank example
taken from Christopher Potts on Semantic Role Labeling
27 / 34 The English VerbNet PropBank Frame file
taken from Christopher Potts on Semantic Role Labeling
28 / 34 The English VerbNet PropBank PropBank argument labels
taken from Christopher Potts on Semantic Role Labeling
29 / 34 The English VerbNet PropBank PropBank verb labels
taken from Christopher Potts on Semantic Role Labeling
30 / 34 The English VerbNet PropBank PropBank traces
taken from Christopher Potts on Semantic Role Labeling
31 / 34 The English VerbNet PropBank Let’s annotate some data!
Task #2 Go to the English PropBank at http://verbs.colorado.edu/propbank/ and choose the English frame sets. Then annotate the following sentences:
1 Ford achieves a 2% increase in sales figures by repeatedly launching clever adverts on television.
2 Confidential content of the meeting leaked to the press.
3 John’s operation of the knee lasted three hours.
4 Any admission of guilt by the suspect helps the court.
5 I was warmly welcomed at Kathmandu Airport.
6 The boy whistled.
7 The bullet whistled through the air.
32 / 34 The English VerbNet PropBank Summary
Virtues Full gold-standard parses Different levels of annotation for one corpus (syntax, thematic roles)
Drawbacks labels ARG2-5 are overloaded (VerbNet and FrameNet provide more fine grained role labels) WSJ is too domain specific and too financial What about argument-taking nouns and adjectives?
33 / 34 The English VerbNet PropBank Wrap-up
34 / 34