
Syntactic and Semantic Improvements to Computational Metaphor Processing by Kevin Stowe B.A., Michigan State University, 2009 M.A., Indiana University, 2011 A thesis submitted to the Faculty of the Graduate School of the University of Colorado in partial fulfillment of the requirement for the degree of Doctor of Philosophy Department of Linguistics 2019 This thesis entitled: Syntactic and Semantic Improvements to Computational Metaphor Processing written by Kevin Stowe has been approved for the Department of Linguistics Martha Palmer James H. Martin Date The final copy of this thesis has been examined by the signatories, and we find that both the content and the form meet acceptable presentation standards of scholarly work in the above mentioned discipline. iii Stowe, Kevin (Phd., Linguistics) Syntactic and Semantic Improvements to Computational Metaphor Processing Thesis directed by Professors Martha Palmer & Jim Martin Identifying and interpreting figurative language is necessary for comprehensive nat- ural language understanding. The main body of work on computational metaphor pro- cessing is based in lexical semantics. We’ve seen recent evidence that syntactic construc- tions play a part in our production and comprehension of metaphors; the goal of this work is to identify areas where these theories can improve metaphor processing. This is done be exploiting dependency parses, syntax-based lexical resources, and distant su- pervision using linguistic analysis. Through these methods we show improvements over state-of-the-art deep learning models in a variety of metaphor processing tasks. iv Dedicated to my parents, Ron and Kristine, who made all of this possible. v Acknowledgements I would first and foremost like to thank my advisors Martha and Jim for all of their help making this possible, and providing me the opportunity to pursue this interesting path of research. I would also like to thank Susan Brown, for her support on all things VerbNet, and her constant encouragement. I am grateful to Laura Michaelis for her in- valuable insight into every facet of language, and for helping me pursue many of the se- mantic details we both find interesting. I am also grateful for the support of Oana David, and for her inspiring work on the interaction between metaphors and constructions. I would also like to thank my many other collaborators I’ve had the opportunity to work with at the University of Colorado. First, the people at Project EPIC: Leysia Palen, Ken Anderson, Jennings Anderson, Marina Kogan, and Melissa Bica, have all been tremendous supporters and colleagues, giving me the opportunity to employ our what I’ve learned in practical settings and opening my mind to more concrete, practical ap- plications of our research. Additionally, the support of the Institute of Cognitive Science has been tremendous, especially through the practicums offered by Sidney D’Mello and Tamara Sumner. Finally, I would like to thank all the fellow students who have helped and inspired with my work. Jenette Preciado was a constant source of help and encouragement. I would also like to thank Claire Bonial, Meredith Green, Rebecca Lee, James Gung, and Tim O’Gorman for their support for all of the various linguistic, computational, and per- sonal components that make this kind of research possible. vi Contents 1 Introduction 1 1.1 The Problem . .2 1.2 Research Questions . .6 1.3 Approach . .7 2 Linguistic Background 10 2.1 Some Basics . 11 2.2 Conceptual Metaphor Theory . 13 2.2.1 Invariance Principle . 15 2.2.2 Hierarchical Structure . 16 2.2.3 Word Senses . 17 2.2.4 Metaphor as Purely Cognitive . 19 2.2.5 Analysis . 20 2.3 Selectional Preferences and Lexical Features . 22 2.3.1 Selectional Preferences . 22 2.3.2 Lexical Features . 25 2.3.3 Analysis . 28 2.4 Alternatives . 30 2.4.1 Blending Theory . 30 2.4.2 Class Inclusion . 31 2.5 Frames, Metaphors, and Constructions . 33 2.5.1 Adjective-noun constructions . 34 2.5.2 Argument Structure Constructions . 35 vii 2.5.3 Metaphor Identification by Construction . 36 2.5.4 Analysis . 37 2.6 Differentiating Metaphors from Other Language . 38 2.6.1 Literal vs Figurative . 39 2.6.2 Similes . 41 2.6.3 Metonymy . 42 2.6.4 Idioms . 45 2.6.5 Analysis . 47 2.7 Summary . 48 3 Computational Background 49 3.1 What’s the task? . 49 3.2 Knowledge-based Systems . 51 3.2.1 MIDAS . 51 3.2.2 Induction-based Reasoning . 52 3.2.3 MetaNet . 54 3.3 Machine Learning . 55 3.3.1 Features . 58 3.3.2 Syntax and Lexical Resources . 61 3.4 Word Embeddings . 62 3.4.1 Types of Embedding Models . 63 3.4.2 Embeddings for Metaphor . 64 3.5 Neural Networks . 66 3.6 Summary . 68 4 Lexical Resources 70 4.1 Metaphors in Lexical Resources . 71 4.2 VerbNet . 72 viii 4.2.1 Metaphoric/Literal VerbNet Classes . 72 4.2.2 Thematic Roles . 77 4.2.3 Syntactic Frames . 79 4.2.4 Semantic Frames . 80 4.2.5 Previous Applications of VerbNet for Metaphor Processing . 82 4.2.6 Summary . 83 4.3 FrameNet . 83 4.3.1 Frames . 84 4.3.2 Metaphoric/Literal FrameNet Frames . 84 4.3.3 Frame Elements . 86 4.3.4 Previous Applications of FrameNet for Metaphor Processing . 87 4.3.5 Summary . 88 4.4 PropBank . 88 4.4.1 Previous Applications of PropBank for Metaphor Processing . 89 4.4.2 Summary . 89 4.5 WordNet . 90 4.5.1 OntoNotes Sense Groupings . 90 4.5.2 Previous Applications of WordNet for Metaphor Processing . 91 4.5.3 Summary . 92 4.6 Lexical Resources Summary . 92 5 Corpora 93 5.1 Introduction . 93 5.2 Difficulties in Annotation . 93 5.2.1 Conventionalized metaphors . 93 5.2.2 Unit of analysis . 95 5.2.3 Different kinds of Figuration . 96 5.3 VUAMC . 97 ix 5.4 LCC . 102 5.5 TroFi . 106 5.6 The Mohammad et al Dataset (MOH) . 108 5.7 Summary . 109 6 Methods 111 6.1 Tasks . 112 6.1.1 VUAMC . 113 6.1.2 MOH-X . 114 6.1.3 Trofi . 114 6.1.4 LCC . 115 6.2 Computational Methods . 117 6.2.1 Feature-based Machine Learning . 117 Support Vector Machines . 118 6.2.2 Deep Learning . 119 Long-Short Term Memory Networks . 119 6.3 Syntactic Features and Representations . 120 6.4 Baselines . 121 6.4.1 A Note on Significance . 123 6.5 Summary . 124 7 Dependency Structures 125 7.1 Introduction . 125 7.2 Implementation . ..
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages208 Page
-
File Size-