
Computationally Constructed Concepts: A Machine Learning Approach to Metaphor Interpretation Using Usage-Based Construction Grammatical Cues Zachary Rosen Department of Linguistics The University of Colorado, Boulder 295 UCB, Boulder, CO 80309 [email protected] Abstract metaphor interpretation is still relatively new, and a growing number of researchers are beginning to The current study seeks to implement a explore the topic in greater depth. Recently, work deep learning classification algorithm us- by the team behind Berkeley’s MetaNet has shown ing argument-structure level representation of that a constructional and frame-semantic ontol- metaphoric constructions, for the identifica- tion of source domain mappings in metaphoric ogy can be used to accurately identify metaphoric utterances. It thus builds on previous utterances and generate possible source domain work in computational metaphor interpretation mappings, though at the cost of requiring a large (Mohler et al. 2014; Shutova 2010; Bolle- database of metaphoric exemplars (Dodge et al. gala & Shutova 2013; Hong 2016; Su et al. 2015; Hong 2016). Researchers from the Depart- 2017) while implementing a theoretical frame- ment of Cognitive Science at Xiamen University work based off of work in the interface of (Su et al. 2017) report that, using word embed- metaphor and construction grammar (Sullivan 2006, 2007, 2013). The results indicate that it dings, they have created a system that can reliably is possible to achieve an accuracy of approx- identify nominal-specific conceptual metaphors as imately 80.4% using the proposed method, well as interpret them, albeit within a very lim- combining construction grammatical features ited scope–the nominal modifier metaphors that with a simple deep learning NN. I attribute they work with only include metaphors in which this increase in accuracy to the use of con- the source and target domain share what they structional cues, extracted from the raw text of refer to as a ”direct ancestor”, such as in the metaphoric instances. case of ”the surgeon is a butcher”, limiting re- searchers to analyzing noun phrases with modi- 1 Introduction fiers that exist in a single source and target do- Lakoff’s theory of conceptual metaphor has been main. Other approaches have included develop- highly influential in cognitive linguistic research ing literal paraphrases of metaphoric utterances since its initial publication (Lakoff & Johnson (Shutova 2010; Bollegala & Shutova 2013), and, 1980). Conceptual metaphors represent fine- as an ancestor to the current study, clustering the- grained mappings of abstract concepts like ”love” matic co-occurents–the AGENT, PATIENT, and to more concrete, tangible phenomena, like ”jour- ATTRIBUTE of the metaphoric sentence–which neys” which have material and culturally salient allowed researchers to predict a possible source attributes like a PATH, various LANDMARKS, domain label–think: ”The bill blocked the way and a THEME which undergoes movement from forward”, where for the word ”bill” the system a SOURCE to a GOAL (Lakoff & Johnson 1980). predicted that it mapped to a ”PHYSICAL OB- These tangible phenomena then serve as the basis JECT” role in the source domain (Mohler et al. for models from which speakers can reason about 2014). abstract ideas in a culturally transmissible manner. For example, consider the following metaphoric 2 Construction Grammatical mappings for the metaphor LOVE IS MAGIC, as Approaches to Metaphor shown in figure 1. To date, while automatic metaphor detection The constructional makeup of metaphoric lan- has been explored in some length, computational guage has been explored at some length by a 102 Proceedings of the Workshop on Figurative Language Processing, pages 102–109 New Orleans, Louisiana, June 6, 2018. c 2018 Association for Computational Linguistics LOVER is a MAGICIAN She cast her spell over me ATTRACTION is a SPELL I was spellbound A RELATIONSHIP is BEWITCHMENT He has me in a trance Figure 1: Metaphoric Mapping & Example handful of researchers to date. Karen Sullivan, independent of cues beyond the sentence itself. for example, has done considerable work on both how syntactic structures (i.e. constructions) re- 3 Data Collection strict the interpretation of metaphoric utterances in predictable ways by both instantiating a seman- All the examples in this experiment were taken tic frame and mapping the target domain referent from the EN-Small LCC Metaphor Dataset, com- to a semantic role within the instantiated frame piled and annotated by Mohler et al. (2016). (Sullivan 2006, 2009, 2013). Notable examples of The corpus contains 16,265 instances of concep- computational implementations of Sullivan’s the- tual metaphors from government discourse, in- ories include Stickles et al. (2016) and Dodge cluding immediate context sentences preceding et al. (2015), who have compiled a database and following them. Each sentence is given a of metaphoric frames–MetaNet–organized into an metaphoricity score, ranging from ”-1” to ”3”, ontology of source domains for researchers to where ”3” indicates high confidence that the sen- use in analyzing metaphoric utterances, similar to tence is metaphoric, ”0” indicates that the sen- FrameNet. tence was not metaphoric, and ”-1” indicates an in- valid syntactic relationship between the target and One of the advantages of construction gram- source domain referents in the sentence (Mohler mar with respect to figurative language interpre- et al. 2016). Additionally, the corpus is annotated tation lies in the regularity with which construc- for polarity (negative, neutral, and positive), inten- tions establish form-meaning pairings. The var- sity, and situational protagonists (i.e.: the ”gov- ious meanings of constructions rely heavily on ernment”, ”individuals”, etc.). Though not anno- particular ”cues”–cues including the verb, as well tated for every sentence, the most important an- as the syntactic template and argument-structure– notations for this study were the annotations for which point speakers in the direction of a spe- source-target domain mappings. There was a total cific interpretation (Goldberg 2006). For the pur- of 7,941 sentences annotated for these mappings, pose of the current study, I will be focusing on with 108 source domain tags, annotated by five an- the argument-structure of metaphoric utterances notators (Mohler et al. 2016). Each annotator in- which, though it supplies a rather course-grained dicated not only what they thought the source do- view of the meaning of an utterance, provides an main was, but also gave the example an additional excellent and stable constructional cue with re- metaphoricity score based on their opinion. spect to its interpretation (Goldberg 2006). As an For the purposes of this study, I only used example of how this might work, consider the dif- the metaphoric instances that were annotated for ference between ”the Holidays are coming up on source-target domain mappings. For the source us” and ”we’re coming up on the Holidays.” In the domain labels, I selected the labels made by the first sentence, ”the Holidays” is established as be- annotator who had marked the example for having ing mapped to a MOVING OBJECT in the source the highest metaphoricity. I initially attempted to domain by virtue of its position in the argument- select the metaphoric source domain annotations structure of the sentence. Meanwhile, in the sec- that had the highest agreement amongst the an- ond utterance ”the Holidays” is mapped to a LO- notators who had annotated the sentence, but this CATION or GOAL in the source domain due to its proved trickier than I had anticipated. After cal- change in position in the argument-structure of the culating the average Cohen Kappa score (54.4%), construction. Implicitly, this means that important I decided that selecting labels based on their asso- information about the interpretation of a construc- ciated metaphoricity would be better. This effec- tion can be gleaned through extracting the argu- tively removed two annotators from the pool, who ments that fill its argument-structure and analyz- consistently ranked each metaphoric sentence as ing these arguments’ relationships to one another, having a metaphoricity score of 1 or less. 103 I further restricted the training and test data by corpus example, and found the verb that it was excluding multi-word expressions from the dataset directly dependent on in the sentence. This en- for simplicity, though in the future I would very sured that the target domain referent was in its much like to re-test the methods outlined in the immediate context. Once the verb was found, I rest of this paper including the omitted MWEs. Fi- then built a representation of the argument struc- nally, I removed any source domain annotations ture of the sentence by extracting the following that included only a single example and split the dependencies–(1) the verb for which the target do- data in training and testing data sets, using 85% main referent was a dependency, (2) the subject of as training data, and 15% as testing data. Be- the verb in 1, (3) the object of the verb in 1, and cause of my exclusion of MWEs and metaphoric if the target domain referent was not included in source domain tags that were used only once, this the subject or direct object, (4) the target domain left me with a total of 1985 sentences used in this referent as a nominal modifier and (5) any prepo- experiment–1633 of those were used in the train- sitional arguments that it had as a dependency. ing data, and 352 reserved for test data–with 77 Additionally, I extracted (6) the universal depen- source domain labels. The source labels were con- dency tags for each of the arguments in the verb’s verted to integers and used as classes in the follow- argument-structure, and converted that into a list ing Deep Neural Net (DNN) classifier. of tags that I simply labeled ”syntax”, or ”SYN”, based off the assumption that knowing what the 4 The Neural Network Approach to dependencies were might help in identifying the Source Domain Interpretation exact relationships between the lexemes that had been collected.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-