Computing Word-Pair Antonymy Saif Mohammad† Bonnie Dorr Graeme Hirstφ † Laboratory for Computational Linguistics and Information Processing † Institute for Advanced Computer Studies and Computer Science † University of Maryland and Human Language Technology Center of Excellence ¡ saif,bonnie ¢ @umiacs.umd.edu φDepartment of Computer Science University of Toronto [email protected] Abstract sense, it applies to any two words that represent contrasting meanings. We will use the term de- Knowing the degree of antonymy between gree of antonymy to encompass the complete se- words has widespread applications in natural mantic range—a combined measure of the contrast language processing. Manually-created lexi- in meaning conveyed by two words and the tendency cons have limited coverage and do not include most semantically contrasting word pairs. We of native speakers to call them opposites. The higher present a new automatic and empirical mea- the degree of antonymy between a target word pair, sure of antonymy that combines corpus statis- the greater the semantic contrast between them and tics with the structure of a published the- the greater their tendency to be considered antonym saurus. The approach is evaluated on a set of pairs by native speakers. closest-opposite questions, obtaining a preci- sion of over 80%. Along the way, we discuss Automatically determining the degree of what humans consider antonymous and how antonymy between words has many uses includ- antonymy manifests itself in utterances. ing detecting and generating paraphrases (The dementors caught Sirius Black / Black could not 1 Introduction escape the dementors) and detecting contradictions (Marneffe et al., 2008; Voorhees, 2008) (Kyoto has Native speakers of a language intuitively recog- a predominantly wet climate / It is mostly dry in nize different degrees of antonymy—whether two Kyoto). Of course, such “contradictions” may be words are strongly antonymous (hot–cold, good– a result of differing sentiment, new information, bad, friend–enemy), just semantically contrasting non-coreferent mentions, or genuinely contradictory (enemy–fan, cold–lukewarm, ascend–slip) or not statements. Antonyms often indicate the discourse antonymous at all (penguin–clown, cold–chilly, relation of contrast (Marcu and Echihabi, 2002). boat–rudder). Over the years, many definitions of They are also useful for detecting humor (Mihalcea antonymy have been proposed by linguists (Cruse, and Strapparava, 2005), as satire and jokes tend 1986; Lehrer and Lehrer, 1982), cognitive scien- to have contradictions and oxymorons. Lastly, it tists (Kagan, 1984), psycholinguists (Deese, 1965), is useful to know which words are semantically and lexicographers (Egan, 1984), which differ from contrasting to a target word, even if simply to filter each other in small and large respects. In its them out. For example, in the automatic creation strictest sense, antonymy applies to gradable adjec- of a thesaurus it is necessary to distinguish near- tives, such as hot–cold and tall–short, where the synonyms from word pairs that are semantically two words represent the two ends of a semantic contrasting. Measures of distributional similarity dimension. In a broader sense, it includes other fail to do so. Detecting antonymous words is not adjectives, nouns, and verbs as well (life–death, sufficient to solve most of these problems, but it ascend–descend, shout–whisper). In its broadest remains a crucial, and largely unsolved, component. 982 Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing, pages 982–991, Honolulu, October 2008. c 2008 Association for Computational Linguistics Lexicons of pairs of words that native speakers mantic dimension, then they tend to be considered consider antonyms have been created for certain lan- antonyms. We will refer to this semantic dimension guages, but their coverage has been limited. Further, as the dimension of opposition. (2) If on the other as each term of an antonymous pair can have many hand, as Lehrer and Lehrer (1982) point out, there is semantically close terms, the contrasting word pairs more to the meaning of the antonymous words than far outnumber those that are commonly considered the dimension of opposition—for example, more se- antonym pairs, and they remain unrecorded. Even mantic dimensions or added connotations—then the though a number of computational approaches have two words are not so strongly antonymous. Most been proposed for semantic closeness, and some for people do not think of chubby as a direct antonym hypernymy–hyponymy (Hearst, 1992), measures of of thin because it has the additional connotation of antonymy have been less successful. To some ex- being cute and informal. (3) Cruse (1986) also pos- tent, this is because antonymy is not as well under- tulates that word pairs are not considered strictly stood as other classical lexical-semantic relations. antonymous if it is difficult to identify the dimension We first very briefly summarize insights and in- of opposition (for example, city–farm). (4) Charles tuitions about this phenomenon, as proposed by lin- and Miller (1989) claim that two contrasting words guists and lexicographers (Section 2). We discuss are identified as antonyms if they occur together in related work (Section 3). We describe the resources a sentence more often than chance. However, Mur- we use (Section 4) and present experiments that ex- phy and Andrew (1993) claim that the greater-than- amine the manifestation of antonymy in text (Sec- chance co-occurrence of antonyms in sentences is tions 5 and 6). We then propose a new empirical because together they convey contrast well, which approach to determine the degree of antonymy be- is rhetorically useful, and not really the reason why tween two words (Section 7). We compiled a dataset they are considered antonyms in the first place. of 950 closest-opposite questions, which we used for evaluation (Section 8). We conclude with a discus- 2.2 Are semantic closeness and antonymy sion of the merits and limitations of this approach opposites? and outline future work. Two words (more precisely, two lexical units) are considered to be close in meaning if there is a 2 The paradoxes of antonymy lexical-semantic relation between them. Lexical- semantic relations are of two kinds: classical Antonymy, like synonymy and hyponymy, is a and non-classical. Examples of classical rela- lexical-semantic relation that, strictly speaking, ap- tions include synonymy, hyponymy, troponymy, and plies to two lexical units—combinations of surface meronymy. Non-classical relations, as pointed out form and word sense. (That said, for simplicity and by Morris and Hirst (2004), are much more com- where appropriate we will use the term “antonymous mon and include concepts pertaining to another con- words” as a proxy for “antonymous lexical units”.) cept (kind, chivalrous, formal pertaining to gentle- However, accepting this leads to two interesting and manly), and commonly co-occurring words (for ex- seemingly paradoxical questions (described below ample, problem–solution pairs such as homeless, in the two subsections). shelter). Semantic distance (or closeness) in this broad sense is known as semantic relatedness. Two 2.1 Why are some pairs better antonyms? words are considered to be semantically similar if Native speakers of a language consider certain con- they are associated via the synonymy, hyponymy– trasting word pairs to be antonymous (for example, hypernymy, or the troponymy relation. So terms large–small), and certain other seemingly equivalent that are semantically similar (plane–glider, doctor– word pairs as less so (for example, large–little). A surgeon) are also semantically related, but terms that number of reasons have been suggested: (1) Cruse are semantically related may not always be semanti- (1986) observes that if the meaning of the target cally similar (plane–sky, surgeon–scalpel). words is completely defined by one semantic dimen- Antonymy is unique among these relations be- sion and the words represent the two ends of this se- cause it simultaneously conveys both a sense of 983 closeness and of distance (Cruse, 1986). Antony- between two words in text and also cue words such mous concepts are semantically related but not se- as but, from, and and. Unfortunately, they evalu- mantically similar. ated their method on only 18 word pairs. Neither of these methods determines the degree of antonymy 3 Related work between words and they have not been shown to have substantial coverage. Schwab et al. (2002) cre- Charles and Miller (1989) proposed that antonyms ate “antonymous vector” for a target word. The occur together in a sentence more often than chance. closer this vector is to the context vectors of the This is known as the co-occurrence hypothesis. other target word, the more antonymous the two tar- They also showed that this was empirically true for get words are. However, the antonymous vectors are four adjective antonym pairs. Justeson and Katz manually created. Further, the approach is not eval- (1991) demonstrated the co-occurrence hypothesis uated beyond a handful of word pairs. for 35 prototypical antonym pairs (from an original Work in sentiment detection and opinion mining set of 39 antonym pairs compiled by Deese (1965)) aims at determining the polarity of words. For ex- and also for an additional 22 frequent antonym pairs. ample, Pang, Lee and Vaithyanathan (2002) detect All of these pairs were adjectives. Fellbaum (1995) that adjectives such as dazzling, brilliant, and grip- conducted similar experiments on 47 noun, verb, ad- ping cast their qualifying nouns positively whereas jective, and adverb pairs (noun–noun, noun–verb, adjectives such as bad, cliched, and boring portray noun–adjective, verb–adverb and so on) pertaining the noun negatively. Many of these gradable adjec- to 18 concepts (for example, lose(v)–gain(n) and tives have antonyms. but these approaches do not loss(n)–gain(n), where lose(v) and loss(n) pertain to attempt to determine pairs of positive and negative the concept of “failing to have/maintain”).
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-