Montague Meets Markov: Deep Semantics with Probabilistic Logical Form Islam Beltagyx, Cuong Chaux, Gemma Boleday, Dan Garrettex, Katrin Erky, Raymond Mooneyx xDepartment of Computer Science yDepartment of Linguistics The University of Texas at Austin Austin, Texas 78712 x beltagy,ckcuong,dhg,mooney @cs.utexas.edu
[email protected],
[email protected] g Abstract # » # » sim(hamster, gerbil) = w We combine logical and distributional rep- resentations of natural language meaning by gerbil( transforming distributional similarity judg- hamster( ments into weighted inference rules using Markov Logic Networks (MLNs). We show x hamster(x) gerbil(x) f(w) that this framework supports both judg- 8 ! | ing sentence similarity and recognizing tex- tual entailment by appropriately adapting the Figure 1: Turning distributional similarity into a MLN implementation of logical connectives. weighted inference rule We also show that distributional phrase simi- larity, used as textual inference rules created on the fly, improves its performance. els (Turney and Pantel, 2010) use contextual sim- ilarity to predict semantic similarity of words and 1 Introduction phrases (Landauer and Dumais, 1997; Mitchell and Tasks in natural language semantics are very diverse Lapata, 2010), and to model polysemy (Schutze,¨ and pose different requirements on the underlying 1998; Erk and Pado,´ 2008; Thater et al., 2010). formalism for representing meaning. Some tasks This suggests that distributional models and logic- require a detailed representation of the structure of based representations of natural language meaning complex sentences. Some tasks require the ability to are complementary in their strengths (Grefenstette recognize near-paraphrases or degrees of similarity and Sadrzadeh, 2011; Garrette et al., 2011), which between sentences.