
Joint Syntactic and Semantic Parsing with Combinatory Categorial Grammar Jayant Krishnamurthy Tom M. Mitchell Carnegie Mellon University Carnegie Mellon University 5000 Forbes Avenue 5000 Forbes Avenue Pittsburgh, PA 15213 Pittsburgh, PA 15213 [email protected] [email protected] Abstract ideally improve the parser’s ability to solve diffi- cult syntactic parsing problems, as in the exam- We present an approach to training a joint ples above. A semantic representation tied to a syntactic and semantic parser that com- knowledge base allows for powerful inference op- bines syntactic training information from erations – such as identifying the possible entity CCGbank with semantic training informa- referents of a noun phrase – that cannot be per- tion from a knowledge base via distant su- formed with shallower representations (e.g., frame pervision. The trained parser produces a semantics (Baker et al., 1998) or a direct conver- full syntactic parse of any sentence, while sion of syntax to logic (Bos, 2005)). simultaneously producing logical forms for portions of the sentence that have a se- This paper presents an approach to training a mantic representation within the parser’s joint syntactic and semantic parser using a large predicate vocabulary. We demonstrate our background knowledge base. Our parser produces approach by training a parser whose se- a full syntactic parse of every sentence, and fur- mantic representation contains 130 pred- thermore produces logical forms for portions of icates from the NELL ontology. A seman- the sentence that have a semantic representation tic evaluation demonstrates that this parser within the parser’s predicate vocabulary. For ex- produces logical forms better than both ample, given a phrase like “my favorite town in comparable prior work and a pipelined California,” our parser will assign a logical form like λx.CITY(x) LOCATEDIN(x, CALIFORNIA) syntax-then-semantics approach. A syn- ∧ tactic evaluation on CCGbank demon- to the “town in California” portion. Additionally, strates that the parser’s dependency F- the parser uses predicate and entity type informa- score is within 2.5% of state-of-the-art. tion during parsing to select a syntactic parse. Our parser is trained by combining a syntactic 1 Introduction parsing task with a distantly-supervised relation Integrating syntactic parsing with semantics has extraction task. Syntactic information is provided long been a goal of natural language processing by CCGbank, a conversion of the Penn Treebank and is expected to improve both syntactic and se- into the CCG formalism (Hockenmaier and Steed- mantic processing. For example, semantics could man, 2002a). Semantics are learned by training help predict the differing prepositional phrase at- the parser to extract knowledge base relation in- tachments in “I caught the butterfly with the net” stances from a corpus of unlabeled sentences, in and “I caught the butterfly with the spots.” A joint a distantly-supervised training regime. This ap- analysis could also avoid propagating syntactic proach uses the knowledge base to avoid expen- parsing errors into semantic processing, thereby sive manual labeling of individual sentence se- improving performance. mantics. By optimizing the parser to perform both We suggest that a large populated knowledge tasks simultaneously, we train a parser that pro- base should play a key role in syntactic and se- duces accurate syntactic and semantic analyses. mantic parsing: in training the parser, in resolv- We demonstrate our approach by training a joint ing syntactic ambiguities when the trained parser syntactic and semantic parser, which we call ASP. is applied to new text, and in its output semantic ASP produces a full syntactic analysis of every representation. Using semantic information from sentence while simultaneously producing logical the knowledge base at training and test time will forms containing any of 61 category and 69 re- 1188 Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, pages 1188–1198, Baltimore, Maryland, USA, June 23-25 2014. c 2014 Association for Computational Linguistics lation predicates from NELL. Experiments with related to the distantly-supervised approach of Kr- ASP demonstrate that jointly analyzing syntax ishnamurthy and Mitchell (2012). and semantics improves semantic parsing perfor- The parser presented in this paper can be viewed mance over comparable prior work and a pipelined as a combination of both a broad coverage syn- syntax-then-semantics approach. ASP’s syntactic tactic parser and a semantic parser trained using parsing performance is within 2.5% of state-of- distant supervision. Combining these two lines the-art; however, we also find that incorporating of work has synergistic effects – for example, our semantic information reduces syntactic parsing ac- parser is capable of semantically analyzing con- curacy by 0.5%. ∼ junctions and relative clauses based on the syn- 2 Prior Work tactic annotation of these categories in CCGbank. This synergy gives our parser a richer semantic This paper combines two lines of prior work: representation than previous work, while simulta- broad coverage syntactic parsing with CCG and neously enabling broad coverage. semantic parsing. Broad coverage syntactic parsing with CCG has 3 Parser Design produced both resources and successful parsers. These parsers are trained and evaluated using This section describes the Combinatory Categorial CCGbank (Hockenmaier and Steedman, 2002a), Grammar (CCG) parsing model used by ASP. The an automatic conversion of the Penn Treebank input to the parser is a part-of-speech tagged sen- into the CCG formalism. Several broad cover- tence, and the output is a syntactic CCG parse tree, age parsers have been trained using this resource along with zero or more logical forms representing (Hockenmaier and Steedman, 2002b; Hocken- the semantics of subspans of the sentence. These maier, 2003b). The parsing model in this paper is logical forms are constructed using category and loosely based on C&C (Clark and Curran, 2007b; relation predicates from a broad coverage knowl- Clark and Curran, 2007a), a discriminative log- edge base. The parser also outputs a collection of linear model for statistical parsing. Some work dependency structures summarizing the sentence’s has also attempted to automatically derive logi- predicate-argument structure. Figure 1 illustrates cal meaning representations directly from syntac- ASP’s input/output specification. tic CCG parses (Bos, 2005; Lewis and Steedman, 3.1 Knowledge Base 2013). However, these approaches to semantics do not ground the text to beliefs in a knowledge base. The parser uses category and relation predicates Meanwhile, work on semantic parsing has fo- from a broad coverage knowledge base both to cused on producing semantic parsers for answer- construct logical forms and to parametrize the ing simple natural language questions (Zelle and parsing model. The knowledge base is assumed Mooney, 1996; Ge and Mooney, 2005; Wong and to have two kinds of ontological structure: a gen- Mooney, 2006; Wong and Mooney, 2007; Lu et eralization/subsumption hierarchy and argument al., 2008; Kate and Mooney, 2006; Zettlemoyer type constraints. This paper uses NELL’s ontology and Collins, 2005; Kwiatkowski et al., 2011). This (Carlson et al., 2010), which, for example, speci- line of work has typically used a corpus of sen- fies that the category ORGANIZATION is a general- tences with annotated logical forms to train the ization of SPORTSTEAM, and that both arguments parser. Recent work has relaxed the requisite su- to the LOCATEDIN relation must have type LOCA- pervision conditions (Clarke et al., 2010; Liang et TION. These type constraints are enforced during al., 2011), but has still focused on simple ques- parsing. Throughout this paper, predicate names tions. Finally, some work has looked at applying are shown in SMALLCAPS. semantic parsing to answer queries against large knowledge bases, such as YAGO (Yahya et al., 3.2 Syntax 2012) and Freebase (Cai and Yates, 2013b; Cai ASP uses a lexicalized and semantically- and Yates, 2013a; Kwiatkowski et al., 2013; Be- typed Combinatory Categorial Grammar rant et al., 2013). Although this work considers (CCG) (Steedman, 1996). Most gram- a larger number (thousands) of predicates than we matical information in CCG is encoded in do, none of these systems are capable of parsing a lexicon Λ, containing entries such as: open-domain text. Our approach is most closely 1189 area / NN that / WDT includes / VBZ beautiful / JJ London / NNP N (N1 N1)/(S[dcl] NP1)2 (S[dcl] NP1)/NP2 N1/N1 N \ \ \ λx.LOCATION(x) λf.λg.λz.g(z) f(λy.y = z) λf.λg. x, y.g(x) f(y) λf.f λx.M(x, “london”, CITY) ∃ ∧ ∧ LOCATEDIN(y, x) ∧ N : λx.M(x, “london”, CITY) (S[dcl] NP1): \ λg. x, y.g(x) M(y, “london”, CITY) LOCATEDIN(y, x) ∃ ∧ ∧ N1 N1 : λg.λz. x, y.g(z) x = z M(y, “london”, CITY) LOCATEDIN(y, x) \ ∃ ∧ ∧ ∧ N : λz. x, y.LOCATION(z) x = z M(y, “london”, CITY) LOCATEDIN(y, x) ∃ ∧ ∧ ∧ Head Argument word POS semantic type index syntactic category arg. num. word POS semantic type index that WDT — 1 (N1 N1)/(S NP1)2 1 area NN LOCATION 0 \ \ 1 that WDT — 1 (N1 N1)/(S NP1)2 2 includes VBZ LOCATEDIN− 2 1 \ \ includes VBZ LOCATEDIN− 2 (S[dcl] NP1)/NP2 1 area NN LOCATION 0 1 \ includes VBZ LOCATEDIN− 2 (S[dcl] NP1)/NP2 2 ENTITY:CITY NNP CITY 4 \ beautiful JJ — 3 N1/N1 1 ENTITY:CITY NNP CITY 4 Figure 1: Example input and output for ASP. Given a POS-tagged sentence, the parser produces a CCG syntactic tree and logical form (top), and a collection of dependency structures (bottom). person := N : PERSON : λx.PERSON(x) icate that concisely represents the word’s seman- London := N : CITY : λx.M(x, “london”, CITY) tics. The semantic type is used to enforce type great := N1/N1 :—: λf.λx.f(x) constraints during parsing and to include seman- (S[dcl] NP1)/NP2 : ACQUIRED : bought := tics in the parser’s parametrization.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages11 Page
-
File Size-