Learning Structured Natural Language Representations for Semantic Parsing

Learning Structured Natural Language Representations for Semantic Parsing

Learning Structured Natural Language Representations for Semantic Parsing Jianpeng Cheng† Siva Reddy† Vijay Saraswat‡ and Mirella Lapata† †School of Informatics, University of Edinburgh ‡IBM T.J. Watson Research jianpeng.cheng,siva.reddy @ed.ac.uk, [email protected], { [email protected]} Abstract representation (Kwiatkowski et al., 2013; Reddy et al., 2016, 2014; Krishnamurthy and Mitchell, We introduce a neural semantic parser 2015; Gardner and Krishnamurthy, 2017). A merit which is interpretable and scalable. Our of the two-stage approach is that it creates reusable model converts natural language utter- intermediate interpretations, which potentially en- ances to intermediate, domain-general nat- ables the handling of unseen words and knowledge ural language representations in the form transfer across domains (Bender et al., 2015). of predicate-argument structures, which The successful application of encoder-decoder are induced with a transition system and models (Bahdanau et al., 2015; Sutskever et al., subsequently mapped to target domains. 2014) to a variety of NLP tasks has provided The semantic parser is trained end-to-end strong impetus to treat semantic parsing as a se- using annotated logical forms or their de- quence transduction problem where an utterance notations. We achieve the state of the is mapped to a target meaning representation in art on SPADES and GRAPHQUESTIONS string format (Dong and Lapata, 2016; Jia and and obtain competitive results on GEO- Liang, 2016; Kociskˇ y´ et al., 2016). Such models QUERY and WEBQUESTIONS. The in- still fall under the first approach, however, in con- duced predicate-argument structures shed trast to previous work (Zelle and Mooney, 1996; light on the types of representations useful Zettlemoyer and Collins, 2005; Liang et al., 2011) for semantic parsing and how these are dif- they reduce the need for domain-specific assump- ferent from linguistically motivated ones.1 tions, grammar learning, and more generally ex- tensive feature engineering. But this modeling 1 Introduction flexibility comes at a cost since it is no longer pos- Semantic parsing is the task of mapping natu- sible to interpret how meaning composition is per- ral language utterances to machine interpretable formed. Such knowledge plays a critical role in meaning representations. Despite differences in understand modeling limitations so as to build bet- the choice of meaning representation and model ter semantic parsers. Moreover, without any task- structure, most existing work conceptualizes se- specific prior knowledge, the learning problem is mantic parsing following two main approaches. fairly unconstrained, both in terms of the possible Under the first approach, an utterance is parsed derivations to consider and in terms of the target and grounded to a meaning representation directly output which can be ill-formed (e.g., with extra or via learning a task-specific grammar (Zelle and missing brackets). Mooney, 1996; Zettlemoyer and Collins, 2005; In this work, we propose a neural semantic Wong and Mooney, 2006; Kwiatkowksi et al., parser that alleviates the aforementioned prob- 2010; Liang et al., 2011; Berant et al., 2013; lems. Our model falls under the second class of Flanigan et al., 2014; Pasupat and Liang, 2015; approaches where utterances are first mapped to Groschwitz et al., 2015). Under the second ap- an intermediate representation containing natural proach, the utterance is first parsed to an inter- language predicates. However, rather than using mediate task-independent representation tied to a an external parser (Reddy et al., 2014, 2016) or syntactic parser and then mapped to a grounded manually specified CCG grammars (Kwiatkowski 1Our code is available at https://github.com/ et al., 2013), we induce intermediate representa- cheng6076/scanner. tions in the form of predicate-argument structures 44 Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, pages 44–55 Vancouver, Canada, July 30 - August 4, 2017. c 2017 Association for Computational Linguistics https://doi.org/10.18653/v1/P17-1005 from data. This is achieved with a transition-based Predicate Usage Sub-categories answer denotation wrapper — approach which by design yields recursive seman- stateid, cityid, type entity type checking tic structures, avoiding the problem of generating riverid, etc. querying for an entire ill-formed meaning representations. Compared to all — set of entities most existing semantic parsers which employ a one-argument meta count, largest, aggregation CKY style bottom-up parsing strategy (Krishna- predicates for sets smallest, etc. murthy and Mitchell, 2012; Cai and Yates, 2013; logical two-argument meta intersect, connectors predicates for sets union, exclude Berant et al., 2013; Berant and Liang, 2014), the transition-based approach we proposed does not Table 1: List of domain-general predicates. require feature decomposition over structures and thereby enables the exploration of rich, non-local tion y. features. The output of the transition system is Grounded Meaning Representation We repre- then grounded (e.g., to a knowledge base) with a sent grounded meaning representations in FunQL neural mapping model under the assumption that (Kate et al., 2005) amongst many other alterna- grounded and ungrounded structures are isomor- tives such as lambda calculus (Zettlemoyer and phic.2 As a result, we obtain a neural model that Collins, 2005), λ-DCS (Liang, 2013) or graph jointly learns to parse natural language semantics queries (Holzschuher and Peinl, 2013; Harris and induce a lexicon that helps grounding. et al., 2013). FunQL is a variable-free query lan- The whole network is trained end-to-end on guage, where each predicate is treated as a func- natural language utterances paired with anno- tion symbol that modifies an argument list. For tated logical forms or their denotations. We example, the FunQL representation for the utter- conduct experiments on four datasets, including ance which states do not border texas is: GEOQUERY (which has logical forms; Zelle and Mooney 1996), SPADES (Bisk et al., 2016), WEB- answer(exclude(state(all), next to(texas))) QUESTIONS (Berant et al., 2013), and GRAPH- where next to is a domain-specific binary predi- QUESTIONS (Su et al., 2016) (which have deno- cate that takes one argument (i.e., the entity texas) tations). Our semantic parser achieves the state of and returns a set of entities (e.g., the states border- the art on SPADES and GRAPHQUESTIONS, while ing Texas) as its denotation. all is a special predi- obtaining competitive results on GEOQUERY and cate that returns a collection of entities. exclude is WEBQUESTIONS. A side-product of our mod- a predicate that returns the difference between two eling framework is that the induced intermedi- input sets. ate representations can contribute to rationalizing An advantage of FunQL is that the resulting neural predictions (Lei et al., 2016). Specifically, s-expression encodes semantic compositionality they can shed light on the kinds of representations and derivation of the logical forms. This prop- (especially predicates) useful for semantic pars- erty makes FunQL logical forms convenient to be ing. Evaluation of the induced predicate-argument predicted with recurrent neural networks (Vinyals relations against syntax-based ones reveals that et al., 2015; Choe and Charniak, 2016; Dyer et al., they are interpretable and meaningful compared 2016). However, FunQL is less expressive than to heuristic baselines, but they sometimes deviate lambda calculus, partially due to the elimination from linguistic conventions. of variables. A more compact logical formulation which our method also applies to is λ-DCS (Liang, 2 Preliminaries 2013). In the absence of anaphora and composite Problem Formulation Let denote a knowl- binary predicates, conversion algorithms exist be- K edge base or more generally a reasoning system, tween FunQL and λ-DCS. However, we leave this and x an utterance paired with a grounded mean- to future work. ing representation G or its denotation y. Our prob- Ungrounded Meaning Representation We lem is to learn a semantic parser that maps x to G also use FunQL to express ungrounded meaning via an intermediate ungrounded representation U. representations. The latter consist primarily of When G is executed against , it outputs denota- K natural language predicates and domain-general 2We discuss the merits and limitations of this assumption predicates. Assuming for simplicity that domain- in Section5 general predicates share the same vocabulary 45 in ungrounded and grounded representations, algorithm uses a buffer to store input tokens in the ungrounded representation for the example the utterance and a stack to store partially com- utterance is: pleted trees. A major difference in our semantic answer(exclude(states(all), border(texas))) parsing scenario is that tokens in the buffer are not fetched in a sequential order or removed from the where states and border are natural language pred- buffer. This is because the lexical alignment be- icates. In this work we consider five types of tween an utterance and its semantic representation domain-general predicates illustrated in Table1. is hidden. Moreover, some predicates cannot be Notice that domain-general predicates are often clearly anchored to a token span. Therefore, we implicit, or represent extra-sentential knowledge. allow the generation algorithm to pick tokens and For example, the predicate all in the above utter- combine logical forms

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    12 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us