Learning Structured Natural Language Representations

Learning Structured Natural Language Representations

Learning Structured Natural Language Representations for Semantic Parsing Jianpeng Cheng† Siva Reddy† Vijay Saraswat‡ and Mirella Lapata† †School of Informatics, University of Edinburgh ‡IBM T.J. Watson Research {jianpeng.cheng,siva.reddy}@ed.ac.uk, [email protected], [email protected] Abstract then mapped to a grounded representation (Kwiatkowski et al., 2013; Reddy et al., 2016, We introduce a neural semantic parser 2014; Krishnamurthy and Mitchell, 2015; which converts natural language utter- Gardner and Krishnamurthy, 2017). A merit ances to intermediate representations in of the two-stage approach is that it creates the form of predicate-argument structures, reusable intermediate interpretations, which po- which are induced with a transition sys- tentially enables the handling of unseen words and tem and subsequently mapped to tar- knowledge transfer across domains (Bender et al., get domains. The semantic parser is 2015). trained end-to-end using annotated log- The successful application of encoder-decoder ical forms or their denotations. We models (Bahdanau et al., 2015; Sutskever et al., achieve the state of the art on SPADES 2014) to a variety of NLP tasks has provided and GRAPHQUESTIONS and obtain com- strong impetus to treat semantic parsing as a petitive results on GEOQUERY and WEB- sequence transduction problem where an utter- QUESTIONS. The induced predicate- ance is mapped to a target meaning represen- argument structures shed light on the tation in string format (Dong and Lapata, 2016; types of representations useful for seman- Jia and Liang, 2016; Koˇcisk´yet al., 2016). Such tic parsing and how these are different models still fall under the first approach, however, from linguistically motivated ones.1 in contrast to previous work (Zelle and Mooney, 1 Introduction 1996; Zettlemoyer and Collins, 2005; Liang et al., 2011) they reduce the need for domain-specific as- Semantic parsing is the task of mapping natural sumptions, grammar learning, and more generally language utterances to machine interpretable extensive feature engineering. But this modeling meaning representations. Despite differences flexibility comes at a cost since it is no longer pos- in the choice of meaning representation and sible to interpret how meaning composition is per- model structure, most existing work con- formed. Such knowledge plays a critical role in ceptualizes semantic parsing following two understand modeling limitations so as to build bet- main approaches. Under the first approach, ter semantic parsers. Moreover, without any task- arXiv:1704.08387v3 [cs.CL] 14 Jun 2017 an utterance is parsed and grounded to a specific prior knowledge, the learning problem is meaning representation directly via learning fairly unconstrained, both in terms of the possible a task-specific grammar (Zelle and Mooney, derivations to consider and in terms of the target 1996; Zettlemoyer and Collins, 2005; output which can be ill-formed (e.g., with extra or Wong and Mooney, 2006; Kwiatkowksi et al., missing brackets). 2010; Liang et al., 2011; Berant et al., 2013; In this work, we propose a neural semantic Flanigan et al., 2014; Pasupat and Liang, parser that alleviates the aforementioned prob- 2015; Groschwitz et al., 2015). Under the lems. Our model falls under the second class second approach, the utterance is first parsed of approaches where utterances are first mapped to an intermediate task-independent rep- to an intermediate representation containing nat- resentation tied to a syntactic parser and ural language predicates. However, rather than 1Our code will be available at using an external parser (Reddy et al., 2014, https://github.com/cheng6076/scanner. 2016) or manually specified CCG grammars (Kwiatkowski et al., 2013), we induce interme- Predicate Usage Sub-categories answer denotation wrapper — diate representations in the form of predicate- stateid, cityid, type entity type checking argument structures from data. This is achieved riverid, etc. querying for an entire with a transition-based approach which by de- all — set of entities sign yields recursive semantic structures, avoid- one-argument meta count, largest, aggregation ing the problem of generating ill-formed meaning predicates for sets smallest, etc. representations. Compared to existing chart-based logical two-argument meta intersect, connectives predicates for sets union, exclude semantic parsers (Krishnamurthy and Mitchell, 2012; Cai and Yates, 2013; Berant et al., 2013; Table 1: List of domain-general predicates. Berant and Liang, 2014), the transition-based ap- proach does not require feature decomposition lem is to learn a semantic parser that maps x to G over structures and thereby enables the exploration via an intermediate ungrounded representation U. of rich, non-local features. The output of the tran- When G is executed against K, it outputs denota- sition system is then grounded (e.g., to a knowl- tion y. edge base) with a neural mapping model under the Grounded Meaning Representation We assumption that grounded and ungrounded struc- represent grounded meaning representations tures are isomorphic.2 As a result, we obtain a in FunQL (Kate et al., 2005) amongst many neural network that jointly learns to parse natural other alternatives such as lambda calculus language semantics and induce a lexicon that helps (Zettlemoyer and Collins, 2005), λ-DCS (Liang, grounding. 2013) or graph queries (Holzschuher and Peinl, The whole network is trained end-to-end on 2013; Harris et al., 2013). FunQL is a variable- natural language utterances paired with anno- free query language, where each predicate is tated logical forms or their denotations. We treated as a function symbol that modifies an conduct experiments on four datasets, includ- argument list. For example, the FunQL represen- ing GEOQUERY (which has logical forms; tation for the utterance which states do not border Zelle and Mooney 1996), SPADES (Bisk et al., texas is: 2016), WEBQUESTIONS (Berant et al., 2013), and GRAPHQUESTIONS (Su et al., 2016) (which answer(exclude(state(all), next to(texas))) have denotations). Our semantic parser achieves where next to is a domain-specific binary predi- the state of the art on SPADES and GRAPH- cate that takes one argument (i.e., the entity texas) QUESTIONS, while obtaining competitive results and returns a set of entities (e.g., the states border- on GEOQUERY and WEBQUESTIONS. A side- ing Texas) as its denotation. all is a special predi- product of our modeling framework is that the in- cate that returns a collection of entities. exclude is duced intermediate representations can contribute a predicate that returns the difference between two to rationalizing neural predictions (Lei et al., input sets. 2016). Specifically, they can shed light on An advantage of FunQL is that the result- the kinds of representations (especially predi- ing s-expression encodes semantic composition- cates) useful for semantic parsing. Evaluation of ality and derivation of the logical forms. This the induced predicate-argument relations against property makes FunQL logical forms natural syntax-based ones reveals that they are inter- to be generated with recurrent neural networks pretable and meaningful compared to heuristic (Vinyals et al., 2015; Choe and Charniak, 2016; baselines, but they sometimes deviate from lin- Dyer et al., 2016). However, FunQL is less ex- guistic conventions. pressive than lambda calculus, partially due to the elimination of variables. A more compact logical 2 Preliminaries formulation which our method also applies to is Problem Formulation Let K denote a knowl- λ-DCS (Liang, 2013). In the absence of anaphora edge base or more generally a reasoning system, and composite binary predicates, conversion algo- and x an utterance paired with a grounded mean- rithms exist between FunQL and λ-DCS. How- ing representation G or its denotation y. Our prob- ever, we leave this to future work. 2We discuss the merits and limitations of this assumption Ungrounded Meaning Representation We in Section 5. also use FunQL to express ungrounded meaning representations. The latter consist primarily of again answer(exclude(states(all), border(texas))) natural language predicates and domain-general which is tree structured. Each predicate (e.g., bor- predicates. Assuming for simplicity that domain- der) can be visualized as a non-terminal node of general predicates share the same vocabulary the tree and each entity (e.g., texas) as a termi- in ungrounded and grounded representations, nal. The predicate all is a special case which the ungrounded representation for the example acts as a terminal directly. We can generate the utterance is: tree top-down with a transition system reminis- cent of recurrent neural network grammars (RN- answer(exclude(states(all), border(texas))) NGs; Dyer et al. 2016). Similar to RNNG, our al- where states and border are natural language pred- gorithm uses a buffer to store input tokens in the icates. In this work we consider five types of utterance and a stack to store partially completed domain-general predicates illustrated in Table 1. trees. A major difference in our semantic pars- Notice that domain-general predicates are often ing scenario is that tokens in the buffer are not implicit, or represent extra-sentential knowledge. fetched in a sequential order or removed from the For example, the predicate all in the above utter- buffer. This is because the lexical alignment be- ance represents all states in the domain which are tween an utterance and its semantic representation

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    12 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us