High-Order Semantic Role Labeling

High-Order Semantic Role Labeling

High-order Semantic Role Labeling Zuchao Li1;2;3, Hai Zhao1;2;3;,∗ Rui Wang4, and Kevin Parnow1;2;3 1Department of Computer Science and Engineering, Shanghai Jiao Tong University (SJTU) 2Key Laboratory of Shanghai Education Commission for Intelligent Interaction and Cognitive Engineering, Shanghai Jiao Tong University, Shanghai, China 3MoE Key Lab of Artificial Intelligence, AI Institude, Shanghai Jiao Tong University, China 4National Institute of Information and Communications Technology (NICT), Kyoto, Japan [email protected], zhaohai@cs,sjtu.edu.cn, [email protected], [email protected] Abstract arcs. The types of features that the model can ex- ploit in the inference depend on the information Semantic role labeling is primarily used to included in the factorized parts. identify predicates, arguments, and their se- Before the introduction of deep neural networks, mantic relationships. Due to the limitations in syntactic parsing (a kind of linguistic parsing), of modeling methods and the conditions of several works (McDonald and Pereira, 2006; Car- pre-identified predicates, previous work has focused on the relationships between predi- reras, 2007; Koo and Collins, 2010; Zhang and cates and arguments and the correlations be- McDonald, 2012; Ma and Zhao, 2012) showed that tween arguments at most, while the correla- high-order parsers utilizing richer factorization in- tions between predicates have been neglected formation achieve higher accuracy than low-order for a long time. High-order features and struc- ones due to the extensive decision history that can ture learning were very common in model- lead to significant improvements in inference (Chen ing such correlations before the neural net- et al., 2010). work era. In this paper, we introduce a high- order graph structure for the neural semantic Semantic role labeling (SRL) (Gildea and Juraf- role labeling model, which enables the model sky, 2002; Zhao and Kit, 2008; Zhao et al., 2009b, to explicitly consider not only the isolated 2013) captures the predicate-argument structure of predicate-argument pairs but also the inter- a given sentence, and it is defined as a shallow se- action between the predicate-argument pairs. mantic parsing task, which is also a typical linguis- Experimental results on 7 languages of the tic parsing task. Recent high-performing SRL mod- CoNLL-2009 benchmark show that the high- els (He et al., 2017; Marcheggiani et al., 2017; He order structural learning techniques are benefi- et al., 2018a; Strubell et al., 2018; He et al., 2018b; cial to the strong performing SRL models and further boost our baseline to achieve new state- Cai et al., 2018), whether labeling arguments for of-the-art results. a single predicate using sequence tagging model at a time or classifying the candidate predicate- 1 Introduction argument pairs, are (mainly) belong to first-order parsers. High-order information is an overlooked Linguistic parsing seeks the syntactic/semantic re- potential performance enhancer; however, it does lationships between language units, such as words suffer from an enormous spatial complexity and or spans (chunks, phrases, etc.). The algorithms an expensive time cost in the inference stage. As usually use factored representations of graphs to a result, most of the previous algorithms for high- accomplish the target: a set of nodes and relational order syntactic dependency tree parsing are not directly applicable to neural parsing. In addition, ∗ Corresponding author. This paper was partially sup- the target of model optimization, the high-order ported by National Key Research and Development Pro- gram of China (No. 2017YFB0304100), Key Projects of relationship, is very sparse. It is not as convenient National Natural Science Foundation of China (U1836222 for training the model with negative likelihood as and 61733011), Huawei-SJTU Long Term AI Project, Cutting- edge Machine Reading Comprehension and Language Model. the first-order structure is because the efficient gra- Rui Wang was partially supported by JSPS grant-in-aid for dient backpropagation of parsing errors from the early-career scientists (19K20354): “Unsupervised Neural Ma- high-order parsing target is indispensable in neural chine Translation in Universal Scenarios” and NICT tenure- track researcher startup fund “Toward Intelligent Machine parsing models. Translation”. To alleviate the computational and graphic mem- 1134 Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1134–1151 November 16 - 20, 2020. c 2020 Association for Computational Linguistics A0 A1 sibling co-parent cop gp A1 A1 AM-LOC A2 P A1 A2 P1 P2 A (sib) (cop) Bell , based in Los Angeles , makes and distributes electronic , computer and building products grandparent 01 01 01 01 01 sib A3 P1 P2 A A0 A1 (gp) A0 Figure 1: Left is the second-order parts (structures) considered in this paper, where the P stands for a predicate, and A stands for argument. Right is an example of semantic role labeling from the CoNLL-09 training dataset. ory occupation challenges of explicit high-order in dependency parsing has shown that the addi- modeling in the training and inference phase, we tion of such high-order features improves parse propose a novel high-order scorer and an approxi- accuracy (McDonald and Pereira, 2006; Carreras, mation high-order decoding layer for the SRL pars- 2007; Koo and Collins, 2010; Zhang and McDon- ing model. For the high-order scorer, we adopt a ald, 2012; Ma and Zhao, 2012). We find that this triaffine attention mechanism, which is extended addition can also benefit semantic parsing, as a tree from the biaffine attention (Dozat and Manning, is a specific form of a graph, and the high-order 2017), for scoring the second-order parts. In or- properties that exist in a tree apply to the graphs in der to ensure the high-order errors backpropagate semantic parsing tasks as well. in the training stage and to output the part score For a long time, SRL has been formulated as of the first-order and highest-order fusion in the a sequential tagging problem or a candidate pair highest-scoring parse search stage during decoding, (word pair) classification problem. In the pattern of inspired by (Lee et al., 2018; Wang et al., 2019), sequential tagging, only the arguments of one sin- we apply recurrent layers to approximate the high- gle predicate are labeled at a time, and a CRF layer order decoding iteratively and hence make it differ- is generally considered to model the relationship entiable. between the arguments implicitly (Zhou and Xu, We conduct experiments on popular English and 2015). In the candidate pair classification pattern, multilingual benchmarks. From the evaluation re- He et al.(2018a) propose an end-to-end approach sults on both test and out-of-domain sets, we ob- for jointly predicting all predicates, arguments, and serve a statistically significant increase in semantic- their relationships. This pattern focuses on the F 1 score with the second-order enhancement and first-order relationship between predicates and ar- report new state-of-the-art performance in all test guments and adopts dynamic programming decod- set of 7 languages except for the out-of-domain test ing to enforce the arguments’ constraints. From the set in English. Additionally, we also evaluated the perspective of existing SRL models, high-order in- results of the setting without pre-identified predi- formation has long been ignored. Although current cates and compared the effects of every different first-order neural parsers could encode the high- high-order structure combination on all languages order relationships implicitly under the stacking of to explore how the high-order structure contributes the self-attention layers, the advantages of explicit and how its effect differs from language to lan- modeling over implicit modeling lie in the lower guage. Our analysis of the experimental results training cost and better stability. This performance shows that the explicit higher-order structure learn- improvement finding resultant of high-order fea- ing yields steady improvements over our replicated tures or structure learning suggests that the same strong BERT baseline for all scenarios. benefits might be observed in SRL. Thus, this pa- per intends to explore the integration and effect of 2 High-order Structures in SRL high-order structures learning in the neural SRL model. High-order features or structure learning is known to improve linguistic parser accuracy. In depen- The trade-offs between rich high-order structures dency parsing, high-order dependency features en- (features), decoding time complexity, and memory code more complex sub-parts of a dependency tree requirements need to be well considered, especially structure than the features based on first-order, bi- in the current neural models. The work of Li et al. gram head-modifier relationships. The clear trend (2020) suggests that with the help of deep neural 1135 network design and training, exact decoding can be individually. The predicate-argument structure is replaced with an approximate decoding algorithm, regarded as a general dependency relation, with which can significantly reduce the decoding time predicate as the head and argument as the depen- complexity at a very small performance loss; how- dent (dep) role. Formally, we describe the task with ever, using high-order structure unavoidably brings a sequence X = w1; w2; :::; wn, a set of unlabeled problematically high graphic memory demand due arcs Yarc = W × W, where × is the cartesian to the gradient-based learning methods in the neu- product, and a set of labeled predicate-argument ral network model. Given an input sentence with relations Ylabel = W × W × R which, along with length L, order J of parsing model, the memory the set of arcs, is the target to be predicted by the J+1 required is O(L ). In the current GPU memory model. W = fw1; w2; :::; wng refers to the set of conditions, second-order J = 2 is the upper limit all words, and R is the candidate semantic role that can be explored in practice if without pruning.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    18 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us