Learning Uniform Semantic Features for Natural Language And

Learning Uniform Semantic Features for Natural Language And

The Thirty-Third AAAI Conference on Artificial Intelligence (AAAI-19) Learning Uniform Semantic Features for Natural Language and Programming Language Globally, Locally and Sequentially Yudong Zhang,1 Wenhao Zheng,1;3 Ming Li1;2 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, 210023, China 2Collaborative Innovation Center of Novel Software Technology and Industrialization, Nanjing, 210023, China 3Search Product Center, WeChat Search Application Department, Tencent, China [email protected], [email protected], [email protected] Abstract public static int arra y _t o p1(int[] array) { Array.sort(array); return array[array.length - 1]; Semantic feature learning for natural language and program- } public static int find_zero (int[] array) { ming language is a preliminary step in addressing many soft- public static int[][] loop1(int[][] array, int col) { int index = array.length - 1; for(int row=0; row<10; ++row) { ware mining tasks. Many existing methods leverage informa- while(index >= 0 && array[index] != 0) { arra y [row][col] = row*10 + col; index --; tion in lexicon and syntax to learn features for textual data. } } return index; However, such information is inadequate to represent the en- return array; } Global Information tire semantics in either text sentence or code snippet. This } Global Information public static int[][] loop2(int[][] array, int col) { motivates us to propose a new approach to learn semantic for(int row=0; row<10; ++row) { public static int sort1(int[] array) { arra y [row][col] = col*10 + row; features for both languages, through extracting three levels of arra y [4] *= 10; information, namely global, local and sequential information, Arra y . so rt (array); } return array; return array; from textual data. For tasks involving both modalities, we } } public static int sort2(int[] array) { project the data of both types into a uniform feature space so Arra y . so rt (array) Local Information that the complementary knowledge in between can be utilized arra y [4] *= 10; return array; in their representation. In this paper, we build a novel and } general-purpose feature learning framework called UniEm- Sequential Information Local Information bed, to uniformly learn comprehensive semantic representa- tion for both natural language and programming language. Experimental results on three real-world software mining Figure 1: Examples of Java code snippet pairs differing in tasks show that UniEmbed outperforms state-of-the-art mod- each level of information els in feature learning and prove the capacity and effective- ness of our model. As we observe, natural language and programming lan- Sequential Information Introduction guage, in spite of different modalities, uniformly share three levels of semantic information, namely global, local and se- In recent years, semantic representation learning (Bengio, quential information. The three levels of information depict Courville, and Vincent 2013) for languages has gained much semantic meaning of textual contents from different aspects. attention and been widely applied to many software mining In order to give a clean explanation below, we provide de- tasks. Among those tasks, natural language and program- tailed instances and discussions to separately illustrate those ming language are two common data types, which require semantic information in both languages. feature extraction as a preliminary step so that their rep- resentation can be processed by specific models. The ex- In natural language, global information refers to the ag- tracted features have been used to address tasks such as bug gregation of every term’s meaning within the sentence glob- localization (Huo, Li, and Zhou 2016; Huo and Li 2017; ally. All terms contribute to the sentence’s semantics, while Hoang et al. 2018), code clone detection (White et al. 2016; replacing any term can probably cause global informa- “write a program in Wei and Li 2017; 2018), code summarization (Allamanis, tion shift. For example, sentences as Python” “debug the Java program” Peng, and Sutton 2016), etc. However, one main factor that and are discussing dis- determines the capacity of representation lies in the infor- parate things because of different lexicons in the two sen- mation exploited from data, since partial semantic informa- tences. Local information is implied in a local small group of tion can result in features of poor quality, and thus prevent words standing together as a conceptual unit, like the phrase “Depth First Search” models from elevating performance. Therefore, figuring out . Only by considering those terms as a way to learn comprehensive semantic features for natu- an integrity rather than separate words can we understand ral language and programming language has become a vital it. Sequential information is included in the logic order of problem we need to resolve. terms forming the sentence. For instance, changing the or- der of terms in “apply quick sort to the array” to “quick to Copyright c 2019, Association for the Advancement of Artificial apply array the sort” will alter the semantic meaning and Intelligence (www.aaai.org). All rights reserved. make the sentence not understandable. 5845 Similar to natural language, programming language also information, in learning features for textual data. Those contains three levels of information. Its semantics re- learned representations can comprehensively cover the se- lates to the functionality of the code snippets. As the ex- mantics in both natural language and programming lan- amples shown in Figure 1, function array top 1 and guage. To the best of our knowledge, no previous works find zero are manifestly different in the variables and adopt similar strategy in their feature learning method or terms within their bodies. This causes disparity in their achieve comparable performance as our framework. global information and meanwhile leads to functionality dif- • We propose a novel representation learning framework ference. In the local information example, function loop2 UniEmbed, which can learn uniform features for natural changes the substructures in the for-loop in loop1 while language and programming language when both modal- it remains all the terms and the organization of statements ities are involved in the same task. Such representations (global and sequential information) unshifted. Nevertheless, are capable and effective in helping address a set of real- the behaviors of the two functions differ due to the nuance world software mining tasks. in local information. In the sequential information example, two statements have reversed positions in sort2 comparing Related Work with sort1. As a result, the dependencies between state- ments (sequential information) are changed and the func- Most of existing research involving feature learning in soft- tionality is mutated, despite the fact that the terms and sub- ware mining tasks has focused on various methods to ex- structures (global and local information) are nearly the same tract semantic information from textual data. (Lukins, Kraft, in two functions. and Etzkorn 2010) considers each word with certain prob- According to the discussions above, we observe that (1) abilities in bug reports and applies Latent Dirichlet Alloca- both natural language and programming language contain tion (LDA) model for locating buggy files, (Gay et al. 2009) three levels of information, i.e. global, local and sequential represents bug reports and source code files as feature vec- information, all of which are indispensable for understand- tors based on concept localization using Vector Space Model ing the semantic meaning; (2) the three levels of informa- (VSM). More recently, besides global information (lexi- tion are complementary rather than implicit to each other, as con), local information (structure) within textual content is changing one level of information while keeping the other also taken into consideration. (Allamanis, Peng, and Sutton two can still cause semantic shift. 2016) parses code token groups to extract local informa- Many previous works focused on learning representation tion with Convolutional Attention Model, (White et al. 2016; for both languages in software mining tasks by extracting Wei and Li 2017) exploit both lexicons and syntax structures semantic information. (Huo and Li 2017; Huo, Li, and Zhou in learning features for source code. (Huo and Li 2017) even 2016) model bug reports and source code by leveraging local considers long-term dependencies in code snippet and thus and sequential information in their representation to address combines sequential information in representation. How- bug localization task, (Wei and Li 2017) learns hash code as ever, all these models leverage only one or two types among code snippet’s representation by exploiting global and se- global, local and sequential information, which may result quential information, while (Allamanis, Peng, and Sutton in incomplete semantics in learned features. 2016) generates features based only on local information to For software mining tasks involving both natural language encode code snippet. However, those existing models failed and programming language, such as bug localization and to take all of the aforementioned three levels of informa- code annotation, it is prevailing to treat both modalities dif- tion into account, so that their learned representations would ferently by learning representation in separate feature spaces cover only partial semantics. among

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us