
Methods for Numeracy-Preserving Word Embeddings Dhanasekar Sundararaman1, Shijing Si1, Vivek Subramanian1, Guoyin Wang2, Devamanyu Hazarika3, Lawrence Carin1 1 Duke University 2 Amazon Alexa AI 3 National University of Singapore [email protected] Abstract While word embeddings effectively capture se- mantic relationships between words, they are less Word embedding models are typically able to capture the semantics of words via the distri- effective at capturing numeric properties associated butional hypothesis, but fail to capture the nu- with numbers. Though numbers represent a signif- merical properties of numbers that appear in icant percentage of tokens in a corpus, they are a text. This leads to problems with numerical often overlooked. In non-contextual word embed- reasoning involving tasks such as question an- ding models, they are treated like any other word, swering. We propose a new methodology to which leads to misinterpretation. For instance, they assign and learn embeddings for numbers. Our exhibit unintuitive similarities with other words and approach creates Deterministic, Independent- do not contain strong prior information about the of-Corpus Embeddings (referred to as DICE) for numbers, such that their cosine similarity magnitude of the number they encode. In sentence reflects the actual distance on the number line. similarity and reasoning tasks, failure to handle DICE outperforms a wide range of pre-trained numbers causes as much as 29% of contradictions word embedding models across multiple exam- (De Marneffe et al., 2008). In other data-intensive ples of two tasks: (i) evaluating the ability to tasks where numbers are abundant, like neural ma- capture numeration and magnitude; and (ii) to chine translation, they are masked to hide the trans- perform list maximum, decoding, and addition. lation models inefficiency in dealing with them We further explore the utility of these embed- dings in downstream applications by initializ- (Mitchell and Lapata, 2009). ing numbers with our approach for the task of There are a variety of tests proposed to measure magnitude prediction. We also introduce a reg- the efficiency of number embeddings. For instance, ularization approach to learn model-based em- Naik et al.(2019) shows that GloVe (Pennington beddings of numbers in a contextual setting. et al., 2014), word2vec (Mikolov et al., 2013b), and fastText (Joulin et al., 2016; Bojanowski et al., 1 Introduction 2017) fail to capture numeration and magnitude Word embeddings capture semantic relationships properties of a number. Numeration is the property between words by operationalizing the distribu- of associating numbers with their corresponding tional hypothesis (Harris, 1954; Firth, 1957). They word representations (“3” and “three”) while mag- can be learned either non-contextually (Mikolov nitude represents a number’s actual value (3 < 4). et al., 2013b; Pennington et al., 2014; Bojanowski Further, Wallace et al.(2019) proposes several tests et al., 2017) or contextually (Devlin et al., 2018; for analyzing numerical reasoning of number em- Peters et al., 2018). Non-contextual embeddings beddings that include list maximum, decoding, and have worked well on various language understand- addition. ing and semantic tasks (Rumelhart et al., 1988; In this paper, we experimentally demonstrate Mikolov et al., 2013a,b). More recently, they have that if the cosine similarity between word embed- also been used as pre-trained word embeddings to dings of two numbers reflects their actual distance aid more sophisticated contextual models for solv- on the number line, the resultant word embeddings ing rigorous natural language processing (NLP) are useful in downstream tasks. We first demon- problems, including translation, paraphrasing, and strate how Deterministic, Independent-of-Corpus sentence-similarity tasks (Kiros et al., 2015; Wiet- Embeddings (DICE) can be constructed such that ing et al., 2015). they almost perfectly capture properties of numera- tion and magnitude. These non-contextual embed- dataset named Numeracy-600k, a collection of ap- dings also perform well on related tests for numer- proximately 600,000 sentences from market com- acy (Wallace et al., 2019). ments with a diverse set of numbers representing To demonstrate the efficacy of DICE for down- age, height, weight, year, etc. The authors use stream tasks, we explore its utility in two experi- neural network models, including a GRU, BiGRU, ments. First, we design a DICE embedding ini- CRNN, CNN-capsule, GRU-capsule, and BiGRU- tialized Bi-LSTM network to classify the mag- capsule, to classify the magnitude of each num- nitude of masked numbers in the 600K dataset ber. Wallace et al.(2019) compares and contrasts (Chen et al., 2019). Second, given the popular- the numerical reasoning ability of a variety of non- ity of modern contextual model-based embeddings, contextual as well as contextual embedding models. we devise a regularization procedure that emulates The authors also proposed three tests – list maxi- the hypothesis proposed by DICE and can be em- mum, decoding, and addition – to judge the numer- ployed in any task-based fine-tuning process. We ical reasoning ability of embeddings of numerals. demonstrate that adding such regularization helps They infer that word embedding models that per- the model internalize notions of numeracy while form the best on these three tests have captured the learning task-based contextual embeddings for the numerical properties of numbers well. Therefore, numbers present in the text. We find promising we consider these proposed tests in our evaluation. results in a numerical reasoning task that involves (Spithourakis and Riedel, 2018) used a variety of numerical question answering based on a sub-split models to distinguish numbers from words, and of the popular SQuAD dataset (Rajpurkar et al., demonstrated that this ability reduces model per- 2016). plexity with neural machine translation. Weiss et al. Our contribution can be summarized as follows: (2018) found that neural networks are capable of reasoning numbers with explicit supervision. • We propose a deterministic technique to learn Numerically Augmented QANet numerical embeddings. DICE embeddings (NAQANet) (Dua et al., 2019) was built by are learned independently of corpus and effec- adding an output layer on top of QANet (Yu et al., tively capture properties of numeracy. 2018) to predict answers based on addition and subtraction over numbers in the DROP dataset. • We prove experimentally that the resultant em- Our work, in contrast, offers a simple methodology beddings learned using the above methods im- that can be added to any model as a regularization prove a model’s ability to reason about num- technique. Our work is more similar to Jiang et al. bers in a variety of tasks, including numera- (2019), where embedding of a number is learned tion, magnitude, list maximum, decoding, and as a simple weighted average of its prototype addition. embeddings. Such embeddings are used in tasks • We also demonstrate that properties of DICE like word similarity, sequence labeling and have can be adapted to contextual models, like been proven to be effective. BERT (Devlin et al., 2018), through a novel regularization technique for solving tasks in- 3 Methods volving numerical reasoning. To overcome NLP models inefficiency in dealing 2 Related Work with numbers, we consider our method DICE to form embeddings. To begin, we embed numerals D The major research lines in this area have been and word forms of numbers as vectors ei 2 R , dedicated to (i) devising probing tests and curating where i indexes numerals identified within a cor- resources to evaluate the numerical reasoning abili- pus. We first preprocess by parsing the corpora ties of pre-trained embeddings, and (ii) proposing associated with each of our tasks (described be- new models that learn these properties. low) for numbers in numeral and word forms to Naik et al.(2019) surveyed a number of non- populate a number vocabulary. Then, the dimen- contextual word embedding models and high- sionality of the embeddings required for that task lighted the failure of those models in capturing is fixed. We explicitly associate the embeddings of two essential properties of numbers – numeration a numeral and word forms of numbers to have the and magnitude. Chen et al.(2019) created a novel same embedding. (a) DICE-2 embedding (b) DICE-D embedding (c) Addition of DICE vectors Figure 1: Proposed DICE embeddings. Vectors are colored according to numeral magnitude. Note that addition of two numbers in this embedding is performed by a shift, scaling, and rotation. Scaling depends only on the vector being added, as illustrated in sub-figure (c) in which the two black lines, corresponding to identical ej, have the same length. 3.1 DICE embeddings in the same direction. When x ? y, the cosine dis- In designing embeddings that capture the aforemen- tance is 1; and when x and y are antiparallel, cosine tioned properties of numeration and magnitude, we distance is 2. consider a deterministic, handcrafted approach (de- We seek a mapping (x; y) 7! (x; y) such that de picted in Figures 1a and 1b). This method relies on monotonically increases as dn increases. We first the fact that tests for both numeration and magni- bound the range of numbers for which we wish to tude are concerned with the correspondence in simi- compute embeddings by [a; b] ⊂ R and, without larity between numbers in token space and numbers loss of generality, restrict x and y to be of unit in embedding space. In token space, two numbers length (i:e:, jjxjj2 = jjyjj2 = 1). Since the cosine function decreases monotonically between 0 and x; y 2 R, in numeral or word form (with the latter being mapped to its corresponding numeral form π, we can simply employ a linear mapping to map for comparison), can be compared using absolute distances sn 2 [0; ja − bj] to angles θ 2 [0; π]: difference, i:e:: s θ(s ) = n π (4) n ja − bj dn(x; y) = jx − yj (1) This mapping achieves the desired direct relation- The absolute value ensures that two numbers are ship between sn and de.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-