Neural Embeddings of Graphs in Hyperbolic Space

Neural Embeddings of Graphs in Hyperbolic Space

1 Neural Embeddings of Graphs in Hyperbolic Space 2 3 4 Benjamin Paul Chamberlain James R Clough Marc Peter Deisenroth 5 Department of Computing Centre for Complexity Science Department of Computing 6 Imperial College London Imperial College London Imperial College London 7 [email protected] [email protected] [email protected] 8 9 ABSTRACT To the best of our knowledge, all previous work on neural embed- 10 Neural embeddings have been used with great success in Natural ding models either explicitly or implicitly (by using the Euclidean 11 Language Processing (NLP). They provide compact representations dot product) assumes that the vector space is Euclidean. Recent 12 that encapsulate word similarity and attain state-of-the-art per- work from the eld of complex networks has found that many in- 13 formance in a range of linguistic tasks. The success of neural teresting networks, such as the Internet [5] or academic citations 14 embeddings has prompted signicant amounts of research into [7, 8] can be well described by a framework with an underlying 15 applications in domains other than language. One such domain is non-Euclidean hyperbolic geometry. Hyperbolic geometry pro- 16 graph-structured data, where embeddings of vertices can be learned vides a continuous analogue of tree-like graphs, and even innite 17 that encapsulate vertex similarity and improve performance on trees have nearly isometric embeddings in hyperbolic space [11]. 18 tasks including edge prediction and vertex labelling. For both NLP Additionally, the dening features of complex networks, such as 19 and graph based tasks, embeddings have been learned in high- power-law degree distributions, strong clustering and hierarchical 20 dimensional Euclidean spaces. However, recent work has shown community structure, emerge naturally when random graphs are 21 that the appropriate isometric space for embedding complex net- embedded in hyperbolic space [15]. 22 works is not the at Euclidean space, but negatively curved, hyper- The starting point for our model is the celebrated word2vec Skip- 23 bolic space. We present a new concept that exploits these recent gram architecture, which is shown in Figure 3 [18, 19]. Skipgram 24 insights and propose learning neural embeddings of graphs in hy- is a shallow neural network with three layers: (1) An input pro- 25 perbolic space. We provide experimental evidence that embedding jection layer that maps from a one-hot-encoded to a distributed 26 graphs in their natural geometry signicantly improves perfor- representation, (2) a hidden layer, and (3) an output softmax layer. 27 mance on downstream tasks for several real-world public datasets. The network is necessarily simple for tractability as there are a very 28 large number of output states (every word in a language). Skipgram 29 KEYWORDS is trained on a sequence of words that is decomposed into (input 30 word, context word)-pairs. The model employs two separate vector neural networks, graph embeddings, complex networks, geometry 31 representations, one for the input words and another for the con- 32 text words, with the input representation comprising the learned 33 embedding. The word pairs are generated by taking a sequence of 34 1 INTRODUCTION words and running a sliding window (the context) over them. As 35 Embedding (or vector space) methods nd a lower-dimensional an example the word sequence “chance favours the prepared mind” 36 continuous space in which to represent high-dimensional complex with a context window of size three would generate the following 37 data [4, 26]. The distance between objects in the lower-dimensional training data: (chance, favours), (chance, the), (favours, chance), ... 38 space gives a measure of their similarity. This is usually achieved }. Words are initially randomly allocated to vectors within the two 39 by rst postulating a low-dimensional vector space and then op- vector spaces. Then, for each training pair, the vector representa- 40 timising an objective function of the vectors in that space. Vector tions of the observed input and context words are pushed towards 41 space representations provide three principle benets over sparse each other and away from all other words (see Figure 2). 42 schemes: (1) They encapsulate similarity, (2) they are compact, (3) The concept can be extended from words to network structured 43 they perform better as inputs to machine learning models [27]. This data using random walks to create sequences of vertices. The 44 is true of graph structured data where the native data format is vertices are then treated exactly analogously to words in the NLP 45arXiv:1705.10359v1 [stat.ML] 29 May 2017 the adjacency matrix, a typically large, sparse matrix of connection formulation. This was originally proposed as DeepWalk [24]. Exten- 46 weights. sions varying the nature of the random walks have been explored 47 Neural embedding models are a avour of embedding scheme in LINE [29] and Node2vec [12]. 48 where the vector space corresponds to a subset of the network Contribution. In this paper, we introduce the new concept of neu- 49 weights, which are learned through backpropagation. Neural em- ral embeddings in hyperbolic space. We formulate backpropagation 50 bedding models have been shown to improve performance in a in hyperbolic space and show that using the natural geometry of 51 large number of downstream tasks across multiple domains. These complex networks improves performance in vertex classication 52 include word analogies [18, 20], machine translation [28], docu- tasks across multiple networks. 53 ment comparison [16], missing edge prediction [12], vertex at- 54 tribution [24], product recommendations [2, 10], customer value 55 prediction [6, 14] and item categorisation [3]. In all cases the em- 2 HYPERBOLIC GEOMETRY 56 beddings are learned without labels (unsupervised) from a sequence Hyperbolic geometry emerges from relaxing Euclid’s fth postulate 57 of entities. (the parallel postulate) of geometry. In hyperbolic space there is not 58 In this model hyperbolic distances grow exponentially towards the edge of the disk. The circle’s boundary represents innitely distant points as the innite hyperbolic plane is squashed inside the nite disk. This property is illustrated in Figure 1a where each tile is of constant area in hyperbolic space, but the tiles rapidly shrink to zero area in Euclidean space. Although volumes and distances are warped, the Poincaré disk model is conformal, meaning that Euclidean and hyperbolic angles between lines are equal. Straight lines in hyperbolic space intersect the boundary of the disk or- (a) “Circle Limit 1” by (b) A set of straight lines in thogonally and appear either as diameters of the disk, or arcs of a M.C. Escher illustrates the the Poincare disk that all circle. Figure 1b shows a collection of straight hyperbolic lines in Poincaré disc model of pass through a given point the Poincaré disk. Just as in spherical geometry, the shortest path hyperbolic space. Each and are all parallel to the from one place to another is a straight line, but appears as a curve tile is of constant area blue (thicker) line. on a at map. Similarly, these straight lines show the shortest path in hyperbolic space, but (in terms of distance in the underlying hyperbolic space) from one vanishes in Euclidean point on the disk to another, but they appear curved. This is because space at the boundary. it is quicker to move close to the centre of the disk, where distances are shorter, than nearer the edge. In our proposed approach, we Figure 1: Illustrations of properties of hyperbolic space. a will exploit both the conformal property and the circular symmetry Tiles of constant area b Parallel lines. of the Poincaré disk. Overall, the geometric intuition motivating our approach is that vertices embedded near the middle of the disk can have more close neighbours than they could in Euclidean space, whilst vertices just one, but an innite number of parallel lines that pass through nearer the edge of the disk can still be very far from each other. a single point. This is illustrated in Figure 1b where every line is parallel to the bold, blue line and all pass through the same 2.2 Inner Product, Angles, and Distances point. Hyperbolic space is one of only three types of isotropic spaces that can be dened entirely by their curvature. The most The mathematics is considerably simplied if we exploit the sym- familiar is Euclidean, which is at, having zero curvature. Space metries of the model and describe points in the Poincaré disk using with uniform positive curvature has an elliptic geometry (e.g. the polar coordinates, x = ¹re ;θº, with re 2 »0; 1º and θ 2 »0; 2πº. To surface of a sphere), and space with uniform negative curvature dene similarities and distances, we require an inner product. In is called hyperbolic, which is analogous to a saddle-like surface. the Poincaré disk, the inner product of two vectors x = ¹rx ;θx º and As, unlike Euclidean space, in hyperbolic space even innite trees y = ¹ry ;θy º is given by have nearly isometric embeddings, it has been successfully used to hx;yi = kx kkyk cos¹θx − θy º (1) model complex networks with hierarchical structure, power-law = 4 arctanhr arctanhr cos¹θ − θ º (2) degree distributions and high clustering [15]. x y x y One of the dening characteristics of hyperbolic space is that it The distance of x = ¹re ;θº from the origin of the hyperbolic co- is in some sense larger than the more familiar Euclidean space; the ordinate system is given by rh = 2 arctanhre and the circumference area of a circle or volume of a sphere grows exponentially with its ra- of a circle of hyperbolic radius R is C = 2π sinh R.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    7 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us