
Self-Supervised Learning of Contextual Embeddings for Link Prediction in Heterogeneous Networks Ping Wang1, Khushbu Agarwal2, Colby Ham2, Sutanay Choudhury2, Chandan K. Reddy1 1Department of Computer Science, Virginia Tech, Arlington, VA 2Pacific Northwest National Laboratory, Richland, WA [email protected],{khushbu.agarwal,colby.ham,sutanay.choudhury}@pnnl.gov,[email protected] ABSTRACT 1 INTRODUCTION Representation learning methods for heterogeneous networks pro- The topic of representation learning for heterogeneous networks duce a low-dimensional vector embedding (that is typically fixed has gained a lot of attention in recent years [1, 5, 10, 29, 33, 35], for all tasks) for each node. Many of the existing methods focus on where a low-dimensional vector representation of each node in obtaining a static vector representation for a node in a way that is the graph is used for downstream applications such as link pre- agnostic to the downstream application where it is being used. In diction [1, 5, 37] or multi-hop reasoning [8, 13, 40]. Many of the practice, however, downstream tasks such as link prediction require existing methods focus on obtaining a static vector representation specific contextual information that can be extracted from thesub- per node that is agnostic to any specific context and is typically graphs related to the nodes provided as input to the task. To tackle obtained by learning the importance of all of the node’s imme- this challenge, we develop SLiCE, a framework for bridging static diate and multi-hop neighbors in the graph. However, we argue representation learning methods using global information from the that nodes in a heterogeneous network exhibit a different behavior, entire graph with localized attention driven mechanisms to learn based on different relation types and their participation in diverse contextual node representations. We first pre-train our model in network communities. Further, most downstream tasks such as link a self-supervised manner by introducing higher-order semantic prediction are dependent on the specific contextual information associations and masking nodes, and then fine-tune our model for related to the input nodes that can be extracted in the form of task a specific link prediction task. Instead of training node represen- specific subgraphs. tations by aggregating information from all semantic neighbors Incorporation of contextual learning has led to major break- connected via metapaths, we automatically learn the composition throughs in the natural language processing community [9, 24], in of different metapaths that characterize the context for a specific which the same word is associated with different concepts depend- task without the need for any pre-defined metapaths. SLiCE signif- ing on the context of the surrounding words. A similar phenomenon icantly outperforms both static and contextual embedding learning can be exploited in graph structured data and it becomes particu- methods on several publicly available benchmark network datasets. larly pronounced in heterogeneous networks where the addition We also demonstrate the interpretability, effectiveness of contextual of relation types as well as node and relation attributes leads to learning, and the scalability of SLiCE through extensive evaluation. increased diversity in a node’s contexts. Figure 1 provides an illus- tration of this problem for an academic network. Given two authors CCS CONCEPTS who publish in diverse communities, we posit that the task of pre- • Mathematics of computing ! Graph algorithms; • Comput- dicting link ¹퐴DCℎ>A1, 2> − 0DCℎ>A, 퐴DCℎ>A2º would perform better ing methodologies ! Learning latent representations; Neu- if their node representation is reflective of the common publication ral networks. topics and venues, i.e., Representation Learning and NeurIPS. This is in contrast to existing methods where author embeddings would KEYWORDS reflect information aggregation from all of their publications, in- Heterogeneous networks, network embedding, self-supervised learn- cluding the publications in healthcare and climate science which ing, link prediction, semantic association are not part of the common context. Contextual learning of node representations in network data has ACM Reference Format: recently gained attention with different notions of context emerging Ping Wang1, Khushbu Agarwal2, Colby Ham2, Sutanay Choudhury2, Chan- (see Table 1). In homogeneous networks, communities provide a dan K. Reddy1. 2021. Self-Supervised Learning of Contextual Embeddings for Link Prediction in Heterogeneous Networks. In Proceedings of the Web natural definition of a node’s participation in different contexts Conference 2021 (WWW ’21), April 19–23, 2021, Ljubljana, Slovenia. ACM, referred to as facets or aspects [11, 19, 21, 32, 34]. Given a task New York, NY, USA, 12 pages. https://doi.org/10.1145/3442381.3450060 such as link prediction, inferring the cluster-driven connectivity between the nodes has been the primary basis for these approaches. However, accounting for higher-order effects over diverse meta- This paper is published under the Creative Commons Attribution 4.0 International paths (defined as paths connected via heterogeneous relations) is (CC-BY 4.0) license. Authors reserve their rights to disseminate the work on their demonstrated to be essential in representation learning and link personal and corporate Web sites with the appropriate attribution. WWW ’21, April 19–23, 2021, Ljubljana, Slovenia prediction in heterogeneous networks [5, 16, 33, 35]. Therefore, © 2021 IW3C2 (International World Wide Web Conference Committee), published contextual learning methods that primarily rely on the well-defined under Creative Commons CC-BY 4.0 License. notion of graph clustering will be limited in their effectiveness for ACM ISBN 978-1-4503-8312-7/21/04. https://doi.org/10.1145/3442381.3450060 heterogeneous networks where modeling semantic association (via WWW ’21, April 19–23, 2021, Ljubljana, Slovenia P. Wang, et al. (a) (b) (c) Figure 1: Subgraph driven contextual learning in an academic network. (a) Author nodes publish on diverse topics (participate in diverse contexts) (b) State-of-the-art methods aggregate global semantics for authors based on all published papers (c) Our approach uses context subgraph between authors to contextualize their node embeddings during link prediction. meta-paths or meta-graphs) is at least equal or more important semantic association (HSA) between nodes in a context subgraph. than community structure for link prediction. We do not assume any prior knowledge about important metap- In this paper, we seek to make a fundamental advancement over aths, and SLiCE learns important task specific subgraph structures these categories that aim to contextualize a node’s representation during training (see section 4.3). More specifically, we first develop with regards to either a cluster membership or association with a self-supervised learning approach that pre-trains a model to learn meta-paths or meta-graphs. We believe that the definition of a con- a HSA matrix on a context-by-context basis. We then fine-tune the text needs to be expanded to subgraphs (comprising heterogeneous model in a task-specific manner, where given a context subgraph relations) that are task-specific and learn node representations that 62 as input, we encode the subgraph with global features and then represent the collective heterogeneous context. With such a design, transform that initial representation via a HSA-based non-linear a node’s embedding will be dynamically changing based on its transformation to produce contextual embeddings (see Figure 2). participation in one input subgraph to another. Our experiments Our Contributions: The main contributions of our work are: indicate that this approach has a strong merit with link prediction • Propose contextual embedding learning for graphs from single performance, thus improving it by 10%-25% over many state-of-the- relation context to arbitrary subgraphs. art approaches. • Introduce a novel self-supervised learning approach to learn higher- We propose shifting the node representation learning from a order semantic associations between nodes by simultaneously cap- node’s perspective to a subgraph point of view. Instead, of focusing turing the global and local factors that characterize a context on “what is the best representation for a node E", we seek to answer subgraph. “what are the best collective node representations for a given sub- • Show that SLiCE significantly outperforms existing static and graph 62 " and “how such representations can be potentially useful contextual embedding learning methods using standard evalua- in a downstream application?" Our proposed framework SLiCE tion metrics for the task of link prediction. (which is an acronym for Self-supervised LearnIng of Contextual • Demonstrate the interpretability, effectiveness of contextual trans- Embeddings), accomplishes this by bridging static representation lation, and the scalability of SLiCE through an extensive set of learning methods using global information from the entire graph experiments and contribution of a new benchmark dataset. with localized attention driven mechanisms to learn contextual node representations in heterogeneous networks. While bridging global The rest of this paper is organized as follows. Section 2 provides and local information is a common approach for many algorithms, an overview of related work about network embedding learning and the primary novelty
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-