
Washington University in St. Louis Washington University Open Scholarship Engineering and Applied Science Theses & Dissertations McKelvey School of Engineering Winter 12-15-2019 Graph Deep Learning: Methods and Applications Muhan Zhang Washington University in St. Louis Follow this and additional works at: https://openscholarship.wustl.edu/eng_etds Part of the Artificial Intelligence and Robotics Commons Recommended Citation Zhang, Muhan, "Graph Deep Learning: Methods and Applications" (2019). Engineering and Applied Science Theses & Dissertations. 504. https://openscholarship.wustl.edu/eng_etds/504 This Dissertation is brought to you for free and open access by the McKelvey School of Engineering at Washington University Open Scholarship. It has been accepted for inclusion in Engineering and Applied Science Theses & Dissertations by an authorized administrator of Washington University Open Scholarship. For more information, please contact [email protected]. WASHINGTON UNIVERSITY IN ST.LOUIS School of Engineering & Applied Science Department of Computer Science and Engineering Dissertation Examination Committee: Yixin Chen, Chair Michael Avidan Sanmay Das Roman Garnett Brendan Juba Yinjie Tang Graph Deep Learning: Methods and Applications by Muhan Zhang A dissertation presented to The Graduate School of Washington University in partial fulfillment of the requirements for the degree of Doctor of Philosophy December 2019 St. Louis, Missouri © 2019, Muhan Zhang Table of Contents List of Figures........................................................................................... v List of Tables ............................................................................................ viii Acknowledgments...................................................................................... x Abstract ................................................................................................... xiii Chapter 1: Introduction............................................................................ 1 1.1 Graph Deep Learning......................................................................... 1 1.2 A Brief History of Graph Neural Networks ............................................. 4 1.3 Graph Neural Networks Basics ............................................................. 6 1.4 A Categorization of Graph Neural Networks ........................................... 8 1.4.1 GNNs for node-level tasks ......................................................... 9 1.4.2 GNNs for graph-level tasks ........................................................ 18 1.4.3 GNNs for edge-level tasks.......................................................... 19 Chapter 2: Graph Neural Networks for Graph Representation Learning .... 22 2.1 Graph Neural Networks for Graph Classification...................................... 23 2.1.1 Traditional graph classification methods: graph kernels ................... 23 2.1.2 Limitations of existing GNNs for graph classification....................... 25 2.1.3 Deep Graph Convolutional Neural Network (DGCNN) .................... 26 2.1.4 Training through backpropagation............................................... 30 2.1.5 Discussion .............................................................................. 31 2.1.6 Experimental results................................................................. 34 2.1.7 Conclusion ............................................................................. 41 2.2 Graph Neural Networks for Medical Ontology Embedding ......................... 42 2.2.1 Introduction ........................................................................... 42 ii 2.2.2 Preliminaries .......................................................................... 44 2.2.3 Methodology........................................................................... 46 2.2.4 Theoretical analysis.................................................................. 50 2.2.5 Experiments ........................................................................... 54 2.2.6 Related work .......................................................................... 60 2.2.7 Conclusion ............................................................................. 61 Chapter 3: Graph Neural Networks for Relation Prediction ...................... 63 3.1 Link Prediction Based on Graph Neural Networks.................................... 64 3.1.1 A brief review of link prediction methods...................................... 64 3.1.2 Limitations of existing methods .................................................. 66 3.1.3 A theory for unifying link prediction heuristics .............................. 67 3.1.4 SEAL: An implementation of the theory using GNN ....................... 74 3.1.5 Experimental results................................................................. 78 3.1.6 Conclusion ............................................................................. 83 3.2 Inductive Matrix Completion Based on Graph Neural Networks.................. 83 3.2.1 Introduction ........................................................................... 83 3.2.2 Related work .......................................................................... 87 3.2.3 Inductive Graph-based Matrix Completion (IGMC) ........................ 88 3.2.4 Experiments ........................................................................... 93 3.2.5 Conclusion ............................................................................. 98 Chapter 4: Graph Neural Networks for Graph Structure Optimization ...... 99 4.1 Introduction..................................................................................... 100 4.2 Related Work ................................................................................... 103 4.3 DAG Variational Autoencoder (D-VAE)................................................. 104 4.3.1 Encoding ............................................................................... 106 4.3.2 Decoding ............................................................................... 112 4.3.3 Model extensions ..................................................................... 114 4.3.4 Encoding neural architectures .................................................... 115 4.3.5 Encoding Bayesian networks ...................................................... 117 4.3.6 Advantages of Encoding Computations for DAG Optimization .......... 118 iii 4.4 Experiments..................................................................................... 119 4.4.1 Reconstruction accuracy, prior validity, uniqueness and novelty ......... 121 4.4.2 Predictive performance of latent representation. ............................. 122 4.4.3 Bayesian optimization............................................................... 123 4.4.4 Latent space visualization.......................................................... 126 4.5 Conclusion ....................................................................................... 128 Chapter 5: Conclusions ............................................................................. 129 References ................................................................................................ 133 Appendix A: ...........................................................................................[147] A.1 Additional details about link prediction baselines ..................................... [147] A.2 More related work on neural architecture search and Bayesian network structure learning........................................................................................... [149] A.3 More details about neural architecture search.......................................... [151] A.4 More details about Bayesian network structure learning ............................ [153] A.5 Baselines for D-VAE .......................................................................... [154] A.6 VAE training details .......................................................................... [157] A.7 More details of the piror validity experiment ........................................... [158] A.8 SGP training details .......................................................................... [159] A.9 The generated neural architectures........................................................ [160] A.10 The generated Bayesian networks ......................................................... [161] iv List of Figures Figure 1.1: 2D convolution (left) vs. graph convolution (right). Graph convolu- tion can be seen as generalizing 2D convolution on grids to arbitrary structures, where a node’s local receptive field is no longer a fixed-size subgrid, but is defined to be its one-hop neighboring nodes. Figure is from [166]. ............................................................................. 6 Figure 2.1: A consistent input ordering is crucial for CNNs’ successes on graph classification. If we randomly shuffle the pixels of the left image, then state-of-the-art convolutional neural networks (CNNs) will fail to recognize it as an eagle. ............................................................ 27 Figure 2.2: The overall structure of DGCNN. An input graph is first passed through multiple message passing layers where node information is propagated between neighbors. Then the vertex features are sorted and pooled with a SortPooling layer, and passed to 1-D convolutional layers to learn a predictive model............................................................ 29 Figure 2.3: Training curves of SortPooling
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages178 Page
-
File Size-