Signal Processing on Graphs
Total Page:16
File Type:pdf, Size:1020Kb
Signal Processing on Graphs Santiago Segarra, Weiyu Huang, and Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania [email protected] http://www.seas.upenn.edu/users/~aribeiro/ April 15, 2019 Signal and Information Processing Signal Processing on Graphs 1 Graph Signals Graph Signals Graph Laplacian Graph Fourier Transform (GFT) Ordering of frequencies Inverse graph Fourier transform (iGFT) Graph Filters Application: Gene Network Information sciences at ESE Signal and Information Processing Signal Processing on Graphs 2 The support of one dimensional signals I We have studied one-dimensional signals, image processing, PCA I It is time to understand them in a more unified way I Consider the support of one-dimensional signals I There is an underlying graph structure ) Each node represents discrete time instants (e.g. hours in a day) ) Edges are unweighted and directed Spring day in Philadelphia 70 F) 60 o 0 1 2 3 ··· 23 50 Temperature ( 40 30 0 4 8 12 16 20 24 Time of day (hours) Underlying graph Signal and Information Processing Signal Processing on Graphs 3 The support of images I Similarly, images also have an underlying graph structure I Each node represents a single pixel I Edges denote neighborhoods of pixels ) Unweighted and undirected ··· ··· ··· . Signal and Information Processing Signal Processing on Graphs 4 PCA uses another underlying graph I The previous underlying graph assumes a structure between pixels (neighbors in lattice) a priori of seeing the images I PCA considers images as defined on a different graph I Each node represents a single pixel I Edges denote covariance between pairs of pixels in the realizations ) A posteriori after seeing the images ) Undirected and weighted, including self loops Σ11 Σ22 Σ12 p1 p2 Σ14 Σ13 Σ24 Σ23 p3 p4 Σ34 Σ33 Σ44 Signal and Information Processing Signal Processing on Graphs 5 Graphs I Formally, a graph (or a network) is a triplet( V; E; W ) I V = f1; 2;:::; Ng is a finite set of N nodes or vertices I E ⊆ V × V is a set of edges defined as order pairs (n; m) ) Write N (n) = fm 2 V :(m; n) 2 Eg as the in-neighbors of n I W : E! R is a map from the set of edges to scalar values, wnm ) Represents the level of relationship from n to m ) Unweighted graphs ) wnm 2 f0; 1g; for all (n; m) 2 E ) Undirected graphs ) (n; m) 2 E if and only if (m; n) 2 E and wnm = wmn; for all (n; m) 2 E ) In-neighbors are neighbors ) More often weights are strictly positive, W : E! R++ Signal and Information Processing Signal Processing on Graphs 6 Graphs { examples I Unweighted and directed graphs 0 1 2 3 ··· 23 )V = f0; 1;:::; 23g )E = f(0; 1); (1; 2);:::; (22; 23); (23; 0)g ) W :(n; m) 7! 1; for all (n; m) 2 E I Unweighted and undirected graphs 1 2 3 )V = f1; 2; 3;:::; 9g 4 5 6 )E = f(1; 2); (2; 3);:::; (8; 9); (1; 4);:::; (6; 9)g ) W :(n; m) 7! 1; for all (n; m) 2 E 7 8 9 Σ11 Σ22 Σ12 p1 p2 I Weighted and undirected graphs Σ14 )V = fp1; p2; p3; p4g Σ Σ 13 Σ23 24 )E = f(p1; p1); (p1; p2);:::; (p4; p4)g = V × V p3 p4 ) W :(n; m) 7! Σnm = Σmn; for all (n; m) Σ34 Σ33 Σ44 Signal and Information Processing Signal Processing on Graphs 7 Adjacency matrices I Given a graph G = (V; E; W ) of N vertices, I Its adjacency matrix A 2 RN×N is defined as ( wnm; if(n; m) 2 E Anm = 0; otherwise I A matrix representation incorporating all information about G ) For unweighted graphs, positive entries represent connected pairs ) For weighted graphs, also denote proximities between pairs I Inherently defines an ordering of vertices ) same ordering as in graph signals that we will see soon Signal and Information Processing Signal Processing on Graphs 8 Adjacency matrices { examples Σ11 Σ22 Σ12 1 2 3 p1 p2 0 1 2 3 ··· 23 Σ14 4 5 6 Σ Σ 13 Σ23 24 7 8 9 p3 p4 Σ34 Σ33 Σ44 I Different ordering will yield different A 2 1 3 2 3 1 1 2 3 6 1 7 . Σ11 Σ12 ··· 6 7 6 ..7 6 .. 7 61 1 7 6Σ12 Σ22 ···7 6 . 7 6 7 4 5 = Σ 6 7 6 1 7 . .. 4 15 6 7 . 6 .. 7 1 61 .7 4 . 5 .. .. Signal and Information Processing Signal Processing on Graphs 9 Graph signals I Graph signals are mappings x : V! R I Defined on the vertices of the graph I May be represented as a vector x 2 RN I xn represents the signal value at the nth vertex in V I Inherently utilizes an ordering of vertices ) same ordering as in adjacency matrices 5 2 3 2 3 s0 0:6 s 0:7 0 6 17 6 7 6 9 6 7 6 7 1 4 6s37 60:37 x = 6 7 = 6 7 6 . 7 6 . 7 2 3 4 . 5 4 . 5 7 8 s9 0:7 Signal and Information Processing Signal Processing on Graphs 10 Graphs { Gene networks I Graphs representing gene-gene interactions ) Each node denotes a single gene (loosely speaking) ) Connected if their coded proteins participate in same metobolism TP53 CTNNB1 PTEN KRAS PIK3R1 IRS1 JAK1 IL2RG TIAM1 A sample network Signal and Information Processing Signal Processing on Graphs 11 Graph signals { Genetic profiles I Genetic profiles for each patient can be considered as a graph signal ) Signal on each node is 1 if mutated and 0 otherwise P P T I T I C P I C P I K T K T J J Sample patient 1 with subtype 1 Sample patient 2 with subtype 1 P P T I T I C P I C P I K T K T J J Sample patient 1 with subtype 2 Sample patient 2 with subtype 2 Signal and Information Processing Signal Processing on Graphs 12 Plans I We are going to derive following concepts for graph signal processing ) Total variations ) Frequency ) the notion of high or low frequency will be less obvious ) DFT and iDFT for graph signals ) Graph filtering I And apply graph signal processing to gene mutation dataset Signal and Information Processing Signal Processing on Graphs 13 Graph Laplacian Graph Signals Graph Laplacian Graph Fourier Transform (GFT) Ordering of frequencies Inverse graph Fourier transform (iGFT) Graph Filters Application: Gene Network Information sciences at ESE Signal and Information Processing Signal Processing on Graphs 14 Degree of a node I The degree of a node is the sum of the weights of its incident edges I Given a weighted and undirected graph G = (V; E; W ) P I The degree of node i, deg(i) is defined as deg(i)= j2N (i) wij ) where N (i) is the neighborhood of node i I Equivalently, in terms of the adjacency matrix A P P ) deg(i)= j Aij = j Aji N×N I The degree matrix D 2 R is a diagonal matrix s.t. Dii = deg(i) I In directed graphs, each node has an out-degree and an in-degree ) Weights in outgoing and incoming edges need not coincide Signal and Information Processing Signal Processing on Graphs 15 Laplacian of a graph I Given a graph G with adjacency matrix A and degree matrix D I We define the Laplacian matrix L 2 RN×N as L = D − A I Equivalently, L can be defined element-wise as 8 deg(i) = P w if i = j <> j2N (i) ij Lij = −wij if j 2 Ni :>0 otherwise I We assume undirected G ) deg(i) is well-defined I The normalized Laplacian can be obtained as L = D−1=2LD−1=2 ) We will mainly focus on the unnormalized version Signal and Information Processing Signal Processing on Graphs 16 An example of a graph Laplacian I Consider the weighted and undirected graph and its Laplacian 2 1 3 2 3 −10 −23 6−16 −3 −27 1 2 3 L = 6 7 4 0 −34 −15 2 1 −2 −2 −1 5 4 I Diagonal elements are strictly positive since no node is isolated ) Every node has a non-zero degree I Off-diagonal elements are non-positive Signal and Information Processing Signal Processing on Graphs 17 Multiplication by the Laplacian I Consider a graph G with Laplacian L and a graph signal x on G I Signal y = Lx results from multiplying x with the Laplacian I Component yi of y is (matrix product definition, separate j = i) X X yi = Lij xj = Lii xi + Lij xj j j6=i I We know Lij = 0 when j 2= Ni and Lij = −wij when j 2 Ni . Then X yi = Lii xi − wij xj j2Ni We also know that L = deg(i) = P w . Therefore I ii j2Ni ij X X X yi = wij xi − wij xj = wij (xi − xj ) j2Ni j2Ni j2Ni I Replaces xi by weighted average of difference with neighbors Signal and Information Processing Signal Processing on Graphs 18 Graph signal diffusion X I Multiplying by the Laplcian yields ) yi = wij (xi − xj ) j2Ni I yi measures the difference between x at a node and its neighborhood I We say the signal diffuses through the graph. Like heat diffuses I Temperature at i is averaged with neighbor's temperatures I Further Laplacian multiplications continue diffusion process ) L2x brings in energy from 2-hop neighborhood ) L3x brings in energy from 3-hop neighborhood ) ::: Lk x brings in energy from k-hop neighborhood Signal and Information Processing Signal Processing on Graphs 19 Variability of the Laplacian quadratic form I The Laplacian quadratic form of graph signal x is ) xT Lx I Quadratic form is a number: row vector × matrix × column vector Theorem The Laplacian quadratic form of signal x = [x0; x1;:::; xN ] is explicitly given by 1 X X 1 X xT Lx = w (x − x )2 = w (x − x )2 2 ij i j 2 ij i j i j2Ni (i;j)2E I xT Lx quantifies the local variation of signal x ) signals can be ordered depending on how much they vary ) will be important to order frequencies Signal and Information Processing Signal Processing on Graphs 20 Proof of quadratic form variability Proof.