Graph Spectra and Signal Processing on Graphs

Graph Spectra and Signal Processing on Graphs

Graph Spectra and Signal Processing on Graphs Aristotle University of Thessaloniki Karalias Nikolaos October 2015 Acknowledgements I want to thank my advisor, professor Ioannis Pitas for his patience and sug- gestions throughout the course of this project. I would also like to thank my parents and my brother for the support and understanding they showed. 1 Contents 1 Introduction 4 2 Graph Theory 5 3 Linear Algebra and Discrete Transforms 9 3.1 Linear Algebra . 9 3.2 Discrete Transforms . 12 3.3 Circulant Matrices . 15 4 Spectral Graph Theory 19 4.1 Basic Properties . 19 4.2 Graph Spectra . 21 4.2.1 Circulant Graphs . 21 4.2.2 The Path Graph . 32 4.2.3 The Hypercube Graph . 36 4.2.4 The Grid Graph . 39 4.3 Additional Remarks . 45 5 Digital Signal Processing on Graphs 47 5.1 The Graph Fourier Transform and Graph Filtering . 47 5.1.1 Graph Products . 49 5.1.2 The Bilateral Graph Filter . 49 5.2 Experiments . 50 5.2.1 Graph Image Filtering . 50 5.2.2 Spectral clustering . 52 5.2.3 Graph Filtering . 55 5.2.4 Concluding remarks . 57 Appendices 63 A Eigenvalue and Eigenvector plots 64 2 Abstract We analyze the spectra of some basic graphs and explore their relationship with known discrete transforms. We then investigate the field of signal processing on graphs, which has emerged in recent years, and conduct graph filtering experi- ments on specfic graph topologies. 3 Chapter 1 Introduction Graphs are a form of representing data and their structure. Vertices on a graph correspond to data points while edges describe the relationship between them. Weighted graphs are often a natural and intutive description for data in various applications. There, the edge weights are used to represent some notion of (dis)similarity. The increasing need to model and analyze complex networks and large datasets has inevitably led to the drastic popularity boost of graph theoretic methods.[2] However apart from modern applications on social media and networking [13] [14] [15], graphs have also been useful on a variety of other applications, ranging from image processing [16] and graphics [17] to biology [18]. Spectral methods have often been at the core of those developments in a plethora of fields .[27] [19] [20] [41]. The popularity of spectral methods can be largely attributed to their relationship with linear algebra, which lends itself very well to computation. Linear algebra algorithms are often suitable for parallel distributed processing or vectorization which makes them even more appealing in the face of the ever increasing demand to deal with big volumes of data. More recently, an interesting subject of study has been explored, namely the area of signal processing on graphs. [21][22] There the vertices of the graph are indexing a signal, which is then studied using tools from spectral graph theory. As we will see later, the core concepts of signal processing like the Fourier transform emerge naturally in this framework. This approach can be successfully utilized for image processing, where various filters are implemented in order to enhance certain aspects of an image(denoising, edge detection)[32] [31] as well as for big data analysis [29]. Here, we will first introduce the basic concepts from the relevant areas of linear algebra and graph theory, then move on to analyze some common graph spectra, and finally will take a look at signal processing on graphs and some of its applications. 4 Chapter 2 Graph Theory A graph G = fV; Eg consists of a vertex set V (G) and an edge set E(G) where an edge is an unordered pair of distinct vertices of G. Let (x; y) denote an edge. The vertices x and y are called adjacent, which we denote by x ∼ y. A vertex is incident with an edge if it is one of the two vertices of the edge. Definition 1 Two graphs G1;G1 are equal if and only if they have the same vertex set and the same edge set. Definition 2 Two graphs G1 and G2 are isomorphic if there is a bijection φ, from V (G1) to V (G2), such that x ∼ y in G1 if and only if φ(x) ∼ φ(y) in Y . We say that φ is an isomorphism from G1 to G2. Relabelling the vertex set of a graph G is an isomorphism. Definition 3 A directed graph G, has an edge set of distinct ordered adjacent vertices. Definition 4 The degree of a vertex in graph G, is the number of edges incident on that vertex. Next, we state some core definitions that are going to be useful for the spectral analysis of the graphs. Definition 5 The adjacency matrix AN×N (G) of a graph G is the integer ma- trix with rows and columns indexed by the vertices of X, such that the ij-entry of A(G) is equal to the number of arcs from i to j(which is 0 or 1). Theorem 1 Let G be a graph with adjacency matrix A(G). The number of r walks from u to v in G with length r is (A )uv. Definition 6 The degree matrix D(G) of a graph G is the diagonal matrix where di is the degree of the vertex in the ith position. 5 Definition 7 The Laplacian matrix L(G) of a graph G, is a N × N symmetric matrix indexed by the vertices of X defined by L = D − A. Definition 8 The incidence matrix of a graph G, denoted by B(G), is the ma- trix of binary entries with rows and columns indexed by the vertices and the edges of G respectively(some authors define it so that the rows correspond to edges and columns correspond to vertices). The ijth entry of B(G) is 1 if and only if i is in the edge of j. Definition 9 The normalized Laplacian matrix, is L~ = D−1=2LD−1=2. Definition 10 A k − regular graph is one whose vertices all have the same degree k. Common examples of k − regular graph examples are the cube, the ring, etc. Definition 11 A complete graph Kn on n vertices, is a graph whose every pair of vertices is connected by an edge. Definition 12 Let G1;G2;:::;GN be graphs. Then the Cartesian Graph Product is the graph N G12G22 ::: 2GN = 2 Gi (2.1) i=1 with vertex set f(x1; x2; : : : ; xN )jxi 2 V (Gi)g and for which two vertices (x1; x2; : : : ; xN ) are adjacent whenever xiyi 2 E(Gi) for exactly one index 1 ≤ i ≤ N and xj = yj for each index j 6= i . The Cartesian Graph product, is commutative in the sense that G2H =∼ H2G i.e the two products are isomorphic. The Cartesian Graph Product is also associative. Definition 13 Let G1;G2;:::;GN be graphs. Then the Strong Graph Prod- uct is the graph N G1 G2 ··· GN = Gi (2.2) i=1 with vertex set f(x1; x2; : : : ; xN )jxi 2 V (Gi)g and for which two vertices (x1; x2; : : : ; xN ) are adjacent whenever xiyi 2 E(Gi) or xj = yj for each 1 ≤ i ≤ N . Definition 14 Let G1;G2;:::;GN be graphs. Then the Direct Graph Prod- uct is the graph N G1 × G2 × · · · × GN = × Gi (2.3) i=1 with vertex set f(x1; x2; : : : ; xN )jxi 2 V (Gi)g and for which two vertices (x1; x2; : : : ; xN ) are adjacent only whenever xiyi 2 E(Gi) for each 1 ≤ i ≤ N . 6 Figure 2.1: The P2 path graph. Figure 2.2: The C3 cycle graph. For a graph G1 on N vertices and G2 on M vertices, the adjacency matrix of the direct product will be A(G1 × G2) = A(G1) ⊗ A(G2) (2.4) , the adjacency matrix of the cartesian product will be A(G12G2) = (A(G1) ⊗ IM ) + (IN ⊗ A(G2)) (2.5) , the adjacency matrix of the strong product will be A(G1 G2) = A(G1) ⊗ A(G2) + (A(G1) ⊗ IM ) + (IN ⊗ A(G2)) (2.6) [10] For example, consider the following graphs: the path graph on two vertices P2 and the cycle graph on 3 vertices C3. Their graph products are going to be Figure 2.3: The cartesian product G = C32P2 7 Figure 2.4: The strong product G = C3 P2 Figure 2.5: The direct product G = C3 × P2 8 Chapter 3 Linear Algebra and Discrete Transforms 3.1 Linear Algebra Here we are going to present some of the required tools and concepts from linear algebra and the theory of discrete transforms that are going to be useful later on. Definition 15 A matrix A is symmetric when A = AT. Definition 16 A matrix A is called singular, when it is not invertible. Definition 17 Matrices BN×N and CN×N are said to be similar matrixes whenever there exists a nonsingular matrix Q such that B = Q−1CQ. The product P−1AP is called a similarity transformation on A. If Q = P where P is a permuation matrix, then B; C are said to be permutation- similar. The set of distinct eigenvalues of A is called the spectrum of A. Definition 18 A matrix A is called normal whenever AAT = ATA. From this definition, we can observe that symmetric matrices are normal. Definition 19 For an N × N matrix A, the scalars λ and vectors vN×1 that satisfy Av = λv (3.1) are called eigenvalues and eigenvectors of A, respectively. Definition 20 A square matrix A is said to be diagonalizable whenever A is similar to a diagonal matrix, i.e.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    101 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us