<<

1

Spectral Projector-Based Graph Fourier Transforms Joya A. Deri, Member, IEEE, and José M. F. Moura, Fellow, IEEE

Abstract—The paper presents the graph Fourier trans- to this repeated eigenvalue (frequency)—defining the form (GFT) of a signal in terms of its spectral decompo- spectral or frequency components (“harmonics”) be- sition over the Jordan subspaces of the graph adjacency comes an issue; and 3) Nondiagonalizability: If the shift matrix 퐴. This representation is unique and coordinate free, and it leads to unambiguous definition of the spectral is not diagonalizable, as it happens in many real world components (“harmonics”) of a graph signal. This is partic- applications with large sparse graphs, the matrix 퐴 is ularly meaningful when 퐴 has repeated eigenvalues, and it defective, introducing additional degrees of freedom in is very useful when 퐴 is defective or not diagonalizable (as the coordinate form definition of the GFT. These topics it may be the case with directed graphs). Many real world are particularly relevant when applying GSP to datasets large sparse graphs have defective adjacency matrices. We present properties of the GFT and show it to satisfy arising in real world problems. In many of these, the a generalized Parseval inequality and to admit a total graphs are large and sparse and their adjacency matrix variation ordering of the spectral components. We express is defective. the GFT in terms of spectral projectors and present an This paper addresses these issues. We present the illustrative example for a real world large urban traffic coordinate free GFT that leads to a unique spectral dataset. decomposition of graph signals and to an unambiguous Index Terms— on graphs, graph signal definition of spectral components, regardless if there are processing, graph , spectral projection, graph spectral components, Jordan decomposition, gener- repeated eigenvalues or not, or if the shift is defective. alized eigenspaces, directed graphs, sparse matrices, large Spectral components [5] are signals that are left invariant networks by graph filters. For repeated eigenvalues or defective matrices, it makes sense to consider in this context I.INTRODUCTION irreducible invariant subspaces—signal subspaces that are invariant to filtering and are irreducible. This is Graph signal processing (GSP) extends traditional achieved by decomposing the signal space into a direct signal processing to data indexed by nodes of graphs. sum of irreducible filter-invariant (spectral) subspaces. If Such data arises in many domains from genomics to the dimension of the filter-invariant subspaces is larger business to social networks, to name a few. In GSP, the than one, the choice of basis for these subspaces is not graph Fourier transform (GFT) has been defined through unique, and neither is the coordinate form of the GFT the eigendecomposition of the adjacency matrix 퐴 of the or identifying spectral components with basis vectors. graph, taken as the graph shift operator [1]–[3], or of the The coordinate free GFT and the spectral decomposi- graph Laplacian 퐿 [4]. In the GSP approach in [1]–[3] tion we present successfully addresses these challenges. and according to the algebraic signal processing in [5]– We present a spectral oblique projector-based GFT that [7] the eigenvectors of the shift are the graph frequency arXiv:1701.02690v1 [cs.SI] 10 Jan 2017 allows for a unique and unambiguous spectral repre- or graph spectral components and the eigenvalues are sentation of a signal over defective adjacency matrices. the graph frequencies. Invariance to filtering follows from invariance to the Contributions. There are several issues that need shift operator (adjacency matrix 퐴) since, by GSP [1]– further study: 1) Unicity: the matrix form of the GFT [3], shift invariant filters are polynomials in the shift 퐴. in [1]–[4] is not unique, depending on (implicit or The spectral components are the Jordan subspaces of explicit) choice of bases for underlying spaces. This is the adjacency matrix. We show that the GFT allows true, even if the matrix of interest is diagonalizable; characterization of the signal projection energies via a 2) Spectral components: If 퐴 or 퐿 have repeated eigen- generalized Parseval’s identity. Total variation ordering values, there may be several eigenvectors corresponding of the spectral components with respect to the Jordan This work was partially supported by NSF grants CCF-1011903 and subspaces is also discussed. CCF-1513936 and an SYS-CMU grant. Synopsis of approach. Before we formally introduce The authors are with the Department of Electrical and Computer Engineering, Carnegie Mellon University, Pittsburgh, PA 15213 USA the concepts and as a way of introduction and moti- (email: jderi,[email protected]) vation, we explain very concisely our approach. From algebraic signal processing (ASP) [5], [6], we know Other methods determine low-dimensional represen- that the basic component is the signal processing model tations of high-dimensional data by projecting the data Ω = (풜, ℳ, Φ). For a vector space 푉 of complex-valued onto low-dimensional subspaces generated by subsets of signals, we can then generalize for this signal model Ω, an eigenbasis [11]–[14]. References [11], [12] embed linear filtering theory, where algebra 풜 corresponds to high-dimensional vectors onto low-dimensional mani- a filter space, module ℳ corresponds to a signal space, folds determined by a weight matrix with entries cor- and bijective linear mapping Φ: 푉 → ℳ generalizes responding to nearest-neighbor distances. In [13], em- the 푧-transform [5]. One way to create a signal model bedding data in a low-dimensional space is described in is to specify a generator (or generators) for 풜, the terms of the graph Laplacian, where the graph Laplacian shift filter or shift operator. The Fourier transform is is an approximation to the Laplace-Beltrami operator the map from the signal module ℳ to an irreducible on manifolds. Reference [15] also proves that the algo- decomposition of ℳ where the irreducible components rithm [11] approximates eigenfunctions of a Laplacian- are invariant to the shift (and to the filters). We are then based matrix. interested in studying the invariant irreducible compo- These methods [11]–[14] focus on discovering low- nents of ℳ. These are the Jordan subspaces as explained dimensional representations for high-dimensional data, below. In GSP, we choose as shift the adjacency matrix 퐴 capturing relationships between data variables into a of the underlying graph. Similarly, then, the Jordan matrix for their analysis. In contrast, our problem treats subspaces play an important role in the graph Fourier the data as a signal that is an input to a graph-based transform defined in Section IV and, in this context, the filter. Our approach emphasizes node-based weights (the irreducible, 풜-invariant submodules ℳ′ ⊆ ℳ are the signal) instead of edge-based weights that capture data spectral components of (signal space) ℳ. The Jordan dependencies. Related node-based methods in the graph subspaces are invariant, irreducible subspaces of C푁 signal processing framework are discussed next. with respect to the adjacency matrix 퐴; they represent Data indexed by graphs and Laplacian-based the spectral components. This motivates the definition GFTs. The graph signal processing framework devel- of a spectral projector-based graph Fourier transform in oped in this paper assumes that data is indexed by Section IV. graphs. Studies that analyze data indexed by nodes of a Section II describes related spectral analysis methods graph include [16]–[18], which use wavelet transforms and graph signal processing frameworks. Section III to study data on distributed sensor networks. Other provides the graph signal processing and linear algebra approaches, such as those in [4], [19]–[24], use the background for the graph Fourier transform defined in graph Laplacian and its eigenbasis for localized data Section IV. Section V presents the generalized Parseval’s processing. In particular, [4], [20] define a graph Fourier identity as a method for ranking spectral components. transform (GFT) as signal projections onto the Laplacian Total variation-based orderings of the Jordan subspaces eigenvectors. These eigenvectors form an orthonormal are discussed in detail in Section VI. Section VII shiows basis since the graph Laplacian is symmetric and positive an application on a real world dataset. Limitations of the semidefinite. Graph-based filter banks are constructed method are briefly discussed in Section VIII. with respect to this GFT in [21]. Analyses based on the graph Laplacian do not take II.PREVIOUSWORK into account first-order network structure of the network, This section presents a brief review of the literature that is, any asymmetry like in a digraph or directed and some background material. edges in a graph. These asymmetries affect network flows, random walks, and other graph properties, as studied, for example, in [25], [26]. The approach we take A. Spectral methods here preserves the influence of directed edges in graph Principal component analysis (the Karhunen-Loève signal processing by projecting onto the eigenbasis of Transform) is an early signal decomposition method the adjacency matrix. proposed and remains a fundamental tool today. This Adjacency matrix-based GFTs. References [1]–[3] approach orthogonally transforms data points, often develop GSP, including filtering, , graph via eigendecomposition or singular value decomposition Fourier transform, from the graph adjacency matrix 퐴 ∈ (SVD) of an empirical covariance matrix, into linearly C푁×푁 , taken to play the role of shift operator 푧−1 in uncorrelated variables called principal components [8]– digital signal processing. According to the algebraic sig- [10]. The first principal components capture the most nal processing theory of [5]–[7], [27], the shift generates variance in the data; this allows analysis to be restricted all linear shift-invariant filters for a class of signals (un- to these first few principal components, thus increasing der certain shift invariance assumptions). In the context the efficiency of the data representation. of GSP [1], shift invariant filters are polynomials on the

2 shift 퐴. The graph Fourier transform is defined also in terms of the adjacency matrix. GSP as presented in [1]– [3] preserves the directed network structure, in contrast to second order methods like those based on the graph Laplacian. Fig. 1: Directed cycle graph. The graph Fourier transform of [1] is defined as follows. For a graph 풢 = 퐺(퐴) with adjacency matrix 퐴 ∈ C푁×푁 and Jordan decomposition 퐴 = 푉 퐽푉 −1, A. Graph Signal Processing 푁 the graph Fourier transform of a signal 푠 ∈ C over 풢 1) Graph Signals: Let 풢 = 풢(퐴) = (풱, ℰ) be the is defined as graph corresponding to matrix 퐴 ∈ C푁×푁 , where 풱 is 푠 = 푉 −1푠, (1) ̃︀ the set of 푁 nodes and a nonzero entry [퐴]푖푗 denotes a where 푉 −1 is the Fourier transform matrix of 풢. This directed edge 푒푖푗 ∈ ℰ from node 푗 to node 푖. In real- is essentially a projection of the signal onto the eigen- world applications, such nodes can be represented by vectors of 퐴. It is an orthogonal projection when 퐴 geo-locations of a road network, and the edges can be is normal (퐴퐻 퐴 = 퐴퐴퐻 ) and the eigenvectors form specified by one-way or two-way streets. Define graph a unitary basis (i.e., 푉 −1 = 푉 퐻 ). This is of course signal 푠 : 풱 → 풮 on 풢, where 풮 represents the signal guaranteed with the graph Laplacian. Left unanswered space over the nodes of 풢. We take 풮 = C푁 such that 푁 in these approaches is the lack of unicity1 of 푉 −1, the 푠 = (푠1, . . . , 푠푁 ) ∈ C and 푠푖 represents a measure at appropriate definition of spectral components when there node 푣푖 ∈ 풱. In real-world applications, such signals can are repeated eigenvalues, and finally how to define it be specified by sensor measurements or datasets. uniquely when the shift is defective. 2) Graph Shift: As in [1], [3], the graph shift is This paper addresses these topics and, in particular, fo- the graph signal processing counterpart to the shift cuses on graph signal processing over defective, or non- operator 푧−1 in digital signal processing. The graph shift diagonalizable, adjacency matrices. These matrices have is defined as the operator that replaces the element 푠푖 of at least one eigenvalue with algebraic multiplicity (the graph signal 푠 = (푠1, . . . , 푠푁 ) corresponding to node exponent in the characteristic polynomial of 퐴) greater 푣푖 ∈ 푉 with the linear combination of the signal ele- than the geometric multiplicity (the kernel dimension of ments at its in-neighbors (nodes 푣푘 ∈ 푉 that participate 퐴), which results in an eigenvector matrix that does not in an edge 푒푖푘 ∈ ℰ), denoted by set 풩푖; i.e., the shifted 푁 ∑︀ signal has elements 푠푖 = [퐴] 푠푗, or span C . ̃︀ 푣푗 ∈풩푖 푖푗 The basis can be completed by computing Jordan 푠 = 퐴푠. (2) chains of generalized eigenvectors [28], [29], but the ̃︀ computation introduces degrees of freedom that ren- Consistency with discrete signal processing can be seen der these generalized eigenvectors non-unique; in other by considering the directed cycle graph in Figure 1, words, the transform (1) may vary greatly depending on which represents a finite, periodic time-series signal. the particular generalized eigenvectors that are chosen. The adjacency matrix of the graph is circulant matrix Our approach defines the GFT in terms of spectral (elements not shown are zero) projections onto the Jordan subspaces (i.e., the span of ⎡ 1⎤ the Jordan chains) of the adjacency matrix2. ⎢1 ⎥ 퐶 = ⎢ ⎥ . (3) ⎢ .. ⎥ III.BACKGROUND ⎣ . ⎦ 1 This section reviews the concepts of graph signal pro- cessing and provides a reference for the underlying math- The shift 푠̃︀ = 퐶푠 yields the time delay 푠̃︀푖 = 푠푖−1 mod 푁 . ematics. Section III-A defines the graph Fourier trans- Reference [1] shows that the graph shift motivates form and graph filters; see also [1]–[3]. Section III-B de- defining the graph Fourier transform as the signal pro- fines the generalized eigenspaces and Jordan subspaces jection onto the eigenvectors of 퐴. Our transform in of a matrix [28], [29], [32]. Section IV builds on this concept to develop a framework to handle defective adjacency matrices. 1 Eigenvalues are defined up to a constant. Different choices lead to 3) Graph Filter: The graph shift is a simple graph scaled polynomial transforms. The discrete Fourier transform corre- 푁×푁 sponds to a very specific choice of basis [5]. filter, where a graph filter H ∈ C represents a (lin- 2We recognize that computing the Jordan decomposition is numer- ear) system with output H푠 for any graph signal 푠 ∈ 풮. ically unstable. This paper is focused on the concepts of a spectral As shown in Theorem 1 of [1], graph filter H is shift- projection coordinate free definition of the GFT and spectral compo- nents. Section VIII will address these computational issues that, for invariant, or lack of space, are fully discussed in [30], [31]. 퐴 (H) 푠 = H (퐴푠) , (4)

3 Fig. 2: Illustration of generalized eigenspace partitions and Jordan chains of adjacency matrix 퐴 ∈ C푁×푁 for (a) a single Jordan block, (b) one eigenvalue and two Jordan blocks, (c) one eigenvalue and multiple Jordan blocks, and (d) multiple eigenvalues. In (a)-(c) (bottom), each point represents a vector in a Jordan chain of 퐴; points connected by lines illustrate a single Jordan chain. The partial multiplicities depicted for 휆1 are (a) 푁, (b) 푟11 = 푁 − 2 and 2, and (c) 푟11 = 푁 − 6, 2, 2, 1, and 1. Each generalized eigenspace G푖 in (d) can be visualized by (a)-(c).

∑︀퐿 푖 if and only if a polynomial ℎ(푥) = 푖=0 ℎ푖푥 exists for Jordan chain of length 푟. The Jordan chain vectors are constants ℎ0, ℎ1, . . . , ℎ퐿 ∈ C such that H = ℎ(퐴) = linearly independent and span a Jordan subspace, or, ∑︀퐿 ℎ 퐴푖. This condition holds whenever the charac- 푖=0 푖 = span (푣 , 푣 , . . . , 푣 ) . (7) teristic and minimal polynomials of 퐴 are equal [1]. J 1 2 푟 For defective 퐴 with unequal characteristic and min- Order the Jordan subspaces of 휆푖 by decreasing di- imal polynomials such as the examples seen in this mension and denote by J푖푗 the 푗th Jordan subspace paper, shift-invariance cannot be claimed; however, an 푔푖 of 휆푖 with dimension 푟푖푗 ≤ 푚푖, where {푟푖푗}푗=1 are equivalent graph filter can be designed in terms of a called the partial multiplicities of 휆푖. Then the gener- matrix that is the image of a polynomial of 퐴 [1]. The 푚푖 alized eigenspace G푖 = Ker(퐴 − 휆퐼) of 휆푖 can be properties of such graph filters are established in [1]. decomposed as 푔푖 ؁ G푖 = J푖푗 (8) B. Eigendecomposition 푗=1 This section and Appendix A provide a review of Jor- as depicted in Figures 2a-c. Combining (8) and (5), the dan decompositions. The reader is directed to [28], [29], Jordan subspaces uniquely decompose C푁 as

[32], [33] for additional background. Jordan subspaces 푘 푔푖 ؁ ؁ and the Jordan decomposition are defined in this section. 푁 C = J푖푗. (9) 푚푖 The generalized eigenspaces G푖 = Ker(퐴 − 휆푖퐼) 푖=1 푗=1 of 퐴 ∈ 푁×푁 corresponding to its 푘 distinct eigenval- C Furthermore, the cyclic Jordan subspaces cannot be rep- ues 휆 decompose 푁 in terms of the direct sum 푖 C resented as direct sums of smaller invariant subspaces; 푘 that is, the Jordan subspaces are irreducible components ؁ 푁 C = G푖 (5) of C푁 (see, e.g., p. 318 of [32]). 푖=1 Figure 2 illustrates possible Jordan subspace structures as depicted in Figure 2d; see also Appendix A. of 퐴, with the top row showing the tessellation of vector 푁 Let 푣1 ∈ Ker(퐴−휆푖퐼), 푣1 ̸= 0, be one of the 푔푖 proper space C by the generalized or root eigenspace G푖 = 푚푖 eigenvectors of 퐴 corresponding to the eigenvalue 휆푖. It Ker (퐴 − 휆푖퐼) and by the Jordan spaces J푖푗, and generates by recursion the generalized eigenvectors the bottom row illustrating the telescoping of C푁 by the generalized eigenspaces of order 푝. Figure 2a illustrates 퐴푣푝 = 휆푖푣푝 + 푣푝−1, 푝 = 2, . . . , 푟 (6) C푁 for a matrix with a single Jordan chain, represented where 푟 is the minimal positive integer such that by connected points in C푁 . The case of a matrix with 푟 푟−1 (퐴 − 휆푖퐼) 푣푟 = 0 and (퐴 − 휆푖퐼) 푣푟 ̸= 0. Such a two Jordan blocks corresponding to one eigenvalue is 푁 sequence of vectors (푣1, . . . , 푣푟) that satisfies (6) is a shown in Figure 2b. Figure 2c shows C for a matrix

4 with a single eigenvalue and multiple Jordan blocks, and components of graph frequency 휆푖 are the Jordan sub- Figure 2d depicts the tessellation of the space in terms spaces J푖푗. The total number of frequency components of the generalized eigenspaces for the case of multiple corresponding to 휆푖 is its geometric multiplicity 푔푖. In distinct eigenvalues. this way, when 푔푖 > 1, frequency 휆푖 corresponds to more Jordan decomposition. Appendix A defines eigen- than one frequency component. vector matrix 푉 and Jordan normal form 퐽 such that To highlight the significance of (12) and (13), consider −1 퐴 = 푉 퐽푉 . An important property of this decom- the signal expansion of a graph signal 푠 with respect to position is that Jordan chains are not unique and not graph 풢(퐴): necessarily orthogonal. For example, the 3 × 3 matrix 푠 = 푠1푣1 + ··· + 푠푁 푣푁 = 푉 푠, (14) ⎡0 1 1⎤ ̃︀ ̃︀ ̃︀ 퐴 = ⎣0 0 1⎦ (10) where 푣푖 is the 푖th basis vector in a Jordan basis 0 0 0 of 퐴, 푉 is the corresponding eigenvector matrix, and 푠 is the 푖th expansion coefficient. As discussed in can have distinct eigenvector matrices ̃︀푖 Section III-B, the choice of Jordan basis has degrees ⎡ 1 −1 1 ⎤ ⎡ 1 0 0 ⎤ of freedom when the dimension of a cyclic Jordan 푉1 = ⎣ 0 1 −2 ⎦ , 푉2 = ⎣ 0 1 −1 ⎦ , (11) subspace is greater than one. Therefore, if dimJ푖푗 ≥ 2, 0 0 1 0 0 1 there exists eigenvector submatrix 푋푖푗 ̸= 푉푖푗 such that span{푋 } = span{푉 } = . Thus, the signal ex- where the Jordan chain vectors are the columns of 푉 푖푗 푖푗 J푖푗 1 pansion (14) is not unique when 퐴 has Jordan subspaces and 푉 and so satisfy (6). Since Jordan chains are not 2 of dimension 푟 > 1. unique, the Jordan subspace is used in Section IV to characterize the possible generalized eigenvectors. In contrast, the Fourier transform given by (12) and (13) yields a unique signal expansion that is indepen- IV. SPECTRAL PROJECTOR-BASED GRAPH SIGNAL dent of the choice of Jordan basis. Given any Jordan ba- PROCESSING sis 푣1, . . . , 푣푁 with respect to 퐴, the 푗th spectral compo- ∑︀푝+푟푖푗 −1 This section presents a basis-invariant coordinate free nent of eigenvalue 휆푖 is, by (13), 푠̂︀푖푗 = 푘=푝 푠̃︀푘푣푘, graph Fourier transform with respect to a set of known where 푣푝, . . . , 푣푝+푟푖푗 −1 are a basis of J푖푗. Under this proper eigenvectors. For graphs with diagonalizable ad- definition, there is no ambiguity in the interpretation of jacency matrices, the coordinate form of this transform frequency components even when Jordan subspaces have resolves with appropriate choice of basis vectors to that dimension 푟 > 1 or there are repeated eigenvalues. The of [1], [3]. The interpretation of the spectral components properties of the spectral components are discussed in is settled for the cases of repeated eigenvalues and non- more detail below. diagonalizable, or defective, adjacency matrices. Consider matrix 퐴 with distinct eigenval- ues 휆1, . . . , 휆푘, 푘 ≤ 푁, that has Jordan −1 A. Spectral Components decomposition 퐴 = 푉 퐽푉 . Denote by J푖푗 the 푗th Jordan subspace of dimension 푟푖푗 corresponding to The spectral components of the Fourier transform (12) eigenvalue 휆푖, 푖 = 1, . . . , 푘, 푗 = 1, . . . , 푔푖. Each J푖푗 are expressed in terms of basis 푣1, . . . , 푣푁 and its dual is 퐴-invariant and irreducible (see Section III-B). Then, basis 푤1, . . . , 푤푁 since the chosen Jordan basis may not the Jordan subspaces are the spectral components of be orthogonal. Denote the basis and dual basis matrices the signal space 풮 = 푁 and define the graph Fourier C by 푉 = [푣1 ··· 푣푁 ] and 푊 = [푤1 ··· , 푤푁 ]. By defini- transform of a graph signal 푠 ∈ 풮 as the mapping 퐻 tion, ⟨푤푖, 푣푗⟩ = 푤푖 푣푗 = 훿푖푗, where 훿푖푗 is the Kronecker 퐻 퐻 푘 푔푖 delta function [33], [34]. Then 푊 푉 = 푉 푊 = 퐼, so 퐻− ؁ ؁ ℱ : 풮 → J푖푗 the dual basis is the inverse Hermitian 푊 = 푉 and 푖=1 푗=1 correspond to the left eigenbasis. 푠 → (푠 ,..., 푠 ,..., 푠 ,..., 푠 ) . (12) ̂︀11 ̂︀1푔1 ̂︀푘1 ̂︀푘푔푘 Partition Jordan basis matrix 푉 as (58) so that each 푉 ∈ 푁×푟푖푗 spans Jordan subspace . Similarly, That is, the Fourier transform of 푠, is the unique decom- 푖푗 C J푖푗 partition the dual basis matrix by rows as 푊 = position [··· 푊 퐻 ··· 푊 퐻 ··· ]푇 푊 퐻 ∈ 푟푖푗 ×푁 푖1 푖푔푖 , with each 푖푗 C . Sup- 푘 푔푖 ∑︁ ∑︁ pose 푉푖푗 = [푣1 ··· 푣푟푖푗 ] with corresponding coefficients 푠 = 푠푖푗, 푠푖푗 ∈ J푖푗. (13) 푠 ,..., 푠 in the Jordan basis expansion (14). Define ̂︀ ̂︀ ̃︀1 ̃︀푟푖푗 푖=1 푗=1 0 an 푁 ×푁 matrix 푉푖푗 = [0 ··· 푉푖푗 ··· 0] that is zero except The distinct eigenvalues 휆1, . . . , 휆푘 are the graph for the columns corresponding to 푉푖푗. Then each spectral frequencies of graph 풢(퐴). The frequency or spectral component corresponding to Jordan subspace J푖푗 can

5 be written as (below, diagonal dots and elements not where 1 is at the 푖th index. Then it follows that shown are zero) 퐻 퐻 푃푖푗푃푘푙 = 푉푖푗푊푖푗 푉푘푙푊푘푙 (23)

푠푖푗 = 푠1푣1 + ··· + 푠푟푖푗 푣푟푖푗 (15) (︀ )︀ 퐻 ̂︀ ̃︀ ̃︀ = 푉푖푗 훿푖푘훿푗푙퐼푟푖푗 ×푟푘푙 푊푘푙 . (24) = 푉 0푠 푖푗 ̃︀ (16) 퐻 If 푖 = 푘 and 푗 = 푙, then 푃푖푗푃푘푙 = 푉푖푗퐼푟 ×푟 푊 = 푃푖푗; = 푉 0푉 −1푠 (17) 푖푗 푖푗 푖푗 푖푗 otherwise, 푃푖푗푃푘푙 = 0. ⎡. ⎤ .. (b) Write ⎢ ⎥ −1 ⎡. ⎤ = 푉 ⎢ 퐼푟푖푗 ⎥ 푉 푠 (18) . ⎣ ⎦ 푔푖 푔푖 . .. ∑︁ ∑︁ ⎢ ⎥ −1 . 푃푖푗 = 푉 ⎢ 퐼푟푖푗 ⎥ 푉 (25) ⎣ ⎦ ⎡. ⎤ 푗=1 푗=1 .. .. . ⎢ ⎥ 퐻 ⎛ ⎡. ⎤⎞ = 푉 ⎢ 퐼푟푖푗 ⎥ 푊 푠 (19) . ⎣ ⎦ 푔푖 . .. ⎜∑︁ ⎢ ⎥⎟ −1 . = 푉 ⎜ ⎢ 퐼푟푖푗 ⎥⎟ 푉 (26) ⎝ ⎣ ⎦⎠ 퐻 푗=1 . = 푉푖푗푊푖푗 푠, (20) .. ⎡. ⎤ for 푖 = 1, . . . , 푘 and 푗 = 1, . . . , 푔푖. Denote .. 퐻 ⎢ ⎥ 퐼 푔 푃푖푗 = 푉푖푗푊푖푗 , (21) ⎢ ∑︀푖 ⎥ −1 = 푉 ⎢ 푟푖푗 ⎥ 푉 (27) ⎢ 푗=1 ⎥ which is the projection matrix onto 풮푖푗 parallel to ⎣ . ⎦ 2 .. complementary subspace 풮 ∖ 풮푖푗. Note that 푃푖푗 = 퐻 퐻 퐻 푉푖푗푊푖푗 푉푖푗푊푖푗 = 푉푖푗푊푖푗 = 푃푖푗. = 푍푖0, (28) 푟푖푗 The projection matrices {푃푖푗} are related to the 푗=1 or the first component matrix of 퐴 for eigenvalue 휆푖. first component matrix 푍 of eigenvalue 휆 . The com- 푖0 푖 Theorem 1(a) shows that each projection matrix 푃 ponent matrix is defined as [28, Section 9.5] 푖푗 only projects onto Jordan subspace J푖푗. Theorem 1(b) ⎡. ⎤ shows that the sum of projection matrices for a given .. eigenvalue equals the component matrix of that eigen- 푍 = 푉 ⎢ 퐼 ⎥ 푉 −1 (22) 푖0 ⎢ 푎푖 ⎥ value. ⎣ . ⎦ .. This section provides the mathematical foundation for a graph Fourier transform based on projections onto the where 푎 = ∑︀푔푖 푟 is the algebraic multiplicity 푖 푗=1 푖푗 Jordan subspace of an adjacency matrix. The next section of 휆 . This matrix acts as a projection matrix onto the 푖 motivates ranking signal projections on Jordan subspaces generalized eigenspace. by the energy of the signal projections. Theorem 1 provides additional properties of projection matrix 푃푖푗. V. GENERALIZED PARSEVAL’S IDENTITY Theorem 1. For matrix 퐴 ∈ C푁×푁 with eigen- As discussed above, a chosen Jordan basis for ma- values 휆1, . . . , 휆푘, the projection matrices 푃푖푗 onto trix 퐴 ∈ 푁×푁 , represented by the eigenvector ma- the 푗th Jordan subspace J푖푗 corresponding to eigen- C trix 푉 , may not be orthogonal. Therefore, Parseval’s value 휆푖, 푖 = 1, . . . , 푘, 푗 = 1, . . . , 푔푖, satisfy the following properties: identity may not hold. Nevertheless, a generalized Par- seval’s identity does exist in terms of the Jordan basis

(a) 푃푖푗푃푘푙 = 훿푖푘훿푗푙푃푖푗, where 훿 is the Kronecker delta and its dual; see also [34]. For a dual basis matrix 푊 = −퐻 function; 푉 , the following property holds: ∑︀푔푖 (b) 푗=1 푃푖푗 = 푍푖0, where 푍푖0 is the component matrix Property 2 (Generalized Parseval’s Identity). Consider 푁 of eigenvalue 휆푖; graph signals 푠1, 푠2 ∈ C over graph 풢(퐴), 퐴 ∈ 푁×푁 . Let 푉 = [푣 ··· 푣 ] be a Jordan basis for 퐴 퐻 퐻 C 1 푁 Proof: (a) Since 푊 푉 = 퐼, the partition of 푊 −퐻 with dual basis 푊 = 푉 partitioned as [푤1 ··· 푤푁 ]. 퐻 푁 and 푉 that yields (20) satisfies 푊푖푗 푉푘푙 = 훿푖푘훿푗푙퐼푟푖푗 ×푟푘푙 , ∑︀ Let 푠 = 푖=1⟨푠, 푣푖⟩푣푖 = 푉 푠푉 be the representation of where 푟 is the dimension of the Jordan subspace cor- 푁 ̃︀ 푖푗 푠 in basis 푉 and 푠 = ∑︀ ⟨푠, 푤 ⟩푤 = 푊 푠 be the responding to 푃 , 푟 the dimension of Jordan subspace 푖=1 푖 푖 ̃︀푊 푖푗 푘푙 representation of 푠 in basis 푊 . Then corresponding to 푃푘푙, and matrix 퐼푟푖푗 ×푟푘푙 consists of the first 푟푘푙 canonical vectors 푒푖 = (0,..., 1,..., 0), ⟨푠1, 푠2⟩ = ⟨푠̃︀1,푉 , 푠̃︀2,푊 ⟩. (29)

6 By extension, As in [3], [37], the total variation for finite discrete- 2 valued (periodic) time series 푠 is defined as ‖푠‖ = ⟨푠, 푠⟩ = ⟨푠푉 , 푠푊 ⟩. (30) ̃︀ ̃︀ 푁−1 ∑︁ Equations (29) and (30) hold regardless of the choice TV (푠) = |푠푛 − 푠푛−1 mod푁 | = ‖푠 − 퐶푠‖1 , (35) of eigenvector basis. 푛=0 Energy of spectral components. The energy of a where 퐶 is the circulant matrix (3) that represents the 푁 discrete signal 푠 ∈ C is defined as [35], [36] DSP shift operator. As in [3], (35) is generalized to the 푁 graph shift 퐴 to define the graph total variation 2 ∑︁ 2 퐸푠 = ⟨푠, 푠⟩ = ‖푠‖ = |푠| . (31) TV퐺 (푠) = ‖푠 − 퐴푠‖ . (36) 푖=1 1 Matrix 퐴 can be replaced by 퐴norm = 1 퐴 when Equation (30) thus illustrates conservation of signal |휆max| energy in terms of both a Jordan basis and its dual. the maximum eigenvalue satisfies |휆max| > 0. The energy of the signal projections onto the spectral Equation (36) can be applied to define the total components of 풢(퐴) for the GFT (12) is next defined. variation of a spectral component. These components Write 푠̂︀푖,푗 in terms of the columns of 푉 as 푠̂︀푖,푗 = are the cyclic Jordan subspaces of the graph shift 퐴 as 훼1푣푖,푗,1 + . . . 훼푟푖푗 푣푖,푗,푟푖푗 and in terms of the columns of described in Section III-B. Choose a Jordan basis of 퐴 so 푊 as 푠 = 훽 푤 + . . . 훽 푤 . Then the energy −1 ̂︀푖,푗 1 푖,푗,1 푟푖푗 푖,푗,푟푖푗 that 푉 is the eigenvector matrix of 퐴, i.e., 퐴 = 푉 퐽푉 , of 푠̂︀푖푗 can be defined as where 퐽 is the Jordan form of 퐴. Partition 푉 into 2 푁 × 푟푖푗 submatrices 푉푖푗 whose columns are a Jordan ‖푠̂︀푖푗‖ = ⟨훼, 훽⟩ (32) chain of (and thus span) the 푗th Jordan subspace J푖푗 of using the notation 훼 = (훼1, . . . , 훼푟푖푗 ) and 훽 = eigenvalue 휆푖, 푖 = 1, . . . , 푘 ≤ 푁, 푗 = 1, . . . , 푔푖. Define (훽1, . . . , 훽푟푖푗 ). the (graph) total variation of 푉푖푗 as The generalized Parseval’s identity expresses the en- TV (푉 ) = ‖푉 − 퐴푉 ‖ , ergy of the signal in terms of the signal expansion coef- 퐺 푖푗 푖푗 푖푗 1 (37) ficients 푠, which highlights the importance of choosing ̃︀ where ‖·‖1 represents the induced L1 matrix norm (equal a Jordan basis. This emphasizes that both the GFT {푠̂︀푖푗} to the maximum absolute column sum). and the signal expansion coefficients 푠 are necessary to ̃︀ The next theorem shows equivalent formulations for fully characterize the graph Fourier domain. the graph total variation (37). Normal 퐴. When 퐴 is normal (i.e., when 퐴퐴퐻 = 퐻 퐴 퐴), 푉 can be chosen to have unitary columns. Then, Theorem 3. The graph total variation with respect to 푉 = 푊 so GFT (12) can be written as ⟨푠 , 푠 ⟩ = ⟨푠 , 푠 ⟩ (33) 1 2 ̃︀1 ̃︀2 ⃦ (︀ )︀⃦ TV퐺 (푉푖푗) = ⃦푉푖푗 퐼푟푖푗 − 퐽푖푗 ⃦ (38) and 1 2 2 ‖푠‖ = ⟨푠, 푠⟩ = ‖푠‖ . (34) = max {|1 − 휆| ‖푣1‖1 , ‖(1 − 휆) 푣푖 − 푣푖−1‖1} . ̃︀ 푖=2,...,푟푖푗 (39) Note that (33) and (34) do not hold in general for diagonalizable 퐴. Proof: Simplify (37) to obtain While the Jordan basis, or choice of eigenvectors, is ⃦ −1 ⃦ not unique, the image of a signal 푠 under the projection TV퐺 (푉푖푗) = ⃦푉푖푗 − 푉 퐽푉 푉푖푗⃦1 (40) ⃦ ⎡ ⎤⃦ matrix 푃푖푗 (21) is invariant to the choice of Jordan basis. ⃦ 0 ⃦ This section shows that these projections can be ranked ⃦ ⃦ = ⃦푉푖푗 − 푉 퐽 ⎣퐼푟푖푗 ⎦⃦ (41) in terms of the percentage of recovered signal energy. ⃦ 0 ⃦ ⃦ ⃦1 The next section demonstrates a total variation-based ⃦ ⎡ ⎤⃦ ⃦ 0 ⃦ ranking of the spectral components. ⃦ ⃦ = ⃦푉푖푗 − 푉 ⎣ 퐽푖푗 ⎦⃦ (42) ⃦ 0 ⃦ ⃦ ⃦1 VI.TOTAL VARIATION ORDERING = ‖푉푖푗 − 푉푖푗퐽푖푗‖1 (43) ⃦ (︀ )︀⃦ This section defines a mapping of spectral components = ⃦푉푖푗 퐼푟푖푗 − 퐽푖푗 ⃦1 . (44) to the real line to achieve an ordering of the spectral Let 휆 denote the 푖th eigenvalue and the columns components. This ordering can be used to distinguish 푣 , . . . , 푣 of 푉 comprise its 푗th Jordan chain. generalized low and high frequencies as in [3]. An upper 1 푟푖푗 푖푗 Then (38) can be expressed in terms of the Jordan chain: bound for a total-variation based mapping of a spectral component (Jordan subspace) is derived. TV퐺 (푉푖푗) (45)

7 ⃦ ⎡ ⎤⃦ ⃦ 1 − 휆 −1 ⃦ ⃦ ⃦ shift 퐴. While this total variation bound may not capture ⃦ ⎢ .. ⎥⃦ ⃦[︀ ]︀ ⎢ 1 − 휆 . ⎥⃦ the true total variation of a spectral component, it can = ⃦ 푣1 . . . 푣푟푖푗 ⎢ ⎥⃦ (46) be generalized as an upper bound for all spectral com- ⃦ ⎢ .. ⎥⃦ ⃦ ⎣ . −1 ⎦⃦ ponents associated with a Jordan equivalence class. This ⃦ 1 − 휆 ⃦ ⃦ ⃦1 concept is explored further in [31]. ⃦[︀(1 − 휆) 푣 (1 − 휆) 푣 − 푣 ··· ]︀⃦ = ⃦ 1 2 1 ⃦1 (47) VII.APPLICATION = max {|1 − 휆| ‖푣1‖1 , ‖(1 − 휆) 푣푖 − 푣푖−1‖1} . 푖=2,...,푟푖푗 We apply the graph Fourier transform (12) to a signal (48) based on 2010-2013 New York City taxi data [40] over the Manhattan road network. The signal represents the Theorem 4 shows that 푉 can be chosen such that four-year average number of trips that pass through each node of the road network as determined by Dijkstra path ‖푉푖푗‖1 = 1 without loss of generality. estimates from the start- and end-coordinates (latitude Theorem 4. The eigenvector matrix 푉 of adjacency ma- 푁×푁 and longitude) provided by the raw data; the computation trix 퐴 ∈ C can be chosen so that each Jordan chain behind these estimates is described in [41]. The road net- 푁×푟푖푗 represented by the eigenvector submatrix 푉푖푗 ∈ C work consists of 6,408 nodes that represent latitude and satisfies ‖푉 ‖ = 1; i.e., ‖푉 ‖ = 1 without loss of 푖푗 1 1 longitude coordinates from [42] that are connected by generality. 14,418 directed edges that represent one-way directions Proof: Let 푉 represent an eigenvector matrix of as verified by Google Maps [43]. The adjacency matrix 퐴 with partitions 푉푖푗 as described above, and let 퐽푖푗 of the road network is defective with 253 Jordan chains represent the corresponding Jordan block. Let 퐷 be a of length 2 and 193 eigenvectors without Jordan chain block diagonal matrix with 푟푖푗 × 푟푖푗 diagonal blocks vectors corresponding to eigenvalue zero (446 Jordan chains total). Details for the eigendecomposition of this 퐷푖푗 = (1/ ‖푉푖푗‖1)퐼푟푖푗 . Since 퐷푖푗 commutes with 퐽푖푗, 퐷 commutes with 퐽. Note that 퐷 is a special case matrix are described in [30], [44]. of upper triangular Toeplitz matrices discussed in [28, Figure 3a shows the signal, consisting of the four- Example 6.5.4, Theorem 12.4.1]. year June-August average number of trips at each node Let 푋 = 푉 퐷 and 퐵 = 푋퐽푋−1. Then of the Manhattan road network for Fridays 9pm-10pm. Applying the GFT (12) and computing the energies of −1 퐵 = 푋퐽푋 (49) the signal projections as described in Section V yields a = 푉 퐷퐽퐷−1푉 −1 (50) highly expressed eigenvector shown in Figure 3b. This = 푉 퐷퐷−1퐽푉 −1 (51) eigenvector shows that most of the signal energy is concentrated at Gramercy Park (shown in black), north = 푉 퐽푉 −1 (52) of Gramercy Park, and in Hell’s Kitchen on the west = 퐴. (53) side of Manhattan. Therefore, both 푉 and 푋 are eigenvector matrices of 퐴. VIII.LIMITATIONS In the following, it is assumed that 푉 satisfies Theo- The graph Fourier transform presented in this paper rem 4. Theorem 5 presents an upper bound of (38). solves the problem of uniqueness for defective adjacency matrices by projecting a signal onto Jordan subspaces Theorem 5. Consider matrix 퐴 with 푘 distinct eigenval- instead of eigenvectors and generalized eigenvectors. ues and 푁 × 푟푖푗 matrices 푉푖푗 with columns comprising This method relies on Jordan chain computations, how- the 푗th Jordan chain of 휆푖, 푖 = 1, . . . , 푘, 푗 = 1, . . . , 푔푖. ever, which is sensitive to numerical errors and can Then the graph total variation TV퐺(푉푖푗) ≤ |1 − 휆푖|+1. be expensive when the number of chains to compute and the graph dimension are large and the computing Proof: Let ‖푉푖푗‖ = 1 and rewrite (38): 1 infrastructure is memory-bound. The computation can ⃦ ⃦ TV퐺 (푉푖푗) ≤ ‖푉푖푗‖1 ⃦퐼푟푖푗 − 퐽푖푗⃦1 (54) be accelerated by using equivalence classes over graph ⃦ ⃦ topologies, as we discuss in [31]. These classes allow = ⃦퐼푟푖푗 − 퐽푖푗⃦1 (55) matching between a given graph to graphs of simpler = |1 − 휆 | + 1. (56) 푖 topologies such that the graph Fourier transform of a signal with respect to these graphs is preserver. Since Equations (38), (39), and (56) characterize the (graph) the equivalence classes may not be applicable to arbitrary total variation of a Jordan chain by quantifying the graph structures, we explore inexact eigendecomposition change in a set of vectors that spans the Jordan sub- methods in [30] to reduce the execution time and bypass space J푖푗 when they are transformed by the graph the numerically unstable Jordan chain computation.

8 (a) Average number of June-August trips, Fridays 9pm-10pm (b) Maximum expressed eigenvector

Fig. 3: Graph signal based on NYC taxi data (a) and the maximum expressed eigenvector (b). (a) Colors denote log10 bins of the four-year average number of trips for Fridays 9pm-10pm (699 log bins; white: 0–20, beige to yellow: 20–230, orange: 230–400, red: 400–550, blue: 550–610, purple to black: 610–2,700. (b) All colors except white indicate locations with a concentrated number of taxi trips. Black indicates the locations of maximum expression in the eigenvector. The plots were generated with ggmap [38] and OpenStreetMap [39].

∏︀푘 푎푖 IX.CONCLUSION polynomial 휙퐴(휆) = det(퐴 − 휆퐼) = 푖=1 (휆 − 휆푖) , The graph Fourier transform proposed here pro- 퐼 is the identity matrix, and exponent 푎푖 represents the vides a unique and non-ambiguous spectral decompo- algebraic multiplicity of eigenvalue 휆푖, 푖 = 1, . . . , 푘. sition for signals over graphs with defective, or non- Denote by Ker(퐴) the kernel or null space of matrix 퐴, diagonalizable, adjacency matrices. The transform is i.e., the span of vectors 푣 satisfying 퐴푣 = 0. The geo- based on spectral projections of signals onto the Jordan metric multiplicity 푔푖 of eigenvalue 휆푖 equals the dimen- subspaces of the graph adjacency matrix. This coordinate sion of null space Ker (퐴 − 휆푖퐼). The minimal polyno- ∏︀푘 푚푖 free graph Fourier transform is unique and leads to a mial 푚퐴(휆) of 퐴 has form 푚퐴(휆) = 푖=1 (휆 − 휆푖) , unique spectral decomposition of graph signals. This where 푚푖 is the index of eigenvalue 휆푖. The index 푚푖 paper shows that the signal projections onto the Jordan represents the maximum Jordan chain length or Jordan subspaces can be ranked by energy via a generalized Par- subspace dimension, which is discussed in more detail seval’s identity. Lastly, a total variation-based ordering of below. the Jordan subspaces is proposed. This allows ordering frequencies, and to define low-, high-, and band-pass graph signals. Generalized eigenspaces. The eigenvectors and gen- eralized eigenvectors of matrix 퐴 ∈ C푁×푁 partition C푁 into subspaces, some of which are spans of eigenvec- APPENDIX A tors, eigenspaces, or generalized eigenspaces. Subspace BACKGROUNDON JORDAN DECOMPOSITIONS 푚푖 G푖 = Ker (퐴 − 휆푖퐼) is the generalized eigenspace Direct sum. Let 푋1, . . . , 푋푘 be subspaces of vector or root subspace of 휆푖. The generalized eigenspaces space 푋 such that 푋 = 푋1 + ··· + 푋푘. If 푋푖 ∩ 푋푗 = ∅ are 퐴-invariant; that is, for all 푥 ∈ G푖, 퐴푥 ∈ G푖. The 푝 for all 푖 ̸= 푗, then 푋 is the direct sum of subspaces subspace S푖푝 = Ker (퐴 − 휆푖퐼) , 푝 = 0, 1, . . . , 푁, is 푘؀ 푘 {푋푖}푖=1, denoted as 푋 = 푖=1 푋푖. Any 푥 ∈ 푋 can the generalized eigenspace of order 푝 for 휆푖. For 푝 ≥ be written uniquely as 푥 = 푥1 + ··· + 푥푘, where 푥푖 ∈ 푚푖, S푖푝 = G푖. The proper eigenvectors 푣 of 휆푖, or sim- 푋푖, 푖 = 1, . . . , 푘. ply eigenvectors of 휆푖, are linearly independent vectors Eigenvalues and multiplicities. Consider matrix 퐴 ∈ in S푖1 = Ker (퐴 − 휆푖퐼), the eigenspace of 휆푖. There are 푁×푁 C with 푘 distinct eigenvalues 휆1, . . . , 휆푘, 푘 ≤ 푁. 푔푖 = dim S푖1 = dim Ker (퐴 − 휆푖퐼) eigenvectors of 휆푖. The eigenvalues of 퐴 are the roots of the characteristic Subspaces S푖푝 form a (maximal) chain of 풢푖 as depicted

9 in Figure 2; that is, [6] M. Püschel and J.M.F. Moura, “Algebraic signal processing theory: 1-d space,” IEEE Transactions on Signal Processing, 푁 vol. 56, no. 8, pp. 3586–3599, Aug. 2008. {0}=S푖0 ⊂ S푖1 ⊂· · · ⊂S푖,푚푖 = S푖,푚푖+1 = · · · ⊂ C [7] M. Püschel and J. M. F. Moura, “Algebraic signal processing where 푚푖 is the index of 휆푖. Vectors 푣 ∈ S푖푝 but 푣∈ / theory: Cooley-Tukey type algorithms for DCTs and DSTs,” IEEE Transactions on Signal Processing, vol. 56, no. 4, pp. S푖,푝−1 are generalized eigenvectors of order 푝 for 휆푖. 1502–1521, April 2008. Properties of Jordan subspaces. A Jordan sub- [8] K. Pearson, “On lines and planes of closest fit to systems of space (7) is 퐴-invariant; that is, for all 푥 ∈ J , 퐴푥 ∈ J . points in space,” Philosophical Magazine Series 6, vol. 2, no. The Jordan subspace J is also cyclic since it can be 11, pp. 559–572, 1901. [9] H. Hotelling, “Analysis of a complex of statistical variables into written by (6) as principal components,” Journal of Educational Psychology, vol. ˘ (︀ 푟−1 )︀ 24, no. 6, pp. 417–441, 498–âA¸S520,1933. J = span 푣푟, (퐴 − 휆퐼)푣푟,..., (퐴 − 휆퐼) 푣푟 [10] H. Hotelling, “Relations between two sets of variates,” (57) Biometrika, vol. 28, no. 3/4, pp. 321–377, 1936. 푟 for 푣푟 ∈ Ker(퐴 − 휆퐼) , 푣푟 ̸= 0. [11] J. F. Tenenbaum, V. Silva, and J. C. Langford, “A global The number of Jordan subspaces corresponding to 휆 geometric framework for nonlinear dimensionality reduction,” 푖 Science, vol. 290, pp. 2319–2323, 2000. equals the geometric multiplicity 푔푖 = dim Ker(퐴−휆퐼), [12] S. Roweis and L. Saul, “Nonlinear dimensionality reduction by since there are 푔푖 eigenvectors of 휆푖. It can be shown locally linear embedding,” Science, vol. 290, pp. 2323–2326, 2000. that the Jordan spaces {J푖푗}, 푗 = 1, ··· , 푔푖 and 푖 = [13] M. Belkin and P. Niyogi, “Laplacian eigenmaps for dimensional- 1, ··· , 푘, are disjoint. ity reduction and data representation,” Neural Computation, vol. Jordan decomposition. Let 푉푖푗 denote the 푁 × 푟푖푗 15, no. 6, pp. 1373–1396, 2003. matrix whose columns form a Jordan chain of eigenvalue [14] D. L. Donoho and C. Grimes, “Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data,” Proceedings 휆푖 of 퐴 that spans Jordan subspace J푖푗. Then the of the National Academy of Sciences, vol. 100, no. 10, pp. 5591– generalized eigenvector matrix 푉 of 퐴 is 5596, 2003. [︀ ]︀ [15] Mikhail Belkin and Partha Niyogi, “Towards a theoretical 푉 = 푉11 ··· 푉1푔1 ··· 푉푘1 ··· 푉푘푔푘 , (58) foundation for laplacian-based manifold methods,” Journal of Computer and System Sciences, vol. 74, no. 8, pp. 1289 – 1308, where 푘 is the number of distinct eigenvalues. The 2008. columns of 푉 are a Jordan basis of C푁 . Then 퐴 [16] D. Ganesan, B. Greenstein, D. Estrin, J. Heidemann, and R. Govindan, “Multiresolution storage and search in sensor has block-diagonal Jordan normal form 퐽 consisting of networks,” ACM Transactions on Storage, vol. 1, pp. 277–315, Jordan blocks 2005. ⎡휆 1 ⎤ [17] R. Wagner, H. Choi, R. Baraniuk, and V. Delouille, “Distributed wavelet transform for irregular sensor network grids,” in Proceed- . ⎢ 휆 .. ⎥ ings of the 13th IEEE Workshop on Statistical Signal Processing, 퐽(휆) = ⎢ ⎥ . (59) 2005, pp. 1196–1201. ⎢ . ⎥ ⎢ .. 1⎥ [18] R. Wagner, R. Baraniuk, S. Du, D. Johnson, and A. Cohen, “An ⎣ ⎦ architecture for distributed wavelet analysis and processing in 휆 sensor networks,” in Proceedings of the 5th ACM International Conference on Information Processing in Sensor Networks, 2006, of size 푟푖푗; see, for example, [28] or [45, p.125]. The pp. 243–250. Jordan normal form 퐽 of 퐴 is unique up to a permutation [19] R. R. Coifman and M. Maggioni, “Diffusion wavelets,” Applied of the Jordan blocks. The Jordan decomposition of 퐴 is and Computational Harmonic Analysis, vol. 21, no. 1, pp. 53–94, −1 2006. 퐴 = 푉 퐽푉 . [20] D.K. Hammond, P. Vandergheynst, and R. Gribonval, “Wavelets on graphs via ,” Applied and Computational Harmonic Analysis, vol. 30, no. 2, pp. 129–150, 2011. REFERENCES [21] S.K. Narang and A. Ortega, “Perfect reconstruction two-channel [1] A. Sandryhaila and J.M.F. Moura, “Discrete signal processing wavelet filter banks for graph structured data,” IEEE Transactions on graphs,” IEEE Transactions on Signal Processing, vol. 61, on Signal Processing, vol. 60, no. 6, pp. 2786–2799, Jun. 2012. no. 7, pp. 1644–1656, Apr. 2013. [22] A. Agaskar and Y. Lu, “A spectral graph uncertainty principle,” [2] A. Sandryhaila and J.M.F. Moura, “Big data analysis with signal IEEE Transactions on Information Theory, vol. 59, no. 7, pp. processing on graphs: Representation and processing of massive 4338–4356, 2013. data sets with irregular structure,” IEEE Signal Processing [23] V. Ekambaraman, B Fanti, Ayazifar, and K. Ramchandran, “Mul- Magazine, vol. 31, no. 5, pp. 80–90, Aug. 2014. tiresolution graph signal processing via circulant structures,” in [3] A. Sandryhaila and J.M.F. Moura, “Discrete signal processing Proceedings of the IEEE DSP/SPE Workshop, 2013, pp. 112–117. on graphs: Frequency analysis,” IEEE Transactions on Signal [24] X. Zhu and M. Rabbat, “Approximating signals supported on Processing, vol. 62, no. 12, pp. 3042–3054, Jun. 2014. graphs,” in Proceedings of the 37th IEEE International Con- [4] D. Shuman, S.K. Narang, P. Frossard, A. Ortega, and P. Van- ference on Acoustics, Speech, and Signal Processing (ICASSP), dergheynst, “The emerging field of signal processing on graphs: Mar. 2012, pp. 3921–3924. Extending high-dimensional data analysis to networks and other [25] S. Aksoy, F. Chung, and X. Peng, “Extreme values of the irregular domains,” IEEE Signal Processing Magazine, vol. 30, stationary distribution of random walks on directed graphs,” no. 3, pp. 83–98, Apr. 2013. Advances in Applied Mathematics, vol. 81, pp. 128–155, 2016. [5] M. Püschel and J.M.F. Moura, “Algebraic signal processing [26] F. Chung, P. Horn, and L. Lu, “Diameter of random spanning theory: Foundation and 1-d time,” IEEE Transactions on Signal trees in a given graph,” Journal of Graph Theory, vol. 69, pp. Processing, vol. 56, no. 8, pp. 3572–3585, Aug. 2008. 223–240, 2012.

10 [27] M. Püschel and J. M. F. Moura, “The algebraic approach to the discrete cosine and sine transforms and their fast algorithms,” SIAM Journal on Computing, vol. 32, no. 5, pp. 1280–1316, 2003. [28] P. Lancaster and M. Tismenetsky, The Theory of Matrices, New York, NY, USA: Academic, 2nd edition, 1985. [29] G.H. Golub and C.F. Van Loan, Matrix computations, Baltimore, MD, USA: JHU Press, 4 edition, 2013. [30] J.A. Deri and J.M.F. Moura, “Agile inexact methods for spectral projector-based graph Fourier transforms,” in preparation, Nov. 2016. [31] J.A. Deri and J.M.F. Moura, “Graph equivalence classes for spec- tral projector-based graph Fourier transforms,” in preparation, Nov. 2016. [32] I. Gohberg, P. Lancaster, and L. Rodman, Invariant Subspaces of Matrices with Applications, vol. 51, Philadelphia, PA, USA: SIAM, 2006. [33] R.A. Horn and C.R. Johnson, Matrix Analysis, Cambridge, U.K.: Cambridge Univ. Press, 2012. [34] M. Vetterli, J. Kovaceviˇ c,´ and V.K. Goyal, Foundations of Signal Processing, Cambridge, U.K.: Cambridge Univ. Press, 2014. [35] A.V. Oppenheim, A.S. Willsky, and S.H. Nawab, Signals and systems, vol. 2, Englewood Cliffs, NJ, USA: Prentice-Hall, 1983. [36] A. V. Oppenheim, R. W. Schafer, and J. R. Buck, Discrete-Time Signal Processing, Englewood Cliffs, NJ, USA: Prentice-Hall, 2 edition, 1999. [37] S. Mallat, A Wavelet Tour of Signal Processingg, New York, NY, USA: Academic, 3 edition, 2008. [38] D. Kahle and H. Wickham, “ggmap: Spatial visualization with ggplot2,” The R Journal, vol. 5, no. 1, pp. 144–161, 2013. [39] OpenStreetMap contributers, “OpenStreetMap,” http://www. openstreetmap.org/, Accessed 09-Jun.-2016. [40] B. Donovan and D.B. Work, “New York City Taxi Data (2010- 2013),” Dataset, http://dx.doi.org/10.13012/J8PN93H8, 2014, Accessed 30-Jun.-2016. [41] J.A. Deri and J.M.F. Moura, “Big data computation of taxi movement in new york city,” in to appear in Proceedings of the 1st IEEE Big Data Conference Workshop on Big Spatial Data, Dec. 2016. [42] Baruch College: Baruch Geoportal, “NYC Geodatabase,” URL: https://www.baruch.cuny.edu/confluence/display/geoportal/ NYC+Geodatabase, Accessed 31-Aug.-2014. [43] Google, “Google Map of Manhattan, New York City,” https: //goo.gl/maps/X5PQqrywKpq, Accessed 31-Aug.-2014. [44] J.A. Deri and J.M.F. Moura, “Taxis in New York City: A network perspective,” in Proceedings of the 49th Asilomar Conference on Signals, Systems, and Computers, Nov. 2015, pp. 1829–1833. [45] C. Jordan, Traité des substitutions et des équations algébriques, Paris, 1870.

11