<<

Thus Tˆ(bˆj) = ajibˆi, �i and the of Tˆ with respect to Bˆ isA T , the transpose of the matrix ofT with respect toB.

Remark: Natural isomorphism betweenV and its double dual Vˆ. We have seen above that everyfinite dimensional vector spaceV has the same dimension as its dual Vˆ, and hence they are isomorphic as vector spaces. Once we choose a basis forV we also define a dual basis for Vˆ and the correspondence between the two bases gives us an explicit isomorphism betweenV and Vˆ. However this isomorphism is not intrinsic to the spaceV, in the sense that it depends upon a choice of basis and cannot be described independently of a choice of basis. ˆ The dual space of Vˆ is denoted Vˆ; it is the space of all linear mappings from Vˆ toF. By all of the above discussion it has the same dimension as Vˆ andV. The reason for mentioning this however is that there is a “natural” isomorphismθ fromV to Vˆ. It is defined in the following way, ˆ forx V andf Vˆ - note thatθ(x) belongs to Vˆ, soθ(x)(f) should be an element ofF. ∈ ∈ θ(x)(f) =f(x).

To see thatθ is an isomorphism, suppose thatx 1,...,x k are independent elements ofV. Then ˆ θ(x1),...,θ(x k) are independent elements of Vˆ. To see this leta 1,...,a k be element ofF for which a θ(x ) + +a θ(x ) = 0. This means thatf(a x + +a x ) = 0 for allf Vˆ, which means 1 1 ··· k k 1 1 ··· k k ∈ thata x + +a x = 0, which means that eacha = 0. 1 1 ··· k k i

1.3 Matrices and Graphs

A (or digraph)G consists of a non-empty setV of vertices and a setE of ordered pairs of these vertices, called edges. Each edge is directed from one ofG to another. An undirected graph is similar, except that the edges are not considered to have a direction. A number of square matrices are typcially associated to a graph, the most elementary of which is the .

Definition 1.3.1. LetG be a (directed) graph withn vertices labelledv 1,...,v n. The adjacency matrix A ofG is then n matrix whose entries are given by × 1 there is an edge directed fromv tov inG A = i j ij 0 otherwise � Example 1.3.2. A directed graph and its adjacency matrix.

19 v1 v2

0 1 0 0 0 0 1 0 1 0 0 0   0 1 0 0 1 0 A=  1 0 0 0 0 0  v3 v   4  0 0 1 1 0 1     1 0 0 0 0 0     

v6 v5

Ifu andv are vertices in a directed graph, a (directed) walk fromu tov is a sequence of ver- tices that starts atu andfinishes atv, with the property that every pair of consecutive entries is a directed edge. The length of a is the number of edges in it. In the example above, v5,v 6,v 1,v 2,v 1,v 2,v 3 is a directed walk of length 6 fromv 5 tov 3. The adjacency matrix has the interesting property that its powers count directed walks.

Theorem 1.3.3. LetG be a directed graph with adjacency matrixA (with respect to the orderingv 1,v 2,...,v n of the vertices). For every positive integerk, the(i,j) entry ofA k is the number of walks of lengthk from vi tov j inG. Proof. By induction onk. The casek= 1 is just the definition of the adjacency matrix. So assume that the theorem is true fork=m− 1 and considerk=m. Then

m m−1 (A )ij = AilAlj . �l

Every path of lengthm fromv i tov j must start with an edge fromv i to somev l and follow that with a path of lengthm− 1 fromv l tov j. For eachl, the entryA il is 1 if(v i,v l) is an edge and m−1 0 otherwise. By the induction hypothesisA lj is the number of directed walks of lengthm−1 m−1 fromv l tov j inG. Thus for each vertexl, the integerA ilAlj is the number of walks of length m fromv i tov j that havev l as their second vertex. The sum overl of these is the total number of walks of lengthm fromv i tov j inG. This completes the induction proof. An undirected graph resembles a directed graph except that the edges are unordered pairs of vertices. The adjacency matrix of an undirected graph is symmetric. Note that the adjacency matrix of a (directed or undirected) graphG depends not only on the graph itself but also on the choice of an ordering of the vertices. Suppose thatσ is a permutation of the set{1, . . . ,n}. LetA be the adjacency matrix ofG with respect to the orderingv 1, dots,v n of the vertices, and letA � be the adjacency matrix with respect to the orderingv σ(1),...,v σ(n). Then A� is obtained fromA by first reordering the columns by replacing Column 1 with Columnσ(1), Column 2 with Col- • umnσ(2), and so on. This means multiplying on the right by the matrixP σ, in which each Columnj (for eachj) has a 1 as itsσ(j)-th entry and is otherwise full of zeros.

20 then reordering the rows by replacing Row 1 with Rowσ(1), Row 2 with Rowσ(2), etc. This • T −1 means multiplyingA on the left by the matrix(P σ) , which is also equal toP σ . A is a matrix that has exactly one 1 in each row and column and is oth- • erwise full of zeros. Every permutation matrix has the property that its inverse is equal to its transpose. We have shown that adjacency matricesA andA � represent the same graph if and only if T A� =P AP, for some permutation matrixP. This relation is known as permutation equivalence; it is a special case of both similarity and congruence. The adjacency matrix is one of a number of matrices often associated with a graph. We men- tion a few more.

Definition 1.3.4. LetG be an undirected graph withn verticesv 1,...,v n andm edgese 1,...,e m. The matrix ofG is then n matrixC that has a1 in position(i,j) if the vertexv is • × i incident with the edgee j, and zeros elsewhere. An oriented ofG is then m matrixB defined byfirst assigning a direction to • × every edge ofG and then setting

1 ifv i is the start vertex ofe j B = −1 ifv is the end vertex ofe ij  i j  0 otherwise

The oriented incidence matrix depends on a choice of ordering of both the vertices and edges, and on a choice of orientation of the edges. Given matrices that are associated with graphs, a general philosophy is to consider how the properties of the matrix and the graph are related to each other. In the case of an oriented inci- dence matrix, the of the matrix tells us about the number of connected components in the graph.

Theorem 1.3.5. LetG be a simple graph withn vertices andm edges, and letB be an oriented incidence matrix ofG. Then the rank ofB isn−t, wheret is the number of connected components ofG.

Proof. First suppose thatG is connected. This means that for any pair of verticesv i andv j inG, there exists a walk inG fromv i tov j. We consider the left nullspaceN of the matrixB. Note that every column ofB has one entry equal to 1, one equal to−1, and is otherwise full of zeros. This means that the vector(1 1 . . . 1) belongs to the left nullspace ofV, so this nullspace is at least 1-dimensional. Suppose that(a a ...a ) is a non-zero element ofN, and choosek for whicha = 0, write 1 2 n k � ak =α. Then(a 1 a2 ...a n) must satisfy(a 1 a2 ...a n)v= 0 for every columnv ofB, and in particular for those columns corresponding to edges that are incident with the vertexv k. It follows thata i =a k =α wheneverv i is adjacent tov k. Now by the same reasoning applied to the neighbours ofv k, we must havea j =α wheneverv j is adjacent to a neighbour ofv k. SinceG is connected, repetition of this step reaches all vertices ofG and we conclude thata i =α for alli and thatN is a 1-dimensional space. Thusn=1+ rank(B) and rank(B) =n− 1. Now suppose thatG hast connected components and let their numbers of vertices ben 1,n 2,...,n t, withm 1,m 2,...,m t edges respectively. By ordering the vertices by component, we can arrange thatB has a rectangular block diagonal structure with an n block in the upper 1 × 1 left, etc. Each block is an oriented incidence matrix of a connected component ofG, and so a block withn i vertices has rankn i − 1. It follows that the total rank is

(n −1) + (n −1) + +(n −1) =n−t. 1 2 ··· t

21 Theorem 1.3.6. LetB be an oriented incidence matrix of an undirected simple graphG. Then

BBT =D−A,

whereD is the whose diagonal entries are the total degrees of the vertices, andA is the adjacency matrix ofA. Proof. Fori= 1, . . . ,n, the entry in the(i,i)-position ofBB T is just the ordinary scalar product of Rowi ofB with itself. Since every entry of this row is 1 or−1 or 0, this scalar product is the number of non-zero entries in Rowi, which is the total of the vertexv i. Note that each Column ofB has exactly two non-zero entries, which are equal to 1 and−1. For i=j, the entry in the(i,j) position ofBB T is the scalar product of Rowi and Rowj ofB. If this � is not zero it means that there is a Column whose only two non-zero entries occur in positionsi andj, which means exactly thatv ivj is an edge inG. There can be at most one such column since T there are no inG. So the(i,j) entry ofBB is−1 ifv i andv j are adjacent inG and by 0 otherwise. We conclude thatBB T =D−A. Note that a consequence of Theorem 1.3.6 is that the matrixBB T does not depend on the choice of orientation of the edges. Definition 1.3.7. LetG be an undirected graph with adjacency matrixA. The matrixL=D−A is called the ofG. Its entries on the main diagonal are the degrees of the vertices ofG. Away from the main diagonal, the entry in position(i,j) is−1 or0 according to whetherv i andv j are adjacent or not. Properties of the Laplacian matrix of a graphG carry extensive information about properties ofG itself. Moreover, as a real it enjoys various special properties. For instance, it is a consequence of the following lemma that the rank ofL tells us the number of connected components ofG.

T Lemma 1.3.8. Suppose thatA M n p(R). Then the rank of then n matrixAA is equal to the rank ∈ × × ofA.

T T Proof. That rank(AA )� rank(A) is clear, since every columnn ofAA is a real linear com- bination of the columns ofA. We show now that in this special case, the right nullspace of AAT is equal to the right nullspace ofA T . Suppose thatA T v= 0 for somev R n ThenA T v � ∈ belongs to the columnspace ofA T , and sinceA T v is a non-zero vector inR p, it follows that (AT v)T AT v=v T AAT v= 0. ThusAA T v= 0, andv does not belong to the right nullspace of � � AAT . Then the right nullspaces ofA T andAA T coincide and have the same dimensiond, and the ranks ofAA T andA T (andA) are all equal ton−d. As a real symmetric matrix,L has the property that its eigenvalues are all real. Lemma 1.3.9. LetA be a complex Hermitiann n matrix and letλ be a complex eigenvalue ofA. Then × λ R. ∈ Note ThatA is Hermitian means thatA=A ∗, whereA ∗ denotes the Hermitian conjugate ofA, whose entries are the complex conjugates of the entries ofA T . A real symmetric matrix is a special case of a complex .

n Proof. SinceA is Hermitian we have for any vectorv C thatv ∗Av R, since ∈ ∈ (v∗Av)∗ =v ∗A∗v=v ∗Av.

Thusv ∗Av is a that is equal to its own complex conjugate, hence it is real. Now letu C n be an eigenvector corresponding toλ. Then ∈ u∗Au R= u ∗λu R= λu ∗u R. ∈ ⇒ ∈ ⇒ ∈ Sincev ∗v R (since it is the sum of the entries ofu each multiplied by its own complex conjugate) ∈ it follows thatλ R also. ∈

22 IfG is a graph, it is a consequence of Lemma 1.3.9 that the eigenvalues of the Laplacian ma- trixL ofG are real numbers. In fact they are all non-negative, for letv be an eigenvector ofL corresponding to the eigenvalueλ, and letB be an oriented incidence matrix ofG. Then

vT Lv=v T λv=λv T v.

On the other hand vT Lv=v T BBT v=(v T B)(BT v) = (BT v)T (BT v). Sincev T v is a positive and(B T v)T (BT v) is a non-negative real number, it follows that λ� 0. How the eigenvalues ofL are related to properties ofG is one of the themes of spectral . We will be able to prove the following statements. 1. We know that 0 occurs at least once as an eigenvalue ofL. We will show that it occurs exactly once if and only ifG is connected. 2. IfG is connected, letµ be the least positive eigenvalue ofL. This number is called the algebraic connectivity ofG. We will show that it can be considered as a measure of how robustly connected the graph is. It is bounded above by the vertex connectivity, which is the least number of vertices whose removal disconectsG.

3. The of any(n−1) (n−1) submatrix ofL is the number of spanning trees in × G. A subgraph ofG is a spanning tree if it involves all the vertices ofG, is connected, and has no cycles.

We will return to these statements later after some investigations of and eigenval- ues, in general and for the special case of symmetric matrices.

23