<<

Part IB — Linear Definitions

Based on lectures by S. J. Wadsley Notes taken by Dexter Chua

Michaelmas 2015

These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures. They are nowhere near accurate representations of what was actually lectured, and in particular, all errors are almost surely mine.

Definition of a vector (over R or C), subspaces, the space spanned by a . , bases, . Direct sums and complementary subspaces. [3] Linear maps, . Relation between and nullity. The space of linear maps from U to V , representation by matrices. Change of . Row rank and column rank. [4] and of a . Determinant of a product of two matrices and of the inverse matrix. Determinant of an endomorphism. The adjugate matrix. [3] Eigenvalues and eigenvectors. Diagonal and triangular forms. and minimal polynomials. Cayley-Hamilton Theorem over C. Algebraic and geometric multiplicity of eigenvalues. Statement and illustration of Jordan normal form. [4] Dual of a finite-dimensional , dual bases and maps. Matrix representation, rank and determinant of dual map. [2] Bilinear forms. Matrix representation, . Symmetric forms and their link with quadratic forms. Diagonalisation of quadratic forms. Law of inertia, classification by rank and signature. Complex Hermitian forms. [4] Inner product spaces, orthonormal sets, orthogonal projection, V = W ⊕ W ⊥. Gram- Schmidt orthogonalisation. Adjoints. Diagonalisation of Hermitian matrices. Orthogo- nality of eigenvectors and properties of eigenvalues. [4]

1 Contents IB (Definitions)

Contents

0 Introduction 3

1 Vector spaces 4 1.1 Definitions and examples ...... 4 1.2 Linear independence, bases and the Steinitz exchange lemma . .4 1.3 Direct sums ...... 5

2 Linear maps 7 2.1 Definitions and examples ...... 7 2.2 Linear maps and matrices ...... 7 2.3 The first theorem and the rank-nullity theorem . . .7 2.4 Change of basis ...... 7 2.5 Elementary matrix operations ...... 8

3 9 3.1 ...... 9 3.2 Dual maps ...... 9

4 Bilinear forms I 10

5 of matrices 11

6 Endomorphisms 12 6.1 Invariants ...... 12 6.2 The minimal polynomial ...... 12 6.2.1 Aside on polynomials ...... 12 6.2.2 Minimal polynomial ...... 13 6.3 The Cayley-Hamilton theorem ...... 13 6.4 Multiplicities of eigenvalues and Jordan normal form ...... 13

7 Bilinear forms II 14 7.1 Symmetric bilinear forms and quadratic forms ...... 14 7.2 Hermitian form ...... 14

8 Inner product spaces 15 8.1 Definitions and basic properties ...... 15 8.2 Gram-Schmidt ...... 15 8.3 Adjoints, orthogonal and unitary maps ...... 15 8.4 ...... 16

2 0 Introduction IB Linear Algebra (Definitions)

0 Introduction

3 1 Vector spaces IB Linear Algebra (Definitions)

1 Vector spaces 1.1 Definitions and examples

Notation. We will use F to denote an arbitrary field, usually R or C. Definition (Vector space). An F-vector space is an (additive) abelian V together with a F × V → V , written (λ, v) 7→ λv, such that (i) λ(µv) = λµv for all λ, µ ∈ F, v ∈ V (associativity) (ii) λ(u + v) = λu + λv for all λ ∈ F, u, v ∈ V (distributivity in V ) (iii)( λ + µ)v = λv + µv for all λ, µ ∈ F, v ∈ V (distributivity in F) (iv)1 v = v for all v ∈ V (identity) We always write 0 for the additive identity in V , and call this the identity. By abuse of notation, we also write 0 for the trivial vector space {0}.

Definition (Subspace). If V is an F-vector space, then U ⊆ V is an (F-linear) subspace if (i) u, v ∈ U implies u + v ∈ U.

(ii) u ∈ U, λ ∈ F implies λu ∈ U. (iii) 0 ∈ U. These conditions can be expressed more concisely as “U is non-empty and if λ, µ ∈ F, u, v ∈ U, then λu + µv ∈ U”. Alternatively, U is a subspace of V if it is itself a vector space, inheriting the operations from V . We sometimes write U ≤ V if U is a subspace of V .

Definition (Sum of subspaces). Suppose U, W are subspaces of an F vector space V . The sum of U and V is U + W = {u + w : u ∈ U, w ∈ W }. Definition (Quotient spaces). Let V be a vector space, and U ⊆ V a subspace. Then the quotient group V/U can be made into a vector space called the quotient space, where is given by (λ, v + U) = (λv) + U. This is well defined since if v + U = w + U ∈ V/U, then v − w ∈ U. Hence for λ ∈ F, we have λv − λw ∈ U. So λv + U = λw + U.

1.2 Linear independence, bases and the Steinitz exchange lemma

Definition (Span). Let V be a vector space over F and S ⊆ V . The span of S is defined as ( n ) X hSi = λisi : λi ∈ F, si ∈ S, n ≥ 0 i=1 This is the smallest subspace of V containing S. Note that the sums must be finite. We will not play with infinite sums, since the notion of convergence is not even well defined in a general vector space.

4 1 Vector spaces IB Linear Algebra (Definitions)

Definition (Spanning ). Let V be a vector space over F and S ⊆ V . S spans V if hSi = V .

Definition (Linear independence). Let V be a vector space over F and S ⊆ V . Then S is linearly independent if whenever

n X λisi = 0 with λi ∈ F, s1, s2, ··· , sn ∈ S distinct, i=1 we must have λi = 0 for all i. If S is not linearly independent, we say it is linearly dependent.

Definition (Basis). Let V be a vector space over F and S ⊆ V . Then S is a basis for V if S is linearly independent and spans V . Definition (Finite dimensional). A vector space is finite dimensional if there is a finite basis.

Definition (Dimension). If V is a vector space over F with finite basis S, then the dimension of V , written

dim V = dimF V = |S|.

1.3 Direct sums

Definition ((Internal) direct sum). Suppose V is a vector space over F and U, W ⊆ V are subspaces. We say that V is the (internal) direct sum of U and W if (i) U + W = V (ii) U ∩ W = 0. We write V = U ⊕ W . Equivalently, this requires that every v ∈ V can be written uniquely as u + w with u ∈ U, w ∈ W . We say that U and W are complementary subspaces of V .

Definition (() direct sum). If U, W are vector spaces over F, the (external) direct sum is

U ⊕ W = {(u, w): u ∈ U, w ∈ W }, with and componentwise:

(u1, w1) + (u2, w2) = (u1 + u2, w1 + w2), λ(u, w) = (λu, λw).

Definition ((Multiple) (internal) direct sum). If U1, ··· ,Un ⊆ V are subspaces of V , then V is the (internal) direct sum

n M V = U1 ⊕ · · · ⊕ Un = Ui i=1 P if every v ∈ V can be written uniquely as v = ui with ui ∈ Ui. This can be extended to an infinite sum with the same definition, just noting P that the sum v = ui has to be finite.

5 1 Vector spaces IB Linear Algebra (Definitions)

Definition ((Multiple) (external) direct sum). If U1, ··· ,Un are vector spaces over F, the external direct sum is

n M U1 ⊕ · · · ⊕ Un = Ui = {(u1, ··· , un): ui ∈ Ui}, i=1 with pointwise operations. This can be made into an infinite sum if we require that all but finitely many of the ui have to be zero.

6 2 Linear maps IB Linear Algebra (Definitions)

2 Linear maps 2.1 Definitions and examples

Definition (). Let U, V be vector spaces over F. Then α : U → V is a linear map if

(i) α(u1 + u2) = α(u1) + α(u2) for all ui ∈ U.

(ii) α(λu) = λα(u) for all λ ∈ F, u ∈ U. We write L(U, V ) for the set of linear maps U → V . Definition (Isomorphism). We say a linear map α : U → V is an isomorphism if there is some β : V → U (also linear) such that α ◦ β = idV and β ◦ α = idU . If there exists an isomorphism U → V , we say U and V are isomorphic, and write U =∼ V . Definition ( and ). Let α : U → V be a linear map. Then the image of α is im α = {α(u): u ∈ U}. The kernel of α is ker α = {u : α(u) = 0}.

2.2 Linear maps and matrices Definition (Matrix representation). We call the matrix corresponding to a linear map α ∈ L(U, V ) under the corollary the matrix representing α with respect to the bases (e1, ··· , em) and (f1, ··· , fn).

2.3 The first isomorphism theorem and the rank-nullity theorem Definition (Rank and nullity). If α : U → V is a linear map between finite- dimensional vector spaces over F (in fact we just need U to be finite-dimensional), the rank of α is the r(α) = dim im α. The nullity of α is the number n(α) = dim ker α.

2.4 Change of basis

Definition (Equivalent matrices). We say A, B ∈ Matn,m(F) are equivalent if −1 there are invertible matrices P ∈ Matm(F), Q ∈ Matn(F) such that B = Q AP .

Definition (Column and row rank). If A ∈ Matn,m(F), then – The column rank of A, written r(A), is the dimension of the subspace of Fn spanned by the columns of A. – The row rank of A, written r(A), is the dimension of the subspace of Fm spanned by the rows of A. Alternatively, it is the column rank of AT .

7 2 Linear maps IB Linear Algebra (Definitions)

2.5 Elementary matrix operations

Definition (Elementary matrices). We call the following matrices of GLn(F) elementary matrices:

1   ..   .     1     0 1     1    n  ..  Sij =  .     1     1 0     1     ..   .  1

This is called a reflection, where the rows we changed are the ith and jth row.

1   ..   .     1 λ    n  ..  Eij(λ) =  .     1     ..   .  1

This is called a shear, where λ appears at the i, jth entry.

1   ..   .     1  n   T (λ) =  λ  i    1     ..   .  1

This is called a shear, where λ 6= 0 appears at the ith column.

8 3 Duality IB Linear Algebra (Definitions)

3 Duality 3.1 Dual space

Definition (Dual space). Let V be a vector space over F. The dual of V is defined as ∗ V = L(V, F) = {θ : V → F : θ linear}. Elements of V ∗ are called linear functionals or linear forms. Definition (Annihilator). Let U ⊆ V . Then the annihilator of U is

U 0 = {θ ∈ V ∗ : θ(u) = 0, ∀u ∈ U}.

If W ⊆ V ∗, then the annihilator of W is

W 0 = {v ∈ V : θ(v) = 0, ∀θ ∈ W }.

3.2 Dual maps

Definition (Dual map). Let V,W be vector spaces over F and α : V → W ∈ L(V,W ). The dual map to α, written α∗ : W ∗ → V ∗ is given by θ 7→ θ ◦ α. Since the composite of linear maps is linear, α∗(θ) ∈ V ∗. So this is a genuine map.

9 4 Bilinear forms I IB Linear Algebra (Definitions)

4 Bilinear forms I

Definition (). Let V,W be vector spaces over F. Then a function φ : V × W → F is a bilinear form if it is linear in each variable, i.e. for each v ∈ V , φ(v, · ): W → F is linear; for each w ∈ W , φ( · , w): V → F is linear.

Definition (Matrix representing bilinear form). Let (e1, ··· , en) be a basis for V and (f1, ··· , fm) be a basis for W , and ψ : V × W → F. Then the matrix A representing ψ with respect to the basis is defined to be

Aij = ψ(ei, fj).

Definition (Left and right kernel). The kernel of ψL is left kernel of ψ, while the kernel of ψR is the right kernel of ψ. Definition (Non-). ψ is non-degenerate if the left and right kernels are both trivial. We say ψ is degenerate otherwise.

Definition (Rank of bilinear form). If ψ : V → W is a bilinear form F on a finite-dimensional vector space V , then the rank of V is the rank of any matrix representing φ. This is well-defined since r(P T AQ) = r(A) if P and Q are invertible. Alternatively, it is the rank of ψL (or ψR).

10 5 Determinants of matrices IB Linear Algebra (Definitions)

5 Determinants of matrices

Definition (Determinant). Let A ∈ Matn,n(F). Its determinant is

n X Y det A = ε(σ) Aiσ(i).

σ∈Sn i=1

Definition (Volume form). A volume form on Fn is a function d : Fn×· · ·×Fn → F that is n (i) Multilinear, i.e. for all i and all v1, ··· , vi−1, vi+1, ··· , vn ∈ F , we have

n ∗ d(v1, ··· , vi−1, · , vi+1, ··· , vn) ∈ (F ) .

(ii) Alternating, i.e. if vi = vj for some i 6= j, then

d(v1, ··· , vn) = 0.

Definition (Singular matrices). A matrix A is singular if det A = 0. Otherwise, it is non-singular.

Notation. Write Aˆij for the matrix obtained from A by deleting the ith row and jth column.

Definition (Adjugate matrix). Let A ∈ Matn(F). The adjugate matrix of A, i+j written adj A, is the n × n matrix such that (adj A)ij = (−1) det Aˆji.

11 6 Endomorphisms IB Linear Algebra (Definitions)

6 Endomorphisms 6.1 Invariants

Definition. If V is a (finite-dimensional) vector space over F. An endomorphism of V is a linear map α : V → V . We write End(V ) for the F-vector space of all such linear maps, and I for the identity map V → V . Definition (Similar matrices). We say matrices A and B are similar or conjugate if there is some P invertible such that B = P −1AP .

Definition (Trace). The trace of a matrix of A ∈ Matn(F) is defined by n X tr A = Aii. i=1 Definition (Trace and determinant of endomorphism). Let α ∈ End(V ), and A be a matrix representing α under any basis. Then the trace of α is tr α = tr A, and the determinant is det α = det A. Definition (Eigenvalue and eigenvector). Let α ∈ End(V ). Then λ ∈ F is an eigenvalue (or E-value) if there is some v ∈ V \{0} such that αv = λv. v is an eigenvector if α(v) = λv for some λ ∈ F. When λ ∈ F, the λ-eigenspace, written Eα(λ) or E(λ) is the subspace of V containing all the λ-eigenvectors, i.e.

Eα(λ) = ker(λι − α). where ι is the . Definition (Characteristic polynomial). The characteristic polynomial of α is defined by χα(t) = det(tι − α). Definition (Diagonalizable). We say α ∈ End(V ) is diagonalizable if there is some basis for V such that α is represented by a diagonal matrix, i.e. all terms not on the diagonal are zero.

6.2 The minimal polynomial 6.2.1 Aside on polynomials

Definition (Polynomial). A polynomial over F is an object of the form m m−1 f(t) = amt + am−1t + ··· + a1t + a0, with m ≥ 0, a0, ··· , am ∈ F. We write F[t] for the set of polynomials over F. Definition (Degree). Let f ∈ F[t]. Then the degree of f, written deg f is the largest n such that an 6= 0. In particular, deg 0 = −∞. Definition (Multiplicity of a root). Let f ∈ F[t] and λ a root of f. We say λ has multiplicity k if (t − λ)k is a factor of f but (t − λ)k+1 is not, i.e. f(t) = (t − λ)kg(t)

for some g(t) ∈ F[t] with g(λ) 6= 0.

12 6 Endomorphisms IB Linear Algebra (Definitions)

6.2.2 Minimal polynomial Pm i Notation. Given f(t) = i=0 ait ∈ F[t], A ∈ Matn(F) and α ∈ End(V ), we can write m m X i X i f(A) = aiA , f(α) = aiα i=0 i=0 where A0 = I and α0 = ι. Definition (Minimal polynomial). The minimal polynomial of α ∈ End(V ) is the non-zero monic polynomial Mα(t) of least degree such that Mα(α) = 0.

6.3 The Cayley-Hamilton theorem Definition (Triangulable). An endomorphism α ∈ End(V ) is triangulable if there is a basis for V such that α is represented by an upper triangular matrix   a11 a12 ··· a1n  0 a22 ··· a2n    .  . . .. .   . . . .  0 0 ··· ann

6.4 Multiplicities of eigenvalues and Jordan normal form Definition (Algebraic and geometry multiplicity). Let α ∈ End(V ) and λ an eigenvalue of α. Then

(i) The algebraic multiplicity of λ, written aλ, is the multiplicity of λ as a root of χα(t).

(ii) The geometric multiplicity of λ, written gλ, is the dimension of the corre- sponding eigenspace, dim Eα(λ).

(iii) cλ is the multiplicity of λ as a root of the minimal polynomial mα(t).

Definition (Jordan normal form). We say A ∈ MatN (C) is in Jordan normal form if it is a block diagonal of the form   Jn1 (λ1) 0

 Jn2 (λ2)     ..   . 

0 Jnk (λk) P where k ≥ 1, n1, ··· , nk ∈ N such that n = ni, λ1, ··· , λk not necessarily distinct, and λ 1 ··· 0  .. .  0 λ . .  Jm(λ) =    . . ..   . . . 1 0 0 ··· λ

is an m × m matrix. Note that Jm(λ) = λIm + Jm(0). Definition (Nilpotent). We say α ∈ End(V ) is nilpotent if there is some r such that αr = 0.

13 7 Bilinear forms II IB Linear Algebra (Definitions)

7 Bilinear forms II 7.1 Symmetric bilinear forms and quadratic forms

Definition (Symmetric bilinear form). Let V is a vector space over F. A bilinear form φ : V × V → F is symmetric if φ(v, w) = φ(w, v) for all v, w ∈ V . Definition (Congruent matrices). Two square matrices A, B are congruent if there exists some invertible P such that B = P T AP.

Definition (Quadratic form). A function q : V → F is a quadratic form if there exists some bilinear form φ such that q(v) = φ(v, v) for all v ∈ V . Definition (Positive/negative (semi-)definite). Let φ be a symmetric bilinear form on a finite-dimensional real vector space V . We say (i) φ is positive definite if φ(v, v) > 0 for all v ∈ V \{0}. (ii) φ is positive semi-definite if φ(v, v) ≥ 0 for all v ∈ V . (iii) φ is negative definite if φ(v, v) < 0 for all v ∈ V \{0}. (iv) φ is negative semi-definite if φ(v, v) ≤ 0 for all v ∈ V . Definition (Signature). The signature of a bilinear form φ is the number p − q, where p and q are as above.

7.2 Hermitian form Definition (). Let V,W be complex vector spaces. Then a sesquilinear form is a function φ : V × W → C such that ¯ (i) φ(λv1 + µv2, w) = λφ(v1, w) +µφ ¯ (v2, w).

(ii) φ(v, λw1 + µw2) = λφ(v, w1) + µφ(vw2).

for all v, v1, v2 ∈ V , w, w1, w2 ∈ W and λ, µ ∈ C. Definition (Representation of sesquilinear form). Let V,W be finite-dimensional complex vector spaces with basis (v1, ··· , vn) and (w1, ··· , wm) respectively, and φ : V × W → C be a sesquilinear form. Then the matrix representing φ with respect to these bases is

Aij = φ(vi, wj). for 1 ≤ i ≤ n, 1 ≤ j ≤ m. Definition (Hermitian sesquilinear form). A sesquilinear form on V × V is Hermitian if φ(v, w) = φ(w, v).

14 8 Inner product spaces IB Linear Algebra (Definitions)

8 Inner product spaces 8.1 Definitions and basic properties Definition (). Let V be a vector space. An inner product on V is a positive-definite symmetric bilinear/hermitian form. We usually write (x, y) instead of φ(x, y). A vector space equipped with an inner product is an inner product space.

Definition (Orthogonal vectors). Let V be an inner product space. Then v, w ∈ V are orthogonal if (v, w) = 0.

Definition (Orthonormal set). Let V be an inner product space. A set {vi : i ∈ I} is an orthonormal set if for any i, j ∈ I, we have

(vi, vj) = δij

Definition (). Let V be an inner product space. A subset of V is an orthonormal basis if it is an orthonormal set and is a basis.

8.2 Gram-Schmidt orthogonalization Definition (Orthogonal internal direct sum). Let V be an inner product space and V1,V2 ≤ V . Then V is the orthogonal internal direct sum of V1 and V2 if it is a direct sum and V1 and V2 are orthogonal. More precisely, we require

(i) V = V1 + V2

(ii) V1 ∩ V2 = 0

(iii)( v1, v2) = 0 for all v1 ∈ V1 and v2 ∈ V2. Note that condition (iii) implies (ii), but we write it for the sake of explicitness. We write V = V1 ⊥ V2. Definition (). If W ≤ V is a subspace of an inner product space V , then the orthogonal complement of W in V is the subspace

W ⊥ = {v ∈ V :(v, w) = 0, ∀w ∈ W }.

Definition (Orthogonal external direct sum). Let V1,V2 be inner product spaces. The orthogonal external direct sum of V1 and V2 is the vector space V1 ⊕ V2 with the inner product defined by

(v1 + v2, w1 + w2) = (v1, w1) + (v2, w2), with v1, w1 ∈ V1, v2, w2 ∈ V2. Here we write v1 + v2 ∈ V1 ⊕ V2 instead of (v1, v2) to avoid confusion.

8.3 Adjoints, orthogonal and unitary maps Definition (Adjoint). We call the map α∗ the adjoint of α.

15 8 Inner product spaces IB Linear Algebra (Definitions)

Definition (Self-adjoint). Let V be an inner product space, and α ∈ End(V ). Then α is self-adjoint if α = α∗, i.e.

(α(v), w) = (v, α(w))

for all v, w. Definition (Orthogonal endomorphism). Let V be a real inner product space. Then α ∈ End(V ) is orthogonal if

(α(v), α(w)) = (v, w)

for all v, w ∈ V . Definition (). Let V be a real inner product space. Then the orthogonal group of V is

O(V ) = {α ∈ End(V ): α is orthogonal}.

Definition (Unitary map). Let V be a finite-dimensional complex vector space. Then α ∈ End(V ) is unitary if

(α(v), α(w)) = (v, w)

for all v, w ∈ V . Definition (Unitary group). Let V be a finite-dimensional complex inner prod- uct space. Then the unitary group of V is

U(V ) = {α ∈ End(V ): α is unitary}.

8.4 Spectral theory

16