L. Vandenberghe ECE133A (Spring 2021) 4. Matrix inverses
• left and right inverse
• nonsingular matrices
• matrices with linearly independent columns
• matrices with linearly independent rows
4.1 Left and right inverse
≠ in general, so we have to distinguish two types of inverses
Left inverse: - is a left inverse of if
- =
is left-invertible if it has at least one left inverse
Right inverse: - is a right inverse of if
- =
is right-invertible if it has at least one right inverse
Matrix inverses 4.2 Examples
−3 −4 1 0 1 = 4 6 , = 0 1 1 1 1
• is left-invertible; the following matrices are left inverses:
1 −11 −10 16 , 0 −1/2 3 9 7 8 −11 0 1/2 −2
• is right-invertible; the following matrices are right inverses:
1 −1 1 0 1 −1 1 −1 1 , 0 1 , 0 0 2 1 1 0 0 0 1
Matrix inverses 4.3 Some immediate properties
Dimensions a left or right inverse of an < × = matrix must have size = × <
Left and right inverse of (conjugate) transpose
) ) • - is a left inverse of if and only if - is a right inverse of
) -) = (- )) =
• - is a left inverse of if and only if - is a right inverse of
- = (- ) =
Matrix inverses 4.4 Inverse if has a left and a right inverse, then they are equal and unique:
- = , . = =⇒ - = -(.) = (- ). = .
• in this case, we call - = . the inverse of (notation: −1)
• is invertible if its inverse exists
Example −1 1 −3 2 4 1 − 1 = 1 −1 1 , 1 = 0 −2 1 4 2 2 2 −2 −2 0
Matrix inverses 4.5 Linear equations set of < linear equations in = variables
11G1 + 12G2 + · · · + 1=G= = 11
21G1 + 22G2 + · · · + 2=G= = 12 .
<1G1 + <2G2 + · · · + <=G= = 1<
• in matrix form: G = 1
• may have no solution, a unique solution, infinitely many solutions
Matrix inverses 4.6 Linear equations and matrix inverse
Left-invertible matrix: if - is a left inverse of , then
G = 1 =⇒ G = - G = -1 there is at most one solution (if there is a solution, it must be equal to -1)
Right-invertible matrix: if - is a right inverse of , then
G = -1 =⇒ G = -1 = 1 there is at least one solution (namely, G = -1)
Invertible matrix: if is invertible, then
G = 1 ⇐⇒ G = −11 there is a unique solution
Matrix inverses 4.7 Outline
• left and right inverse
• linear independence
• nonsingular matrices
• matrices with linearly independent columns
• matrices with linearly independent rows Linear combination
a linear combination of vectors 01,..., 0= is a sum of scalar-vector products
G101 + G202 + · · · + G=0=
• the scalars G8 are the coefficients of the linear combination
• can be written as a matrix-vector product
G1 G2 G101 + G202 + · · · + G=0= = 01 02 ··· 0= . . G=
• the trivial linear combination has coefficients G1 = ··· = G= = 0
(same definition holds for real and complex vectors/scalars)
Matrix inverses 4.8 Linear dependence
a collection of vectors 01, 02,..., 0= is linearly dependent if
G101 + G202 + · · · + G=0= = 0 for some scalars G1,..., G=, not all zero
• the vector 0 can be written as a nontrivial linear combination of 01,..., 0=
• equivalently, at least one vector 08 is a linear combination of the other vectors:
G1 G8−1 G8+1 G= 08 = − 01 − · · · − 08−1 − 08+1 − · · · − 0= G8 G8 G8 G8
if G8 ≠ 0
Matrix inverses 4.9 Example the vectors
0.2 −0.1 0 01 = −7 , 02 = 2 , 03 = −1 8.6 −1 2.2 are linearly dependent
• 0 can be expressed as a nontrivial linear combination of 01, 02, 03:
0 = 01 + 202 − 303
• 01 can be expressed as a linear combination of 02, 03:
01 = −202 + 303
(and similarly 02 and 03)
Matrix inverses 4.10 Linear independence
vectors 01,..., 0= are linearly independent if they are not linearly dependent
• the zero vector cannot be written as a nontrivial linear combination:
G101 + G202 + · · · + G=0= = 0 =⇒ G1 = G2 = ··· = G= = 0
• none of the vectors 08 is a linear combination of the other vectors
Matrix with linearly independent columns
= 01 02 ··· 0= has linearly independent columns if
G = 0 =⇒ G = 0
Matrix inverses 4.11 Example the vectors 1 −1 0 01 = −2 , 02 = 0 , 03 = 1 0 1 1 are linearly independent:
G1 − G2 G101 + G202 + G303 = −2G1 + G3 = 0 G2 + G3 holds only if G1 = G2 = G3 = 0
Matrix inverses 4.12 Dimension inequality
if = vectors 01, 02,..., 0= of length < are linearly independent, then
= ≤ <
(proof is in textbook)
• if an < × = matrix has linearly independent columns then < ≥ =
• if an < × = matrix has linearly independent rows then < ≤ =
Matrix inverses 4.13 Outline
• left and right inverse
• linear independence
• nonsingular matrices
• matrices with linearly independent columns
• matrices with linearly independent rows Nonsingular matrix for a square matrix the following four properties are equivalent
1. is left-invertible
2. the columns of are linearly independent
3. is right-invertible
4. the rows of are linearly independent a square matrix with these properties is called nonsingular
Nonsingular = invertible
• if properties 1 and 3 hold, then is invertible (page 4.5)
• if is invertible, properties 1 and 3 hold (by definition of invertibility)
Matrix inverses 4.14 Proof
(a) left-invertible linearly independent columns
(b’) (b)
(a’) linearly independent rows right-invertible
• we show that (a) holds in general
• we show that (b) holds for square matrices ) • (a’) and (b’) follow from (a) and (b) applied to
Matrix inverses 4.15 Part a: suppose is left-invertible
• if is a left inverse of (satisfies = ), then
G = 0 =⇒ G = 0 =⇒ G = 0
• this means that the columns of are linearly independent: if
= 01 02 ··· 0=
then G101 + G202 + · · · + G=0= = 0
holds only for the trivial linear combination G1 = G2 = ··· = G= = 0
Matrix inverses 4.16 Part b: suppose is square with linearly independent columns 01,..., 0=
• for every =-vector 1 the vectors 01,..., 0=, 1 are linearly dependent (from dimension inequality on page 4.13)
• hence for every 1 there exists a nontrivial linear combination
G101 + G202 + · · · + G=0= + G=+11 = 0
• we must have G=+1 ≠ 0 because 01,..., 0= are linearly independent
• hence every 1 can be written as a linear combination of 01,..., 0=
• in particular, there exist =-vectors 21,..., 2= such that
21 = 41, 22 = 42, . . . , 2= = 4=, • the matrix = 21 22 ··· 2= is a right inverse of :
21 22 ··· 2= = 41 42 ··· 4= =
Matrix inverses 4.17 Examples
1 −1 1 −1 1 −1 1 −1 1 −1 1 = −1 1 1 , = 1 1 −1 −1 1 1 −1 −1 −1 1 1
• is nonsingular because its columns are linearly independent:
G1 − G2 + G3 = 0, −G1 + G2 + G3 = 0,G1 + G2 − G3 = 0
is only possible if G1 = G2 = G3 = 0 • is singular because its columns are linearly dependent:
G = 0 for G = (1, 1, 1, 1)
Matrix inverses 4.18 Example: Vandermonde matrix
C C2 ··· C=−1 1 1 1 1 2 =−1 1 C2 C ··· C = . . .2 . 2. with C8 ≠ C 9 for 8 ≠ 9 ...... 2 =−1 1 C= C= ··· C= we show that is nonsingular by showing that G = 0 only if G = 0
• G = 0 means ?(C1) = ?(C2) = ··· = ?(C=) = 0 where
2 =−1 ?(C) = G1 + G2C + G3C + · · · + G=C
?(C) is a polynomial of degree = − 1 or less
• if G ≠ 0, then ?(C) can not have more than = − 1 distinct real roots
• therefore ?(C1) = ··· = ?(C=) = 0 is only possible if G = 0
Matrix inverses 4.19 Inverse of transpose and product
Transpose and conjugate transpose ) if is nonsingular, then and are nonsingular and
())−1 = (−1)), ()−1 = (−1)
) we write these as − and −
Product if and are nonsingular and of equal size, then is nonsingular with
()−1 = −1−1
Matrix inverses 4.20 Outline
• left and right inverse
• linear independence
• nonsingular matrices
• matrices with linearly independent columns
• matrices with linearly independent rows Gram matrix the Gram matrix associated with a matrix = 01 02 ··· 0= is the matrix of column inner products
• for real matrices: ) ) ) 0 01 0 02 ··· 0 0= )1 )1 )1 ) 0 01 0 02 ··· 0 0= = 2. 2. 2. . . . ) ) ) 0= 0 0= 0 ··· 0= 0= 1 2 • for complex matrices
0 01 0 02 ··· 0 0= 1 1 1 0 01 0 02 ··· 0 0= = 2. 2. 2. . . . 0= 0 0= 0 ··· 0= 0= 1 2
Matrix inverses 4.21 Nonsingular Gram matrix the Gram matrix is nonsingular if only if has linearly independent columns
< = • suppose ∈ R × has linearly independent columns:
) ) ) ) G = 0 =⇒ G G = (G) (G) = kGk2 = 0 =⇒ G = 0 =⇒ G = 0
) therefore is nonsingular < = • suppose the columns of ∈ R × are linearly dependent
) ∃G ≠ 0, G = 0 =⇒ ∃G ≠ 0, G = 0
) therefore is singular
< = ) ) (for ∈ C × , replace with and G with G )
Matrix inverses 4.22 Pseudo-inverse
< = • suppose ∈ R × has linearly independent columns
• this implies that is tall or square (< ≥ =); see page 4.13
the pseudo-inverse of is defined as
† = () )−1)
) • this matrix exists, because the Gram matrix is nonsingular
• † is a left inverse of :
† = () )−1() ) =
(for complex with linearly independent columns, † = ( )−1 )
Matrix inverses 4.23 Summary the following three properties are equivalent for a real matrix
1. is left-invertible
2. the columns of are linearly independent ) 3. is nonsingular
• 1 ⇒ 2 was already proved on page 4.16
• 2 ⇒ 1: we have seen that the pseudo-inverse is a left inverse
• 2 ⇔ 3: proved on page 4.22
• a matrix with these properties must be tall or square ) • for complex matrices, replace in property 3 by
Matrix inverses 4.24 Outline
• left and right inverse
• linear independence
• nonsingular matrices
• matrices with linearly independent columns
• matrices with linearly independent rows Pseudo-inverse
< = • suppose ∈ R × has linearly independent rows
• this implies that is wide or square (< ≤ =); see page 4.13 the pseudo-inverse of is defined as
† = ) ())−1
) • has linearly independent columns ) • hence its Gram matrix is nonsingular, so † exists
• † is a right inverse of :
† = ())())−1 =
(for complex with linearly independent rows, † = ( )−1)
Matrix inverses 4.25 Summary the following three properties are equivalent
1. is right-invertible
2. the rows of are linearly independent ) 3. is nonsingular
• 1 ⇒ 2 and 2 ⇔ 3: by transposing result on page 4.24
• 2 ⇒ 1: we have seen that the pseudo-inverse is a right inverse
• a matrix with these properties must be wide or square ) • for complex matrices, replace in property 3 by
Matrix inverses 4.26