<<

Linear maps on Rn Linear structure of Rn n I The R has a linear structure with two features: n n I Vector : For all u, v ∈ R we have u + v ∈ R with the components (u + v)i = ui + vi for 1 ≤ i ≤ n with respect, say, to the standard vectors of Rn. n I multiplication: For all u ∈ R and all c ∈ R, we n have cu ∈ R with the components (cu)i = cui for 1 ≤ i ≤ n. I Note that these two structures exist independent of any particular basis we choose to represent the coordinates of vectors. cu u+v

v

u

u Convex combination of vectors 2 I The convex combination of any two vectors u, v ∈ R is the segment au + bv joining u and v, where 0 ≤ a ≤ 1, 0 ≤ b ≤ 1 and a + b = 1. v

u

I Similarly, the convex combination of three vectors u, v, w ∈ R3 is the triangle au + bv + cw with vertices u, v and w, where 0 ≤ a, b, c ≤ 1 and a + b + c = 1. w

v

u Scalar product of vectors 3 I Given two vectors u, v ∈ R , with coordinates     u1 v1 u = u2 v = v2 u3 v3 their scalar product is given by 3 X u · v = ui vi = kukkvk cos θ, i=1 q 2 2 2 where kuk = u1 + u2 + u3 and θ is the angle between u and v. Check this formula in R2! v

u

θ Scalar product and coordinates

I Note that the lengths kuk and kvk as well as the angle θ are independent of the three mutually perpendicular axes used to determine the coordinates of u and v.

I It follows that if we move to another of mutually perpendicular axes with respect to which u and v have coordinates

 0   0  u1 v1 0 0 u = u2 v = v2 , 0 0 u3 v3 then, we have

3 3 X X 0 0 u · v = ui vi = kukkvk cos θ = ui vi . i=1 i=1 Linear maps n m I A f : R → R is called linear if it preserves the linear structure, i.e., if it satisfies: n I ∀u, v ∈ R . f (u + v) = f (u) + f (v). n I ∀u ∈ R ∀c ∈ R. f (cu) = cf (u). I We can write the two conditions as a single condition: n I ∀u, v ∈ R ∀a, b ∈ R. f (au + bv) = af (u) + bf (v). n n I Example. For a fixed u ∈ R , let fu : R → R be defined by

fu(v) = u · v

I It is easy to check that fu is a . I If kuk = 1, then fu(v) is the of v onto the direction of u. v

u ||u||=1 u.v Linear maps and basis n m I Let f : R → R be a linear map. n I Assume e1, e2,..., en form a basis of R and d1, d2,..., dm form a basis of Rm. I By , the action of f is completely determined by m knowing the vectors f (ej ) ∈ R (j = 1, ··· n). m I Since d1,..., dm form a basis of R , each vector m f (ej ) ∈ R can be expressed as

m X f (ej ) = Aij di , i=1

for some coefficients Aij ∈ R with i = 1, ··· , m. Pn I Given any vector v = j=1 vj ej , we have by linearity:

n n n m X X X X f (v) = f ( vj ej ) = vj f (ej ) = vj ( Aij di ) j=1 j=1 j=1 i=1 representation of linear maps

I The m × n matrix matrix A = (Aij ) with 1 ≤ i ≤ m, 1 ≤ j ≤ n is called the matrix representation of f with respect to n m the basis e1,... en of R and the basis d1,..., dm of R . Pn I Given any vector v = j=1 vj ej , we can compute the vector representing f (v) ∈ Rm with respect to the basis d1,..., dm: I n n n m X X X X f (v) = f ( vj ej ) = vj f (ej ) = vj ( Aij di ) j=1 j=1 j=1 i=1 n m m n m X X X X X = Aij vj di = Aij vj di = (Av)i di i=1 j=1 i=1 j=1 i=1

I Thus, (Av)i is the coefficient of di in the expansion of f (v) with respect to the basis d1,..., dm, i.e,

I Av represents f (v) with respect to the basis d1,..., dm. Extension to Cn

n I All the definitions and results for R we have seen here can be extended to the vector space Cn, the complex vector space of n. 2 T I For example, C has vectors of the form v = (v1, v2) with v1, v2 ∈ C. n n I The linear structure of C is like R , except that we can now use with complex numbers, i.e., T for the vector v above: cv = (cv1, cv2) for c ∈ C. n I Interestingly, the standard basis for R can be interpreted as a standard basis for Cn. I Note that even when we deal with real matrices, we cannot avoid using complex vectors.

I In fact, the eigenvalues of a real matrix can be complex numbers, which means that the corresponding eigenvectors would be complex vectors. The inner product in Cn

n I How about the and the of a vector in C ? I If we simply copy the definition of the dot product from the real case for complex vectors as well, then the dot product of a complex vector with itself will not necessarily be a non-negative number and thus cannot be used to define the norm of the vector. n I For complex vectors v, w ∈ C , we define the inner Pn ∗ product hv, wi := i=1 vi wi . I Recall that for c = c1 + ic2 ∈ C the of ∗ c is given by c = c1 − ic2. ∗ 2 2 I Note that c c = c1 + c2 ≥ 0 for any c = c1 + ic2 ∈ C. I Observe that the inner product is not commutative: hv, wi = hw, vi∗. n p I The norm of v ∈ C is defined as kvk := hv, vi.