Quick viewing(Text Mode)

Basics from Linear Algebra

Basics from Linear Algebra

Basics from

Definition. A is a V with the operations of + : V × V → V, denoted ~w + ~v = +(~v, ~w), where ~v, ~w ∈ V and multiplication by a · : R × V → V denoted r~v = ·(r,~v), where r ∈ R~v ∈ V such that the following hold: (1) We have ~v + (~w + ~u) = (~v + ~w) + ~u for all ~v, ~w, ~u ∈ V . (2) We have ~v + ~w = ~w + ~v for all ~v, ~w ∈ V . (3) There exists an element ~0 ∈ V such that for every ~v ∈ V we have ~0 + ~v = ~v + ~0 = ~v. (One can prove that if an element ~0 with this property exists then such an element is unique). (4) For every ~v ∈ V there exists an element ~w ∈ V such that ~v + ~w = ~w + ~v = ~0. Again, one can show that for any given ~v an element ~w with this property is unique, and it is denoted ~w = −~v. (5) For every ~v ∈ V we have 1 · ~v = ~v. (6) For every r ∈ R and for all ~v, ~w ∈ V we have r(~v + ~w) = r~v + r ~w. (7) For every r, s ∈ R and every ~v ∈ V we have (r + s)~v = r~v + r ~w. Elements ~v of a vector space V are called vectors. Examples: n (1) If n ≥ 1 is an , then the R , with the standard operations of addition and multiplication by a scalar, is a vector space. (2) The set Mn,n(R) of all n×n matrices with entries in R, with the stan- dard operations of addition and multiplication by a scalar, is a vector space. (3) If X is a nonempty set, then the set F (X, R) of all functions f : X → R, with point-wise addition and point-wise multiplication by a scalar, is a vector space. That is, for f, g : X → R, f +g : X → R is defined as (f +g)(x) = f(x)+g(x) for all x ∈ X. Similarly, if r ∈ R and f : X → R, then the rf : X → R is defined as (rf)(x) := rf(x), where x ∈ X. Basic properties of vector spaces. Let V be a vector space. Then: (1) We have 0 · ~v = ~0 for every ~v ∈ V . (2) We have (−1) · ~v = −~v for all ~v ∈ V .

Definition. Let V be a vector space and let ~v1, . . . ,~vm ∈ V be m vectors in V (where m ≥ 1). We say that ~v1, . . . ,~vm are linearly independent in ~ V if whenever c1, . . . , cm ∈ R are such that c1~v1 + . . . cm~vm = 0 then c1 = ··· = cm = 0. The vectors ~v1, . . . ,~vm are linearly dependent if they are not linearly independent. 1 2

Thus ~v1, . . . ,~vm are linearly dependent if and only if there exist c1, . . . cm ∈ ~ R such that c1~v1 + . . . cm~vm = 0 but that ci 6= 0 for some i. Example. 3 (1) The vectors ~v1 = (0, 1, 3),~v2 = (−1, 1, 2) ∈ R are linearly indepen- 3 dent in R . 3 (2) The vectors ~v1 = (0, 1, 3),~v2 = (−1, 1, 2),~v3 = (−2, 3, 7) ∈ R are 3 ~ linearly dependent in R . Indeed 1·v1 + 2~v2 + (−1)~v3 = (0, 0, 0) = 0. 3 (3) The vectors ~v1 = (0, 1, 3),~v2 = (0, 0, 0) ∈ R are linearly dependent 3 ~ in R . Indeed, 0 · v1 + 1 · ~v2 = (0, 0, 0) = 0 and 1 6= 0. 2 3 (4) The functions x, x , 5x are linearly independent in F (R, R) (try to prove this fact). n Recall that the Euclidean space R is also equipped with the dot-product 1 n 1 n 1 1 operation (x , . . . , x ) · (y , . . . , y ) = x y + ··· + xnyn. 1 n n Recall that for ~x = (x , . . . , x ) ∈ R the norm or length of ~x is √ p ||~x|| := ~x · ~x = (x1)2 + ··· + (xn)2. Thus we always have ||~x|| ≥ 0 and, moreover ||~x|| = 0 if and only if ~x = ~0. n A system of vectors ~v1, . . .~vm ∈ R is called orthogonal if ~vi · ~vj = 0 for all i 6= j, 1 ≤ i, j ≤ m. n Fact: Let ~v1, . . .~vm ∈ R be an orthogonal system of vectors such that ~vi 6= ~0 for i = 1, . . . , m. Then the vectors ~v1, . . . ,~vm are linearly independent.

Proof. Suppose c1, . . . , cm ∈ R are such that

c1~v1 + ··· + cm~vm = ~0. Let i ∈ {1, . . . , m} be arbitrary. take the dot-product of the above equation with ~vi. Then

(c1~v1 + ··· + cm~vm) · ~vi = ~0 · ~vi = 0

c1(~v1 · ~vi) + . . . cm(~vm · ~vi) = 0 n Because ~v1, . . .~vm ∈ R is, by assumption, an orthogonal system in the above sum all terms ~vj · ~vi are = 0 except for the case j = i. Thus we get 2 ci(~vi · ~vi) = ci||~vi|| = 0.

Since, again by assumption, ~vi 6= ~0, we have ||~vi|| > 0. Therefore from 2 ci||~vi|| = 0 we get ci = 0. Since i ∈ {1, . . . , m} was arbitrary, we conclude that c1 = ··· = cm = 0. Thus ~v1, . . .~vm are linearly independent, as claimed.  Definition. Let V be a vector space and let W ⊆ V be a . The subset W is called a linear subspace of V if it satisfies the following properties: (1) ~0 ∈ W . (2) Whenever ~v ∈ W and r ∈ R then r~v ∈ W . 3

(3) For every ~v, ~w ∈ W we have ~v + ~w ∈ W . If W is a linear subspace of V , we write W ≤ V . Note that if W ≤ V then W is itself a vector space, with the operations of addition and multiplication by a scalar restricted from V . Example: 2 2 (1) The set W = {(x, y) ∈ R |y = 3x} is a linear subspace of R . 2 2 (2) The set W = {(x, y) ∈ R |y = 3x + 1} is a not linear subspace of R . (3) The set W = {f : R → R : f(3) = 0} is a linear subspace of F (R, R). (4) The set W = {f : R → R : f(3) = 2f(5)} is a linear subspace of F (R, R). (5) The set W = {f : R → R : f is continuous} is a linear subspace of F (R, R). (6) If A ∈ M2,2(R) is a 2 × 2 matrix, then     1 2 2 x1 0 ker(A) := {(x , x ) ∈ R |A = } x2 0 2 is a linear subspace of R . (7) Let V be a vector space and let S ⊆ V be a nonempty subset. The span of S is defined as:

Span(S) := {r1~v1 + . . . rn~vn|n ≥ 1,~v1, . . . ,~vn ∈ V, and r1, . . . rn ∈ R (Note that n in the above definition is not fixed, so that Span(S) consists of all finite linear combinations of elements of S). Then Span(S) is a linear subspace of V . 3 (8) For S = {(0, 1, 2), (0, 0, −1)} ⊆ R try to prove that Span(S) = {(0, y, z)|y, z ∈ R are arbitrary}.

Definition. Let V be a vector space. A collection of vectors ~v1, . . . ,~vn ∈ V is called a of V if the vectors ~v1, . . . ,~vn are linearly independent and if Span(~v1, . . . ,~vn) = V .

Fact. A collection of vectors ~v1, . . . ,~vn ∈ V is a basis of V if and only if for every ~v ∈ V there exists a unique n-tuple of real numbers c1, . . . , cn such that c1~v1 + ··· + cn~vn = ~v. Basic properties of bases:

(1) If ~v1, . . . ,~vn ∈ V and ~w1, . . . , ~wm ∈ V are bases of V then n = m. For this reason, if a vector space V admits a finite basis ~v1, . . . ,~vn ∈ V , then the number n is called the of V and denoted n = dim V . If a vector space V does not admit a finite basis, we set dim V := ∞. (2) If ~v1, . . . ,~vm ∈ V is a linearly independent collection of vectors then ~v1, . . . ,~vm is a basis of the linear subspace Span(~v1, . . . ,~vm). (3) If dimV = n < ∞ and ~v1, . . . ,~vm ∈ V is a linearly independent col- lection of vectors then m ≤ n and there exist vectors ~vm+1, . . .~vn ∈ V such that ~v1, . . . ,~vm,~vm+1, . . .~vn ∈ V is a basis of V . 4

(4) If dim(V ) = n < ∞ and if W ≤ V then dim W ≤ n. Examples: n n (1) dim R = n and ~e1, . . .~en is a basis of R where ~ei = (0,..., 1,... 0) with 1 occurring in the i-th position. (2) dimF (X, R) = |X|, the cardinality of the set X. In particular, dimF (X, R) < ∞ if and only if X is a finite set. 2 (3) dim Mn,n(R) = n . n n (4) Let ~v1, . . . ,~vn ∈ R be n vectors in R and let A = [v1|v2| ... |vn] be the n × n matrix with the i-th column being the vector ~vi. n Then ~v1, . . . ,~vn is a basis of R if and only if det(A) 6= 0. n ~ (5) Let ~v1, . . . ,~vn ∈ R an orthogonal system of vectors such that ~vi 6= 0 n for i = 1, . . . , n. Then the vectors ~v1, . . .~vn form a basis of R . Definition. Let V and W be vector spaces. A function T : V → W is called a if for every ~v1,~v2 ∈ V and r1, r2 ∈ R we have

T (r1~v1 + r2~v2) = r1T (~v1) + r2(~v2).

Basic facts: (1) If T : V → W is a linear map then T (~0) = ~0. (2) If T : V → W is a linear map then ker(T ) := {~v ∈ V |T (~v) = ~0} is a linear subspace of V . (3) If T : V → W is a linear map then T (V ) = {T (~v)|~v ∈ V } is a linear subspace of W . (4) Let V,W be vector spaces, and let ~v1, . . .~vn be a basis of V . Let T,S : V → W be linear maps such that for i = 1, . . . , n we have T (~vi) = S(~vi). Then T = S as functions, that is, T (~v) = S(~v) for all ~v ∈ V . (5) Let V,W be vector spaces, and let ~v1, . . .~vn be a basis of V , and let ~w1, . . . , ~wn ∈ W be arbitrary. Then there exists a unique linear map T : V → W such that T (~vi) = ~wi for i = 1, . . . , n. Examples. 2 (1) The function T : R → R given by the formula T (x, y) = 3x − 5y, is a linear map. 2 (2) The function T : R → R given by the formula T (x, y) = 3x−5y +4, is not a linear map. 2 (3) Consider the function T : F (R, R) → R given by T (f) = (f(0) + 3f(1), −5f(20)), where f : R → R is arbitrary. Then T is a linear map. 3 (4) Consider the function T : M2,2 → R given by x1 x2 T = (x1 − 2x3, 2x2 + 5x1, x1 + x2 + 4x3). x3 x4 Then T is a linear map. 5

(5) Let A be an m × n matrix with entries in R. Consider the map     x1 x1 n m  .   .  T : R → R given by T  .  := A  .  (where we think of xn xn n m elements of R and of R as column-vectors and where the right- hand side of the preceding formula refers to the matrix product). Then T is a linear map. (6) Let ~w ∈ R be an arbitrary fixed vector. Consider the function T : n R → R given by T (~x) := ~x · ~w. Then T is a linear map. n (7) Let ~v1, . . .~vn−1 ∈ R be arbitrary fixed n − 1 vectors. n Consider the map T : R → R given by T (~x) = det[v1| ... |vn−1|x] n for ~x ∈ R . Then T is a linear map.