<<

Inner Product Spaces:

Definition: Let V be a and the inner product of two vectors u, v V is a function denoted by that assigns a real # to each pair of vectors u & v. An is a vector space with an inner product function. This function has the following properties:

Properties of Inner Products:

1. = < u1 , v> + < u2 , v> 2. = k 3. = 4.  0 if u  0 & = 0 if u = 0 5. Property 1 and 2 together  = a + b

n Definition: Suppose u = (u1 , u2 ,….,un ) and v= (v1 , v2 ,….,vn ) then the standard inner product of  is defined as = u1v1 + u2v2 + ….. + unvn

Theorem: If V is a vector space of all n x n matrices then = Tr(BTA)

The Norm: Let V be an inner product space & u  V. The norm of u is defined as ||u|| = or ||u||2 =

Definition: The distance between vectors in V is defined: d(u,v) =

Theorem: ||  ||u|| ||v||

Orthogonal Vectors: let S = {v1 , v2 , …. , vn} & V is an inner product space then S is an orthogonal set if = 0 i  j

Orthonormal Set: If ||vi|| = 1 for all i and = 0 i  j the set is called an orthonormal set.

Theorem: Orthogonal vectors are independent.

Theorem: let V be an inner product space and S = { v1 , v2 , …. , vn} be an orthogonal for V. Then any vector u  V can be expressed as:

2 2 2 u = ( / ||v1|| ) + ( / ||v2|| ) + …. + ( / ||vn|| )

In particular if vi’s are orthogonal vectors u = v1 + v2 + … + vn

Orthogonal Compliment: Let V be an inner product space and W a subspace of V. The orthogonal compliment of W is denoted by W and is the collection of all vectors in V orthogonal to vectors in W.

Properties of Orthogonal Compliment: Let V be an inner product space and W a subspace of V. Let B = {W1 , … , Wn} a basis ’ for W B = {u1 , …. , un} a basis for the orthogonal compliment then:

1. W  W = {0}  2. The set { W1 , … , Wn , u1 , …. , un} has dimensions dim(V = dimW +dimW ) 3. Any vector in V can be expressed as the sum of vectors in W and a vector in W

Vector Projection: Let W be a subspace of an inner product space V. Suppose B = { u1 , u2 , …. , un} an orthogonal basis for W 2 2 and let u be any vector in V. Then the projection from u to W is = ( / ||u1|| )(u1) + ( / ||u2|| ) (u2) + …. + 2 ( / ||un|| ) (un) and the component of u orthogonal to w is determined by u -

Theorem: Any finite inner product space has an orthogonal basis.

2 2 2 To find the orthogonal basis have vn = un – [( / ||v1|| )(v1) + ( / ||v2|| ) (v2) + …. + ( / ||un-1|| ) (un-1)]

Reminders:

1. If V is a vector in an inner and { w1 , w2 , … , wn} is an orthogonal basis for a subspace of V. 2 2 2 Then the = ( / ||w1|| )(w1) + ( / ||w2|| ) (w2) + …. + ( / ||wn|| ) (wn) 2. Let A be an mxn . The N(A) & R(A) are orthogonal compliments. The N(AT) & col(A) are also orthogonal compliments.  3. Any vector b in an inner product space V can be written as b = w1 + w2 , where w1W & w2  W

Approximating solutions for x in an inconsistent system: Replace b by then the system Ax = is consistent and this replacement provides the best solution to Ax = b.

1. Ax = b  inconsistent 2. Ax = consistent 3. b - is a vector in the col(A) 4. b – Ax = b – 5. b -  N(AT) 6. b – Ax  N(AT) 7. AT(b – Ax) = 0 8. ATAx = ATb  This equation is called the normal equation associated with Ax = b the solution to the normal equation is the least square solution to Ax = b.

COS() = (< u, v >)/(||u|| ||v||)

Theorem: Eigen Vectors of any symmetric matrix that corresponds to different Eigen values are orthogonal.

Theorem: Any symmetric matrix Amxn has n orthogonal (orthonormal) eigenvectors if there two or more 2 2 sets of Eigen vectors then use the gram Schmidt algorithm. vn = un – [( / ||v1|| )(v1) + ( / ||v2|| ) 2 (v2) + …. + ( / ||un-1|| ) (un-1)]

Theorem: If A is an nxn symmetric matrix than A=P-1DP if we select the columns of P to be orthonormal eigen vectors, than P has the property P-1 = PT

Definition: Any Matrix A with the property A-1=AT(AAT=I) is called an orthogonal matrix.

Definition: Any diagonalizable matrix whose diagonalzing matrix is an orthogonal matrix is said to be orthogonally diagonalizable.

Theorem: Let A be a symmetric matrix & V1, V2, …,Vn orthogonal eigen vectors associated with eigen values 1, T T T 2, .., n than A = 1V1V1 + 2V2V2 +….+ nVnVn

All are column vectors.

Definition: A quadratic form is a polynomial of the form Q(X) = XTAX let x = , A= a symmetric matrix,

2 2 2 then Q(x1, x2, …, xn) = a11x1 +a22x2 +..+ammxm +2  aijxixj

Theorem: Any quadratic form can be written as the sum of squares. Q(X) = XTAX , A = PDP-1 where P is orthogonal so P-1=PT

T T 2 2 2 let Y = P X , then Q(X) = Q’(Y) = Y DY =1y1 +2y2 +…+nyn

Complex Properties:

1. Modulus:Z = a + bi  |Z| = (a2+b2)(1/2) 2. : Z= a + bi  = a - bi 3. Z (a + bi)(a – bi) = a2+b2 and become all real numbers 4. Dividing by a complex number: Just multiply the fraction by its complex conjugate. 5. Reciprocal of =( )( ) 6. i0= 1, i1= i = , i2= -1, i3= -i 7. Two complex numbers (a+bi) and (c+di) are equal iff a = c and b = d 8. (a+bi)+(c+di)=(a+c)+(b+d)i 9. (a+bi)(c+di)=(ac-bd)+(ad+bc)i 10. Polar form Z =r(cos + isin)

Theorem: A complex # Z is a real # iff Z =

Complex Inner Product Space:

n Let U = (u1,u2,…,un) and V=(v1,v2,…,vn) be vectors in . Then

=u1 1 + u2 2 +…+ un n or = 1v1+ 2v2+…+ nvn

= 0 then U = 0 or if U 0 then  0

2 2 2 2 ||U|| = |u1| +|u2| +…+|un|

Key Definitions:

1. Symmetric Matrix  AT = A 2. Skew Symmetric Matrix  AT = - A 3. Orthogonal Matrix  A-1 = AT 4. Hermitian Matrix  A* = A 5. Skew- Hermitian Matrix  A*= - A 6. Unitary Matrix  A*= A-1 7. Normal Matrix AA* = A*A