Orthogonal Projections and Orthonormal Bases
v1 • The length (a.k.a. magnitude) of a vector ~v = . is |~v| = pv2 + ··· + v2 . . 1 n vn √ Notice this is equivalent to |~v| = ~v · ~v
v1 w1 . . • The dot product of vectors ~v = . and ~w = . is ~v · ~w = v1w1 + ··· + vnwn vn wn w1 v ··· v . Notice this is equivalent to the matrix multiplication ~v · ~w = 1 n . wn
• The dot product satisfies ~v · ~w = |~v| |~w| cosθ, where θ is the angle between the vectors.
~v· ~w • The vector projection of ~v onto ~w is proj ~w~v = | ~w|| ~w| ~w
2 5 1. Consider ~v = and ~w = 3 1
(a) Draw ~v and ~w.
(b) Find |~v| and |~w|.
(c) Find the angle between ~v and ~w.
(d) Find a unit vector (a vector of length 1) in the direction of ~w.
(e) Find the vector projection of ~v onto ~w. Draw it on your picture from (a).
(f) Write ~v as the sum of a vector parallel to ~w and a vector perpendicular to ~w.
1 1 1 2. Find the angle between 2 and −1. 1 1
Two vectors ~w and ~w are orthogonal (a.k.a. perpendicular) if ~v · ~w = 0
Suppose B = {~v1, ··· ,~vn} is a basis of linear space V . B is called an orthogonal basis if the vectors in B are all orthogonal to each other. B is called an orthonormal basis if the vectors in B are all orthogonal to each other and all have length 1.
1/3 −2/3 3. Let V be the plane 2x + 2y + z = 0, ~u1 = −2/3, and ~u2 = 1/3. 2/3 2/3
Verify that {~u1, ~u2} is an orthonormal basis of V .
2 Suppose V is a subspace of Rn. For any vector ~x in Rn, we can write ~x = ~xk + ~x⊥ where ~xk is in V and ~x⊥ is orthogonal to every vector in V .
k ~x is called the orthogonal projection of ~x onto V and is denoted projV (~x).
n 4. Let V be a subspace of R . Suppose we have an orthonormal basis {~u1, ··· , ~um} of V .
Let’s find a formula for projV (~x)
(a) Explain why projV (~x) can be written as projV (~x) = c1~u1 +···+cm~um for some scalars c1, . . . , cm.
(b) Since ~x = ~xk + ~x⊥, we can use (a) to write
⊥ ~x = (c1~u1 + ··· + cm~um) + ~x .
Express the coefficient ck in terms of ~x,~u1, . . . , ~um.
(c) Write a formula for projV (~x) in terms of ~x,~u1, . . . , ~um.
(d) In coming up with this formula for projV (~x), where was it important that (~u1, . . . , ~um) be an orthonormal basis of V ?
3 5. Let’s revisit the plane from #3. 1/3 −2/3 V is the plane 2x + 2y + z = 0, ~u1 = −2/3, and ~u2 = 1/3. We already saw that {~u1, ~u2} is an 2/3 2/3 orthonormal basis of V . 1 Let ~x = 4. 8
(a) Find projV (~x).
(b) Write ~x as the sum of a vector in V and a vector orthogonal to V .
The transpose of a matrix A, denoted AT , is the matrix whose jth row is the jth column of A, and vice versa. 1 4 1 2 3 For example, if A = , then AT = 2 5 4 5 6 3 6
Let {~v1,~v2, ...,~vn} be an orthonormal basis of V , and Q be the matrix containing the basis vectors as column vectors. Then the projection onto the space V is given by the matrix P = QQT , where QT is the transpose matrix.
(c) Compute P , the matrix that projects onto V .
(d) What is P~v? Compute it to verify your answer!
4 n 6. (T/F) If ~u1, . . . , ~um are orthonormal vectors in R , then must they be linearly independent.
x1 y1 . . Suppose we have some data points (x1, y1),... (xn, yn). Denote ~x = . and . xn yn
x1+···+xn The expectation of x is E[x] = n . x1 − m . X = . is the centered form of ~x. xn − m If X, Y are the centered form of ~x and ~y then:
X·X The variance of X is Var[X] = n X·Y The covariance of X and Y is Cov[X,Y ] = n
|X| √ The standard deviation of X is n
X·Y The correlation coefficient of X and Y is |X||Y |
Consider data points (x1, y1),... (xn, yn), with X and Y defined as above. The line of best linear fit is given by y = ax + b, where a = Cov[X,Y ]/Var[X] and b = E[y] − aE[x]
7. Suppose that we want to fit a line to the data points (−1, 3), (0, 1), and (1, 1).
(a) Do you expect the slope of the line to be positive, negative, or zero?
(b) Find the best-fit line.
5 8. In each part, you are given a subspace V of some Rn. Describe V ⊥. (we call V ⊥ the orthogonal complement of V )
(a) y = 3x in R2.
(b) y = 3x in R3.
1 (c) span 2. 3
n n n 9. Let V be an m-dimensional subspace of R . Consider the linear transformation projV : R → R .
(a) What is im projV ? What is rank projV ?
(b) What is ker projV ? What is its dimension?
6