<<

3. Cross Definition 3.1. Let ~ and ~ be two vectors in R3. The of ~v and ~w, denoted ~v × ~w, is the vector defined as follows: • the length of ~v × ~w is the of the parallelogram with sides ~v and ~w, that is, ~vkk~wk sin θ. • ~v × ~w is orthogonal to both ~v and ~w. • the three vectors ~v, ~w and ~v × ~w form a right-handed set of vectors.

Remark 3.2. The cross product only makes sense in R3. Example 3.3. have ˆı × ˆ = k,ˆ ˆ× kˆ = ˆı and kˆ × ˆı =. ˆ By contrast ˆ× ˆı = −k,ˆ kˆ × ˆ = −ˆı and ˆı × kˆ = −.ˆ

Theorem 3.4. Let ~, ~v and ~w be three vectors in R3 and let λ be a . (1) ~v × ~w = −~w × ~v. (2) ~u × (~v + ~w) = ~u × ~v + ~u × ~w. (3) (~u + ~v) × ~w = ~u × ~w + ~v × ~w. (4) λ(~v × ~w) = (λ~v) × ~w = ~v × (λ~w). Before we prove (3.4), let’ draw some conclusions from these prop- erties. Remark 3.5. Note that (1) of (3.4) is what really distinguishes the cross product (the cross product is skew commutative). Consider computing the cross product of ˆı, ˆı and ˆ. On the one hand, (ˆı × ˆı) × ˆ = ~0 × ~ = ~0. On the other hand, ˆı × (ˆı × ˆ) = ˆı × kˆ = −.ˆ In other words, the order in which we compute the cross product is important (the cross product is not associative). Note that if ~v and ~w are parallel, then the cross product is the zero vector. One can see this directly from the formula; the area of the parallelogram is zero and the only vector of zero length is the zero 1 vector. On the other hand, we know that ~w = λ~v. In this case, ~v × ~w = ~v × (λ~v) = λ~v × ~v = −λ~v × ~v. To get from the second to the third line, we just switched the factors. But the only vector which is equal to its inverse is the zero vector. Let’s try to compute the cross product using (3.4). If ~v = (v1, v2, v3) and ~w = (w1, w2, w3), then ˆ ˆ ~v × ~w = (v1ˆı + v2ˆ+ v3k) × (w1ˆı + w2ˆ+ w3k) ˆ = v1w1(ˆı × ˆı) + v1w2(ˆı × ˆ) + v1w3(ˆı × k) ˆ + v2w1(ˆ × ˆı) + v2w2(ˆ × ˆ) + v2w3(ˆ × k) ˆ ˆ ˆ ˆ + v3w1(k × ˆı) + v3w2(k × ˆ) + v3w3(k × k) ˆ = (v2w3 − v3w2)ˆı + (v3w1 − v1w3)ˆ + (v1w2 − v2w1)k.

Definition 3.6. A A = (aij) is a rectangular array of , where aij is in the i row and jth column. If A has rows and columns, then we say that A is a m × n matrix. Example 3.7. −2 1 −7 a a a  A = = 11 12 13 , 0 2 −4 a21 a22 a23 is a 2 × 3 matrix. a23 = −4. Definition 3.8. If a  A = is a 2 × 2 matrix, then the of A is the scalar

a b = ad − bc. c d If a b c A = d i is a 3 × 3 matrix, then the determinant of A is the scalar

a b c e f d f d e d e f = a − b + c . h i g i g h g h i 2 Note that the cross product of ~v and ~w is the (formal) determinant

ˆı ˆ kˆ

v1 v2 v3 .

w1 w2 w3 Let’s now turn to the proof of (3.4).

Definition 3.9. Let ~u, ~v and ~w be three vectors in R3. The triple scalar product is (~u × ~v) · ~w. The triple scalar product is the signed of the formed using the three vectors, ~u, ~v and ~w. Indeed, the volume of the parallelepiped is the area of the base times the height. For the base, we take the parallelogram with sides ~u and ~v. The magnitude of ~u × ~v is the area of this parallelogram. The height of the parallelepiped, up to , is the length of ~w times the cosine of the , let’s call this φ, between ~u × ~v and ~w. The sign is positive, if ~u, ~v and ~w form a right-handed set and negative if they form a left-handed set.

Lemma 3.10. If ~u = (u1, u2, u3), ~v = (v1, v2, v3) and ~w = (w1, w2, w3) are three vectors in R3, then

u1 u2 u3

(~u × ~v) · ~w = v1 v2 v3 .

w1 w2 w3 Proof. We have already seen that

ˆı ˆ kˆ

~u × ~v = u1 u2 u3 .

v1 v2 v3 If one expands this determinant and dots with ~w, this is the same as replacing the top row by (w1, w2, w3),

w1 w2 w3

(~u × ~v) · ~w = u1 u2 u3 .

v1 v2 v3 Finally, if we switch the first row and the second row, and then the second row and the third row, the sign changes twice (which makes no change at all):

u1 u2 u3

(~u × ~v) · ~w = v1 v2 v3 .

w1 w2 w3  3 Example 3.11. The scalar of ˆı, ˆ and kˆ is one. One way to see this is geometrically; the parallelepiped determined by these three vectors is the unit cube, which has volume 1, and these vectors form a right-handed set, so that the sign is positive. Another way to see this is to compute directly (ˆı × ˆ) · kˆ = kˆ · kˆ = 1. Finally one can use ,

1 0 0

(ˆı × ˆ) · kˆ = 0 1 0 = 1.

0 0 1

Lemma 3.12. Let ~u, ~v and ~w be three vectors in R3. Then (~u × ~v) · ~w = (~v × ~w) · ~u = (~w × ~u) · ~v. Proof. In fact all three numbers have the same , namely the volume of the parallelepiped with sides ~u, ~v and ~w. On the other hand, if ~u, ~v, and ~w is a right-handed set, then so is ~v, ~w and ~u and vice-versa, so all three numbers have the same sign as well.  Lemma 3.13. Let ~v and ~w be two vectors in R3. Then ~v = ~w if and only if ~v · ~ = ~w · ~x, for every vector ~x in R3. Proof. One direction is clear; if ~v = ~w, then ~v · ~x = ~w · ~x for any vector ~x. So, suppose that we know that ~v · ~x = ~w · ~x, for every vector ~x. Suppose that ~v = (v1, v2, v3) and ~w = (w1, w2, w3). If we take ~x = ˆı, then we see that

v1 = ~v · ˆı = ~w · ˆı = w1. Similarly, if we take ~x = ˆ and ~x = kˆ, then we also get

v2 = ~v · ˆ = ~w · ˆ = w2, and ˆ ˆ v3 = ~v · k = ~w · k = w3. But then ~v = ~w as they have the same components.  Proof of (3.4). We first prove (1). Both sides have the same magni- tude, namely the area of the parallelogram with sides ~v and ~w. Further both sides are orthogonal to ~v and ~w, so the only thing to check is the change in sign. 4 As ~v, ~w and ~v × ~w form a right-handed triple, it follows that ~w, ~v and ~v × ~w form a left-handed triple. But then ~w, ~v and −~v × ~w form a right-handed triple. It follows that ~w × ~v = −~v × ~w. This is (1). To check (2), we check that (~u × (~v + ~w)) · ~x = (~u × ~v + ~u × ~w) · ~x, for an arbitrary vector ~x. We first attack the LHS. By (3.12), we have (~u × (~v + ~w)) · ~x = (~x × ~u) · (~v + ~w) = (~x × ~u) · ~v + (~x × ~u) · ~w = (~u × ~v) · ~x + (~u × ~w) · ~x. We now attack the RHS. (~u × ~v + ~u × ~w) · ~x = (~u × ~v) · ~x + (~u × ~v) · ~x. It follows that both sides are equal. This is (2). We could check (3) by a similar argument. Here is another way. (~u + ~v) × ~w = −~w × (~u + ~v) = −~w × ~u − ~w × ~v = ~u × ~w + ~v × ~w. This is (3). To prove (4), it suffices to prove the first equality, since the fact that the first term is equal to the third term follows by a similar derivation. If λ = 0, then both sides are the zero vector, and there is nothing to prove. So we may assume that λ 6= 0. Note first that the magnitude of both sides is the area of the parallelogram with sides λ~v and ~w. If λ > 0, then ~v and λ~v point in the same direction. Similarly ~v × ~w and λ(~v × ~w) point in the same direction. As ~v, ~w and ~v × ~w for a right-handed set, then so do λ~v, ~w and λ(~v × ~w). But then λ(~v × ~w) is the cross product of λ~v and ~w, that is, (λ~v) × ~w = λ(~v × ~w). If λ < 0, then ~v and λ~v point in the opposite direction. Similarly ~v × ~w and λ(~v × ~w) point in the opposite direction. But then λ~v, ~w and λ(~v × ~w) still form a right-handed set. This is (4). 

5