<<

Introduction to 1

Contravariant and covariant vectors

Rotation in 2­space: x' = cos  x + sin  y y' = ­ sin  x + cos  y

To facilitate generalization, replace (x, y) with (x1, x2)

Prototype contravariant vector: dr = (dx1, dx2)

= cos  dx1 + sin  dx2

Similarly for

Same holds for r, since transformation is linear. 2 Compact notation:

(generalizes to any transformation in a space of any )

Contravariant vector:

Now consider a field (r): How does ∇ transform under rotations?

appears rather than 3 For rotations in Euclidean n­space:

where  = angle btwn and axes

It is not the case for all spaces and transformations that so we define a new type of vector that transforms like the gradient:

Covariant vectors: Explicit demonstration for rotations in Euclidean 2­space: 4 5 What about vectors in Minkowski space?

but => contravariant and covariant vectors are different! 6 Recap (for arbitrary space and transformation)

Contravariant vector:

Covariant vector:

For future convenience, define new notation for partial derivatives:

Note:

= = 1 if i=j , 0 if i≠j 7 Tensors

Consider an N­dimensional space (with arbitrary geometry) and an object with components in the coord system and in the coord system.

This object is a mixed , contravariant in i...k and covariant in l...n, under the coord transformation  if

Rank of tensor, M = number of indices Total number of components = N M Vectors are first rank tensors and scalars are zero rank tensors. 8 If space is Euclidean N­space and transformation is rotation of Cartesian coords, then tensor is called a “”.

In Minkowski space and under Poincaré transformations, tensors are “Lorentz tensors”, or, “4­tensors”.

Zero tensor 0 has all its components zero in all coord systems.

Main theorem of tensor analysis: If two tensors of the same type have all their components equal in one coord system, then their components are equal in all coord systems.

Einstein's summation convention: repeated upper and lower indices => summation

e.g.: 9 A B i could also be written A B j ; index is a “dummy index” i j

Another example:

j and k are dummy indices; i is a “free index”

Summation convention also employed with , etc.

Example of a second rank tensor: Kronecker delta

SP 5.1­2 10 (operations for making new tensors from old tensors)

1. Sum of two tensors: add components: Proof that sum is a tensor: (for one case)

2. Outer product: multiply components: e.g.,

3. Contraction: replace one superscript and one subscript by a dummy index pair e.g.,

Result is a scalar if no free indices remain. e.g, 11 4. Inner product: contraction in conjunction with outer product

e.g.:

Again, result is a scalar if no free indices remain, e.g,

5. Index permutation: e.g.,

SP 5.3­5

Differentiation of Tensors

Notation: ; , etc. 12

IF transformation is linear (so that p's are all constant)

=> derivative of a tensor wrt a coordinate is a tensor only for linear transformations (like rotations and LTs)

Similarly, differentiation wrt a scalar (e.g., ) yields a tensor for linear transformations. Now specialize to Riemannian spaces 13

characterized by a metric with

Assume is symmetric: (no loss of generality, since they only appear in pairs)

If , then space is “strictly Riemannian” (e.g., Euclidean N­space)

Otherwise, space is “pseudo­Riemannian” (e.g., Minkowski space)

is called the “”.

Note that the metric tensor may be a function of position in the space. 14 Proof that is a tensor:

(since dxi is a vector)

(2 sets of dummy indices)

=>

It's tempting to divide by and conclude

But there's a double sum over k' and l', so this isn't possible.

Instead, suppose = 1 if i' = 1 = 0 otherwise

=> Similarly for , etc. 15

Now suppose = 1 if i' = 1 or 2 = 0 otherwise

Only contributing terms are: k'=1, l'=1 k'=1, l'=2 k'=2, l'=1 k'=2, l'=2 0 0

since is symmetric. since i and j are dummy indices.

=> Similarly for all 16 General definition of the scalar product:

Define as the inverse of :

is also a tensor, since applying tensor transformation yields

, which defines as the inverse of

Raising and lowering of indices: another tensor algebraic operation, defined for Riemannian spaces = inner product of a tensor with the metric tensor e.g.: ;

Note: covariant and contravariant indices must be staggered when raising and lowering is anticipated. 4­tensors 17 In all coord systems in Minkowski space:

=>

e.g: Under standard Lorentz transformations: 18

All the other p's are zero.

e.g.:

SP 5.6­10