<<

Einstein’s Notation

Kartik Gokhale April 2021

1 Introduction

The Einstein notation or Einstein convention is a notational con- vention that implies summation over a set of indexed terms in a formula, thus achieving notational brevity. When an index appears twice in a term and is not otherwise defined, it is taken as the sum over all possible values of said index. For example,

3 i 1 2 3 Σi=1cix = c1x + c2x + c3x (1) i would just be written as cix . The summation is implicit. According to this convention:

1. The whole term is summed over any index which appears twice in that term 2. Any index appearing only once in the term in not summed over and must be present in both LHS and RHS

2 Two important symbols 2.1

δij is the Kronecker Delta symbol. It is defined as ( 1 if i = j δ = ij 0 if i 6= j

2.2 Levi-Civita

ijk is the Levi-Civita symbol(in 3 ). It is defined as

1 if ijk is a cyclic permutation of xyz, eg xyz, yzx, zxy  ijk = −1 if ijk is an anticyclic permuation of xyz, eg xzy, zyx, yxz 0 if in ijk any index is repeating, eg xxy, xyy etc

1 3 Application to Vectors 3.1 Dot Consider 2 vectors in 3-d space. ˆ ~a = a1ˆı + a2ˆ+ a3k ~ ˆ b = b1ˆı + b2ˆ+ b3k Hence, their is ~ 3 3 ~a · b = Σi=1Σj=1aibjδij This can be written quite succinctly using Einstein’s notation as ~ ~a · b = aibjδij (2) As the summation over the indices is implicit

3.2 Consider 2 vectors again ˆ ~a = a1ˆı + a2ˆ+ a3k ~ ˆ b = b1ˆı + b2ˆ+ b3k

Hence, their cross product is

ˆı ˆ kˆ ~ ~a × b = a1 a2 a3

b1 b2 b3 This can be written quite succinctly using Einstein’s notation as ~ ~a × b = aibjijkeˆk (3) wheree ˆk is the unit vector in k direction.

As the summation over the indices is implicit

3.3 Product of Levi-Civitas We can resolve the product of two Levi-Civitas(in 3-d) as

δil δip δiq

ijklpq = δjl δjp δjq

δkl δkp δkq

2 4 Problems 4.1 BAC-CAB Rule Consider 3 vectors, A,~ B,~ C~ .Our goal is to simplify

A~ × (B~ × C~ )

Solution Let us define another vector D~ = B~ × C~

Hence, our expression simplifies to

A~ × (B~ × C~ ) = A~ × D~

= aidjijkeˆk

Since D is a cross product of 2 vectors

dj = blcmlmjeˆj

By back-substitution

aidjijkeˆk = aiblcmlmjijkeˆk

Recall product of 2 Levi-Civitas

δil δip δiq

ijklpq = δjl δjp δjq

δkl δkp δkq Substituting Values and simplifying

aiblcmlmjijkeˆk = aibjcm(δlkδmi − δliδmk)e ˆk

= aibjcmδlkδmieˆk − aibjcmδliδmkeˆk

Recall these are just dot products

= bk(A~ · C~ )e ˆk − ck(A~ · B~ )e ˆk = B~ (A~ · C~ ) − C~ (A~ · B~ )

Hence,we obtain the very popular result

A~ × (B~ × C~ ) = B~ (A~ · C~ ) − C~ (A~ · B~ ) (4)

3 4.2 More Problems Question The ∇ behaves like a vector in “ some sense”. For example, of a (∇.∇ × A~ = 0) for any A~, may suggest that it is just like A.~ B~ × C~ being zero if any two vectors are equal. Prove that ∇×∇×F~ = ∇(∇.F~ )−∇2F~ . To what extent does this look like the well known expansion of A~ × B~ × C~ ? Solution This question is very similar to the previous one. ∂ Now we will represent each component of ∇, ∂i by ∂i,     ∇ × ∇ × F~ = ijk∂j ∇ × F~ eˆi = ijkklm∂j∂lFm eˆi = kijklm∂j∂lFm eˆi k = (δilδjm − δimδjl)∂j∂lFm eˆi (∵ ijkilm = δjlδkm − δjmδkl) = (∂m∂iFm − ∂l∂lFi)e ˆi (∵ δijAj = Ai) = ∂i(∂mFm)e ˆi − ∂l∂l(Fi eˆi)(∵ ∂m∂i = ∂i∂m)

~ 2 ~ = ∇(∇ · F ) − ∇ F 

Notice that this proof works because ∂m∂i = ∂i∂m which is just fancy notation ∂2 ∂2 for ∂x∂y = ∂y∂x We know that A~ × B~ × C~ = B~ (A~ · C~ ) − (A~ · B~ )C~ . If we write A~ = ∇, B~ = ∇, C~ = F~ , we get ∇ × ∇ × F~ = ∇(∇ · F~ ) − (∇ · ∇)F~ = ∇(∇ · F~ ) − ∇2F~ . So this serves as a fake proof of the identity

4 Question√ As a more involved example, show that the operator L = −i~r ×∇ where (i = −1) satisfies L × Lf = iLf where f is an arbitrary test function. (Notice that the cross product of an operator with itself does not necessarily vanish, can you see why?) Solution

L × Lf = ijkLjLkf eˆi

= −iijkjlmrl∂mLkf eˆi

= −ijkijlmrl∂mLkf eˆi

= −i(δklδim − δkmδil)rl∂mLkf eˆi

= −i(rl∂iLkf − ri∂kLkf)e ˆi

= −i(Ai − Bi)e ˆi

Ai = rk∂iLkf = −ikpqrk∂irp∂qf

Bi = ri∂kLkf = −ikpqri∂krp∂qf

Now we can’t directly switch the order of ∂i & ∂q,

we have to use the product rule: (∂a(rb∂c))f = (rb∂a∂c)f + ((∂arb)∂c)f 0 : =⇒ A = −i rr∂ ∂ f − i r (∂ r )∂ f = −i r (δ )∂ f = −i r ∂ f i kpq k p i q kpq k i p q kpq k ip q kiq k q (∵ kpqrkrp∂i∂qf = −pkqrkrp∂i∂qf = −kpqrprk∂i∂qf = −kpqrkrp∂i∂qf =⇒ kpqrkrp∂i∂qf = 0) The first term cancels because in the summation 1 term will have k,q = x,y

and another term k,q = y,z. Both differ only in their sign, which is due to the kpq. So they cancel out : 0  =⇒ Bi = −ikpqrirp∂k∂qf − ikpqri(∂krp)∂qf = −ikpqrk(δkp)∂qf = −ippqrk∂qf = 0  (∵ kpqrirp∂k∂qf = −qpkrirp∂k∂qf = −kpqrirk∂q∂kf = −kpqrirp∂k∂qf =⇒ kpqrirp∂k∂qf = 0) Similar reasoning as above is used here

=⇒ L × Lf = −kiqrk∂qf eˆi = ikqrk∂qf eˆi = iLf 

Notice that we have used ∂arb = δab. This is because ra is simply x if a = x, y if a = y and so on. And if we differentiate that w.r.t b = x or y or z, then it is 1 if a = b, and 0 otherwise, ∂ ∂ because ∂x x = 1, ∂x y = 0 The cross product of the operator with itself doesn’t vanish here because the operators act on different objects. One L acts on Lf, and the other on f.

5