Tensors oContravariant and covariant tensors oEinstein notation oInterpretation What is a Tensor?
! It is the generalization of a: � – Matrix operator for >2 dimensions output input
– Vector data to >1 dimension ��
�� or
! Why do we need it? – Came out of differential geometry. – Was used by Albert Einstein to formulate the theory of General Relativity. – It’s needed for many problems, including Deep NNs. Contravariant and Covariant Vectors � = �� ! Contravariant vectors: – “Column Vectors” that represent data – Vectors that describe the position of something
! contravariant – � for 0 ≤ � < � and � ∈ ℜ covariant
! Covariant vector: � = � is 1� 1
– “Row vectors” that operate on data × �
– Gradient vectors is � – �! for 0 ≤ � < � Picture of multiplication ! Einstein notation $%& ! ! � = �! � = �! � !"# – Leave out the sum to make notation cleaner – Always sum over any two indices that appear twice Vector-Matrix Products as Tensors
� = �� covariant rows contravariant ! 1D Contravariant vectors: ! = – � for 0 ≤ � < � � ' � is is ! is � – � for 0 ≤ � < �( � is �×� � × × 1 1 ! 2D tensor (i.e., matrix): ) Picture of multiplication – � ! for 0 ≤ � < �( and 0 ≤ � < �' – There is a gradient for each component �) ! Einstein notation $!%& ) ) ! ) ! � = � ! � = � ! � !"# – Leave out the sum to make notation cleaner – Always sum over any two indices that appear twice Tensor Products Sum over pairs ! Einstein notation of indices