
Quantum Mechanics Robert C. Roleda Physics Department Completeness Basis Expansion We have seen in [bra-ket] that state kets may be expressed as an expansion in terms of eigenkets | ⟩ = ȕ ͕)|͢⟩ )Ͱͥ and that the expansion coefficients may be obtained from ͕) = ͢ Inserting the last equation to the first, | ⟩ = ȕ ͢ |͢⟩ )Ͱͥ Completeness Relation Since the inner product is just a number, we may move it to the right ͢ | ⟩ = ȕ|͢⟩ ͢ )Ͱͥ Consistency requires that ȕ|͢⟩ ⟨͢| = 1 )Ͱͥ That this is true implies that the eigenkets form a complete set for the expansion, thus forming a basis set . The unit summation is called the completeness relation . Inner Product We note that the operation ̻ ̼ = ǹ ̻∗ ͬ ̼ (ͬ) ͬ͘ yields a scalar (number). The reason that this operation is called an inner product is that it is a product of two vectors that gives a lower-ordered entity (scalar). You may recall that the dot product is also called a scalar product. ̻⃑ · ̼ An inner product is a bra-ket. Outer Product The completeness relation ȕ|͢⟩ ⟨͢| = 1 )Ͱͥ is a summation over the terms which are basically ket-bras. |͢⟩⟨͢| More generally, if we consider a ket-bra , then ̻ = |͡⟩⟨͟| ̻|͢⟩ = |͡⟩⟨͟|͢⟩ = ⟨͟|͢⟩ |͡⟩ A ket-bra acting on a ket therefore yields another ket. Mathematical entities such as these are called operators. Because ket-bra is an operation on two vectors that gives rise to a mathematical object that is higher than a vector, it is called an outer product. Matrices Outer products can be better understood if we refer back to Euclidean vectors. These vectors may be expressed as matrices whereby the basis vectors in ͧ ̻⃑ = ȕ ̻&͙̂& &Ͱͥ are expressed as 1 0 0 ͙̂ͥ = 0 , ͙̂ͦ = 1 , ͙̂ͧ = 0 0 0 1 Since the components are just numbers, a Euclidean vector is a trio ̻& expressed in terms of a column matrix ͧ ̻ͥ ̻⃑ = ȕ ̻&͙̂& = ̻ͦ &Ͱͥ ̻ͧ Matrix Multiplication Matrix multiplication basically involves multiplying a row of the first matrix with the column of the second matrix ͕ ͖ ͙ ͚ ͕͙+͖͛ ͕͚+͖ℎ = ͗ ͘ ͛ ℎ ͙͗+͛͘ ͚͗+͘ℎ Such operations can be expressed as & % & ̽$ = ȕ ̻$ ̼% % where the upper indices are the column numbers while the lower indices are the row numbers of a matrix. The summation is thus over the column number of the first matrix and the row number of the second matrix: ͥ ͥ ͥ ͦ ͥ ̽ͥ = ̻̼ͥͥ + ̻̼ͥͦ ͥ ͥ ͥ ͦ ͥ ̽ͦ = ̻̼ͦͥ + ̻̼ͦͦ ͦ ͥ ͦ ͦ ͦ ̽ͥ = ̻̼ͥͥ + ̻̼ͥͦ ͦ ͥ ͦ ͦ ͦ ̽ͦ = ̻̼ͦͥ + ̻̼ͦͦ Demonstration If ͕ ͖ ̻ = ͗ ͘ ͙ ͚ ̼ = ͛ ℎ ͥ ͦ ͥ ͦ ͥ ͦ ͥ ͦ Then ̻ͥ = ͕, ̻ͥ = ͖, ̻ͦ = ͗, ̻ͦ = ͘, ̼ͥ = ͙, ̼ͥ = ͚, ̼ͦ = ͛, ̼ͦ = ℎ and ͥ ͥ ͥ ͦ ͥ ̽ͥ = ̻̼ͥͥ + ̻̼ͥͦ = ͕͙ + ͖͛ ͥ ͥ ͥ ͦ ͥ ̽ͦ = ̻̼ͦͥ + ̻̼ͦͦ = ͙͗ + ͛͘ ͦ ͥ ͦ ͦ ͦ ̽ͥ = ̻̼ͥͥ + ̻̼ͥͦ = ͕͚ + ͖ℎ ͦ ͥ ͦ ͦ ͦ ̽ͦ = ̻̼ͦͥ + ̻̼ͦͦ = ͚͗ + ͘ℎ Thus, ͕͙+͖͛ ͕͚+͖ℎ ̽ = ͙͗+͛͘ ͚͗+͘ℎ Covectors The dual of a vector is that object which will yield a scalar if ⟨̻| |̻⟩ ̻ ̻ multiplied with the vector. If the vector is presented as a column vector, ̻ͥ |̻⟩ ≔ ̻ͦ ̻ͧ then its elements will only have row numbers. The matrix multiplication that will yield a scalar must be of the form $ ȕ ̻ ̻$ $ We see then that the dual of vector must be an object that only has column |̻⟩ number. Such an object is a row matrix ⟨̻| ≔ ̻ͥ ̻ͦ ̻ͦ These objects are called covectors. Inner and Outer Products In Euclidean space, & . Thus, the inner product ̻ = ̻& ̻ͥ ͥ ͦ ͧ ͥ ͦ ͧ ͦ ͦ ͦ ̻ ̻ = ̻ ̻ ̻ ̻ͦ = ̻ ̻ͥ + ̻ ̻ͦ + ̻ ̻ͧ = ̻ͥ + ̻ͦ + ̻ͧ ̻ͧ gives a scalar. The outer product ͥ ͦ ͧ ̻ͥ ̻̻ͥ ̻̻ͥ ̻̻ͥ ͥ ͦ ͧ ͥ ͦ ͧ |̻⟩⟨̻| = ̻ͦ ̻ ̻ ̻ = ̻̻ͦ ̻̻ͦ ̻̻ͦ ͥ ͦ ͧ ̻ͧ ̻̻ͧ ̻̻ͧ ̻̻ͧ on the other hand yields a square matrix Transpose The operation of interchanging row and column numbers is called a Transpose. Thus, if ͕ ͖ ͇ = ͗ ͘ Its transpose is ͕ ͗ ͇ = ͖ ͘ or & $ ͇$ = ͇& For a vector, $ ̻$ = ̻ The transpose of a column matrix is a row matrix Adjoint We however deal with complex numbers in quantum physics, and physicality requires that inner products produce real numbers. We have seen in [bra-ket] that the dual of a function is its complex-conjugate. Duality of Euclidean vectors on the other hand involves a transposition. Taking these two together, we define the adjoint as the complex-conjugate of the transpose. Hence, if 1 2͝ ͇ = 3 ͝ then its adjoint is ϡ 1 3 ͇ = −2͝ −͝ Adjoint The adjoint of the vector, 1 ̻ = 0 −͝ is ̻ϡ = 1 0 ͝ Note that complex-conjuGation ensures that the inner product 1 ̻ ̻ = 1 0 ͝ 0 = 1 + 0 + 1 = 2 −͝ is real. The dual of a vector |̻⟩ in quantum physics is therefore its adjointadjoint..
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages13 Page
-
File Size-