<<

Chapter 13 Representation

Matrix Rep. Same basics as introduced already. Convenient method of working with vectors.

Superposition Complete set of vectors can be used to express any other vector.

Complete set of N orthonormal vectors can form other complete sets of N orthonormal vectors.

Can find set of vectors for Hermitian operator satisfying Au= α u. Eigenvectors and eigenvalues

Matrix method Find superposition of basis states that are eigenstates of particular operator. Get eigenvalues.

Copyright – Michael D. Fayer, 2018 Orthonormal basis set in N dimensional

{ e j } basis vectors

Any N dimensional vector can be written as

N = j j x∑ xej with xj = ex j=1

To get this, project out

eejj from x

j jj j xej = e e x piece of x that is e , then sum over all e j .

Copyright – Michael D. Fayer, 2018 Operator equation

y= Ax

NN jj ∑∑yejj= A xe Substituting the series in terms of bases vectors. jj=11=

N j = ∑ xj Ae j=1

Left mult. by ei

N ij yij= ∑ e Ae x j=1

The N2 scalar products

ij e Ae N values of j for each yi; and N different yi are completely determined by Ae and the basis set { j } . Copyright – Michael D. Fayer, 2018 Writing = ij j aij e Ae Matrix elements of A in the basis { e } gives for the linear transformation

N = yi∑ ax ij j jN= 12,, j=1 j Know the aij because we know A and { e }

In terms of the vector representatives  Q=++754 xyzˆˆˆ x1 y1 vector   x2 y 7 x =  y = 2  vector representative,   5    must know basis    xN yN 4

(Set of numbers, gives you vector when basis is known.) The set of N linear algebraic equations can be written as y= Ax double underline means matrix

Copyright – Michael D. Fayer, 2018 A array of coefficients - matrix

aa11 12  a1N  aa a 21 22  2 N The aij are the elements of the matrix A . Aa=( ij ) =   aaNN12 aNN

y= Ax

vector representatives in particular basis

y1   aa 11 12  a1N  x 1       y aa x 2 =  21 22 2             yNNN   aa12 aNN x N 

The product of matrix A and vector representative x is a new vector representative y with components N yi= ∑ ax ij j j=1 Copyright – Michael D. Fayer, 2018 Matrix Properties, Definitions, and Rules

Two matrices, A and B are equal AB=

if aij = bij.

The unit matrix The zero matrix 10 0  01 0 ones down 00 0 1 =δ =   ij   principal diagonal 00 0  0 = 00 1    00 0 Gives identity transformation N 00x = yi =∑δ ij xx j = i j=1 Corresponds to y=1 xx =

Copyright – Michael D. Fayer, 2018 Consider y= Ax z= By operator equations

z= BA x

Using the same basis for both transformations

N = zk∑ by ki i z= By Bb= matrix = (ki ) i=1

z= By = BAx = Cx

C= BA has elements Example

N c= ba 23  75   2928  kj∑ ki ij   =   i=1 34  56   4139  Law of matrix multiplication Copyright – Michael D. Fayer, 2018 Multiplication Associative

( AB) C= A( BC)

Multiplication NOT Commutative except in special cases.

AB ≠ BA

Matrix addition and multiplication by

αβA+= BC

cabij=αβ ij + ij

Copyright – Michael D. Fayer, 2018 Reciprocal of Product −1 −− ( AB) = B11 A

For matrix defined as Aa= ()ij  Aa= ( ji ) interchange rows and columns

Complex Conjugate

* * Aa= ( ij ) of each element

Hermitian Conjugate

+ * Aa= ( ji ) complex

Inverse of a matrix A − −−11 inverse of A A 1 AA= A A = 1 identity matrix

CT − A transpose of cofactor matrix (matrix of signed minors) A 1 = A A≠=0 If AA 0 is singular Copyright – Michael D. Fayer, 2018 Rules

( AB) = BA  transpose of product is product of transposes in reverse order

||||AA = determinant of transpose is determinant

** ()AB* = A B complex conjugate of product is product of complex conjugates

|AA* |||= * determinant of complex conjugate is complex conjugate of determinant

++ ()AB+ = B A Hermitian conjugate of product is product of Hermitian conjugates in reverse order

+ |AA |||= * determinant of Hermitian conjugate is complex conjugate of determinant

Copyright – Michael D. Fayer, 2018 Definitions

AA=  Symmetric

+ AA= Hermitian

AA= * Real

* AA= − Imaginary

−+1 AA= Unitary

aaij= ijδ ij Diagonal

Powers of a matrix A01=1 A = A A 2 = AA ⋅⋅⋅ A2 eAA =1 + + +⋅⋅⋅ 2!

Copyright – Michael D. Fayer, 2018 Column vector representative one column matrix

x1  x x = 2   vector representatives in particular basis xN then y= Ax becomes

y1   aa 11 12  a1N  x 1       y aa x 2 =  21 22 2             yNNN   aa12 aNN x N  row vector transpose of column vector

x = ( xx12,  xN )

y= Ax y= xA transpose + y= Ax y++= xA Hermitian conjugate

Copyright – Michael D. Fayer, 2018 Change of Basis orthonormal basis { ei } then ij ee= δ ij ( ij,= 1, 2, N)

Superposition of { ei } can form N new vectors linearly independent a new basis { ei′ }

N ik′ e= ∑ ueik i= 1, 2,  N k=1

complex numbers

Copyright – Michael D. Fayer, 2018 New Basis is Orthonormal

ji′′ ee = δ ij if the matrix

Uu= ( ik ) coefficients in superposition N ik′ e= ∑ ueik i= 1, 2,  N meets the condition k=1 + UU= 1

−+ UU1 = U is unitary – Hermitian conjugate = inverse

Important result. The new basis { e i′ } will be orthonormal if U , the transformation matrix, is unitary (see book

++ and Errata and Addenda, book ). UU= U U = 1

Copyright – Michael D. Fayer, 2018 Unitary transformation substitutes orthonormal basis { e ′ } for orthonormal basis { e } .

Vector x

i x= ∑ xei i vector – line in space (may be high dimensionality abstract space) ′ i′ x= ∑ xei written in terms of two basis sets i

x Same vector – different basis.

The unitary transformation U can be used to change a vector representative of x in one orthonormal basis set to its vector representative in another orthonormal basis set. x – vector rep. in unprimed basis x' – vector rep. in primed basis x′ = Ux change from unprimed to primed basis

+ x= Ux′ change from primed to unprimed basis

Copyright – Michael D. Fayer, 2018 Example

Consider basis {xyzˆˆ,,ˆ} y

|s

z x

Vector s - line in real space.

In terms of basis s=++771 xˆˆ yzˆ

Vector representative in basis {xyzˆˆ,,ˆ}

7 =  s 7 1

Copyright – Michael D. Fayer, 2018 Change basis by rotating axis system 45° around zˆ .

Can find the new representative of s , s'

s′ = Us

U is

x yz cosθ sinθ 0  U = −sinθ cosθ 0  0 01

For 45° rotation around z

2/2 2/2 0  U = − 2/2 2/2 0  0 01 

Copyright – Michael D. Fayer, 2018 Then

2/2 2/2 0 7 72  ′  s =−=2/2 2/2 0 7 0  0 0 11  1  

72  s e′ s′ = 0 vector representative of in basis { }  1

Same vector but in new basis. Properties unchanged.

1/2 Example – length of vector ( ss)

1/2 1/2 1/2 1/2 ss=⋅( s* s ) = (49 ++ 49 1) = (99)

1/2 * 1/2 1/2 1/2 ss=( s′′ ⋅ s ) = (2 × 49 ++ 0 1) = (99)

Copyright – Michael D. Fayer, 2018 Can go back and forth between representatives of a vector x by

change from unprimed change from primed to primed basis to unprimed basis

+ x′ = Ux x= Ux′

components of x in different basis

Copyright – Michael D. Fayer, 2018 Consider the linear transformation y= Ax operator equation In the basis { e } can write y= Ax or

yi= ∑ ax ij j j

Change to new orthonormal basis { e ′ } using U

+ y′′= Uy = UAx = UAUx or y′′= Ax′ with the matrix A ′ given by + A′ = U AU −1 Because U is unitary A′ = U AU

Copyright – Michael D. Fayer, 2018 Extremely Important

− A′ = U AU 1

Can change the matrix representing an operator in one orthonormal basis into the equivalent matrix in a different orthonormal basis.

Called

Similarity Transformation

Copyright – Michael D. Fayer, 2018 yAx= ABC = ABC +=

In basis { e }

Go into basis { e′ }

yAx′′=′ ABC′′= ′ ABC ′ += ′ ′

Relations unchanged by change of basis. Example AB= C

++ U ABU= UCU

+ +− Can insert UU between AB because UU = U1 U = 1

++ + U AU U BU= U CU Therefore AB′′= C ′

Copyright – Michael D. Fayer, 2018 between operators in abstract vector space and matrix representatives.

Because of isomorphism not necessary to distinguish abstract vectors and operators from their matrix representatives.

The matrices (for operators) and the representatives (for vectors) can be used in place of the real things.

Copyright – Michael D. Fayer, 2018 Hermitian Operators and Matrices

Hermitian operator

xAy= yAx

Hermitian operator

+ AA=

+ - complex conjugate transpose - Hermitian conjugate

Copyright – Michael D. Fayer, 2018 Theorem (Proof: Powell and Craseman, P. 303 – 307, or linear algebra book)

For a Hermitian operator A in a linear vector space of N dimensions, there exists an orthonormal basis, UU12,  UN relative to which A is represented by a diagonal matrix

α1 00  00α A′ = 2 0  .  0 α N

i The vectors, U , and the corresponding real numbers, αi, are the solutions of the Eigenvalue Equation AU= α U and there are no others.

Copyright – Michael D. Fayer, 2018 Application of Theorem

Operator A represented by matrix A in some basis { e i } . The basis is any convenient basis. In general, the matrix will not be diagonal. There exists some new basis eigenvectors

{ U i }

in which A ′ represents operator and is diagonal eigenvalues.

To get from { e i } to { U i } unitary transformation.

{ Uii} = Ue{ }.

− A′ = U AU 1 Similarity transformation takes matrix in arbitrary basis into diagonal matrix with eigenvalues on the diagonal.

Copyright – Michael D. Fayer, 2018 Matrices and Q.M.

Previously represented state of system by vector in abstract vector space.

Dynamical variables represented by linear operators.

Operators produce linear transformations. y= Ax

Real dynamical variables (observables) are represented by Hermitian operators.

Observables are eigenvalues of Hermitian operators. AS= α S

Solution of eigenvalue problem gives eigenvalues and eigenvectors.

Copyright – Michael D. Fayer, 2018 Matrix Representation

Hermitian operators replaced by Hermitian matrix representations. AA→ In proper basis, A ′ is the diagonal Hermitian matrix and the diagonal matrix elements are the eigenvalues (observables).

−1 A suitable transformation U AU takes A (arbitrary basis) into A′ (diagonal – eigenvector basis)

− A′ = U AU 1 . U takes arbitrary basis into eigenvectors. Diagonalization of matrix gives eigenvalues and eigenvectors.

Matrix formulation is another way of dealing with operators and solving eigenvalue problems.

Copyright – Michael D. Fayer, 2018 All rules about kets, operators, etc. still apply.

Example Two Hermitian matrices AB and can be simultaneously diagonalized by the same unitary transformation if and only if they commute.

All ideas about matrices also true for infinite dimensional matrices.

Copyright – Michael D. Fayer, 2018 Example – Harmonic Oscillator

Have already solved – use occupation number representation kets and bras (already diagonal).

1 221 ++ H=( Px + ) =(aa + a a) 2 2 + an= nn −1 an=++ n11 n matrix elements of a 0 123⋅⋅ ⋅⋅ 00a = 0 0 1000⋅⋅⋅ 0  = 01a 1 1 00 200 02a = 0  2 000 30  3  a = 0000 4 10a = 0  ⋅ ⋅⋅ 11a = 0 ⋅  ⋅⋅ a = 12 2 ⋅ ⋅⋅ 13a = 0 ⋅  ⋅  Copyright – Michael D. Fayer, 2018 0 0 00⋅⋅  10 00⋅⋅  ⋅⋅ 1 ++ + 0 200 H=( aa + a a) a = 2 0 0 30⋅⋅  00 0 ⋅⋅ 4 ⋅ ⋅⋅⋅ ⋅⋅

0 10 0 ⋅ ⋅ 0 0 00⋅⋅ 1000⋅ ⋅ ⋅ 10 00⋅⋅  00 20 0200⋅  ⋅ ⋅ 0 200⋅⋅ aa+ = 00 0 3 = 0030⋅ ⋅⋅  00 0 0 4 ⋅ 0 0 30 ⋅ 0004  ⋅ ⋅⋅⋅ ⋅ ⋅ 00 0 ⋅⋅ ⋅⋅⋅⋅⋅ 4 ⋅ ⋅ ⋅ ⋅ ⋅⋅ ⋅ ⋅⋅⋅ ⋅⋅

Copyright – Michael D. Fayer, 2018 1 ++ H=( aa + a a) 2

0 0 00⋅⋅⋅ 0 10 0 ⋅ ⋅ 0000⋅ 10 00⋅⋅⋅ ⋅ ⋅ 00 20 ⋅ 0100 0 200⋅⋅⋅ ⋅ ⋅ aa+ = 00 0 3 = 0020⋅ 0 0 30⋅⋅⋅ 00 0 0 4 ⋅  0003⋅  00 ⋅⋅⋅ ⋅ ⋅⋅⋅ ⋅ ⋅ ⋅ ⋅ ⋅ ⋅⋅ 0 4  ⋅ ⋅ ⋅⋅⋅⋅⋅ ⋅ ⋅ ⋅ ⋅ ⋅⋅

Copyright – Michael D. Fayer, 2018 Adding the matrices aa + and aa + and multiplying by ½ gives H

1000⋅⋅ 120 0 0     0 3 0 0⋅⋅ 0 32 0 0 1    H = 0050⋅⋅= 0 0 520  2    0007⋅⋅ 0 0 0 72     ⋅⋅⋅⋅⋅ ⋅ ⋅ ⋅ ⋅ ⋅

The matrix is diagonal with eigenvalues on diagonal. In normal units the matrix would be multiplied by  ω .

This example shows idea, but not how to diagonalize matrix when you don’t already know the eigenvectors.

Copyright – Michael D. Fayer, 2018 Diagonalization

Eigenvalue equation eigenvalue Au= αu Au−=αu 0

matrix representing representative of operator eigenvector

In terms of the components

N ∑(aij−==αδ ij) u j 0( iN 1, 2 ) j=1

This represents a system of equations

We know the aij. (a−α ) u +++= au au  0 11 1 12 2 13 3 We don't know α au21 1+−( a 22α ) u 2 + au 23 3 += 0 - the eigenvalues

ui - the vector representatives, au+ au +−( aα ) u += 0 31 1 32 2 33 3 one for each eigenvalue.

Copyright – Michael D. Fayer, 2018 Besides the trivial solution

uu12= = uN = 0

A solution only exists if the determinant of the coefficients of the ui vanishes.

(a11 −α ) aa12 13 ⋅⋅⋅ know a , don't know α's aa21 ( 22 −α ) a23 ⋅⋅⋅ ij a aa( −α ) 31 32 33 = 0 ⋅⋅ ⋅⋅ ⋅⋅

Expanding the determinant gives Nth degree equation for the unknown α's (eigenvalues). −α +++= Then substituting one eigenvalue at a time into (a11) u 1 au 12 2 au 13 3  0 system of equations, the ui (eigenvector representatives) are found. au21 1+−( a 22α ) u 2 + au 23 3 += 0 N equations for u's gives only N - 1 conditions. au31 1+ au 32 2 +−( a 33α ) u 3 += 0 Use normalization. ** * uu11+ uu 22 +⋅⋅⋅+ uNN u =1

Copyright – Michael D. Fayer, 2018 Example - Degenerate Two State Problem

Basis - time independent kets α β orthonormal.

HEα=0 α + γβ α and β not eigenkets. γ HEβ=0 β + γα Coupling .

These equations define H.

The matrix elements are And the is

ααHE= 0 βH αγ= αβ αH βγ= α E γ = 0 ββHE= 0 H  β γ E0

Copyright – Michael D. Fayer, 2018 The corresponding system of equations is

()E0 −λα += γβ 0These only have a solution if the determinant of the coefficients vanish. γα+−(E0 λ )0 β =

− λγ γ Make into determinant. E0 Take the E0 = 0 Subtract λ from the diagonal γλE − matrix γ E 0 0 elements.

Expanding 2γ Dimer Splitting 2 2 ( E0 −λγ) −=0

2 22 E0 Excited State λλγ−20EE00 + −=

Energy Eigenvalues

λγ+ =E0 + Ground State λγ=E − − 0 E = 0 Copyright – Michael D. Fayer, 2018 To obtain Eigenvectors Use system of equations for each eigenvalue.

+=ab++αβ + Eigenvectors associated with λ+ and λ-.

−=ab−−αβ +

[ab++, ] and [ ab −− , ] are the vector representatives of + and − in the αβ, basis set. We want to find these.

Copyright – Michael D. Fayer, 2018 First, for the

λγ+ =E0 + eigenvalue write system of equations.

()H11 −+λ++ a Hb12 + = 0

Ha21 ++−( H22 λ++ )0 b =

HHHHHH11 =αα ;;;12 =αβ 21 =βα HH22 = ββ Matrix elements of H

The matrix elements are ()EE00−−γγ a++ + b =0

ααHE= 0 γγa+++() EE −− b =0 00 βH αγ= αH βγ= The result is ββHE= 0

−+γγab++ =0

γγab++−=0

Copyright – Michael D. Fayer, 2018 An equivalent way to get the equations is to use a matrix form.

E0 − λγ+ a+ 0 =  γλE0 − + b+ 0

Substitute λγ+ =E0 +

EE00−−γγa+ 0 =  γγEE00−−b+ 0

−γγ a+ 0  =  γγ− b+ 0

Multiplying the matrix by the column vector representative gives equations.

−+γγab++ =0

γγab++−=0

Copyright – Michael D. Fayer, 2018 −+γγab++ =0 The two equations are identical. γγab++−=0

ab++=

Always get N – 1 conditions for the N unknown components. Normalization condition gives necessary additional equation.

22 ab+++=1 Then 1 ab++= = 2 and 11 +=αβ + Eigenvector in terms of the α β 22basis set.

Copyright – Michael D. Fayer, 2018 For the eigenvalue

λγ− =E0 − using the matrix form to write out the equations

E0 − λγ− a− 0 =  γλE0 − − b− 0

Substituting λγ− =E0 −

γγ a− 0  =  γγ b− 0

γγab−−+=0

γγab−−+=0

These equations give ab−−= − 11 Using normalization ab−−= = − 22 11 Therefore −=αβ − 22 Copyright – Michael D. Fayer, 2018 Can diagonalize by transformation − H′ = UHU 1 diagonal not diagonal Transformation matrix consists of representatives of eigenvectors in original basis.

aa 1/ 2 1/ 2 U =+− =   bb+−1/ 2− 1/ 2

− 1/ 2 1/ 2 U 1 =  complex conjugate transpose 1/ 2− 1/ 2

Copyright – Michael D. Fayer, 2018 Then  ′ 1/ 2 1/ 2 E0 γ 1/ 2 1/ 2 H =  γ 1/ 2−− 1/ 2 E0 1/ 2 1/ 2

Factoring out 1/ 2 , one from each matrix.

′ 1 11E0 γ  11 H =   2 11−−γ E0  11

after matrix multiplication

′ 1 11EE00+−γγ H =  2 11− EE00+γγ −+

more matrix multiplication

′ E0 + γ 0 H = diagonal with eigenvalues on diagonal 0 E0 −γ

Copyright – Michael D. Fayer, 2018