Second Lecture: Quantum Monte Carlo Techniques

Second Lecture: Quantum Monte Carlo Techniques

! Second Lecture: Quantum Monte Carlo Techniques Andreas Läuchli, “New states of quantum matter” MPI für Physik komplexer Systeme - Dresden http://www.pks.mpg.de/~aml [email protected] Lecture Notes at http:www.pks.mpg.de/~aml/LesHouches “Modern theories of correlated electron systems” - Les Houches - 19/5/2009 Outline of today’s lecture Quantum Monte Carlo (World line) Loop Algorithm Stochastic Series Expansion Worm Algorithm Green’s function Monte Carlo Sign Problem What is not treated here: Auxiliary field Monte Carlo for fermions Diagrammatic Monte Carlo CT-QMC for Quantum impurity problems Real Time Quantum Monte Carlo 1. Quantum Monte Carlo (World lines) Quantum Monte Carlo In quantum statistical mechanics the canonical partition function now reads: − Z(T ) = Tr e H/T Expectation values: 1 − !O" = Tr[Oe H/T ] ≡ Tr[Oρ(T )] Z(T ) We need to find a mapping onto a “classical” problem in order to perform MC World-line methods (Suzuki-Trotter, Loop algorithm, Worm algorithm, ...) Stochastic Series Expansions Potential Problem: The mapping can give “probabilities” which are negative ⇒ infamous sign problem. Common cause is frustration or fermionic statistics. Quantum Monte Carlo Feynman (1953) lays foundation for quantum Monte Carlo Map d dimensional quantum system to classical world lines in d+1 dimensions Quantum Monte Carlo Feynman (1953) lays foundation for quantum Monte Carlo Map d dimensional quantum system to classical world lines in d+1 dimensions classical particles space Quantum Monte Carlo Feynman (1953) lays foundation for quantum Monte Carlo Map d dimensional quantum system to classical world lines in d+1 dimensions “imaginary time” classical quantum mechanical particles world lines space Quantum Monte Carlo Feynman (1953) lays foundation for quantum Monte Carlo Map d dimensional quantum system to classical world lines in d+1 dimensions “imaginary time” classical quantum mechanical particles world lines space Use Metropolis algorithm to update world lines The Suzuki-Trotter Decomposition Generic mapping of a quantum spin system to “Ising” model (vertex model) basis of most discrete time QMC algorithms not limited to special models Split Hamiltonian into two easily diagonalized pieces H H = H1 + H2 = H1 + −εH −ε ( H1 + H 2 ) −εH1 −εH 2 2 e = e = e e + O(ε ) H2 Obtain €the checkerboard decomposition −β ( H + H ) € Z = Tr[exp(−βH)] = Tr[e 1 2 ] M = Tr[e−(β / M )H1 e−(β / M )H 2 ] + O(β 3 / M 2) imaginary time space direction € Path integral QMC Use Trotter-Suzuki or a simple low-order formula M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} gives a mapping to a (d+1)-dimensional classical model |i 〉 € 1 |i8〉 |i7〉 |i6〉 |i5〉 |i4〉 |i3〉 imaginary time |i2〉 |i1〉 space direction partition function of quantum system is sum over classical world lines Path integral QMC Use Trotter-Suzuki or a simple low-order formula M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} gives a mapping to a (d+1)-dimensional classical model |i 〉 € 1 |i8〉 place particles (spins) |i7〉 |i6〉 |i5〉 |i4〉 |i3〉 imaginary time |i2〉 |i1〉 space direction partition function of quantum system is sum over classical world lines Path integral QMC Use Trotter-Suzuki or a simple low-order formula M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} gives a mapping to a (d+1)-dimensional classical model |i 〉 € 1 |i8〉 place particles (spins) |i7〉 |i6〉 |i5〉 for Hamiltonians conserving |i4〉 particle number (magnetization) |i3〉 imaginary time |i2〉 we get world lines |i1〉 space direction partition function of quantum system is sum over classical world lines Calculating configuration weights M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} Examples: particles with nearest neighbor repulsion H −t a†a a†a V n n = !( i j + j i)+ ! i j € !i,j" !i,j" Calculating configuration weights M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} Examples: particles with nearest neighbor repulsion H −t a†a a†a V n n = !( i j + j i)+ ! i j € !i,j" !i,j" 1 Calculating configuration weights M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} Examples: particles with nearest neighbor repulsion H −t a†a a†a V n n = !( i j + j i)+ ! i j € !i,j" !i,j" 2 1 (∆τt) Calculating configuration weights M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} Examples: particles with nearest neighbor repulsion H −t a†a a†a V n n = !( i j + j i)+ ! i j € !i,j" !i,j" 2 3 1 (∆τt) (1 − ∆τV ) Calculating configuration weights M M Z = Tre−βH = Tre−MΔτH = Tr(e−ΔτH ) = Tr(1− ΔτH) + O(βΔτ) = ∑ i1 1− ΔτH i2 i2 1− ΔτH i3 iM 1− ΔτH i1 {(i1 ...iM )} Examples: particles with nearest neighbor repulsion H −t a†a a†a V n n = !( i j + j i)+ ! i j € !i,j" !i,j" 2 3 2 1 (∆τt) (1 − ∆τV ) (∆τt) Monte Carlo updates just move the world lines locally probabilities given by matrix element of Hamiltonian H = −t c†c + c† c example: tight binding model ∑( i i+1 i+1 i) i, j € Monte Carlo updates just move the world lines locally probabilities given by matrix element of Hamiltonian H = −t c†c + c† c example: tight binding model ∑( i i+1 i+1 i) i, j introduce or remove two kinks: € Monte Carlo updates just move the world lines locally probabilities given by matrix element of Hamiltonian H = −t c†c + c† c example: tight binding model ∑( i i+1 i+1 i) i, j introduce or remove two kinks: shift a kink: € Monte Carlo updates just move the world lines locally probabilities given by matrix element of Hamiltonian H = −t c†c + c† c example: tight binding model ∑( i i+1 i+1 i) i, j introduce or remove two kinks: shift a kink: € 2 P =1 P = (Δτt) P = Δτt Monte Carlo updates just move the world lines locally probabilities given by matrix element of Hamiltonian H = −t c†c + c† c example: tight binding model ∑( i i+1 i+1 i) i, j introduce or remove two kinks: shift a kink: € 2 P =1 P = (Δτt) P = Δτt 2 P→ = min[1,(Δτt) ] P→ = P← =1 2 P← = min[1,1/(Δτt) ] The continuous time limit the limit Δτ→0 can be taken in the algorithm [Prokof'ev et al., Pis'ma v Zh.Eks. Teor. Fiz. 64, 853 (1996)] |i1〉 τ6 |i8〉 i 〉 τ5 | 7 τ4 |i6〉 τ |i5〉 3 τ |i4〉 2 |i3〉 imaginary time imaginary time τ |i2〉 1 |i1〉 space direction space direction discrete time: store configuration at all time steps continuous time: store times at which configuration changes Advantages of continuous time No need to extrapolate in time step a single simulation is sufficient no additional errors from extrapolation Less memory and CPU time required Instead of a time step Δτ << t we only have to store changes in the configuration happening at mean distances ≈ t Speedup of 1 / Δτ ≈ 10 Conceptual advantage we directly sample a diagrammatic perturbation expansion 2. Cluster updates: The loop algorithm Problems with local updates Problems with local updates Local updates cannot change global topological properties number of world lines (particles, magnetization) conserved winding conserved braiding conserved cannot sample grand-canonical ensemble Problems with local updates Local updates cannot change global topological properties number of world lines (particles, magnetization) conserved winding conserved braiding conserved cannot sample grand-canonical ensemble Critical slowing down at second order phase transitions solved by cluster updates Problems with local updates Local updates cannot change global topological properties number of world lines (particles, magnetization) conserved winding conserved braiding conserved cannot sample grand-canonical ensemble Critical slowing down at second order phase transitions solved by cluster updates Tunneling problem at first order phase transitions solved by extended sampling techniques Cluster algorithms: the formal explanation Z = ∑W (C) = ∑∑W (C,G) with W (C) = ∑W (C,G) C C G G 1 graph G allowed for C €W (C,G) = Δ(C,G)V (G) where Δ(C,G) = 0 otherwise € Ci → (Ci,G) → G → (Ci+1,G) → Ci+1 € Cluster algorithms: the formal explanation Extend the phase space to configurations + graphs (C,G) Z = ∑W (C) = ∑∑W (C,G) with W (C) = ∑W (C,G) C C G G Choose graph weights independent of configuration (C) 1 graph G allowed for C €W (C,G) = Δ(C,G)V (G) where Δ(C,G) = 0 otherwise Perform updates € Ci → (Ci,G) → G → (Ci+1,G) → Ci+1 € Cluster algorithms: the formal explanation Extend the phase space to configurations + graphs (C,G) Z = ∑W (C) = ∑∑W (C,G) with W (C) = ∑W (C,G) C C G G Choose graph weights independent of configuration (C) 1 graph G allowed for C €W (C,G) = Δ(C,G)V (G) where Δ(C,G) = 0 otherwise Perform updates € Ci → (Ci,G) → G → (Ci+1,G) → Ci+1 V (G) 1.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    157 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us