From Matrix Product States and Dynamical Mean-Field Theory to Machine Learning Sommerfeld Theory Colloquium, LMU Munich November 9, 2016

From Matrix Product States and Dynamical Mean-Field Theory to Machine Learning Sommerfeld Theory Colloquium, LMU Munich November 9, 2016

From Matrix Product States and Dynamical Mean-Field Theory to Machine Learning Sommerfeld Theory Colloquium, LMU Munich November 9, 2016 F. Alexander Wolf j falexwolf.de Institute of Computational Biology Helmholtz Zentrum M¨unchen Outline • Matrix Product States / Tensor Trains • Dynamical Mean-Field Theory • Machine Learning 2 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Vector of random variables X 2 f0; 1gL with joint probability mass L 1 −H(x)=T X p(x) = Z e ;H(x) = xn n=1 P −H(x)=T normalized with Z = x e . 3 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Vector of random variables X 2 f0; 1gL with joint probability mass L 1 −H(x)=T X px = Z e ;H(x) = xn n=1 P −H(x)=T normalized with Z = x e . p has 2L components x 2 f(0; 0; :::; 0); (0; 0; :::; 1);::: g. 3 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Vector of random variables X 2 f0; 1gL with joint probability mass L 1 −H(x)=T X px = Z e ;H(x) = xn n=1 P −H(x)=T normalized with Z = x e . p has 2L components x 2 f(0; 0; :::; 0); (0; 0; :::; 1);::: g. Note 2100 ' 1030 ' 1015 TB. 3 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Compute correlations via cov(Xn;Xm) = hXnXmi − hXnihXni, X hXnXmi = xnxm px: x 3 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Compute correlations via cov(Xn;Xm) = hXnXmi − hXnihXni, X hXnXmi = xnxm px: x . Naive brute force: 2L operations necessary. Monte Carlo: sampling in space of 2L states. 3 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Better: independent degrees of freedom Xn imply separability PL 1 − n=1 xn=T px = px1;x2;:::;xL = Z e 1 −xn=T = Z ax1 ax2 : : : axL ; axn = e : 3 / 25 \Tensor Trains I": noninteracting bits X1 X2 X3 X4 ... Better: independent degrees of freedom Xn imply separability PL 1 − n=1 xn=T px = px1;x2;:::;xL = Z e 1 −xn=T = Z ax1 ax2 : : : axL ; axn = e : Compute correlations in 2L operations . L 1 X X Y X hX X i = x a x a a n m Z n xn m xm xk xn xm k6=n;m xk = hXnihXmi ::: there are none. 3 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... L−1 1 −H(x)=T X pex = Z e ;H(x) = − xnxn+1: n=1 4 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... Two-body interactions imply \almost { separability" X X x1x2=T x2x3=T Z pex = e e ::: x x 4 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... Two-body interactions imply \almost { separability" X X Z p = A A ::: ex x1x2 x2;x3 x x 4 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... Two-body interactions imply \almost { separability" X X Z p = A A ::: ex x1x2 x2;x3 x x xnxn+1=T 2×2 = gsumAA : : : ; Axnxn+1 = e ;A 2 R ; where gsum is the grand sum. 4 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... Two-body interactions imply \almost { separability" X X Z p = A A ::: ex x1x2 x2;x3 x x xnxn+1=T 2×2 = gsumAA : : : ; Axnxn+1 = e ;A 2 R ; where gsum is the grand sum. Compare to non-interacting case X X −xn=T 2 Z px = ax1 ax2 : : : ; axn = e ; a 2 R : x x 4 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... Compute correlations in 23L operations (L matrix products) n−1 m−1 L−1 1 Y Y Y hX X i = gsum A[k] M A[k] M A[k] n m pe Z k=1 k=n k=m 1 0 where M = 0 −1 4 / 25 \Tensor Trains II": interacting bits (Ising model) X1 X2 X3 X4 ... Compute correlations in 23L operations (L matrix products) n−1 m−1 L−1 1 Y Y Y hX X i = gsum A[k] M A[k] M A[k] n m pe Z k=1 k=n k=m 1 0 where M = 0 −1 . Compare to non-interacting case (2L operations) 1 X X Y X hX X i = x A x A A n m p Z n xn m xm xk xn xm k6=n;m xk 4 / 25 L−2 B 0 = A X Y t xn(2xn+1+xn+2) xnxn+1xn+2 = Bx0 x0 B 0 0 n n+1 xn+1xn+2 2×4 x0 n=1 B 2 R 1 3 3 Tensor Train format . 2 (2 + 4 )L operations L−2 xnxn+1xn+2=T X X Y Axnxn+1xn+2 = e Z px = Ax x x n n+1 n+2 2×2×2 x x n=1 A 2 R \Tensor Trains III": long-range interacting bit chain Wolf (2015) X1 X2 X3 X4 ... L−2 1 −H(x)=T X px = Z e ;H(x) = − xnxn+1xn+2 n=1 5 / 25 L−2 B 0 = A X Y t xn(2xn+1+xn+2) xnxn+1xn+2 = Bx0 x0 B 0 0 n n+1 xn+1xn+2 2×4 x0 n=1 B 2 R 1 3 3 Tensor Train format . 2 (2 + 4 )L operations \Tensor Trains III": long-range interacting bit chain Wolf (2015) X1 X2 X3 X4 ... L−2 1 −H(x)=T X px = Z e ;H(x) = − xnxn+1xn+2 n=1 L−2 xnxn+1xn+2=T X X Y Axnxn+1xn+2 = e Z px = Ax x x n n+1 n+2 2×2×2 x x n=1 A 2 R 5 / 25 1 3 3 Tensor Train format . 2 (2 + 4 )L operations \Tensor Trains III": long-range interacting bit chain Wolf (2015) X1 X2 X3 X4 ... L−2 1 −H(x)=T X px = Z e ;H(x) = − xnxn+1xn+2 n=1 L−2 xnxn+1xn+2=T X X Y Axnxn+1xn+2 = e Z px = Ax x x n n+1 n+2 2×2×2 x x n=1 A 2 R L−2 B 0 = A X Y t xn(2xn+1+xn+2) xnxn+1xn+2 = Bx0 x0 B 0 0 n n+1 xn+1xn+2 2×4 x0 n=1 B 2 R 5 / 25 \Tensor Trains III": long-range interacting bit chain Wolf (2015) X1 X2 X3 X4 ... L−2 1 −H(x)=T X px = Z e ;H(x) = − xnxn+1xn+2 n=1 L−2 xnxn+1xn+2=T X X Y Axnxn+1xn+2 = e Z px = Ax x x n n+1 n+2 2×2×2 x x n=1 A 2 R L−2 B 0 = A X Y t xn(2xn+1+xn+2) xnxn+1xn+2 = Bx0 x0 B 0 0 n n+1 xn+1xn+2 2×4 x0 n=1 B 2 R 1 3 3 Tensor Train format . 2 (2 + 4 )L operations 5 / 25 If px = p(x) does not couple all index components xn among each other, there is a low rank Tensor Train representation. This reduces computational cost in summations over p(x) from exponential to linear in system size. • What about quantum mechanics? \Tensor Trains" in Statistical Mechanics • Write probability mass function L p : f0; 1; :::; dg ! R; d; L 2 N; as vector dL px = p(x); p 2 R ; which is indexed and parametrized by x 2 f0; 1; :::; dgL. 6 / 25 This reduces computational cost in summations over p(x) from exponential to linear in system size. • What about quantum mechanics? \Tensor Trains" in Statistical Mechanics • Write probability mass function L p : f0; 1; :::; dg ! R; d; L 2 N; as vector dL px = p(x); p 2 R ; which is indexed and parametrized by x 2 f0; 1; :::; dgL. If px = p(x) does not couple all index components xn among each other, there is a low rank Tensor Train representation. 6 / 25 • What about quantum mechanics? \Tensor Trains" in Statistical Mechanics • Write probability mass function L p : f0; 1; :::; dg ! R; d; L 2 N; as vector dL px = p(x); p 2 R ; which is indexed and parametrized by x 2 f0; 1; :::; dgL. If px = p(x) does not couple all index components xn among each other, there is a low rank Tensor Train representation. This reduces computational cost in summations over p(x) from exponential to linear in system size. 6 / 25 \Tensor Trains" in Statistical Mechanics • Write probability mass function L p : f0; 1; :::; dg ! R; d; L 2 N; as vector dL px = p(x); p 2 R ; which is indexed and parametrized by x 2 f0; 1; :::; dgL. If px = p(x) does not couple all index components xn among each other, there is a low rank Tensor Train representation. This reduces computational cost in summations over p(x) from exponential to linear in system size. • What about quantum mechanics? 6 / 25 We now consider quantum many-body states X j i = cxjxi; x where jxi = jx1i ⊗ jx2i ⊗ · · · ⊗ jxLi = jx1x2 : : : xLi is a tensor product of single-particle basis states jxii. For example jxii 2 fj "ii; j #iig • But, do we know anything about how the vector of coefficients c = (cx) couples its components, so that the tensor train format is meaningful? Statistical Mechanics vs.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    91 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us