<<

1

Stats 318: Lecture

Agenda: Monte Carlo

• The Metropolis algorithm

• The hard spheres model

• The

• The Gibbs sampler

• Examples 2

Metropolis algorithm

1. Intialize X0

2. Repeat

• Sample y from the distribution Q(Xt, ·) • Sample U ∼ U[0, 1]  π(y)Q(y,x)  • If U < h π(x)Q(x,y)

Xt+1 = y (Accept proposed move)

Otherwise

Xt+1 = Xt (Stay put) Until convergence

Select Q so that it is easy to sample from Q(x, ·) for any x ∈ S. 3 Sampling for the model of hard spheres

1. Intialize X0 (with all entries zero, say)

2. Repeat • Pick a vertex v at random • If all neighbors are empty, toss a coin – If “Heads”

Xt+1(v) = 1

– If “Tail”

Xt+1(v) = 0 • Otherwise, do nothing

This is a Metropolis algorithm with h(u) = u/(1 + u) and   1 if x, y differ in one site Q(x, y) = |V | 0 otherwise 4

Equivalent formulation (Metropolis form)

1. Intialize X0 (with all entries zero, say)

2. Repeat • Pick a vertex v at random and flip the bit • Accept move with probability   π(y)  1/2 if y is feasible h = π(x) 0 otherwise 5 Implementation

% Initialize the chain X0 - zeros(10,10); X = X0; Empirical.sum = 0;

% Number of steps N = 10000; for n = 1:N, % Pick a free vertex uniformly at random (rejection method) ok = false; while (˜ok) v = randsample(1:10,2,true); ok = (sum(neighbors(v,X)) == 0); end

% Assign the value 0 or 1 with prob 1/2 X(v(1), v(2)) = (rand(1) >= 1/2); Empirical.sum = Empirical.sum + sum(X(:)); end 6

function val = neighbors(v,X) % Returns the values of X(v’) where v’ runs trough % the list of neighbors of v

L = size(X,1); vn = [(v(1)-1) v(2); (v(1)+1) v(2); v(1) (v(2)-1); v(1) (v(2)+1)]; ix = (vn(:,1) >= 1).*(vn(:,1) <= L).*(vn(:,2) >= 1).*(vn(:,2) <= L); vn = vn(ix == 1,:); val = []; for k = 1:size(vn,1), val = [val X(vn(k,1),vn(k,2))]; end 7

A typical configuration

Figure 1: After 10,000 steps 8

How many 1’s on average?

Run the chain and estimate theoretical mean by empirical average (T = 10, 000)

1 X f(Xt) = 23.1572 T 1≤t≤T

Repeat 10 times

23.1572, 23.5349, 23.2513, 23.2399, 23.5056, 23.6810, 23.2418, 24.0579, 23.5041, 23.2266 9

Perhaps better to use burn-in, i.e. start from a configuration obtained after running the chain for some time. Idea is to remove the effect of the intial state. Emprirical averages

23.5051, 23.6458, 23.2869, 23.2398, 23.7409, 23.9433, 23.3087, 23.5144, 23.9526, 24.1567

Here X0 is state after a run of length 10,000. 10

Simulations of Gibbs distribution

• Energy distribution: E : S → R • β: real-valued parameter—inverse temperature

Gibbs distribution 1 π(x) = e−βE(x) Z Partition function X Z = e−βE(x) x∈S 11 Ising Model Simplified model of interacting spins

• Spins located at vertices of a graph G = (V,E)

• Each Iv can take on two values: ±1 • State space S = {−1, 1}|V |

Energy of a spin configuration

1 X X E(I) = − Jv,v0 IvIv0 − BvIv 2 v,v0∈V v∈V

• J is symmetric

0 • Jv,v0 = 0 if v and v are not adjacent

0 • Jv,v0 > 0 if v ∼ v (ferromagnetic interaction): spins tend to be aligned

0 • Jv,v0 < 0 if v ∼ v (anti-ferromagnetic interaction): spins tend to be misaligned

• Bv: strength of (electro)magnetic field at v 12

Problem

Simulate a typical spin configuration

This is where a Metropolis algorithm is enormously useful 13

Ising Model on Regular Lattice

• n by n ‘spin’ array

• Iv = ±1, v = (v1, v2), 1 ≤ v1, v2 ≤ n

• Energy 1 X E(I) = − IvIv0 2 v∼v0 where the sym v ∼ v0 symbolizes all pairs of nearest neighbours on the lattice.

e−βE(I) π(I) = P −βE(I0) all states I0 e 14

A Metropolis chain

Transition matrix P (x, y)

1. Pick a site and flip the sign (new state is y)

2. Accept move with probability min(π(y)/π(x), 1) 15 Implementation % Temperature parameter T = 1; beta = 1/T;

% Initialize the chain X = ones(10,10); % Number of steps N = 10000; for n = 1:N, % Pick a vertex uniformly at random v = randsample(1:10,2,true);

% Decide whether to accept a move or not G = sum(neighbors(v,X)); I = X(v(1), v(2)); R = min(exp(-beta*I*G),1); if rand(1) < R, X(v(1), v(2)) = -I; end end 16

An example at cold temperature

T = 1/β = 1 17

Examples at different temperatures

Cold Intermediate Hot 18

Magnetization P Magnetization is: M = v Iv Density is symmetric Simulate 500 states (10,000 MC steps for each) and plot histogram

120

100

80

60

40

20

0 !100 !80 !60 !40 !20 0 20 40 60 80 100

Shows that convergence has not yet occurred (almost there)! 19 Gibbs sampler for the Ising model T = 1; beta = 1/T;

% Initialize the chain X = ones(10,10);

% Number of steps N = 10000; for n = 1:N, % Pick a vertex uniformly at random v = randsample(1:10,2,true);

% Sample from the conditional distribution G = sum(neighbors(v,X)); if (rand(1) < 1/(1+exp(-beta*G))) X(v(1), v(2)) = 1; else X(v(1), v(2)) = -1; end end 20

Magnetization P Magnetization is: M = v Iv

120

100

80

60

40

20

0 !100 !80 !60 !40 !20 0 20 40 60 80 100 21

Other choices

• Other choices of intial state; e.g. Iv i.i.d. with Iv = ±1 wp. 1/2

• Speeding up convergence

– Simulated annealing

– Updating blocks of spins at a time