Monte Carlo Sampling Methods
Total Page:16
File Type:pdf, Size:1020Kb
[1] Monte Carlo Sampling Methods Jasmina L. Vujic Nuclear Engineering Department University of California, Berkeley Email: [email protected] phone: (510) 643-8085 fax: (510) 643-9685 UCBNE, J. Vujic [2] Monte Carlo Monte Carlo is a computational technique based on constructing a random process for a problem and carrying out a NUMERICAL EXPERIMENT by N-fold sampling from a random sequence of numbers with a PRESCRIBED probability distribution. x - random variable N 1 xˆ = ---- x N∑ i i = 1 Xˆ - the estimated or sample mean of x x - the expectation or true mean value of x If a problem can be given a PROBABILISTIC interpretation, then it can be modeled using RANDOM NUMBERS. UCBNE, J. Vujic [3] Monte Carlo • Monte Carlo techniques came from the complicated diffusion problems that were encountered in the early work on atomic energy. • 1772 Compte de Bufon - earliest documented use of random sampling to solve a mathematical problem. • 1786 Laplace suggested that π could be evaluated by random sampling. • Lord Kelvin used random sampling to aid in evaluating time integrals associated with the kinetic theory of gases. • Enrico Fermi was among the first to apply random sampling methods to study neutron moderation in Rome. • 1947 Fermi, John von Neuman, Stan Frankel, Nicholas Metropolis, Stan Ulam and others developed computer-oriented Monte Carlo methods at Los Alamos to trace neutrons through fissionable materials UCBNE, J. Vujic Monte Carlo [4] Monte Carlo methods can be used to solve: a) The problems that are stochastic (probabilistic) by nature: - particle transport, - telephone and other communication systems, - population studies based on the statistics of survival and reproduction. b) The problems that are deterministic by nature: - the evaluation of integrals, - solving the systems of algebraic equations, - solving partial differential equations. UCBNE, J. Vujic Monte Carlo [5] Monte Carlo methods are divided into: a) ANALOG, where the natural laws are PRESERVED - the game played is the analog of the physical problem of interest (i.e., the history of each particle is simulated exactly), b) NON-ANALOG, where in order to reduce required computational time the strict analog simulation of particle histories is abounded (i.e., we CHEAT!) Variance-reduction techniques: - Absorption suppression - History termination and Russian Roulette - Splitting and Russian Roulette - Forced Collisions - Source Biasing UCBNE, J. Vujic Example 1: Evaluation of Integrals [6] fmax fx() y1 • reject • accept a x1 b b () () Ifx= ∫ dx - area uder the function f(x),Rba= – fmax - area of rectangle a I P = ---- - is a probability that a random point lies under f(x), thus IRP= R ()ξ ξ Step 1: Choose a random point (x1,y1):x1 = aba+ – 1 and y1 = fmax 2 Step 2: Check if y1 ≤ fx()1 - accept the point, if y1 < fx()1 - reject the point Step 3: Repeat this process N times, Ni - the number of accepted points Ni Ni Step 4: Determine P = ------ and the value of integral IR= ------ N N UCBNE, J. Vujic Major Components of Monte Carlo [7] Major Components of a Monte Carlo Algorithm • Probability distribution functions (pdf’s) - the physical (or mathematical) system must be described by a set of pdf’s. • Random number generator - a source of random numbers uniformly distributed on the unit interval must be available. • Sampling rule - a prescription for sampling from the specified pdf, assuming the availability of random numbers on the unit interval. • Scoring (or tallying) - the outcomes must be accumulated into overall tallies or scores for the quantities of interest. • Error estimation - an estimate of the statistical error (variance) as a function of the number of trials and other quantities must be determined. • Variance reduction techniques - methods for reducing the varinace in the estimated solution to reduce the computational time for Monte Carlo simulation. • Parallelization and vectorization - efficient use of advanced computer architectures. UCBNE, J. Vujic Probability Distribution Functions [8] Probability Distribution Functions Random Variable, x, - a variable that takes on particular val- ues with a frequency that is determined by some underlying probability distribution. Continuous Probability Distribution Paxb{}≤≤ Discrete Probability Distribution {} Px= xi = pi UCBNE, J. Vujic Probability Distribution Functions [9] PDFs and CDFs (continuous) Probability Density Function (PDF) - continuous • f(x), fx()dx= P{} x≤≤ x' x+ dx ∞ • 0 ≤ fx(), ∫ fx()dx= 1 b () –∞ fx • Probability{ axb≤≤ } = ∫ fx()dx a x → Cumulative Distribution Function (CDF) - continuous x • Fx()==∫ fx'()dx' Px'x{}≤ 1 –∞ • 0 ≤≤Fx() 1 Fx() d • 0 ≤ Fx()= fx() dx 0 b x → • ∫fx'()dx' ==Paxb{}≤≤ Fb()– Fa() a UCBNE, J. Vujic Probability Distribution Functions [10] PDFs and CDFs (discrete) Probability Density Function (PDF) - discrete () δ() • f(xi), fxi = pi xx– i ≤ () p • 0 fxi 3 ()∆() () p1 • ∑fxi xi = 1 or ∑pi = 1 fx i i p2 x x x Cumulative Distribution Function (CDF) - discrete 1 2 3 F(x) • Fx()==p fx()∆x ∑ i ∑ i i p1+p2+p3 1 < < xi x xi x p1+p2 • 0 ≤≤Fx() 1 p1 • x1 x2 x3 UCBNE, J. Vujic Monte Carlo & Random Sampling [11] Sampling from a given discrete distribution () Given fxi = pi and ∑pi = 1, i = 1, 2, ..., N i 0 p1 p1+p2 p1+p2+p3 p1+p2+...+pN=1 ξ ≤≤ξ () ()ξ ∈ and 0 1, then Px= xk ==pk P dk or k – 1 k ≤ ξ < ∑ pi ∑ pi i = 1 i = 1 UCBNE, J. Vujic Monte Carlo & Random Sampling [12] Sampling from a given continuous distribution If f(x) and F(x) represent PDF and CDF od a random variable x, and if ξ is a random number distributed uniformly on [0,1] with PDF g(ξ )=1, and if x is such that F(x) = ξ than for each ξ there is a corresponding x, and the variable x is distribute according to the probability density function f(x). Proof: For each ξξξξ in ( , +∆ ), there is x in (x,x+∆x). Assume that PDF for x is q(x). Show that q(x) = f(x): q(x)∆x = g(ξξ )∆ =∆ ξ = ( ξ +∆ ξξ)- = F(x+∆x)- F(x) q(x) = [F(x+∆x)- F(x)]/∆x = f(x) Thus, if x = F-1(ξ ), then x is distributed according to f(x). UCBNE, J. Vujic Monte Carlo & Random Sampling [13] Monte Carlo Codes Categories of Random Sampling • Random number generator uniform PDF on [0,1] • Sampling from analytic PDF’s normal, exponential, Maxwellian, ..... • Sampling from tabulated PDF’s angular PDF’s, spectrum, cross sect For Monte Carlo Codes... • Random numbers, ξ, are produced by the R.N. generator on [0,1] • Non-uniform random variates are produced from the ξ’s by — direct inversion of CDFs — rejection methods — transformations — composition (mixtures) — sums, products, ratios, ..... — table lookup + interpolation — lots (!) of other tricks ..... • < 10% of total cpu time (typical) UCBNE, J. Vujic Random Sampling Methods [14] Random Number Generator Pseudo-Random Numbers • Not strictly "random", but good enough 1 — pass statistical tests fx() — reproducible sequence 0 1 • Uniform PDF on [0,1] • Must be easy to compute, must have a large period 1 () Multiplicative congruential method Fx • Algorithm 0 1 S0 = initial seed, odd integer, < M Sk = G • Sk-1 mod M, k = 1, 2, ..... ξ k = Sk / M • Typical (vim, mcnp): 19 48 Sk = 5 • Sk-1 mod 2 ξ 48 k = Sk / 2 period = 246 ≈ 7.0 x 1013 UCBNE, J. Vujic Random Sampling Methods [15] Direct Sampling (Direct Inversion of CDFs) Direct Solution of xˆF← –1()ξ Sampling Procedure: 1 • Generate ξ ξ () ξ • Determine xˆFx such that ˆ = Fx() 0 x → xˆ Advantages • Straightforward mathematics & coding • "High-level" approach Disadvantages • Often involves complicated functions • In some cases, F(x) cannot be inverted (e.g., Klein-Nishina) UCBNE, J. Vujic Random Sampling Methods [16] Rejection Sampling Used when the inverse of CDF is costly ot impossible to find. Select a bounding function, g(x), such that • cgx⋅ ()≥ fx() for all x • g(x) is an easy-to-sample PDF Sampling Procedure: ← –1()ξ • sample xˆx from g(x): ˆg 1 cg() x ξ ⋅ ()≤ () • test: 2 cg xˆ fxˆ • reject if true accept xˆ , done () if false reject xˆ , try again fx 6 • accept Advantages x → • Simple computer operations Disadvantages • "Low-level" approach, sometimes hard to understand UCBNE, J. Vujic Random Sampling Methods [17] Table Look-Up Used when f(x) is given in a form of a histogram f 1 fi fx() f2 x1 x2 xi-1 xi F(x) ξ x1 x2 xi-1 x xi Then by linear interpolation ()() []()ξ xx– i – 1 Fi + xi – x Fi – 1 xi – xi – 1 – xiFi – 1 + xi – 1Fi Fx()= --------------------------------------------------------------------- , x = ---------------------------------------------------------------------------------- xi – xi – 1 Fi – Fi – 1 UCBNE, J. Vujic Random Sampling Methods [18] Sampling Multidimensional Random Variables If the random quantity is a function of two or more random variables that are independent, the joint PDF and CDF can be written as (), () () fxy = f1 x f2 y (), () () Fxy = F1 x F2 y EXAMPLE: Sampling the direction of isotropically scattered particle in 3D ΩΩθϕ()Ω, Ω Ω ==xi ++yj zk =vwu++, dΩ sinθdθdϕ –d()cosθ dϕ –dµdϕ -------- ==-------------------------- -------------------------------- =------------------ 4π 4π 4π 4π 1 1 f()Ω ==f ()µ f ()ϕ --------- 1 2 22π µ 1 F ()µ ===f ()µµ' d ' ---()ξµ + 1 or µ = 2ξ – 1 1 ∫–1 1 2 1 1 ϕ ϕ F ()ϕ ===f ()ϕϕ' d ' ------ ξ , or ϕ = 2πξ 1 ∫0 2 2π 2 2 UCBNE, J. Vujic Random Sampling Methods [19] Probability Density Function Direct Sampling Method Linear: fx()=02x , <<x 1 x ← ξ (L1, L2) –x Exponential: fx()=0e , < x x ← –logξ (E) u ← cos πξ 2D Isotropic: 1 2 1 f()ρ = ----- , ρ = ()uv, (C) 2π ← πξ v sin2 1 3D Isotropic: u ← 2ξ – 1 1 1 (I1, I2) f()Ω = ------ , Ω = ()uvw,, 4π ← 2 πξ v 1 –2u cos 2 ← 2 πξ w 1 –2u sin 2 π Maxwellian: 2 x –xT/ ← ()ξ ξ 2 ξ fx()=0----------- --- e , < x xT– log 1 – log 2 cos -- 3 (M1, M2, M3) T π T 2 π Watt –ab⁄ 4 wa← ()– logξ – logξ cos2 -- ξ 2e –xa⁄ 1 2 2 3 Spectrum: fx()= ---------------- e sinh bx ,0 < x π 3 2 (W1, W2, W3) a b a b 2 xw← ++--------- ()2ξ – 1 a bw 4 4 Normal: x – µ 2 –1 x ← µσ+ –22logξ cos πξ (N1, N2) 1 -2------------ 1 2 fx()= ------------- e σ σ 2π UCBNE, J.