Monte Carlo Methods - a Special Topics Course

Monte Carlo Methods - a Special Topics Course

Monte Carlo Methods - a special topics course Tom Kennedy April 27, 2016 Contents 1 Introduction 1 I Independent Monte Carlo 7 2 Basics of direct Monte Carlo 11 2.1 TheprobabilisticbasisfordirectMC . ........ 11 2.2 Errorestimation ................................. .. 12 2.3 Accuracyvs.computationtime . .... 15 2.4 Howgoodistheconfidenceinterval . ..... 16 2.5 Estimatingafunctionofseveralmeans . ........ 17 2.6 References...................................... 20 3 Pseudo-random numbers generators 21 3.1 Basics of pseudo-random numbers generators . .......... 21 3.2 Someexamples .................................... 22 3.2.1 Linearcongruentialgenerators . ...... 23 3.2.2 Multiple-recursivegenerators. ....... 23 3.2.3 Combininggenerators . 24 3.3 Alittlestatistics ............................... .... 25 iii 3.4 Tests for pseudo-random numbers generators . .......... 26 3.4.1 Equidistributiontests. .... 27 3.4.2 Gaptests ................................... 27 3.4.3 Permutationtests.............................. 28 3.4.4 Rankofbinarymatrixtest. .. 28 3.4.5 Two-stageorsecondordertests . .... 28 3.4.6 Testsuites................................... 29 3.5 Seedingtherandomnumbergenerator . ...... 29 3.6 Practicaladvice................................. ... 30 3.7 References...................................... 30 4 Generating non-uniform random variables 31 4.1 Inversion....................................... 31 4.2 Acceptance-rejection . .... 33 4.3 Aliasmethodfordiscretedistribution . ......... 36 4.4 Tricksforspecificdistributions. ......... 37 4.4.1 Box-Muller .................................. 38 4.4.2 Geometric................................... 38 4.4.3 Randompermutations . 39 4.4.4 RV’sthataresumsofsimplerRV’s . 39 4.4.5 Mixtures.................................... 39 4.5 Generating multidimensional random variables . ............ 40 4.5.1 Compositionmethodorconditioning . ..... 40 4.5.2 Acceptance-rejection . .. 40 4.5.3 Themultivariatenormal . .. 41 4.5.4 Affinetransformations . 41 4.5.5 Uniformpointinasphere,onasphere . ... 42 5 Variance reduction 43 5.1 Antitheticvariables.............................. .... 43 5.2 Controlvariates................................. ... 47 5.3 Stratifiedsampling ............................... ... 50 5.4 Conditioning .................................... 53 6 Importance sampling 57 6.1 Thebasics....................................... 57 6.2 Self-normalizedimportancesampling . ......... 63 6.3 Variance minimization and exponential tilting . ............ 65 6.4 Processes ....................................... 68 II Markov Chain Monte Carlo 71 7 Markov chain background 73 7.1 Finitestatespace ................................ ... 73 7.2 Countablestatespace. .... 77 7.3 Generalstatespace ............................... ... 79 8 Markov chain Monte Carlo 87 8.1 ThekeyideaofMCMC................................ 87 8.2 TheMetropolis-Hastingalgorithm . ....... 88 8.3 Theindependencesampler . ... 94 8.4 TheGibbssampler ................................. 95 8.5 Slicesampler .................................... 100 8.6 BayesianstatisticsandMCMC . 102 9 ConvergenceanderrorestimationforMCMC 107 9.1 Introduction-sourcesoferrors . .......107 9.2 Thevarianceforcorrelatedsamples . ........113 9.3 Varianceviabatchedmeans . 114 9.4 Subsampling..................................... 117 9.5 Burn-inorinitialization . ......119 9.6 Autocorrelationtimesandrelatedtimes . ........124 9.7 Missingmassorbottlenecks . 128 III Further topics 131 10 Optimization 133 10.1 Simulatedannealing . 133 10.2 Estimatingderivative . ......138 10.3 Stochasticgradientdescent. ........142 11 Further topics 143 11.1 Computingthedistribution-CDF’s. ........143 11.2 Computing the distribution - Kernel density estimation..............145 11.3 Sequentialmontecarlo . 152 11.3.1 Review of weighted importance sampling . .......152 11.3.2 Resampling ..................................153 11.3.3 sequentialMC................................. 156 Chapter 1 Introduction A Monte Carlo method is a compuational method that uses random numbers to compute (estimate) some quantity of interest. Very often the quantity we want to compute is the mean of some random variable. Or we might want to compute some function of several means of random variables, e.g., the ratio of two means. Or we might want to compute the distribution of our random variable. Although Monte Carlo is a probabilistic method, it is often applied to problems that do not have any randomness in them. A classic example of this is using Monte Carlo to compute high dimensional integrals. Monte Carlo methods may be divided into two types. In the first type we generate independent samples of the random variable. This is usually called direct, simple or crude Monte Carlo. The latter two terms are rather misleading. We will refer to these types of methods as direct Monte Carlo. In this type of Monte carlo the samples Xi that we generate are an i.i.d. sequence. So the strong law of large numbers tells us that the average of the Xi, i.e., the sample mean 1 n X , will converge to the mean of X as n . Furthermore the n i=1 i → ∞ central limit theorem tellsP us a lot about the error in our computation. The second type of methods are Markov Chain Monte Carlo (MCMC) methods. These methods construct a Markov Chain whose stationary distribution is the probability measure we want to simulate. We then generate samples of the distribution by running the Markov Chain. As the chain runs we compute the value Xn of our random variable at teach time step n. The samples Xn are not independent, but there are theorems that tell us the sample mean still converges to the mean of our random variable. We give some examples. 1 Example - Integration: Suppose we want to compute a definite integral over an interval like b I = f(x) dx (1.1) Za Let X be a random variable that is uniformly distributed on [a,b]. Then the expected value of f(X) is 1 b E[X]= f(x) dx (1.2) b a a | − | Z So I =(b a)E[X]. If we generate n independent sample of X, call them X1,X2, ,Xn, then the law− of large numbers says that ··· 1 n f(X ) (1.3) n n Xi=1 conveges to E[X]. The error in this method goes to zero with n as 1/√n. If f is smooth, simple numerical integration methods like Simpson’s rule do much better - 1/n4 for Simpson. However, the rate of convergence of such methods is worse in higher dimensions. In higher dimension, Monte Carlo with its slow rate of convergence may actually be faster. And if the integrand is not smooth Monte Carlo may be the best method, especially if simplicity is important. Example - more integration: Suppose we want to evaluate the integral 10 x2 e− (1.4) Z0 x2 We could follow the example above. However, e− is essentially zero on a large part of the interval [0, 10]. So for most of our samples Xi, f(Xi) is essentially zero. This looks inefficent. A large part of our computation time is spent just adding zero to our total. We can improve the efficiency by a technique that is called importance sampling. We rewrite the integral we want to compute as 2 10 x e− x − x ce dx (1.5) Z0 ce− 10 x x2 x where the constant c is chosen so that 0 ce− dx = 1. Let g(x)= e− /ce− . If X is a x random variable with density ce− , thenR the expected value E[g(X)] is the integral we want to compute. As we will learn, it is quite easy to use uniform random numbers on [0, 1] to generate samples of X with the desired density. (It is not so easy to generate samples with x2 density proportional to e− on [0, 10].) Example - yet more integration: Another example of the need for importance sampling is the following. Suppose we want to compute a d-dimensional integral over some region D. If D is a parallelepiped, it is easy to generate a random vector uniformly distributed over D. But if D is a more complicated shape this can be quite difficult. One approach is to find a parallelepiped R that contains D. We then generate random vectors that are uniformly distributed on R, but we reject them when the vector is outside of D. How efficient this is depends on the ratio of the volume of D to that of R. Even if it is very difficult to generate uniform samples from D, it may be possible to find a domain D′ which contains D and whose volume is not too much bigger than that of D. Then generating a uniform sample from D′ and rejecting it if it is not in D can be a much more effecient method that generating a uniform sample from a parallelepiped that contains D. Example - shortest path in a network: By a network we mean a collection of nodes and a collection of edges that connect two nodes. (Given two nodes there need not be an edge between them). For each edge there is a random variable that give the time it takes to traverse that edge. There random variables are taken to be independent. We fix a starting node and an ending node in the graph. The random variable we want to study is the minimum total time it takes to get from the starting node to the ending node. By minimum we mean the minimum over all possible paths in the network from the starting node to the ending node. For very small networks you work out the expected value of this random variable. But this analytic approach becomes intractable for every modest size networks. The Monte Carlo approach is to generate a bunch of samples of the network and for each sample network compute the minimum transit time. (This is easier said than done.) The average of these minima over the samples is then our estimate for the expected value we want. Example - network connectivity or reliability We now consider a network but now it is random

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    167 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us