Probability and Random Processes
Total Page:16
File Type:pdf, Size:1020Kb
Ramon van Handel Probability and Random Processes ORF 309/MAT 380 Lecture Notes Princeton University This version: February 22, 2016 Preface These lecture notes are intended for a one-semester undergraduate course in applied probability. Such a course has been taught at Princeton for many years by Erhan Çinlar. The choice of material in these notes was greatly inspired by Çinlar’s course, though my own biases regarding the material and presentation are inevitably reflected in the present incarnation. As always, some choices had to be made regarding what to present: • It should be emphasized that this course is not intended for a pure math- ematics audience, for whom an entirely different approach would be in- dicated. The course is taken by a diverse range of undergraduates in the sciences, engineering, and applied mathematics. For this reason, the focus is on probabilistic intuition rather than rigorous proofs, and the choice of material emphasizes exact computations rather than inequalities or asymp- totics. The main aim is to introduce an applied audience to a range of basic probabilistic notions and to quantitative probabilistic reasoning. • A principle I have tried to follow as much as possible is not to introduce any concept out of the blue, but rather to have a natural progression of topics. For example, every new distribution that is encountered is derived naturally from a probabilistic model, rather than being defined abstractly. My hope is that this helps students develop a feeling for the big picture and for the connections between the different topics. • The range of topics is quite large for a first course on probability, and the pace is rapid. The main missing topic is an introduction to martingales; I hope to add a chapter on this at the end at some point in the future. It is a fact of life that lecture notes are a perpetual construction zone. Surely errors remain to be fixed and presentation remains to be improved. Many thanks are due to all students who provided me with corrections in the past, and I will be grateful to continue to receive such feedback in the future. Princeton, January 2016 Contents 0 Introduction ............................................... 1 0.1 What is probability? . .1 0.2 Why do we need a mathematical theory? . .2 0.3 This course . .4 1 Basic Principles of Probability ............................. 5 1.1 Sample space . .5 1.2 Events . .6 1.3 Probability measure . .9 1.4 Probabilistic modelling . 12 1.5 Conditional probability . 16 1.6 Independent events . 19 1.7 Random variables . 23 1.8 Expectation and distributions . 26 1.9 Independence and conditioning . 31 2 Bernoulli Processes ........................................ 37 2.1 Counting successes and binomial distribution . 37 2.2 Arrival times and geometric distribution . 42 2.3 The law of large numbers . 47 2.4 From discrete to continuous arrivals. 56 3 Continuous Random Variables ............................. 63 3.1 Expectation and integrals . 63 3.2 Joint and conditional densities . 70 3.3 Independence . 74 4 Lifetimes and Reliability ................................... 77 4.1 Lifetimes . 77 4.2 Minima and maxima . 80 4.3 * Reliability . 85 X Contents 4.4 * A random process perspective . 89 5 Poisson Processes .......................................... 97 5.1 Counting processes and Poisson processes . 97 5.2 Superposition and thinning . 102 5.3 Nonhomogeneous Poisson processes . 110 6 Random Walks ............................................115 6.1 What is a random walk? . 115 6.2 Hitting times . 117 6.3 Gambler’s ruin . 124 6.4 Biased random walks . 127 7 Brownian Motion ..........................................135 7.1 The continuous time limit of a random walk . 135 7.2 Brownian motion and Gaussian distribution . 138 7.3 The central limit theorem . 144 7.4 Jointly Gaussian variables . 151 7.5 Sample paths of Brownian motion . 155 8 Branching Processes .......................................161 8.1 The Galton-Watson process . 161 8.2 Extinction probability . 163 9 Markov Chains ............................................169 9.1 Markov chains and transition probabilities . 169 9.2 Classification of states . 175 9.3 First step analysis . 180 9.4 Steady-state behavior . 183 9.5 The law of large numbers revisited . 191 0 Introduction 0.1 What is probability? Most simply stated, probability is the study of randomness. Randomness is of course everywhere around us—this statement surely needs no justification! One of the remarkable aspects of this subject is that it touches almost ev- ery area of the natural sciences, engineering, social sciences, and even pure mathematics. The following random examples are only a drop in the bucket. • Physics: quantities such as temperature and pressure arise as a direct con- sequence of the random motion of atoms and molecules. Quantum me- chanics tells us that the world is random at an even more basic level. • Biology and medicine: random mutations are the key driving force behind evolution, which has led to the amazing diversity of life that we see today. Random models are essential in understanding the spread of disease, both in a population (epidemics) or in the human body (cancer). • Chemistry: chemical reactions happen when molecules randomly meet. Random models of chemical kinetics are particularly important in systems with very low concentrations, such as biochemical reactions in a single cell. • Electrical engineering: noise is the universal bane of accurate transmission of information. The effect of random noise must be well understood in order to design reliable communication protocols that you use on a daily basis in your cell phones. The modelling of data, such as English text, using random models is a key ingredient in many data compression schemes. • Computer science: randomness is an important resource in the design of algorithms. In many situations, randomized algorithms provide the best known methods to solve hard problems. • Civil engineering: the design of buildings and structures that can reliably withstand unpredictable effects, such as vibrations, variable rainfall and wind, etc., requires one to take randomness into account. 2 0 Introduction • Finance and economics: stock and bond prices are inherently unpre- dictable; as such, random models form the basis for almost all work in the financial industry. The modelling of randomly occurring rare events forms the basis for all insurance policies, and for risk management in banks. • Sociology: random models provide basic understanding of the formation of social networks and of the nature of voting schemes, and form the basis for principled methodology for surveys and other data collection methods. • Statistics and machine learning: random models form the foundation for almost all of data science. The random nature of data must be well under- stood in order to draw reliable conclusions from large data sets. • Pure mathematics: probability theory is a mathematical field in its own right, but is also widely used in many problems throughout pure mathe- matics in areas such as combinatorics, analysis, and number theory. • ... (insert your favorite subject here) As a probabilist1, I find it fascinating that the same basic principles lie at the heart of such a diverse list of interesting phenomena: probability theory is the foundation that ties all these and innumerable other areas together. This should already be enough motivation in its own right to convince you (in case you were not already convinced) that we are on to an exciting topic. Before we can have a meaningful discussion, we should at least have a basic idea of what randomness means. Let us first consider the opposite notion. Suppose I throw a ball many times at exactly the same angle and speed and under exactly the same conditions. Every time we run this experiment, the ball will land in exactly the same place: we can predict exactly what is going to happen. This is an example of a deterministic system. Randomness is the opposite of determinism: a random phenomenon is one that can yield different outcomes in repeated experiments, even if we use exactly the same conditions in each experiment. For example, if we flip a coin, we know in advance that it will either come up heads or tails, but we cannot predict before any given experiment which of these outcomes will occur. Our challenge is to develop a framework to reason precisely about random phenomena. 0.2 Why do we need a mathematical theory? It is not at all obvious at first sight that it is possible to develop a rigorous the- ory of probability: how can one make precise predictions about a phenomenon whose behavior is inherently unpredictable? This philosophical hurdle ham- pered the development of probability theory for many centuries. 1 Official definition from the Oxford English Dictionary: “probabilist, n. An expert or specialist in the mathematical theory of probability.” 0.2 Why do we need a mathematical theory? 3 To illustrate the pitfalls of an intuitive approach to probability, let us con- sider a seemingly plausible definition. You probably think of the probability that an event E happens as the fraction of outcomes in which E occurs (this is not entirely unreasonable). We could posit this as a tentative definition Number of outcomes where E occurs Probability of E = . Number of all possible outcomes This sort of intuitive definition may look at first sight like it matches your experience. However, it is totally meaningless: we can easily use it to come to entirely different conclusions. Example 0.2.1. Suppose that we flip two coins. What is the probability that we obtain one heads (H) and one tails (T )? • Solution 1: The possible outcomes are HH,HT,TH,TT . The outcomes where we have one heads and one tails are HT,TH. Hence, 2 1 Probability of one heads and one tails = = . 4 2 • Solution 2: The possible outcomes are two heads, one heads and one tails, two tails. Only one of these outcomes has one heads and one tails.