Introduction to Stochastic Calculus Math 545
Total Page:16
File Type:pdf, Size:1020Kb
Introduction to Stochastic Calculus Math 545 Contents Chapter 1. Introduction 5 1. Motivations 5 2. Outline For a Course 6 Chapter 2. Probabilistic Background 9 1. Countable probability spaces 9 2. Uncountable Probability Spaces 11 3. Distributions and Convergence of Random Variables 18 Chapter 3. Wiener Process and Stochastic Processes 21 1. An Illustrative Example: A Collection of Random Walks 21 2. General Stochastic Proceses 22 3. Definition of Brownian motion (Wiener Process) 23 4. Constructive Approach to Brownian motion 24 5. Brownian motion has Rough Trajectories 26 6. More Properties of Random Walks 28 7. More Properties of General Stochastic Processes 29 8. A glimpse of the connection with pdes 34 Chapter 4. It^oIntegrals 37 1. Properties of the noise Suggested by Modeling 37 2. Riemann-Stieltjes Integral 38 3. A motivating example 39 4. It^ointegrals for a simple class of step functions 40 5. Extension to the Closure of Elementary Processes 44 6. Properties of It^ointegrals 46 7. A continuous in time version of the the It^ointegral 47 8. An Extension of the It^oIntegral 48 9. It^oProcesses 49 Chapter 5. Stochastic Calculus 51 1. It^o'sFormula for Brownian motion 51 2. Quadratic Variation and Covariation 54 3. It^o'sFormula for an It^oProcess 58 4. Full Multidimensional Version of It^oFormula 60 5. Collection of the Formal Rules for It^o'sFormula and Quadratic Variation 64 Chapter 6. Stochastic Differential Equations 67 1. Definitions 67 2. Examples of SDEs 68 3. Existence and Uniqueness for SDEs 71 4. Weak solutions to SDEs 74 3 5. Markov property of It^odiffusions 76 Chapter 7. PDEs and SDEs: The connection 77 1. Infinitesimal generators 77 2. Martingales associated with diffusion processes 79 3. Connection with PDEs 81 4. Time-homogeneous Diffusions 84 5. Stochastic Characteristics 85 6. A fundamental example: Brownian motion and the Heat Equation 86 Chapter 8. Martingales and Localization 89 1. Martingales & Co. 89 2. Optional stopping 91 3. Localization 92 4. Quadratic variation for martingales 95 5. L´evy-Doob Characterization of Brownian-Motion 97 6. Random time changes 99 7. Time Change for an sde 99 8. Martingale representation theorem 100 9. Martingale inequalities 100 Chapter 9. Girsanov's Theorem 103 1. An illustrative example 103 2. Tilted Brownian motion 104 3. Shifting Weiner Measure Simple Cameron Martin Theorem 105 4. Girsanov's Theorem for sdes 106 Bibliography 109 Appendix A. Some Results from Analysis 111 4 CHAPTER 1 Introduction 1. Motivations Evolutions in time with random influences/random dynamics. Let Nptq be the \number of rabbits in some population" or \the price of a stock". Then one might want to make a model of the dynamics which includes \random influences”. A (very) simple example is dNptq “ aptqNptq where aptq “ rptq ` \noise" . (1.1) dt Making sense of \noise" and learning how to make calculations with it is one of the principal objectives of this course. This will allow us predict, in a probabilistic sense, the behavior of Nptq . Examples of situations like the one introduced above are ubiquitous in nature: i) The gambler's ruin problem We play the following game: We start with 3$ in our pocket and we flip a coin. If the result is tail we loose one dollar, while if the result is positive we win one dollar. We stop when we have no money to bargain, or when we reach 9$. We may ask: what is the probability that I end up broke? ii) Population dynamics/Infectious diseases As anticipated, (1.1) can be used to model the evolution in the number of rabbits in some population. Similar models are used to model the number of genetic mutations an animal species. We may also think about Nptq as the number of sick individuals in a population. Reasonable and widely applied models for the spread of infectious diseases are obtained by modifying (1.1), and observing its behavior. In all these cases, one may be inteested in knowing if it is likely for the disease/mutation to take over the population, or rather to go extinct. iii) Stock prices We may think about a set of M risky investments (e.g. a stock), where the price Niptq for i P t1;:::Mu per unit at time t evolves according to (1.1). In this case, one M one would like to optimize his/her choice of stocks to maximize the total value i“1 αiNiptq at a later time T . ř Connections with diffusion theory and PDEs. There exists a deep connection between noisy processes such as the one introduced above and the deterministic theory of partial differential equations. This starling connection will be explored and expanded upon during the course, but we anticipate some examples below: i) Dirichlet problem Let upxq be the solution to the pde given below with the noted B B boundary conditions. Here ∆ “ Bx ` By . The amazing fact is the following: If we start a Brownian motion diffusing from a point px0; y0q inside the domain then the probability that it first hits the boundary in the darker region is given by upx0; y0q. ii) Black Scholes Equation Suppose that at time t “ 0 the person in iii) is offered the right (without obligation) to buy one unit of the risky asset at a specified price S and at a specified future time t “ T . Such a right is called a European call option. How much should the person be willing to pay for such an option? This question can be answered by solving the famous Black Scholes equation, giving for any stock price Nptq the rigth value S of the European option. 5 u = 0 1 2∆u = 0 u = 1 2. Outline For a Course What follows is a rough outline of the class, giving a good indication of the topics to be covered, though there will be modifications. i) Week 1: Motivation and Introduction to Stochastic Process (a) Motivating Examples: Random Walks, Population Model with noise, Black-Scholes, Dirichlet problems (b) Themes: Direct calculation with stochastic calculus, connections with pdes (c) Introduction: Probability Spaces, Expectations, σ-algebras, Conditional expectations, Random walks and discrete time stochastic processes. Continuous time stochastic pro- cesses and characterization of the law of a process by its finite dimensional distributions (Kolmogorov Extension Theorem). Markov Process and Martingales. ii) Week 2-3: Brownian motion and its Properties (a) Definitions of Brownian motion (BM) as a continuous Gaussian process with indepen- dent increments. Chapman-Kolmogorov equation, forward and backward Kolmogorov equations for BM. Continuity of sample paths (Kolmogorov Continuity Theorem). BM and more Markov process and Martingales. (b) First and second variation (a.k.a variation and quadratic variation) Application to BM. iii) Week 4: Stochastic Integrals (a) The Riemann-Stieltjes integral. Why can't we use it ? t (b) Building the It^oand Stratonovich integrals. (Making sense of \ 0 σ dB.") (c) Standard properties of integrals hold: linearity, additivity 2 2 ³ (d) It^oisometry: Ep f dBq “ E f ds. iv) Week 5: It^o'sFormula and Applications (a) Change of variable.³ ³ (b) Connections with pdes and the Backward Kolmogorov equation. v) Week 6: Stochastic Differential Equations (a) What does it mean to solve an sde ? (b) Existence of solutions (Picard iteration). Uniqueness of solutions. vi) Week 7: Stopping Times (a) Definition. σ-algebra associated to stopping time. Bounded stopping times. Doob's optional stopping theorem. (b) Dirichlet Problems and hitting probabilities. (c) Localization via stopping times vii) Week 8: Levy-Doob theorem and Girsonov's Theorem (a) How to tell when a continuous martingale is a Brownian motion. 6 (b) Random time changes to turn a Martingale into a Brownian Motion (c) Hermite Polynomials and the exponential martingale (d) Girsanov's Theorem, Cameron-Martin formula, and changes of measure. (1) The simple example of i.i.d Gaussian random variables shifted. (2) Idea of Importance sampling and how to sample from tails. (3) The shift of a Brownian motion (4) Changing the drift in a diffusion. viii) Week 9: sdes with Jumps ix) Week 10: Feller Theory of one dimensional diffusions (a) Speed measures, natural scales, the classification of boundary point. 7 CHAPTER 2 Probabilistic Background Example 0.1. We begin with the following motivating example. Consider a random sequence N ! “ t!iui“0 where 1 1 with probability 2 !i “ 1 #´1 with probability 2 independent of the other !'s. We will also write this as 1 rp! ;! ;:::;! q “ ps ; s ; : : : ; s qs “ P 1 2 N 1 2 N 2N for si “ ˘1. We can group the possible outcomes into subsets depending on the value of !1: F1 “ tt! P Ω: !1 “ 1u; t! P Ω: !1 “ ´1uu : (2.1) These subsets contain the possible events that we may observe knowing the value of !1. In other words, this separation represents the information we have on the process at that point. Let Ω be the set of all such sequences of length N (i.e. Ω “ t´1; 1uN ), and consider now the sequence of functions tXn :Ω Ñ Zu where X0p!q “ 0 (2.2) n Xnp!q “ !i i“1 ¸ for n P t1; ¨ ¨ ¨ ;Nu. Each Xn is an example of an integer-valued random variable. The set Ω is called the sample space (or outcome space) and the measure given by P on Ω is called the probability measure. We now recall some basic definitions from the theory of probability which will allow us to put this example on solid ground. Intuitively, each ! P Ω is a possible outcome of all of the randomness in our system.