Stochastic Calculus: an Introduction with Applications
Total Page:16
File Type:pdf, Size:1020Kb
Stochastic Calculus: An Introduction with Applications Gregory F. Lawler © 2014 Gregory F. Lawler All rights reserved ii Contents 1 Martingales in discrete time 3 1.1 Conditional expectation . 3 1.2 Martingales . 10 1.3 Optional sampling theorem . 14 1.4 Martingale convergence theorem . 19 1.5 Square integrable martingales . 24 1.6 Integrals with respect to random walk . 26 1.7 A maximal inequality . 27 1.8 Exercises . 28 2 Brownian motion 35 2.1 Limits of sums of independent variables . 35 2.2 Multivariate normal distribution . 38 2.3 Limits of random walks . 42 2.4 Brownian motion . 43 2.5 Construction of Brownian motion . 46 2.6 Understanding Brownian motion . 51 2.6.1 Brownian motion as a continuous martingale . 54 2.6.2 Brownian motion as a Markov process . 56 2.6.3 Brownian motion as a Gaussian process . 57 2.6.4 Brownian motion as a self-similar process . 58 2.7 Computations for Brownian motion . 58 2.8 Quadratic variation . 63 2.9 Multidimensional Brownian motion . 66 2.10 Heat equation and generator . 68 2.10.1 One dimension . 68 2.10.2 Expected value at a future time . 74 2.11 Exercises . 78 iii iv CONTENTS 3 Stochastic integration 83 3.1 What is stochastic calculus? . 83 3.2 Stochastic integral . 85 3.2.1 Review of Riemann integration . 85 3.2.2 Integration of simple processes . 86 3.2.3 Integration of continuous processes . 89 3.3 It^o'sformula . 99 3.4 More versions of It^o'sformula . 105 3.5 Diffusions . 111 3.6 Covariation and the product rule . 116 3.7 Several Brownian motions . 117 3.8 Exercises . 120 4 More stochastic calculus 125 4.1 Martingales and local martingales . 125 4.2 An example: the Bessel process . 131 4.3 Feynman-Kac formula . 133 4.4 Binomial approximations . 137 4.5 Continuous martingales . 141 4.6 Exercises . 143 5 Change of measure and Girsanov theorem 145 5.1 Absolutely continuous measures . 145 5.2 Giving drift to a Brownian motion . 150 5.3 Girsanov theorem . 153 5.4 Black-Scholes formula . 162 5.5 Martingale approach to Black-Scholes equation . 166 5.6 Martingale approach to pricing . 169 5.7 Martingale representation theorem . 178 5.8 Exercises . 180 6 Jump processes 185 6.1 L´evyprocesses . 185 6.2 Poisson process . 188 6.3 Compound Poisson process . 192 6.4 Integration with respect to compound Poisson processes . 200 6.5 Change of measure . 205 6.6 Generalized Poisson processes I . 206 CONTENTS v 6.7 Generalized Poisson processes II . 211 6.8 The L´evy-Khinchin characterization . 216 6.9 Integration with respect to L´evyprocesses . 224 6.10 Symmetric stable process . 227 6.11 Exercises . 232 7 Fractional Brownian motion 237 7.1 Definition . 237 7.2 Stochastic integral representation . 239 7.3 Simulation . 241 8 Harmonic functions 243 8.1 Dirichlet problem . 243 8.2 h-processes . 249 8.3 Time changes . 251 8.4 Complex Brownian motion . 252 8.5 Exercises . 254 vi CONTENTS Introductory comments This is an introduction to stochastic calculus. I will assume that the reader has had a post-calculus course in probability or statistics. For much of these notes this is all that is needed, but to have a deep understanding of the subject, one needs to know measure theory and probability from that per- spective. My goal is to include discussion for readers with that background as well. I also assume that the reader can write simple computer programs either using a language like C++ or by using software such as Matlab or Mathematica. More advanced mathematical comments that can be skipped by the reader will be indented with a different font. Comments here will as- sume that the reader knows that language of measure-theoretic prob- ability theory. We will discuss some of the applications to finance but our main fo- cus will be on the mathematics. Financial mathematics is a kind of applied mathematics, and I will start by making some comments about the use of mathematics in \the real world". The general paradigm is as follows. • A mathematical model is made of some real world phenomenon. Usually this model requires simplification and does not precisely describe the real situation. One hopes that models are robust in the sense that if the model is not very far from reality then its predictions will also be close to accurate. • The model consists of mathematical assumptions about the real world. • Given these assumptions, one does mathematical analysis to see what they imply. The analysis can be of various types: 1 2 CONTENTS { Rigorous derivations of consequences. { Derivations that are plausible but are not mathematically rigor- ous. { Approximations of the mathematical model which lead to tractable calculations. { Numerical calculations on a computer. { For models that include randomness, Monte Carlo simulations us- ing a (pseudo) random number generator. • If the mathematical analysis is successful it will make predictions about the real world. These are then checked. • If the predictions are bad, there are two possible reasons: { The mathematical analysis was faulty. { The model does not sufficiently reflect reality. The user of mathematics does not always need to know the details of the mathematical analysis, but it is critical to understand the assumptions in the model. No matter how precise or sophisticated the analysis is, if the assumptions are bad, one cannot expect a good answer. Chapter 1 Martingales in discrete time A martingale is a mathematical model of a fair game. To understand the def- inition, we need to define conditional expectation. The concept of conditional expectation will permeate this book. 1.1 Conditional expectation If X is a random variable, then its expectation, E[X] can be thought of as the best guess for X given no information about the result of the trial. A conditional expectation can be considered as the best guess given some but not total information. Let X1;X2;::: be random variables which we think of as a time series with the data arriving one at a time. At time n we have viewed the values X1;:::;Xn. If Y is another random variable, then E(Y j X1;:::;Xn) is the best guess for Y given X1;:::;Xn. We will assume that Y is an integrable random variable which means E[jY j] < 1. To save some space we will write Fn for \the information contained in X1;:::;Xn" and E[Y j Fn] for E[Y j X1;:::;Xn]. We view F0 as no information. The best guess should satisfy the following properties. • If we have no information, then the best guess is the expected value. In other words, E[Y j F0] = E[Y ]. • The conditional expectation E[Y j Fn] should only use the informa- tion available at time n. In other words, it should be a function of 3 4 CHAPTER 1. MARTINGALES IN DISCRETE TIME X1;:::;Xn, E[Y j Fn] = φ(X1;:::;Xn): We say that E[Y j Fn] is Fn-measurable. The definitions in the last paragraph are certainly vague. We can use measure theory to be precise. We assume that the random variables Y; X1;X2;::: are defined on a probability space (Ω; F; P). Here F is a σ-algebra or σ-field of subsets of Ω, that is, a collection of subsets satisfying • ; 2 F; • A 2 F implies that Ω n A 2 F; S1 • A1;A2;::: 2 F implies that n=1 An 2 F. The information Fn is the smallest sub σ-algebra G of F such that X1;:::;Xn are G-measurable. The latter statement means that for all t 2 R, the event fXj ≤ tg 2 Fn. The \no information" σ-algebra F0 is the trivial σ-algebra containing only ; and Ω. The definition of conditional expectation is a little tricky, so let us try to motivate it by considering an example from undergraduate probability courses. Suppose that (X; Y ) have a joint density f(x; y); 0 < x; y < 1; with marginal densities Z 1 Z 1 f(x) = f(x; y) dy; g(y) = f(x; y) dx: −∞ −∞ The conditional density f(yjx) is defined by f(x; y) f(yjx) = : f(x) This is well defined provided that f(x) > 0, and if f(x) = 0, then x is an \impossible" value for X to take. We can write Z 1 E[Y j X = x] = y f(y j x) dy: −∞ 1.1. CONDITIONAL EXPECTATION 5 We can use this as the definition of conditional expectation in this case, Z 1 R 1 y f(X; y) dy E[Y j X] = y f(y j X) dy = −∞ : −∞ f(X) Note that E[Y j X] is a random variable which is determined by the value of the random variable X. Since it is a random variable, we can take its expectation Z 1 E [E[Y j X]] = E[Y j X = x] f(x) dx −∞ Z 1 Z 1 = y f(y j x) dy f(x) dx −∞ −∞ Z 1 Z 1 = y f(x; y) dy dx −∞ −∞ = E[Y ]: This calculation illustrates a basic property of conditional expectation. Suppose we are interested in the value of a random variable Y and we are going to be given data X1;:::;Xn. Once we observe the data, we make our best prediction E[Y j X1;:::;Xn]. If we average our best prediction given X1;:::;Xn over all the possible values of X1;:::;Xn, we get the best predic- tion of Y . In other words, E[Y ] = E [E[Y j Fn]] : More generally, suppose that A is an Fn-measurable event, that is to say, if we observe the data X1;:::;Xn, then we know whether or not A has occurred.