Math 288 - Probability Theory and Stochastic Process

Math 288 - Probability Theory and Stochastic Process

Math 288 - Probability Theory and Stochastic Process Taught by Horng-Tzer Yau Notes by Dongryul Kim Spring 2017 This course was taught by Horng-Tzer Yau. The lectures were given at MWF 12-1 in Science Center 310. The textbook was Brownian Motion, Martingales, and Stochastic Calculus by Jean-Fran¸coisLe Gall. There were 11 undergrad- uates and 12 graduate students enrolled. There were 5 problem sets and the course assistant was Robert Martinez. Contents 1 January 23, 2017 5 1.1 Gaussians . .5 1.2 Gaussian vector . .6 2 January 25, 2017 8 2.1 Gaussian process . .8 2.2 Conditional expectation for Gaussian vectors . .8 3 January 27, 2017 10 3.1 Gaussian white noise . 10 3.2 Pre-Brownian motion . 11 4 January 30, 2017 13 4.1 Kolmogorov's theorem . 13 5 February 1, 2017 15 5.1 Construction of a Brownian motion . 15 5.2 Wiener measure . 16 6 February 3, 2017 17 6.1 Blumenthal's zero-one law . 17 6.2 Strong Markov property . 18 1 Last Update: August 27, 2018 7 February 6, 2017 19 7.1 Reflection principle . 19 7.2 Filtrations . 20 8 February 8, 2017 21 8.1 Stopping time of filtrations . 21 8.2 Martingales . 22 9 February 10, 2017 23 9.1 Discrete martingales - Doob's inequality . 23 10 February 13, 2017 25 10.1 More discrete martingales - upcrossing inequality . 25 11 February 15, 2017 27 11.1 Levi's theorem . 27 11.2 Optional stopping for continuous martingales . 28 12 February 17, 2017 29 13 February 22, 2017 30 13.1 Submartingale decomposition . 30 14 February 24, 2017 31 14.1 Backward martingales . 31 14.2 Finite variance processes . 32 15 February 27, 2017 33 15.1 Local martingales . 33 16 March 1, 2017 35 16.1 Quadratic variation . 35 17 March 3, 2017 37 17.1 Consequences of the quadratic variation . 37 18 March 6, 2017 39 18.1 Bracket of local martingales . 39 18.2 Stochastic integration . 40 19 March 8, 2017 41 19.1 Elementary process . 41 20 March 20, 2017 43 20.1 Review . 43 2 21 March 22, 2017 45 21.1 Stochastic integration with respect to a Brownian motion . 45 21.2 Stochastic integration with respect to local martingales . 45 22 March 24, 2017 47 22.1 Dominated convergence for semi-martingales . 47 22.2 It^o'sformula . 48 23 March 27, 2017 50 23.1 Application of It^ocalculus . 50 24 March 29, 2017 52 24.1 Burkholder{Davis{Gundy inequality . 52 25 March 31, 2017 54 25.1 Representation of martingales . 54 25.2 Girsanov's theorem . 55 26 April 3, 2017 56 26.1 Cameron{Martin formula . 56 27 April 5, 2017 58 27.1 Application to the Dirichlet problem . 58 27.2 Stochastic differential equations . 59 28 April 7, 2017 61 28.1 Existence and uniqueness in the Lipschitz case . 61 28.2 Kolomogorov forward and backward equations . 62 29 April 10, 2017 64 29.1 Girsanov's formula as an SDE . 65 29.2 Ornstein{Uhlenbeck process . 65 29.3 Geometric Brownian motion . 66 30 April 12, 2017 67 30.1 Bessel process . 67 31 April 14, 2017 69 31.1 Maximum principle . 69 31.2 2-dimensional Brownian motion . 70 32 April 17, 2017 71 32.1 More on 2-dimensional Brownian motion . 71 33 April 19, 2017 73 33.1 Feynman{Kac formula . 73 33.2 Girsanov formula again . 74 3 34 April 21, 2017 76 34.1 Kipnis{Varadhan cutoff lemma . 76 35 April 24, 2017 78 35.1 Final review . 78 4 Math 288 Notes 5 1 January 23, 2017 The textbook is J. F. Le Gall, Brownian Motion, Martingales, and Stochastic Calculus. n A Brownian motion is a random function X(t): R≥0 ! R . This is a model for the movement of one particle among many in \billiard" dymanics. For instance, if there are many particles in box and they bump into each other. Robert Brown gave a description in the 19th century. X(t) is supposed to describe the trajectory of one particle, and it is random. To simplify this, we assume X(t) is a random walk. For instance, let N ( X 1 with probability 1=2 SN = Xi;Xi = −1 with probability 1=2: i=1 For non-integer values, you interpolate it linearly and let bNtc X SN (t) = Xi + (Nt − bNtc)XbNtc+1: i=1 2 Then E(SN ) = N. As we let N ! 1, we have three properties: 2 SN (t) • lim E p = t N!1 N E(SN (t)SN (s)) • lim p 2 = t ^ s = min(t; s) N!1 N p • SN (t)= N ! N(0; t) This is the model we want. 1.1 Gaussians Definition 1.1. X is a standard Gaussian random variable if 1 Z 2 P (X 2 A) = p e−x =2dx 2π A for a Borel set A ⊆ R. For a complex variable z 2 C, Z zX 1 zx −x2=2 z2=2 E(e ) = p e e dx = e : 2π This is the moment generating function. When z = iξ, we have 1 2 k 2 X (−ξ ) 2 (iξ) = e−ξ =2 = 1 + iξ X + X2 + ··· : k! E 2 E k=0 Math 288 Notes 6 From this we can read off the moments of the Gaussian EXn. We have (2n)! X2n = ; X2n+1 = 0: E 2nn! E Given X ∼ N (0; 1), we have Y = σX + m ∼ N (m; σ2). Its moment gener- ating function is given by zY imξ 2 2 Ee = e − σ ξ =2: 2 2 Also, if Y1 ∼ N (m1; σ1) and Y2 ∼ N (m2; σ2) and they are independent, then z(Y +Y ) zY zY i(m +m )ξ−(σ2+σ2)ξ2=2 Ee 1 2 = Ee 1 Ee 2 = e 1 2 1 2 : 2 2 So Y1 + Y2 ∼ N (m1 + m2; σ1 + σ2). 2 2 Proposition 1.2. If Xn ∼ N (mn; σn) and Xn ! X in L , then 2 2 (i) X is Gaussian with mean m = limn!1 mn and variance σ = limn!1 σn. p (ii) Xn ! X in probability and Xn ! X in L for p ≥ 1. 2 Proof. (i) By definition, E(Xn − X) ! 0 and so jEXn − EXj ≤ (EjXn − Xj2)1=2 ! 0. So EX = m. Similarly because L2 is complete, we have Var X = σ2. Now the characteristic function of X is iξX iξX im ξ−σ2 ξ2=2 imξ−σ2ξ2=2 e = lim e n = lim e n n = e : E n!1 n!1 So X ∼ N (m; σ2). (ii) Because EX2p can be expressed in terms of the mean and variance, we 2p 2p p have supn EjXnj < 1 and so supn EjXn − Xj < 1. Define Yn = jXn − Xj . 2 Then Yn is bounded in L and hence is uniformly integrable. Also it converges 1 to 0 in probability. It follows that Yn ! 0 in L , which means that Xn ! X in p L . It then follows that Xn ! X in probability. Definition 1.3. We say that Xn is uniformly integrable if for every > 0 there exists a K such that supn E[jXnj · 1jXnj>K ] < . 1 Proposition 1.4. A sequence of random variables Xn converge to X in L if and only if Xn are uniformly integrable and Xn ! X in probability. 1.2 Gaussian vector Definition 1.5. A vector X 2 Rn is a Gaussian vector if for every n 2 Rn, hu; Xi is a Gaussian random variable. Example 1.6. Define X1 ∼ N (0; 1) and = ±1 with probability 1=2 each. Let X2 = X1. Then X = (X1;X2) is not a Gaussian vector. Math 288 Notes 7 Note that u 7! E[u · X] is a linear map and so E[u · X] = u · mX for some n Pn mX 2 R . We call mX the mean of X. If X = i=1 Xiei for an orthonormal n normal basis ei of R , then the mean of X is N X mX = (EXi)ei: i=1 Likewise, the map u 7! Var u · X is a quadratic form and so Var uX = hu; qX ui for some matrix qX . We all qX the covariant matrix of X. We write qX (n) = hu; qX ui. Then the characteristic function of X is zi(u·X) 2 Ee = exp(izmX − z qX (u)=2): Proposition 1.7. If Cov(Xj;Xk)1≤j;k≤n is diagonal, and X = (X1;:::;Xn) is a Gaussian vector, then Xis are independent. Proof. First check that n X qX (u) = uiuj Cov(Xi;Xj): i;j=1 Then n 1 1 X exp(iu · X) = exp − q (u) = exp − u2 Var(X ) E 2 X 2 i i i=1 n Y 1 = exp − u2 Var(X ) : 2 i i i=1 This gives a full description of X. Theorem 1.8. Let γ : Rn ! Rn be a positive definite symmetric n × n matrix. Then there exists a Gaussian vector X with Cov X = γ. Proof. Diagonalize γ and let λj ≥ 0 be the eigenvalues. Choose v1; : : : ; vn an n orthonormal basis of eigenvectors of R and let wj = λjvj = γvj. Choose Yj to be Gaussians N (0; 1). Then the vector n X X = λjvjYj j=1 is a Gaussian vector with the right covariance. Math 288 Notes 8 2 January 25, 2017 We start with a probability space (Ω; F;P ) and joint Gaussians (or a Gaussian vector) X1;:::;Xd. We have Cov(Xj;Xk) = E[XjXk] − E[Xj]E[Xk]; γX = (Cov(Xj;Xk))jk = C: Then the probability distribution is given by 1 −hx;C−1xi=2 PX (x) = p e (2π)d=2 det C where C > 0 and the characteristic function is itx −ht;Cti=2 E[e ] = e : 2.1 Gaussian process A (centered) Gaussian space is a closed subspace of L2(Ω; F;P ) containing only Gaussian random variables.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    80 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us