Martingale Problems and Stochastic Equations for Markov Processes 1
Total Page:16
File Type:pdf, Size:1020Kb
Martingale problems and stochastic equations for Markov processes 1. Basics of stochastic processes 2. Markov processes and generators 3. Martingale problems 4. Exisence of solutions and forward equations 5. Stochastic integrals for Poisson random measures 6. Weak and strong solutions of stochastic equations 7. Stochastic equations for Markov processes in Rd 8. Convergence for Markov processes characterized by martingale problems •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 1 9. Convergence for Markov processes characterized by stochastic differential equations 10. Martingale problems for conditional distributions 11. Equivalence of stochastic equations and martingale problems 12. Genealogies and ordered representations of measure-valued pro- cesses 13. Poisson representations 14. Stochastic partial differenctial equations 15. Information and conditional expectation 16. Technical lemmas 17. Exercises 18. Stochastic analysis exercises 19. References •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 2 http://www.math.wisc.edu/˜kurtz/FrankLect.htm •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 3 1. Basics of stochastic processes • Filtrations • Stopping times • Martingales • Optional sampling theorem • Doob’s inequalities • Stochastic integrals • Local martingales • Semimartingales • Computing quadratic variations • Covariation • Ito’sˆ formula •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 4 Conventions and caveats State spaces are always complete, separable metric spaces (some- times called Polish spaces), usually denoted (E; r). All probability spaces are complete. All identities involving conditional expectations (or conditional prob- abilities) only hold almost surely (even when I don’t say so). If the filtration fFtg involved is obvious, I will say adapted, rather than fFtg-adapted, stopping time, rather than fFtg-stopping time, etc. All processes are cadlag (right continuous with left limits at each t > 0), unless otherwise noted. A process is real-valued if that is the only way the formula makes sense. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 5 References Kurtz, Lecture Notes for Math 735 http://www.math.wisc.edu/˜kurtz/m735.htm Ethier and Kurtz, Markov Processes: Characterization and Convergence Protter, Stochastic Integration and Differential Equations, Second Edi- tion •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 6 Filtrations (Ω; F;P ) a probability space Available information is modeled by a sub-σ-algebra of F Ft information available at time t fFtg is a filtration. t < s implies Ft ⊂ Fs fFtg is complete if F0 contains all subsets of sets of probability zero. A stochastic process X is adapted to fFtg if X(t) is Ft-measurable for each t ≥ 0. An E-valued stochastic process X adapted to fFtg is fFtg-Markov if E[f(X(t + r))jFt] = E[f(X(t + r))jX(t)]; t; r ≥ 0; f 2 B(E) •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 7 Measurability for stochastic processes A stochastic process is an indexed family of random variables, but if the index set is [0; 1), then we may want to know more about X(t; !) than that it is a measurable function of ! for each t. For example, for a R-valued process X, when are Z b X(s; !)ds and X(τ(!);!) a random variables? X is measurable if (t; !) 2 [0; 1) × Ω ! X(t; !) 2 E is B([0; 1)) × F- measurable. R b R b Lemma 1.1 If X is measurable and a jX(s; !)jds < 1, then a X(s; !)ds is a random variable. If, in addition, τ is a nonnegative random variable, then X(τ(!);!) is a random variable. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 8 Proof. The first part is a standard result for measurable functions on a product space. Verify the result for X(s; !) = 1A(s)1B(!), A 2 B[0; 1), B 2 F and apply the Dynkin class theorem to extend the result to 1C, C 2 B[0; 1) × F. If τ is a nonnegative random variable, then ! 2 Ω ! (τ(!);!) 2 [0; 1) × Ω is measurable. Consequently, X(τ(!);!) is the composi- tion of two measurble functions. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 9 Measurability continued A stochastic process X is fFtg-adapted if for all t ≥ 0, X(t) is Ft- measurable. If X is measurable and adapted, the restriction of X to [0; t] × Ω is B[0; t] × F-measurable, but it may not be B[0; t] × Ft-measurable. X is progressive if for each t ≥ 0, (s; !) 2 [0; t] × Ω ! X(s; !) 2 E is B[0; t] × Ft-measurable. Let W = fA 2 B[0; 1) × F : A \ [0; t] × Ω 2 B[0; t] × Ft; t ≥ 0g: Then W is a σ-algebra and X is progressive if and only if (s; !) ! X(s; !) is W-measurable. Since pointwise limits of measurable functions are measurable, point- wise limits of progressive processes are progressive. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 10 Stopping times Let fFtg be a filtration. τ is a Ft-stopping time if and only if fτ ≤ tg 2 Ft for each t ≥ 0. If τ is a stopping time, Fτ ≡ fA 2 F : A \ fτ ≤ tg 2 Ft; t ≥ 0g. If τ1 and τ2 are stopping times with τ1 ≤ τ2, then Fτ1 ⊂ Fτ2 . If τ1 and τ2 are stopping times then τ1 and τ1 ^ τ2 are Fτ1 -measurable. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 11 A process observed at a stopping time If X is measurable and τ is a stopping time, then X(τ(!);!) is a ran- dom variable. Lemma 1.2 If τ is a stopping time and X is progressive, then X(τ) is Fτ - measurable. Proof. ! 2 Ω ! (τ(!) ^ t; !) 2 [0; t] × Ω is measurable as a mapping from (Ω; Ft) to ([0; t] × Ω; B[0; t] × Ft). Consequently, ! ! X(τ(!) ^ t; !) is Ft-measurable, and fX(τ) 2 Ag \ fτ ≤ tg = fX(τ ^ t) 2 Ag \ fτ ≤ tg 2 Ft: •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 12 Right continuous processes Most of the processes you know are either continuous (e.g., Brown- ian motion) or right continuous (e.g., Poisson process). Lemma 1.3 If X is right continuous and adapted, then X is progressive. Proof. If X is adapted, then [ns] + 1 (s; !) 2 [0; t] × Ω ! Y (s; !) ≡ X( ^ t; !) n n X k + 1 = X( ^ t; !)1[ k ; k+1 )(s) n n n k is B[0; t] × Ft-measurable. By the right continuity of X, Yn(s; !) ! X(s; !) on B[0; t] × Ft, so (s; !) 2 [0; t] × Ω ! X(s; !) is B[0; t] × Ft- measurable and X is progressive. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 13 Examples and properties Define Ft+ ≡ \s>tFs. fFtg is right continuous if Ft = Ft+ for all t ≥ 0. If fFtg is right continuous, then τ is a stopping time if and only if fτ < tg 2 Ft for all t > 0. h Let X be cadlag and adapted. If K ⊂ E is closed, τK = infft : X(t) or X(t−) 2 Kg is a stopping time, but infft : X(t) 2 Kg may not be; however, if fFtg is right continuous and complete, then for any B 2 B(E), τB = infft : X(t) 2 Bg is an fFtg-stopping time. This result is a special case of the debut theorem, a very technical result from set theory. Note that f! : τB(!) < tg = f! : 9s < t 3 X(s; !) 2 Bg = projΩf(s; !): X(s; !) 2 B; s < tg •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 14 Piecewise constant approximations > 0, τ0 = 0, τi+1 = infft > τi : r(X(t);X(τi )) _ r(X(t−);X(τi )) ≥ g Define X (t) = X(τi ), τi ≤ t < τi+1. Then r(X(t);X (t)) ≤ . If X is adapted to fFtg, then the fτi g are fFtg-stopping times and X is fFtg-adapted. See Exercise4. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 15 Martingales An R-valued stochastic process M adapted to fFtg is an fFtg-martingale if E[M(t + r)jFt] = M(t); t; r ≥ 0 Every martingale has finite quadratic variation: X 2 [M]t = lim (M(t ^ ti+1) − M(t ^ ti)) where 0 = t0 < t1 < ···, ti ! 1, and the limit is in probability as max(ti+1 − ti) ! 0. More precisely, for > 0 and t0 > 0, X 2 lim P fsup j[M]t − lim (M(t ^ ti+1) − M(t ^ ti)) j > g = 0: t≤t0 For standard Brownian motion W , [W ]t = t. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 16 Optional sampling theorem A real-valued process is a submartingale if E[jX(t)j] < 1, t ≥ 0, and E[X(t + s)jFt] ≥ X(t); t; s ≥ 0: If τ1 and τ2 are stopping times, then E[X(t ^ τ2)jFτ1 ] ≥ X(t ^ τ1 ^ τ2): If τ2 is finite a.s. E[jX(τ2)j] < 1 and limt!1 E[jX(t)j1fτ2>tg] = 0, then E[X(τ2)jFτ1 ] ≥ X(τ1 ^ τ2): Of course, if X is a martingale E[X(t ^ τ2)jFτ1 ] = X(t ^ τ1 ^ τ2): •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 17 Square integrable martingales M a martingale satisfying E[M(t)2] < 1. Then 2 M(t) − [M]t is a martingale. In particular, for t > s 2 E[(M(t) − M(s)) ] = E[[M]t − [M]s]: •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 18 Doob’s inequalities Let X be a submartingale.