Martingale Problems and Stochastic Equations for Markov Processes
Total Page:16
File Type:pdf, Size:1020Kb
Martingale problems and stochastic equations for Markov processes • Review of basic material on stochastic processes • Characterization of stochastic processes by their martingale properties • Weak convergence of stochastic processes • Stochastic equations for general Markov process in Rd • Martingale problems for Markov processes • Forward equations and operator semigroups • Equivalence of martingale problems and stochastic differential equations • Change of measure • Filtering • Averaging •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit • Control • Exercises • Glossary • References •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit 1. Review of basic material on stochastic processes • Filtrations • Stopping times • Martingales • Optional sampling theorem • Doob’s inequalities • Stochastic integrals • Local martingales • Semimartingales • Computing quadratic variations • Covariation • Ito’sˆ formula •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Conventions and caveats 4 State spaces are always complete, separable metric spaces (sometimes called Polish spaces), usually denoted (E, r). All probability spaces are complete. All identities involving conditional expectations (or conditional probabilities) only hold almost surely (even when I don’t say so). If the filtration {Ft} involved is obvious, I will say adapted, rather than {Ft}- adapted, stopping time, rather than {Ft}-stopping time, etc. All processes are cadlag (right continuous with left limits at each t > 0), unless otherwise noted. A process is real-valued if that is the only way the formula makes sense. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit References 5 Kurtz, Lecture Notes for Math 735 http://www.math.wisc.edu/˜kurtz/m735.htm Seppalainen, Basics of Stochastic Analysis http://www.math.wisc.edu/˜seppalai/sa-book/etusivu.html Ethier and Kurtz, Markov Processes: Characterization and Convergence Protter, Stochastic Integration and Differential Equations, Second Edition •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Filtrations 6 (Ω, F,P ) a probability space Available information is modeled by a sub-σ-algebra of F Ft information available at time t {Ft} is a filtration. t < s implies Ft ⊂ Fs A stochastic process X is adapted to {Ft} if X(t) is Ft-measurable for each t ≥ 0. An E-valued stochastic process X adapted to {Ft} is {Ft}-Markov if E[f(X(t + r))|Ft] = E[f(X(t + r))|X(t)], t, r ≥ 0, f ∈ B(E) An R-valued stochastic process M adapted to {Ft} is an {Ft}-martingale if E[M(t + r)|Ft] = M(t), t, r ≥ 0 •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Stopping times 7 τ is an {Ft}-stopping time if for each t ≥ 0, {τ ≤ t} ∈ Ft. For a stopping time τ, Fτ = {A ∈ F : {τ ≤ t} ∩ A ∈ Ft, t ≥ 0} Exercise 1.1 1. Show that Fτ is a σ-algebra. 2. Show that for {Ft}-stopping times σ, τ, σ ≤ τ implies that Fσ ⊂ Fτ . In particular, Fτ∧t ⊂ Ft. ∞ 3. Let τ be a discrete {Ft}-stopping time satisfying {τ < ∞} = ∪k=1{τ = tk} = Ω. Show that Fτ = σ{A ∩ {τ = tk} : A ∈ Ftk , k = 1, 2,...}. 4. Show that the minimum of two stopping times is a stopping time and that the maxi- mum of two stopping times is a stopping time. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Examples and properties 8 Define Ft+ ≡ ∩s>tFs. {Ft} is right continuous if Ft = Ft+ for all t ≥ 0. If {Ft} is right continuous, then τ is a stopping time if and only if {τ < t} ∈ Ft for all t > 0. h If K ⊂ E is closed, τK = inf{t : X(t) or X(t−) ∈ K} is a stopping time, but inf{t : X(t) ∈ K} may not be; however, if {Ft} is right continuous and complete, then for any B ∈ B(E), τB = inf{t : X(t) ∈ B} is an {Ft}-stopping time. This result is a special case of the debut theorem, a very technical result from set theory. Note that {ω : τB(ω) < t} = {ω : ∃s < t 3 X(s, ω) ∈ B} = projΩ{(s, ω): X(s, ω) ∈ B, s < t} Piecewise constant approximations > 0, τ0 = 0, τi+1 = inf{t > τi : r(X(t),X(τi )) ∨ r(X(t−),X(τi )) ≥ } Define X (t) = X(τi ), τi ≤ t < τi+1. Then r(X(t),X (t)) ≤ . If X is adapted to {Ft}, then the {τi } are {Ft}-stopping times and X is {Ft}- adapted. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Martingales 9 An R-valued stochastic process M adapted to {Ft} is an {Ft}-martingale if E[M(t + r)|Ft] = M(t), t, r ≥ 0 Every martingale has finite quadratic variation: X 2 [M]t = lim (M(t ∧ ti+1) − M(t ∧ ti)) where 0 = t0 < t1 < ···, ti → ∞, and the limit is in probability as max(ti+1 −ti) → 0. More precisely, for > 0 and t0 > 0, X 2 lim P {sup |[M]t − lim (M(t ∧ ti+1) − M(t ∧ ti)) | > } = 0. t≤t0 For standard Brownian motion W , [W ]t = t. Exercise 1.2 Let N be a Poisson process with parameter λ. Then M(t) = N(t) − λt is a martingale. Compute [M]t. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Optional sampling theorem 10 A real-valued process is a submartingale if E[|X(t)|] < ∞, t ≥ 0, and E[X(t + s)|Ft] ≥ X(t), t, s ≥ 0. If τ1 and τ2 are stopping times, then E[X(t ∧ τ2)|Fτ1 ] ≥ X(t ∧ τ1 ∧ τ2). If τ2 is finite a.s. E[|X(τ2)|] < ∞ and limt→∞ E[|X(t)|1{τ2>t}] = 0, then E[X(τ2)|Fτ1 ] ≥ X(τ1 ∧ τ2). •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Square integrable martingales 11 M a martingale satisfying E[M(t)2] < ∞. Then 2 M(t) − [M]t is a martingale. In particular, for t > s 2 E[(M(t) − M(s)) ] = E[[M]t − [M]s]. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Doob’s inequalities 12 Let X be a submartingale. Then for x > 0, P {sup X(s) ≥ x} ≤ x−1E[X(t)+] s≤t P {inf X(s) ≤ −x} ≤ x−1(E[X(t)+] − E[X(0)]) s≤t If X is nonnegative and α > 1, then α α E[sup X(s)α] ≤ E[X(t)α]. s≤t α − 1 Note that by Jensen’s inequality, if M is a martingale, then |M| is a submartingale. In particular, if M is a square integrable martingale, then E[sup |M(s)|2] ≤ 4E[M(t)2]. s≤t •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Stochastic integrals 13 Definition 1.3 For cadlag processes X, Y , Z t X− · Y (t) ≡ X(s−)dY (s) 0 X = lim X(ti)(Y (ti+1 ∧ t) − Y (ti ∧ t)) max |ti+1−ti|→0 whenever the limit exists in probability. Sample paths of bounded variation: If Y is a finite variation process, the stochastic integral exists (apply dominated convergence theorem) and Z t Z X(s−)dY (s) = X(s−)αY (ds) 0 (0,t] αY is the signed measure with αY (0, t] = Y (t) − Y (0) •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Existence for square integrable martingales 14 If M is a square integrable martingale, then 2 E[(M(t + s) − M(t)) |Ft] = E[[M]t+s − [M]t|Ft] For partitions {ti} and {ri} hX E X(ti)(M(ti+1 ∧ t) − M(ti ∧ t)) X 2 − X(ri)(M(ri+1 ∧ t) − M(ri ∧ t)) Z t 2 = E (X(t(s−)) − X(r(s−))) d[M]s 0 Z 2 = E (X(t(s−)) − X(r(s−))) α[M](ds) (0,T ] t(s) = ti for s ∈ [ti, ti+1) r(s) = ri for s ∈ [ri, ri+1) •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Cauchy property 15 Let X be bounded by a constant. As sup(ti+1 − ti) + sup(ri+1 − ri) → 0, the right side converges to zero by the dominated convergence theorem. {ti} P MX (t) ≡ X(ti)(M(ti+1 ∧ t) − M(ti ∧ t)) is a square integrable martingale, so X E sup X(ti)(M(ti+1 ∧ t) − M(ti ∧ t)) t≤T X 2 − X(ri)(M(ri+1 ∧ t) − M(ri ∧ t)) Z 2 ≤ 4E (X(t(s−)) − X(r(s−))) α[M](ds) (0,t] A completeness argument gives existence of the stochastic integral and the unifor- mity implies the integral is cadlag. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Local martingales 16 Definition 1.4 M is a local martingale if there exist stopping times {τn} satisfying τ1 ≤ τn τn τ2 ≤ · · · and τn → ∞ a.s. such that M defined by M (t) = M(τn ∧ t) is a martingale. τn M is a local square-integrable martingale if the τn can be selected so that M is square integrable. {τn} is called a localizing sequence for M. Remark 1.5 If {τn} is a localizing sequence for M, and {γn} is another sequence of stop- ping times satisfying γ1 ≤ γ2 ≤ · · ·, γn → ∞ a.s. then the optional sampling theorem implies that {τn ∧ γn} is localizing. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Local martingales with bounded jumps 17 Remark 1.6 If M is a continuous, local martingale, then τn = inf{t : |M(t)| ≥ n} will be a localizing sequence. More generally, if |∆M(t)| ≤ c for some constant c, then τn τn = inf{t : |M(t)|∨|M(t−)| ≥ n} will be a localizing sequence. Note that |M | ≤ n+c, so M is local square integrable.