Transparencies for Lectures

Transparencies for Lectures

1. Stochastic equations for Markov processes • Filtrations and the Markov property • Ito equations for diffusion processes • Poisson random measures • Ito equations for Markov processes with jumps •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 1 Filtrations and the Markov property (Ω; F;P ) a probability space Available information is modeled by a sub-σ-algebra of F Ft information available at time t fFtg is a filtration. t < s implies Ft ⊂ Fs A stochastic process X is adapted to fFtg if X(t) is Ft-measurable for each t ≥ 0. An E-valued stochastic process X adapted to fFtg is fFtg-Markov if E[f(X(t + r))jFt] = E[f(X(t + r))jX(t)]; t; r ≥ 0; f 2 B(E) An R-valued stochastic process M adapted to fFtg is an fFtg-martingale if E[M(t + r)jFt] = M(t); t; r ≥ 0 τ is an fFtg-stopping time if for each t ≥ 0, fτ ≤ tg 2 Ft. For a stopping time τ, Fτ = fA 2 F : fτ ≤ tg \ A 2 Ft; t ≥ 0g •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 2 Ito integrals W standard Browian motion (W has independent increments with W (t + s) − W (t) normally distributed with mean zero and variance s.) W is compatible with a filtration fFtg if W is fFtg-adapted and W (t + s) − W (t) is independent of Ft for all s; t ≥ 0. If X is R-valued, cadlag and fFtg-adapted, then Z t X X(s−)dW (s) = lim X(ti)(W (t ^ ti+1) − W (t ^ ti)) 0 max(ti+1−ti)!0 exists, and the integral satisfies Z t Z t E[( X(s−)dW (s))2] = E[ X(s)2ds] 0 0 if the right side is finite. The integral extends to all measurable and adapted processes satisfying Z t X(s)2ds < 1 0 m T W is an R -valued standard Brownian motion if W = (W1;:::;Wm) for W1;:::;Wm independent one-dimensional standard Brownian motions. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 3 Ito equations for diffusion processes σ : Rd ! Md×m and b : Rd ! Rd Consider Z t Z t X(t) = X(0) + σ(X(s))dW (s) + b(X(s))ds 0 0 where X(0) is independent of the standard Brownian motion W . Why is X Markov? Suppose W is compatible with fFtg, X is fFtg-adapted, and X is unique (among fFtg-adapted solutions). Then Z t+r Z t+r X(t + r) = X(t) + σ(X(s))dW (s) + b(X(s))ds t t and X(t + r) = H(t; r; X(t);Wt) where Wt(s) = W (t + s) − W (t). Since Wt is independent of F t, Z E[f(X(t + r))jFt] = f(H(t; r; X(t); w))µW (dw): •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 4 Gaussian white noise. (U; dU ) a complete, separable metric space; B(U), the Borel sets µ a (Borel) measure on U A(U) = fA 2 B(U): µ(A) < 1g W (A; t) Mean zero, Gaussian process indexed by A(U) × [0; 1) E[W (A; t)W (B; s)] = t ^ sµ(A \ B), W ('; t) = R '(u)W (du; t) P '(u) = aiIAi (u) P W ('; t) = i aiW (Ai; t) R E[W ('1; t)W ('2; s)] = t ^ s U '1(u)'2(u)µ(du) Define W ('; t) for all ' 2 L2(µ). •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 5 Definition of integral P X(t) = i ξi(t)'i process in L2(µ) R P R t IW (X; t) = U×[0;t] X(s; u)W (du × ds) = i 0 ξi(s)dW ('i; s) " Z t Z # 2 X E[IW (X; t) ] = E ξi(s)ξj(s)ds 'i'jdµ i;j 0 U Z t Z = E X(s; u)2µ(du)ds 0 U The integral extends to adapted processes satisfying Z t Z X(s; u)2µ(du)ds < 1 a:s: 0 U so that Z t Z 2 2 (IW (X; t)) − X(s; u) µ(du)ds 0 U is a local martingale. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 6 Poisson random measures ν a σ-finite measure on U,(U; dU ) a complete, separable metric space. ξ a Poisson random measure on U × [0; 1) with mean measure ν × m. For A 2 B(U)×B([0; 1)), ξ(A) has a Poisson distribution with expectation ν ×m(A) and ξ(A) and ξ(B) are independent if A \ B = ;. ξ(A; t) ≡ ξ(A × [0; t]) is a Poisson process with parameter ν(A). ξe(A; t) ≡ ξ(A × [0; t] − ν(A)t is a martingale. ξ is fFtg compatible, if for each A 2 B(U), ξ(A; ·) is fFtg adapted and for all t; s ≥ 0, ξ(A × (t; t + s]) is independent of Ft. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 7 Stochastic integrals for Poisson random measures X cadlag, L1(ν)-valued, fFtg-adapted We define Z Iξ(X; t) = X(u; s−)ξ(du × ds) U×[0;t] in such a way that Z E [jIξ(X; t)j] ≤ E jX(u; s−)jξ(du × ds) U×[0;t] Z = E[jX(u; s)j]ν(du)ds U×[0;t] If the right side is finite, then Z Z E X(u; s−)ξ(du × ds) = E[X(u; s)]ν(du)ds U×[0;t] U×[0;t] •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 8 Predictable integrands The integral extends to predictable integrands satisfying Z jX(u; s)j ^ 1ν(du)ds < 1 a:s: U×[0;t] so that Z E[Iξ(X; t ^ τ)] = E X(u; s)ξ(du × ds) U×[0;t^τ] for any stopping time satisfying Z E jX(u; s)jν(du)ds < 1 (1) U×[0;t^τ] If (1) holds for all t, then Z Z X(u; s)ξ(du × ds) − X(u; s)ν(du)ds U×[0;t^τ] U×[0;t^τ] is a martingale. X is predictable if it is the pointwise limit of adapted, left-continuous processes. •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 9 Stochastic integrals for centered Poisson random measures X cadlag, L2(ν)-valued, fFtg-adapted We define Z I (X; t) = X(u; s−)ξ(du × ds) ξe e U×[0;t] h i in such a way that E I (X; t)2 = R E[X(u; s)2]ν(du)ds if the right side is finite. ξe U×[0;t] Then I (X; ·) is a square-integrable martingale. ξe The integral extends to predictable integrands satisfying Z jX(u; s)j2 ^ jX(u; s)jν(du)ds < 1 a:s: U×[0;t] so that I (X; t ^ τ) is a martingale for any stopping time satisfying ξe Z E jX(u; s)j2 ^ jX(u; s)jν(du)ds < 1 U×[0;t^τ] •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 10 It^oequations for Markov processes with jumps W Gaussian white noise on U0 × [0; 1) with E[W (A; t)W (B; t)] = µ(A \ B)t ξi Poisson random measure on Ui × [0; 1) with mean measure νi. Let ξe1(A; t) = ξ1(A; t) − ν1(A)t Z 2 2 σb (x) = σ (x; u)µ(du) < 1 Z Z 2 α1(x; u) ^ jα1(x; u)jν1(du) < 1; jα2(x; u)j ^ 1ν2(du) < 1 and Z Z t X(t) = X(0) + σ(X(s−); u)W (du × ds) + β(X(s−))ds U0×[0;t] 0 Z + α1(X(s−); u)ξe1(du × ds) U1×[0;t] Z + α2(X(s−); u)ξ2(du × ds) : U2×[0;t] •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 11 A martingale inequality Discrete time version: Burkholder (1973). Continuous time version: Lenglart, Lepingle, and Pratelli (1980). Easy proof: Ichikawa (1986). Lemma 1 For 0 < p ≤ 2 there exists a constant Cp such that for any locally square integrable martingale M with Meyer process hMi and any stopping time τ p p=2 E[sup jM(s)j ] ≤ CpE[hMiτ ] s≤τ hMi is the (essentially unique) predictable process such that M 2 − hMi is a local martingale. (A left-continuous, adapted process is predictable.) •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 12 Graham's uniqueness theorem Lipschitz condition sZ jσ(x; u) − σ(y; u)j2µ(du) + jβ(x) − β(y)j sZ Z 2 + jα1(x; u) − α1(y; u)j ν1(du) + jα2(x; u) − α2(y; u)jν2(du) ≤ Mjx − yj •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 13 Estimate X and Xe solution of SDE. E[sup jX(s) − Xe(s)j] s≤t Z t Z 2 1 ≤ E[jX(0) − Xe(0)j] + C1E[( jσ(X(s−); u) − σ(Xe(s−); u)j µ(du)ds) 2 ] 0 U0 Z t Z 2 1 +C1E[( jα1(X(s−); u) − α1(Xe(s−); u)j ν1(du)ds) 2 ] 0 U1 Z t Z +E[ jα2(X(s−); u) − α2(Xe(s−); u)jν2(du)ds] 0 U2 Z t +E[ jβ(X(s−)) − β(Xe(s−))jds] 0 p ≤ E[jX(0) − Xe(0)j] + D( t + t)E[sup jX(s) − Xe(s)j] s≤t •First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 14 Boundary conditions X has values in D and satisfies Z t Z t Z t X(t) = X(0) + σ(X(s))dW (s) + b(X(s))ds + η(X(s))dλ(s) 0 0 0 where λ is nondecreasing and increases only when X(t) 2 @D.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    96 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us