Solution to Selected Problems

Solution to Selected Problems

Solution to selected problems. Chapter 1. Preliminaries 1. 8A 2 FS, 8t ¸ 0, A \ fT · tg = (A \ fS · tg) \ fT · tg, since fT · tg ½ fS · tg. Since A \ fS · tg 2 Ft and fT · tg 2 Ft, A \ fT · tg 2 Ft. Thus FS ½ FT . 2. Let Ω = N and F = P(N) be the power set of the natural numbers. Let Fn = σ(f2g; f3g;:::; fn+ 1g), 8n. Then (Fn)n¸1 is a filtration. Let S = 3 ¢ 13 and T = 4. Then S · T and ( f3g if n = 3 f! : S(!) = ng = ; otherwise ( Ω if n = 4 f! : T (!) = ng = ; otherwise Hence fS = ng 2 Fn, fT = ng 2 Fn, 8n and S; T are both stopping time. However f! : T ¡ S = 1g = f! : 1f3g(!) = 1g = f3g 2= F1. Therefore T ¡ S is not a stopping time. 3. Observe that fTn · tg 2 Ft and fTn < tg 2 Ft for all n 2 N , t 2 R+, since Tn is stopping time and we assume usual hypothesis. Then (1) supn Tn is a stopping time since 8t ¸ 0, fsupn Tn · tg = \nfTn · tg 2 Ft. (2) infn Tn is a stopping time since finfn Tn < tg = [fTn < tg 2 Ft (3) lim supn!1 is a stopping time since lim supn!1 = infm supn¸m Tn (and (1), (2).) (4) lim infn!1 is a stopping time since lim infn!1 = supm infn¸m Tn (and (1), (2).). 4. T is clearly a stopping time by exercise 3, since T = lim infn Tn. FT ½ FTn , 8n since T · Tn, and FT ½ \nFTn by exercise 1. Pick a set A 2 \nFTn .8n ¸ 1, A 2 FTn and A \ fTn · tg 2 Ft. Therefore A \ fT · tg = A \ (\nfTn · tg) = \n (A \ fTn · tg) 2 Ft. This implies \nFTn ½ FT and completes the proof. p P p p 5. (a) By completeness of L space, X 2 L . By Jensen’s inequality, EjMtj = EjE(XjFt)j · p p E [E(jXj jFt)] = EjXj < 1 for p > 1. p 1 (b) By (a), Mt 2 L ½ L . For t ¸ s ¸ 0, E(MtjFs) = E(E(XjFt)jFs) = E(XjFs) = Ms a.s. fMtg is a martingale. Next, we show that fMtg is continuous. By Jensen’s inequality, for p > 1, n p n p n p EjMt ¡ Mtj = EjE(M1 ¡ XjFt)j · EjM1 ¡ Xj ; 8t ¸ 0: (1) n p n p It follows that supt EjMt ¡ Mtj · EjM1 ¡ Xj ! 0 as n ! 1. Fix arbitrary " > 0. By Chebychev’s and Doob’s inequality, µ ¶ µ ¶p n p n 1 n p p supt EjMt ¡ Mtj P sup jMt ¡ Mtj > " · p E(sup jMt ¡ Mtj ) · p ! 0: (2) t " t p ¡ 1 " n Therefore M converges to M uniformly in probability. There exists a subsequence fnkg such that Mnk converges uniformly to M with probability 1. Then M is continuous since for almost all !, it is a limit of uniformly convergent continuous paths. 1 6. Let p(n) denote a probability mass function of Poisson distribution with parameter ¸t. Assume ¸t is integer as given. X¸t ¡ ¡ EjNt ¡ ¸tj =E(Nt ¡ ¸t) + 2E(Nt ¡ ¸t) = 2E(Nt ¡ ¸t) = 2 (¸t ¡ n)p(n) n=0 à ! X¸t (¸t)n X¸t (¸t)n ¸tX¡1 (¸t)n =2e¡¸t (¸t ¡ n) = 2¸te¡¸t ¡ (3) n! n! n! n=0 n=0 n=0 (¸t)¸t =2e¡¸t (¸t ¡ 1)! 7. Since N has stationary increments, for t ¸ s ¸ 0, 2 2 2 E(Nt ¡ Ns) = ENt¡s = V ar(Nt¡s) + (ENt¡s) = ¸(t ¡ s)[1 + ¸(t ¡ s)]: (4) 2 2 As t # s (or s " t), E(Nt ¡ Ns) ! 0. N is continuous in L and therefore in probability. 8. Suppose ¿® is a stopping time. A 3-dimensional Brownian motion is a strong Markov process we know that P (¿® < 1) = 1. Let’s define Wt := B¿®+t ¡B¿® . W is also a 3-dimensional Brownian W motion and its argumented filtration Ft is independent of F¿®+ = F¿® . Observe kW0k = kB¿® k = W W ®. Let S = inftft > 0 : kWtk · ®g. Then S is a Ft stopping time and fS · sg 2 Fs . So fS · sg has to be independent of any sets in F¿® . However f¿® · tg \ fS · sg = ;, which implies that f¿® · tg and fS · sg are dependent. Sine f¿® · tg 2 F¿® , this contradicts the fact that F¿® and W Ft are independent. Hence ¿® is not a stopping time. 2 n Pn i 9. (a) Since L -space is complete, it suffices to show that St = i=1 Mt is a Cauchy sequence w.r.t. n. For m ¸ n, by independence of fM ig, à ! m 2 m m X X ¡ ¢2 X 1 E(Sn ¡ Sm)2 = E M i = E M i = t : (5) t t t t i2 i=n+1 i=n+1 i=n+1 P1 2 n m 2 n , Since i=1 1=i < 1, as n; m ! 1, E(St ¡ St ) ! 1. fSt gn is Cauchy and its limit Mt is well defined for all t ¸ 0. (b)First, recall Kolmogorov’sP convergence criterion:P Suppose fXigi¸1 is a sequence of independent random variables. If i V ar(Xi) < 1, then i(Xi ¡ EXi) converges a.s. i i i For all i and t, 4Mt = (1=i)4Nt and 4Mt > 0. By Fubini’s theorem and monotone convergence theorem, X X X1 X1 X X1 X 4N i X1 N i 4M = 4M i = 4M i = s = t : (6) s s s i i s·t s·t i=1 i=1 s·t i=1 s·t i=1 i Let Xi = (Nt ¡t)=i. Then fXigi is a sequence of independent random variables such that EXi = 0, 2 P1 V ar(Xi) = 1=i and hence i=1 V ar(Xi) < 1. Therefore Kolmogorov’s criterion implies that P1 P1 P i=1 Xi converges almost surely. On the other hand, i=1 t=i = 1. Therefore, s·t 4Ms = P1 i i=1 Nt =i = 1. 2 P 1 i P 1 i 10. (a) Let Nt = i i (Nt ¡ t) and Lt = i i (Lt ¡ t). As we show in exercise 9(a), N, M are well defined in L2 sense. Then by linearity of L2 space M is also well defined in L2 sense since X 1 £ ¤ X 1 X 1 M = (N i ¡ t) ¡ (Li ¡ t) = (N i ¡ t) ¡ (Li ¡ t) = N ¡ L : (7) t i t t i t i t t t i i i Both terms in right size are martingales change only by jumps as shown in exercise 9(b). Hence Mt is a martingale which changes only by jumps. P (b) First show that given two independent Poisson processes N and L, s>0 4Ns4Ls = 0 a.s., i.e. N and L almostP surely don’t jumpP simultaneously. Let fTngn¸1 be aP sequence of jump times of a process L. Then s>0 4Ns4Ls = n 4NTn . We want to show that n 4NTn = 0 a.s. Since 4NTn ¸ 0, it is enough to show that E4NTn = 0 for 8n ¸ 1. Fix n ¸ 1 and let ¹Tn be a induced probability measure on R+ of Tn. By conditioning, Z 1 Z 1 E(4NTn ) = E [E (4NTn jTn)] = E (4NTn jTn = t) ¹Tn (dt) = E (4Nt) ¹Tn (dt); (8) 0 0 where last equality is by independence of N and Tn. It follows that E4NTn = E4Nt. Since 1 4Nt 2 L and P (4Nt = 0) = 1 by problem 25, E4Nt = 0, hence E4NTn = 0. Next we show that the previous claim holds even when there are countably many Poisson processes. i assume that there exist countably many independent Poisson processes fN gi¸1. Let A ½ Ω be a i set on which more than two processes of fN gi¸1 jump simultaneously. Let Ωij denotes a set on i j which N and N don’t jump simultaneously. Then P (Ωij) = 1 for i 6= j by previous assertion. c P c Since A ½ [i>jΩij, P (A) · i>j P (Ωij) = 0. Therefore jumps don’t happen simultaneously almost surely. Going back to the main proof, by (a) and the fact that N and L don’t jump simultaneously, 8t > 0, X X X j4Msj = j4Nsj + j4Lsj = 1 a:s: (9) s·t s·t s·t 11. Continuity: We use notations adopted in Example 2 in section 4 (P33). Assume EjU1j < 1. By independence of Ui, elementary inequality, Markov inequality, and the property of Poisson process, we observe X lim P (jZt ¡ Zsj > ²) = lim P (jZt ¡ Zsj > ² jNt ¡ Ns = k)P (Nt ¡ Ns = k) s!t s!t k X Xk X h ² i · lim P ( jUij > ²)P (Nt ¡ Ns = k) · lim kP (jU1j > ) P (Nt ¡ Ns = k) (10) s!t s!t k k i=1 k X EjU1j 2 EjU1j · lim k P (Nt ¡ Ns = k) = lim f¸(t ¡ s)g = 0 s!t ² s!t ² k 3 Independent Increment: Let F be a distribution function of U. By using independence of fUkgk and strong Markov property of N, for arbitrary t; s : t ¸ s ¸ 0, ³ ´ ³ PN P ´ iu(Z ¡Z )+ivZ iu t U +iv Ns U E e t s s = E e k=Ns+1 k k=1 k ³ ³ ´´ PNt PNs iu Uk+iv Uk =E E e k=Ns+1 k=1 jFs ³ ³ P ´´ ³ ³ P ´´ (11) PNs Nt PNs Nt iv Uk iu Uk iv Uk iu Uk =E e k=1 E e k=Ns+1 jFs = E e k=1 E e k=Ns+1 ³ P ´ ³ PN ´ ³ ´ iv Ns U iu t U ¡ ivZ ¢ iu(Z ¡Z ) =E e k=1 k E e k=Ns+1 k = E e s E e t s : This shows that Z has independent increments.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    26 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us