Brownian Motion and

Xiongzhi Chen University of Hawaii at Manoa Department of Mathematics September 20, 2008

Abstract This note is about Doob decomposition and the basics of Square integrable martingales

Contents

1 Doob-Meyer Decomposition 1

2 Square Integrable Martingales 4

Brownian Motion and Stochastic Calculus Continuout Time Submartingales

Usually it’ssu¢ ce to only discuss submartingales by symmetry in de…nition and techniques are the same.

1 Doob-Meyer Decomposition

Doob-meyer decomposition clears the obstable for de…ning stochastic integral (in the isometry strategy) wrt square integrable martingales and hence is of foundamental importance

De…nition 1 An increasing process A is called natural if for every bounded, right continuous martingale Mt; t : 0 t < we have f F  1g E MsdAs = E MsdAs (1) Z(0;t] Z(0;t] for every 0 < t < 1

Problem 2 Suppose X = Xt; t : 0 t < is a right continuous submartingale. Show that under any one of the following conditions,f XF is of class1g DL a) Xt 0;a.s for every t 0   b) X has the special form Xt = Mt + At; t 0  suggested by Doob decomposition.

Show also that if X is a uniformly integrable martingale, the it is of class D.

1 Proof. By optional sampling theorem for bounded stopping times (in note 4), we have

XT Xa XT >  XT > Zf g Zf g Also we have E (XT ) E (Xa) P (XT > )    

for all a > 0;  > 0;T a: Therefore 2 I

lim sup XT dP = 0  T a XT > 2I Zf g

For the second part. It su¢ ces to show that Mt in the decomposition is uniformly integrable (since f g At is uniformly integrable for all T a). Again from optional sampling theorem f g 2 I

XT = E (Xa T ) jF

for all T a which established the needed uniformly integrability. If X is2 uniformly I integrable, then X closes X and hence X is in D 1

Problem 3 Let X = Xt; t : 0 t < be a continuous, non-negative process with X0 = 0 a.s. and f F  1g A = At; t : 0 t < any continuous, increasing process for which f F  1g

E (XT ) E (AT ) 

holds for every bounded T of t . Introduce the process fF g

Vt = max Xs 0 s t   consider a continuous, increasing function F on [0; ) with F (0) = 0 and de…ne 1

1 1 G (x) = 2F (x) + x u dF (u) Zx for 0 < x < : Establish the inequalities 1 E (AT ) P [VT "] ; " > 0 (2)   " 8 and E ( AT ) P [VT "; AT < ] ^ ; ";  > 0 (3)   " 8 and E (F (VT )) E (G (AT )) (4)  for any stopping time T of t fF g Proof. De…ne the stopping times

H" = inf t 0 : Xt " ;S = inf t 0 : At  (5) f   g f   g and Tn = T n H" ^ ^

2 (Notice that H" T if VT < " is not true for arbitrary …nite T; i.e., if VT " for some …nite T then  f  g H" T ; otherwise, the left side is zero since VT " is empty for …nite T and the inquality holds naturally) we have f  g

"P (VTn ") E XTn 1 VT " E (XTn ) E (ATn ) E (AT )   f n  g    Now by the continuity of probability measure we take a limit on both sides

"P (VT ") "P (VT H" ") E (AT )   ^  

(since (VT H" ") (VT ") since Vt = max0 s t Xt is monotonic non-decreasing on t: ) On the other hand, we have^      E (AT S) E ( AT ) P (VT "; AT < ) P (VT S ") ^ = ^ (6)   ^   " "

(since AT <  implies T S ) Then we have 

P (VT ") = P (VT "; AT < ) + P (VT "; AT )     E ( AT ) ^ + P (AT )  "  which is the …rst inequality in the corollary that follows

Denote the cdf of VT by FVT (VT ) : By the assumption on F we have

1 F (x) = 1 x u dF (u) f  g Z0 and

1 1 1 E (F (VT )) = F (VT ) dFVT (VT ) = 1 VT u dF (u) dFVT (VT ) (7) f  g Z0 Z0 Z0  1 1 1 = dF (u) 1 VT u dFVT (VT ) = P (VT u) dF (u) f  g  Z0 Z0 Z0 1 E (u AT ) ^ + P (AT u) dF (u)  u  Z0  

1 E AT 1 AT

Corollary 4 from the above problem, we have

E ( AT ) P [VT "] ^ + P [AT ]   "  and 2 p E (V p) E (Ap ) ; 0 < p < 1 T  1 p T

3 Proof. Take F (x) = xp for 0 < p < 1 and x 0: Then 

p 1 1 p 1 p 1 p 2 g (x) = 2x + x pu u du = 2x + xp u du x x Z Z p 1 1 1 x = 2xp + xp up 1 = 2xp + xp p 1 1 p  x    p 2 p = 2xp + xp = xp 1 p 1 p and from the above inequality 2 p E (V p) E (Ap ) T  1 p T as desired.

2 Square Integrable Martingales

One thing to notice is that, when squaring sums of martingale increments and taking the expectation, on can neglecte th cross-produc terms. More precisely, if M = Mt 2 and 0 s < t u < v; then f g 2 M  

E ((Mv Mu)(Mt Ms)) = E E ((Mv Mu)(Mt Ms) u) (8) f jF g = E (Mt Ms) E (Mv Mu u) = 0 f jF g This fact applied to both M;M 2 M we get h i 2 2 2 E (Mv Mu) t = E M 2MuMv + M t (9) jF v u jF   2 2 = E M t 2E (MuE (M v u) t) + E M t v jF jF jF ujF 2 2 2 = E M t 2E M t + E M t v jF  ujF ujF  2 2 = E M M t = E ( M M t) v  ujF h  iv h iu jF since  2 E (MuMv t) = E (E ((MuMv) u) t) = E ([MuE (Mv u)] t) = E M t (10) jF jF jF jF jF ujF for martingale M  2 2 2 Thus M M M M and (Mv Mu) ( M M ) have zero expectations conditioned v h iv u h iu h iv h iu on t F 

Problem 5 Let Xt; t be a continuous process with the property that for each …xed t > 0 and for some p > 0 f F g (p) lim Vt () = Lt  k k!1 (p) in probability, where Lt is a random variable taking values in [0; ) a.s.. Show that for q > p; lim  Vt () = (p) 1 k k!1 0 in probability and lim  0 Vt () = for 0 < q < p on Lt > 0 k k! 1 f g

Proof. If Lt = 0 a.s., then we have nothing to prve. So it’sreasonable to suppose

P (Lt > 0) > 0

Let

X (t; ) = sup Xs0 Xs s s <;s;s [0;t] fj j g j 0 j 02

4 Then by the uniform continuity of Xs on the compact set [0; t] ; we have

lim sup Xs0 Xs = 0  0 s s <;s;s [0;t] fj j g ! j 0 j 02

(n) (n) (n) (n) (n) for any > 0: For any partition n = t [0; t] : 0 = t t t tm = t let i 2 0  i  i+1  n n o (n) (n) n = max ti+1 ti k k 0 i mn   n o

Then it’sclear that m m n q n p q p (q) Vt (n) = Xt(n) Xt(n) = Xt(n) Xt(n) Xt(n) Xt(n) i+1 i i+1 i i+1 i i=0 i=0 Xm X n p q p X (n) X (n) sup X (n) X (n) ti+1 ti ti+1 ti  ! 0 i mn i=0     X Consequently, if q > p; then (q) lim Vt (n) = 0 n 0 k k! in probability. For the second half, let’s assume the contrary. Then there are constants ; K > 0 and a sequence of partitions (n) (n) (n) (n) (n) n = t [0; t] : 0 = t t t t = t i 2 0  i  i+1  mn such that for the sets n o (q) An = Lt > 0;V (n) K t  it holds that n o P (An) P (Lt > 0)  Thus from

m m n p n q p q (p) Vt (n) = Xt(n) Xt(n) = Xt(n) Xt(n) Xt(n) Xt(n) i+1 i i+1 i i+1 i i=0 i=0 Xm X n q p q Xt(n) Xt(n) sup Xt(n) Xt(n)  i+1 i i+1 i i=0 ! 0 i mn X     p q K sup X (n) X (n) ti+1 ti  0 i mn     it’sinfered that (p) P lim Vt (n) = 0 P (Lt > 0) > 0 n 0  k k!  which contradicts (p) P V (n) Lt > " <  t   for all  whose  <  (") for some  (") for any given ";  > 0: k k c Problem 6 Let X be in ; and T be a stopping time of t . If X = 0 a.s. then M2 fF g h iT

P (XT t = 0; 0 t < ) = 1 ^ 8  1

5 Proof. Since X is continuous and nondecreasing and t T we have h i ^

P (Xt T = 0; 0 t < ) = 1 ^  1 2 Since Mt = X X is a continuous martingale, by the optional sampling theorem t h it 2 2 0 = E (MT 0) = E (Mt T ) = E Xt T E ( X t T ) = E Xt T ^ ^ ^ h i ^ ^ which implies   Xt T = 0 ^ in probability for all 0 t < : Consequently  1

P Xt T = 0 : t Q[0; ) ^ 2 1 

= P Xri T = 0 0 f ^ g1 ri Q[0; ) 2\ 1 @ A = 1 P (Xri T = 0) = 1 ^ 6 ri Q[0; ) 2[ 1 For any t [0; ); by continuity we have 2 1

lim Xt;rk T = Xt T rt;k t;rt;k Q[0; ) ^ ^ # 2 1 Thus

P (Xt T = 0 : t [0; )) ^ 2 1

= P lim Xt;rk T = Xt T : t [0; ) = P Xri T = 0 = 1 rt;k t;rt;k Q[0; ) ^ ^ 2 1 0 f ^ g1  # 2 1  ri Q[0; ) 2\ 1 @ A as desired.

c Problem 7 Show that for X;Y and  = t0; t1; ; tm a partition of [0; t] 2 M2 f    g m

lim (Xtk Xtk )(Ytk Ytk 1) = X;Y t  0 h i k k! k=1 X in probability.

Proof. Let’stake a smarter strategy with the following roadmap: by the de…nition 1 X;Y = ( X + Y X Y ) h i 4 h i h i and the fact that

m (2) 2 P VX+Y (t; ) = [(Xtk Xtk ) + (Ytk Ytk 1)] X + Y t ; ! h i k=1 Xm (2) 2 P VX Y (t; ) = [(Xtk Xtk ) (Ytk Ytk 1)] X Y t ! h i k=1 X

6 as  0; we have only to show k k ! m (2) P 1 (2) (2) 00 VX;Y (t; ) = (Xtk Xtk )(Ytk Ytk 1) lim VX+Y (t; 0) lim VX Y t;  ! 4 0 0 00 0 k=1 k k! k k! X    Notice that in the above scheme the partitions used may be very di¤erent. Then

V (2) (t; ) P X;Y X;Y ! h it Obviously, for the same parition ;

(2) 1 (2) (2) VX;Y (t; ) = VX+Y (t; ) VX Y (t; ) 4   which yields the result. To make this more formal, we showed that, since for any ";  > 0; there exists some  > 0; such that

(2) (2) max P VX+Y (t; ) X + Y t " ;P VX Y (t; ) X Y t " <  h i  h i  n    o for every  with  < ; and that, for all such ; it holds that k k (2) 1 (2) (2) VX;Y (t; ) = VX+Y (t; ) VX Y (t; ) 4   then P V (2) (t; ) X;Y " <  X;Y h it    as desired.

Remark 8 We don’t have to cope with the mess by de…nining

 X = X X ;  Y = Y Y ;  X;Y = X;Y X;Y k t tk tk k t tk tk k t tk tk 1 h i h i h i and show m 2

E (Xtk Xtk )(Ytk Ytk 1) X;Y t = 0 h i k=1 ! X and compute

m 2 m 2

E (Xtk Xtk )(Ytk Ytk 1) X;Y t = E [(kXtkYt) k X;Y t] h i h i k=1 ! k=1 ! Xm X 2 2 = E (kXtkYt) + (k X;Y ) + 2kXtkYtk X;Y h it h i k=1 ! X n o m 1 m +E 2 ((iXtiYt) i X;Y ) ((jXtjYt) j X;Y ) 0 h it h it 1 i=1 j=i+1 X X @ m A m 2 sup kXt sup kYt E kXtkYt + sup k X;Y t E (k X;Y t)  1 k m fj jg 1 k m fj jg 1 k m fj h i jg h i k=1 ! k=1 !     X   X till we’re bored

Problem 9 Let X;Y be in c;loc: Then there is a stochastically unique adapted, continuous process of M c;loc bounded variation X;Y satisfying X;Y 0 = 0 a.s., such that XY X;Y . If X = Y we write X = X;X andh this processi in nondecreasing.h i h i 2 M h i h i

7 Proof. This problem di¤ers from the original theorem in the textbook in that here the processed are only local martingales. Thus it’smore di¢ cult but it su¢ ces to show

X c;loc = X2 X c;loc 2 M ) h i 2 M Then we can take

XY X;Y h i 1 2 = (X + Y )2 X + Y (X Y )2 X Y 4 h i h i n   o which is a linear combination of local martingales and consequently a local martingale. So, let’sde…ne sequences of stopping times Sn ; Tn such that Sn ;Tn . Then f g f g " " 1 (n) (n) Xt = Xt Sn ; Yt = Yt Tn ^ ^ n o n o are t -martingales (by the optional sampling theorem). De…ne fF g On = inf t 0 : ( Xt Yt ) = n f  j j _ j j g and Rn = Sn Tn On ^ ^ and set ~ (n) ~ (n) Xt = Xt Rn ; Yt = Yt Rn ^ ^ Note that Rn a.s.. Since " 1 X~ (n) = X = X(n) ; Y~ (n) = Y = Y (n) t t Rn t Rn t t Rn t Rn ^ ^ ^ ^ c these processes are also t -martingales and are in because they’re bounded. For fF g M2 m > n; X~ (n) = X(m) t t Rn ^ (key observation) and so

2 2 X~ (n) X~ (m) = X(m) X~ (m) t t Rn t Rn ^ t Rn   D E ^   D E ^ is a martingale. This implies X~ (n) = X~ (m) (11) t t Rn ^ by Doob Decomposition. Thus we can de…neD E D E

~ (n) X t = X h i t D E whenver t Rn and be assured that X is well-de…ned. The process X is adapted, continuous, and nondecreasing and sati…es X = 0 a.s..h Furthermorei h i h i0 2 X2 X = X~ (n) X~ (n) (12) t Rn t Rn t ^ h i ^ t   D E (since X = X~ (n) = X~ (m) = X~ (n) ) is a martingale for each n; so X2 X c;loc: t Rn h i ^ t Rn t Rn t h i 2 M By Theorem 5.13D in theE text,^ weD can takeE ^ D E 1 X;Y = ( X + Y X Y ) h i 4 h i h i

8 Then XY X;Y c;loc For the uniqueness, h i 2 M suppose both A; B satisfy the conditions requires of X;Y . Then M = XY A and c;loc h i B = XY B are in ; so just as before we can construct a seuence Rn of stopping times with Rn (n) M (n) c f (n)g (n) " 1c such that Mt = Mt Rn and Nt = Nt Rn are in 2: Consequently Mt Nt = Bt Rn At Rn 2 and being of bounded^ variation this process^ mustM be identically zero. (see the proof of^ Theorem ^ 5.13)2 M It follows that A = B

Problem 10 Establish the following i) A local martingale of class DL is a martingale ii) A nonnegative lcoal martingale is a supermartingale

c;loc 2 2 2 iii) If M and S is a stopping time of t ; then E MS E ( M S) ; where M =limt Mt 2 M fF g  h i 1 !1

Proof. i) By hypothesis, there exists a sequence of stopping times Tn Tn+1 a.s. such that for each  " 1 Tn the stopped process Xt Tn ; t is a martingale. Since the sequence t Tn : n 1 is bounded by t; by DL property f ^ F g f ^  g

E (Xt Tn s) = Xs Tn ^ jF ^ implies a.s.

lim E (Xt Tn s) = E (Xt s) = lim Xs Tn = Xs n ^ jF jF n ^ which justi…ed the martingale property. ii) Since

E (Xt Tn s) = Xs Tn ^ jF ^ a.s. for all s t and for each Tn; then (because the limits exists a.s.) 

lim Xs Tn = Xs = lim E (Xt Tn s) limnE (Xt Tn s) n ^ n ^ jF  ^ jF and by Fatou’slemma and Levi’slemma, for any A s 2 F

limnE (Xt Tn s) dP = limn E (Xt Tn s) dP = limn Xt Tn dP ^ jF ^ jF ^ ZA ZA ZA

limnXt Tn dP = XtdP  ^ ZA ZA Since A s is arbitrary, then 2 F limnE (Xt Tn s) E (Xt s) ^ jF  jF Consequently Xs E (Xt s)  jF (Actually Xt is a martingale) iii) Forf g

E (Xt Tn s) = Xs Tn ^ jF ^ a.s. for all s t and for each such Tn:S is a stopping time  E M 2 E ( M ) S  h iS Essentially, establish (locally) uniformly integrability 

9 Remark 11 Local martingale: for any s t Tn; we have t Tn = t; s Tn = s and   ^ ^

E (Xt s) = Xs jF

For t > Tn we have E (XT s) = Xs n jF So it indeed is a martingale locally.

c;loc Problem 12 Let M = Mt; t 2 and assume that its process M is integrable: E ( M ) < f : ThenF g 2 M [M h i h i1 1 2 i) M is a martingale, and M and M are both uniformly integrable; in particular, M = limt Mt exista a.s. and E M 2 = E ( M ) 1 1 h i1 2 2 ii) we may take a right-continuous modi…cation of Zt = E M t Mt ; t 0; which is a potential. 1jF 

Proof. Essentially, establish (locally) uniformly integrability and use modi…cation. If M 2; then 2 c;loc 2 2 M E Mt = E ( M t) E ( M ) : If M ; then E MS E ( M S) E ( M ) : Consequently h i  h i1 2 M  h i  h i1 MS S in uniformly integrable (by Holder’s Inequality). Thus M = lim Mt a.s and E (M t) = Mt a.s.f andg 2I by Fatou’slemma  1 1jF

2 2 2 E M = E lim Mt lim E Mt = E ( M ) (13) 1  h i1 and Jessen’sinequality yields    2 2 Mt E M t  1jF 2 2 If follows that M has a last element, ie., that Mt ; t : 0 t is a submartingale, which is uniformly F 2  1 2 integrable and (13) holds with equality. Letting Zt = E M t Mt gives the required process  1jF c;loc Problem 13 Let M and show that for any stopping time T of t ; 2 M fF g

E ( M T ) P max Mt " ^ h i + P [ M ] 0 t T j j   "2 h iT      for any ";  > 0: In particular, for a sequence M (n) c;loc we have  M  P (n) P (n) P M 0 = max Mt 0 T ! ) 0 t T ! D E  

submartingale

Proof. Take X = M 2 and A = M and use the problem in note 4. h i

2 2 P max Mt " = P max Mt " 0 t T j j  0 t T          E ( M ) ^ h iT + P [ M ]  "2 h iT  The second follows readily by replacing.

10