
Stochastic Integration Prakash Balachandran Department of Mathematics Duke University June 11, 2008 These notes are based on Durrett’s Stochastic Calculus, Revuz and Yor’s Continuous Martingales and Brownian Motion, and Kuo’s Introduction to Stochastic Integration. 1 Preliminaries Definition: A continuous-time process Xt is said to be a continuous local martingale w.r.t. fFt; t ≥ 0g if there are stopping times Tn " 1 such that 8 X on fT > 0g <> Tn^t n Tn Xt = :>0 on fTn = 0g is a martingale w.r.t. fFt^Tn : t ≥ 0g. The stopping times fTng are said to reduce X. Remarks: T 1. I brooded over why we set Xt = 0 in the definition, and this is the only explanation I could find: If we defined T Xt = XT ^t on fT ≥ 0g; T T then X0 = X0, so according to the above definition of a local martingale, Xt a martingale implies T E[X0 ] = E[X0] < 1. T So, with the above definition of Xt , X0 has to be integrable. 1 Since we want to consider more general processes in which X0 need not be integrable, we set 8 >XT ^t on fT > 0g T < Xt = :>0 on fT = 0g so that 8 >X0 on fT > 0g T < X0 = :>0 on fT = 0g T and according to the definition of a local martingale, Xt a martingale implies that T E[X0 ] = E[X0; T > 0] < 1 which does not necessarily imply that X0 is integrable since E[X0; T > 0] ≤ E[X0]. Thus, our definition of a continuous local martingale frees us from integrability of X0. 2. We say that a process Y is locally A if there is a sequence of stopping times Tn " 1 so that the T stopped processes Yt has property A. Now, you might ask why the hell we should care about continuous local martingales. Again, I thought about this a lot, and these are the only reasons I could salvage: (1) (n) n (j)on Example 1: Let Bt = Bt ;:::;Bt be n-dimensional Brownian motion, where Bt are j=1 independent Brownian motions on R. r (1)2 (n)2 Suppose we’re interested in the process jjBtjj = Bt + ··· + Bt ; it can be shown that: n Z t (j) Z t X Bs n − 1 1 jjB jj = dB(j) + ds t jjB jj s 2 jjB jj j=1 0 s 0 s and that n Z t (j) X Bs W = dB(j) t jjB jj s j=1 0 s is a Brownian motion, and hence, a martingale. Now, what about the second integral above? It’s not immediately obvious how this integral behaves, but it’s certainly not a martingale. In fact, it can be shown that 1 for t ≥ 0 is a continuous local jjBtjj martingale, and so R t ds is a continuous local martingale. 0 jjBsjj 2 Since a continuous martingale is certainly a continuous local martingale, it follows that jjBtjj is a con- tinuous local martingale. Now, suppose that a particle is exhibiting Brownian motion. If w(t; !) is any suitable process which represents a quantity that varies with the distance from the origin to the particle, the process Z t W (!) = w(s; !)djjBsjj(!) 0 represents the total accumulation of this quantity, along a path of Brownian motion. So, we need to know how to integrate processes w.r.t. continuous local martingales to evaluate this quantity (and ensure that it does, in fact, exist). Example 2: Let Xt be a continuous martingale, and let φ be a convex function (imagine that Xt is the −tXt interest rate at time t, so that φ(Xt) = e is the present value of a dollar made in t years). Theorem 1 If E[jφ(Xt)j] < 1, then φ(Xt) is a submartingale. Proof: Jensen’s inequality for conditional expectation states that if φ is convex, and E[jXtj];E[j[φ(Xt)j] < 1 for each t, then for s < t: φ(Xs) = φ (E[XtjFs]) ≤ E[φ(Xt)jFs]: On the other hand, we have: Theorem 2 φ(Xt) is always a local submartingale. For the proof of this, see the corollary after Theorem 4. R T −tXt R T So, if φ(t) is a cash flow from now to time T , 0 φ(t)e dt = 0 φ(t)dYt is the net present value of −tXt this cash flow, where we’ve set dYt = e dt. Again, we need to know how to integrate processes w.r.t. continuous local martingales to evaluate this quantity (and ensure that it does, in fact, exist). Example 3: 2 Definition: Define Lad(Ω;L [a; b]) denote the space of stochastic processes f(t; !) satisfying: 1. f(t; !) is adapted to the filtration fFtg of Brownian motion. R b 2 2. a jf(t; !)j dt < 1 almost surely. 2 R b 2 Definition: Define L ([a; b]×Ω) to be the space of fFtg adapted processes f(t; !) such that a E(jf(t; !)j )dt < 1. 3 2 hR b 2 i R b 2 Now, by Fubini’s Theorem if f 2 L ([a; b] × Ω), E a jf(t; !)j dt = a E[jf(t; !)j ]dt < 1. Thus, R b 2 2 a jf(t; !)j dt < 1 almost surely, so that f 2 Lad(Ω;L [a; b]). Since f was arbitrary, we must have 2 2 L ([a; b] × Ω) ⊆ Lad(Ω;L [a; b]): R t 2 Now, in Stochastic Calculus, one constructs the integral 0 f(s; !)dBs for f 2 L ([a; b] × Ω). In this R t case, we have that 0 f(s; !)dBs is a martingale. 2 R t However, when f 2 Lad(Ω;L [a; b]), 0 f(s; !)dBs need not be a martingale. Now, in order to proceed, we need a couple of theorems about continuous local martingales, and theorems concerning variance and covariance processes. They may seem irrelevant, now, but they’ll come in handy later. 4 2 Continuous Local Martingales The first section was supposed to convince you why you should care about continuous local martingales. Now, we prove some theorems. Theorem 3 (The Optional Stopping Theorem) Let X be a continuous local martingale. If S ≤ T are stopping times, and XT ^t is a uniformly integrable martingale, then E[XT jFS] = XS. Proof: The classic Optional Stopping Theorem states that: If L ≤ M are stopping times and YM^n is a uniformly integrable martingale w.r.t. Gn, then E[YM jGL] = YL: [2nS]+1 To extend the result from discrete to continuous time, let Sn = 2n . Applying the discrete time result n to the uniformly integrable martingale Ym = XT ^m2−n with L = 2 Sn and M = 1, we have E[XT jFSn ] = XT ^Sn : Now, the dominated convergence theorem for conditional expectation states: If Zn ! Z a.s., jZnj ≤ W for all n where E[W ] < 1, and Fn "F1, E[ZnjFn] ! E[ZjF1] a:s: Taking Zn = XT and Fn = FSn , and noticing that XT ^t a uniformly integrable martingale implies that E[jXT j] < 1, we have E[XT jFSn ] ! E[XT jFS] a.s.. Since E[XT jFSn ] = XT ^Sn ! XS a.s., we have that XS = E[XT jFS]. 5 Theorem 4 If X is a continuous local martingale, we can always take the sequence which reduces X to be Tn = infft : jXtj > ng 0 0 or any other sequence Tn ≤ Tn that has Tn " 1 as n " 1. Proof: Let Sn be a sequence that reduces X. If s < t, then applying the optional stopping theorem to Sn 0 0 Xr at times r = s ^ Tm and t ^ Tm gives: 0 0 0 E[Xt^Tm^Sn 1Sn>0jFs^Tm^Sn ] = Xs^Tm^Sn 1Sn>0: 0 0 Multiplying by 1Tm>0 2 F0 ⊆ Fs^Tm^Sn : 0 0 0 0 0 E[Xt^Tm^Sn 1Tm>0;Sn>0jFs^Tm^Sn ] = Xs^Tm^Sn 1Tm>0;Sn>0: 0 0 0 0 0 0 As n " 1, Fs^Tm^Sn "Fs^Tm , and Xr^Tm^Sn 1Sn>0;Tm>0 ! Xr^Tm 1Tm>0 for all r ≥ 0 and 0 0 jXr^Tm^Sn 1Sn>0;Tm>0j ≤ m it follows from the dominated convergence theorem for conditional ex- pectation that: 0 0 0 0 0 E[Xt^Tm 1Tm>0jFs^Tm ] = Xs^Tm 1Tm>0: Corollary 1 If X is a continuous martingale, and φ is a convex function, then φ(Xt) is a continuous local submartingale. Proof: By Theorem 4, we can let Tn = infft : jXtj > ng to be a sequence of stopping times that reduce Tn Tn Tn Xt. By definition of Xt , we therefore have jXt j ≤ n ) E[jXt j] ≤ n, t ≥ 0. Tn Tn Now, since φ is convex, and jXt j ≤ n, φ(Xt ) is contained in φ([−n; n]). Since φ is convex, it is Tn continuous, so that φ([−n; n]) is bounded. Hence, jφ(Xt )j ≤ M for some 0 < M < 1, and so Tn E[jφ(Xt )j] ≤ M. So, by Jensen’s inequality: Tn Tn Tn E[φ(Xt )jFTn^s] ≥ φ(E[Xt jFTn^s]) = φ(Xs ): Tn Thus, φ(Xt ) is a submartingale, so that φ(Xt) is a local submartingale. 6 Tn In the proof of Theorem 4, we used the fact that Xt is a martingale w.r.t. fFt^Tn ; t ≥ 0g, as per the definition of a continuous local martingale. In general, we have S Theorem 5 Let S be a stopping time.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages48 Page
-
File Size-