Martingale Calculus and a Maximal Inequality for Supermartingales
Total Page:16
File Type:pdf, Size:1020Kb
Martingale calculus and a maximal inequality for supermartingales B. Hajek Department of Electrical and Computer Engineering and the Coordinated Science Laboratory University of Illinois at Urbana-Champaign March 15, 2010 Abstract: In the first hour of this two part presentation, the calculus of semimartingales, which includes martingales with both continuous and discrete components, will be reviewed. In the second hour of the presentation, a tight upper bound is given involving the maximum of a supermartingale. Specifically, it is shown that if Y is a semimartingale with initial value zero and quadratic variation process [Y,Y] such that Y + [Y,Y] is a supermartingale, then the probability the maximum of Y is greater than or equal to a positive constant a is less than or equal to 1/(1+a). The proof uses the semimartingale calculus and is inspired by dynamic programming. If Y has stationary independent increments, the bounds of J.F.C. Kingman apply to this situation. Complements and extensions will also be given. (Preliminary paper posted to AirXiv: http://arxiv.org/abs/0911.4444). Outline Brief review of martingale calculus Kingman's moment bound Bound on maximum of a supermartingale (with no SII assumption) The big jump construction Proof of the upper bound Discrete time, version I Discrete time, version II A lemma Comparison to Doob's moment bounds PART I: Brief review of martingale calculus See, for example, [4, 5, 7, 8, 9] for more detail. The usual underlying conditions I Assume (Ω; F; P) is complete (subsets of events with probability zero are events) I Assume filtration of σ-algebras F• = (Ft : t ≥ 0) is I right-continuous, and I each Ft includes all zero-probability events. I Thus martingales, supermartingales, and submartingales have c`adl`ag(right continuous with finite left limits) versions. We assume in these slides such versions are used, without further explicit mention. Predictable processes and L2 stochastic integrals I P =predictable subsets of R+ × Ω (σ-algebra of subsets of R+ × Ω generated by random processes U(!)I(a;b](t); where U is Fa measurable. ) I Process H = (H(t;!): t ≥ 0) is predictable if it is a P-measurable function of (t;!): I X is said to admit a predictable compensator A if A is a predictable process and X − A is a martingale. 2 2 2 I If M is an L martingale (so supt≥0 Mt is integrable) then Mt has a predictable compensator, written < M; M > : R t I Integrals H • M = ( 0 Hs dMs : t ≥ 0) for such M and a class of predictable processes H can be defined by focusing on the isomorphism for t fixed: 2 2 E[(H • Mt ) ] = E[H • < M; M >t ]: I H • M is then also a martingale Localization in time and semimartingales I A process X = (Xt : t ≥ 0) stopped at a stopping time T : 8 0 if T = 0 T < Xt = Xt if 0 ≤ t ≤ T and T > 0 : XT if t ≥ T and T > 0 I M is a local martingale if there is sequence of stopping times Tn with Tn ≤ Tn+1 and Tn ! 1 so M is a martingale for all n: Tn I H is locally bounded if there is such (Tn) so that H is bounded for each n: I A semimartingale is a random process X that can be represented as the sum of a local martingale and a (c`adl`ag) adapted process of locally finite variation. Semimartingales as integrators R t I Can define H • X = ( 0 Hs dXs : t ≥ 0) for: H locally bounded and predictable, X a semimartingale. Tn n n n n I Use (Tn) such that X = X0 + Mt + At where X0 is n 2 n bounded, supt jMt j is integrable, the variation of A is bounded, and HTn is bounded. n n 2 n n I Define H • M as an L stochastic integral and H • A as a Lebesgue-Stieltjes integral. n n n n I Let H • Xt = limn!1 (H • M + H • A )t n n I Shown limit exists, same for all choices of (Tn), A , M . Note: 4(H • X )t = Ht 4Xt Quadratic variation processes I [Y ; Y ] for semimartingale Y is defined by kn−1 X 2 [Y ; Y ]t = lim (Ytn − Ytn ) n!1 i+1 i i=0 for any 0 = tn < tn < ··· < tn = t such that 0 1 kn n n maxi jti+1 − ti j ! 0 as n ! 1: P 2 c I Decomposition: [Y ; Y ]t = s≤t (4Ys ) + [Y ; Y ]t 2 I If M is a square integrable martingale, then M − [M; M] and [M; M]− < M; M > are martingales. In particular, < M; M > is the predictable compensator of both M2 and [M; M]: I If Yt = Xt + bt then [X ; X ] = [Y ; Y ]: I Define [X ; Y ] similarly, or as 1 2 ([X + Y ; X + Y ] − [X ; X ] − [Y ; Y ]): I If either X or Y have locally finite variation, then X [X ; Y ]t = X0Y0 + 4Xs 4Ys : 0<s≤t Example: Brownian motion Let W denote standard Brownian motion. Then, as well known, c [W ; W ]t = [W ; W ]t =< W ; W >t = t: Example: Poisson process I Let N be a rate λ Poission process. I λt is the predictable compensator of N, so Mt = Nt − λt defines a martingale M. P 2 P I [N; N]t = [M; M]t = 0<s≤t (4Ms ) = 0<s≤t 4Ns = Nt : c I So [M; M] ≡ 0 and < M; M >t = λt: Generalized It^oformula Let F be a twice continuously differentiable function and let X be a semimartingale. The generalized It^oformula (aka the Dol´ean-DadeMeyer change of variables formula) is: Z t 0 F (Xt ) = F (X0) + F (Xu−)dXu 0 X 0 + F (Xu) − F (Xu−) − F (Xu−)4Xu 0<u≤t Z t 1 00 c + F (Xu)d[X ; X ]u; 2 0 for t ≥ 0: PART II: Kingman's moment bound in discrete and continuous time Kingman's [6] moment bound for random walks Let S0 = 0 and Sk = U1 + ··· + Uk : for k ≥ 1; where U1; U2;::: 2 ∗ are iid with mean −µ and variance σ < 1: Let S = supk≥0 Sk : Kingman's bound is σ2 E[S∗] ≤ : (1) 2µ Proof of Kingman's bound Let Wn = maxf0; U1; U1 + U2; ··· ; U1 + ··· + Ung: Then ∗ ∗ Wn % S as n % 1; so by dom. conv., E[Wn] % E[S ]: Wn+1 = maxf0; U1 + maxf0; U2; U2 + U3; U2 + ··· + Un+1gg: | {z } same distribution as Wn so d: Wn+1 = (U + Wn)+: (2) Trivially, U + Wn = (U + Wn)+ − (U + Wn)− (3) Taking expectations on both sides of (3) and applying (2) and the fact E[Wn+1] ≥ E[Wn] yields E[(U + Wn)−] ≥ µ. (4) Squaring both sides of (3), using (a)+(a)− ≡ 0; taking 2 2 expectations, and using (2) and E[Wn+1] ≥ E[Wn ] yields 2 2 E[U ] − 2µE[Wn] ≥ E[(U + Wn)−]: (5) Rearranging (5) and applying (4) yields E[U2] − E[(U + W )2 ] E[W ] ≤ n − n 2µ E[U2] − µ2 − E[(U + W )2 ] + E[(U + W ) ]2 ≤ n − n − 2µ σ2 − Var((U + W ) ) σ2 = n − ≤ 2µ 2µ Finally, letting n ! 1 yields (1). Kingman's bound in continuous time Kingman's bound readily extends to continuous time. Let Y be a stationary independent increment (SII) process with Y0 = 0 such that for some µ > 0 and σ2 > 0; 2 E[Yt ] = −µt and Var(Yt ) = σ t: (6) ∗ σ2 Then E[Y ] ≤ 2µ : Proof of Kingman's bound in continuous time For each integer n ≥ 0, let Sn denote the random walk process n n∗ Sk = Yk2−n : Let S = supk≥0 Sk : By Kingman's moment bound for discrete time processes, n 2 n∗ Var(S1 ) σ E[S ] ≤ n = −2E[S1 ] 2µ Since Sn∗ is nondecreasing in n and converges a.s. to Y ∗; the result follows. Reformulation of Kingman's bound in continuous time I Suppose Y is SII with Y0 = 0 Then: I E[Yt ] = −µt $ Y + µt is a martingale 2 2 4 I Var(Yt ) = σ t $ [Y ; Y ] − σ t is a martingale (Mt = Yt + µt 2 and [M; M]t − σ t are martingales; [M; M] = [Y ; Y ]) I So (6) is eqivalent to: 2 Yt + µt and [Y ; Y ]t − σ t are martingales. (7) µ I Let γ = σ2 : Then 2 Yt + γ[Y ; Y ]t = (Yt + µt) + γ([Y ; Y ]t − σ t); so (7) implies that Y + γ[Y ; Y ] is a martingale. We have: Proposition (Kingman bound in continuous time) Let Y be a SII process with Y0 = 0 such that Y + γ[Y ; Y ] is a supermartingale. Then ∗ 1 ∗ 1 E[Y ] ≤ 2γ : Also, for any a > 0; PfY ≥ ag ≤ 2aγ : Part III: Bound on maximum of a supermartingale (no SII assumption, continuous time) Suppose γ > 0: I Condition 1 Y is a semimartingale with Y0 = 0, and Y + γ[Y ; Y ] is a supermartingale. µ I Condition 2 (Stronger than Condition 1) Y0 = 0, γ = σ2 ; 2 and both (Yt + µt : t ≥ 0) and ([Y ; Y ] − σ t : t ≥ 0) are supermartingales. Condition 2 implies Condition 1 because 2 Yt + γ[Y ; Y ]t = (Yt + µt) + γ([Y ; Y ]t − σ t): ∗ Let Y = supfYt : t ≥ 0g: Supermartingale bound Proposition 1Under Condition 1 or 2, for a ≥ 0 : (a) The following holds: 1 PfY ∗ ≥ ag ≤ : (8) 1 + γa (b) Equality holds in (8) if and only if the following is true, with T = infft ≥ 0 : Yt ≥ ag :(Yt^T : t ≥ 0) has no continuous martingale component, Y is sample-continuous over [0; T ) with probability one, P(YT = ajT < 1) = 1, and (Y + γ[Y ; Y ])t^T is a martingale.