Martingale Theory

Martingale Theory

CHAPTER 1 Martingale Theory We review basic facts from martingale theory. We start with discrete- time parameter martingales and proceed to explain what modifications are needed in order to extend the results from discrete-time to continuous-time. The Doob-Meyer decomposition theorem for continuous semimartingales is stated but the proof is omitted. At the end of the chapter we discuss the quadratic variation process of a local martingale, a key concept in martin- gale theory based stochastic analysis. 1. Conditional expectation and conditional probability In this section, we review basic properties of conditional expectation. Let (W, F , P) be a probability space and G a s-algebra of measurable events contained in F . Suppose that X 2 L1(W, F , P), an integrable ran- dom variable. There exists a unique random variable Y which have the following two properties: (1) Y 2 L1(W, G , P), i.e., Y is measurable with respect to the s-algebra G and is integrable; (2) for any C 2 G , we have E fX; Cg = E fY; Cg . This random variable Y is called the conditional expectation of X with re- spect to G and is denoted by E fXjG g. The existence and uniqueness of conditional expectation is an easy con- sequence of the Radon-Nikodym theorem in real analysis. Define two mea- sures on (W, G ) by m fCg = E fX; Cg , n fCg = P fCg , C 2 G . It is clear that m is absolutely continuous with respect to n. The conditional expectation E fXjG g is precisely the Radon-Nikodym derivative dm/dn. If Y is another random variable, then we denote E fXjs(Y)g simply by E fXjYg. Here s(Y) is the s-algebra generated by Y. The conditional probability of an event A is defined by P fAjG g = E fIAjG g . The following two examples are helpful. EXAMPLE 1.1. Suppose that the s-algebra G is generated by a partition: ¥ W = [i=1 Ai, Ai \ Aj = Æ if i 6= j, 1 2 1. MARTINGALE THEORY and P fAig > 0. Then E fXjG g is constant on each Ai and is equal to the average of X on Ai, i.e., 1 Z E fXjG g (w) = X dP, w 2 Ai. P fAig Ai EXAMPLE 1.2. The conditional expectation E fXjYg is measurable with respect to s(Y). By a well known result, it must be a Borel function f (Y) of Y. We usually write symbolically f (y) = E fXjY = yg . Suppose that (X, Y) has a joint density function p(x, y) on R2. Then we can take R R xp(x, y) dy f (y) = R . R p(x, y) dx The following three properties of conditional expectation are often used. (1) If X 2 G , then E fXjG g = X; more generally, if X 2 G then E fXYjG g = XE fYjG g . (2) If G1 ⊆ G2, then E fXjG1jG2g = E fXjG2jG1g = E fXjG1g . (3) If X is independent of G , then E fXjG g = E fXg . The monotone convergence theorem, the dominated convergence theo- rem, and Fatou’s lemma are three basic convergence theorems in Lebesgue integration theory. They still hold when the usual expectation is replaced by the conditional expectation with respect to an aribtary s-algebra. 2. Martingales A sequence F∗ = fFn, n 2 Z+g of increasing s-algebras on W is called a filtration (of s-algebras). The quadruple (W, F∗, F , P) with Fn ⊆ F is a filtered probability space. We usually assume that ¥ de f _ F = F¥ = Fn, n=0 the smallest s-algebra containing all Fn. Intuitively Fn represents the in- formation of an evolving random system under consideration accumulated up to time n. 2. MARTINGALES 3 A sequence of random variables X = fXng is said to be adapted to the filtration F∗ if Xn is measurable with respect to Fn for all n. The filtration X F∗ generated by the sequence X is X Fn = s fXi, i ≤ ng . It is the smallest filtration to which the sequence X is adapted. DEFINITION 2.1. A sequence of integrable random variables X = fXn, n 2 Z+g on a filtered probability space (W, F∗, P) is called a martingale with respect to F∗ if X is adapted to F∗ and E fXnjFn−1g = Xn for all n. It is called a submartingale or supermartingale if in the above relation = is replaced by ≥ or ≤, respectively. If the reference filtration is not explicitly mentioned, it is usually under- stood what it should be from the context. In many situations, we simply take the reference filtration is the one generated by the sequence itself. REMARK 2.2. If X is a submartingale with respect to some filtration F∗, X then it is also a martingale with respect to its own filtration F∗ . The definition of a supermartingale is rather unfortunate, for EXn ≤ EXn−1, that is, the sequence of expected values is decreasing. Intuitively a martingale represents a fair game. The defining property of a martingale can be written as E fXn − Xn−1jFn−1g = 0. We can regard the difference Xn − Xn−1 as a gambler’s gain at the nth play of a game. The above equation says that even after applying all the infor- mation and knowledge he has accumulated up to time n − 1, his expected gain is still zero. EXAMPLE 2.3. If fXng is a sequence of independent and integrable ran- dom variables with mean zero, then the partial sum Sn = X1 + X2 + ··· + Xn is a martingale. EXAMPLE 2.4. (1) If fXng is a martingale and f : R 7! R a convex func- tion such that each f (Xn) is integrable, then f f (Xn)g is a submartingale. (2) fXng is a submartingale and f : R 7! R a convex and increasing function such that each f (Xn) is integrable, then f f (Xn)g is a submartingale. EXAMPLE 2.5. (martingale transform) Let M be a martingale and Z = fZng adapted. Suppose further that each Zn is uniformly bounded. Define n Nn = ∑ Zi−1(Mi − Mi−1). i=1 4 1. MARTINGALE THEORY Then N is a martingale. Note that in the general summand, the multi- plicative factor Zi−1 is measurable with respect to the left time point of the martingale difference Mi − Mi−1. EXAMPLE 2.6. (Reverse martingale) Suppose that fXng is a sequence of i.i.d. integrable random variables and X + X + ··· X Z = 1 2 n . n n Then fZng is a reverse martingale, which means that E fZnjGn+1g = Zn+1, where Gn = s fSi, i ≥ ng . If we write Wn = Z−n and Hn = G−n for n = −1, −2, . .. Then we can write E fWnjHn−1g = Wn−1. 3. Basic properties Suppose that X = fXng is a submartingale. We have EXn ≤ EXn+1. Thus on average a submartingale is increasing. On the other hand, for a martingale we have EXn = EXn+1, which shows that it is purely noise. The Doob decomposition theorem claims that a submartingale can be decom- posed uniquely into the sum of a martingale and an increasing sequence. The following example shows that the uniqueness question for the decom- position is not an entirely trivial matter. EXAMPLE 3.1. Consider Sn, the sum of a sequence of independent and 2 square integrable random variables with mean zero. Then Sn is a sub- martingale. We have obviously 2 2 2 2 Sn = Sn − ESn + ESn. 2 n 2 2 2 From ESn = ∑i=1 EXi it is easy to verify that Mn = Sn − ESn is a martin- gale. Therefore the above identity is a decomposition of the submartingale 2 Sn into the sum of a martingale and an increasing process. On the other hand, n n 2 2 Sn = 2 ∑ Si−1Xi + ∑ Xi . i=1 i=1 The first sum on the right side is a martingale (in the form of a martingale transform). Thus the above gives another such decomposition. In general these two decompositions are different. The above example shows that in order to have a unique decomposition we need further restrictions. DEFINITION 3.2. A sequence Z = fZng is said to be predictable with respect to a filtration F∗ if Fn if Zn 2 Fn−1 for all n ≥ 0. 4. OPTIONAL SAMPLING THEOREM 5 THEOREM 3.3. (Doob decomposition) Let X be a submartingale. Then there is a unique increasing predictable process Z with Z0 = 0 and a martingale M such that Xn = Mn + Zn. PROOF. Suppose that we have such a decomposition. Conditioning on Fn−1 in Zn − Zn−1 = Xn − Xn−1 − (Mn − Mn−1), we have Zn − Zn−1 = E fXn − Xn−1jFn−1g . The right side is nonnegative if X is a submartingale. This shows that a Doob decomposition, if exists, must be unique. It is now clear how to pro- ceed to show the existence. We define n Zn = ∑ E fXi − Xi−1jFi−1g . i=1 It is clear that X is increasing, predictable, and Z0 = 0. Define Mn = Xn − Zn. We have Mn − Mn−1 = Xn − Xn−1 − E fXn − Xn−1jFn−1g , from which it is easy to see that E fMn − Mn−1jFn−1g = 0. This shows that M is a martingale. 4. Optional sampling theorem The concept of a martingale derives much of its power from the option- al sampling theorem we will discuss in this section. Most interesting events concerning a random sequence occurs not at a fixed constant time, but at a random time.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    22 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us