
AN INTRODUCTION TO MARKOV PROCESSES AND THEIR APPLICATIONS IN MATHEMATICAL ECONOMICS UMUT C¸ETIN 1. Markov Property 2. Brief review of martingale theory 3. Feller Processes 4. Infinitesimal generators 5. Martingale Problems and Stochastic Differential Equations 6. Linear continuous Markov processes In this section we will focus on one-dimensional continuous Markov processes on real line. Our aim is to better understand their extended generators, transition functions, and to construct diffusion process from a Brownian motion by a change of time and space. We will deal with a Markov process, X, whose state space, E, is an interval (l; r) in R, which may be closed, open, semi-open, bounded or unbounded. The life-time is denoted with ζ as usual. The following is the standing assumptions on X throughout the section: (1) X is continuous on [0; ζ[; (2) X has strong Markov property; x (3) X is regular; i.e. if Tx := infft > 0 : Xt = xg, then P (Ty < 1) > 0 for any x 2 int(E) and y 2 E. The last assumption on the regularity of X is very much like the concept of irreducibility for Markov chains. Indeed, when X is regular, its state space cannot be decomposed into smaller sets from which X cannot exit. For any interval I =]a; b[ such that [a; b] ⊂ E, we denote by σI the exit time of I. Note that x x x for x 2 I, σI = Ta ^ Tb;P -a.s., and for x2 = I, σI = 0;P -a.s.. We also put mI (x) := E [σI ]. Proposition 6.1. mI is bounded on I. In particular σI is finite almost surely. Proof. Let y be a fixed point in I. Since X is regular, we can find α < 1 and t > 0 such that y y maxfP (Ta > t);P (Tb > t)g = α: If y < x < b, then x x y P (σI > t) ≤ P (Tb > t) ≤ P (Tb > t) ≤ α: Applying the same reasoning to a < x < y, we thus get x sup P (σI > t) ≤ α < 1: x2I Now, since σI = u + σI ◦ θu on [σI > u], we have x x \ P (σI > nt) = P [σI > (n − 1)t] [(n − 1)t + σI ◦ θ(n−1)t > nt] : 1 2 MARTINGALE PROBLEMS Thus, in view of the Markov property x x X(n−1)t P (σI > nt) = E 1[σI >(n−1)t]E 1[σI >t] : X(n−1)t However, on [σI > (n − 1)t], X(n−1)t 2 I, thus, E 1[σI >t] ≤ α so that x x P (σI > nt) ≤ αP (σI > (n − 1)t); x n and consequently P (σI > nt) ≤ α for every x 2 I. Therefore, Z 1 1 Z (n+1)t x x X x sup E [σI ] = sup P (σI > s)ds = sup P (σI > s)ds x2I x2I 0 x2I n=0 nt 1 X 1 ≤ tP x(σ > nt) ≤ t : I 1 − α n=0 In view of the above proposition, for any l ≤ a < x < b ≤ r, we have x x b a P [Ta < Tb] + P [T < T ] = 1: Theorem 6.1. There exists a continuous and strictly increasing function s on E such that for any a; b; x in E with l ≤ a < x < b ≤ r s(x) − s(a) P x(T < T ) = : b a s(b) − s(a) Moroever, if s~ is another function with the same properties, then s~ = αs+β for some α > 0. Proof. We will only give the proof of the formula when E is a closed and bounded interval. For the rest of the proof and the extension to the general case see the proof of Proposition 3.2 in Chap. VII of Revuz & Yor (or better try it yourself!). First, observe that [Tr < Tl] = [Tr < Tl;Ta < Tb] \ [Tr < Tl;Tb < Ta]; and that Tl = Ta + Tl ◦ θTa and Tr = Ta + Tr ◦ θTa on [Ta < Tb]. Thus, x x P (Tr < Tl;Ta < Tb) = E 1[Ta<Tb]1[Tr<Tl] ◦ θTa : Using the strong Markov property at Ta and that XTa = a when Ta < 1, we obtain x P (Tr < Tl;Ta < Tb) = Px(Ta < Tb)Pa(Tr < Tl): Similarly, x P (Tr < Tl;Tb < Ta) = Px(Tb < Ta)Pb(Tr < Tl); so that x P (Tr < Tl) = Px(Ta < Tb)Pa(Tr < Tl) + Px(Tb < Ta)Pb(Tr < Tl): Setting x s(x) = P (Tr < Tl) and solving for Px(Tb < Ta), we get the formula in the statement. Definition 6.1. The function s in the above result is called the scale function of X. Observe that s is unique only upto an affine transformation. We will say that X is on natural scale when s(x) = x. Moreover, if we apply the scale function to X, we get the following MARTINGALE PROBLEMS 3 Proposition 6.2. Let X~ := s(X). Then, X~ satisfies the standing assumptions of this section and it is on natural scale. Let's put R = Tr ^ Tl. Theorem 6.2. Let f be a locally bounded and increasing Borel function. Then, f is a scale function if and only if f(X)R is a local martingale. Proof. We refer to the proof of Proposition 3.5 in Chap. VII of Revuz & Yor for the proof of that s(X)R is a local martingale. We will now give the proof of the converse. Suppose that f(X)R is a local martingale and let [a; b] be a closed interval in the interior of E. Then, f(X)Ta^Tb is a bounded martingale. Thus, an application of the Optional Stopping Theorem yields x x f(x) = f(a)P (Ta < Tb) + f(b)P (Tb < Ta): x x x Since P (Ta < Tb) + P (Tb < Ta) = 1, solving for P (Tb < Ta) shows that f is a scale function. Exercise 6.1. Suppose that X is a diffusion with the infinitesimal generator 1 d2 d L = σ2(x) + b(x) ; 2 dx2 dx where σ and b are locally bounded functions on E and σ > 0. Prove that the scale function is given by Z x Z y s(x) = exp − 2b(z)σ−2(z)dz dy c c where c is an arbitrary point in the interior of E. Exercise 6.2. A 3-dimensional Bessel process is a Markov process on (0; 1) satisfying Z t 1 Xt = x + ds + Bt; x > 0: 0 Xs Compute its scale function. Since X has continuous paths, s(X) is a continuous local martingale. Thus, it can be viewed as a time-changed Brownian motion. This time change can be explicitly written in terms of a so-called speed measure. x Recall that mI (x) = E [σI ]. Let now that J =]c; d[ be a subinterval of I. By the strong Markov property, for any a < c < x < d < b one has x mI (x) = E [σJ + σI ◦ θσJ ] s(d) − s(x) s(x) − s(c) = m (x) + m (c) + m (d) J s(d) − s(c) I s(d) − s(c) I s(d) − s(x) s(x) − s(c) > m (c) + m (d); s(d) − s(c) I s(d) − s(c) I which says that mI is an s-concave function. Thus, one can define a Green's function GI on E × E by 8 (s(x)−s(a))(s(b)−s(y)) ; if a ≤ x ≤ y ≤ b; <> s(b)−s(a) (s(y)−s(a))(s(b)−s(x)) GI (x; y) = s(b)−s(a) ; if a ≤ y ≤ x ≤ b; :> 0; otherwise; 4 MARTINGALE PROBLEMS so that there exists a measure µI on the interior of E such that Z mI (x) = GI (x; y)µI (dy): In fact, we have more. The measure µI does not depend on I (see Theorem 3.6 in Chap. VII of Revuz & Yor) and one can write Z mI (x) = GI (x; y)m(dy): The measure in the above representation is called the speed measure of X. The next two results state some of the basic properties of the speed measure. Proposition 6.3. For any positive Borel function f Z Ta^Tb Z x E f(Xs)ds = GI (x; y)f(y)m(dy): 0 Proof. See the proof of Corollary 3.8 n Chap. VII of Revuz & Yor. Theorem 6.3. For a bounded f 2 DA and x 2 int(E) d d Af(x) = f(x) dm ds in the sense that df i) the s-derivative ds exists except possibly on the set fx : m(fxg) > 0g, ii) if x1 and x2 are two points for which this s-derivative exists, then df df Z x2 (x2) − (x1) = Af(y)m(dy): ds ds x1 Proof. If f 2, then M f is a martingale where Z t f Mt = f(Xt) − f(X0) − Af(Xs) ds: 0 x f Moreover, for any stopping time T with E [T ] < 1,(Mt^T ) is a uniformly integrable f martingale since jMt^T j ≤ 2 k f k +T k Af k. Thus, if we choose T = Ta ^ Tb, by the Optional Stopping Theorem Z T x x E [f(XT )] = f(x) + E Af(Xs) ds : 0 Thus, it follows from the preceding result that (6.1) f(a)(s(b) − s(x)) + f(b)(s(x) − s(a)) − f(x)(s(b) − s(a)) Z = (s(b) − s(a)) GI (x; y)Af(y)m(dy): I It can be directly verified that f(b) − f(x) f(x) − f(a) Z + = HI (x; y)Af(y)m(dy); s(b) − s(x) s(x) − s(a) I MARTINGALE PROBLEMS 5 where 8 s(y)−s(a) ; if a < y ≤ x; <> s(x)−s(a) s(b)−s(y) HI (x; y) = s(b)−s(x) if x ≤ y < b; :> 0; otherwise.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-