A technique for exponential change of measure for Markov processes

Citation for published version (APA): Palmowski, Z. B., & Rolski, T. (2002). A technique for exponential change of measure for Markov processes. Bernoulli, 8(6), 767-785.

Document status and date: Published: 01/01/2002

Document Version: Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers)

Please check the document version of this publication:

• A submitted manuscript is the version of the article upon submission and before peer-review. There can be important differences between the submitted version and the official published version of record. People interested in the research are advised to contact the author for the final version of the publication, or visit the DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication

General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.

• Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal.

If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the “Taverne” license above, please follow below link for the End User Agreement: www.tue.nl/taverne

Take down policy If you believe that this document breaches copyright please contact us at: [email protected] providing details and we will investigate your claim.

Download date: 01. Oct. 2021 Bernoulli 8(6), 2002, 767 –785

A technique for exponential change of measure for Markov processes

ZBIGNIEW PALMOWSKI 1,2 and TOMASZ ROLSKI 1 1Mathematical Institute, University of Wrocław, pl. Grunwaldzki 2/4, 50-384 Wrocław, Poland 2EURANDOM, P.O. Box 513, 5600 MB Eindhoven, The Netherlands. E-mail: [email protected]

We consider a Markov process X(t) with extended generator A and domain (A). Let t be a D ~ fF g right-continuous history filtration and Pt denote the restriction of P to t. Let PP be another ~ h F on ( Ù, ) such that d PPt=dPt E (t), where F ¼ h(X(t)) t (Ah)( X(s)) Eh(t) exp ds ¼ h(X(0)) À h(X(s)) ð0  is a true martingale for a positive function h (A). We demonstrate that the process X(t) is a Markov 2 D ~ ~ process on the probability space ( Ù, , t , PP), we find its extended generator AA and provide suf ficient conditions under which (AA~) F fF(A).g We apply this result to continuous-time Markov chains, to piecewise deterministic MarkovD processes¼ D and to diffusion processes (in this case a special choice of h yields the classical Cameron –Martin –Girsanov theorem). Keywords: Cameron –Martin –Girsanov theorem; ; exponential change of measure; extended generator; ; Markov ; Markov process; piecewise deterministic Markov process

1. Introduction

The technique of exponential change of measure has been successfully applied in various theories such as large deviations (Ridder and Walrand 1992; Schwartz and Weiss 1995), queues and fluid flows (Asmussen 1994; 1995; Kulkarni and Rolski 1994; Palmowski and Rolski 1996; 1998; Gautam et al . 1999), (Dassios and Embrechts 1989; Asmussen 1994; 2000; Schmidli 1995; 1996; 1997a; 1997b), simulation (Ridder 1996; Asmussen 1998) and population genetics (Fukushima and Stroock 1986; Ethier and Kurtz 1993). The process of interest is typically Markovian and, under a suitably chosen new probability measure, it is again a Markov process with some ‘nicer ’ desired properties. To know the parameters of the new process we need to find its generator. Although in some cases determining the new generator is straightforward, there are many situations in which it is more dif ficult. In such situations, a uni fied theory simpli fies the calculations. In this paper we present a detailed account of the change of probability measure technique for cadlag Markov processes. We can also accommodate some non-Markovian processes that are Markovian with a supplementary component, for example, piecewise deterministic Markov

1350 –7265 # 2002 ISI/BS 768 Z. Palmowski and T. Rolski processes or Markov additive processes. For diffusion processes, a special case of the theory presented is a Girsanov-type theorem. Consider a Markov process X(t) on a filtered probability space ( Ù, , t , P) having extended generator A with domain (A). In this paper we use the martingaleF fF g approach of Stroock and Varadhan (1979) to deDfine generators. For each strictly positive function f , de fine

f (X(t)) t (A f )( X(s)) Ef (t) exp ds , t > 0: (1 :1) ¼ f (X(0)) À f (X(s)) ð0 

If, for some function h, the process Eh(t) is a martingale, then it is said to be an exponential martingale . In this case we call h a good function. Using this exponential martingale as a likelihood ratio process, we will de fine a new probability measure PP~ on ( Ù, ). In Theorem 4.2 we prove that under some mild assumptions, X(t) is a MarkovF process on ~ (Ù, , t , PP) having extended generator F fF g

1 AA~ f [A( f h ) f Ah] (1 :2) ¼ h À and (AA~) (A). Note that (1.2) can be rewritten using the ope´rateur carre´ du champ (see D ¼ D ~ 1 Revuz and Yor 1991, De finition 3.2, p. 326) AA f A f hÀ h, f A, where h, f A A(hf ) hA f f Ah. Kunita (1969) and Ethier and¼ Kurtzþ (1986)h i found thath thei two¼ generatorsÀ AA~ andÀ A are related by AA~ A B, where B is a linear operator. However, the domains in their papers are restricted¼ toþ bounded functions. Under some additional assumptions, Ethier and Kurtz (1986, Theorems 5.4 and 5.11c) show that B is the generator of some Markov process and they refer to AA~ as a perturbation of A. If a good function h is ~ 1 harmonic, that is, Ah 0 (or AAhÀ 0), then h(X(t)) =h(X(0)) is a martingale. In this case ~ 1 ¼ 1 ¼ AA f hÀ A( fh ) f Ah hÀ A( fh ) – see Dynkin (1965, Chapter IX, (9.46)) for a similar result¼ expressedÀ in terms¼ of in finitesimal operators, or Doob (1984, Section 2.VI.13), where the idea of h transforms was outlined. In Section 5 we illustrate the theory, applying it to several cases of special interest. For example, if X(t) is a diffusion, then, choosing the exponential function as a good function, we obtain a Girsanov-type theorem (see Theorem 5.5). We also show that a piecewise deterministic Markov process (PDMP) remains a PDMP under the new measure PP~, and we find its characteristics (see Theorem 5.3). Other explicit forms of AA~ are computed for continuous-time Markov chains (CTMCs) in Proposition 5.1, and for Markov additive processes in Proposition 5.6. Related material on change of measure or Girsanov-type theorems can be found, for example, in Revuz and Yor (1991), Ku¨chler and S ørensen (1997) and Jacod and Shiryaev (1987). Exponential change of measure for Markov processes 769 2. Preliminaries

2.1. Basic set-up

Let E be a Borel space. Throughout this paper we denote the space of measurable functions f : E R by (E) and the space of continuous functions by (E). We add a subscript b to denote! the restrictionM to bounded functions. We consider stochasticC process X(t) assuming values in E and de fined on a probability space ( Ù, , P), equipped with a filtration t t>0. F fF g We assume that t>0 t, that is, is the smallest ó-field generated by all subsets of t F ¼ F F F for all t > 0. We denote by F the pair ( , t t>0). We call ( Ù, F) a filtered space and W F fF g (Ù, F, P) a filtered probability space . For a probability measure P we denote by Pt its restriction to t. All stochastic processes considered here are cadlag. Let M(t) beF a positive cadlag martingale de fined on a filtered probability space ( Ù, F, P) ~ such that EM(0) 1. We de fine a family of measures PPt by ¼

dPP~ t M(t): (2 :3) dPt ¼

~ ~ ~ It can be proved that PPt is a consistent family, that is, for s < t we have PPt(A) PPs(A) for ¼ all A s. We2 say F that the standard set-up is satis fied if:

(i) the filtered probability space ( Ù, F, P) has a right-continuous filtration t ; PP~ PP~ PP~ PP~fF g (ii) there exists a unique probability measure such that t t , where t is de fined by (2.3). ¼ jF

The standard set-up may not be satis fied if the filtration t is not separable (Parthasarathy 1967, De finition 2.1 and Theorem 4.2); see, for example,fF g the example of Rogers and Williams (1987, p. 83) and Karatzas and Shreve (1988, p. 193) concerning the augmented filtration of Brownian motion. [0, ) We now describe a canonical model for the standard set-up. Assume that Ù E 1 is  the space DE[0, ) of cadlag functions from [0, ) into E (in some cases we can also use 1 1 E[0, ), the space of continuous functions into E) and consider the canonical process C 1 Ù F P X X(t) on ( , , ) de fined by X(ø, t) ø(t). The filtration is de fined by t t , X < ¼ fF ¼ F þg where t ó X(s), s t is the natural filtration generated by the process X(t). We de fine aF new¼ familyf of probabilitiesg via (2.3). By Kolmogorov ’s theorem (see Stroock 1987, Theorem 4.2, p. 106; Stroock and Varadhan 1979, Theorem 1.3.5, p. 34; Parthasarathy 1967, Theorem 4.2, p. 143), there exists a unique probability measure PP~ on ( Ù, F) such PP~ PP~ PP~ that t t , where t satis fies (2.3). The above considerations are, of course, true if we ¼ jF X X take the natural filtration t instead of t , but in applications one often needs the right-continuity assumptionfF tog de fine certainfF stoppingþg times (e.g. moment of ruin in risk theory). 770 Z. Palmowski and T. Rolski

2.2. Markov processes

The following theorem was proved by Kunita and Watanabe (1963, Propositions 3 and 5). See also Dynkin (1965) and Ku¨chler and S ørensen (1997, Proposition 6.3.5).

Theorem 2.1. Let X (t) be a Markov process on (Ù, F, P). Assume that standard set-up is satisfied. We define a new probability measure PP~ by (2.3) , where the martingale M (t) is a non-negative multiplicative functional for which E(M(0)) 1. Then on the new filtered probability space (Ù, F, PP~), the process X (t) is Markovian .¼

For a Markov process X(t) with values in E we de fine a full (extended ) generator A by A ( f , f ) (E) 3 (E) : D f (t) is a (local) martingale , ¼ f à 2 M M g f t > where D (t) f (X(t)) 0 f Ã(X(s))d s, t 0, is Dynkin’s (local ) martingale and the function s f¼ (X(s)) isÀ integrable P-almost surely on [0, t] for all t > 0. From now on we ! à Рwill identify all versions of the function f à and we denote all these versions by A f if ( f , f Ã) A. By 2(A) we denote the set of measurable functions f (E) such that A f (E) exists,D and such that t A f (X(s)) ds , P-a.s. for all t 2> M0. Thus 2 M 0 j j 1 t Ð D f (t) f (X(t)) A f (X(s))d s (2 :4) ¼ À ð0 is an F-(local) martingale for f (A). The space (A) is called the domain of the full (extended ) generator A. If we restrict2 D the domain of theD extended or the full generator to a subset (A), then to avoid cumbersome notation we also denote this subset by (A). D  D D Remark 2.1. We will also use the notation D f (t) for the process given in (2.4) for any linear operator A : (E) (E) for which s A f (X(s)) is integrable P-a.s. on [0, t] for all t > 0. M !M !

f P P Remark 2.2. If D (t) is a local martingale with respect to the augmented t , and t - X fF fg fF g stopping times in a fundamental sequence are t -stopping times, then D (t) is also an X fF þg t -martingale. In this case, the extended generator and its domain do not change under augmentation.fF þg

t Remark 2.3. Note that for f (A) the process 0 A f (X(s))d s is a continuous process of finite variation. Thus the process2 D f (X(t)) is a . In particular, since the local martingale D f (t) is cadlag this process is also cadlag.Ð

3. Exponential martingale

Consider a Markov process X(t) de fined on a filtered probability space ( Ù, F, P). For a linear operator A : (E) (E) we de fine M !M Exponential change of measure for Markov processes 771

(A) f (E) : f (x) 0 for all x, MÃ ¼ 2 M 6¼  t (A f )( X(s)) t ds , and A f (X(s)) ds , for all t > 0, P-a :s: : f (X(s)) 1 j j 1 ð0 ð0 

Moreover, for an extended generator A, we let (A) (A) (A): DÃ ¼ MÃ \ D f Thus, E (t) de fined in (1.1) makes sense for f Ã(A). The following result is slight extension of Proposition 3.2 in Ethier and Kurtz (1986,2 M p. 175).

f > Lemma 3.1. Let f Ã(A). Then the process D (t), t 0 is a local martingale if and only if Ef (t), t > 20 Mis a local martingale . f g f g f Proof. Assume that the process D (t) is a local martingale. For f Ã(A), the process t 2 M 0 (A f )( X(s)) = f (X(s)) d s is continuous of finite variation. Hence by Jacod and Shiryaev (1987, Proposition 2.1.3, p. 75) Ð t (A f )( X(s)) f (X(t)), exp ds 0, À f (X(s)) ¼ ð0   t where [ :, :]t is a quadratic covariation. Using the integration by parts formula for , we have

1 t (A f )( X(s)) dEf (t) exp ds dD f (t): (3 :5) ¼ f (X(0)) À f (X(s)) ð0  Thus, by Dellacherie and Meyer (1982, Theorem VIII.3.e, p. 314), the process Ef (t) is a local martingale. Now, assume that Ef (t) is a local martingale. By (3.5) we also have that

t (A f )( X(s)) dD f (t) f (X(0))exp ds dEf (t), ¼ f (X(s)) ð0  which completes the proof. h

Proposition 3.2. Suppose that either one of the following two conditions holds for a positive function h (A): 2 D 1 (M1) h b(E) and h À Ah b(E); 2 M 2 M (M2) h, Ah b(E) and inf x h(x) . 0. 2 M Then h is a good function.

Proof. A criterion of Protter (1990, Theorem 47, p. 35), and Lemma 3.1 justify the above statement. h 772 Z. Palmowski and T. Rolski 4. Main theorem

In this section we consider the Markov process from Section 3. Throughout this section, h is ~ a good function and we de fine a new family of probabilities PPt, t > 0 by f g dPP~ t Eh(t), t > 0: dPt ¼ We now assume that the standard set-up holds and thus there exists a unique probability PP~ PP~ PP~ measure such that t t . The Markovian property of X(t) on the filtered probability space ( Ù, F, PP~) follows¼ fromjF Theorem 2.1. We now quote the following useful lemma (Liptser and Shiryaev 1986, Lemma 4.5.3, p. 188):

Lemma 4.1. An adapted process Z (t) on a filtered space (Ù, F) is a PP~-local martingale if Z(t)Eh(t) is a P-local martingale .

For the operator AA~ de fined in (1.2), let t ~ ~ A f (A) : hf (A) and AA f (X(s)) ds , for all t > 0, PP-a :s: : D ¼ 2 D 2 D Ã j j 1  ð0  For f A we have f (x) 0. Note also that if f A, then 2 D 6¼ 2 D t (AA~ f )( X(s)) t (A( fh ))( X(s)) t (Ah)( X(s)) ds ds ds f (X(s)) ¼ fh (X(s)) À h(X(s)) ð0 ð0 ð0 is well de fined. Thus for f A we can de fine the processes 2 D f (X(t)) t (AA~ f )( X(s)) EE~ f (t) exp ds (4 :6) ¼ f (X(0)) À f (X(s)) ð0 ! and t DD~ f (t) f (X(t)) (AA~ f )( X(s))d s: ¼ À ð0 We have

f (X(t)) h(X(t)) t (AA~ f )( X(s)) (Ah)( X(s)) EE~ f (t)Eh(t) exp ds ¼ f (X(0)) h(X(0)) À f (X(s)) þ h(X(s)) ð0" # ! fh (X(t)) t (A( fh ))( X(s)) exp ds : ¼ fh (X(0)) À fh (X(s)) ð0  ~ f ~ f Applying Lemma 3.1 to f A and Lemma 4.1 to Z(t) EE (t), we have that DD (t) is a local martingale. Thus AA~ is2 an D extended generator of the¼ process X(t) on ( Ù, F, PP~) and ~ ~ A (AA). Unfortunately, with this approach, we cannot easily identify the domain (AA) Dof the D extended generator AA~. D Exponential change of measure for Markov processes 773

Let t ~ h f (A) : fh (A) and AA~ f (X(s)) ds , for all t > 0, PP-a :s: DA ¼ 2 D 2 D j j 1  ð0  and t h 1 ~ 1 ~ ~À f (AA) : fh À (AA) and A f (X(s)) ds , for all t > 0, P-a :s: : DAA ¼ 2 D 2 D j j 1  ð0 

Remark 4.1. Note that if inf x h(x) . 0 or h (E), then by the inequality 2 C ~ A( fh ) Ah AA f < j j j j , j j h þ h j j j j we have h f (A) : fh (A) : DA ¼ f 2 D 2 D g Similarly, if h b(E) or h (E), then 2 M 2 C h 1 ~ 1 ~ À f (AA) : fh À (AA) : DA ¼ f 2 D 2 D g

1 h hÀ ~ Theorem 4.2. Suppose that A (A) and A (AA). Then on the filtered probability space (Ù, F, PP~) the process X D(t)¼is D MarkovianD with¼ extended D generator AA~ and (AA~) (A). D ¼ D h ~ f Proof. Let f (A). Thus f A and the process DD (t) is well de fined. Moreover, it is cadlag because,2 D by Remark 2.3,2 the D process f (X(t)) is cadlag. We will show that the process DD~ f (t) is a PP~-local martingale. In that case we also prove the inclusion (A) (AA~). To prove the reverse inclusion, it suf fices to change the measure by D  D P d t h 1 ~ E (t)À , dPPt ¼ ~ 1 1 which can be done since h is positive. Note that hAAhÀ hÀ Ah and hence ¼ À 1 t ~h 1 hÀ (X(t)) ~ 1 h 1 EE À (t) exp hAAhÀ (X(s))d s (E (t)) À (4 :7) ¼ h 1(X(0)) À ¼ À ð0  1 PP~ ~ hÀ ~ is a -martingale. Note also that f 1 (AA). So by the equality AA~ (AA), we have 1 ~ ¼ 2 D D ¼ D that hÀ Ã(AA). We now2 D prove that the process DD~ f (t) is a local martingale. Following Lemma 4.1, it suf fices to show that the process Eh(t)DD~ f (t) is a P-local martingale. By the integration by parts formula for semimartingales, we obtain t t ~ f h ~ f h h ~ f h ~ f DD (t)E (t) DD (s )d E (s) E (s )d DD (s) [E , DD ]t: ¼ À þ À þ ð0 ð0 However, from the de finition of the process DD~ f (t) and the operator AA~, we have 774 Z. Palmowski and T. Rolski

t ~ f f f , h A(X(s)) DD (t) D (t) h i ds: ¼ À h(X(s)) ð0 Hence

t t t ~ f h ~ f h h f h ~ f h f , h A(X(s)) DD (t)E (t) DD (s )d E (s) E (s )d D (s) [E , DD ]t E (s ) h i ds: ¼ À þ À þ À À h(X(s)) ð0 ð0 ð0 (4 :8)

The first two components are local martingales. We now consider the third component. Let X c be the continuous component, and X d be the pure-jump component of the process X(t), and let ˜ f (X(t)) f (X(t)) f (X(t )). Then we have ¼ À À t h ~ f h ~ h d[ E , DD ]t d[ f (X), E ]t d AA f (X(s))d s, E (t) : ¼ À ð0  t t ~ Note that 0 AA f (X(s))d s is continuous of finite variation and hence that t ~ h h ~ f h [ AA f (X(s))d s, E (t)] t 0. Thus d[ E , DD ]t d[ f (X), E ]t, which by (3.5) is equal to 0 Ð ¼ ¼ Ð t Eh(s ) t Eh(s ) d f (X(t)), À dh(X(s)) d f (X(t)), À Ah(X(s))d s : (4 :9) h(X(s )) À h(X(s )) "ð0 À # t "ð0 À # t

Note that

t 1 1 t Ah(X(s)) t Eh(s )Ah(X(s)) ds < exp ds Ah(X(s)) ds, h(X(s )) À h(X(0)) h(X(s)) j j ð0 (ð0 ) ð0 À

P > t h which is finite -a.s. for all t 0. Thus process 0 (1 =h(X(s ))) E (s )Ah(X(s))d s is continuous of finite variation and the second component of (4.9) isÀ equal toÀ 0. We have Ð t h ~ f c d 1 h c d d[ E , DD ]t d f (X (t)) f (X (t)), E (s )d( h(X (s)) h(X (s))) ¼ þ h(X(s )) À þ  ð0 À  t 1 h d d E (t )d[ f (X (t)), h(X (t))] t ¼ h(X(t )) À À 1 h E (t )d[ f (X(t)), h(X(t))] t: ¼ h(X(t )) À À Using the integration by parts formula for semimartingales again, one obtains

h ~ f d[ E , DD ]t Exponential change of measure for Markov processes 775

1 t t Eh(t )d f (X(t)) h(X(t)) f (X(s ))d h(X(s)) h(X(s ))d f (X(s)) ¼ h(X(t )) À À À À À À  ð0 ð0  1 1 Eh(t )d D fh (t) Eh(t ) f (X(t ))d Dh(t) ¼ h(X(t )) À À h(X(t )) À À À À 1 Eh(t )h(X(t ))d D f (t) À h(X(t )) À À À 1 Eh(t )(( A( fh ))( X(t)) f (X(t ))( Ah)( X(t)) þ h(X(t )) À À À À h(X(t ))( A f )( X(t)))d t: À À The first three components above are all local martingales since they are the integrals with respect to local martingales. The last component is a Riemann differential, and is equal to

f , h A(X(t)) h i Eh(t )d t, h(X(t)) À which completes the proof in view of (4.8). h

5. Examples

5.1. Continuous-time

The simplest case we can analyse is when X(t) is a CTMC. We will only consider CTMCs with a finite state space. The modi fication to a countable state space is straightforward, but it would require many additional assumptions. Let Q (qij )i, j 1, ... ,‘ be the intensity matrix of the process, which, in the case of the finite state¼ space,¼ is also the generator of the process. Here, functions f, h are column T vectors. We de fine fh ( f 1 h1, . . . , f ‘ h‘) . Let h be positive. Now we have the following proposition (see also Rolski¼ et al . 1999, Lemma 12.3.3). ~ Proposition 5.1. The new generator is QQ (qq~ij ), where ¼ hj qij , i j, ~ hi 6¼ qqij 8 hk ¼ > qik , i j: <> h À k i i ¼ X6¼ > Proof. By inspection. : h

Corollary 5.2. If Q is irreducible and reversible with the stationary probability vector ð, then the stationary vector ðð~ for QQ~ is given by 776 Z. Palmowski and T. Rolski

ði hi ðð~i C , ¼ hj j i 6¼ ‘ ~ Y where C is a constant such that j 1ðð j 1. ¼ ¼ P 5.2. Piecewise deterministic Markov process

We follow Davis ’s (1993) description of PDMPs. Let E be a state space consisting of pairs x (í, z), where í assumes a finite number of values from a set and z belongs to an open ¼ d(í) I subset í of R (note that E is a Borel space). The process X(t) is determined (see Davis 1993, p.O 62) by the following:

dí í,i (i) i 1 g (x)@=@ zi, a vector field determining the flow öí(t, z) (we assume that Xgí, ¼ i are¼ locally Lipschitz continuous); (ii) º(:),P a force of transitions; (iii) Q(:, :), a transition kernel.

Denote by @ í the boundary of í and let O O @ í z @ í : z öí(t, z9) for some ( t, z9) R 3 í , ÃO ¼ f 2 O ¼ 2 þ O g

ˆ (í, z) @E : í , z @ í , ¼ f 2 2 I 2 ÃO g

t (í, z) sup t . 0 : öí(t, z) exists and ö(t, z) í : à ¼ f 2 O g The set ˆ is called an active boundary . It is the set of boundary points of E, which can be reached from E via integral curves in finite time. For each point ( í, z), tÃ(í, z) is the time ˆ needed to reach the boundary from ( í, z). We will assume that öí(tÃ(í, z), z) if t (í, z) , , which means that the integral curves cannot ‘disappear ’ inside E. Assume2 that à 1 lim Tn , P-a :s:, (5 :10) n !1 ¼ 1 where Tn denotes the consecutive jumps of the process X(t). To obtain (5.10) we can assume that º is bounded and that one of the following conditions is satis fied: tÃ(x) for each x (í, z) E, that is, there are no active boundary points (e.g. ˆ ˘), or for¼ some 1 E . 0, ¼ 2 ˆ >¼ E we have Q(x, BE) 1 for all x , where BE x E : tÃ(x) . The last condition means that the minimal¼ distance2 between consecutive¼ f 2 boundary hittingg times is not smaller than E (see Davis 1993, Proposition 24.6, p. 60). From Davis (1993, Proposition 26.14, p. 69), the formula for the extended generator is

A f (x) f (x) º(x) ( f (y) f (x)) Q(x, d y), ¼ X þ À ðE where x E and the domain (A) consists of every function f that is the restriction to E of a measurable2 function f : E ˆD R satisfying three conditions: [ ! (i) For each ( í, z) E the function t f (í, öí(t, z)) is absolutely continuous on 2 ! (0, tÃ(í, z)). Exponential change of measure for Markov processes 777

(ii) For each x ˆ, 2 f (x) f (y)Q(x, d y): (5 :11) ¼ ðE (iii) For n 1, 2, . . . , ¼ n E f (X(Ti)) f (X(Ti )) , : (5 :12) i 1 j À À j ! 1 X¼ From (1.2) we have º(x)H(x) h(y) (AA~ f )( x) ( f )( x) ( f (y) f (x)) Q(x, d y), ¼ X þ h(x) À H(x) ðE where H(x) E h(y)Q(x, d y). We assume that ¼ ~ Ð lim Tn , PP-a :s: (5 :13) n !1 ¼ 1 This condition holds if, for example, º(x) is bounded, inf x h(x) . 0 and ˆ ˘. We can now conclude with the following theorem. ¼

Theorem 5.3. Assume that h is a good function satisfying H (x) , for all x E. Suppose h h 1 ~ 1 2 that (5.13) holds, and that (A) and ~À (AA). Then on the new probability space ~ DA ¼ D DAA ¼ D ( E[0, ), F, PP), the process X (t) is a PDMP with the unchanged differential operator andD the1 following jump intensity and transition kernel: X ~ º(x)H(x) h(y) ºº(x) , QQ~(x, d y) Q(x, d y): (5 :14) ¼ h(x) ¼ H(x) The domain does not change: (AA~) (A). D ¼ D Remark 5.1. Assume that condition (M1) or (M2) of Proposition 3.2 holds. Then h is a good function and H(x) , for all x. Moreover, if, for each f (A), the function fh satis fies condition (5.11) (e.g.1 if there are no active boundary points:2ˆ D ˘), then h (A) and 1 A hÀ ~ ¼ D ¼ D AA~ (AA) by Remark 4.1. Thus, if (5.13) holds additionally, then all assumptions of DTheorem¼ D 5.3 are satis fied. In particular, this is the case when ˆ ˘, º(x) is bounded and h is ¼ an absolutely continuous, bounded function such that inf x h(x) . 0.

Remark 5.2. Consider a positive function h (A) which is harmonic (typically unbounded so that conditions (M1) and (M2) of Proposition2 D 3.2 do not hold) and which belongs to the domain of the full generator of the process X(t). That is, h is a good function and by Davis (1993, Remark 26.17, p. 70), it satis fies a condition stronger than (5.12), namely

E h(X(Ti)) h(X(Ti )) , , (5 :15) T 0. Assume that (5.13) holds, H(x) , for all x E and fh satis fies (5.11) (e.g. 1 2 778 Z. Palmowski and T. Rolski

ˆ ˘). We restrict the domains of the extended generators A and AA~ to the bounded measurable¼ functions satisfying (i) and (ii) and we also denote these sets by (A) and (AA~), ~ D D respectively. Then by Theorem 5.3, the process X(t) on ( E[0, ), F, PP) is a PDMP with the parameters given in (5.14) and (AA~) (A). Such aD framework1 is often used in risk theory (see Rolski et al . 1999, p. 460).D ¼ D

5.3. Diffusion processes

T d Let a (aij )i, j 1, ... ,d and b (b1, . . . , bd) be functions of x R satisfying the following conditions:¼ ¼ ¼ 2 d (i) bi (R ) and, for some L . 0, 2 C d bi(x) < L(1 x ), x R : (5 :16) j j þ k k 2 (ii) a(x) is a strictly positive de finite matrix and d aij b(R ): (5 :17) 2 C In this subsection we assume that X(t) is a Markov diffusion process on the state space d E R with a given initial state X(0). We consider here Ù Rd [0, ) and ¼ X ¼ C 1 X(ø, t) ø(t) with the right-continuous filtration t t t>0. The extended generator A of the¼ diffusion process is fF ¼ F þg 1 d @2 f (x) d @ f (x) (A f )( x) aij (x) bi(x) , (5 :18) ¼ 2 i, j 1 @xi@xj þ i 1 @xi X¼ X¼ where x Rd and the family of twice continuously differentiable functions f 2(Rd) is included2 in the domain of this generator (A); see Karatzas and Shreve (1988,2 Proposition C 4.2, p. 312) or Rogers and WilliamsD (1987, Theorem 24.1, p. 170). We will write 2 d T d d (A) (R ). Let c(x) (c1(x1), . . . , cd(xd)) : R R , where D ¼ C ¼ ! 1 ci (R), i 1, . . . , d: (5 :19) 2 C b ¼ For x0 (x0,1 , . . . , x0, d), x (x1, . . . , xd) and y (y1, . . . , yd), we de fine ¼ ¼ ¼ x d x j c(y)d y cj(yj)d yj: x0 ¼ j 1 x0, j ð X¼ ð We use the following function h : Rd R : ! þ x h(x) exp c(y)d y : (5 :20) ¼ ðx 0 ! For d 1, we have h(x) exp( x c(y)d y). Note that the function h(x) satis fies ¼ ¼ x0 Ð 1 @h(x) cj(xj): (5 :21) h(x) @xj ¼ Exponential change of measure for Markov processes 779

Moreover, we have that h (A) and h(x) 0. Hence h Ã(A) and thus by Lemma 3.1, the process Eh(t) is a local2 martingale. D However,6¼ a stronger2 resultD is also true. We adopt the convention that

(cTb)( x) c(x)Tb(x), ( ac)( x) a(x)c(x), ( cTac)( x) c(x)Ta(x)c(x): ¼ ¼ ¼ Lemma 5.4. We have

t t 1 t Eh(t) exp c(X(s))d X(s) (cTb)( X(s))d s (cTac)( X(s))d s , (5 :22) ¼ À À 2 ð0 ð0 ð0  where

t d t c(X(s))d X(s) cj(X j(s))d X j(s): 0 ¼ j 1 0 ð X¼ ð Proof. From (5.18) and (5.21) we have

d Ah(x) 1 1 T T aii (x)c9i(xi) (c ac)( x) (c b)( x), (5 :23) h(x) ¼ 2 i 1 þ 2 þ X¼ where d c9i(t) ci(z) z t: ¼ dz j ¼ Then

X(t) t d h 1 E (t) exp c(x)d x exp aii (X(s)) c9i(X i(s))d s ¼ X(0) ! À 2 0 i 1 ! ð ð X¼ 1 t t 3 exp (cTac)( X(s))d X(s) (cTb)( X(s))d s : À 2 À  ð0 ð0  Thus it suf fices to show that

d X i(t) d t 1 t d c (z)d z c (X (s))d X (s) a (X(s)) c9(X (s))d s: (5 :24) i i i i 2 ii i i i 1 X i(0) ¼ i 1 0 þ 0 i 1 X¼ ð X¼ ð ð X¼ From Stroock (1987, p. 75), we have

t d X i(t) bi(X(s))d s aii (X(t))d t: À ¼ ð0  t x Now (5.24) follows from Itoˆ’ s formula applied to the function g(x) ci(z)d z: ¼ x0 Ð 780 Z. Palmowski and T. Rolski

X i(t) t 1 t d ci(z)d z ci(X i(s))d X i(s) c9i(X i(s))d X i(s) s ¼ þ 2 dx h i ðX i(0) ð0 ð0 t 1 t s ci(X i(s))d X i(s) c9i(X i(s))d X i(s) bi(X(v))d v (5 :25) ¼ þ 2 À ð0 ð0 ð0  s t 1 t ci(X i(s))d X i(s) aii (X(s)) c9i(X i(s))d s, ¼ þ 2 ð0 ð0 t Rd where (5.25) follows from the fact that 0 bi(X(s))d s has a finite variation ( bi ( )). 2 C h Ð Note that by (5.17), (5.19) and Stroock (1987, p. 75) we have t t t T c(X(s))d X(s) c(X(s)) b(X(s))d s (c ac)( X(s))d s < Kt, (5 :26) À ¼ ð0 ð0  t ð0 for some constant Kt. Thus from Rogers and Williams (1987, Theorem 37.8, p. 77), the process Eh(t) is a true martingale, that is, h is a good function. As an illustration of our Theorem 4.2, we give another proof of the Cameron –Martin – Girsanov theorem. In this theorem, we consider the new probability measure PP~ de fined by ~ h the martingale d PPt=dPt E (t), where h is a good function of the form (5.20). We have ¼ d 2 d d ~ 1 @ f (x) 1 @h(x) @ f (x) (AA f )( x) aij (x) bi(x) aij (x) : (5 :27) ¼ 2 i, j 1 @xi@xj þ i 1 0 þ j 1 h(x) @xj 1 @xi X¼ X¼ X¼ @ A Theorem 5.5. Suppose that h is a good function of the form (5.20). On the new probability ~ space ( Rd [0, ), F, PP), the process X (t) is a diffusion process with parameters C 1 ~ aa~ a, bb b ac: ¼ ¼ þ

1 fi h hÀ Proof. The assumptions of Theorem 4.2 are satis ed, that is, A (A) AA~ (AA~) 2(Rd) and we deduce aa~ and bb~ from the form of the generatorD given¼ D in (5.27).¼ D h¼ D ¼ C

Example 5.1. Take d 1 and ci(x) Æ. In this case a diffusion process with in finitesimal variance function a(x)¼ and in finitesimal¼ À drift function b(x) has new parameters aa~(x) a(x) and bb~(x) b(x) Æa(x) after the exponential change of measure. In particular,¼ if we consider Brownian¼ À motion B(t), then after the exponential change of measure the process B(t) is Brownian motion with drift Æ (see Itoˆ and Watanabe 1965). À Example 5.2. The second example concerns the so-called Gauss –Markov process X(t) (see Karatzas and Shreve 1988, p. 355), which is the solution of the stochastic differential equation dX(t) mX(t)d t ódB(t), ¼ þ Exponential change of measure for Markov processes 781 where m, ó are d 3 d matrices and B(t) is d-dimensional Brownian motion. We assume that all eigenvalues of m have strictly negative real parts. Then by Stroock (1987, Theorem 2.6, p. 91) the process X(t) is a diffusion with a(x) óó T, b(x) mx, ¼ ¼ T where x (x1, . . . , xd) . After the exponential change of measure, the process X(t) is a ¼ ~ diffusion with parameters aa~(x) a(x) and bb(x) mx ac(x), where c(x) (c1(x1), . . . , T 1 R ¼ ¼ þ R ó ¼ R cd(xd)) and ci(xi) b( ). In particular, when d 1, m Æ and pa , then the Gauss –Markov2 C process is a one-dimensional¼ Ornstein¼ À –2Uhlenbeck¼ process.2 þ Taking c(x) c R, after the exponential change of measure, the process X(t) is the diffusionffiffiffi with in finitesimal¼ 2 drift changed from Æx to Æx ac ; see Kulkarni and Rolski (1994). À À þ

5.4.

Following Asmussen and Kella (2000), we consider a Markov additive process X(t), where X(t) X (1) (t) X (2) (t), and the independent processes X (1) (t) and X (2) (t) are speci fied by ¼ þ the characteristics qij , Gi, ó i, ai, íi(d x) which will be de fined below. Let J(t) be a right- continuous, irreducible, finite state space CTMC, with 1, . . . , N , and with the I ¼ f g intensity matrix Q (qij ). We denote the jumps of the process J(t) by Ti (with T0 0). i ¼ : f g ¼ Let U n be i.i.d. random variables with distribution function Gi( ). De fine the by f g

(1) i X (t) U n1 J(T n 1) i,T n1 i X X For each i , let X i(t) be a Le´vy process such that 2 I 2 2 ÆX i(1) ó i Æ 1 Æ y log Ee łi(Æ) aiÆ (e 1 Æy1[0,1] ( y )) íi(d y), ¼ ¼ þ 2 þ À À j j ðÀ1 Æ y i where 1 e y1[0,1] ( y )íi(d y) , . Note that the process X (t) has the Le´vy characteristics 2À1 j j(2) 1 i (Æi t, ó i t, íi(d x)). By X (t) we denote the process which behaves in law like X (t), when J(t) Ði. Note that ( X (1) (t), J(t)) is a PDMP with extended generator ¼ N N (1) 1 1 (A f )( x, i) qik ( f (x y, k) f (x, i))d Gi(y) qik f (x y, k)d Gi(y), ¼ k 1 þ À ¼ k 1 þ X¼ ðÀ1 X¼ ðÀ1 (5 :28) and domain (A(1) ) consisting of absolutely continuous functions for which the above integrals are fiDnite. By C ¸ inlar et al . (1980, Theorem 7.14, p. 211) and Jacod and Shiryaev (1987, Theorem 2.42, p. 86), the process X i(t) has extended generator

i ó i 1 A f (x) aif 9(x) f 0(x) ( f (x y) f (x) 1[0,1] ( y )yf 9(x)) íi(d y), ¼ þ 2 þ þ À À j j ðÀ1 with domain 2(R) (Ai). By Palmowski (2002, Theorem 2.1), the process (X (2) (t), X (1) (t), CJ(t)) has D extended generator A such that, for h(x, y, i) g(x) p(y, i), ¼ 782 Z. Palmowski and T. Rolski

A(gp )( x, y, i) g(x)A(1) p(y, i) p(y, i)Ai g(x), ¼ þ for g 2(R) (Ai) and p (A(1) ) (i.e., A A(1) Ai and (A(1) ) 2(R) (A)). 2 C ^  D ^ 2 D ^ ¼ È D C  D Letting Q GG(Æ) (qij GGi(Æ)), where GGi(Æ) E exp( ÆUi) is assumed to be finite, we de fine  ¼ ¼ ^ F(Æ) Q GG(Æ) diag( ł1(Æ), . . . , łN (Æ)) : ¼  þ The Perron –Frobenius eigenvalue and the corresponding right eigenvector of F(Æ) are denoted by º(Æ) and h(Æ), respectively. Note that º(Æ) is real and h(Æ) is positive. As a good Æx Æ y function we propose h(x, y, i) g(x) f (y, i), where g(x) e and p(y, i) e hi(Æ). Then ¼ ¼ ¼ Æ(x y) ^ Æ(x y) (Ah)( x, y, i) e þ (łi(Æ)hi(Æ) (Q GG(Æ)h(Æ)) i) º(Æ)h(x, y, i)e þ : (5 :29) ¼ þ  ¼ Proposition 5.6. The process

h ÆX(t) º(Æ)t hJ(t)(Æ) E (t) e À , t > 0, (5 :30) ¼ hJ(0) (Æ) is a mean-one martingale. Furthermore, the process X (t), after the exponential change of measure, has the following characteristics:

hj(Æ) ~ qq~ij qij GGi(Æ), ¼ hi(Æ)

~ eÆx GGi(d x) ^ Gi(d x), ¼ GGi(Æ)

óó~i ó i, (5 :31) ¼

2 1 Æ y ÆÆ~i Æ Æó y 1[0,1] ( y )(1 e )íi(d y), ¼ þ i À j j j j À ðÀ1 Æx íí~i(d x) e íi(d x): ¼ Proof. By Lemma 3.1, Eh(t) is a local martingale. Following Asmussen and Kella (2000), we have EEh(t) 1 for all t > 0 and hence Eh(t) is a true martingale. To prove (5.31), we write ¼ N Æ(x y) 1 ~ A( fh )( x, y, i) e þ hi(Æ) qq~ik [ f (x, y z, k) f (x, y, i)]d GGi(z) ¼ k 1 þ À ¼ ðÀ1 X N Æ(x y) ~ i Æ(x y) ^ e þ hi(Æ)AA f (x, y, i) e þ qik f (x, y, i)hk (Æ)GGi(Æ) þ þ k 1 ¼ Æ(x y) X e þ hi(Æ) f (x, y, i)łi(Æ), þ ~ 1 which completes the proof in view of (5.29) and the formula AA f hÀ (A( fh ) f Ah). h ¼ À For a one-dimensional Le´vy process ( U i 0 and N 1), taking h(x, y, i) h(y) eÆ y, n ¼ ¼ ¼ ¼ hi(Æ) 1 and º(Æ) ł(Æ), we obtain the well-known Wald martingale ¼ ¼ Exponential change of measure for Markov processes 783

h ÆX(t) ł(Æ)t E (t) e À : (5 :32) ¼ After the exponential change of measure, the Le´vy process changes its Le´vy characteristics from ( a, ó , í) to ( aa~, ó , íí~), where

2 1 Æ y aa~ a Æó y 1[0,1] ( y )(1 e )í(d y) (5 :33) ¼ þ À j j j j À ðÀ1 Æx íí~(dx ) e íi(d x) ¼ (see Ku¨chler and S ørensen 1997, Proposition 2.1.3, p. 11). ÆT x For a multidimensional Le´vy process ( X 1(t), . . . , X d(t)) we take h(x) e , where d k Rd h Æ k X ł k (Æ k )t ¼ Æ, x . This yields the martingale E (t) k 1 e À ; for further general- izations2 of this result see Ku¨chler and S ørensen¼ (1997,¼ Theorem A.10.1, p. 294), Jacod and Me´min (1976, Theorem 3.3, p. 13) and Jacod andQ Shiryaev (1987, Theorem 3.24, p. 159).

Acknowledgements

The work for this paper was partially supported by KBN grants 5 PO3 A 021 20 for 2001 – 2003 (ZP) and 2 PO3 A 049 15 for 1998 –2001 (TR). The final draft of the paper was written while TR was staying at the Department of Mathematical and Computing Sciences of Tokyo Institute of Technology. TR wishes to thank the Department and Prof. Y. Takahashi for hospitality. We are grateful to H. Schmidli, H.C. Gromoll and an associate editor for their comments.

References

Asmussen, S. (1994) Busy period analysis, rare events and transient behaviour in fluid models. J. Appl. Math. Stochastic Anal. , 7(3), 269 –299. Asmussen, S. (1995) Stationary distribution for fluid models with or without Brownian noise. Stochastic Models , 11 , 21 –49. Asmussen, S. (1998) Stochastic Simulation with a View towards Stochastic Processes , MaPhySto Lecture Notes 2. Aarhus: MaPhySto, University of Aarhus. Asmussen, S. (2000) Ruin Probabilites . Singapore: World Scienti fic. Asmussen, S. and Kella, O. (2000) Multi-dimensional martingale for Markov additive processes and its applications. Adv. Appl. Probab. , 32 , 376 –380. C¸ inlar, E., Jacod, J., Protter, P. and Sharpe, M.J. (1980) Semimartingales and Markov process. Z. Wahrscheinlichkeitstheorie Verw. Geb. , 54 , 161 –219. Dassios, A. and Embrechts, P. (1989) Martingales and insurance risk. Stochastic Models , 5, 181 –217. Davis, M.H.A. (1993) Markov Models and Optimization . London: Chapman & Hall. Dellacherie, C. and Meyer, P. (1982) Probabilities and Potential B . New York: North-Holland. Doob, J.L. (1984) Classical Potential Theory and Its Probabilistic Counterpart . New York: Springer- Verlag. Dynkin, E.B. (1965) Markov Processes , Vol. I . Berlin: Springer-Verlag. 784 Z. Palmowski and T. Rolski

Ethier, H.J. and Kurtz, T.G. (1986) Markov Processes. Characterization and Convergence . New York: Wiley. Ethier, H.J. and Kurtz, T.G. (1993) Fleming –Viot processes in population genetics. SIAM J. Control Optim. , 31 , 345 –386. Fukushima, M. and Stroock, D. (1986) Reversibility of solutions to martingale problems. In G.-C. Rota (ed.), Probability, Statistical Mechanics, and Number Theory , Adv. Math. Suppl. Stud. 9, pp. 107 –123. Orlando, FL: Academic Press. Gautam, N., Kulkarni, V., Palmowski, Z. and Rolski, T. (1999) Bounds for fluid models driven by semi-Markov inputs. Probab. Engrg. Inform. Sci. , 13 , 429 –475. Itoˆ, K. and Watanabe, S. (1965) Transformation of Markov processes by multiplicative functionals. Ann. Inst. Fourier Grenoble , 15 , 13 –30. Jacod, J. and Me´min, J. (1976) Characteristiques locales et conditions de continuite´ absolute pour les semi-martingales. Z. Wahrscheinlichkeitstheorie Verw. Geb. , 35 , 1 –37. Jacod, J. and Shiryaev, A.N. (1987) Limit Theorems for Stochastic Processes . Berlin: Springer-Verlag. Karatzas, I. and Shreve, S.E. (1988) Brownian Motion and Stochastic Calculus . New York: Springer- Verlag. Ku¨chler, U. and S ørensen, M. (1997) Exponential Families of Stochastic Processes . New York: Springer-Verlag. Kulkarni, V. and Rolski, T. (1994) Fluid model driven by an Ornstein –Uhlenbeck process. Probab. Engrg. Inform. Sci. , 8, 403 –417. Kunita, H. (1969) Absolute continuity of Markov processes and generators. Nagoya Math. J. , 36 , 1–26. Kunita, H. and Watanabe, T. (1963) Notes on transformations of Markov processes connected with multiplicative functionals. Mem. Fac. Sci. Kyushu Univ. Ser. A , 17 , 181 –191. Liptser, R.S. and Shiryaev, A.N. (1986) Teorija Martingalov . Moskow: Nauka. Palmowski, Z. (2002) Lundberg inequalities in a diffusion environment. Insurance Math. Econom. To appear. Palmowski, Z. and Rolski, T. (1996) A note on martingale inequalities for fluid models. Statist. Probab. Lett ., 31 , 13 –21. Palmowski, Z. and Rolski, T. (1998) Superposition of alternating on –off flows and a fluid model. Ann. Appl. Probab. , 8, 524 –541. Parthasarathy, K.R. (1967) Probability Measures on Metric Spaces . New York: Academic Press. Protter, P. (1990) Stochastic Integration and Differential Equations. A New Approach . New York: Springer-Verlag. Revuz, D. and Yor, M. (1991) Continuous Martingales and Brownian Motion . Berlin: Springer-Verlag. Ridder, A. (1996) Fast simulation of Markov fluid models. J. Appl. Probab. , 33 , 786 –804. Ridder, A. and Walrand, J. (1992) Some large deviations results in Markov fluid models. Probab. Engrg. Inform. Sci. , 6, 543 –560. Rogers, L.C.G. and Williams, D. (1987) Diffusions, Markov Processes, and Martingales. Volume 2: Itoˆ Calculus . New York: Wiley. Rolski, T., Schmidli, H., Schmidt, V. and Teugels, J.L. (1999) Stochastic Processes for Insurance and Finance . New York: Wiley. Schmidli, H. (1995) Crame´r–Lundberg approximations for ruin functions of risk processes perturbed by diffusion. Insurance Math. Econom. , 16 , 135 –149. Schmidli, H. (1996) Lundberg inequalities for a Cox model with a piecewise constant intensity. J. Appl. Probab. , 33 , 196 –210. Schmidli, H. (1997a) An extension to the renewal theorem and an application to risk theory. Ann. Exponential change of measure for Markov processes 785

Appl. Probab. 7, 121 –133. Schmidli, H. (1997b) Estimation of the Lundberg coef ficient for a Markov modulated risk model. Scand. Actuar. J. , 48 –57. Schwartz, A. and Weiss, A. (1995) Large Deviations for Performance Analysis . London: Chapman & Hall. Stroock, D.W. (1987) Lectures on Stochastic Analysis: Diffusion Theory . Cambrige: Cambridge University Press. Stroock, D.W. and Varadhan, S.R.S. (1979) Multidimensional Diffusion Processes . New York: Springer-Verlag.

Received March 2000 and revised April 2002