SIAM J. CONTROL Vol. 13, No. 5, August 1975

MARTINGALES ON JUMP PROCESSES. II: APPLICATIONS*

R. BOEL, P. VARAIYA AND E. WONG"

1. Introduction and summary. This paper is concerned with applying the theory of martingales of jump processes to various problems arising in com- munication and control. It parallels the approaches which have been recently discovered in dealing with similar problems where the underlying is Brownian motion. Indeed these approaches have recently been ex- tended, starting with the work of Snyder [143, [16, [303 and Br6maud [6], [283, to the case of the Poisson process and its transformations. The paper can then be regarded as a sweeping generalization to this recent work. The paper can also be considered as an illustration of an abstract view and a set of instructions which must be followed to obtain certain concrete results in the areas of communication and control. It is hoped that this tutorial function will also be served. Two results from the abstract theory of martingales form the basis of this abstract view. The first consists of the differentiation rule and the associated for martingales and semi-martingales [1], and its application to the so-called "exponentiation" formula [2]. The second result consists of the earlier Doob-Meyer decomposition theorem for supermartingales [3]. In order to follow the abstract view, one also needs a third set of results, the so-called "martingale representation" theorems for specific processes. These results form a bridge between the abstract theory and the concrete applications. The repre- sentation results used here have been obtained in [4], hence the paper can also be viewed as a continuation of that work. The paper is organized in the following manner. In the next section are presented many definitions, notations and results from [1], [2], [3], [4] which will be used in the succeeding development. These preliminaries are certainly longer than can be considered proper, and are justified partly to serve the tutorial function, partly because there is no consensus of usage in the literature, and lastly because some of the published literature contains errors and inaccurate or mis- leading statements which can be exposed only within a carefully and completely developed context. Section 3 is concerned with showing the "global" existence of jump processes over a finite .or infinite interval which satisfy certain local descriptions. Existence of such processes is obtained by transforming the laws of "known" processes by an absolutely continuous transformation. We also present a wide class of point processes which can be so transformed to yield solutions to prespecified local

Received by the editors December 21, 1973, and in revised form August 13, 1974. f Department of Electrical Engineering and Computer Sciences and the Electronics Research Laboratory, University of California, Berkeley, California 94720. This research was supported by the National Science Foundation under Grant GK-10656X3 and the Army Research Office--Durham under Contract DAHC04-67-C-0046. The work of the first author was also supported by an ESRO- NASA International Fellowship.

1022 MARTINGALES ON JUMP PROCESSES. II 1023 descriptions. Sufficient conditions are derived which guarantee when this tech- nique is applicable. The question of uniqueness of the solutions is settled for a wide class of local descriptions. Section 4 deals with a specific problem in communication theory, namely the calculation of the likelihood ratio of a process which may be governed by one or two absolutely continuous laws. The techniques for 3 and 4 are the same. Section 5 is concerned with estimating certain random variables or processes which are statistically related to an observed process. The emphasis here is on obtaining "recursive" filters. As special cases one obtains a "closed form" solution for some of the situations where the estimated process is Markovian. Applications to optimal control will be made in a future paper. Throughout, there has been an attempt to link up the results with those which have already appeared in the literature in as precise a manner as limitations of space permit. Any omissions are due to oversight of the authors. 2. Preliminaries and formulations. This section describes most of the results from the literature which are necessary to the sequel. Section 2.1 is definitional in nature. Sections 2.2-2.7 are taken mainly from [1], 2.8 is taken from [23, the remainder is from [4]. 2.1. Processes. Throughout 92 is a fixed space, the sample space. The time interval of interest is R/ [0, oe) unless specified otherwise. For each let be a a-field of subsets of f. It will always be assumed that the family , R+, is increasing, i.e., c for s _<_ and right-continuous, i.e., t f-l,>t Let Vt be the smallest a-field containing all the . Let P be a probability" measure on (92, -). Thus one has a family of probability spaces (92, , P). It will always be assumed that probability spaces are complete. Let (Z, ) be a measurable space. Let x'f2 R/ --, Z be a function such that {colxt(co)e B} for all B ee, e R+. Then (xt,t,P) is a (stochastic) process. Thus every process has attached to it a family (YL , P), R+, of probability spaces. The same function x defines a different process if either the family or the measure P is changed. When the context makes it clear we write (xt, ) or (x, P) or x, instead of (xe, ,, P). If (x, , P) is a process, then so is (xt, ,oz-x p) where is the sub-a-field of generated by x, s =< t, and P is the restriction to Vt . Two processes (x, , P) and (y, t, P) are said to be modifications or'versions of one another if x y a.s. P for each t, the set {xt 4: yt} may vary with t. They are said to be indistinguishable if there is a set N with P(N) 0 such that for co N, x(og)= y(og) for all t. Given (92, , P), a random variable, or r.v., with values in (Z, ) is a --measurable map from f into Z. Unless explicitly stated otherwise all r.v.s and processes take values in (R U {oe }, ), where is the Borel field. 2.2: Stopping times. Consider a family (92, o, P). A nonnegative r.v. T is a , s.t., of the family if

{T=< t}e forallt.

The s.t. T is said to be predictable if there exists an increasing sequence of s.t.s, 1024 R. BOEL, P. VARAIYA AND E..WONG

S S 2 ..., such that

P =O or S < Tfor all kand S 1. {T k k-,oolim k T}

The s.t. T is said to be totally inaccessible if T > 0 a.s. and if for every increasing sequence of s.t.s $1 < $2 < ..',

P Tfor all k and S T < O. {Sk< kolim k oo}

2.3. Martingales and increasing processes. A process (mr, t, P) is said to be a (uniformly integrable) martingale if the collection {mtit R +} of r.v.s is uniformly integrable, and if E(mtl) m a.s. for s __< t. The collection of all such martingales, for which mo 0, is denoted///1 _/dl(t, p). (mr, t, P) is said to --, be a if there is an increasing sequence of s.t.s Sk, with Sk a.s. such that

(mr A skI(s o}, 'G, P) /d for each k.

The collection is denoted/dloc(--, P). (mt, ----, P) is a square integrable martingale if m e//d and if supt Em2 < oo. The collection is denoted /d2(, P) and the 2 class of locally square integrable martingales /dloc(t, P)is defined analogously. It is obvious that/dl2oc c /do c. Each m e/dlloc has a version whose sample paths are right-continuous and have left-hand limits. Clearly such a version is unique, i.e., unique modulo in- distinguishability. It will always be assumed that local martingales have sample paths with this continuity property.

A process (at, , P) is said to be increasing if ao 0 a.s. and if its sample paths are right-continuous and nondecreasing. The collection is denoted

Members of '+(a') are said to be integrable (or have integrable variation). a a'- is said to be locally integrable a 'l+oc if there is an increasing sequence -, '+ of s.t.s Sk a.s. such that at^s at for all k. oc a'+lo a'lo.+ . A process (st, , P) is a , respectively local semimartingale, if it can be expressed as s so + m + at, where mt/dl(, P) and a a'(, P), respectively, m /dl*o(, P) and a a'0(, P). The families are respectively denoted 5(G, P) and o(G, P). 2.4. Predictable processes. The family of all processes (Yt, t, P) which have left-continuous sample paths generates a a-field ()c (R) ', where is the Borel field of R/, with respect to which the functions (m, t)- are measurable.' is called the predictable a-field, and every process (Yt, , P) MARTINGALES ON JUMP PROCESSES. II 1025 which is -measurable is called a . Note that if = , then (,,) (). For (a,, , P)6 .NO L"(a) Y(Yt, , P) is predictable and E IyI"dal <

L'o(a) {ylthere is a sequence of s.t.s S such that yIs e LP(a) for each k}. The integrals above are Stieltjes integrals. 2.5. . Two martingales m, n in o are orthogonal if their product, mn oc" m o is continuous if its sample paths are con- tinuous; it is said to be discontinuous if it is orthogonal to every continuous martingale. Every m o has a unique decomposition, m,=m+m such that m is continuous and m is discontinuous. Clearly if mt oc is con- tinuous, then it is oc. To every path m, n in o is associated a unique pre- dictable process, denoted (m, n)t or ((m,, n,), , P) such that (m, n), c, and (m,n, (m, n),) o(, P). (m,n) is called the predictable quadratic covariation of m,n. For mo, (m) (m, m) is the predictable quadratic variation of m. Note that generally (m, n) depends crucially upon the family (, P). If m, n in oc have the decompositions m m + m,a n n + n, then the process Ira, n], Ira,, n,] (m, n), + Am' An, st where Am m m-, An n n-, is called the quadratic covariation of m, n and [m] Imp, mt] is the quadratic variation of mt. It turns out that

so that if, furthermore, m, n are in t, then Ira, n], (m, n), do.

2.6. Stochastic integration. If mt oc(, P) and , Loc((m),),2 then b, Loc((m, n),) for all nt ///2oc(, P) and there is a unique process, denoted (b m), //2oc( P), which satisfies

(2.1) (b m, n), qbd(m, n) for all n, 6 d2o.

The integral on the right is a Stieltjes integral. If m ///o then one cannot define a stochastic integral in this way. Two other possibilities are open. 1026 R. BOEL, P. VARAIYA AND E. WONG

If m m + mate loc( P), if roted oc( p)l and if Llo(mt), then the process (2.2) ( m)t (o mC)t + dps dm e #o(, P), where (b m)t is defined as in (2.1) whereas the second integral is a Stieltjes integral. Finally if m e /glao and if (t, , P) is a locally bounded2 predictable process, then there exists a unique process ( m)t e /o which satisfies

(2.3) [ m, n], b dim, n]s for all n

The integral on the right is not in general a Stieltjes integral unless [m, nit e 1o. The precise interpretation of this integral is not given here since it is seldom used below. For details see [1]. The process (05 m)t is called the stochastic integral of 4) with respect to m. Note that if (b m) makes sense according to more than one of the three possibilities (2.1), (2.2) or (2.3), then the resulting stochastic integrals coincide. 2.7. Differentiation formula. Let st So + mt + at o(, P). The decom- position is not unique. If st So + m't + a't is another decomposition, then the continuous parts m, m't of the local martingale are indistinguishable. This unique continuous local martingale is denoted s. Let s (s), ..., s') be a process with values in R" such that s e 5o( P), l, ..., n. Let F:R" R be a twice continuously differentiable function. Then the following differentiation- formula holds: F(st) F(so) + (st-) ds + (s,-) d(sic, sJ), i,j=l [ ] As a special case one obtains the very useful "product" rule. Suppose m and n are in///lo. Then (since m0 no 0), and recalling the definition of Ira, nit,

mtnt mS_ dnS+ nS- dmS + [m,n] t.

2.8. The exponentiation formula. Let s e oc(, P) with So 0. Then there is a unique process Yt e oc(, P) which satisfies the equation

Yt Yo + y-ds, >= O,

This is a nontrivial restriction on m It holds for the discontinuous martingales to be intro- duced in 2.9 below. lbt is locally bounded if there is an increasing sequence of s.t.s S oe such that the process qb ^skIs>o is bounded for all k. Note that if b is a right-continuous process, having left-hand limits, then the process q6 b_ is locally boufided. MARTINGALES ON JUMP PROCESSES. II 1027 for a prespecified o-measurable Yo, and Yt is given explicitly by Yt Y0 exp (s 1/2(sc, s)t) 1-I (1 -+- As) where the second term converges a.s. Yt is called the exponential of s and is some- times denoted Yt g(st). Evidently g(st) _> 0 a.s. if Yo ->- 0 a.s., and if 1 + As _>_ 0 a.s. If, in addition, mo __> 0 and m -mo ///doc(t, P), then (g(mt),t,P) is a supermartingale, i.e., E((mt)]s) < (ms), s < t, and so in particular, E(g(mt) <= E(mo), >= O. Finally if m [1 is bounded, then g(mt) is a martingale. 2.9. The fundamental lump process. Let (, t, P) be a family of spaces and

let (xt, , P) be a process with values in (Z, ) such that all the sample paths of x are piecewise constant and have only a finite number of discontinuities in every finite interval, and such that the sample paths are right-continuous, i.e., for all co, <_ there is eo > 0 such that xt(co xt+(co) for 0 =< e e o. Let T,, n 0, 1,..-, denote the jump times of the process, defined inductively by To 0 and inf {tit > T,(co), xt(co # XT.(co)}, n > O, Tn + l(co oo if the set above is empty. (xt, , P) is a fundamental , or a fundamental process, f.p., with values in (Z, e), if in addition, (i) (Z, e) is a Blackwell space, and then it turns out that the jump times are .s.t.s, and (ii) The s.t.s T, are totally inaccessible. Evidently if (xt, , P) is a f.p., so is (xt, t, P), where is the sub-a-field of generated by x, s __< t. For each B e e, let ' P(B, t)

be the number of jumps of x which occur prior to and which end in the set B. Associated with P(B,t)are two unique increasing continuous processes P(B, t) dloc(t,+ ov P) and P'(B, t) + o-,, p) such that Q(B, t) P(B, t) P(B, t)e ,_loc(t,2 P) and Q(B, t) P(B, t) P(B, t) t- P). Furthermore, =/3(B f"l B2, t), and

pX(B B 2, t). Finally, the functions P, are countably additive. 1028 R. BOEL, P. VARAIYA AND E. WONG

Note. The condition that the T, are totally inaccessible is equivalent to the assertion that the P(B, t) are continuous. See [4] for alternative conditions. A real-valued function f(z, t)= f(z, o, t) is said to be predictable, and one writes f @(), if it is measurable with respect to (R) (R) and if for each fixed z, f(z, .,. is predictable in the sense of 2.4 above. The family (-) is defined similarly. If f (), respectively (), we call f a -predictable, re- spectively '-predictable, process. The following classes of predictable functions are used in the martingale representation results: L2(lX)={f6()'("f Y)2=EfzfR f2(z,t)-X(dz,dt)

L'(Q) L'(P) [')L'(Px). It turns out that l[f[ll lif[l, hence L'(P) LI(p) L'(Q).

LZoc(Px) f @(ff')] there exists a sequence of s.t.s Sk such that f(z, t)I,=s Lz(Px) for each k}. The classes Loc(P etc. are defined in a similar manner. Evidently, Loc(Q') Loc(P Loc(P). Let f(z, co, t) be a function which is measurable with respect to (R) ff (R) such that f(z,., t) is ff'-measurable for fixed z and such that

E f (z t)[PX(dz' dt) < fz fR Then there exists a 0--predictable function f such that E fzf ]f fiP(dz, dt)=O. This result follows easily from [22, , Thm. 23]. The result will be used in 4, 5 in the following context" Let f (ff,) and let f(z, t)- E(f(z, t)[-'); it can then be assumed without loss of generality that f is ff-predictable. 2.10. Representation of //2(-). For each f L2(Qx) there exists a unique process (f QX)t # 2 (#'t) such that for all g L2(Q), and fl in R, (af + fig)o QX =_ a(f QX) + fl(g Q),

( f Q', g Q')t f(z, s)g(z, s).'(dz, ds). MARTINGALES ON JUMP PROCESSES. II 1029

Conversely if m, e /2(,_), then there exists f L2(Qx) such that m (f QX)t. Similarly, m //Zoc(2 o"xt) if and only if there exists f LZoc(Q') such that m (f Q,),. 2.11 Representation of ///lo(t).x Iff e L (px), then (fo QX), where (f QX)t f(z, s)QX(dz, ds) f(z, s)P(dz, ds)

f(z, s)P(dz, ds), the integrals on the right being Stieltjes integrals. Conversely if m, then there is f L I(Px) such that m (f Q)t. Finally, m, e //1o(5 t) if and only if there is f e Llo(P such that

m (f QX)t f(z, s)[P(dz, ds) PX(dz, ds)-].

Remark 2.1.1. If m //loc('x,) has continuous sample paths, then m =-- O. 2. If more than one-representation above applies then the representations coincide. 2.12. Local description of a fundamental process. Let (xt, ,, P) be a funda- mental process with values in (Z,), and consider the increasing processes P(B, t) and P(B, t). Let A(t) P(Z, t), AX(t) P(Z, t). The countable additivity of these functions with respect to B implies that there exist predictable processes n(B, t) and n(B, t) such that for all B ,

P(B, t) n(B, s)A(ds),

pX(B, t) n(B, s)A'(ds).

Evidently it can be assumed that n(Z, s) n(Z, s) 1. The system {n(B, t), A(t)}. or {n(dz, t), A(dt)} is analogous to a L6vy system for a Hunt process [5]. The system {n(dz, t), A(dt)} will be called an extrinsic local description of x, whereas {n(dz, t), A(dt)) is called the intrinsic local description of x, because of the fol- lowing interpretation: the probability that x has a jump in It, + dt] given , (respectively ,)is A(dt)+ o(dt) (respectively AX(dt)+ o(dt)) while n(B,t) (respectively nX(B, t)) is the probability that x, e B given t(5,) and given that a jump occurs at t. For future reference we note the following trivial but important fact. 1030 R. BOEL, P. VARAIYA AND E. WONG

Fact. Let {n(B,t), A(t)} and {nX(B,t),AX(t)} be extrinsic and intrinsic local descriptions. Then for all B e , and e R /,

(2.4) g n(B, s)A(ds)[[ nX(B, s)A(ds) a.s.

2.13. Fundamental example. The results in .the succeeding sections will be specialized to the following example which covers many practical cases such as Poisson, counting, birth and death, and queueing processes. Let (xt, , P) be a fundamental process with values in (Z, e). Suppose that from each z e Z the process can make at most n transitions, where n is a fixed finite number. Thus the transitions can be represented by a "state-transition" diagram of Fig. 1, where the transitions are labeled a ..., a,. Define the counting processes pi(t), <= <= n, pi(t) number of transitions of type made by the process xt prior to t.

FIG. 1. State-transition diagram for fundamental example

Then there exist increasing processes/i(t) and ff[(t) such that

/2(_--x ////2 '-x Furthermore, m p) respectively locke, t, p) if and only if there exist ff L2(p) respectively Lo(Pi),2 such that

(2.5) m (fo q[)t i=1

////1 (x and /{(x p) f-) , respectively t, p), if and only if there exist m, t loc. f. e L X(p), respectively Loc(p), such that

(2.6) ,n (f,.o q),, i=1 where the integral is a Stieltjes integral. We call (/7, .-.,/7,), respectively (p., ..., p,), the extrinsic, respectively in- trinsic, local descriptions. MARTINGALES ON JUMP PROCESSES. II 103

3 Remark 2.2. If (xt, , P) is a counting process, then Fig. simplifies to Fig. 2 and there is only one transition. Hence in this case n in (2.5) and (2.6).

FIG. 2. State-transition diagram for counting processes For this special case Br6maud [6] has obtained the representation for //oc(t),2, whereas Davis [7] has extended it to the class ////loc(,). However both these results were obtained only for the case where the law of (xt,.t,_x p) is mutually absolutely continuous with respect to the law for a standard Poisson process (see 3). 3. Solutions to specified local descriptions by change of law. In 3.1 we present a very useful technique for transforming one fundamental process (x, , P) with a I.d. (local description) (n, A) to another process with a different prespecified I.d. The questions of uniqueness of the solution is discussed in 3.2. Section 3.3 consists of some sufficient conditions which guarantee that the tech- nique is applicable. Finally 3.4 presents a class of processes which can be trans- formed into other processes with this technique. Let (xt, x, P) be a fundamental process with values in (Z,) and with intrinsic local description (1.d.) (nX(dz, t), AX(dt)) so that P'(B, t) nX(dz, s)A'(ds), R+.

Since we will be only dealing with the "intrinsic" a-field in this section, the superscript x will be omitted here. Hence x, p PX etc. 3.1. The transformation technique. Let P1 be another probability measure on ).and suppose that P1 <_ 0 and E(L)4 1. Let L, E(LI). Then (Lt, 't, P) is a uniformly integrable martingale, lim L L a.s. and in L by [3, remark after VI, Thm. 6. PROPOSITION 3.1. (i) If L > 0 a.s. P, then for almost all 09, L_(og) > 0 and L,(oo) > 0 for all t. (ii) Let

(3.1) T(o9) inf {tlLt-(o) 0 or L,(o)) 0}.

A counting process is an integer-valued process which starts at 0 and has unit jumps. 4 E, E x, denotes expectation with respect to P, P. 1032 R. BOEL, P. VARAIYA AND E. WONG

Then for almost all co, Lt(co 0 for >- T(co). Proof. (i) Clearly L > 0 a.s. implies Lt > 0 a.s. and then the second part of the assertion follows from (ii), and the latter follows from [3, VI, Thm. 15]. [3 Remarks 3.1. (i) If L > 0 a.s. P, then in fact P << P1, i.e., the two measures are mutually absolutely continuous. (ii) It is easy to give examples such that Lt > 0 for all but P(L 0) > 0. For e > 0 let (3.2) T(co) inf { t[ L, (co) =< e}. PROPOSITION 3.2. T is a s.t. for all e and lim T(co)= T(co) a.s. P e-0 Proof. The fact that T is a s.t. follows from the fact that the process L_ is left-continuous and from [3, IV, Thm. 52]. Now T is clearly nondecreasing with e. Let To(cO lim T(cO). -0 Suppose T(cO)= oo and per contra To(cO < oo. Then there exists a sequence increasing to o < oo such that L,_(cO) 0. By left-continuity Lo_(cO 0 and so T(cO) __< o. Next suppose T(cO) < oo. By Proposition 3.1, for almost all such cO, To(cO __< T(cO). If To(cO < T(cO), then a repetition of the previous argument will end in a contradiction. Once again To(cO T(co). V1 For e > 0 let L(CO) L,^ T(cO), e R+.

>_ Then L- L0 e FI(P) and L_ e for all t. By 2.11 there is a predictable function if(z, t) e Loc(P such that (3.3) L + if(z, s)Q(dz, ds) f(z, s)[P(dz, ds) P(dz, ds)]. Since 1/L_ <= 1/e, therefore the process (z, s) [if(z, s)]/L- e Lloc(P), and hence (3.4) m(t)=fzfl dp(z, s)O(dz, ds) e /o(, P) which upon substitution into (3.3) gives L 1 + L_ dm.

Here it is being assumed that L _= which is indeed the case if o is trivial. Otherwise, in the sequel, replace the martingale L by LffLo. MARTINGALES ON JUMP PROCESSES II 1033

By the exponentiation formula of 2.8, (3.5) L g(m) exp (m -(m,c m'c),) 1-[ (1 + Am) e s<_t By the Remark in 2.11, m'c =_ O, hence (3.5) simplifies to (3.6) L exp (m) 1-I (1 + Am) e- am. Rewriting (3.4) as

(3.7) and acknowledgin that the second integral has continuous sample paths (since P is continuous) it follows that for almost all 09, (3.8) Am(og) m](og) m_(og) ;z (z, s)(oo)[P(dz, s)(co) P(dz, s-)(co)]. Also since P(B, s)(o) P(B, s-)(o) equals or 0 depending upon whether or not xs-(o) 4: G(co) and x(o))e B, therefore the term (1 + Am) in (3.6) can be written as (3.9) (1 + Am)(o9) fz (1 + dp(z, s)(og))[P(dz, s)(co) P(dz, s-)(co)]. From (3.8), (3.9) it follows respectively that Z Am(o9)= Z (x(co), s), s<=t s<_t V= Xs I-I (1 + Am(9)) 1-I (1 + s<_t s<_t Xs- Xs which upon substitution, together with (3.7), into (3.6), yields after some can- cellation the first interesting result" (3.10) g= I-I [1 +d/(xs,s)]exp Xs- :/= Xs

Finally let ek > 0, k 1,2,..., be a sequence decreasing to 0, let SO 0, Sk T, k 1,2,..., and let dp(z, s) cDk(z, s)I(s_ k=l

4) is predictable since is predictable and I(s_ <_

Then there exists a predictable function b(z, s) and an increasing sequence Sk of s.t.s converging to T such that Ck(Z, S) (Z, s)I<_s,, L]o(P) and

(3.11) L^s I-I [1 / dPk(X,s)]exp dpk(Z,S)P(dz, ds s<_t Xs- Xs The product on the right converges a.s. whereas the integral is a Stieltjes integral. Remarks 3.2. (i) If L dPx/dP > 0 a.s., then T- ov a.s. so that the result above implies that Loc(P). However if this is not the case then it is not true that in general Loc. Some additional properties of are given in Theorem 3.2 below. Nevertheless, very loosely speaking, one can interpret (3.1 l) as (3.12) L I] [1 + (D(x,s)] exp d(z,s)P(dz,ds) for < T. s<_t :Xs- Xs Indeed some such loose interpretation has to be used in understanding the cor- responding formulas of [6], [23]. (ii) The characterization (3.11) has been derived earlier [23], [24] for the case where (xt, t, P) is a Brownian motion. The techniques for the proof are identical except that in deriving (3.4) one observes that every martingale on a Brownian motion sample space is a stochastic integral of the Brownian motion (see [5]), and that all martingales are continuous so that (3.5) simplifies to L =exp(m- -(,,,*' ,m'C),). (iii) For the fundamental example the representation (3.11) becomes, using 2.12,

L,^ s I-I [1 + bk(S)] exp dPk(s)p (ds) i=1 s<_t (x- ,x)e i for some predictable 4)(s), 1 __< __< n, such that 4) Lo(P). Here the notation (x,_, x) a means that x makes a transition of type at time s. If (x, t, P) is a Poisson process then in the above n and, as is well known, p'(ds)=- ds. For this case the result was first obtained by Br6maud [6] without the integrability condition on 4), and for the case L > 0 a.s., by Van Schuppen [24], and by Davis [7] who proves in addition that then 4) Lo. Br6maud [6] also obtains this representation for the case where the example is a . We proceed to obtain the relations between the local descriptions. The next result is well known. LEMMA 3.1. m /lloe(tt, P) if and only if tntL e ,/lloc("t, P). Let S av be a sequence of s.t.s such that for each k, Proof. k - (3.13) e l(p). mt A s,,Lt ^ s,,I(s,, > o) MARTINGALES ON JUMP PROCESSES. II 1035

First of all, E lm, ^ skltsk 01 ELIm, ^ sk Itsk > o1 EL,^ slmt^sIts>ol (by (3.13))

Next for s __< t, g's) E(m, ^ sL, ^ sIs > oil Ex(m,^sIsk>o[s) E(L,^sk[) From (3.13) and the fact that L e '(P) the right-hand side simplifies to ms ^ skLs ^ SkI(sk O} ms ^ s I{s > ol, Ls ^ sk which proves the "if" part of the assertion. Conversely suppose that (3.14) mt^suI{su>o} /I(P1). It will be shown that for s 5 t, (3.15) E(L^sm^sItso[) L^skltso a.s.P. So let A . Then E(I,L ^ sm ^ sIts o E (Iamt ^ sklts o E(Iam^sItso (by (3.14)) E(L^sIAm^sItsk>o), which proves (3.15). THEOREM 3.2. Let (xt, , P) be a fundamental process with values in (Z, ) and with (intrinsic) 1.d. (n(dz, t), A(dt)). Let Px << P and let L E(dP/dPI) have the representation (3.11). Then (x,, , PI) has l.d. (n(dz, t), A(dt)), where (3.16) A(t) A(dt) and nx(dz, t) (1 + dp(z, t))n(dz, t). Furthermore, it can be assumed that (3.17) (1 + q) => 0 and (1 +)eLo(P with respect to probability measure PI. Proof. By 2.9 there exist continuous increasing processes Pa(B, t) l+oc(P) such that (3.18) Q,(B, t) P(B, t) P, (B, t) e oc(P1). Hence to show (3.16) it is equivalent to prove that (3.19) P-(B, t) (1 + ok(z, s))P(dz, ds). 1036 R. BOEL, P. VARAIYA AND E. WONG

Let S i, bi be as in Theorem 3.1, and let (3.20) Q](B, t) QI(B,

(3.21) m, P(B, t/ S)- (1 + dp(z,s))P(dz, ds).

It will be shown first that m e //oc(P). By Lemma 3.1 it is enough to show that

Since e Lo(P), therefore m is in o(P), also L, e (P) o(P). Hence one can apply the differential formula of 2.7 to obtain

(3.22) Lm ms_ dL + L_ dm + [A(mL)- m-AL- L Am]. From (3.21), L- dm L-P(B, ds) L_(1 + )P(dz, ds), and since A(mL)= (m- + Am)(L- + AL)- m-L-= m-AL + L-Am + AL Am, therefore the last term in (3.22) equals AL Am L-4(z, s)P(dz, ds) from (3.11) and (3.21). Substituting these relations back into (3.22) gives

Ltm m_ dL + L-(1 + dp)P(dz, ds) L_(1 + dp)P(dz, ds)

^s' =flm-dL+fff L_(1 + ch)Q(dz, ds), which is clearly in /goc(P). Hence m,eC{o(P,). Since Q] (B, t) 6 o(P), sub- tracting (3.21) from (3.20) implies that P(B, tAS)-fBfl (1 + c(z, s))P(dz, ds) e /'loe(P1). But this process has continuous sample paths, hence it must vanish, i.e., for almost all co (P measure)

Pl(B, ] Si) (1 / dp(z, s))P(dz, ds) for all t,

which proves (3.19) and thereby (3.16). The assertion contained in (3.17) follows from the fact that P has increasing sample paths and is in 'loc(P1).+ [-] Remark 3.3. (i) It has been shown that b e LIon(P) in the probability space (fL if, P1) and not in (YL if, P). MARTINGALES ON JUMP PROCESSES. II 1037

(ii) The transformation of I.d. for the case where (x,, P) and (xt, P1) are both Hunt processes has been obtained in [5]. For this case the local description is called a L6vy system. (iii) For the case of the fundamental example with 1.d. 1, ..., ,6,) under P, the I.d. under P is ((1 + 4))/, (1 + b")/,), where the qi are as in Remark 3.2 (iii). Theorems 3.1, 3.2 allow us to obtain in certain cases processes which have certain specified l.d. from known processes with other descriptions. Put differently, we have a "synthesis" procedure for obtaining "global" solutions for a class of 1.d.s. This is summarized in the following theorem, whose proof is now immediate. THEOREM 3.3. (Existence of solutions to local descriptions). Let (xt, tt, P) be a fundamental process with values in (Z, ) and with intrinsic 1.d. (n(dz, t), A(dt)). Let dp(z, s) be a predictable function such that (3.23) z, s) ocP) and (3.24) fa L dP 1, where (3.25) L, I-I [1 + 4(x, s)] exp c(z, s)P(dz, ds) s<_t Xs- Xs Then (x,, , P) is a fundamental process with l.d. (n(dz, t), A(dt)), where n(dz, t) (1 + O(z, t))n(dz, t) and where the probability measure Px is. given by dPx L dP. Remark 3.4. (i) This result is extremely useful in practice since given an arbitrary 1.d. there is no way to determine whether or not there exists a process with such a description. On the other hand, from the viewpoint of dynamical processes, a 1.d. is much more natural and useful. (ii) For the case of Brownian motion the result corresponding to the above was first obtained by Girsanov [9], and the technique was soon adopted in stochastic control problems [10], [11], [12], [13]. (iii) Br6maud [6] was the first to use this result, for the special case where (x,, , P) is a Poisson process, to obtain existence of several "self-exciting" counting processes (x, , Pa). Snyder [14] and Rubin [15] introduce several jump processes through their l.d. However they do not discuss whether or not there indeed exist processes with these descriptions. The result above can be used to solve this problem. (iv) The condition (3.24) is a nontrivial restriction. For the Brownian motion case some sufficient conditions on the local description have been derived which guarantee (3.24). See [10], [11]. For our case similar conditions are given below in 3.3. 1038 R. BOEL, P. VARAIYA AND E. WONG

(v) Theorem 3.3 does not address itself to the question of uniqueness of the solution. This question is discussed next. 3.2. Uniqueness of solutions with specified I.d. To discuss uniqueness of laws of solutions it is convenient to assume that f is the space of sample functions and that the process x on f is merely the "evaluation" process, i.e., xt(og) o9. The probability on f is then the law of the process. We will be dealing with two such processes, xt and yt, with the same set of sample functions but with different laws. Hence we must have two different probability spaces (fx, t, PX) and (fr, r p) where (fx, ) and (f, otr are copies of the same family (f, t). In particular, then, x and y are identical functions on [0, 1]. Since we are unable to obtain any interesting results for the infinite time interval, therefore in Theorem 3.4 and Corollary 3.1, [0, 1]. DEFINITION 3.1. An (intrinsic) l.d. (n, A) is said to have unique solutions if all fundamental processes (xt, t, P) with l.d. (n, A) have the same law. THEOREM 3.4. Let x, y, 0 <__ <__ 1, be fundamental processes with values in x y (Z, ) and on the (sample function) spaces (fU, t, px), (fy, pr) respectively. Let (n, A) be the l.d. ofx and ((1 + qS)n, A) the 1.d. ofy for some predictable function Suppose that (n, A) has unique solutions, and suppose that for each e > 0 there exist Z and k < such that (i) pX(B) >__ 1 e, where B {cnlx,(og) e Zfor 0 <= <= 1 },

(ii) (PY(dz, ds) + PY(dz, ds)) 09 B, -0-( o s) <= k for where these are Stieltjes integrals. Then - x(o)Pr(do) and dP dPr, where 1,=((oQr),) and ff=-/(l+b).

Proof. The process (m, firt pr) m, (z, s)Qr(dz, ds) (z, s)EPr(dz, ds) Pr(dz, ds)]

is, by (i) and (ii), well-defined as a Stieltjes integral. Hence m, o(r, Pr) so that it is in oc(Pr). Therefore by 2.9 there is a unique process (l, rt, Pr) where l, 8(m,). Let e, > 0 be a decreasing sequence converging to 0. Define the predictable functions

f (O, s) q,"(z o s) I. 0 MARTINGALES ON JUMP PROCESSES. II 1039

Because of(ii), the process m"t, rt, W) where

m, O"(z, s)Qr(dz, ds), is a bounded martingale. Hence by 2.9 the process (l', r, W) is a martingale, where I' (m'). Furthermore from the definition of

(3.26) 1'(09) l,(og) for all and 09 B,. By Theorem 3.3 the fundamental process (y, ffr, p,), where dP" (3.27) dpy 1], has a 1.d. ((1 + if")(1 + qS)n, A), and by the definition of if" and (1 + ")(1 + qS)(z, 09, t) for all t, 09 B. and z Z,. Since (n, A) has unique solutions, it follows that

>= 1-e (by(i)). From (3.26), (3.27) this implies f (o)W(do) > e and since > 0 is arbitrary, the assertion follows. Note. We must have W(B, t)(o9) W(B, 0(09) and PY(dz, dt)(og) [1 + b(z, t)] x n(dz, t)A(dt). COROLLARY 3.1 (uniqueness). Let (xt, , P) be a fundamental process with values in (Z, r) and with 1.d. (n, A) which has unique solutions. Let dp be a predictable function such that qb(z, s) L,oc(n ), E[((b Q)),)] 1. Suppose that clb satisfies (i) and (ii) of Theorem 3.4. Then the 1.d. ((1 + b)n, A)) has unique solutions. Proof. By Theorem 3.3 and the hypothesis there is a solution (yt,Yt, P1) with I.d. ((1 + qS)n, A), where dP g((ck Q'))dP. Suppose (Yt, rt, P2) is another solution with I.d. ((1 + b)n, A). By Theorem 3.4, dP lx dP 12 dP2, and since dP La dP it follows that > 0 a.s. Pi, i= 1,2. Evidently then P2=P. [-] 1040 R. BOEL, P. VARAIYA AND E. WONG

Remark 3.5. (i) Theorem 3.4 is inspired by [9, Lemma 7] and the development there suggests how the result can be generalized. (ii) Conditions (i) and (ii) of Theorem 3.4 are usually easy to verify in practice. x Consider a special case of the fundamental example where (xt, , px) is a Poisson process with rate 1. Then Z is the space of integers and y is then a counting process with local "intensity" rate + b(o, t). Suppose (o, t) is expressed explicitly as a function of the past of x, i.e., (o, t)= f(Xto,l(co), t). Then the conditions (i) and (ii) are satisfied if, for instance, there is an increasing function fo such that ]f(xto.q, t)l + < fo(S) when Ix,l S. I1 + f(xto,q, t)l For a similar condition in the Brownian motion case see [11]. (iii) Corollary 3.1 extends in an obvious way to the time interval R +. How- ever Theorem 3.4 does not. 3.3. Sufficient conditions. Let (x,, , P) be a fundamental process with values in (Z, )and with intrinsic description (n(dz, t), A(dt)). Let (z, t) Lo(P and define the process L, [0, 1], by (3.28) L I-I [1 + 4(G, s) exp d(z, s)(dz, ds) =/=: Then L also satisfies

(3.29) dLt Lt- dmt, where

(3.30) m, ( Q),.

We assume that + (z, t) >__ 0. Then Lt >= 0, Lt is a supermartingale and E(L1) 1. The three results below state conditions on which guarantee E(L)-- 1. The following assumption is made throughout this subsection. Assumption 3.1. There exists an increasing function #:R+ R+ such that (3.31) P(Z, t) <_ p(t) a.s. (Note that this implies P(B, t) <= p(t) for all B e ). PROPOSITION 3.3. Suppose that for some K < , (3.32) I1 <-_ K. Then E(L 1. MARTINGALES ON JUMP PROCESSES. II 1041 Proof. From (3.28), (3.31) and (3.32), L, =< (K + 1)exp Kp(t). Hence IL,-c/)(z, t)l: <= KZ(K + 1): exp 2Kg(t), so that E ]L,-4)(z, t)l:P(dz, dt) <= E K:(K + 1) exp 2K#(t)p(dt) < , which implies that L,_(z, t) e L(P). By 2.10, L is a square-integrable martingale, and in particular E(L)= E(Lo) PROPOSITION 3.4. Suppose that for some K < , (3.33) fzf (1 + (z, t))[ln (1 + (z, t))]P(dz, tit) K a.s. Then E(L) 1. Proof. Define the function " so that 0 otherwise, and let L be obtained from (3.28) by replacing with ". By Proposition 3.3, E(L]) 1 and it is clear that L] converges to L in probability. Hence by [3, II, Thm. 21], E(L) if and only if the set of r.v.s {L]ln l, 2, ...} is uniformly integrable. fine the probability measures P, by

() L(). dP

By Theorem 3.3, (xt,,P,), tel0, 1], is a fundamental process with 1.d. ((1 + ")n, A) and so the corresponding margingales are given by

(, t e(, 0 ,(, 0 e(, 0 ( + 4(z, s(, s.

Because is bounded and because of (3.31), Qe(P,). For later reference define e (P) by r 0n( + 4 ,, and note that

(3.34) (", "), [ln (1 + b")-]2(1 + ck")P(dz, ds). 1042 R. BOEL, P. VARAIYA AND E. WONG We are ready to show that {L] } is a uniformly integrable family. Fix M < . First,

L"l(co)P(dco) P,,{L" > M}. LnI>M Next, {L] > M}

(from (3.28))

exp In (1 + 4)")[P(dz, ds) (1 + ")P(dz, ds)]

exp [(1 + "). In (1 + b") c"]i(dz, ds) > M

= In (1 + ")[P(dz, ds)- (1 + ")P(dz, ds)] > } In M

U [(1 + 49 In (1 + 49 4](dz, ds) > In M =F UF say. So P,{L" > M} <= P,(F1) + P,(F2). From (3.33) it is immediate that P,(F2)= 0 for all sufficiently large M. On the other hand, F2 {" > 1/2 In M}, so by the Chebychev inequality,

4 P"(F2) -<- 2 ( ")1 dP. (In M) fta " 4 (by (3.34), (3.33)). (ln M);K It follows that for all n, P,{L] > M} 0 as M , i.e., {L"} is uniformly integrable. For the next proposition express P(dz, ds) n(dz, s)P(Z, ds) (see 2.11). PROPOSITION 3.5. Suppose that there exist > aM K, K' finite such that (3.35) fz (1 + (z, t))'n(dz, t) K + K'[P(Z, t) + P(Z, t)] a.s. and suppose that for all 0 < M < , (3.36) E exp IMP(Z, 1)] < . MARTINGALES ON JUMP PROCESSES. II 1043

Then for < 7 <-_ (3.37) sup EL < , t[O,1] in particular EL 1. Proof. If (3.37) is satisfied then by [8, II, Thm. 22] the family in 0 =< __< 1} is uniformly integrable and so by [8, VI, Thm. 6], L is a uniformly integrable martingale, hence E(L1) 1. For 0 1/2 7 > 1, define In (1 + dp)Q(dz, ds) + zf0[71n(1 + b) + P(dz, ds) g,(7) exp 74 P(dz, ds) First of all,

f(Y)g,(7) exp 7 In (1 / O)(dz, ds) 7( In (1 + c))P(dz, ds) L (by (3.28)). Next it can be checked by substitution in (3.28) that If(7)] is obtained from (3.28) by replacing (1 + ) with (1 + )r. Hence if 2< 2 so that (1 + )r Lo(P), then we must have E[(y)] for all t. Now by H61der's inequality, eL (e[()3)'/(e[g,()/.- ').- so that EL <_ (E[g,(7)] /(- 1))(- Next, [g(7)]/,- 1) =< exp + P(dz, ds) (since + b >= 0 implies -7q5 _-< 7) __< exp 72/t(t) + {K + K'(P(Z, s) + P(Z, s))}P(Z, ds)l (from (3.31), (3.35)) + K + #(t) + K'P(Z t) tt(t) (from (3.31)) 7 5- __< exp fl exp K'/(1)P(Z, 1) (for some constant fl). 1044 R. BOEL, P. VARAIYA AND E. WONG Hence, EL <= (exp fl)E[exp K'/(1)P(Z, 1)] and the result follows from (3.36). Remark 3.6. (i) Suppose (x, , P) is as in the fundamental example with corresponding increasing processes/(t), ...,/(t). Then Assumption 3.1 trans- lates into the following: there exists an increasing function A :R/ R/ such that L i(t) A(t)a.s. i=1 Similarly (3.35), (3.36) become:there exist > and K, K' such that (3.38) L (1 + ,(t)y <= K + K' L (Pi(t) + ff,(t)). i=1 i=1

(ii) Now suppose that (xt, , P) is a standard Poisson process. Then (3.38) becomes (1 4- O(t)y <__ K + K'(x(t) 4- t). Suppose that b(t) c(x(t_)y for some < 1. According to Feller E27 p. 452] a counting process x with rate [1 + b(t)] has infinitely many jumps in a finite in- terval, so that it cannot be a fundamental process. Thus Proposition 3.5 is false if e < 1. We have been unable to resolve the case of "linear" growth, i.e., e 1. Remark 3.7. Propositions 3.3, 3.4, 3.5 are inspired by corresponding results in [6], [24], [28 respectively. 3.4. A class of Poisson-measure lroeesses. In order to apply the trans- formation technique presented earlier one must begin with a fundamental process (with a known 1.d.) whose existence is guaranteed. In this section we present a large class of such processes for which the increasing processes P(B,t) are deterministic. Let (Z, ) be any Blackwell space and let/t be any positive measure on the space (Z x R+, e (R) ), where is the Borel field on R+. Suppose that for all < oo, (z x E0, t)< Let fl' be the space of all (nonnegative) integer-valued measures N on (Z x R+, (R) ). For each T e R+, let } be the family of all subsets of of the form {N f'IN(C) K}, where C e (R) N[0, T] and K I+, the set of nonnegative integers. Evidently Wr is a r-algebra on f'. Let '= VT'T.

Now, for each T define the set function Pr on (', Wr) by

P(c)k -u(c) P'T(N(C) e K) k! e MARTINGALES ON JUMP PROCESSES. II 1045

Note that /(C)< since C c Z x [0, T]. By [311, Pr defines a probability measure on (f', }). Furthermore if C1, C2 are in Z x [0, T] and then the two random variables defined by N- N(C), N--, N(C2), N are independent. Finally the random variable N- N(C) has a Poisson dis- tribution. For A , consider the counting process P'(A, t), R+, defined on the family (f', ^ r, Pr), by P'(A, t)(N) N(A x [0, A T]).

Evidently E(P'(A, t))= lt(A x [0, A T]), and if A ["l A 2 , then P'(A1, t) and P'(A2, t) are independent processes. Next by Moyal [32], there exists a jump process x, e R+, with values in (Z, e) defined on a family (, ,x PT) such that (i) PT) is isomorphic to (f',ff^ T, P}) and (ii) the counting processes W(A, t) corresponding to xt are "isomorphic" to the processes P'(A, t) constructed above. Furthermore, pX(A, t) =/(A [0, A T]). To finish the construction we merely note that if S < T, then the probability measure Ps on (fl, ffs) coincides with the restriction of PT (defined on ff) to ffs. By the Kolmogorov consistency theorem, there therefore exists a probability measure P on (fl, 5 ) such that

(3.39) /'(A, t) p(A x [0, t]), A e, R +. However the process x may not be a fundamental process. To guarantee this we must be sure that the jump times are totally inaccessible. As mentioned in 2.9, this is equivalent to the requirement that PX(A, t) have continuous sample paths, and hence, from (3.39), to the requirement that p(A x [0, t]) be continuous in for each fixed A. We summarize the main conclusions as follows. THEOREM 3.5. Let (Z, ) be a Blackwell space and let l be any nonnegative measure on (Z x R+ (R) ) such that (i) p(Z x [0, t])< o for all t R+, (ii) p(A x [0, t]) is continuous in for all A . Then there exists a fundamental process xt on a family (YL ',x p) with values in (Z, ) such that P'(A, t) /(A x [0, t]), A 6 , e R + Remark 3.8. (i) The x process has independent increments in the sense that the P(A, t) have independent increments. If xt were vector-valued this would indeed imply that xt has independent increments in the usual sense. (ii) The most useful version of this result would be when is a product measure p(dz, ds) n(dz)A(ds), where n is a finite measure on (Z, ) and A(t) is a continuous increasing function on R+, in which case (n, A) would be a L6vy system. 4. Detection. The prototypical detection problem in communication theory is the following. We observe a sample xt(co), 0 <= < o, of a stochastic process. 1046 R. BOEL, P. VARAIYA AND E. WONG

The process is known to be governed by one of two laws, P or Px. Based upon the observed sample one has to decide which of the two hypotheses, P or P1, is true. The term "detection" arises from a particular instance of this hypothesis testing model, namely, when the process x has the representation dx , under P, (4.1) dxt- white noise + st, under P1, where s is called the "signal". Thus deciding which hypothesis is true is, for the example, equivalent to "detecting" whether the signal is present (hypothesis P1) or absent (hypothesis P). Very recently this problem has been considered for the case where x is a counting process under P1 and a Poisson process under P [6], [7], [15], [16], [17]. The case where x is a Markov chain under P has also been discussed [6]. We generalize these results by considering problems where x is a fundamental process. A well-established procedure for judging which hypothesis is true consists in first calculating the "likelihood" ratio (dP/dP)(x(co)) and then in accepting P if dP/dP > and rejecting P otherwise. The selection of the "threshold" is discussed in 18]. The procedure is often called the "threshold detector". Evidently for this procedure to be meaningful one must assume P1 << P. Also to obtain results of practical value one must specify precisely how the "signal" affects the observation, as for instance in (4.1), where it is assumed to be additive. We proceed to the mathematical model. Let (, ), R +, be a family of spaces and P, Pa two on (fl, ). The observed process is a family of measurable functions x "(, ) (Z, ) such that (x P) and (x are both fundamental processes. The processes t, , t, , P) - P, P, Q and px, QX are the extrinsic and intrinsic (i.e., relative to ) processes corresponding to (xt, P). Similarly P1, Pa, / etc. correspond to (x, Pa). The extrinsic and intrinsic 1.d.'s are (n, A), (nx, Ax) for (xt,P) and (ha ,A), (n, A) for (xt, P1)" We now give the model corresponding to the "signal plus noise" model of (4.1). Assumption 4.1. There exist '-predictable processes (B, co, t), B , and -predictable processes g(z, co, s) and g a(z, co, s) such that E]g(z, s)] < o and Exlg(z, s)l < o for all z, s, and

P(B, t) n(B, t)A(t) g(z, co, S)l(dz, co, ds),

P(B, t) n(B, t)A(t) gl(z, co, s)/a(dz, co, ds),

where the integrals are Stieltjes integrals. Interpretation. In communication theory terms we can say that the "jump rates" P(B, t) are "modulated" by the signal through the functions g, g. DEFINITION 4.1. Let E(g(z, t)l) (z, t) and El(gl(z, t)l) l(Z, t). MARTINGALES ON JUMP PROCESSES. II 1047

PROPOSITION 4.1.

P(B, t) ,(z, s)p(dz, ds) a.s.

P(B, t) , l(Z, s)p(dz, ds) a.s. Proof. It is enough to prove the first assertion since the proof for the second is identical. Fix B e. We know that

(4.2) Q(B, t) P(B, t) g(z, s)p(dz, as)e o(, P)

(4.3) Q(B t)= P(B t)- X(B t) -lot,

Let T, n 0, 1, ..., be the jump times of x t. The T are stopping times for the hmily () as well as for (7). Furthermore EIP(B, A Z)] n. Hence EI Q(B,t A L)l < , _x and we can define a process (Q(B t) t p) such that O(B, T.)= E(Q(B, T.)I) and it is trivial that Q(B, A T,)e (, P). Now P(B, t) and g(z, t) are ff- measurable, hence

Subtracting this from (4.3) implies that

On the other hand it can be directly verified that

" [E(g(z, s)[)- (z, s)]p(dz, ds) 1(, P). d oC Therefore pX(B, tA T,)-ffl ,(z s)p(dz ds) /1(,o''x ,P).

But this is a continuous process. Hence it must vanish, i.e.,

P'(B, ,(z, s)p(dz, ds). [

Remark 4.1. The processes (Q(B, t), 5 p) are called the innovations processes of the process (xt, , P), in analogy with the Brownian motion case [21]. These processes will be used in the next section. 1048 R. BOEL, P. VARAIYA AND E. WONG THEOREM 4.1. Suppose that P << P. Let L E(dPx/dPI) be the likelihood ratio and let T=inf{tlLt=OorLt- =0}.

Then there is a sequence of [ s.t.'s Sk T T a.s. P such that

1 (Z'S) "'x and ^sk (z,s) (4.4) Lt^s l-I s)] exp ,(z, s)p(dz, s<=tASk ,(x, s) J I_ fz fl ,(z,s) ds)l Xs- :/: Xs

Proof. By Theorem 3.1 there exists s.t.'s Sk T T and an -predictable function b such that Lt^s is given by (3.11), and by Theorem 3.2 the intrinsic 1.d. of (xt,P1) is ((1 + )nX, AX), where (nX, A) is the intrinsic 1.d. of (xt, P); so from Proposition 4.1 we can conclude that (1 + c(z, s))n(dz, s)AX(ds) (1 + dp(z, s)),(z, s)p(dz, ds) ,l(z, s)p(dz, ds) Wi(dz, s)A(ds). Therefore (z, s) + c(z, s) "-,(z,s) which upon substitution into (3.11) yields (4.4). [3 COROLLARY 4.1. Suppose in the above that x is as in the fundamental example of 2.12. Suppose there exists a 'xt-predictable process p(t), and --predictable processes 2i(t), 2] (t), <= <= n, such that ,() 2(s)(ds), fii, l(t) (s)]2(ds), l<_i

Then the formula (4.4) changes to

(4.5) ^ i=1 s

Formula (4.6) has also been derived in [6] and [7]. Formula (4.5) for the case n and lz(ds) =- ds appears in [15], although the derivation is not satisfactory, and various additional assumptions, some of which are not easily unverifiable, were made there. (ii) In [6] we can also find (4.5) for the special case where (xt,.t,.-x p) is a Markov chain, in which case the .i can be interpreted in terms of various transition probabilities as suggested in 2.11, 2.12. We apply formulas (4.5) and (4.6) to calculate the mutual information be- tween two fundamental processes. Let x and x' be two such processes on (f, , P) with values in (Z, e) and (Z', e,) respectively. Let l(dz, ds) and IJ'(dz', ds) be -x_ and 7'-predictable processes and g(z,s) g'(z' s) be two -predictable processes with finite expectation such that n(dz, s)A(ds) g(z, s)It(dz, ds), n'(dz', s)A'(ds) g'(z', s)lz'(dz', ds). Let Px, Px' denote the restrictions of P to x and x' respectively. Assume that _-x' the product a-algebra and let P,, P, (R) P, denote the product measure on x(R) '. It is trivial that P << P,. Assume further that Px, << P. The mutual information between x, x' is the quantity I(x,x')= E(ln dd-P, Let ,(z, t) (g(z, ,'(z', t) (g'(z', By Remark 3.2 (i), g/, e Lo(P), g'/,' I Lloc(P ). Assume further that In (g/,)e LI(p), In (g'/')e L l(p,). Then by Theorem 4.1, s) g(z,s) exp S)l(dz, ,(x,,s)J [- fzfo ,(z,s) 1)(z, ds)l} exp -1 x ['(xs S) ,'(z',s) ,'(z,s)l'(dz',ds)]} so that dP g(z, s) ,(z, S)la(dz, ds) 2:: ,(z,s) (4.7) Xs- Xs ,'(dz', ds)l'(dz', ds). 1050 R. BOEL, P. VARAIYA AND E. WONG

Since In (g/,)e L l(p), therefore g(x, s) g(z, In g(z, s)p(dz, ds) (x, ) ,(z, s)s))

,(z,s) so that

E In E In g(z, s)#(dz, ds). [g(x's))ls) (g(z,s)) Ixs_, xs ,(x, fz fo ,(z,s) Similarly, g'(z', s) ln(g'(x'., ,' E In g'(z', s)'(dz', ds). _, g (x, '(z', S E[x Ss),} 1 fzfo )) Taking expectations (4.7) and substituting these relations gives the following result. TnoN 4.2. Suppose P, << P aM In (g/) e Lx(P), In (g'/') e L(P). Then g(z, s) ,(z,s) I(x x') E In 1 g(z, s)p(dz, ds) ,(z, s) g(z, s) (4.8) g'(z'' s) ,'(z',s) + in g'(z', s)p'(dz', ds) ,'(z',s) g'(z', s) ] Remark 4.3. This result for the case where x, x' are both counting processes has appeared in [6], and our proof is adapted- from the one given there. 5. Filtering. A popular model for estimation and filtering problems in com- munication and control is where the observed process, xt, depends upon the "signal" or "state" process, Yt, according to dyt g(Yt) dt + dBx(t),

dx f(xt, y,) dt + dBE(t), where B1,B2 are Brownian motions. The problem is to determine E(ytlt). Note that in the above Yt is a semi-martingale. We begin this section by examining this situation when (xt, t,P) is a fundamental process with values in (Z, e). We need a preliminary fact.

LEMMA 5.1. Let (mt, t, P) /2(, p). Then there exists an t-predictable process h(z, t) such that

(5.1) E Ih(z, t)lZP(dz, dr) < o

and

(5.2) (m,, Q(B, t)) h(z, s)P(dz, ds) for all B e . MARTINGALES ON JUMP PROCESSES. II 1051

Proof. The set, say 5, of all processes (h Q),, where h is any predictable process satisfying (5.1), is easily shown to be a stable subspace of /{2(t,P (see [19] for a definition of a stable subspace). Therefore by [19], there exists a unique decomposition of mt, mt n + l, with-le 50 and (n, l't)= 0 for all 1 50. Let 1, (h Q), and the assertion follows. U Assumption 5.1. There exist '-predictable processes/(B, t), B e , and an -predictable process g(z, t) such that

(5.3) P(B, t) g(z, S)la(dz, ds). Notation. In the following for any process (f, , P), f E(flt). THEOREM 5.1. Let (xt, , P) be a fundamental process satisfying Assumption 5.1. Let (y,, ,, P) 5(,) have the representation (5.4) Y, Yo + a, + m, with a, s'(), m, /2(). Then .9, satisfies the filtering equation

,9, .9o + r/, + k(z, s)Q'(dz, ds), where r/, e (o,), Q'(B, t) P(B, t) Is Ito ,(z, S)l(dz, ds), and where the ,-'- predictable process k satisfies [(y- h(z, s)]g(z, s) k(z, s) .9- + ,(z,s) and h, g are as in (5.2), (5.3) respectively. + Proof. Let #t E(mtlt). Clearly #, ////(t). Now write a, a a- where at+, a- +(t, P). It is easy to verify that the --measurable processes + - t E(at+l), t- E(a-It) are submartinales. By the Doob-Meyer de- composition theorem [3], there exist martingales t+, t- in ////-(t)and predictable increasin processes r/t+, r/t in '+ (,) such that .+ + ,.+. Hence - (5.5) Po + =Po+r/,+ ,, say, where r/, e sC(ff), e ///(ff'). By 2.11 there exists a ff'-predictable process k(z, s) e Lo(P') such that (5.6) , k(z, s)QX(dz, ds).

It remains to evaluate k. By the differentiation formula of 2.7,

ytP(B, t) y_ P(B, ds) + P(B, s-) dy + [m,, Q(B, t)]. 1052 R. BOEL, P. VARAIYA AND E. WONG

Since P(B, t)- P(B, t)and [mt, Q(B, t)] are in o(t), therefore, from the above, for some 7,, ?, /o(),

y,P(B, ) y_P(B, ds) + P(B, ) dy + (m,, (B, )) + y (5.7) (y_ + h(z, s))g(z, s)(dz, ds) + P(B, s-) da + 7, using (5.2), (5.3) and (5.4). Now apply the differential rule to P(B, t) to obtain ,P(B, t) _P(B, ds) + P(B, s-) d + [,, OX(B, t)]. Recalling that P(B, t) P(B, t) and [t, Q(B, t)] (t, Q(B, t)) are in loc(tlx), the relation above implies that for some , e lo(x), tP(B, t) p_pX(B, ds) + P(B, s-) d + + t (5.8) @_ + k(z, s))(z, s)p(dz, ds) + P(B, s-) d + using Proposition 4.1, (5.5), (5.6). Next we make the following observations, which can be verified directly from the martingale definition"

(y_ + h(z, s))g(z, s)g(dz, ds)

[( ](dz, ds) e 1o,,,,

(yo)P(B, s-) da yo P(B, s dq e lo(t). Using these facts and the fact that (ytP(B, t)) tP(B, t), we conclude from (5.7), (5.8) that

{(ys- + k(, sN(z, s) [( )]}u(z, s) e 1o(,),x

and since this process is continuous, it must vanish identically, so that we may assume [(Ys- + h(z, s))g(z, s)] k(z s) y- .'tz, ) [(Y,- Ys- + h(z, s))g(z)] (z,s) MARTINGALES ON JUMP PROCESSES. II 1053

COROLLARY 5.1. Suppose in the above that x is as in the fundamental example of 2.12 and that there exists an 7-predictable process /a(t) and -predictable processes 2, such that

Pi(t) ,i(s)(ds) and let some h Then (mr, qi(t)> o hi(s)Pi(ds) for -predictable processes i.

t o + tit + ki(s)q[(ds) i=l with [@t- + hi(t))'(t)] ki(t t- < < n, A[ and

qT(t) Pi(t) ii(s) ds.

Remark 5.1. (i) Suppose in (5.4) that a is given as

a ds

for some predictable process fit in L'(). Since a -y)sds is in l(ffc)it follows that in the representation for t we have the further specification

(ii) Corollary 5.1 has appeared in the literature for the case where (xt, , P) is a counting process, i.e., n 1. Even here some additional conditions have been imposed on the Yt process (such as, e.g., Yt is Markov [6], [16]) or on the x process (such as, e.g., (xt,-t,--x p) is obtained from a Poisson process by an absolutely con- tinuous change of measure [6], [20]). (iii) Theorem 5.1 has been inspired largely by the procedures of [21], where the underlying process is Brownian motion. See also [24] for the Brownian motion case. (iv) While Theorem 5.1 has some value in terms of clarifying the issues in- volved in obtaining the filtering equations it is of little practical importance since these equations do not lead to a realization by a dynamical system. This is so because the filtering equations contain the terms rh, k and f,'t which are not

computable in terms of ,Pt and xt. In other words, the filtering equation is not recursive. This difficulty persists even when one imposes additional conditions such as Yt is Markov. In the remainder of this section we seek to determine con- ditions under which the filter is recursive. We impose conditions on the dependence between the "signal" or "state" process Yt and the "observation" process x which are considerably stronger than 1054 R. BOEL, P. VARAIYA AND E. WONG those of Assumption 5.1. For the remainder of this section the following assumption holds.

Assumption 5.2. (fL ), R /, is a family of spaces and P, P1 are two prob- ability measures on (fL ). xt and yt are measurable functions on (fL o) with values in (Z, e) and (Y, ) respectively. The following properties are satisfied. (i) Z is a Borel subset of R", e is the Borel field. (The most important practical cases are Z R" or Z is the space of all z e R" with integer components.) Y is a locally compact Hausdorff space, is the Borel field. ---x V ,x. (ii) Under the measure P, (a) (xt,,P) is a fundamental process with independent increments; i.e., xt x is independent of (under P), for s =< t, (b) (yt,,P) is a Markov process whose sample paths are right- continuous and have left-limits, and the jump times of y are totally inaccessible, (c) the processes xt and yt are independent, i.e., -x and r are in- dependent. (iii) P1 << P, there exists an -predictable process f Loc(P with a represen- tation f(z, co, t) (h(z, yt-(co), o9, t), where qS(.,y,.,.) is -predictable for fixed y e Y, and there also exist predictable processes p(B, t) for B e such that E(If(z, t)l) / E(If(z, t)l) < oe for all z, and

1-I [1 + 4)(xs, Ys-, s)] exp O(z, y_, s)p(dz, ds s<_t Xs- :/: Xs - Note that we must have + b => 0. Let Q, P and QX, px be the processes associated with (x P) and (x _-x p) Similarly let Q, Pa, Q, P be the processes corresponding with (x,, , Pa) and (xt,,t,--x pa)respectively. From Assumption 5.2 and Proposition 4.1 it is im- mediate that P(B, t) pX(B, t) p(B, t), Px (B, t) (1 + f(z, s))p(dz, as),

P(B, t) (1 + f (z, s))p(dz, ds), where f(z, t) E (f(z, t).lo). For any let o_ V

PROPOSITION 5.3. Yt Yt- a.s.P. Proof. Prob {Yt 4: Yt-} Prob {t is a jump time}. However, since the jump times are totally inaccessible, this probability must be zero. For a real-valued function g on Y we are interested in determining a (recursive) expression for the process E l(g(yt)l). Now E(g(y,)L,I). (5.9) E l(g(y,)l) "x

It turns out that the numerator of the expression in the right is much better be- haved than the ratio, and, furthermore, the denominator does not depend on g. Hence we will seek to determine instead an expression for DEFINITION 5.1. Let aS be the family of all bounded, measurable, real-valued functions g on Y. For g aS and R +, let (5.10) rt,(g)- E(g(y,)L,I). PROPOSITION 5.4. E(L,I-)= a.s. Proof. The proof is immediate from the assumptions that x, fly are in- dependent under P and/(B, t) is ff'-measurable. V] Now fix g a. Since L satisfies L, + L_ d(qbo Q), substitution into (5.10) gives

E(g(yt)lff' + E[g(y,)L_ $(z, y_, s)l, ]Q(dz, ds).

Since fix and o-Y are independent under P,

(5.12) E(g(yt)lff' Eg(y,). Also, E[g(yt)L,-dp(z, y,- s)l,] E[g(yt)Lfl)(z, y, s)]ff'] (by Propositions 5.2, 5.3) E[E{g(yt)Lqb(z, y, s)]' V ff}]'-t (5.13) E[Lck(z, y s)E(g(Yt)l E[Lck(z, y, s)E(g(yt)l )lt (by independence of ff, fir) E[Lck(z, y, s)E(g(yt)lY)l[] (since y, is Markov) E[L$(z, Ys, s)H,,(g)lffx] since x has independent increments under P and where H,.(g) E(g(y,)ly). 1056 R. BOEL, P. VARAIYA AND E. WONG

Substitution of (5.12)and (5.13)into (5.11) gives r(g) Eg(y) + G(b(z,., s)g,(g))Q(dz, ds).

Note that the integrand in the above expression is a predictable process for each fixed t, as explained at the end of 2.9. We summarize the above. THEOREM 5.2. Under Assumption (5.2) the process rct(g satisfies (5.14) rct(g Eg(y,) + G(qS(z,., s)H,,s(g))Q(dz, ds), where (5.15) H,.s(g) E(g(y,)ly), and (5.16) Q(B, t) P(B, t) P(B, t).

Remark 5.2. (i) Because of Proposition 5.4, Eg(y,) Elg(y,) and H,,(g)= El(g(y,)ly). (ii) From (5.10), t,(1) E(LtIX,, where denotes the function on Y which is identically equal to unity. Hence from (5.9), ,(g) E, (g(y,)l--') re,(1) Eg(y,) + fz J' G(b(z,', s)H,.s(g))Q(dz, ds) / fz fo G(d(z,', s))Q(dz, ds) from (5.14). (iii) Suppose (xt,,P) is as in the fundamental example. Then (5.14) simplifies to

(5.17) rct(g Eg(y,) + G(Oi(', s)H,.(g))[pi(ds)- i(ds)] i=1 (iv) We now derive a more familiar-looking version of (5.14). For any set A,

If P(yt A) P(y e A) O, then r(IA) 0 a.s. Hence there exists a measurable function Ut" Y R such that (5.18) rc,(A) t Ut(y)P(dy), where P is the marginal distribution of Yt under P and Px. Evidently if h c, then ,(h) h(y)U,(y)P(dy). fr MARTINGALES ON JUMP PROCESSES. II 1057

Next let P(A, t[y, s), A e ag, s t, be the transition kernel of the Markov process y so that (Ht,(g)) (Y) ;r g(y')P(dy', t[y, s) and let P(A, s[y, t), A e o-g, >__ s, be the backward kernel so that for h if, E(h(y)[yt) fr h(y')P(dy', s[yt, t). Substituting these relations into (5.14) leads to

g(y')U,(y')P(dy') g(y')(dy') + (z, y, s)

frg(y')P(dy',tly, s)}U(y)Ps(dy .Q(dz, ds) frg(y')(dy')+ fr g(Y') (z, y, s)V(y)P(dy, sJy', (dz, ds) (dy').

Since g e is arbitrary, the process Ut(y evolves according to

(5.19) U,(y) + (z, y', s)U(y')P(dy', sly, Q(dz, ds).

Remark 5.3. (i) For the case of the fundamental example (see (5.17)) the equation above simplifies to

(5.20) Ut(y) + i(Y', s)U(y')P(dy', s[y, t) [Pi(ds) fii(ds)]. i=1 This equation has been derived in [28] for the special case where (xt, , P) is a counting process, so that n 1, and with the additional condition that (ds) ds. (ii) Equations (5.14) and (5.15) are not yet recursive since the functions b(z, y, t), b(y, t) are allowed to depend on the entire past x, =< s __< t. We will see later how under additional conditions these equations become truly recursive. (iii) Notice that unlike the representation for .9 obtained in Theorem 5.1, those for n in (5.14) and U in (5.19) are not semimartingales because the integrands depend upon t. This dependency can be eliminated by some additional assumptions as follows. For the remainder of this section the following holds in addition to Assumption 5.2. Assumption 5.3. The operators Ht, of (5.15) have the following properties" (5.21) (i) lim H,, I, the identity operator on ft. st 1058 R. BOEL, P. VARAIYA AND E. WONG

(ii) there exist operators A t, >= 0 on a such that (5.22) lim (1/e)(H,+, e+0 We do not elaborate on the precise theoretical status of the operators A (i.e., the precise definitions of their domain, range, etc.), since it would take us too far afield and since this topic is well-covered in the semigroup theory of Markov processes (see, e.g., [29]). We merely note that (i) is a continuity assumption, (ii) is a differentiability assumption. The operators A are often referred to as the infinitesimal generator, especially when y is a Hunt process. If y is a k-dimensional diffusion, for example, then A is just a (partial) differential operator of the form 1 k (2 k )' ffij(Y, + mi(Y, i,j= t)Yi OYj i:EX t)--.Oyi We now develop the simplifications induced by (5.21), (5.22) in (5.14). First of all, recalling that Po(dy) is the probability distribution of Yo and that y is Markov, we get - Eg(yt) E(Ht,o(g)(Yo)).

This, together with (5.22), implies that E(g(y, + ) g(Yt)) f. (Ht + ,o H,,o)(g)(Y)Po(dy) - efy (Ht,oAt(g))(y)Po(dY) eE[(At(g))(yt)]. Substituting this into (5.14), and using (5.21) and (5.22), leads us to

(,+,- ,)(g) eE(At(g)) + e 7r,($g)Q(dz, dr)+ e rc(dHt,At(g))Q(dz, ds)

e t($g)Q(dz, dt) + ezt(At(g)). Hence

(5.23) zt(g 70(g + z(Ag) ds + c($(z,., s)g)Q(dz, ds).

THEOREM 5.3. Under the additional conditions of Assumption 5.3, the repre- sentations (5.1a) and (5.17) simplify to (5.23), (5.24) respectively.

(5.24) zt(g Zo(g + (Ag) ds + s(ci(., s)g)[pi(ds)- i(ds)]. i=1 As an example illustrating (5.24) suppose that under P x and Yt are in- dependent standard Poisson processes. Then Z Y I +, the set of nonnegative MARTINGALES ON JUMP PROCESSES. II 1059 integers. Also n in (5.24), p(t) x and/(t) t. For g:I/ R, Ht,s(g)(y E(g(y,)[ys y)

g(y + k) e k--O k so that, (t- s)k _,_ -(H,,(g))(y) o k----e )[g(y + k + 1)- g(y + k)], and hence (5.25) (A,g)(y) (Ag)(y) g(y + 1) g(y). Consider the "indicator" functions 6:1+ R, where ify k, 6(y)= 0 otherwise.

By the linearity of nt,

n,(g) g(k)nt(6,), k=0 so that it is enough to determine the processes nt(6), k 0, 1, 2, Substitution of 6 for g into (5.24) gives, using (5.25), ,(6) o(6) + [(6_ ) (6)3 ds + s(O(', s)6)(dx ds)

no(6) + [n(6_ ) n(6)] ds + O(k, s)n(6)(dx ds), 0 since O(y, s)6(y) (k, s)6(y). Now ilk=0 o(6) E6(yo)= 0 ilk > 0, and 6_ 0, so that the expression above simplifies to

(6o) + (6o) ds + (0, s)n(fo)(dx ds),

,() (_ ) ds () ds + (k, s)(5)(dx ds), k 1,

and these can be rewritten respectively as

e%(o) + 4(0, s) e%(6o)(dxs ds), (5.26) d(5) ) ds + (k, s) e(5)(dx ds), k 1. 1060 R. BOEL, P. VARAIYA AND E. WONG

These linear integral equations can now be solved inductively to yield the explicit formulas

(5.27) r(o) e 1--I [1 + 4(O,s)]exp (O,s) ds - Xs- :/= Xs r,(6k) e-t'-)r(6k a) 1--I [1 + b(k, )] exp c(k, v)d ds, (5.28) L_ , k>l. Remark 5.4. The result just obtained illustrates the power of the formulation of Theorem 5.3 over the more usual formulations which involve obtaining a relation for the conditional density (e.g., [16]). We believe that equations (5.14), (5.20), and (5.23), (5.24) are much more useful since they are iinear in the "unknown" linear operators r whereas the evolution equations for the conditional density are nonlinear. Of course the latter can be easily derived from the former.

REFERENCES 1] C. DOLANS-DADE AND P. A. MEYER, lntgrales stochastiques par rapport aux martingales locales, S6minaire de Probabilit6s" IV, Lecture Notes in Mathematics, Springer-Verlag, Berlin and New York, 1970, pp. 77-107. [2] C. DOL.ANS-DADE, Quelques applications de la formule de changement de variables pour les semi- martingales, Z. Wahrscheinlichtstheorie Verw. Gebiete, 16 (1970), pp. 181-194. [3] P. A. MEYER, ProbabilitOs et Potentiel, Hermann, Paris, 1966-English translation: Probability and Potential, Blaisdell, Waltham, Mass., 1966. [4] R. BOEL, P. VARAIYA AND E. WONG, Martingales on point processes I: Representation results, Memo # M-407, Electronics Research Lab., University of California, Berkeley, Calif., 1973. [5] H. KUNTA AND S. WATANABE, On square integrable martingales, Nagoya Math. J., 30 (1967), pp. 209-245. [6] P. M. BRMAtD, A martingale approach to point processes, Electronics Research Lab., Memo # M-345, University of California, Berkeley, Calif., 1972. [7] M. H. A. DAVIS, Detection of signals with observation, Publication 73/8, Dept. of Computing and Control, Imperial College, London, 1973. [8] T. T. KADOVA AND L. A. SHEPP, Conditions for absolute continuity between a certain pair ofprob- ability measures, Z. Wahrscheinlichtstheorie Verw. Gebiete (3), 16 (1960), pp. 13-30. [9] I. V. GIRSANOV, On transforming a certain class of stochastic processes by absolutely continuous substitution of measures, Theory Probability Appl., 5 (1960), pp. 285-301. [10] V. BENE, Existence of optimal stochastic control laws, this Journal, 9 (1971), pp. 446-475. l] T. E. DUNCAN AND P. VARAIYA, On the solutions of a stochastic control system, this Journal, 9 (1971), pp. 354-371. [12] M. n. A. DAVIS AND P. VARAIYA, Dynamic programming conditions for partially observable stochastic systems, this Journal, (1973), pp. 226-261. [13] R. RISHEL, Weak solutions of a partial differential equation of dynamic programming, this Journal, 9 (1971), pp. 519-528. [14] D. L. SNYDER, Information processing for observed jump processes, Information and Control, 22 (1973), pp. 69-78. [15] I. RUBIN, Regular point processes and their detection, IEEE Trans. Information Theory IT-18 (1972), pp. 547-557. 16] D. L. SNYDER, Filtering and detection for doubly stochastic Poisson processes, Ibid., IT- 18 (1972), pp. 91-102. 17] O. MACCHI AND B. C. PICINBONO, Estimation and detection of weak optical signals, Ibid., IT-18 (1972), pp. 562-573. MARTINGALES ON JUMP PROCESSES. II 1061

[18] E. LEHMANN, Testing Statistical Hypotheses, Wiley, New York, 1959. [19] P. A. MEYER, Square integrable martingales, a survey, Martingales: A report on a Meeting at Oberwolfach, Lecture Notes in Mathematics, No. 190, Springer-Verlag, Berlin, 1970. [20] M. H. A. DAVIS, Nonlinear filtering with point process observations, preprint. [21] M. FUJISAKI, G. KALLIANPUR AND H. KUNITA, Stochastic differential equations for the nonlinear filtering problem, Osaka J. Math. (1), 9 (1972), pp. 19-40. [22] C. DELLACHERIE, CapacitOs et processus stochastiques, Springer-Verlag, Berlin, 1972. [23] T. E. DUNCAN, On the absolute continuity ofmeasures, Annals Math. Statist., 41 (1970), pp. 30-38. [24] J. H. VAN SCHUPPEN, Estimation theory for continuous time processes, a martingale approach, Memo # M-405, Electronics Research Lab., University of California, Berkeley, Calif., 1973. [25] J. L. DOOB, Stochastic Processes, Wiley, New York, 1953. [26] W. FELLER, An Introduction to and Its Applications, v.I, Wiley, New York, 1968. [27] T. E. DUNCAN AND P. VARAIYA, On the solutions of a stochastic control system II, Memo # M406, Electronics Research Lab., Univ. of California, Berkeley, Calif., 1973. [28] P. BRIMAUD, Filtering for point processes, preprint, 1973. [29] E. B. DYNKIN, Markov Processes I, Academic Press, New York, 1965. [30] D. L. SNYDER, Smoothing for doubly stochastic Poisson processes, IEEE Trans. Information Theory, IT-18 (1972), pp.. 558-562. [3 l] A. PRIKOPA, On stochastic setfunctions, Acta Math. Acad. Sci. Hungar. (2), 7 (1956), pp. 215-263. [32] J. E. MOYAL, The general theory of stochastic population processes, Acta Math. (Uppsala), 108 (1962), pp. 1-31. [33] J. n. VAN SCHUPPEN, Filtering for counting processes, a martingale approach, Proc. 4th Symp. Nonlinear Estimation and Appl., San Diego, Calif., 1973.