ApPENDIX A Some Facts from

1. Convergence of Moments. Uniform Integrability

In all probability courses we are told that convergence in probability and convergence in distribution do not imply that moments converge (even if they exist).

EXAMPLE 1.1. The standard example is {Xn' n ~ 1} defined by 1 1 P{Xn = O} = 1 - - and P{Xn = n} =-. (1.1) n n

Then Xn ~ 0 as n ---. 00, but, for example,

EXn ---.1 and Var Xn ---. 00 as n ---. 00, that is, the converges, but not to the expected value of the limiting , and the variance diverges.

The reason for this behavior is that the distribution mass escapes to infinity in a forbidden way. The adequate concept in this context is the notion of uniform integrability. A sequence of random variables, {Xn' n ~ 1}, is said to be uniformly integrable if lim EIXnII{IXnl > IX} = 0 uniformly in n. (1.2)

It is now easy to see that the sequence defined by (1.1) is not uniformly integrable. Another way to check uniform integrability is given by the following criterion (see e.g. Chung (1974), Theorem 4.5.3).

Lemma 1.1. A sequence {Y", n ~ 1} is uniformly integrable iff

(i) sup EI Y"I < 00. n 166 Appendix A. Some Facts from Probability Theory

(ii) For every & > 0 there exists ~ > 0, such that for all events A with P {A} < ~ we have

ElY" I I {A} < & for all n. (1.3)

The following is an important result connecting uniform integrability and moment convergence.

Theorem 1.1. Let 0 < r < 00, suppose that EIXnlr < 00 for all n and that Xn ~ X as n -+ 00. The following are equivalent:

(i) Xn -+ X in L r as n -+ 00,

(ii) EIXnlr -+ EIXlr < 00 as n -+ 00,

(iii) {IXnlr, n ~ 1} is uniformly integrable.

Furthermore, if Xn ~ X and one of (i)-(iii) hold, then

(iv) E IXnlP -+ E IXIP as n -+ 00 for all p, 0 < p ::;; r.

For proofs, see e.g. Loeve (1977), pp. 165-166, where this result (apart from (iv» is called an U-convergence theorem, and Chung (1974), Theorem 4.5.4. Remark 1.1. The theorem remains obviously true if one assumes convergence almost surely instead of convergence in probability, but also (except for (i» if one assumes convergence in distribution. For the latter, see Billingsley (1968), Theorem 5.4.

Remark 1.2. The implications (iii) => (i), (ii), (iv) remain true for families of random variables.

Lemma 1.2. Let U and V be positive random variables, such that EUr < 00 and EVr < 00 for some r > O. Then E(U + V)'!{U + V> a} ::;; 2rEurI {U >~} + 2rEvrI {V> n(a> 0). (1.4)

PROOF. E(U + V)'!{U + V> a}::;; E(max{2U, 2V})'!{max{2U, 2V} > a} ::;; 2rEurI{U > a/2} + 2rEvrI{V > a/2}. 0

Lemma 1.3. Let {Un> n ~ 1} and {v", n ~ 1} be sequences of positive random variables such that, for some p > 0,

{U,f, n ~ 1} and {V/, n ~ 1} are uniformly integrable. (1.5) Then

{(Un + v,,)P, n ~ 1} is uniformly integrable. (1.6) 2. Moment Inequalities for Martingales 167

PROOF. We have to show that

lim E(Un + v,,}P I {Un + v" > oc} = 0 uniformly in n. (1.7) (%-+00 But this follows from Lemma 1.2 and (1.5). o

Remark 1.3. Lemmas 1.2 and 1.3 can be extended to more than two random variables (sequences) in an obvious way.

2. Moment Inequalities for Martingales

The most technical convergence results are those where moments of stopped sums are involved, in particular, convergence of such moments. For those results we need some moment inequalities for martingales. The study of such inequalities was initiated in Burkholder (1966). Two further important references are Burkholder (1973) and Garsia (1973). Just as martingales in some sense are generalizations of sequences of partial sums of independent random variables with mean zero, the martingale inequalities we consider below are generalizations of the corresponding inequalities for such sequences of partial sums of independent random variables obtained by Marcinkiewicz and Zygmund (1937, 1938). As a preparation for the proofs of the moment convergence results for stopped random walks we prove moment convergence in the ordinary (see Theorem 1.4.2). Since the martingale inequalities below, in the proofs of moment convergence for stopped random walks, will play the role played by the Marcinkiewicz-Zygmund inequalities in the proof of moment convergence in the ordinary central limit theorem, we present the latter inequalities before we derive the former. Finally, since there is equality for the first and second moments of stopped sums just as in the classical case (Theorem 1.5.3), we close by quoting an application ofDoob's optional sampling theorem to stopped sums, which will be useful. However, we begin, for completeness, with the definition of a martingale and the most basic convergence result. Thus, let (Q, $', P) be a probability space, let {$i", n ~ 1} be an increasing sequenceofsub-lT-algebras of$' and set$'oo = IT{U:'=l$i,,} and$'o = {0,Q}. Further, let {Xn' n ~ 1} be a sequence of integrable random variables. We say that {(Xn,$i,,), n ~ 1} is a martingale if

(i) {Xn' n ~ 1} is adapted, that is, if Xn is $i,,-measurable for all n, (ii) E(Xnl$'m} = Xm for m :s; n. Recall that a simple example of a martingale is obtained by Xn = L~=l Yt, where {Yt, k ~ 1} are independent random variables with mean 0 and $i" = IT{ Yt, k :s; n}. 168 Appendix A. Some Facts from Probability Theory

The classical reference for martingale theory is Doob (1953). For more recent books, see e.g. Chung (1974) and Neveu (1975). We say that {(Xn,'.§n)' n ~ 1} is a reversed martingale if {'.§n, n ~ 1} is a decreasing sequence of sub-O"-algebras of ff' and (i) Xn is '.§n-measurable for all n, (ii) E(Xn I'.§m) = Xm for n:::; m. Alternatively one can define a reversed martingale exactly like a martingale, but with the index set being the negative integers. A common example of a reversed martingale is Xn = (lin) L~=l ~, where {~, k ~ 1} are i.i.d. random variables with finite mean and '.§n = 0" {Xk' k ~ n}. If the equality sign in (ii) of the definitions is replaced by ~ (:::;) we have sub(super) martingales. The classical martingale convergence theorems state that an L 1-bounded martingale is a.s. convergent and that all reversed martingales converge a.s. and in L1. The following is an important result on moment convergence for martingales.

Theorem 2.1. Let {(Xn' ~), n ~ 1} be a martingale. The following are equivalent:

(i) {Xn' n ~ 1} is uniformly integrable, (ii) Xn converges in L 1 as n --+ 00, (iii) Xn ~ Xoo as n --+ 00, where Xoo is integrable and such that {(Xn' ~), 1 :::; n :::; oo} is a martingale, (iv) there exists a random variable Y such that Xn = E(YI~) for all n.

A similar result is true for reversed martingales, but, since every reversed martingale is, in fact, uniformly integrable, all corresponding conditions are automatically satisfied.

Moment Inequalities for Sums of Independent Random Variables

Let {Xko k ~ 1} be a sequence of random variables with the same distribution, such that E IXli' < 00 for some r ~ 1 and let {Sn, n ~ 1} be their partial sums. It is easy to see, by applying Minkowski's inequality, that (2.1) which gives an upper bound for EISnlr of the order of magnitude ofnr. Now, if, for example, the summands are i.i.d. with mean 0 and finite variance, 0"2, the central limit theorem states that 2. Moment Inequalities for Martingales 169

In view of the previous section it would therefore follow that E ISnlvlnlr converges as n --+ 00 provided {ISnlvlnlr, n ~ I} is uniformly integrable, in which case E ISn Ir would be (9(nr/2) instead of (9(nr). Below we shall see that this is, indeed, the correct order of magnitude when r ~ 2. By starting with Khintchine's inequality and Rademacher functions Marcinkiewicz and Zygmund (1937, 1938) proved (without using the central limit theorem) the following celebrated inequalities: Let {Xk' k ~ I} be independent (not necessarily identically distributed) random variables with mean 0 and a finite moment of some order r ~ 1. Let {Sn, n ~ I} denote the partial sums. Then there exist numerical constants Ar and Br , depending on r only such that

(2.2)

If, in particular, the summands are i.i.d. the right most inequality can be estimated by Minkowski's inequality if rl2 ~ 1 and by the cr-inequality if r12:::;; 1 and one obtains

S forI E r<{BrnEIXllr :::;;r:::;;2 (2.3) I nl - Brnr/2 E IXlir for r ~ 2.

For r = 2 we have, of course, ES; = nEX; and, for r = 1, EISnl:::;; nEIXll. Note that (2.3), for r ~ 2, can be rewritten as E ISnlvlnlr :::;; Br· E IXllr. Since, by Billingsley (1968), Theorem 5.3,

liminf EISnlvlnlr ~ EIZlr, n .... oo where Z is normal with mean 0 and variance (f2, we have established the correct order of magnitude for these moments. That they actually converge is proved in Theorem 14.2, by showing that the sequence {ISnlvlnlr, 1i ~ I} is uniformly integrable.

Moment Inequalities for Martingales We now turn our attention to martingale extensions of the Marcinkiewicz• Zygmund inequalities (2.2). The first extension is due to Burkholder (1966), Theorem 9, where it is shown that (2.2) remains true for r > 1, if {Xk' k ~ I} is a sequence of martingale differences and {Sn' n ~ I} is a martingale. Later Davis (1970) proved that the right hand inequality also holds for r = 1. More precisely, the following result is true.

Theorem 2.2. Let {(Zn' ~), n ~ I} be a martingale and set Yl = Zl and Yk = Zk - Zk-l for k ~ 2. There exist numerical constants Ar and Br (r ~ 1), depending on r only, such that 170 Appendix A. Some Facts from Probability Theory

(i) (r > 1),

(ii) (r = 1).

The application of Theorem 2.2 which will be of interest to us is when the martingale is a stopped sum of i.i.d. random variables with mean °and finite moment of order r (1 < r :=;: 2). For r ;?: 2 we shall use the following inequality, which is a special case of Burkholder (1973), Theorem 21.1.

Theorem 2.3. Let {(Zn' g;,), n ;?: 1} be a martingale with increments {Y", k ;?: 1}. There exists a numerical constant B, (r ;?: 2), depending on r only, such that

EIZnl':=;: B,ECtl E(Y?I~_l)r2 + B,EC~~~n IY"I'). (2.4)

In Burkholder (1973) this inequality is proved for more general convex functions than 1x I'. Remark 2.1. The quantities Un = (L~=l y"2)l/2 and v,. = (L~=l E(y"21~_l»l/2 are called the square function and the conditional square function, respec• tively, and they play an important role in martingale theory as well as in other branches of mathematics where martingale theory is used. We observe that Theorem 2.2 provides us with a relation between the moments of order r of the martingale and the square function and Theorem 2.3 gives an upper bound for the moments of order r of the martingale which involves the moment of order r of the conditional square function.

An Application of Dooh's Optional Sampling Theorem

Finally, suppose that {Xk' k ;?: 1} are i.i.d. random variables with mean 0, which are adapted to a sequence of increasing sub-

{.=n}Eg;, (n ;?: 1). (2.5)

Since {(Sn, g;,), n ;?: 1} is a martingale we can apply Doob's optional sampling theorem (see e.g. Doob (1953), Section VII.2 and Chung (1974), Section 9.3), which yields the following result.

Theorem 2.4. Under the above assumptions

{(St!, n' g;,), n ;?: 1} is a martingale. (2.6) 3. Convergence of Probability Measures 171

In partieular, ES'An = 0 for all n. (2.7)

3. Convergence of Probability Measures

In this section we shall review some results, which are used in Chapter V. A basic reference is Billingsley (1968).

Definition 3.1. Let S be a metric space, let [I' be the a-algebra generated by the open sets and let {Pn, n ~ I} and P be probability measures on [1'. We say that Pn converges weakly to P, denoted Pn => P, if

IfdPn~ IfdP (3.1) for all f E C(S), where C(S) is the set of bounded, continuous, real valued functions on S.

The first important case is when (S, [1') = (C, ~), where C = CEO, IJ is the space of continuous functions on [0, IJ with the uniform metric. In this metric we say that, if {xm n ~ I} and x are elements of C, then Xn is U -convergent to x,

xn ~ x(U) as n ~ 00, (3.2) if

sup Ixit) - x(t)1 ~ 0 as n ~ 00. (3.3) O~t~l When working with random elements, that is, with mappings from our given probability space (n,§,p) into S, we say that a sequence {Xn' n ~ I} of random elements converges in distribution to the random element X,

Xn => X as n ~ 00, (3.4) if the corresponding distributions converge weakly in the sense of Definition 3.1. Remark 3.1. Sometimes it will be convenient to add a letter above the arrow, which denotes the topology considered. (For the space C this is, in general, not necessary; see further below.) We shall also use the mixture Xn => P as n~ 00.

Now, let x E C and define 1tt , •...• tk as the mapping that carries x to (x(td, ... , x(tk)) E Rk (k ~ 1). The sets of the form 1t~~ ...• tkH with H a Borel set in Rk (k ~ 1) constitute the finite-dimensional sets. Now, the convergence of all finite-dimensional 172 Appendix A. Some Facts from Probability Theory distributions does not in general imply weak convergence. In order to find additional criteria for this to be the case we introduce the concept of tightness.

Defmition 3.2. Let S be a metric space. A family n of probability measures is tight if for every e > 0 there exists a compact set K such that P {K} > 1 - e for all PEn. (3.5)

The following theorem then holds.

Theorem 3.1. Let {Pn, n ~ 1} and P be probability measures on (C,~. If the finite-dimensional distributions of Pn converge to those of P and if {Pm n ~ 1} is tight, then Pn => Pas n ~ 00.

It still remains to find criteria which allow us to apply this result. One important case of interest in the context of the present book is the functional central limit theorem called Donsker's theorem, which we now present. Let {~k' k ~ 1} be i.i.d. random variables defined on a common probability space (Q,ff,P) and suppose that E~l = 0 and Var~l = u2 < 00. Further, set Sn = L~=l ~k (n ~ 0) (So = 0) and define 1 nt - [nt] Xn(t, w) = ;:S[nt)(w) + ;: ~[nt)+l (w) (0 ::;; t ::;; 1). (3.6) u..;n u..;n

Theorem 3.2. Xn => W as n ~ 00, where W = {W(t), 0::;; t::;; 1} is the Wiener measure on (C, ~.

For the case when the variance is not necessarily finite a similar result will be stated below. In this situation, however, the limiting processes, the stable processes, no longer have continuous paths.

Definition 3.3. D = D[O, 1] is the space of functions on [0,1] that are right continuous and have left-hand limits. Also, ~ denotes the u-algebra generated by the open sets in the Jl-topology defined below. (The open sets in the M l-topology defined below actually generate the same u-algebra.)

Remark 3.2. All discontinuities of elements of D are of the first kind. Remark 3.3. C is a subset of D.

For the space D we shall define two different topologies; the Jl-topology and the Ml-topology, which were introduced in Skorohod (1956).

Definition 3.4. Let A denote the class of strictly increasing, continuous mappings of [0,1] onto itself, such that .1.(0) = 0 and .1.(1) = 1 for all A. E A. Suppose that {xn' n ~ 1} and x are elements of D. We say that Xn is Jl- 3. Convergence of Probability Measures 173 convergent to x,

xn -> x (Jd as n --+ 00, (3.7) if there exists {An' n ~ I} in A, such that

sup IAn(t) - tl --+ ° and sup IXn(t) - X(A(t)) 1 --+ ° as n --+ 00. (3.8) O~t~i O~t~i

Definition 3.5. Define the graph G(x) as the subset of R x [0,1], which contains all pairs (x, t), such that, for all t E [0,1], the point x belongs to the segment joining x(t - ) and x(t), that is

G(x) = {(x,t):O:<:;;t:<:;; 1 and x(t-):<:;;x:<:;;x(t)}. (3.9) G(x) is a continuous curve in R x [0, 1]. The pair of functions (y(s), t(s)) gives a parametric representation of the graph if (y(s), t(s)) is a continuous 1-1 mapping of [0, 1] onto G(x), such that t(s) is nondecreasing.

Definition 3.6. Suppose that {xn' n ~ I} and x are elements of D. We say that Xn is M 1 -convergent to x, (3.10) if there exist parametric representations (Yn(s), tn(s)) of G(xn) and (y(s), t(s)) of G(x), such that

sup (IYn(s) - y(s)1 + Itn(s) - t(s)l) --+ °as n --+ 00. (3.11) {"O~s~l}

Remark 3.4. The Mctopology is weaker than the J1-topology, which, in turn, is weaker than the V-topology. If the limit x E C, the J i -topology and the M1-topology both reduce to the V-topology.

We are now ready to state the stable counterpart of Theorem 3.2. Let { ~k' k ~ I} be i.i.d. random variables with mean °and set Sn = I~=i ~k (n ~ 0). Suppose that {Bn' n ~ I} are positive, normalizing coefficients such that (3.12) where G~(x) is the distribution function of a stable law with index IX (1 < IX :<:;; 2) and define

(0 :<:;; t :<:;; 1). (3.13)

Theorem 3.3. Xn ~ X as n --+ 00, where X E D[O, 1] is the whose one-dimensional marginal distribution at t = 1 is G~.

In connection with this result we also refer to Gikhman and Skorohod (1969), Chapter IX.6. 174 Appendix A. Some Facts from Probability Theory

Remark 3.5. When a 2 = Var~l < 00 we can use Bn = aJn in (3.13) and it ,. J 1 U follows that Xn ~ W as n -+ 00, but we can actually conclude that Xn ~ W as n -+ 00. In this case we may define Xn by (3.6) or by (3.l3), see Remark 3.4 and Billingsley (1968), Section 18. However, due to measurability problems it is not always true that one has weak convergence in the V-topology for processes converging weakly to W in the J1-topology, see Billingsley (1968), Section 18.

The processes studied in Chapter V are all suitable functions of {Xn' n ~ I} as defined by (3.6) or (3.l3). Instead ofverifying that Theorem 3.1 is applicable we shall use so called continuous mapping theorems.

Theorem 3.4. Let Sand S' be metric spaces, let h: S -+ S' be measurable and define Dh = the set of discontinuities of h. (3.14) If (3.15) then (3.16)

The corresponding counterpart for random elements is the following.

Theorem 3.5. Let S, S', hand Dh be as before and suppose that {X., n ~ 1} and X are random elements of S. If

Xn~X asn-+oo andP{XEDh} =0, (3.17) then

h(Xn) ~ h(X) as n -+ 00. (3.18)

It is also possible to obtain such a result for a sequence of functions, {hn' n ~ 1}, which are not necessarily the same. Following is the result corresponding to Theorem 3.5.

Theorem 3.6. Let Sand S' be separable metric spaces and suppose that {h .. n ~ 1} and h are measurable mappings from S to S'. If

Xn ~ X as n -+ 00 and P{X E E} = 0, (3.19) where

E = {x: hnxn ++ hx for some {xn' n ~ 1}, such that Xn -+ x}, (3.20) then (3.21) 3. Convergence of Probability Measures 175

Remark 3.6. If hn = h (n ~ 1) then E = Dh •

Remark 3.7. The spaces (C, U), (D,J1 ) and (D,Md are separable (whereas (D, U) is not).

If, for example, we wish to apply Theorem 3.5 to Theorem 3.2 it follows that, if h is continuous on C, then h(Xn) = h(W) as n ~ 00. If we cannot compute the limiting distribution of h(W) directly, it may be possible to compute the limit of the distributions of h(Xn) as n ~ 00 for some special choice of {~k' k ~ I} and thus, since the theorem tells us that the limit of h(Xn) is h(W) irrespectively of the sequence {~k> k ~ I}, this provides us with a method to compute the distribution of h(W). Theorems 3.2 and 3.3 therefore sometimes are called (weak) invariance principles. The continuous mapping theorems just mentioned are related to convergence in distribution. Similarly one can state such theorems for other convergence modes. However, the following result, due to Skorohod (1956, §3) (see also Vervaat (1972a,b) and Whitt (1980), Section 1 and references given there) shows that it actually suffices to consider deterministic continuous mapping results.

Theorem 3.7. Let {Xn' n ~ I} and X belong to a separable metric space and suppose that Xn = X as n ~ 00. Then there exists a probability space supporting the random elements {Yn, n ~ 1} and Y such that Y" ~ Xn (n ~ 1) and Y ~ X (3.22) and such that

Y" ~ Y as n ~ 00. (3.23)

Let us, as an example, show how this result can be used to prove Theorem 3.5.

EXAMPLE 3.1. Suppose that h is a continuous functional and that Xn = X as n ~ 00. Then, since Y" ~ Y we have h(Y,,) ~ h(Y) as n ~ 00 and hence h(Y,,) = h(Y) as n ~ 00. Finally, since h(Y,,) ~ h(Xn) (n ~ 1) and h(Y) ~ h(X), we conclude that h(Xn) = h(X) as n ~ 00.

The continuity of a number of functions is investigated in Whitt (1980). Remark 3.8. The above theory has been generalized to the space D [0,00), see Lindvall (1973), as follows. Let {Xn' n ~ I} and X be random elements in (D [0,00), gtl[O, 00)) and define, for b > 0, rb: D[O, 00) ~ D[O, b] by

(rbx)(t) = x(t) (t ::;; b). (3.24) Then 176 Appendix A. Some Facts from Probability Theory

Xn ~ X as n -+ 00 iff rbXn ~ rbX as n -+ 00, (3.25) for every b, such that P {X(b) =I X(b -)} = 0. The same is true for the other two topologies discussed above. Thus, convergence in C [0, 00) is uniform convergence on compact sets. It is easily seen that the proofs of Theorems 3.2 and 3.3 carryover to C [0, b] and D [0, b] for any b > 0, that is, the results also hold true in C [0, (0) and D [0, 00), respectively. In our problems in Chapter V it is more natural and also easier to work in the latter spaces.

When considering Anscombe type results we shall use random time changes (cf. Billingsley (1968), Section 17). To do so we also need the space Do = Do [0, 00), which consists of all elements cp of D [0, 00) that are nondecreasing and satisfy cp(t) ~ °(t ~ 0). This space is topologized by the relativized Jc topology. For XED and cp E Do we let x 0 cp denote the composition of x and cp, which is defined through the relation

(x 0 cp )(t) = x( cp(t)). (3.26) This is a measurable mapping D x Do -+ D. We shall also use the following results, which involve the notion of convergence in probability, defined next.

Definition 3.7. Let (S, p) be a metric space. If for some a E S we have

P{p(Xn,a)~B}-+O asn-+oo forallB>O, (3.27) we say that Xn converges in probability to a as n -+ 00 and write Xn ~ a as n -+ 00.

Theorem 3.8. Let (S, p) be a separable, metric space and suppose that {Xn' n ~ I}, {Y", n ~ 1} and X are random elements of S, such that Xn and Y" have the same domain for each n. If Xn = X and p(Xn' y") ~ ° as n -+ 00, (3.28) then Y" = X as n -+ 00. (3.29)

Theorem 3.9. Let {Xn' n ~ 1} and X be random elements of S' and let {Y", n ~ 1} be random elements of S", where S' and S" are separable, metric spaces. Further, assume that Xn and Y" have the same domain for each n. If

Xn =X and Y" ~ a as n -+ 00, (3.30) then

(Xn' y") = (X, a) as n -+ 00. (3.31 ) 4. Strong Invariance Principles 177

4. Strong Invariance Principles

Just as Donsker's theorem above extends the central limit theorem, the classical Hartman-Wintner law of the iterated logarithm has been extended by Strassen (1964), see also e.g. Stout (1974), Chapter 5. Let {~k' k ~ 1} be i.i.d. random variables with mean 0 and (for convenience) variance 1 and define, for n ~ 3, 1 nt - [ntJ Xn(t, w) = J S[nt](w) + J ~[ntl+1 (w) (0 :s; t :s; 1), 2n loglog n 2n loglog n (4.1)

where, as always, Sn = IZ=l ~k (n ~ 0). The Hartman-Wintner (1941) law states that

limsup (liminf) Xn(1) = (2:)1 a.s., (4.2) n--+oo n---+oo

that is, the extreme limit points of {Xn(1), n ~ 3} are + 1 and -1. A strengthening of this result is that

C( {Xn(1), n ~ 3}) = [ -1, 1J a.s., (4.3)

that is, the set of limit points of {Xn(1), n ~ 3} are, in fact, all points between the extreme limit points, see e.g. Stout (1974), Section 5.3 or De Acosta (1983), Theorem 2.5. (C{xn} denotes the cluster set of {xn}.) The result below describes the set oflimitpoints of {Xn(t)(O :s; t:s; 1),n ~ 3}. The method in Strassen (1964) is to prove such a result for Brownian motion and then to use a strong approximation result, which states that the pathwise behaviors of Xn(t) (0 :s; t :s; 1) and of Brownian motion are "close." (Note the difference with Donsker's theorem which is a distributional result.) Theorem 4.1 below is therefore sometimes called a strong in variance principle or an almost sure invariance principle. To describe the result we let

K = {x E AC[O, 1]: x(O) = 0 and f (x'(tWdt :s; 1}, (4.4) where AC denotes the set of absolutely continuous functions. If x is considered as being the motion of a particle of mass 2 from 0 to 1, then K is the set of motions starting at 0 and with kinetic energy at most equal to one.

Lemma 4.1. K is compact (in the uniform topology).

PROOF. See Stout (1974), pp. 282-284. D

We are now ready to state Strassen's result. 178 Appendix A. Some Facts from Probability Theory

Theorem 4.1. With probability one the sequence {Xn' n ;;::: 3}, defined in (4.1), is relatively compact (in the uniform topology) and the set of its limit points coincides with K.

Remark 4.1. Since the projections are continuous mappings (cf. above) we obtain (4.3) and the Hartman-Wintner law (4.2) as trivial corollaries. The reason we mention De Acosta (1983) above is that (to the best of our knowledge) this is the first place where a direct proof of (4.3) based on elementary methods only is given (that is, neither Strassen's result nor Skorohod embedding is used). For more on the law of the iterated logarithm we refer to the recent survey paper by Bingham (1986). Remark 4.2. For the extension to D[O, (0) we refer to Vervaat (1972a), Section 1.5 (see also Vervaat (1972b)).

5. Problems

1. Show that the sequence given in (1.1) is not uniformly integrable.

2. Let {Y,., n :2: I} be such that supn E I Y,.I' < 00. Show that {I y"IP, n :2: I} is uniformly integrable for all p, 0 < p < r.

3. Observe the strict inequality, p < r, in Problem 2. Compare with the sequence (1.1) and r = 1. Show that supn E IXnl < 00 but {IXnl, n :2: I} is not uniformly integrable. 4. Prove Lemma 1.1. 5. Let {Un' n:2: I} and {v;" n:2: I} be sequences of random variables, such that Un:::; v;, a.s. Show that if {v;" n :2: I} is uniformly integrable, then so is {U., n :2: I}. 6. Deduce (2.3) from (2.2).

7. Let {Xk' k:2: I} be independent random variables, such that EXk = 11k and Var X k = af exist, finite. Set Sn = Lk~l Xk, mn = Lk~l 11k and s; = Lk~l af and define$',. = a{X1, ... ,Xn } (n:2: 0). Prove that

{((Sn - mn)2 - s~, $',.), n :2: I} is a martingale. (5.1) 8. Let {Xk' k :2: I} be a sequence of i.i.d. random variables with finite mean. Set Sn = Lk~l X k (n :2: 1) and define, for n :2: 1, '§n = a{Sk' k :2: n}. Show that

{ (~ , '§n), n :2: I} is a reversed martingale. (5.2)

9. Let, be a stopping time relative to an increasing sequence of a-algebras and set 'n = , /\ n. Show that 'n is a stopping time.

10. Let, 1 and, 2 be stopping times relative to an increasing sequence of a-algebras. Show that, 1 /\ '2 and, 1 V '2 are stopping times.

11. What about Problem 10 when, 1 and, 2 are stopping times relative to different increasing sequences of a-algebras? 5. Problems 179

12. Let {Xn' n;::: I} and X be real valued random variables. a) What does it mean to say that {Xn' n ;::: 1} is tight? b) Show that, if Xn converges in distribution to X as n --> CfJ, then {Xn' n ;::: 1} is tight. (Compare with Theorem 3.1; in the real valued case tightness is automatic.)

13. Show that the J1-topology is weaker than the U -topology, that is, let {xn' n ;::: I} and x be elements of D and show that Xn --> x (U) => Xn --> X (11)'

14. Let {xn' n;::: I} be elements of D and suppose that x E C. Show that Xn --> X (Jd-= Xn --> x (U). ApPENDIX B Some Facts about Regularly Varying Functions

1. Introduction and Definitions

Regular variation was introduced by Karamata (1930). The theory of regularly varying functions has proved to be important in many branches of probability theory. In this appendix we present definitions and a few facts that are needed in Section IV.5. For further reading we suggest Feller (1971), de Haan (1970) and Seneta (1976).

Definition 1.1. A positive, measurable function U on (0, (0) is regularly varying (at infinity) with exponent p (-00 < p < (0) if

U(tx) p ---+x ast-+oo (1.1) U(t) for all x> o.

A function that is regularly varying with exponent 0 is called slowly varying.

Definition 1.2. A positive, measurable function L on (0, (0) is slowly varying (at infinity) if L(tx) -+ 1 as t -+ 00 (1.2) L(t) for all x> o.

Suppose U is regularly varying with exponent p. It is then easy to see that we can write U(x) = x P L(x), (1.3) where L is slowly varying. Remark 1.1. One can also define regular and slow variation at some finite point a. We shall, however, only deal with regular and slow variation at infinity. 2. Some Results 181

Typical examples of regularly varying functions are log x x P, x P log x, x P I I ' etc. og ogx Typical examples of slowly varying functions are log x, loglog x, etc., but also all functions having a finite, positive limit as x -+ 00, such as arc tan x. Remark 1.2. We tacitly assume that the examples above have been modified so that they are positive for all x > O.

2. Some Results

We now present some results for regularly and slowly varying functions. We give no proofs (apart from one exception); they can all be found in the references above. Also, for the "typical examples" mentioned above the results are fairly trivial.

Lemma 2.1 (A Representation Formula). A function L varies slowly iff it is of the form

X L(x) = c(x)' exp {fl B;) dY}, (2.1) where B(X) -+ 0 and c(x) -+ c (0 < c < 00) as x -+ 00.

Remark 2.1. It follows from this result that if L varies slowly then

x-· < L(x) < x' (2.2) for any fixed B > 0 and all sufficiently large x.

Before we state our next result we note that the ratio between the arguments in the definitions is constant, x. It is an easy exercise to show that if L is slowly varying and {an, n 2 1} and {bn, n 2 1} are sequences of positive reals, such that an - -+ c (0 < c < ex)) as n -+ 00, (2.3) bn then

as n -+ 00. (2.4)

A similar result holds for regularly varying functions. The following result, due to Sreehari (1970), is a kind of converse. 182 Appendix B. Some Facts about Regularly Varying Functions

Lemma 2.2. Let f(t) and g(t) be real valued functions tending to infinity as t --+ 00 and let L(t) be a slowly varying function. If f(t))P L(f(t)) ( g(t) . L(g(t)) --+ 1 as t --+ 00 for some p#-O (2.5) then

as t --+ 00. (2.6)

PROOF. Set h* = f /\ g and h* = f v g. We use the representation formula for slowly varying functions given in Lemma 2.1, which yields

( f(t))P. L(f(t)) = (f(t))P. c(f(t)) . exp {sign(f(t) _ g(t)). rh'(t) 8(Y) dY}. g(t) L(g(t)) g(t) c(g(t)) Jh.(t) Y (2.7) By the mean value theorem we have h'(t) 8(y) _ _ h*(t) I dY-8(t)logh ()' (2.8) h.(t) y * t where inf 8(S) :::;; l(t) :::;; sup 8(S). h.(t):$;s:$;h·(t) h.(t):$;s:$;h*(t)

Since the LHS in (2.7) tends to 1 and c(f(t))/c(g(t)) --+ 1 as t --+ 00 it follows that

f(t))P. (h*(t))i(t)'sign(f(t)-g(t)) --+ 1 as t --+ 00, ( (2.9) g(t) h*(t) which is the same as f(t))P+i(t) ( - --+ 1 as t --+ 00. (2.10) g(t)

The conclusion now follows from the fact that l(t) --+ 0 as t --+ 00. D

We finally need a limit theorem for the case when V has a density, u.

Lemma 2.3. Suppose that p ;:::: 0 and that V has an ultimately monotone density u. Then

as t --+ 00. (2.11)

For the case p = 0 we also refer to Gut (1974a), Lemma 3.1 (a). Bibliography

AleskeviCiene, A. (1975): On the local limit theorem for the first passage time across a barrier (in Russian). Litovsk. Mat. Sb. XV, 23-66. Anderson, K.K. and Athreya, K.B. (1987): A renewal theorem in the infinite mean case. Ann. Probab. 15, 388-393. Anscombe, F.J. (1952): Large-sample theory of sequential estimation. Proc. Cambridge Philos. Soc. 48, 600-607. Arjas, E., Nummelin, E. and Tweedie, R.L. (1978): Uniform limit theorems for non• singular renewal and Markov renewal processes. J. Appl. Probab. 15, 112-125. Asmussen, S. (1982): Conditioned limit theorems relating a to its associate, with applications to risk reserve processes and the GI/G/1 queue. Adv. in Appl. Probab. 14, 143-170. Asmussen, S. (1984): Approximations for the probability of ruin within finite time. Scand. Actuar. J., 31-57; Correction, ibid. (1985), 64. Asmussen, S. (1987): and Queues. John Wiley, New York. Athreya, K., Mc Donald, D. and Ney, P. (1978): Coupling and the renewal theorem. Amer. Math. Monthly 85,809-814. Barlow, R.E. and Proschan, F. (1965): Mathematical Theory of Reliability. John Wiley, New York. Basu, A.K. (1972): Invariance theorems for first passage time random variables. Canad. Math. Bull. 15, 171-176. Baum, L.E. and Katz, M. (1965): Convergence rates in the law oflarge numbers. Trans. Amer. Math. Soc. 120,108-123. Berbee, H.C.P. (1979): Random Walks with Stationary Increments and . Mathematical Centre Tracts 112, Amsterdam. Bickel, P.J. and Yahav, J.A. (1965): Renewal theory in the plane. Ann. Math. Statist. 36, 946-955. Billingsley, P. (1968): Convergence of Probability Measures. John Wiley, New York. Bingham, N.H. (1973): Maxima of sums of random variables and suprema of stable processes. Z. Wahrsch. verw. Gebiete 26, 273-296. Bingham, N.H. (1986): Variants of the law of the iterated logarithm. Bull. London Math. Soc. 18, 433-467. Bingham, N.H. and Goldie, C.M. (1982): Probabilistic and deterministic averaging. Trans. Amer. Math. Soc. 269, 453-480. Blackwell, D. (1946): On an equation ofWald. Ann. Math. Statist. 17, 84-87. 184 Bibliography

Blackwell, D. (1948): A renewal theorem. Duke Math. J. 15, 145-150. Blackwell, D. (1953): Extension of a renewal theorem. Pacific J. Math. 3, 315-320. Breiman, L. (1965): First exit times from a square root boundary. Proc. Fifth Berkeley Symp. Math. Statist. and Probability, Vol. IIjll, 9-16. University of California Press, Berkeley, CA. Brown, B.M. (1969): Moments of a stopping rule related to the central limit theorem. Ann. Math. Statist. 40,1236-1249. Burkholder, D.L. (1966): Martingale transforms. Ann. Math. Statist. 37,1494-1504. Burkholder, D.L. (1973): Distribution function inequalities for martingales. Ann. Probab. 1, 19-42. Carlsson, H. (1982): Error estimates in d-dimensional renewal theory. Compositio Math. 46" 227-253. Carlsson, H. (1983): Remainder term estimates of the renewal function. Ann. Probab. 11, 143-157. Carlsson, H. and Nerman, O. (1986): An alternative proof of Lorden's renewal inequality. Adv. in Appl. Probab. 18,1015-1016. Carlsson, H. and Wainger, S. (1984): On the multi-dimensional renewal theorem. J. Math. Anal. Appl. 100,316-322. Chang, I. and Hsiung, C. (1979): On the uniform integrability oflb-1!PWMbIP, 0 < P < 2. Preprint, NCU Chung-Li, Taiwan. Chang, I.S. and Hsiung, c.A. (1983): Strassen's invariance principle for random subsequences. Z. Wahrsch. verw. Gebiete 64, 401-409. Choquet, G. and Deny, J. (1960): Sur I'equation de J1 = J1 * u. c.R. Acad. Sci. Paris 250, 799-801. Chow, Y.S. (1966): On the moments of some one-sided stopping rules. Ann. Math. Statist. 37, 382-387.

Chow, Y.S. and Hsiung, A. (1976): Limiting behaviour of maxjsn Sjr" and the first passage times in a random walk with positive drift. Bull. Inst. Math. Acad. Sinica 4, 35-44. Chow, Y.S., Hsiung, C.A. and Lai, T.L. (1979): Extended renewal theory and moment convergence in Anscombe's theorem. Ann. Probab. 7, 304-318. Chow, Y.S. and Lai, T.L. (1979): Moments of ladder variables for driftIess random walks. Z. Wahrsch. verw. Gebiete 48, 253-257. Chow, Y.S. and Robbins, H. (1963): A renewal theorem for random variables which are dependent or nonidentically distributed. Ann. Math. Statist. 34, 390-395. Chow, Y.S., Robbins, H. and Siegmund, D. (1971): Great Expectations: The Theory of Optimal Stopping. Houghton-Miffiin, Boston, MA. Chow, Y.S., Robbins, H. and Teicher, H. (1965): Moments of randomly stopped sums. Ann. Math. Statist. 36, 789-799. Chow, Y.S. and Teicher, H. (1966): On second moments of stopping rules. Ann. Math. Statist. 37, 388-392. Chung, KL. (1948): Asymptotic distribution of the maximum cumulative sum of independent random variables. Bull. Amer. Math. Soc. 54,1162-1170. Chung, KL. (1952): On the renewal theorem in higher dimensions. Skand. Aktuarietidskr. 35, 188-194. Chung, KL. (1974): A Course in Probability Theory. 2nd ed. Academic Press, New York. Chung, KL. and Fuchs, W.H.J. (1951): On the distribution of values of sums of random variables. M em. Amer. Math. Soc. 6. Bibliography 185

Chung, K.L. and Ornstein, D. (1962): On the recurrence of sums of random variables. Bull. Amer. Math. Soc. 68, 30-32. Chung, K.L. and Pollard, H. (1952): An extension of renewal theory. Proc. Amer. Math. Soc. 3, 303-309. Chung, K.L. and Wolfowitz, J. (1952): On a limit theorem in renewal theory. Ann. of Math. (2) 55, 1-6. <;inlar, E. (1975): Introduction to Stochastic Processes. Prentice Hall, Englewood Cliffs, N.J. Cox, D.R. (1967): Renewal Theory. Methuen, London. Cox, D.R. and Miller, H.D. (1965): The Theory of Stochastic Processes. Methuen, London. Cox, D.R. and Smith, W.L. (1953): A direct proof of a fundamental theorem of renewal theory. Skand. Aktuarietidskr. 36, 139-150. Cramer, H. (1955): Collective risk theory. The jubilee volume of Forsiikringsbolaget Skandia, Stockholm. Daley, D.J. (1980): Tight bounds for the renewal function of a random walk. Ann. Probab. 8, 615-621. Davis, B. (1970): On the integrability of the martingale square function. Israel J. Math. 8,187-190. De Acosta, A. (1983): A new proof of the Hartman-Wintner law of the iterated logarithm. Ann. Probab. 11,270-276. De Groot, M.H. (1986): A conversation with David Blackwell. Statistical Science 1, 40-53. de Haan, L. (1970): On Regular Variation and Its Application to the Weak Convergence of Sample Extremes. Mathematical Centre Tracts 32, Amsterdam. Doney, R.A. (1966): An analogue of the renewal theorem in higher dimensions. Proc. London Math. Soc. (3) 16, 669-684. Doney, R.A. (1980): Moments of ladder heights in random walks. J. App/. Probab. 17, 248-252. Doney, R.A. (1982): On the existence of the mean ladder height for random walk. Z. Wahrsch. verw. Gebiete 59, 373-382. Donsker, M. (1951): An invariance principle for certain probability limit theorems. M em. Amer. Math. Soc. 6. Doob, J.L. (1948): Renewal theory from the point of view of the theory of probability. Trans. Amer. Math. Soc. 63,422-438. Doob, J.L. (1953): Stochastic Processes. John Wiley, New York. Doob, J.L., Snell, J.L. and Williamson, R.E. (1960): Applications of boundary theory to sums of independent random variables. Contributions to Probability and . Essays in honor of Harold Hotelling (Eds. Ingram Olkin et al.), 182-197. Stanford University Press, Stanford, CA. Dynkin, E.B. (1955): Limit theorems for sums of independent random quantities. Izv. Akad. Nauk SSSR Ser. Mat. 19,247-266. Selected Translations in Math. Stat. and Prob. 1 (1961),171-189. Englund, G. (1980): Remainder term estimate for the asymptotic normality of the number of renewals. J. App/. Probab. 17, 1108-1113. Erdos, P. (1949): On a theorem of Hsu and Robbins. Ann. Math. Statist. 20, 286- 291. Erdos, P. (1950): Remark on my paper "On a theorem of Hsu and Robbins." Ann. Math. Statist. 21, 138. 186 Bibliography

Erdos, P., Feller, W. and Pollard, H. (1949): A theorem on power series. Bull. Amer. Math. Soc. 55, 201-204. Erdos, P. and Kac, M. (1946): On certain limit theorems of the theory of probability. Bull. Amer. Math. Soc. 52, 292-302. Erdos, P. and Renyi, A. (1970): On a new law oflarge numbers. J. Analyse Math. 23, 103-111. Erickson, K.B. (1970): Strong renewal theorems with infinite mean. Trans. Arner. Math. Soc. 151,263-291. Essen, M. (1973): Banach algebra methods in renewal theory. J. Analyse Math. 26, 303-336. Farrell, R.H. (1964): Limit theorems for stopped random walks. Ann. Math. Statist. 35, 1332-1343. Farrell, R.H. (1966): Limit theorems for stopped random walks III. Ann. Math. Statist. 37, 1510-1527. Feller, W. (1941): On the integral equation ofrenewal theory. Ann. Math. Statist. 12, 243-267. Feller, W. (1949): Fluctuation theory and recurrent events. Trans. Amer. Math. Soc. 67,98-119. Feller, W. (1968): An Introduction to Probability Theory and Its Applications, Vol. 1, 3rd ed. John Wiley, New York. Feller, W. (1971): An Introduction to Probability Theory and Its Applications, Vol. 2, 2nd ed. John Wiley, New York. Feller, W. and Orey, S. (1961): A renewal theorem. J. Math. Mech. 10,619-624. Gafurov, M. U. (1980): On the first exit time of a multidimensional random walk from expanding sets. Soviet Math. Dokl. 22, 492-493. Garsia, A.M. (1973): Martingale Inequalities: Seminar Notes on Recent Progress. Benjamin, Reading. MA. Garsia, A. and Lamperti, J. (1962/63): A discrete renewal theorem with infinite mean. Comment. Math. Helv. 37, 221-234. Gikhman, 1.1. and Skorohod, A.V. (1969): Introduction to the Theory of Random Processes. Saunders, Philadelphia, P A. Gnedenko, B.V., Belyayev, Yu. K. and Solovyev, A.D. (1969): Mathematical Methods of Reliability Theory. Academic Press, New York. Grubel, R. (1986): On harmonic renewal measures. Probab. Th. ReI. Fields 71, 393-404. Gundy, R.F. and Siegmund, D. (1967): On a stopping rule and the central limit theorem. Ann. Math. Statist. 38,1915-1917. Gut, A. (1973): A functional central limit theorem connected with extended renewal theory. Z. Wahrsch. verw. Gebiete 27, 123-129. Gut, A. (1974a): On the moments and limit distributions of some first passage times. Ann. Probab. 2, 277-308. Gut, A. (1974b): On the moments of some first passage times for sums of dependent random variables. . Appl. 2,115-126. Gut, A. (1974c): On convergence in r-mean of some first passage times and randomly indexed partial sums. Ann. Probab. 2, 321-323. Gut, A. (1975a): Weak convergence and first passage times. J. Appl. Probab. 12, 324-332. Gut, A. (1975b): On a.s. and r-mean convergence of random processes with an application to first passage times. Z. Wahrsch. verw. Gebiete 31,333-341. Gut, A. (1983a): Renewal theory and ladder variables. Probability and Mathematical Bibliography 187

Statistics. Essays in honour of Cad-Gustav Esseen (Eds. A. Gut and L. Holst), 25-39. Uppsala. Gut, A. (1983b): Complete convergence and convergence rates for randomly indexed partial sums with an application to some first passage times. Acta Math. Acad. Sci. Hungar. 42, 225-232; Correction, ibid. 45 (1985), 235-236. Gut, A. (1985): On the law of the iterated logarithm for randomly indexed partial sums with two applications. Studia Sci. Math. Hungar. 20, 63-69. Gut, A. and Ahlberg, P. (1981): On the theory of chromatography based upon renewal theory and a central limit theorem for randomly indexed partial sums of random variables. Chemica Scripta 18, 248-255. Gut, A. and Holst, L. (1984): On the waiting time in a generalized roulette game. Statist. Probab. Lett. 2, 229-239. Gut, A. and Janson, S. (1983): The limiting behaviour of certain stopped sums and some applications. Scand. J. Statist. 10,281-292. Gut, A. and Janson, S. (1986): Converse results for existence of moments and uniform integrability for stopped random walks. Ann. Probab. 14, 1296-1317. Hall, P. and Heyde, c.c. (1980): Martingale Limit Theory and Its Application. Academic Press, New York. Hartman, P. and Wintner, A. (1941): On the law of the iterated logarithm. Amer. J. Math. 63,169-176. Hatori, H. (1959): Some theorems in an extended renewal theory I. Kodai Math. Sem. Rep. 11, 139-146. Heyde, c.c. (1964): Two probability theorems and their applications to some first passage problems. J. Austral. Math. Soc. 4, 214-222. Heyde, c.c. (1966): Some renewal theorems with application to a first passage problem. Ann. Math. Statist. 37, 699-710. Heyde, c.c. (1967a): Asymptotic renewal results for a natural generalization of classical renewal theory. J. Roy. Statist. Soc. Ser. B 29,141-150. Heyde, c.c. (1967b): A limit theorem for random walks with drift. J. Appl. Probab. 4, 144-150. Hogfeldt, P. (1977): On the asymptotic behaviour of first passage time processes and certain stopped stochastic processes. Abstract Second Vilnius Conference on Probability Theory and , Vol. 3,86-87. Horvath, L. (1984a): Strong approximation of renewal processes. Stochastic Process. Appl. 18, 127-138. Horvath, L. (1984b): Strong approximation of extended renewal processes. Ann. Probab. 12, 1149-1166. Horvath, L. (1984c): Strong approximation of certain stopped sums. Statist. Probab. Lett. 2, 181-185. Horvath, L. (1985): A strong nonlinear renewal theorem with applications to sequential analysis. Scand. J. Statist. 12,271-280. Horvath, L. (1986): Strong approximations of renewal processes and their applications. Acta Math. Acad. Sci. Hungar. 47, 13-28. Hsu, P.L. and Robbins, H. (1947): Complete convergence and the law oflarge numbers. Proc. Nat. Acad. Sci. U.S.A. 33, 25-31. Huggins, R.M. (1985): Laws of the iterated logarithm for time changed Brownian motion with an application to branching processes. Ann. Probab. 13, 1148-1156. Hunter, J.J. (1974a): Renewal theory in two dimensions: Basic results. Adv. in Appl. Probab. 6, 376-391. 188 Bibliography

Hunter, J.J. (1974b): Renewal theory in two dimensions: Asymptotic results. Adv. in Appl. Probab. 6, 546-562. Hunter, J.J. (1977): Renewal theory in two dimensions: Bounds on the renewal function. Adv. in Appl. Probab. 9,527-541. Iglehart, D.L. and Whitt, W. (1971): The equivalence of functional central limit theorems for counting processes and associated partial sums. Ann. Math. Statist. 42, 1372-1378. Jagers, P. (1975): Branching Processes with Biological Applications. John Wiley, New York. Janson, S. (1983): Renewal theory for m-dependent variables. Ann. Probab.l1, 558-568. Janson, S. (1986): Moments for first passage and last exit times, the minimum and related quantities for random walks with positive drift. Adv. in Appl. Probab. 18, 865-879. Kaijser, T. (1971): A stochastic model describing the water motion in a river. Nordic Hydrology II, 243-265. Karamata, J. (1930): Sur une mode de croissance reguliere des fonctions. Mathematica (Cluj) 4, 38-53. Katz, Melvin L. (1963): The probability in the tail of a distribution. Ann. Math. Statist. 34,312-318. Kemperman, J.H.B. (1961): The Passage Problem for a Stationary . University of Chicago Press, Chicago, IL. Kesten, H. (1974): Renewal theory for functionals of a Markov chain with general state space. Ann. Probab. 2, 355-386. Kiefer, J. and Wolfowitz, J. (1956): On the characteristics of the general queueing process, with applications to random walk. Ann. Math. Statist. 27,147-161. Kingman, J.F.e. (1968): The of subadditive stochastic processes. J. Roy. Statist. Soc. Ser. B 30, 499-510. Kolmogorov, A.N. (1936): Anfangsgriinde der Theorie der Markoffschen Ketten mit unendlich vielen moglichen Zustiinden. Mat. Sb. (N.S.) 1, 607-610. Lai, T.L. (1975): On uniform integrability in renewal theory. Bulllnst. Math. Acad. Sinica 3, 99-105. Lai, T.L. (1976): Asymptotic moments of random walks with applications to ladder variables and renewal theory. Ann. Probab. 4, 51-66. Lai, T.L. (1977): First exit times from moving boundaries for sums of independent random variables. Ann. Probab. 5, 210-221. Lai, T.L. and Siegmund, D. (1977): A nonlinear renewal theory with applications to sequential analysis. Ann. Statist. 5, 946-954. Lai, T.L. and Siegmund, D. (1979): A nonlinear renewal theory-with applications to sequential analysis II. Ann. Statist. 7, 60-76. Lalley, S.P. (1984a): Limit theorems for first-passage times in linear and nonlinear renewal theory. Adv. in Appl. Probab. 16, 766-803. Lalley, S.P. (1984b): Conditional Markov renewal theory I. Finite and denumerable state space. Ann. Probab.12, 1113-1148. Lalley, S.P. (1986): Renewal theorem for a class of stationary sequences. Probab. Th. Rei. Fields 72, 195-213. Lamperti, J. (1958): Some limit theorems for stochastic processes. J. Math. Mech. 7, 433-448. Lamperti, J. (1961): A contribution to renewal theory. Proc. Amer. Math. Soc. 12, 724-731. Bibliography 189

Lindberger, K. (1978): Functional limit theorems for cumulative processes and stopping times. Z. Wahrsch. verw. Gebiete 44, 47-56. Lindvall, T. (1973): Weak convergence of probability measures and random functions in the function space D[O, (0). J. Appl. Probab.l0, 109-121. Lindvall, T. (1977): A probabilistic proof of Blackwell's renewal theorem. Ann. Probab. 5,482-485. Lindvall, T. (1979): On coupling of discrete renewal processes. Z. Wahrsch. verw. Gebiete 48,57-70. Lindvall, T. (1982): On coupling of continuous-time renewal processes. J. Appl. Probab. 19,82-89. Lindvall, T. (1986): On coupling of renewal processes with use of failure rates. Stochastic Process. Appl. 22, 1-15. Loeve, M. (1977): Probability Theory, 4th ed. Springer, New York. Lorden, G. (1970): On excess over the boundary. Ann. Math. Statist. 41, 520-527. Maejima, M. (1975): On local limit theorems and Blackwell's renewal theorem for independent random variables. Ann. Inst. Statist. Math. 27, 507-520. Maejima, M. and Mori, T. (1984): Some renewal theorems for random walks in multidimensional time. Math. Proc. Cambridge Philos. Soc. 95,149-154. Marcinkiewicz, J. and Zygmund, A. (1937): Sur les fonctions independantes. Fund. Math. 29, 60-90. Marcinkiewicz, J. and Zygmund, A. (1938): Quelques theoremes sur les fonctions independantes. Studia Math. vn, 104-120. Meyer, P.A. (1966): Probability and Potential. Waltham, Blaisdell. Mohan, N.R. (1976): Teugels' renewal theorem and stable laws. Ann. Probab. 4, 863-868. Nagaev, A.V. (1979): Renewal theorems in Rd. Theory Probab. Appl. XXIV, 572-581. Neveu, J. (1975): Discrete-Parameter Martingales. North-Holland, Amsterdam. Ney, P. (1981): A refinement of the coupling method in renewal theory. Stochastic Process. Appl. 11,11-26. Ney, P. and Wainger, S. (1972): The renewal theorem for a random walk in two• dimensional time. Studia Math. XLIV, 71-85. Niculescu, S.P. (1979): Extension of a renewal theorem. Bull. Math. Soc. Sci. Math. R.S. Roumanie 23, 289-292. Ornstein, D.S. (1969a): Random walks I. Trans. Amer. Math. Soc. 138, 1-43. Ornstein, D.s. (1969b): Random walks II. Trans. Amer. Math. Soc. 138,45-60. Plucinska, A. (1962): On the joint limiting distribution of times spent in particular states by a Markov process. Colloq. Math. IX, 347-360. Port, S.c. and Stone, c.J. (1967): Hitting time and hitting places for nonlattice recurrent random walks. J. Math. Mech. 17,35-57. Prabhu, N.U. (1965): Stochastic Processes. Macmillan, New York. Prabhu, N.U. (1980): Stochastic Storage Processes. Queues, Insurance Risk, and Dams. Springer, New York. Pyke, R. (1958): On renewal processes related to type I and type II counter models. Ann. Math. Statist. 29, 737-754. Pyke, R. and Root, D. (1968): On convergence in r-mean for normalized partial sums. Ann. Math. Statist. 39, 379-381. Renyi, A. (1957): On the asymptotic distribution of the sum of a random number of independent random variables. Acta Math. Acad. Sci. Hungar. 8,193-199. Revuz, D. (1975): Markov Chains. North-Holland, Amsterdam. 190 Bibliography

Richter, W. (1965): Limit theorems for sequences of random variables with sequences of random indices. Theory Probab. Appl. X, 74-84. Rosen, B. (1961): On the asymptotic distribution of sums of independent identically distributed random variables. Ark. Mat. 4, 323-332. Seal, H.L. (1969): Stochastic Theory of a Risk Business. John Wiley, New York. Seneta, E. (1976): Regularly Varying Functions. Lecture Notes in Mathematics 508. Springer, Berlin. Serfozo, R.I'. (1974): Large deviations of renewal processes. Stochastic Process. Appl. 2,295-301. Serfozo, R.F. (1975): Functional limit theorems for stochastic processes based on embedded processes. Adv. in Appl. Probab. 7, 123-139. Siegmund, D.O. (1967): Some one-sided stopping rules. Ann. Math. Statist. 38, 1641-1646. Siegmund, D.O. (1968): On the asymptotic normality of one-sided stopping rules. Ann. Math. Statist. 39, 1493-1497. Siegmund, D.O. (1969): The variance of one-sided stopping rules. Ann. Math. Statist. 40, 1074-1077. Siegmund, D. (1975): The time until ruin in collective risk theory. Mitt. Verein Schweiz. Versicherungsmath. 75, 157-165. Siegmund, D. (1985): Sequential Analysis. Tests and Confidence Intervals. Springer, New York. Skorohod, A.V. (1956): Limit theorems for stochastic processes. Theory Probab. Appl. 1,261-290. Smith, W.L. (1954): Asymptotic renewal theorems. Proc. Roy. Soc. Edinburgh Sect. A 64,9-48. Smith, W.L. (1955): Regenerative stochastic processes. Proc. Roy. Soc. London Ser. A 232,6-31. Smith, W.L. (1958): Renewal theory and its ramifications. J. Roy. Statist. Soc. Ser. B 20,243-302. Smith, W.L. (1964): On the elementary renewal theorem for nonidentically distributed variables. Pacific J. Math. 14, 673-699. Smith, W.L. (1967): A theorem on functions of characteristic functions and its applications to some renewal theoretic random walk problems. Proc. Fifth Berkeley Symp. Math. Statist. and Probability, Vol. II/II, 265-309. University of California Press, Berkeley, CA. Sparre Andersen, E. (1953a): On sums of symmetrically dependent random variables. Skand. Aktuarietidskr. 36, 123-138. Sparre Andersen, E. (1953b): On the fluctuations of sums of random variables 1. Math. Scand. 1,263-285. Sparre Andl~rsen, E. (1954): On the fluctuations of sums of random variables II. Math. Scand. 2, 195-223. Sparre Andersen, E. (1957): On the collective theory of risk in the case of contagion between claims. Trans. XV Internat. Congr. Actuaries, New York 2, 219-227. Spitzer, F. (1956): A combinatorial lemma and its application to probability theory. Trans. Amer. Math. Soc. 82, 323-339. Spitzer, F. (1960): A Tauberian theorem and its probability interpretation. Trans. Amer. Math. Soc. 94, 150-169. Spitzer, F. (1965): Renewal theory for Markov chains. Proc. Fifth Berkeley Symp. Bibliography 191

Math. Statist. and Probability, Vol. IIj1I, 311-320. University of California Press, Berkeley, CA. Spitzer, F. (1976): Principles of Random Walks, 2nd ed. Springer, New York. Sreehari, M. (1970): On a class of limit distributions for normalized sums of indepen• dent random variables. Theory Probab. Appl. XV, 258-281. Starn, A.J. (1968): Two theorems in r-dimensional renewal theory. Z. Wahrsch. verw. Gebiete 10, 81-86. Starn, A.J. (1969): Renewal theory in r dimensions I. Compositio Math. 21, 383-399. Starn, A.J. (1971): Renewal theory in r dimensions II. Compositio Math. 23,1-13. Statist. Neerlandica 36 (1982),100. Statist. Neerlandica 37 (1983), 46-48. Steinebach, J. (1986): Improved Erdos-Renyi and strong approximation laws for increments of renewal processes. Ann. Probab. 14, 547-559. Stone, C. (1965): On characteristic functions and renewal theory. Trans. Amer. Math. Soc. 120,327-342. Stone, C. (1969): On the potential theory operator for one-dimensional recurrent random walks. Trans. Amer. Math. Soc. 136, 413-426. Stone, C. and Wainger, S. (1967): One-sided error estimates in renewal theory. J. Analyse Math. 20, 325-352. Stout, W.F. (1974): Almost Sure Convergence. Academic Press, New York. Strassen, V. (1964): An invariance principle for the law of the iterated logarithm. Z. Wahrsch. verw. Gebiete 3, 211-226. Strassen, V. (1966): A converse to the law of the iterated logarithm. Z. Wahrsch. verw. Gebiete 4, 265-268. Szynal, D. (1972): On almost complete convergence for the sum of a random number of independent random variables. Bull. Acad. Polon. Sci. Ser. Sci. Math. Astronom. Phys. 20, 571-574. Tacklind, S. (1944): Elementare Behandlung vom Erneuerungsproblem fUr den sta• tionaren Fall. Skand. Aktuarietidskr. 27, 1-15. Takacs, L. (1956): On a probability problem arising in the theory of counters. Proc. Cambridge Phi/os. Soc. 52, 488-498. Takacs, L. (1957): On certain sojourn time problems in the theory of stochastic processes. Acta Math. Acad. Sci. Hungar. 8,169-191. Teicher, H. (1973): A classical limit theorem without invariance or reflection. Ann. Probab. 1,702-704. Teugels, J.L. (1968): Renewal theorems when the first or the second moment is infinite. Ann. Math. Statist. 39,1210-1219. Thorisson, H. (1987): A complete coupling proof of Blackwell's renewal theorem. Stochastic Process. Appl. Toming, I. (1987): The law of the iterated logarithm-cluster points of deterministic and random subsequences. Probab. Math. Statist. VIII. Vervaat, W. (1972a): Success Epochs in Bernoulli Trials (With Applications in Number Theory). Mathematical Centre Tracts 42, Amsterdam. Vervaat, W. (1972b): Functional central limit theorems for processes with positive drift and their inverses. Z. Wahrsch. verw. Gebiete 23, 245-253. von Bahr, B. (1974): Ruin probabilities expressed in terms of ladder height distribu• tions. Scand. Actuar. J., 190-204. Wald, A. (1947): Limit distribution of the maximum and minimum of successive 192 Bibliography

cumulative sums of random variables. Bull. Amer. Math. Soc. 53,142-153. Whitt, W. (1980): Some useful functions for functional limit theorems. Math. Oper. Res. 5,67-8l. Williamson, I.A. (1965): Some renewal theorems for non-negative independent random variables. Trans. Amer. Math. Soc. 114, 417-445. Williamson, I.A. (1968): Random walks and Riesz kernels. Pacific J. Math. 25, 393-415. Woodroofe, M. (1976): A renewal theorem for curved boundaries and moments of first passage times. Ann. Probab. 4, 67-80. Woodroofe, M. (1982): Nonlinear Renewal Theory in Sequential Analysis. CBMS-NSF Regional eon[ Ser. in Appl. Math. 39. SIAM, Philadelphia, PA. Yu, K.F. (1979): On the uniform integrability of the normalized randomly stopped sums of independent random variables. Preprint, Yale University. Index

A c Ahlberg 6, 119 (C,~ 148,157,171-176,177 Aleskeviciene 87 Carlsson 62,89,100 Anderson 61 Central limit theorem 1, 7, 8, 15, 17, 18, Anscombe 3, 8, 15,42 29, 36, 38, 55, 61, 72, 85, 92-96, condition 15,44 102, 108, 111-114, 130, 137, 147, Anscombe-Donsker invariance principle 168-169 147-149 moment convergence in 18, 168-169 Anscombe's theorem 3,4,15-16,36,44, multidimensional 117 56,85,137,147,151 Chang 33, 163 L'-analogue of 33 Choquet 53 multidimensional 150 Chow 7, 17, 23, 24, 33, 34, 67, 92-95, Arjas 53 103, 106, 107, 132, 137, 142-144, Asmussen 4,48, 62, 109, 125 146 Athreya 53, 61 Chromatography 6,118-121 Chung 4,15,62,65,73,90,100,165, 166, 168, 170 B <;in1ar 4, 48, 62 Barlow 122-124 Cluster set (see Limit points, set of) Basu 152 Combinatorial methods 4, 46 Baum 42,43 Composition 149,151,176 Belyayev 122 Continuity offunction(al)s 164, 174- Berbee 107 175 Berry 61 composition 149, 151 Bickel 62 first passage time 156 Billingsley 7, 18, 149, 150, 152, 153, 160, inversion 155 166,169,171,174,176 largest jump 154 Bingham 103,156,178 projections 150, 156, 178 Binomial process 1 supremum 155 Blackwell 8,23,26, 27, 44, 52, 61, 76, Continuous mapping theorem 149,150, 87,88,90 154, 163, 174-175 Breiman 7 Convergence Brown 7 almost sure (a.s.) 4, 9, 10-14, 17, Brownian motion (see Wiener, process) 41,44,54,67,70,83-85,97,130, Burkholder 20,29,167,169,170 135, 145, 166, 175 194 Index

Convergence (cont.) Erdos 6,7,42,52,73,85 complete 9, 42-43, 103 Erickson 53,61 in distribution 4, 9, 10, 15-17, 36, Esseen 61 55,58,72,85-87,98,102,111- Essen 89 112,115-117,130-131,137-138, 166,179 of finite-dimensional distributions F 150, 156, 171-172 Farrell 62 in U 4,9, 17, 37, 39,41,44, 166 Feller 4,6,24,36,48, 52, 54, 60, 61, (see also Uniform integrability) 65,90, 134, 180 moment 4,9, 17-20,36-39,54,56, First passage time(s) 2, 5-9, 22, 24, 39, 57,59,87,89,92-97,98-102, 50,74-107,108,109-118,132, 106,108,109-114,130-132, 133,147,151-159,163 (see also 143-145, 165-167 First passage times across general in probability 4,9,10-13,39,166,176 boundaries) of probability measures 171-176 auxiliary 138, 139, 144 (see also Weak convergence) central limit theorem 56, 85, 93 rate 42-44,103-104,131 complete convergence 103 in r-mean 4 (see also Convergence, convergence rate 103-104 in U) excess over the boundary 76, 97 weak (see Weak convergence) (see also First passage times, counters 1, 2, 124-125 overshoot) coupling 53 for the ladder height process 77,81 Cox 106, 109, 122 55, 83-84, 93, Cramer 125 105 law of the iterated logarithm 102- 103, 163 D momentgenerating function 81 (D,~) 149,154-157,164,172-176,178 moments Do 148,155,164,176 convergence 55,92-97,106 Daley 89 finiteness 50, 78-81 Davis 20, 169 overshoot 76,97-102,106 De Acosta 177, 178 process 3,5,6,50,55,57,75-107, De Groot 23 119 de Haan 134, 180 subadditivity 55, 83 Deny 53 uniform integrability 55,57,90-92, Doeblin 53 94-95 Domain of attraction (see Stable law) weak convergence 151-157 Doney 62,67 First passage times across general Donsker 7, 147 boundaries 7, 75, 109, 133-146, Donsker's theorem 147, 149, 158, 164, 159-161, 163 172,177 central limit theorem 137, 144 Anscombe version of, 147, 162 law oflarge numbers 135-137,143 Doob 21,48,53,54,87,167,168,170 law of the iterated logarithm 145- Dynkin 61 146, 163 momentgenerating function 134 moments E convergence 143-145 Englund 61 finiteness 133-134 Index 195

overshoot 145 Invariance principle 147-164 (see also weak convergence 159-161 Functional, limit theorem) Fluctuation theory 4, 46 almost sure (see Invariance principle, Fuchs 4,65 strong) Functional Anscombe-Donsker 147-149 central limit theorem 147,172 strong 7,42,147,161-164,177-178 (see also Invariance principle, weak 7,147-161,175 weak) Inverse relationship 138, 146, 155 limit theorem 7, 132, 147-164 (see also between partial maxima and first Invariance principle) passage times 75, 80, 85, 103, 132 between renewal and counting G processes 49, 56, 75 Gafurov 62 Garsia 61, 167 Generalized arc sine distribution 61 J Gikhman 151,173 Jagers 4, 48, 53 Gnedenko 122 Janson 6, 11, 12, 28, 34, 35, 69, 73, 78, Goldie 103 82,97, 101, 107, 109, 111, 113, Grubel 90 115, 118, 122, 132, 139, 140, 158 Gundy 7 Gut 6,7,22,26,28,34,35,40,43,44, 75, 76, 78, 84-86, 93-98, 100, 102, K 103, 107, 109, 111, 113, 115, 117, Kac 6,7,73 119, 122, 126, 127, 133, 134, 136- Kaijser 121 140, 144-146, 150, 152, 153, 156, Karamata 180 158, 161, 182 Katz 42,43 Kemperman 62 Kesten 62 H Kiefer 132 Hall 163 Kingman 83 Hartman 41,177-178 Kolmogorov 14, 27, 52, 84 Hatori 54 Heath 128 Heyde 73, 78, 80, 81, 83, 85, 86, 89, 95, L 132, 163 Ladder Hogfeldt 158 epoch 5,65-66,77-78,98,104-106, Holst 126, 127 129 Horvath 163 ascending 8, 65 Hsiung 17, 33, 34, 92, 94, 103, 132, strong 5, 65, 68-70, 77 137,142-144,146,163 weak 67,70 Hsu 42,43 descending Huggins 163 strong 67 Hunter 62 weak 66 height 66-67, 98, 101, 129 first passage times for (see First I passage times) Iglehart 156, 164 strong ascending 66, 68-69, 71 Insurance risk theory 1, 2, 125-126 variable 2,4,26,46,65-67,71 196 Index

Ladder (cont.) inequalities method 76,77-78,81,83,87-89, for martingales 167, 169-170 91,93,94-96,97,101,152 for stopped random walks 20- Lai 7, 17,28,33,34,59,67,85,91,92, 28 94,100, 133, 142-144 for sums of independent random Lalley 62, 87, 102 variables 167, 168-169 Lamperti 61 Marcinkiewicz-Zygmund 19, 20, Law oflarge numbers 1,8,13-14,17, 167, 169 29,33,37,44,54,70,83-84,92- Mori 107 94,108,109-111,130,135-137 convergence rate 42-43 converse 27 N Erdos-Renyi 85 Nagaev 62 martingale proof of 44 Negative binomial process 47,49,57, moment convergence 17-18,44 59,106 Law of the iterated logarithm 1, 8, 9, Nerman 100 41-42,43,102-103,108,117,131, Neveu 24,168 147,161-164,177-178 Ney 53,107 Anscombe version of 42,162 Niculescu 53 converse 41 Number of Limit points 41,177 renewals 48-49 set of 41-42,162-164,177-178 visits 6, 64-65 Lindberger 161, 164 expected 64-65, 90 Lindvall 53, 175 Nummelin 53 Local limit theorem 87,106 Loeve 11, 14, 166 Lorden 100,101 o U-convergence theorem 166 Optional sampling theorem (see Martingale) Orey 90 M Ornstein 4, 6, 65 Maejima 106, 107 Marcinkiewicz 14,19,20,29,44,61,84, 110, 136, 167, 169 P Martingale 44,53,167-171,178 Partial maxima 5, 7,46,67-73, 75, convergence 168 108,128-132, 155, 159, 164 moment inequalities (see Moments) Partial minima 5,67-70, 128, 132 optional sampling theorem 21-25, Plucinska 122 167, 170-171 Point reversed 44, 168, 178 persistent 4, 65 McDonald 53 possible 4 Meyer 53 transient 65 Miller 122 Poisson process 47,48,53, 57, 59, 125, Mohan 53,57,61 126 Moments Pollard 52, 90 boundedness 39, 40 Port 6 convergence (see Convergence) Prabhu 4,48,65,68,69,73,75,90,109 finiteness 24,25-28,49,51,78,79,97, 125, 126 132,133-134, 145 Projections 150, 171 Index 197

continuity of (see Continuity of Relative compactness 162-164, 178 function(al)s) Reliability theory 1, 2, 8, 123-124 Proschan 122-124 Renewal counting process 2-6, 8, 9, Pyke 20,124 48-57,68,82, 118 Berry-Esseen theorem 61 central limit theorem 55 Q large deviation 62 1,2,126 law oflarge numbers 54 law of the iterated logarithm 61 momentgenerating function 49 R moments Random change of time 148,151, 160, convergence 54-57 176 finiteness 49 Random index 1-9, 17, 46, 68 for random walks with positive drift Random walk 1-7, 8, 46, 62-73, 99- 82 100 (see also Stopped random uniform integrability 54-57 walk) Renewal function 5,6,49-53,61,90 arithmetic 47,65,97, 105, 107 extended 89 d-arithmetic 47,88,90,95-99 Renewal measure 63, 90 with span d 47,63 harmonic 90 Bernoulli 1,47 Renewal process 1-6,46-62, 74, 76, 82, symmetric 1 105, 108, 113, 118, 125, 152, 164 classification 63-68 age 60,61 coin-tossing 1 alternating 6, 122 drifting 4, 6 arithmetic 47,48,54,66 to - 00 64-67,69,70 d-arithmetic 47,50,52,53,56-58 to + 00 5,7,64-67,70, 104 (see also with span d 47 Random walk, with positive drift) coupling proofs 53 maximum of (see Partial maxima) delayed 62 minimum of (see Partial minima) integral equation 50, 52 with multidimensional indices 107 lifetime 58, 60 nonarithmetic 47, 63, 65, 88, 90, 95- residual 58-60, 61, 97 100,102 nonarithmetic 47,48,52-54,56-60 oscillating 4, 6, 64, 66-68 pure 62 persistent 4" 64-65 terminating 62, 66 with positive drift 3, 5, 73, 74-107, Renewal theorem 6,8,51-53,61,87-90 108,128-132, 133-146, 151-157, Blackwell's 52, 61, 90 159-161, 163, 164 random walk analogue 88 randomly indexed 1-9 (see also elementary 5,51,61,89,90 Stopped random walk) random walk analogue 88 simple 1, 63-65, 104-106, 107 key 53,61 symmetric 1, 24, 35, 66 remainder term estimate 89 stopped (see Stopped random walk) Renewal theory 2-6,8,47-62,74 transient 4, 6, 64-65 in higher dimensions 61 two-dimensional 3, 6, 108, 109 (see Markov, 62 also Random walk, stopped) for Markov chains 62 Recurrent even 4 multidimensional 62 Regularly varying function 53,57,61, multivariate 62 87,116,134,155,180-182 nonlinear 7, 133, 146, 163 198 Index

Renewal theory (cont.) complete convergence 42-43,104 for oscillating random walks 6 convergence rate 42-44, 104 for random walks law oflarge numbers 13-14,37,93, (on the real line) 5,61,62 109-111, 135 with positive drift 5, 74-107 law of the iterated logarithm 42, 102, Renyi 3,8,15,75,85,110,122 117, 162, 163 Revuz 6 moments 20-28,51,78, 133 Richter 4, 11, 12 convergence 36-39,93,109-114, Robbins 23,24,42,43,93,95,106,107 144, 167 Root 20 uniform integrability 28-36,91-95, Rosen 67 111, 138 Rouletts 126-127 weak convergence 148-151,157 Stopped sum (see Stopped random walk) Stopped two-dimensional random walk S 108,109-118, 157-159, 163 Seal 125 applications 118-132 Seneta 134, 180 Stopping summand 39-41, 79, 83, 84, Sequential analysis 1,2,8,9,22,24 91,93,135,138,144 Serfozo 62, 150, 162, 164 Stopping time 4,5, 7, 9, 17,20,28,37, Siegmund 7,24,95, 102, 106, 109, 113, 39,46,50,60,68,74,76-78,133, 13~ 13~ 13~ 13~ 140, 146 170, 178 Skorohod 151,172,173,175,178 Storage and inventory theory 1, 2 Slowly varying function 61,116, 134, Stout 7,41, 177 180--182 Strassen 7,41, 147, 161, 162, 177-178 representation formula 181 Strong approximation 163, 177-178 Smith 8,48,54,62,81,82,106,109, Sums of random variables 111,113,115 dependent 107, 146 Snell 53 independent 106,146,167-169,178 Solovyev 122 m-dependent 78, 101 Sparre Andersen 4,125 non-i.i.d. 106-107, 146 Spitzer 4, 6, 62, 65, 67, 88, 90 stationary 101, 107 Sreehari 181 vector valued 117 Stable law 16,36,150-151 Szynal 43 domain of attraction of 6, 16,57,60, 61,8~ 11~ 131, 13~ 151, 15~ 15~ 159, 161, 173 T Stable process 151,153, 173 (see also Tiicklind 57, 97 Weak convergence) Takacs 8,54,122 without negative jumps 157, 159 Teicher 6,7,23, 138 without positive jumps 153, 157, 161 Teuge1s 57,61 Starn 62 Thorisson 53 Steinebach 85 Tightness 151, 172, 179 Stone 6,89 Topology Stopped random walk 4-6,8-45,51, J1 - 154,156-157,172-173,179 75,76,78,83,91-93,148,162, M 1 - 155,172-173 167, 170 (see also Stopped two• graph 173 dimensional random walk) parametric representation 156,173 central limit theorem 15-16, 38, u- 157,171,173,179 111-112 Torrang 42 Index 199

Tweedie 53 in the J1 -topology 148, 150-160, 173-176 in the M1 -topology 153-156, 159 U in the U-topology 172,174,176 Uniform continuity in probability 15 Whitt 151, 155-157, 164, 175 (see also Anscombe, condition) Wiener Uniform integrability 4, 9, 17, 18, 28- measure 148,150,172 36,40-41,56-57,90-92,98, process 147, 158 109-112,130,138-143,145,165- Wiener-Hopffactorization 4,46 167, 178 (see also Convergence, Williamson 53,61,106 in U; Convergence, moment) Wintner 41,177-178 Wolfowitz 90,132 Woodroofe 7,97,98,100,102,133 V Vervaat 152,156,163,164,175,178 von Bahr 109, 125 Y Yahav 62 Yu 33 W Wainger 62,89, 107 Wald 73 Z Weak convergence 147-161,171-176 Zygmund 19,20, 167, 169