arXiv:1509.06583v2 [math.PR] 18 Nov 2016 rcse.I h ueciia ae ti nw httequ the that known is it case, supercritical the In processes. plcto per nqeigter se[8 n [12]). and [18] (see dis theory spreading queuing model in to appears used ha application been to recently known have are they processes biology, Such Crum processes. . general (CMJ) Crump-Mode-Jagers homogeneous binary tPisntmswt osatrate constant with times Poisson at pitn tree splitting ..729 40 adur-`sNnyCdx France Vandœuvre-l`es-Nancy Cedex, 54506 70239, B.P. e od n phrases. and words Key ednl rmec te.Terlftmsflo narbitr an follow lifetimes Their whe population other. branching each general from a pendently consider we work, this In Introduction 1 Cen – function scale – L´evy processes – process birth–death E-mail: S 00sbetclassifications: subject 2000 MSC 2 1 EL–UR70,NnyUieste apssiniqe B. Nancy-Universit´e, scientifique, 7502, Campus UMR – IECL OC rjc-em NI ac rn s,IC M 7502, UMR – IECL Est, Grand – Nancy INRIA project-team, TOSCA n[9,Nra hw eygnrlcniin o h almost the for conditions general very shows Nerman [19], In h ouaincutn process counting population The pitn re lo st banacnrllmttheorem. limit central a obtain to conjunction us in asymptot allow results the trees These study splitting precisely theory. can L´evy processes we to [17, Then, thanks works recent error. and the theory study of renewal we moments classical paper, use we this end, In this To renormalized. know correctly is when It surely process. almost process Crump-Jargers-Mode counting homogeneous population binary The as ind others. new to each birth from gives individual independently each and distribution arbitrary with i.i.d. eta ii hoe o ueciia iayhomogeneou binary supercritical for theorem limit Central [email protected] ecnie ueciia eea rnhn ouainweeth where population branching general supercritical a consider We 1,1,1]adi fmi motnei h td ftemodel the of study the in importance main of is and 17] 10, [11, rnhn rcs pitn re–CupMd–aespro Crump–Mode–Jagers – tree splitting – process branching rm-oeJgr processes Crump-Mode-Jagers rmr 08;scnay9D0 08,6G1 01,60F05 60K15, 60G51, 60J85, 92D10, secondary 60J80; Primary b h eelgclte nue yti ouaini called is population this by induced tree genealogical The . N eotHenry Benoit t gvn h ubro iigidvdasa time at individuals living of number the (giving Abstract 1 .729 40 Vandœuvre-l`es-Nancy France, 54506 Cedex, 70239, P. 1 , antity 2 r distribution ary rlLmtTheorem. Limit tral ae se[0 ].Aohreapeof example Another 4]). [20, (see eases emn plctos o ntne in instance, For applications. many ve eidvdaslv n erdc inde- reproduce and live individuals re ihanwdcmoiino the of decomposition new a with ,5 nti oe ooti the obtain to model this on 5] 6, htsc rcse converges processes such that n e -oeJgr rcse r very are processes p-Mode-Jagers cbhvoro hs moments these of behaviour ic ac-nvri´,Cmu scientifique, Nancy-Universit´e, Campus uecnegneo eea CMJ general of convergence sure − fsc ouaini known a is population such of h ro fti convergence. this of error the αt ieie fidvdasare individuals of lifetimes e N t vdasa oso times Poisson at ividuals where , P V . n h itsoccur births the and α steMalthusian the is es–linear – cess s t sa is ) . a parameter of the population, converges almost surely. This result has been proved in [21] using Jagers-Nerman’s theory of general branching processes counted by random characteristics. Another proof can be found in [5], using only elementary probabilistic tools, relying on fluctuation analysis of the process. Our purpose in this work is to investigate the behaviour of the error in the aforementioned convergence. Many papers studied the second order behaviour of converging branching processes. Early works investigate the Galton-Watson case. In [13] and [14], Heyde obtained rates of convergence and get central limit theorems in the case of supercritical Galton-Watson when the limit has finite variance. Later, in [1], Asmussen obtained the polynomial convergence rates in the general case. In our model, the particular case when the individuals never die (i.e. PV = δ∞, implying that the population counting process is a Markovian Yule process) has already been studied. More precisely, Athreya showed in [3], for a Markovian branching process Z with appropriate conditions, and such −αt that e Zt converges to some W a.s., that the error

αt Zt e W − , √Zt converges in distribution to some Gaussian random variable. In the case of general CMJ processes, there was no similar result although very recent work of Iksanov and Meiners [15] gives sufficient conditions for the error terms in the convergence of supercritical general branching processes to be o(tδ) in a very general background (arbitrary birth ). Although our model is more specific, we give slightly more precise results. Indeed, α t we give the exact rate of convergence, e 2 , and characterized the limit. Moreover, we believe that our method could apply to other general branching processes counted by random characteristics, as soon as the birth point process is Poissonian. The first step of the method is to obtain informations on the moments of the error in the a.s. convergence of the process. Using the renewal structure of the tree and formulae on the expectation of a random integral, we are able to express the moments of the error in terms of the scale function of a L´evy process. This process is known to be the contour process of the splitting tree as constructed in [17]. The asymptotic behaviours of the moments are then precisely studied thanks to the analysis of the ladder height process associated to a similar L´evy process and to the Wiener-Hopf factoriza- tion. The second ingredient is a decomposition of the splitting tree into subtrees whose laws are characterized by the overshoots of the contour process over a fixed level. Finally, the error term can be decomposed as the sum of the error made in each subtrees. Our controls on the moments ensure that the error in each subtree decreases fast enough compared to the growth of the population (see Section 4 for details). The first section is devoted to the introduction of main tools used in this work. The first part recall the basic facts on splitting trees which are essentially borrowed from [17, 6, 7, 8, 5]. The second part recall some classical facts on renewal equations and the last part gives a useful Lemma on the expectation of a random integral. Section 3 is devoted to the statement of Theorem 3.2 which is a CLT for the population counting process Nt. Section 4 details the main lines of the method. Theorem 3.2 is finally proved in Section 6.

2 2 Splitting trees and preliminary results

This section is devoted to the statement of results which are constantly used in the sequel. The first subsection presents the model and states results on splitting trees coming from [17, 6, 7, 21, 5]. The second subsection recalls some well-known results on renewal equations. Finally, the last subsection is devoted to the statement and the proof of a lemma for the expectation of random integrals, which is constantly used in the sequel.

2.1 Splitting trees

In this paper, we study a model of population dynamics called a splitting tree. We consider a branching tree (see Figure 1), where individuals live and reproduce independently from each other. Their lifetimes are i.i.d. following an arbitrary distribution PV . Given the lifetime of an individual, he gives birth to new individuals at Poisson times with positive constant rate b until his death independently from the other individuals. We also suppose that the population starts with a single individual called the root. The finite measure Λ := bPV is called the lifespan measure, and plays an important role in the study of the model. t

Figure 1: Graphical representation of a Splitting tree. The vertical axis represents the biological time and the horizontal axis has no biological meaning. The vertical lines represent the individuals, their lengths correspond to their lifetimes. The dashed lines denote the filiations between individuals.

In [17], Lambert introduces a contour process Y , which codes for the splitting tree. Suppose we k are given a tree T, seen as a subset of R k≥ N with some compatibility conditions (see [17]). × ∪ 0 On this object, Lambert constructs a Lebesgue measure λ and a total order relation which can   be roughly summarized as follows: let x,y in T, the point of birth of the lineage of x during the lifetime of the root split the tree in two connected components, then y x if y belong to the same  component as x but is not an ancestor of x (see Figure 2).

3 If we assume that λ(T) is finite, then the application, ϕ : T [0, λ (T)], → x λ ( y y x ) , → { |  } is a bijection. Moreover, in a graphical sens (see Figure 2), ϕ(x) measures the length of the part of the tree which is above the lineage of x. The contour process is then defined, for all s, by,

x

Figure 2: In gray, the set y T y x { ∈ |  }

−1 Ys := ΠR ϕ (s) ,

k where ΠR is the projection from R k≥ N toR.  × ∪ 0 In a more graphical way, the contour process can be seen as the graph of an exploration process  of the tree: it begins at the top of the root and decreases with slope 1 while running back along the − life of the root until it meets a birth. The contour process then jumps at the top of the life interval of the child born at this time and continues its exploration as before. If the exploration process does not encounter a birth when exploring the life interval of an individual, it goes back to its parent and continues the exploration from the birth-date of the just left individual (see Figure 3). It is then readily seen that the intersections of the contour process with the line of ordinate t are in one-to-one correspondence with the individuals in the tree alive at time t. In the case where λ(T) is infinite, one has to consider the truncations of the tree above fixed levels in order to have well-defined contours (see [17] for more details). In [17], Lambert shows that (t) the contour process Ys , s R of a splitting tree which has been pruned from every part above ∈ + t (called truncated tree above t), has the law of a spectrally positive L´evy process started at the lifespan V of the root, reflected below t and killed at 0, with Laplace exponent ψ given by

ψ(x)= x 1 e−rx Λ(dr), x R . (2.1) − − ∈ + Z(0,∞]  4 Figure 3: One-to-one correspondence between the tree and the graph of the contour represented by corresponding colours.

In particular, the of PV can be expressed in terms of ψ,

−λv ψ(λ) λ e PV (dv)=1+ − . (2.2) R b Z + The largest root of ψ, denoted α, characterizes the way the population expend. In this paper, we only investigate the behavior of the population in the supercritical case, when α> 0. In particular, using the convexity of ψ (see [16]), this is equivalent to ψ′(0+) < 0. Now, since

′ −xv ψ (x) = 1 xe bPV (dv), x R+, (2.3) − R ∀ ∈ Z + one can see that the condition ψ′(0+) < 0 is also equivalent to have bE [V ] > 1 which is a more usual supercritical condition. In the supercritical case, the population grows exponentially fast on the survival event with rate α. According to (2.2), one can also see that

−αv α e PV (dv) = 1 . (2.4) R − b Z + 5 As said earlier, an important feature of the contour process is that the number of alive individuals at time t equals Card Y (t) = t s R . { s | ∈ +} This set is the number of times the contour process hits t. This allows getting, thanks to the theory of L´evy processes, the law of the unidimensional marginals of the process (Nt, t R+). Indeed, let (t) ∈ τt (resp. τ0) be the hitting time of t (resp. of 0) by the contour process Y . Now, for any positive integer k, the strong entails that

(t) (t) P (Nt = k Nt > 0) = E Pt∧V ♯ Y = t s 0 = k τt <τ = Pt ♯ Y = t s> 0 = k 1 . | { s | ≥ } | 0 { s | } − n  o   Once again, the strong Markov property gives

(t) (t) Pt ♯ Y = t s> 0 = k 1 = Pt (τt <τ ) Pt ♯ Y = t s> 0 = k 2 { s | } − 0 { s | } −   k−1 (t)  = Pt (τt <τ ) Pt ♯ Y = t s> 0 = 0 . 0 { s | }   Now, using fluctuation identities for spectrally positive L´evy processes (see Theorem 8.1 in [16] for the spectrally negative case), we have that 1 Pt (τt <τ ) = 1 , 0 − W (t) where W is the scale function of the L´evy process whose Laplace exponent is given by (2.1). The function W is the unique increasing function whose Laplace transform is given by

−rt 1 TLW (t)= e W (r)dr = , t > α, (2.5) ψ(t) Z(0,∞) where α is the largest root of ψ. From the discussion above, we see that Nt is a geometric random variable conditionally on Nt > 0 . More precisely, for a positive integer k, { } 1 1 k−1 P (Nt = k Nt > 0) = 1 . (2.6) | W (t) − W (t)   In particular, E [Nt Nt > 0] = W (t). (2.7) | Moreover, it can be showed (see [21]), that

ENt = W (t) W ⋆ PV (t), (2.8) − and W ⋆ PV (t) P (Nt > 0) = 1 , (2.9) − W (t) where W ⋆ PV (t) := W (t s)PV (ds). − Z[0,t] 6 For the rest of this paper, unless otherwise stated, the notation Pt refers to P (. Nt > 0) whereas | P∞ refers to the probability measure conditioned on the non-extinction event (which has positive probability in the supercritical case). Finally, we recall the asymptotic behaviour of the scale function W (t) which is widely used in the sequel,

Lemma 2.1. ([6, Thm. 3.21]) There exist a positive constant γ such that,

e−αtψ′(α)W (t) 1= e−γt . − O From this Lemma and (2.9), one can easily deduce that  α P (NonEx) = lim P (Nt > 0) = , (2.10) t→∞ b where NonEx refer to the non-extinction event. To end this section, let us recall the law of large number for Nt.

Theorem 2.2. There exists a random variable , such that E −αt 2 e Nt E , a.s. and in L . t→∞→ ψ′(α)

Moreover, under P∞, is exponentially distributed with parameter one. E 2.2 A bit of

The purpose of this part is to recall some facts on renewal equations borrowed from [9]. Let h : R R → be a function bounded on finite intervals with support in R+ and Γ a probability measure on R+. The equation F (t)= F (t s)Γ(ds)+ h(t), R − Z + called a renewal equation, is known to admit a unique solution finite on bounded interval. Here, our interest is focused on the asymptotic behavior of F . We said that the function h is DRI (directly Riemann integrable) if for any δ > 0, the quantities

n δ sup f(t) i t∈[δi,δ(i+1)) X=0 and n δ inf f(t) t∈[δi,δ(i+1)) i X=0 δ δ converge as n goes to infinity respectively to some real numbers Isup and Iinf , and

δ δ lim Isup = lim Iinf < . δ→0 δ→0 ∞ In the sequel, we use the two following criteria for the DRI property:

7 Lemma 2.3. Let h a function as defined previously. If h satisfies one of the next two conditions, then h is DRI:

1. h is non-negative decreasing and classically Riemann integrable on R+, 2. h is c`adl`ag and bounded by a DRI function. We can now state the next result, which is constantly used in the sequel. Theorem 2.4. Suppose that Γ is non-lattice, and h is DRI, then

lim F (t)= γ h(s)ds, t→∞ R Z + with −1 γ := s Γ(ds) , R Z +  if the above integral is finite, and zero otherwise. Remark 2.5. In particular, if we suppose that Γ is a measure with mass lower than 1, and that there exists a constant α 0 such that ≥ eαtΓ(dt) = 1, R Z + then, one can perform the change a measure

Γ(dt)= eαtΓ(dt), in order to apply Theorem 2.4 to a newe renewal equation to obtain the asymptotic behavior of F . (See [9] for details). This method is also used in the sequel.

2.3 A lemma on the expectation of a random integral with respect to a Poisson random measure

Lemma 2.6. Let ξ be a Poisson random measure on R+ with intensity θλ(da) where θ is a positive (i) real number and λ the Lebesgue measure. Let also Xu , u R+ be an i.i.d. sequence of non- ∈ i≥1 negative c`adl`ag random processes independent of ξ. Let also Y be a random variable independent of (i) ξ and from the family Xu , u R+ . If ξu denotes ξ ([0, u]), then, for any t 0, ∈ i≥1 ≥   t E (ξu)1 P E Xu Y >u ξ(du)= (Y > u) θ Xudu, Z[0,t] Z0 (1) where (Xu, u R )= Xu , u R . In addition, for any t s, we have ∈ + ∈ + ≤   t E (ξv)1 (ξu)1 E 2 P Xv Y >v ξ(dv) Xu Y >u ξ(du) = θ Xu (Y > u) du "Z[0,t] Z[0,s] # Z0 t  s  2 + θ EXuEXvP (Y > u, Y > v) dudv. Z0 Z0

8 Proof. Since the proof the two formulas lies on the same ideas, we only give the proof of the second equation. First of all, let f : R2 R be a positive measurable deterministic function. We recall that, for + → + a Poisson random measure, the measures of two disjoint measurable sets are independent random variables. That is, for A, B in the Borel σ-field of R , ξ(A Bc) is independent from ξ(B), which + ∩ leads to E [ξ(A)ξ(B)] = Eξ(A)Eξ(B) + Varξ(A B). ∩ Using the approximation of f by an increasing sequence of simple function, as in the construction of Lebesgue’s integral, it follows from the Fubini-Tonelli theorem and the monotone convergence theorem, that

t t s E f(u, v) ξ(du)ξ(dv)= θf(u, u) du + θ2f(u, v) dudv. Z[0,t]×[0,s] Z0 Z0 Z0 Since the desired relation only depends on the law of our random objects, we can assume without (i) loss of generality that ξ is defined on a probability space (Ω, , P) and the family Xs , s R+ F ∈ i≥1 is defined on an other probability space Ω˜, ˜, P˜ . Then, using a slight abuse of notation, we define F ˜ ξ on Ω Ω by ξ(ω,ω˜) = ξω, and similarly for the family X. × v Then, by Fubini-Tonneli Theorem, with the notation ξω = ξω ([0, v]),

v u E (ξv ) (ξu) (ξω) (ξω ) P P˜ Xv Xu ξ(du)ξ(dv) = Xv (˜ω)Xu (˜ω) ξω(du)ξω(dv) (dω, dω˜) ˜ ⊗ "Z[0,t]×[0,s] # ZΩ×Ω Z[0,t]×[0,s] v u (ξω ) (ξω ) P˜ P = Xv (˜ω)Xu (˜ω) (dω˜) ξω(du)ξω(dv) (dω). ˜ ZΩ Z[0,t]×[0,s] ZΩ  But since the X(i) are identically distributed and ξ is a simple measure (purely atomic with mass v u one for each atom) we deduce that, if u and v are two atoms of ξω, ξω = ξω if and only if u = v, which implies that

v u EXuEXv, u = v, (ξω ) (ξω ) P˜ Xv (˜ω)Xu (˜ω) (dω˜)= 2 6 ξω a.e. ˜ EX , u = v, − ZΩ ( u The result follows readily, and the case with the of Y is left to the reader.

3 Statement of the theorem

The a.s. convergence stated in Section 2.1 suggests to study the second order properties of this convergence to get central limit theorems. We recall that the Laplace distribution with zero mean and variance σ2 is the probability distribution whose characteristic function is given by 1 λ R . 1 2 2 ∈ 7→ 1+ 2 σ λ

9 It particular, it has a density given by

1 − |x| x R e σ . ∈ 7→ 2σ We denote this law by 0,σ2 . We also recall that, if G is a Gaussian random variable with L zero mean and variance σ2 and is an exponential random variable with parameter 1 independent E  of G, then √ G is Laplace 0,σ2 . E L Before stating the main result of the paper, let us recall the law of large number for N .  t Theorem 3.1. In the supercritical case, that is bE [V ] > 1, there exists a random variable , such E that −αt 2 e Nt E , a.s. and in L . t→∞→ ψ′(α)

In particular, under P∞, is exponentially distributed with parameter one. E

In this work we prove the following theorem on the second order properties of the above conver- gence.

Theorem 3.2. In the supercritical case, we have, under P∞,

− α t ′ αt (d) ′ e 2 ψ (α)Nt e 0, 2 ψ (α) . − E t−→→∞ L −   The proof of this theorem is the subject of Section 6. Note that, according to (2.3), we have

′ −αv 2 ψ (α)=1+ ve bPV (dv) > 0. − R Z + 4 Strategy of proof

1 Let (Gn)n≥1 be a sequence of geometric random variables with respective parameter n , and (Xi)i≥1 2 a L family of i.i.d. random variables with zero mean independent of (Gn)n≥1. It is easy to show that the characteristic function of G 1 n Z := X , (4.1) n √n i i X=1 is given by 1+ o (1) EeiλZn = n , (4.2) 2E 2 1+ λ X1 + on(1) 2 from which we deduce that Zn converges in distribution to (0, EX ). L 1 If we suppose that the population counting process N is a Yule Markov process, it clearly follows from the branching property that, for s

Ns i Nt = Nt−s, (4.3) i X=1

10 i where the family Nt−s i≥1 is an i.i.d. sequence of random variables distributed as Nt−s and inde- pendent of N . Moreover, since N is geometrically distributed with parameter e−αs, taking the s  s renormalized limit leads to, Ns −αt −αs lim e Nt =: = e i, t→∞ E E i X=1 where ,..., N is an i.i.d. family of exponential random variables with parameter one, and inde- E1 E s pendent of Ns. Hence, Ns αt i α(t−s) Nt e = N e i , − E t−s − E i X=1   is a geometric sum of centered i.i.d. random variables. This remark and (4.1) suggest the desired CLT in the Yule case.

Remark 4.1. Let N be a integer valued random variable. In the sequel we say that a random vector with random size (Xi)1≤i≤N form an i.i.d. family of random variables independent of N, if and only if d (X1,...,XN ) = X˜1,..., X˜N ,   where X˜i is a sequence of i.i.d. random variables distributed as X1 independent of N. i≥1   However, in the general case, we need to overcome some important difficulties. First of all, equation (4.3) is wrong in general. Nevertheless, a much weaker version of (4.3) can be obtained in the general case. To make this clear, if u

Nu i Nt = Nt−u (Oi) , (4.4) i X=1 i where N (Oi) denote the population counting processes of the subtrees T(Oi) induced by t−u i≤Nu each individual. The notation refers to the fact that each subtree has the law of a standard split-  ting tree with the only difference that the lifelength of the root is given by Oi. More precisly, we define, for all i 1 and o R , N i (o) the population counting process of the splitting tree con- ≥ ∈ + t−u structed from the same random objects as the ith subtree of Figure 4, where the life duration of the first individual is equal to o. Hence, from the independence properties between each individuals, i N (o) , t u, o 0 is a family of independent processes, independent of (Oi) , and t−u ≥ ≥ i≥1 1≤i≤Nu N i (o),t u has the law of the population counting process of a splitting tree but where the t−u ≥  lifespan of the ancestor is o. Note that the lifespans of the other individuals are still distributed as  i V . From the discussion above, it follows that the family of processes Nt−u (Oi) , t u are ≥ 1≤i≤Nu dependent only through the residual lifetimes (Oi) and the law of (Nt (Oi) , t R ) under Pu 1≤i≤Nu ∈ +

11 t

T(O2) T(O3)

T(O1)

O3

T(O4)

O1 O2 O4 u

Figure 4: Residual lifetimes with subtrees associated to living individuals at time u. is the law of standard population counting process of splitting tree where the lifespan of the root is distributed as Oi under Pu. Unfortunately, the computation of (4.2) does not apply to (4.4). This issue is solved by the following lemma, whose proof is very similar to one of Proposition 5.5 of [17].

Lemma 4.2. Let u in R+, we denote by Oi for i an integer between 1 and Nu the residual lifetime of the ith individuals alive at time u. Then under Pu, the family (Oi, i J1,NuK) form a family ∈ of independent random variables, independent of Nu, and, expect O1, having the same distribution, given by, for 2 i Nt, ≤ ≤ W (u y) Pu(Oi dx)= − bP (V y dx) dy. (4.5) ∈ R W (u) 1 − ∈ Z + −

Moreover, it follows that the family (Ns(Oi),s R ) is an independent family of process, i.i.d. ∈ + 1≤i≤Nu for i 2, and independent of Nu. ≥ Proof. Let Y (i) a family of independent L´evy processes with Laplace exponent 0≤i≤Nu  ψ(x)= x 1 e−rx Λ(dr), x R , − − ∈ + Z(0,∞]  conditioned to hit (u, ) before hitting 0, for i 0,...,Nu 1 , and conditioned to hit 0 first for ∞ ∈ { − } i = Nu. We also assume that, Y (0) = u V, 0 ∧ and (i) Y = u, i 1,...,Nu . 0 ∈ { }

12 O1 O5 O2 O3 O4 t

Figure 5: Reflected JCCP with overshoot over t. Independence is provided by the Markov property.

Now, denote by τi the exit time of the ith process out of (0, u) and

n−1 Tn = τi, n 0,...,Nu + 1 . ∈ { } i X=0 Then, the process defined, for all s, by

Nu Y = Y (i) 1 , s s−Ti Ti≤s

13 The result follows easily from 1 P τ − <τ + = 1 . 0 u − W (u)  Remark 4.3. It is important to note that the law of the residual lifetimes of the individuals considered above depends on the particular time u we choose to cut the tree. That is why, in the sequel, we may (u) denote Oi for Oi when we want to underline the dependence in time of the law of the residual lifetimes.

In addition, as suggested by (4.2), we need to compute the expected quadratic error in the convergence of Nt, ′ αt 2 E ψ (α)Nt e , − E h i which implies to compute ENt .  E Although, this moment is easy to obtain in the Markovian case, the method does not extend easily to the general case. One idea is to characterize it as a solution of a renewal equation in the spirit of the theory of general CMJ processes. To make this, we use the renewal structure of a splitting tree: the splitting trees can be constructed (see [17]) by grafting i.i.d. splitting tree on a branch (a tree with a single individual) of length V∅ (i) R distributed as V . Therefore, there exists a family Nt , t + of i.i.d. population counting ∈ i≥1 processes with the same law as (Nt, t R ), and a Poisson random measure ξ on R with intensity ∈ + + b da such that (ξu)1 1 Nt = Nt−u V∅>u ξ(du)+ V∅>t, a.s., (4.6) Z[0,t] where ξu = ξ ([0, u]). Another difficulty comes from the fact that unlike (4.1), the quantities summed in (4.4) are time-dependent, which requires a careful analysis of the asymptotic behaviour of their moments. The calculus and the asymptotic analysis of these moments is made in Section 6.1.1: In Lemma 6.1, we compute ENt , and then with Lemmas 6.2 and 6.4, we study the asymptotic behaviour of E the error of order 2 and 3 respectively. Section 6.1.2 is devoted to the study of the same questions for the population counting processes of the subtrees described in Figure 4 (when the lifetime of the root is not distributed as V ). Finally, Section 6.2 is devoted to the proof of Theorem 3.2. One of the difficulties in studying the behaviour of the moments is to get better estimates on the scale function W than those of Lemma 2.1. This is the subject of the next section.

5 Precise estimates on W using L´evy processes

Before stating and proving the result of this section, we need to recall some facts about L´evy processes. We follow the presentation of [16]. First, we recall that the law of a spectrally positive L´evy process (Yt, t R ) is uniquely characterized by its Laplace exponent ψ, ∈ +

−λY1 ψY (λ) = log E e , λ R , ∈ + h i

14 which in our case take the form of (2.1):

−rx ψY (λ)= x 1 e bPV (dr), λ R . − − ∈ + Z(0,∞]  In this section, we suppose that Y = 0. For a such L´evy process, 0 is irregular for (0, ) and in this 0 ∞ case the at the maximum (Lt, t R) can be defined as ∈ nt i Lt = e , t R , ∈ + i X=0 i where e i≥0 is a family of i.i.d. exponential random variables with parameter 1, and  nt := Card 0

α−ψY (β) κ(α, β)= , α, β R+, φY (α)−β ∈ ( κˆ(α, β)= φY (α)+ β, α, β R+, ∈ where φY is the right-inverse of ψY . Taking α = 0 allows us to recover the Laplace exponent ψH of H from which we obtain the relation,

ψY (λ) = (λ φY (0)) ψH (λ). (5.1) − We have now all the notation to state and prove the main result of this section.

Proposition 5.1 (Behavior of W ). There exists a positive non-increasing c`adl`ag function F such that eαt W (t)= eαtF (t), t 0, ψ′(α) − ≥ and 1 if EV < , lim eαtF (t)= bEV −1 ∞ t→∞ (0 otherwise.

15 Proof. Let Y ♯ be a spectrally negative L´evy process with Laplace exponent given by

♯ −λx −αx ψ (λ)= λ 1 e e b PV (dx). − R − Z +   It is known that Y ♯ has the law of the contour process of the supercritical splitting tree with lifespan ♯ measure PV conditioned to extinction (see [17]). In this case the largest root of ψ is zero, meaning ♯ that the process Y does not go to infinity and that φY ♯ (0) = 0. Elementary manipulations on Laplace transform show that the scale function W ♯ of Y ♯ is related to W by

W ♯(t)= e−αtW (t), t R . ∈ + Let H♯ be the ascending ladder subordinator associated to the L´evy process Y ♯. In the case ♯ where φY ♯ (0) = 0, and in this case only, the scale function W can be rewritten as (see [16] or use Laplace transform), ∞ ♯ P ♯ W (t)= Hx t dx. (5.2) 0 ≤ Z   In other words, if we denote by U the potential measure of H♯,

W ♯(t)= U[0,t].

♯ Now, it is easily seen from (5.1) that the Laplace exponent ψH♯ of H takes the form,

′ −λr ψH♯ (λ)= ψ (α) 1 e Υ(dr), − [0,∞] − Z   where −αv −αV Υ(dr)= e bPV (dv)dr = E e 1V >r bdr. Z(r,∞)   Moreover, Υ(R ) = 1 ψ′(α), + − which mean that H♯ is a with jump rate 1 ψ′(α), jump distribution E −αV 1 − [e V >r] ′ J(dr) := 1−ψ′(α) dr, and killed at rate ψ (α). It is well known (or elementary by conditioning on the number of jumps at time x), that

′ k −ψ′(α)x −(1−ψ′(α))x ((1 ψ (α)) x) ⋆k P ♯ (dt)= e e − J (dt). Hx k! ≥ Xk 0 Some calculations now lead to, U(dx)= Υ⋆k(dx). ≥ Xk 0 From this point, since Υ is a sub-probability, U(x) := U[0,x] satisfies the following defective renewal equation, 1 U(x)= U(x u)Υ(du)+ R+ (x). R − Z + 16 Finally, since eαxΥ(dx) = 1, R Z + and since, from Lemma 2.3, t U(t, ), → ∞ is clearly a directly Riemann integrable function as a positive decreasing integrable function. Hence, as suggested in Remark 2.5, αx 1 e (U(R+) U(x)) , − x−→→∞ αµ with 1 µ = reαrΥ(dr)= (bEV 1) , R α − Z + if V is integrable. In the case where V is not integrable, the limit is 0. To end the proof, note using relation (5.2) and the fact that H♯ is killed at rate ψ′(α) that, 1 W ♯(t)= U(t, ). ψ′(α) − ∞

6 Proof of Theorem 3.2

We begin the proof of Theorem 3.2 by computing moments, and analysing their asymptotic be- haviours. A first part is devoted to the case of a splitting tree where the lifetime of the root is distributed as V whereas a second part study the case where the lifespan of the root is arbitrary (for instance, as the subtrees described by Figure 4).

6.1 Preliminary moments estimates

αt 2 This section is devoted to the calculus of the expectation of Nt e . We start with the simple − E case where the initial individual has life-length distributed as V . Secondly, we study the asymp-  totic behavior of these moments. In Subsection 6.1.2, we prove similar result for arbitrary initial distributions. The expectations above are given with respect to P, however since Nt and vanish on the E extinction event, we can easily recover the results with respect to Pt by using (2.10) and (2.9) (see Corollary 6.3).

L 6.1.1 Case V∅ = V

We start with the computation of ENt . E

17 Lemma 6.1 (Join moment of and Nt). The function t E [Nt ] is the unique solution bounded E → E on finite intervals of the renewal equation,

f(t)= f(t u)be−αuP (V > u) du R − Z + −αv + αbE [N·] ⋆ e P (V > , V > v) dv (t) R · Z +  + α e−αvP (V > t, V > v) dv, (6.1) R Z + and its solution is given by

α −αt −αt 1+ e W (t) 1 e W ⋆ PV (t). b − − −    Proof. As explained in Section 4,

(ξu)1 1 Nt = Nt−u V∅>u ξ(du)+ V∅>t, Z[0,t] (i) where ξ a with rate b on the real line, N i≥1 is a family of independent CMJ processes with the same law as N and V is the lifespan of the root. Moreover, the three objects ∅  (u) N , ξ and V∅ are independent. It follows that, for s>t

(ξu) (ξv)1 1 NtNs = Nt−u Ns−v V∅>u V∅>v ξ(du)ξ(dv) Z[0,t]×[0,s] (ξu)1 1 (ξu)1 1 1 1 + Nt−u V∅>u ξ(du) V∅>s + Ns−u V∅>u ξ(du) V∅>t + V∅>t V∅>s, Z[0,t] Z[0,s] and, using Lemma 2.6,

ENtNs = bE [Nt−uNs−u] P (V > u) du Z[0,t] 2 + b E [Nt−u] E [Ns−v] P (V > u, V > v) du dv Z[0,t]×[0,s]

+ P (V >s) bE [Nt−u] du + bE [Ns−u] P (V > u, V > t) du + P (V >s) . Z[0,t] Z[0,s] Then, thanks to the estimate W (t) = eαt (see Lemma 2.1 or 5.1) and the L1 convergence of −1 O Ns 2 W (s) NtNs to Nt as s goes to infinity (since, by Theorem 2.2, converge in L and using E  W (s)

18 Cauchy-Schwarz inequality), we can exchange limit and integrals to obtain,

Ns −αu lim ENt = ENt = E [Nt−u ] e P (V > u) b du →∞ s W (s) E [0,t] E :=f(t) Z =:f⋆G(t) | {z } −αv |+ αbE{z[Nt−u] e P (V >} u, V > v) du dv Z[0,t]×[0,∞)

=:ζ1(t)

+ | αe−αvP (V >v,V{z >t) dv, } Z[0,∞]

=:ζ2(t)

−1 α where we used that lim W (t) ENt = | . {z } t→∞ b Now, we need to solve the last equation to obtain the last part of the lemma. To do that, we compute the Laplace transform of each part of the equation. Note that, since W (t)= eαt , it is O easy to see that the Laplace transform of each term of (6.1) is well-defined as soon as λ > α (using  Cauchy-Schwarz inequality for the first term). Now, using (2.2),

α· −λt −λt TLe G(λ)= b e P (V >t) dt = b e PV (dv) dt R R Z + Z + Z(t,∞) 1 −λv ψ(λ) = 1 e bPV (dv) = 1 . (6.2) λ R − − λ Z +   So, ψ(λ + α) TLG(λ) = 1 . − λ + α Then,

−αv TLζ1(λ)= αTLEN.(λ)TL b e P (V > , V > v) dv (λ) R ·  Z +  λ −αv = 1 TL α e P (V > , V > v) dv (λ) . ψ(λ) − R ·    Z + 

=Lζ2(λ) and, using (6.2), we get | {z }

−λt −αv 1 ψ (λ + α) ψ(λ) TLζ2(λ)= α e e P (V > t, V > v) dv dt = . R R b λ − λ Z + Z +   Finally, we obtain, ψ(λ + α) λ 1 ψ (λ + α) ψ(λ) 1 ψ (λ + α) ψ(λ) TLf(λ)= TLf(λ) 1 + 1 + . − λ + α ψ(λ) − b λ − λ b λ − λ         Hence, λ 1 1 TLf(λ)= . b ψ(λ) − ψ(λ + α)   19 Finally, using (2.5) and (ψ(λ) b + λ) bTL (W ⋆ P ) (λ)= − , V ψ(λ) allows to inverse the Laplace transform of f and get the result.

Lemma 6.1 allows us to compute the expected quadratic error.

′ −αt Lemma 6.2 (Quadratic error in the convergence of Nt). Let the a.s. limit of ψ (α)e Nt. Then, E −αt ′ αt 2 α ′ lim e E ψ (α)Nt e = 2 ψ (α) . t→∞ − E b − Proof. Let   µ := lim eαtF (t), t→∞ where F is defined in Proposition 5.1. We have, using Proposition 5.1 and (2.4),

αt αt P e α e −αuP α(t−u) P W (t u) V (du)= ′ 1 µ ′ e V (du)+ µ e F (t u) V (du) [0,t] − ψ (α) − b − − ψ (α) (t,∞) [0,t] − − Z   Z Z   eαt α = 1 µ + o(1). ψ′(α) − b −   Hence, the expression of ENt given by Lemma 6.1 can be rewritten, thanks to Lemmas 5.1, as E 2αeαt α 1 ENt = + µ + o(1), (6.3) E bψ′(α) − b ψ′(α)   Using (2.6) and (2.9) in conjunction with Proposition 5.1, we also have αeαt 2αµ α e−αtEN 2 = 2 + o(1). (6.4) t bψ′(α)2 − bψ′(α) − bψ′(α) Hence, it finally follows from (6.3) and (6.4) that

αt −αt ′ αt 2 ′ 2 −αt 2 ′ 2αe e E ψ (α)Nt e = ψ (α) e EN 2ψ (α)ENt + − E t − E b  αµ αψ′(α) α = 2 ψ′(α) + 2 1+ ψ′(α)µ + o(1) − b − b b α = 2 ψ′(α) + o(1).  b − 

It is worth noting that, using (2.9) and the method above, we have the following result.

Corollary 6.3. We have 1 b bµψ′(α) = e−αt + o(e−αt), (6.5) P (Nt > 0) α − α which leads to 2eαt 1 EtNt = 3µ + o(1). (6.6) E ψ′(α) − ψ′(α) −

20 Our last estimate is the boundedness of the third moments. Lemma 6.4 (Boundedness of the third moment). The third moment of the error is asymptotically bounded, that is 3 − α t ′ αt E e 2 ψ (α)Nt e = (1) . − E O   ∞  Proof. We define for all t 0, Nt as the number of individuals alive at time t which have an infinite ≥ ∞ descent. According to Proposition 6.1 of [5], N is a Yule process under P∞. We have ′ αt 3 ′ ∞ 3 ∞ αt 3 E ψ (α)Nt e E ψ (α)Nt Nt E Nt e α t− E 8 α t− + 8 −α t E . 2 ≤ 2 2 " e # " e # " e #

Now, we know according to the proof of Theorem 6.2 of [5] (and this is easy to prove using the decomposition of Figure 4) that N ∞ can be decomposed as

Nt ∞ (t) Nt = Bi , i X=1 (t) where Bi is a family of independent Bernoulli random variables, which is i.i.d. for i 2, under i≥1 ≥ Pt. Hence,  3 4 4 ′ ∞ 3 Nt ψ (α)Nt Nt − 3 αt ′ (t) E e 2 E ψ α B . t α t− t ( ) i e 2 ≤  −  " # i=1 ! X    

Since, it is known from the proof of Theorem 6.2 of [5] that EB(t) = ψ′(α)+ e−αt , 2 O it is straightforward that  ′ ∞ 3 E ψ (α)Nt Nt t α t− " e 2 # is bounded.

On the other hand, we know that a Yule process is a time-changed Poisson process (see for instance [2], Theorem III.11.2), that is, if Pt is a Poisson process independent of under P∞, E 3 ∞ αt 3 αt N e P αt e E t E E(e −1) − E P −α t E = ∞ α t (NonEx). " e 2 #  e 2 

Now, using Hlder inequality, it remains to bound 

αt 4 PE(eαt−1) e −2αt αt 4 −x E − E e E P αt e x e dx. ∞ α t = ∞ x(e −1)  e 2  R − ! Z + h  i Finally, for a Poissonian random variable X with parameter ν, straightforward computations give that E (X ν)4 = 3ν2 + ν, which allows us to end the proof. − h i 21 P 6.1.2 Case with arbitrary initial distribution V∅ In order to study the behavior of the sub-splitting trees involved in the decomposition described in Figure 4, we investigate the behaviour of a splitting tree where the ancestor lifelength is not distributed as V , but follows an arbitrary distribution. Let Ξ be a random variable in (0, ], giving ∞ to the life-length of the ancestor and by N(Ξ) the associated population counting process. Using the decomposition of N(Ξ) over the lifespan of the ancestor, as described in Section 4, we have (ξu)1 1 Nt(Ξ) = Nt−u Ξ>u ξ(du)+ Ξ>t, (6.7) R Z + i where N i≥1 is a family of i.i.d. CMJ processes with the same law as N independent of Ξ and ξ, as described in section 4. Let, for all i 1, i be  ≥ E ′ −αt i i := lim ψ (α)e Nt , a.s, (6.8) E t→∞ and, let (Ξ) be the random variable defined by E

−αu (Ξ) := e 1 >u ξ(du). (6.9) E E(ξu) Ξ Z[0,∞] Lemma 6.5 (First moment). The first moment is asymptotically bounded, that is

′ αt E ψ (α)Nt(Ξ) e (Ξ) = (1), − E O uniformly with respect to the random variable Ξ. 

Proof. Using Lemma 2.6, (6.7) and (6.9) with have

′ αt ′ α(t−u) −αu E ψ (α)Nt(Ξ) e (Ξ) = ψ (α)ENt−u e E e P (Ξ > u) bdu, − E − E Z[0,t]    which leads using (2.8) and (2.10) to

′ αt ′ ′ α α(t−u) −αu E ψ (α)Nt(Ξ) e (Ξ) = ψ (α)W (t u) ψ (α)W ⋆ PV (t u) e e P (Ξ > u) bdu. − E [0,t] − − − − b Z    =:It−u (6.10) | {z } We get using Proposition 5.1 and (2.4),

αs ′ αs αs α Is =e ψ (α)e F (s) e 1 − − − b ′ α(s−v)   αs −αv α αs + ψ (α) e F (s v)PV (dv)+ e e PV (dv) e − − b Z[0,s] Z(s,∞) αs −αv =e e PV (dv)+ o(1). Z(s,∞)

Hence, (Is)s≥0 is bounded. The result, now, follows from (6.10).

22 2 ′ −αt 2 Lemma 6.6 (L convergence in the general case). ψ (α)e Nt(Ξ) converge a.s. and in L to (Ξ), E and −αt ′ αt 2 α ′ −αs lim e E ψ (α)Nt(Ξ) e (Ξ) = 2 ψ (α) e P (Ξ >s) bds, t→∞ − E b − R Z +   where the convergence is uniform with respect to Ξ in (0, ]. In the particular case when Ξ follows (βt) ∞ 1 the distribution of O2 given by (4.5), we have, for 0 <β< 2 ,

2 αt −αt ′ (βt) (βt) ′ ′ lim e Eβt e ψ (α)Nt(O ) (O ) = 2 ψ (α) ψ (α). t→∞ 2 −E 2 −    Proof. From (6.7) and (6.9), we have

2 −αt ′ 2 −α(t−u) ′ (ξu) −αu1 −αt1 e ψ (α)Nt(Ξ) (Ξ) = e ψ (α)Nt−u (u) e Ξ>u ξ(du)+ e Ξ>t −E R −E Z +     (6.11) and, using Lemma 2.6,

′ −αt 2 E ψ (α)e Nt(Ξ) (Ξ) −E 2 E ′ −α(t−u) (ξu) −αu1 = ψ (α)e Nt−u (u) e Ξ>u ξ(du) R −E Z +    −2αtP −αtE1 ′ −α(t−u) (ξu) −αu1 + e (Ξ >t) + 2e Ξ>t ψ (α)e Nt−u (u) e Ξ>u ξ(du), R −E Z +   2 E ′ −α(t−u) (ξu) −2αuP = ψ (α)e Nt−u (u) e (Ξ > u) bdu R −E Z +    E ′ −α(t−u) (ξu) E ′ −α(t−v) (ξv) −α(u+v)P + ψ (α)e Nt−u (u) ψ (α)e Nt−v (v) e (Ξ > u, Ξ > v) bdu dv R −E −E Z +     −2αtP −αt E ′ −α(t−u) (ξu) −αuP + e (Ξ >t) + 2e ψ (α)e Nt−u (u) e (Ξ > u, Ξ >t) bdu. R −E Z +   Moreover, since, ′ −αt −αt ψ (α)Ee Nt = e , −E O this leads, using Lemma 6.5, to 

αt −αt ′ 2 α ′ −αu lim e E e ψ (α)Nt(Ξ) (Ξ) = 2 ψ (α) e P (Ξ > u) bdu. t→∞ −E b − R Z +   Now, we have from (4.5) and Lemma 2.1,

W (u y) −αy lim Pu (O2 >s) = lim − P (V >s + y) bdy = e P (V >s + y) bdy. u→∞ u→∞ R W (u) 1 R Z + − Z + It follows then from Lebesgue theorem that,

′ −αs bψ (α) lim e Pβt (O2 >s) bds = . t→∞ R α Z +

23 Lemma 6.7 (Boundedness in the general case.). The error of order 3 in asymptotically bounded, that is − 3 αt ′ αt 3 e 2 E ψ (α)Nt(Ξ) e (Ξ) = (1) , − E O uniformly w.r.t. Ξ.

Proof. Rewriting N(Ξ) and (Ξ) as in the proof of Lemma 6.6, we see that, E 3 − 3 ′ 3 − 3 ′ (ξu) − ′ 2 t E αt 2 t E α(t u) 1 1 e ψ (α)Nt(Ξ) e (Ξ) = e ψ (α)Nt−u e (u) Ξ>u ξ(du)+ ψ (α) Ξ>t − E  [0,t] − E  Z  

 3  − 3 − ′ (ξu) − − 1 ′ − 1 3 E 2 (t u) α( t u) 2 u1 2 tP 8 e ψ (α)Nt−u e (u) e Ξ>uξ(du) + 8ψ (α)e (Ξ >t) ≤ [0,t] − E Z  

We denote by I the first term of the r.h.s. of the last inequality, leading to 3 − 1 (t−s ) ′ (ξs ) α(t−s ) − 1 s I E e 2 i ψ α N i e i e 2 i 1 ξ ds ξ ds ξ ds 8 ( ) t−si (si) Ξ>si ( 1) ( 2) ( 3) ≤ 3 − E [0,t] i Z Y=1   3 3 3 (ξ − 1 (t−s ) ′ sj ) α(t−s ) − 1 s E e 2 j ψ α N e j e 2 i 1 ξ ds ξ ds ξ ds 8 ( ) t−sj (sj ) Ξ>si ( 1) ( 2) ( 3) ≤ 3 − E [0,t] j i Z X=1   Y=1 2 3 − 1 − ′ (ξu) − − 1 − 1 E 2 (t u) α(t u) 2 u1 2 u 24 e ψ (α)Nt−u e (u) e Ξ>uξ(du) e ξ(du) ≤ [0,t] − E [0,t] ! Z   Z 1 3 1 − t−u ′ (ξu) α t−u − u E 2 ( ) ( ) 2 1 24 e ψ (α)Nt−u e (u) e Ξ>u µ(du), ≤ [0,t] − E Z  

with 2 − 1 s µ(du)= e 2 ξ(ds) ξ(du). Z[0,t] ! Now, since µ is independent from the family N (i) and , an easy adaptation of the proof of E(i) Lemma 2.6, leads to   3 − 3 t ′ αt 3 − 1 (t−u) ′ α(t−u) − 1 u e 2 E ψ (α)Nt(Ξ) e (Ξ) 24E E e 2 ψ (α)Nt−u e e 2 1Ξ>u µ(du) − E ≤ [0,t] − E Z     ′ − 1 t + 8ψ (α)e 2 P (Ξ >t) Using Lemma 6.4 to bound 3 − 3 (t−u) α(t−u) E e 2 Nt−u e , − E   in the previous expression, finally leads to 3 − 3 t ′ αt 3 − 1 u e 2 E ψ (α)Nt(Ξ) e (Ξ) E e 2 ξ(du) + 1 , − E ≤C R Z +  ! for some real positive constant . C

24 6.2 Proof of Theorem 3.2

We fix a positive real number u. From this point, we recall the decomposition of the splitting tree as described in Section 4 (see also Figure 4). We also recall that, for all i in 1,...,Nu , the process i { } N (Oi) , s R is the population counting process of the (sub-)splitting tree T (Oi). s ∈ + As explained in Section 4, it follows from the construction of the splitting tree, that, for all i in  i,j 1,...,Nu , there exists an i.i.d. family of processes N independent from Nu with the same { } j≥1 (i) law as (Nt, t R+), and an i.i.d. family ξ of random measure independent from Nu and ∈ 1≤i≤Nu  from N i,j the family with same law as ξ, such that j≥1   i i,j (i) N (Oi)= N 1O >u ξ (du)+ 1O >t, t R , i 1,...,Nu . (6.12) t t−u i i ∀ ∈ + ∀ ∈ { } Z[0,t]

As in (6.9), we define, for all i in 1,...,Nu , { } −αu1 (i) (Oi) := (i) e Oi>u ξ (du), (6.13) E Ei,ξu Z[0,t] ′ −αt i,j where i,j := lim ψ (α)e N . E t→∞ t −αt i 2 Hence, it follows from Lemma 6.6, that e Nt (Oi) converges to (Oi) in L . i R E Note also that, from Lemma 4.2, the family Nt (Oi) , t + is i.i.d. and independent ∈ 2≤i≤Nu from N under P , as well as the family ( (O )) (in the sense of Remark 4.1). Note that u u i 2≤i≤Nu  P E i R the law under u of the processes of the family Nt (Oi) , t + is the law of standard ∈ 2≤i≤Nu population counting processes where the lifespan of the root is distributed as O under P (except  2 u for the first one).

Lemma 6.8 (Decomposition of ). We have the following decomposition of , E E

Nu −αu = e i (Oi) , a.s. E E i X=1 Moreover, under Pu, the random variables ( i (Oi)) (defined by (6.13)) are independent, indepen- E i≥1 dent of Nu, and identically distributed for i 2. ≥ Proof. Step 1: Decomposition of . R ∞ E For all t in +, we denote by Nt the number of individuals alive at time t which have an infinite ∞ ∞ descent. For all i, we define, for all t 0, N (Oi) from T (Oi) as N was defined from the whole ≥ t t tree. Now, it is easily seen that Nu ∞ ∞ Nt = Nt−u (Oi) . i X=1 −αt ∞ Hence, if e N (Oi) converges a.s. to (Oi), then t E

Nu Nu −αt ∞ −αu −α(t−u) ∞ −αu lim e N = lim e e N − (Oi)= e (Oi) . t→∞ t t→∞ t u E i i X=1 X=1 25 So, it just remains to prove the a.s. convergence to get the desired result. ∞ Step 2: a.s. convergence of N (Oi) to (Oi). E For this step, we fix i 1,...,Nu . ∈ { } In the same spirit as (6.12) (see also Section 4), it follows from the construction of the splitting tree j,∞ T (Oi), that there exists, an i.i.d. (and independent of Nu) sequence of processes Ns , s R+ ∈ j≥1 with the same law as (N ∞, t R ) (under P), such that   t ∈ + (i) ∞ ξu ,∞ (i) N (Oi)= N 1O >u ξ (du)+ 1O ∞, t 0. t t−u i i= ∀ ≥ Z[0,t] Now, it follows from Theorem 6.2 of [5], that for all j,

−αt j,∞ lim e N = i,j, a.s., t→∞ t E where i,j was defined in the beginning of this section. Let E −αt j,∞ j := sup e Nt , j 1, C t∈R+ ∀ ≥ and −αt ∞ := sup e Nt . C t∈R+ j,∞ Then, the family ( j) is i.i.d., since the processes N are i.i.d, with the same law as . C j≥1 j≥1 C Hence, (i)  −α(t−u) ξu ,∞ −αu1 (i) −αu1 (i) e Nt−u e Oi>u ξ (du) (i) e Oi>u ξ (du). (6.14) ≤ Cξu Z[0,t] Z[0,t] E P E ∞ It is easily seen that [ ] = (NonEx) ∞ [C]. Now, since, from Proposition 6.1 of [5], Nt is a P C −αt ∞ Yule process under ∞ (and hence e Nt is a martingale), Doobs’s inequalities entails that the random variable is integrable. Hence, the right hand side of the (6.14) is a.s. finite, and we can C apply Lesbegue Theorem to get

−αt ∞ −αu1 lim e Nt (Oi)= (i) e Oi>u Γ(du)= (Oi) , a.s., t→∞ Ei,ξu E Z[0,t] where the right hand side of the last equality is just the definition of (Oi). E

We have now all the tools needed to prove the for Nt.

Proof of Theorem 3.2. Let u u are independent processes, i.i.d. t−u 1≤i≤Nu for i 2 and independent of Nu. Let us denote by ϕ andϕ ˜ the characteristic functions ≥  ψ′(α)N 2 (O ) eα(t−u) (O ) ϕ(λ) := E exp iλ t−u 2 − E2 2 , λ R α (t−u) " e 2 !!# ∈ and ψ′(α)N 1 (O ) eα(t−u) (O ) ϕ˜(λ) := E exp iλ t−u 1 − E1 1 , λ R. α (t−u) " e 2 !!# ∈ It follows from (6.15) and Lemma 4.2 that,

λ ′ αt ϕ˜ α Nu ψ (α)Nt e e 2 u λ Eu exp iλ α − E = Eu ϕ α 2 t  λ  2 u e ϕ α " e #    e 2 u     −1 Since Nu is geometric with parameter W (u) under Pu,

λ −1 λ ′ αt ϕ α W u ϕ α ψ (α)N e ˜ u ( ) u E iλ t e 2 e 2 u exp α −t E = e 2  λ  −1  λ    ϕ α u 1 (1 W (u) ) ϕ α u e 2 − − e 2     Using Taylor formula for ϕ, we obtain,

ψ′(α)N eαt λ 1 E iλ t ϕ u exp α t− E = ˜ α u e 2 e 2 D(λ, t, u)      where,

D(λ, t, u)= W (u) ψ′(α)N i (O ) eα(t−u) (O ) W u iλE t−u 2 − E2 2 ( ( ) 1) 1+ α (t−u) α u − − " e 2 e 2 # 2 λ2 ψ′(α)N i (O ) eα(t−u) (O ) E t−u 2 − E2 2 + R(λ, t, u) α (t−u) α u − 2  e 2 e 2 !  !

′ i α(t−u)   W (u) 1 ψ (α)Nt−u (O2) e 2 (O2) = 1 iλ α E − E u− α (t−u) − e 2 " e 2 # 2 λ2 W (u) 1 N i (O ) eα(t−u) (O ) + E ψ′(α) t−u 2 − E2 2 αu− α (t−u) 2 e  e 2 !   (W (u) 1)R(λ, t, u), − −

27 with, for all ǫ> 0 and all λ in ( ǫ,ǫ), − 3 3 ′ i α(t−u) 3 − 3 αu ∂ ψ (α)Nt−u (O2) e 2 (O2) ǫ e 2 3 − 3 u R(λ, t, u) sup ϕ(λ) E − E Cǫ e 2 , 3 α (t−u) | |≤ ∂λ ≤  2  6 ≤ λ∈(−ǫ,ǫ) e !

  (6.16) for some real positive constant C obtained using Lemma 6.7. 1 From this point, we set u = βt with 0 <β< 2 . It follows then from the Lemmas 6.6 and 4.2, that ′ i α(t−βt) 2 ψ (α)N − (O2) e 2 (O2) lim E t βt − E = ψ′(α) 2 ψ′(α) . (6.17) βt α (t−βt) t→∞  e 2 !  −   1  Moreover, we have from Lemma 6.5, and since β < 2 ,

− α t ′ i αt lim W (βt)e 2 E ψ (α)N (O2) e 2 (O2) = 0. (6.18) t→∞ t − E   Finally, the relations (6.16), (6.17) and (6.18) lead to

N eαt 1 E iλ t . lim βt exp −α t E = λ2 t→∞ e 2 1+ (2 ψ′(α))    2 − To conclude, note that,

′ αt αt αt iλ ψ (α)Nt−e E 1 1 Nt e Nt e α t Nβt>0 NonEx E exp iλ E exp iλ = E e e 2 βt −α t E ∞ −α t E P P e 2 − e 2 " (Nβt > 0) − (NonEx) #         1 1 Nβt>0 E NonEx ≤ P (N > 0) − P (NonEx)  βt 

goes to 0 as t goes to infinity. This ends the proof of Theorem 3.2.

References

[1] Søren Asmussen. Convergence rates for branching processes. Ann. Probability, 4(1):139–146, 1976.

[2] K. B. Athreya and P. E. Ney. Branching processes. Dover Publications, Inc., Mineola, NY, 2004. Reprint of the 1972 original [Springer, New York; MR0373040].

[3] Krishna Balasundaram Athreya. Limit theorems for multitype continuous time Markov branch- ing processes. II. The case of an arbitrary linear functional. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete, 13:204–214, 1969.

[4] Frank Ball, Miguel Gonz´alez, Rodrigo Mart´ınez, and Maroussia Slavtchova-Bojkova. Stochastic monotonicity and continuity properties of functions defined on Crump-Mode-Jagers branching processes, with application to vaccination in epidemic modelling. Bernoulli, 20(4):2076–2101, 2014.

28 [5] Nicolas Champagnat and Henry Benoit. Moments of the frequency spectrum of a splitting tree with neutral poissonian mutations. Electron. J. Probab., 21:34 pp., 2016.

[6] Nicolas Champagnat and Amaury Lambert. Splitting trees with neutral Poissonian mutations I: Small families. . Appl., 122(3):1003–1033, 2012.

[7] Nicolas Champagnat and Amaury Lambert. Splitting trees with neutral Poissonian mutations II: Largest and oldest families. Stochastic Process. Appl., 123(4):1368–1414, 2013.

[8] Nicolas Champagnat, Amaury Lambert, and Mathieu Richard. Birth and death processes with neutral mutations. Int. J. Stoch. Anal., pages Art. ID 569081, 20, 2012.

[9] . An introduction to and its applications. Vol. II. Second edition. John Wiley & Sons, Inc., New York-London-Sydney, 1971.

[10] J. Geiger and G. Kersting. Depth-first search of random trees, and Poisson . In Classical and modern branching processes (Minneapolis, MN, 1994), volume 84 of IMA Vol. Math. Appl., pages 111–126. Springer, New York, 1997.

[11] Jochen Geiger. Size-biased and conditioned random splitting trees. Stochastic Process. Appl., 65(2):187–207, 1996.

[12] Sergei Grishechkin. On a relationship between processor-sharing queues and Crump-Mode- Jagers branching processes. Adv. in Appl. Probab., 24(3):653–698, 1992.

[13] C. C. Heyde. A rate of convergence result for the super-critical Galton-Watson process. J. Appl. Probability, 7:451–454, 1970.

[14] C. C. Heyde. Some central limit analogues for supercritical Galton-Watson processes. J. Appl. Probability, 8:52–59, 1971.

[15] Alexander Iksanov and Matthias Meiners. Rate of convergence in the for supercritical general multi-type branching processes. Stochastic Process. Appl., 125(2):708–738, 2015.

[16] Andreas E. Kyprianou. Fluctuations of L´evy processes with applications. Universitext. Springer, Heidelberg, second edition, 2014. Introductory lectures.

[17] Amaury Lambert. The contour of splitting trees is a L´evy process. Ann. Probab., 38(1):348–395, 2010.

[18] Amaury Lambert, Florian Simatos, and Bert Zwart. Scaling limits via excursion theory: in- terplay between Crump-Mode-Jagers branching processes and processor-sharing queues. Ann. Appl. Probab., 23(6):2357–2381, 2013.

[19] Olle Nerman. On the convergence of supercritical general (C-M-J) branching processes. Z. Wahrsch. Verw. Gebiete, 57(3):365–395, 1981.

29 [20] Peter Olofsson and Suzanne S. Sindi. A Crump-Mode-Jagers branching process model of prion loss in yeast. J. Appl. Probab., 51(2):453–465, 2014.

[21] Mathieu Richard. Arbres, Processus de branchement non Markoviens et processus de L´evy. Th`ese de doctorat, Universit´ePierre et Marie Curie, Paris 6.

30