Stationary Gamma Processes
Robert L. Wolpert
July 28, 2016 (Draft 2.6)
1 Introduction
For fixed α,β > 0 these notes present six different stationary time series, each with Gamma X Ga(α, β) univariate marginal distributions and autocorrelation function ρ|s−t| for t ∼ Xs, Xt. Each will be defined on some time index set , either = Z or = R. Five of the six constructions can be applied to otherT InfiniteT ly DivisibleT (ID) distribu- tions as well, both continuous ones (normal, α-stable, etc.) and discrete (Poisson, negative binomial, etc.). For specifically the Poisson and Gaussian distributions, all but one of them (the Markov change-point construction) coincide— essentially, there is just one “AR(1)-like” Gaussian process (namely, the AR(1) process in discrete time, or the Ornstein-Uhlenbeck process in continuous time), and there is just one AR(1)-like Poisson process. For other ID distributions, however, and in particular for the Gamma, each of these constructions yields a process with the same univariate marginal distributions and the same autocorrelation but with different joint distributions at three or more times. First, by “Ga(α, β)” we mean the Gamma distribution with mean α/β and variance α/β2, i.e., with shape parameter α and rate parameter β. The pdf and chf are given by: βα f(x α, β)= xα−1e−βx1 | Γ(α) {x>0} χ(ω α, β)=(1 iω/β)−α | − = exp eiωu αe−βu u−1 du . (1) − R Z + The sum ξ := ξ of independent random variables ξ ind Ga(α , β) with the same rate + j j ∼ j parameter also has a Gamma distribution, ξ+ Ga(α+, β), if αj R+ are summable P ∼ { } ⊂ with sum α+ := Σαj < . Eqn(1) shows that the “L´evy measure” for this distribution is −βu −1 ∞ ν(du)= αe u 1{u>0}du. From the Poisson representation of ID distributions
X = u (du) R N Z 1 for (du) Po ν(du) we can show that the extremal properties of ID random variables dependN only∼ on the L´evy measure ν(du): for large u R , ∈ + P[X>u] P (u, ) = ] ≈ N ∞ 6 ∅ =1 exp ν (u, ) − − ∞ ν (u, ) (α/βu)e−βu, (2) ≈ ∞ ≈ from which one might mount a study of the multivariate extreme properties of the six processes presented below.
2 Six Stationary Gamma Processes
2.1 The Gamma AR(1) Process Fix α,β > 0 and 0 ρ< 1. Let X Ga(α, β) and for t N define X recursively by ≤ 0 ∼ ∈ t
Xt := ρXt−1 + ζt (3)
−α for iid ζ with chf Eeiωζt = (1 iω/β)−α(1 iρω/β)α = β−iω (easily seen to be { t} − − β−iρω positive-definite, with L´evy measure ν (du)= α e−βu e−βu/ρ u−11 du). A simple way ζ − {u>0} of generating ζ with this distribution is presented in the Appendix. The process X has { t} { t} Gamma univariate marginal distribution Xt Ga(α, β) for every t R+ and, at consecutive times 0, 1, joint chf ∼ ∈
χ(s,t)= E exp(isX0 + itX1)
= E exp(i(s + ρt)X0 + itζ1) (1 i(s + ρt)/β) (1 it/β) −α = − − . (4) 1 itρ/β h − i Since this is asymmetric in s,t, the process is not time-reversible; this is also evident from the observation that
P[X ρX ]= P[ζ 0]=1 > P[ζ X (1 ρ2)/ρ]= P[X ρX ]. t ≥ t−1 t ≥ t ≤ t−1 − t−1 ≥ t
This process has marginal distribution Xt Ga(α, β) at all times t, and has autocorrelation |s−t| ∼ Corr Xs,Xt) = ρ (easily found from either (3) or (4)). It is clearly Markov (from (3)), and can be shown to have infinitely-divisible (ID) multivariate marginal distributions of all orders, since the ζ are ID. { t}
2 2.2 Thinned Gamma Process
If X Ga(α1, β) and Y Ga(α2, β) are independent, then Z := X + Y Ga(α+, β) and U :=∼X/Z Be(α ,α )∼ are also independent (where α := α + α ), because∼ under the ∼ 1 2 + 1 2 change of variables x = uz, y = (1 u)z with Jacobian J(u,z)= z, − βα1 βα2 f(u,z)= xα1−1e−βx yα2−1e−βy J(u,z) Γ(α ) Γ(α ) 1 2 βα+ = uα1−1(1 u)α2−1zα+−2e−βzz Γ(α )Γ(α ) − 1 2 Γ(α ) βα+ = + uα1−1(1 u)α2−1 zα+−1e−βz . Γ(α )Γ(α ) − Γ(α ) 1 2 +
Thus if Z Ga(α1 + α2, β) and U Be(α1,α2) are independent then X = UZ Ga(α1, β) and Y = (1∼ U)Z Ga(α , β) are∼ independent too. Let ∼ − ∼ 2 X Ga(α, β) 0 ∼ and, for t N (or t N, resp.) set ∈ ∈−
Xt := ξt + ζt where
ξ := B X , B Be(αρ,αρ¯), t t · t−1 t ∼ ζ Ga(αρ,¯ β) t ∼ (or ξ = B X , resp.), whereρ ¯ := (1 ρ) and all the B and ζ are independent. t t · t+1 − { t} { t}
Then, by induction, ξ Ga(αρ,β) and ζ Ga(αρ,¯ β) are independent, with sum X t ∼ t ∼ t ∼ Ga(α, β). Thus Xt is a Markov process with Gamma univariate marginal distribution X Ga(α, β), now{ with} symmetric joint chf t ∼
χ(s,t)= E exp(isX0 + itX1) = E exp is(X ξ )+ i(s + t)ξ + itζ { 0 − 1 1 1} = (1 is/β)−αρ¯(1 i(s + t)/β)−αρ(1 it/β)−αρ¯ (5) − − − |s−t| and autocorrelation Corr Xs,Xt)= ρ . The process of passing from Xt−1 to ξt = Xt−1Bt is called thinning, so X is called the thinned gamma process. A similar construction is t available for any ID marginal distribution.
3 2.3 Random Measure Gamma Process Let (dxdy) be a countably additive random measure that assigns independent random variablesG (A ) Ga(α A , β) to disjoint Borel sets A (R2) of finite area A (this is G i ∼ | i| i ∈ B | i| possible by the Kolmogorov consistency conditions, and is illustrated in the Appendix) and, for λ := log ρ, consider the collection of sets: − G := (x,y): x R, 0 y <λe−2λ|t−x| t ∈ ≤ −λ|s−t| shown in Figure(1) whose intersections have area Gs Gt = e . For t = R, set | ∩ | ∈T
G1 G2 G3 0.0 0.2 0.4 0.6 −2 −1 0 1 2 3 4 5 Time t Figure 1: Random measure construction of process X = (G ) t G t X := (G ). (6) t G t R2 For any n times t1 < t2 < < tn the sets Gti partition into n(n + 1)/2 sets of ··· c { } finite area (and one with infinite area, ( Gti ) ), so each Xti can be written as the sum of some subset of n(n + 1)/2 independent Gamma∪ random variables. In particular, any n =2 variables Xs and Xt can be written as X = (G G )+ (G G ), X = (G G )+ (G G ) s G s\ t G s ∩ t t G t\ s G s ∩ t just as in the thinning approach of Section(2.2), so both 1-dimensional and 2-dimensional marginal distributions for the random measure process coincide with those for the thinning process. Again the joint chf is
χ(s,t)= E exp(isX0 + itX1) = (1 is/β)−αρ¯(1 i(s + t)/β)−αρ(1 it/β)−αρ¯ (7) − − − |s−t| and the autocorrelation is Corr Xs,Xt) = exp λ s t or, for integer times, ρ for ρ := exp( λ). The distribution for consecutive −triplets| − differs| from those of the Thinned Gamma Process,− however, an illustration that the thinning process is Markov but the random measure is not. The Random Measure process does feature infinitely-divisible (ID) marginal distributions of all orders, while the thinned process does not.
4 2.4 The Markov change-point Gamma Process
Let ζ : n Z iid Ga(α, β) be iid Gamma random variables and let N be a standard { n ∈ } ∼ t Poisson process indexed by t R (so N0 =0 and (Nt Ns) Po(t s) for all
Xt := ζn, n = Nλt.
Then each Xt Ga(α, β) and, for s,t R, Xs and Xt are either identical (with probability ρ|s−t|) or independent—∼ reminiscent of∈ a Metropolis MCMC chain. The chf is
χ(s,t)= E exp(isX0 + itX1) −α = ρ 1 i(s + t)/β +ρ ¯(1 is/β)−α(1 it/β)−α (8) − − −
and once again the marginal distribution is Xt Ga(α, β) and the autocorrelation function |s−t| ∼ is Corr Xs,Xt)= ρ . 2.5 The Squared O-U Gamma Diffusion
iid Let Zi OU(λ/2, 1) be independent Ornstein-Uhlenbeck velocity processes, mean-zero Gaussian{ } processes∼ with covariance Cov Z (s),Z (t) = exp λ s t δ , and set i j − 2 | − | ij 1 n X := Z (t)2 t 2β i i=1 X 2 4 for n N and β R+. Note Zi(t) No(0, 1), so EZi(t) = 1 and EZi(t) = 3; it follows that EZ (s∈)2Z (t)2 = 1+2exp(∈ λ s t∼). Then X Ga(α, β) for α = n/2, with EX = α/β and i i − | − | t ∼ s 1 EX X = n 1+2exp( λ s t ) + n(n 1) s t 4β2 − | − | − α2 +α exp λ s t − | − | = 2 β Cov α −λ|s−t| so the autocovariance is Xs,Xt) = β2 e and the autocorrelation at integer times is Corr X ,X )= ρ|s−t| for ρ := exp( λ). The chf at consecutive integer times is s t −