Probability Theory and Related Fields manuscript No. (will be inserted by the editor)

Exchangeable graph-valued Feller processes

Harry Crane

Received: date / Accepted: date

Abstract The transition law of every exchangeable Feller process on the space of countable graphs is determined by a σ-finite measure on the space of 0, 1 0, 1 -valued arrays. In discrete time, this characterization gives rise to{ a} construction × { } from an independent, identically distributed sequence of exchangeable random functions. In continuous time, the behavior is enriched by a Levy–It´ o–Khintchine-typeˆ decomposition of the jump measure into mu- tually singular components that govern global, vertex-level, and edge-level dynamics. Every such process almost surely projects to a Feller process in the space of graph limits. Keywords exchangeability graph limit Levy–It´ o–ˆ Khintchine decomposition · Aldous–Hoover· theorem Rado· graph Erdos–R˝ enyi´ random graph· · · Mathematics Subject Classification (2000) 05C80 60G09 60J05 60J25 · · ·

1 Introduction

A graph, or network, G = (V, EG) is a set of vertices V and a binary relation of ij edges EG V V. For i, j V, we write G to indicate the status of edge (i, j), ⊆ × ∈ that is, ( ij 1, (i, j) EG, G := 1 (i, j) EG := ∈ { ∈ } 0, otherwise. ij Both the pair G = (V, EG) and the array G = (G )i,j V represent the same object. ∈ We write V to denote the space of graphs with vertex set V. ConsiderG a population of individuals labeled distinctly in V and related to one another by the edges of a graph G = (V, EG). The population size is typi- cally large, but unknown, and so we assume a countably infinite population

Rutgers University, Department of & Biostatistics, 110 Frelinghuysen Road, Piscataway, NJ 08854. [email protected] 2 Harry Crane

and take V = N := 1, 2,... . In practice, the labels N are arbitrarily assigned for the purpose of distinguishing{ } individuals, and data can only be observed for a finite sample S N. Since labels are assigned arbitrarily, we often take S = [n]:= 1,..., n whenever⊂ S N is finite with cardinality n 1. { } ⊆ ≥ These practical matters relate to the operations of ij – relabeling: the relabeling of G = (G )i,j 1 by any permutation σ : N N is ≥ → σ σ(i)σ(j) G := (G )i,j 1, and (1) ≥ ij – restriction: the restriction of G = (G )i,j 1 to a graph with vertex set S N is ≥ ⊆ S ij G = G S := (G )i,j S. (2) | ∈ In many relevant applications, the network structure changes over time, re- sulting in a time-indexed collection (Gt)t T of graphs. We consider exchange- able Markov processes as statistical models∈ in this setting.

1.1 Graph-valued Markov processes

A V-valued Markov process is a collection Γ = Γ : G V for which each G { G ∈ G } ΓG = (Γt)t T is a family of random graphs that has Γ0 = G and ∈ – the , that is, the past (Γs)st are condition- ally independent given the present Γt for all t T. ∈ We also assume Γ is

– exchangeable, that is, all ΓG share a common transition law for which

σ σ P Γt Γt = F = P Γ Γ = F , (3) { 0 ∈ · | } { t0 ∈ · | t }

for all permutations σ : V V, all F V, and all t0 > t 0, and – consistent, that is, for all S →V the transition∈ G law of Γ satisfies≥ ⊆ S S P Γ = F Γt = F0 = P Γ = F Γt = F00 , F S, (4) { t0 | } { t0 | } ∈ G

for all F0, F00 V such that F0 S = F00 S. ∈ G | | S Consistency implies that Γ = (Γt S)t T satisfies the Markov property for every G | ∈ S V, for all G V. We call any Γ satisfying these properties an exchangeable, consistent⊆ Markov∈ G process. In Proposition 1, we prove that consistency and the Feller property are equivalent for exchangeable Markov processes on N, so we sometimes also call Γ an exchangeable Feller process. G Any process Γ = ΓG : G V determines a random collection (Γt)t T with a given transition{ law and∈ initial G } distribution ν by first drawing Γ ∈ν 0 ∼ and then putting Γν = ΓG on the event Γ0 = G. With the underlying process Γ = ΓG : G V understood, we write Γν = (Γt)t T to denote a collection with{ this behavior.∈ G } ∈ Exchangeable graph-valued Feller processes 3

Theorems 1-5 below characterize the behavior of exchangeable N-valued Feller processes in both discrete and continuous time. In discrete time,G every exchangeable Feller process can be constructed by an iterated application of independent, identically distributed (i.i.d.) exchangeable random functions, called rewiring maps; see Section 4.5. In continuous time, Γ admits a con- struction from a Poisson whose intensity measure has a Levy–´ Ito–Khintchine-typeˆ decomposition. The Levy–It´ o–Khintchineˆ representation classifies every discontinuity of Γ as one of three types. When ν is an exchangeable initial distribution on N, both discrete and G continuous time processes Γν project to a Feller process in the space of graph limits. These outcomes invoke connections to previous work on the theory combinatorial stochastic processes [4,13], partially exchangeable arrays [1,2, 8], and limits of dense graph sequences [11,12].

1.2 Outline

Before summarizing our conclusions (Section 3), we first go over key defini- tions and assumptions (Section 2). We then explain essential concepts in more detail (Section 4) and prove our main theorems for discrete time (Section 5) and continuous time (Section 6) processes. We make concluding remarks in Section 7.

2 Definitions and assumptions

2.1 Graphs

For any graph G = (V, EG), we impose the additional axioms of

(i) anti-reflexivity,(i, i) < EG for all i V, and ∈ (ii) symmetry,(i, j) EG implies (j, i) EG for all i, j V, ∈ ∈ ∈ and we specialize V to denote the set of all graphs satisfying (i) and (ii). G By the symmetry axiom (ii), we may write ij = i, j EG to indicate that there is an edge between i and j in G. By definition,{G has} ∈ no multiple edges, condition (i) forbids edges from a vertex to itself, and condition (ii) makes ij G undirected. In terms of the adjacency array G = (G )i,j V, (i) and (ii) above correspond to ∈ (i’) Gii = 0 for all i V and (ii’) Gij = Gji for all∈i, j V, ∈ respectively.As we discuss in Section 7, our main theorems remain valid under some relaxation of these conditions. The above two notions of a graph— as a pair (V, EG) and as an adjacency array—are equivalent. We use them interchangeably and with the same notation. With SV denoting the space of permutations of V N, that is, bijective ⊆ functions σ : V V, relabeling (1) associates every σ SV to a map V V. → ∈ G → G 4 Harry Crane

S For all S S0 V, restriction (2) determines a map S0 S, G G = G S. ⊆ ⊆ ij G → G 7→ | Specifically, for n m 1, G [m] = (G )1 i,j m is the leading m m submatrix of ij ≥ ≥ | ≤ ≤ × G = (G )1 i,j n. By combining (1) and (2), every injection φ : S S0 determines ≤ ≤ → a projection S S by G 0 → G φ φ(i)φ(j) G G := (G )i,j S. (5) 7→ ∈ We call a sequence of finite graphs (Gn)n N compatible if Gn [n] and ∈ ∈ G Gn [m] = Gm for every m n, for all n N. Any compatible sequence of finite| graphs determines a≤ unique countable∈ graph G , the projective limit of ∞ (G1, G2,...) with respect to the restriction operation in(2). We endow N with the product discrete topology induced, for example, by the ultrametricG

d(G, G0):= 1/ sup n N : G n = G0 n , G, G0 N, (6) { ∈ |[ ] |[ ]} ∈ G with the convention that 1/ = 0. From (6), we naturally equip N with the ∞ G Borel σ-field σ [n] n N generated by the restriction maps. Under (6), N is h ·| i ∈ G a compact and, therefore, complete and separable metric space; hence, N is Polish and standard measure theoretic facts apply in our analysis. G

2.2 Graph limits

For n m 1, F m , and G n , we define the density of F in G by ≥ ≥ ∈ G[ ] ∈ G[ ] m δ(F, G):= ind(F, G)/n↓ , (7)

m where ind(F, G) is the number of embeddings of F into G and n↓ := n(n 1) (n m + 1). Specifically, − ··· − X ind(F, G):= 1 Gφ = F { } φ:[m] [n] → is the number of injections φ :[m] [n] for which Gφ = F. → S Given a sequence (Gn)n N in ∗ := n N [n], we define the limiting density ∈ G ∈ G of F in (Gn)n N by ∈

δ(F, (Gn)n N ):= lim δ(F, Gn), if it exists. ∈ n →∞ For G N, we define the limiting density of F in G by ∈ G δ(F, G):= lim δ(F, G [n]), if it exists. (8) n →∞ | Definition 1 (Graph limit) The graph limit of G N is the collection ∈ G S G := (δ(F, G))F m N [m] , | | ∈ ∈ G S provided δ(F, G) exists for every F m N [m]. If δ(F, G) does not exist for some F, ∈ ∈ G then we put G := ∂. We write to denote the closure of G : G N ∂ in S ∗ m N [m]| | D {| | ∈ G }\{ } [0, 1] ∈ G . Exchangeable graph-valued Feller processes 5

If the graph limit of G exists, then G determines a family of exchangeable probability distributions on ( [n])n N by G ∈ P Γn = F G = δ(F, G), F n , n N . (9) { | } ∈ G[ ] ∈

Moreover, the distributions determined by (δ(F, G))F [m] and (δ(F, G))F [n] , m n, are mutually consistent, that is, the distribution in∈G (9) satisfies ∈G ≤ X P Γn m = F0 G = δ(F, G) = δ(F0, G), for every F0 m . { |[ ] | } ∈ G[ ] F n : F m =F ∈G[ ] |[ ] 0

Thus, every D ∗ determines a unique probability measure on N, written ∈ D σ G γD. In addition, γD is exchangeable in that Γ γD satisfies Γ = Γ for all σ SN, where = denotes equality in law. ∼ L ∈ L

Remark 1 In our notation, D ∗ and γD correspond to the same object, but we ∈ D use the former to emphasize that D is the graph limit of some G N and the latter ∈ G to specifically refer to the probability distribution that D determines on N. As an S m N [m] G element of ∗,D = (DF)F resides in [0, 1] ∈ G . The connection between D and D ∈G∗ γD is made explicit by

DF = D(F) = γD( G N : G n = F ), F n , n N . { ∈ G |[ ] } ∈ G[ ] ∈ Definition 2 (Dissociated random graphs) A random graph Γ is dissociated if

Γ S and Γ S are independent for all disjoint subsets S, S0 N . (10) | | 0 ⊆ A probability measure ν on N is dissociated if Γ ν is a dissociated random graph. G ∼ Dissociated measures play a central role in the theory of exchangeable random graphs. The measure γD determined by any D ∗ is dissociated, and conversely the Aldous–Hoover theorem (Theorem∈ 6) D states that every exchangeable probability measure on N is a mixture of exchangeable, disso- ciated probability measures. In particular,G to every exchangeable probability measure ν on N there exists a unique probability measure ∆ on ∗ such that ν = γ , whereG D ∆ Z γ ( ):= γD( )∆(dD) (11) ∆ · · D∗ is the mixture of γD measures with respect to ∆.

2.3 Notation

We use the capital Roman letter G to denote a generic graph, capital Greek letter Γ to denote a random graph, bold Greek letter with subscript Γ to denote a collection of random graphs with initial condition , and bold Greek• letter Γ = Γ to denote a graph-valued process indexed by• the initial conditions . { •} • Superscripts index edges and subscripts index time; therefore, for (Γt)t T, ∈ ij S S we write Γ to indicate the status of edge ij at time t T and Γ = (Γ )t T t ∈ T0 t ∈ 0 6 Harry Crane to denote the trajectory of the edges ij S N over the set of times in ⊆ ⊆ T0 T. We adopt the same notation for processes Γ = ΓG : G N , with S ⊆ S { ∈ G } Γ := Γ : G N denoting the restriction of Γ to edges ij S and times T0 { G,T0 ∈ G } ⊆ T0 T. ⊆ To distinguish between discrete time (T = Z+ := 0, 1, 2,... ) and continu- ous time (T = R := [0, )) processes, we index time{ by m and}t, respectively; + ∞ thus, (Γm)m 0 denotes a discrete time process and (Γt)t 0 denotes a continuous time process.≥ To further distinguish these cases, we≥ call Γ a if its constituents are indexed by discrete time and a Markov process if its con- stituents are indexed by continuous time. Whenever a discussion pertains to both discrete and continuous time, we employ terminology and notation for the continuous time case.

3 Summary of main theorems

We now state our main theorems, saving many technical details for later. Of primary importance are the rewiring maps and rewiring limits, which we introduce briefly in Section 3.1 and develop formally in Section 4.5.

3.1 Discrete time Feller chains

ij ii Let W = (W )i,j V be a symmetric 0, 1 0, 1 -valued array with W = (0, 1) for all i V. For∈ i, j V, we express{ } × the { (i}, j) component of W as a pair ij ∈ij ij ∈ W = (W , W ). For any such W and any G V we define the rewiring of G 0 1 ∈ G by W by G0 = W(G), where

 ij ij ij  W0 , G = 0 G0 := , i, j V. (12)  ij ij W1 , G = 1 ∈ Note that W acts on G by reconfiguring its adjacency array at each entry: if ij ij ij ij ij G = 0, then G0 is taken from W0 ; otherwise, G0 is taken from W1 . Thus, we can regard W as a function V V, called a rewiring map. Lemma 1 records some basic facts about this operation.G → G The acts of relabeling and restriction for rewiring maps are identical to (1) and (2) for graphs, that is, for every σ SV and S V, we define ∈ ⊆ σ σ(i)σ(j) W := (W )i,j V and ∈ S ij W = W S := (W )i,j S. | ∈ We write V to denote the collection of rewiring maps V V. W G → G Given a probability measure ω on N, we construct a discrete time N- W G valued Markov chain Γ∗ = Γ∗ : G N as follows. We first generate ω { G,ω ∈ G } W1, W2,... i.i.d. according to ω. For G N, we define Γ∗ = (Γm∗ )m 0 by ∈ G G,ω ≥ putting Γ0∗ = G and

Γm∗ := Wm(Γm∗ 1) = (Wm W1)(G), m 1, (13) − ◦ · · · ◦ ≥ Exchangeable graph-valued Feller processes 7

where W W0 indicates the composition of W and W0 as maps N N. ◦ G → G From the definition of rewiring map in (12), Γω∗ in (13) is a consistent ij σ(i)σ(j) Markov chain. If ω is also exchangeable, that is, W ω satisfies (W )i,j 1 = (W )i,j 1 ∼ ≥ L ≥ for all permutations σ : N N, then Γω∗ is also exchangeable. Theorem 1 es- tablishes the converse: to every→ discrete time exchangeable, consistent Markov chain Γ = Γ : G N there corresponds a probability measure ω on N { G ∈ G } W such that Γ = Γω∗ , that is, ΓG = ΓG∗ ,ω for all G N. By appealingL to the theoryL of partially exchangeable∈ G arrays, we further characterize ω uniquely in terms of a measure Υ on the space of rewiring limits. We write ω = ΩΥ to denote that ω is directed by Υ. We define all these concepts formally in Section 4.5.

Theorem 1 (Discrete time characterization) Let Γ = Γ : G N be a discrete { G ∈ G } time, exchangeable, consistent Markov chain on N. Then there exists a probability G measure Υ such that Γ = Γ∗ , where Γ∗ is constructed as in (13) from W1, W2,... L Υ Υ independent and identically distributed according to ω = ΩΥ.

Our proof of Theorem 1 relies on an extension (Theorem 7) of the Aldous– Hoover theorem (Theorem 6) to the case of exchangeable and consistent Markov chains on 0, 1 -valued arrays. See Section 5 for the complete ar- gument. { }

3.1.1 Projection into the space of graph limits

Any exchangeable random graph Γ projects almost surely to a graph limit Γ = | | D, which corresponds to a random element γD in the space of exchangeable, dissociated probability measures on N. Likewise, any exchangeable rewiring G map W N projects to a rewiring limit W = υ, which corresponds to an ∈ W | | exchangeable, dissociated probability measure Ω on N. We write ∗ and υ W D ∗ to denote the spaces of graph limits and rewiring limits, respectively. V We delay more formal details about rewiring limits until Section 4.5. For now, we settle for an intuitive understanding by analog to graph limits. Recall the definition of γ∆ in (11) for a probability measure ∆ on ∗, that is, Γ γ∆ is a random graph obtained by first sampling D ∆ and, givenD D, drawing∼ Γ ∼ from γD. Similarly, for a probability measure Υ on ∗, W ΩΥ is a random rewiring map obtained by first sampling υ Υ and,V given∼ υ, drawing W from Ω . In particular, W Ω implies W ∼Υ. Just as for graph limits, we υ ∼ Υ | | ∼ regard υ ∗ and Ω as the same object, but use the former to refer to a ∈ V υ rewiring limit of some W N and the latter to refer to the corresponding ∈ W exchangeable, dissociated probability distribution on N. W From the construction of Γ∗ in (13), every probability measure ω on N ω W determines a transition probability P ( , ) on N by ω · · G P (G, ):= ω( W N : W(G) ), G N, (14) ω · { ∈ W ∈ ·} ∈ G

and thus every υ ∗ determines a transition probability, denoted P ( , ), ∈ V υ · · by taking ω = Ω as in (14). Consequently, every υ ∗ acts on ∗ by υ ∈ V D 8 Harry Crane

composition of probability measures,

Z D Dυ( ):= Pυ(G, )γD(dG). (15) 7→ · N · G

In other words, Dυ is the probability measure of Γ0 obtained by first generating Γ γD and, given Γ = G, taking Γ0 = W(G), where W Ωυ is a random rewiring∼ map. ∼ Alternatively, we can express D ∗ as a countable vector (D(F))F with ∈ D ∈G∗

D(F):= γD( G N : G n = F ), F n , n N . { ∈ G |[ ] } ∈ G[ ] ∈

Moreover, every υ ∗ gives rise to an infinite by infinite array (υ(F, F0))F,F0 ∗ with ∈ V ∈G ( Ωυ( W N : W [n](F) = F0 ), F, F0 [n] for some n N, υ(F, F0):= { ∈ W 0, | } ∈ Gotherwise. ∈

In this way, (υ(F, F0))F,F has finitely many nonzero entries in each row and 0∈G∗ the definition of Dυ = (D0(F))F by ∈G∗ X D0(F):= D(F0)υ(F0, F), F n , n N, (16) ∈ G[ ] ∈ F n 0∈G[ ]

coincides with (15). (Note that the array (υ(F, F0))F,F does not determine υ, 0∈G∗ but its action on ∗ through (16) agrees with the action of υ in (15).) D For a graph-valued process Γ = ΓG : G N , we write Γ = ΓD : D { ∈ G } D∗ { ∈ ∗ to denote the process derived from Γ by defining the law of ΓD = (Γt)t T D } ∈ as that of the collection of random graphs obtained by taking Γ γD and, 0 ∼ given Γ = G, putting Γ = Γ . Provided Γt exists for all t T, we define 0 D G | | ∈ ΓD := ( Γt )t T as the collection of graph limits at all times of ΓD. If ΓD exists | | | | ∈ | | for all D ∗, we write Γ := ΓD : D ∗ to denote the associated ∈ D | D∗ | {| | ∈ D } ∗-valued process. D The operation in (17) below is the iterated application of (15).

Theorem 2 (Graph limit chain) Let Γ = Γ : G N be a discrete time, { G ∈ G } exchangeable Feller chain on N. Then Γ exists almost surely and is a Feller G | D∗ | chain on ∗. Moreover, ΓD = D∗ for every D ∗, where D∗ := (Dm∗ )m 0 D | | L D,Υ ∈ D D,Υ ≥ satisfies D0∗ = D and

Dm∗ := Dm∗ 1Ym = DY1 Ym, m 1, (17) − ··· ≥

with Y1, Y2,... i.i.d. from Υ, the probability measure associated to Γ by Theorem 1. Exchangeable graph-valued Feller processes 9

3.2 Continuous time Feller processes

The continuous time analog to the representative Markov chain ΓΥ∗ in discrete time, by Equation (13) and Theorem 1, is a representative Markov process Γ∗ = Γ∗ : G N constructed from a W = (t, Wt) ω { G,ω ∈ G } { } ⊂ [0, ) N with intensity dt ω. Here dt denotes Lebesgue measure on [0, ) ∞ ×W ⊗ ∞ and ω is a measure on N satisfying W

ω( IdN ) = 0 and ω( W N : W n , Id n ) < for every n N, (18) { } { ∈ W |[ ] [ ]} ∞ ∈

where IdV is the identity V V, V N. G → G ⊆ [n] [n] We construct each Γ∗ through its finite restrictions Γ∗ := (Γ∗ )t 0 to G,ω G,ω t ≥ [n] n for every n 1 by first putting Γ∗ := G n and then defining G[ ] ≥ 0 |[ ] [n] [n] Γt∗ := Wt [n](Γt∗ ), if t > 0 is an atom time for which Wt [n] , Id[n], and • [n] [n|] − | Γt∗ := Γ∗t otherwise. • − (19) [n] [n] [n] Above we have written Γ∗ := lims t Γs∗ to denote the state of Γ∗ in the t ↑ G,ω instant before time t > 0. − [n] ∗ By the righthand side of (18), every ΓG,ω has cadl` ag` paths and is a Markov [n] chain on [n]. Moreover, if ω is exchangeable (Definition 4), then so is Γ∗ω . By G [m] [n] ∗ ∗ construction, ΓG,ω is the restriction to [m] of ΓG,ω for every m n and every [n] [n] G ≤ G N. In particular, Γ∗ = Γ∗ for all G, G0 N such that G [n] = G0 [n]. G,ω G0,ω ∈ G [n] ∈ G | | Thus, (Γ∗ )n N determines a unique collection Γ∗ = (Γ∗)t 0 in N for every G,ω ∈ G,ω t ≥ G G N. We define the process Γω∗ = ΓG∗ ,ω : G N as the family of all such collections∈ G generated from the same Poisson{ point∈ G process} W.

Theorem 3 (Poissonian construction) Let Γ = Γ : G N be an exchange- { G ∈ G } able Feller process on N. Then there exists an exchangeable measure ω satisfying G (18) such that Γ = Γω∗ , for Γω∗ as constructed in (19). L

3.2.1 L´evy–Itˆo–Khintchinerepresentation

Our main theorems reach beyond Theorem 3 by decomposing the character- istic measure ω in terms of the types of discontinuities it permits in Γ. We call this the Levy–It´ o–Khintchineˆ representation of ω because of its close re- semblance to the Levy–It´ o–Khintchineˆ characterization of Levy´ processes; see Remark 3. Theorem 4 below shows that if s > 0 is a discontinuity time for ΓG = (Γt)t 0, then either ≥ ij ij (I) there is a unique edge ij for which Γs , Γs , − ij (II) there is a unique vertex i 1 for which the edge statuses (Γs )j,i are ≥ − N i updated according to an exchangeable transition probability on 0, 1 \{ } and all edges not incident to i remain fixed, or { } 10 Harry Crane

(III) the edges of Γs are updated according to a random rewiring map as in discrete time. −

Qualitatively, the above description separates the jumps of ΓG according to their local and global characteristics. Type (I) discontinuities are local—they involve a change to just a single edge—while Type (III) discontinuities are global—they involve a change to a positive proportion of all edges. Type (II) discontinuities lie in between—they involve a change to infinitely many but a zero limiting proportion of edges. According to this characterization, there can be no discontinuities affecting the status of more than one but a zero limiting fraction of vertices or more than one but a zero limiting fraction of edges. This qualitative behavior has been shown more generally for all exchangeable Markov processes on N with cadl` ag` sample paths, but without the Feller assumption the preciseG characterization of Theorem 4 is unavailable [6]. The decomposition of the discontinuities into Types (I)-(III) prompts the L´evy–Itˆo–Khintchinedecomposition of the intensity measure ω from Theorem 3. More specifically, Theorem 4 below parameterizes any such ω by a unique triple (e, Σ, Υ), where e = (e0, e1) is a pair of nonnegative constants, Σ is a unique measure on the 3-dimensional simplex, and Υ is a unique measure on the space of rewiring limits. The 3-dimensional simplex ∆3 consists of all probability distributions on a set of four elements. Specific to our setting is the 4 element set 0, 1 0, 1 , and so for clarity we index each S ∆ by 0, 1 0, 1 , that is, S = ({S ,}×{S , S }, S ) ∈ 3 { }×{ } 00 01 10 11 with each Skk0 0 and S00 + S01 + S10 + S11 = 1. Every S ∆3 determines a ≥ ∈ j product measure S∞ on infinite 0, 1 0, 1 -valued sequences X = (X )j N by taking { } × { } ∈ j P X = (k, k0) S = Skk , k, k0 = 0, 1, (20) { | } 0 N N (i) independently for all j . For i and S ∆3, we write ΩS to denote the ∈ ∈ ∈ ij probability measure of a random rewiring map W N with (W )j,i S∞, ji ij i0 j0 ∈ W ∼ W = W by symmetry, and otherwise W = (0, 1) for all i0, j0 N with P (i) ∈ i < i0, j0 . We then write ΩS = ∞ Ω and, for any measure Σ on ∆ , we { } i=1 S 3 write Z ΩΣ( ):= ΩS( )Σ(dS) (21) · ∆3 ·

as the mixture of ΩS measure with respect to Σ. These components contribute to the behavior of each ΓG, G N, as follows. ∈ G (I) For unique constants e , e 0, every edge ij, i , j, changes its status 0 1 ≥ independently of all others: each edge in ΓG is removed independently at Exponential rate e 0 and each absent edge in Γ is added indepen- 0 ≥ G dently at Exponential rate e1 0. A jump of this kind results in a Type (I) discontinuity. ≥ (II) A measure Σ on ∆3 determines the measure ΩΣ in (21), which then induces a transition rate measure in the same way that a probability distribution ω determines a transition probability measure in (14). By construction, ΩΣ Exchangeable graph-valued Feller processes 11

assigns positive mass only to W N for which there is a unique vertex ∈ W i N with Wij , (0, 1) for a positive fraction of j , i and Wi0 j0 = (0, 1) ∈ for all edges i0 j0 that do not include i N. Any edge not involving i stays fixed and a jump from this measure results∈ in a Type (II) discontinuity. (III) A unique measure Υ on ∗ determines a transition rate measure ΩΥ, akin to the induced transitionV probability measure (14) discussed in Section 3.1.1. A jump of this kind results in a Type (III) discontinuity.

In the next theorem, I := IdN denotes the rewiring limit of the identity (2) | | IdN : N N and υ is the mass assigned to Id[2] by Ωυ, υ ∗. For W → W ∗ ∈ V k = 0, 1, we define k as the measure that assigns unit mass to each rewiring map with a single off-diagonal entry equal to (k, k) and all other entries (0, 1).

Theorem 4 (L´evy–Itˆo–Khintchinerepresentation) Let Γω∗ = ΓG∗ ,ω : G N be the process constructed in (19) based on exchangeable intensity ω{ satisfying∈(18) G }. Then there exist unique constants e , e 0, a unique measure Σ on ∆ satisfying 0 1 ≥ 3 Z Σ( S01 = 1 ) = 0 and (1 S01)Σ(dS) < , (22) { } ∆3 − ∞

and a unique measure Υ on ∗ satisfying V Z Υ( I ) = 0 and (1 υ(2))Υ(dυ) < (23) { } − ∗ ∞ V∗ such that

ω = ΩΥ + ΩΣ + e00 + e11. (24)

Remark 2 Theorem 3 says that every exchangeable Feller process Γ admits a con- struction by Γω∗ in (19) for some ω satisfying (18). Therefore, Theorem 4 provides a L´evy–Itˆo–Khintchine-typerepresentation for all exchangeable N-valued Feller processes Γ. G

Remark 3 The regularity conditions (22) and (23) and characterization (24) recall a similar description for Rd-valued L´evyprocesses, which decompose as the superpo- sition of independent with drift, a , and a pure jump martingale. In Rd, the Poissonian component is characterized by a Levy´ measure Π satisfying Z Π( (0,..., 0) ) = 0 and (1 x 2)Π(dx) < ; (25) { } ∧ | | ∞ see [3, Theorem 1]. Only a little imagination is needed to appreciate the resemblance of (18), (22), and (23) to (25).

Intuitively, (22) and (23) are the naive conditions needed to ensure that (2) ΩΣ and ΩΥ satisfy (18). By the Aldous–Hoover theorem (Theorem 6), υ ∗ corresponds to the probability that a random rewiring map from Ωυ restricts 12 Harry Crane

to Id . By de Finetti’s theorem, S is the probability that a random [2] ∈ W2 01 rewiring map from ΩS restricts to Id[2]. Thus, Z Z (2) (1 S01)Σ(dS) < and (1 υ )Υ(dυ) < ∆ − ∞ − ∗ ∞ 3 V∗

guarantee that the restriction of Γ to [2] has cadl` ag` sample paths. Under exchangeability, this is enough to guaranteeG that Γ[n] has cadl` ag` paths for all n N. ∈ By Theorem 3, every exchangeable Feller process corresponds to some measure ω satisfying (18) and, by Theorem 4, any such ω uniquely determines a measure Σ satisfying (45). Any such Σ, in turn, determines the transitions of an exchangeable Feller process on 0, 1 N (equipped with its product discrete topology) as follows. { } N Let Y = (t, Yt) [0, ) ( 0, 1 0, 1 ) be a Poisson point process with { } ⊂ ∞i × { i } ×i { } intensity dt Σ, so that Yt = (Yt,0, Yt,1) 0, 1 0, 1 , i 1, for every atom ⊗ ∈ { N} × { } ≥ time t 0. We construct XΣ∗ = Xx∗ : x 0, 1 from Y as follows. For every ≥ i { ∈ { } } [n] [n] 0, 1 -valued sequence x = (x )i N and n N, we define Xx∗ = (X∗ )t 0 by ∈ t ≥ { } [n] i ∈ putting X∗ = (x )1 i n and 0 ≤ ≤ i – if t > 0 is an atom time of Y such that Yt , (0, 1) for some i = 1,..., n, then [n] [n] [n] i Xt∗ = Yt (Xt∗ ) = (x0 )1 i n, where − ≤ ≤ ( i i i Yt,0, x = 0, x0 = i i i 1, and Yt,1, x = 1, ≥

[n] [n] – otherwise Xt∗ = Xt∗ . − [n] The resulting family of processes (Xx∗ )n 1 are compatible and determine [n] N ≥ N uniquely a process X∗ on 0, 1 . The collection X∗ = X∗ : x 0, 1 is an x { } Σ { x ∈ { } } exchangeable pure jump Feller process on 0, 1 N, whose behavior has been characterized previously in [5]. { }

By Theorem 1.3 in [5], the transition behavior of XΣ∗ is determined by a unique measure κ on the space of 2 2 stochastic matrices such that Xx∗ has the same distribution as if it were generated× from a Poisson point process with intensity Σ given by Z Σ(d(y, y0)) = S(1, y)S(2, y0)κ(dS) S2 with κ satisfying Z κ( I ) = 0 and (1 S(1, 1) S(2, 2))κ(dS) < . { 2} − ∧ ∞ S2 Exchangeable graph-valued Feller processes 13

3.2.2 Projection into the space of graph limits

As in discrete time, we write Γ for the process derived from Γ by mixing D∗ with respect to each initial distribution γD, D ∗. Analogous to Theorem 2, every exchangeable Feller process almost surely∈ D projects to a Feller process in the space of graph limits.

Theorem 5 (Graph limit process) Let Γ = Γ : G N be an exchangeable { G ∈ G } Feller process on N. Then Γ = ΓD : D ∗ exists almost surely and is a Feller G | D∗ | { ∈ D } process on ∗. Moreover, each ΓD is continuous at all t > 0 except possibly at the times of TypeD (III) discontinuities.| |

4 Preliminaries: Graph-valued processes

Proofs of the main theorems are spread over Sections 5 and 6. In preparation, we first develop the machinery of exchangeable random graphs, graph-valued processes, rewiring maps, graph limits, and rewiring limits.

4.1 Exchangeable random graphs

Definition 3 (Exchangeable random graphs) A random graph Γ V is ex- σ ∈ G changeable if Γ = Γ for all σ SV. A measure γ on V is exchangeable if γ(S) = σ L ∈ G σ σ γ(S ) for all σ SV and all measurable subsets S V, where S := G : G S . ∈ ⊆ G { ∈ } In the following theorem, is any Polish space. X

ij Theorem 6 (Aldous–Hoover [2], Theorem 14.21) Let X := (X )i,j 1 be a sym- σ(i)σ(j) ≥ metric -valued array for which X = (X )i,j 1 for all σ SN. Then there exists X 4 L ≥ ∈ a measurable function f : [0, 1] such that f ( , b, c, ) = f ( , c, b, ) and X = X∗, ij → X · · · · L where X∗ := (X∗ )i,j 1 is given by ≥

ij X∗ := f (ζ , ζ i , ζ j , ζ i,j ), i, j 1, (26) ∅ { } { } { } ≥ for ζ ;(ζ i )i 1;(ζ i,j )j>i 1 i.i.d. Uniform random variables on [0, 1]. { ∅ { } ≥ { } ≥ } By taking = 0, 1 , the Aldous–Hoover theorem gives a general con- struction for allX exchangeable{ } random graphs with countably many vertices. We make repeated use of the representation in (26), with particular empha- sis on the Aldous–Hoover representation and other special properties of the Erdos–R˝ enyi´ process. 14 Harry Crane

4.2 Erdos–R˝ enyi´ random graphs

For fixed p [0, 1], the Erd˝os–R´enyimeasure εp on N is defined as the dis- ∈ G tribution of Γ N obtained by including each edge independently with ∈ G probability p. We call Γ εp an Erd˝os–R´enyiprocess with parameter p. The ∼ finite-dimensional distributions of Γ εp satisfy ∼ Y (n) Fij 1 Fij P Γ n = F = ε (F):= p (1 p) − > 0, F n , n N . (27) { |[ ] } p − ∈ G[ ] ∈ 1 i

For every p (0, 1), εp is exchangeable and has full support in the sense that ∈ it assigns positive measure to all open subsets of N. G Essential to our proof of Theorems 1, 4, and 7, Γ εp admits a simplified ∼ Aldous–Hoover representation Γ = (g0(ζ , ζ i , ζ j , ζ i,j ))i,j 1, where L ∅ { } { } { } ≥ ( 1, 0 ζ i,j p, g0(ζ , ζ i , ζ j , ζ i,j ) = g0 (ζ i,j ):= ≤ { } ≤ (28) ∅ { } { } { } 0 { } 0, otherwise.

4.3 Feller processes

Let Z := Zz : z be a Markov process in any Polish space . The semigroup { ∈ Z} Z (Pt)t T of Z acts on bounded, measurable functions g : R by ∈ Z → Pt g(z):= E(g(Zt) Z = z), z , t T. | 0 ∈ Z ∈ We say that Z possesses the Feller property if, for all bounded, continuous g : R, Z → (i) z Pt g(z) is continuous for every t T and 7→ ∈ (ii) limt 0 Pt g(z) = g(z) for all z . ↓ ∈ Z Proposition 1 The following are equivalent for any exchangeable Markov process Γ = Γ : G N on N. { G ∈ G } G (i) Γ is consistent. (ii) Γ has the Feller property.

Proof (i) (ii): Compactness of the topology induced by (6) and the Stone– Weierstrass⇒ theorem imply that

g : N R : n N such that G n = G0 n implies g(G) = g(G0) { G → ∃ ∈ |[ ] |[ ] } is dense in the space of continuous functions N R. Therefore, every G → bounded, continuous g : N R can be approximated to arbitrary accuracy G → by a function that depends on G N only through G [n] for some n N. For any such approximation, the conditional∈ G expectation| in the definition∈ of the Feller property depends only on the restriction of Γ[n], which is a Markov chain on n . The first point in the Feller property follows immediately and G[ ] the second follows soon after by noting that n is a finite state space and, G[ ] Exchangeable graph-valued Feller processes 15

therefore, Γ[n] must remain in its initial state for a strictly positive amount of time with probability 1.

(ii) (i): For fixed n N and F n , we define ψF : N 0, 1 by ⇒ ∈ ∈ G[ ] G → { }

ψF(G):= 1 G n = F , G N, { |[ ] } ∈ G

which is bounded and continuous on N. To establish consistency, we must prove that G

P Γt n = F Γ = G = P Γt n = F Γ n = G n { |[ ] | 0 } { |[ ] | 0|[ ] |[ ]}

for every t 0, G N, and F [n], for all n N, which amounts to showing that ≥ ∈ G ∈ G ∈

PtψF(G) = PtψF(G∗) (29)

for all G, G∗ N for which G n = G∗ n . ∈ G |[ ] |[ ] By part (i) of the Feller property, PtψF( ) is continuous and, therefore, · it is enough to establish (29) on a dense subset of G N : G [n] = F . σ { ∈ G | } Exchangeability implies that PtψF(G) = PtψF(G ) for all σ SN that coincide ∈ with the identity [n] [n]. To establish (29), we identify G∗ N such that σ → ∈ G G∗ n = F and G∗ : σ SN , σ coincides with the identity [n] [n] is dense |[ ] { ∈ → } in G N : G n = F . For this, we draw G∗ ε conditional on the { ∈ G |[ ] } ∼ 1/2 event G∗ n = F , where  is the Erdos–R˝ enyi´ measure with parameter 1/2. { |[ ] } 1/2 Now, for every F0 [n0], n0 n, such that F0 [n] = F, Kolmogorov’s 0-1 law ∈ G ≥ | σ guarantees a permutation σ : N N that fixes [n] and for which G∗ [n0] = F0. Therefore, → |

σ G∗ : σ SN, σ coincides with identity [n] [n] { ∈ → } is dense in G N : G n = F . By the invariance of PtψF with respect to { ∈ G |[ ] } permutations that fix [n], PtψF is constant on a dense subset of G N : { ∈ G G n = F . Continuity implies that PtψF must be constant on the closure of this |[ ] } set and, therefore, depends only on the restriction to n . G[ ] By part (ii) of the Feller property,

PtψF(G∗) ψF(G∗) as t 0. → ↓ It follows that each finite restriction Γ[n] has cadl` ag` paths and Γ is consistent.

4.4 More on graph limits

Recall Definition 2 of dissociated random graphs. The space of graph limits ∗ corresponds to the space of exchangeable, dissociated probability measuresD on N, so we often conflate the two notions and write G

D(F):= γD( G N : G n = F ), F n , n N, { ∈ G |[ ] } ∈ G[ ] ∈ 16 Harry Crane

for each D ∗. In this way, the space of graph limits ∗ corresponds to ∈ D D the set of exchangeable, dissociated probability measures on N, which is compact under the metric G X X n D D0 := 2− D(F) D0(F) , D, D0 ∗ . (30) k − k | − | ∈ D n N F n ∈ ∈G[ ] S m N [m] We furnish ∗ with the trace of the Borel σ-field on [0, 1] ∈ G . The Aldous–HooverD theorem (Theorem 6) provides the link between dis- sociated measures and exchangeable random graphs. Dissociated measures are ergodic with respect to the action of SN in the space of exchangeable ran- dom graphs, that is, the law of every exchangeable random graph is a mixture of dissociated random graphs, and to every exchangeable random graph Γ there is a unique probability measure ∆ so that Γ γ as in (11). ∼ ∆ (n) Example 1 (Graph limit of the Erd˝os–R´enyiprocess) For fixed p [0, 1], let (εp )n N be the collection of finite-dimensional Erdos–R˝ enyi´ measures∈ in (27). The∈ (n) graph limit Γ of Γ εp is deterministic and satisfies δ(F,Γ) = ε (F) a.s. for | | ∼ p every F n , n N. ∈ G[ ] ∈ Example 2 (Random graph limit) The graph limit of an exchangeable random graph Γ is in general a random element of ∗. For example, for α, β > 0, let D εα,β be the mixture of Erdos–R˝ enyi´ measures given by Z εα,β( ):= εp( )Bα,β(dp), · [0,1] ·

where Bα,β is the Beta distribution with parameter (α, β). The finite-dimensional distributions of Γ ε are ∼ α,β n n α↑ 1 β↑ 0 P Γ = F = ε(n) (F):= , F , n N, (31) [n] α,β (n) [n] { | } (α + β)↑ 2 ∈ G ∈

P ij P ij j where n1 := 1 i

(1) Lov´aszand Szegedy’s graphon [12, Section 1] is not unique for any sequence of graphs (Gn)n N . In this sense, their construal of a graphon as a limit is a misnomer. In contrast, our∈ interpretation of a graph limit as a probability measure is unique. (2) Lov´aszand Szegedy’s definition of graphon is a recasting of the Aldous–Hoover theorem in the special case of countable graphs; see Theorem 6 above. (3) Our definition of a countable graph G N as the projective limit of a compatible ∈ G sequence of finite graphs (Gn)n N should not be confused with a graph limit. Any ∈ G N corresponds to a unique compatible sequence (G , G ,...) of finite graphs, ∈ G 1 2 whereas a graph limit D ∗ corresponds to the measurable set of countable 1 ∈ D graphs D − := G N : G = D . | | { ∈ G | | }

4.5 Rewiring maps and their limits

As defined in Section 3.1, a rewiring map W is a symmetric 0, 1 0, 1 - ij { } × { } valued array (W )i,j 1 with (0, 1) along the diagonal. We often regard W as a ≥ map N N, a pair (W0, W1) N N, or a 0, 1 0, 1 -valued array withoutG explicit→ G specification. ∈ G × G { } × { } φ Given any W [n] and any injection φ :[m] [n], we define W := φ(i)φ(j) ∈ W → (W )1 i,j m. In particular, W [n] := (W0 [n], W1 [n]) denotes the restriction of ≤ ≤ | | | σ W to a rewiring map n n and, for any σ SN, W is the image of G[ ] → G[ ] ∈ W under relabeling by σ. We equip N with the product discrete topology induced by W

d(W, W0):= 1/ sup n N : W n = W0 n , W, W0 N, { ∈ |[ ] |[ ]} ∈ W with the convention that 1/ = 0, and its associated Borel σ-field. ∞ Definition 4 (Exchangeable rewiring maps) A random rewiring map W N σ ∈ W is exchangeable if W = W for all σ SN. A measure ω on N is exchangeable σ L ∈ W if ω(S) = ω(S ) for all σ SN, for every measurable subset S N, where σ σ ∈ ⊆ W S := W : W S . In particular, a probability measure ω on N is exchangeable if it determines{ the∈ } law of an exchangeable random rewiring map.W

Appealing to the notion of graph limits, we define the density of V m ∈ W[ ] in W [n] by ∈ W m δ(V, W):= ind(V, W)/n↓ , where, as in (7), ind(V, W) is the number of injections φ :[m] [n] for which φ → W = V. We define the limiting density of V m in W N by ∈ W[ ] ∈ W δ(V, W):= lim δ(V, W [n]), if it exists. n →∞ | Definition 5 (Rewiring limits) The rewiring limit of W N is the collection ∈ W S W := (δ(V, W))V m N [m] , | | ∈ ∈ W S provided δ(V, W) exists for every V m N [m]. If δ(V, W) does not exist for some ∈ ∈ W V, then we put W := ∂. We write to denote the closure of W : W N ∂ S ∗ m N [|m] | V {| | ∈ W }\{ } in [0, 1] ∈ W . 18 Harry Crane

Following the program of Section 2.2, every υ ∗ determines an ex- ∈ V changeable probability measure Ω on N. We call Ω the rewiring measure υ W υ directed by υ, which is the essentially unique exchangeable measure on N W such that Ω -almost every W N has W = υ. Given any measure Υ on υ ∈ W | | ∗, we define the mixture of rewiring measures by V Z Ω ( ):= Ω ( )Υ(dυ). (32) Υ · υ · V∗ The space of rewiring limits ∗ corresponds to the space of exchangeable, V dissociated probability measures on N and, therefore, is compact under the analogous metric to (30), W X X n υ υ0 := 2− υ(V) υ0(V) , υ, υ0 ∗ . k − k | − | ∈ V n N V n ∈ ∈W[ ] As for the space of graph limits, we furnish with the trace Borel σ-field of S ∗ m N [m] V [0, 1] ∈ W . Lemma 1 The following hold for rewiring maps.

(i) As a map N N, every W N is Lipschitz continuous with Lipschitz constant 1G with→ respect G to the metric∈ W(6), that is,

d(W(G), W(G0)) d(G, G0) for all G, G0 N, ≤ ∈ G and the rewiring operation is associative in the sense that

(W W0) W00 = W (W0 W00), for all W, W0, W00 N . ◦ ◦ ◦ ◦ ∈ W σ σ σ (ii) (W(G)) = W (G ) for all W N,G N, and σ SN. ∈ W ∈ G ∈ (iii) If W N is an exchangeable rewiring map, then W exists almost surely. ∈ W | | Moreover, for every υ ∗, Ω -almost every W N has W = υ. ∈ V υ ∈ W | | Proof Parts (i) and (ii) are immediate from the definition. Part (iii) follows from a straightforward modification of the almost sure existence of Γ for any exchangeable random graph Γ; see Theorem 6. | |

5 Discrete time chains and the rewiring measure

5.1 Conditional Aldous–Hoover theorem for graphs

Theorem 7 (Conditional Aldous–Hoover theorem for graphs) Let Γ be an exchangeable Feller chain on N whose transitions are governed by transition proba- bility measure P. Then there existsG a measurable function f : [0, 1]4 0, 1 0, 1 × { } → { } satisfying f ( , b, c, , ) = f ( , c, b, , ) such that, for every fixed G N, the distribu- · · · · · · ij ∈ G tion of Γ0 P(G, ) is identical to that of Γ0∗ = (Γ0∗ )i,j 1, where ∼ · ≥ ij ij Γ0∗ = f (ζ , ζ i , ζ j , ζ i,j , G ), i, j 1, (33) ∅ { } { } { } ≥ for ζ ;(ζ i )i 1;(ζ i,j )j>i 1 i.i.d. Uniform random variables on [0, 1]. { ∅ { } ≥ { } ≥ } Exchangeable graph-valued Feller processes 19

Proof Let Γ = Γ : G N be an exchangeable Feller chain on N with { G ∈ G } G transition probability measure P. Let ε1/2 denote the Erdos–R˝ enyi´ measure (27) with p = 1/2. To prove (33), we generate a jointly exchangeable pair (Γ,Γ0) with Γ ε and Γ0 Γ = G P(G, ). ∼ 1/2 | ∼ · By exchangeability of ε1/2 and the transition probability measure P,(Γ,Γ0) is σ σ jointly exchangeable, that is, (Γ ,Γ0 ) = (Γ,Γ0) for all σ SN, and, therefore, L ∈ ij determines a weakly exchangeable 0, 1 0, 1 -valued array ((Γ,Γ0) )i,j 1 = ij ij { } × { } ≥ (Γ ,Γ0 )i,j 1. ≥ By Theorem 6, there exists a measurable function f : [0, 1]4 0, 1 0, 1 → { } × { } with f ( , b, c, ) = f ( , c, b, ) such that (Γ,Γ0) = (Γ∗,Γ0∗), where · · · · L ij (Γ∗,Γ0∗) = f (ζ , ζ i , ζ j , ζ i,j ), i, j 1, ∅ { } { } { } ≥

for ζ ;(ζ i )i 1;(ζ i,j )j>i 1 i.i.d. Uniform[0, 1]. By separating components, we { ∅ { } ≥ { } ≥ } can write f = ( f0, f1) so that

ij ij (Γ∗ ,Γ0∗ ) = ( f0(ζ , ζ i , ζ j , ζ i,j ), f1(ζ , ζ i , ζ j , ζ i,j )), i, j 1. ∅ { } { } { } ∅ { } { } { } ≥

With g0 and g0 defined as in (28), we have

(g0(ζ , ζ i , ζ j , ζ i,j ))i,j 1 ε1/2 ∅ { } { } { } ≥ ∼ and

( f0(ζ , ζ i , ζ j , ζ i,j ))i,j 1 = (g0(ζ , ζ i , ζ j , ζ i,j ))i,j 1 = (g0 (ζ i,j )i,j 1. ∅ { } { } { } ≥ L ∅ { } { } { } ≥ 0 { } ≥

Let ζˆJ = (ζI)I J for subsets J N, so that ζˆ i,j = (ζ , ζ i , ζ j , ζ i,j ), ζˆ i = ⊆ ⊂ { } ∅ { } { } { } { } (ζ , ζ i ), and ζˆ = ζ . By [10, Theorem 7.28], there exists a measurable function ∅ { } 2 ∅ 4 ∅ 8 h : [0, 1] [0, 1] [0, 1] [0, 1] such that, for η ;(η i )i 1;(η i,j )j>i 1 i.i.d. ∪ ∪ → { ∅ { } ≥ { } ≥ } Uniform[0, 1] random variables, h(ζˆJ, ηˆJ) Uniform[0, 1] and is independent ∼ of ζˆJ ζJ and ηˆJ ηJ for every J N with two or fewer elements and \{ } \{ } ⊆

g0 (ζ i,j ) = g0(ζˆ i,j ) = f0(h(ζ , η ), h(ζˆ i , ηˆ i ), h(ζˆ j , ηˆ j ), h(ζˆ i,j , ηˆ i,j )) 0 { } { } ∅ ∅ { } { } { } { } { } { } a.s. for every i, j 1. ≥ Now, put ξ = h(ζ , η ), ξ i = h(ζˆi, ηˆ i ), and ξ i,j = h(ζˆij, ηˆ i,j ) so that ∅ ∅ ∅ { } { } { } { } ξ ;(ξ i )i 1;(ξ i,j )j>i 1 are i.i.d. Uniform[0, 1] random variables. The generic { ∅ { } ≥ { } ≥ } Aldous–Hoover representation allows us to construct (Γ∗,Γ0∗) by

ij ij (Γ∗ ,Γ0∗ ) = ( f0(ξˆ i,j ), f1(ξˆ i,j )), i, j 1. { } { } ≥

From [10, Theorem 7.28], (Γ∗,Γ0∗) also has the representation

ij ij (Γ∗ ,Γ0∗ ) = ( f0(ξˆ i,j ), f1(ξˆ i,j )) = (g0 (ζ i,j ), g1(ζˆ i,j , ηˆ i,j )), a.s., i, j 1, { } { } 0 { } { } { } ≥ 8 where g1 : [0, 1] 0, 1 is the composition of f1 with h and g0 is given in (28). → { } 20 Harry Crane

By the Coding Lemma [1, Lemma 2.1], we can represent any pair (U, V) of i.i.d. Uniform[0, 1] random variables by (U, V) = a(α), where α is a Uniform[0, 1] random variable independent of (U, V) and a :L [0, 1] [0, 1]2 is a measurable → function. The random variables in ζ ;(ζ i )i 1;(ζ i,j )j>i 1; η ;(η i )i 1;(η i,j )j>i 1 { ∅ { } ≥ { } ≥ ∅ { } ≥ { } ≥ } are i.i.d. Uniform[0, 1], and so we can represent (ζˆ i,j , ηˆ i,j )i,j 1 by { } { } ≥

(ζˆ i,j , ηˆ i,j )i,j 1 = (aˆ(αˆ i,j ))i,j 1, { } { } ≥ L { } ≥ where we write aˆ(αˆ i,j ) = (a(α ), a(α i ), a(α j ), a(α i,j )) for α ;(α i )i 1;(α i,j )j>i 1 i.i.d. Uniform[0, 1]{ random} variables.∅ { } { } { } { ∅ { } ≥ { } ≥ } With π : [0, 1]2 [0, 1] denoting the projection onto the first coordinate, → (a, b) a, it follows that (Γ,Γ0) possesses a joint representation by 7→ ij Γ∗ = g00(α i,j ) and (34) 0 { } ij Γ0∗ = g0 (α , α i , α j , α i,j ), 1 ∅ { } { } { }

where g10 is the composition of g1 with aˆ and g000 = g0 π a. Also by the Coding Lemma, we can represent ◦ ◦

(g00(α i,j ), α i,j ) = (g00(α i,j ), u(g00(α i,j ), χ i,j )) 0 { } { } L 0 { } 0 { } { } for (χ i,j )j>i 1 i.i.d. Uniform[0, 1] random variables independent of (g000(α i,j ))j>i 1 and u{ : } 0, 1≥ [0, 1] [0, 1] a measurable function. We define { } ≥ { } × → ij ij g00(α , α i , α j , χ i,j , G ):= g0 (α , α i , α j , u(G , χ i,j )) = g0 (α , α i , α j , α i,j ), 1 ∅ { } { } { } 1 ∅ { } { } { } L 1 ∅ { } { } { } so that (Γ,Γ0) = (g00(α i,j ), g00(α , α i , α j , α i,j , g00(α i,j )))i,j 1. L 0 { } 1 ∅ { } { } { } 0 { } ≥ Conditioning on Γ = G gives { } ij P(G, ) = P (g00(α , α i , α j , α i,j , G ))i,j 1 for ε1/2-almost every G N . · { 1 ∅ { } { } { } ≥ ∈ ·} ∈ G By the Feller property, P(G, ) is continuous in the first argument and, thus, · the above equality holds for all G N and representation (33) follows. ∈ G

5.2 Construction from a single process

The proof of Theorem 7 uses only the fact that ε1/2 has full support on N. We can say a bit more by using a stronger property of the Erdos–R˝ enyi´ process.G

Proposition 2 For every p (0, 1), the Erd˝os–R´enyi process with parameter p produces a countable random graph∈ Γ that is almost surely

– universal, that is, for every m N and F [m] there exists an injection φ :[m] N such that Γφ = F, and∈ ∈ G → – ultrahomogeneous, that is, for every F [n] and injection φ :[m] N for φ ∈ G φ0→ which Γ = F m , φ extends to an injection φ0 :[n] N such that Γ = F. |[ ] → Exchangeable graph-valued Feller processes 21

Proof Let p (0, 1) and Γ εp. Then the graph limit Γ = (δ(F,Γ))F ∗ satisfies (n)∈ ∼ (n)| | ∈G δ(F,Γ) = εp (F) > 0 a.s. for every F [n], where εp is defined in (27). Since δ(F,Γ) > 0 a.s. there must be at least one∈ G copy, in fact infinitely many copies, of F in Γ with probability 1 for every F ∗. Universality follows by countable additivity of probability measures. ∈ G For ultrahomogeneity, let F [n] and φ :[m] N be an injection for φ ∈ G → which Γ = F [m]. For m∗ = max1 j m φ(j), we extend φ to an injection φm+1 : | ≤ ≤ [m+1] N by putting φm+1(j) = φ(j) for 1 j m and defining φm+1(m+1) = → ≤ ≤ φm+1 r, where r > m∗ is the smallest integer such that Γ = F [m+1]. Such an r exists with probability 1 by the Borel–Cantelli lemma since there| is an equal and φm+1 strictly positive probability that φm (m+1) = k yields Γ = F m for every +1 |[ +1] k > m∗. We obtain the desired extension φ0 :[n] N by recursively extending → φm+j 1 to φm+j for every j = 1,..., n m in the same way. The proof is complete. − −

Remark 5 By Proposition 2, εp-almost every G N is isomorphic to the Rado graph, that is, the unique (up to isomorphism) countable∈ G universal and ultrahomo- geneous graph [14].

The universality and ultrahomogeneity properties permit a simultaneous embedding of Γ = Γ : G N into a single, exchangeable process Γ† = { G ∈ G } (Γ†)t T, which we generate from Γ by taking t ∈ – Γ† ε and 0 ∼ 1/2 G – Γ† = ΓG∗ on the event Γ0† = ∗.

Theorem 8 Let Γ = Γ : G N be an exchangeable Feller process on N and let { G ∈ G } G Γ† = (Γ†)t T be the process constructed above. Then for every G N there exists an t ∈ ∈ G φG φG injection φG : N N such that Γ† = (Γ† )t T = ΓG. Moreover, we can choose t ∈ L → φG φ G0 (φG)G N in such a way that Γ† [n] = Γ† [n] whenever G [n] = G0 [n]. ∈G | | | |

Proof For each G N, we let φG : N N be the injection obtained by ∈ G → choosing φG(1) < φG(2) < successively such that each φG(n) is the smallest φG n ··· † |[ ] integer for which Γ0 = G [n], where φG [n] is the domain restriction of φG to φG | | a map [n] N. Thus, Γ† = G for every G N. → 0 ∈ G The existence of such φG for every G N follows by the universality and ∈ G ultrahomogeneity properties of Γ0† ε1/2 from Proposition 2. Exchangeability ∼ φG and consistency properties of Γ guarantee that Γ† = ΓG for every G N. L ∈ G Finally, if G, G0 N satisfy G [n] = G0 [n], then φG(j) = φG0 (j) for all j = 1,..., n ∈ G | φG | φG by the above construction; thus, Γ† n = Γ† 0 n . The proof is complete. |[ ] |[ ]

5.3 Discrete time characterization

Proposition 3 For an exchangeable probability measure ω on N, let Γ∗ = Γ∗ : W ω { G,ω G N be constructed as in (13) from an i.i.d. sequence W , W ,... from ω. Then ∈ G } 1 2 Γ∗ is an exchangeable Feller chain on N. ω G 22 Harry Crane

Proof Each ΓG∗ ,ω has the Markov property by the assumption that W1, W2,... are independent and identically distributed. Exchangeability of ω and Lemma 1(ii) pass exchangeability along to Γω∗ . By Lemma 1(i), every W N is [n]∈ W ∗ Lipschitz continuous with Lipschitz constant 1, implying that ΓG,ω also has the Markov property for every n N. By Proposition 1, Γω∗ is Feller and the proof is complete. ∈

Definition 6 (Rewiring Markov chains) We call Γω∗ constructed as in (13) a rewiring chain with rewiring measure ω. If ω = ΩΥ for some probability measure Υ on ∗, we call Γ∗ a rewiring chain directed by Υ. V Υ

Theorem 9 Let Γ be an exchangeable Feller chain on N. Then there exists an G exchangeable probability measure ω on N so that Γ = Γω∗ , where Γω∗ = ΓG∗ : W L { ,ω G N is a rewiring Markov chain with rewiring measure ω. ∈ G } Proof By Theorem 7, there exists a measurable function f : [0, 1]4 0, 1 0, 1 so that the transitions of Γ can be constructed as in (33). From× { f ,} we → { } 4 define f ∗ : [0, 1] 0, 1 0, 1 by → { } × { }

f ∗(a, b, c, d):= ( f (a, b, c, d, 0), f (a, b, c, d, 1)).

Given i.i.d. Uniform[0, 1] random variables ζ ;(ζ i )i 1;(ζ i,j )j>i 1 , we con- { ∅ { } ≥ { } ≥ ij} struct a weakly exchangeable 0, 1 0, 1 -valued array W∗ := (W∗ )i,j 1 by { } × { } ≥ ij W∗ := f ∗(ζ , ζ i , ζ j , ζ i,j ), i, j 1. ∅ { } { } { } ≥

We write ω to denote the essentially unique distribution of W∗. Treating W∗ as a rewiring map, the image W∗(G) = Γ∗ satisfies

ij ij Γ∗ = f (ζ , ζ i , ζ j , ζ i,j , G ) for all i, j 1, ∅ { } { } { } ≥ for every G N; whence, Γω∗ in (13) has the same transition probabilities as Γ and the proof∈ G is complete.

5.4 Characterizing the rewiring measure

Theorem 9 establishes that any exchangeable Feller chain on N can be con- structed as a rewiring chain. In the discussion surrounding LemmaG 1 above, we identify every exchangeable probability measure ω with a unique prob- ability measure Υ on ∗ through the relation ω = ΩΥ, with ΩΥ defined in (32). V

Proposition 4 Let ω be an exchangeable probability measure on N. Then there W exists an essentially unique probability measure Υ on ∗ such that ω = Ω := R V Υ ΩυΥ(dυ). V∗ Exchangeable graph-valued Feller processes 23

Proof This follows by combining the Aldous–Hoover theorem and Lemma 1(iii). By Lemma 1(iii), ω-almost every W N possesses a rewiring limit, from which the change of variables formula∈ gives W Z Z ω(dW):= Ω V (dW)ω(dV) = Ωυ(dW) ω (dυ) = Ω ω (dW), | | | | N | | W V∗ where ω denotes the image measure of ω by : N ∗ ∂ . | | | · | W → V ∪{ } Proof (Proof of Theorem 1) Follows from Theorem 9 and Proposition 4.

5.5 The induced chain on ∗ D

Any υ ∗ corresponds to a unique transition probability P on N N, as ∈ V υ G × G defined in (14). Moreover, υ ∗ acts on ∗ by composition of probability ∈ V D measures as in (15). For υ, υ0 ∗, we define the product P0 = P P by ∈ V υ υ0 Z

P0(G, dG0):= Pυ0 (G00, dG0)Pυ(G, dG00), G, G0 N . (35) N ∈ G G

Lemma 2 Let W, W0 N be independent exchangeable random rewiring maps ∈ W such that W and W0 exist almost surely. Then W0 W exists almost surely and | | | | | ◦ |

P W W = P W P W a.s. | 0◦ | | | | 0|

Proof Follows by independence of W and W0, the definition of W0 W, and ◦ the definition of P W P W in (35). | | | 0|

Proof (Proof of Theorem 2) By Theorem 1, there exists a measure Υ on ∗ V such that Γ = Γ∗ , the rewiring chain directed by Υ. Thus, for every G N, L Υ ∈ G ΓG = (Γm)m 0 is determined by Γ0 = G and ≥

Γm = (Wm W1)(G), m 1, L ◦ · · · ◦ ≥ where W1, W2,... are i.i.d. with distribution ΩΥ. For any D ∗, ΓD = (Γm)m 0 is generated from Γ = ΓG : G N by ∈ D ≥ { ∈ G } first taking Γ γD and then putting Γ = Γ on the event Γ = G. By the 0 ∼ D G 0 Aldous–Hoover theorem along with exchangeability of γD and ΩΥ, the graph limit Γm exists almost surely for every m 0 and, thus, ΓD := ( Γm )m 0 exists | | ≥ | | | | ≥ almost surely in ∗. D By Lemma 1(iii), W1 , W2 ,... are i.i.d. from Υ, and, by Lemma 2, Wm W exists almost| surely| | and| | ◦ · · · ◦ 1|

P Wm W = P W P Wm a.s. for every m 1. | ◦···◦ 1| | 1| ··· | | ≥ Finally, by Theorem 1,

Γm = (Wm W1)(Γ0) = Γ0 W1 Wm a.s. for every m 1, | | L | ◦ · · · ◦ | | || | · · · | | ≥ 24 Harry Crane

and so each ΓD is a Markov chain on ∗ with initial state D and representation (17). | | D To establish the Feller property, we use the fact that ∗ is compact and, D therefore, any h : ∗ R is uniformly continuous and bounded. The dominated convergenceD theorem→ and Lipschitz continuity of the action of υ ∗ on ∗ immediately give part (i) of the Feller property. ∈ V D Part (ii) of the Feller property is trivial because Γ ∗ is a discrete time Markov chain. The proof is complete. | D |

Remark 6 We could interpret P0 in (35) as the two-step transition probability mea- sure for a time inhomogeneous Markov chain on N with first step governed by P G υ and second step governed by Pυ0 . By this interpretation, Theorem 2 leads to another description of exchangeable N-valued Feller chains as mixtures of time inhomoge- G neous Markov chains. In particular, we can construct Γ by first sampling Y1, Y2,... i.i.d. from the unique directing measure Υ. Each Yi induces an exchangeable transi- tion probability PYi as in (14). At each time m 1, given Γm 1,...,Γ0 and Y1, Y2,..., ≥ − we generate Γm PYm (Γm 1, ). ∼ − ·

5.6 Examples

Example 3 (Product Erd˝os–R´enyi chains) Let (p , p ) (0, 1) (0, 1) and put 0 1 ∈ × ω = εp εp , where εp denotes the Erdos–R˝ enyi´ measure with parameter p. 0 ⊗ 1 This measure generates transitions out of state G N by flipping a pk-coin ij ∈ G with k = G independently for every j > i 1. The limiting trajectory in ∗ is deterministic and settles down to a degenerate≥ stationary distribution atD the graph limit of an εq-random graph with q := p /(1 p + p ). 0 − 1 0 Example 4 (A reversible family of Markov chains) We extend the model from Example 3 by mixing (p , p ) with respect to any measure on [0, 1] [0, 1]. 0 1 × For example, let α, β > 0 and define ω = εP εP for (P , P ) B B , 0 ⊗ 1 0 1 ∼ α,β ⊗ β,α where Bα,β is the Beta distribution with parameter (α, β). For every n 1, the finite-dimensional transition probabilities of this mixture model are ≥

n00 n01 n10 n11 [n] [n] α↑ β↑ β↑ α↑ P Γ = F0 Γm = F = , F, F0 [n], m+1 n0 n1 { | } (α + β)↑ • (α + β)↑ • ∈ G

P ij ij where nrs := 1 i

n n (α + β)↑ 0 (α + β)↑ 1 P Γ[n] = F = , F , (n) [n] { } (2α + 2β)↑ 2 ∈ G

P ij where nr := 1 i

6 Continuous time processes and the L´evy–Itˆo–Khintchinemeasure

We now let Γ := Γ : G N be a continuous time exchangeable Feller { G ∈ G } process on N so that each ΓG = (Γt)t 0 is indexed by [0, ). Any such process can jump infinitelyG often in arbitrarily≥ small time intervals,∞ but the consistency property (4) confines each finite restriction Γ[n] to jump only finitely often in bounded intervals. The interplay between these possibilities restricts the behavior of Γ in a precise way. Let ω be an exchangeable measure on N that satisfies W

ω( IdN ) = 0 and ω( W N : W , Id ) < , (36) { } { ∈ W |[2] [2]} ∞

where IdN stands for the identify map N N as before. We proceed as G → G in (19) and construct a process Γω∗ := ΓG∗ ,ω : G N on N from a Poisson + { ∈ G } G point process W = (t, Wt) R N with intensity dt ω. { } ⊆ × W ⊗

Proposition 5 Let ω be an exchangeable measure satisfying (36). Then Γω∗ con- structed in (19) is an exchangeable Feller process on N. G Proof By exchangeability of ω and (36),    [    ω( W N : W [n] , Id[n] ) = ω  W N : W i,j , Id i,j  { ∈ W | }  { ∈ W |{ } { }} 1 i

Corollary 1 Every exchangeable measure ω satisfying (36) determines the jump rates of an exchangeable Feller process on N. G By the Feller property, the transition law of each finite restriction Γ[n] is determined by the infinitesimal jump rates

1 [n] [n] Qn(F, F0):= lim P Γt = F0 Γ = F , F, F0 [n], F , F0. (37) t 0 t { | 0 } ∈ G ↓ 26 Harry Crane

Exchangeability (3) and consistency (4) of Γ imply

σ σ Qn(F , F0 ) = Qn(F, F0) for all permutations σ :[n] [n] and (38) X→ Qm(F m , F0) = Qn(F, F00 n : F00 m = F0 ) = Qn(F, F00), (39) |[ ] { ∈ G[ ] |[ ] } F n : F m =F 00∈G[ ] 00|[ ] 0 for all F n and F0 m such that F m , F0, m n. Thus, for every ∈ G[ ] ∈ G[ ] |[ ] ≤ G N,(Qn)n N determines an additive pre-measure Q(G, ) on N G by ∈ G ∈ · G \{ } Q(G, G0 N : G0 n = F0 ):= Qn(G n , F0), F0 n G n , n N . { ∈ G |[ ] } |[ ] ∈ G[ ] \{ |[ ]} ∈ By Caratheodory’s´ extension theorem, Q(G, ) has a unique extension to a · measure on N G . G \{ } In Γω∗ constructed above, ω determines the jump rates through

Q(G, ) = ω( W N : W(G) ), G N . · { ∈ W ∈ ·} ∈ G In fact, we can show that the infinitesimal rates of any exchangeable Feller process are determined by an exchangeable measure ω that satisfies (36). Our main theorem (Theorem 4) characterizes this jump measure further in the sense of Levy–It´ o–Khintchine.ˆ

6.1 Existence of exchangeable jump measure

Let Γ be an exchangeable Feller process on N. For every t > 0, Theorems 7 G and 9 guarantee the existence of an exchangeable probability measure ωt on N such that, for every G N, W ∈ G ωt( W N : W(G) ) = P Γt Γ = G . (40) { ∈ W ∈ ·} { ∈ · | 0 } The Chapman–Kolmogorov theorem implies that (ωt)t 0 satisfies ≥ Z ωt+s( W N : W(G) ) = ωt( W0 N :(W0 W)(G) )ωs(dW), { ∈ W ∈ ·} N { ∈ W ◦ ∈ ·} W (41) for all s, t 0 and all G N. The family (ωt)t 0 is not determined by these conditions,≥ but nevertheless∈ G time homogeneity≥ and (40) implies the existence of an exchangeable rate measure ω on N that satisfies (36) and determines the jump rates of Γ. W

Theorem 3. (Poissonian construction). Let Γ = Γ : G N be an exchangeable { G ∈ G } Feller process on N. Then there exists an exchangeable measure ω satisfying (18) G such that Γ = Γω∗ as constructed in (19). L Proof We first define a measure ω from the transition law of Γ and then show that this measure satisfies the necessary properties. Equations (37) and (40) imply

1 lim t− ωt( W N : W [n](F) = F0 ) = Qn(F, F0) < , F, F0 [n], F , F0. t 0 { ∈ W | } ∞ ∈ G ↓ Exchangeable graph-valued Feller processes 27

Since n is a finite state space for every n N, we can define G[ ] ∈

λn := max Qn(F, [n] F ) < , n N . F [n] G \{ } ∞ ∈ ∈G

For every n N and V n Id n we see that ∈ ∈ W[ ] \{ [ ]}

ωt( W N : W n = V ) ωt( W N : W , Id n ), { ∈ W |[ ] } ≤ { ∈ W [ ]}

and for every V , Id n there exists some F∗ n for which V(F∗) , F∗. Letting [ ] ∈ G[ ] F be any such F∗, we see that

ωt( W N : W n = V ) ωt( W N : W n (F) = V(F) ) { ∈ W |[ ] } ≤ { ∈ W |[ ] } = P Γ[n] = V(F) Γ[n] = F { t | 0 } P Γ[n] , F Γ[n] = F ≤ { t | 0 } 1 P Γ[n] = F for all s [0, t] Γ[n] = F ≤ − { s ∈ | 0 } = 1 exp tQn(F, n F ) − {− G[ ] \{ } } 1 exp tλn ≤ − {− } λnt, ≤ x with the final inequality following from 1 e− x. − (≤n) For n N and t > 0, we define a measure ω on n by ∈ t W[ ]

( 1 (n) t− ωt( W N : W n = V ), V , Id n , ω (V):= [ ] [ ] t { ∈ W0, | } otherwise.

(n) For every n N, each ωt is invariant under relabeling V by any permutation ∈ (n) (n) n (2) 1 and the collection of measures (ωt )t>0 is bounded by ωt ( ) 4 t− (λnt) = (n) (n) · ≤ 4 2 λn < . Thus, (ω )t 0 is a bounded sequence of exchangeable measures ∞ t ↓ on n . Finiteness of n and the Bolzano–Weierstrass theorem allows us W[ ] W[ ] to choose a subsequence t1,n > t2,n > > 0 such that tk,n 0 as k and (n) (n) ··· ↓ → ∞ ω ω , a finite measure on [n] with tk,n → W

(n) (n) ω (V):= lim ωt ( W N : W [n] = V ), V [n] . (42) k k,n { ∈ W | } ∈ W →∞

(n) (n) Since ωt is exchangeable for every t > 0, the resulting measure ω is ex- (n) changeable for any choice of subsequence. Moreover, ωt ( W N : W [n] = (n) { ∈ W | Id[n] ) = 0 for all t > 0, and so we also have ω (Id[n]) = 0 for any choice of subsequence.} (n) We build up the sequence (ω )n 1 recursively by first choosing (tk,1)k 1 ≥ ≥ and, for every n 1, choosing (tk,n+1)k 1 as a subsequence of (tk,n)k 1 so that ≥ ≥ ≥ 28 Harry Crane

(n+1) (n) ω converges to a finite measure on [n+1]. By this construction of (ω )n 1 tk,n+1 W ≥ from the refining subsequences (tk,n)k 1 n 1, we automatically have { ≥ } ≥ (n+1) ω ( V∗ [n+1] : V∗ [n] = V ) = { X∈ W | } (n+1) = ω (V∗)

V n :V n =V ∗∈W[ +1] ∗|[ ] X (n+1) = lim ωt (V∗) k k,n+1 V∗ [n+1]:V∗ [n]=V →∞ ∈W | X 1 = lim t− ωtk,n+1 ( W N : W [n+1] = V∗ ) k k,n+1 { ∈ W | } →∞ V n :V n =V ∗∈W[ +1] ∗|[ ] 1 = lim t− ωtk,n+1 ( W N : W [n] = V ) k k,n+1 { ∈ W | } →∞ = ω(n)(V), for every V [n] Id[n] , where interchange of the sum and limit is justified by the bounded∈ W convergence\{ } theorem. S Since the Borel σ-field σ n N [n] is generated by the π-system of events ∈ W W N : W n = V , V n , n N, { ∈ W |[ ] } ∈ W[ ] ∈ we can define a set function ω on N on sets of the form W N : W n = V W { ∈ W |[ ] } for V n Id n by ∈ W[ ] \{ [ ]} (n) ω( W N : W n = V ) = ω (V), V n Id n . { ∈ W |[ ] } ∈ W[ ] \{ [ ]} (n) By construction, (ω )n N satisfies ∈ X (m) (n) ω (V) = ω (V∗)

V n :V m =V ∗∈W[ ] ∗|[ ] for every m n and V [m] Id[m] . Caratheodory’s´ extension theorem ≤ ∈ W \{ } (n) guarantees an extension to a measure ω on N IdN . For each n N, ω W \{ } ∈ determines the jump rates of an exchangeable Markov chain on n , implying G[ ] (n) ω( W N : W n , Id n ) = ω ( n Id n ) < { ∈ W |[ ] [ ]} W[ ] \{ [ ]} ∞ for all n N. Putting ω( IdN ) = 0 gives (36). Exchangeability of every ∈ (n){ } ωt, t > 0, makes each ω , n N, exchangeable and, hence, implies ω is ∈ exchangeable. We must still show that this choice of ω yields a process Γω∗ = Γ. L Let ω be the exchangeable measure from above and let W = (t, Wt) be a Poisson point process with intensity dt ω. Since ω satisfies (36),{ Proposition} ⊗ [n] 5 allows us to construct Γω∗ from W as in (19). The jump rates of each Γω∗ are determined by a thinned version W[n] of W that only maintains the atoms (t, Wt) for which Wt [n] , Id[n]. By the thinning property of Poisson random measures, see [9], the| intensity of W[n] is ω(n) as defined in (42), and it follows [n] immediately that the jump rate from F to F0 , F in Γω∗ is (n) ω ( V n : V(F) = F0 ) = Qn(F, F0), { ∈ W[ ] } Exchangeable graph-valued Feller processes 29

for Qn( , ) as in (37). Furthermore, · · (n) Qn(F, n F ) = ω ( V n : V(F) , F ) < G[ ] \{ } { ∈ W[ ] } ∞ [n] for all F [n], n N. By construction, (Γω∗ )n N is a compatible collection of cadl` ag` exchangeable∈ G ∈ Markov chains governed∈ by the same finite-dimensional transition law as Γ. The proof is complete.

6.1.1 Standard ω-process

For ω satisfying (18), we define the standard ω-process Wω∗ := (W∗)t 0 on N t ≥ W as the N-valued Markov process with W0∗ = IdN and infinitesimal jump rates W

Q(W, dW0):= ω( W00 N : W00 W dW0 ), W , W0 N . { ∈ W ◦ ∈ } ∈ W Associativity and Lipschitz continuity of the rewiring maps (Lemma 1(i)) makes W = WV : V N a consistent Markov process, where WV := (W∗ { ∈ W } t ◦ V)t 0 for each V N. Thus, an alternative description to the construction of ≥ ∈ W Γω∗ = ΓG∗ ,ω : G N in (19) is to put ΓG∗ ,ω = (Wt∗(G))t 0, where Wω∗ = (Wt∗)t 0 is a standard{ ω-process.∈ G } ≥ ≥ Now let ω be any exchangeable measure on N satisfying (36). Any such W ω determines a family of exchangeable probability measures (ωt)t 0 on N, which in turn determines the semigroup of an exchangeable Feller≥ processW on N, that is, W ωt( ) = P W∗ W∗ = IdN . · { t ∈ · | 0 } These measures satisfy the semigroup property Z ωt+s( ) = ωt( W0 N : W0 W )ωs(dW), s, t 0. · N { ∈ W ◦ ∈ ·} ≥ W We have the following corollary to Theorem 3.

Corollary 2 Let Γ = Γ : G N be an exchangeable Feller process on N. Then { G ∈ G } G there exists an exchangeable measure ω satisfying (36) such that ΓG = (W∗(G))t 0 L t ≥ for every G N, where Wω∗ = (W∗)t 0 is a standard ω-process. ∈ G t ≥

6.2 Levy–It´ o–Khintchineˆ representation

We now refine Theorem 3 with an explicit Levy–It´ o–Khintchine-typeˆ charac- terization of exchangeable measures satisfying (36). The representation entails a few special measures, which we now introduce. 30 Harry Crane

6.2.1 Single edge updates

ij ij For j > i 1 and k 0, 1 , we define ρk : N N by G G0 = ρk (G), where ≥ ∈ { } G → G 7→ ( i0 j0 i0 j0 G , i0 j0 , ij G0 = k, i0 j0 = ij.

ij In words, G0 = ρ (G) coincides with G at every edge except i, j : irrespective k { } ij ij ij ij of G , ρk puts G0 = k. We call ρk a single edge update map and define the single edge update measures by X ij k( ):= δρ ( ), k = 0, 1, (43) · k · 1 i

ij which place unit mass at each single edge update map ρk .

6.2.2 Single vertex updates

For any vertex i N and G N, the sequence of edges adjacent to i is ∈ ∈ G ij 1 2 1 2 an infinite 0, 1 -valued sequence (G )j,i. Let x0 = x0x0 and x1 = x1x1 be infinite sequences{ } in 0, 1 and let i N be any distinguished··· vertex. For··· { } i ∈ i x = (x , x ) and i N, we define v : N N by G0 = v (G), where 0 1 ∈ x G → G x

 i0 j0  G , i0 , i and j0 , i   j0 i0 j0  x , i0 = i and G = 0  0 i0 j0  i0 i0 j0 G0 = x , j = i and G = 0 , i0 j0. (44)  0 0 ,  j0  x , i = i and Gi0 j0 = 1  1 0  i0 i0 j0 x1 , j0 = i and G = 1

i We call vx a single vertex update map, as it affects only edges incident to the distinguished vertex i. When Γ is the realization of an exchangeable random graph, the sequence ij (Γ )j,i is exchangeable for all fixed i N. We ensure that the resulting Γ0 = i ∈ vx(Γ) is exchangeable by choosing x from an exchangeable measure on pairs X = (X0, X1) of infinite binary sequences. As above, we let ∆3 denote the 3-dimensional simplex equipped with its Borel σ-field. From any S ∆3, we (i) ∈ write W ΩS to denote the probability measure of a random rewiring map ∼ i i W constructed by taking X = (X )i 1 i.i.d. from (20) and putting W = v as in ≥ R X (44). Given a measure Σ on ∆3, we define ΩΣ( ) = ΩS( )Σ(dS) as in (21). · ∆3 · For the reader’s convenience, we restate Theorem 4, whose proof relies on Lemmas 3, 4, and 5 from Section 6.3 below.

Theorem 4 (Levy–It´ o–Khintchineˆ representation). Let Γω∗ = ΓG∗ ,ω : G N be an exchangeable Feller process constructed in (19) based on exchangeable{ intensity∈ G } ω Exchangeable graph-valued Feller processes 31

satisfying (18). Then there exist unique constants e , e 0, a unique measure Σ on 0 1 ≥ ∆3 satisfying Z Σ( S01 = 1 ) = 0 and (1 S01)Σ(dS) < , (45) { } ∆3 − ∞

and a unique measure Υ on ∗ satisfying V Z Υ( I ) = 0 and (1 υ(2))Υ(dυ) < (46) { } − ∗ ∞ V∗ such that ω = ΩΥ + ΩΣ + e00 + e11. (47)

6.3 Key lemmas

In the following three lemmas, we always assume that ω is an exchangeable measure satisfying (18). For n N, we define ∈

ωn := ω1 W N: W n ,Id n (48) { ∈W |[ ] [ ]} as the restriction of ω to the event W N : W n , Id n . While ω can { ∈ W |[ ] [ ]} have infinite mass, the righthand side of (18) assures that ωn is finite for all n 2. Exchangeability of ω implies that ωn is invariant with respect to all permutations≥ N N that fix [n]. We recover a finite, exchangeable measure → ij n+i,n+j from ωn by defining the n-shift of W = (W )i,j 1 by ←−Wn := (W )i,j 1 and ≥ ≥ letting ←−ω n be the image of ωn by the n-shift, that is,

ω n( ):= ωn( W N : ←−Wn ). (49) ←− · { ∈ W ∈ ·} Lemma 3 Suppose ω is exchangeable and satisfies (18). Then ω-almost every W ∈ N possesses a rewiring limit W . W | | Proof Let ωn be the finite measure defined in (48). The image measure ←−ω n in (49) is finite, exchangeable, and, therefore, proportional to an exchangeable probability measure on N. By Lemma 1(iii), ω n-almost every W N W ←− ∈ W possesses a rewiring limit. Since the rewiring limit of W N depends only ∈ W on ←−Wn, for every n N, we conclude that ωn-almost every W N possesses a rewiring limit. As∈ n , ∈ W → ∞

ωn ω = ω1 W N: W,IdN , ↑ ∞ { ∈W } the restriction of ω to W N : W , IdN . By the lefthand side of (18), ω { ∈ W } assigns zero mass to W N : W = IdN , implying ω = ω. The mono- { ∈ W } ∞ tone convergence theorem implies that ω-almost every W N possesses a rewiring limit. The proof is complete. ∈ W 32 Harry Crane

Lemma 4 Suppose ω is exchangeable, satisfies (36), and ω-almost every W N ∈ W has W , I. Then ω = Ω for some unique measure Υ satisfying (46). | | Υ

Proof Let ωn be as in (48) and let An := W N : ←−Wn , Id . By Lemma { ∈ W |[2] [2]} 3, the image of ωn by taking rewiring limits, denoted ωn , is well defined and | | Z Z (2) (2) ωn(An) = (1 υ )ωn( W dυ) = (1 υ ) ωn (dυ), N − ∗ | | ∈ − ∗ | | W V∗ (2) where υ is the component of υ ∗ corresponding to Id[2]. By the monotone ∗ ∈ V convergence theorem, ωn converges to a unique measure Υ := ω on ∗. Since | | | | V ω-almost every W N has W , I, this limiting measure satisfies Υ( I ) = 0. ∈ W | | { } Moreover, Z Z (2) (2) (1 υ ) ωn (dυ) (1 υ )Υ(dυ) − ∗ | | ↑ − ∗ V∗ V∗ and ωn(An) ω(An) = ω( W N : W , Id ) < . ≤ { ∈ W |[2] [2]} ∞ Therefore, Υ satisfies (46) and ΩΥ satisfies (36). We must still show that ω = ΩΥ. By Proposition 4, we can write Z ω n(dW) = Ω (dW) ω n (dυ) ←− υ |←− | V∗ for each n N. By exchangeability, ∈ ω( W : W [n] = V, W [n+m] [n] , Id[m] ) = ←−ω m( W : W [n] = V ) { | | \ } { | } for every V n Id n , n N. Also, for every m 1, ∈ W[ ] \{ [ ]} ∈ ≥ Z ω m( W : W n = V ) = Ω ( W : W n = V ) ω m (dυ), ←− { |[ ] } υ { |[ ] } |←− | V∗ which increases to ΩΥ( W : W [n] = V ) as m since ωm = ←−ω m and ωm converges to Υ. The lefthand{ side| of (36)} implies→ ∞ | | | | | |

lim ω( W : W [n] = V, W [n+m] [n] , Id[m] ) = ω( W : W [n] = V ). m { | | \ } { | } ↑∞

By uniqueness of limits, ΩΥ and ω agree on a generating π-system of the Borel σ-field and, therefore, they agree on all measurable subsets of N. The proof is complete. W

Lemma 5 Suppose ω is exchangeable, satisfies (36), and ω-almost every W N has W = I. Then there exist unique constants e , e , v 0 and a unique measure∈ WΣ | | 0 1 ≥ on ∆3 satisfying (45) such that

ω = ΩΣ + e00 + e11, where k is defined in (43) and ΩΣ is defined in (21). Exchangeable graph-valued Feller processes 33

Proof For any W N, i N, and ε, δ > 0, we define ∈ W ∈ Xn 1 ij LW(i):= lim sup n− 1 W , (0, 1) , n { } →∞ j=1

SW(ε):= i N : LW(i) ε , { ∈ ≥ } Xn 1 SW(ε) := lim sup n− 1 i SW(ε) , and | | n { ∈ } →∞ i=1 V(ε, δ):= W N : SW(ε) δ . { ∈ W | | ≥ }

We can partition the event W N : W = I by V E, where { ∈ W | | } ∪

[∞ V := W N : LW(i) > 0 and { ∈ W } i=1

\∞ E := W N : LW(i) = 0 . { ∈ W } i=1

We can therefore decompose ω as the sum of singular measures

ω = ωV + ωE, where ωV and ωE are the restrictions of ω to V and E, respectively. As in (48), we write

ωV,n := ωV1 W N: W n ,Id n and { ∈W |[ ] [ ]}

ωE,n := ωE1 W N: W n ,Id n { ∈W |[ ] [ ]} to denote the restrictions of ω to V W N : W n , Id n and E W ∩ { ∈ W |[ ] [ ]} ∩ { ∈ N : W n , Id n , respectively. By (36) and exchangeability, both ωVn and W |[ ] [ ]} , ωE n are finite for all n 2 and their images under the n-shift, denoted ω Vn , ≥ ←− , and ←−ω E,n, are exchangeable.

Proof (Case V) Restricting our attention first to V, we note that W = I implies | | X X 2 ij 2 ij lim sup n− 1 W , (0, 1) = lim n− 1 W , (0, 1) = 0. { } n { } n 1 i,j n →∞ 1 i,j n →∞ ≤ ≤ ≤ ≤

By the for exchangeable sequences, we can replace the upper limits in our definition of LW(i) and SW(ε) by proper limits for ω- | | almost every W N. (Note here that the law of large numbers does not apply directly since∈ Wω may be infinite, but it does apply to each of the finite measures ω n. This gives the desired convergence ωn-a.s. for every n N ←− ∈ 34 Harry Crane

and, hence, ω-a.s. by the lefthand side of (36) and the monotone convergence theorem.) For any ε, δ > 0, ω-almost every W V(ε, δ) satisfies ∈ X 2 ij lim sup n− 1 W , (0, 1) = n { } →∞ 1 i,j n X≤ ≤ 2 ij = lim n− 1 W , (0, 1) n { } →∞ 1 i,j n ≤ ≤ Xn Xm 1 ij = lim lim (nm)− 1 W , (0, 1) n m { } →∞ →∞ i=1 j=1 Xn Xm 1 1 ij = lim n− lim m− 1 W , (0, 1) n m { } →∞ i=1 →∞ j=1 Xn 1 lim n− ε1 i SW(ε) ≥ n { ∈ } →∞ i=1 εδ ≥ > 0, contradicting the assumption that ω-almost every W N has W = I. (In the above string of inequalities, passage from the second∈ W to third line| | follows from exchangeability and the law of large numbers, and the interchange of sum and limit in the fourth line is permitted because the sum is finite and each of the limits exists by exchangeability and the law of large numbers. Once again we must apply the law of large numbers to the finite measures ←−ω V,n and deduce the corresponding behavior for the infinite measure ω.) Therefore, ω assigns zero mass to V(ε, δ) for all ε, δ > 0, and so ω-almost every W N must have SW(ε) = 0 for all ε > 0. In this case, either #SW(ε) < or ∈ W | | ∞ #SW(ε) = . ∞ Treating the latter case first, we define the n-shift ←−Wn of W N as above, ∈ W so that ←−ω V,n is finite, exchangeable, and assigns positive mass to the event

W N : SW(ε) = 0 and #S (ε) = . { ∈ W | | ←−Wn ∞} We can regard ←−ω V,n as a constant multiple of an exchangeable probability measure θn on N, so that for fixed n N the sequence (1 L (i) ε )i N is W ∈ { ←−Wn ≥ } ∈ an exchangeable 0, 1 -valued sequence under θn. By de Finetti’s theorem, { } Xm 1 SW(ε) = lim m− 1 L (i) ε = 0 θn-a.s., | | m { ←−Wn ≥ } →∞ i=1 which implies L (i) = 0 for all i N ←−ω V,n-a.e., contradicting #SW(ε) = . ←−Wn ∈ ∞ Now consider the case #SW(ε) < . We claim that #SW(ε) = 1 almost surely. ∞ 12 First note that W N : W [2] , Id[2] = W N : W , (0, 1) , so that by (36) { ∈ W | } { ∈ W } 12 12 > ω( W N : W , (0, 1) ) ω( W N : W , (0, 1), LW(1) ε ). ∞ { ∈ W } ≥ { ∈ W ≥ } Exchangeable graph-valued Feller processes 35

Also, 12 ω( W N : W , (0, 1), LW(1) ε ) = {Z ∈ W ≥ } 12 = 1 W , (0, 1) 1 LW(1) ε ω(dW) N { } { ≥ } ZW 1j = 1 W , (0, 1) 1 LW(1) ε ω(dW) N { } { ≥ } W for all j 2 by exchangeability of ω and, therefore, ≥ 12 ω( W N : W , (0, 1), LW(1) ε ) = { ∈ W ≥ } Xn+1 Z 1 1j = n− 1 LW(1) ε 1 W , (0, 1) ω(dW) N { ≥ } { } j=2 W Z Xn+1 1 1j = 1 LW(1) ε n− 1 W , (0, 1) ω(dW) N { ≥ } { } W j=2 for all n 1, where the interchange of finite sum and integral is permitted since all terms≥ in the integrand are nonnegative. The righthand side is constant for all n 1, implying ≥ 12 ω( W N : W , (0, 1), LW(1) ε ) = { ∈ W ≥ } Z Xn+1 1 1j = lim inf 1 LW(1) ε n− 1 W , (0, 1) ω(dW) n N { ≥ } { } →∞ W j=2 Z Xn+1 1 1j 1 LW(1) ε lim inf n− 1 W , (0, 1) ω(dW) n ≥ N { ≥ } { } W →∞ j=2 Z Xn+1 1 1j = 1 LW(1) ε 1 lim n− 1 W , (0, 1) exists n N { ≥ } { { } } × W →∞ j=2 Xn+1 1 1j lim n− 1 W , (0, 1) ω(dW) × n { } →∞ j=2

εω( W N : LW(1) ε ), ≥ { ∈ W ≥ } where the first inequality is a consequence of Fatou’s lemma and the final 1 Pn+1 1j equality follows because the limiting fraction limn n− j=2 1 W , (0, 1) →∞ { } exists for ωm-almost every W, for all m 2, by an argument similar to the proof ≥ of Lemma 3. Monotone convergence of ωm ω implies that this limit exists ↑ for ω-almost every W N. We must have ω( W N : LW(1) ε ) < ∈ W { ∈ W ≥ } ∞ and, by exchangeability, ω( W N : LW(i) ε ) < for all i N, for all ε > 0. { ∈ W ≥ } ∞ ∈ By the law of total probability we have

X∞ ω( W N : LW(1) ε ) = ω( W N : LW(1) ε, #SW(ε) = k ) { ∈ W ≥ } { ∈ W ≥ } k=1 36 Harry Crane

and, for k 2, ≥

ω( W N : LW(1) ε, #SW(ε) = k ) = { ∈X W ≥ } = ω( W : LW(1) ε, LW(ij) ε, LW(r) < ε for r < 1, i1,..., ik 1 ). { ≥ ≥ { − }} 2 i1<

W N : LW(1) ε, LW(ij) ε, LW(r) < ε for r < 1, i1,..., ik 1 { ∈ W ≥ ≥ { − }} has the same mass under ω and, thus, each must have mass 0 or else W { ∈ N : LW(1) ε would have infinite mass under ω, a contradiction. As this W ≥ } holds for all ε > 0, it follows that LW(i) > 0 for exactly one i N ωV-a.e. and we can decompose ∈

X∞ ωV = ωV1 LW (i)>0, LW (j)=0 for j,i . { } i=1

For each i N, ωV1 LW (i)>0, LW (j)=0 for j,i induces an exchangeable measure (i) ∈ N i{ } ij τ on ( 0, 1 0, 1 ) \{ } by the map W (W )j i. The lefthand side of (36) { } × { } 7→ , implies τ(i)( ((0, 1), (0, 1),...) ) = 0, the righthand side of (36) implies τ(i)( x { N } { ∈ ( 0, 1 0, 1 ) : x1 , (0, 1) ) < , and exchangeability of ω requires that τ{(i) =}τ ×(j {) for} all i, j 1, and} so we∞ speak of the exchangeable measure τ on ( 0, 1 0, 1 )N for which≥ { } × { } τ( ((0, 1), (0, 1),...) ) = 0 and τ( x ( 0, 1 0, 1 )N : x1 , (0, 1) ) < . { } { ∈ { } × { } } ∞(50) For k, k0 = 0, 1, we define the limiting frequency of (k, k0) in x by

Xn 1 j Skk (x) = lim n− 1 x = (k, k0) , if it exists. 0 n { } →∞ j=1

The frequencies of x, if they all exist, determine an element S(x) of the 3- dimensional simplex ∆3. By an analogous argument to Lemma 3, the frequen- cies S(x) exist for τ-almost every x ( 0, 1 0, 1 )N , allowing us to write R ∈ { } × { } τ( ) = S∞( )Σ(dS), and therefore ωV = ΩΣ as defined in (21), for some · ∆3 · measure Σ on ∆3. To see this, we let

τn := τ1 x:(x1,...,xn),((0,1),...,(0,1)) { } be the restriction of τ to the event that the first n elements of x are not all (0, 1). By the lefthand side of (50) and the monotone convergence theorem, τn increases to τ as n . By exchangeability of τ and the righthand side of → ∞ (50), τn is a finite measure for every n 1 and the image measure ←−τ n by the j+n ≥ n-shift, x (x )j 1, is exchangeable. By de Finetti’s theorem, S(x) exists for 7→ ≥ ←−τ n-almost every x and, thus, also exists for τn-almost every x. The monotone convergence theorem implies that S(x) exists for τ-almost every x. Exchangeable graph-valued Feller processes 37

Let τ be the image of τ by taking asymptotic frequency, that is, | | τ (A) = τ( x : S(x) A ), A ∆ . | | { ∈ } ⊆ 3 1+n We proceed as in Lemma 4 and define An := x : x , (0, 1) so that the preceding argument implies { } Z Z τn(An) = (1 S01)τn( x dS) = (1 S01) τn (dS). ∆3 − | | ∈ ∆3 − | |

By the monotone convergence theorem, τn converges to a unique measure | | Σ := τ on ∆3. Since τ-almost every x has x , ((0, 1), (0, 1),...), this limiting measure| | satisfies τ ( S = 1 ) = 0. Together, | | { 01 } Z Z (1 S01) τn (dS) (1 S01)Σ(dS) ∆3 − | | ↑ ∆3 − and 1 τn(An) τ(An) = τ( x : x , (0, 1) ) < ≤ { } ∞ imply that Σ satisfies (45). R It remains to show that τ( ) = S∞( )Σ(dS). By de Finetti’s theorem, we · ∆3 · can write Z ←−τ n(dx) = S∞(dx) ←−τ n (dS) ∆3 | | for each n N. Exchangeability implies ∈ 1 n n+1 n+m 1 n τ( (x ,..., x ) = y, (x ,..., x ) , ((0, 1),..., (0, 1)) ) = τ m( (x ,..., x ) = y ) { } ←− { } j [n] for every y = (y )1 j n ( 0, 1 0, 1 ) , n N, and every m 1. Also, for every m 1, ≤ ≤ ∈ { } × { } ∈ ≥ ≥ Z Yn 1 n j τ m( x :(x ,..., x ) = y ) = S(y ) τ m (dS), ←− { } |←− | ∆3 j=1

R Qn j which increases to j=1 S(y )Σ(dS) as m since τm = ←−τ m and τm ∆3 → ∞ | | | | | | converges to Σ. Since n was chosen arbitrarily and the lefthand side of (45) forbids (xn+1, xn+2,...) = ((0, 1), (0, 1),...), we have

lim τ( (x1,..., xn) = y, (xn+1,..., xn+m) , ((0, 1),..., (0, 1)) ) = τ( (x1,..., xn) = y ). m { } { } ↑∞ R By uniqueness of limits τ agrees with S∞( )Σ(dS) on a generating π-system ∆3 · of the Borel σ-field and, therefore, the two agree on all measurable subsets of N ( 0, 1 0, 1 ) . The correspondence between ωV and τ implies ωV = Ω . { } × { } Σ 38 Harry Crane

Proof (Case E) On event E, Wij , (0, 1) occurs for a zero proportion of all pairs ij and also a zero proportion of all j , i for a fixed i. For W N, we define ∈ W ij EW := ij : W , (0, 1) , { } which must satisfy either #EW = or #EW < . Since ω E n is finite and ∞ ∞ ←− , exchangeable for every n 2, we can write ω E n θn for some exchangeable ≥ ←− , ∝ probability measure θn on N. W Suppose first that #EW = . Then ←−ω E,2 finite implies ←−W2 θ2 is an ex- changeable 0, 1 -valued array∞ with ∼ { } X 2 ij lim sup n− 1 ←−W , (0, 1) = 0. { 2 } n 1 i,j n →∞ ≤ ≤

By the Aldous–Hoover theorem, #E = 0 ω E,2-almost everywhere, forcing ←−W2 ←− all edges in EW to be in the first two rows of W. Furthermore, the unshifted measure ωE,2 is finite and invariant with respect to permutations of N that fix 1j 2j 1, 2 ; whence, both sequences (1 W , (0, 1) )j 3 and (1 W , (0, 1) )j 3 are {exchangeable} with { } ≥ { } ≥

Xn+2 1 1j lim n− 1 W , (0, 1) = 0 and n { } →∞ j=3 Xn+2 1 2j lim n− 1 W , (0, 1) = 0. n { } →∞ j=3

1j 2j It follows that (1 W , (0, 1) )j 3 and (1 W , (0, 1) )j 3 consist of all 0s for { } ≥ { } ≥ ωE -almost every W, contradicting #EW = . ,2 ∞ Now suppose that #EW < . Then ∞ 12 12 > ωE( W N : W , (0, 1) ) ωE( W N : W , (0, 1), #EW = k ) ∞ { ∈ W } ≥ { ∈ W } for every k 1. Also, for any k 2, r = 0,..., k 1, and s = 0,..., (r (k 1 r)), ω assigns equal≥ probability to≥ all events of the− form ∧ − −

A(I, J):= := W12 , (0, 1), W1i , (0, 1) and W2j , (0, 1) for i I, j J, Wi0 j0 = (0, 1) otherwise { ∈ ∈ } for I, J N 1, 2 such that #I = r,#J = k 1 r, and #(I J) = s. Thus, ⊂ \{ } − − ∩ 12 12 ωE( W N : W , (0, 1) ) ωE( W N : W , (0, 1), #EW = k ) { ∈ W } ≥ { ∈ W } k 1 r (k 1 r) X− ∧ X− − X = ωE(A(I, J)) r=0 s=0 I,J N 1,2 :#I=r,#J=k 1 r,#(I J)=s ⊂ \{ } − − ∩ Exchangeable graph-valued Feller processes 39

By the righthand side of (18), the above expression must be finite, which can hold only if each of the inside sums X ωE(A(I, J)) I,J N 1,2 :#I=r,#J=k 1 r,#(I J)=s ⊂ \{ } − − ∩ is finite in the above expression. By exchangeability, each is a sum of infinitely many equal terms, forcing ωE(A(I, J)) = 0 for all I, J with #I + #J = k 1 for k 2. − ≥ The above argument rules out all cases except #EW = 1, allowing us to write 12 12 12 12 ωE( W N : W , (0, 1) ) = c00δρ12 + c δρ12 + c δρ12 , { ∈ W } 0 11 1 10 10 12 12 where c = ωE( W N : W = (k, k0) ) 0 for (k, k0) (0, 0), (1, 0), (1, 1) , kk0 ij ij { ∈ W } ≥ ∈ { ij } ρ0 and ρ1 are the single edge update maps defined before (43), and ρ10 is the rewiring map W with all Wi0 j0 = (0, 1) except Wij = (1, 0). By exchangeability we must have

ij 12 ωE( W N : W , (0, 1) ) = ωE( W N : W , (0, 1) ) { ∈ W } { ∈ W } for all i , j and, moreover,

ij 12 ωE( W N : W = (k, k0) ) = ωE( W N : W = (k, k0) ), k, k0 = 0, 1, { ∈ W } { ∈ W } 12 12 12 for all i , j. Thus, we can define c0 := c00, c1 := c11, and c10 := c10 so that

ij ij ij ij ωE( W N : W , (0, 1) ) = c0δρ + c1δρ + c10δρ { ∈ W } 0 1 10 for all i , j and X ωE = (c0δ ij + c1δ ij + c10δ ij ). ρ0 ρ1 ρ10 j>i 1 ≥

This measure is exchangeable, assigns zero mass to IdN, and satisfies the ij ij ij ij righthand side of (18). Since ρ10 gives ρ10(W) = ρ1 (W) whenever W = 0 and ij ij ij ρ10(W) = ρ0 (W) when W = 1, we define e0 = c0 + c10 and e1 = c1 + c10. The proof is complete.

Proof By Lemma 3, ω-almost every W N possesses a rewiring limit W and we can express ω as a sum of singular∈ W components: | |

ω = ω1 W ,I + ω1 W =I . {| | } {| | }

By Lemma 4, the first term corresponds to ΩΥ for some unique measure Υ satisfying (46). By Lemma 5, the second term corresponds to ΩΣ + e00 + e11 for a unique measure Σ on ∆3 satisfying (45) and unique constants e0, e1 0. The proof is complete. ≥ 40 Harry Crane

6.4 The projection into ∗ D

Let Γ = Γ : G N be an exchangeable Feller process and, for D ∗, { G ∈ G } ∈ D recall the definition of ΓD = (Γt)t 0 as the process obtained by first taking ≥ Γ γD and then putting Γ = Γ on the event Γ = G. We now show that 0 ∼ D G 0 the projection ΓD = ( Γt )t 0 into ∗ exists simultaneously for all t 0 with probability 1. We| also| | prove| ≥ that theseD projections determine a Feller≥ process on ∗ by D Γ = ΓD : D ∗ . | D∗ | {| | ∈ D } Proposition 6 Let Γ = Γ : G N be an exchangeable Feller process on N. { G ∈ G } G For any D ∗, let ΓD = (Γt)t 0 be obtained by taking Γ0 γD and putting ΓD = ΓG ∈ D ≥ ∼ on the event Γ = G. Then the graph limit Γt exists almost surely for every t 0. 0 | | ≥ Proof For each D ∗, this follows immediately by exchangeability of Γ ∈ D 0 ∼ γD, exchangeability of the transition law of Γ, and the Aldous–Hoover theo- rem.

Theorem 5 establishes that Γt exists simultaneously for all t 0. | | ≥ Theorem 5. Let Γ = Γ : G N be an exchangeable Feller process on N. { G ∈ G } G Then Γ ∗ exists almost surely and is a Feller process on ∗. Moreover, each ΓD is continuous| D | at all t > 0 except possibly at the times of TypeD (III) discontinuities.| |

Recall that every D ∗ corresponds to a probability measure γD on N. ∈ D G We equip ∗ with the metric (30), under which it is a compact space. Further- D more, any υ ∗ corresponds to a transition probability P on N N and, ∈ V υ G × G thus, determines a Lipschitz continuous map υ : ∗ ∗ through D Dυ as in (15). Consequently, D → D 7→

Dυ D0υ D D0 for all D, D0 ∗ and all υ ∗ . k − k ≤ k − k ∈ D ∈ V Proof The proof divides into three parts on existence, the Feller property, and discontinuities. Proof (Existence) Proof of existence follows along the same lines as the proof of Theorem 3.6 in [6]. We include the proof here for completeness. ij Fix any D ∗ and let Γ[0,1] = (Γ )i,j 1 denote the array of edge trajec- ∈ D [0,1] ≥ tories in ΓD on the interval [0, 1]. By the consistency assumption for Γ, each ij Γ[0,1] is an alternating collection of finitely many 0s and 1s. Formally, each ij Γ[0,1] = y is a partition of [0, 1] into finitely many nonoverlapping intervals Jl along with an initial status y 0, 1 . The starting condition y determines 0 ∈ { } 0 the entire path y = (yt)t [0,1] because y must alternate between states 0 and ∈ 1 in the successive subintervals Jl. Thus, each y = (yt)t [0,1] lives in a closed ∈ subspace ∗ of the Skorokhod space of cadl` ag` functions [0, 1] 0, 1 [7, Secs. 3.5–3.6]. I → { } S We can partition ∗ := m N m∗ , where m∗ is the closure of I ∈ I I y ∗ : y has exactly m subintervals . { ∈ I } Exchangeable graph-valued Feller processes 41

Consequently, ∗ is complete, separable, and Polish. The fact that ∗ is Polish I I allows us to apply the Aldous–Hoover theorem to study the ∗-valued array I Γ[0,1]. Since the initial distribution of ΓD is exchangeable, the entire path (Γt)t 0 satisfies ≥ σ (Γt )t 0 = (Γt)t 0 for all permutations σ : N N . ≥ L ≥ → In particular, the array Γ[0,1] of edge trajectories is weakly exchangeable. By the Aldous–Hoover theorem, we can express Γ[0,1] as a mixture of exchangeable, dissociated ∗-valued arrays. It suffices to prove existence of Y = ( Yt )t [0,1] I ij | | | | ∈ for exchangeable, dissociated ∗-valued arrays Y = (Y )i,j 1. I ≥ To this end, we treat Y just as the edge trajectories of a graph-valued ij process on [0, 1], so Y denotes the status of edge ij at time t [0, 1] and t ∈ δ(F, Yt) is the limiting density of any finite subgraph F ∗ in Yt. To show ∈ G that ( Yt )t [0,1] exists simultaneously at all t [0, 1], we show that the upper | | ∈ [n] ∈ and lower limits of (δ(F, Y ))n N coincide for all t [0, 1] simultaneously. In t ∈ ∈ particular, for fixed F m, m N, and t [0, 1] we write ∈ G ∈ ∈ + [n] δt (F):= lim sup δ(F, Yt ) and n →∞ [n] δ−(F):= lim inf δ(F, Y ). t n t →∞ + We already know from Proposition 6 that P δt (F) = δt−(F) = 1 for all fixed + { } t [0, 1], but we wish to show that δt (F) = δt−(F) almost surely for all t [0, 1], that∈ is, ∈ + P supt [0,1] δt (F) δt−(F) = 0 = 1. { ∈ | − | } ij ij Each path Y = (Y )t [0,1] has finitely many discontinuities almost surely [0,1] t ∈ [n] ij and so must Y := (Y )1 i,j n, for every n N. For every ε > 0, there is a [0,1] [0,1] ≤ ≤ ∈ finite subset Sε [0, 1] and an at most countable partition J1, J2,... of the open set [0, 1] S such⊂ that \ ε P ij Y[0,1] is discontinuous at s ε for every s Sε and { ij } ≥ ∈ P Y is discontinuous in Jl < ε, l = 1, 2,.... { [0,1] }

(Without such a partition, there must be a sequence of intervals (t ρn, t + ρn) − with ρn 0 that converges to t < S such that → ε ij P Y is discontinuous in (t ρn, t + ρn) ε for every n 1. { [0,1] − } ≥ ≥ Continuity from above would then imply

ij P Y is discontinuous at t < S ε > 0, { [0,1] ε} ≥ contradicting the assumption t < Sε.) 42 Harry Crane

By of the action of SN for dissociated arrays,

12 P Y is discontinuous in Jl = { X } 2 ij = lim 1 Y is discontinuous in Jl < ε a.s. n n(n 1) { } →∞ 1 i

+ 2 P supt J δt (F) δt−(F) 2εV(F) = 1 { ∈ l | − | ≤ } for all l = 1, 2,... and all ε > 0. Since [0, 1] is covered by at most countably many subintervals J , l = 1, 2,..., S l and the nonrandom set S := ε>0 Sε, it follows that

+ 2 P supt [0,1] δt (F) δt−(F) 2εV(F) = 1 for all ε > 0, for every F ∗. { ∈ | − | ≤ } ∈ G Continuity from above implies that

+ + 2 P supt [0,1] δt (F) δt−(F) = 0 = lim P supt [0,1] δt (F) δt−(F) 2εV(F) = 1; { ∈ | − | } ε 0 { ∈ | − | ≤ } ↓ S thus, P δ(F, Yt) exists at all t [0, 1] = 1 for every F m N [m]. Countable additivity{ of probability measures∈ implies} that this limit∈ exists∈ G almost surely S for all F m N m and, thus, Y[0,1] determines a process on ∗ almost surely. The fact∈ that∈ G these limits are deterministic follows fromD the 0-1 law and our assumption that Y is dissociated. By the Aldous–Hoover theorem, ΓD is a mixture of exchangeable, dissociated processes, so we conclude that Γ | D | exists almost surely for every D ∗. ∈ D Proof (Feller property) By Theorem 3 and Corollary 2, we can assume Γ = Γ : { G G N is constructed by ΓG = (W∗(G))t 0 for each G N, where Wω = ∈ G } t ≥ ∈ G (Wt∗)t 0 is a standard ω-process for some exchangeable measure ω satisfying (18). The≥ standard ω-process has the Feller property and, by Theorem 2,

Γt = Wt∗(Γ0) = Γ0 Wt∗ a.s. for every t 0. | | L | | | || | ≥

Existence of W∗ at fixed times follows by exchangeability and analog to | t | Proposition 6. In fact, Wω∗ = ( Wt )t 0 exists for all t 0 simultaneously | | | | ≥ ≥ by analog to the above argument for existence of Γ , because IdN is an | D | exchangeable initial state with rewiring limit I. The Markov property of Γ ∗ follows immediately. | D | Let (Pt)t 0 be the semigroup of Γ , that is, for every continuous function ≥ | D∗ | g : ∗ R, D → Pt g(D) = E(g( Γt ) Γ = D). | | | | 0| To establish the Feller property, we must show that for every continuous g : ∗ R D → Exchangeable graph-valued Feller processes 43

(i) D Pt g(D) is continuous for all t > 0 and 7→ (ii) limt 0 Pt g(D) = g(D) for all D ∗. ↓ ∈ D For (i), we take any continuous function h : ∗ R. By compactness D → of ∗, h is uniformly continuous and, hence, bounded. By the dominated D convergence theorem and Lipschitz continuity of Wt : ∗ ∗, D Pth(D) is continuous for all t > 0. | | D → D 7→ Part (ii) follows from exchangeability of Wω∗ and the Feller property of Γ. To see this explicitly, we show the equivalent condition

lim P Wt∗ I > ε = 0 for all ε > 0. t 0 {k| | − k } ↓ For contradiction, we assume

lim sup P Wt∗ I > ε > 0 for some ε > 0. t 0 {k| | − k } ↓

By definition of the metric (30) on ∗, D X X n W∗ I = 2− W∗ (V) I(V) , k| t | − k | | t | − | n N V n ∈ ∈W[ ]

and so there must be some n 2, some V [n] Id[n] , and ε, % > 0 such that ≥ ∈ W \{ } lim sup P δ(V, Wt) > ε %. (51) t 0 { } ≥ ↓ Given such a V [n] Id[n] , we can choose F [n] such that V(F) , F and we can define ∈ W \{ } ∈ G ψF( ):= 1 n , F · {·|[ ] } [n] so that PtψF(G) = P Γ , F Γ = G . We choose any G = F∗ N such that { t | 0 } ∈ G F∗ n = F. In this way, |[ ] [n] [n] PtψF(F∗) = P Γ , F Γ0 = F∗ P Γ discontinuous on [0, t] { t | } ≤ { F∗ } and

[n] [n] [n] P Γ , F Γ = F∗ P W∗ = V = EP W∗ = V W∗ = Eδ(V, W∗). { t | 0 } ≥ { t } { t | | t |} t Now,

[n] P Γ discontinuous on [0, t] 1 exp tω( W N : W [n] , Id[n] ) { F∗ } ≤ − {− { ∈ W | } } implies

lim sup P Γ[n] discontinuous on [0, t] F∗ t 0 { } ≤ ↓ lim sup 1 exp tω( W N : W [n] , Id[n] ) = 0, ≤ t 0 − {− { ∈ W | } } ↓ 44 Harry Crane

because ω( W N : W [n] , Id[n] ) < for all n 2 by the righthand side of (18). On{ the∈ other W hand,| Markov’s} inequality∞ implies≥

Eδ(V, W∗) εP δ(V, W∗) > ε , t ≥ { t } and (51) gives

lim sup Eδ(V, Wt∗) ε lim sup P δ(V, Wt∗) > ε ε% > 0. t 0 ≥ t 0 { } ≥ ↓ ↓ Thus, the Feller property of Γ and the hypothesis (51) lead to contradicting statements

lim sup PtψF(F∗) ε% > 0 and t 0 ≥ ↓ lim sup PtψF(F∗) 0. t 0 ≤ ↓ We conclude that

lim sup P δ(V, Wt∗) > ε < % t 0 { } ↓ for all V [n] Id[n] , n N, and all %, ε > 0. ∈ W n \{1 } ∈n ( − ) ( ) There are 4 2 1 < 4 2 elements in n Id n for each n 2. Thus, − W[ ] \{ [ ]} ≥      [ n   (2)  P δ(Id[n], Wt∗) < 1 ε P  δ(V, Wt∗) > ε4−  , { − } ≤  { } V n Id n  ∈W[ ] \{ [ ]} and we have

lim sup P δ(Id[n], Wt∗) < 1 ε t 0 { − } ≤ ↓ X n (2) lim sup P δ(V, Wt∗) > ε4− ≤ t 0 { } V n Id n ↓ ∈W[ ] \{ [ ]} X n (2) lim sup P δ(V, Wt∗) > ε4− ≤ t 0 { } V n Id n ∈W[ ] \{ [ ]} ↓ = 0.

It follows that

lim sup P δ(V, Wt∗) δ(V, IdN) > ε = 0 for all ε > 0, t 0 {| − | } ↓

for all V [n], for all n N. Now, for any ε > 0, the event Wt∗ I > ε implies ∈ W ∈ {k| | − k } X X n W∗ I = 2− W∗ (V) I(V) > ε. k| t | − k || t | − | n N V n ∈ ∈W[ ] Exchangeable graph-valued Feller processes 45

P Since V [n] Wt∗ (V) I(V) 2 for every n 1, we observe that ∈W || | − | ≤ ≥ X X X n n log (ε) 2− W∗ (V) I(V) 2 2− = 2b 2 c ε. || t | − | ≤ ≤ n 1 log (ε) V n n 1 log (ε) ≥d − 2 e ∈W[ ] ≥d − 2 e Writing m = 2 log (ε) , we must have ε d − 2 e mε mε+1 ( ) δ(V, W∗) δ(V, IdN) > (ε 2− )4− 2 > 0 | t − | − S mε mε+1 ( 2 ) for some V n 0, we now conclude that ∈ W −      [ [  lim sup P Wt∗ I > ε lim sup P  δ(V, Wt∗) δ(V, IdN) > %  t 0 {k| | − k } ≤ t 0  {| − | } ↓ ↓ n mε V [n]  X ≤X∈W lim sup P δ(V, Wt∗) δ(V, IdN)) > % ≤ t 0 {| − | } ↓ n mε V [n] X X≤ ∈W lim sup P δ(V, Wt∗) δ(V, IdN) > % ≤ t 0 {| − | } n mε V n ≤ ∈W[ ] ↓ = 0.

Where the interchange of sum and lim sup is permitted because the sum is

finite. We conclude that Wt∗ P I as t 0 and Wt∗(Γ0) = Γ0 Wt∗ P Γ0 as t 0. The Feller property| follows.| → ↓ | | | || | → | | ↓

Proof (Discontinuities) Let ΓD = (Γt)t 0 have exchangeable initial distribution ≥ γD. Then ( Γt )t 0 exists almost surely and has the Feller property. By the Feller property,| | ≥ Γ has a version with cadl` ag` paths. For the cadl` ag` version, | D | lims t Γt exists for all t > 0 with probability 1. Although the map : N ∗ ↑ | | |·| G → D is not continuous, we also have lims t Γs = Γt , as we now show. ↑ | | | −| Exchangeability of the initial distribution γD as well as the transition prob- σ ability implies ΓD = ΓD for all σ SN; therefore, Γt is exchangeable for all L ∈ − t > 0. By the Aldous–Hoover theorem (Theorem 6), Γt exists a.s. for all t > 0. | −| Now, suppose t > 0 is a discontinuity time of ΓD. If lims t Γs , Γt , then there exists ε > 0 such that ↑ | | | −|

lim Γs Γt > ε. k s t | | − | −|k ↑

In particular, there exists m N and F m such that ∈ ∈ G[ ]

lim δ(F,Γs) δ(F,Γt ) > ε. | s t − − | ↑ By definition, 1 X φ δ(F,Γs) = lim 1 Γs = F . n n m { } →∞ ↓ φ:[m] [n] → 46 Harry Crane

Since the above limits exist with probability 1, we can replace limits with limits inferior to obtain 1 X 1 φ F 1 φ F 0 lim inf lim inf m Γs = Γt = ≤ | s t n n { } − { − }| ≤ ↑ →∞ ↓ φ:[m] [n] → 1 X 1 φ F 1 φ F lim inf lim inf m Γs = Γt = ≤ s t n n | { } − { − }| ↑ →∞ ↓ φ:[m] [n] → 1 X 2 lim inf lim inf 1 Γφ discontinuous on [s, t) . ≤ s t n n m { D } ↑ →∞ ↓ φ:[m] [n] → By the bounded convergence theorem, Fatou’s lemma, exchangeability and consistency of Γ, and the construction of Γ from a standard ω-process, we have    X   1 φ  E lim inf lim inf 1 Γ discontinuous on [s, t)   s t n n m { D } ≤ ↑ →∞ ↓ φ:[m] [n] → 1 X h i lim inf lim inf E 1 Γφ discontinuous on [s, t) ≤ s t n n m { D } ↑ →∞ ↓ φ:[m] [n] → lim inf lim inf 1 exp (t s)ω( W N : W [m] , Id[m] ) ≤ s t n − {− − { ∈ W | } } ↑ →∞ = 0.

By Markov’s inequality, we must have ( ) P lim δ(F,Γs) δ(F,Γt ) > ε = 0 for all ε > 0 and all F ∗. | s t − − | ∈ G ↑ It follows that

P lim Γs Γt > ε = 0 for all ε > 0. {k s t | | − | −|k } ↑ To see that the discontinuities in Γ occur only at the times of Type (III) | | discontinuities, suppose s > 0 is a discontinuity time for ΓD . The cadl` ag` | | property of ΓD along with the fact that lims t Γs = lims t Γs a.s. implies that | | S ↑ | | | ↑ | δ(F,Γs ) δ(F,Γs) > ε for some F m N [m] and some ε > 0. By the strong law| of− large− numbers,| ∈ ∈ G

2 X ij ij lim 1 Γs , Γs > 0, n n(n 1) { − } →∞ 1 i

involving i∗ remain constant. If s were the time of a Type (II) discontinuity, then 2 X ij ij 2 lim 1 Γs , Γs lim n = 0, n n(n 1) { − } ≤ n n(n 1) · →∞ 1 i

Example 5 Theorem 5 establishes that every discontinuity of ΓD corresponds to a type (III) discontinuity in the graph-valued process Γ. It does| | not say that all type (III) discontinuities cause discontinuous behavior in ΓD . Consider | | the process ΓD that evolves purely by type (III) discontinuities but for which ΓD is almost surely continuous. | | Fix p [0, 1] and let D ∗ be the graph limit of an Erdos–R˝ enyi´ random ∈ ∈ D graph with parameter p, that is, D = Γ for Γ εp. Let ω be the probability | | ∼ measure on N for which the entries of W ω are i.i.d. with W ∼ ij k 1 k k0 1 k0 P W = (k, k0) = p (1 p) − p (1 p) − , k, k0 = 0, 1, j > i 1. { } − − ≥

Let Γ = Γω∗ , where Γω∗ is the process constructed from the Poisson point process L with intensity dt ω. The process ΓD is obtained from Γ by first drawing Γ γD ⊗ 0 ∼ and then putting ΓD = ΓG on the event Γ0 = G. This process experiences a discontinuity according to the arrival times of a rate 1 Poisson process. At the time of each arrival, the process jumps to a new state chosen independently according to εp; thus, the process ΓD = (Γt)t 0 experiences only jumps of type ≥ (III) but has Γt = D for all t > 0 with probability 1. | |

7 Concluding remarks

The main theorems characterize the behavior of exchangeable Feller processes on the space of countable undirected graphs. Our arguments rely mostly on the Aldous–Hoover theorem (Theorem 6) and its extension to conditionally exchangeable random graphs (Theorem 7) and, therefore, our main theorems extend to exchangeable Feller processes on countable d-dimensional arrays taking values in any finite space. Our main theorems can also be extended to cover processes in the space of directed graphs, multigraphs, and hyper- graphs. In these cases, the analogs to Theorems 1-5 can be deduced by similar arguments, so we omit them.

References

1. D. Aldous. Representations for Partially Exchangeable Arrays of Random Variables. Journal of Multivariate Analysis, 11:581–598, 1981. 2. D. J. Aldous. Exchangeability and related topics. In Ecole´ d’´et´ede probabilit´esde Saint-Flour, XIII—1983, volume 1117 of Lecture Notes in Math., pages 1–198. Springer, Berlin, 1985. 3. J. Bertoin. L´evyProcesses. Cambridge University Press, Cambridge, 1996. 48 Harry Crane

4. J. Bertoin. Random fragmentation and coagulation processes, volume 102 of Cambridge Studies in Advanced Mathematics. Cambridge University Press, Cambridge, 2006. 5. H. Crane. The cut-and-paste process. The Annals of Probability, 42(5):1952–1979, 2014. 6. H. Crane. Dynamic random networks and their graph limits. Annals of Applied Probability, in press, 2016. 7. S. Ethier and T. Kurtz. Markov Processes: Characterization and Convergence. Wiley, 2005. 8. D. Hoover. Relations on Probability Spaces and Arrays of Random Variables. Preprint, Institute for Advanced Study, Princeton, NJ, 1979. 9. O. Kallenberg. Random Measures. Academic Press, New York, 1986. 10. O. Kallenberg. Probabilistic Symmetries and Invariance Principles. Probability and Its Applica- tions. Springer, 2005. 11. L. Lovasz.´ Large Networks and Graph Limits, volume 60 of AMS Colloquium Publications. American Mathematical Society, Providence, RI, 2012. 12. L. Lovasz´ and B. Szegedy. Limits of dense graph sequences. J. Comb. Th. B, 96:933–957, 2006. 13. J. Pitman. Combinatorial stochastic processes, volume 1875 of Lecture Notes in Mathematics. Springer-Verlag, Berlin, 2006. Lectures from the 32nd Summer School on held in Saint-Flour, July 7–24, 2002, With a foreword by Jean Picard. 14. R. Rado. Universal graphs and universal functions. Acta Arithmetica, 9:331–340, 1964.