arXiv:1811.02901v1 [math.PR] 7 Nov 2018 fsbierexpectation sublinear of nw.Ti yeo ihrlvlucranycnb describe be can uncertainty level measures higher of type This known. for u h inrpoaiiymeasure probability Wiener the But h aecnnclprocess canonical same the xetto,cle -xetto,sc that such G-expectation, called expectation, sotnue ooti outepcain hstp ffor of type This expectation expectation. sublinear robust of a obtain to used often is saGusa rcs,adalincrements all and process, Gaussian a is oitoueaWee rbblt measure probability Wiener a uncert introduce of to studies quantitative for framework mathematical htayrno etro h form the of vector random any that nrdcdanto of notion a introduced paths h bv etoe dat osrc ai olna expe nonlinear basic a construct to idea mentioned above the rnre l[]19) ebe 4(02 n öle Schi & Föllmer and [4](2002) Delbaen ris [1](1999), coherent al corresponding et the Arzner also See [18](1991). Walley hnasadr rwinmto,nml,udrteWee pr Wiener the under namely, motion, Brownian standard a then h process the u nms ae forra ol,tepoaiiymauei measure probability the world, real our of cases most in But h xoai omlto fpoaiiyspace probability of formulation axiomatic The eyitrtn usini htcnw olwteie fKo of idea the follow we can that is question intereting very A t 1 , · · · = Ω G e words Key one. temporal me the probability from the different of intrinsically whic uncertainty are in the noises under expectation, known, linear be of to case need the from Different noise. pca ae ae nti eut eas aeitoue a introduced have also we result, this on Based case. special yeof type Abstract. wienie pta n eprlwienoise. white temporal and Spatial noise, -white t , { P C n B θ ([0 ≤ } G eoe nrmnal tbe needn n continuous and independent stable, incrementally becomes θ PTA N EPRLWIENOISES WHITE TEMPORAL AND SPATIAL ∈ Gusa admfils hc oti yeo pta whit spatial of type a contain which fields, random -Gaussian , t, Θ ∞ ulna expectation, sublinear . ntefaeoko ulna xetto,w aeintroduc have we expectation, sublinear of framework the In n dnial itiue ihrsetto respect with distributed identically and NE ULNA G-EXPECTATION SUBLINEAR UNDER uhta eaeual oko tu n” nti aetenot the case this In one”. “true know to unable are we that such ); R d ) G E with , Bona oinwihi tl h aecnnclprocess. canonical same the still is which motion -Brownian hc a enwdl netgtd e,eg,Hbr[9](19 Huber e.g., see, investigated, widely been has which B IXAJNADPN SHIGE PENG AND XIAOJUN JI eoe nnier rwinmto?Pn [12](2007) Peng motion? Brownian “nonlinear” a becomes F E = 1. [ X B ( Introduction sup = ] B (Ω) P t G 1 , θ Bona motion, -Brownian a elcdb h orsodn sublinear corresponding the by replaced was h aoia process canonical The . ∈ · · · B Θ 1 P t + B ˆ B , s ntesaeof space the on Ω − eoe otnossohsi process stochastic continuous a becomes XdP t n B ) t (Ω snral itiue n thus and distributed normally is θ r needn of independent are . , F P , h rbblt measure probability the h G esrsrpeetto of representation measures k spatial-temporal ite.Atpcleapeis example typical A ainties. uainidcsanwtype new a induces mulation Gusa admfield, random -Gaussian ) yafml fprobability of family a by d sr,tesailwhite spatial the asure, d[8](2002). ed sapwru n elegant and powerful a is B d ctation mgrvWee,t use to lmogorov-Wiener, dmninlcontinuous -dimensional s . sl sesnilyun- essentially is tself bblt measure obability B t ( ω os sa as noise e E = ) [ danew a ed · ( ] G B ne which under , ttrsout turns It . -white t ω 1 , t , · · · t ≥ B , 0 81), t ion is , n P B ) , , 2 JI XIAOJUN AND PENG SHIGE whose increments are stable and independent. It turns out to be a typical model of stochastic processes for which the uncertainty of probabilities is dynamically and intrinsically taken into account. We can also use this G-Brownian motion to generate a very rich class of stochastic processes such as G-OU process, geometric G-Brownian motions, through the corresponding stochastic differential equations driven by this G- Brownian motion, thanks to a generalized Itô’s calculus. But it is worth to point out that, for each t> 0, Bt is not symmetric with respect to B B in the level of distributions. We know that B B is independent from B , 2t − t 2t − t t but in general Bt is not independent from B2t Bt. In fact, we can check that, although B B and B are both G-normally distributed,− but the 2-dimensional random vector 2t − t t (B2t Bt, Bt) is not G-normal. In general Y is independent from X does not imply that −X is independent from Y . In fact the notion of independence becomes a typical ∞ temporal relation. For example, a random sequence Xi i=1 is a i.i.d. sequence under a nonlinear expectation is mainly for the case where{ i } N is a time index, namely ∞ ∈ Xi i=1 is a time series. How to introduce a space-indexed random fields becomes an {interesting} and challenging problem. In this paper, we introduce a new type of spatial-temporal white noise in the presence of probability model uncertainties which is a natural generalization of the classical spatial white noise (see, among others, Walsh [19], Dalang [2], Da Prato and J. Zabczyk [3]). But we emphasis that, in the framework of nonlinear expectation, the space- indexed increments does not satisfy the property of independence. Once the nonlinear G-expectation reduced to its special situation of linear expectation, the property of independence for the space indexed part turns out to be true, thanks for the Fubini Lemma for the linear expectation. The Gaussian property for the time indexed part also turns out to be true, thanks to the important property of (linear) normal distributions that independent Gaussian random variables constitute a Gaussian random vector. The objective of this paper is to construct a white noise of space-indexed, as well as space-time-indexed. We will see that such type of space-indexed white noise is a special type of Gaussian random field under probability uncertainty, satisfying distributional invariant of rotations and translations (see Proposition 24). An important and interesting point observed in Peng [17] is that a G-Brownian motion (Bt)t≥0 is very efficient to quantify time-indexed random fields, but it is no longer suitable to describe space-indexed uncertainties. A new notion of Gaussian process was then introduced in Peng [17]. Our main point of view is that under a sublinear expectation, a framework of G-Gaussian random fields is suitable for describing spatial random fields. In this paper we will systematically develop a new type of spatial Gaussian random fields under a sublinear expectation space. It is remarkable that is a finite distribution of a (B , , B ) is no longer G-normally distributed. t1 ··· tn In many literature, the index t in Brownian motion Bt is also treated as a spatial parameter, i.e., t = x R. This method is widely used in the study of space-time indexed random field. In∈ fact, this can be justified by Fubini theorem, in which time and space is exchangeable. However, Fubini theorem does not hold for the nonlinear expectation framework. So we need to establish a new theoretical framework in which SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 3 space and time are treated separately for obtaining a spatial-temporal random fields, and then develop a new space-time indexed white noise theory. We hope that this new framework of spatial-temporal random fields is useful in quantitative financial risk measures, complex random networks, stochastic partial dif- ferential equations, non-equlibrium statistical physics, stochastic control system and games, large scale robust economics, and spatial and temporal indexed data. With the quickly increasing complexity of the internet network, its uncertainty in probability measures as a space indexed random field is increasing. There is an urgent need for an exact theoretical framework of nonlinear expectation in this field. We establish the general random field under the nonlinear expectation framework. In particular, we construct spatial random fields and spatial-temporal white noises, and then the related stochastic calculus theory. Specially, if the nonlinear expectation degrades to a linear expectation, the corresponding random field theory and space-time indexed white noise coincide with classical results. This paper is organized as follows. In Section 2, we review some basic notions and results of nonlinear expectation theory, the notion of distributions and independence of random vectors, the notion of G-normal distribution, G-Brownian motion and their basic properties. In Section 3, we first provide a generalized Kolmogorov’s existence theorem based from a consistently defined family of finite dimensional distributions in the framework of nonlinnear expectations. The existence and consistence of a type of G-Gaussian random fields is also provided. In Section 4, we construct the spatial white noise and the corresponding stochastic integral. The existence and stochastic integral of space-time-indexed white noise are discussed in Section 5. We are aware of deep research results of Föllmer [6] and Föllmer & Klüppelberg [7] (2014) on spatial risk measures and will explore the relations between the two seemingly very different approaches.

2. Preliminaries In this section, we present some basic notions and properties in nonlinear expectation. More details can be found in [5], [12], and [15] . 2.1. Basic notions of nonlinear expectations. Let Ω be a given nonempty set and be a linear space of real-valued functions on Ω such that if X1,... ,Xd , then H d d ∈ H ϕ(X1,X2,...,Xd) for each ϕ Cl.Lip(R ), where Cl.Lip(R ) denotes the linear space of functions satisfying∈ H for each∈x, y Rd, ∈ ϕ(x) ϕ(y) C (1 + x m + y m) x y ,for some C > 0, m N depending on ϕ. | − | ≤ ϕ | | | | | − | ϕ ∈ is considered as the space of random variables. X =(X1,...,Xd), Xi , 1 i d, isH called a d-dimensional random vector, denoted by X d. ∈ H ≤ ≤ ∈ H Definition 1. A sublinear expectation Eˆ on is a functional Eˆ : R satisfying the following properties: for each X,Y , H H→ ∈ H (i) Monotonicity: Eˆ[X] Eˆ[Y ] if X Y ; ≥ ≥ (ii) Constant preserving: Eˆ[c]= c for c R; ∈ (iii) Sub-additivity: Eˆ[X + Y ] Eˆ[X]+ Eˆ[Y ]; ≤ 4 JI XIAOJUN AND PENG SHIGE

(iv) Positive homogeneity: Eˆ[λX]= λEˆ[X] for λ 0. ≥ The triplet (Ω, , Eˆ) is called a sublinear expectation space. If only (i) and H (ii) are satisfied, then Eˆ is called a nonlinear expectation and the triplet (Ω, , Eˆ) is called a nonlinear expectation space. H ∞ (v) Regularity: If Xi i=1 be such that Xi(ω) 0 as i , for each ω Ω, then { } ⊂ H ↓ → ∞ ∈ lim Eˆ[Xi]=0. i→∞ In this paper we are mainly interested in sublinear expectations satisfying the regular condition. Theorem 2. Let Eˆ be a sublinear expectation on (Ω, ) satisfying the regularity con- dition (v) in Definition 1. Then there exists a familyH of unique probability measures P , such that { θ}θ∈Θ Eˆ[X] = max X(ω)dPθ, X . θ∈Θ ˆΩ ∀ ∈ H Let (Ω, , Eˆ) be a nonlinear (resp. sublinear) expectation space. For each given H d d-dimensional random vector X, we define a functional on Cl.Lip(R ) by F [ϕ] := Eˆ[ϕ(X)], for each ϕ C (Rd). X ∈ l.Lip d d It is easy to verify that (R ,Cl.Lip(R ), FX ) forms a nonlinear (resp. sublinear) ex- pectation space. FX is called the distribution of X. Two d-dimensional random vec- tors X and X defined respectively on nonlinear expectation spaces (Ω , , Eˆ ) and 1 2 1 H1 1 Eˆ d F F (Ω2, 2, 2) are called identically distributed, denoted by X1 = X2, if X1 = X2 , i.e., H Eˆ [ϕ(X )] = Eˆ [ϕ(X )] for each ϕ C (Rd). 1 1 2 2 ∈ l.Lip Similar to the classical case, Peng [14](2008) gave the following definition of conver- gence in distribution. Definition 3. Let X , n 1, be a sequence of d-dimensional random vectors defined n ≥ respectively on nonlinear (resp. sublinear) expectation spaces (Ωn, n, Eˆ n). Xn : n RdH F { ≥ 1 is said to converge in distribution if, for each fixed ϕ Cb.Lip( ), Xn [ϕ]: n 1 } d ∈ { ≥ } is a Cauchy sequence, where Cb.Lip(R ) denotes the set of all bounded and Lipschitz functions on Rd. Define F F [ϕ] = lim Xn [ϕ], n→∞ d d then the triplet (R ,Cb.Lip(R ), F) forms a nonlinear (resp. sublinear) expectation space. The following notion of the independence of random variables under a nonlinear expectation is very useful (see Peng [15](2010)). Definition 4. Let (Ω, , Eˆ) be a nonlinear expectation space. An n-dimensional ran- dom vector Y is said toH be independent from another m-dimensional random vector X under the expectation Eˆ if, for each test function ϕ C (Rm+n), we have ∈ l.Lip Eˆ[ϕ(X,Y )] = Eˆ[Eˆ[ϕ(x, Y )]x=X ]. SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 5

Let X¯ and X be two m-dimensional random vectors on (Ω, , Eˆ). X¯ is called an H independent copy of X if X¯ =d X and X¯ is independent from X. Remark 5. It is important to note that “Y is independent from X” does not imply that “X is independent from Y ” (see Peng [15]). p For each p 1 , let L (Ω) be the completion of under the Banach norm X Lp := 1 ≥ H′ k k (Eˆ[ X p]) p . It is easy to verify that Lp(Ω) Lp (Ω) for each 1 p′ p. Since | | ⊂ ≤ ≤ Eˆ[X] Eˆ[Y ] Eˆ[ X Y ], Eˆ can be continuously extended to the mapping from L1(Ω) | − | ≤ | − | to R and properties (i)-(iv) of definition 1 still hold. Moreover, (Ω, L1(Ω), Eˆ) also forms a sublinear expectation space, which is called a complete sublinear expectation space. We say that X = Y in Lp(Ω) if Eˆ[ X Y p]=0, and denote by X Y or Y X if X Y =(X Y )+. | − | ≥ ≤ − − 2.2. G-normal distributions under a sublinear expectation space. In the rest part of this paper, we focus ourselves to a sublinear expectation space (Ω, , Eˆ). The following simple lemma is quite useful: H Lemma 6. (see Peng [15]) If a ramdom variable ξ satisfies Eˆ[ξ] = Eˆ[ ξ]=0, then we have ∈ H − Eˆ[ξ + η]= Eˆ[η], η . ∀ ∈ H Definition 7. A d-dimensional random vector X = (X , ,X ) on a sublinear ex- 1 ··· d pectation space (Ω, , Eˆ) is called (centralized) G-normally distributed if H aX + bX¯ =d √a2 + b2X for a, b 0, ≥ where X¯ is an independent copy of X. We denote by S(d) the collection of all d d symmetric matrices. The distribution of a × G-normally distributed random vector X defined on (Ω, , Eˆ) is uniquely characterized by a function G = G : S(d) R defined as follows: H X 7→ 1 (2.1) G (Q) := Eˆ[ QX,X ], Q S(d). X 2 h i ∈ It is easy to check that G is a sublinear and continuous function monotone in Q S(d) in the following sense: for each Q, Q¯ S(d), ∈ ∈ G(Q + Q¯) G(Q)+ G(Q¯), (2.2) G(λQ)= λG≤ (Q), λ 0,  G(Q) G(Q¯), if∀ Q≥ Q.¯  ≥ ≥ S R Proposition 8. Let G : (d) be a given sublinear and continuous function, monotone in Q S(d) in the sense→ of (2.2). Then there exists a G-normally distributed ∈ d-dimensional random vector X on some sublinear expectation space (Ω, , Eˆ) satisfying (2.1). Moreover, if X¯ is also a normally distributed random variable suchH that

G (Q)= G ¯ (Q), for any Q S(d), X X ∈ then we have X =d X¯. 6 JI XIAOJUN AND PENG SHIGE

The function GX ( ) associated to the distribution function of G-normal variable X is called the generating· function of X. Proposition 9. Let G be given as in Proposition 8 and X be a d-dimensional random vector on a sublinear expectation space (Ω, , Eˆ) such that X is G-normally distributed. For each ϕ C (Rd), define H ∈ l.Lip (2.3) u(t, x) := Eˆ[ϕ(x + √tX)], (t, x) [0, ) Rd. ∈ ∞ × Then u is the unique viscosity solution of the G-heat equation (2.4) ∂ u G(D2u)=0, u = ϕ(x) t − |t=0 The following property of a G-normally distributed random vector ξ is easy to check: Proposition 10. Let ξ be a d-dimensional G-normally distributed random vector char- acterized by its generating function 1 G (Q) := Eˆ[ Qξ,ξ ], Q S(d). ξ 2 h i ∈ Then for any matrix K Rm×d, Kξ is also an m-dimensional G-normally distributed random vector. Its corresponding∈ generating function is 1 (2.5) G (Q)= Eˆ[ KT QKξ,ξ ], Q S(m). Kξ 2 h i ∈ Proof. Let ξ¯ be an independent copy of ξ. It is clear that Kξ¯ is also an independent m d copy of Kξ. Thus, for each ϕ Cl.Lip(R ), the function ϕK (x) := ϕ(Kx), x R , is d ∈ ∈ also a Cl.Lip(R ) function. It follows from the definition of the G-normal distribution that, for any a, b 0, ≥ 2 2 2 2 Eˆ[ϕ(aKξ + bKξ¯)] = Eˆ[ϕK (aξ + bξ¯)] = Eˆ[ϕK(√a + b ξ)] = Eˆ[ϕ(√a + b Kξ)]. Hence Kξ is also normally distributed. Relation (2.5) is easy to check.  By Daniell-Stone Theorem, we have the following robust representation theorem of sublinear expectations which is quite useful. Theorem 11. (see Peng [15]) Let (Ω, , Eˆ) be a sublinear expectation space satisfying the regular property. Then there existsH a weakly compact set of probability measures defined on (Ω, σ( )) such that P H Eˆ[X] = max EP [X] for all X , P ∈P ∈ H where σ( ) is the σ-algebra generated by all functions in . is called the the family H H P of probability measures that represents Eˆ. Let be a weakly compact set that represents Eˆ. For this , we define capacity P P c(A) := sup P (A), A (Ω). P ∈P ∈ B A set A (Ω) is called polar if c(A)=0. A property holds “quasi-surely” (q.s.) if it holds outside⊂ B a polar set. In the following, we do not distinguish two random variables X and Y if X = Y q.s. SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 7

d Definition 12. Let Γ be a set of indices. A family of R -valued random vectors (Xγ)γ∈Γ is called a d-dimensional stochastic process indexed by Γ, or defined on Γ, if for each d γ Γ, Xγ . Let (Xγ)γ∈Γ and (Yγ)γ∈Γ be two processes indexed by Γ. Y is called a quasi-modification∈ ∈ H of X if for all γ Γ, X = Y q.s. ∈ γ γ 3. G-Gaussian random fields 3.1. A general setting of random fields defined on a nonlinear expectation space. Definition 13. Let (Ω, , Eˆ) be a nonlinear expectation space and Γ be a parameter H set. An m-dimensional random field on (Ω, , Eˆ) is a family of random vectors W =(W ) such that W m for each γ ΓH. γ γ∈Γ γ ∈ H ∈ Let us denote the family of all sets of finite indices by := γ =(γ , ,γ ): n N,γ , ,γ Γ,γ = γ for i = j, 1 i, j n . JΓ { 1 ··· n ∀ ∈ 1 ··· n ∈ i 6 j 6 ≤ ≤ } Now we give the notion of finite dimensional distribution of the random field W .

Definition 14. For each γ = (γ1, ,γn) Γ, let Fγ be a nonlinear expectation on Rn×m Rn×m F ··· ∈ J ( ,Cl.Lip( )). We call ( γ)γ∈JΓ a system of finite dimensional distributions on Rm with respect to Γ. Let (Wγ)γ∈Γ be an m-dimensional random field defined on a nonlinear expectation space (Ω, , Eˆ). For each γ =(γ , ,γ ) and the corresponding ramdom variable H 1 ··· n ∈JΓ W =(W , , W ), we define a functional on C (Rn×m) by γ γ1 ··· γn l.Lip F [ϕ]= Eˆ[ϕ(W )], ϕ C (Rn×m). Wγ γ ∈ l.Lip Rm×n Rm×n F Then the triple ( ,Cl.Lip( ), Wγ ) constitute a nonlinear expectation space. We F call ( Wγ )γ∈JΓ the family of finite dimensional distributions of (Wγ)γ∈Γ. (1) (2) Let (Wγ )γ∈Γ and (Wγ )γ∈Γ be two m-dimensional random fields defined on nonlinear expectation spaces (Ω1, 1, Eˆ 1) and (Ω2, 2, Eˆ2), respectively. They are said to be H (1) H d (2) (1) d (2) identically distributed, denoted by (Wγ )γ∈Γ = (Wγ )γ∈Γ, or simply W = W , if (1) d (2) for each γ =(γ , ,γ ) , Wγ = Wγ , i.e., 1 ··· n ∈JΓ Eˆ [ϕ(W (1))] = Eˆ [ϕ(W (2))] for each ϕ C (Rn×m). 1 γ 2 γ ∈ l.Lip For any given m-dimensional random field W = (Wγ)γ∈Γ, the family of its finite dimensional distributions satisfies the following properties of consistency: (1) Compatibility: For each (γ , ,γ ,γ ) and ϕ C (Rn×m), 1 ··· n n+1 ∈JΓ ∈ l.Lip (3.1) F [ϕ]= F [ϕ], Wγ1 ,··· ,Wγn Wγ1 ,··· ,Wγn ,Wγn+1 m×(n+1) where the function ϕ is a function on R defined by ϕ(y1, ,yn,yn+1)= m ··· ϕ(y1, ,yn), for any y1, ,yn,yn+1 R ; e ··· ··· ∈ Rn×m (2) Symmetry: Fore each (γ1, ,γn) Γ, ϕ Cl.Lip( e ) and each permu- tation π of 1, , n , ··· ∈ J ∈ { ··· } (3.2) F ··· [ϕ]= F ··· [ϕ ], Wγπ(1) , ,Wγπ(n) Wγ1 , ,Wγn σ 8 JI XIAOJUN AND PENG SHIGE

where we denote ϕ (y , ,y )= ϕ(y , ,y ). π 1 ··· n π(1) ··· π(n) According to the family of finite dimensional distributions, we can generalize the clas- sical Kolmogorov’s existence theorem to the situation of a sublinear expectation space, which is a variant of Theorem 3.8 in Peng [17]. Theorem 15. Let F ,γ be a family of finite dimensional distributions on Rm { γ ∈JΓ} satisfying the compatibility condition (3.1) and the symmetry condition (3.2). Then there exists an m-dimensional random field W = (Wγ )γ∈Γ defined on a nonlinear ex- pectation space (Ω, , Eˆ) whose family of finite dimensional distributions coincides with F ,γ . If weH assume moreover that each F in F ,γ is sublinear, then the { γ ∈JΓ} γ { γ ∈JΓ} corresponding expectation Eˆ on the space of random variables (Ω, ) is also sublinear. H m Γ m Proof. Denote by Ω=(R ) the space of all functions ω =(ωγ)γ∈Γ from Γ to R . For each ω Ω, we denote by W = (Wγ(ω) = ωγ )γ∈Γ and call W the canonical process defined∈ on Ω. The space of random variables is defined by H = L (Ω) = ϕ(W , , W ), n N,γ , ,γ Γ,ϕ C (Rn×m) . H ip { γ1 ··· γn ∀ ∈ 1 ··· n ∈ ∈ l.Lip }

Then for each random variable ξ of the form ξ = ϕ(Wγ1 , , Wγn ), we define the corresponding nonlinear expectation∈ H by ··· Eˆ F [ξ]= γ1,··· ,γn [ϕ]. Since F , γ satisfies the consistency conditions (3.1) and (3.2), thus the functional γ ∈JΓ Eˆ : R is consistently defined. Since,H 7→ for each γ , the distribution F is monotone and constant preserving, hence ∈JΓ γ the expectation Eˆ[ ] satisfies the same properties. Namely, Eˆ[ ] is a nonlinear expectation · · defined on (Ω, ). Obviously, the family of finite dimensional distributions of (Wγ )γ∈Γ F H F is ( γ)γ∈JΓ . Moreover, if for each γ =(γ1, ,γn) Γ, γ is a sublinear expectation m×n m×n ··· ∈J defined on (R ,Cl.Lip(R )), then Eˆ is a sublinear expectation. The proof of the Eˆ sub-additivity of [ ] is as follows: let ξ = ϕ(Wγ1 , , Wγi ) and η = ψ(Wι1 , , Wιj ) · m×···i m×j ··· be two random variables in , with ϕ Cl.Lip(R ) and ψ Cl.Lip(R ). Without loss of generality, assume thatH γ = ι ,∈ for h = 1, , i, k =∈ 1, , j. Then, from the h 6 k ··· ··· definition of Eˆ,

Eˆ[ξ + η]= F [ϕ + ψ] F [ϕ]+ F [ψ] γ1,··· ,γi,ι1,··· ,ιj ≤ γ1,··· ,γi,ι1,··· ,ιj γ1,··· ,γi,ι1,··· ,ιj F F Eˆ Eˆ = γ1,··· ,γi [ϕ]+ ι1,··· ,ιj [ψ]= [ξ]+ [η], e e e e where we set ϕ(γ , ,γ , ι , , ι )= ϕ(γ , ,γ ), 1 ··· i 1 ··· j 1 ··· i ψ(γ1, ,γi, ι1, , ιj)= ψ(ι1, , ιj). e ··· ··· ··· Eˆ F The positive homogeneitye of is directly obtained from the property of γ, γ Γ. The proof is complete. ∈ J SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 9

3.2. Gaussian random fields under a sublinear expectation space.

Definition 16. Let (Wγ )γ∈Γ be an m-dimensional random field on a sublinear expec- tation space (Ω, , Eˆ). (Wγ)γ∈Γ is called an m-dimensional G-Gaussian random field if for each γ =(γ ,H ,γ ) , the following n m-dimensional random vector 1 ··· n ∈JΓ × W =(W , , W )=(W (1), W (m), , W (1), , W (m)), W (j) , γ γ1 ··· γn γ1 ··· γ1 ··· γn ··· γn γi ∈ H i =1, , n, j =1, , m, ··· ··· is G-normally distributed. Namely, aW + bW¯ =d √a2 + b2W , for any a, b 0, γ γ γ ≥ where W¯ γ is an independent copy of Wγ . For each γ =(γ , ,γ ) , we define 1 ··· n ∈JΓ 1 G (Q)= Eˆ[ QW , W ], Q S(n m), Wγ 2 h γ γi ∈ × where S(n m) denotes the collection of all (n m) (n m) symmetric matrices. × × × × Then (GWγ )γ∈JΓ constitutes a family of functions: G : S(n m) R, γ =(γ , ,γ ), γ Γ, n N, Wγ × 7→ 1 ··· n i ∈ ∈ which satisfies the sublinear and monotone properties in the sense of (2.2). F According to Proposition 8, the corresponding distribution Wγ of this G-normally distributed random vector Wγ is uniquely determined by the function GWγ . Clearly, G is a family of monotone sublinear and continuous functions satisfying the { Wγ }γ∈JΓ properties of consistency in the following sense: (1) Compatibility: For any (γ , ,γ ,γ ) and Q = (q )n×m S(n 1 ··· n n+1 ∈ JΓ ij i,j=1 ∈ × m) (3.3) G (Q¯)= G (Q), W γ1,··· ,Wγn ,Wγn+1 W γ1,··· ,Wγn Q 0 where Q¯ = S((n + 1) m); 0 0 ∈ × (2) Symmetry: For any permutation π of 1, , n , { ··· } −1 (3.4) G ··· (Q)= G ··· (π (Q)), Wγπ(1) , ,Wγπ(n) Wγ1 , ,Wγn where the mapping π−1 : S(n m) S(n m) is defined by × 7→ × −1 n×m π (Q) =(q −1 −1 ), i, j =1, , (n m), Q =(q ) S(n m). ij π (i)π (j) ··· × ij i,j=1 ∈ × A very inttersting inverse problem is how to construct a G-Gaussian field in a sublinear expectation space if a above type of family of sublinear functions (Gγ)γ∈JΓ is given.

Theorem 17. Let (Gγ)γ∈JΓ be a family of real valued functions such that, for each γ =(γ , ,γ ) , the real function G is defined on S(n m) R and satisfies the 1 ··· n ∈JΓ γ × 7→ same monotone and sublinear property of type (2.2). Moreover, this family (Gγ)γ∈JΓ satisfies the same compatibility condition (3.3) and symmetry condition (3.4). Then 10 JI XIAOJUN AND PENG SHIGE there exists an m-dimensional G-Gaussian random field (Wγ)γ∈Γ on a sublinear expec- Eˆ tation space (Ω, , ) such that for each γ =(γ1, ,γn) Γ, Wγ =(Wγ1 , , Wγn ) is G-normally distributed,H i.e. ··· ∈J ··· 1 G (Q)= Eˆ[ QW , W ]= G (Q), for any Q S(n m). Wγ 2 h γ γi γ ∈ ×

Furthermore, if there exists another Gaussian random field (W¯ γ)γ∈Γ, with the same index set Γ, defined on a sublinear expectation space (Ω¯, ¯, E¯) such that for each γ , H ∈JΓ W¯ γ is G-normally distributed with the same generating function, namely, 1 E¯[ QW¯ , W¯ ]= G (Q) for any Q S(n m). 2 h γ γi γ ∈ × Then we have W =d W¯ . Proof. For each γ = (γ , ,γ ) , we denote by F the sublinear distribution 1 ··· n ∈ JΓ Wγ uniquely determined by Gγ. Since the family of monotone sublinear functions GWγ satisfies the conditions of the consistency (3.3), (3.4). It follows that the family of G- normal distributions F [ ] satisfies the conditions of consistency (3.1), (3.2). Thus we Wγ · can follow the construction procedure in the proof of Theorem 15 to construct a random field W = (Wγ )γ∈Γ on a complete sublinear expectation space (Ω, , Eˆ), under which the family of finite dimensional distributions coincides with F H . Obviously, W { Wγ }γ∈JΓ is a Gaussian random field under Eˆ. Assume that there exists another G-Gaussian random field W¯ with the same family of generating functions. Then W and W¯ have the same finite-dimensional distributions, which implies W =d W¯ . The proof is complete. 

p For each p 1, let L (W ) be the completion of under the Banach norm Lp := G G 1/p ≥ H k·k Eˆ[ p] . Then the sublinear expectation Eˆ can be extended continuously to Lp (W ) |·| G Lp Eˆ and (Ω, G(W ), ) forms a complete sublinear expectation space. + Remark 18. If Γ = R , W = (Wγ )γ∈Γ becomes a G-Gaussian process which has been studied in Peng [17].

4. Spatial white noise under sublinear expectations In this section, we formulate a spatial white noise and then develop the corresponding stochastic integral.

d Definition 19. Let (Ω, , Eˆ) be a sublinear expectation space and Γ = 0(R ) = A (Rd),λ < ,H where λ denotes the Lebesgue measure of A B(Rd). A { ∈ B A ∞} A ∈ B 1-dimensional G-Gaussian random field W = WA, A Γ is called a 1-dimensional G-white noise if { ∈ } (i) Eˆ[W2 ]= σ2λ , Eˆ[ W2 ]= σ2λ , for all A Γ; A A − − A A ∈ SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 11

(ii) For each A , A Γ, A A = , we have 1 2 ∈ 1 ∩ 2 ∅ (4.1) Eˆ[W W ]= Eˆ[ W W ]=0, A1 A2 − A1 A2 (4.2) Eˆ[(W W W )2]=0, A1∪A2 − A1 − A2 where 0 σ2 σ2 are any given numbers. ≤ ≤ Remark 20. The definition implies that the distribution of a G-white noise has only two parameters σ2 and σ2. In fact we can also set σ2 = 1, σ2 1 to make it as a 1-parameter field. On the other hand, for any given pair of numbers≤ 0 σ2 σ2, one can construct a 1-dimensional random field W = W , A Γ with Γ=≤ (R≤d) defined { A ∈ } B0 on a sublinear expectation space (Ω, , Eˆ) satisfying conditions (i)-(ii). H 2 We denote by L (W) the completion of under the Banach norm L2 := G G 1/2 H k·k Eˆ[ 2] . Then the sublinear expectation Eˆ can be extended continuously to L2 (W) |·| G L2 W Eˆ and (Ω, G( ), ) forms a complete sublinear expectation space. 4.1. Existence of G-white noise. In the following, we turn to construct a spatial G-white noise on some sublinear expectation space. According to Theorem 17, it is sufficient to define an appropriate family of sublinear and monotone functions (Gγ)γ∈JΓ such that the G-Gaussian random field generated by (Gγ )γ∈JΓ satisfies conditions (i)-(ii) in Definition 19 for G-white noise. d For each γ =(A1, , An), Aj Γ= 0(R ), we define a mapping Gγ( ): S(n) R as follows: ··· ∈ B · 7→ n (4.3) G (Q)= G( q λ ), Q =(q )n S(n). A1,··· ,An ij Ai∩Aj ij i,j=1 ∈ i,j=1 X 1 2 + 1 2 − R 2 2 where G(a) = 2 σ a 2 σ a for a . Here σ σ are two given nonnegative parameters. − ∈ ≥ Proposition 21. For each γ = (A , , A ), A Γ = (Rd), the function G ( ) 1 ··· n j ∈ B0 γ · defined as (4.3) is a sublinear and monotone real function defined on S(n), namely it satisfies the condition (2.2). Moreover, the family G satisfies the conditions of { γ}γ∈JΓ consistency given in (3.3) and (3.4).

Proof. The property that Gγ satisfies relation (2.2) follows from the monotone and sublinear properties of the function G = G(a), a R, and the property that for each A , , A (Rd), ∈ 1 ··· n ∈ B0 n (4.4) q λ 0, if Q 0. ij Ai∩Aj ≥ ≥ i,j=1 X The compatibility of the type (3.3) is checked as follows: let Q 0 Q¯ = S(n + 1), 0 0 ∈   12 JI XIAOJUN AND PENG SHIGE

d and let A1, , An, An+1 0(R ). Noting that qi(n+1) = q(n+1)i =0, i =1, , n +1, by (4.3), we··· have ∈ B ··· ¯ GA1,··· ,An+1 (Q)= GA1,··· ,An (Q), which is (3.3). Let π( ) be a permutation of 1, 2, , n . By (4.3), we can check that · { ··· }

GA1,··· ,An (Q)= GAπ(1),··· ,Aπ(1) (π(Q)), n  where π(Q)=(qπ(i)π(j))i,j=1. Thus (3.4) holds. The proof is complete. Now we present the existence of white noise under the sublinear expectation. Theorem 22. For each given sublinear and monotone function 1 G(a)= (σ2a+ σ2a−), a R, 2 − ∈ let the family of generating functions G ( ): γ =(A , , A ) be defined as in { γ · 1 ··· n ∈JΓ} (4.3). Then there exists a 1-dimensional G-Gaussian random field (Wγ)γ∈Γ on a com- plete sublinear expectation space (Ω, L2 (W), Eˆ) such that, for each γ =(A , , A ) G 1 ··· n ∈ , W =(W , , W ) is G-normally distributed, i.e., JΓ γ A1 ··· An n 1 n GW (Q)= Eˆ[ QW , W ]= G( q λ ), for any Q =(q ) S(n). γ 2 h γ γi ij Ai∩Aj ij i,j=1 ∈ i,j=1 X W L2 W Eˆ Moreover, ( γ)γ∈Γ is also a spatial G-white noise under (Ω, G( ), ), namely, con- ditions (i) and (ii) of Definition 19 are satisfied. If (W¯ γ)γ∈Γ is another G-white noise with the same sublinear function G, then its family of generating function GW¯ { γ }γ∈JΓ coincides with the one defined in (4.3).

Proof. Since the compatibility and symmetry conditions of Gγ γ∈JΓ has already been proved, it then follows from Lemma 21 and Theorem 17 that{ the} existence and unique- ness of the G-Gaussian random field W in a complete sublinear expectation space L2 W Eˆ (Ω, G( ), ) with the family of generating functions defined by (4.3) hold. We only need to prove that W is a G-white noise. It suffices to verify that the G-random field W induced by the family of generating functions defined in (4.3) satisfies conditions (i)-(ii) of Definition 19. For any A Γ, ∈ Eˆ W2 2 [ A]=2GA(1) = 2G(λA)= σ λA. Similarly, we have Eˆ[ W2 ]= σ2λ , which verifies (i). Now consider relation (4.1) in − A − A (ii). Let A1, A2 Γ be such that A1 A2 = . On the other hand, we can choose a 2 2 symmetric∈ matrix Q =(q )2 ∩to express∅ × ij i,j=1 2 Eˆ W W Eˆ W W [ A1 A2 ]= [ qij Ai Aj ], i,j=1 X where we set 0 1 Q =(q )2 = 2 . ij i,j=1 1 0  2  SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 13

It then follows from (4.3) that

2 2 Eˆ W W Eˆ W W [ A1 A2 ]= [ qij Ai Aj ]= G( qijλAi∩Aj )=0. i,j=1 i,j=1 X X Similarly, we can get Eˆ[ W W ]=0, thus (4.1) holds. − A1 A2 To prove (4.2), let A1, A2 Γ be such that A1 A2 = and denote A3 = A1 A2. Then ∈ ∩ ∅ ∪ 3 Eˆ[(W + W W )2]= Eˆ[ q W W ], A1 A2 − A3 ij Ai Aj i,j=1 X where we denote 1 1 1 3 − Q =(qij)i,j=1 = 1 1 1 .  1 1− 1  − − We then apply (4.3) to get  

3 3 Eˆ[ q W W ]= G( q λ )= G(λ + λ + λ 2(λ + λ )]=0, ij Ai Aj ij Ai∪Aj A1 A2 A1∩A2 − A2 A1 i,j=1 i,j=1 X X which yields (4.2). Now let W¯ γ γ∈Γ be a G-white noise, associated with the same function G, on a complete sublinear{ } expectation space (Ω¯, L2 (W¯ ), E¯). Then for each γ =(A , , A ) G 1 ··· n ∈ , Q =(q )n S(n), JΓ ij i,j=1 ∈ 1 n GW¯ (Q)= Eˆ[ QW¯ , W¯ ]= G( q λ )= G (Q). γ 2 h γ γi ij Ai∩Aj γ i,j=1 X d Thus W¯ γ γ∈Γ = Wγ γ∈Γ and the proof is complete. { } { } 

We can use a similar approach to prove that, for each A, B (Rd), we have ∈ B0 Eˆ(W + W W W )2]=0. A∪B A∩B − A − B W W W Remark 23. From the above lemma, it follows directly that A1∪A2 = A1 + A2 W ∞ − A1∩A2 for any A1, A2 Γ. Futhermore, if Ai i=1 is a mutually disjoint sequence of ∞ ∈ { }

Γ such that λAi < , then i=1 ∞

P ∞ ∞ W A = W L2 (W). i Ai ∈ G i=1 ! i=1 [ X A very important property of spatial G-white noise is that it is invariant under rotations and translations: 14 JI XIAOJUN AND PENG SHIGE

Proposition 24. For each p Rd and O O(d) := O Rd×d,OT = O−1 , we set ∈ ∈ { ∈ } T (A)= Ox + p : x A . p,O { ∈ } Then, for each A , , A (Rd), we have 1 ··· n ∈ B0 (W , , W ) =(d W , , W ). A1 ··· An Tp,O(A1) ··· Tp,O(An) Namely the finite dimensional distributions of W are invariant under rotations and translations. Proof. It is a direct consequence of the well-know invariance of the Lebesgue measure under rotations and translations, i.e., λ = λ , (p,O) Rd O(d), A (Rd). A Tp,O(A) ∀ ∈ × ∈ B0  According to Theorem 22, we can calculate and characterize each finite dimensional distribution of the G-white noise W by the nonlinear heat equations. Let W be a G-white noise on a sublinear expectation space (Ω, , Eˆ) with index Γ= (Rd). For each γ =(A , , A ) , W =(W , , W )H, n N, we have B0 1 ··· n ∈JΓ γ A1 ··· An ∈ 1 GW (Q)= Eˆ[ QW , W ]= G (Q), γ 2 h γ γi γ S σ2 + σ2 − R Rn where Q =(qij) (n) and G(q)= 2 q 2 q , q . For any given ϕ Cl.Lip( ), set ∈ − ∈ ∈ u(t, x , , x )= Eˆ[ϕ(x + √tW , , x + √tW )], (t, x) [0, ) R. 1 ··· n 1 A1 ··· n An ∈ ∞ × Then by Proposition 9, u is the unique viscosity solution of the Cauchy problem of the nonlinear heat equation: (4.5) ∂ u G (D2 u)=0, u = ϕ(x , , x ). t − γ xx |t=0 1 ··· n We then can obtain the nonlinear expected value by Eˆ[ϕ(W , , W )] = u . A1 ··· An |t=1,x1,··· ,xn=0 4.2. Stochastic integrals with respect to a spatial white noise. Now let us define the stochastic integral with respect to G-white noise in three steps. d 2 2 Let Wγ γ∈Γ, Γ= 0(R ), be a 1-dimensional G-white noise with parameters σ , σ { } B L2 W Eˆ on a complete sublinear expectation space (Ω, G( ), ). d Step 1: For each A Γ, λA < , suppose f : R R is an elementary function with the form f(x)= 1 ∈(x), x Rd∞. We can define → A ∈

f(x)W(dx)= 1A(x)W(dx)= WA. ˆRd ˆRd n N R Step 2: For any simple function f(x)= ai1Ai (x), n , a1, , an , A1, , An i=1 ∈ ··· ∈ ··· ∈ Γ, we define P n n n W W W ( ai1Ai (x)) (dx)= ai 1Ai (x) (dx)= ai Ai . ˆRd ˆRd i=1 i=1 i=1 X X X SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 15

2 Rd Rd R 2 Step 3: Let L ( ) denote the space of all functions f : such that f L2 = f(x) 2dx < . By the following lemma, we can extend the→ stochastic integrak k l to Rd | | ∞ L´ 2(Rd). Lemma 25. If f : Rd R is a simple function, then → Eˆ W 2 2 2 [ f(x) (dx) ]= σ f L2 . | ˆRd | k k

n

Proof. Without loss of generality, we suppose f(x) = ai1Ai (x), where a1, , an i=1 ··· ∈ R , A1, , An Γ and Ai Aj = if i = j. Then P ··· ∈ ∩ ∅ 6 n n Eˆ W 2 Eˆ W 2 Eˆ 2W2 [ f(x) (dx) ]= [ ai Ai ]= [ ai Ai ] | ˆRd | | | i=1 i=1 n X nX 2 2 2 2 2 2 a Eˆ[W ]= σ a λ = σ f 2 . ≤ i Ai i Ai k kL i=1 i=1 X X 

This lemma implies that the linear mapping

W 2 Rd L2 W f(x) (dx): L ( ) G( ) ˆRd 7→ is contract. We then can continuously extend this mapping to the the whole domain of 2 Rd W 2 Rd L ( ). We still denote this mapping by Rd f(x) (dx) and call it integral of f L ( ) with respect to the spatial white noise W´ . ∈ For each B Rd, f L2(Rd), we can also define ⊂ ∈

f(x)W(dx)= f(x)1B(x)W(dx). ˆB ˆRd d Theorem 26. Let Γ = 0(R ) and W = WA, A Γ be a 1-dimensional G-white B {Eˆ ∈ } W 2 Rd noise on a sublinear expectation space (Ω, , ). Then Rd f(x) (dx): f L ( ) is a G-Gaussian random field. H {´ ∈ } To prove this theorem, we need the following lemma to show that the convergence of a sequence of G-normally distributed random vectors is still G-normally distributed. Lemma 27. Let X ∞ be a sequence of m-dimensional G-normally distributed ran- { n}n=1 dom variables on a sublinear expectation space (Ω, , Eˆ). If for some p 1, lim Eˆ[ Xn H ≥ n→∞ | − X p]=0, then X is also G-normally distributed. | Eˆ p ∞ Proof. As for some p 1, lim [ Xn X ]=0, we can deduce that Xn n=1 converges ≥ n→∞ | − | { } in distribution by F [ϕ] F [ϕ] = Eˆ[ϕ(X )] Eˆ[ϕ(X)] C Eˆ[ X X ] C Eˆ[ X X p]1/p 0, | Xn − X | | n − | ≤ ϕ | n − | ≤ ϕ | n − | → 16 JI XIAOJUN AND PENG SHIGE

m as n , where Cϕ is the Lipschitz constant of ϕ Cb.Lip(R ). Assume that X¯n and →∞ ∈ m X¯ are the independent copies of Xn and X, respectively. Then, for each ϕ Cb.Lip(R ), a, b 0, we have ∈ ≥ Eˆ[ϕ(aX + bX¯ )] = Eˆ[ϕ(√a2 + b2X )] Eˆ[ϕ(√a2 + b2X)], as n . n n n → →∞ On the other hand, setting ϕn(y)= Eˆ[ϕ(y + bXn)] and ϕ¯(y)= Eˆ[ϕ(y + bX)], it is easy to show that ϕn(y) is a uniform Lipschitz function with the same Lipschitz constant as ϕ, such that ϕ (y) ϕ¯(y) bC Eˆ[ X X ], for all y Rm. Thus | n − | ≤ ϕ | − n| ∈ Eˆ[ϕ(aX + bX¯ )] Eˆ[ϕ(aX + bX¯)] | n n − | = Eˆ[ϕ (aX )] Eˆ[¯ϕ(aX)] | n n − | Eˆ[ϕ (aX )] Eˆ[ϕ (aX)] + Eˆ[ ϕ (aX) ϕ¯(aX) ] 0, as n . ≤| n n − n | | n − | → →∞ 2 2 m Hence, Eˆ[ϕ(aX + bX¯)] = Eˆ[ϕ(√a + b X)], for each ϕ Cb.Lip(R ), which completes the proof. | ∈  2 d Proof of Theorem 26. For any n N, f1, , fn L (R ), there exists a sequence of m m ∈ ··· ∈ m 2 Rd simple functions (f1 , , fn ), m 1 such that lim fi = fi in L ( ), i =1, , n, { ··· ≥ } m→∞ ··· and W W m W m W ( f1(x) (dx), , fn(x) (dx)) = lim ( f1 (x) (dx), , fn (x) (dx)). ˆRd ··· ˆRd m→∞ ˆRd ··· ˆRd By Proposition 1.6 in Chapter II in Peng [15], it is easy to verify that

m W m W ( f1 (x) (dx), , fn (x) (dx)) ˆRd ··· ˆRd is G-normally distributed. Using Lemma 27, we deduce that

( f1(x)W(dx), , fn(x)W(dx)) ˆRd ··· ˆRd is G-normally distributed which completes the proof. 

Example 28. Let WA, A 0(R) be a 1-dimensional G-white noise. Define Bt = W([0, t]), t R , then{ ∈ B } ∈ + Eˆ[B B ]= σ2λ ([0, t] [0,s]) = σ2(s t). t s 1 ∩ ∧ d Different from the classical case, Bt is no longer a G-Brownian motion although Bt = N( 0 [σ2t, σ2t]) for each t 0. { } × ≥ 4.3. The continuity of spatial G-white noise. For each x Rd, let Γ denote the set of hypercubes in Rd that contains the origin and t as its two∈ extremal vertices in the sense that (4.6) Γ= (0 x, 0 x] := (0 x , 0 x ] (0 x , 0 x ]: x =(x , , x ) Rd . { ∧ ∨ ∧ 1 ∨ 1 ×···× ∧ d ∨ d ∀ 1 ··· d ∈ } Suppose (Vγ)γ∈Γ is a random field in a sublinear expectation space (Ω, , Eˆ). Define d H d V := V , x R . Then (V ) Rd is a random field indexed by x R . In x (0∧x,0∨x] ∈ x x∈ ∈ particular, we derive from a spatial random field W , A (Rd). A ∈ B SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 17

W W For the spatial white noise ( A)A∈B(Rd), the corresponding random field ( x)x∈Rd is called Rd-indexed spatial white noise. We will see in this subsection that, just like in Rd W the classical situation, this -indexed spatial white noise ( x)x∈Rd has also a quasi- continuous modification in the sense of Definition 12. We recall the following generalized Kolmogorov’s continuity criterion for a random field indexed by Rd (see Denis et al. [5]). Lp Theorem 29. Let p> 0 be given and (Vx)x∈Rd be a random field such that Vx (Ω) for each x Rd. Assume that there exist constants c,ε > 0 such that if there∈ exist positive constants∈ c,ε,p satisfying

Eˆ[ V V p] c x y d+ε, for all x, y Rd, | x − y| ≤ | − | ∈ ˜ then (Vx)x∈[0,T ] has a continuous quasi-modification (Vx)x∈Rd such that for each α [0,ε/p), ∈ ˜ ˜ Eˆ Vx Vy p [(sup | − α|) ] < . 6 x y ∞ x=y | − | Hence, the paths of V˜ are quasi-surely Hölder continuous of order α, α [0,ε/p), in the sense that there exists a Borel set N in with capacity 0 such that for∈ all ω N c, the mapping x V˜ (ω) is Holder continuous of order α. ∈ → x We have the following generalized Kolmogorov’s criterion for spatial white noise W ( x)x∈Rd Theorem 30. For each x Rd, let W := W , be a 1-dimensional spatial G-white ∈ x (0,x] noise on the sublinear expectation space (Ω, , Eˆ). Then there exists a continuous quasi- H modification W˜ of W. Proof. For notational simplicity, we only prove the case that d = 2. For each x = 2 (x1, x2),y =(y1,y2) R , without loss of generality, assume that 0 xi yi, i =1, 2. Then, by (4.3), ∈ ≤ ≤

Eˆ[ W W 2]= σ2λ = σ2(y y x x ). | y − x| (0,y]/(0,x] 1 2 − 1 2 Thus W W is G-normally distributed and W W =d N( 0 [σ2(y y x x ), σ2(y y y− x y− x { }× 1 2− 1 2 1 2− x1x2)]. It follows that

Eˆ[ W W 6]=15σ6 y y x x 3. | y − x| | 1 2 − 1 2| Since y y x x (y y )( y x + y x ) √2(y y ) y x , 1 2 − 1 2 ≤ 1 ∨ 2 | 1 − 1| | 2 − 2| ≤ 1 ∨ 2 | − | we have Eˆ[ W W 6] 30√2σ6(y y ) y x 3. | y − x| ≤ 1 ∨ 2 | − | We then apply Theorem 29 and obtain that W has a continuous modification W˜ .  18 JI XIAOJUN AND PENG SHIGE

5. Spatial and temporal white noise under G-sublinear expectation 5.1. Basic notions and main properties. In this section, we study a new type spatial-temporal white noise which is a random field defined on the following index set Γ= [s, t) A :0 s t< , A (Rd) . { × ≤ ≤ ∞ ∈ B0 } Specifically, we have Definition 31. A random field W([s, t) A) on a sublinear expectation { × }([s,t)×A)∈Γ space (Ω, , Eˆ) is called a 1-dimensional spatial-temporal G-white noise if it satisfies the followingH conditions:

W d (i) For each fixed [s, t), the ramdom field ([s, t) A) A∈B0(R ) is a 1-dimensional spatial white-noise that has the same{ finite-dimensional× } distributions as (√t sW ) Rd ; − A A∈B0( ) (ii) For any r s t, A (Rd), W([r, s) A)+ W([s, t) A)= W([r, t) A); ≤ ≤ ∈ B0 × × × (iii) W([s, t) A) is independent of (W([s1, t1) A1), , W([sn, tn) An)), if ti < s and A × (Rd), for i =1, , n, × ··· × i ∈ B0 ··· W d where ( A)A∈B0(R ) is a 1-dimensional G-white noise.

d Remark 32. It is important to mention that W([s, t) A), 0 s t< , A 0(R ) is no longer a G-Gaussian random field. { × ≤ ≤ ∞ ∈ B } We now present the existence of the spatial-temporal G-white noise and the con- struction of the corresponding expectation and conditional expectation. Suppose Ω=(R)Γ and for each ω Ω, define the canonical process (W ) by ∈ γ γ∈Γ W([s, t) A)(ω)= ω([s, t) A), 0 s t< , A (Rd). × × ∀ ≤ ≤ ∞ ∈ B0 d For each T 0, set T = σ W([s, t) A), 0 s t T, A (R ) , = T , ≥ F { × ≤ ≤ ≤ ∈ B } F T ≥0 F and W L ( )= ϕ(W([s , t ) A ), , W([s , t ) A )), for any n N, s t T, ip FT { 1 1 × 1 ··· n n × n ∈ i ≤ i ≤ i =1, , n, A , , A (Rd),ϕ C (Rn) . ··· 1 ··· n ∈ B0 ∈ l.Lip } ∞ We also set Lip( ) = n=1Lip( n). For each X Lip( ), without loss of generality, we assume X hasF the form∪ F ∈ F X =ϕ(W([0, t ) A ), , W([0, t ) A ), , W([t , t ) A ), , 1 × 1 ··· 1 × m ··· n−1 n × 1 ··· W([t , t ) A )), n−1 n × m d where 0= t0 t1 < < tn < , A1, , Am 0(R ) are mutually disjoint, and m≤×n ··· ∞ { ··· } ⊂ B ϕ Cl.Lip(R ). Then we can define the sublinear expectation for X in the following procedure.∈ N Rd Let GA1,··· ,Am : m , 0 s

(1) (m) ˜ ˜ E˜ (ξi , ,ξi ), 1 i n, on a sublinear expectation (Ω, , ) such that ξi+1 is independent··· of (ξ ,≤ ,ξ≤) for each i =1, 2, , n 1, and H 1 ··· i ··· − 1 n G (Q)= E˜[ Qξ ,ξ ]= G( q λ ), Q S(m), A1,··· ,Am 2 h i ii ii Ai ∀ ∈ i=1 X σ2 + σ2 − R where G(q)= 2 q 2 q , q . Define − ∈ Eˆ[X]= Eˆ[ϕ(W([0, t ) A ), , W([0, t ) A ), , W([t , t ) A ), , 1 × 1 ··· 1 × m ··· n−1 n × 1 ··· W([t , t ) A ))] n−1 n × m = E˜[ϕ(√t ξ(1), , √t ξ(m), , (t t )ξ(1), , (t t )ξ(m))]. 1 1 ··· 1 1 ··· n − n−1 n ··· n − n−1 n and the related conditional expectation ofp X under , t pt < t , denoted by Ft j ≤ j+1 Eˆ[X ], is defined by |Ft Eˆ[ϕ(W([0, t ) A ), , W([0, t ) A ), , W([t , t ) A ), , 1 × 1 ··· 1 × m ··· n−1 n × 1 ··· W([t , t ) A )) ] n−1 n × m |Ft = ψ(W([0, t ) A ), , W([0, t ) A ), , W([t , t ) A ), , 1 × 1 ··· 1 × m ··· j−1 j × 1 ··· W([t , t ) A )). j−1 j × m where ψ(x , , x )= E˜[ϕ(x , , x , t t ξ(1) , , t t ξ(m), 11 ··· jm 11 ··· jm j+1 − j j+1 ··· j+1 − j j+1 (1) (m) , tn ptn−1ξn , , tn ptn−1ξn )]. ··· − ··· − It is easy to check that Eˆ[ ] defines a sublinearp expectation onpL ( ) and the canonical · ip F process (Wγ )γ∈Γ is a 1-dimensional spatial-temporal white noise on (Ω, Lip( ), Eˆ). For each p 1, T 0, we denote by Lp (W )(resp. Lp (W)) the completionF of ≥ ≥ G [0,T ] G L ( )(resp. L ( )) under the form X := (Eˆ[ X p])1/p. The conditional expec- ip FT ip F k kp | | tation Eˆ[ t]: Lip( ) Lip( t) is a continuous mapping under p and can be extended· continuously |F F to→ the mappingF Lp (W) Lp (W ) by k·k G → G [0,t] Eˆ[X ] Eˆ[Y ] Eˆ[ X Y ] . | |Ft − |Ft | ≤ | − | |Ft It is easy to verify the conditional expectation Eˆ[ t] satisfies the following properties. We omit the proof here. ·|F Eˆ Lp W Lp W Proposition 33. The conditional expectation [ t]: G( ) G( [0,t]) satisfies the following properties: for any X,Y Lp (W), ·η |F Lp (W ),→ ∈ G ∈ G [0,t] (i): Eˆ[X ] Eˆ[Y ] for X Y . |Ft ≥ |Ft ≥ (ii): Eˆ[η ]= η. |Ft (iii): Eˆ[X + Y ] Eˆ[X ]+ Eˆ[Y ]. |Ft ≤ |Ft |Ft (iv): Eˆ[ηX ]= η+Eˆ[X ]+ η−Eˆ[ X ]. |Ft |Ft − |Ft (v): Eˆ[Eˆ[X ] ]= Eˆ[X ]. |Ft |Fs |Ft∧s 20 JI XIAOJUN AND PENG SHIGE

Proposition 34. For each X,Y Lp (W ), t T , Eˆ[Y ]= Eˆ[ Y ]. Then ∈ G [0,T ] ≤ |Ft − − |Ft Eˆ[X + αY ]= Eˆ[X ]+ αEˆ[Y ] , α R. |Ft |Ft |Ft ∈ 5.2. Stochastic integrals with respect to spatial-temporal white noise. In the following, we define the stochastic integral with respect to the spatial-temporal G-white d noise W([s, t) A), 0 s < t T, A 0(R ). Firstly, let M×0([0, T≤] Rd) be≤ the collection∈ B of simple processes with the form: × n−1 m

f(s, x; ω)= Xij(ω)1Aj (x)1[ti,ti+1)(s), i=0 j=1 X X where Xij Lip( ti ), i = 0, , n 1, j = 1, , m, 0= t0 < t1 < < tn = T , and A m is∈ a spaceF partition··· of (R−d), namely,··· ··· { j}j=1 B0 m A (Rd), A = Rd, and A A = , if j = k, j,k =1, , m. j ∈ B0 j j ∩ k ∅ 6 ··· j=1 [ The Bohner’s integral of f M0([0, T ] Rd) is defined by ∈ × T T n−1 m

IB(f)= f(s, x)dsdx = f(s, x)dxds := Xij(ti+1 ti)λAj , t [0, T ]. ˆRd ˆ ˆ ˆRd − ∀ ∈ 0 0 i=0 j=1 X X 0 d It is clear that IB : M ([0, T ] R ) Lip( T ) is a linear mapping. For a given p 1 and f M0([0, T ] Rd), let × 7→ F ≥ ∈ × 1 1/p T p n−1 m p p p Eˆ Eˆ f M = [ f(s, x) dsdx] = Xij(ti+1 ti)λAj . k k ˆ ˆRd | | − 0 ( " i=0 j=1 #)   X X 0 d 0 d Then f Mp is a Banach norm on M ([0, T ] R ) and the completion of M ([0, T ] R ) k k Mp × Rd × under the norm Mp , denoted by G([0, T ] ), is a Banach space. The stochastick·k integral with respect to the spatial-tempor× al indexed white noise W from M0([0, T ] Rd) to L ( ) can be defined as follows: × ip FT T n−1 m (5.1) f(s, x)W(ds, dx) := XijW([ti, ti+1) Aj). ˆ ˆRd × 0 i=0 j=1 X X Lemma 35. For any 0 s t< and ξ Lp (W ), we have ≤ ≤ ∞ ∈ G [0,s] Eˆ[ξW([s, t) A)]=0, A (R), × ∀ ∈ B0 and Eˆ[ξW([s, t) A)(W([s, t) A¯)]=0, A, A¯ (R) such that A A¯ = . × × ∀ ∈ B0 ∩ ∅ Proof. The first relation is obvious, since W([s, t) A) is independent of ξ and Eˆ[ W([s, t) A)]=0. The second relation is derived from × ± × Eˆ[ W([s, t) A)(W([s, t) A¯)] = Eˆ[ W([s, t) (A A¯))] = 0. ± × × ± × ∩  SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 21

Using this lemma, we now provide the following lemma which allows us to extend the the domain of stochastic integral to M2 ([0, T ] Rd): G × Lemma 36. For each f M0([0, T ] Rd), ∈ × T (5.2) Eˆ f(s, x)W(ds, dx) =0. ˆ ˆ d  0 R  T 2 T (5.3) Eˆ f(s, x)W(ds, dx) σ2Eˆ f(s, x) 2dsdx . ˆ ˆRd ≤ ˆ ˆRd | | " 0 #  0 

n−1 m Proof. Suppose f(s, x; ω) = Xij(ω)1Aj (x)1[ti,ti+1)(s), where 0 t0 < t1 < < i=0 j=1 ≤ ··· m Rd Rd tn T , Aj j=1 0( ) Pis aP partition of , Xij Lip( ti ), i = 1, , n, j = 1, ≤ , m.{ It follows} ⊂ from B the first relation of Lemma 35,∈ combinedF with Lemma··· 6 that ··· T n−1 m Eˆ f(s, x)W(ds, dx) = Eˆ Xij(W([ti, ti+1) Aj) =0. ˆ ˆRd × 0 " i=0 j=1 #   X X Hence we derive the equation (5.2). For the second one,

2 T 2 n−1 m Eˆ f(s, x)W(ds, dx) = Eˆ XijW([ti, ti+1) Aj) ˆ ˆRd  ×  " 0 # i=0 j=1 X X n−1 m n−1 m  

= Eˆ X X W([t , t ) A )W([t , t ) A ) ij lk i i+1 × j l l+1 × k " i=0 j=1 # X X Xl=0 Xk=1 n−1 m m = Eˆ X X W([t , t ) A )W([t , t ) A ) . ij ik i i+1 × j i i+1 × k " i=0 j=1 # X X Xk=1 where the last equality is from the first relation of Lemma 35, combined with Lemma 6 since Aj Ak = for j = k. We then apply the second relation of Lemma 35, once more combined∩ with∅ Lemma6 6, to obtain

T 2 n−1 m Eˆ W Eˆ 2 W 2 f(s, x) (ds, dx) = Xij ( ([ti, ti+1) Aj)) ˆ ˆRd × " 0 # " i=0 j=1 # X X n−1 m Eˆ η + σ2Eˆ X2 (t t )λ ≤ n ij i+1 − i Aj " " i=0 j=1 ## X X n−1 m = σ2Eˆ X2 (t t )λ ij i+1 − i Aj " i=0 j=1 # X X T = σ2Eˆ f(t, x) 2dxdt , ˆ ˆRd | |  0  22 JI XIAOJUN AND PENG SHIGE where we denote k−1 m η := X2 (W([t , t ) A ))2 σ2(t t )λ , k =1, , n. k ij i i+1 × j − i+1 − i Aj ··· i=0 j=1 X X   Here, to obtain that Eˆ[ηn]=0, we take the following procedure, adopted from Peng [15]. m Eˆ[η ]= Eˆ η + X2 (W([t , t ) A ))2 σ2(t t )λ n n−1 n−1,j n−1 n × j − n − n−1 Aj " j=1 # X   m = Eˆ a + Eˆ b2 (W([t , t ) A ))2 σ2(t t )λ  j n−1 n × j − n − n−1 Aj  " j=1 # X h i a=ηn−2,bj =Xn−1,j = Eˆ[η ]= = Eˆ[η ]=0.  n−2 ··· 1 The proof is complete.  Therefore, we can continuously extend the domain M0([0, T ] Rd) of the stochastic M2 Rd M2 × Rd integral (5.1) to G([0, T ] ). Indeed, for any f G([0, T ] ), there exists a × 0 d ∈ × sequence of simple processes fn M ([0, T ] R ) such that lim fn f M2 =0. By ∈ × n→∞ k − k Lemma 36, we have that T ∞ fn(s, x)W(ds, dx) ˆ ˆ d  0 R n=1 L2 W L2 W is a Cauchy sequence in G( [0,T ]). As G( [0,T ]) is complete, we can define T T 2 f(s, x)W(ds, dx) := L lim fn(s, x)W(ds, dx). ˆ0 ˆRd − n→∞ ˆ0 ˆRd It is easy to check that the stochastic integral has the following properties. Proposition 37. For each f,g M2 ([0, T ] Rd), 0 s r t T , we have ∈ G × ≤ ≤ ≤ ≤ T W r W T W (1) s Rd f(s, x) (ds, dx)= s Rd f(s, x) (ds, dx)+ r Rd f(s, x) (ds, dx). ´ T ´ W´ ´ ´ ´ (2) s Rd (αf(s, x)+ g(s, x)) (ds, dx) ´ ´ T W T W L1 W = α s Rd f(s, x) (ds, dx) + s Rd g(s, x) (ds, dx), if α G( [0,s]) is bounded.´ ´ ´ ´ ∈ (3) Eˆ[ T f(s, x)W(ds, dx) ]=0. r Rd |Fr For each f´ ´M2 ([0, T ] Rd), by (3) in Proposition 37, we can obtain that ∈ G × t f(r, x)W(dr, dx), t [0, T ] {ˆ0 ˆRd ∈ } is a martingale in the sense that t s Eˆ[ f(r, x)W(dr, dx) s]= f(r, x)W(dr, dx) if 0 t T. ˆ0 ˆRd |F ˆ0 ˆRd ≤ ≤ T W Furthermore, 0 Rd f(r, x) (dr, dx), t, t [0, T ] is a martingale measure. {´ ´ F ∈ } SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 23

6. Conclusion remarks Based on a new type of G-Gaussian process introduced in Peng [17], we develop a general Gaussian random field under a sublinear expectation framework. We first construct a G-Gaussian random field whose finite dimensional distributions are G- normally distributed. A remarkable point is that, different from the classical probability theory, a G-Brownian motion is not a G-Gaussian random field. A G-Brownian motion is suitable to be as a temporal random field, whereas a random Gaussian field is rather suitable for describe spatial uncertainty. This new framework of G-Gaussian random field can be used to quantitatively and efficiently study such type of spatial uncertainties and risks. We then define a special family of generating functions G as (4.3) and establish { γ}γ∈JΓ the corresponding white noise of space type. Stochastic integral of functions in L2(Rd) with respect to the spatial white noise has been established and the family of stochastic integrals is still a G-Gaussian random field. Furthermore, we focus on the spatial-temporal G-white noise on the sublinear expec- tation space. Applying the similar method of G-Itô’s integral, we develop the stochastic M2 Rd calculus with respect to the spatial-temporal white noise on G([0, T ] ). G-Gaussian random field and G-white noise theory can be widely× used to study the uncertain models in many physical fields, such as statistical physics and quantum theory.

7. Appendix 7.1. Upper expectation and the corresponding capacity. Let Γ be defined as (4.6). In this section, unless otherwise mentioned, we always denote by Ω= C0(Γ, R), the space of all R valued continuous functions (ωγ)γ∈Γ, with ω0 = 0, equipped with the distance − ∞ (1) (2) −i (1) (2) (1) (2) ρ(ω ,ω ) := 2 [( max ωγ ωγ ) 1], ω ,ω Ω, γ⊂Bd(0;i) | − | ∧ ∀ ∈ i=1 X where Bd(0; i) = x Rd : x i . Let Ω¯ = RΓ denote the space of all R valued { ∈ | | ≤ } − functions (¯ωγ)γ∈Γ. We also denote by (Ω), the σ-algebra generated by all open sets and let (Ω)¯ be the σ-algebra generated byB all finite dimensional cylinder sets. The correspondingB canonical process is Wγ(ω) = ωγ (respectively, W¯ γ(¯ω)=ω ¯γ ), γ Γ for ω Ω (respectively, ω¯ Ω¯). ∈ ∈ ∈The spaces of Lipschitz cylinder functions on Ω and Ω¯ are denoted respectively by L (Ω) := ϕ(W , W , , W ): n 1,γ , ,γ Γ, ϕ C (Rn) , ip { γ1 γ2 ··· γn ∀ ≥ 1 ··· n ∈ ∀ ∈ l.Lip } L (Ω)¯ := ϕ(W¯ , W¯ , , W¯ ): n 1,γ , ,γ Γ, ϕ C (Rn) . ip { γ1 γ2 ··· γn ∀ ≥ 1 ··· n ∈ ∀ ∈ l.Lip } Let G ( ),γ be a family of continuous monotone and sublinear functions { γ · ∈ JΓ} given by (4.3). Following Section 4, we can construct the corresponding sublinear expectation E¯ on (Ω¯, Lip(Ω))¯ . Due to the natural correspondence of Lip(Ω)¯ and Lip(Ω), 24 JI XIAOJUN AND PENG SHIGE we also construct a sublinear expectation Eˆ on (Ω, Lip(Ω)) such that (Wγ(ω))γ∈Γ is a G-white noise. We want to construct a weakly compact family of (σ-additive) probability measures on (Ω, (Ω)) such that the sublinear expectation can be represented as an upper Pexpectation,B namely,

Eˆ[ ] = max EP [ ]. · P ∈P · The following lemmas are variations of Lemma 3.3 and 3.4 in Chapter I of Peng [15]. We denote := γ =(γ ,...,γ ): m N, γ Γ . JΓ { 1 m ∀ ∈ i ∈ } Lemma 38. Let γ = (γ , ,γ ) and ϕ ∞ C (Rm) satisfy ϕ 0 as 1 ··· m ∈ JΓ { n}n=1 ⊂ l.Lip n ↓ n . Then E¯[ϕ (W¯ , W¯ , , W¯ )] 0. →∞ n γ1 γ2 ··· γm ↓ Lemma 39. Let E be a finitely additive linear expectation dominated by E¯ on Lip(Ω)¯ . Then there exists a unique probability measure P on (Ω¯, (Ω))¯ such that E[X]= EP [X] for each X L (Ω)¯ . B ∈ ip ∞ Proof. For any fixed γ =(γ1,...,γm) Γ, by Lemma 38, for each sequence ϕn n=1 Rm ∈JE¯ W¯ W¯ W¯ { } ⊂ Cl.Lip( ) satisfying ϕn 0, we have [ϕn( γ1 , γ2 , , γm )] 0. By Daniell- Stone’s theorem (see Appendix↓ B in Peng [15]), there··· exists a unique↓ probability measure P on (Rm, (Rm)) such that E [ϕ] = E¯[ϕ(W¯ , W¯ , , W¯ )] for each γ B Pγ γ1 γ2 ··· γm ϕ C (Rm). Thus we get a family of finite dimensional distributions P : γ . ∈ l.Lip { γ ∈JΓ} It is easy to check that P : γ is a consistent family. Then by Kolmogorov’s con- { γ ∈JΓ} sistent extension theorem, there exists a probability measure P on (Ω¯, (Ω))¯ such that P : γ is the finite dimensional distributions of P . We now proveB the unique- { γ ∈ JΓ} ness. Assume that there exists another probability measure P¯ satisfying the condition. By Daniell-Stone’s theorem, P and P¯ have the same finite-dimensional distributions, hence by the monotone class theorem, P = P¯. The proof is complete.  Lemma 40. There exists a family of probability measures on (Ω¯, (Ω))¯ such that Pe B E¯[X] = max EP [X], for X Lip(Ω)¯ . P ∈Pe ∈ Proof. By the representation theorem of the sublinear expectation and Lemma 39, it is easy to get the result.  For this , we define the associated capacity: Pe c˜(A) := sup P (A), A (Ω)¯ , P ∈Pe ∈ B and the upper expectation for each (Ω)¯ -measurable real function X which makes the following definition meaningful: B

E˜[X] := sup EP [X]. P ∈Pe By Theorem 30, there exists a continuous modification W˜ of W¯ in the sense that c˜( W˜ = W¯ )=0, and { γ 6 γ} E˜[ W˜ W˜ 2d+2]= E˜[ W¯ W¯ 2d+2]= E¯[ W¯ W¯ 2d+2]= L γ γ d+1, γ ,γ Γ, | γ1 − γ2 | | γ1 − γ2 | | γ1 − γ2 | | 1− 2| ∀ 1 2 ∈ SPATIAL AND TEMPORAL WHITE NOISES UNDER SUBLINEAR G-EXPECTATION 25 where L is a constant depending only on E¯. −1 For any P e, let P W˜ denote the probability measure on (Ω, (Ω)) induced by ∈ P ◦ −1 B W˜ with respect to P . We denote 1 = P W˜ : P e . Applying the well-known criterion for tightness of Kolmogorov-Chentsov’sP { ◦ type∈ exp P }ressed in terms of moments (see Appendix B in Peng [15]), we conclude that is tight. We denote by = the P1 P P1 closure of 1 under the topology of weak convergence, then is weakly compact. Now, weP give the representation of the sublinear expectatioPn. Theorem 41. Let G ( ),γ be a family of continuous monotone and sublinear { γ · ∈ JΓ} functions given by (4.3) and Eˆ be the corresponding sublinear expectation on (Ω, Lip(Ω)). Then there exists a weakly compact family of probability measures on (Ω, (Ω)) such that P B

Eˆ[X] = max EP [X] for X Lip(Ω). P ∈P ∈ Proof. By Lemma 40, we have

Eˆ[X] = max EP [X] for X Lip(Ω). P ∈P1 ∈ For any X L (Ω), by Lemma 38, we get Eˆ[ X (X N) ( N) ] 0 as N . ∈ ip | − ∧ ∨ − | ↓ → ∞ Noting that = , by the definition of weak convergence, we thus have the result.  P P 1 References

[1] Artzner, Ph., Delbaen, F., Eber, J. M., and Heath, D., Coherent Measures of Risk. Mathematical Finance, 9(1999), no. 3, 203–228. [2] Dalang, Robert C., Extending the martingale measure stochastic integral with applications to spatially homogeneous s.p.d.e.’s. Electron. J. Probab. 4(1999), no. 6, 29 pp. [3] Da Prato, G., and Zabczyk, J., Stochastic equations in infinite dimensions, Cambridge University Press, Cambridge, 1992. [4] Delbaen, F. (2002) Coherent Risk Measures (Lectures given at the Cattedra Galileiana at the Scuola Normale di Pisa, March 2000), Published by the Scuola Normale di Pisa. [5] Denis, L., Hu, M., Peng, S., Function spaces and capacity related to a sublinear expectation: application to G-Brownian motion paths, Potential Analysis 34(2): 139-161, 2011. [6] Föllmer, H., Spatial risk measures and their local specification: the locally law-invariant case. Stat. Risk Model. 31 (2014), no. 1, 79–101. [7] Föllmer, H., and Klüppelberg C., Spatial risk measures: local specification and boundary risk. Stochastic analysis and applications 2014, 307–326, Springer Proc. Math. Stat., 100, Springer, Cham, 2014. [8] Föllmer, H., and Schied, A., Convex measures of risk and trading constraints. Finance Stoch. 6 (2002), no. 4, 429–447. [9] Huber, P. J., Robust Statistics. John Wiley & Sons, Inc., New York, 1981. [10] Khoshnevisan, D., Analysis of stochastic partial differential equations. CBMS Regional Conference Series in Mathematics, 119. American Mathematical Society,Providence, RI, 2014. [11] Khoshnevisan, D., Multiparameter processes: an introduction to random fields. Springer Mono- graphs in Mathematics. Springer-Verlag, New York, 2002. [12] Peng, S., G-expectation, G-Brownian motion and related stochastic calculus of Itô’s type, Sto- chastic analysis and applications, 541–567, Abel Symposia 2, Springer, Berlin, 2007. [13] Peng, S., Multi-dimensional G-Brownian motion and related stochastic calculus under G- expectation, Stochastic Processes and their Applications 118(12): 2223-2253, 2008. 26 JI XIAOJUN AND PENG SHIGE

[14] Peng, S., A new central limit theorem under sublinear expectations, in arXiv:0803.2656, 2008. [15] Peng, S., Nonlinear expectations and stochastic calculus under uncertainty, arXiv:1002.4546v1, 2010. [16] Peng, S., Tightness, weak compactness of nonlinear expectations and application to CLT, arXiv:1006.2541v1, 2010. [17] Peng, S., G-Gaussian Processes under sublinear expectations and q-Brownian motion in quantum mechanics, arXiv:1105.1055v1, 2011. [18] Walley, P., Statistical Reasoning with Imprecise Probabilities. Chapman and Hall, London, New York, 1991. [19] Walsh, J. B., An introduction to stochastic partial differential equations. École d’été de Prob. de St-Flour XIV, 1984, 265–439, Lect. Notes in Math., 1180, Springer, Berlin, 1986.

School of Mathematics University, 250100, Jinan, this research is sup- ported by NSF of No. L1624032 E-mail address: Ji Xiaojun: [email protected], Peng Shige: [email protected]