arXiv:1211.1090v2 [math.PR] 19 Jun 2013 eedn on depending xettos n rvdtecnrllmtterm n h weak the v and random theorems (IID) limit things. central distributed other the identically proved and Pe and independent finance, Recent expectations, in of applications. uncertain notion its modeling the and and pricing theory nu superhedge probability large measures, of of laws development (weak) the strong and in theorems limit central classic The Introduction 1 MSC(2000): pr some Keywords extend which expectations, sublinear under numbers large ain.I 1] hnpoe toglwo ag ubr o I r IID ident for of numbers requirement large presented the of Hu without variables [12], law In random strong independent expectations. a for sublinear proved by Chen induced [11], capacities In tations. su results weighted the for extends theorem which limit expectations, central sublinear a under obtained variables Zhang [10], In holds. Abstract itiuin hc xed egscnrllmtterm n[,4.I [9 In 4]. [2, in theorems witho limit theorem continuous central limit Peng’s central a extends proved which Shi distribution, and Li [8], In expectations. -alades [email protected] address: E-mail ∗ ut-iesoa eta ii hoesand theorems limit central Multi-dimensional o ercl h xsigrslsaottelw flrenmesun numbers large of laws the about results existing the recall we Now n[] uadZagotie eta ii hoe o aaiisind capacities for theorem limit central a obtained Zhang and Hu [7], In orsodn uhr eateto ahmtc,NnigUniv Nanjing Mathematics, of Department author: Corresponding aso ag ubr ne sublinear under numbers large of laws nti ae,w rsn oemlidmninlcnrllmttheor limit central multi-dimensional some present we paper, this In eta ii hoe,lw flrenmes ulna expectation sublinear numbers, large of laws theorem, limit central ψ 01,60H05 60H10, eta ii hoe ne ulna xettosotie yPe by obtained expectations sublinear under theorem limit central , ψ aifigtegot condition growth the satisfying eCu Hu Ze-Chun expectations ajn University Nanjing ∗ 1 n igZhou Ling and | ψ ( x ) ≤ | riy ajn 103 China 210093, Nanjing ersity, C 1+ (1 bandb eg iadShi. and Li Peng, by obtained tterqieeto identical of requirement the ut vosresults. evious br lya motn role important an play mbers a flrenmesamong numbers large of law cldsrbto.I 1] we [13], In distribution. ical g[,2 ,4 ,6 initiated 6] 5, 4, 3, 2, [1, ng he aso ag numbers large of laws three | so needn random independent of ms x y oiae yterisk the by motivated ly, ,H rvdta o any for that proved Hu ], | ralsudrsublinear under ariables p no aibe under variables andom o some for ) e ulna expec- sublinear der cdb sublinear by uced m n asof laws and ems > C g[]still [4] ng 0 p , ≥ 1 obtained a strong law of large numbers for IID random variables under one-order type moment condition. The main results in [7, 8, 9, 10, 11, 12, 13] are about one-dimensional random variables in sublinear expectation space. The goal of this paper is to present some multi-dimensional central limit theorems and laws of large numbers under sublinear expectations without the requirement of identical distribution, which extend Peng’s central limit theorems in [2, 4], Theorem 3.1 of [8] and Theorem 3.1 of [12]. The rest of this paper is organized as follows. In Section 2, we recall some basic notions under sublinear expectations. In Section 3, we present a multi-dimensional central limit theorem and several corollaries, and their proofs will be given in Section 4. In Section 5, we give some multi-dimensional laws of large numbers.

2 Basic settings

In this section, we present some basic settings about sublinear expectations. Please refer to Peng [1, 2, 3, 4, 5, 6]. Let Ω be a given set and let be a linear space of real functions defined on Ω such that H for any constant number c,c ; if X , then X ; if X1,...,Xn , then for any n ∈ H ∈ H |n | ∈ H ∈ H ϕ Cl,Lip(R ), ϕ(X1,...,Xn) , where Cl,Lip(R ) denotes the linear space of functions ϕ satisfying∈ ∈ H ϕ(x) ϕ(y) C(1 + x m + y m) x y , x, y Rn, | − | ≤ | | | | | − | ∀ ∈ n for some C > 0, m N depending on ϕ. Denote by Cb,Lip(R ) the space of bounded Lipschitz functions defined on∈Rn.

Definition 2.1 A sublinear expectation Eˆ on H is a Eˆ : R satisfying the follow- ing properties: H→ (a) Monotonicity: Eˆ[X] Eˆ[Y ], if X Y . (b) Constant preserving:≥Eˆ[c]= c, c ≥R. ∀ ∈ (c) Sub-additivity: Eˆ[X + Y ] Eˆ[X]+ Eˆ[Y ]. (d) Positive homogeneity: Eˆ[λX≤ ]= λEˆ[X], λ 0. The triple (Ω, , Eˆ) is called a sublinear expectation∀ ≥ space. If only (c) and (d) are satisfied, Eˆ is H called a sublinear functional.

Theorem 2.2 Let E be a sublinear functional defined on a linear space . Then there exists a H family of linear functionals E : θ Θ defined on such that { θ ∈ } H

E[X] = sup Eθ[X], X , θ Θ ∀ ∈ H ∈ and, for each X , there exists θ Θ such that E[X] = E [X]. Furthermore, if E is a ∈ H X ∈ θX sublinear expectation, then the corresponding Eθ is a linear expectation.

2 Definition 2.3 Let X1 and X2 be two n-dimensional random vectors defined on sublinear expec- tation spaces (Ω , , E ) and (Ω , , E ), respectively. They are called identically distributed, 1 H1 1 2 H2 2 denoted by X1 ∼ X2, if E [ϕ(X )] = E [ϕ(X )], ϕ C (Rn). 1 1 2 2 ∀ ∈ b,Lip Definition 2.4 In a sublinear expectation space (Ω, , Eˆ), a random vector Y n is said to be independent from another random vector X H m under Eˆ[ ] if for each∈ test H function ∈ H · ϕ C (Rn+m) we have ∈ l,Lip Eˆ[ϕ(X,Y )] = Eˆ[Eˆ[ϕ(x, Y )] ]. |x=X Proposition 2.5 Let (Ω, , Eˆ) be a sublinear expectation space and X,Y be two random variables H such that Eˆ[Y ]= Eˆ[ Y ], i.e., Y has no mean-uncertainty. Then we have − − Eˆ[X + αY ]= Eˆ[X]+ αEˆ[Y ], α R. ∀ ∈ In particular, if Eˆ[Y ]= Eˆ[ Y ]=0, then Eˆ[X + αY ]= Eˆ[X]. − Proposition 2.6 For each X,Y , we have ∈ H r r 1 r r Eˆ[ X + Y ] 2 − (Eˆ[ X ]+ Eˆ[ Y ]), | | ≤ 1 | | |1 | Eˆ[ XY ] (Eˆ[ X p]) p (Eˆ[ X q]) q , | | ≤ 1| | · |1 | 1 (Eˆ[ X + Y p]) p (Eˆ[ X p]) p +(Eˆ[ Y p]) p , | | ≤ | | | | where r 1 and 1

We denote by S(d) the collection of all d d symmetric matrices. Let X be G-normal distributed × on (Ω, , Eˆ). The following function is very important to characterize its distribution: H 1 G(A)= Eˆ AX,X , A S(d). 2h i ∈   It is easy to check that G is a sublinear function monotonic in A S(d). By Theorem 2.2, there exists a bounded, convex and closed subset Θ S(d) such that ∈ ⊂ 1 G(A) = sup tr[AQ], A S(d). Q Θ 2 ∈ ∈

3 Proposition 2.9 Let X be G-normal distributed. For each ϕ C (Rd), define a function ∈ b,Lip u(t, x) := Eˆ[ϕ(x + √tX)], (t, x) [0, ) Rd. ∀ ∈ ∞ × Then u is the unique viscosity solution of the following parabolic PDE:

∂ u G(D2u)=0, u = ϕ, (2.1) t − |t=0 where G(A)= Eˆ[ 1 AX,X ], A S(d). 2 h i ∈

The parabolic PDE (2.1) is called a G-heat equation. Since G(A) is monotonic: G(A ) 1 ≥ G(A ), for A A , it follows that 2 1 ≥ 2 T d d Θ S (d)= θ S(d): θ 0 = BB : B R × . ⊂ + { ∈ ≥ } { ∈ } If Θ= Q , then X is classical zero-mean normal distributed with covariance Q. In general, Θ characterizes{ } the covariance uncertainty of X. We denote X ∼ N(0; Θ). When d = 1, we have X ∼ N(0; [σ2, σ2]), where σ2 = Eˆ[X2], σ2 = Eˆ[ X2]. The corre- sponding G-heat equation is − −

1 2 2 + 2 2 ∂ u (σ (∂ u) σ (∂ u)−)=0, u = ϕ. (2.2) t − 2 xx − xx |t=0

3 Multi-dimensional central limit theorems under sublin- ear expectations

In this section, we present a multi-dimensional central limit theorem under sublinear expectation and several corollaries. Their proofs will be given in next section.

Rd Theorem 3.1 Let Xi i∞=1 be a sequence of -valued random variables in a sublinear expectation space (Ω, , Eˆ) which{ satisfies} the following conditions: H

(i) each X is independent from (X ,...,X ), i =1, 2,... ; i+1 1 i ∀ (ii) Eˆ[X ]= Eˆ[ X ] = 0; i − i (iii) there is a constant M > 0, such that Eˆ[ X 3] M, i =1, 2,... ; | i| ≤ ∀

(iv) there exist a sequence an of positive numbers and a sublinear function G : S(d) R such a1+ +an { } → that limn ··· =0 and →∞ n G (A) G(A) a A , n, A S(d), | n − | ≤ nk k ∀ ∈ where G (A)= 1 Eˆ[ AX ,X ], A = n a2 , A =(a ) S(d). i 2 h i ii k k i,j=1 ij ∀ ij ∈ qP 4 Sn ∞ Then the sequence √n converges in law to G-normal distribution N(0;Θ), i.e., n=1 n o Sn d lim Eˆ ϕ = Eˆ[ϕ(X)], ϕ Cb,Lip(R ), (3.1) n √n ∀ ∈ →∞    where X ∼ N(0;Θ),S = X + + X . n 1 ··· n

Rd Corollary 3.2 Let Xi i∞=1 be a sequence of -valued random variables in a sublinear expec- tation space (Ω, , E{ˆ) which} satisfies conditions (i)(ii)(iii) in Theorem 3.1 and the following condition (v). H

(v) There exists a sublinear function G : S(d) R such that G converges to G pointwise, → { i} i.e., lim Gi(A)= G(A), A S(d), i →∞ ∀ ∈ where G (A)= 1 Eˆ[ AX ,X ], A S(d). i 2 h i ii ∀ ∈

Then the result of Theorem 3.1 holds.

In order to state another corollary of Theorem 3.1, we need the following definition.

Definition 3.3 Let ∆1, ∆2 S+(d) be two closed subsets. Define their Hausdorff distance as follows: ⊂ d (∆ , ∆ ) := inf ε ∆ B (∆ ), ∆ B (∆ ) , H 1 2 { | 1 ⊂ ε 2 2 ⊂ ε 1 } where B (∆) := x S (d) d(x, ∆) <ε , d(x, ∆) := inf d(x, y) y ∆ , and ε { ∈ + | } { | ∈ }

d d(x, y) := x y = x y 2 for x =(x ),y =(y ). k − k v | ij − ij| ij ij ui,j=1 uX t

Set Ξ := ∆ S+(d) ∆ is bounded and closed . Then by [14], (Ξ,dH) is a . { ⊂ | }

Rd Corollary 3.4 Let Xi i∞=1 be a sequence of -valued random variables in a sublinear expec- tation space (Ω, , E{ˆ) which} satisfies conditions (i)(ii)(iii) in Theorem 3.1 and the following condition (vi). H

5 (vi) There exists a bounded, convex, and closed subset Θ of S+(d) such that

n 1 lim dH (Θ, Θi)=0, n n →∞ i=1 X where Θi is a bounded, convex and closed subset of S+(d) such that 1 1 Gi(A) := Eˆ[ AXi,Xi ]= sup tr[AQ]. 2 h i 2 Q Θi ∈

1 S Define G(A)= 2 supQ Θ tr[AQ], A (d). Then the result of Theorem 3.1 holds. ∈ ∀ ∈

d Corollary 3.5 Let X ∞ be a sequence of R -valued random variables in a sublinear expec- { i}i=1 tation space (Ω, , Eˆ) which satisfies conditions (i)(ii)(iii) in Theorem 3.1 and the following condition (vii). H

(vii) Let Θi be a bounded, convex and closed subset of S+(d) such that 1 1 Gi(A) := Eˆ[ AXi,Xi ]= sup tr[AQ], 2 h i 2 Q Θi ∈ and Θ is a Cauchy sequence in (Ξ,d ), i.e., d (Θ , Θ ) 0, as i, j . { i} H H i j → →∞

Then there exists a bounded, convex and closed subset Θ S+(d) such that the result of Theorem S R ⊂ 1 3.1 holds, where the function G : (d) is defined by G(A)= 2 supQ Θ tr[AQ]. → ∈

By Corollary 3.4, we can obtain the following result.

Theorem 3.6 ([8]) Let a sequence X ∞ which is 1-dimensional random variable in a sublinear { i}i=0 expectation space (Ω, , Eˆ) satisfy the following conditions: H (i) each X is independent from (X ,...,X ), i =1, 2,... ; i+1 1 i ∀ (ii) Eˆ[X ]= Eˆ[ X ]=0, Eˆ[X2]= σ2, Eˆ[ X2]= σ2, where 0 σ2 σ2 < ; i − i i i − − i i ≤ i ≤ i ∞ (iii) there are two positive constants σ and σ such that

n n 1 2 2 1 2 2 lim σi σ =0, lim σi σ = 0; n n | − | n n | − | →∞ i=1 →∞ i=1 X X (iv) there is a constant M > 0, such that

Eˆ[ X 3] M, i =1, 2,... | i| ≤ ∀

6 Sn ∞ 2 2 Then the sequence √n of the sum Sn = X1 + + Xn converges in law to N(0; [σ , σ ]) : n=1 ··· n o Sn lim Eˆ ϕ = Eˆ[ϕ(X)], ϕ Cb,Lip(R), n √n ∀ ∈ →∞    where X ∼ N(0; [σ2, σ2]).

Proof. For d = 1, by Definition 3.3, we have

d ([a, b], [c,d]) = max a c , b d a c + b d . H {| − | | − |}≤| − | | − | 2 2 2 2 Set Θi = [σi , σi ], Θ = [σ , σ ]. Then d (Θ , Θ) = max σ2 σ2 , σ2 σ2 σ2 σ2 + σ2 σ2 , H i {| i − | | i − |}≤| i − | | i − | and thus by condition (iii), we have

n n n 1 1 2 2 1 2 2 lim dH (Θi, Θ) lim σi σ + lim σi σ =0. n n ≤ n n | − | n n | − | →∞ i=1 →∞ i=1 →∞ i=1 X X X Then the result follows from Corollary 3.4.

4 Proofs

In this section, we will give the proofs of Theorem 3.1 and its corollaries.

4.1 Proof of Theorem 3.1

At first, we prove a lemma.

Lemma 4.1 We assume the same assumptions as in Theorem 3.1. We further assume that there exists a constant β > 0 such that, for each A, B S(d) with A B, we have ∈ ≥ Eˆ[ AX,X ] Eˆ[ BX,X ] β tr[A B]. (4.1) h i − h i ≥ − Then the main result (3.1) holds.

Proof. For a small but fixed h> 0, let V (t, x) be the unique viscosity solution of

∂ V + G(D2V )=0, (t, x) [0, 1+ h] Rd, V = ϕ. (4.2) t ∈ × |t=1+h By Proposition 2.9, we have

V (t, x)= Eˆ[ϕ(x + √1+ h tX)]. (4.3) − 7 In particular, V (h, 0) = Eˆ[ϕ(X)], V (1 + h, x)= ϕ(x). Since (4.2) is a uniformly parabolic PDE and G is a , by the interior regularity of V (see [15]), we have

α V C1+ 2 ,2+α([0,1] Rd) < , for some α (0, 1). (4.4) k k × ∞ ∈

1 Set δ = n , S0 = 0. Then

n 1 − V (1, √δS ) V (0, 0) = V ((i + 1)δ, √δS ) V (iδ, √δS ) n − i+1 − i i=0 n o Xn 1 − = [V ((i + 1)δ, √δS ) V (iδ, √δS )] { i+1 − i+1 i=0 X +[V (iδ, √δSi+1) V (iδ, √δSi)] n 1 − } − = Ii + J i , { δ δ} i=0 X with, by Taylors expansion, 1 J i = ∂ V (iδ, √δS )δ + D2V (iδ, √δS )X ,X δ + DV (iδ, √δS ),X √δ , δ t i 2h i i+1 i+1i h i i+1 i 1 Ii = δ [∂ V ((i + β)δ, √δS ) ∂ V (iδ, √δS )]dβ δ t i+1 − t i+1 Z0 +[∂tV (iδ, √δSi+1) ∂tV (iδ, √δSi)]δ 1 1 − 2 2 + δ D V (iδ, √δSi + γβXi+1√δ) D V (iδ, √δSi) Xi+1,Xi+1 γdβdγ. 0 0 − Z Z D  E Thus

n 1 n 1 n 1 n 1 − − − − Eˆ J i Eˆ Ii Eˆ[V (1, √δS )] V (0, 0) Eˆ J i + Eˆ Ii . (4.5) δ − − δ ≤ n − ≤ δ δ " i=0 # " i=0 # " i=0 # " i=0 # X X X X

ˆ n 1 i We now prove that limn E i − Jδ =0. Firstly, by conditions (i) and (ii), we have →∞ =0 Eˆ[ DV (iδ, √δS ),XP √δ]= Eˆ[ DV (iδ, √δS ),X √δ]=0. h i i+1i h− i i+1i Then by (4.2) and the sublinear property of Eˆ, we have

n 1 n 1 − − 1 Eˆ J i = Eˆ ∂ V (iδ, √δS )δ + D2V (iδ, √δS )X ,X δ δ t i 2h i i+1 i+1i " i=0 # " i=0  # X Xn 1 − 2 = Eˆ ∂tV (iδ, √δSi)δ + G(D V (iδ, √δSi))δ " i=0 X   8 n 1 − 1 + D2V (iδ, √δS )X ,X δ G(D2V (iδ, √δS ))δ 2h i i+1 i+1i − i i=0  # Xn 1 − 1 = Eˆ D2V (iδ, √δS )X ,X G(D2V (iδ, √δS )) δ 2h i i+1 i+1i − i " i=0   # Xn 1 − 1 δEˆ D2V (iδ, √δS )X ,X G (D2V (iδ, √δS )) ≤ 2h i i+1 i+1i − i+1 i " i=0  # Xn 1 − +δEˆ G (D2V (iδ, √δS )) G(D2V (iδ, √δS )) i+1 i − i " i=0 # X   = L1 + L2. Note that ∂2 V = ∂2 V, so D2V S(d), and thus G(D2V ) is meaningful. Now we discuss L x1x2 x2x1 ∈ 1 and L2 respectively. For L1, by the definition of the function Gi+1, we have

n 1 1 − 1 L Eˆ D2V (iδ, √δS )X ,X G (D2V (iδ, √δS )) =0. 1 ≤ n 2h i i+1 i+1i − i+1 i i=0 X   For L2, by condition (iv), we have

n 1 n 1 1 − 1 − L Eˆ a D2V (iδ, √δS ) a Eˆ D2V (iδ, √δS ) . 2 ≤ n i+1k i k ≤ n i+1 k i k " i=0 # i=0 X X h i 2 We claim that there exists a positive constant C1 such that Eˆ[ D V (iδ, √δSi) ] C1, i =1, 2,... By (4.4), we have k k ≤ ∀

α α D2V (iδ, √δS ) D2V (0, 0) C √δS + iδ 2 , k i − k ≤ | i| | |   where C is a positive constant. By H¨older’s inequality and conditions (i)-(iii), we have

α α α i 2 i 2 2 α 2 2 1 2 1 3 3 2 α α Eˆ[ √δS ] Eˆ[ √δS ] = Eˆ[ X ] Eˆ[ X ] M 3 × 2 = M 3 . | i| ≤ | i| n | k| ≤ n | k| ≤ k ! k !   X=1 X=1   α Let C := D2V (0, 0) + CM 3 + C. Then for any i =1, 2,..., 1 k k α α Eˆ[ D2V (iδ, √δS ) ] D2V (0, 0) + CEˆ[ √δS ]+ C iδ 2 C . k i k ≤k k | i| | | ≤ 1 C1 n Hence L2 n i=1 ai, which together with condition (iv) implies that lim sup L2 0. Then ≤ n ≤ →∞ P n 1 − ˆ i lim sup E Jδ 0. (4.6) n ≤ " i=0 # →∞ X Similarly, we have n 1 − ˆ i lim sup E Jδ 0. (4.7) n − ≤ " i=0 # →∞ X 9 ˆ n 1 i ˆ n 1 i ˆ n 1 i As E[ − Jδ] E[ − Jδ]. By (4.7), we have lim inf E[ − Jδ] 0, which together with i=0 ≥ − − i=0 n i=0 ≥ ˆ n 1 i →∞ (4.6) implies that lim E − Jδ =0. P n P i=0 P →∞ P  i 2 Now we come to analyze the two items including Iδ in (4.5). Since ∂tV and D V are uniformly α-H¨older continuous in x and α -H¨older continuous in t on [0, 1] Rd, we have 2 × i ′ α α α I C δ1+ 2 (1 + X + X 2+ ), | δ| ≤ | i+1| | i+1| where C′ is a positive constant. By Young’s inequality, we have

2+α α α 2+α α Xi+1 · 1 α Xi+1 2 Xi+1 | 2+|α + 2+α = | | + , | | ≤ α 2 2+ α 2+ α and thus there exists a positive constant C2 such that

i α α I C δ1+ 2 (1 + X 2+ ). | δ| ≤ 2 | i+1| Hence

i α α Eˆ[ I ] C δ1+ 2 1+ Eˆ[ X 2+ ] | δ| ≤ 2 | i+1| 2+α α   3 C δ1+ 2 1+ Eˆ[ X 3] ≤ 2 | i+1| α  2+α   C δ1+ 2 (1 + M 3 ). ≤ 2 It follows that

n 1 n 1 n 1 α − − − 2+ 2 i i i α 1 Eˆ I Eˆ[I ] Eˆ[ I ] C (1 + M 3 ) , δ ≤ δ ≤ | δ| ≤ 2 n " i=0 # i=0 i=0 X X X   and thus n 1 − ˆ i lim sup E Iδ 0. (4.8) n ≤ " i=0 # →∞ X On the other side,

n 1 n 1 n 1 α − − − 2+ 2 i i i α 1 Eˆ I Eˆ[ I ] Eˆ[ I ] C (1 + M 3 ) , δ ≥ − − δ ≥ − | δ| ≥ − 2 n " i=0 # i=0 i=0 X X X   ˆ n 1 i ˆ n 1 i we have lim inf E − Iδ 0, which together with (4.8) implies that lim E[ − Iδ]=0. n i=0 ≥ n i=0 →∞ ˆ n 1 i →∞ Similarly, we have lim E[ − Iδ]=0. nP  i=0 P →∞ − By (4.5), we have P

lim Eˆ[V (1, √δSn)] = V (0, 0). (4.9) n →∞

10 d Finally, t, t′ [0, 1+ h], x R , ∀ ∈ ∈

V (t, x) V (t′, x) Eˆ[ ϕ(x + √1+ h tX) ϕ(x + √1+ h t X) ] | − | ≤ | − − − ′ | k √1+ h t √1+ h t Eˆ[ X ] ≤ ϕ| − − − ′| | | C t t , (4.10) ≤ 3 | − ′| p where kϕ is a Lipschitz constant depending on ϕ, and C3 is a constant depending on ϕ. By (4.3) and (4.10), we have

Eˆ[ϕ(√1+ hX)] Eˆ[ϕ(X)] = V (h, 0) V (0, 0) C √h, | − | | − | ≤ 3 and S Eˆ ϕ n Eˆ[V (1, √δS )] = Eˆ[V (1 + h, √δS )] Eˆ[V (1, √δS )] C √h. | √n − n | | n − n | ≤ 3    It follows that S S Eˆ ϕ n Eˆ[ϕ(X)] Eˆ ϕ n Eˆ[V (1, √δS )] + Eˆ[V (1, √δS )] Eˆ[ϕ(√1+ hX)] √n − ≤ √n − n | n − |      

+ Eˆ[ϕ(√1+ hX)] Eˆ[ϕ(X)] | − | 2C √h + Eˆ[V (1, √δS )] V (0, 0) , ≤ 3 | n − | which together with (4.9) implies that

Sn lim sup Eˆ ϕ Eˆ[ϕ(X)] 2C3√h. n √n − ≤ →∞   

Since h can be arbitrarily small, we have S lim Eˆ ϕ n = Eˆ[ϕ(X)]. n √n →∞   

Proof of Theorem 3.1. The idea comes from the proof of Theorem 3.5 of [6]. When the uniform elliptic condition (4.1) does not hold, we first introduce a perturbation to prove the above ¯ ¯ E¯ convergence for ϕ Cb,Lip(Rd). We can construct a sublinear expectation space (Ω, , ) and a ∈ ¯ ¯ n H∼ n sequence of two random vectors (Xi, κ¯i) i∞=1 such that, for each n =1, 2,..., Xi i=1 Xi i=1 and (X¯ , κ¯ ) is independent{ from }(X¯ , κ¯ ) n and, moreover, { } { } { n+1 n+1 } { i i }i=1 2 d |x| 2 d E¯[ψ(X¯i, κ¯i)] = (2π)− 2 Eˆ[ψ(Xi, x)]e− 2 dx, ψ Cb,Lip(R × ). Rd ∀ ∈ Z ε ε Define X¯ := X¯ + εκ¯ for a fixed ε> 0. It’s easy to check that the sequence X¯ ∞ satisfies all i i i { i }i=1 the conditions of Theorem 3.1, in particular, 1 ε2 Gε(A) := E¯[ AX¯ ε, X¯ ε ]= G (A)+ tr[A], i 2 h i i i i 2 11 ε2 Gε(A) := G(A)+ tr[A]. 2 They are strictly elliptic. Then we can apply Lemma 4.1 to

n n n ¯ε ¯ ε ¯ Sn = Xi = Xi + εJn, Jn = κ¯i, i=1 i=1 i=1 X X X and obtain S¯ε lim E¯ ϕ n = E¯[ϕ(X¯ + εκ¯)], n √n →∞    where (X,¯ κ¯) is G¯-distributed under E¯[ ] and · 1 G¯(A¯) := E¯[ A¯(X,¯ κ¯), (X,¯ κ¯) ], A¯ S(2d). 2 h i ∈ We have S S¯ε S¯ε J S¯ε Eˆ ϕ n E¯ ϕ n = E¯ ϕ n ε n E¯ ϕ n √n − √n √n − √n − √n             Jn εCE¯ C′ε, ≤ √n ≤  

where C is a Lipschitz constant of ϕ and C′ is a constant depending on ϕ. Similarly, Eˆ[ϕ(X)] E¯[ϕ(X¯ + εκ¯)] = E¯[ϕ(X¯)] E¯[ϕ(X¯ + εκ¯)] Cε. | − | | − | ≤ Since ε can be arbitrarily small, it follows that

Sn d lim Eˆ ϕ = Eˆ[ϕ(X)], ϕ Cb,Lip(R ). n √n ∀ ∈ →∞    The proof of Theorem 3.1 is complete.

Remark 4.2 By the proof of Lemma 4.1, we know that condition (iii) in Theorem 3.1 can be weaken to: there exist two constants M > 0 and α (0, 1) such that Eˆ[ X 2+α] M, i =1, 2,... ∈ | i| ≤ ∀

4.2 Proof of Corollary 3.2

At first, we prove two lemmas.

Lemma 4.3 We make the same assumptions as in Theorem 3.1, then G ,G are Lipschitz { i } function, and further they have a common Lipschitz constant C0.

12 Proof. A, B S(d) and i =1, 2,..., we have ∀ ∈ ∀ 1 G (A) G (B) = Eˆ[ AX ,X ] Eˆ[ BX ,X ] | i − i | 2| h i ii − h i ii | 1 Eˆ[ (A B)X ,X ] ≤ 2 |h − i ii| 1 A B Eˆ[ X 2] ≤ 2k − k· | i| 2 1 3 A B Eˆ[ X 3] ≤ 2k − k· | i| 1 2   M 3 A B , ≤ 2 k − k and

1 2 G(A) G(B) = lim Gi(A) Gi(B) M 3 A B . n | − | →∞ | − | ≤ 2 k − k 2 1 3 Set C0 = 2 M . The proof is complete.

Lemma 4.4 We make the same assumptions as in Theorem 3.1, then there exists a sequence γi of positive numbers such that lim γi =0 and i { } →∞ G (A) G(A) γ A , A S(d). | i − | ≤ ik k ∀ ∈

Proof. Because of G ’s and G’s positive homogeneity, we only need to prove G uniformly i { i} converges to G on the set A S(d) A 1 . We suppose that this does not hold. Then there exists an ε> 0 such that {k, ∈ n |kk, Ak ≤ } A S(d) A 1 , and ∀ ∃ k ≥ nk ∈{ ∈ |k k ≤ } G (A ) G(A ) > ε. | nk nk − nk | Since A S(d) A 1 is a compact subset of S(d), thus it is sequential compact, so A { ∈ | || || ≤ } { nk } has a convergent subsequence An . Denote by A0 its limit point. By Lemma 4.3, we get { kl }

Gn (An ) Gn (A0) C0 An A0 , G(An ) G(A0) C0 An A0 , | kl kl − kl | ≤ k kl − k | kl − | ≤ k kl − k and thus

Gn (An ) G(An ) Gn (An ) Gn (A0) + Gn (A0) G(A0) + G(A0) G(An ) . | kl kl − kl |≤| kl kl − kl | | kl − | | − kl | Leting l , we obtain that lim sup G (A ) G(A ) 0. It’s a contradiction. Hence the nkl nkl nkl →∞ l | − | ≤ result of this lemma holds. →∞

Then Corollary 3.2 directly follows from Theorem 3.1 and Lemma 4.4.

13 4.3 Proof of Corollary 3.4

Set 1 1 G(A)= Eˆ[ AX,X ]= sup tr[AQ], A S(d). 2 h i 2 Q Θ ∀ ∈ ∈ We claim that

G (A) G(A) d (Θ , Θ) A . (4.11) | i − | ≤ H i k k

If dH (Θi, Θ) = 0, then Θi = Θ, and so Gi = G and thus (4.11) holds in this case. Now assume that d (Θ , Θ) = 0. By tr[AQ ] tr[AQ ] = tr[A(Q Q )], we have H i 6 1 − 2 1 − 2 tr[AQ ] tr[AQ ] A Q Q . (4.12) | 1 − 2 |≤k k·k 1 − 2k Suppose that (4.11) doesn’t hold. Then there exists an element A =0 S(d) such that 6 ∈ G (A) G(A) >d (Θ , Θ) A . (4.13) | i − | H i k k Since Θ and Θ are bounded and closed, there exist Q Θ , Q Θ such that i i ∈ i ∈ 1 1 G (A)= tr[AQ ], G(A)= tr[AQ]. (4.14) i 2 i 2 Without loss of generality, we assume that G (A) G(A). Then by (4.13) and (4.14), we have i ≥ 1 1 1 tr[AQ ] tr[AQ]= tr[A(Q Q)] >d (Θ , Θ) A . (4.15) 2 i − 2 2 i − H i k k By the definition of d and the assumption that d (Θ , Θ) = 0, there exists Q Θ such that H H i 6 0 ∈ 3 Q Q < d (Θ , Θ). (4.16) k i − 0k 2 H i Then by (4.12), (4.15) and (4.16), we have 1 1 1 1 1 tr[AQ ] tr[AQ] = tr[AQ ] tr[AQ ]+ tr[A(Q Q)] 2 0 − 2 2 0 − 2 i 2 i − 1 > A Q Q + d (Θ , Θ) A −2k k·k 0 − ik H i k k 3 d (Θ , Θ) A + d (Θ , Θ) A ≥ −4 H i k k H i k k 1 = d (Θ , Θ) A > 0, 4 H i k k

1 1 which contradicts the fact that 2 tr[AQ] = G(A) = 2 supQ Θ tr[AQ]. Hence (4.11) holds. Then by Theorem 3.1, we obtain the result of Corollary 3.4. ∈

14 4.4 Proof of Corollary 3.5

Since Θi is a Cauchy sequence in (Ξ,dH) and (Ξ,dH) is a complete metric space, there exists { } 1 n a bounded closed set Θ Ξ such that lim dH (Θ, Θi)=0, hence lim dH (Θ, Θi)=0. And n n n i=1 ∈ →∞ →∞ thus by Corollary 3.4, we only need to prove that Θ is a . FirPst, we prove two claims.

Claim 1. x Θ lim d(x, Θi)=0. i ∈ ⇔ →∞ “ ” If x Θ, then by the definition of d , we have d(x, Θ ) d (Θ, Θ ), and thus ⇒ ∈ H i ≤ H i lim d(x, Θi)=0. i →∞ “ ” Since d (Θ, Θ ) 0 as i , so ε > 0, there exists an N such that when i > N, ⇐ H i → → ∞ ∀ dH (Θ, Θi) < ε/2. Hence Θi Bǫ(Θ), and by lim d(x, Θi)=0, we get x B2ǫ(Θ). Since ε is can i ⊂ →∞ ∈ be arbitrarily small, we have x Θ=Θ. ∈ Claim 2. The function d( , Θ ): S (d) R is convex. · i + → Indeed, x, y S (d), a [0, 1], there exist x , y Θ such that ∀ ∈ + ∈ 0 0 ∈ i d(x, Θi)= d(x, x0), d(y, Θi)= d(y,y0). Since Θ is convex, we have ax + (1 a)y Θ . Then we have i 0 − 0 ∈ i d(ax + (1 a)y, Θ ) d(ax + (1 a)y, ax + (1 a)y ) − i ≤ − 0 − 0 = (ax + (1 a)y) (ax + (1 a)y ) k − − 0 − 0 k a x x + (1 a) y y ≤ k − 0k − k − 0k = ad(x, Θ )+(1 a)d(y, Θ ), i − i and thus Claim 2 holds. Now we prove that Θ is a convex set. For any x, y Θ and a [0, 1], by Claim 1 and Claim 2, we have ∈ ∈ lim d(ax + (1 a)y, Θi) lim (ad(x, Θi)+(1 a)d(y, Θi))=0. i i →∞ − ≤ →∞ − By Claim 1 again, we get that ax + (1 a)y Θ, and thus Θ is a convex set. − ∈

Remark 4.5 Under the conditions of Corollary 3.5, by the above proof and (4.11) we have lim Gi(A)= G(A), A S(d). i →∞ ∀ ∈

5 Multi-dimensional laws of large numbers under sublin- ear expectations

In this section, we give a multi-dimensional (weak) law of large numbers under sublinear expec- tations and several corollaries. The idea for the proofs is similar to the one used in the above section and we omit the details.

15 Definition 5.1 ([6]) A d-dimensional random vector η = (η1, η2,...,ηd) on a sublinear expec- tation space (Ω, , Eˆ) is called maximal distributed if there exists a bounded, vonvex and closed subset Γ Rd suchH that ∈ d Eˆ[ϕ(η)] = max ϕ(y), ϕ Cb,Lip(R ). y Γ ∈ ∀ ∈

Proposition 5.2 ([6]) Let η be maximal distributed. For any ϕ C (Rd), define a function ∈ b,Lip u(t, y)= Eˆ[ϕ(y + tη)], (t, y) [0, ) Rd. ∈ ∞ × Then u is the unique viscosity solution of the following parabolic PDE:

∂ u g(Du)=0, u = ϕ, (5.17) t − |t=0 where g = g : Rd R is defined by g (p)= Eˆ[ p, η ]. η → η h i

d Remark 5.3 It is easy to check that gη is a sublinear function on R . Then by Theorem 2.2, there exists a bounded, convex and closed subset Γ Rd such that ⊂ g(p) = sup p, q , p Rd. q Γ h i ∀ ∈ ∈

Rd Theorem 5.4 Let Yi i∞=1 be a sequence of -valued random variables in a sublinear expectation space (Ω, , Eˆ) which{ satisfies} the following conditions: H (i) each Y is independent from (Y ,...,Y ), i =1, 2,... ; i+1 1 i ∀ (ii) there exists a positive constant M such that Eˆ[ Y 2] M, i =1, 2,... ; | i| ≤ ∀

(iii) there exist a sequence an of positive numbers and a sub-additive, positive homogeneous d { } function g : R R such that lim (a1 + + an)/n =0, and n → →∞ ··· g (p) g(p) a p , n, p Rd, | n − | ≤ n| | ∀ ∈ where g : Rd R, g (p)= Eˆ[ Y ,p ], p Rd. i → i h i i ∀ ∈

Sn ∞ Then the sequence n n=1 converges in law to G-normal distribution N(Γ;0), i.e.,  Sn d lim Eˆ ϕ = Eˆ[ϕ(η)], ϕ Cb,Lip(R ), n n ∈ →∞    ∼ Rd where Sn = Y1 + + Yn, η N(Γ;0) and g(p) = supq Γ p, q , p . ··· ∈ h i ∀ ∈

16 Rd Corollary 5.5 Let Yi i∞=1 be a sequence of -valued random variables in a sublinear expectation space (Ω, , Eˆ) which{ satisfies} conditions (i)(ii) in Theorem 5.4 and the following condition (iv). H

d d (iv) gi converges to g pointwise, i.e., lim gi(p) = g(p), p R , where gi : R R, gi(p)= i { } →∞ ∀ ∈ → Eˆ[ Y ,p ], p R. h i i ∀ ∈

Then the result of Theorem 5.4 holds.

Define Π := Γ Rd Γ is bounded and closed . { ⊂ | } Denote by dH the Hausdorff distance on Π. Then by [14], (Π,dH ) is a complete metric space.

Rd Corollary 5.6 Let Yi i∞=1 be a sequence of -valued random variables in a sublinear expectation space (Ω, , Eˆ) which{ satisfies} conditions (i)(ii) in Theorem 5.4 and the following condition (v). H

(v) There exists a bounded, convex and closed subset Γ Rd such that ⊂ n 1 lim dH(Γi, Γ)=0, n n →∞ i=1 X d R ˆ d where Γi is a bounded, convex and closed subset of such that E[ Yi,p ] = supq Γi R q,p , h i ∈ ⊂ h i p Rd. ∀ ∈

Then the result of Theorem 5.4 holds.

d Corollary 5.7 Let Y ∞ be a sequence of R -valued random variables in a sublinear expectation { i}i=1 space (Ω, , Eˆ) which satisfies conditions (i)(ii) in Theorem 5.4 and the following condition (vi). H

d R ˆ d (vi) Let Γi be a bounded, convex and closed subset of such that E[ Yi,p ] = supq Γi R q,p , p h i ∈ ⊂ h i ∀ ∈ Rd. Suppose that Γ is a Cauchy sequence in (Π,d ). { i} H

d Then there exists a bounded, convex and closed subset Γ of R such that dH (Γ, Γi) 0 as i and the result of Theorem 5.4 holds. → →∞

Remark 5.8 By Corollary 5.6 and the definition of the Hausdorff metric dH , following the proof of Theorem 3.6, we can obtain Theorem 3.1 of [12].

17 References

[1] Peng S. G.: G-expectation, G-Brownian motion and related stochastic calculus of Itˆo’s type. in: (Benth F. E., et al. eds.) Stochastic Analysis and Applications, Proceedings of the Second Abel Symposium, 2005, Springer-Verlag, 541-567 (2006)

[2] Peng S. G.: Law of large numbers and central limit theorem under nonlinear expectation. arXiv: 0702358v1 (2007)

[3] Peng S. G.: Multi-dimensional G-Brownian motion and related stochastic calculus under G-expectation. Stoch. Proc. Appl., 118, 2223–2253 (2008)

[4] Peng S. G.: A new central limit theorem under sublinear expectation. arXiv: 0803.2656v1 (2008)

[5] Peng S. G.: Survey on normal distributions, central limit theorem, Brownian motion and the related stochastic calculus under sublinear expectations. Sci. China Ser. A, 52, 1391–1411 (2009)

[6] Peng S. G.: Nonlinear expectations and stochastic calculus under uncertainty. arXiv: 1002.4546v1 (2010)

[7] Hu F. and Zhang D. F.: Central limit theorem for capacities. C. R. Acad. Sci. Paris, Ser. I, 348, 1111–1114 (2010)

[8] Li M., Shi Y. F.: A general central limit theorem under sublinear expectations. Sci. China Math., 53, 1989–1994 (2010)

[9] Hu F.: Moment bounds for IID sequences under sublinear expectations. Sci. China Math., 54, 2155–2160 (2011)

[10] Zhang D. F.: A weighted central limit theorem under sublinear expectations. arXiv: 1105.0727v1 (2011)

[11] Chen Z. J.: Strong laws of large numbers for capacities. arXiv: 1006.0749v1 (2010)

[12] Hu F.: General laws of large numbers under sublinear expectations. arXiv: 1104.5296v2 (2012)

[13] Hu Z. C. and Yang Y. Z.: Some inequalities and limit theorems under sublinear expectations. arXiv: 1207.0489 (2012)

[14] Henrikson J.: Completeness and total boundedness of the Hausdorff metric. MIT Under- graduate Journal of Mathematics, 1, 69–80 (1999)

[15] Wang L. H.: On the regularity of fully nonlinear parabolic equations II. Comm. Pure Appl. Math., 45, 141–178 (1992)

18