Martingale Theory Problem set 3, with solutions Martingales

The solutions of problems 1,2,3,4,5,6, and 11 are written down. The rest will come soon.

3.1 Let ξj, j = 1, 2,... be i.i.d. random variables with common distribution   P ξi = +1 = p, P ξi = −1 = q := 1 − p,

and , , their natural ltration. Denote Pn , . Fn = σ(ξj, 0 ≤ j ≤ n) n ≥ 0 Sn := j=1 ξj n ≥ 0

Sn (a) Prove that Mn := (q/p) is an (Fn)n≥0-martingale.

(b) For λ > 0 determine C = C(λ) so that

λ n Sn Zn := C λ

be an (Fn)n≥0-martingale. SOLUTION: (a)

 ξn+1  ξn+1  E Mn+1 Fn = E Mn(q/p) Fn = MnE (q/p) Fn ξn+1  = MnE (q/p) = Mn (p(q/p) + q(p/q)) = Mn.

(b)

−1 C = C(λ) = Eλξ = λp + λ−1q.

3.2 Gambler's Ruin, 1 A gambler wins or looses one pound in each round of betting, with equal chances and independently of the past events. She starts betting with the rm determination that she will stop when either she won a pounds or she lost b pounds.

(a) What is the that she will be winning when she stops playing further.

1 (b) What is the expected number of her betting rounds before she will stop playing further.

SOLUTION: Model the experiment with simple symmetric . Let ξj, j = 1, 2,... be i.i.d. random variables with common distribution 1 Pξ = +1 = = Pξ = −1, i 2 i and Fn = σ(ξj, 0 ≤ j ≤ n), n ≥ 0, their natural ltration. Denote n X S0 = 0,Sn := ξj, n ≥ 1. j=1 Dene the stopping times

TL := inf{n > 0 : Sn = −b},TR := inf{n > 0 : Sn = +a},T := min{TL,TR}. Note that

{the gambler wins a pounds} = {T = TR},

{the gambler looses b pounds} = {T = TL}.

(a) By the Optional Stopping Theorem   E ST = E S0 = 0. Hence   −bP T = TL + aP T = Tr = 0. On the other hand,   P T = TL + P T = Tr = 1. Solving the last two equations we get a b PT = T  = , PT = T  = . L a + b R a + b

(b) First prove that 2 is yet another martingale: Mn := Sn − n  2  E Mn+1 Fn = E Sn+1 Fn − (n + 1) 2  = E Sn + 2Snξn+1 + 1 Fn − (n + 1) = ··· = Mn. Now, apply the Optional Stopping Theorem

 2   2  2  0 = E MT = E ST − T = P T = TL b + P T = TR a − E T . Hence, using the result from (a)

ET  = ab.

2 3.3HW Gambler's Ruin, 2 Answer the same questions as in problem 2 when the probability of winning or loosing one pound in each round is p, respectively, q := 1 − p, with p ∈ (0, 1). Hint: Use the martingales constructed in problem 1

SOLUTION: Model the experiment with simple biased random walk. Let ξj, j = 1, 2,... be i.i.d. random variables with common distribution   P ξi = +1 = p, P ξi = −1 = q,

and Fn = σ(ξj, 0 ≤ j ≤ n), n ≥ 0, their natural ltration. Denote

n X S0 = 0,Sn := ξj, n ≥ 1. j=1 Dene the stopping times

TL := inf{n > 0 : Sn = −b},TR := inf{n > 0 : Sn = +a},T := min{TL,TR}. Note that

{the gambler wins a pounds} = {T = TR},

{the gambler looses b pounds} = {T = TL}.

(a) Use the Optional Stopping Theorem for the martingale (q/p)Sn :

Sn  b  a  1 = E (q/p) = (p/q) P T = TL + (q/p) P T = TR . On the other hand,   P T = TL + P T = Tr = 1. Solving the last two equations we get

1 − (q/p)a 1 − (p/q)b PT = T  = , PT = T  = . L (p/q)b − (q/p)a R (q/p)a − (p/q)b

(b) Now, apply the Optional Stopping Theorem to the martingale Sn − (p − q)n. Hence  1 − (p/q)b 1 − (q/p)a  ET  = (p − q)−1ES  = (p − q)−1 a − b T (q/p)a − (p/q)b (p/q)b − (q/p)a a(1 − (p/q)b) + b(1 − (q/p)a) = (p − q)−1 . (q/p)a − (p/q)b

3 3.4 Let ξj, j = 1, 2, 3,... , be independent and identically distributed random variables and

Fn := σ(ξj, 0 ≤ j ≤ n), n ≥ 0, the natural ltration generated by them. Assume that γξ  for some γ ∈ R the exponential moment m(γ) := E e j < ∞ exists. Denote S0 := 0, Pn , . Prove that the process Sn := j=1 ξj n ≥ 1

−n Mn := m(γ) exp{γSn}, n ∈ N,

is an (Fn)n≥0-martingale.

SOLUTION: Very much the same as problem 1 (b).

3.5 Let (Ω, F, (Fn)n≥0, P) be a ltered probability and Yn, n ≥ 0, a sequence of absolutely integrable random variables adapted to the ltration (Fn)n≥0. Assume that there exist real

numbers un, vn, n ≥ 0, such that

 E Yn+1 Fn = unYn + vn.

Find two real sequences an and bn, n ≥ 0, so that the sequence of random variables

Mn := anYn + bn, n > 1, be martingale w.r.t. the same ltration.

SOLUTION: Write down the martingale condition for Mn:   E Mn+1 Fn = E an+1Yn+1 + bn+1 Fn

= an+1unYn + an+1vn + bn+1 = anYn + bn. We get the recursions

−1 an+1 = anun , bn+1 = bn − an+1vn. The solution is

n−1 !−1 Y a0 = 1, an = uk , k=0 n X b0 = 0, bn = − akvk−1. k=1

3.6HW We place N balls in K urns (in whatever way) and perform the following discrete time process. At each time unit we choose one of the balls uniformly at random (that is : each ball is chosen with probability 1/N) and place it in one of the urns also uniformly chosen

at random (that is: each urn is chosen with probability 1/K). Denote by Xn the number

4 of balls in the rst urn at time n and let Fn := σ(Xj, 1 ≤ j ≤ n), n ≥ 0, be the natural

ltration generated by the process n 7→ Xn.  (a) Compute E Xn+1 Fn .

(b) Using the result from problem 5, nd real numbers an, bn, n ≥ 0, such that Zn :=

anXn + bn be martingale with respect to the ltration (Fn)n≥0. SOLUTION: (a)    N − Xn 1 Xn K − 1 N − Xn K − 1 Xn 1 E Xn+1 Fn = (Xn + 1) + (Xn − 1) + Xn + N K N K N K N K N − 1 1 = X + . n N K (b) Apply the result from problem 5 with N − 1 1 u = , v = . n N n K

3.7 Let Xj, j ≥ 1, be absolutely integrable random variables and Fn := σ(Xj,, 1 ≤ j ≤ n), n ≥ 0, their natural ltration. Dene the new random variables

n−1 X  Z0 := 0,Zn := Xj+1 − E Xj+1 Fj . j=0

Prove that the process n 7→ Zn is an (Fn)n≥0-martingale.

3.8 A biased coin shows HEAD=1 with probability θ ∈ (0, 1), and TAIL=0 with probability 1 − θ. The value θ of the bias is not known.

n For t ∈ [0, 1] and n ∈ N we dene pn,t : {0, 1} → [0, 1] by Pn Pn xj n− xj pn,t(x1, . . . , xn) := t j=1 (1 − t) j=1 .

We make two hypotheses about the possible value of θ: either θ = a, or θ = b, where a, b ∈ [0, 1] and a 6= b. We toss the coin repeatedly and form the sequence of random variables pn,a(ξ1, . . . , ξn) Zn := , pn,b(ξ1, . . . , ξn)

where ξj, j = 1, 2,... , are the results of the successive trials (HEAD=1, TAIL=0).

Prove that the process n 7→ Zn is a martingale (with respect to the natural ltration generated by the coin tosses) if and only if the true bias of the coin is θ = b. SOLUTION:

5 3.9 Bonus Bellman's Optimality Principle

We model a sequence of gambling as follows. Let ξj, j = 1, 2,... , be independent random variables with the following identical distribution;

  P ξj = +1 = p, P ξj = −1 = 1 − p := q, 1/2 < p < 1.

We denote

α := p log2 p + q log2 q + 2,

the entropy of the distribution of ξj.

ξj is the return of unit bet in the jth round. A gambler starts playing with initial fortune

Y0 > 0 and her fortune after round n is

Yn = Yn−1 + Cnξn

where Cn is the amount she bets in this round. Cn may depend on the values of ξ1, . . . , ξn−1,

and 0 ≤ Cn ≤ Yn−1. The expected rate of winnings within n rounds is:  rn := E log2(Yn/Y0) .

The gambler's goal is to maximize rn within a xed number of rounds. (a) Prove that no matter what strategy the gambler chooses (that is: no matter how she

chooses Cn = Cn(ξ1, . . . , ξn−1) ∈ [0,Yn−1])

Xn := log2 Yn − nα

is a supermartingale and hence it follows that rn ≤ nα. This means that she will not be able to make her average winning rate, over any number of rounds, larger than α.

(b) However, there exists a gambling strategy which makes Xn dened above a martingale and thus realizes the maximal average winning rate. Find this strategy. That is: determine

the optimal choice of Cn = Cn(ξ1, . . . , ξn). SOLUTION:

3.10 Let n 7→ ηn be a homogeneous on the countable state space S := {0, 1, 2,... }

and Fn := σ(ηj, 0 ≤ j ≤ n), n ≥ 0, its natural ltration. For i ∈ S denote by Q(i) the probability that the Markov chain starting from site i ever reaches the point 0 ∈ S:  Q(i) := P ∃n < ∞ : ηn = 0 η0 = i .

6 Prove that Zn := Q(ηn) is an (Fn)n≥0-martingale. SOLUTION:

3.11HW Galton-Watson

Let ξn,k, n = 1, 2,... , k = 1, 2,... be independent and identically distributed random variables which take values from N = {0, 1, 2,... }. Assume that they have nite second  2  moment and denote µ := E ξn,k , σ := Var ξn,k . Dene the Galton-Watson branching process Z Xn Z0 := 1,Zn+1 := ξn+1,k k=1

and let Gn := σ(Zj : 0 ≤ j ≤ n), n ≥ 0, be its natural ltration. (a) Prove that −n Mn := µ Zn, n = 0, 1, 2,...

is a (Gn)n≥0-martingale. (b) Prove that 2  2 2 2 E Zn+1 Gn = µ Zn + σ Zn.

(c) Using the result from (b) prove that

 2 n 2 σ µ − 1 M − Mn if µ 6= 1,  n µn+1 µ − 1 Nn :=  2 2 if Mn − nσ Mn µ = 1

is also a (Gn)n≥0-martingale. (d) Using the result from (c) prove that if then 2 (that is: the µ > 1 sup0≤n<∞ E Mn < ∞ martingale is uniformly bounded in 2) while if then 2 . Mn L µ ≤ 1 limn→∞ E Mn = ∞ SOLUTION: (a)

Zn Zn  −(n+1) X  −(n+1) X  −(n+1) E Mn+1 Fn = µ E ξn+1,k Gn = µ E ξn+1,k Gn = µ Znµ = Mn. k=1 k=1

(b)

Zn Zn 2 X X  2 2 2 2 2 2 2 Zn+1Gn = E ξn+1,kξn+1,l Gn = Zn(µ + σ ) + (Zn − Zn)µ = Znµ + Znσ . k=1 l=1

7 (c) Consider the case µ 6= 1.

2 n+1 2 n+1 2 σ µ − 1  −2(n+1) 2  σ µ − 1  E M − Mn+1 Gn = µ E Z Gn − E Mn+1 Gn n+1 µn+2 µ − 1 n+1 µn+2 µ − 1 σ2 µn+1 − 1 = µ−2(n+1) Z2µ2 + Z σ2 − M n n µn+2 µ − 1 n σ2 µn − 1 = ... = M 2 − M . n µn+1 µ − 1 n When µ = 1 the computations are simpler.  E Nn = 1, and hence σ2 µn − 1 EM 2 = 1 + . n µn+1 µ − 1 The right hand side is bounded if µ > 1.

3.12 Bonus Pólya Urn, 1

At time n = 0, an urn contains B0 = 1 blue, and R0 = 1 red ball. At each time n = 1, 2, 3,..., a ball is chosen at random from the urn and returned to the urn, together with

a new ball of the same colour. We denote by Bn and Rn the number of blue, respectively,

red balls in the urn after the n-th turn of this procedure. (Note that Bn + Rn = n + 2.)

Denote by Fn := σ(Bj, 0 ≤ j ≤ n) = σ(Rj, 0 ≤ j ≤ n), n ≥ 0, the natural ltration of the process. Let Bn Mn := Bn + Rn be the proportion of blue balls in the urn just after time n.

(a) Show that n 7→ Mn, is an (Fn)n≥0-martingale.  (b) Show that P Bn = k = 1/(n + 2) for 0 ≤ k ≤ n + 1. (Hint: Write down the probability of choosing k blue and n − k red balls in whatever xed order.)

(c) We will prove soon that M∞ := lim Mn exist . What is the distribution of

M∞?

(Hint: What is the limit of the distribution of Mn (identied in the previous point) as n → ∞?) (d) (To be done after learning about the Optional Stopping Theorem.) Let T be the number of balls drawn until the rst blue ball is chosen. Use the optional stopping theorem to show that 1  . E T +2 = 1/4

8 SOLUTION:

3.13 Pólya Urn, 2

Write a program code to simulate the Pólya Urn. Start with B0 = 1 blue and R0 = 1 red ball in the urn, perform 1000 steps and record the proportion of blue and red balls after the 1000th step. Repeat this experiment 2000 times and determine the distribution of the nal proportion of blue balls by performing elementary statistical analysis.

SOLUTION:

3.14 Pólya Urn, 3 We continue the study of Pólya Urn and use the notations of problem 12. Let θ ∈ [0, 1] be xed and dene

(Bn + Rn − 1)! Bn−1 Rn−1 Nn(θ) := θ (1 − θ) . (Bn − 1)!(Rn − 1)!

Show that Nn(θ) is is an (Fn)n≥0-martingale. SOLUTION:

3.15 Bonus Bayes Urn Assume we have a randomly biased coin which shows

PHEAD = Θ, PT AIL = 1 − Θ,

where Θ ∼ UNI[0, 1] is a which is uniformly distributed in [0, 1]. We toss this coin many times and denote

B0 = 1,Bn := 1 + no. of HEADs in the rst n trials,

R0 = 1,Rn := 1 + no. of TAILs in the rst n trials,

and Fn := σ(Bj, 0 ≤ j ≤ n), n ≥ 0, the natural ltration generated by the sequence of coin

tosses. (Note the +1-s and that Bn + Rn = n + 2.)

(a) Prove that for any n ∈ N, the joint distribution of (B0,B1,...,Bn) is the same as that of the sequence denoted the same way in the Pólya Urn, problem 12.

(b) Prove that Bn Mn := , Bn + Rn

9 (with Bn, Rn dened in this problem) is an an (Fn)n≥0-martingale. (c) Prove that

(Bn + Rn − 1)! Bn−1 Rn−1 Nn(θ) := θ (1 − θ) . (Bn − 1)!(Rn − 1)!

(with Bn, Rn dened in this problem) is exactly the (regular) conditional density function of the random variable Θ, given Fn.

10