Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that...

30
STK4080 Survival and event history analysis, UiO 2019 Slides 5: Counting processes and martingales SOLUTIONS TO EXERCISES Bo Lindqvist 1

Transcript of Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that...

Page 1: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

STK4080 Survival and event history analysis, UiO 2019

Slides 5: Counting processes andmartingales

SOLUTIONS TO EXERCISES

Bo Lindqvist

1

Page 2: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 1

Throw a die several times. Let Yi be result in ith throw, and letXi = Y1 + . . .+ Yi be the sum of the i first throws.

(a) Find an expression for f(x1, x2).

**********

f(x1, x2) = P (X1 = x1, X2 = x2) =1

36I(1 ≤ x1 ≤ 6, 1 + x1 ≤ x2 ≤ 6 + x1)

(b) Find an expression for f(x2|x1).

**********

f(x2|x1) =f(x1, x2)

f(x1)=

1

6I(1 + x1 ≤ x2 ≤ 6 + x1)

(c) Find an expression for E(X2|x1) as a function of x1.

**********

E(X2|x1) = x1 + 7/2

2

Page 3: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

(d) Find the expectation of the random variable E(X2|X1). FindE(X2) from this. Find also E(X2) without using the rule of doubleexpectation.

**********

E(X2|X1) = X1 + 7/2

E(X2) = E(E(X2|X1)) = E(X1) + 7/2 = 7

or

E(X2) = E(Y1 + Y2) = E(Y1) + E(Y2) = 7/2 + 7/2 = 7

(e) Find also E(X3|X2), E(X3|X1, X2) and E(X3|X1).

**********

E(X3|X2) = X2 + 7/2

E(X3|X1, X2) = X2 + 7/2

E(X3|X1) = X1 + 7

3

Page 4: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 2

Let X1, X2, . . . be independent with E(Xn) = µ for all n.

Let Sn = X1 + . . .+Xn.

Find E(S4|F2) and in general E(Sn|Fm) when m < n

**********

E(S4|F2) = E(X1 +X2 +X3 +X4|X1, X2)= X1 +X2 + E(X3) + E(X4)= X1 +X2 + 2µ

E(Sn|Fm) = Sm + (n−m)µ

4

Page 5: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 3

Let X1, X2, . . . be independent with

P (Xi = 1) = P (Xi = −1) = 1/2

Think of Xi as result of a game where one flips a coin and wins 1 unitif “heads” and loses 1 unit if “tails”.

Let Mn = X1 + . . .+Xn be the gain after n games, n = 1,2, . . .

Show that Mn is a (mean zero) martingale.

**********

E(Mn|Fn−1) = E(X1 + . . .+Xn|X1, . . . , Xn−1)= X1 + . . .+Xn−1 + E(Xn)= Mn−1

5

Page 6: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 4

(a) Let X1, X2, . . . be independent with E(Xn) = 0 for all n.

Let Mn = X1 + . . .+Xn.

Show that Mn is a (mean zero) martingale.

**********

Same proof as for Exercise 3

(b) Let X1, X2, . . . be independent with E(Xn) = µ for all n.

Let Sn = X1 + . . .+Xn.

Compute E(Sn|Fn−1) (see Exercise 2). Is {Sn} a martingale?

Can you find a simple transformation of Sn which is a martingale?

**********

E(Sn|Fn−1) = Sn−1 + µ

This is a martingale only if µ = 0.

Easy to see that Mn = Sn − nµ is a martingale.

6

Page 7: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

(c) Let X1, X2, . . . be independent with E(Xn) = 1 for all n.

Let Mn = X1 ·X2 · . . . ·Xn.

Show that Mn is a martingale. What is E(Mn)?

**********

E(Mn|Fn−1) = E(X1 · · ·Xn|X1, . . . , Xn−1)= X1 · · ·Xn−1E(Xn)= Mn−1

E(Mn) = 1

7

Page 8: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 5

Show that for a martingale {Mn} is

(a) E(Mn|Fm) = Mm for all m < n.

This is essentially Exercise 2.1 in book.

****************

We use induction on k to prove that

(∗) E(Mn|Fn−k) = Mn−k for k = 1,2, . . . , n− 1

k = 1 is definition of martingale.

Suppose (*) holds for k. Then

E(Mn|Fn−k−1) = E{E(Mn|Fn−k)|Fn−k−1} = E(Mn−k|Fn−k−1) = Mn−k−1

Hence (*) holds for all the required k.

(b) E(Mm|Fn) = Mm for all m < n

**************

This holds trivially since Mm ∈ Fn

(c) Give verbal (intuitive) interpretations of each of (a) and (b).

8

Page 9: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 6

Define the martingale differences by

∆Mn = Mn −Mn−1

(a) Show that the definition of martingale,E(Mn|Fn−1) = Mn−1,is equivalent to

E(Mn −Mn−1|Fn−1) = 0, i.e. E(∆Mn|Fn−1) = 0 (1)

(Hint: Why is E(Mn−1|Fn−1) = Mn−1? )

**********************************

The following are clearly equivalent by ’Hint’ which is obvious:

E(Mn|Fn−1) = Mn−1

E(Mn|Fn−1)−Mn−1 = 0E(Mn|Fn−1)− E(Mn−1|Fn−1) = 0

E(Mn −Mn−1|Fn−1) = 0

9

Page 10: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

(b) Show that for a martingale we have

Cov(Mn−1 −Mn−2,Mn −Mn−1) = 0 for all n (2)

i.e.

Cov(∆Mn−1,∆Mn) = 0

Explain in word what this means.

****************************

Cov(Mn−1 −Mn−2,Mn −Mn−1) = E{(Mn−1 −Mn−2)(Mn −Mn−1)}= E[E{(Mn−1 −Mn−2)(Mn −Mn−1)|Fn−1}]= E[(Mn−1 −Mn−2)E{(Mn −Mn−1)|Fn−1}]= 0

(c) Show that (1) and (2) automatically hold when Mn = X1+. . .+Xn

for independent X1, X2, ....

Note that in this case the differences ∆Mn = Mn−Mn−1 = Xn areindependent.

Thus (2) shows that martingale differences correspond to a weak-ening of the independent increments property

10

Page 11: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 7

Suppose that you use the martingale betting strategy, but that you inany case must stop after n games.

Calculate the expected gain. Can you “beat the game”?

(Hint: You either win 1 or lose 2n − 1 units.)

***************************

You lose all n times with probability 1/2n. The loss is then 1 + 2 + 4 +· · ·+ 2n−1 = 2n − 1.

You otherwise win 1, which must have probability 1 − 1/2n. Henceexpected gain is:

−(2n − 1) · (1/2n) + 1 · (1− 1/2n) = −1 + 1/2n + 1− 1/2n = 0

so you do not “beat the game”

11

Page 12: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 8

Do Exercise 2.6 in book:

Show that the stopped process MT is a martingale. (Hint: Find apredictable process H such that MT = H •M).

********************

We have

MTn = Mn∧T ≡

{Mn if n ≤ TMT if n > T

Want to write

(∗) MTn = H1(M1 −M0) +H2(M2 −M1) + . . .+Hn(Mn −Mn−1)

Let

Hi =

{1 if T ≥ i0 otherwise

Then Hi is known at time i − 1 and we may check that (*) does thejob.

************************

This suggests that E(MT) ≡ E(MTT ) = E(MT

0 ) = E(M0) = 0.

But this does obviously not hold for the martingale strategy case(where indeed the process is a martingale, stopped at a stopping time).

What is the clue here? We can conclude from the above that E(MTn ) =

0 for all n, but not that this n can be replaced by a random T . Butsee next slide in lecture!

12

Page 13: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 9

Suppose Xn = U1 + . . . + Un where U1, . . . , Un are independent withE(Ui) = µ.

Find the Doob decomposition of the process X and identify the pre-dictable part and the innovation part.

******************

Recall the Doob decomposition:

Xn = E(Xn|Fn−1) + (Xn − E(Xn|Fn−1)) = predictable + innovation

Here the predictable part is:

E(Xn|Fn−1) = E(Xn−1 + Un|Fn−1) = Xn−1 + µ

and the innovation is hence

Xn − (Xn−1 + µ) = Xn −Xn−1 − µ = Un − µ

13

Page 14: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 10

Prove the results on the covariances (see Exercise 2.2 in book)

***********************

Let 0 ≤ s < t < u < v ≤ τ

Cov(M(t)−M(s),M(v)−M(u)) = E{(M(t)−M(s))(M(v)−M(u))}= E[E{(M(t)−M(s))(M(v)−M(u))|Fu}]= E[(M(t)−M(s))E{(M(v)−M(u))|Fu}]= 0

14

Page 15: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

POISSON PROCESSES

N(t) = # events in [0, t]

Characterizing properties:

• N(t)−N(s) is Poisson-distributed with parameter λ(t− s)

• N(t) has independent increments, i.e. number of events in disjointintervals are independent.

EXERCISE 11

Show that

M(t) = N(t)− λt

is a martingale. Identify the compensator of N(t).

Hint:

For t > s is E(N(t)|Fs) = N(s) + λ(t− s)

*****************

The compensator must be λt by Doob-Meyer (and λt is increasing,predictable.

15

Page 16: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

E(M(t)|Fs) = E{N(t)− λt|Fs}= E{N(t)|Fs} − λt= E{N(t)−N(s) +N(s)|Fs} − λt= E{N(t)−N(s)|Fs}+N(s)− λt= E{N(t)−N(s)}+N(s)− λt= λt− λs+N(s)− λt= N(s)− λs = M(s)

16

Page 17: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 12

This is Exercise 2.3 in book, plus a new question (d). (a) was doneas our Exercise 3(a)

Thus let X1, X2, . . . be independent with E(Xn) = 0 and Var(Xn) = σ2

for all n. Let Mn = X1 + . . .+Xn.

(a) Show that Mn is a (mean zero) martingale.

(b) Compute 〈M〉n********************

〈M〉n =n∑i=1

V ar(∆Mi|Fi−1)

Here ∆Mi = Mi −Mi−1 = Xi, so we get

〈M〉n =n∑i=1

V ar(∆Mi|Fi−1) =n∑i=1

V ar(Xi|Fi−1) =n∑i=1

V ar(Xi) = nσ2

since Xi is independent of Fi−1. (Why?)

18

Page 18: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

(c) Compute [M ]n*******************

[M ]n =n∑i=1

(∆Mi)2 =

n∑i=1

X2i

(d) Compute E 〈M〉n, E [M ]n and Var(Mn).

***************

E 〈M〉n = E [M ]n = Var(Mn) = nσ2

(and these are equal in general, see next exercise)

19

Page 19: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 13

Consider a general discrete time martingale M

(a) M2n − 〈M〉n is a mean zero martingale - Exercise 2.4 in book

*****************************

Write

M2n = (Mn−1 +Mn −Mn−1)2

= M2n−1 + 2Mn−1(Mn −Mn−1) + (Mn −Mn−1)2

Thus

E{M2n − 〈M〉n |Fn−1}

= E{M2n−1 + 2Mn−1(Mn −Mn−1) + (Mn −Mn−1)2 − 〈M〉n |Fn−1}

= M2n−1 + 2Mn−1E{(Mn −Mn−1)|Fn−1}+ E{(Mn −Mn−1)2|Fn−1}

−n∑i=1

E{(Mi −Mi−1)2|Fi−1}

= M2n−1 + 2 · 0−

n−1∑i=1

E{(Mi −Mi−1)2|Fi−1}

= M2n−1 − 〈M〉n−1

20

Page 20: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

(b) M2n − [M ]n is a mean zero martingale - See book p. 45

(c) Use (a) and (b) to prove that

V ar(Mn) = E 〈M〉n = E [M ]n

What is the intuitive essence of this result?

Page 21: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 14

Consider again the Poisson process, where we have shown that

M(t) = N(t)− λt

is a martingale.

Prove now that:

M2(t)− λt

is a martingale (Exercise 2.10 in book)

Then prove that 〈M〉 (t) = λt

What is [M ] (t)?

Hint:

For t > s is E(N(t)|Fs) = N(s) + λ(t− s)

For t > s is E(N2(t)|Fs) = N(s)2 + 2N(s)λ(t− s) +λ(t− s) + (λ(t− s))2

*******************************

21

Page 22: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

The proof given here is by “brute force”.

The first hint is shown as follows, using that N(t)−N(s) is independentof Fs for the Poisson process. Hence

E(N(t)|Fs) = E(N(s) + (N(t)−N(s)|Fs) = N(s) + E(N(t)−N(s))= N(s) + λ(t− s)

The second hint is shown in a similar way by using that

E(N2(t)|Fs) = E[(N(s) +N(t)−N(s))2|Fs]= N(s)2 + 2N(s)E[N(t)−N(s)] + E[(N(t)−N(s))2]= N(s)2 + 2N(s)λ(t− s) + V ar(N(t)−N(s)) + (E(N(t)−N(s)))2

= N(s)2 + 2N(s)λ(t− s) + λ(t− s) + (λ(t− s))2

It then remains to use these to calculate, again by brute force,

E(M2(t)− λt|Fs] ≡ E[(N(t)− λt)2 − λt|Fs]

and show that it equals M2(s)− λs.

Now we prove that 〈M〉 (t) = λt. This can be seen by the conclusionobtained above, namely that M2(t) = λt+ martingale. Then, since λtis predictable (in fact non-stochastic), it follows by the uniqueness ofthe Doob-Meyer decomposition that λt is the compensator of M2(t),and by slide on page 49, that 〈M〉 (t) = λt.

22

Page 23: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

Finally, by slide page 47,

[M ] (t) =∑s≤t

(M(s)−M(s−))2

But M(t) = N(t)− λt jumps by 1 each time N(t) jumps, so [M ] (t) =N(t) (which will be seen to be the answer also for general countingprocesses).

Page 24: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 15

Do Exercise 2.5:

(a)

M1M2 − 〈M1,M2〉 is a mean zero martingale

M1M2 − [M1,M2] is a mean zero martingale

(b)

Cov(M1n,M2n) = E 〈M1,M2〉n = E [M1,M2]n for all n

**********************************

Hint for (a): We know that M2−〈M〉 is a mean zero martingale whenM is. Use that

M1M2 =(M1 +M2)2 − (M1 −M2)2

4(3)

Thus we know that (M1 + M2)2− < M1 + M2 > and (M1 −M2)2− <M1 −M2 > are mean-zero martingales. Thus this is also the case fortheir difference, which is

(M1 +M2)2 − (M1 −M2)2− < M1 +M2 > + < M1 −M2 > (4)

But we have already seen that

< M1 +M2 > = < M1 > + < M2 > +2 < M1,M2 > (5)

23

Page 25: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

and the same argument will show that

< M1 −M2 > = < M1 > + < M2 > −2 < M1,M2 > (6)

Substituting (5) and (6) into (4) the result follows from (3).

The statement for [·] follows in exactly the same way.

Page 26: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

If you don’t want to use the trick of (3), you may of course use “bruteforce”:

Page 27: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.
Page 28: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

We then prove (b)

Cov(M1n,M2n) = E 〈M1,M2〉n = E [M1,M2]n for all n

Page 29: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

EXERCISE 16

Do exercise 2.10 in book: Show that

〈M1,M2〉 (t) = 0 for all t

[M1,M2](t) = 0 for all t

i.e. M1 and M2 are orthogonal martingales.

Hint: Argue that N1 +N2 is a counting process with

〈M1 +M2〉 = 〈M1〉+ 〈M2〉

[M1 +M2] = [M1] + [M2]

***************************

24

Page 30: Slides 5: Counting processes and martingales€¦ · i is known at time i 1 and we may check that (*) does the job. ***** This suggests that E(M T) E(MT T) = E(MT 0) = E(M0) = 0.

(see also bottom lines of pages 69 and 70 in Slides)