Protter Solutions

download Protter Solutions

of 25

Transcript of Protter Solutions

  • 8/10/2019 Protter Solutions

    1/25

    Solution to selected problems.

    Chapter 1. Preliminaries

    1. A FS,t 0, A {T t} = (A {St}) {T t}, since{T t} {S t}. SinceA {St} Ft and{T t} Ft, A {T t} Ft. ThusFS FT.

    2. Let = N and F=P(N) be the power set of the natural numbers. Let Fn= ({2}, {3}, . . . , {n+1}),n. Then (Fn)n1 is a filtration. Let S= 3 13 and T= 4. Then ST and

    {: S() =n}=

    {3} ifn = 3 otherwise

    {: T() =n}=

    ifn = 4

    otherwiseHence{S=n} Fn,{T =n} Fn,n and S, T are both stopping time. However{ : T S=1}={: 1{3}() = 1}={3} / F1. Therefore T S is not a stopping time.

    3. Observe that{Tn t} Ft and{Tn < t} Ft for all n N , t R+, since Tn is stoppingtime and we assume usual hypothesis. Then(1) supn Tn is a stopping time sincet0,{supn Tnt}=n{Tnt} Ft.(2) infn Tn is a stopping time since{infn Tn< t}={Tn< t} Ft(3) lim supn is a stopping time since lim supn= infmsupnm Tn (and (1), (2).)(4) lim infn is a stopping time since lim infn= supminfnm Tn (and (1), (2).).

    4. T is clearly a stopping time by exercise 3, since T= lim infn Tn.

    FT

    FTn,

    n since T

    Tn,

    andFT nFTn by exercise 1. Pick a setA nFTn.n1, A FTn and A {Tn t} Ft.Therefore A {T t}= A (n{Tnt}) =n(A {Tnt}) Ft. This impliesnFTn FTand completes the proof.

    5. (a) By completeness ofLp space, X LP. By Jensens inequality, E|Mt|p = E|E(X|Ft)|p E[E(|X|p|Ft)] =E|X|p 1.

    (b) By (a), MtLp L1. Forts0, E(Mt|Fs) =E(E(X|Ft)|Fs) =E(X|Fs) =Ms a.s.{Mt}is a martingale. Next, we show that{Mt} is continuous. By Jensens inequality, for p >1,

    E|Mnt Mt|p =E|E(Mn X|Ft)|p E|Mn X|p, t0. (1)It follows that supt E|Mnt Mt|p E|MnX|p 0 as n . Fix arbitrary > 0. ByChebychevs and Doobs inequality,

    P

    supt

    |Mnt Mt|>

    1p

    E(supt

    |Mnt Mt|p)

    p

    p 1p supt E|Mnt Mt|p

    p 0. (2)

    ThereforeMn converges toMuniformly in probability. There exists a subsequence{nk}such thatMnk converges uniformly toMwith probability 1. ThenMis continuous since for almost all , itis a limit of uniformly convergent continuous paths.

    1

  • 8/10/2019 Protter Solutions

    2/25

    6. Letp(n) denote a probability mass function of Poisson distribution with parameter t. Assumet is integer as given.

    E|Nt t|=E(Nt t) + 2E(Nt t) = 2E(Nt t) = 2t

    n=0(t n)p(n)

    =2ettn=0

    (t n) (t)n

    n! = 2tet

    tn=0

    (t)n

    n!

    t1n=0

    (t)n

    n!

    =2et (t)t

    (t 1)!

    (3)

    7. Since Nhas stationary increments, for ts0,E(Nt Ns)2 =EN2ts = V ar(Nts) + (ENts)2 =(t s)[1 + (t s)]. (4)

    As ts (or st), E(Nt Ns)2 0. N is continuous in L2 and therefore in probability.

    8. Suppose is a stopping time. A 3-dimensional Brownian motion is a strong Markov processwe know thatP(0 :Wt }. ThenSis aFWt stopping time and{Ss} FWs . So{Ss}has to be independent of any sets inF . However{ t} {S s} =, which implies that{t} and{Ss} are dependent. Sine{t} F , this contradicts the fact thatF andFWt are independent. Hence is not a stopping time.

    9. (a) SinceL2-space is complete, it suffices to show that Snt =n

    i=1 Mit is a Cauchy sequence

    w.r.t. n. For mn, by independence of{Mi},

    E(Snt Smt )2 =E

    mi=n+1

    Mit

    2=

    mi=n+1

    E

    Mit2

    =tm

    i=n+1

    1

    i2. (5)

    , Since

    i=11/i2

  • 8/10/2019 Protter Solutions

    3/25

    10. (a) LetNt =

    i1i (N

    it t) and Lt =

    i1i (L

    it t). As we show in exercise 9(a), N, M are

    well defined in L2 sense. Then by linearity ofL2 space M is also well defined in L2 sense since

    Mt=i

    1

    i

    (Nit t) (Lit t)

    =

    i

    1

    i(Nit t)

    i

    1

    i(Lit t) =Nt Lt. (7)

    Both terms in right size are martingales change only by jumps as shown in exercise 9(b). HenceMt is a martingale which changes only by jumps.

    (b) First show that given two independent Poisson processes N and L,

    s>0 NsLs = 0 a.s.,i.e. N and L almost surely dont jump simultaneously. Let{Tn}n1 be a sequence of jump timesof a processL. Then

    s>0 NsLs=

    n NTn . We want to show that

    n NTn = 0 a.s. Since

    NTn0, it is enough to show that ENTn = 0 forn1.Fix n1 and letTn be a induced probability measure on R+ ofTn. By conditioning,

    E(NTn) =E[E(NTn|Tn)] =

    0E(NTn|Tn= t) Tn(dt) =

    0E(Nt) Tn(dt), (8)

    where last equality is by independence of N and Tn. It follows that ENTn = ENt. SinceNtL1 and P(Nt = 0) = 1 by problem 25,ENt= 0, hence ENTn = 0.

    Next we show that the previous claim holds even when there are countably many Poisson processes.assume that there exist countably many independent Poisson processes{Ni}i1. Let A be aset on which more than two processes of{Ni}i1 jump simultaneously. Let ij denotes a set onwhich Ni and Nj dont jump simultaneously. Then P(ij) = 1 for i= j by previous assertion.Since A i>jcij, P(A)

    i>jP(

    cij) = 0. Therefore jumps dont happen simultaneously

    almost surely.

    Going back to the main proof, by (a) and the fact that N andL dont jump simultaneously,t >0,st

    |Ms|=st

    |Ns| +st

    |Ls|= a.s. (9)

    11. Continuity: We use notations adopted in Example 2 in section 4 (P33). Assume E|U1| ) = limst

    k

    P(|Zt Zs|> |Nt Ns= k)P(Nt Ns= k)

    limst

    k

    P(ki=1

    |Ui|> )P(Nt Ns = k)limst

    k

    kP(|U1|>

    k)

    P(Nt Ns= k)

    limst

    E|U1|

    k

    k2P(Nt Ns = k) = limst

    E|U1|

    {(t s)}= 0

    (10)

    3

  • 8/10/2019 Protter Solutions

    4/25

    Independent Increment: Let Fbe a distribution function of U. By using independence of{Uk}k and strong Markov property ofN, for arbitrary t, s: ts0,

    E

    eiu(ZtZs)+ivZs

    = E

    eiuPNt

    k=Ns+1Uk+iv

    PNsk=1Uk

    =E

    E

    eiuPNt

    k=Ns+1

    Uk+ivPNs

    k=1 Uk

    |Fs=E

    eivPNs

    k=1 UkE

    eiuPNt

    k=Ns+1Uk |Fs

    = E

    eivPNs

    k=1 UkE

    eiuPNt

    k=Ns+1Uk

    =E

    eivPNs

    k=1 Uk

    E

    eiuPNt

    k=Ns+1Uk

    = E

    eivZs

    E

    eiu(ZtZs)

    .

    (11)

    This shows that Z has independent increments.

    Stationary increment: Since{Uk}k are i.i.d and independent ofNt,

    EeiuZt=EEe

    iuZt |Nt= E eiuxF(dx)

    N(t)

    = n0

    (t)net

    n! eiuxF(dx)

    n

    =exp

    t

    [1 eiux]F(dx)

    .

    (12)

    By (11) and (12), setv = u,

    E

    eiu(ZtZs)

    =E

    eiuZt

    /E

    eiuZs

    = exp

    (t s)

    [1 eiux]F(dx)

    =E

    eiu(Zts) (13)

    Hence Z has stationary increments.

    12. By exercise 12, a compound Poisson process is a Levy process and has independent stationaryincrements.

    E|Zt tEU1| E(E(|Zt||Nt)) + tE|U1| n=1

    E

    ni=1

    |Ui||Nt= n

    P(Nt= n) + tE|U1|

    =E|U1|ENt+ tE|U1|= 2tE|U1|

  • 8/10/2019 Protter Solutions

    5/25

    whereZt =R

    xNt(, dx),= |x|1 x(dx). By theorem 43,E(e

    iuZt) =R

    (1eiux)(dx). Ztisa compound Poisson process (See problem 11). Arrival rate (intensity) is sinceE(

    R

    Nt(, dx)) =tR

    (dx) =t. SinceZt is a martingale, E Zt=t. SinceE Zt= E(

    R

    xNt(, dx)) =tR

    x(dx), =

    R

    x(dx). It follows that Zt = Ztt

    R

    x (dx) is a compensated compound Poissonprocess. Then problem 12 shows thatE U1 = R x (dx). It follows that the distribution of jumps= (1/).

    14. Suppose E Nt

  • 8/10/2019 Protter Solutions

    6/25

    16. LetFt be natural filtration of Bt satisfying usual hypothesis. By stationary incrementsproperty of Brownian motion and symmetry,

    Wt= B1t B1 d=B1 B1t d=B1(1t)= Bt (21)

    This showsWt is Gaussian. Wt has stationary increments becauseWt

    Ws = B1s

    B1td=Bts.

    LetGt be a natural filtration of Wt. ThenGt = ((B1B1s) : 0 s t). By independentincrements property ofBt,Gtis independent ofF1t. Fors > t,WsWt=(B1tB1s) F1tand hence independent ofGt.

    17. a) Fix >0 and such that X() has a sample path which is right continuous, with leftlimits. Suppose there exists infinitely many jumps larger than at time{sn}n1[0, t]. (If thereare uncountably many such jumps, we can arbitrarily choose countably many of them.) Since [0, t]is compact, there exists a subsequence{snk}k1 converging to a cluster point s [0, 1]. Clearlywe can take further subsequence converging to s [0, 1] monotonically either from above or frombelow. To simplify notations, Suppose{sn}n1 s. ( The other case{sn}n1 s is similar.)By left continuity ofX

    t, there exists >0 such thats

    (s

    , s) implies

    |X

    s X

    s

    |< /3 and

    |Xs Xs|< /3. However for sn(s , s),

    |Xsn Xs|=|Xsn Xs+ Xsn | |Xsn| |Xsn Xs|> 2

    3 (22)

    This is a contradiction and the claim is shown.

    b) By a), for each n there is a finitely many jumps of size larger than 1/n. But J ={s [0, t] :|Xs|> 0}=n=1{s[0, t] :|Xs|> 1/n}. We see that cardinality ofJ is countable.

    18. By corollary to theorem 36 and theorem 37, we can immediately see that J and Z J areLevy processes. By Levy -Khintchine formula (theorem 43), we can see thatJZJ =Z. Thus

    J and Z J are independent. ( For an alternative rigorous solution without Levy -Khintchineformula, see a proof of theorem 36 .)

    19. Let Tn = inf{t >0 :|Xt| > n}. Then Tn is a stopping time. LetSn =Tn1{X0n}. We have{Snt}={Tnt, X0n} {X0 > n}={Tnt} {X0 > n} Ft and Sn is a stopping time.Since X is continuous, Sn and XSn1{Sn>0}n,n1. Therefore Xis locally bounded.

    20. Let Hbe an arbitrary unbounded random variable. (e.g. Normal) and let T be a positiverandom variable independent of H such that P(T t) > 0,t > 0 and P(T t0 otherwise

    (23)

    Since ZTn is bounded by Kn, P(ZTn > Kn) = P(H > Kn)P(Tn T > t) = 0. It follows that

    P(Tn T > t) = 0,n and hence P(Tn T) = 1. Moreover P(n{Tn T}) = P(T =) = 1.This is a contradiction.

    6

  • 8/10/2019 Protter Solutions

    7/25

    21. a) let a = (1t)1(1t Y(s)ds) and Let Mt = Y()1(0,t)() +a1[t,1)(). For arbitraryB B([0, 1]),{ : MtB}= ((0, t) {YB}) ({aB} (t, 1)). (0, t) {YB} (0, t) andhence inFt.{aB} (t, 1) is either (t, 1) ordepending onB and in either case inFt. ThereforeMt is adapted.Pick A Ft. Suppose A(0, t). Then clearly E(Mt : A) =E(Y :A). Suppose A(t, 1). Then

    E(Mt : A) =E(Y :A (0, t)) + E

    1

    1 t 1t

    Y(s)ds: (t, 1)

    =E(Y :A (0, t)) + 1t

    Y(s)ds= E(Y :A (0, t)) + E(Y : (t, 1)) =E(Y :A).(24)

    Therefore forA Ft, E(Mt: A) =E(Y :A) and hence Mt= E(Y|Ft) a.s.

    b) By simple calculation EY2 = 1/(1 2) 0. Then T() on (, 1) for all > 0 and itfollows that T 0. This is contradiction. Therefore there exists 0 such that{T > 0} (0, 1).Fix(0, 0). Then{T > } {T > 0} (0, 1). On the other hand, since Tis a stopping time,{T > } (0, ] or{T > } (, 1). Combining these observations, we know that{T > } (, 1)for all (0, 0).(0, 0), there exists >0 such that >0 and{T > } ( , 1).Especially T()> . Taking a limit of0, we observe T() on (0, 0).

    c) Using the notation in b), T() on (0, 0) and hence MT() =1/2 on (0, 0). Therefore,EM2T E(1/: (0, 0)) =

    00

    1d =. Since this is true for all stopping times not identicallyequal to 0, Mcannot be locally a L

    2

    martingale.

    d) By a),|Mt()| 1/2 2 andMhas bounded path for each. IfMT1{T>0} were a boundedrandom variable, then MT would be bounded as well since M0 = 2. However, from c) MT / L2unless T 0 a.s. and hence MT1{T>0} is unbounded.

    23. LetMbe a positive local martingale and{Tn}be its fundamental sequence. Then forts0, E(MTnt 1{Tn>0}|Fs) =MTns 1{Tn>0}. By applying Fatous lemma, E(Mt|Fs)Ms a.s. Therefore

    7

  • 8/10/2019 Protter Solutions

    8/25

    a positive local martingale is a supermartingale. By Doobs supermartingale convergence theorem,positive supermartingale converges almost surely to X L1 and closable. Then by Doobsoptional stopping theorem E(MT|FS) MS a.s. for all stopping times S T 0, {ZTt B} = ({Zt B} {t < T})({ZT B} {t T} {T

  • 8/10/2019 Protter Solutions

    9/25

    For this, lett0= 0 and tn+1= and write

    E

    nj=1

    1{ZtjBj}

    GT

    =

    n+1k=1

    1{T[tk1,tk)}j } =nnm{|ZtZT1/n| > }.Therefore,

    P(|Zt|> )lim infn

    P(|Zt ZT1/n|> ) = 0 (33)

    Since this is true for all >0, P(|Zt|> 0) = 0,t.

    26. To apply results of exercise 24 and 25, we first show following almost trivial lemma.LemmaLet Tbe a stopping time and t R+. IfT t, thenGT =Gt andGT=Gt.Proof.GT={A {T > s}: A Gs}={A {t > s}: A Gs}={A: A Gs, s < t}=Gt. FixA Gt. A {t s} = A Gt Gs (t s), or Gs (t > s) and henceGt GT. Fix A GT,A {ts} Gs,s > 0. Especially,A {tt}= A Gt andGT Gt. ThereforeGT =Gt.Fixt >0. By exercise 25,Zt= Zt a.s.. By exercise 24 (d),Gt=Gt sincet is a bounded stoppingtime.

    27. Let A Ft. ThenA {t < S} = (A {t < S}) {t < T} FT since A {t < S} Ft.Then

    FS

    FT.

    Since Tn T,FTn FT for all n as shown above. Therefore,nFTn FT. Let A Ft.A {t < T}=n(A {t < Tn}) nFTn andFT nFTn. Therefore,FT=nFTn.

    28. Observe that the equation in theorem 38 depends only on the existence of a sequence of simple

    functions approximationf10 and a convergence of both sides inE{

    jajNjj }= t

    jaj(j).

    For this, f1L1 is enough. (Note that we need f1L2 to show the second equation.)

    9

  • 8/10/2019 Protter Solutions

    10/25

    29. Let Mt be a Levy process and local martingale. Mt has a representation of the form

    Mt= Bt+

    {|x|1}

    x [Nt(, dx) t(dx)] + t +{|x|>1}

    xNt(, dx) (34)

    First two terms are martingales. Therefore WLOG, we can assume that

    Mt= t +

    {|x|>1}

    xNt(, dx) (35)

    Mthas only finitely many jumps on each interval [0, t] by exercise 17 (a). Let {Tn}n1be a sequenceof jump times of Mt. Then P(Tn

  • 8/10/2019 Protter Solutions

    11/25

    So that

    YR R() =

    1 Rt , Zt< z y0 otherwise

    , YR R() =

    1 Rt , Zt> z+ y0 otherwise

    (41)

    Strong Markov property implies that on{R

  • 8/10/2019 Protter Solutions

    12/25

    Chapter 2. Semimartingales and Stochastic Integrals

    1. Let x0 R be a discontinuous point off. Wlog, we can assume that f is a right continuousfunction with left limit andf(x0) =d >0. Since inft{Bt = x0} 0. Therefore, Yt = f(Bt) has infinitely many jumps on any compactinterval almost surely. Therefore,

    st

    (Ys)2 = a.s.

    2. By dominated convergence theorem,

    limn

    EQ[|Xn X| 1] = limn

    EP

    |Xn X| 1 dQ

    dP

    = 0

    |Xn X| 10 in L1(Q) implies that XnX in Q-probability.

    3. Bt is a continuous local martingale by construction and by independence between X and Y ,

    [B, B]t = 2[X, X]t+ (1 2)[Y, Y]t = t

    So by Levy theorem, B is a standard 1-dimensional Brownian motion.

    [X, B]t= t, [Y, B]t =

    1 2t

    4. Assume thatf(0) = 0. LetMt= Bf(t) andGt =Ff(t) where B is a one-dimensional standardBrownian motion andFt is a corresponding filtration. Then

    E[Mt|Gs] = E

    Bf(t)|Ff(s)

    = Bf(s)= Ms

    ThereforeMt is a martingale w.r.t.Gt. Sincef is continuous andBt is a continuous process, Mt isclearly continuous process. Finally,

    [M, M]t= [B, B]f(t)= f(t)

    Iff(0)> 0, then we only need to add a constant processAtto Bf(t)such that 2AtM0+A2t =B2f(0)

    for each to get a desired result.

    5. Since B is a continuous process withB0 = 0, M is also continuous. M is local martingalesince B is a locally square integrable local martingale.

    [M, M]t=

    t0

    H2sds=

    t0

    1ds= t

    So by Levy s characterization theorem, Mis also a Brownian motion.

    12

  • 8/10/2019 Protter Solutions

    13/25

    6. Pick arbitrary t0 > 1. Let Xnt = 1(t0 1n ,)

    (t) for n 1, Xt = 1[t0,), Yt = 1[t0,). ThenXn, X , Y are finite variation processes and Semimartingales. limn X

    nt =Xt almost surely. But

    limn

    [Xn, Y]t0 = 0= 1 = [X, Y]t0

    7. Observe that[Xn, Z] = [Hn, Y Z] =Hn [Y, Z], [X, Z] = [H, Y Z] =H [Y, Z]

    and [Y, Z] is a semimartingale. Then by the continuity of stochastic integral, Hn H in ucpimplies, Hn [Y, Z]H [Y, Z] and hence [Xn, Z][XZ] in ucp.

    8. By applying Itos formula,

    [fn(X) f(X), Y]t =[fn(X0) f(X0), Y]t+ t0

    (fn(Xs) f(Xs))d[X, Y]s

    +1

    2 t

    0

    (fn(Xs)

    fn(Xs))d[[X, X], Y]s

    =(fn(X0) f(X0))Y0+ t0

    (fn(Xs) f(Xs))d[X, Y]sNote that [[X, X], Y]0 since XandY are continuous semimartingales and especially [X, X] is afinite variation process. Asn , (fn(X0)f(X0))Y00 for arbitrary. Since [X, Y] has apath of finite variation on compacts, we can treat fn(X) [X, Y],f(X) [X, Y] as Lebesgue-Stieltjesintegral computed path by path. So fix . Then as n ,

    sup0st

    |fn(Bs) f(Bs)|= supinfstBsxsupstBs

    |fn(x) f(x)| 0

    since fnfuniformly on compacts. Therefore t

    0(fn(Xs) f(Xs))d[X, Y]s

    sup0st

    |fn(Bs) f(Bs)| t

    0d|[X, Y]|s0

    9. Letn be a sequence of random partition tending to identity. By theorem 22,

    [M, A] =M0A0 + limn

    i

    MT

    ni+1 MTni

    AT

    ni+1 ATni

    0 + limn

    supi

    MTni+1 MTni i

    ATni+1 ATni = 0since Mis a continuous process andi A

    Tni+1 ATni

  • 8/10/2019 Protter Solutions

    14/25

    10. X2 isP-semimartingale and henceQ-semimartingale by Theorem 2. By Corollary of theorem15, (X X)Q is indistinguishable form (X X)P. Then by definition of quadric variation,

    [X, X]P =X2 (X X)P =X2 (X X)Q = [X, X]Q

    up to evanescent set.

    11. a) = [2, 1] is a closed set and B has a continuous path, by Theorem 4 in chapter 1,T() = inft : Bt / is a stopping time.

    b) Mt is a uniformly integrable martingale since Mt = E[BT|Ft]. Clearly M is continuous.Clearly N is a continuous martingale as well. By Theorem 23, [M, M]t = [B, B]

    Tt = tT and

    [N, N]t = [B, B]Tt = t T. Thus [M, M] = [N, N]. However P(Mt >1) = 0=P(Nt >1). Mand Ndoes not have the same law.

    12. Fix t R+and . WLOG we can assume that X() has a cadlag path and

    st |Xs()| s1/nsuch that|t s| 1/n,

    |Xnt Xns |= n

    ts

    Yd t1/ns1/n

    Yd

    2n(t s) sup

    s 1nt

    |Y|

    SinceXis constant on [0, t] and [1, ), let be a partition of [t, 1] andM= sups 1nt |Y|0.

    Att = 0, limn Xn = 0 = Y0. So Xn Y for all t 0. However Y need not be a semimartingalesince Yt = (Bt)

    1/3 where Bt is a standard Brownian motion satisfies all requirements but not asemimartingale.

    18. Bthas a continuous path almost surely. Fix such that Bt() is continuous. Then limn Xnt() =

    Bt() by Exercise 17. By Theorem 37, the solution ofdZnt =Z

    nsdX

    ns, Z0 = 1 is

    Znt = exp

    Xnt

    1

    2[Xn, Xn]t

    = exp(Xnt),

    since Xn is a finite variation process. On the other hand, Zt = exp(Bt t/2) and

    limn

    Znt = exp(Bt)=Zt

    19. An is a clearly cadlag and adapted. By periodicity and symmetry of triangular function, 2

    0|dAns |=

    1

    n

    2

    0|d sin(ns)|= 1

    n n

    2n

    0d(sin(ns)) =

    2

    0d(sin(s)) = 1

    Since total variation is finite, Ant is a semimartingale. On the other hand,

    limn

    sup0t

    2

    |Ant |= limn1

    n= 0

    15

  • 8/10/2019 Protter Solutions

    16/25

    20. Applying Itos formula touC2(R3 {0}) andBt R3\{0},t, a.s.

    u(Bt) =u(x) +

    t0

    u(Bs) dBs+12

    t0

    u(Bs)ds= 1x +3

    i=1

    t0

    iu(Bs)dBis

    and Mt= u(Bt) is a local martingale. This solves (a). Fix 13. Observe that E(u(B0)

    ) 3/ > 1. Then the first term goes to 0 as t . In particular it is finite for all t0.The first factor in the second term is finite while the second factor goes to 0 as t since

    1

    (2t)3/21

    (2t)3(q1)/2exy2

    2t/q dy = 1

    (2t)3(q1)/2

    1

    q3/21

    (2t/q)3/2exy2

    2t/q dy = 1

    (2t)3(q1)/21

    q3/2

    and the second factor is finite for all t0. (b) is shown with = 1. (c)is shown with = 2. Italso shows that

    limt

    Ex(u((Bt)2) = 0

    which in turn shows (d).

    21. Applying integration by parts to (A A)(C C.),

    (A A)(C C.) =AC 0

    (A As)dCs 0

    (C Cs)dAs+ [A, C]

    Since A, Care processes with finite variation,

    [A, C]t=

    0

  • 8/10/2019 Protter Solutions

    17/25

    22. Claim that for all integerk1 and prove by induction.

    (A As)k =k! s

    dAs1

    s1

    dAs2. . .

    sp1

    dAsp (48)

    For k = 1, equation (48) clearly holds. Assume that it holds for k = n. Then

    (k +1)!

    s

    dAs1

    s1

    dAs2. . .

    sp1

    dAsk+1 = (k + 1)

    s

    (AAs1)dAs1 = (AAs)k+1, (49)

    since A is non-decreasing continuous process, (AA)A is indistinguishable from Lebesgue-Stieltjes integral calculated on path by path. Thus equation (48) holds for n = k + 1 and hence forall integer n1. Then setting s = 0 yields a desired result since A0= 0.

    24. a)

    t0

    1

    1 sd

    s= t

    0

    1

    1 sdB

    s t

    0

    1

    1 sB1 Bs

    1 s ds

    =

    t0

    1

    1 s dBs B1 t0

    1

    (1 s)2 ds + t0

    Bs(1 s)2 ds

    =

    t0

    1

    1 s dBs B1 t0

    1

    (1 s)2 ds + t0

    Bsd

    1

    1 s

    = Bt1 t

    1

    1 s, Bt

    B1

    1

    1 t1

    1

    =Bt B1

    1 t + B1

    Arranging terms and we have desired result.

    b) Using integration by arts, since [1 s, s0(1 u)1du]t= 0,Xt= (1 t)

    t0

    1

    1 s ds= t0

    (1 s) 11 s ds+

    t0

    s0

    1

    1 u du(1)ds

    =t+

    t0

    Xs

    1 s

    ds,

    as required. The last equality is by result of (a).

    27. By Itos formula and given SDE,

    d(etXt) =etXtdt + e

    tdXt= etXtdt + e

    t(Xtdt + dBt) =etdXtThen integrating both sides and arranging terms yields

    Xt = et

    X0+

    t0

    esdBs

    17

  • 8/10/2019 Protter Solutions

    18/25

    28. By the law of iterated logarithm, lim suptBtt = 0 a.s. In particular, for almost all there

    exists t0() such that t > t0() implies Bt()/t < 1/2 for any (0, 1/2). Then

    limt

    E(Bt) = limt

    exp

    t

    Btt 1

    2

    lim

    tet = 0, a.s.

    29. E(X)1 =E(X+ [X, X]) by Corollary of Theorem 38. This implies thatE(X)1 is thesolution to a stochastic differential equation,

    E(X)1t = 1 + t0

    E(X)1sd(Xs+ [X, X]s),

    which is the desired result. Note that continuity assumed in the corollary is not necessary if weassumeXs=1 instead so thatE(X)1 is well defined.

    30. a) By Itos formula and continuity of M, Mt = 1 +

    t0MsdBs. Bt is a locally square

    integrable local martingale and M

    L. So by Theorem 20, Mt is also a locally square integrable

    local martingale.

    b)

    [M, M]t=

    t0

    M2s ds=

    t0

    e2Bssds

    and for all t0,

    E([M, M]t) =

    t0

    E[e2Bs ]esds=

    t0

    esds

  • 8/10/2019 Protter Solutions

    19/25

    Chapter 3. Semimartingales and Decomposable Processes

    1. Let{Ti}ni=1 be a set of predictable stopping time. For eachi, Ti has an announcing sequence{Ti,j}j=1. LetSk := maxi Ti,k andRk := mini Ti,k. ThenSk,Rk are stopping time.{Sk}and{Rk}make announcing sequences of maximum and minimum of{Ti}i.

    2. LetTn= S+ (1 1/n) for eachn. ThenTn is a stopping time and TnT asn . ThereforeTis a predictable stopping time.

    3. Let S, T be two predictable stopping time. ThenS T, S Tare predictable stopping timeas shown in exercise 1. In addition, ={S T = S T}. Therefore without loss of generality,we can assume that ST. Let{Tn} be an announcing sequence ofT. Let Rn =Tn+n1{TnS}.Then Rn is a stopping time since{Rn t} ={Tn t} ({t Tnn} {Tn< S}) Ft. Rn isincreasing and lim Rn= T = S.

    4. For each X

    L, define a new process Xn by X

    n =kN Xk/2n1[k/2n,(k+1)/2n). Since eachsummand is an optional set (See exercise 6) Xn is an optional process. As a mapping on theproduct space, X is a pints limit ofXn. Therefore X is optional. Then by the definitionP O.

    5. Suffice to show that all cadlag processes are progressively measurable. (Then we can applymonotone class argument.) For a cadlag process Xon [0, t], define a new process Xn by puttingXnu =Xk/2n for u[(k 1)t/2n,kt/2n), k ={1, . . . , 2n}. Then on [0, t],

    {Xn B}=kN+{: Xk/2n()B}

    k 1

    2n ,

    k

    2n

    Ft B([0, t]) (50)

    sinceXis adapted andXn is progressively measurable process. Since Xis cadlag ,{Xn}convergespints to X, which therefore is alsoFt B([0, t]) measurable.6. (1) (S, T]: Since 1(S,T] L, (S.T] ={(s, ) : 1(S,T] = 1} P by definition.(2) [S, T): Since 1[S,T) D, [S.T) ={(s, ) : 1[S,T)= 1} O by definition.(3) (S, T): Since [S, T] =n[S.T+ 1/n), [S, T] is an optional set. Then (S, T) = [0, T) [0, S]cand (S, T) is optional as well.

    (4) [S, T) when S, Tis predictable : Let{Sn}and{Tn}be announcing sequences ofSandT.Then [S, T) =mn (Sm, Tn]. Since (Sm, Tn] is predictable by (1) for all m, n, [S, T) is predictable.

    7. Pick a set A FSn . Then A = A {Sn < T} FT by theorem 7 and the definition of

    {Sn

    }n. (Note: The proof of theorem 7 does not require theorem 6). Since this is true for all n,

    nFSn FT. To show the converse let ={B {t < T} : B Ft}. Then is closed withrespect to finite intersection. B {t < T}=n(B {t < Sn}). Since (B {t < Sn}) {Snt}= Ft,B {t < Sn} FSn . ThereforeB {t < T} nFSn and nFSn . Then by Dynkinstheorem,FT nFSn.

    19

  • 8/10/2019 Protter Solutions

    20/25

    8. Let S, T be stopping times such that S T. ThenFSn FT. andnFSn FT. By thesame argument as in exercise 7,FT nFSn. SinceFT =FTby hypothesis, we have desiredresult. (Note:{Sn} is not necessarily an announcing sequence since Sn = T is possible. ThereforeFSn=FT in general.)

    9. Let X be a Levy process,G be its natural filtration and T be stopping time. Then byexercises 24(c) in chapter 1,GT = (GT, XT). Since X jumps only at totally inaccessible time(a consequence of theorem 4), XT = XT for all predictable stopping time T. Therefore if Tis a predictable time,GT = (GT, XT) = (GT, XT) =GT since XT GT. Therefore acompleted natural filtration of a Levy process is quasi left continuous.

    10. As given in hint, [M, A] is a local martingale. Let{Tn} be its localizing sequence, that is[M, A]Tn is a martingale for alln. ThenE([X, X]

    Tnt ) =E([M, M]

    Tnt )+E([A, A]

    Tnt ). Since quadratic

    variation is non-decreasing non-negative process, first letting n and then letting t , weobtain desired result with monotone convergence theorem.

    11. By the Kunita-Watanabe inequality for square bracket processes, ([X+Y, X+ Y])1/2 ([X, X])1/2 + ([Y, Y])1/2, It follows that [X+ Y, X+Y] 2 ([X, X] + [Y, Y]). This implies that[X+Y, X+Y] is locally integrable and the sharp bracket processX+Y, X+ Y exists. Sincesharp bracket process is a predictable projection (compensator) of square bracket process, we obtainpolarization identity of sharp bracket process from one of squared bracket process. Namely,

    X, Y= 12

    (X+ Y.X+ Y X, X Y, Y) (51)

    Then the rest of the proof is exactly the same as the one of theorem 25, chapter 2, except that wereplace square bracket processes by sharp bracket processes.

    12. Since a continuous finite process with finite variation always has a integrable variation, withoutloss of generality we can assume that the value of A changes only by jumps. Thus A can berepresented as At =

    st As. Assume thatC =

    0 |dAs| is predictable. ThenTn = inf{t >0 :

    Ctn} is a predictable time since it is a debut of right closed predictable set. Let{Tn,m}m be anannouncing sequence ofTn for eachn and define Sn bySn= sup1kn Tk,n. ThenSn is a sequenceof stopping time increasing to, Sn < Tn and hence CSn n. ThusECSn < n and C is locallyintegrable. To prove thatCis predictable, we introduce two standard results.

    lemma Suppose thatA is the union of graphs of a sequence of predictable times. Then thereexists a sequence of predictable times{Tn} such thatA n[Tn] and [Tn] [Tm] = forn=m.

    Let{Sn}n be a sequence of predictable stopping times such thatA n[Sn]. PutT1= S1 and forn 2, Bn =n1k=1 [Sk= Sn], Tn = (Sn)Bn. ThenBn FSn, Tn is predictable, [Tn] [Tm] =when n= m, and A =n1[Tn]. (Note: By the definition of the graph, [Tn] [Tm] = even ifP(Tn= Tm=)> 0 as long as Tn and Tm are disjoint on R+ )

    20

  • 8/10/2019 Protter Solutions

    21/25

    lemma Let Xt be a cadlag adapted process and predictable. Then there exists a sequence ofstrictly positive predictable times{Tn} such that [X= 0] n[Tn].

    Proof. Let S1/kn+1 = inf{t : t > T1/kn (), |XS1/kn Xt| > 1/k or|XS1/kn Xt| > 1/k}. Then since

    Xis predictable, we can show by induction that{Sn}n1 is predictable stopping time. In addition[X= 0] n,k1[S

    1/kn ]. Then by previous lemma, there exists a sequence of predictable stopping

    times{Tn}such thatn,k1[S1/kn ] n[Tn]. Then [X= 0] n[Tn]

    Proof of the main claim: Combining two lemmas, we see that{A= 0} is the unionof a sequence of disjoint graphs of predictable stopping times. SinceAt =

    st As is absolute

    convergent for each , it is invariant with respect to the change of the order of summation. ThereforeAt =

    n ASn1{Snt} where Sn is a predictable time.ASn1{Snt} is a predictable process since

    Sn is predictable and ASn FSn. This clearly implies that|ASn|1{Snt} is predictable. ThenCis predictable as well.

    Note: As for the second lemma, following more general claim holds.

    lemma LetXt be a cadlag adapted process. ThenX is predictable if and only ifX satisfiesthe fol lowing conditions (1). There exists a sequence of strictly positive predictable times{Tn}suchthat [X= 0] n[Tn]. (2). For each predictable timeT, XT1{T

  • 8/10/2019 Protter Solutions

    22/25

    Poisson process,

    E[Nt Ns|Fs] =ECts

    i=1

    Ui

    =

    k=1

    E

    Cts

    i=1

    Ui|Cts= kP(Cts = k)

    =

    k=1

    kP(Cts = k) =(t s)(53)

    ThereforeNt t is a martingale and the claim is shown.

    16. By direct computation,

    (t) = limh0

    1

    hP(tT < t + h|T t) = lim

    h0

    1 ehh

    =. (54)

    A case that Tis Weibull is similar and omitted.

    17. Observe that P(U > s) = P(T > 0, U > s) = exp{s}. P(T t, U > s) = P(U >s) P(T > t.U > s). Then

    P(tT < t + h|T t, Ut) = P(tT < t + h, tU)P(tT , tU) =

    t+ht e

    t( + t)e(+t)udu

    exp{t t t2}=

    1[exp{(t + h) t(t + h)} exp{t t2}]exp{t t2} = 1 exp{( + t)h}

    (55)

    and

    #(t) = limh0

    1 exp{( + t)h}h

    = + t (56)

    18. There exists a disjoint sequence of predictable times{Tn} such that A = {t > 0 :M, Mt= 0} n[Tn]. (See the discussion in the solution of exercise 12 for details.) Inaddition, all the jump times ofM, M is a predictable time. LetT be a predictable time suchthatM, MT= 0. Let N = [M, M] M, M. Then N is a martingale with finite variationsinceM, M is a compensator of [M, M]. Since T is predictable, E[NT|FT] = NT. On theother hand, since{Ft} is a quasi-left-continuous filtration, NT= E[NT|FT] =E[NT|FT] =NT.This implies thatM, MT =[M, M]T = (MT)2. Recall thatM itself is a martingale. SoMT =E[MT|FT] =E[MT|FT] =MT andMT = 0. ThereforeM, MT= 0 andM, M iscontinuous.

    19. By theorem 36, X is a special semimartingale. Then by theorem 34, it has a uniquedecomposition X=M+ A such that M is local martingale and A is a predictable finite variationprocess. LetX=N+ Cbe an arbitrary decomposition ofX. ThenM N=C A. This impliesthat A is a compensator of C. It suffices to show that a local martingale with finite variationis locally integrable. Set Y = MN and Z = t0|dYs|. Let Sn be a fundamental sequence ofY and set Tn = Snninf{t : Zt > n}. Then YTn L1 (See the proof of theorem 38) andZTnn + |YTn| L1. ThusY =MNis has has a locally integrable variation. Then Cis a sumof two process with locally integrable variation and the claim holds.

    22

  • 8/10/2019 Protter Solutions

    23/25

    20. Let{Tn}be an increasing sequence of stopping times such that XTn is a special semimartin-gale as shown in the statement. Then by theorem 37, XtTn = supst |XTns | is locally integrable.Namely, there exists an increasing sequence of stopping times{Rn} such that (XtTn)Rn is inte-grable. LetSn =Tn Rn. Then Sn is an increasing sequence of stopping times such that (Xt)Snis integrable. Then Xt is locally integrable and by theorem 37, Xis a special semimartingale.

    21. Since Q P, dQ/dP> 0 and Z > 0. Clearly M L1(Q) if and only ifM Z L1(P). Bygeneralized Bayes formula.

    EQ[Mt|Fs] = EP[MtZt|Fs]EP[Zt|Fs] =

    EP[MtZt|Fs]Zs

    , ts (57)

    Thus EP[MtZt|Fs] =MsZs if and only ifEQ[Mt|Fs] =Ms.

    22. By Raos theorem. X has a unique decomposition X = M+A where M is (G,P)-localmartingale and A is a predictable process with path of locally integrable variation. E[Xti+1Xti

    |Gti ] =E[Ati+1

    Ati

    |Gti ]. So

    sup

    E

    ni=0

    |E[Ati Ati+1|Gti ]|

  • 8/10/2019 Protter Solutions

    24/25

    By hypothesisE

    [A, A]1/2

    12

    s} Fs.Therefore for allt,Ts, (st) isGt measurable. HenceF0t Gt. This shows that G F. (Note:we assume that Gsatisfies usual hypothesis as well. )

    25. Recall that the{(, t) :At()= 0}is a subset of a union of disjoint predictable times andin particular we can assume that a jump time of predictable process is a predictable time. (See thediscussion in the solution of exercise 12). For any predictable time T such that E[ZT|FT] = 0,

    AT =E[AT|FT] = E[MT|FT] = 0 a.s. (62)

    26. Without loss of generality, we can assume thatA0 = 0 since E[A0|F0] =E[A0|F0] =A0= 0. For any finite valued stopping time S, E[AS]E[A] since A is an increasing process.Observe that A L1 because A is a process with integrable variation. Therefore A{AS}S isuniformly integrable and A is in class (D). Applying Theorem 11 (Doob-Meyer decomposition) toA, we see that M=A A is a uniformly integrable martingale. Then

    0 =E[MT|FT] = E[AT|FT] E[AT|FT] = 0 E[AT|FT]. a.s. (63)

    Therefore A is continuous at time T.

    27. AssumeAis continuous. Consider an arbitrary increasing sequence of stopping time {Tn} Twhere T is a finite stopping time. M is a uniformly integrable martingale by theorem 11 and thehypothesis that Z is a supermartingale of class D. Then > EZT =EAT and in particularAT L1. Since AT ATn for each n. Therefore by Doobs optional sampling theorem, Lebesguesdominated convergence theorem and continuity ofA yields,

    limn

    E[ZT ZTn] = limn E[MT NTn] limn E[AT ATn ] =E[limn (AT ATn)] = 0. (64)

    ThereforeZ is regular.Conversely suppose that Z is regular and assume that A is not continuous at time T. Since

    A is predictable, so is A andA. In particular, T is a predictable time. Then there exists anannouncing sequence{Tn} T. Since Zis regular,

    0 = limn

    E[ZT ZTn ] = lim E[MT MTn ] lim E[AT ATn] =E[AT]. (65)

    Since A is an increasing process andAT 0. ThereforeAT= 0 a.s. This is a contradiction.Thus A is continuous.

    24

  • 8/10/2019 Protter Solutions

    25/25

    31. Let T be an arbitrary F stopping time and ={ : XT()= XT()}. Then byMeyers theorem, T = T Tc where T is totally inaccessible time and Tc is predictable time.By continuity ofX, = andT=. Therefore T =Tc . It follows that all stopping times arepredictable and there is no totally inaccessible stopping time.

    32. By exercise 31, the standard Brownian space supports only predictable time since Brownianmotion is clearly a strong Markov Feller process. Since O= ([S, T[:S, Tare stopping time andST) andP = ([S, T[: S, Tare predictable times andS T), if all stopping times are predictableO=P.

    35. E(Mt)0 andE(Mt) = exp [Bt 1/2(t )]e. So it is a bounded local martingale andhence martingale. IfE(M) is a uniformly integrable martingale, there existsE(M) such thatE[E(M)] = 1. By the law of iterated logarithm, exp(B 1/2)1{=}= 0 a.s. Then

    E[E(M)] =E

    exp

    1

    21{