1 Department ofMathematics and Department ofStatistics.boos/library/mimeo.archive/ISMS_1970… ·...
Transcript of 1 Department ofMathematics and Department ofStatistics.boos/library/mimeo.archive/ISMS_1970… ·...
ItThis re8eaPCh was partia'Lty supported by the Nationa'L Science Found-
ati~ under Grant GU-20S9.
1
2
Department of Mathematics and Department of Statistics.
Department of Statistics.
GAUSSIAN STOCHASTIC PROCESSES AND GAUSSIAN MEASURESlt
by
Balram S. Rajput1 Stamatis Cambanis2
Department of StatisticsUniversity of North Carolina at Chapel: HiZZ
Institute of Statistics Mimeo Series No. 705
August, 1970
.'" ,
-
GAUSSIAN STOCHASTIC PROCESSES
AJID GAUSSIAN MEASURES*
Balram S. RajputDepartment of Mathema.tics and Department of Statistics
University of North CarolinaChapel Hill, North Carolina 27514
and
Stamatis C&mbanisDepartment of Statistics
University of North CarolinaCha.pel Hill, North Carolina 27514
,
*This research was partially supported by the National ScienceFoundation under Grant GU-2059.
1. INTllJOOcrION
Gaussian stochastic processes are used in connection with problems
such as estimation, detection, mutual information, etc. These problems
are often effectively fo~ulated in terms of Gaussian measures on appropriate
Banach or Hilbert spaces of functions. Even though both concepts, the
Gaussian stochastic process and the Gaussian measure, have been extensively
studied, it seems that the connection between them has not been adequately
explained. Two important questions arising in this context are the follow-
ing:
(Ql) Given a Gaussian stochastic process with sample paths in a Banach
fUftCtion space, is there a Gaussian measure on the Banach spae;e which is in
dueedby the given stochastic process?
(Q2) Given a Gaussian measure on a Banach function space, is there a
Gaussian stochastic process with sample paths in the Banach space which in
duees the given measure?
•The purpose of this paper is to explore questions Q1
and Q2 in the com-
• IIOnly encountered spaces C[O,l) and L2 [0,1]. Section 2 provides affirm
ative answers to both questions Q1 and Q2 for the space C[,O,ll, with no re
strictive assumpt1oaa. The space L2[O,l] is considered in Section 3. Under
eome mild conditions, a stochastic process inquces in a natural way a proba-
bility measure on L2[O,1]. If in addition the process is Gaussian, a neces...
aary and sufficient cOlldition for the induced measure to be Gaussian is de-
rived in Theorea 3.1 and Corollary 3.1. The question Q2 is answered in the
affirmative in Theorem 3.3. Section 4 includes some generalizations and ex-
tensions of th. results of Sections 2 and 3. The most important result is
'lheorem 4.1 which provides an affirmative answer to question Q1 for L2(T), T
any Borel measurable subset of the real line, under conditions considerably.weaker than those of theorem 3.1.
2
It should be mentioned thn.t, t.hrour~h01At the literature, whenever
the need arises to have a Gaussian measUl·e induced on an appropriate
"unction space by a Gaussian :proc~r.s,then, ei "th('l'" (i) it is assu.'lled that
the Gaussian process is mean squarE' continuoul'l o.nd that the index set is
a compact interval [5, 7, 93, nnd thus unnpce8~ar~ ~~Gurnpticns are made VAl
the process, or (ii) the term "Gaussian process" is used to :nean a generalized
Gaussian process, i.e., a Gaussian measure, or a m~L\surable map which induces
a Gaussian measure [1, 2) and thus the problem of inducing the Gaussian
measure from a Gauss~an process is not considered.,
Thro\~hout tnis paper real Banach spaces, and real stochastic processes
are considered. The basic notation, definitions and properties that are
consistently used in sUbsequent sections are given in the followinp,.
Let X be a real, separable Ba.nach space of' functions on (0,1] and
denote by SeX) the smallest o-algebra of' subsets of' X which contains
. all the qpen subsets of X. Let X(t,~), t ~ [0,1], be & real stochastic
process defined on the probability space (O,F,p) and such that x(·,~) ~ X
almost surely (a.s.). If the map 'J': (O,F) .. (X,B(X», defined by
T~ = xC· ,tAl} (1.1)
is measurable, then the probability measure llX induced by. X(t,~).
t £ [0,1], on (X,8(X» is defined by
~(B) =P{T-1(B)} =p{~ € 0: X(·,~) ~ B}
for all B £ S(X).
A stochastic process (o,F,P; X(t,~), t ([0,1]) is said to be
Gaussian it for every finite n and tl, ••• ,tn ! [0,1], the random
variables X(tl'W)""tx(tn'~) are jointly Gaussian.
Some definitions and properties related to Gaussian measures are
summariZed here. Let X be any separable Banach space and H(X) the
,I.
3
a-algebra generated by the open sUbsets cf X. ;t is vell known that
R(X) is also the smallest a-algebra of subsets of X with respect to
which all continuous tunctionals on X are measurable. A probability
measure 11 on (X,S(X» is said to be Gaussian if every continuous linear
functional F on X is a Gaussian random variable. It is easily seen
that 11 is Gaussian if' and only if any finite number of continuous linear
functionals {F1' •• • ,Fn} on X are Jointly Gaussian random variables.
Let now X = H, a separable Hilbert space with inner product and norm
denoted respectively by <.,. > and 11·11. If ~ is a Gaussian measure
on (H,S(H», its mean and its covariance operator are defined [8] as the
element Uo E H and the bounded, linear, nonnegative, self-adjoint and
trace class operator S on H which satisfy
E[<u,v>] =<u ,v> , for all v € Ho
E[<u - u ,v><u - u ,w>] = <Sv,W> , for all v,v E H"o 0
•The support ().f a Gaussian measure 11 on (H,8(H» is the set
•where &p{+} is the closure inn
genera1ied.bythe eigenfunctions
(1.4)
u +sp{+l,o nH of the linear manifold sp{+n}
l+n) of the covariance operator S which
correspond to its nonzero eigenvalues {>'} [It]. Also, if' lJ is an
Gaussian measure on (H, B( H) ) then [8]
E[ lIull 2 ] = f1luI12dlJ(u) • 1: A < +- (1.5)
H n n
Bote that if a (not necessarily Gaussian) probability measure lJ on
(H,B(H» satisfies (1. 5) then its mean Uo and its covariance operator S
are vell defined by (1.3) and (1.4) respectively.
2. GAUSSIAN STOCHASTIC PROCESSES AND GAUSSIAN ?~A..~URES ON C[O,l]
In this section it is shown that both questions raised in the intro
duction have an af'firmative answer for the space X. C[O.l] of all real
valued continuous fUnctions on (0.1] with the suprimum norm.
4
be a Gaussian stochastic process with G..a. cont:uuous sam:ple pa:;;:-'s. It
measurable, and in rrheorer.;2.1. t.h.s:t 'Ch~.~ rr:es.SL;:rc: ~. X ind"l;ced by
X(t,w), t E [0,1], on (X = C[O,l], B(X» is ~nuR~j~~.
f.,I'onosition 2.1. Let {n,F,p) ba a. p:"oba'bLl.ity space.
(i) If (O,F,P; X(t,cu), t ~ [O,lj) is tl. sto(~ha.Gtic process with
a.s. continuous sample paths, then the ma.p T d~~fir.~d by (1.1) is measur-
able.
(il) COllyersely, if 0. map K: ($;,F) -+- (X = CrO,l), SeX»~ is
measurable, then x(t,w):= (Kw){t), t F. [0,:], is tJ. stocl1astic process
defined on the probability space (U.r,P).
Proor. (i) It suffices to show that the inverse image under T of
a closed sphere in X with center v £ X ~!d radius e > 0 belongs to
'F. Let lJ' = {u € X: IIu - v I! ~ E}. Then
T-1(B) = {cu ~ Q: sup IX{t,cu) - v(t)l ~ &}tdO,l]
= n{w ~ n: vCr) - £ ~ X(t,w) ~ vCr) + t}r
where the intersection is taken over all rationals in [0,1]. Since
X(t,~) is a stochastic process, feu E Q: a ~ X(t,w) < h} € F for every
t e [0,1] and a ~ b. Hence T-1 (B) € F.
(ii) For every fixed t E [O,lJ we h~ve {w ~ n: X(t,~} ~ a} =K-1{u ~ X: u(t) ~ a}. Since Ft(U) =u(t), u ~ X, is ~ continuous linear
functional on X, {u E X: u{t) ~ a} E R(X) (Aee Section 1), and by the
measurability ot K, {w E 0: X(t,cu) < a} ( f. IITheorem 2.1. Let (n,f,p; X(t,cu), t ([O,lJ) be a Gaussian stochastic
process with a. s. continuous sample paths. Then the probability measure
5
~x inril:i.Ccd 'hy X(t,4i) or; IX = ClU,l}, S{X) i.~ r,l$u5sian.
?'roof. In order to shou tha~, ,IX is Gaussi",n jt ~uftjceG to show
that every cont inuous line::..r :.~un(·-:' ';'nnal on X ~s e. Gaussian random
variable on the probability space (X,8(X),ux). ~et F be a continuous'
linear functional on X. Then there exists Ii. real vt~lued f\mct1on ifi of
bounded variation on [0,1] suer. that for all u ( X,,...
F(u) =!u(t}d$(t)o
where the integral 1s Riemann-Stic·:Ltjes. Then by (1.2),~
~x{u € X: F{u) ~ a} ~ P{w c n: fX(t,w)d~(t} ~ ~}.o
Hence, in order to show that F(u) is a Gaussian random variable on the
probability space (X,B(X) ,}.Ix)' it sl~ff'ices to show that
1t(w) = JY.(t,w)d~(t) a.s.
ois a Gaussian random variable on (n,F ,p). l.t is .known that for continuous
rfunctions the Riemann-Stieltjes integral can be p.pproximated as follows:
• Consider the partitions ~,n = 1,2, •.• of [0,1) defined by the pointsn
k{t =-. k = O.l, ••• ,n - l,n}, a:ld definek,n nn
r. (l.l) = r X(tk
,w)[41{t, ) - 1>(tk 1)]n k=l K -
Then lim t n (61) = E;(61) a.s. and the sequence of random variables {f;nln++-
is a Gaussian family. Since ~(61) is finite a.s., it follows by Lemma 5.1
that it is Gaussian. IIWe now turn to question ~ raised in the introduction and give an
affirmative answer in the following theorem:
Theorem 2.2. Given any Gaussian measure v on (X =C[O,l],S(X»,
there exists a Gaussian stochastic process (O,f,P; X(t,61), t £ [0,1]) with
a.8. continuous sample paths which induces JJ on (X,B(X».
6
Proof. Define the probability space (n,f,p) by Q -X, f. B(X),
p - 11 and denote the identity map between 'l and X by t: 1C11 • u, where
u· CII. Clearly I: tG,f) -t [X,8PO} is a measurable map. Define X(t,(I)
by
X(t,CIl) = (1(1)(t) • u(t), t £ [0,1], w ( Q.
Then. by Proposition 2.1(ii), X(t.~), t € [O,lJ, is a stochastic process on
(G,f,P). It is clear by (1.1), (1.2) and the definition of X(t,CIl), t £ [0,1),
that 1Jx. l1, i.e.. K(t,l.\). induces lJ on (X,B (X) ). In order to show that
X(.t,CIl), t £ (0,1], is a Gaussian process it suffices to show that for every
nand tl, ••• ,tn £ [0,1], the random variables X(tk,w), k • l, ••• ,n, de
fined on (O,F,p) are jointly Gaussian; or equivalently that the random var-
iables Ftk(U). (1C1l)(~) • u(tk), k • l, ••• ,n, defined on (X,B(X),l1) are
jointly Gaussian. :But this follows from the fact that each Ftk is a contin
uous linear functional on X and lJ is a Gaussian measure on (X,SIX)). II
3. GAUS&1AN STOCHASTI~ PBOCESSES AND GAUSSlJu~ MEASURES ON L2
(O,l]
In this section we consider questions Ql and Q2 for the Hilbert space
L2
[O,l] of real valued, square integrable functions with respect to the
Lebesgue measure.
First, question Ql
is considered. Let (n,f,p; X{t,w), t£(O,l]) be a
Gaussian stochastic process. The problem is under what conditions does
X(t,w), t£[O,l], induce a Gaussian measure on (X. L2
[O,l],B(X)). Obvious
minimal conditions are that X(t.(j) be product (8[O,l])C f) measurable,
where 8[0,1]' is the a-algebra of Borel measurable subsets of (0,1], and
that almost' all sample paths of X(t,w) belong to L2[0,1]. Denote by
(11) and (12) the following sets of conditions:
..
7
x(t,cu): ([O,lJ x F,B[O,l] x F)·· .. (R,B(R») is measurable, where
R is the real lir.~ ~ld FeR) 1s the class of Borel subsets
of' R.
(ii) X(t,CAl) is Ga.ussi~ with Jl'lean met), autocorrelation ret,s)
and covariance H{ t,s).
and
a.s •
.An application of Fubini's theorem t>roves the f'ollowin~ proposition:
Proposition 3.1. If the stochastic process (n,F,p; X(t,CAl), t € [0,1])
satisfies (1) and (iii) of (I2), then the map T defined by (1.1) is
measurable.
Hence, if (i) and (iii) of (I~) are satisfied X(t,CAl), t € [0,1],...induces by (1.2) a probability measure ~X on (X =L2[0,1), SeX»~. ~ow
the ques~on under consideration is: If, in addition, X(t,w), t € [0,1],
is Gaussian, under what conditions is ~X Gaussian. A sufficient condition
is given in Theorem 3.1 and is shown to be also necessary in Corollary 3.1.
For the proot ot .Theorem 3.1 the following lemma is needed.
Lemma 3.1. Let (n,F,p; X(t,w), t € [0,1]) be a stochastic process
satisfying (ll) and let f(t) be a real valued, Borel measurable tunction
on [0,11. If1 2
E(!lx(t,w)t(t)!dt) < +wo
then the random variable t(w) defined by
1t(w) c !X(t,w)t(t)dt a.s.1 0
is Gaussian with mean !m(t)f(t)dt and covarianceo
1 1J !R(t,s)t(t)f(s)dtds.o 0
I..
(III)
8
Remark 3.1. It in loemma. 3.1 the condition (3.1) :1.s replaced by, 1
(II) E(lx2(t,w)dt) =!r{t,t)dt ~ +..0o 0
or by1 2
F.(f1X(t,lAl) Idt) < +"',o
then the conclusions of the 1en'.1na are true for every t E: L2
[O,l] and tor
every real, Borel measurable, a.e. uniformly bounded function f on [0,1]
respectively. Note that, as foilows by Sch~'artz's inequality, (II) implies
(III) •
Proof. In view of (11
) and (3.1) the r&.ndom variable (fAl) is well
defined by (3.2) as an a.s. sample path integral.
Let H(X) be the closure in L2(O,F,P) of the linear manifold
generated by the random variables X(t,w), t ~ [0,11. Then H(x) ~ L2
(O,F,P).
Clearly every family of random veriables in H(X) 1s jointly Gaussian. The
integral1
new) =!X(t,w)f(t)dto
. exists in the quadratic mean sense in H(X) if and only if
1 10
2 =J !r{t,s)~(t)f(s)dtds < +~o 0
[6, p. 33J. If the integral <Jxists, new) is Gaussian with
(3.4)
2 2E(n ) = 0 •
It will be shown that the hypotheses of the lemma imply (3.4). Note that
ret,s) =R(t,s) + m(t)m(s) implies
11' 102 ~ f f1R(t,s)t(t)t(s) latds + (!Im(t)t(t) Idt)2 (3.5)000
It tollows by Tonelli's theorem and by (3.1) that
1 1 1 1J fIR(t~s)f(t)t(s)ldtds ~ I !Elx(t,w)X(s,w)llt(t)f(s)ldtdso 0 0 0
1. .!,)=E(!lx(t,w)f(t)fdt)' < .. (3.6)o
9
Also, by Tonelli's theorem and (3.1), we obtain
1 1!lm(c )r(t} Idt ..:. E(!lX(t ,ul)r(t) !dt) < +.....o 0
Equations (3.5), (3.6) and (3.7) imply (3.4).
It is next shown that ;(w) = new) a.s. Tonelli's theorem and (3.1)
imply that
1 1J J Jlx(t,w)X(s,w)f(t)t(s)ldsdtdP(w) < +.....n 0 0
Hence, by Fubini's·theorem and (3.4), it follows that
E(,2} =02 (3.9)
By applyin~ a property of the quadratic mean integral (3.3) [6, p. 30]
and (3.8) we obtain
1E(~n) = JE[~(w)X(t,w)f(t»)dt
o1 1 2
= !{!(JX(s,w)f(S)X(t,w)f(t)ds)dP(w)}dt =0 (3.10)o n 0
'By (3.9),,{3.10) and the fact that E{n2 ) = 02
, it follows that
Hence t = Tl in L2
(0,F,P); Le., E;(w) = n(w) a.s. and since Tl 1s
Gaussian, so is t.1
Since by (3.1), Ilx(t,w)f(t)ldt ~ L1(n,F~p), we haveo
X(t,w}f(t) € L1 ([0,1] x 0,8[0,1) x F, dt x p) and therefore
1E(t) =Jm(t)t(t)dt
oSince E(t2) =E(n2) = 0
2 , it folloW's by (3.1) that
2 2 1 1Var(t) =E(t ) - E (~) = J !R(t,s)f{t)f(s)dtds II
o °Theorem 3.1. If the stochastic process (n,F,p;X(t,w},t € [0,1])
satisties (11) and (II), then the probability measure ~X induced on
10
(X =L2[O,1],B(X» 'hy X(t,h\) 'is Gaussian Yit.h me(~n m t: X and covariance
onerator fr,enerated by the Kernel R(t,~).
Proof. Note that (II) implies (iii) of (12) and by Proposition 3.1
~X is well defined by (1.2). In order to show that ~X is Gaussian it
suffices to shoY that every continuous linear fUnctional on X is a
Gaussian random variable on the pror.a.bility space (X,R(X) ,ux)' Let F,
be a continuous linear f\mctional on X. Then there exists a f c X such
that1
F{u) = <u,f> =!u(t)t(t)dto
'for all u (X. It tollows by (1.2) that
1~f(W) =!x(t,w)t(t)dt
o
1~X{u ( X: F(u) ~ a} = :t>{w E n: !X(t,w)f(t)dt ~ a}.
°Hence the random variables F(u) on (X,8(X),\.!x) and
on (n,F,p) are identically distributed. Since ~(w) is Gaussian by
Remark 3.z, so is F(u).
It Uo E X is the mean of \.lX' it follows from (1.3) and Remark 3.1
that,
for
1 1lu (t)f(t)dt =Ereu,f>] = E[~t(w)] =!m(t)t(t)dt0 0 0
all t E X. Hence u (t) =met) a.e. [L~b.] on [0,1]; i.e., m =uo 0
in X.1
It S is the covariance operator of Ux and tg(w) = jX(t,w)g(t)dt,o
g E X, it follows by (1.4) and the tact that the random variables <u,f>,
<u,g> and ~f(w), ~g(w) are identically distributed that
<St,g> =E[<u - U ,f><u - u ,g>]o 0
= E[~f(w)~g(w)] - <uo~~><Uo,g>
for all t,g f: X. As in (3.B) we obtain
1 1J f flx(ttw)X(Stw)f(t)~(s)'dsdtdP(w) < +~f2 0 0
we have
and by Fubinits Theorem
11
.......
1 1<Sftg> = f !r(tts)f(t)P,(s)dtds - <mtf><im,g>
° 01 1
=f JR(t,s)f(~)g(s)dtdG = <Bfts>a 0
for all f,g E Xt where R is the trace .class (because of (II» operator
on X with kernel R(t,s). Hence S =R. I IIt should be'noted that condition (II) is satisfied if the stochastic
process X(t,w), t E [0,1], has either one of the following properties:
mean square continuity, wide sense stationarity, uniformly bounded auto-
correlation function.
Theorem 3.1 shows that condition (II) is sufficient for the induced
measure \.IX to be Gaussian. It is shown in the ro110win~ corollary that
it Is also necessary.
Corpllary 3.1. If the stochastic process (n,Ftp; X(t,w), t € (0,1])
satisfies (I2), then the probability mea.sure ~X induced on (X =L2[O,1],
R(X» by X{t,w) is Gaussian if and only if (II) is satisfied.
Proof. The "if" 'Part is shown in 'l'heorem 3.1. The "only if" part
follows fram (1.5) and the fact that the random variables IIul1 2 on
(X, B(X), \.IX) and IIx( •,w) 11 2 on (n, F,p) are identically distributed. i IOne may wonder whether \.IX can be shown to be a Gaussian measure if
the stochastic process X(t,w), t E [0,1], satisfies conditions (12 ) and
(III), which are in general weaker than (I1 ) and (II). The answer, in
the affirmative, is provided by the following theorem•
Theorem 3.2. If the stochastic process (n,F,p; X(t,w), t £ [0,1])
satisfies (12) and (III), then the probability measure ~X induced on
, ..
12
(X =L2[0,1], R(X» by X(t,w) is Gaussian with ~ean met) and covariance
operator S with kernel R(t,s) and (II) is satisfied.
Proof. Let {$n(t)} be a complete orthonormal set in 12(0,1]
such that each ~ (t) is continuous on (0,1] for all n. Thenn
X(',w) t L2[0,l] implies
in L2[0,1] a.s., where
1• ~ (w) = !X(t,w)¢ (t)dt a.s.nOn
It follows by Remark 3.1 and the proof or Theorem 3.1 that the random
variables {t (w)} are jointly Gaussian.n
In order to show that ~X is Gaussian it suffices to show that
every continuous linear functional on X is Gaussian; Le., that the
random variable F(u) = <u,.f> on (X,B(X),lJX) is Gaussian for every
f £ X. It is seen in the proof of Theorem 3.1 that the random variable
,(u) is Identically distributed with the random variable t(w) on (n,F,p)1
,given by t(w) = JX(t,w)f,(t)dt '.a.s. It follows from. (3.11) thato
By Lemma 5.1 it follows that t(w) is Gaussian.
The validity of (II) follows from Corollary 3.1 and the claims
about the mean and covariance of lJX from Theorem 3.1. IIRemark 3.2. It follows from Theorem 3.2 that i.f condition (I2) is
satisfied, then (II) and (III) are equivalent.
The affirmative answer to question ~ is given in the following
theorem.
Theorem 3.3. Given any Gaussian measure lJ on (X. L2[O,l],R(X)}
there exists a Gaussian stochastic process (n,F,p; X(t,w), t £ [0,1])
13
which satisfies (II) and (II) a.nd imlllces ~ on (X,B{X».
In order to prove Theore~ 3.3. the followinp; lemma is needed.
Lemma 3.2. Let tl be a Gaussian measure on (X = L~[O,l),R(X»,,let Uo and S be th(~ mean and the covariance opera.tor of 1.1 respectively,
eoand let {~n}n=l be the ei~enfunctions of S corresponding to its nonzero
00
eigenvalu~s {A} l'n n=
all t E ~
(ii)
,lJ(Nt
) = 0
There exists a set Al (B(O,l] with Leb{Al ):& 0 such that
The series t <$ ,u - U >$ (t) convcr~es in L2(X,B(X),~) torn=l non
(~ denotes the complement of AI); and
for all t € A~ there exists a set Nt ~ SeX) such that
and the series i <$ ,u - U >$ (t) converges for all u € Ntc •
n=l non
converges int <$ ,u - U >~ (t)n=l non
Define Y{t,u) equal to the limit of the series on A~ x X and equal to
zero on Al
x X.
(2) The seriesI-
L2([0,1] ~ X, B[O,l] x 8(X),Leb x 1.1). Denote the limit by Y'(t,u).
(3) There exists a set A € B[O,l] such that Leb(A) = 0 and
Y(t,u) =Y'(t,u) a.e. [Leb x 1.1] on AC x X.
Proot 2!. Lemma 3.2. Proof of (1). Let ret) r A let). Since alln=l n n
the functions in this series are nonnegative, real valued and measurable,
it follows that f(t) is an extended, real valued, measurable function.
Since, by (1. 5),
!f{t)dt = tAt $2(t)dt D EA < +e,a n=l nOn n=l n
we conclude that there exists a set Al £ B[O,l] such that Leb{Ai} = 0
and t(t) < +- on ~. Let t E A~ be fixed. Then the random variables00
{~ (t,u) • <+ ,u - u >+ (t)} 1 on (X,S(X),lJ) are jo~ntly Gaussiann non n-
......
with mean zero and
14
Er~2(t,u}] =<8, ,~ >$2(t) =A-.2{t)n n n n n n
E[~ (t,u)~k(t,U)] = <S~ ,~. >~ (t)~k(t) = 0 kA $2(t),n n K n n n n
where 0nk is Kronecher's ~. Hence {w~(t,u)}:=l is a sequence otco :2independent, zero mean, r78,ussian random variables and r E[ IjJ (t ,u)J =
n=l n
(i) and also by [3, p. 197] for all
t w(t,u) converges a.e. [~], which proves (ii).n=l n
Clearly the two liJ!1i.ts, in (1) and (ii), are equal a.e. [pl.
00 2r A ~ (t) < +-. This impliesn=l n n
ct € Al the series
Proot 2! ill. Let 8[0,1])( H(X) be the product a-algebra of
[0,1] )( X. Then ~ (t,u) is a measurable function.n By Tonelli's theorem
we have1 2I I ~ (t,u}d~(u)dt • <8, " > = A •oX n n n n
Hence ~n € L2([0,1] x X, 8[0,1] x SeX), Leb x p} =L2([0,1] x X) and
~nwk € Ll £[O,l) x X) tor all n and k. It follows by Fubini's Theorem
that1 .I I ~n(t,u)~k(t,u)d~(u)dt =<S~n,$k><~n"k> = 6nk~k.o X
Hence {Iji (t,u)}co 1 are orthogonal in L2«(O,l) x X) and by (1.5),n n=
colf/2 00r ~ (t,u)du(u)dt. r A < +CO.
n=l 0 X n n=l n00 •
It tollows that the series r ~ (t,u) converges in L2([0,lJ x X).n=l n
Proof 2!11l. Since by (2), Ew(t,u) = yl(t,U) in L2([0,1] x X),n=l n
Nkit tollows that there exists a subsequence such that 11m ~1~ (t,u) = Y'(t,u) c
k-++co n- nNk
(Leb )( lJ] on [O;lJ x X. Let B. {(t,u) € [0,1] x X: r w (t,u) doesn-l n
not converge to Y' (t,u)). Then (Leb)( p}(B) = 0 and by Fubini 1 s theorem,1
o • (Leb x u)(B) =f f ~(t,U)d1J(U)dtlOX
• I(f dp(u»dt = t p(Bt)dto B
t0
15
• where B... = {u ( X: (t,u) I! B} E R(X).u
lIence ~(Bt) = 0 a..e. [Leb.)
and tbus there exists a 3et A2 e B[O,l] such that Leb(A2) =0
It "follows that. for every t IE A~ welJ(Bt
) =: 0 for all t £ A~,~:
'\k;J(Bt ) = 0 &.!ld lim 1: IJI (t,u) ~ yt(t,u) a.,e. [lJ).
k-*o> n=l n
and
have
Let
such that
cA =Al U A2, Then for every t (A there exists
j.l(Bt
) = 0 and
Nlim ik ~ (t,u) =yl(t,u) for all u £ B
tc •
k*CD n=l n
B.Y part (l.ii) it follows that for every t (Ac there exists Nt ( B(X)
•
such that peNt) = 0 and
N Clim t ~ (t,u) = y(t,u) for all u £ Nt'N*co n=l n
N N co
Since {ik w(t,u)}koo_l is a subsequence of {E $ (t,u)}N-l'n=l n - n=l n -
c cit follows that for every -t ( A , Nt .£ Bt • Hence for all t E: A and
,.
Y(t,u) = Y' (t,u) = E $ (t,u).n=l n
It now follows from (0,1] x X = {A x Xl u HAc x X) 0 B} u HAc x X) nBc},
Leb(A) • 0 and (Leb x p)(B) =0, that (Leb x lJ){(Ac x X) nBc} =(Leb x lJ){(O,l] x X}. For all (t,u) ( {(AC
x, X) nBc} we have t £ AC
and u (B~ and thus Y(t,u) = Y'(t,u). It follows that Y(t,u) = Y'(t,u)
e..e. [Leb x lJ] on [0,1] 1C x. IIProof 2t Theorem 3.3. Define the probability space (n,F,p) by
n • X, F =. B( X), P = lJ and denote the identity map 'between n and X
e·
by I: !w • u, where u = w. Define X(t ,w) by
{ aon A x n
X(t,w) •uo(t) + Y(t,I-l(w» AC x n,on
,
16
where A and Y are the same as in Lemma 3.2. Since u (t) and Y(t~u)o
are B[O~l])( SeX) measurable. it follows that X(t.w} is R[O.l])< F " ..
measurable and hence (n.F.p; X(t,w), t £ [0,1]) is a product measurable
stochastic process. It also folloW's by parts (l) and (3) of Lemma 3.2
that for all ct fA.
X(t,lil) =u (t) + 'f <ljl ~ I-1 (w) - u > • (t)o n-1 non
I»
a.e. (p] and also in L2 (O,F,P). Since the random variables {<"u - U >} 1• n 0 n=
on (X.8(X),lJ) are jointlY' Gaussian. so are the random variables
{<.n,I-1 (w) - uO>}:=l on (o,F,p). Hence the convergence in L2(O,F,P)
of the series C3.13), along with C3.12), imply that X(t,w), t e [0~1] is
a Gaussian stochastic process.
Now uo(t) € L2[0.l] implies uoCt) € L2([0,1] )( 0). Also
y(t.u} £ L2([0,1] x X} implies Y(t,r-l(w» £ L2([O,1] )( 0). It .fol10ws
that X(t,w) € L2([0,1] x 0) and by Fubini's theorem we obtain,1 1
+1» > I jx2(t,w}dtdP(w) = E(jx2(t,w)dt} ;n 0 0
i.e., X(t,w} satisfies (II). Thus X(t,w), t e [0,1], satisfies both
(Il ) and (II) and by Theorem 3.1 it induces a Gaussian measure lAX on
(X,8(X)}. It will be shown that lAX = lA. For this it sUffices to show
l!X(B) • \J(B) for any open sphere with radius £ > 0 and center any
w € X. Let B = {u € X: Ilu - wll < d. Then we have by (1.2), (-3.12)
and (3.13)
\JX(B) = P{lA) f 0: IlxC' ,w} ':"' w(·} II < d
= II {u e X: IIu - t <4l •u - U >. -wII < £ }_ 0 n-l non
Since the support of II is sup (ll) = tio +"ii"P{.n}' we have
17
~x(:a) • ldv ~ sp{~ l: -II'll + t <~ ,v>. - wi I < e}n 0 n=l n n
== lJ {v f; -;p"{ $ }: IIu + v - vii < £}n 0
ll'l ll{U E: Sup(l1): II'll - vii < d
= ~{'ll E X: Ilu - ..,11 < d == u(B)
which completes the proof. I I
4. SOME GENERALIZATIONS AND EXTENSIONS
In this sect1.on the folloving generalizatiorB and extension$ of the'
results presented in Sections 2 and 3 are considered.
4.1 From [0all to any Borel meas'urable subset T of the real line.
All the results presented in Sections 2 and 3 for the case vhere the
parameter t,
takes values on T == [0,1) clearly hold for any compact
subset T of the real line. Moreover it is easily seen that all the results
of Section 3 also hold for every Borel measurable subset T of the real
line. Th~ only use of the compactness of the interval [0,1] is made in
the proof of Theorem 3.2 s in concluding that the continuous, orthonormal
and complete functions 4>n(t) in L2[0,1] are uniformly bounded. However,
what is essential in the proof is clearly the uniform boundedness on T
of each fUnction in a complete orthonormal set in L2(T,B(T),Leb), and this
can be always satisfied by an appropriate choice of a complete orthonormal
set even if T is a set of infinite Lebesgue measure.
4.2 On the induction of a Gaussian measure on an appropriate~ S'Dace by
a Gaussian stochastic 'Drocess.
The two questions raised in the introduction have been ansvered in
the affirmative for the function spaces e[O,l] and L2[O,1] in Sections 2
and 3 respectively. It should be remarked that the only case where an
affirmative answer is obtained under restrictive assumptions, and by no
(IV)
18
means in general, is the important case of inducing a Gaussian measure on
L2[0,lJ from a Gaussian process X(t,w), t ( [O,l)~ the restrictive
assumption being condition (II). We now turn our attention to this case
and ask whether assumption (II) is essential to inducing a Gaussian
measure or not. It turns out that the need for assumption (II) is solely
due to the specific way in which we attempted to provide an answer, and
that an affirmat~ve answer can indeed be given in the general case with
no restrictive assumptions whatever. Specifically, it is presently shown
that every. product measurable, Gaussian stochastic process X(t,w), t ~ T,
where T is any Borel measurable set on the real line, induces a Gaussian
measure on an appropriate Hilbert space of square integrable fUnctions on T.
Let (n,f,p) be ~ probability space. In this section by (Il ), (I2 )
and (II) we mean conditions (11), (12) and (II) with the interval [0,1]
replaced by a Borel measurable set T of the real line. It follows by
Theorem 3.,l and Section 4.1 that a stochastic process X(t,lIJ), t ( T,
satisfYing (Il ) induces a Gaussian measure on L2
(T,B(T),Leb.) if (II)
'is satisfied, i.e., if !r(t,t)dt < +~.T
Consider a measure v on (T,8(T» such that
!r(t,t)dv(t) < +T
That such measures v exist tollows from the following particular choice:dv
Define vo on (T,8(T» by [d L~b](t) =f(t)g(t), where get) ~ 0,
.....,
t(t) •
r(t,t)
1
tor
tor
o ~ r{t,t) < 1
1 ~ r(t,t)
It is clear that vo
satisfies (IV) and is a finite measure. The folloving .
theorem can be proved in the same wrq &8 Theorem 3.1.
19
Theorem J!.l. If the stocha.stic process (n,F,p; X(t,w), t E'. T)
satisfies {II}' then for every measure v on (T,B(T» satisfying (IV),
X(t,w), t ( T, induces a Gaussian measure ~x on (X =L2
{T,B(T),v),
B(X» with mea.n m (X and covariance operator S generated by the
kernel R(t , s ) .
IIenee, even though a product measurable Gaussian process does not
necessarily induce a Gaussian measure on (HLeb =L2
{T,B(T),Leb.), B(~eb»'
the necessary and sufficient condition for the latter being (II), it always
induces a Gaussian measure on every (H =L2
(T,8{T),v),B(H» with vv v
satisfying (IV). For instance, a wide sense stationary process X(t,w),
t € (--,+-) :: R, satisfying (II) does not induce a measure on L2
(R,B(R} ,Leb. ),
but it induces a Gaussian measure onL2(R,8{R),v) for every finite measure
v. Also a harmonizable Gaussian process X{t,w), t E'. R, does not necessarily
induce a Gaussian measure on L2
(R,B(R) ,Leb.), but it induces a Gaussian
measure oJ; every L2
{R,B(R),v} for v a finite measure. For a more concrete
example consider the Wiener process +W{t,w), t € [0,+-) =R ; then
ret,s) = min(t,s) and even though W(t,w) does not induce a measure on
L2
(R'+ ,B(R+) ,Leb.) it induces a Gaussian measure for example on
+ + dv ]() -tL2(R ,B(R },v), where v is defined by [d Leb t =e •
Remark 4.1. Denote by N the set of measures v on (T,B(T» which.satisfy (IV). A meaningful choice of v in N assigns positive measure
to all Lebesgue measurable subsets, vith positive Lebesgue measure, of the
set To = {t € T: r (t,t) .; O}. That such measures v exist is demonstrated
by the measure v. The construction of v makes also clear that thereo 0
exist v € N which are absolutely continuous with respect to the Lebesgue
measure with, moreover, ~d ~b. ](t) ; 0 a.e. [Leb.] on 'lo; 1.e., there
exist v E: N Which are equivalent to the Lebesgue measure on (To,BCTo».
20
ReFla.rk 4.2. t.l'heorem l~.] states tha.t for every \l ( N, a. product
measurable, Gaussinn stochastic process X(t,w), t ( T, induces a Ga.ussian-
Sup(lJ)=m+sp{, } =m+'Jf(S)," \l,n n "
of the range ~(S) of the opera.tor"
lithe closure in H\l
on H\l =L2(T,B(T),v). Let S" be the covariance operator
{.p } its eigenfunctions corresponding to its nonzero",n nThe support of ~ is
"
mea.sure lJv
of lJ and\l
eigenva.lues.
where R{S)v
S. Hence the Hilbert spa.ces (H) and the induced Gaussian measures" "(~ ), as well as the supports (m + R(S », depend on the choice of v inv v ~
N. Some interestill@; questions arise in this connection. First, whether the
equivalence or singularity of the Ga.ussian measures induced by two product
measurable, Gaussian stocha.stic processes depend on v in N. This problem
will not be considered here. Secondly, how does sup(lJ,,), or R(Sv)' depend
on v £ N. Note that X(',w) - m(') (f(s) a.s. for all ,,( N, andv
therefore it would be interesting to know whether there exists a minimal
R(S) fof' some \l (N. Even though an affirmative answer to the latter\l
question does not seem in general plaus i ble, the following remarks can be
made.
(i)
d\li[d Leb](t)
Let "i ( N, i =1,2, be such that vi « Leb. and let
• t. (t).1
~owever, an inclusion relationshipc: R{S )."1
does not seem to hold in general, except ifand
and R(S )"2
R{S )"2
are strictly positive definite operators, in which caseand
between
R(S ) c RCS )."1 "2
(il) It Leb E N, then tor every ,,€ N such that ,,« Leb. and
,Ct) ~ c < +- on To we have 11.eb. cHand RCS ) c RCSLeb).
"
. ; ...21
However, there may exist v € N such that RCS ) c RCSL b)' For" e •
instance consider the Wiener process W(t,w), t € [0,1); then r(t,t). t,
Leb. E N, and if v is defined by [d ~~bJ(t) == 1P , 1 ~ P < 2, thent
" E N and by (i), RCs,,) • Hv ~ HLeb == R(SLeb.)'
(iii) If Leb ~ N then (1) is still applicable. However a measure
«N,
+ +measures "k' k· 1,2, ••• , defined on (R ,S(R » b,y_(t_~)2 ,
equal to 1 on [O,k) and to e 2k on [k,+ao). Then Leb.
" € N corresponding to a minimal R(S) does not seem in general to exist.v
For instance consider the Wiener process W(t,w), t ~
and by (i), R(S ) c; R(S ) for all k. Note thatvk+l ~ "k
€ [0,+-) and hence there is no "E N such that
"k E N, RCS ) =H ,. "k vk
'k(t) k 1 for all t
HE' H for all k." vk
4.3. Gaussian stochastic processes and Gaussian measures on Lp[O,l).
It is shown in Theorem 3.2 that if the stochastic process X(t,w),
.t £ [0,1], satisfies (12) and (III) then it induces a Gaussian measure on
(H2 == L2[O,l],B(H2». Note that (III) implies only integrability and not
square integrability of almost all sample paths of ,X(t,w). Hence the
question arises as to whether a Gaussian measure is induced on
(Hl == Ll[O,l),B(Hl » by X(t,w) if (11) and (III) are satisfied. The
answer to this question is shown to be affirmative in Theorem 4.2~
,This naturally raises the question of inducing a Gaussian measure
on L [0,1], 1 < P < +-. If the stochastic process (n,F,p; X(t,w), t E (O,l])P - 1
is product (8(0,11 x f) measurable and if flx(t,w) IPdt < +ao a.s. toro
1 ~ p < +-, then the map T defined by (1.1) with X. L [0,1] 1s measurable," p
and X(t,w), t E [0,1], induces on (X,S(X» a probability mea.ure lJX
22
(H = L [0,1),P P
defined by (1.2). The followin~ theorem can be proved as Theorem 3.1.
~eorem 4.2. If the stochastic process (o,F,p; X(t,w), t € [0,1})
satisfies (Il ) and (III) for p = 1 or1
(V) E(! Ix(t,w)/p dt)2/p < +-o
for 1 < P < +00, then the probability measure lJX induced on
B(Hp » by X(t,w) is Gaussian.
Theorem. 4.2 answers question ~ for the spaces Lp[O,l], 1 ~ p < +ClD.·
Theorem 3.1 isth~s obtained from Theorem 4.2 for p 18 2. Theorem 4.2
continuous to hold if' the set [0,1] is replaced by a Borel measurable
-set T on the real line.
The· study of question Q2 in the general Lp[O,l] space, 1 ~ P < +-,
appears to be considerably more complicated than in the case p = 2. The
results reported recently in [7] and [10] seem to prOVide the appropriate
structure to approach this question.
5. APPENDIX
We prove the following lemma which is used in the proofs of Theorems
2.1 and 3.2.
If the real random variables {~, n =l,2, ••• } aren
jointly Gaussian and lim E;n ,. f; a. s., where' E; is an a. s. finite random .n......-
variable, then t is Gaussian.
Proof. Since the sequence of random variables t converges a.s.n
to the a.s. finite random variable t, it follows that the sequence of
the characterist~
function of
converges completely to the distributioniu t-.ya2 t 2n n
• e
distribution functions of the f; 'sn
function of . t. Hence it we denote by l' (t)n
function at tn' n = 1,2, •• ., and by ret) the characteristic
t, we have
23
lim fn(t) =f(t) for all t € (-~,+.)n++011
Eq. (5.1) implies that
arg f (1) =u ~ U := arg tel)n n n-++OII 011
\, ..
and that 2-0
If (12") I = e n -:::t It( ~>1 < +011.n n-r-r-
Note that the lim!t
02
---+ +- and.nn-++-
If(i:2) \ is strictly positive; since if It(/:2) 1 - 0 then
{
l for t. 0Ir(t)l- lim If (t)! = , Which contradicts
n.....~ n 0 for t 'It 0
the fact that ret) is continuous in t. Hence
2 20-)0 <+011.n n-++OII 00
It follows from (5.1), (5.2) and (5.3) that
iu t - ~a2t2t{t) • e 011 n . tor all t € ( __,+00)
and thus ~ 1s Gaussian. II,
[1]
[3]
[4 J
[6]
[8l
[9]
[10]
24
REFEREnCES
C. R. Baker, Mutual information for Gaussia.n processes, SIAM J.Apple Ma.th., 19 (1970), 451-q58:
I. M. Gel'fand and A. M. Yaglom, Calculation of the amount of information about a random function, contained in another such function,Amer. Math. Soc. Transl. (2), 12 (1959), 199-246.
P. R. Halmos, "Measure Theory," Van Nostrand, Princeton, N. J., 1950.
K. Ito, The topOlogical support of Gauss measure on Hilbert space,Nagoya M.ath.·J., 38 (1970), 181-183.
G. Kallianpur and H. Ooda11'&, The equivalence and singularity ofGaussian measures, in M. Rosenblatt (ed-.), "Proc. of Symp. on TimeSeries Analysis," 279-291, Wiley, New York, 1962.
K. Karhunen, Uber lineare Methoden in der Wahrscheinlichkeitsrechnung,Ann. Acad; Scient. "Fennicae, Sere AI, No 37 (1947), 1-79.
J. Kuelbs, Gaussian measures on Banach space, J. Funct. Anal.,5 (1970), 354-367.
E. Mourier, Elements aleatoires dens un espace de Banach, Ann. d'Inst. H. Poincare, 13 (1953), 161-244.
~
W. L. Root, Singular Gaussian measures in detection theory, inM. Rosenblatt (ed.), "Proc. of Symp. on Time series Analysis,"292-315, Wiley, New York, 1962.
H. Sato, Gaussian measur~ on Banach space and abstract Wiener measure,Nagoya Math. J., 36 (1969), 65-81.