Almost Sure Convergence of the Series of Gaussian Markov Sequences

19
This article was downloaded by: [University of Connecticut] On: 10 October 2014, At: 12:56 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Communications in Statistics - Theory and Methods Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/lsta20 Almost Sure Convergence of the Series of Gaussian Markov Sequences Valerii V. Buldygin a & Marina K. Runovska a a Department of Mathematical, Analysis, and Probability Theory , National Technical University of Ukraine , Kyiv , Ukraine Published online: 18 Aug 2011. To cite this article: Valerii V. Buldygin & Marina K. Runovska (2011) Almost Sure Convergence of the Series of Gaussian Markov Sequences, Communications in Statistics - Theory and Methods, 40:19-20, 3407-3424, DOI: 10.1080/03610926.2011.581163 To link to this article: http://dx.doi.org/10.1080/03610926.2011.581163 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Transcript of Almost Sure Convergence of the Series of Gaussian Markov Sequences

Page 1: Almost Sure Convergence of the Series of Gaussian Markov Sequences

This article was downloaded by: [University of Connecticut]On: 10 October 2014, At: 12:56Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK

Communications in Statistics - Theory and MethodsPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/lsta20

Almost Sure Convergence of the Series of GaussianMarkov SequencesValerii V. Buldygin a & Marina K. Runovska aa Department of Mathematical, Analysis, and Probability Theory , National TechnicalUniversity of Ukraine , Kyiv , UkrainePublished online: 18 Aug 2011.

To cite this article: Valerii V. Buldygin & Marina K. Runovska (2011) Almost Sure Convergence of the Series of Gaussian MarkovSequences, Communications in Statistics - Theory and Methods, 40:19-20, 3407-3424, DOI: 10.1080/03610926.2011.581163

To link to this article: http://dx.doi.org/10.1080/03610926.2011.581163

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Communications in Statistics—Theory and Methods, 40: 3407–3424, 2011Copyright © Taylor & Francis Group, LLCISSN: 0361-0926 print/1532-415X onlineDOI: 10.1080/03610926.2011.581163

Almost Sure Convergence of the Seriesof GaussianMarkov Sequences

VALERII V. BULDYGIN AND MARINA K. RUNOVSKA

Department of Mathematical, Analysis, and Probability Theory,National Technical University of Ukraine, Kyiv, Ukraine

The necessary and sufficient conditions for the almost sure convergence of the seriesof multi-dimensional zero-mean Gaussian Markov sequences are found. Besides, thenecessary conditions for the almost sure convergence of the series of regressivesequences with operator coefficients in Banach spaces are studied. By obtainedgeneral results, we find the criterion for the almost sure convergence of the seriesof Gaussian m-Markov sequences of random variables.

Keywords Almost sure convergence of random series; Gaussian Markovsequence; Gaussian m-Markov sequence; Multi-dimensional Gaussian Markovsequence; Operator-normed sums of independent random vectors.

Mathematics Subject Classification 60G50; 65B10; 60G15; 40A05.

1. Introduction

The asymptotic properties of regressive sequences, in particular of Gaussian Markovsequences, were studied in Arato (1982), Buldygin (1978, 1980), Buldygin andSolntsev (1987, 1989, 1997), Dorogovtsev (1992), Korostelev (1984), and Koval(1991). For the Gaussian Markov sequences of random variables and vectors, thenecessary and sufficient conditions for the convergence almost surely (a.s.) to zero,convergence a.s., and boundedness a.s. were found in Buldygin (1978) and Buldyginand Solntsev (1987, 1989, 1997).

Generally, in this article we extend the investigations of asymptotic behavior ofGaussian Markov sequences.

Let �Xk� = �Xk� k ≥ 1� be a zero-mean Gaussian Markov sequence in the space�d, i.e., �Xk� obeys the system of stochastic recurrence equations:

X1 = D1�1� Xk = CkXk−1 +Dk�k� k ≥ 2�

Address correspondence to Valerii V. Buldygin, Department of Mathematical, Analysis,and Probability Theory, National Technical University of Ukraine (KPI), Peremogy Ave.,37, Kyiv 03056, Ukraine; E-mail: [email protected]

3407

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 3: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3408 Buldygin and Runovska

where �Ck� and �Dk� are sequences of non random real d × d – matrices, and ��k�is a sequence of independent standard Gaussian random vectors in �d. The mainaim of this article is to find necessary and sufficient conditions for the convergencea.s. of the series

∑�k=1 Xk for the Gaussian Markov sequence �Xk�.

The article is organized as follows. In Sec. 2, we discuss the necessaryconditions for the convergence a.s. of series

∑�k=1 Xk for first-order regressive

sequences �Xk� in separable Banach spaces. Note that for the random series ofregressive sequences with scalar coefficients in Banach spaces, the necessary andsome sufficient conditions for the convergence a.s. are considered in Buldygin andRunovska (2009, 2010).

In Sec. 3, we study the necessary and sufficient conditions for the convergencea.s. of the series

∑�k=1 Xk for a zero-mean Gaussian Markov sequence �Xk� in the

space �d. Note that the case d = 1 is considered in Runovska (2010).Finally, in Sec. 4, we present as an application of the general results above the

necessary and sufficient conditions for the convergence a.s. of the series of Gaussianm-Markov sequences.

2. Necessary Conditions for the Convergence a.s. of the Seriesof Autoregressive Sequences in Banach Spaces

Let ��� � · �� be a real separable Banach space. Suppose that �Xk� is a regressiverandom sequence in � which obeys the system of recurrence equations:

X1 = V1� Xk = CkXk−1 + Vk� k ≥ 2� (1)

where �Ck� is a sequence of non random continuous linear operators in �, and �Vk�is a sequence of independent symmetric random vectors in �. Recall that a �-valuedrandom vector V is called symmetric if V and �−V� are identically distributed.

For the sequence �Xk�, consider the random series

�∑k=1

Xk� (2)

For n ≥ 1, denote

V�n� k� =

Vk +

n−k∑l=1

( k+1∏j=k+l

Cj

)Vk� 1 ≤ k ≤ n− 1�

Vk� k = n�

0� k > n�

where 0 is a zero-vector in �. For each k ≥ 1, consider the random series

�∑l=1

(( k+1∏j=k+l

Cj

)Vk

)� (3)

and set

V��� k� = Vk +�∑l=1

( k+1∏j=k+l

Cj

)Vk�

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 4: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3409

if series (3) converges a.s. in �.Remark that V��� k�, k ≥ 1, are jointly independent symmetric random vectors.The next result follows from Theorem 2.8.1; see Buldygin and Solntsev (1997).

Theorem 2.1. If random series (2) converges a.s. in �, then, for any k ≥ 1, series (3)converges a.s. in �, and the random series

�∑k=1

V��� k�

converges a.s. in �� Moreover, the equality

�∑k=1

Xk =�∑k=1

V��� k� a�s� (4)

holds true.

3. Necessary and Sufficient Conditions for the Convergence a.s.of the Series of Gaussian Markov Sequences in �d

In what follows, the space �d, d ≥ 1, is interpreted as a Euclidean space of all realcolumn vectors x = �x1� � � � � xd�

t with the coordinate-wise scalar product �x� y� =∑dk=1 xkyk and the norm �x� = √

�x� x��As a matrix norm of real �d × d� – marices A = �aij�

di�j=1, we shall consider the

Hilbert-Schmidt norm:

��A�� =( d∑

i�j=1

a2i�j

)1/2

Suppose that �Xk� is a zero-mean Gaussian Markov sequence in �d, that is tosay �Xk� obeys the system of recurrence equations:

X1 = D1�1� Xk = CkXk−1 +Dk�k� k ≥ 2� (5)

where �Ck� and �Dk� are sequences of non random real �d × d� – matrices, and ��k�is a sequence of jointly independent standard Gaussian random vectors in �d.

For the sequence �Xk�, consider the random series

�∑k=1

Xk� (6)

In this section, we will find a criterion for the convergence a.s. of series (6).For n� k ≥ 1 denote

Q�n� k� =

Dk +

n−k∑l=1

( k+1∏j=k+l

Cj

)Dk� 1 ≤ k ≤ n− 1�

Dk� k = n�

�� k > n�

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 5: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3410 Buldygin and Runovska

where � is a zero-matrix. For each k ≥ 1, consider the matrix series

�∑l=1

( k+1∏j=k+l

Cj

)Dk� (7)

and put

Q��� k� = Dk +�∑l=1

( k+1∏j=k+l

Cj

)Dk� (8)

if series (7) is convergent in the matrix norm.Note that for Gaussian Markov sequences �Xk� in �d Theorem 2.1 above is

specialized as follows.

Corollary 3.1. If random series (6) converges a.s., then, for any k ≥ 1, matrix series (7)converges in the matrix norm, and

�∑k=1

��Q��� k���2 < �� (9)

Moreover, the equality

�∑k=1

Xk =�∑k=1

Q��� k��k a�s�

holds true.

Further, we discuss the criterion for the convergence a.s. of random series (6)in �d.

Let �Hn� be a sequence of independent symmetric random vectors in �d, ��n� bea sequence of linear continuous operators in �d, and �n =

∑nk=1 Hk, n ≥ 1. Denote

by �� the class of all monotone sequences of positive integers that increase toinfinity.

We will say that the sequence �xn� of vectors of the space �d belongs to the set

c��d� of convergent sequences, if there exists the limit limn→� xn�For the operator-normed sums of independent symmetric random vectors in �d

the following result is well-known (Buldygin and Solntsev, 1997).

Proposition 3.1. In order for

��n�n� ∈ c��d� a�s�

it is necessary and sufficient that the following three conditions be satisfied:

(1) for any k ≥ 1,

��nHk� ∈ c��d� a�s��

(2) the series∑�

k=1 limn→���nHk� converges in �d a.s.;

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 6: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3411

(3) for all the sequences �mj� from the class ��

limj→�

��mj+1��mj+1

−�mj�� = 0 a�s�

In order to check condition (3) of Proposition 3.1, the following result is used.

Proposition 3.2. Let �Gk� be a sequence of zero-mean Gaussian random vectors in �d,d ≥ 1� If, for any � > 0,

�∑k=1

exp{− �

M�Gk�2}< �� (10)

then

limk→�

�Gk� = 0� a�s� (11)

If, in addition to the above assumptions, the sequence �Gk� is a sequence of independentGaussian random vectors, then conditions (11) and (10) are equivalent.

For series (6) the following criterion providing convergence a.s. holds true.

Theorem 3.1. In order for series (6) to converge a.s., it is necessary and sufficient thatthe following three conditions be satisfied:

(1) for any k ≥ 1, matrix series (7) is convergent;(2) condition (9) holds;(3) for any � > 0 and all the sequences �mj� from the class ��, one has

�∑j=1

exp{− �∑mj+1

k=mj+1 ��Q�mj+1� k���2}< �� (12)

Proof. Consider the sequence �Sn� of partial sums of series (6):

Sn =n∑

k=1

Xk� n ≥ 1�

An approach, used to prove Theorem 3.1, is based on switching from the sequence�Sn� to the sequence of sums of independent zero-mean Gaussian vectors in �2d (seeBuldygin and Solntsev, 1997).

Since Xn = Sn − Sn−1, n ≥ 1, then recurrence Eq. (5) implies that the sequence�Sn� obeys the system of recurrence equations of order 2:

S−1 = S0 = 0� Sn = �I + Cn�Sn−1 − CnSn−2 +Dn�n� n ≥ 1� (13)

where I is an identity matrix of order d.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 7: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3412 Buldygin and Runovska

From Eq. (13), we proceed to the recurrence equation of order 1 in space �2d.Namely,

S̃1 = 1� S̃n = BnS̃n−1 +n� n ≥ 2� (14)

where

S̃n =(

SnSn−1

)� Bn =

(I + Cn −Cn

I �

)� n =

(Dn�n0

)� n ≥ 1�

By representation (14), random series (6) converges a.s. if and only if the limitlimn→� S̃n exists a.s. in �2d.

Formula (14) implies that

S̃n =( 2∏

j=n

Bj

)1 +

( 3∏j=n

Bj

)2 + · · · + Bnn−1 +n� n ≥ 1� (15)

where∏k

j=n Bj = BnBn−1 � � � Bk, k ≤ n. Using the induction hypothesis one can obtainthat

k∏j=n

Bj =(I +∑n−k+1

l=1

(∏kj=k+l−1 Cj

) −∑n−k+1l=1

(∏kj=k+l−1 Cj

)I +∑n−k

l=1

(∏kj=k+l−1 Cj

) −∑n−kl=1

(∏kj=k+l−1 Cj

) ) � 2 ≤ k ≤ n�

Necessity. Suppose that series (6) converges a.s. Notice that conditions (1) and(2) of Theorem 3.1 hold true according to Corollary 3.1.

In order to prove that condition (3) of Theorem 3.1 is necessary the contractionprinciple in the space of convergent sequences is used (see Buldygin and Solntsev,1997).

Let us fix an arbitrary sequence �mj� from the class ��, and consider the arrayof random vectors �Yn�k� n� k ≥ 1�, where

Yn�k =

(∏k+1

j=n Bj

)k� 1 ≤ k ≤ n− 1�

k� k = n�

0� k > n�

and 0 is a zero-vector in �2d. Emphasize that this array satisfies the followingconditions:

(a) for any n ≥ 1, the series∑�

k=1 Yn�k converges a.s. in �2d;(b) sequences Wk = �Yn�k� n ≥ 1�, k ≥ 1, are independent and symmetric as random

elements of the sequence space.

Moreover,

S̃n =�∑k=1

Yn�k� n ≥ 1�

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 8: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3413

Together with array �Yn�k� n� k ≥ 1�, consider the contraction array �n�k�n� k≥ 1�, where

n�k ={1� n = mj+1� mj < k ≤ mj+1� j ≥ 1�

0� otherwise�

Since random series (6) converges a.s., then the limit limn→� S̃n exists a.s.Therefore, the sequence of vectors �

∑�k=1 Yn�k� is convergent a.s. in �2d and,

moreover,

�n�kYn�k� →n→� 0� k ≥ 1� a.s.

According to the contraction principle (see Corollary 2.6.4 of Buldygin andSolntsev, 1997) it implies that∥∥∥∥ �∑

k=1

n�kYn�k

∥∥∥∥ →n→� 0� a.s.

The last formula in our terms is of such a form∥∥∥∥ mj+1∑k=mj+1

( k+1∏i=mj+1

Bi

)k

∥∥∥∥ →j→�

0� a.s.,

and is equivalent to the following:∥∥∥∥∥∥ ∑mj+1

k=mj+1 Q�mj+1� k��k∑mj+1−1k=mj+1 Q�mj+1 − 1� k��k

∥∥∥∥∥∥ →

j→�0� a.s.

Thus,

limj→�

∥∥∥∥ mj+1∑k=mj+1

Q�mj+1� k��k

∥∥∥∥ = 0� a.s. (16)

Since ��k� is a sequence of independent standard Gaussian vectors, then therandom vector

∑mj+1

k=mj+1 Q�mj+1� k��k is a zero-mean Gaussian random vector for

any j ≥ 1. Moreover, for any j1 �= j2, the random vectors∑mj1+1

k=mj1+1 Q�mj1+1� k��k and∑mj2+1

k=mj2+1 Q�mj2+1� k��k are independent. Then, by virtue of (16) and Proposition 3.2,

one has that for any � > 0:

�∑j=1

exp{− �∑mj+1

k=mj+1 ��Q�mj+1� k���2}< ��

Therefore, condition (3) of Theorem 3.1 holds true.Sufficiency. For simplicity, we will assume that all matrices Cn, n ≥ 1 are

non singular.Assume that conditions 1) – 3) are satisfied, and the matrices Cn� n ≥ 1, are

non singular. Then all matrices Bn� n ≥ 1, which appear in formula (15) are also

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 9: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3414 Buldygin and Runovska

non singular, that is detBn = detCn �= 0, n ≥ 1� Therefore, from recurrence relation(15) one can proceed to the following:

S̃n =( 2∏

j=n

Bj

)(1 + B−1

2 2 + �B−12 B−1

3 �3 + · · · + �B−12 B−1

3 � � � B−1n �n

)� n ≥ 1�

where B−1k is an inverse matrix to Bk, k ≥ 1.

Such a representation enables to rewrite the sequence �̃Sn� in the form ofthe sequence of operator-normed sums of independent Gaussian random vectors.Namely,

S̃n = �n

n∑k=1

Hk = �n�n� n ≥ 1�

where

�n =n∑

k=1

Hk� n ≥ 1�

H1 = 1� Hk =( k∏

j=2

B−1j

)k� k ≥ 2�

Moreover,

�1 = I� �n =2∏

j=n

Bj =(I +∑n−1

l=1

(∏2j=l+1 Cj

) −∑n−1l=1

(∏2j=l+1 Cj

)I +∑n−2

l=1

(∏2j=l+1 Cj

) −∑n−2l=1

(∏2j=l+1 Cj

)) � n ≥ 2�

Note that �Hn� is a sequence of independent zero-mean Gaussian randomvectors in �2d. Thus, the sequence �̃Sn� is represented in the form of sequence��n�n� of operator-normed sums of random vectors in �2d. Therefore, series (6)converges a.s., if ��n�n� ∈ c��2d� a.s. Further, one should apply Proposition 3.1 tothe sequence ��n�n�.

Assume that matrix series (7) is convergent in the matrix norm for any k ≥ 1.Then, for any k ≥ 1, the limit

limn→�Q�n� k� = Q��� k�

exists. Since

�nHk = BnBn−1 � � � Bk+1k =(

Q�n� k��k

Q�n− 1� k��k

)� k� n ≥ 1�

then, for any k ≥ 1,

��nHk� ∈ c��2d� a.s.�

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 10: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3415

that is condition (A) of Proposition 3.1 holds. Moreover,

limn→��nHk = lim

n→�

(Q�n� k��k

Q�n− 1� k��k

)=(Q��� k��k

Q��� k��k

)� k ≥ 1�

Since

�∑k=1

� limn→��nHk� =

�∑k=1

(Q��� k��k

Q��� k��k

)�

then, by (9), the series∑�

k=1� limn→��nHk� converges a.s. in �2d. Thus, condition (B) of

Proposition 3.1 holds.Finally, let us fix an arbitrary sequence �mj� from the class ��. Then,

�mj+1��mj+1

−�mj� =

( mj+2∏j=mj+1

Bj

)mj+1 + · · · + Bmj+1

mj+1−1 +mj+1

= ∑mj+1

k=mj+1 Q�mj+1� k��k∑mj+1−1k=mj+1 Q�mj+1 − 1� k��k

� j ≥ 1� (17)

Assume that condition (12) holds. Then, by (17) according to Proposition 3.2,one has

limj→�

∥∥∥∥ mj+1∑k=mj+1

Q�mj+1� k��k

∥∥∥∥ = 0� a.s.

Therefore, by (17), we obtain that

limj→�

��mj+1��mj+1

−�mj�� = 0� a.s.,

i.e., condition (C) of Proposition 3.1 holds.Thus, conditions (A) – (C) of Proposition 3.1 are satisfied, and series (6)

converges a.s.

The next two examples illustrate the partial cases of Theorem 3.1.

Example 3.1. Let Ck = �� k ≥ 1, i.e., �Xk� is a sequence of independent Gaussianrandom vectors Xk = Dk�k, k ≥ 1� Then series (6) converges a.s. if and only if

�∑k=1

��Dk��2 < �� (18)

Example 3.2. Consider a zero-mean Gaussian Markov sequence, which obeys thesystem of recurrence equations

X1 = D1�1� Xk = CXk−1 +Dk�k� k ≥ 2�

where C is some constant matrix of order d, such that ��C�� < 1�

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 11: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3416 Buldygin and Runovska

Using Theorem 3.1 it is easy to obtain that the random series∑�

k=1 Xk convergesa.s. if and only if condition (18) holds true.

Now put d = 1 and consider a zero-mean Gaussian Markov sequence ofrandom variables ��k�, i.e., the sequence which obeys the system of recurrenceequations

�1 = �1 1� �k = �k�k−1 + �k k� k ≥ 2�

where ��k� is non random real sequence, ��k� is non negative, non random realsequence, and � k� is a standard Gaussian sequence. For the sequence ��k�, considerthe random series

�∑k=1

�k� (19)

For n ≥ 1, denote

A�n� k� =

�k +

∑n−kl=1 �k

( k+l∏j=k+1

�j

)� 1 ≤ k ≤ n− 1�

�k� k = n�

0� k > n�

For k ≥ 1, put

A��� k� = �k +�∑l=1

�k

( k+l∏j=k+1

�j

)�

if the corresponding series

�∑l=1

�k

( k+l∏j=k+1

�j

)(20)

is convergent, i.e., the limit A��� k� = limn→�A�n� k� exists.

The following corollary gives a criterion for the convergence a.s. of series (19)(see also Runovska, 2010).

Corollary 3.2. In order for series (19) to converge a.s., it is necessary and sufficientthat the following three conditions be satisfied:

(1) for any k ≥ 1, series (20) is convergent;(2)

�∑k=1

�A��� k��2 < ��

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 12: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3417

(3) for any � > 0 and all the sequences �mj� from the class ��, one has

�∑j=1

exp{− �∑mj+1

i=mj+1�A�mj+1� i��2

}< ��

4. Necessary and Sufficient Conditions for the Convergence a.s.of the Series of Gaussian m-Markov Sequences

Theorem 3.1 enables the finding of the necessary and sufficient conditions forthe convergence a.s. of a series which consists of elements of Gaussian m-Markovsequence of random variables.

Thus, let ��k� be a zero-mean Gaussian m-Markov sequence of randomvariables, i.e., the sequence which obeys the mth order system of recurrenceequations:

�1−m = · · · = �−1 = �0 = 0�

�k = bk1�k−1 + bk2�k−2 + · · · + bkm�k−m + �k k� k ≥ 1� (21)

where ��k� is non negative, non random real sequence, �bkj � 1 ≤ j ≤ m� k ≥ 1� is anon random real array, and � k� is a standard Gaussian sequence. For the sequence��k�, consider the random series

�∑k=1

�k� (22)

In this section, the criterion for the convergence a.s. of series (22) is obtained.For k ≥ 1, consider the non random sequence �u�k�

n � n ≥ k� which obeys thesystem of recurrence equations

u�k+1�n = bn1u

�k+1�n−1 + bn2u

�k+1�n−2 + · · · + bnmu

�k+1�n−m � n ≥ k+ 1� (23)

where

u�k+1�k−�m−1� = u

�k+1�k−�m−2� = · · · = u

�k+1�k−1 = 0� u

�k+1�k = 1�

For k ≥ 1, set

Uk =�∑l=0

�ku�k+1�k+l �

if the corresponding series

�∑l=0

�ku�k+1�k+l (24)

is convergent.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 13: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3418 Buldygin and Runovska

The following result gives a criterion for the convergence a.s. of series (22).

Theorem 4.1. Let ��k� be a zero-mean Gaussian m-Markov sequence. In order forseries (22) to converge a.s. it is necessary and sufficient that the following threeconditions be satisfied:

(1) for any k ≥ 1, series (24) is convergent;(2) the following relation holds

�∑k=1

U 2k < �� (25)

(3) for any � > 0 and all the sequences �mj� from the class ��, one has

�∑j=1

exp{− �∑mj+1

k=mj+1

(∑mj+1−k

l=0 u�k+1�k+l

)2�2k

}< ��

Proof. In order to prove the theorem, the standard method may be used forswitching from the problem of convergence a.s. of series (22) for the Gaussianm-Markov sequence to that of the convergence a.s. of series for the GaussianMarkov sequence which obeys the first-order system of recurrence equations, thistime considered in �m (see Buldygin and Solntsev, 1997).

Indeed, set

Xk =

�k�k−1

� � ��k−m+1

� �k =

k0� � �0

� Dk =

�k 0 � � � 0 00 0 � � � 0 0� � � � � � � � � � � � � � �

0 0 � � � 0 0

and

Ck =

bk1 bk2 � � � bkm−1

bkm1 0 � � � 0 00 1 � � � 0 0� � � � � � � � � � � � � � �

0 0 � � � 1 0

� k ≥ 2�

Emphasize that matrix Ck is a Frobenius matrix for any k ≥ 1.Then instead of recurrence relation (21) of the mth order in � one can consider

the recurrence relation of the first order in �m. Namely,

X1 = D1�1� Xk = CkXk−1 +Dk�k� k ≥ 2�

Obviously, the sequence �Xk� is a zero-mean Gaussian Markov sequence in �m, andseries (22) converges a.s. if and only if the series

∑�k=1 Xk converges a.s. in �m.

Thus, one can apply Theorem 3.1 to the random series∑�

k=1 Xk. Let us showthat conditions (1)–(3) of Theorem 4.1 hold if and only if the conditions (1)–(3) ofTheorem 3.1 hold.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 14: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3419

First, find the product of matrices∏k+1

j=n Cj for any n ≥ k+ 1� Notice that thematrix Ck+1 can be represented in the following form:

Ck+1 =

b�k+1�1

b�k+1�2� � � b�k+1�m

1 0 � � � 00 1 � � � 0� � � � � � � � � � � �

0 0 � � � 0

=

u�k+1�k+1 b�k+1�2

u�k+2�k+1 � � � b�k+1�m

u�k+2�k+1

u�k+1�k b�k+1�2

u�k+2�k � � � b�k+1�m

u�k+2�k

u�k+1�k−1 b�k+1�2

u�k+2�k−1 + u

�k+3�k+2 � � � b�k+1�m

u�k+2�k−1

� � � � � � � � � � � �

u�k+1�k−�m−2� b�k+1�2

u�k+2�k−�m−2� � � � b�k+1�m

u�k+2�k−�m−2�

Further, note that the matrix Ck+2Ck+1 can be represented in the following form:

Ck+2Ck+1 =

u�k+1�k+2 b�k+1�2

u�k+2�k+2 + b�k+2�3

u�k+3�k+2 � � � b�k+1�m

u�k+2�k+2

u�k+1�k+1 b�k+1�2

u�k+2�k+1 � � � b�k+1�m

u�k+2�k+1

u�k+1�k b�k+1�2

u�k+2�k � � � b�k+1�m

u�k+2�k

� � � � � � � � � � � �

u�k+1�k−�m−3� b�k+1�2

u�k+2�k−�m−3� � � � b�k+1�m

u�k+2�k−�m−3�

Continuing the current procedure, one can obtain that all the entries of thematrices Ck+1� Ck+2Ck+1� � � � � Ck+�m−1� � � � Ck+2Ck+1 can be represented in terms ofelements of the sequence �u�k�

n � n ≥ k��

Finally, using the induction hypothesis one can prove that all the entries of anymatrix

∏k+1j=n Cj , n ≥ k+m, can be represented in the form of the sum of less then

m summands of elements of the sequence �u�k�n � n ≥ k�, defined in (23).

Further, notice that, for any k ≥ 1 and l ≥ 1,

( k+1∏j=k+l

Cj

)Dk =

u�k+1�k+l �k 0 � � � 0

u�k+1�k+l−1�k 0 � � � 0

� � � � � � � � � � � �

u�k+1�k+l−�m−1��k 0 � � � 0

Therefore, the series∑�

l=1

(∏k+1j=k+l Cj

)Dk converges in the matrix norm for any

k ≥ 1, if and only if series (24) is convergent for any k ≥ 1. Thus, condition (1) ofTheorem 4.1 holds if and only if condition (1) of Theorem 3.1 holds.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 15: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3420 Buldygin and Runovska

Since

Q��� k� = Dk +�∑l=1

( k+1∏j=k+l

Cj

)Dk =

Uk 0 � � � 0Uk 0 � � � 0� � � � � � � � � � � �

Uk 0 � � � 0

� k ≥ 1�

then

��Q��� k���2 = mU 2k � k ≥ 1�

Thus, (9) holds if and only if (25) holds true, i.e., condition (2) of Theorem 4.1 isequivalent to condition (2) of Theorem 3.1.

Fix some � > 0 and an arbitrary sequence �mj� from the class ��� Since

Q�mj+1� k��k =

(∑mj+1−k

l=0 u�k+1�k+l

)�k k(∑mj+1−k

l=1 u�k+1�k+l−1

)�k k

� � �(∑mj+1−k

l=m−1 u�k+1�k+l−�m−1�

)�k k

then the condition

limj→�

∥∥∥∥ mj+1∑k=mj+1

Q�mj+1� k��k

∥∥∥∥ = 0� a.s.

holds if and only if

limj→�

∣∣∣∣ mj+1∑k=mj+1

( mj+1−k∑l=0

u�k+1�k+l

)�k k

∣∣∣∣ = 0� a.s.

Finally, since � k� is a standard Gaussian sequence, then according toProposition 3.2 the last relation holds if and only if, for any � > 0,

�∑j=1

exp{− �∑mj+1

k=mj+1

(∑mj+1−k

l=0 u�k+1�k+l

)2�2k

}< ��

This completes the proof of Theorem 4.1.

The next corollary yields the necessary and sufficient conditions for convergencea.s. of series (22) for the Gaussian m-Markov sequence ��k� with non negativecoefficients.

Corollary 4.1. Let ��k� be a zero-mean Gaussian m-Markov sequence such that bkj ≥ 0,j = 1� 2� � � � � m, k ≥ 1� In order for series (22) to converge a.s. it is necessary andsufficient that the following two conditions be satisfied:

(1) for any k ≥ 1, series (24) is convergent;(2) relation (25) holds.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 16: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3421

Proof. Note that conditions (1) and (2) of Corollary 4.1 are the same asconditions (1) and (2) of Theorem 4.1, respectively. Now show that the condition (3)of Theorem 4.1 holds true. Fix an � > 0 and an arbitrary sequence �mj� from theclass ��. Since bk ≥ 0, k = 1� 2� � � � � m, then, for any j ≥ 1, one has

( mj+1−k∑l=0

�ku�k+1�k+l

)≤( �∑

l=0

�ku�k+1�k+l

)= Uk� k ≥ 1�

Observe that, for any j ≥ 1,

exp

{− �∑mj+1

k=mj+1

(∑mj+1−k

l=0 u�k+1�k+l

)2�2k

}≤ 1

( mj+1∑k=mj+1

( mj+1−k∑l=0

u�k+1�k+l

)2

�2k

)≤ 1

( mj+1∑k=mj+1

U 2k

)�

Moreover, the convergence of the series

�∑j=1

( mj+1∑k=mj+1

U 2k

)

follows from condition (25). Thus, condition (3) of Theorem 4.1 holds true.

The next corollary illustrates the case of Gaussian m-Markov sequence ��k� withconstant coefficients.

Corollary 4.2. Let the sequence ��k� obey the system of recurrence equations

�1−m = · · · = �−1 = �0 = 0� �k = b1�k−1 + b2�k−2 + · · · + bm�k−m + �k k� k ≥ 1�

where bj , j = 1� 2� � � � � m, are some real constants, and ��k� is non negative non randomreal sequence such that there are non equal to zero elements among �k’s. Then in orderfor series (22) to converge a.s. it is necessary and sufficient that the following twoconditions hold:

(1) max1≤k≤s

�k� < 1, where i, i = 1� 2� � � � � s, s ≤ m, are different roots of the equation

m − �b1m−1 + b2

m−2 + · · · + bm−1+ bm� = 0�

(2) the following relation holds:

�∑k=1

�2k < ��

Proof. Prior to the proof of Corollary 4.2, let us consider the Frobenius matrix

C =

b1 b2 � � � bm−1 bm1 0 � � � 0 00 1 � � � 0 0� � � � � � � � � � � � � � �0 0 � � � 1 0

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 17: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3422 Buldygin and Runovska

The characteristic equation for the matrix C takes the form

m − �b1m−1 + b2

m−2 + · · · + bm−1+ bm� = 0�

Let 1� 2� � � � � s, s ≤ m, be the different roots of this equation, and �1� �2� � � � � �s bethe corresponding multiplicities of these roots. Emphasize that roots 1� 2� � � � � s,s ≤ m, are, generally speaking, complex-valued.

Denote by r the spectral radius of matrix C, i.e.,

r = max1≤k≤s

�k��

and by � the maximal multiplicity of the roots k, 1 ≤ k ≤ s, i.e.

� = max1≤k≤s

��k � �k� = r��

By the proof of Corollary 4.2, we will use again an approach based on switchingfrom the problem of convergence a.s. of the series (22) for the Gaussian m-Markovsequence to that of the convergence a.s. of series for the Gaussian Markov sequencewhich obeys the first-order system of recurrence equations, this time considered in�m. Indeed, set

Xk =

�k�k−1

� � ��k−m+1

� �k =

k0� � �0

� H =

1 0 � � � 0 00 0 � � � 0 0� � � � � � � � � � � � � � �0 0 � � � 0 0

� Dk = �kH� k ≥ 1�

Then, instead of the recurrence relation of the mth order in � one can consider therecurrence relation of the first order in �m. Namely,

X1 = D1�1� Xk = CXk−1 +Dk�k� k ≥ 2�

Obviously, the sequence �Xk� is a zero-mean Gaussian Markov sequence in �m withconstant matrix coefficients. Therefore, series (22) converges a.s. if and only if theseries

∑�k=1 Xk converges a.s. in �m. Thus, one should apply Theorem 3.1 to the

series∑�

k=1 Xk.Consider matrix series (7). In our case, for any k ≥ 1, it takes the form

�∑l=0

Cl ·Dk�

Note that, for any k ≥ 1, one has

��ClDk�� = ��Cl�kH�� = �k��ClH��� l ≥ 1�

SinceC is a Frobenius matrix, then according to Lemma 7.7.3 (Buldygin and Solntsev,1997) one can estimate the norm of the matrix ��ClH�� for any l ≥ 1� Indeed,

c1 · rl · l�−1 ≤ ��ClH�� ≤ c2 · rl · l�−1� (26)

where c1 and c2 are some constants such that c2 > c1 > 0. Hence,

c1 · rl · l�−1 · �k ≤ ��ClDk�� ≤ c2 · rl · l�−1 · �k� l ≥ 1�

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 18: Almost Sure Convergence of the Series of Gaussian Markov Sequences

Series of Gaussian Markov Sequences 3423

if the corresponding �k �= 0� Then for the corresponding k the matrix series∑�

l=0 Cl ·

Dk is convergent in the matrix norm if and only if

�∑l=0

rl · l�−1 < ��

Since � ≥ 1, then the last relation holds if and only if r < 1, i.e.,

max1≤k≤s

�k� < 1�

Thus, condition (1) of Theorem 3.1 in our case holds if and only if condition(1) of Corollary 4.2 holds true.

Further, observe that, for any k ≥ 1,

Q��� k� =( �∑

l=0

Cl

)Dk =

( �∑l=0

Cl

)�kH�

and Q��� k� is well-defined for any k ≥ 1 . Then

��Q��� k���2 =∣∣∣∣∥∥∥∥( �∑

l=0

Cl

)�kH

∥∥∥∥∣∣∣∣2 = �2

k

∣∣∣∣∥∥∥∥( �∑

l=0

Cl

)H

∥∥∥∥∣∣∣∣2� k ≥ 1�

Hence, condition (9) holds if and only if

�∑k=1

�2k < ��

Finally, fix an � > 0 and an arbitrary sequence �mj� from the class ��. Since

Q�n� k� =( n−k∑

l=0

Cl

)�kH� 1 ≤ k ≤ n� n ≥ 1�

then, by (26), one has

exp{− �∑mj+1

k=mj+1 ��Q�mj+1� k���2}

≤ 1�

mj+1∑k=mj+1

��Q�mj+1� k���2 =1�

mj+1∑k=mj+1

∣∣∣∣∥∥∥∥( mj+1−k∑

l=0

Cl

)�kH

∥∥∥∥∣∣∣∣2

≤ 1�

mj+1∑k=mj+1

�2k

( mj+1−k∑l=0

∣∣∣∣∥∥∥∥ClH

∥∥∥∥∣∣∣∣)2

≤ c22�

mj+1∑k=mj+1

�2k

( mj+1−k∑l=0

rl · l�−1

)2

≤ c22�

( �∑l=0

rl · l�−1

)2

·mj+1∑

k=mj+1

�2k�

Therefore, condition (12) follows from the convergence of the series∑�

k=1 �2k.

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4

Page 19: Almost Sure Convergence of the Series of Gaussian Markov Sequences

3424 Buldygin and Runovska

References

Arato, M. (1982). Linear Stochastic Systems with Constant Coefficients: A Statistical Approach.Berlin-Heidelberg: Springer-Verlag.

Buldygin, V. V. (1978). The strong laws of large numbers and the convergence to zero ofGaussian sequences. Theory Probab. Math. Statist. 19:35–43.

Buldygin, V. V. (1980). Convergence of Random Elements in Topological Spaces. Kiev:“Naukova Dumka”. (in Russian)

Buldygin, V. V., Solntsev, S. A. (1987). The strong law of large numbers for sums ofindependent random vectors with operator normalizations and the convergence to zeroof Gaussian sequences. Probab. Theor. Appl. 32:243–256.

Buldygin, V. V., Solntsev, S. A. (1989). Functional Methods in Problems of the Summation ofRandom Variables. Kiev: “Naukova Dumka”. (in Russian)

Buldygin, V. V., Solntsev, S. A. (1997). Asymptotic Behavior of Linearly Transformed Sums ofRandom Variables. Dordrecht: Kluwer Academic Publishers.

Buldygin, V. V., Runovska, M. K. (2009). On the convergence of series of autoregressivesequences. Theor. Stochastic Process. No. 1, 15(31):7–14.

Buldygin, V. V., Runovska, M. K. (2010). On the convergence of series of autoregressivesequences in Banach spaces. Theor. Stochastic process. No. 1, 16(32):29–38.

Dorogovtsev, A. Ya. (1992). Periodic and Stationary Behaviours in Infinite-DimensionalDeterministic and Stochastic Dynamical Systems. Kiev: “Vyshcha Shkola”. (in Russian)

Korostelev, A. P. (1984). Stochastic Recurrent Procedures: Local Properties. Moscow:“Nauka”. (in Russian)

Koval, V. A. (1991). Asymptotical behavior of solutions of stochastic recurrence equationsin space �d. Ukr. Math. J. 6(43):776–779.

Runovska, M. K. (2010). Convergence of series of elements of Gaussian Markov sequences.Teor. Imovir. Math. Stat. 83:125–137. (in Ukrainian)

Dow

nloa

ded

by [

Uni

vers

ity o

f C

onne

ctic

ut]

at 1

2:56

10

Oct

ober

201

4