Vector autoregressive Moving Average Process - · PDF fileVector autoregressive Moving...
date post
30-Aug-2019Category
Documents
view
12download
0
Embed Size (px)
Transcript of Vector autoregressive Moving Average Process - · PDF fileVector autoregressive Moving...
Vector autoregressive Moving Average Process
Presented by
Muhammad Iqbal, Amjad Naveed
and Muhammad Nadeem
Road Map
1. Introduction
2. Properties of MA Finite Process
3. Stationarity of MA Process
4. VARMA (p,q) process
5. VAR (1) Representation of VARMA (p,q) Process
6. Autocovariance and Autocorrelation of VARMA (p,q) Process
1: Introduction
Extension of Standard VAR process
VAR(p)
= + 11 ++ +
Where is auto correlated not a white noise MA(q)
= +11 ++ Where is zero mean white noise with nonsingular covariance matrix
Combination of VAR(p) and MA(q) is VARMA (p, q) = + 11 ++ + +11 ++
2:Properties of MA Finite Process
MA(1)
= + +11
Where = (1 , , ) and is zero mean white noise wit
non-singular covariance matrix and () = , further assume that = 0
= 11
Successive substitution we get (MA(1) to VAR()
= 11
= 1 1 12 = 11 +122
= 11 ++ 1 + 1
+11
= + (1)
=1
,
= 1
=1
+
is VAR() ,(Wold Type representation as in ch:2 )
if 1 0 .
This requires that the eigenvalues of M1 are all less than 1 in modulus, i.e.,
+1 0 , 1 condition is analogous to the stability condition for a VAR(1)
Similarly MA(q) process can be represented as VAR()
= +11 ++ , = 0,1,2, ,
= =1 + VAR()
If
det +1 + + 0 , 1
then, the MA(q) process is called invertible .
MA(q) to VAR by using lag operator
= +1 + +
=
() = +1 ++
MA operator is invertible if it satisfies the above condition. Then we can write,
()1 =
()1 = =
=0
Where 1 = 1
and = , = 2, 3, , 1=0
= 0 >
These recursions used to compute the MA coefficients of a pure VAR process (ref ch2)
3: Stationarity of MA process () = =0
=
= +
, = , 1, , ,
=0
0 = + 1, + 2, ,
With 0 =
If h > q then the vector are uncorrelated =0
The autocovariance is independent of time (stationary)
Non-invertible MA(q) process
That violate the invertible condition and has no roots on the complex unit circle, i.e., if
det +1 + + 0 = 1
This process can also be stationary ( proof by Hannan and Deistler, 1988)
For instance two different univariate MA(1) process below yielding same
autocovariance structure.
A) = +1 its autocovariance is
= 1 2
2 = 0
2 = 1,
0
Where
2 =
B) = +1
1 where is white noise process with same
autocovariance 2 = =
22
When >1 , then we may choose invertible presentation
= 1 +1
1
= 1
=0
= 1 +1
1
(1 +)
If = 1 and , hence 1+mz=0 for some z on the unit circle (z=1 or -1), an invertible representation does not exist.
4: VARMA (p,q) process as Pure MA and Pure VAR
= + 11 ++ + +11 +
+ , = 0,1,2, ,
Where is zero mean white noise with non-singular covariance matrix
Let suppose MA part is denoted by
and = +11 ++
Then VARMA (p, q) look like VAR process as,
= + 11 ++ +
If this process is stable, that is, if
det 1 0 1
then it can be represented as MA()
= +
=0
= + (
=0
+11 ++
= + =0 .Pure MA process
The is = ( 1 )1
=0
= ( 1 )1
The are (K x K) matrices
=0
=
=0
( 1 )
VARMA (p,q) in lag operator notation as MA process
= + 11 ++ + +11 ++
After using lag operator
= +()
Where () = 1 and
= +1 + +
Premultiplying with 1 we get
= 11 + 1
= +
=0
Hence, multiplying from the left by A(L) gives
( 1 )
=0
= ( +
=1
=0
= +1 + +
After comparing coefficients
=
=1
, = 1,2, ,
with 0 = , = 0 for > and = 0 for > ,
Rearranging gives
= + , = 1,2, ,
=1
If MA operator M(L) satisfies the invertibility condition
det +1 + + 0 , 1
Then VARMA process is called invertible then we have Pure VAR representation
y y
=1
= 1 y = 11 +
matrices are obtained by comparing coefficients,
+
=1 = 1()
Multiplying by M(L) from the left
( +1 ++
=1
= + M
=1
=1
= 1
with M0 = , = 0 for > . = 0 for i > and comparing coefficients gives,
= M
1
=1
or
= + M = 1,2, 1=1
if is a stable and invertible VARMA process, then the pure MA representation is called the Canonical or prediction error MA representation.
= + =0
5: VAR(1) representation of a VARMA process
Suppose has the (, ) and for simplicity = 0
Y =
+1
+1
, U =
0000
1)
( 1)
(( + ) 1)
=11 21 22
( + ) ( + )
With this notation we get VAR (1) process of .
= 1 +
11 =
1 1
0
0
00
12 =
1 1 00
00
00
( ) ( )
21 = 0 22 =
0 0 0
0
0
00
( ) ( )
If VAR order is one, we will choose p=2 and set 2 = 0
VAR (1) process is stable if and only if is stable.
det + = det 11 det 22
= det 1
Determinant can be found by partition Matrix (See Apendix A.10)
Wold type representation
If hence is stable, VARMA (p,q) can be represented as MA ()
=
=0
Premultiplying by the J matrix,
=
=0
As = : 0: 0
= J
=0
= J
=0
Where =
0000
)
()
and Thus =
=
=0
Example: VARMA (1,1) y = 1y1 + +11 For this process
=y
, =1 10 0
, =
J = : 0 ( 2) =
(2 )
0 = = , 1 = = 1:1 = 1 +1,
2 = 2 = 1
2 110 0
= 12 + 11,
= = 1
111
0 0 = 1
+ 111, = 0,1,2, .
6: Autocovariance and Autocorrelation of VARMA (p,q)
The K-dimensional, zero mean, stable VARMA(p, q) process is
= 1 1 ++ + +11 ++
the autocavariance can be obtained by Pure MA representation If has the canonical MA representation as
=
=0
The autocovariance are
= = +
,
=0
Alternative Method: The more convenient process is multiplying VARMA process by
and taking expectations gives
= 1 1 ++
+ ++(
)
From Pure MA representation, It can be seen that E() = 0 for < .
Hence we get > = 1 1 ++ ( )
If h = 0 (initial matrix), covariance matrix of VAR(1) process is 0 = 0
+
Here = () is the covariance matrix of white noise.
Applying the vec operator
vec 0 = (2(+)2 ) vec( )
Inverse only exist when (I ) is stable
First evaluate 0 from given and
Recursion only valid for >
Computation of autocovariance requires that > .
If not we will add one more lag of by taking zero coefficient matrix until VAR(p) is greater then MA(q)
Example: Considering VARMA(1,1) process as p = q so we are adding second lag of
= 11 + 22 + +11 With 2 = 0, now
=
1
, =1 0 1 0 00 0 0
=
0
, = 0 0 0 0 0
vec( 0 ) = (92 )
vec( )
By having the starting up matrices, the recursion may be applied,
= 1 1 for = 2,3,
Autocorrelations of VARMA(p,q) process is
= 1
1
Where D is the diagonal Matrix with the square roots from 0 on the main Diagonal.
References
Hannan, E. J. & Deistler, M. (1988). The Statistical Theory of Linear Systems, JohnWiley, New York.
Lutkepohl, H. (2005). New Introduction to Multiple Time Series Analysis, Springer, Berlin.
Wei, W. W. S. (2005) ``Time Series Analysis: Univariate and Multivariate Methods`` Second Edition, Pearson (Addison Wesley)
Thank You