Vector Auto Regressions

22
Introduction Stable VAR Processes Vector error correction models Vector autoregressions Based on the book ‘New Introduction to Multiple Time Series Analysis’ by Hel mut L ¨ utkepohlRobert M. Kunst [email protected] University of Vienna and Institute for Advanced Studies Vienna November 23, 2011 Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Transcript of Vector Auto Regressions

Page 1: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 1/22

Introduction Stable VAR Processes Vector error correction models

Vector autoregressions

Based on the book ‘New Introduction to Multiple Time SeriesAnalysis’ by Helmut Lutkepohl’

Robert M. [email protected]

University of Vienna

andInstitute for Advanced Studies Vienna

November 23, 2011

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 2: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 2/22

Introduction Stable VAR Processes Vector error correction models

Outline

Introduction

Stable VAR ProcessesBasic assumptions and properties

ForecastingStructural VAR analysis

Vector error correction modelsUnivariate integrated processes

Integrated VAR processesCointegrated VAR processesDeterministics in the VECMCausality and impulse response analysis

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 3: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 3/22

Introduction Stable VAR Processes Vector error correction models

Univariate integrated processes

Review: univariate integrated AR processes

If a univariate autoregressive process

y t  = ν + a1y t −1 + . . . + ap y t −p  + u t 

has inverted roots of its characteristic polynomial λ1, . . . , λp  suchthat d  roots are exactly one and the remaining roots fulfil |λ j | < 1,then it is unstable, whereas its d –th difference

∆d y t  = (1 − L)d y t 

is stable. It is then called integrated of order d  or I(d).

d  = 1 is the most important case. The random walk is I(1).

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 4: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 4/22

Introduction Stable VAR Processes Vector error correction models

Univariate integrated processes

Univariate integrated processes

More general definitions of integrated processes do not assume theprocess to be autoregressive. Then, a process is called I (d ) iff itsd –th difference has a Wold-type MA representation

y t  =

 j =1

θ j u t − j 

with θ(1) =

 j =0 θ j  = 0 and

 j =0

 j |θ j | <∞.

The flanking conditions guarantee uniqueness of  d  andconvergence of the so-called Beveridge-Nelson decomposition.Other authors may require different flanking conditions.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 5: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 5/22

Introduction Stable VAR Processes Vector error correction models

Univariate integrated processes

The Beveridge-Nelson decomposition

One can show that every I(1) process has a unique representationof the form

y t  = y 0 + θ(1)

s =1 u s  +

 j =0

θ

 j u t − j − w 

0 ,

with θ∗ j  = −

i = j +1 θi  and w ∗0 =

 j =0 θ∗ j u − j  reflecting starting

conditions.

Essentially, any I(1) process can be decomposed into a randomwalk and a stable remainder. A comparable decomposition formultivariate processes was established by the Granger representation theorem.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 6: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 6/22

Introduction Stable VAR Processes Vector error correction models

Integrated VAR processes

Integrated VAR processes: the concept

Consider a VAR without intercept

A(L)y t  = u t 

with the characteristic polynomial A(z ) = I K − A1z − . . .− Ap z p .Formally, any matrix A can be represented as a product of itsdeterminant and an invertible matrix, whose inverse is called itsadjoint Aadj . Thus,

|A(L)|y t  = A(L)adj u t 

always works. If  m roots of  |A(z )| are one and the remaining onesare larger than one, i.e.

|A(L)| = (1 − L)mα(L),

with α(.) invertible, the process y t  is called integrated . However,m need not be the desired d .

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 7: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 7/22

Introduction Stable VAR Processes Vector error correction models

Integrated VAR processes

Two role-model examples for integrated VAR processes

The VAR(1)

y t  =

1 00 1

y t −1 + u t 

has determinant (1 − z )2. Here, y t  should be I(1).

The VAR(2)

y t 

= 2 0

0 0 y 

t −

1+ −1 0

0 0 y 

t −

2+ u 

has determinant (1 − z )2. Here, y t  should be I(2).

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 8: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 8/22

Introduction Stable VAR Processes Vector error correction models

Integrated VAR processes

Integration order for vector autoregressions

DefinitionA vector autoregressive process (y t ) (whose determinant has onlyroots at one or outside the unit circle) is called integrated of order d  or I(d) iff ∆d y t  is stable but ∆d −1y t  is not stable.

Remark: Usually, the differenced processes do not have finite-orderVAR representations. Some components often have individuallower integration order than d  and may also be stationary.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 9: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 9/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

Cointegrated VAR processes: the concept

DefinitionLet (y t ) be a vector autoregressive process of order p  that isintegrated of order d . It is called cointegrated of order  (d , b ) orCI(d,b) iff there exists a linear combination z t  = β ′y t  with

β = (β 1, . . . , β  K )′ = 0 such that z t  is I (d − b ).

Remarks:

◮ If a component of  y t , say the first, is integrated of lower orderthan d , the conditions are fulfilled for β = (1, 0, . . . , 0)′: there

is self-cointegration, although it may not correspond to theoriginal idea;

◮ The case CI (1, 1) is the most important one;

◮ The linear combination β  is called a cointegrating vector .

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Page 10: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 10/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

Cointegrated VAR: role-model case

Consider the VAR(1)

y t  =

1 01 0

y t −1 + u t ,

which has determinant (1 − z ) and is I(1). The combination(1,−1)y t  = y 1,t − y 2,t  is white noise, and β = (1,−1)′ is acointegrating vector. Note the equivalent representation

∆y t  =

0 01 −1

y t −1 + u t ,

which shows the vector β  in the coefficient matrix.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

I d i S bl VAR P V i d l

Page 11: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 11/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

Properties of cointegrating vectors

◮ Cointegrating vectors are not unique . Even when there is onlyone cointegrating vector, all multiples will also cointegrate;

◮ If there are several linear independent cointegrating vectors, alltheir linear combinations also cointegrate. The cointegratingvectors can be seen as a basis of a linear cointegrating space ;

◮ The dimension of the cointegrating space, the cointegrating 

rank , is unique.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

I t d ti St bl VAR P V t ti d l

Page 12: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 12/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

The vector error-correction model

Any VAR(p ) model y t  = A1y t −1 + . . . + Ap y t −p  + u t  can bere-written in the error-correction form (vector error-correctionmodel , VECM)

∆y t  = Πy t −1 +p −1 j =1

Γ j ∆y t − j  + u t ,

with a one-to-one functional relation between the parameters

(A1, . . . ,Ap ) and (Π, Γ1, . . . , Γp −1). In particular,

Π = −(I − A1 − . . .− Ap ), Γ j  = −(A j +1 + . . . + Ap ).

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 13: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 13/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

Cointegration in the VECM

If (y t ) is integrated, Π will have reduced rank r  < K . Any matrixof rank r  < K  can be decomposed as

Π = αβ ′,

with α and β  of rank r  and dimension K × r . Thus,

∆y t  = αβ ′y t −1 +

p −1 j =1

Γ j ∆y t − j  + u t ,

and β  contains the cointegrating vectors if (y t ) is CI(1,1). Notethat all terms ∆y t − j  ( j  = 0, . . . , p − 1), u t , and β ′y t −1 arestationary. α is called the loading matrix .

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 14: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 14/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

A suggested normalization

While Π is unique, the decomposition and hence α and β  are not.A potential normalization is to impose unit length on all columnsof β . Alternatively, normalize β  to the form

β =

I r 

β K −r 

.

With self-cointegration, the corresponding column in β  is a unit

vector. Stationary component variables of  y t  should be among thefirst r  positions.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 15: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 15/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

The economic interpretation

◮ Economists see the vectors in β  as dynamic equilibrium

conditions: the stationary islands in a wild integrated sea;◮ Loading matrices describe how component variables react to

deviations from equilibrium: αij  shows how component i reacts to deviations from equilibrium condition j ;

◮ Γ j  represent short-run dynamics in the system (inertia).

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 16: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 16/22

Introduction Stable VAR Processes Vector error correction models

Cointegrated VAR processes

Granger’s Representation Theorem

Proposition

Suppose 

∆y t  = αβ ′y t −1 +

p −1 j =1

Γ j ∆y t − j  + u t , t  = 1, 2, . . . ,

with y t  = u t  = 0, t ≤ 0 and u t  is white noise for t > 0. The corresponding polynomial C (z ) is defined by 

C (z ) = (1 − z )I K − αβ ′z −

p −1 j =1

Γ j (1 − z )z  j .

Assume (a) |C (z )| = 0 ∴ |z | > 1 or z  = 1; (b) The number of unit roots is K − r; (c) α and β  have dimension K × r and rank r.Then y t  can be represented as 

y t  = Ξt 

 j =1

u  j  + Ξ∗(L)u t  + y ∗0 .

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 17: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 17/22

Cointegrated VAR processes

Some details for the representation theorem

Ξ = β ⊥α

I K −

p −1

 j =1 Γ j 

β ⊥−1

α

describes the ‘weight’ of the pure (K − r )–dimensional randomwalk in the multivariate Beveridge-Nelson decomposition. y ∗0contains initial values. Ξ∗(L)u t  is a convergent stable process, withΞ∗ j  some (complex) function of the original parameters.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 18: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 18/22

Deterministics in the VECM

Deterministic terms in the VECM

Supposey t  = µt  + x t ,

with x t  non-deterministic and (1) µt  = µ0 or (2) µt  = µ0 + µ1t .For (1), substitute into the VECM

∆y t  = αβ ′(y t −1 − µ0) +p −1 j =1

Γ j ∆y t − j  + u t ,

as ∆µ0 = 0. The term β ′(y t −1 − µ0) can be re-written as

(β ′ : −β ′µ0)

y t −1

1

.

Formally, this looks as if  y  were cointegrated with the constant 1.Note that y  is not trending here.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 19: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 19/22

Deterministics in the VECM

The trending case of the VECM

Suppose case (2), i.e. µt  = µ0 + µ1t . Substitute into the VECM

∆y t −µ1 = αβ ′{y t −1 −µ0 −µ1(t − 1)}+

p −1

 j =1

Γ j (∆y t − j −µ1) + u t ,

as ∆(µ1t ) = µ1. The term β ′{y t −1 − µ0 − µ1(t − 1)} can bere-written as

(β ′ : −β ′µ1)y t −1

t − 1− β ′µ0.

Formally, this looks as if  y  were cointegrated with the linear timetrend. Note that y  is I(1) plus linear trend here, while β ′y  istrend-stationary but not stationary.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 20: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 20/22

Deterministics in the VECM

The trending case of the VECM without explicit t 

If β ′µ1 = 0, then the corresponding term disappears and thesystem can be written as

∆y t  = ν + αβ ′y t −1 +

p −1

 j =1

Γ j ∆y t − j  + u t ,

with ν = (I K − Γ1 − . . .− Γp −1)µ1 − αβ ′µ0. This is the oftenestimated VECM for trending variables. y  is I(1) plus linear trend,while β ′y  is stationary.

It is debatable whether the restriction relative to the previousmodel is natural in applications. A cointegrating vector that yieldsa trend-stationary variable instead of a stationary one may be of little interest.

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 21: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 21/22

Causality and impulse response analysis

Dynamic analysis in the VECM

Forecasting, causality, impulse response analysis: what changes inan integrated VAR relative to the stable VAR?

◮ Forecasting: the same calculus applies, the prediction varianceincreases without finite bounds as the horizon h increases,

while the prediction variance in the cointegrating directionsremains bounded;

◮ Causality: there are two sources of causality in a cointegratedVAR, short- and long-run causality;

◮ IRF: response functions approach a non-zero constant in allintegrated directions, while they approach zero forcointegrated directions, including potential stationarycomponents (self-cointegration).

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna

Introduction Stable VAR Processes Vector error correction models

Page 22: Vector Auto Regressions

8/2/2019 Vector Auto Regressions

http://slidepdf.com/reader/full/vector-auto-regressions 22/22

Causality and impulse response analysis

Causality analysis in the VECM

For two sub-vectors y  = (z ′, x ′)′, consider the partitionedrepresentation of the VECM

∆z t ∆x t 

=

Π11 Π12Π21 Π22

z t −

1x t −1

+

p −1 j =1

Γ11

,

 j  Γ12,

 j Γ21, j  Γ22, j 

∆z t 

− j 

∆x t − j 

+ u t .

Absence of causality from x  to z  requires Π12 = 0 and Γ12, j  = 0, j  = 1, . . . , p − 1. The former event means that z  doesnot adjust to the disequilibrium β ′y  that may include x . The latterone means that ∆z  is unaffected by lagged ‘short-run’ effects in∆x .

Vector autoregressions University of Vienna and Institute for Advanced Studies Vienna