Lesson 17: Vector AutoRegressive · PDF fileLesson 17: Vector AutoRegressive Models Umberto...

Click here to load reader

  • date post

    24-Feb-2018
  • Category

    Documents

  • view

    225
  • download

    2

Embed Size (px)

Transcript of Lesson 17: Vector AutoRegressive · PDF fileLesson 17: Vector AutoRegressive Models Umberto...

  • Lesson 17: Vector AutoRegressive Models

    Umberto Triacca

    Dipartimento di Ingegneria e Scienze dellInformazione e MatematicaUniversita dellAquila,

    [email protected]

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Vector AutoRegressive models

    The extension of ARMA models into a multivariate frameworkleads to

    Vector AutoRegressive (VAR) models

    Vector Moving Average (VMA) models

    Vector AutoregRegressive Moving Average (VARMA)models

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Vector AutoRegressive models

    The Vector AutoRegressive (VAR) models ,

    made famous in Chris Simss paper Macroeconomics and Reality,Econometrica, 1980,

    are one of the most applied models in the empirical economics.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Vector AutoRegressive models

    Let {yt = (y1t , ..., yKt)

    ; t Z}

    be a K -variate random process.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Vector AutoRegressive models

    We say that the process {yt ; t Z} follows a vector autoregressivemodel of order p, denoted VAR(p) if

    yt = + A1yt1 + ...+ Apytp + ut , t Z

    p is a positive integer,

    Ai are fixed (K K ) coefficient matrices, = (1, ..., K )

    is a fixed (K 1) vector of intercept terms,ut = (u1t , ..., uKt)

    is aK -dimensional white noise withcovariance matrix u.

    The covariance matrix u is assumed to be nonsingular.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Example: Bivariate VAR(1)

    [y1ty2t

    ]=

    [12

    ]+

    [a11 a12a21 a22

    ] [y1t1y2t1

    ]+

    [u1tu2t

    ]or

    y1t = 1 + a11y1t1 + a12y2t1 + u1t

    y2t = 1 + a21y1t1 + a22y2t1 + u2t

    where cov(u1t , u2s) = 12 for t = s, 0 otherwise

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Advantages of VAR models

    1 They are easy to estimate.

    2 They have good forecasting capabilities.

    3 The researcher does not need to specify which variables areendogenous or exogeneous. All are endogenous.

    4 In a VAR system is very easy to test for Granger non-causality.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Problems with VARs

    1 So many parameters. If there are K equations, one for each Kvariables and p lags of each of the variables in each equation,(K + pK 2) parameters will have to be estimated.

    2 VARs are a-theoretical, since they use little economic theory.Thus VARs cannot used to obtain economic policyprescriptions.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    In this lesson, the estimation of a vector autoregressive model isdiscussed.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    Consider the VAR(1)

    y1t = 1 + a11y1t1 + a12y2t1 + u1t

    y2t = 1 + a21y1t1 + a22y2t1 + u2t

    wherecov(u1t , u2s) = 12 for t = s, 0 otherwise.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    The model corresponds to 2 regressions with different dependentvariables and identical explanatory variables.

    We could estimate this model using the ordinary least squares(OLS) estimator computed separately from each equations.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    It is assumed that a time series

    y1 = [y11, y21], ..., yT = [y1T , y2T ]

    of the y variables is available.In addition, a presample value

    y0 = [y10, y20]

    is assumed to be available.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    Consider the first equation

    y1t = 1 + a11y10 + a12y2t1 + u1t ; t = 1, ...,T

    y11 = 1 + a11y10 + a12y20 + u11

    y12 = 1 + a11y11 + a12y21 + u12

    ...

    y1T = 1 + a11y1T1 + a12y2T1 + u1T

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    We definey1 = [y11, ..., y1T ]

    ,

    X1 =

    1 y1,0 y2,01 y1,1 y2,1...

    ......

    1 y1,T1 y2,T1

    ,

    1 = [1, a11, a12],

    u1 = [u11, ..., u1T ],

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Thusy1 = X1 + u,

    The OLS estimator 1 is given by

    1 = (XX)1Xy1

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    Consider the second equation

    y2t = 2 + a21y10 + a22y2t1 + u2t ; t = 1, ...,T

    y21 = 2 + a21y10 + a22y20 + u21

    y22 = 2 + a21y11 + a22y21 + u22

    ...

    y2T = 2 + a21y1T1 + a22y2T1 + u2T

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    We definey2 = [y21, ..., y2T ]

    ,

    2 = [2, a21, a22],

    u2 = [u21, ..., u2T ],

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation

    Thusy2 = X2 + u2,

    The OLS estimator 2 is given by

    2 = (XX)1Xy2

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • OLS estimators

    1 = (XX)1Xy1 1 = [1, a11, a12]

    2 = (XX)1Xy2 2 = [2, a21, a22]

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • OLS Estimation

    Are the OLS estimators for 2 and 2 efficient estimators?

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Estimation

    We remember that

    cov(u1t , u2t) = 12 6= 0

    When contemporaneous correlation exists, it may be more efficientto estimate all equations jointly, rather than to estimate each oneseparately using least squares.

    The appropriate joint estimation technique is the GLS estimation

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    Let as consider the following set of two equations

    y1t = 10 + 11x1t + 12z1t + u1t

    y2t = 10 + 11x2t + 12z2t + u2t

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    The two equation can be written in the usual matrix algebranotation as

    y1 = X11 + u1

    y2 = X22 + u2

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    whereyi = [yi1, ..., yiT ]

    i = 1, 2

    Xi =

    1 xi ,1 zi ,11 xi ,2 zi ,2...

    ......

    1 xi ,T zi ,T

    i = 1, 2,i = [i0, i1, i2]

    , i = 1, 2,

    andui = [ui1, ..., uiT ]

    , i = 1, 2,

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    Further, we assume that

    1 E [uit ] = 0 i = 1, 2 t = 1, 2...T

    2 var(uit) = 2i i = 1, 2 t = 1, 2...T ;

    3 E [uitujt ] = ij i , j = 1, 2;

    4 E [uitujs ] = 0 for t 6= s and i , j = 1, 2;

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    Now we write the two equations as the folllowing super model[y1y2

    ]=

    [X1 00 X2

    ] [12

    ]+

    [u1u2

    ]or

    y = X + u

    The first point to note is that it can be shown that least squaresapplied to this system is identical to applying least squaresseparately to each of the two equations.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    Note that the covariance matrix of the joint disturbace vector u isgiven by

    =

    [21I 12I12I 22I

    ]=

    [21 1212

    22

    ] IT = IT

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    The disturbance covariance matrix is of dimension(2T 2T ).An important point to note is that cannot be writtenas scalar multiplied by a 2T -dimensional identity matrix. Thus thebest linear unbiased estimator for is given by the generalizedleast squares estimator

    GLS = (X1X)1X1y = [X(1 IT )X]1X(1 IT )y

    It has lower variance than the least squares estimators for because it takes into account the contemporaneous correlationbetween the disturbances in different equation.

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Seemingly Unrelated Regressions Equations

    There are two conditions under the which least squares is identicalto generalized least squares.

    1 The first is when all contemporaneus covariances are zero,12 = 0.

    2 The second is when the explanatory variables in each equationare identical, X1 = X2

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation of A VAR model

    The use of generalized least squares estimator does not lead to again in efficiency when each equation contains the sameexplanatory variables.

    Thus the autoregressive cooefficients of our VAR(1) model can beestimated, without loss of estimation efficiency, by ordinary leastsquares. This in turn is equivalent to estimating each equationseparately by OLS

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation of A VAR model

    The (2 2) unknown covariance matrix may be consistentestimated by whose elements are

    ij =ui uj

    T 2p 1for i , j = 1, 2

    where ui = yi Xi

    Umberto Triacca Lesson 17: Vector AutoRegressive Models

  • Estimation of A VAR model

    In general, the autoregressive cooefficients of a K -dimensionalVAR(p) model

    yt = + A1yt1 + ...+ Apytp + ut

    can be estimated, without loss of estimation efficiency, by ordinaryleast squares.In the first equation, we have to run the regression

    y1t on y1t1, ..., yKt1, ..., y1tp, ..., yKtp,

    in the second equation, we regress

    y12 on y1t1, ..., yKt1, ..