Post on 06-Feb-2016
description
9.1
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Lecture #9Studenmund (2006) Chapter 9
Objectives • The nature of autocorrelation• The consequences of autocorrelation• Testing the existence of autocorrelation• Correcting autocorrelation
9.2
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Time Series Data
Time series process of economic variables
e.g., GDP, M1, interest rate, exchange rate,
imports, exports, inflation rate, etc.
Realization
An observed time series data set generated from a time series process
Remark: Age is not a realization of time series process.Time trend is not a time series process too.
9.3
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Decomposition of time series
Trend
random
Cyclical orseasonal
Xt
time
Xt = Trend + seasonal + random
9.4
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Static ModelsStatic Models
Ct = 0 + 1Ydt + t
Subscript “t” indicates time. The regression is a contemporaneous relationship, i.e., how does current consumption (C) be affected by current Yd?
Example: Static Phillips curve model
inflatt = 0 + 1unemployt + t
inflat: inflation rateunemploy: unemployment rate
9.5
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Finite Distributed LagLag Models
Forward Distributed Lag Effect (with order q)
Effect at time t+2
Economic actionat time t
Effect at time t
Ct =0+0Ydt+t
Effect at time t+1
Ct+1=0+0Ydt+1+1Ydt+tCt=0 +0Ydt+1Ydt-1+t
Effect at time t+q ….
….Ct+q=0+1Ydt+q+…+1Ydt+tCt=0+1Ydt+…+1Ydt-q+t
9.6
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Economic actionat time t
Effect at time t-1
Backward Distributed Lag Effect
Yt= 0+0Zt+1Zt-1+2Zt-2+…+2Zt-q+t
Initial state: zt = zt-1 = zt-2 = c
Effect at time t-q ….Effect
at time t-3 Effect
at time t-2
9.7
All right reserved by Dr.Bill Wan Sing Hung - HKBU
C = 0 + 00Ydt + 11Ydt-1 + 22Ydt-2 + t
Long-run propensity (LRP) = (00 + + 11 + + 22)
Permanent unit change in C for 1 unit permapermanentnent (long-run) change in Yd.
Distributed Lag model in general:
Ct = 0 + 0Ydt + 1Ydt-1 +…+ qYdt-q
+ other factors + t
LRP (or long run multiplier) = 0 + 1 +..+ q
9.8
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Time Trends
Linear time trend
Yt = 0 + 1t + t Constant absolute change
Exponential time trend
ln(Yt) = 0 + 1t + t Constant growth rate
Quadratic time trend
Yt = 0 + 1t + 2t2 + t Accelerate change
For advances on time series analysis and modeling , welcome to take ECON 3670ECON 3670
9.9
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Definition: First-order of Autocorrelation, AR(1)
If Cov (t, s) = E (t s) 0 where t s
Yt = 0 + 1 X1t + t t = 1,……,T
and if t = t-1 + ut
where -1 < < 1 ( : RHORHO)
and ut ~ iid (0, u2) (white noise)
This scheme is called first-order autocorrelation and denotes as AR(1)
Autoregressive : The regression of t can be explained by itself lagged one period.
(RHORHO) : the first-order autocorrelation coefficient or ‘coefficient of autocovariance’
9.10
All right reserved by Dr.Bill Wan Sing Hung - HKBU
1990 230 320 u1990
… … … …. ... … … ….2002 558 714 u2002
2003 699 822 u2003
2004 881 907 u2004
2005 925 1003 u2005
2006 984 1174 u2006
2007 1072 1246 u2007
Year Consumptiont = 0 + 1 Incomet + errort
Example of serial correlation:
TaxPay2006
TaxPay2007
Error termrepresents
other factorsthat affect
consumption
uutt ~ iid(0, ~ iid(0, uu22))
TaxPay2007 = TaxPay2006 + uu2007
t = t-1 + uut
The current year Tax Pay may be determined by previous year rate
9.11
All right reserved by Dr.Bill Wan Sing Hung - HKBU
If t = 1 t-1 + ut
it is AR(1), first-order autoregressive
If t = 1 t-1 + 2 t-2 + ut
it is AR(2), second-order autoregressive
If t = 1 t-1 + 2 t-2 + …… + n t-n + ut
it is AR(n), nth-order autoregressive
……………………………………………….
High orderautocorrelation
Autocorrelation AR(1) :
Cov (t t-1) > 0 => 0 < < 1 positive AR(1)
Cov (t t-1) < 0 => -1 < < 0 negative AR(1)
-1 < < 1
If t = 1 t-1 + 2 t-2 + 3 t-3 + ut
it is AR(2), third-order autoregressive
9.12
All right reserved by Dr.Bill Wan Sing Hung - HKBU
time0
i
xx
xx
xx
xx
x
Positive autocorrelation
time0
i
xx
xx
x
xx
x
Positive autocorrelation
time0
i Cyclical: Positive autocorrelation
x
xx
xx x x
xx
xx x x
xx
xThe current error term tends to have the same sign as the previous one.
9.13
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Negative autocorrelation
time
i^
x
x
x
x
xx
x
xx
x
x
xx
x
No autocorrelation
xx
xx
xx
xx
xx
xx
x
xx
xx
x xx
xxx0 timex
xx
i^
The current error term tends to have the opposite sign from the previous.
The current error term tends to be randomly appeared from the previous.
9.14
All right reserved by Dr.Bill Wan Sing Hung - HKBU
The meaning ofThe meaning of : The error term t at time t is a linear combination of the current and past disturbance.
0 < < 1
-1 < < 0
The further the period is in the past, the smaller is the weight of that error term (t-1) in determining t
= 1 The past is equal importance to the current.
> 1 The past is more importance than the current.
9.15
All right reserved by Dr.Bill Wan Sing Hung - HKBU
The consequences of serial correlation:
3. The standard error of the estimated coefficient, Se(k) becomes large
^
^2. The variances of the k is no longer the smallestno longer the smallest
Therefore, when AR(1) is existing in the regression,The estimation will not be “BLUE”
BLUEBLUE1. The estimated coefficients are still unbiasedstill unbiased.
E(k) = k^
9.16
All right reserved by Dr.Bill Wan Sing Hung - HKBU
If E(tt-1) 0, and t = t-1 + ut , then
If = 0, zero autocorrelation, than Var(1)AR1 = Var(1)^ ^
If 0, autocorrelation, than Var(1)AR1 >> Var(1)^ ^
Two variable regression model: Yt = 0 + 1X1t + t
The OLS estimator of 1,
^ x y
xt2
If E(t t-1) = 0 then Var (1) = ^ 2
xt2
===> 1 =
Var (1)AR1= + + 2^ 2 22 xt xt+1 xt xt+2
xt2 xt
2 xt2 xt
2
-1 < < 1
+ ….
The AR(1) variance is not the smallest
Example:
9.17
All right reserved by Dr.Bill Wan Sing Hung - HKBU
t = t-1 + ut ==> t = [ t-2 + ut-1] + ut
==> t-2 = t-3 + ut-2 => t = 2 [ t-3 + ut-2] + ut-1 + ut
==> t-1 = t-2 + ut-1 t = 2 t-2 + ut-1 + ut
t = 3 t-3 + 2 ut-2 + ut-1 + ut
E(t t-1) =2
1 - 2
E(t t-3) = 2 2
E(t t-2) = 2
…………….E(t t-k) = k-1 2
Autoregressive scheme:
It means the more periods in the past,the less effect on current period
k-1 becomes smaller and smaller
9.18
All right reserved by Dr.Bill Wan Sing Hung - HKBU
How to detect autocorrelation ?
DW* or d*
9.19
All right reserved by Dr.Bill Wan Sing Hung - HKBU
5% level of significance,
k = 1k = 1, n=24n=24
DW* = 0.9107
dL = 1.27
du = 1.45
kk is the number of independent variables (excluding the intercept)
DW* << ddLL
9.20
All right reserved by Dr.Bill Wan Sing Hung - HKBU
From OLS regression result: where d or DW* = 0.9107
Check DW Statistic Table (At 5% level of significance, k’ = 1, n=24)
dL = 1.27du = 1.45
0 1.27 1.45 2
dL du
DWDW*
0.91070.9107
Durbin-Watson Autocorrelation test
Reject H0
region
H0 : no autocorrelation = 0H1 : yes, autocorrelation exists. or > 0 positive autocorrelation
9.21
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Durbin-Watson test
OLS : Y = 0 + 1 X2 + …… + k Xk + t
obtain t , DW-statistic(d) ^
Assuming AR(1) process: t = t-1 + ut
I. H0 : ≤ 0 no positive autocorrelation
H1 : > 0 yes, positive autocorrelation-1 < < 1
Compare dd* and ddLL, dduu (critical values)DW*
if dd* < ddLL ==> reject H0
if d* > dduu ==> not reject H0
if ddLL dd* dduu ==> this test is inconclusive
9.22
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Durbin-Watson test(Cont.)
Since -1 -1 1^
implies 0 dd 4
DW = 2 (1 - ) (t - t-1)2
t=2
T ^ ^
t2
t=1
T ^^
(dd)dd 2(1)^
dd ≈ 2 (1- )
==> ≈ 1 -
==> ≈ 1-
^dd2
dd2
^
^
0 1.27 1.45 22
dL du
44
(4-d(4-dLL))(4-d(4-dUU))
2.732.732.552.55
9.23
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Durbin-Watson test(Cont.)II. H0 : ≥0 no negative autocorrelation
H1 : < 0 yes, negative autocorrelation
we use (4-d) (when dd is greater than 2)
if (4 - d) < dL
or 4 - dL < d < 4 ==> reject H0
if dL (4 - d) du
or 4 - du > d > 2 ==> not reject H0
if dL (4 - d) du
or 4 - du d 4 - dL ==> inconclusive
0 1.27 1.45 22
dL du
44
(4 - dL)(4-d(4-dUU))
2.732.732.552.55
9.24
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Durbin-Watson test(Cont.)II. H0 : =0 No autocorrelation
H1 : 0 two-tailed test for auto correlation either positive or negative AR(1)
If d < dL
or d > 4 - dL
==> reject H0
If du < d < 4 - du ==> not reject H0
If dL d du
or 4 - du d 4 - dL
==> inconclusive
9.25
All right reserved by Dr.Bill Wan Sing Hung - HKBU
For example :
UMt = 23.1 - 0.078 CAPt - 0.146 CAPt-1 + 0.043Tt^
(15.6) (2.0) (3.7) (10.3)
R2 = 0.78 F = 78.9 = 0.677 SSR = 29.3 DW = 0.23DW = 0.23 n = 68_
^
(i) K = 3 (number of independent variable)Observed
(ii) n = 68 , = 0.01 significance level 0.05
(iii) dL = 1.525 , du = 1.703 0.05
dL = 1.372 , du = 1.546 0.01
Reject H0, positive autocorrelation exists
(excluding intercept)
9.26
All right reserved by Dr.Bill Wan Sing Hung - HKBU
H0 : = 0 positive autocorrelation H1 : > 0
0 dL du 2
reject H0 not
reject
inconclusive
DW (d)
4-du 4-dL 4
inconclusive
reject H0
H0 : = 0negative autocorrelation
H1 : < 0
not reject
2.45 2.297
2.63 2.475
1.3721.525
1.5461.703
1% & 5%Critical values
0.23
9.27
All right reserved by Dr.Bill Wan Sing Hung - HKBU
The assumptions underlying the d(DW) statistics :
1. Intercept term must be included.2. X’s are nonstochastic
3. Only test AR(1) : t = t-1 + ut where ut ~ iid (0, u2)
4. Not include the lagged dependent variable,
Yt = 0+ 1 Xt1 + 2 Xt
2 + …… + kXtk + Yt-1 + t
(autoregressive model)
5. No missing observation 1970 100 15
1980 235 2081 N.A. N.A.82 N.A. N.A.93 253 3794 281 4195
... ... ...
... ...
Y X
missing
9.28
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Lagrange MultiplierLagrange Multiplier ( (LMLM)) Test Test or called Durbin’s m testOr Breusch-Godfrey (BG) test of higher-order autocorrelation
^Test Procedures:(1) Run OLS and obtain the residuals t.
(3) compute the BG-statistic: ((nn--pp)R)R22
(4) compare the BG-statistic to the 2p (pp is # of degree-order)
(5) If BG > 2p, reject Ho,
it means there is a higher-order autocorrelation If BG < 2
p, not reject Ho,
it means there is a no higher-order autocorrelation
^ ^ ^ ^^ ^ ^ ^
^
(2) Run t against all the regressors in the model
plus the additional regressors, t-1, t-2, t-3,…, t-p.
t = 0 + 1 Xt + t-1 + t-2 + t-3 + … + t-pp + u
Obtain the RR22 value from this regression.
^
9.29
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Remedy: 1. First-difference transformation
Yt = 0 + 1 Xt + t
Yt-1 = 0 + 1 Xt-1 + t-1 assume = 1
==> Yt - Yt-1 = 0 - 0 + 1 (Xt - Xt-1) + (t - t-1)
==> Yt = 1 Xt + t
no intercept
2. Add a trend (T)Yt = 0 + 1 Xt + 2 T + t
Yt-1 = 0 + 1 Xt-1 + 2 (T -1) + t-1
==> (Yt - Yt-1) = (0 - 0) + 1 (Xt - Xt-1) + 2 [T- (T -1)] + (t - t-1)
==> Yt = 1 Xt + 2*1 + ’t
==> Yt = 2* + 1 Xt + ’t
If 1* > 0 => an upward trend in Y^
(2 > 0)^
9.30
All right reserved by Dr.Bill Wan Sing Hung - HKBU
3. Cochrane-Orcutt Two-step procedureCochrane-Orcutt Two-step procedure (CORC)(1). Run OLS on Yt = 0 + 1 Xt + t
and obtains t ^
(3). Use the to transform the variables :^
Yt* = Yt - Yt-1
^
^Xt* = Xt - Xt-1
-) Yt-1 = 0 + 1 Xt-1 + t-1 ^ ^ ^^
Yt = 0 + 1 Xt + t
(4). Run OLS on Yt* = 0
* + 1* Xt
* + ut
(2). Run OLS on t = t-1 + ut^
and obtains ^
^
Where u~(0, )
GeneralizedGeneralized Least SquaresLeast Squares
(GLS)(GLS)methodmethod
(Yt - Yt-1)= 0(1-) +1(Xt - Xt-1) + (t -t-1) ^ ^^^
9.31
All right reserved by Dr.Bill Wan Sing Hung - HKBU
4. Cochrane-OrcuttCochrane-Orcutt Iterative Procedure
(5). If DW test shows that the autocorrelation still existing, than it needs to iterate the procedures from (4). Obtains the t
*
(6). Run OLS
t* = t-1
* + ut’^ ^
(1 - )DW2
2^
and obtains which is the second-round estimated ^
Xt** = Xt - Xt-1 Yt-1 = 0 + 1 Xt-1 + t-1
(7). Use the to transform the variable^
Yt** = Yt - Yt-1 Yt = 0 + 1 Xt + t
^
^ ^ ^ ^^
9.32
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Cochrane-Orcutt Iterative procedure(Cont.)
(8). Run OLS onYt
** = 0** + 1
** Xt** + t
**
Where is ^^(Yt - Yt-1) = 0 (1 - ) + 1 (Xt - Xt-1) + (t - t-1)^ ^ ^^ ^ ^
(9). Check on the DW3 -statistic, if the autocorrelation is still existing, than go into third-round procedures and so on.
Until the estimated ’s differs a little ^^( - < 0.01).
9.33
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Example: Studenmund (2006) Exercise 14 and Table 9.1, pp.342-344
(1)
Low DW statistic
Obtain the Residuals
(Usually after you run regression, the residuals will be immediately stored in this icon
9.34
All right reserved by Dr.Bill Wan Sing Hung - HKBU
(2)
Give a new name for the residual series
Run regression of the current residual on the lagged residual
Obtain the estimated ρρ(“rho”)
ttt 1ˆˆ
^
9.35
All right reserved by Dr.Bill Wan Sing Hung - HKBU
(3) Transform the Y* and X*
New series are created,
but each first observation
is lost.
9.36
All right reserved by Dr.Bill Wan Sing Hung - HKBU
(4)
Obtain the estimated result
which is improved
Run the
transformed
regression
9.37
All right reserved by Dr.Bill Wan Sing Hung - HKBU
The Cochrane-Orcutt Iterative procedure in the EVIEWS
The is the EVIEWS’ Command to run the iterative procedure
(5)~(9)
9.38
All right reserved by Dr.Bill Wan Sing Hung - HKBU
The result of the Iterative procedure
The DW
is improved
This is the
estimated ρρEach
variable is
transformed
9.39
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Generalized least Squares (GLS)
Yt = 0 + 1 Xt + t t = 1,……,T (1)
Assume AR(1) : t = t-1 + ut -1 < < 1
Yt-1 = 0 + 1 Xt-1 + t-1 (2)
(1) - (2) => (Yt - Yt-1) = 0 (1 - ) + 1 (Xt - Xt-1) + (t - t-1)
GLS => Yt* = 0
* + 1* Xt
* + ut
5. Prais-Winsten transformation
9.40
All right reserved by Dr.Bill Wan Sing Hung - HKBU
To avoid the loss of the first observation, the first observation of Y1
* and X1* should be transformed as :
X1* = 1 - 2 (X1)
^Y1
* = 1 - 2 (Y1)^
Edit the figure hereTo restorethe first observation
but Y2* = Y2 - Y1 ; X2
* = X2 - X1^ ^
Y3* = Y3 - Y2 ; X3
* = X3 - X2^^
…..
. …..
. …..
. …..
. …..
. …..
.Yt
* = Yt - Yt-1 ; Xt* = Xt - Xt-1
^ ^
9.41
All right reserved by Dr.Bill Wan Sing Hung - HKBU
6. Durbin’s Two-step method : Since (Yt - Yt-1) = 0 (1 - ) + 1 (Xt - Xt-1) + ut
=> Yt = 0* + 1 Xt - 1 Xt-1 + Yt-1 + ut
Yt = 0 + 1 Xt + t
III. Run OLS on model : Yt* = 0 + 1 Xt
* + ’t
and 1 = 1^ ^where 0 = 0 (1 - )^ ^
I. Run OLS => this specification
Yt = 0* + 1
* Xt - 2* Xt-1 + 3
* Yt-1 + ut
Obtain 3* as an estimated (RHO)^ ^
II. Transforming the variables :
Yt* = Yt - 3
* Yt-1 as Yt* = Yt - Yt-1
and Xt* = Xt - 3
* Xt-1 as Xt* = Xt - Xt-1
^ ^
^^
9.42
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Including this
lagged term of Y
Obtain the estimated
ρρ(“rho”)^
9.43
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Lagged Dependent VariableLagged Dependent Variable and Autocorrelation
Compare h* to Z where Zc ~ N (0,1) normal distribution
If |h*| > Zc => reject H0 : = 0 (no autocorrelation)
Yt = 0 + 1 X1t + 2 X2t
+ …… + k Xk.t + 1 Yt-1 +t
DW statistic will often be closed to 2 or
DW does not converge to 2 (1 - )^DW is not reliable
Durbin-h Test: Compute h* =
^ 1 - n*Var (1)
n
^
Limitation of Durbin-Watson Test:
9.44
All right reserved by Dr.Bill Wan Sing Hung - HKBU
Durbin-h Test: Compute h* =
^ 1 - n*Var (1)
n
^
Therefore reject
H0 : = 0 (no autocorrelation)
2)10617.0(*241
24*7772.0*
h
h* = 4.458 > Z