Arma
-
Upload
casey-smith -
Category
Documents
-
view
4 -
download
0
description
Transcript of Arma
-
Modeling Cycles By ARMA Specification Identification (Pre-fit) Testing (Post-fit) Forecasting
-
DefinitionsData =Trend + Season+Cycle + Irregular
Cycle + Irregular = Data Trend Season (curves) (dummy variables) For this presentation, let: Yt = Cyclet + Irregulart
-
Stationary Process For CyclesCycle + Irregular =(A) Stationary Process =(A) ARMA(p, q)=(A) : Approximation
-
Stationary ProcessSeries Yt is stationary if: mt = m, constant for all tst = s, constant for all tr(Yt, Yt+h) = rh does not depend on t
WN is a special example of a stationary process
-
Models For a Stationary ProcessAutoregressive Process, AR(p)
Moving Average Process, MA(q)
Autoregressive Moving Average Process, ARMA(p, q)
-
Parameters of ARMA ModelsSpecification ParametersfkAutoregressive Process Parameter
qkMoving Average Process Parameter
Characterization ParametersrkAutocorrelation Coefficient
fkkPartial Autocorrelation Coefficient
-
AR ProcessAR (1) : (Yt - m ) = f1 (Y(t-1) - m ) + e t
-1 < f1 < 1 (stationarity condition)AR (2) : (Yt - m) = f1 (Y(t-1) - m) + f2 (Y(t-2) - m ) + e t
f2 + f1 < 1, f2 - f1 < 1 , -1 < f2 < 1(stationarity condition) e t is a WN (s)
-
MA ProcessMA (1) : Yt - m = et + q 1 e(t-1) - 1 < q 1 < 1 (invertibility condition)
MA (2) : Yt - m = et + q 1 e (t-1) + q2 e (t-2) q2 + q1 >-1, q2 - q1 >- 1 , -1 < q2 < 1 (invertibility condition) e t is a WN (s)
-
ARMA (p, q) ModelsARMA(1, 1):
(Yt - m ) = f1 (Y(t-1) - m ) + e t + q 1 e(t-1)
ARMA(2, 1):
(Yt - m ) = f1 (Y(t-1) - m ) + f2 (Y(t-2) - m ) + e t + q 1 e(t-1)
ARMA(1, 2):
(Yt - m ) = f1 (Y(t-1) - m ) + e t + q 1 e(t-1) + q 2 e(t-2)
-
Wold TheoremAny stationary process can be defined as a linear combination of a WN series, et
means: with: sum( ) < inf.
-
Lag Operator, LLag Operator, L
Then, the Wold Theorem can be written as:
-
ApproximationApproximation of B(L) by a Simple Rational Polynomial of L
-
Generating AR(1)Let:
-
Generating MA(1)Let:
-
Generating ARMA(1,1)Your Exercise
-
AR, MA or ARMA?Pre-Fitting Model IdentificationUsing ACF and PACF
-
Partial Autocorrelation Function:PACFNotation: The partial autocorrelation of order k is denoted asf kk
Interpretation: f kk = Correlation (Yt, Y(t-k) Y(t-1) ,..., Y(t-k+1) ) Yt, {Y(t-1), Y(t-2), ... , Y(t-k+1)}, Y(t-k)
-
Patterns of ACF and PACFAR processes
MA processes
ARMA processes
-
Model Diagnostics Post FitResidual Check:Correlogram of the ResidualQLB Statistic (m - # of parameters)
SE
Test of Significance of Coefficients
AIC, SIC
-
AIC and SIC(Maximized)(Minimized)
-
Truth is SimpleParsimonyUse a minimum number of unknown parameters
-
Importance of Parsimony In-Sample RMSE (SE) of Model Prediction vs.B. Out-of-Sample RMSE
The two should not differ much.
-
Eview CommandsARls series_name c ar(1) ar(2)..
MAls series_name c ma(1) ma(2).. ARMAls series_name c ar(1) ar(2).ma(1) ma(2).
-
Forecasting RulesSample range: 1 to T. Forecast T+h for h=1,2,
Write the model, with all unknown parameters replaced by their estimates. Write the information set WT (only necessary part) The unknown errors are given 0. Use the chain rule.
-
Interval Forecast h=1Use SE of Regression for setting the upper and the lower limits h=2a) AR(1) b) MA(1) c) ARMA(1,1)