1 Time Series Forecasting: The Case for the Single Source of Error State Space Model J. Keith Ord,...

Post on 18-Dec-2015

215 views 1 download

Tags:

Transcript of 1 Time Series Forecasting: The Case for the Single Source of Error State Space Model J. Keith Ord,...

1

Time Series Forecasting: The Case for the Single Source of

Error State Space Model

J. Keith Ord, Georgetown UniversityRalph D. Snyder, Monash UniversityAnne B. Koehler, Miami University

Rob J. Hyndman, Monash UniversityMark Leeds, The Kellogg Group

http://www.buseco.monash.edu.au/depts/ebs/pubs/wpapers/2005

2

Outline of Talk

• Background• General SSOE model

– Linear and nonlinear examples– Estimation and model selection

• General linear state space model – MSOE and SSOE forms– Parameter spaces– Convergence– Equivalent Models– Explanatory variables– ARCH and GARCH models

• Advantages of SSOE

3

Review Paper

A New Look At Models for  Exponential Smoothing (2001).

JRSS, series D [The Statistician], 50, 147-59.

Chris Chatfield, Anne Koehler, Keith Ord &Ralph Snyder

4

Framework Paper

A State Space Framework for Automatic Forecasting Using Exponential Smoothing(2002)

International J. of Forecasting, 18, 439-454

Rob Hyndman, Anne Koehler, Ralph Snyder & Simone Grosse

5

Some background

• The Kalman filter: Kalman (1960), Kalman & Bucy (1961)

• Engineering: Jazwinski (1970), Anderson & Moore (1979)

• Regression approach: Duncan and Horn (JASA, 1972)

• Bayesian Forecasting & Dynamic Linear Model: Harrison & Stevens (1976, JRSS B); West & Harrison (1997)

• Structural models: Harvey (1989)• State Space Methods: Durbin & Koopman

(2001)

6

Single Source of Error (SSOE)State Space Model

• Developed by Snyder (1985) among others

• Also known as the Innovations Representation

• Any Gaussian time series has an innovations representation [SSOE looks restrictive but it is not!]

7

Why a structural model?

• Structural models enable us to formulate model in terms of unobserved components and to decompose the model in terms of those components

• Structural models will enable us to formulate schemes with non-linear error structures, yet familiar forecast functions

8

General Framework: Notation

}y,...,y,{yIset we

and interest, of process observable the:

11-ttt ty

variablesstate leunobservab of vector :tx

2 varianceand 0 means

with errors random leunobservab the:

t

variablesstatefor estimators of vector :tm

9

Single Source of Error (SSOE)State Space Model

)()( 11 tttt khy xx

2 ~ (0, )

is a 1 state vector

and is a 1 vector of parameters

t

t

NID

k

k

x

α

tttt ),()( 11 αxgxfx

10

Simple Exponential Smoothing (SES)

ttty 1

Equationt Measuremen

ttt 1

Equation State

tt at time level theis

11

Another Form for State Equation

ttty 1

Equationt Measuremen

)( 11

Equation State

tttt y

1)1(

or

ttt y

12

Reduced ARIMA Form

ARIMA(0,1,1):

11 )1( tttt yy

13

Another SES Model

tttty 11

Equationt Measuremen

tttt 11

Equation State

14

Same State Equation for Second Model

1

1

t

ttt

y

1

111

t

ttttt

y

)( 11 tttt y

15

Reduced ARIMA Model for Second SES Model

NONE

16

Point Forecasts for Both Models

thty ̂ˆ

)ˆ(ˆˆˆ11 tttt y

)ˆ)ˆ1(ˆˆ 1

or

ttt y

17

SSOE Model for Holt-Winters Method

tmtttmtttt sbsby )()( 1111

tttttt bb )()( 1111

ttttt bbb )( 111

tmtmtt sss

18

Likelihood, Exponential Smoothing, and Estimation

n

tt

kn

tt

nL1

)(log21

2log)0

,( xxα

0 fixed with dLikelikhoo x

)(

)( 1

1tx

x

k

hy ttt

)(

)()()(

1

111

t

ttttt k

hy

x

xxgxfx

19

Model Selection

pLAIC 2)ˆ,ˆ(

Criterionn Informatio Akaike

0 xα

p is the number of free states plus the number of parameters

20

General Linear State Space Model

ttt ηFxx 1

ηη

η

VV

V

η

2

,0

0NID~

t

t

ttty 1'xh

21

Special Cases

,( tCov ηV 0) t

jiCov jtit for 0),( is, that diagonal, is ηV

Model SSOE αη t tε

,( tCov ηV2

(CovηV 2

Model MSOE

αη )t

ααη ')t

22

Linear SSOE Model

ttty 1xh

ttt αFxx 1

vector1 a is

vector a is

vector1 a is

k

kk

k

α

F

h

23

SSOE for Holt’s Linear Trend Exponential Smoothing

tt

tt b

1

1

10

11 x

tt

tt by

1

111

24

MSOE Model for Holt’s Liner Trend Exponential Smoothing

tttt b 111

tttt by 11

ttt bb 21

25

Parameter Space 1

• Both correspond to the same ARIMA model in the steady state BUT parameter spaces differ – SSOE has same space as ARIMA– MSOE space is subset of ARIMA

• Example: for ARIMA (0,1,1), = 1- – MSOE has 0 < < 1– SSOE has 0 < <2 equivalent to –1 < < 1

26

Parameter space 2

• In general, ρ = 1 (SSOE) yields the same parameter space as ARIMA, ρ = 0 (MSOE) yields a smaller space.

• No other value of ρ yields a larger parameter space than does ρ = 1 [Theorems 5.1 and 5.2]

• Restricted parameter spaces may lead to poor model choices [e.g. Morley et al., 2002]

27

Convergence of the Covariance Matrix for Linear SSOE

),,,|( 21 ttt yyyE xm where

filterKalman n the

as

,I

tt 0C

],,,|))([(),,,|( 2121 tttttttt yyyEyyyCov mxmxxC

)(1 1tmhaFmm tttt y

αa (t

gainKalman 12

112 ))(

hChhFC tt

28

Convergence 2

• The practical import of this result is that, provided t is not too small, we can approximate the state variable by its estimate

• That is, heuristic forecasting procedures, such as exponential smoothing, that generate forecast updates in a form like the state equations, are validated.

29

Equivalence• Equivalent linear state space models

(West and Harrison) will give rise to the same forecast distribution.

• For the MSOE model the equivalence transformation H of the state vector typically produces a non-diagonal covariance matrix.

• For the SSOE model the equivalence transformation H preserves the perfect correlation of the state vectors.

30

Explanatory Variables

1tt Fxx t

framework regression a intoput becan SSOE

tty z~~t

and offunction a is ~tt yy

and offunction augmentedan is ~tt zz

γ

x0

tttty γzxh 1

31

ARCH Effects

ttty 11 xh

1tt Fxx t ttt h 2/1

)1,0(~ Nt

2110 tth

model ARCH(1) theof version SSOE

32

Advantages of SSOE Models

• Mapping from model to forecasting equations is direct and easy to see

• ML estimation can be applied directly without need for the Kalman updating procedure

• Nonlinear models are readily incorporated into the model framework

33

Further Advantages of SSOE Models

• Akaike and Schwarz information criteria can be used to choose models, including choices among models with different numbers of unit roots in the reduced form

• Largest parameter space among state space models.

• In Kalman filter, the covariance matrix of the state vector converges to 0.