S349 08.ppt
Transcript of S349 08.ppt
-
8/12/2019 S349 08.ppt
1/81
Model Building For ARIMA
time series
Consists of three steps
1. Identification
2. Estimation
3. Diagnostic checking
-
8/12/2019 S349 08.ppt
2/81
-
8/12/2019 S349 08.ppt
3/81
To identify an ARIMA(p,d,q) we use extensively
the autocorrelation function
{rh: -< h< }
andthe partial autocorrelation function,
{Fkk: 0 k< }.
-
8/12/2019 S349 08.ppt
4/81
-
8/12/2019 S349 08.ppt
5/81
It can be shown that:
t
kttkhhTrrCov rr1
,
Thus
q
tt
tth rTTrVar 1
22
21
11
r
Assumingrk= 0 for k> q
q
t
tr rT
sh
1
2211
Let
-
8/12/2019 S349 08.ppt
6/81
The sample partial autocorrelation function is defined
by:
1
1
1
1
1
21
21
11
21
21
11
F
kk
k
k
kkk
kk
rr
rr
rr
rrr
rr
rr
-
8/12/2019 S349 08.ppt
7/81
It can be shown that:
T
Var kk1 F
Ts
kk
1Let
F
-
8/12/2019 S349 08.ppt
8/81
Identification of an Arima process
Determining the values of p,d,q
-
8/12/2019 S349 08.ppt
9/81
Recall that if a process is stationary one of theroots of the autoregressive operator is equal toone.
This will cause the limiting value of the
autocorrelation function to be non-zero.
Thus a nonstationary process is identified byan autocorrelation function that does not tail
away to zero quickly or cut-off after a finitenumber of steps.
-
8/12/2019 S349 08.ppt
10/81
To determine the value of d
Note: the autocorrelation function for a stationary ARMA
time series satisfies the following difference equation1 1 2 2h h h p h pr r r r
The solution to this equation has general form
1 2
1 2
1 1 1h ph h h
p
c c cr r r
r
where r1, r2, r1, rp, are the roots of the polynomial
21 21 p
px x x x
-
8/12/2019 S349 08.ppt
11/81
For a stationary ARMA time series
Therefore
1 2
1 2
1 1 10 ash ph h h
p
c c c hr r r
r
The roots r1, r2, r1, rp, have absolute value greaterthan 1.
If the ARMA time series is non-stationary
some of the roots r1, r2, r1, rp, have absolute value
equal to 1, and
1 21 2
1 1 10 as
h ph h hp
c c c a hr r r
r
-
8/12/2019 S349 08.ppt
12/81
0
0.5
1
0 3 6 9 12 15 18 21 24 27 30
stationary
0
0.5
1
0 3 6 9 12 15 18 21 24 27 30
non-stationary
-
8/12/2019 S349 08.ppt
13/81
If the process is non-stationary then firstdifferences of the series are computed to
determine if that operation results in a
stationary series. The process is continued until a stationary time
series is found.
This then determines the value of d.
-
8/12/2019 S349 08.ppt
14/81
Identification
Determination of the values ofp and q.
-
8/12/2019 S349 08.ppt
15/81
To determine the value ofpand qwe use the
graphical properties of the autocorrelation
function and the partial autocorrelation function.
Again recall the following:
Auto-correlationfunction
PartialAutocorrelationfunction
Cuts off
Cuts off
Infinite. Tails off.
Damped Exponentials
and/or Cosine waves
Infinite. Tails off.
Infinite. Tails off.Infinite. Tails off.
Dominated by damped
Exponentials & Cosine
waves.
Dominated by damped
Exponentials & Cosine waves
Damped Exponentials
and/or Cosine wavesafter q-p.
after p-q.
Process MA(q) AR(p) ARMA(p,q)
Properties of the ACF and PACF of MA, AR and ARMA Series
-
8/12/2019 S349 08.ppt
16/81
More specically some typical patterns of the autocorrelation
function and the partial autocorrelation function for some
important ARMA series are as follows:
Patterns of the ACF and PACF of AR(2) Time Series
In the shaded region the roots of the AR operator are complex
-
8/12/2019 S349 08.ppt
17/81
Patterns of the ACF and PACF of MA(2) Time Series
In the shaded region the roots of the MA operator are complex
-
8/12/2019 S349 08.ppt
18/81
Patterns of the ACF and PACF of ARMA(1.1) Time Series
Note: The patterns exhibited by the ACF and the PACF give
important and useful information relating to the values of the
parameters of the time series.
-
8/12/2019 S349 08.ppt
19/81
Summary: To determinep and q.
Use the following table.
MA(q) AR(p) ARMA(p,q)
ACF Cuts after q Tails off Tails offPACF Tails off Cuts afterp Tails off
Note: Usuallyp + q 4. There is no harm in overidentifying the time series. (allowing more parameters in
the model than necessary. We can always test to
determine if the extra parameters are zero.)
-
8/12/2019 S349 08.ppt
20/81
Examples
-
8/12/2019 S349 08.ppt
21/81
2001000
16
17
18
Ex ample A: "Uncontrolled" Concentration, Two-Hourly Readings:
Chemical Process
-
8/12/2019 S349 08.ppt
22/81
The data
1 17.0 41 17.6 81 16.8 121 16.9 161 17.1
2 16.6 42 17.5 82 16.7 122 17.1 162 17.13 16.3 43 16.5 83 16.4 123 16.8 163 17.1
4 16.1 44 17.8 84 16.5 124 17.0 164 17.45 17.1 45 17.3 85 16.4 125 17.2 165 17.2
6 16.9 46 17.3 86 16.6 126 17.3 166 16.97 16.8 47 17.1 87 16.5 127 17.2 167 16.9
8 17.4 48 17.4 88 16.7 128 17.3 168 17.0
9 17.1 49 16.9 89 16.4 129 17.2 169 16.710 17.0 50 17.3 90 16.4 130 17.2 170 16.911 16.7 51 17.6 91 16.2 131 17.5 171 17.3
12 17.4 52 16.9 92 16.4 132 16.9 172 17.813 17.2 53 16.7 93 16.3 133 16.9 173 17.8
14 17.4 54 16.8 94 16.4 134 16.9 174 17.615 17.4 55 16.8 95 17.0 135 17.0 175 17.5
16 17.0 56 17.2 96 16.9 136 16.5 176 17.017 17.3 57 16.8 97 17.1 137 16.7 177 16.9
18 17.2 58 17.6 98 17.1 138 16.8 178 17.1
19 17.4 59 17.2 99 16.7 139 16.7 179 17.220 16.8 60 16.6 100 16.9 140 16.7 180 17.421 17.1 61 17.1 101 16.5 141 16.6 181 17.5
22 17.4 62 16.9 102 17.2 142 16.5 182 17.923 17.4 63 16.6 103 16.4 143 17.0 183 17.0
24 17.5 64 18.0 104 17.0 144 16.7 184 17.025 17.4 65 17.2 105 17.0 145 16.7 185 17.0
26 17.6 66 17.3 106 16.7 146 16.9 186 17.227 17.4 67 17.0 107 16.2 147 17.4 187 17.3
28 17.3 68 16.9 108 16.6 148 17.1 188 17.4
29 17.0 69 17.3 109 16.9 149 17.0 189 17.430 17.8 70 16.8 110 16.5 150 16.8 190 17.031 17.5 71 17.3 111 16.6 151 17.2 191 18.0
32 18.1 72 17.4 112 16.6 152 17.2 192 18.233 17.5 73 17.7 113 17.0 153 17.4 193 17.6
34 17.4 74 16.8 114 17.1 154 17.2 194 17.835 17.4 75 16.9 115 17.1 155 16.9 195 17.7
36 17.1 76 17.0 116 16.7 156 16.8 196 17.237 17.6 77 16.9 117 16.8 157 17.0 197 17.4
38 17.7 78 17.0 118 16.3 158 17.4
39 17.4 79 16.6 119 16.6 159 17.240 17.8 80 16.7 120 16.8 160 17.2
-
8/12/2019 S349 08.ppt
23/81
18901860183018001770
0
100
200
Example B: Annual Sunspot Numbers
(1790-1869)
Example B: Sunspot Numbers: Yearly
-
8/12/2019 S349 08.ppt
24/81
The data
Example B: Sunspot Numbers: Yearly
1770 101 1795 21 1820 16 1845 40
1771 82 1796 16 1821 7 1846 64
1772 66 1797 6 1822 4 1847 98
1773 35 1798 4 1823 2 1848 124
1774 31 1799 7 1824 8 1849 961775 7 1800 14 1825 17 1850 66
1776 20 1801 34 1826 36 1851 64
1777 92 1802 45 1827 50 1852 54
1778 154 1803 43 1828 62 1853 39
1779 125 1804 48 1829 67 1854 21
1780 85 1805 42 1830 71 1855 7
1781 68 1806 28 1831 48 1856 41782 38 1807 10 1832 28 1857 23
1783 23 1808 8 1833 8 1858 55
1784 10 1809 2 1834 13 1859 94
1785 24 1810 0 1835 57 1860 96
1786 83 1811 1 1836 122 1861 77
1787 132 1812 5 1837 138 1862 591788 131 1813 12 1838 103 1863 44
1789 118 1814 14 1839 86 1864 47
1790 90 1815 35 1840 63 1865 30
1791 67 1816 46 1841 37 1866 16
1792 60 1817 41 1842 24 1867 7
1793 47 1818 30 1843 11 1868 37
1794 41 1819 24 1844 15 1869 74
-
8/12/2019 S349 08.ppt
25/81
3002001000
300
400
500
600
700Daily IBM Common Stock Closing Prices
May 17 1961-November 2 1962
Day
Price($)
-
8/12/2019 S349 08.ppt
26/81
Example C: IBM Common Stock Closing Prices: Daily (May 17 1961- Nov 2 1962)
460 471 527 580 551 523 333 394 330457 467 540 579 551 516 330 393 340452 473 542 584 552 511 336 409 339459 481 538 581 553 518 328 411 331462 488 541 581 557 517 316 409 345459 490 541 577 557 520 320 408 352463 489 547 577 548 519 332 393 346479 489 553 578 547 519 320 391 352493 485 559 580 545 519 333 388 357490 491 557 586 545 518 344 396492 492 557 583 539 513 339 387498 494 560 581 539 499 350 383499 499 571 576 535 485 351 388497 498 571 571 537 454 350 382496 500 569 575 535 462 345 384490 497 575 575 536 473 350 382489 494 580 573 537 482 359 383478 495 584 577 543 486 375 383487 500 585 582 548 475 379 388491 504 590 584 546 459 376 395
487 513 599 579 547 451 382 392482 511 603 572 548 453 370 386487 514 599 577 549 446 365 383482 510 596 571 553 455 367 377479 509 585 560 553 452 372 364478 515 587 549 552 457 373 369479 519 585 556 551 449 363 355477 523 581 557 550 450 371 350479 519 583 563 553 435 369 353475 523 592 564 554 415 376 340479 531 592 567 551 398 387 350476 547 596 561 551 399 387 349
478 551 596 559 545 361 376 358479 547 595 553 547 383 385 360477 541 598 553 547 393 385 360476 545 598 553 537 385 380 366475 549 595 547 539 360 373 359473 545 595 550 538 364 382 356474 549 592 544 533 365 377 355474 547 588 541 525 370 376 367474 543 582 532 513 374 379 357465 540 576 525 510 359 386 361466 539 578 542 521 335 387 355467 532 589 555 521 323 386 348
471 517 585 558 521 306 389 343Read downwards
-
8/12/2019 S349 08.ppt
27/81
Chemical Concentration data:
2001000
16
17
18
Example A: "Uncontr olled" Concentration, Two-Hourly Readings:
Chemical Pr ocess
par Summary Statistics
d N Mean Std. Dev.
0 197 17.062 0.398
1 196 0.002 0.369
2 195 0.003 0.622
-
8/12/2019 S349 08.ppt
28/81
ACF and PACF for xt ,xt and 2xt (Chemical Concentration Data)
h10 20 30 40
hr
-1.0
0.0
1.0x t
-1
0
1
10 20 30 40
Fkk
^
x t
k
-
8/12/2019 S349 08.ppt
29/81
-1.0
0.0
1.0hr
10 20 30 40h
x t
-1.0
0.0
1.0
Fkk
^x t
10 20 30 40
k
-
8/12/2019 S349 08.ppt
30/81
-1.0
0.0
1.0
10 20 30 40h
x t2
hr
-1.0
0.0
1.0
10 20 30 40
Fkk
k
x t2
^
-
8/12/2019 S349 08.ppt
31/81
Possible Identifications
1. d= 0,p= 1, q= 1
2. d= 1,p= 0, q= 1
Sunspot Data:
-
8/12/2019 S349 08.ppt
32/81
Sunspot Data:
18901860183018001770
0
100
200
Example B: Annual Sunspot Numbers
(1790-1869)
Summary Statistics for the Sunspot Data
-
8/12/2019 S349 08.ppt
33/81
ACF and PACF for xt,xtand 2xt (SunspotData)
-1.0
0.0
1.0
rh
h
10 20 30 40
x t
-1.0
0.0
1.0
x tF
kk
10 20 30 40k
-
8/12/2019 S349 08.ppt
34/81
-1.0
0.0
1.0
x t
10 20 30 40h
rh
-1.0
0.0
1.0
x t
10 20 30 40
Fkk
-
8/12/2019 S349 08.ppt
35/81
-1.0
0.0
1.0
x t2
rh
10 20 30 40h
-1.0
0.0
1.0
10 20 30 40
Fkk
k
x t2
-
8/12/2019 S349 08.ppt
36/81
Possible Identification
1. d= 0,p= 2, q= 0
-
8/12/2019 S349 08.ppt
37/81
IBM stock data:
3002001000
300
400
500
600
700 Daily IBM Common StockClosing Prices
May 17 1961-Nove mber 2 1962
Day
Price($)
Summary Statistics
-
8/12/2019 S349 08.ppt
38/81
ACF and PACF for xt,xtand 2xt (IBM Stock PriceData)
-1.0
0.0
1.0
x t
rh
10 20 30 40h
-1.0
0.0
1.0
10 20 30 40k
Fkkx t
-
8/12/2019 S349 08.ppt
39/81
-1.0
0.0
1.0
rh x t
10 20 30 40h
-1.0
0.0
1.0
x t
Fkk
10 20 30 40k
-
8/12/2019 S349 08.ppt
40/81
-1.0
0.0
1.0rh2 x t
10 20 30 40h
-1.0
0.0
1.0
2 x t
Fkk
10 20 30 40k
-
8/12/2019 S349 08.ppt
41/81
Possible Identification
1. d= 1,p=0, q= 0
-
8/12/2019 S349 08.ppt
42/81
Estimation
of ARIMA parameters
-
8/12/2019 S349 08.ppt
43/81
Preliminary Estimation
Using the Method of moments
Equate sample statistics to populationparamaters
-
8/12/2019 S349 08.ppt
44/81
Estimation of parameters of an MA(q) series
The theoretical autocorrelation function in terms theparameters of an MA(q) process is given by.
qh
qhq
qhqhh
h
0
11 22221
11
r
To estimate 1, 2, , qwe solve the system of
equations:
qhrq
qhqhh
h
1
1
22
2
2
1
11
-
8/12/2019 S349 08.ppt
45/81
This set of equations is non-linear and generally verydifficult to solve
For q = 1 the equation becomes:
Thus
2
1
1
11
r
01 1121 r
or 0 112
11 rr
This equation has the two solutions
14
1
2
1
2
11
1 rr
One solution will result in the MA(1) time series being
invertible
-
8/12/2019 S349 08.ppt
46/81
For q = 2 the equations become:
2
2
2
1
2111
1
r
2
2
2
1
22
1
r
-
8/12/2019 S349 08.ppt
47/81
Estimation of parameters of an
ARMA(p,q) series
We use a similar technique.
Namely: Obtain an expression for rhin terms 1,
2, ... , p; 1, 1, ... , qof and set up q +p
equations for the estimates of 1, 2, ... , p; 1,
2, ... , qby replacingrhby rh.
-
8/12/2019 S349 08.ppt
48/81
Estimation of parameters of an ARMA(p,q) series
112
11
2
1
11111
211
rr
r
Example: The ARMA(1,1) process
The expression for r1and r2in terms of 1and
1are:
Further
021
1
11
2
1
2
12
xtuVar s
s
-
8/12/2019 S349 08.ppt
49/81
112
11
2
1
1111
1
21
1
rr
r
Thus the expression for the estimates of 1, 1,
and s2are :
and
021
1
11
2
1
2
12
xC
s
-
8/12/2019 S349 08.ppt
50/81
1111112111
2
1
121
and
r
r
r
Hence
or
1
21
1
21
1
21
2
11 121
r
r
r
r
r
rr
This is a quadratic equation which can be solved
0
12
1
2112
1
2
22
2
1
1
21
r
rrr
rrr
rr
-
8/12/2019 S349 08.ppt
51/81
Example (ChemicalConcentration Data)
the time series was identified as either anARIMA(1,0,1) time series or an ARIMA(0,1,1)
series.
If we use the first identification then seriesxtis an
ARMA(1,1) series.
Id tif i th i i ARMA(1 1) i
-
8/12/2019 S349 08.ppt
52/81
Identifying the seriesxtis an ARMA(1,1) series.
The autocorrelation at lag 1 is r1= 0.570 and the
autocorrelation at lag 2 is r2= 0.495 .
Thus the estimate of 1is 0.495/0.570 = 0.87.
Also the quadratic equation
becomes
0121
2112
1
2
22
21
1
21
rrr
rrr
rrr
02984.0
7642.0
2984.0 12
1
which has the two solutions -0.48 and -2.08. Again we
select as our estimate of 1to be the solution -0.48,
resulting in an invertibleestimated series.
-
8/12/2019 S349 08.ppt
53/81
Since d= m(1 - 1) the estimate of dcan be computed as
follows:
Thus the identified model in this case is
xt= 0.87xt-1+ ut- 0.48 ut-1+ 2.25
25.2)87.01(062.171 1 d x
-
8/12/2019 S349 08.ppt
54/81
If we use the second identification then series
xt = xtxt-1 is an MA(1) series.
Thus the estimate of 1is:
14
1
2
1
2
11
1 rr
The value of r1 = -0.413.Thus the estimate of 1is:
53.0
89.11
413.04
1
413.02
1
21
The estimate of 1= -0.53, corresponds to an
invertible time series. This is the solution that we will
choose
-
8/12/2019 S349 08.ppt
55/81
The estimate of the parameter mis the sample mean.
Thus the identified model in this case is:
xt= ut- 0.53 ut-1+ 0.002 or
xt=xt-1 + ut- 0.53 ut-1+ 0.002
This compares with the other identification:
xt= 0.87xt-1+ ut- 0.48 ut-1+ 2.25(An ARIMA(1,0,1) model)
(An ARIMA(0,1,1) model)
-
8/12/2019 S349 08.ppt
56/81
Preliminary Estimation
of the Parameters of an AR(p)
Process
-
8/12/2019 S349 08.ppt
57/81
pprr
ss
11
2
10
111 1 pprr
2112 pprrr
and
111 ppp rr
The regression coefficients 1, 2, ., pand theauto correlation function rhsatisfy the Yule-Walker equations
:
-
8/12/2019 S349 08.ppt
58/81
ppx rrC s 10 112
1111 pprr
2112
pprrr
and
1 11 ppp rr
The Yule-Walker equations can be used to estimatethe regression coefficients 1, 2, ., pusing thesample auto correlation function rhby replacing rhwith rh.
Example
-
8/12/2019 S349 08.ppt
59/81
Example
Considering the data in example 1 (Sunspot Data) the time serieswas identified as an AR(2) time series .
The autocorrelation at lag 1 is r1= 0.807 and the autocorrelationat lag 2 is r2= 0.429 .
The equations for the estimators of the parameters of this seriesare
429000018070
807080700001
21
21
...
...
which has solution
6370321.1
2
1
.
Since d= m( 1 -1- 2) then it can be estimated as follows:
-
8/12/2019 S349 08.ppt
60/81
Thus the identified model in this case is
xt= 1.321xt-1-0.637xt-2+ ut+14.9
9.14637.0321.11590.46
1
21 x d
-
8/12/2019 S349 08.ppt
61/81
Maximum Likelihood
Estimationof the parameters of an ARMA(p,q)
Series
-
8/12/2019 S349 08.ppt
62/81
-
8/12/2019 S349 08.ppt
63/81
It is important to note that:
finding the values -q1,q2, ... , qk- to maximize
L(q1,q2, ... , qk) is equivalent to finding the
values to maximize l(q1,q2, ... , qk) = ln
L(q1,q2, ... , qk).
l(q1,q2, ... , qk) is called the log-Likelihood
function.
-
8/12/2019 S349 08.ppt
64/81
Again let {ut: t T} be identically distributed
and uncorrelated with mean zero. In addition
assume that each is normally distributed .
Consider the time series {xt: t T} defined by
the equation:
(*) xt= 1xt-1+ 2xt-2 +... +pxt-p+ d+ ut
+1ut-1+ 2ut-2 +... +qut-q
-
8/12/2019 S349 08.ppt
65/81
Assume thatx1,x2, ...,xN are observations on the
time series up to time t=N.
To estimate thep+ q+ 2 parameters 1, 2, ...
,p; 1, 2, ... ,q; d, s2by the method of
Maximum Likelihood estimation we need to find
the joint density function ofx1,x2, ...,xN
f(x1,x2, ...,xN|1, 2, ... ,p; 1, 2, ... ,q, d, s2)
= f(x| , , d,s2).
-
8/12/2019 S349 08.ppt
66/81
We know that u1, u2, ...,uNare independent
normal with mean zero and variance s2.
Thus the joint density function of u1, u2, ...,uNis
g(u1, u2, ...,uN; s2) = g(u; s2) is given by.
N
t
t
n
N uguug1
2
2
22
12
1exp21;;,
ssss u
-
8/12/2019 S349 08.ppt
67/81
-
8/12/2019 S349 08.ppt
68/81
The system of equations :
x1= 1x0+ 2x-1 +... +px1-p+ d+ u1+1u0
+ 2u-1 +... + qu1-q
x2= 1x1+ 2x0 +... +px2-p+ d+ u2+1u1
+ 2u0 +... +qu2-q...
xN= 1xN-1+ 2xN-2 +... +pxN-p+ d+ uN
+1uN-1+ 2uN-2 +... + quN-q
-
8/12/2019 S349 08.ppt
69/81
can be solved for:u1= u1(x, x*, u*; , , d)u2= u2(x, x*, u*; , , d)...
uN= uN(x, x*, u*; , , d)(The jacobian of the transformation is 1)
-
8/12/2019 S349 08.ppt
70/81
Then the joint density of x given x* and u* is
given by:
2,,,*,*, sduxxf
N
t
t
n
u
1
2
2,,*,*,
2
1exp
2
1d
ss
ux
d
ss,,*
2
1exp
2
12
S
n
N
t
tuS1
2 ,,*,*,,,*where dd ux
-
8/12/2019 S349 08.ppt
71/81
Let:
2
**, ,,, sduxxL
N
t
t
n
u1
2
2,,*,*,
2
1exp
2
1d
ssux
dss
,,*2
1exp
2
12
Sn
N
t
tuS1
2 ,,*,*,,,*again dd ux
= conditional likelihood function
-
8/12/2019 S349 08.ppt
72/81
2
**,
2
**, ,,,ln,,, sdsd uxxuxx Ll
N
t
tunn
1
2
2
2,,*,*,
2
1ln
22d
ss ux
ds
s ,,*2
12ln
22ln
2 22
Snn
conditional log likelihood function =
-
8/12/2019 S349 08.ppt
73/81
2
**,
2
**, ,,,and,,, sdsd uxxuxx Ll
N
t
tuS1
2,,*,*,,,* dd ux
The values that maximize
are the values
that minimize
d,,
dds ,,*1,,*,*,11
22ux S
nu
n
N
t
t
with
-
8/12/2019 S349 08.ppt
74/81
N
t
tuS1
2,,*,*,,,* dd ux
Comment:
Requires a iterative numerical minimization
procedure to find:
The minimization of:
d,,
Steepest descent
Simulated annealing
etc
C
-
8/12/2019 S349 08.ppt
75/81
N
t
tuS1
2,,*,*,,,* dd ux
Comment:
for specific values of
The computation of:
can be achieved by using the forecast
equations
d,,
1 1 ttt xxu
C
-
8/12/2019 S349 08.ppt
76/81
N
t
tuS1
2,,*,*,,,* dd ux
Comment:
assumes we know the value of starting values
of the time series {xt| tT} and {ut| tT}
The minimization of :
Namely x* and u*.
A h
-
8/12/2019 S349 08.ppt
77/81
*ofcomponentsfor the0
*ofcomponentsfor the
u
xx
Approaches:
1. Use estimated values:
2. Use forecasting and backcasting equations toestimate the values:
Backcasting:
-
8/12/2019 S349 08.ppt
78/81
Backcasting:
If the time series {xt|tT} satisfies the equation:
2211 qtqttt uuuu
2211
d ptpttt
xxxx
It can also be shown to satisfy the equation:
2211 qtqttt uuuu 2211 d
ptpttt xxxx
Both equations result in a time series with the samemean, variance and autocorrelation function:
In the same way that the first equation can be used toforecast into the future the second equation can beused to backcast into the past:
A h t h dli t ti l f
-
8/12/2019 S349 08.ppt
79/81
*ofcomponentsfor the0
*ofcomponentsfor the
u
xx
Approaches to handling starting values of
the series {xt|t T} and {ut|t T}
1. Initially start with the values:
2. Estimate the parameters of the model usingMaximum Likelihood estimation and the
conditional Likelihood function.
3. Use the estimated parameters to backcast thecomponents of x*. The backcasted components of
u*will still be zero.
-
8/12/2019 S349 08.ppt
80/81
4. Repeat steps 2 and 3 until the estimates stablize.
This algorithm is an application of the E-M algorithm
This general algorithm is frequently used when there
are missing values.
TheE stands for Expectation (using a model to estimate
the missing values)
TheM stands for Maximum Likelihood Estimation, theprocess used to estimate the parameters of the model.
-
8/12/2019 S349 08.ppt
81/81
Some Examples using:
Minitab
Statistica
S-Plus
SAS