Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter 3.1-3.3.

Post on 20-Dec-2015

215 views 0 download

Transcript of Time Series Basics Fin250f: Lecture 3.1 Fall 2005 Reading: Taylor, chapter 3.1-3.3.

Time Series Basics

Fin250f: Lecture 3.1

Fall 2005

Reading: Taylor, chapter 3.1-3.3

Outline

Random variablesDistributionsCentral limit theoremTwo variables IndependenceTime series definitions

Random Variables: Discrete

∑ −=

∑=

=∑

==

=

=

=

N

iii

N

iii

N

ii

ii

ii

XExpXVar

xpXE

p

xXp

px

1

2

1

1

))(()(

)(

1

)Pr(

,

Random Variables: Continuous

∫ =

=

=∞=−∞≤=

∞−1)(

0)(

)()(

1)(,0)(

)Pr()(

dxxf

xfdxxdF

xf

FF

xXxF

Random Variables: Continuous

∫ −=−=

=∫ −=

∫=

=∫=

∞−

∞−

∞−

∞−

dxxfxXEm

dxxfxXVar

dxxfxgXgE

dxxxfXE

nn

n )()()(

)()()(

)()())((

)()(

22

μμ

σμ

μ

Random Variables: Continuous

44

224

5.1

2

3

)()(

)(

σσmmXkurtosis

mmXskewness

==

=

Important Distributions

UniformNormalLog normalStudent-tStable

Normal/Gaussian

f (x :μ ,σ ) =1

σ 2πexp −

12

x −μσ

⎛ ⎝ ⎜

⎞ ⎠ ⎟2 ⎛

⎝ ⎜

⎠ ⎟

X ~ N(μ ,σ )

Normal Picture: Sample = 2000

Normal Exponential Expectations

X ~ N (μ ,σ 2 )

E(eαX ) = exp(αμ +12α 2σ 2 )

Why Important in Finance?

Central limit theoremMany returns almost normal

Log Normal

( )

∑ +=∑=

−==

==

T

tt

T

ttT

TT

TT

rrW

PPW

PPW

NX

1

'

1

0

0

)1log()log(

log)log()log(

/

),(~)log( σμ

Log Normal

Not symmetric Long right tail

Log Normal Histogram (Sample = 5000)

Chi-square

)(~

),(~

2

1

2

2

nY

XY

NX

n

i

i

χσμ

σμ

∑ ⎟⎠⎞

⎜⎝⎛ −=

=

Student-t

rVW

T

rV

NW

/

)(~

)1,0(~2

=

χ

Student-t Moments

All moments > r do not exist

rmXE m ≥∞=)(

Stable Distribution

Similar shape to normal Infinite varianceSums of stable RV’s are stable

Central Limit Theorem (casual)

),(~

nlargeFor

)Var(Z

tindependen variable,randomany

2

1

i

σμNU

ZU

Z

n

ii

i

∑=

∞<

=

Consequence of CLT and continuous compounding

rt+1,2 = rt + rt+1

rt+1,k = rt+ii=0

k−1

If var(rt ) < ∞

rt+1,k ~ N (μ ,σ 2 )

Two Variables

F(x, y) = P(X ≤ x,Y ≤ y)

f (x, y) =∂ 2F∂x∂y

f (x | y) =f (x, y)fX (x)

E(Y | x) = yf (y | x)dx−∞

E(Y | X)

More on Two Variables

cov(X,Y ) = E[(X − E(X))(Y − E(Y ))]

= E(XY ) − E(X)E(Y )

cor(X,Y ) =cov(X,Y )σ Xσ Y

= ρ

−1 ≤ ρ ≤ 1

More Two Variables

cov(a +bX,c + dY ) = bd cov(X,Y )

cor(a +bX,c + dY ) = cor(X,Y )

E(aX +bY ) = aE(X) +bE(Y )

var(aX +bY ) = a2 var(X) +b2 var(Y ) + 2abcov(X,Y )

Independent Random Variables

f (x, y) = fX (x) fY (y)

f (y | x) = fY (y)

E(Y | x) = E(Y )

E(YX) = E(X)E(Y )

cov(Y ,X) = cor(Y ,X) = 0

More than Two RV’s

F(y1, y2 , y3 K ) = Pr(Y1 ≤ y1,Y2 ≤ y2 ,Y3 ≤ y3,K )

E(a + biYi )i=1

n

∑ = a + biE(Yi )i=1

n

var(a + biYi )i=1

n

∑ = bi

2

i=1

n

∑ var(Yi ) + 2bibj cov(Yi ,Y j )j=i+1

n

∑i=1

n−1

Multivariate Normal

Y ~ N (μ ,Ω)

μ i = E(Yi )

Ωi , j = cov(Yi ,Y j )

Independence

E(YiY j ) = E(Yi )E(Y j )

f (y1, y2 , y3,K ) = f (y1) f (y2 ) f (y3 )K

f (y1 | y2 , y3 K ) = fY1(y1)

Independent Identically Distributed

All random variables drawn from same distribution

All are independent of each otherCommon assumption IID IID Gaussian

Stochastic Processes

Xt

(x1, x2 , x3,K xT )

E(Xt | xt−1, xt−2 ,K ) = E(Xt | t −1)

Time Series Definitions

Strictly stationaryCovariance stationaryUncorrelated White noiseRandom walkMartingale

Strictly Stationary

All distributional features are independent of time

F(xt , xt−1,K xt−m )

Covariance Stationary

Variances and covariances independent of time

cov(Xt ,Xt− j )

var(Xt )

Uncorrelated

cor(Xt ,Xt− j ) = cov(Xt ,Xt− j ) = 0

White Noise

Covariance stationaryUncorrelatedMean zero

Random Walk

pt = pt−1 +etet IID

Geometric Random Walk

log(pt ) = log(pt−1) +etet IID

Martingale

E(Pt+1 | t) = Pt

Autocovariances/correlations

ρ j = cor(Xt ,Xt− j ) =cov(Xt ,Xt− j )

σ X2

Uncorrelated : ρ j = 0 j > 0

Outline

Random variablesDistributionsCentral limit theoremTwo variables IndependenceTime series definitions