Probability theory 2008 Outline of lecture 5 The multivariate normal distribution Characterizing...

23
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution Characterizing properties of the univariate normal distribution Different definitions of normal random vectors Conditional distributions Independence Cochran’s theorem
  • date post

    22-Dec-2015
  • Category

    Documents

  • view

    223
  • download

    3

Transcript of Probability theory 2008 Outline of lecture 5 The multivariate normal distribution Characterizing...

Probability theory 2008

Outline of lecture 5

The multivariate normal distribution

Characterizing properties of the univariate normal distribution Different definitions of normal random vectors Conditional distributions Independence Cochran’s theorem

Probability theory 2008

The univariate normal distribution- defining properties

A distribution is normal if and only if it has the probability density

where R and > 0.

A distribution is normal if and only if the sample mean

and the sample variance

are independent for all n.

)2

)(exp(

2

1)(

2

2

x

xf X

n

iiX

nX

1

1

n

ii XX

ns

1

22 )(1

1

Probability theory 2008

The univariate normal distribution- defining properties

Suppose that X1 and X2 are independent of each other, and that the same is true for the pair

where no coefficient vanishes. Then all four variables are normal.

Special case: rotations other than multiples of 90 degrees

2

1

2221

1211

2

1

X

X

aa

aa

Y

Y

x1

x2

Probability theory 2008

The univariate normal distribution- defining properties

Let F be a class of distributions such that

X F a + bX F

Can F be comprised of distributions other than the normal distributions?

cf. Cauchy distributions

Probability theory 2008

The multivariate normal distribution- a first definition

A random vector is normal if and only if every linear combination of its components is normal

Immediate consequences:

Every component is normal

The sum of all components is normal

Every marginal distribution is normal

Vectors in which the components are independent normal random variables are normal

Linear transformations of normal random vectors give rise to new normal vectors

Probability theory 2008

Illustrations of independent and dependent normal distributions

Probability theory 2008

Illustrations of independent and dependent normal distributions

http://stat.sm.u-tokai.ac.jp/~yama/graphics/bnormE.html

Probability theory 2008

Parameterization of the multivariatenormal distribution

Is a multivariate normal distribution uniquely determined by the vector of expected values and the covariance matrix?

Is there a multivariate normal distribution for any covariance matrix?

Probability theory 2008

Fundamental results for covariance matrices

Let be a covariance matrix.

Since is symmetric there exists an orthogonal matrix C

(C’C = C C’ = I) such that

C’ C = D and = CD C’

where D is a diagonal matrix.

Since is also nonnegative-definite, there exists a symmetric matrix B such that

B B =

If X has independent components with variance 1, Y = BX has covariance matrix

Probability theory 2008

The multivariate normal distribution- a second definition

A random vector is normal if and only if it has a characteristic function of the form

where is a nonnegative-definite, symmetric matrix and is a vector of constants

Proof of the equivalence of definition I and II:

Let XN( , ) according to definition I, and set Z = t’X. Then E(Z) = t’u and Var(Z) = t’ t, and Z(1) gives the desired expression.

Let XN( , ) according to definition II. Then we can derive the characteristic function of any linear combination of its components and show that it is normally distributed.

)2

exp()()(tΛt'

μt'Xt'X ieEt i

Probability theory 2008

The multivariate normal distribution- a third definition

Let Y be normal with independent standard normal components and set

Then

provided that the determinant is non-zero.

μYΛX 1/2

)(-1/2 μXΛY

2

)()'(exp

det

1

2

1)(

12/μxΛμx

ΛxX

n

f

Probability theory 2008

The multivariate normal distribution- a fourth definition

Let Y be normal with independent standard normal components and set

Then X is said to be a normal random vector.

μAYX

Probability theory 2008

The multivariate normal distribution- conditional distributions

All conditional distributions in a multivariate normal vector

are normal

The conditional distribution of each component is equal to that of a linear combination of the other components plus a random error

Probability theory 2008

The multivariate normal distribution- conditional distributions and optimal predictors

For any random vector X it is known that E(Xn | X1, …, Xn-1) is an optimal predictor of Xn based on X1, …, Xn-1 and that

Xn = E(Xn | X1, …, Xn-1) +

where is uncorrelated to the conditional expectation.

For normal random vectors X, the optimal predictor E(Xn | X1, …, Xn-1) is a linear expression in X1, …, Xn-1

Probability theory 2008

The multivariate normal distribution- calculation of conditional distributions

Let XN (0 , ) where

Determine the conditional distribution of X3 given X1 and X2

401

062

121

Set Z = a X1 + bX2 + c

Minimize the variance of the prediction error Z - X3

Probability theory 2008

The multivariate normal vector- uncorrelated and independent components

The components of a normal random vector are independent if and only if they are uncorrelated

Probability theory 2008

The multivariate normal distribution- orthogonal transformations

Let X be a normal random vector with independent standard normal components, and let C be an orthogonal matrix.

Then

Y = CX

has independent, standard normal components

Probability theory 2008

Quadratic forms of the components of a multivariate normal distribution – one-way analysis

of variance

Let Xijij, i = 1, …, k, j = 1, …, ni , be k samples of observations. Then, the total variation in the X-values can be decomposed as follows:

ji

k

i

k

i

n

jiijiiij

i

XXXXnXnX, 1 1 1

2.

2...

2..

2 )()(

XXXXXXXX 321 '''' AAAI

Probability theory 2008

nn

nn

A

/1.../1

..

..

..

/1.../1

1

kkk

kk

kk

kkk

nnn

nn

nn

nnn

nnn

n

nn

nnn

A

11

1..

1

111.

..

.1

11

1..

111

.

.

.

.

.

11

1..

1

1.

.1

11

1..

111

111

1

11

111

3

Probability theory 2008

Decomposition theorem for nonnegative-definite quadratic forms

Let

where

Then there exists an orthogonal matrix C such that with x = Cy

(y = C’x)

n

ipi QQx

11

2 ...

p

ii

p

iiii nrARankAQ

11

)( and forms quadratic negative-non arexx'

221

2211

...

.

.

...1

pp rrnp

r

yyQ

yyQ

Probability theory 2008

Decomposition theorem for nonnegative-definite quadratic forms (Cochran’s theorem)

Let X1, …, Xn be independent and N(0; 2) and suppose that

where

Then there exists an orthogonal matrix C such that with X = CY (Y = C’X)

Furthermore, Q1, …, Qp are independent and 22-distrubuted with r1, …rp degrees of freedom

n

ipi QQX

11

2 ...

p

ii

p

iiii nrARankAQ

11

)( and forms quadratic enonnegativ areXX'

221

2211

...

.

.

...1

pp rrnp

r

YYQ

YYQ

Probability theory 2008

Quadratic forms of the components of a multivariate normal distribution – one-way analysis

of variance

Let Xijij, i = 1, …, k, j = 1, …, ni , be independent and N( ,2). Then,

the total sum of squares can be decomposed into three quadratic forms

which are independent and 22-distrubuted with 1, k-1, and n-k degrees of freedom

3211 1 1

2.

2...

2.. )()( QQQXXXXnXn

k

i

k

i

n

jiijii

i

ji

ijX,

2

Probability theory 2008

Exercises: Chapter V

5.1, 5.2, 5.6, 5.8, 5.14, 5.16, 5.17, 5.27