dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X...

14
1 AMS 570 Professor Wei Zhu *Review: The derivative and integral of some important functions one should remember. 1 ( ) ( ) 1 (ln ) k k x x d x kx dx d e e dx d x dx x 1 1 1 1 1 ( ) ( ) 1 1 b k k k k a b x x b a a x b x dx x b a x a k k x b e dx e e e x a *The Chain Rule [ ( )] '[ ( )] '( ) d gfx g fx f x dx For example: 2 2 2 x x d e e x dx *The Product Rule [() ( )] '( ) () () '( ) d gx fx g xfx gxf x dx *Review: MGF, its second function: The m.g.f. will also generate the moments Moment: 1 st (population) moment: ( ) () EX x f x dx 2 nd (population) moment: 2 2 ( ) () EX x f x dx

Transcript of dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X...

Page 1: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

1

AMS 570 Professor Wei Zhu

*Review: The derivative and integral of some important functions one should remember.

1( )

( )

1(ln )

k k

x x

dx kx

dx

de e

dx

dx

dx x

1 1 11 1( ) ( )

1 1

bk k k k

a

bx x b a

a

x bx dx x b a

x ak k

x be dx e e e

x a

*The Chain Rule

[ ( )] '[ ( )] '( )d

g f x g f x f xdx

For example: 2 2

2x xde e x

dx

*The Product Rule

[ ( ) ( )] '( ) ( ) ( ) '( )d

g x f x g x f x g x f xdx

*Review: MGF, its second function: The m.g.f. will also generate the moments Moment:

1st (population) moment: ( ) ( )E X x f x dx

2nd (population) moment: 2 2( ) ( )E X x f x dx

Page 2: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

2

Kth (population) moment: ( ) ( )k kE X x f x dx

Take the Kth derivative of the ( )XM t with respect to t,

and the set t = 0, we obtain the Kth moment of X as follows:

22

2

( ) ( )0

( ) ( )0

...

( ) ( )0

X

X

kk

Xk

dM t E X

tdt

dM t E X

tdt

dM t E X

tdt

Note: The above general rules can be easily proven using calculus.

Example: When 2~ ( , )X N , we want to verify the

above equations for k=1 & k=2. 2 21

22( ) ( ) ( )t t

X

dM t e t

dt

(using the chain rule) So when t=0

( ) ( )0

X

dM t E X

tdt

2 2

2 2 2 2

2

2

1

22

1 1

2 2 22 2

( )

[ ( )]

[( ) ( )]

( ) ( ) ( )

X

X

t t

t t t t

dM t

dt

d dM t

dt dt

de t

dt

e t e

(using the result of the Product Rule) And

Page 3: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

3

2

2

2 2

2

( )0

( )

X

dM t

tdt

E X

Considering 2 2 2( ) ( )Var X E X

*** Joint distribution, and independence Definition. The joint moment generating function of two random variables X and Y is defined as ( ) (

)

Theorem. Two random variables X and Y are independent

⇔ (if and only if)

( ) ( ) ( ) ⇔ ( )

( ) ( ) Definition. The covariance of two random variables X and Y is defined as ( ) [( )( )]. Theorem. If two random variables X and Y are independent, then we have ( ) . (*Note: However, ( ) does not necessarily mean that X and Y are independent.) Exercise (

). Prove that and are independent in two approaches: (1) pdf, (2) mgf

Page 4: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

4

Solution: (1) The pdf approach: Let

Then we have

This is a 1-1 transformation between (X1, X2) and (W, Z). Define the Jacobian J of the transformation as:

|

| |

|

Then the joint distribution of the new random variables is given by:

( ) (

) | |

Since and are independent, we have:

( ) ( ) ( )

Thus we find:

( ) (

) (

) | |

(

√ )

( )

( )

(

)

( )

( )

( )

Page 5: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

5

( ) ( ) So Z and W are independent. Furthermore we see that (

) and ( )

(2) The mgf approach: Now we p

( ) ( ) ( )

( )

( ) ( )

( ) ( )

⇒ ( ( ) ( ))

( ( ) ( ) )

( ( ) ( ) )

( ( ) ) ( ( ) )

( ( ) ) ( )

( )

( ( ) ) ( )

( )

( ( ) ( ) )

Page 6: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

6

(

)

(

( )

( )

)

( )

( )

( ) ( )

( ) ( )

( ) ( ) ( )

Q2. ( ), (1) please derive the covariance between ; (2) Are independent?

Page 7: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

7

( ) ( ) [( ( )) ( ( ))]

= [( )( )] [ ]

[ (

)]|

(2) are not independent. You can do it in many ways, using (a) pdf, (b) cdf/probabilities, or (c) mgf. For example, ( ) ( )

( ) ( ) [ ( )]

Q3. Let ~ (3,4)X N , please calculate (1 3)P X

(1 3)

1 3 3 3 3( )

2 2 2

( 1 0)

(0 1)

0.3413

P X

XP

P Z

P Z

( )

Page 8: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

8

Q4. Students A and B plans to meet at the SAC Seawolf market between 12noon and 1pm tomorrow. The one who arrives first will wait for the other for 30 minutes, and then leave. What is the chance that the two friends will be able to meet at SAC during their appointed time period assuming that each one will arrive at SAC independently, each at a random time between 12noon and 1pm?

Page 9: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

9

Solution: 3/4 as illustrated by the following figure. Let X and Y denote the arrival time (in hour where 0 indicates 12noon and 1 indicates 1pm) of A and B respectively. Our assumption implies that X and Y each follows Uniform[0,1] distribution, and furthermore, they are independent, Thus the joint pdf of X and Y follows the uniform distribution with f(x,y) = 1, if (x,y) is in the square region, and f(x,y) = 0, outside the square region. A and B will be able to meet iff | | as represented by the red area M bounded by the lines: and .

That is: P(A & B will meet) =

(| |

) ∬

*Definitions: population correlation & sample correlation Definition: The population correlation coefficient ρ is defined as:

Page 10: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

10

( )

√ ( ) ( )

Definition: Let (X1 , Y1), …, (Xn , Yn) be a random sample from a given bivariate population, then the sample correlation coefficient r is defined as:

∑( ̅)( ̅)

√[∑( ̅) ][∑( ̅)

]

*Definition: Bivariate Normal Random Variable

2 2( , ) ~ ( , ; , ; )x x y yX Y BN where is the correlation

between &X Y The joint p.d.f. of ( , )X Y i

,2

12

1exp

12

1

,

22

22

,

Y

Y

Y

Y

X

X

X

X

YX

YX

yyxx

yxf

Exercise: Please derive the mgf of the bivariate

normal distribution. Q5. Let X and Y be random variables with joint pdf

,2

12

1exp

12

1

,

22

22

,

Y

Y

Y

Y

X

X

X

X

YX

YX

yyxx

yxf

where x , y . Then X and Y are

said to have the bivariate normal distribution. The joint moment generating function for X and Y is

22

221

22

12121 22

1exp, YYXXYX ttttttttM

.

( ) F h m g l f’s of ;

Page 11: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

11

(b) ov h f o ly f ρ = 0. (H ρ s , h < o ul o > co l o coefficient between X and Y.) (c) Find the distribution of YX .

(d) Find the conditional pdf of f(x|y), and f(y|x) Solution: (a) The moment generating function of X can be given by

( ) ( ) [

]

Similarly, the moment generating function of Y can be given by

( ) ( ) [

]

Thus, X and Y are both marginally normal distributed, i.e., (

), and ( ).

The pdf of X is

( )

√ [

( )

]

The pdf of Y is

( )

√ [

( )

]

(b)

If 0 , then

2 2 2 2

1 2 1 2 1 2 1 2

1( , ) exp ( ,0) (0, )

2X Y X YM t t t t t t M t M t

Therefore, X and Y are independent. If X and Y are independent, then

Page 12: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

12

2 2 2 2

1 2 1 2 1 2 1 2

1( , ) ( ,0) (0, ) exp

2X Y X YM t t M t M t t t t t

2 2 2 2

1 2 1 1 2 2

1exp 2

2X Y X X Y Yt t t t t t

Therefore, 0

(c)

( )( ) t X Y tX tY

X YM t E e E e

Recall that 1 2

1 2( , )t X t Y

M t t E e , therefore we can

obtain ( )X YM t by setting 1 2t t t in 1 2( , )M t t

That is,

2 2 2 2 21( ) ( , ) exp 2

2X Y X Y X X Y YM t M t t t t t t t

2 2 21exp 2

2X Y X X Y Yt t

2 2 2~ ( , 2 )X Y X X Y YX Y N

(d) The conditional distribution of X given Y=y is given by

( | ) ( )

( )

√ √

{

( ( ))

( )

}

Similarly, we have the conditional distribution of Y given X=x is

Page 13: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

13

( | ) ( )

( )

√ √

{

( ( ))

( )

}

Therefore:

| ( ( ) (

) )

| ( ( ) (

) )

Q6: ( ), and the value of random variable W

depends on a coin (fair coin) flip: {

,

find the distribution of W. Is the joint distribution of Z and W a bivariate normal?

Page 14: dx - Stony Brookzhu/ams570/Lecture4_570.pdf3 22 2 2 2 0 X d Mt dt t EX PV Considering VP2 2 2 Var X E X( ) ( ) *** Joint distribution, and independence Definition.The joint moment

14

Answer: ( ) ( )

( | ) ( ) ( | ) ( ) ( ) ( ) ( ) ( )

( )

( )

( )

( )

( )

( )

( ) So ( ). The joint distribution of Z and W is not normal. This can be shown by deriving the joint mgf of Z and W and compare it with the joint mgf of bivariate normal. ( ) (

)

( ( | ))

( | ) ( ) ( | ) ( )

( ( ) )

( ( ) )

( ( )

( )

)

This is not the joint mgf of bivariate normal. Alternatively, you can also derive the following probability:

( ) ( | ) ( ) ( | ) ( )

( )

( )

This shows that W+Z can not possibility follow a univariate normal distribution – which in turns shows that W and Z can not possible follow a bivariate normal distribution.