Chapter 4. Multiple Random Variables
description
Transcript of Chapter 4. Multiple Random Variables
![Page 1: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/1.jpg)
tch-prob 1
Chapter 4. Multiple Random Variables
Ex. 4.1. Select a student’s name from an urn.
: height
: weight
: age
H
W
A
S
1 2, , , nX X XX
In some random experiments, a number of different quantities are measured.
![Page 2: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/2.jpg)
tch-prob 2
A vector random variable X is a function that assigns a vector of real numbers to each outcome in S, the sample space of the random experiment.
1 2
Each event involving an -dimensional random variable
, , , has a corresponding region in an
-dimensional real space.n
n
X X X
n
X
The vector ( ( ), ( ), ( )) is a vector random variable.H W A
4.1 Vector Random Variables
![Page 3: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/3.jpg)
tch-prob 3
Event Examples• Consider the two-dimensional random variable X=(X,Y).
Find the region of the plane corresponding to events
2 2
10 ,
min( , ) 5 ,
100 .
A X Y
B X Y
C X Y
![Page 4: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/4.jpg)
tch-prob 4
Product Form• We are particularly interested in events that have the
product form
1 1 2 2
...
where is a one-dimensional event that involves only.n n
k k
A X in A X in A X in A
A X
1 2 2 2( , ) ( , )x y x y
1 2 2{ } { }x X x Y y 1 2 1 2{ } { }x X x y Y y
x1 x2
y1
y2
![Page 5: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/5.jpg)
tch-prob 5
Product Form• A fundamental problem in modeling a system with a
vector random variable involves specifying the probability of product-form events
• Many events of interest are not of product form.
• However, the non-product-form events can be approximated by the union of product-form events.
1 1 2 2
1 1 2 2
[ ] ...
in , in ,..., in
n n
n n
P A P X in A X in A X in A
P X A X A X A
5 and 5 and 5B X Y X Y Ex.
![Page 6: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/6.jpg)
tch-prob 6
4.2 Pairs of Random variables
A. Pairs of discrete random variables
- Let X=(X,Y) assume values from
- The joint pmf of X is
, , 1, 2,..., 1, 2,... .j kS x y j k
, ,
, 1, 2,..., 1, 2,...
X Y j k j k
j k
p x y P X x Y y
P X x Y y j k
, in
[ in ] ,j k
XY j kx y A
P A p x yX
It gives the probability of the occurrence of the pair ,j kx y
- The probability of any event A is the sum of the pmf over the outcomes in A:
1 1
, 1XY j kj k
p x y
- When A=S,
![Page 7: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/7.jpg)
tch-prob 7
Marginal pmf• We are also interested in the probabilities of events involvi
ng each of the random variables in isolation.
• These can be found in terms of the Marginal pmf.
• In general, knowledge of the marginal pmf’s is insufficient to specify the joint pmf.
,1
,1
, anything
,
,
x j j j
X Y j kk
y k k X Y j kj
p x P X x P X x Y
p x y
p y P Y y p x y
![Page 8: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/8.jpg)
tch-prob 8
Ex. 4.6. Loaded dice: A random experiment consists of tossing two loaded dice and noting the pair of numbers (X,Y) facing up. The joint pmf
, ( , )X Yp j k
1 2 3 4 5 6
1 2/42 1/42 1/42 1/42 1/42 1/42
2 1/42 2/42 1/42 1/42 1/42 1/42
3 1/42 1/42 2/42 1/42 1/42 1/42
4 1/42 1/42 1/42 2/42 1/42 1/42
5 1/42 1/42 1/42 1/42 2/42 1/42
6 1/42 1/42 1/42 1/42 1/42 2/42
j
k
The marginal pmf P[X=j]=P[Y=k]=1/6.
![Page 9: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/9.jpg)
tch-prob 9
Ex. 4.7. Packetization problem: The number of bytes N in a message has a geometric distribution with parameter 1-p and range SN={0,1,2,….}. Suppose that messages are broken into packets of maximum length M bytes.Let Q be the number of full packets and let R be the number of bytes left over. Find the joint pmf and marginal pmf’s of Q and R.
![Page 10: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/10.jpg)
tch-prob 10
joint cdf of X and Y
, 1 1 1 1, ,X YF x y P X x Y y
(x1,y1)
y
x
The joint cdf of X and Y is defined as the probability of the product-form event
1 1{ } { }:X x Y y
, 1 1 , 2 2 1 2 1 2
, 1 , 1
,
,
( ) ( , ) ( , ) if and
( ) ( , ) ( , ) 0
( ) ( , ) 1
( ) ( ) ( , ) [ , ] [ ]
( ) [ ]
X Y X Y
X Y X Y
X Y
X X Y
Y
i F x y F x y x x y y
ii F Y F X
iii F
iv F x F x P X x Y P X x
F y P Y y
marginal cdf
![Page 11: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/11.jpg)
tch-prob 11
joint cdf of X and Y
, ,x a
, ,b
1 2 1 2
, 2 2 , 2 1 , 1 2 , 1 1
( ) lim ( , ) ( , )
lim ( , ) ( , )
( ) [ , ]
( , ) ( , ) ( , ) ( , )
X Y X Y
X Y X Yy
X Y X Y X Y X Y
v F x y F a y
F x y F x b
vi P x X x y Y y
F x y F x y F x y F x y
(x1,y1)
y
x
![Page 12: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/12.jpg)
tch-prob 12
),(),(],[ 11,12,121 yxFyxFyYxXxP YXYX
x1 x2
y1
(x1,y1) (x2,y1)
y
x
y
(x2,y2)
(x1,y1)
xx1 x2y2
y1
B B
A
![Page 13: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/13.jpg)
tch-prob 13
joint pdf of two jointly continuous random variables
,
, ,
,,
1 21 1 2 2 ,
1 2
,
1 ( ', ') ' '
( , ) ( ', ') ' '
2 ( , ) ( , )
, , ( ', ') ' '
, ( ', ')
X Y
X Y X Y
X YX Y
X Y
X Y
f x y dx dyyxF x y f x y dx dy
F x yf x y
x y
b bP a X b a Y b f x y dx dy
a a
y dyx dxP x X x dx y Y y dy f x y dxx y
,
' '
( , ) X Y
dy
f x y dx dy
, ( , )X Yf x yX and Y are jointly Continuous if the probabilities of events involving (X,Y) can be expressed as an integral of a pdf, .
, ( ', ') ' ' X Y
A
P in A f x y dx dyX
![Page 14: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/14.jpg)
tch-prob 14
Marginal pdf: obtained by integrating out the variables that are not of interest.
,
,
( ) ( , ') '
( ) ( ', ) '
X X Y
Y X Y
f x f x y dy
f y f x y dx
,
marginal
X Y
d x f (x', y')dy' dx'dx
cdf
![Page 15: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/15.jpg)
tch-prob 15
Ex. 4.10. A randomly selected point (X,Y) in the unit square has uniform joint pdf given by
,
,
1 0 1 and 0 1( , )
0 elsewhere.
Find ( , ).
X Y
X Y
x yf x y
F x y
v
iv
iii
ii
i 1
1
![Page 16: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/16.jpg)
tch-prob 16
Ex. 4.11 Find the normalization constant c and the marginal pdf’s for the following joint pdf:
,
0( , )
0 elsewhere.
x y
X Y
ce e y xf x y
![Page 17: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/17.jpg)
tch-prob 17
Ex. 4.12
Find [ 1] in Example 4.11.P X Y
10
![Page 18: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/18.jpg)
tch-prob 18
Ex. 4.13 The joint pdf of X and Y is
2 2 2
, 2
( 2 )/2(1 )1( , ) , .
2 1X Y
x xy yf x y x ye
We say that X and Y are jointly Gaussian. Find the marginal pdf’s.
![Page 19: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/19.jpg)
tch-prob 19
4.3 Independence of Two Random Variables
X and Y are independent random variables if any event A1 defined in terms of X is independent of any event A2 defined in terms of Y;
P[ X in A1, Y in A2 ] = P[ X in A1 ] P[ Y in A2 ]
Suppose that X,Y are discrete random variables, and suppose we are interested in the probability of the event where A1 involves only X and A2 involves only Y.
“”If X and Y are independent, then A1 and A2 are independent events.
Let
, ( , ) ,
( ) ( ) for all and .
X Y j k j k
j k
X j Y k j k
p x y P X x Y y
P X x P Y y
p x p y x y
1 2 ,A A A
1 2 and j kA X x A Y y
![Page 20: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/20.jpg)
tch-prob 20
“”
,If ( , ) ( ) ( ) for all and ,X Y j k X j Y k j kp x y p x p y x y
1 2
1 2
j 1 k 2
, in in
in in
x in y in
1 2
then ( , )
= ( ) ( )
= ( ) ( )
=
j k
j k
X Y j kx A y A
X j Y kx A y A
X j Y kA A
P A p x y
p x p y
p x p y
P A P A
![Page 21: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/21.jpg)
tch-prob 21
In general, X, Y are independent iff
If X and Y are independent r.v. ,then g(X) and h(Y) are also independent.
,
,
( , ) ( ) ( )
or ( , ) ( ) ( ) if , are jointly continuous.X Y X Y
X Y X Y
F x y F x F y
f x y f x f y X Y
( ) in A, ( ) in B in A', in B'
in A' in B'
( ) in A ( ) in B .
P g X h Y P X Y
P X P Y
P g X P h Y
# A and A’ are equivalent events; B and B’ are equivalent events.
![Page 22: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/22.jpg)
tch-prob 22
Ex.4.15 In the loaded dice experiment in Ex. 4.6, the tosses are not independent.
Ex. 4.16 Q and R in Ex. 4.7 are independent.
Ex.4.17 X and Y in Ex. 4.11 are not independent, even though the joint pdf appears to factor.
,
2 0( , )
0 elsewhere.
x y
X Y
e e y xf x y
[ , ] (1 )
[ ] (1 )( ) , 0,1,2,...
[ ] (1 ) /(1 ), 0,1,2,..., 1.
qM r
M M q
r M
P Q q R r p p
P Q q p p q
P R r p p p r M
2( ) 2 (1 ) ( ) 2x x yX Yf x e e f y e
![Page 23: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/23.jpg)
tch-prob 23
4.4 Conditional Probability and Conditional Expectation
Many random variables of practical interest are not independent. We are interested in the probability P[Y in A] given X=x?
conditional probability
A. If X is discrete, can obtain conditional cdf of Y given X=xk
The conditional pdf, if the derivative exists, is
][
],in []in [
xXP
xXAYPxXAYP
[ , ]( ) , for 0.
[ ]k
Y k kk
P Y y X xF y x P X x
P X x
)()( kYkY xyFdy
dxyf
in A
in A | ( | )k Y kyP Y X x f y x dy
![Page 24: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/24.jpg)
tch-prob 24
If X and Y are independent
- If X and Y are discrete
If X and Y are independent
)()(
)()(
][][],[
yfxyf
yFxyF
xXPyYPxXyYP
YY
YY
kk
)(
),(
][
],[)(
kX
jkXY
k
jkkjY xP
yxP
xXP
yYxXPxyP
)()( jYkjY yPxyP
![Page 25: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/25.jpg)
tch-prob 25
B. If X is continuous, P[ X = x] = 0
conditional cdf of Y given X = x
conditional pdf.
, ,
,
( ', ') ' ' ( , ') 'lim
( )0 ( ') '
( , ') '( )
X Y X Y
XX
X Y
X
y x h yf x y dx dy f x y dy hxx h f x hh f x dxx
y f x y dyf x
,( ) lim ( | ) lim0 0Y Y
P Y y x X x h
P x X x hF y x F y x X x h
h h
,( , )
( )( )
X YY
X
f x yf y x
f x
![Page 26: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/26.jpg)
tch-prob 26
Discrete
continuous
discrete
continuous
,in Aall
in A in Aall all
( , )
( | ) ( ) ( ) ( | )
[ in A ] ( )
[ in A]
all
X Y k jyx jk
Y j k X k Y j kX ky yx xj jk k
k X k
k
p x y
p y x p x p x p y x
P Y X x p x
P Y
x
,,
( , )( ) ( , ) ( ) ( )
( )X Y
Y X Y Y XX
f x yf y x f x y f y x f x
f x
[ in A] [ in A ] ( )X
P Y P Y X x f x dx
,,
( , )( ) ( , ) ( ) ( ).
( )X Y k j
j k X Y k j Y j k X kX k
Ypp x y
y x p x y p y x p xp x
Theorem on total probability
![Page 27: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/27.jpg)
tch-prob 27
Ex 4.22. The total number of defects on a chip is a
Poisson random variable with mean . Suppose that
each defect has a probability of falling in a specific
region R and that the location of each
X
p
defect is
independent of the locations of all other defects. Find
the pmf of the number of defects that fall in the region .Y R
[ ] , 0,1,2,...!
k
P X k e kk
(1 ) 0[ | ]
0
j k jkp p j k
P Y j X k j
j k
0
( )[ ] [ | ] [ ] ...
!
jp
k
pP Y j P Y j X k P X k e
j
![Page 28: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/28.jpg)
tch-prob 28
Ex. 4.23 The number of customers that arrive at a service station during
a time is a Poisson random variable with parameter . The time
required to service each customer is an exponential random
t t variable with
parameter . Find the pmf for the number of customers that arrive
during the service time of a specific customer. Assume that the customer
arrivals are independent of the customer
N
T
service time.
( )[ | ] , 0,1,2,...
!
ktt
P N k T t e kk
( ) , 0tTf t e t
0[ ] [ | ] ( )
, 0,1,2,...
T
k
P N k P N k T t f t dt
k
![Page 29: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/29.jpg)
tch-prob 29
Ex. 4.24 The random variable is selected at random from
the unit interval; the random variable is then selected at
random from the interval (0, ). Find the cdf of .
X
Y
X Y
|
1/ 0( | )
0Y x
x y xf y x
otherwise
( ) 1, 0 1Xf x x
/ 0[ | ]
1
y x y xP Y y X x
x y
1
0( ) [ ] [ | ] ( )Y XF y P Y y P Y y X x f x dx
1
0( ) 1 ' ' ln
'
y
Y y
yF y dx dx y y y
x
![Page 30: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/30.jpg)
tch-prob 30
Conditional Expectation The conditional expectation of Y given X=x is or if X,Y are discrete.
( )j Y j
j
py y xy
[ | ] ( )Y
E Y x yf y x dy
The conditional expectation | can be viewed as defining
a function of : ( ) | .
( ) can be used to define a random variable ( ) | .
What is ( ) | ?
E Y x
x g x E Y x
g x g X E Y X
E g X E E Y X
|
| ( ) discrete
[ | ] ( ) continuous
X
k X kxk
E Y X
E Y x p x X
E E Y x f x dx X
![Page 31: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/31.jpg)
tch-prob 31
can be generalized to
[ | ] ( )
( ) ( )
( , )
( )[ ]
X
Y X
XY
Y
E Y x f x dx
yf y x dy f x dx
y f x y dxdy
yf y dyE Y
]][[)]([ XYEEXgE
]])([[)]([ XYhEEYhE
![Page 32: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/32.jpg)
tch-prob 32
[ X Y ] [ 0,0 ] 0.1 [ 1,0 ] [ 1,1 ] [ 2,0 ] [ 2,1 ] [ 2,2 ] [ 3,0 ] [ 3,1 ] [ 3,2 ] [ 3,3 ]
E[Y] = 1 E[X] = 2.0
x
y
1( , )
10XYp x y 30 xyfor
0.4
0.3( )
0.2
0.1
Yp y
0.1
0.2( )
0.3
0.4
Xp x
0y
3
2
1
0x
3
2
1
![Page 33: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/33.jpg)
tch-prob 33
1( 2) 3
0Yp y x
1( 1) 2
0Yp y x
1,0y
2,1,0y
00.1]0[2
11.
2
10.
2
1]1[
12.3
11.
3
10.
3
1]2[
2
3)3210(
4
1]3[
xYE
xYE
xYE
xYE
0.11.03.06.0
1.002.02
13.014.0
2
3][ YE
x
y
![Page 34: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/34.jpg)
tch-prob 34
Ex. 4.25 Find the mean of Y in Ex. 4.22 using conditional expectation.
Ex. 4.26 Find the mean and variance of the number of customer arrivals N during the service time T of a specific customer in Ex. 4.23.
0
|k
E Y E Y X k P X k
2
2
|
|
E N T t
E N T t
E N
E N
[ ]pE X p
0k
Pp kk X
t2( )t t
0 0[ | ] ( ) ( ) [ ] /T TE N T t f t dt tf t dt E T
2 2 2
0 0
2 2 2 2
[ | ] ( ) ( ) ( )
[ ] [ ] / 2 /
T TE N T t f t dt t t f t dt
E T E T
![Page 35: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/35.jpg)
tch-prob 35
4.5 Multiple Random Variables
Extend the methods for specifying probabilities of pairs of random variables to the case of n random variables.
We say that are jointly continuous random variables if
1 2
, , , 1 2 1 2 in A 1 2
, , , 1 2( ', ', , ')
( ', ', , ') ' ' '
where is the joint pdf function.
,........ in A1
n
X X X n nX n
nX X Xf x x x
P f x x x dx dx dxX Xn
1
, , , 1 2 , , , 1 2 1 21 2 1 2' ' ' ' ' '
The joint cdf is given by
( , , , ) ( , , , )n
X X X n X X X n nn nf dx xF x x x x x x dx x dx
1 2
, , , 1 2 1 1 2 21 2
, ,The joint cdf of is defined as
( , , , ) , ,...,X X X n n nn
XX
F x x x P X x X x X x
nXXX ,...,, 21
![Page 36: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/36.jpg)
tch-prob 36
1
1 , , , 1 2 21 1 2
1 2 1
, , , 1 2 1, , , 1 2 1 1 21 2 1
The marginal pdf of is
( ) ' ' ' '
The marginal pdf for , , , is
( , , , , ') '
( , , , )
( , , , )
X X X X n nn
n
X X X n n nX X X n nn
X
f x f dx dx
X X X
f f x x x x dx
x x x
x x x
1 1
, , 111 1
, , 1 11 1
The conditional pdf of given the values of , , is
( , , )( , , )
( , , )
n n
X X nnX n nn X X nn
X X X
f x xf x x x
f x x
11 1
( , , ) 0, , nn
if f x xX X
, , , 1 2 1 1 1 1 2 2 1 11 2 1 2 1
Repeated applications of above( , , ) ( , , ) ( ) ( ).( , , , )
X X X n X n n X n n X Xn n nf f x x x f x x x f x x f xx x x
, , , 1 2 , , , 1 21 2 1 21
The joint pdf, if it exists, is given by
( , , , ) ( , , , )X X X n X X X nn n
n
fn
x x x F x x xx x
![Page 37: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/37.jpg)
tch-prob 37
1 2 3
2 2 21 2 1 2 3
, , 1 2 31 2 3
1 3
EX. 4.29 Random variables , , and have joint Gaussain pdf
( 2 /2)( , , ) .
2Find the marginal pdf of and .
X X X
X X X
x x x x xef x x x
X X
23
1
2
, 1 31 3
2 21 3
1 2 2 2( 2 )32 1 2 1 2 ( , )
22 2 21 12 2[( ) ]
2 1 12222 2 2
1 2( )1 12 2 2 123 12 222 2 2 2
1 1 1 12 2 ' 2( ) 21 32 2 1 2 2'2 2 2 2 2
x
X X
x x x x xe ef x x dx
x x xe e dx
x xx xe e e dx
x x x xx xe e e e edx
X1 and X3 are independent zero-mean, unit-variance Gaussian r.v.s.
![Page 38: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/38.jpg)
tch-prob 38
Independence
1 2
, , , 1 2 1 21 2 1 2
, , , 1 2 11 2 1
1 11
, , , are independent if and only if
or ( ) ( ) if continuous
( , , ) ( ) ( ) if disc
( , , , ) ( ) ( ) ( )
( , , , )
n
X X X n X X X nn n
X X X n X X nn n
n X X nn
X X
f f x f x
p x x p x p x
XF x x x F x F x F x
x x x
rete
![Page 39: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/39.jpg)
tch-prob 39
4.6 Functions of Several Random Variables Quite often we are interested in one or more functions of random variables involved with some experiment. For example, sum, maximum or minimum of X1, X2, …,Xn.
1 2
Let random variable be defined as ( , , , ).
n
ZZ g X X X
1' ' ' '
, , 1 11
The cdf of ( ) [ ]
[{ ( , , ) such that g( ) z)}]
( , , ) in eqv.event
pdf ( ) ( )
Z
n
X X n nn
Z Z
ZF z P Z z
P x x
f x x dx dx
df z F zdz
x x
x
![Page 40: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/40.jpg)
tch-prob 40
Example 4.31 Z=X+Y
,
,
( ) [ ]
[ ]'z-x ' ' ' ' ( , )- -
' ' ' ( ) ( ) ( , )
Z
X Y
Z Z X Y
F z P Z z
P X Y z
f x y dy dx
df z F z f x z x dxdz
Superposition integral
y
x
y=-x+z
If X and Y are independent r.v.,
( ) ( ') ( ') 'Z X Y
f z f x f z x dx convolution integral
![Page 41: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/41.jpg)
tch-prob 41
Example 4.32 Sum of Non-Independent r.v.s Z=X+Y , X,Y zero-mean, unit-variance with correlation coefficient
2
1
,
2 2 2( 2 ) 2(1 )1( , ) - x, y22 1
X Yx xy yf x y e
,
2
( ) ( ', ') '
2' ' ' ' 2[ 2 ( ) ( ) ]/2(1 )1 '22 1-
Z X Yf z f x z x dx
x x z x z xe dx
2
2
2 2 2 2' ' ' 2 ' ' ' 2 ' ' 2 ( ) ( ') 2 2 22 1' ' 2 ' 2 2 22(1 ) 2(1 ) 2(1 )[ ]
21' 2 22(1 )[ ]
2
x x z x z x x x z x z zx xzx x z z x z z
zx z
![Page 42: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/42.jpg)
tch-prob 42
(1 ) 2' 22 ( ) (1 )4(1 ) '2( )
22 12
1' 2( ) 24(1 ) '2 2 22 12
1' 2( ) 24(1 ) 1 '2 2 2 1 2 12
22 2
4(1 ) 2 2(1 )
2 1 2 2 2(
Z
zzxef z e dx
zzxe e dx
zzxe e dx
z ze e
1 2- 22 21 )
ze
Sum of these two non-independent Gaussian r.v.s is also a Gaussian r.v.
![Page 43: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/43.jpg)
tch-prob 43
Ex.4.33 A system with standby redundancy. Let T1 and T2 be the lifetimes of the two components. They are independent exponentially distributed with the same mean.
The system lifetime is
1T
2T
21 TTT
1
2 2
( )0
x 0 ( )
0 x 0
( ) x 0 x z( ) ( )0 x 0 0 x z
( )
T
T T
xz z x
xef x
z xxe ef x f z x
f z e e dxT
2 2
0 zz ze dx ze
Erlang m=2
![Page 44: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/44.jpg)
tch-prob 44
Let Z = g (X,Y).Given Y = y, Z = g (X,y) is a function of one r.v. X.
Can first find from then find
)( yYzZf )(xXf
( ) ( ') ( ') 'Z Z Y
f z f z y f y dy
The conditional pdf can be used to find the pdf of a function ofseveral random variables.
![Page 45: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/45.jpg)
tch-prob 45
Example 4.34 Z = X/Y X,Y indep., exponentially distributed with mean one. Assume Y = y, Z = X/y is a scaled version of X
( ) ( )
( )
Z X
X
dxf z y f x y x yz dzy f yz y
,( ) ' ( ' ') ( ') ' ' ( ' , ') '
' ' ' ( ' ) ( ') ' ' '0 0'(1 ) ' '0
' '(1 ) '(1 )1 '00 1(1 )1 1
1 1
Z X Y X Y
X Y
f z y f y z y f y dy y f y z y dy
y z yy f y z f y dy y e e dy
y zy e dyy y z y ze e dy
zz
z z
'(1 )0
1 02(1 )
y ze
zz
'(1 )' 'y '(1 )1'
(1 )
y ze dy
y zdy ez
![Page 46: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/46.jpg)
tch-prob 46
y
x
(z>0)x yzx yz
x yzyz x0y if
yz xoy if
zy
x(z<0)
y
xx yz
y yz
0
0
0
0
, ,
, ,
, ,
,
00
Z X Y X Y
X Y X Y
Z X Y X Y
X Y
yzF (z) f (x,y)dxdy f (x,y)dxdyyzyz f (yz,y)d(yz)dy f (yz,y)d(yz)dyyz
f (z) yf (yz,y)dy yf (yz,y)dy
y f (yz,y)dy
![Page 47: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/47.jpg)
tch-prob 47
0 0
0 0
0
, ,
,
,
,2
X Y X Y
Z X Y
X Y
X Y
if f (x,y) f ( x, y)yzF (z) f (x, y)dxdyyzyz yz f ( x, y)dxdyyz f (x,y)dxdy
![Page 48: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/48.jpg)
tch-prob 48
,
Ex. min( , )
If , are independent,
( ) ( ) ( ) ( ) ( ) ( ) ( )
( ) ( ) ( , )
( ) ( ) ( ) ( )
Z X Y X Y X Y
Z X Y X Y
Z X Y X Y
Z X Y
X Y
f z f z f z f z F z F z f z
F (z) F z F z F z z
F (z) F z F z F z F z
(z, z)
![Page 49: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/49.jpg)
tch-prob 49
,
Ex. max( , )
If , are independent,
( ) ( ) ( ) ( ) ( )
( , )
( ) ( )
Z X Y X Y
Z X Y
Z X Y
Z X Y
X Y
f z f z F z F z f z
F (z) F z z
F (z) F z F z
(z, z)
![Page 50: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/50.jpg)
tch-prob 50
Transformation of Random Vectors
Joint cdf of ),,2,1( nZZZ
, , 1 1 11( , , ) [ ( ) , , ( ) ]
Z Z n n nnF z z P g z g z X X
1 1 1 2
2 2 1 2
1 2
( , , , )
( , , , )
( , , , )
n
n
n n n
Z g X X X
Z g X X X
Z g X X X
![Page 51: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/51.jpg)
tch-prob 51
Example 4.35 W = min (X,Y) , Z = max (X,Y)
If z>w
If z<w
,( , ) [{min( , ) } {max( , ) }]
W ZF w z P X Y w X Y z
, , , , , ,
, , ,
( , ) ( , ) { ( , ) ( , ) ( , ) ( , )}
( , ) ( , ) ( , )W Z X Y X Y X Y X Y X Y
X Y X Y X Y
F w z F z z F z z F w z F z w F w w
F w z F z w F w w
, ,( , ) ( , )
min( , ) max( , ) z min( , ) z max( , ) z
W Z X YF w z F z z
X Y z X Y X Y X Y
(z,z)
(w,w)
x
y
![Page 52: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/52.jpg)
tch-prob 52
pdf of Linear Transformation
Linear Transformation V = a X + b Y W = c X + e Y
assume
Y
XA
Y
Xa
W
V
e c
b
0 bcaeA
w
vA
y
x 1
x
y(x,y+dy) (x+dx,y+dy)
(x,y) (x+dx,y)
v
w
(v+bdy,w+edy)(v+adx+bdy,w+cdx+edy)
(v,w)(v+adx,w+cdx)
dP
Equivalent event
![Page 53: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/53.jpg)
tch-prob 53
stretch factor
(a,b)
ba
(c,e)
(a,b)V2
V1
, ,
,,
( , ) ( , )
( , )( , )
X Y V W
X YV W
f x y dxdy f v w dP
f x yf v w
dPdxdy
1 2 1 2 sin
v v v v
ae bc
dP = ?
![Page 54: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/54.jpg)
tch-prob 54
o
(bdy,edy)
h
(adx,cdx)
dxdybcae
xdcxdahdP
xdcxda
adx
xdaxdc
cdxedybdyh
ba
a
ba
b
ba
b
ba
aec
)(
2222
2222,
2222),(
a)(-b,on e)(c, of Projection22
,22
e)(c,
larPerpendicu 0b)(a,a)(-b,
b)(a,on e)(c, of Projection22
,22
),(
,,
b
c e
( ( , ), ( , )) ( , )
X YV W
adP ae bc Adxdy
f x v w y v wf v w
A
1( )For n-dimensional vector , ( )
f AA f z
A
X
Z
ZZ X
![Page 55: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/55.jpg)
tch-prob 55
Example 4.36 X,Y jointly Gaussian
2 2( 2 )22(1 )1( , ), 22 1
1 11-1 12
1
1 -111 12
( ) ( ) 2 2
x xy y
f x y eX Y
V X XA
W Y Y
A
X V
Y W
V W V WX Y
![Page 56: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/56.jpg)
tch-prob 56
)1(22
12
1)1(22
12
1
)1(22
)1(22
2-12
1
)21(2]2
22)(
2)(22)
2[(
2-12
1
1
)2
,2
(,),(,
w
e
v
e
wv
e
wvwvwvwv
e
wvwvYXf
wvWVf
V, W are independent , zero mean, Gaussian r.v.s with variance , and , respectively. see Fig 4-16 Contours of equal value of the joint pdf of XY
1
1
![Page 57: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/57.jpg)
tch-prob 57
Pdf of General Transformation
1
2
1
2
( , )
( , )
Assume that ( , ) and ( , ) are invertiable, i.e.,
( , )
( , )
V g X Y
W g X Y
v x y w x y
v g x y
w g x y
invertible1
2
( , )
( , )
x h v w
y h v w
Fig 4.17a
11 1
1
( , ) ( , )
gg x dx y g x y dx
xg
v dxxv
v dxx
ey
wc
x
w
by
va
x
v
2 ( , )w
g x dx y w dxx
![Page 58: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/58.jpg)
tch-prob 58
y
x
(x,y) (x+dx,y)
(x,y+dy) (x+dx,y+dy)
x
y
(g1(x,y),g2(x,y))
(g1(x+dx,y),g2(x+dx,y))
(g1(x+dx,y+dy),g2(x+dx,y+dy))
(g1(x,y+dy),g2(x,y+dy)
1 2( , )g g
v dx w dxx x
1 1 2 2( , )g g g g
v dx dy w dx dyx y x y
1 2( , )g g
v dy w dyy y
( , )v w
1
2
( , )
( , )
v g x y
w g x y
![Page 59: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/59.jpg)
tch-prob 59
, 1 2,
( ( , ), ( , ))( , ) X Y
V W
f h v w h v wf v w
v vx y
w wx y
Jacobian of the transformation
Jacobian of the Inverse Transformation
Can be shown that
, , 1 2
1
( , ) ( ( , ), ( , ))
V W X Y
v vx xx yv w
y y w w
v w x y
x x
v wf v w f h v w h v wy y
v w
![Page 60: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/60.jpg)
tch-prob 60
Example 4.37 X,Y zero mean , unit-variance , indep. Gaussian r.v.s
12 2 2( ) radius
( , ) angle in (0,2 )
cos , sin
cos sin
sin cos
V X Y
W X Y
x v w y v w
x xw v wv w v
y y w v w
v w
![Page 61: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/61.jpg)
tch-prob 61
,
221 1 22( , )2 2
2 2 2 2cos 2 sin 2 2
21 2
2
V Wyxf v w e e v
v v w v we e
vve
V,W independent
20
0
w
v
Linear transformation method can be used even if we are interested in only one function of random variables.-by defining an “auxiliary” r.v.
uniform
Rayleigh
( ) ( )W Vf w f v
![Page 62: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/62.jpg)
tch-prob 62
Ex. 4.38 X: zero-mean , unit-variance Gaussian Y: Chi-square r.v. with n degrees of freedom X and Y are independent find pdf of
Let W=Y, then
nY
XV
2
0 1
WX V nY W
x x vwv w n wny y
v w
wn
![Page 63: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/63.jpg)
tch-prob 63
)2(2
)]21(2[21
)2(
)2(2
212)2(
2
22
),(,
)2(2
212)2(
2
22
),(,
nn
nvwe
nw
nw
n
we
nwn
wve
wvWVf
n
ye
nyxe
yxYXf
![Page 64: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/64.jpg)
tch-prob 64
0
2
0
21 121 2
22 2
12
12 2
1 1'2' '
21
2 2 11
2
2
V
V
w vnnw
f (v) e dwnnπ Γ( )
w v' Let wn
nv
nn wf (v) (w ) e dwnnπ Γ( )
nv n
Γn
nnπ Γ( )
Student's t - distribution
![Page 65: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/65.jpg)
tch-prob 65
4.7 Expected Value of Function of Random Variables Z=g(X,Y)
,
( , ) ( , ) , jointly continuous,[ ]
( , ) ( , ) , discretei n X Y i ni n
g x y f x y dxdy X YX YE Z
g x y p x y X Y
Ex. 4.39 Z=X+Y
,
, ,
[ ] [ ]
( ' ') ( ', ') ' '
' ( ', ') ' ' ' ( ', ') ' '
' ( ') ' ' ( ') '
[ ] [ ]
X Y
X Y X Y
X Y
E Z E X Y
x y f x y dx dy
x f x y dy dx y f x y dx dy
x f x dx y f y dy
E X E Y
X, Y need not be independent
][]1[]21[ nXEXEnXXXE In general,
![Page 66: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/66.jpg)
tch-prob 66
Ex. 4.40. X,Y independent r.v.s and let
The jkth joint moment of X and Y is
when j=1 , k=1
E[XY]: the correlation of X and Y
If E[XY]=0 , then X and Y are orthogonal.
1 2
1 2
1 2
1 2
1 2
( , ) ( ) ( )
[ ( , )] [ ( ) ( )]
( ') ( ') ( ') ( ') ' '
( ') ( ') ' ( ') ( ') '
[ ( )] [ ( )]
X Y
X Y
g X Y g X g Y
E g X Y E g X g Y
g x g y f x f y dx dy
g x f x dx g y f y dy
E g X E g Y
,
,
( , ) , jointly continuous[ ] ( , ) , discrete
j kX Yj k
j ki n X Y i n
i n
x y f x y dxdy X YE X Y x y p x y X Y
![Page 67: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/67.jpg)
tch-prob 67
The jkth central moment of X and Y
When j=1 , k=1E[(X-E[X])(Y-E[Y])]=COV(X,Y) covariance of X and Y
COV(X,Y)=E[XY-XE[Y]-YE[X]+E[X]E[Y]] =E[XY]-2E[X]E[Y]+E[X]E[Y] =E[XY]-E[X]E[Y]
Ex. 4.41. X,Y independent COV(X,Y)=E[(X-E[X])(Y-E[Y])] =E[X-E[X]]E[Y-E[Y]] =0
]])[(])[[( kYEYjXEXE
![Page 68: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/68.jpg)
tch-prob 68
The correlation coefficient of X and Y
X,Y are uncorrelated if
If X,Y are independent , then COV(X,Y)=0 , , X, Y uncorrelated.
X,Y uncorrelated does not necessarily imply X,Y are independent.
( , ) [ ] [ ] [ ]
where ( ) , ( ) are the standard deviation of and .
XYX Y X Y
X Y
COV X Y E XY E X E Y
Var X Var Y X Y
0XY
0XY
,
2
, ,
1 1.
[ ] [ ]pf : 0
1 2 1 2(1 )
X Y
X Y
X Y X Y
X E X Y E YE
![Page 69: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/69.jpg)
tch-prob 69
X,Y uncorrelated does not necessarily imply X,Y are independent.
Ex. 4.42 : uniformly distributed in (0,2 )
=cos and =sinX Y
![Page 70: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/70.jpg)
tch-prob 70
Joint characteristic Function
If X and Y are independent r.v.s
1 1 2 2
1 2
1 2
( ), ,..., 1 2
( ), 1 2
1 2,
( , , , )
Consider the case =2:
( , )
( ) ( , )
n n
n
j X X XX X X n
j X YX Y
X Y
E e
n
E e
j x yf x y e dxdy
1 2 1 2
1 2
( ), 1 2
1 2
( , )
( ) ( )
j X Y j X j YX Y
j X j YX Y
E e E e e
E e E e
![Page 71: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/71.jpg)
tch-prob 71
If Z=aX+bY
If , are independent, ( ) ( ) ( )Z X YX Y w a b
,( ) ( )( ) [ ] [ ] ( , )Z X Y
j aX bY j aX bYE e E e a b
, 1 21 2
1 2
The th joint moment of , is
1[ ] ( , ) 0, 0
i ki k
X Yi ki k
ik X Y
E X Yj
![Page 72: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/72.jpg)
tch-prob 72
4.8 Jointly Gaussian Random Variables
X,Y are said to be jointly Gaussian if
2 2
1 1 2 2,2
1 1 2 2,
, 21 2 ,
1exp 2
2 1
2 1
X Y
X Y
X Y
X Y
x m x m y m y mρ
σ σ σ σρf x, y
σ ρ
,x y
Contours of constant pdf
1 1 2 2,1 1 2 2
2 22 constantX Y
x m x m y m y m
, 1 22 21 2
21 arctan2X Y
![Page 73: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/73.jpg)
tch-prob 73
Marginal p.d.f.
2 21 1
2 22 2
/ 2
1
/ 2
2
2
2
x m
X
x m
Y
ef x
ef y
Conditional pdf
, ,| X Y
XY
f x yf x y f y
2 2
, 1
2 21 ,
1, 2 12
21ex
1
p2
2
1 X Y
X
X Y
Y
yx m m
11 , 2
2
2 21 ,and conditional variance .
is Gaussian with conditional mean
1
X Y
X Y
m y m
, 0, , are independent.If |X Y X X X Yf x y f x For , jointly Gaussian, , uncorrelated , independent. X Y X Y X Y *
![Page 74: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/74.jpg)
tch-prob 74
1 2
1 2 .
,
|
Cov X Y X m Y m
X m Y m Y
211 2 , 2
2
21, 2
2
21, 2
2
, 1 2
|
( , )
X Y
X Y
X Y
X Y
X m Y m Y Y m
Cov X Y Y m
Y m
We now show that is indeed the correlation coefficient.,X Y
1 2
2 1
2 1
12 , 2
2
|
|
|
X Y
X m Y m Y y
y m X m Y y
y m X Y y m
y m y m
,
1 2
ov ,X Y
C X Y
CorrelationCoefficient
![Page 75: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/75.jpg)
tch-prob 75
1 2
1
, ,..., 1 2 / 2 1/ 2
1exp 2, ,...,2n
T
X X X n n
Kf f x x x
K
X
x m x mx
jointly Gaussian Random Variablesn
1 2, ,..., are jointly Gaussian ifnX X X
11 1
22 2where , and is the covariance matrix
nn n
E Xx m
E Xx mK
E Xx m
x m
1 1 2 1
2 1 2 2
1 2
( ) ( , ) ( , )
( , ) ( ) ( , )
( , ) ( , ) ( )
n
n
n n n
VAR X COV X X COV X X
COV X X VAR X COV X X
COV X X COV X X VAR X
K
![Page 76: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/76.jpg)
tch-prob 76
The pdf of the jointly Gaussian random variables is completely specified by the individual means and variances and the pairwise covariances.
Ex. 4.46 Verify that (4.83) becomes (4.79) when n=2.
Ex. 4.48
1 2
1 2
, ,..., are jointly Gaussian.
If ( , ) 0, , , ,..., independent.n
i j n
X X X
COV X X i j X X X
![Page 77: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/77.jpg)
tch-prob 77
Linear Transformation of Gaussian Random Variables
1f Af
A
X
Y
yy
1 1 1
/ 2 1/ 2
1exp 2
2
T
n
A K A
A K
y m y m
1 1
2 2Let be jointly Gaussian, and define by =A ,
n n
x y
x y
x y
X Y Y X
1 1
1 1T TT
A A A
A A A
y m y m
y m y m
From elementary properties of matrices,
![Page 78: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/78.jpg)
tch-prob 78
1 1 1
/ 2 1/ 2
1exp 2
2
TT
n
A A K A Af
A K
Y
y m y my
11 1 1Since ,
let , .
T T
T
A K A AKA
C AKA A
n m 2det det det detTC AKA A K
1
/ 2 1/ 2
1( ) ( )2
2
T
n
Ce
fC
Y
y n y n
y
Thus, Y jointly Gaussian with mean n and covariance matrix C.
![Page 79: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/79.jpg)
tch-prob 79
TAKA If we can find a A s.t. , a diagonal matrix
1
/ 2 1/ 2
12
( )2
T
n
ef
Y
y n y n
y
2
1/ 2
1 2
2
1/ 21
1exp /2 1
2 2 ..... 2
1exp /2
2
i i i
n
n i i i
i i
ny n
i
y n
1 2, ,...... are independent.nY Y Y
If we can select matrix that diagonalize with 1,
then the linear transformation corresponds to a rotation.
A K A
![Page 80: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/80.jpg)
tch-prob 80
cos sin
sin cos
V X
W Y
1 2 1 2 cos sin sin cos
Cov V,W Ε V Ε V W Ε W
E X m Y m X m Y m
1 2
1 2
cos sin
sin cos
E V m m
E W m m
Ex. 4.49, 1 2
2 21 2
21 arctan2X Y
![Page 81: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/81.jpg)
tch-prob 81
Ex.4.50
1 1 2 2 .... n nZ a X a X a X
2 2 3 3 nLet Z , Z ,..., Z .nX X X
2Define , ,..., , thennZ Z ZZ
1 2
0 1 0where .
0 0 1
na a a
A
is jointly Gaussian with mean An mand covariance matrix TC AKA
1 2, ,..., are jointly Gaussian.nX X X
AZ X
1
11
1
,1 1
i
i j i j
nE Z n a E Xii
n nVAR Z C a a COV X X
i j
![Page 82: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/82.jpg)
tch-prob 82
Joint Characteristic Function of n jointly Gaussian random variables
1 2, ,..... isnX X X
1 2, ,.... 1 2
1 ,21 1 1, ,...,
n
i i i k i k
X X X n
n n nj m COV X Xi i ke
12
T Tj Ke
ω m ω ω
![Page 83: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/83.jpg)
tch-prob 83
4.9 Mean Square Estimation
022
2min
2222min
aYda
aYa
d
aYaYaYa
Ya
2
. . .m s e Y a VAR Y
We are interested in estimating the value of an inaccessible random variable Y
in terms of the observation of an accessible random variable X.
The estimate for Y is given by a function of X, g(X).
1. Estimating a r.v. Y by a constant a so that the mean square error (m.s.e) is minimized :
![Page 84: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/84.jpg)
tch-prob 84
2. Estimating Y by g(X) = a X + b
2min,
Y aX ba b
best is b b Y aX Y a X
2best is by mina Y Y a X X
a
Differentiate w.r.t. a
XXXXaYY 2
2 , 0COV X Y aVAR X
,
,X Y
COV X Y YaVAR X X
![Page 85: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/85.jpg)
tch-prob 85
Minimum mean square error (mmse) linear estimator for Y
,X Y YX
Y a X b
X E XE Y
Zero-mean, unit-variance version of X
error of the best linear estimator observation
0Y Y a X X X X
Orthogonality condition
In deriving a*, we obtain
![Page 86: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/86.jpg)
tch-prob 86
Mean square error of best linear estimator
2 E Y Y a X X
E Y Y a X X Y E Y
a E Y Y a X X X X
, ,
2,
2
,
1
YX Y X Y X YX
X Y
Y Y a X X Y Y
VAR Y a COV X Y
VAR Y
VAR Y
![Page 87: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/87.jpg)
tch-prob 87
3. Best mmse estimator of Y is in general a non-linear function of X, g(X)
2
.min
gY g X
XXgYXgY |22
dxxXfxXXgY |2
constant when X x
The constant that minimizes is 2|E Y g X X x
|g x E Y X x Regression curve
is the estimator for Y in terms of X that yields the smallest m.s.e. XY |
![Page 88: Chapter 4. Multiple Random Variables](https://reader033.fdocuments.in/reader033/viewer/2022061519/5681593e550346895dc67de8/html5/thumbnails/88.jpg)
tch-prob 88
Ex. 4.51 Let X be uniformly distributed in (-1,1) and let Y=X .
Find the best linear estimator and best estimator of Y in terms of X.
Ex. 4.52 Find the mmse estimator of Y in terms of X when X and Y are
jointly Gaussian random variables.
2