Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the...
Transcript of Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the...
![Page 1: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/1.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Probability Theory and Mathematical StatisticsLecture 07: Moment-Generating Functions
Chih-Yuan Hung
School of Economics and ManagementDongguan University of Technology
April 10, 2019
![Page 2: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/2.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
5. Moment-Generating Function
The moments of most distributions can be determined directlyby evaluating the necessary integrals or sums.
But some of the integration and sum are hard to find.
An alternative procedure sometimes provides considerablesimplifications. This technique utilizes moment-generatingfunctions.
![Page 3: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/3.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Definition (6, Moment Generating Function)
The moment generating function of a random variable X ,where it exists, is given by
MX (t) = E(etX)= ∑
x
etx · f (x)
when X is discrete, and
MX (t) = E(etX)=∫ ∞
−∞etx f (x)dx
when X is continuous.
We evaluate this kind of expectation at t near 0. The Maclaurin’sseries expansion of etx is
etx = 1 + tx +t2x2
2!+
t3x3
3!+ ... +
trx r
r !+ ...
![Page 4: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/4.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
The MGT in the form of Maclaurin’s series expansion
MX (t) =E(etX)= ∑
x
etx · f (x)
=∑x
[1 + tx +
t2x2
2!+
t3x3
3!+ ... +
trx r
r !+ ...
]f (x)
=∑x
f (x) + t ∑x
xf (x) +t2
2! ∑x
x2f (x) + ... +tr
r ! ∑x
x r f (x) + ...
=1 + µ · t + µ′2 ·t2
2!+ ... + µ′r
tr
r !+ ...
![Page 5: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/5.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Example (13)
Find the moment-generating function of the random variablewhose probability density is given by
f (x) =
{e−x for x > 0
0 elsewhere
and use it to find an expression for µ′r .
![Page 6: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/6.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
By definition
MX (t) = E(etX)=∫ ∞
0etxe−xdx
=∫ ∞
0e−x(1−t)dx
=1
1− tfor t < 1
When |t| < 1,
MX (t) =1 + t + t2 + ... + tr + ...
=1 + 1!t
1!+ 2!
t2
2!+ ... + r !
tr
r !+ ...
and hence µ′r = r ! for r = 0, 1, 2, ...
![Page 7: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/7.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
The Use of MGF
The main difficulty in using the Maclaurin’s series of a MGF isusually not that of finding MGF, but that of expanding it into aMaclaurin’s series.The following theorem help us to find the first few moment easily.
Theorem (9)
d rMX (t)
dtr|t=0 = µ′r
![Page 8: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/8.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Example (14)
Given that X has the probability distribution f (x) = 18 (
3x) for
x = 0, 1, 2, and 3, find the moment-generating function of thisrandom variable and use it to determine µ′1 and µ′2.
Solution
In accordance with Definition 4.6,
MX (t) = E (etX ) =1
8·
3
∑x=0
etX(
3
x
)=
1
8(1 + 3et + 3e2t + e3t)
=1
8·
3
∑x=0
(3
x
)etX13−x
=1
8(1 + et)3
![Page 9: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/9.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
Then, by Theorem 4.9,
µ′1 = M ′X (0) =3
8(1 + et)2|t=0 =
3
2
and
µ′2 = MX”(0) =3
4(1 + et)|t=0 = 3
![Page 10: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/10.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Theorem (10)
If a and b are constant, then
1 MX+a(t) = E[e(X+a)t
]= eat ·MX (t);
2 MbX (t) = E[e(bX )t
]= MX (bt);
3 MX+ab(t) = E
[e(
X+ab )t
]= e
ab t ·MX (
tb ).
An application you may have seen already:Let a = −µ, b = σ, we standardize a normal random variable as
X − µ
σ
The MGF of this standardized r.v. is
MX−µσ(t) = e−
µtσ ·MX
( tσ
)
![Page 11: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/11.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Product Moment
Consider two random variables, the product moments is defined as:
Definition (7, Product Moments about the Origin)
The rth and sth product moment about the origin of therandom variables X and Y , denoted by µ′r ,s , is the expected valueof X rY s ; symbolically,
µ′r ,s = E (X rY s) = ∑x
∑y
x ry s · f (x , y)
for r = 0, 1, 2, ... and s = 0, 1, 2, ... when X and Y are discrete,and
µ′r ,s = E (X rY s) =∫ ∞
−∞
∫ ∞
−∞x ry s · f (x , y)dxdy
when X and Y are continuous.
![Page 12: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/12.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Note that µ′1,0 = E (X ) = µX and µ′0,1 = E (Y ) = µY , we have
Definition (8, Product Moments about the Mean)
The rth and sth product moment about the means of therandom variables X and Y , denoted by µr ,s , is the expected valueof (X − µX )
r (Y − µY )s ; symbolically,
µr ,s =E [(X − µX )r (Y − µY )
s ]
=∑x
∑y
(X − µX )r (Y − µY )
s · f (x , y)
for r = 0, 1, 2, ... and s = 0, 1, 2, ... when X and Y are discrete,and
µr ,s =E [(X − µX )r (Y − µY )
s ]
=∫ ∞
−∞
∫ ∞
−∞(X − µX )
r (Y − µY )s · f (x , y)dxdy
when X and Y are continuous.
![Page 13: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/13.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
You may know that µ1,1 is of special important, which is worth togive it a name.
Definition (9, Covariance)
µ1,1 is called the covariance of X and Y , and it is denoted byσXY , cov(X ,Y ), or C (X ,Y ).
It is indicative of the relationship, if any, between the value of Xand Y .
If the joint probability has the tendency that large X withlarge Y, then the covariance will be positive;
If there is a high probability that large values of X will go withsmall values of Y , and vice versa, the covariance will benegative.
![Page 14: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/14.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Theorem (11)
σXY = µ′1,1 − µX µY
Proof.
Using the various theorems about expected values, we can write
σXY =E [(X − µX )(Y − µY )]
=E (XY − XµY − Y µX + µX µY )
=E (XY )− µYE (X )− µXE (Y ) + µX µY
=E (XY )− µY µX − µX µY + µX µY
=µ′1,1 − µX µY
![Page 15: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/15.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Example (15)
The joint and marginal probability of X and Y , the number ofaspirin and sedative caplets among two caplets drawn at randomfrom a bottle containing three aspirin, two sedative, and fourlaxative caplets, are recorded as follows:
xf (x , y) 0 1 2
y
0 16
13
112
712
1 29
16
718
2 136
136
512
12
112
Find the covariance of X and Y
![Page 16: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/16.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
µ′1,1 =E (XY )
=0 · 0 · 1
6+ 0 · 1 · 2
9+ 0 · 2 · 1
36+ 1 · 0 · 1
3+ 1 · 1 · 1
6+ 2 · 0 · 1
12
=1
6
Also,
µX = E (X ) = 0 · 5
12+ 1 · 1
2+ 2 · 1
12=
2
3
and
µY = E (Y ) = 0 · 7
12+ 1 · 7
28+ 2 · 1
36=
4
9
It follows
σXY =1
6− 2
3· 4
9= − 7
54
![Page 17: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/17.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Example (16)
Find the covariance of the random variables whose joint probabilitydensity is given by
f (x , y) =
{2 for x > 0, y > 0, x + y < 1
0 elsewhere
![Page 18: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/18.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
Evaluating the necessary integrals, we get
µX =∫ 1
0
∫ 1−x
02xdydx =
1
3
µY =∫ 1
0
∫ 1−y
02ydxdy =
1
3
µ′1,1 =∫ 1
0
∫ 1−y
0x · y2dxdy =
1
12
It follows that
σXY =1
12− 1
3
1
3= − 1
36
![Page 19: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/19.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
If X and Y are statistical independent, we have:
Theorem (12)
If X and Y are independent, then E (XY ) = E (X ) · E (Y ) andσXY = 0.
Proof.
For the discrete case we have, by definition,
E (XY ) = ∑x
∑y
xy · f (x , y)
Since X and Y are independent, we can writef (x , y) = g(x) · h(y), where g(x) and h(y) are the values of themarginal distribution of X and Y , and we get
![Page 20: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/20.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Proof.
E (XY ) =∑x
∑y
xy · g(x)h(y)
=
[∑x
x · g(x)] [
∑y
y · h(y)]
=E (X ) · E (Y )
Hence,
σXY =µ′1,1 − µX µY
=E (X ) · E (Y )− E (X ) · E (Y )
=0
![Page 21: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/21.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Independence ⇒ zero covarianceIndependence : zero covariance
Example (17)
If the joint probability distribution of X and Y is given by
xf (x , y) -1 0 1
y
-1 16
13
16
23
0 0 0 0 01 1
6 0 16
13
13
13
13
Show that their covariance is zero but not independent.
![Page 22: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/22.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
µX =(−1) · 1
3+ 0 · 1
3+ 1 · 1
3= 0
µY =(−1) · 2
3+ 0 · 0 + 1 · 1
3= −1
3
and
µ′1,1 =(−1)(−1) · 1
6+ (−1)1 · 1
6+ 1(−1) · 1
6+ 1 · 1 · 1
6=0
Thus, σXY = 0− 0(13
)= 0.
But f (x , y) 6= g(x) · h(y).for example, x = −1, y = −1,f (−1,−1) = 1
6 6=13 ·
23 = g(−1) · h(−1).
![Page 23: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/23.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Linear Combination of n Random Variables
One of the form of r.v.s is of special important in statisticalinference, linear combination.
Theorem (14)
If X1,X2, ...,Xn are random variables and Y = ∑ni=1 aiXi , where
a1, a2, ..., an are constants, then
E (Y ) =n
∑i=1
aiE (X )
and
Var(Y ) =n
∑i=1
a2i · var(Xi ) + 2 ∑ ∑i<j
aiajcov(XiXj )
![Page 24: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/24.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Corollary (3)
If the random variables X1,X2, ...,Xn are independent andY = ∑n
i=1 aiXi , then
Var(Y ) =n
∑i=1
a2i Var(Xi )
Example (18)
If the random variables X , Y , and Z have the meansµX = 2, µY = −3 and µZ = 4, the variance σ2
X = 1, σ2Y = 5 and
σ2Z = 2, and the covariance cov(X ,Y ) = −2, cov(X ,Z ) = −1,
and cov(Y ,Z ) = 1, find the mean and variance ofW = 3X − Y + 2Z .
![Page 25: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/25.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
By Theorem 14, we get
E (W ) =3E (X )− E (Y ) + 2E (Z )
=3 · 2− (−3) + 2 · 4=17
and
Var(W ) =9Var(X ) + Var(Y ) + 4Var(Z )
− 6cov(X ,Y ) + 12cov(X ,Z )− 4cov(Y ,Z )
=9 · 1 + 5 + 4 · 2− 6(−2) + 12(−1)− 4 · 1=18
![Page 26: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/26.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Theorem (15)
If X1,X2, ...Xn are random variables and
Y1 =n
∑i=1
aiXi and Y2 =n
∑i=1
biXi
where a1, a2, ..., an and b1, b2, ..., bn are constants, then
cov(Y1,Y2) =n
∑i=1
aibi · Var(Xi ) + ∑ ∑i<j
(aibj + ajbi ) · cov(Xi ,Xj )
![Page 27: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/27.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Corollary (4)
If the random variables X1,X2, ...Xn are independent,
Y1 =n
∑i=1
aiXi and Y2 =n
∑i=1
biXi ,
then
cov(Y1,Y2) =n
∑i=1
aibi · Var(Xi )
![Page 28: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/28.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Example (19)
If the random variables X , Y , and Z have the meansµX = 3, µY = 5 and µZ = 2, the variances σ2
X = 8, σ2Y = 12 and
σ2Z = 18, and the covariances cov(X ,Y ) = 1, cov(X ,Z ) = −3,
and cov(Y ,Z ) = 2, find the covariance of
U = X + 4Y + 2Z and V = 3X − Y − Z
Solution
By Theorem 15, we get
cov(U,V ) =cov(X + 4Y + 2Z , 3X − Y − Z )
=3var(X )− 4var(Y )− 2var(Z )
+ 11cov(X ,Y ) + 5cov(X ,Z )− 6cov(Y ,Z )
=3 · 8− 4 · 12− 2 · 18 + 11 · 1 + 5(−3)− 6 · 2=− 76
![Page 29: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/29.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Conditional Expectation
Definition (10)
If X is a discrete random variable, and f (x |y) is the value of theconditional probability distribution of X given Y = y at x , theconditional expectation of u(X ) given Y = y is
E [u(X )|y ] = ∑x
u(x) · f (x |y)
correspondingly, if X is a continuous random variable, and f (x |y)is the value of the conditional probability density of X given Y = yat x , the conditional expectation of u(X ) given Y = y is
E [u(X )|y ] =∫ ∞
−∞u(x) · f (x |y)dx
![Page 30: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/30.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Let u(X ) = X , we obtain the conditional mean of the randomvariable X given Y = y , which we denote by
µX |y = E (X |y)
Corresponding the conditional variance of the random variable Xgiven Y = y is
σ2X |y =E
[(X − µX |y )
2|y]
=E (X 2|y)− µ2X |yw
where E (X 2|y) is the result of u(X ) = X 2 in definition 10.
![Page 31: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/31.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Example (20)
If the joint probability density of X and Y is given by
f (x , y) =
{23 (x + 2y) for 0 < x < 1, 0 < y < 1
0 elsewhere
find the conditional mean and the conditional variance of X givenY = 1
2 .
Solution
First we solve the conditional probability of X given Y = y ,
f (x |y) ={
2x+4y1+4y for 0 < x < 1,
0 elsewhere
![Page 32: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/32.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Solution
So, when Y = 12 ,
f
(x
∣∣∣∣12)=
{23 (x + 1) for 0 < x < 1,
0 elsewhere
Thus, µX | 12is given by
E
(X
∣∣∣∣12)=∫ 1
0
2
3x(x + 1)dx =
5
9
Next we find
E
(X 2
∣∣∣∣12)=∫ 1
0
2
3x2(x + 1)dx =
7
18
and it follows that σ2X | 12
= 718 −
(59
)2= 13
162
![Page 33: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/33.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Appendix
Example
Consider example 12 in chapter 3 again.
xf (x , y) 0 1 2
y
0 16
13
112
712
1 29
16
728
2 136
136
512
12
112
find the conditional mean of X given Y = 1.
![Page 34: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/34.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Appendix
Solution
First we solve the conditional probability of X given Y = y ,
f (x |y) = f (x ,y )h(y )
:
xf (x |y) 0 1 2 h(y)
y
0 27
47
17
712
1 47
37
718
2 1 136
Therefore,
E (X |1) = 0 · 4
7+ 1 · 3
7=
3
7
Also find E (X |0) and E (X |2).
![Page 35: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/35.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Appendix
Consider this example again,
xf (x , y) 0 1 2
y
0 16
13
112
712
1 29
16
728
2 136
136
512
12
112
We can find E (X ) by
1 E (X ) = ∑x g(x) = 0 · 512 + 1 · 12 + 2 · 1
12 = 23
2 Ey [E (X |y)] = ∑y E (X |y)h(y) = 67 ·
712 +
37 ·
718 + 0 · 1
36 = 23
The second method is called iterated expectation.
![Page 36: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/36.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Appendix
Theorem (Iterated Expectation/Law of Total Expectation)
if X is a random variable whose expected value E (X ) is defined,and Y is any random variable on the same probability space, then
E (X ) = E (E (X |Y ))
That is the expected value of the conditional expected value of Xgiven Y is the same as the expected value of X .
![Page 37: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/37.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Appendix
Example
Suppose that two factories supply light bulbs to the market.Factory X ’s bulbs work for an average of 5000 hours, whereasfactory Y ’s bulbs work for an average of 4000 hours. It is knowthat factory X supplies 60 percent of the total bulbs available.What is the expected length of time that a purchased bulb willwork for?
Solution
Applying the law of total expectation, we have:
E (L) = E (L|X )P(X ) + E (L|Y )P(Y ) = 5000 · 0.6 + 4000 · 0.4
Thus each purchased light bulb has an expected lifetime of 4600hours.
![Page 38: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/38.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Homework
Work with your partner (in group)
hand in the homework to the editor group on duty before17:00, Sunday.
Group editor on duty shall organize the final answers and sendthe file of final answer to [email protected] before nextTuesday
HW: please see the attached file
![Page 39: Probability Theory and Mathematical Statistics Lecture 07 ...The main di culty in using the Maclaurin’s series of a MGF is usually not that of nding MGF, but that of expanding it](https://reader036.fdocuments.in/reader036/viewer/2022063011/5fc45baee260b26dc11c6cea/html5/thumbnails/39.jpg)
Moment-Generating Function Product Moment Moments of Linear Combinations of Random Variables Conditional Expectations ending
Questions??