Download - Joint Probability Distributions

Transcript
Page 1: Joint Probability Distributions

1Chapter 4 Title and Outline

5Joint Probability Distributions

5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4 Independence 5-1.5 More Than Two Random Variables

5-2 Covariance and Correlation5-3 Common Joint Distributions5-3.1 Multinomial Probability Distribution5-3.2 Bivariate Normal Distribution5-4 Linear Functions of Random Variables5-5 General Functions of Random Variables

CHAPTER OUTLINE

Page 2: Joint Probability Distributions

Chapter 5 Learning Objectives 2

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Learning Objective for Chapter 5After careful study of this chapter, you should be able to do the

following:1. Use joint probability mass functions and joint probability density functions

to calculate probabilities.2. Calculate marginal and conditional probability distributions from joint

probability distributions.3. Interpret and calculate covariances and correlations between random

variables.4. Use the multinomial distribution to determine probabilities.5. Understand properties of a bivariate normal distribution and be able to draw

contour plots for the probability density function.6. Calculate means and variances for linear combinations of random variables,

and calculate probabilities for linear combinations of normally distributed random variables.

7. Determine the distribution of a general function of a random variable.

Page 3: Joint Probability Distributions

Chapter 5 Introduction 3

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Concept of Joint Probabilities• Some random variables are not independent of each

other, i.e., they tend to be related.– Urban atmospheric ozone and airborne particulate matter

tend to vary together.– Urban vehicle speeds and fuel consumption rates tend to vary

inversely.• The length (X) of a injection-molded part might not be

independent of the width (Y). Individual parts will vary due to random variation in materials and pressure.

• A joint probability distribution will describe the behavior of several random variables, say, X and Y. The graph of the distribution is 3-dimensional: x, y, and f(x,y).

Page 4: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 4

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-1: Signal BarsYou use your cell phone to check your airline reservation. The airline system

requires that you speak the name of your departure city to the voice recognition system.• Let Y denote the number of times that you have to state your departure

city.• Let X denote the number of bars of signal strength on you cell phone.

Figure 5-1 Joint probability distribution of X and Y. The table cells are the probabilities. Observe that more bars relate to less repeating.

1 2 3

0.00

0.05

0.10

0.15

0.20

0.25

OnceTwice

3 Times4 Times

Bar Chart of Number of Repeats vs. Cell

Phone Bars

Cell Phone Bars

Prob

abili

ty

1 2 31 0.01 0.02 0.252 0.02 0.03 0.203 0.02 0.10 0.054 0.15 0.10 0.05

x = number of bars of signal strength

y = number of times city

name is stated

Page 5: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 5

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Joint Probability Mass Function Defined

The of the discrete random variables and , denoted as , , satifies:

(1) , 0 All probabilities

joint probabilit

are non-negative

(2) , 1

y mass function

XY

XY

XYx y

X Yf x y

f x y

f x y

The sum of all probabilities is 1

(3) , , (5-1)XYf x y P X x Y y

Page 6: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 6

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Joint Probability Density Function Defined

(1) , 0 for all ,

(2) , 1

(3) , , (5-2)

XY

XY

XYR

f x y x y

f x y dxdy

P X Y R f x y dxdy

Figure 5-2 Joint probability density function for the random variables X and Y. Probability that (X, Y) is in the region R is determined by the volume of fXY(x,y) over the region R.

The joint probability density function for the continuous random variables X and Y, denotes as fXY(x,y), satisfies the following properties:

Page 7: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 7

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Joint Probability Mass Function Graph

Figure 5-3 Joint probability density function for the continuous random variables X and Y of different dimensions of an injection-molded part. Note the asymmetric, narrow ridge shape of the PDF – indicating that small values in the X dimension are more likely to occur when small values in the Y dimension occur.

Page 8: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 8

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-2: Server Access Time-1Let the random variable X denote the time (msec’s) until a

computer server connects to your machine. Let Y denote the time until the server authorizes you as a valid user. X and Y measure the wait from a common starting point (x < y). The range of x and y are shown here.

Figure 5-4 The joint probability density function of X and Y is nonzero over the shaded region where x < y.

Page 9: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 9

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-2: Server Access Time-2• The joint probability density function is:

• We verify that it integrates to 1 as follows:

0.001 0.002 0.002 0.001

0 0

0.0020.001 0.003

0 0

,

0.0030.002

10.003 10.003

x y y xXY

x x

xx x

f x y dxdy ke dy dx k e dy e dx

ek e dx e dx

0.001 0.002 6, for 0 and 6 10x yXYf x y ke x y k

Page 10: Joint Probability Distributions

Sec 5-1.1 Joint Probability Distributions 10

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-2: Server Access Time-2Now calculate a probability:

Figure 5-5 Region of integration for the probability that X < 1000 and Y < 2000 is darkly shaded.

1000 2000

1000 20000.002 0.001

0

1000 0.002 40.001

0

10000.003 4 0.001

0

3 14

1000, 2000 ,

0.002

0.003

1 10.0030.003 0.001

XYx

y x

x

xx

x x

P X Y f x y dxdy

k e dy e dx

e ek e dx

e e e dx

e ee

0.003 316.738 11.578 0.915

Page 11: Joint Probability Distributions

Sec 5-1.2 Marginal Probability Distributions 11

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Marginal Probability Distributions (discrete)For a discrete joint PDF, there are marginal distributions for each

random variable, formed by summing the joint PMF over the other variable.

Xy

Yx

f x f xy

f y f xy

1 2 3 f (y ) =1 0.01 0.02 0.25 0.282 0.02 0.03 0.20 0.253 0.02 0.10 0.05 0.174 0.15 0.10 0.05 0.30

f (x ) = 0.20 0.25 0.55 1.00

x = number of bars of signal strength

y = number of times city

name is stated

Figure 5-6 From the prior example, the joint PMF is shown in green while the two marginal PMFs are shown in blue.

Page 12: Joint Probability Distributions

Sec 5-1.2 Marginal Probability Distributions 12

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Marginal Probability Distributions (continuous)

• Rather than summing a discrete joint PMF, we integrate a continuous joint PDF.

• The marginal PDFs are used to make probability statements about one variable.

• If the joint probability density function of random variables X and Y is fXY(x,y), the marginal probability density functions of X and Y are:

,

, (5-3)

X XYy

Y XYx

f x f x y dy

f y f x y dx

Page 13: Joint Probability Distributions

Sec 5-1.2 Marginal Probability Distributions 13

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-4: Server Access Time-1For the random variables

times in Example 5-2, find the probability that Y exceeds 2000.

Integrate the joint PDF directly using the picture to determine the limits.

2000

0 2000 2000

2000 , ,

Dark region left dark region right dark region

XY XYx

P Y f x y dy dx f x y dy dx

Page 14: Joint Probability Distributions

Sec 5-1.2 Marginal Probability Distributions 14

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-4: Server Access Time-2

Alternatively, find the marginal PDF and then integrate that to find the desired probability.

0.001 0.002

0

0.002 0.001

0

0.0010.002

0

0.0010.002

3 0.002 0.001

0.001

10.001

6 10 1 for 0

yx y

Y

yy x

yxy

yy

y y

f y ke

ke e dx

eke

eke

e e y

2000

3 0.002 0.001

2000

0.002 0.0033

2000 2000

4 63

2000

6 10 1

6 100.002 0.003

6 10 0.050.002 0.003

Y

y y

y y

P Y f y dy

e e dy

e e

e e

Page 15: Joint Probability Distributions

Sec 5-1.2 Marginal Probability Distributions 15

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Mean & Variance of a Marginal Distribution

Means E(X) and E(Y) are calculated from the discrete and continuous marginal distributions.

2 2 2 2

2 2 2 2

Discrete Continuous

X X XR R

Y Y YR R

X X X XR R

Y Y Y YR R

E X x f x x f x dx

E Y y f y y f y dy

V X x f x x f x dx

V Y y f y y f y dy

Page 16: Joint Probability Distributions

Sec 5-1.2 Marginal Probability Distributions 16

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Mean & Variance for Example 5-1

1 2 3 f (y ) = y *f (y ) = y 2*f (y ) =1 0.01 0.02 0.25 0.28 0.28 0.282 0.02 0.03 0.20 0.25 0.50 1.003 0.02 0.10 0.05 0.17 0.51 1.534 0.15 0.10 0.05 0.30 1.20 4.80

f (x ) = 0.20 0.25 0.55 1.00 2.49 7.61x *f (x ) = 0.20 0.50 1.65 2.35

x 2*f (x ) = 0.20 1.00 4.95 6.15

x = number of bars of signal strength

y = number of times city

name is stated

E(X) = 2.35 V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275

E(Y) = 2.49 V(Y) = 7.61 – 2.492 = 7.61 – 16.20 = 1.4099

Page 17: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 17

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Conditional Probability Distributions

From Example 5-1P(Y=1|X=3) = 0.25/0.55 = 0.455P(Y=2|X=3) = 0.20/0.55 = 0.364P(Y=3|X=3) = 0.05/0.55 = 0.091P(Y=4|X=3) = 0.05/0.55 = 0.091

Sum = 1.001

Recall that P A B

P B AP A

1 2 3 f (y ) =1 0.01 0.02 0.25 0.282 0.02 0.03 0.20 0.253 0.02 0.10 0.05 0.174 0.15 0.10 0.05 0.30

f (x ) = 0.20 0.25 0.55 1.00

x = number of bars of signal strength

y = number of times city

name is stated

Note that there are 12 probabilities conditional on X, and 12 more probabilities conditional upon Y.

Page 18: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 18

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Conditional Probability Density Function Defined

Given continuous random variables and with joint probability density function , ,

the conditional probability densiy function of given =x is,

for 0

XY

XYXY x

X

X Yf x y

Y Xf x y

f y f xf x

(5-4)

which satifies the following properties:(1) 0

(2) 1

(3) for any set B in the range of Y

Y x

Y x

Y xB

f y

f y dy

P Y B X x f y dy

Page 19: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 19

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-6: Conditional Probability-1From Example 5-2, determine the conditional PDF for Y given X=x.

0.001 0.002

0.0020.001

0.0020.001

0.003

0.001 0.002

0.003

0.002 0.002

0.002

0.002

0.003 for 0,

0.003

0.002 for 0

x yX

x

yx

x

x

x

x yXY

Y x xX

x y

f x k e dy

eke

eke

e xf x y kef y

ef x

e x y

Page 20: Joint Probability Distributions

20

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-6: Conditional Probability-2Now find the probability that Y exceeds 2000 given that X=1500:

Sec 5-1.3 Conditional Probability Distributions

15002000

0.002 1500 0.002

2000

0.0023

2000

43 1

2000 1500

0.002

0.0020.002

0.002 0.3680.002

Y

y

y

P Y X

f y dy

e

ee

ee e

Figure 5-8 again The conditional PDF is nonzero on the solid line in the shaded region.

Page 21: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 21

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-7: Conditional Discrete PMFs

Conditional discrete PMFs can be shown as tables.

1 2 3 f (y ) = 1 2 31 0.01 0.02 0.25 0.28 0.036 0.071 0.893 1.0002 0.02 0.03 0.20 0.25 0.080 0.120 0.800 1.0003 0.02 0.10 0.05 0.17 0.118 0.588 0.294 1.0004 0.15 0.10 0.05 0.30 0.500 0.333 0.167 1.000

f (x ) = 0.20 0.25 0.551 0.050 0.080 0.4552 0.100 0.120 0.3643 0.100 0.400 0.0914 0.750 0.400 0.091

Sum of f(y|x) = 1.000 1.000 1.000

f(x|y) for y = Sum of f(x|y) =

y = number of times city

name is stated

x = number of bars of signal strength

Page 22: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 22

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Mean & Variance of Conditional Random Variables

• The conditional mean of Y given X = x, denoted as E(Y|x) or μY|x is:

• The conditional variance of Y given X = x, denoted as V(Y|x) or σ2

Y|x is:

(5-6)Y xy

E Y x y f y dy

2

2 2Y x Y x Y x Y x

y y

V Y x y f y dy y f y

Page 23: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 23

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-8: Conditional Mean & VarianceFrom Example 5-2 & 6, what is the conditional mean for Y given

that x = 1500? Integrate by parts.

0.002 1500 0.002 3 0.002

1500 1500

0.002 0.0023

15001500

0.0023 3

1500

3

1500 0.002 0.002

0.0020.002 0.002

15000.0020.002 0.002 0.002

0.002

y y

y y

y

E Y X y e dy e y e dy

e ee y dy

ee e

e

33

33

15000.002 0.002 0.002

0.002 2000 20000.002

ee

ee

If the connect time is 1500 ms, then the expected time to be authorized is 2000 ms.

Page 24: Joint Probability Distributions

Sec 5-1.3 Conditional Probability Distributions 24

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-9For the discrete random variables in Exercise 5-1, what is

the conditional mean of Y given X=1?

1 2 3 f (y ) =1 0.01 0.02 0.25 0.282 0.02 0.03 0.20 0.253 0.02 0.10 0.05 0.174 0.15 0.10 0.05 0.30

f (x ) = 0.20 0.25 0.55 y*f(y|x=1) y2*f(y|x=1)1 0.050 0.080 0.455 0.05 0.052 0.100 0.120 0.364 0.20 0.403 0.100 0.400 0.091 0.30 0.904 0.750 0.400 0.091 3.00 12.00

Sum of f(y|x) = 1.000 1.000 1.000 3.55 13.3512.6025

0.7475

y = number of times city

name is stated

x = number of bars of signal strength

The mean number of attempts given one bar is 3.55 with variance of 0.7475.

Page 25: Joint Probability Distributions

Sec 5-1.4 Independence 25

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Joint Random Variable Independence

• Random variable independence means that knowledge of the values of X does not change any of the probabilities associated with the values of Y.

• X and Y vary independently.• Dependence implies that the values of X are

influenced by the values of Y.• Do you think that a person’s height and weight

are independent?

Page 26: Joint Probability Distributions

Sec 5-1.4 Independence 26

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Exercise 5-10: Independent Random VariablesIn a plastic molding operation,

each part is classified as to whether it conforms to color and length specifications.

1 if the part conforms to color specs0 otherwise

1 if the part conforms to length specs0 otherwise

X

Y

Figure 5-10(a) shows marginal & joint probabilities, fXY(x, y) = fX(x) * fY(y)

Figure 5-10(b) show the conditional probabilities, fY|x(y) = fY(y)

Page 27: Joint Probability Distributions

Sec 5-1.4 Independence 27

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Properties of IndependenceFor random variables X and Y, if any one of the following

properties is true, the others are also true. Then X and Y are independent.

(1) ,

(2) for all x and y with 0

(3) for all x and y with 0

(4) P , for any sets and in the range of and , respectively. (5-7)

XY X Y

Y XY x

X YX y

f x y f x f y

f y f y f x

f y f x f y

X A Y B P X A P Y BA B X Y

Page 28: Joint Probability Distributions

Sec 5-1.4 Independence 28

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Rectangular Range for (X, Y)

• A rectangular range for X and Y is a necessary, but not sufficient, condition for the independence of the variables.

• If the range of X and Y is not rectangular, then the range of one variable is limited by the value of the other variable.

• If the range of X and Y is rectangular, then one of the properties of (5-7) must be demonstrated to prove independence.

Page 29: Joint Probability Distributions

Sec 5-1.4 Independence 29

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-11: Independent Random Variables• Suppose the Example 5-2 is modified such that the

joint PDF is:

• Are X and Y independent? Is the product of the marginal PDFs equal the joint PDF? Yes by inspection.

• Find this probability:

6 0.001 0.002, 2 10 for 0 and 0.x yXYf x y e x y

6 0.001 0.002

0

0.001

2 10

0.001 for 0

x yX

x

f x e dy

e x

6 0.001 0.002

0

0.002

2 10

0.002 for y 0

x yY

y

f y e dx

e

1 2

1000, 1000 1000 1000

1 0.318

P X Y P X P Y

e e

Page 30: Joint Probability Distributions

Sec 5-1.4 Independence 30

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-12: Machined DimensionsLet the random variables X and Y

denote the lengths of 2 dimensions of a machined part. Assume that X and Y are independent and normally distributed. Find the desired probability.

10.4 10.6, 3.15 3.25 10.4 10.6 3.15 3.25

10.4 10.5 10.6 10.5 3.15 3.2 3.25 3.20.05 0.05 0.06 0.06

2 2 0.833 0.833 0.568

P X Y P X P Y

P Z P Z

P Z P Z

X YMean 10.5 3.2

Variance 0.0025 0.0036

NormalRandom Variables

Page 31: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 31

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-13: More Than Two Random Variables

• Many dimensions of a machined part are routinely measured during production. Let the random variables X1, X2, X3 and X4 denote the lengths of four dimensions of a part.

• What we have learned about joint, marginal and conditional PDFs in two variables extends to many (p) random variables.

Page 32: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 32

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Joint Probability Density Function Redefined

1 2

1 2

1 2

... 1 2

... 1 2 1 2

1 2 ... 1 2 1 2

(1) , ,..., 0

(2) ... , ,..., ... 1

(3) For any region B of p-dimensional space,

, ... ... , ,..., ...

p

p

p

X X X p

X X X p p

p X X X p pB

f x x x

f x x x dx dx dx

P X X X B f x x x dx dx dx

(5-8)

The joint probability density function for the continuous random variables X1, X2, X3, …Xp, denoted as satisfies the following properties:

1 2 3 ... 1 2 3, , ,...,

px x x x pf x x x x

Page 33: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 33

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-14: Component LifetimesIn an electronic assembly, let X1, X2, X3, X4 denote the

lifetimes of 4 components in hours. The joint PDF is:

What is the probability that the device operates more than 1000 hours?The joint PDF is a product of exponential PDFs.P(X1 > 1000, X2 > 1000, X3 > 1000, X4 > 1000) = e-1-2-1.5-3 = e-7.5 = 0.00055

1 2 3 4

1 2 3 4

0.001 0.002 0.0015 0.003121 2 3 4, , , 9 10 for x 0x x x x

X X X X if x x x x e

Page 34: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 34

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-15: Probability as a Ratio of Volumes

• Suppose the joint PDF of the continuous random variables X and Y is constant over the region x2 + y2 =4. The graph is a round cake with radius of 2 and height of 1/4π.

• A cake round of radius 1 is cut from the center of the cake, the region x2 + y2 =1.

• What is the probability that a randomly selected bit of cake came from the center round?

• Volume of the cake is 1. The volume of the round is π *1/4π = ¼. The desired probability is ¼.

Page 35: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 35

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Marginal Probability Density Function

1 2

1 2

1 2 ... 1 2

... 1 2 1 2

If the joint probability density function of continuous random variables

, ,... is , ,... ,

the marginal probability density function of is

... , ,...

p

i p

p X X X p

i

X i X X X p

X X X f x x x

X

f x f x x x dx dx 1 1

1 2

... ... (5-9)

where the integral is over all points in the

range of , ,... for which .

( don't integrate out )

i i p

p i i

i

dx dx dx

X X X X x

x

Page 36: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 36

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Mean & Variance of a Joint PDF

The mean and variance of Xi can be determined from either the marginal PDF, or the joint PDF as follows:

1 2

1 2

... 1 2 1 2

2

... 1 2 1

... , ,... ...

and (5-10)

... , ,...

p

i p

i i X X X p p

i i X X X X p

E X x f x x x dx dx dx

V X x f x x x dx dx

2... pdx

Page 37: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 37

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-16• There are 10 points in

this discrete joint PDF.

• Note that x1+x2+x3 = 3

• List the marginal PDF of X2

P(X2 = 0) = PXXX(0,0,3) + PXXX(1,0,2) + PXXX(2,0,1) + PXXX(3,0,0)

P(X2 = 1) = PXXX(0,1,2) + PXXX(1,1,1) + PXXX(2,1,0)

P(X2 = 2) = PXXX(0,2,1) + PXXX(1,2,0)

P(X2 = 3) = PXXX(0,3,0) Note the index pattern

Page 38: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 38

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Reduced Dimensionality

1 2

1 2 1

1 2 ... 1 2

1 2

... 1 2

If the joint probability density function of continuous random variables

, ,... is , ,... ,

then the probability density function of ,X ,...X , is

, ,..., ...

p

k

p X X X p

k

X X X k X

X X X f x x x

X k p

f x x x f

2 ... 1 2 1 2

1 2

, ,... ... (5-11)

where the integral is over all points in the

range of , ,... for which for 1 through .

( integrate out p-k variables)

pX X p k k p

p i i

x x x dx dx dx

X X X X x i k

Page 39: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 39

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Conditional Probability Distributions

• Conditional probability distributions can be developed for multiple random variables by extension of the ideas used for two random variables.

• Suppose p = 5 and we wish to find the distribution conditional on X4 and X5.

1 2 3 4 5

1 2 3 4 5

4 5

4 5

1 2 3 4 51 2 3

4 5

4 5

, , , ,, ,

,

for , 0.

X X X X XX X X X X

X X

X X

f x x x x xf x x x

f x x

f x x

Page 40: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 40

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Independence with Multiple Variables

The concept of independence can be extended to multiple variables.

1 2 1 2

1 2

... 1 2 1 2 1 2

Random variables , ,..., are if and only if

, ,..., ... for x ,x ,...,x (5-12)

(joint p

in

df equals the product of all the marginal PDFs

depen e

)

d nt

p p

p

X X X p X X X p p

X X X

f x x x f x f x f x all

Page 41: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 41

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-17

• In Chapter 3, we showed that a negative binomial random variable with parameters p and r can be represented as a sum of r geometric random variables X1, X2,…, Xr , each with parameter p.

• Because the binomial trials are independent, X1, X2,…, Xr are independent random variables.

Page 42: Joint Probability Distributions

Sec 5-1.5 More Than Two Random Variables 42

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-18: Layer ThicknessSuppose X1,X2,X3 represent the thickness in μm of a

substrate, an active layer and a coating layer of a chemical product. Assume that these variables are independent and normally distributed with parameters and specified limits as tabled.

X1 X2 X3

Mean (μ) 10,000 1,000 80Std dev (σ) 250 20 4Lower limit 9,200 950 75

Upper limit 10,800 1,050 85

P(in limits) 0.99863 0.98758 0.78870

Note the index pattern P(all in limits) = 0.77783

NormalRandom Variables

What proportion of the product meets all specifications? Answer: 0.7783, 3 layer product.

Which one of the three thicknesses has the least probability of meeting specs?Answer: Layer 3 has least prob.

Page 43: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 43

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Covariance

• Covariance is a measure of the relationship between two random variables.

• First, we need to describe the expected value of a function of two random variables. Let h(X, Y) denote the function of interest.

, , for , discrete , (5-13)

, , for , continuous

XY

XY

h x y f x y X YE h X Y

h x y f x y dxdy X Y

Page 44: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 44

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-19: E(Function of 2 Random Variables)

Figure 5-12 Discrete joint distribution of X and Y.

Task: Calculate E[(X-μX)(Y-μY)] = covariance

x y f(x, y) x-μX y-μY Prod1 1 0.1 -1.4 -1.0 0.141 2 0.2 -1.4 0.0 0.003 1 0.2 0.6 -1.0 -0.123 2 0.2 0.6 0.0 0.003 3 0.3 0.6 1.0 0.181 0.3 0.203 0.7

1 0.32 0.43 0.3

μX = 2.4

μY = 2.0Mea

n

covariance =

Join

tM

argi

nal

Page 45: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 45

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Covariance Defined

The covariance between the random variables X and Y, denoted as cov , or is

(5-14)

The units of are units of times units of .

For example, if the units of

XY

XY X Y X Y

XY

X Y

E X Y E XY

X Y

X are feet and the units of Y are pounds,the units of the covariance are foot-pounds.

Unlike the range of variance, - .XY

Page 46: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 46

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Covariance and Scatter Patterns

Figure 5-13 Joint probability distributions and the sign of cov(X, Y). Note that covariance is a measure of linear relationship. Variables with non-zero covariance are correlated.

Page 47: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 47

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-20: Intuitive Covariance

The probability distribution of Example 5-1 is shown.

By inspection, note that the larger probabilities occur as X and Y move in opposite directions. This indicates a negative covariance.

1 2 31 0.01 0.02 0.252 0.02 0.03 0.203 0.02 0.10 0.054 0.15 0.10 0.05

x = number of bars of signal strength

y = number of times city

name is stated

Page 48: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 48

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Correlation (ρ = rho)

The between random variables X and Y, denoted as , is

cov , (5-15)

Since 0 and 0, and cov , have the same sign.

We say that

correlation

XY

XYXY

X Y

X Y

XY

XY

X Y

V X V Y

X Y

is normalized, so 1 1 (5-16)Note that is dimensionless.Variables with non-zero correlation are correlated.

XY

XY

Page 49: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 49

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-21: Covariance & CorrelationDetermine the

covariance and correlation.

x y f(x, y) x-μX y-μY Prod0 0 0.2 -1.8 -1.2 0.421 1 0.1 -0.8 -0.2 0.011 2 0.1 -0.8 0.8 -0.072 1 0.1 0.2 -0.2 0.002 2 0.1 0.2 0.8 0.023 3 0.4 1.2 1.8 0.880 0.2 1.2601 0.2 0.9262 0.23 0.4

0 0.21 0.22 0.23 0.4

μX = 1.8

μY = 1.8

σX = 1.1662

σY = 1.1662StDe

v

correlation =

Join

tM

argi

nal

covariance =

Mea

n

Note the strong positive correlation.

Figure 5-14 Discrete joint distribution, f(x, y).

Page 50: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 50

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-21Calculate the correlation.

x y f(x, y) x*y*f1 7 0.2 1.42 9 0.6 10.83 11 0.2 6.61 0.2 18.8 = E(XY)2 0.6 0.80 = cov(X ,Y )3 0.2 1.00 = ρXY

7 0.29 0.6

11 0.2μX = 2

μY = 9.0

σX = 0.6325

σY = 1.2649

Join

tM

ean

StDe

vM

argi

nals

Using Equations 5-14 & 5-15

Figure 5-15 Discrete joint distribution. Steepness of line connecting points is immaterial.

Page 51: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 51

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Independence Implies ρ = 0

• If X and Y are independent random variables,σXY = ρXY = 0 (5-17)

• ρXY = 0 is necessary, but not a sufficient condition for independence. – Figure 5-13d (x, y plots as a circle) provides an

example.– Figure 5-13b (x, y plots as a square) indicates

independence, but a non-rectangular pattern would indicate dependence.

Page 52: Joint Probability Distributions

Sec 5-2 Covariance & Correlation 52

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-23: Independence Implies Zero Covariance

Let 16 for 0 2 and 0 4

Show that 0XY

XY

f xy x y x y

E XY E X E Y

Figure 5-15 A planar joint distribution.

4 22 2

0 0

24 32

0 0

42

0

43

0

116

116 3

1 816 3

1 1 64 326 3 6 3 9

32 4 8 09 3 3

XY

E XY x y dx dy

xy dy

y dy

y

E XY E X E Y

4 22

0 0

24 3

0 0

42

0

4 22

0 0

24 22

0 0

43

0

116

116 3

1 8 1 16 416 2 3 6 2 3

116

116 2

2 1 64 816 3 8 3 3

E X x ydx dy

xy dy

y

E Y xy dx dy

xy dy

y

Page 53: Joint Probability Distributions

Sec 5-3 Common Joint Distributions 53

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Common Joint Distributions

• There are two common joint distributions– Multinomial probability distribution (discrete), an

extension of the binomial distribution– Bivariate normal probability distribution

(continuous), a two-variable extension of the normal distribution. Although they exist, we do not deal with more than two random variables.

• There are many lesser known and custom joint probability distributions as you have already seen.

Page 54: Joint Probability Distributions

Sec 5-3.1 Multinomial Probability Distribution 54

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Multinomial Probability Distribution• Suppose a random experiment consists of a series of n trials.

Assume that:1) The outcome of each trial can be classifies into one of k classes.2) The probability of a trial resulting in one of the k outcomes is constant,

denoted as p1, p2, …, pk.3) The trials are independent.

• The random variables X1, X2,…, Xk denote the number of outcomes in each class and have a multinomial distribution and probability mass function:

1 21 1 2 2 1 2

1 2

1 2 1 2

!, ,..., ... (5-18)! !... !

for ... and ... 1.

kxx xk k k

k

k k

nP X x X x X x p p px x x

x x x n p p p

Page 55: Joint Probability Distributions

Sec 5-3.1 Multinomial Probability Distribution 55

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-24: Digital ChannelOf the 20 bits received over a digital channel, 14 are of

excellent quality, 3 are good, 2 are fair, 1 is poor. The sequence received was EEEEEEEEEEEEEEGGGFFP.

The probability of that sequence is 0.6140.330.0820.021 = 2.708*10-9

However, the number of different ways of receiving those bits is a lot!

The combined result is a multinomial distribution.

x P(x)E 0.60G 0.30F 0.08P 0.02

20! 2,325,60014!3!2!1!

14 3 2 11 2 3 4

20!14, 3, 2, 1 0.6 0.3 0.08 0.02 0.006314!3!2!1!

P x x x x

Page 56: Joint Probability Distributions

Sec 5-3.1 Multinomial Probability Distribution 56

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-25: Digital Channel

Refer again to the prior Example 5-24.What is the probability that 12 bits are E, 6 bits

are G, 2 are F, and 0 are P?

12 6 2 01 2 3 4

20!12, 6, 2, 0 0.6 0.3 0.08 0.02 0.035812!6!2!0!

P x x x x

0.03582 = (FACT(20)/(FACT(12)*FACT(6)*FACT(2))) * 0.6^12*0.3^6*0.08^2Using Excel

Page 57: Joint Probability Distributions

Sec 5-3.1 Multinomial Probability Distribution 57

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Multinomial Means and Variances

The marginal distributions of the multinomial are binomial.

If X1, X2,…, Xk have a multinomial distribution, the marginal probability distributions of Xi is binomial with:

E(Xi) = npi and V(Xi) = npi(1-pi) (5-19)

Page 58: Joint Probability Distributions

Sec 5-3.1 Multinomial Probability Distribution 58

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-26: Marginal Probability Distributions

Refer again to the prior Example 5-25.The classes are now {G}, {F}, and {E, P}.Now the multinomial changes to:

2 332

2 3 2 3 2 3 2 32 3 2 3

2 3 3 2

!, 1! ! !

for 0 to and 0 to .

n x xxxX X

nP x x p p p px x n x x

x n x x n x

Page 59: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 59

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-27: Bivariate Normal Distribution

Earlier, we discussed the two dimensions of an injection-molded part as two random variables (X and Y). Let each dimension be modeled as a normal random variable. Since the dimensions are from the same part, they are typically not independent and hence correlated.

Now we have five parameters to describe the bivariate normal distribution:μX, σX, μY, σY, ρXY

Page 60: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 60

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Bivariate Normal Distribution Defined

2

2 2

2 22

1, ; , , , ,2 1

212 1

for and .

uXY X X Y Y

X Y

X X Y Y

X X Y Y

f x y e

x x y yu

x y

0, ,Parameter limits: 1 1

0, ,x x

y y

Page 61: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 61

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Role of Correlation

Figure 5-17 These illustrations show the shapes and contour lines of two bivariate normal distributions. The left distribution has independent X, Y random variables (ρ = 0). The right distribution has dependent X, Y random variables with positive correlation (ρ > 0, actually 0.9). The center of the contour ellipses is the point (μX, μY).

Page 62: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 62

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-28: Standard Bivariate Normal Distribution

Figure 5-18 This is a standard bivariate normal because its means are zero, its standard deviations are one, and its correlation is zero since X and Y are independent. The density function is:

2 20.51,2

x yXYf x y e

Page 63: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 63

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Marginal Distributions of the Bivariate Normal

If X and Y have a bivariate normal distribution with joint probability density function fXY(x,y;σX,σY,μX,μY,ρ), the marginal probability distributions of X and Y are normal with means μX and μY and σX and σY, respectively. (5-21)

Figure 5-19 The marginal probability density functions of a bivariate normal distribution are simply projections of the joint onto each of the axis planes. Note that the correlation (ρ) has no effect on the marginal distributions.

Page 64: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 64

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Conditional Distributions of the Joint Normal

If X and Y have a bivariate normal distribution with joint probability density fXY(x,y;σX,σY,μX,μY,ρ), the conditional probability distribution of Y given X = x is normal with mean and variance as follows:

2 2 21

YY XY x

X

YY x

x

Page 65: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 65

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Correlation of Bivariate Normal Random Variables

If X and Y have a bivariate normal distribution

with joint probability density function

fXY(x,y;σX,σY,μX,μY,ρ), the correlation between X

and Y is ρ. (5-22)

Page 66: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 66

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Bivariate Normal Correlation & Independence

• In general, zero correlation does not imply independence.

• But in the special case that X and Y have a bivariate normal distribution, if ρ = 0, then X and Y are independent. (5-23)

Page 67: Joint Probability Distributions

Sec 5-3.2 Bivariate Normal Distribution 67

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-29: Injection-Molded PartThe injection- molded part dimensions has parameters as tabled

and is graphed as shown.The probability of X and Y being within limits is the volume within

the PDF between the limit values.This volume is determined by numerical integration – beyond the

scope of this text.

X YMean 3 7.7

Std Dev 0.04 0.08CorrelationUpper Limit 3.05 7.80Lower Limit 2.95 7.60

Bivariate

0.8

Figure 5-3

Page 68: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 68

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Linear Functions of Random Variables

• A function of random variables is itself a random variable.

• A function of random variables can be formed by either linear or nonlinear relationships. We limit our discussion here to linear functions.

• Given random variables X1, X2,…,Xp and constants c1, c2, …, cp Y= c1X1 + c2X2 + … + cpXp (5-24) is a linear combination of X1, X2,…,Xp.

Page 69: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 69

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Mean & Variance of a Linear FunctionLet Y= c1X1 + c2X2 + … + cpXp and use Equation 5-10:

1 1 2 2

2 2 21 1 2 2

1 2

2 21 1 2 2

... (5-25)

V ... 2 cov (5-26)

If , ,..., are , then cov 0,

..

independent

.

p p

p p i j i ji j

p i j

E Y c E X c E X c E X

Y c V X c V X c V X c c X X

X X X X X

V Y c V X c V X c

2 (5-27)p pV X

Page 70: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 70

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-30: Negative Binomial Distribution

Let Xi be a geometric random variable with parameter p with μ = 1/p and σ2 = (1-p)/p2

Let Y = X1 + X2 +…+Xr, a linear combination of r independent geometric random variables.

Then Y is a negative binomial random variable with μ = r/p and σ2 = r(1-p)/p2 by Equations 5-25 and 5-27.

Thus, a negative binomial random variable is a sum of r identically distributed and independent geometric random variables.

Page 71: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 71

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-31: Error Propagation

A semiconductor product consists of three layers. The variances of the thickness of each layer is 25, 40 and 30 nm. What is the variance of the finished product?

Answer:

1 2 3

32

1

25 40 30 95 nm

95 9.747 nm

ii

X X X X

V X V X

SD X

Page 72: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 72

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Mean & Variance of an Average

1 2

2

2 2

2

...If and

Then (5-28a)

If the are independent with

Then (5-28b)

pi

i i

X X XX E X

p

pE Xp

X V X

pV Xp p

Page 73: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 73

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Reproductive Property of the Normal Distribution

1 2

2

1 1 2 2

1 1 2 2

2 2 2 21 1 2 2

If , ,..., are , random variables

with , and = , for 1,2,..., ,

then ...

is a random variable w

independent normal

normal ith

...

and

...

p

i i

p p

p p

X X X

E X V X i p

Y c X c X c X

E Y c c c

V Y c c

2 2 (5-29)p pc

Page 74: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 74

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-32: Linear Function of Independent Normals

Let the random variables X1 and X2 denote the independent length and width of a rectangular manufactured part. Their parameters are shown in the table.

What is the probability that the perimeter exceeds 14.5 cm?

X 1 X 2

Mean 2 5Std Dev 0.1 0.2

Parameters of

1 2

1 2

2 22 21 2

Let 2 2 perimeter2 2 2 2 2 5 14 cm

2 2 4 0.1 4 0.2 0.04 0.16 0.20

0.20 0.4472 cm

14.5 1414.5 1 1 1.1180 0.1318.4472

Y X XE Y E X E X

V Y V X V X

SD Y

P Y

0.1318 = 1 - NORMDIST(14.5, 14, SQRT(0.2), TRUE)Using Excel

Page 75: Joint Probability Distributions

Sec 5-4 Linear Functions of Random Variables 75

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-33: Beverage VolumeSoft drink cans are filled by an automated filling machine. The

mean fill volume is 12.1 fluid ounces, and the standard deviation is 0.1 fl oz. Assume that the fill volumes are independent, normal random variables. What is the probability that the average volume of 10 cans is less than 12 fl oz?

th

10

1

1

21022

1

Let denote the fill volume of the i can

Let 10

10 12.110 12.1 fl oz10

10 0.11 0.001 fl oz10 10012 12.112 3.16 0.00079

0.001

i

ii

n

ii

ii

X

X X

E X E X

V X V X

P X

Using Excel

0.000783= NORMDIST(12, 12.1, SQRT(0.001), TRUE)

Page 76: Joint Probability Distributions

Sec 5-5 General Functions of Random Variables 76

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

General Functions of Random Variables

• A linear combination is not the only way that we can form a new random variable (Y) from one or more of random variables (Xi) that we already know. We will focus on single variables, i.e., Y = h(X), the transform function.

• The transform function must be monotone:– Each value of x produces exactly one value of y.– Each value of y translates to only one value of x.

• This methodology produces the probability mass or density function of Y from the function of X.

Page 77: Joint Probability Distributions

Sec 5-5 General Functions of Random Variables 77

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

General Function of a Discrete Random Variable

Suppose that X is a discrete random variable with probability distribution fX(x). Let Y = h(X) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability mass function of the random variable Y is

fY(y) = fX[u(y)] (5-30)

Page 78: Joint Probability Distributions

Sec 5-5 General Functions of Random Variables 78

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-34: Function of a Discrete Random Variable

Let X be a geometric random variable with PMF: fX(x) = p(1-p)x-1 for x = 1, 2, …

Find the probability distribution of Y = X2.Solution: – Since X > 0, the transformation is one-to-one.– The inverse transform function is X = sqrt(Y).– fY(y) = p(1-p)sqrt(y)-1 for y = 1, 4, 9, 16,…

Page 79: Joint Probability Distributions

Sec 5-5 General Functions of Random Variables 79

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

General Function of a Continuous Random Variable

Suppose that X is a continuous random variable with probability distribution fX(x). Let Y = h(X) define a one-to-one transformation between the values of X and Y so that the equation y = h(x) can be solved uniquely for x in terms of y. Let this solution be x = u(y), the inverse transform function. Then the probability density function of the random variable Y isfY(y) = fX[u(y)] |∙ J| (5-31) where J = u’(y) is called the Jacobian of the transformation and the absolute value is used.

Page 80: Joint Probability Distributions

Sec 5-5 General Functions of Random Variables 80

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Example 5-35: Function of a Continuous Random Variable

Let X be a continuous random variable with probability distribution:

Find the probability distribution of Y = h(X) = 2X + 4

for 0 48Xxf x

Note that has a one-to-one relationship to .4 1 and the Jacobian is '

2 24 2 1 4 for 4 12.

8 2 32Y

Y Xyx u y J u y

y yf y y

Page 81: Joint Probability Distributions

Chapter 5 Summary 81

© John Wiley & Sons, Inc. Applied Statistics and Probability for Engineers, by Montgomery and Runger.

Important Terms & Concepts for Chapter 5

Bivariate distributionBivariate normal distributionConditional meanConditional probability density

functionConditional probability mass

functionConditional varianceContour plotsCorrelationCovarianceError propagation

General functions of random variables

IndependenceJoint probability density

functionJoint probability mass functionLinear functions of random

variablesMarginal probability distributionMultinomial distributionReproductive property of the

normal distribution