Distributions Chapter 5: Joint...

10
Chapter 5: Joint Probability Distributions 5 1 T o or More Random Variables 5-1 Two or More Random Variables 5-1.1 Joint Probability Distributions 5 12 Marginal Probability Distributions 5-1.2 Marginal Probability Distributions 5-1.3 Conditional Probability Distributions 5-1.4 Independence 5 1.4 Independence 5 - 1.5 More Than Two Random Variables 5-2 Covariance and Correlation 5-3 Common Joint Distributions 5-3.1 Multinomial Probability Distribution 5-3.2 Bivariate Normal Distribution 5-4 Linear Functions of Random Variables 5-5 General Functions of Random Variables 1 Chapter Learning Objectives After careful study of this chapter you should be able to: 1. Use joint probability mass functions and joint probability density functions to calculate probabilities 2. Calculate marginal and conditional probability distributions from joint probability distributions 3 d l l i d l i b d 3. Interpret and calculate covariances and correlations between random variables 4. Use the multinomial distribution to determine probabilities 5. Understand properties of a bivariate normal distribution and be able to draw contour plots for the probability density function 6 Cl l d i f li bi i f d 6. Calculate means and variances for linear combinations of random variables, and calculate probabilities for linear combinations of normally distributed random variables 7. Determine the distribution of a general function of a random variable 2 Concept of Joint Probabilities Some random variables are not independent of each other, i.e., their values are related to some degree Urban atmospheric ozone and airborne particulate matter tend to vary together Ub hi l d df l i d Urban vehicle speeds and fuel consumption rates tend to vary inversely The length (X) of a injection-molded part might not be The length (X) of a injection molded part might not be independent of the width (Y) (individual parts will vary due to random variation in materials and pressure) A joint probability distribution will describe the behavior of several random variables In the case of 2 RVs, say, X and Y, the graph of the joint distribution is 3-dimensional: x, y, and f(x,y). 3 The Joint Probability Distribution for a Pair of Discrete Random V i bl Variables 4

Transcript of Distributions Chapter 5: Joint...

Page 1: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

Chapter 5: Joint Probability Distributions

5 1 T o or More Random Variables5-1 Two or More Random Variables5-1.1 Joint Probability Distributions5 1 2 Marginal Probability Distributions5-1.2 Marginal Probability Distributions5-1.3 Conditional Probability Distributions5-1.4 Independence5 1.4 Independence5-1.5 More Than Two Random Variables

5-2 Covariance and Correlation5-3 Common Joint Distributions5-3.1 Multinomial Probability Distribution5-3.2 Bivariate Normal Distribution5-4 Linear Functions of Random Variables5-5 General Functions of Random Variables

1

Chapter Learning ObjectivesAfter careful study of this chapter you should be able to:1. Use joint probability mass functions and joint probability density

functions to calculate probabilitiesp2. Calculate marginal and conditional probability distributions from

joint probability distributions3 d l l i d l i b d3. Interpret and calculate covariances and correlations between random

variables4. Use the multinomial distribution to determine probabilitiesp5. Understand properties of a bivariate normal distribution and be able

to draw contour plots for the probability density function6 C l l d i f li bi i f d6. Calculate means and variances for linear combinations of random

variables, and calculate probabilities for linear combinations of normally distributed random variables

7. Determine the distribution of a general function of a random variable2

Concept of Joint Probabilities• Some random variables are not independent of each other,

i.e., their values are related to some degree, g– Urban atmospheric ozone and airborne particulate matter tend to

vary togetherU b hi l d d f l i d– Urban vehicle speeds and fuel consumption rates tend to vary inversely

• The length (X) of a injection-molded part might not beThe length (X) of a injection molded part might not be independent of the width (Y) (individual parts will vary due to random variation in materials and pressure)

• A joint probability distribution will describe the behavior of several random variables– In the case of 2 RVs, say, X and Y, the graph of the joint

distribution is 3-dimensional: x, y, and f(x,y).3

The Joint Probability Distribution for a Pair of Discrete Random

V i blVariables

4

Page 2: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

A Joint Probability Distribution yExample (Example 5-1)

L tLet:

5

The Joint Probability Distribution yfor a Pair of Continuous Random

V i blVariables

6

A Joint Probability Distribution yExample (Example 5-2)

Figure 5 4 The joint probability density

7

Figure 5-4 The joint probability density function of X and Y is nonzero over the shaded region where x < y

A Joint Probability DistributionA Joint Probability Distribution Example (Example 5-2)Example (Example 5 2)

Figure 5-5 Region of integration

8

g g gfor the probability that X < 1000 and Y < 2000 is darkly shaded

Page 3: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

The Marginal ProbabilityThe Marginal Probability DistributionDistribution

• The individual probability distribution of a random variable is referred to as its marginal probability distributionto as its marginal probability distribution

• In general, the marginal probability distribution of X can be determined from the joint probability distribution of X and other

d i blrandom variables• For the case of two discrete random variables:

9

A Marginal ProbabilityA Marginal Probability Distribution Example (Example 5-3)Distribution Example (Example 5 3)

Figure 5-6 Marginal probability distribution of X and d st but o o a dY from Fig. 5-1

10

The Marginal ProbabilityThe Marginal Probability DistributionDistribution

• For the case of two continuous random variables:

11

The Conditional ProbabilityThe Conditional Probability Distribution (discrete case)Distribution (discrete case)

• Given this definition, verify that the joint pmf in Example 5-1 leads to the following conditional pmfs:

fY|x(y)x�(#bars�of�signal�strength)

y�(#times�city�name�is�1 2 3

fX|y(x)x (#bars�of�signal�strength)

y (#times�city�name�is�1 2 3

y ( ystated)

1 2 3

4 0.7500 0.4000 0.09093 0.1000 0.4000 0.09092 0 1000 0 1200 0 3636

y ( ystated)

1 2 3

4 0.5000 0.3333 0.16673 0.1176 0.5882 0.29412 0 0800 0 1200 0 8000

12

2 0.1000 0.1200 0.36361 0.0500 0.0800 0.4545

2 0.0800 0.1200 0.80001 0.0357 0.0714 0.8929

Page 4: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

Properties of Conditional pmfs

13

An Example of Conditional pmfAn Example of Conditional pmfMean and VarianceMean and Variance

• Given the formulae on the previous slide and the conditional pmfs below, verify that conditional means and p , yconditional variances below are correct:

fY|x(y)x�(#bars�of�signal�strength)

y (#times city name is

fX|y(x)x�(#bars�of�signal�strength)

y�(#times�Conditional Conditionaly�(#times�city�name�is�

stated)1 2 3

4 0.7500 0.4000 0.09093 0.1000 0.4000 0.0909

city�name�is�stated)

1 2 3Conditional�

mean:Conditional�variance:

4 0.5000 0.3333 0.1667 1.6667 0.5556

3 0.1176 0.5882 0.2941 2.1765 0.38062 0.1000 0.1200 0.36361 0.0500 0.0800 0.4545

Conditional�mean: 3.5500 3.1200 1.8182Conditional�variance: 0.7475 0.8256 0.8760

2 0.0800 0.1200 0.8000 2.7200 0.3616

1 0.0357 0.0714 0.8929 2.8571 0.1939

14

The Conditional Probability yDistribution (continuous case)

15

An Example of ConditionalAn Example of Conditional Probability (Example 5-6)Probability (Example 5 6)

16

Page 5: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

Conditional Mean and Variance (continuous case)

17

Independence of Two RandomIndependence of Two Random VariablesVariables

18

Rectangular Range for (X, Y)

• A rectangular range for X and Y is a necessary, but not sufficient, condition for the independence of the variablesp

• If the range of X and Y is not rectangular, then the range of one ariable is limited bthen the range of one variable is limited by the value of the other variable

• If the range of X and Y is rectangular, then one of the properties of (5-7) must beone of the properties of (5 7) must be demonstrated to prove independence

19

An Example of Independence –p pdiscrete case (Example 5-10)

Figure 5-10 (a) Joint g ( )and marginal probability distributions of X and Y. (b) Conditional

b bilit di t ib ti

20

probability distribution of Y given X=x

Page 6: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

An Example of IndependenceAn Example of Independence (Example 5-6)(Example 5 6)

• The example on the previous page could also be d i b l f f ll

fXY(x,y)( l f )

expressed in tabular form as follows:

x�(colour�conforms)y�(length�conforms) 0 1 fY(y)

1 0.0098 0.9702 0.980 0.0002 0.0198 0.02

f (x) 0 01 0 99fX(x) 0.01 0.99

fY|x(y) fX|y(x)x�(colour�conforms) x�(colour�conforms)

y�(length�conforms) 0 1 y�(length�conforms) 0 1Conditional�

mean:Conditional�variance:

1 0.9800 0.9800 1 0.0100 0.9900 0.9900 0.0099

0 0 0200 0 0200 0 0 0100 0 9900 0 9900 0 0099

21

0 0.0200 0.0200 0 0.0100 0.9900 0.9900 0.0099Conditional�mean: 0.9800 0.9800

Conditional�variance: 0.0196 0.0196

An Example of IndependenceAn Example of Independence –continuous case (Example 5-12)continuous case (Example 5 12)

22

The Covariance Between TwoThe Covariance Between Two Random Variables

• Before we define covariance we need to determine the expected value of a function of two random variables:expected value of a function of two random variables:

• Now we are ready to define covariance:

23

An Example (5-19) of Covariance p ( )for Discrete Random Variables

fXY(x,y) x

y 1 3 fY(y)

3 0 30 0 30

Figure 5-12 Discrete joint

24

3 0.30 0.302 0.20 0.20 0.401 0.10 0.20 0.30

fX(x) 0.30 0.70

jdistribution of X and Y

Page 7: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

Wh t D C i Si if ?What Does Covariance Signify?Covariance is a measure of the strength of the linear relationshipbetween two random variables. If the relationship is nonlinear, the covariance may notthe covariance may not be useful. E.g. in Fig. 5-13 (d) there is definitely a relationship between the variables, undetected by the covariance.

25

The Correlation Between TwoThe Correlation Between Two Random VariablesRandom Variables

26

A Correlation (and Covariance)A Correlation (and Covariance) Example (Example 5-21)Example (Example 5 21)

fXY(x,y) x

y 0 1 2 3 fY(y)

3 0.40 0.402 0.10 0.10 0.201 0.10 0.10 0.200 0.20 0.20

fX(x) 0.20 0.20 0.20 0.40

Figure 5 14 Discrete joint

27

Figure 5-14 Discrete joint distribution, f(x, y).

A Correlation (and Covariance)A Correlation (and Covariance) Example (Example 5-21)Example (Example 5 21)

28

Page 8: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

The Covariance and CorrelationThe Covariance and Correlation Between Independent VariablesBetween Independent Variables

• If X and Y are exactly linearly related (i e• If X and Y are exactly linearly related (i.e. Y=aX+b for constants a and b) then the correlation � is either +1 or 1 (with the samecorrelation �XY is either +1 or -1 (with the same sign as the constant a)

If a is positive � 129

– If a is positive, �XY = 1– If a is negative, �XY = -1

An Example (Covariance Between p (Independent Variables) (Example 5-23)

Figure 5-16 Random variables with zero covariance

30

from Example 5-23

An Example (Covariance Between p (Independent Variables) (Example 5-23)

31

Common Joint Distributions• There are two common joint distributions

– Multinomial probability distribution (discrete), p y ( ),an extension of the binomial distribution

– Bivariate normal probability distributionBivariate normal probability distribution (continuous), a two-variable extension of the normal distribution Although they exist we donormal distribution. Although they exist, we do not deal with more than two random variables.

Th l k d t• There are many lesser known and custom joint probability distributions as you have already seen

32

Page 9: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

The Multinomial DistributionThe Multinomial Distribution

33

The Bivariate NormalThe Bivariate Normal DistributionDistribution

34

35

Linear Combinations of Random Variables

36

Page 10: Distributions Chapter 5: Joint Probabilityroneducate.weebly.com/.../chap5_jointdistributions_4pp.pdf · 2019-05-15 · 5-3 Common Joint Distributions 5-3.1 Multinomial Probability

A Linear Combination ofA Linear Combination of Random Variables Example

Example 5-31Random Variables Example

equation 5-27

37

Mean and Variance of anMean and Variance of an AverageAverage

38

Reproductive Property of theReproductive Property of the Normal DistributionNormal Distribution

39