Post on 03-Aug-2020
Joint Probability Distributions,Correlations
What we learned so far…• Events:
– Working with events as sets: union, intersection, etc.• Some events are simple: Head vs Tails, Cancer vs Healthy• Some are more complex: 10<Gene expression<100• Some are evcen more complex: Series of dice rolls: 1,3,5,3,2
– Conditional probability: P(A|B)=P(A ∩ B)/P(B)– Independent events: P(A|B)=P(A) or P(A ∩ B)= P(A)*P(B)– Bayes theorem: relates P(A|B) to P(B|A)
• Random variables:– Mean, Variance, Standard deviation. How to work with E(g(X))– Discrete (Uniform, Bernoulli, Binomial, Poisson, Geometric, Negative
binomial, Hypergeometric, Power law); PMF: f(x)=Prob(X=x); CDF: F(x)=Prob(X≤x);
– Continuous (Uniform, Exponential, Erlang, Gamma, Normal, Log‐normal);PDF: f(x) such that Prob(X inside A)= ∫A f(x)dx; CDF: F(x)=Prob(X≤x)
• Next step: work with multiple random variables
Concept of Joint Probabilities
• Biological systems are usually described not by a single random variable but by many random variables
• Example: The expression state of a human cell: 20,000 random variables Xi for each of its genes
• A joint probability distribution describes the behavior of several random variables
• We will start with just two random variables X and Y and generalize when necessary
Chapter 5 Introduction 3
Joint Probability Mass Function Defined
Sec 5‐1.1 Joint Probability Distributions 4
The of the and ,
denoted as , , satifies:
(1) , 0 Al
joint probability mass functiondiscrete random vari
l probabilities are non-negative
(2) , 1 The sum of all
ables
XY
XY
XYx y
X Yf x y
f x y
f x y
probabilities is 1
(3) , , (5-1)XYf x y P X x Y y
Example 5‐1: # Repeats vs. Signal BarsYou use your cell phone to check your airline reservation. It asks you to speak
the name of your departure city to the voice recognition system.• Let Y denote the number of times you have to state your departure city.• Let X denote the number of bars of signal strength on you cell phone.
Sec 5‐1.1 Joint Probability Distributions 5
Figure 5‐1 Joint probability distribution of X and Y. The table cells are the probabilities. Observe that more bars relate to less repeating.
OnceTwice3 Times4 Times
0.00
0.05
0.10
0.15
0.20
0.25
1 2 3
Prob
ability
Cell Phone Bars
Bar Chart of Number of Repeats vs. Cell
Phone Bars1 2 3
1 0.01 0.02 0.252 0.02 0.03 0.203 0.02 0.10 0.054 0.15 0.10 0.05
x = number of bars of signal strength
y = number of times city
name is stated
Marginal Probability Distributions (discrete)For a discrete joint PDF, there are marginal distributions for each random variable, formed by summing the joint PMF over the other variable.
Sec 5‐1.2 Marginal Probability Distributions 6
,
,
X XYy
Y XYx
f x f x y
f y f x y
1 2 3 f Y (y ) =
1 0.01 0.02 0.25 0.282 0.02 0.03 0.20 0.253 0.02 0.10 0.05 0.174 0.15 0.10 0.05 0.30f X (x ) = 0.20 0.25 0.55 1.00
x = number of bars of signal strength
y = number of times city name
is stated
Figure 5‐6 From the prior example, the joint PMF is shown in green while the two marginal PMFs are shown in purple.
Called marginal because they are written in the margins
Mean & Variance of X and Y are calculated using marginal distributions
Sec 5‐1.2 Marginal Probability Distributions 7
1 2 3 f (y ) = y *f (y ) = y 2*f (y ) =1 0.01 0.02 0.25 0.28 0.28 0.282 0.02 0.03 0.20 0.25 0.50 1.003 0.02 0.10 0.05 0.17 0.51 1.534 0.15 0.10 0.05 0.30 1.20 4.80f (x ) = 0.20 0.25 0.55 1.00 2.49 7.61
x *f (x ) = 0.20 0.50 1.65 2.35x 2*f (x ) = 0.20 1.00 4.95 6.15
x = number of bars of signal strength
y = number of times city
name is stated
μX =E(X) = 2.35; σX2 = V(X) = 6.15 – 2.352 = 6.15 – 5.52 = 0.6275
μY= E(Y) = 2.49; σY2 = V(Y) = 7.61 – 2.492 = 7.61 – 16.20 = 1.4099
Conditional Probability Distributions
From Example 5‐1P(Y=1|X=3) = 0.25/0.55 = 0.455P(Y=2|X=3) = 0.20/0.55 = 0.364P(Y=3|X=3) = 0.05/0.55 = 0.091P(Y=4|X=3) = 0.05/0.55 = 0.091
Sum = 1.00
Sec 5‐1.3 Conditional Probability Distributions 8
Recall that P A B
P B AP A
1 2 3 f Y (y ) =1 0.01 0.02 0.25 0.282 0.02 0.03 0.20 0.253 0.02 0.10 0.05 0.174 0.15 0.10 0.05 0.30f X (x ) = 0.20 0.25 0.55 1.00
x = number of bars of signal strength
y = number of times city name
is stated
Note that there are 12 probabilities conditional on X, and 12 more probabilities conditional upon Y.
P(Y=y|X=x)=P(X=x,Y=y)/P(X=x)==f(x,y)/fX(x)
Joint Random Variable Independence• Random variable independence means that knowledge of the values of X does not change any of the probabilities associated with the values of Y.
• Opposite: Dependence implies that the values of X are influenced by the values of Y
Sec 5‐1.4 Independence 9
Independence for Discrete Random Variables
• Remember independence of events (slide 21 lecture 3) : P(A|B)=P(A ∩ B)/P(B)=P(A) or P(B|A)= P(A ∩ B)/P(A)=P(B) or P(A ∩ B)=P(A) ∙ P(B)
• Random variables independent if any eventsA that Y=y and B that X=x are independentP(Y=y|X=x)=P(Y=y) for any x or P(X=x|Y=y)=P(X=x) for any y or P(X=x, Y=y)=P(X=x)∙P(Y=y) for any x and y
12
X and Y are Bernoulli variables
What is the marginal PY(Y=0)?A. 1/6B. 2/6C. 3/6D. 4/6E. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 2/6 1/6X=1 2/6 1/6
13
X and Y are Bernoulli variables
What is the marginal PY(Y=0)?A. 1/6B. 2/6C. 3/6D. 4/6E. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 2/6 1/6X=1 2/6 1/6
14
X and Y are Bernoulli variables
What is the conditional P(X=0|Y=0)?A. 2/6B. 1/2C. 1/6D. 4/6E. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 2/6 1/6X=1 2/6 1/6
15
X and Y are Bernoulli variables
What is the conditional P(X=0|Y=0)?A. 2/6B. 1/2C. 1/6D. 4/6E. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 2/6 1/6X=1 2/6 1/6
16
X and Y are Bernoulli variables
Are they independent?A. yesB. noC. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 2/6 1/6X=1 2/6 1/6
17
X and Y are Bernoulli variables
Are they independent?A. yesB. noC. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 2/6 1/6X=1 2/6 1/6
18
X and Y are Bernoulli variables
Are they independent?A. yesB. noC. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 1/2 0X=1 0 1/2
19
X and Y are Bernoulli variables
Are they independent?A. yesB. noC. I don’t know
Get your i‐clickers
Y=0 Y=1X=0 1/2 0X=1 0 1/2
Joint Probability Density Function Defined
Sec 5‐1.1 Joint Probability Distributions 20
(1) , 0 for all ,
(2) , 1
(3) , , (5-2)
XY
XY
XYR
f x y x y
f x y dxdy
P X Y R f x y dxdy
Figure 5‐2 Joint probability density function for the random variables X and Y. Probability that (X, Y) is in the region R is determined by the volume of fXY(x,y) over the region R.
The joint probability density function for the continuous random variables X and Y, denotes as fXY(x,y), satisfies the following properties:
Joint Probability Density Function
Sec 5‐1.1 Joint Probability Distributions 21
Figure 5‐3 Joint probability density function for the continuous random variables X and Y of expression levels of two different genes. Note the asymmetric, narrow ridge shape of the PDF – indicating that small values in the X dimension are more likely to occur when small values in the Y dimension occur.
Marginal Probability Distributions (continuous)
• Rather than summing a discrete joint PMF, we integrate a continuous joint PDF.
• The marginal PDFs are used to make probability statements about one variable.
• If the joint probability density function of random variables X and Y is fXY(x,y), the marginal probability density functions of X and Y are:
Sec 5‐1.2 Marginal Probability Distributions 22
,
, (5-3)
X XYy
Y XYx
f x f x y dy
f y f x y dx
,
,
X XYy
Y XYx
f x f x y
f y f x y
Conditional Probability Density Function Defined
Sec 5‐1.3 Conditional Probability Distributions 23
Given continuous random variables and with joint probability density function , , the conditional probability densiy function of given =x is
, , = if 0
,
XY
XY XYXY x
X XYy
X Yf x y
Y Xf x y f x y
f y f xf x f x y dy
(5-4)
which satifies the following properties:(1) 0
(2) 1
(3) for any set B in the range of Y
Y x
Y x
Y xB
f y
f y dy
P Y B X x f y dy
Compare to discrete: P(Y=y|X=x)=fXY(x,y)/fX(x)
Conditional Probability Distributions
• Conditional probability distributions can be developed for multiple random variables by extension of the ideas used for two random variables.
• Suppose p = 5 and we wish to find the distribution of X1, X2 and X3 conditional on X4=x4 and X5=x5.
Sec 5‐1.5 More Than Two Random Variables 24
1 2 3 4 5
1 2 3 4 5
4 5
4 5
1 2 3 4 51 2 3
4 5
4 5
, , , ,, ,
,
for , 0.
X X X X XX X X x x
X X
X X
f x x x x xf x x x
f x x
f x x
Independence for Continuous Random Variables
For random variables X and Y, if any one of the following properties is true, the others are also true. Then Xand Y are independent.
Sec 5‐1.4 Independence 25
(1) ,
(2) for all x and y with 0
(3) for all x and y with 0
(4) P , for any sets and in the range of and , respectively. (5-7)
XY X Y
Y XY x
X YX y
f x y f x f y
f y f y f x
f y f x f y
X A Y B P X A P Y BA B X Y
P(Y=y|X=x)=P(Y=y) for any x or P(X=x|Y=y)=P(X=x) for any y or P(X=x, Y=y)=P(X=x)∙P(Y=y) for any x and y
Covariation,Correlations
Covariance Defined
Sec 5‐2 Covariance & Correlation 29
The covariance between the random v
Covariance is a number qunatifying
ariables X and Y, denoted as co
average dependence betwee
v , or is
(
n two random variables.
XY
XY X Y X Y
X Y
E X Y E XY
5-14)
The units of are units of times units of .
Unlike the range of variance, - .
XY
XY
X Y
Covariance and PMF tables
Sec 5‐2 Covariance & Correlation 30
The probability distribution of Example 5‐1 is shown.
By inspection, note that the larger probabilities occur as Xand Ymove in opposite directions. This indicates a negative covariance.
1 2 31 0.01 0.02 0.252 0.02 0.03 0.203 0.02 0.10 0.054 0.15 0.10 0.05
x = number of bars of signal strength
y = number of times city
name is stated
Covariance and Scatter Patterns
Sec 5‐2 Covariance & Correlation 31
Figure 5‐13 Joint probability distributions and the sign of cov(X, Y). Note that covariance is a measure of linear relationship. Variables with non‐zero covariance are correlated.
Independence Implies σ=ρ = 0 but not vice versa
• If X and Y are independent random variables,σXY = ρXY = 0 (5‐17)
• ρXY = 0 is necessary, but not a sufficient condition for independence.
Sec 5‐2 Covariance & Correlation 32
NOT independentcovariance=0
Independentcovariance=0
Correlation is “normalized covariance”
• Also called: Pearson correlation coefficient
ρXY=σXY /σXσYis the covariance normalized to be ‐1 ≤ ρXY ≤ 1
Karl Pearson (1852– 1936) English mathematician and biostatistician
Spearman rank correlation• Pearson correlation tests for linear relationship between
X and Y• Unlikely for variables with broad distributions non‐
linear effects dominate• Spearman correlation tests for any
monotonic relationship between X and Y • Calculate ranks (1 to n), rX(i) and rY(i) of variables in both
samples. Calculate Pearson correlation between ranks: Spearman(X,Y) = Pearson(rX, rY)
• Ties: convert to fractions, e.g. tie for 6s and 7s place both get 6.5. This can lead to artefacts.
• If lots of ties: use Kendall rank correlation (Kendall tau)
Matlab exercise: Correlation/Covariation• Generate a sample with Stats=100,000 of two Gaussian random variables r1 and r2 which have mean 0 and standard deviation 2 and are:– Uncorrelated– Correlated with correlation coefficient 0.9– Correlated with correlation coefficient ‐0.5– Trick: first make uncorrelated r1 and r2. Then change r1 to: r1m=mix.*r2+(1‐mix.^2)^0.5.*r1; where mix= corr. coeff.
• In each case calculate covariance and correlation coefficient
• In each case make scatter plot: plot(r1,r2,’k.’);
Matlab exercise: Correlation/Covariation1. Stats=100000;2. r1=2.*randn(Stats,1);3. r2=2.*randn(Stats,1);4. disp('Covariance matrix='); disp(cov(r1,r2));5. disp('Correlation=');disp(corr(r1,r2));6. figure; plot(r1,r2,'ko');7. mix=0.9; %Mixes r2 to r1 but keeps same variance8. r1m=mix.*r2+sqrt(1‐mix.^2).*r1;9. disp('Covariance matrix='); disp(cov(r1m,r2));10.disp('Correlation=');disp(corr(r1m,r2));11.figure; plot(r1m,r2,'ko');12.mix=‐0.5; %REDO LINES 8‐11
Linear Functions of Random Variables
• A function of random variables is itself a random variable.
• A function of random variables can be formed by either linear or nonlinear relationships. We start with linear functions.
• Given random variables X1, X2,…,Xp and constants c1, c2, …, cpY= c1X1 + c2X2 + … + cpXp (5‐24) is a linear combination of X1, X2,…,Xp.
Sec 5‐4 Linear Functions of Random Variables 40
Mean & Variance of a Linear Function
Y= c1X1 + c2X2 + … + cpXp
Sec 5‐4 Linear Functions of Random Variables 42
1 1 2 2
2 2 21 1 2 2
1 2
2 21 1 2 2
... (5-25)
V ... 2 cov (5-26)
If , ,..., are , then cov 0,
..
independent
.
p p
p p i j i ji j
p i j
E Y c E X c E X c E X
Y c V X c V X c V X c c X X
X X X X X
V Y c V X c V X c
2 (5-27)p pV X
Example 5‐31: Error Propagation
A semiconductor product consists of three layers. The variances of the thickness of each layer is 25, 40 and 30 nm2. What is the variance of the finished product?
Answer:
Sec 5‐4 Linear Functions of Random Variables 43
1 2 3
32
1
25 40 30 95 nm
95 9.7 nm
ii
X X X X
V X V X
SD X
Mean & Variance of an Average
Sec 5‐4 Linear Functions of Random Variables 46
1 2
2
2 2
2
...If and
Then (5-28a)
If the are independent with
Then (5-28b)
pi
i i
X X XX E X
p
pE Xp
X V X
pV Xp p
Credit: XKCD comics