ENSC327 Communications Systems 17: Random Variables ...The mean (expectation) of discrete r.v. X: 19...

Post on 29-Sep-2020

5 views 0 download

Transcript of ENSC327 Communications Systems 17: Random Variables ...The mean (expectation) of discrete r.v. X: 19...

ENSC327

Communications Systems

17: Random Variables and Expectation

1

Jie Liang

School of Engineering Science

Simon Fraser University

Introduction to Random Variables

� Definition:� A random variable is a mathematical function that maps the outcomes of random experiments to numbers.

� Caution:� Random variable is actually a function, not a variable.

2

Random variable is actually a function, not a variable.

� Notation:� Random variables: Capital letters (e.g., X, Y, Z)

� Values of random variables: lowercase letters, x, y, z…

� Examples:� A student’s grade in the midterm exam

� Price of Google’s stock

� Average temperature of each day

� Number of heads when tossing 10 fair coins

Random Variables (r.v.’s)

�Map the abstract sample spaces to the real line

� events (collections of outcomes) become intervals

1z ( )X g

3

W

2z

1( )X z

2( )X z

( )X g

� Enables a “simple” probability function

� instead of a set-oriented probability measure

Random Variables (r.v.’s)� The probability functions applies to the real-valued random

variable in the same manner as it applies to the events.

� Probability analysis can be developed in terms of real-valued

quantities regardless of the form or shape of the underlying

events of the random experiment.

4

Cumulative Distribution Function (cdf)

�The cdf completely describes the probability

distribution of a real-valued random variable

Definition:

5

� Note:

The book refers to CDF as the “probability

distribution function”, which can be easily confused

with the “probability density function” f(x).

Therefore CDF is the more commonly used name

for F(x), and will be used in this course.

Properties of CDF

� 3. is continuous from the right( )X

F x

1.

2.

6

� 3. is continuous from the right( )X

F x

Ziemer

Example

� If , where u(x) is the unit step function, find P(X > 5), P(X>5|X<7).

� Solution:

)()1()( xuexFx

X

−=

)(xFX

7

Probability Density Functions (pdf)

� Applicable to continuous random variables.

� Definition:

( )( )

X

X

dF xf x

dx=

8

�Properties:

Example of pdf

� The final angle (modulo ) when a wheel

is spin (with respect to the vertical line)

� Assumption: The pointer is equally

possible to stop at any position

2π θ

9

possible to stop at any position

Probability Mass Function (pmf) for

Discrete Random Variable

� Suppose that X is a discrete random variable, taking

values on some countable sample space S. Then the

probability mass function for X is given by

∈=

=, ],[

)(SxxXP

xf

10

∈=

=otherwise. ,0

, ],[)(

SxxXPxf

X

)(xfX� Note that is defined for all real numbers,

including all values that X could never take; indeed,

it assigns such values a probability of zero.

Probability Mass Function (pmf) for

Discrete Random Variable

� Example: Let X be the random variable representing

the outcome of a fair coin-tossing experiment.

� Let X=0 if the result is tail and 1 if its head.

� The PMF is:

11

� The PMF is:

Several random variables

� Some random experiments must be characterized by

two or more random variables

� Joint cdf of X and Y:

[ ]yYxXPyxFYX

≤≤= ,),(,

X

Y

12

�Properties:

�Joint pdf of X and Y:

[ ]yYxXPyxFYX

≤≤= ,),(,

X

yx

yxFyxf

YX

YX

∂∂

∂=

),(),(

,

2

,

X

Y

Example

� How to write

� Solution:

[ ] ?F of in terms ,XY2121

yYyxXxP ≤<≤<

13

Example

14

Marginal pdf or cdf

[ ] ),(,)( ∞=∞≤≤= xFYxXPxFXYX

[ ] ),(,)( yFyYXPyF ∞=≤∞≤=

15

[ ] ),(,)( yFyYXPyFXYY

∞=≤∞≤=

Conditional pdf

=)|(| xyfXY

=)|( yxf

16

=)|(| yxfYX

=)|(| yxfYX

Baye’s rule:

Independent Random Variables

�Two random variables are independent if and

only if

17

Or equivalently:

Independent Random Variables

Since

So if X and Y independent:

18

So if X and Y independent:

Statistical Averages� Statistical averages can reveal many useful properties

of random variables

� Especially when cdf and pdf are unknown.

� The mean (expectation) of discrete r.v. X:

19

�When P[X=x] are unknown, use the average

of N observations x1, x2, …, xN:

Statistical Averages

� Expectation of continuous random variable X:

20

� Expectation of linear combination of r.v.'s:

� Expectation of function of random variable:

� If Y = g(X):

Variance

�Measure of the spread around the mean value.

[ ]{ }22 XX

XE µσ −=

� Property: 222 )(XX

XE µσ −=

deviation. standard :Xσ

21

� Property: )(XX

XE µσ −=

Variance

� Proof for continuous r.v:

22

Covariance

�Covariance of two r.v.'s X and Y:

� X and Y are said to be uncorrelated if Cov(X, Y) = 0, or

E(XY)= YXµµ

23

� If X and Y are independent, i.e., f(x,y) = f(x)f(y), then

E(XY)= YX

Covariance

� So Independence implies uncorrelated:

independence is a stronger condition.

24

�However, the converse is not always true

� If X and Y are uncorrelated, they are not

necessarily independent.

� But if X, Y are Gaussian (studied in next lecture),

independence and uncorrelated will be equivalent.

Correlation Coefficient

�The normalized covariance is called

correlation coefficient between two random

variables:

.),( YXCov

σσρ =

25

YXσσ

� It can be shown that

� If X and Y are uncorrelated �

[ ].1 ,1−∈ρ

.0=ρ

Correlation Coefficient

� If Y = aX + b, then

� Proof:

<−

>=

0.a if ,1

0.a if ,1ρ

Let X has mean and variance:

Then the mean and variance of Y are:

. and2

XXσµ

{ }YX

XYEYXCov µµ−=),(

26

Then the mean and variance of Y are:

Correlation Coefficient { }YX

XYEYXCov µµ−=),(

27

Additional Properties of Variance

�If Xi are uncorrelated:

.)(1

2

1

∑∑==

=

i

ii

i

iiXVaraXaVar

28