Comm-05-Random Variables and Processes

90
Chapter 5 Chapter 5 Chapter 5 Chapter 5 Random Variables and Processes Random Variables and Processes Wireless Information Transmission System Lab. Wireless Information Transmission System Lab. Institute of Communications Engineering Institute of Communications Engineering National Sun National Sun Yat Yat-sen sen University University

Transcript of Comm-05-Random Variables and Processes

Page 1: Comm-05-Random Variables and Processes

Chapter 5 Chapter 5 Chapter 5 Chapter 5 Random Variables and ProcessesRandom Variables and Processes

Wireless Information Transmission System Lab.Wireless Information Transmission System Lab.Institute of Communications EngineeringInstitute of Communications Engineeringg gg gNational Sun National Sun YatYat--sensen UniversityUniversity

Page 2: Comm-05-Random Variables and Processes

Table of ContentsTable of Contents

◊ 5.1 Introduction5 2 P b bilit◊ 5.2 Probability

◊ 5.3 Random Variables5 4 Statistical A erages◊ 5.4 Statistical Averages

◊ 5.5 Random Processes◊ 5 6 Mean Correlation and Covariance Functions◊ 5.6 Mean, Correlation and Covariance Functions◊ 5.7 Transmission of a Random Process through a Linear Filter

5 8 Po er Spectral Densit◊ 5.8 Power Spectral Density◊ 5.9 Power Spectral Density

5 9 G i P◊ 5.9 Gaussian Process◊ 5.10 Noise

2

◊ 5.11 Narrowband Noise

Page 3: Comm-05-Random Variables and Processes

5.1 Introduction5.1 Introduction

◊ Fourier transform is a mathematical tool for the representation of deterministic signalsdeterministic signals.

◊ Deterministic signals: the class of signals that may be modeled as g g ycompletely specified functions of time.

◊ A signal is “random” if it is not possible to predict its precise value◊ A signal is random if it is not possible to predict its precise value in advance.

◊ A random process consists of an ensemble (family) of sample functions, each of which varies randomly with time.

◊ A random variable is obtained by observing a random process at a fixed instant of time.

3

Page 4: Comm-05-Random Variables and Processes

5.2 Probability 5.2 Probability

◊ Probability theory is rooted in phenomena that, explicitly or implicitly can be modeled by an experiment with an outcome that isimplicitly, can be modeled by an experiment with an outcome that is subject to chance.◊ Example: Experiment may be the observation of the result of p p y

tossing a fair coin. In this experiment, the possible outcomes of a trial are “heads” or “tails”.

◊ If an experiment has K possible outcomes, then for the kth possible outcome we have a point called the sample point, which we denote b With thi b i f k k th f ll i d fi itiby sk. With this basic framework, we make the following definitions:◊ The set of all possible outcomes of the experiment is called the

sample space which we denote by Ssample space, which we denote by S.◊ An event corresponds to either a single sample point or a set of

sample points in the space S

4

sample points in the space S.

Page 5: Comm-05-Random Variables and Processes

5.2 Probability 5.2 Probability

◊ A single sample point is called an elementary event.◊ The entire sample space S is called the sure event; and the null set◊ The entire sample space S is called the sure event; and the null set

is called the null or impossible event. T o e ents are t ll l i if the occ rrence of one e entφ

◊ Two events are mutually exclusive if the occurrence of one event precludes the occurrence of the other event.

◊ A probability measure P is a function that assigns a non negative◊ A probability measure P is a function that assigns a non-negative number to an event A in the sample space S and satisfies the following three properties (axioms):g p p ( )

[ ] ( )[ ] ( )

0 1 5.1

1 5 2

A

S

≤ ≤

=

1.

2 P

P

[ ] ( )1 5.2 If and

SA

=2. P3.

[ ] [ ] [ ] ( ) are two mutually exclusive events, then

5 3B

A B A B∪ = +P P P

5

[ ] [ ] [ ] ( ) 5.3A B A B∪ = +P P P

Page 6: Comm-05-Random Variables and Processes

5.2 Probability 5.2 Probability

6

Page 7: Comm-05-Random Variables and Processes

5.2 Probability 5.2 Probability

◊ The following properties of probability measure P may be derived from the above axioms:from the above axioms:

[ ] ( )1 5.4A A⎡ ⎤ = −⎣ ⎦1. P P[ ] ( )

[ ] [ ] [ ] [ ] ( ) When events A and B are not mutually exclusive:

5.5A B A B A B

⎣ ⎦

∪ = + − ∩

2.P P P P[ ] [ ] [ ] [ ] ( )

If3. 1 2 , ,..., are mutually exclusive events that include all possible outcomes of the random experiment, then

mA A A

[ ] [ ] [ ] ( )1 2 1 5.6mA A A+ + + =P P P…

7

Page 8: Comm-05-Random Variables and Processes

5.2 Probability 5.2 Probability

◊ Let P[B|A] denote the probability of event B, given that event A has occurred The probability P[B|A] is called the conditionaloccurred. The probability P[B|A] is called the conditionalprobability of B given A.

◊ P[B|A] is defined by◊ P[B|A] is defined by[ ][ ] ( ) 5.7

A BB A

A∩

⎡ ⎤ =⎣ ⎦P

PP

◊ Bayes’ rule◊ We may write Eq (5 7) as P[A∩B] = P[B|A]P[A] (5 8)

[ ]AP

◊ We may write Eq.(5.7) as P[A∩B] P[B|A]P[A] (5.8)◊ It is apparent that we may also write P[A∩B] = P[A|B]P[B] (5.9)◊ From Eqs.(5.8) and (5.9), provided P[A] ≠ 0, we may determine P[B|A] by

using the relation

[ ] ( )5 10A B B

B A⎡ ⎤⎣ ⎦⎡ ⎤ =⎣ ⎦

P PP

8

[ ] ( ) 5.10B AA

⎡ ⎤ =⎣ ⎦PP

Page 9: Comm-05-Random Variables and Processes

5.2 Conditional Probability 5.2 Conditional Probability

◊ Suppose that the condition probability P[B|A] is simply equal to the elementary probability of occurrence of event B that iselementary probability of occurrence of event B, that is

[ ]B A B⎡ ⎤ =⎣ ⎦P P [ ] [ ] [ ]A B A B∩ =P P P

so that

[ ]⎡ ⎤⎣ ⎦ [ ] [ ] [ ]

[ ][ ]

[ ] [ ][ ] [ ] ( ) 5.13

A B A BA B A

B B∩

⎡ ⎤ = = =⎣ ⎦P P P

P PP P

◊ Events A and B that satisfy this condition are said to bestatistically independent

[ ] [ ]B BP P

statistically independent.

9

Page 10: Comm-05-Random Variables and Processes

5.2 Conditional Probability 5.2 Conditional Probability

◊ Example 5.1 Binary Symmetric Channel◊ This channel is said to be discrete in that it is designed to handle◊ This channel is said to be discrete in that it is designed to handle

discrete messages.◊ The channel is memoryless in the sense that the channel output at◊ The channel is memoryless in the sense that the channel output at

any time depends only on the channel input at that time.◊ The channel is symmetric, which means that the probability of◊ The channel is symmetric, which means that the probability of

receiving symbol 1 when 0 is sent is the same as the probability of receiving symbol 0 when symbol 1 is sent.

10

Page 11: Comm-05-Random Variables and Processes

5.2 Conditional Probability 5.2 Conditional Probability

◊ Example 5.1 Binary Symmetric Channel (continued)Th i i b bili i f di bi b l 0 d 1◊ The a priori probabilities of sending binary symbols 0 and 1:

[ ]0 0A p=P [ ]1 1A p=P◊ The conditional probabilities of error:

B A B A p⎡ ⎤ ⎡ ⎤⎣ ⎦ ⎣ ⎦P P

◊ The probability of receiving symbol 0 is given by:

1 0 0 1B A B A p⎡ ⎤ ⎡ ⎤= =⎣ ⎦ ⎣ ⎦P P

[ ] [ ] [ ] ( )0 0 0 0 0 1 1 0 11B B A A B A A p p pp⎡ ⎤ ⎡ ⎤= + = − +⎣ ⎦⎣ ⎦P P P P P

◊ The probability of receiving symbol 1 is given by:

⎡ ⎤ ⎡ ⎤

11

[ ] [ ] [ ] ( )1 1 0 0 1 1 1 0 11B B A A B A A pp p p⎡ ⎤ ⎡ ⎤= + = + −⎣ ⎦⎣ ⎦P P P P P

Page 12: Comm-05-Random Variables and Processes

5.2 Conditional Probability 5.2 Conditional Probability

◊ Example 5.1 Binary Symmetric Channel (continued)Th i i b bili i P[A |B ] d P[A |B ]◊ The a posteriori probabilities P[A0|B0] and P[A1|B1]:

[ ][ ]

( )( )

0 0 0 00 0

0 0 1

11

B A A p pA B

B p p pp⎡ ⎤ −⎣ ⎦⎡ ⎤ = =⎣ ⎦ − +

P PP

P[ ] ( )0 0 11B p p pp+P

[ ][ ]

( )( )

1 1 1 11 1

1 0 1

11

B A A p pA B

B pp p p⎡ ⎤ −⎣ ⎦⎡ ⎤ = =⎣ ⎦ + −

P PP

P[ ] ( )1 0 1

12

Page 13: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables ◊ We denote the random variable as X(s) or just X.◊ X is a function◊ X is a function.◊ Random variable may be discrete or continuous.◊ Consider the random variable X and the probability of the event◊ Consider the random variable X and the probability of the event

X ≤ x. We denote this probability by P[X ≤ x].◊ To simplify our notation we write◊ To simplify our notation, we write

◊ The function F (x) is called the cumulative distribution( ) [ ] ( ) 5.15XF x X x= ≤P

◊ The function FX(x) is called the cumulative distribution function (cdf) or simply the distribution function of the random variable X.

◊ The distribution function FX(x) has the following properties:( ) 0 1 xF x≤ ≤1.

13

( )( ) ( )1 2 1 2 if

x

x xF x F x x x≤ <2.

Page 14: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables

There may be more than onerandom variable associated with

14

the same random experiment.

Page 15: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables

◊ If the distribution function is continuously differentiable, thend

f ( ) is called the b bilit d it f ti (pdf) of the random

( ) ( ) ( ) 5.17X Xdf x F xdx

=

◊ fX(x) is called the probability density function (pdf) of the random variable X.

◊ Probability of the event x < X ≤ x equals◊ Probability of the event x1 < X ≤ x2 equals

[ ] [ ] [ ]1 2 2 1x X x X x X x< ≤ = ≤ − ≤P P P

( ) ( ) ( ) ( ) ( )

( )

1

2

2 1 ξ ξ 5.19xx

X X X X

x

F x F x F x f d

f x dx

=−∞

−∞= − ⎯⎯⎯→ =

=

∫∫

◊ Probability density function must always be a nonnegative function,

( )1

Xxf x dx= ∫

15

and with a total area of one.

Page 16: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables

◊ Example 5.2 Uniform Distribution

( )

0, 1

x a

f b

≤⎧⎪⎪ ≤⎨( ) ,

0,

Xf x a x bb a

x b

⎪= < ≤⎨ −⎪>⎪⎩⎩

0, x ax a

≤⎧⎪⎪( ) ,

0

Xx aF x a x bb a

x b

−⎪= < ≤⎨ −⎪>⎪⎩

16

0, x b>⎪⎩

Page 17: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables

◊ Several Random VariablesC id t d i bl X d Y W d fi th j i t◊ Consider two random variables X and Y. We define the joint distribution function FX,Y(x,y) as the probability that the random variable X is less than or equal to a specified value x and that thevariable X is less than or equal to a specified value x and that the random variable Y is less than or equal to a specified value y.

( ) [ ] ( )5 23F x y X x Y y= ≤ ≤P

◊ Suppose that joint distribution function FX,Y(x,y) is continuous h d th t th ti l d i ti

( ) [ ] ( ), , , 5.23X YF x y X x Y y= ≤ ≤P

everywhere, and that the partial derivative

( ) ( ) ( )2

,,

,, 5.24X Y

X Y

F x yf x y

∂=

∂ ∂

exists and is continuous everywhere. We call the function fX,Y(x,y) the joint probability density function of the random variables X

( ) ( ),X Y x y∂ ∂

17

the joint probability density function of the random variables X and Y.

Page 18: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables

◊ Several Random Variables◊ The joint distribution function FX,Y(x,y) is a monotone-

nondecreasing function of both x and y.◊

◊ Marginal density fX(x)

( )X,Y , 1f d dξ η ξ η∞ ∞

−∞ −∞=∫ ∫

◊ Marginal density fX(x)

( ) ( ) ( ) ( ) ( ), ,, , 5.27x

X X Y X X YF x f d d f x f x dξ η ξ η η η∞ ∞

−∞ −∞ −∞= ⎯⎯→ =∫ ∫ ∫

◊ Suppose that X and Y are two continuous random variables with joint probability density function fX,Y(x,y). The conditional probability density function of Y given that X = x is defined by

( ) ( ) ( ),X Yf x y

18

( ) ( )( ) ( ), ,

5.28X YY

X

f x yf y x

f x=

Page 19: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables

◊ Several Random VariablesIf th d i bl X d Y i i ll i d d th◊ If the random variable X and Y are statistically independent, then knowledge of the outcome of X can in no way affect the distribution of Ydistribution of Y.

( ) ( )b 5 28( ) ( ) ( ) ( ) ( ) ( ) ( )by 5.28, , 5.32 Y Y X Y X Yf y x f y f x y f x f y= ⎯⎯⎯→ =

[ ] [ ] [ ] ( )[ ] [ ] [ ] ( ), 5.33x A Y B X A Y B∈ ∈ = ∈ ∈P P P

19

Page 20: Comm-05-Random Variables and Processes

5.3 Random Variables 5.3 Random Variables ◊ Example 5.3 Binomial Random Variable

◊ Consider a sequence of coin-tossing experiments where the◊ Consider a sequence of coin tossing experiments where the probability of a head is p and let Xn be the Bernoulli random variable representing the outcome of the nth toss.

◊ Let Y be the number of heads that occur on N tosses of the coins:

N

1

N

nn

Y X=

=∑

[ ] ( )1 N yyNY y p p

y−⎛ ⎞

= = −⎜ ⎟⎝ ⎠

Py⎝ ⎠

( )!

! !N N

N⎛ ⎞

=⎜ ⎟⎝ ⎠

20

( )! !y y N y⎜ ⎟ −⎝ ⎠

Page 21: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ The expected value or mean of a random variable X is defined by

[ ] ( ) ( ) 5.36x XX xf x dxμ∞

−∞= = ∫E

◊ Function of a Random Variable◊ Let X denote a random variable, and let g(X) denote a real-

l d f i d fi d h l li dvalued function defined on the real line. We denote as

( ) ( ) 5.37Y g X=

◊ To find the expected value of the random variable Y.

( ) ( )

[ ] ( ) ( ) ( ) ( ) ( ) 5.38Y xY yf y dy g X g x f x dx∞ ∞

−∞ −∞⎡ ⎤= ⎯⎯→ =⎣ ⎦∫ ∫E E

21

Page 22: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Example 5.4 Cosinusoidal Random Variable◊ Let Y=g(X)=cos(X)◊ X is a random variable uniformly distributed in the interval (-π, π)

( )1 ,

2X

xf x

π ππ

⎧ − < <⎪= ⎨( ) 20, otherwise

Xf x π⎨⎪⎩

[ ] ( ) 1cos2

Y x dxπ

π π−

⎛ ⎞= ⎜ ⎟⎝ ⎠∫E

1 sin2 xx π

ππ =−

⎝ ⎠

= −

22

0=

Page 23: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Moments◊ For the special case of g(X) = X n, we obtain the nth moment of

the probability distribution of the random variable X; that is

l f

( ) ( ) 5.39n nXX x f x dx

−∞⎡ ⎤ =⎣ ⎦ ∫E

◊ Mean-square value of : X

( ) ( )2 2 5.40XX x f x dx∞

⎡ ⎤ =⎣ ⎦ ∫E

◊ The nth central moment is

( ) ( )5. 0Xx f x dx−∞

⎡ ⎤⎣ ⎦ ∫

( ) ( ) ( ) ( ) 5.41n nX X XX x f x dxμ μ

−∞⎡ ⎤− = −⎣ ⎦ ∫E

23

Page 24: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ For n = 2 the second central moment is referred to as the variance of the random variable X written asthe random variable X, written as

[ ] ( ) ( ) ( ) ( )2 2var 5.42X X XX X x f x dxμ μ∞

−∞⎡ ⎤= − = −⎣ ⎦ ∫E

◊ The variance of a random variable X is commonly denoted as . ◊ The square root of the variance is called the standard deviation of

⎣ ⎦2Xσ

◊ The square root of the variance is called the standard deviation of the random variable X.

◊ [ ] ( )22 X X⎡ ⎤E◊ [ ] ( )2

2 2

var

2

X X

X X

X X

X X

σ μ

μ μ

⎡ ⎤= = −⎣ ⎦⎡ ⎤= − +⎣ ⎦

E

E

[ ]2 22

X X

X XX X

μ μ

μ μ⎣ ⎦⎡ ⎤= − +⎣ ⎦⎡ ⎤

E E

24

( )2 2 5.44XX μ⎡ ⎤= −⎣ ⎦E

Page 25: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Chebyshev inequalityy q y◊ Suppose X is an arbitrary random variable with finite mean mx

and finite variance σx2. For any positive number δ:

( ) 2

2

δσδ x

xmXP ≤≥−

◊ Proof:δ

2 2 2( ) ( ) ( ) ( )d d∞

≥∫ ∫2 2 2

| |

2 2

( ) ( ) ( ) ( )

( ) (| | ) x

x x xx m

x

x m p x dx x m p x dx

p x dx P X m

δσ

δ δ δ

−∞ − ≥= − ≥ −

≥ = − ≥

∫ ∫∫| |

( ) (| | )x

xx mp

δ− ≥∫

25

Page 26: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Chebyshev inequalityy q y◊ Another way to view the Chebyshev bound is working with

the zero mean random variable Y=X-mx.◊ Define a function g(Y) as:

( ) ( )δ⎧ ≥Y1( ) ( )( ) ( )[ ] ( )δ

δδ

≥=⎩⎨⎧

<≥

= YPYgEYg and Y 0Y 1

2⎞⎛

◊ Upper-bound g(Y) by the quadratic (Y/δ)2, i.e. ( )2

⎟⎠⎞

⎜⎝⎛≤δYYg

◊ The tail probability ( )[ ] ( )2

2

2

2

2

2

2

2

δσ

δσ

δδxyYEYEYgE ===⎟⎟

⎞⎜⎜⎝

⎛≤

26

⎠⎝

Page 27: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Chebychev inequality◊ A quadratic upper bound on g(Y) used in obtaining the tail

probability (Chebyshev bound)

◊ For many practical applications, the Chebyshev bound is

27

extremely loose.

Page 28: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Characteristic function is defined as the expectation of the complex exponential function exp( jυX ) as shown by

( )Xφ υcomplex exponential function exp( jυX ), as shown by

( ) ( ) ( ) ( ) ( )exp exp 5.45X Xj j X f x j X dxψ υ υ υ∞

−∞⎡ ⎤= =⎣ ⎦ ∫E

◊ In other words , the characteristic function is the Fourier transform of the probability density function fX(x).

−∞⎣ ⎦ ∫( )Xφ υ

transform of the probability density function fX(x).

◊ Analogous with the inverse Fourier transform:g

( ) ( ) ( ) ( )1 ∞

∫( ) ( ) ( ) ( )1 exp 5.462X Xf x j j X dψ υ υ υπ

−∞= −∫

28

Page 29: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Characteristic functions◊ First moment (mean) can be obtained by:

)( jd

0

)()(=

−==v

x dvjvdjmXE ψ

◊ Since the differentiation process can be repeated, n-thmoment can be calculated by:

)()()( −= n

nnn

dvjvdjXE ψ

0=vdv

29

Page 30: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Characteristic functions◊ Determining the PDF of a sum of statistically independent

random variables:⎤⎡ ⎞⎛ n

ii

jvYY

n

ii XjvEeEjvXY exp)()(

11ψ

⎞⎛⎤⎡

⎥⎦

⎤⎢⎣

⎡⎟⎠

⎞⎜⎝

⎛==⇒= ∑∑

==

( ) nn

n

i

jvxn

i

jvX dxdxdxxxxpeeE ii ...),...,,(... 212111

⎟⎟⎠

⎞⎜⎜⎝

⎛=⎥

⎤⎢⎣

⎡= ∫ ∫ ∏∏

∞−

∞−==

n

XYnn jvjvxpxpxpxxxp )()( )()...()(),...,,(

t,independenlly statisticaarevariablesrandom theSince

2121 ψψ =⇒= ∏

[ ]i

iXYnn

X

jjppppi

d)distributey identicall andnt (independe iid are If

)()()()()(), ,,(1

2121 ψψ ∏=

30

[ ]nXY jvjv )()( ψψ =⇒

Page 31: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Characteristic functions◊ The PDF of Y is determined from the inverse Fourier

transform of ΨY(jv).◊ Since the characteristic function of the sum of n statistically

independent random variables is equal to the product of the h t i ti f ti f th i di id l d i bl itcharacteristic functions of the individual random variables, it

follows that, in the transform domain, the PDF of Y is the n-fold convolution of the PDFs of the Xi.fold convolution of the PDFs of the Xi.

◊ Usually, the n-fold convolution is more difficult to perform than the characteristic function method in determining the PDF gof Y.

31

Page 32: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Example 5.5 Gaussian Random Variable◊ The probability density function of such a Gaussian random

variable is defined by:( )2⎛ ⎞

( ) ( )2

2

1 exp , 22

XX

XX

xf x x

μσπσ

⎛ ⎞−= − −∞ < < ∞⎜ ⎟

⎜ ⎟⎝ ⎠

◊ The characteristic function of a Gaussian random variable with mean mx and variance σ2 is (Problem 5.1):

⎝ ⎠

⎤⎡( ) ( ) ( ) 2222 2/12/

21 σσ

σπψ vjvmmxjvx xx edxeejv −∞

∞−

−− =⎥⎦

⎤⎢⎣

⎡= ∫◊ It can be shown that the central moments of a Gaussian random

variable are given by:

[ ] ⎧ −⋅⋅⋅⋅ )(even)1(31 kk kσ

32

[ ]⎩⎨⎧ ⋅⋅⋅⋅

==− ) (odd 0)(even )1(31

)(kkk

mXE kk

μ

Page 33: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Example 5.5 Gaussian Random Variable (cont.)h f i i ll i d d i d◊ The sum of n statistically independent Gaussian random

variables is also a Gaussian random variable.◊ Proof:◊ Proof:

1

n

iiXY ∑

=

=

( ) ( ) 2/

1

2/

1

1

2222 vjvmn

i

vjvmn

iXY

i

eejvjv yyii

iψψ σσ∏∏ −−

=

===

and where1

22

1

11n

iiy

n

iiy

ii

mm σσ ∑∑==

==

id

mean with ddistribute-Gaussian is Therefore,2

11

y

ii

mY==

33

. varianceand 2yσ

Page 34: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Joint Moments◊ Consider next a pair of random variables X and Y. A set of

statistical averages of importance in this case are the joint t l th t d l f Xi Y k h i d kmoments, namely, the expected value of Xi Y k, where i and k

may assume any positive integer values. We may thus write∞ ∞

⎡ ⎤ ∫ ∫◊ A joint moment of particular importance is the correlation

( ) ( ), , 5.51i k i kX YX Y x y f x y dxdy

∞ ∞

−∞ −∞⎡ ⎤ =⎣ ⎦ ∫ ∫E

◊ A joint moment of particular importance is the correlationdefined by E[XY], which corresponds to i = k = 1.

◊ Covariance of X and Y :

[ ] [ ]( ) [ ]( ) [ ] ( )cov = 5 53XY X X Y Y XY μ μ⎡ ⎤= − − −⎣ ⎦E E E E

34

[ ] [ ]( ) [ ]( ) [ ] ( )cov 5.53X YXY X X Y Y XY μ μ⎡ ⎤= ⎣ ⎦E E E E

Page 35: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Correlation coefficient of X and Y :

[ ] ( )cov 5.54

X Y

XYρ

σ σ=

◊ σX and σY denote the variances of X and Y.

◊ We say X and Y are uncorrelated if and only if cov[XY] = 0.N t th t if X d Y i i ll i d d th th◊ Note that if X and Y are statistically independent, then they are uncorrelated.The converse of the above statement is not necessarily true◊ The converse of the above statement is not necessarily true.

W X d Y th l if d l if E[XY] 0

35

◊ We say X and Y are orthogonal if and only if E[XY] = 0.

Page 36: Comm-05-Random Variables and Processes

5.4 Statistical Averages5.4 Statistical Averages

◊ Example 5.6 Moments of a Bernoulli Random VariableC id h i i i h h b bili f◊ Consider the coin-tossing experiment where the probability of a head is p. Let X be a random variable that takes the value 0 if the result is a tail and 1 if it is a head We say that X is a Bernoulliresult is a tail and 1 if it is a head. We say that X is a Bernoulli random variable.

1 0p x− =⎧ 1

( )1 0

10 otherwise

p xX x p x

=⎧⎪= = =⎨⎪⎩

P [ ] ( ) ( )1

00 1 1

kX k X k p p p

=

= = = ⋅ − + ⋅ =∑E P

[ ]X X j k⎧ ⎡ ⎤ ≠⎣ ⎦⎪E E0 otherwise⎩

( ) [ ]1

22X Xk X kσ μ= − =∑ P

[ ]2

j k

j k

j

X X j kX X

X j k

⎧ ⎡ ⎤ ≠⎣ ⎦⎪⎡ ⎤ = ⎨⎣ ⎦ ⎡ ⎤ =⎪ ⎣ ⎦⎩

E EE

E

( ) ( ) ( )( )

02 20 1 1

1

k

p p p=

= − − + −2

p j kp j k

⎧ ≠= ⎨

=⎩

36

( )1p p= −[ ]12 2

0where the .j k

X k X k=

⎡ ⎤ = =⎣ ⎦ ∑E P

Page 37: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

An ensemble of sample functions.

37( ) ( ) ( ){ } ( ) ( ) ( ){ }1 2 1 2, , , , , , , ,k k n k k k k nx t x t x t X t s X t s X t s=… …For a fixed time instant tk, constitutes a random variable.

Page 38: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ At any given time instant, the value of a stochastic process i d i bl i d d b th t t Wis a random variable indexed by the parameter t. We denote such a process by X(t).

◊ In general, the parameter t is continuous, whereas X may be either continuous or discrete depending on thebe either continuous or discrete, depending on the characteristics of the source that generates the stochastic processprocess.

◊ The noise voltage generated by a single resistor or a single◊ The noise voltage generated by a single resistor or a single information source represents a single realization of the stochastic process It is called a sample function

38

stochastic process. It is called a sample function.

Page 39: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ The set of all possible sample functions constitutes an bl f l f ti i l tl thensemble of sample functions or, equivalently, the

stochastic process X(t).◊ In general, the number of sample functions in the

ensemble is assumed to be extremely large; often it is infinite.

◊ Having defined a stochastic process X(t) as an ensemble of sample functions, we may consider the values of the process at any set of time instants t1>t2>t3>…>tn, where nis any positive integer.

◊ ( ) are,,...,2,1,variablesrandom thegeneral,In i it nitXX =≡

39

( )( ).,...,, PDFjoint by their lly statistica zedcharacteri

,, ,,,g ,

21 n

i

ttt

it

xxxp

Page 40: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ Stationary stochastic processes( )◊ ( )Consider another set of random variables ,

1, 2,..., , where is an arbitrary time shift. These randomit t in X X t t

i n t+ ≡ +

=

( )1 2variables are characterized by the joint PDF , ,..., .

nt t t t t tp x x x+ + +

The jont PDFs of the random variables and 1 2X X i n=◊

( ) ( )

The jont PDFs of the random variables and 1 2 ,

may or may not be identical. When they are identical, i.e., wheni it t tX X ,i , ,...,n+ =

( ) ( )1 2 1 2 , ,..., , ,...,

for all and n nt t t t t t t t tp x x x p x x x

t+ + +=

all , it is said to be stationary in the strict sense (SSS).n◊ When the joint PDFs are different, the stochastic process is

non-stationary.

, y ( )

40

Page 41: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ Averages for a stochastic process are called ensemble averages.◊

( ) ( )∫∞

=

i

nn

t

dxxpxXE

Xn :as defined is variablerandom theof momentth The

( ) ( )∫ ∞−=

iiii tttt dxxpxXE

on thedependwillmomentth theofvalue thegeneral,In n

.on depends of PDF theif instant timepg ,

iti tXti

( ) ( ) allforstationaryisprocessWhen the txpxp =( ) ( )

timeoftindependenismomentththeeconseq enca as and, time,oft independen is PDF theTherefore,

.allfor ,stationaryisprocess When the txpxpii ttt +

time.oft independenismomentth thee,consequenc n

41

Page 42: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ Two random variables: ( ) , 1, 2.it iX X t i≡ =

◊ The correlation is measured by the joint moment:i

( ) ( ),t t t t t t t tE X X x x p x x dx dx∞ ∞

= ∫ ∫◊ Since this joint moment depends on the time instants t1 and

t2, it is denoted by RX(t1 ,t2).

( ) ( )1 2 1 2 1 2 1 2,t t t t t t t tp

−∞ −∞∫ ∫

,

◊ RX(t1 ,t2) is called the autocorrelation function of the stochastic process.

◊ For a stationary stochastic process, the joint moment is:

1 2 1 2 1 2( ) ( , ) ( ) ( )t t X X XE X X R t t R t t R τ= = − =

2

' '1 1 1 1 1 1

( ) ( ) ( ) ( ) ( )X t t t t Xt tR E X X E X X E X X Rτ τ τ

τ τ+ + −− = = = =

42

◊ Average power in the process X(t): RX(0)=E(Xt2).

Page 43: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ Wide-sense stationary (WSS)◊ A wide-sense stationary process has the property that the

mean value of the process is independent of time (a constant) and where the autocorrelation function satisfies the condition that RX(t1,t2)=RX(t1-t2).

◊ Wide-sense stationarity is a less stringent condition than strict sense stationaritystrict-sense stationarity.

43

Page 44: Comm-05-Random Variables and Processes

5.5 Random Processes 5.5 Random Processes

◊ Auto-covariance function◊ The auto-covariance function of a stochastic process is

defined as:( ) ( ) ( ){ }E X X⎡ ⎤ ⎡ ⎤( ) ( ) ( ){ }

( ) ( ) ( )1 21 2 1 2

1 2 1 2

,

,

t t

X

t t E X m t X m t

R t t m t m t

μ ⎡ ⎤ ⎡ ⎤= − −⎣ ⎦ ⎣ ⎦= −

◊ When the process is stationary, the auto-covariance function simplifies to:

2

◊ For a Gaussian random process, higher-order moments can b d i f fi d d

21 2 1 2( , ) ( ) ( ) ( )Xt t t t R mμ μ μ τ τ= − = = −

be expressed in terms of first and second moments. Consequently, a Gaussian random process is completely characterized by its first two moments

44

characterized by its first two moments.

Page 45: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Consider a random process X(t). We define the mean of the ( ) h i f h d i bl b i d bprocess X(t) as the expectation of the random variable obtained by

observing the process at some time t, as shown by

A d i id t b i fi d if th

( ) ( ) ( ) ( ) ( ) 5.57X X tt X t xf x dxμ∞

−∞⎡ ⎤= =⎣ ⎦ ∫E

◊ A random process is said to be stationary to first order if the distribution function (and therefore density function) of X(t) does not vary with timenot vary with time.

( ) ( ) ( ) ( ) ( ) ( )1 2 1 2for all and for all 5.59X XX t X tf x f x t t t tμ μ= → =

◊ The mean of the random process is a constant.◊ The variance of such a process is also constant.

45

Page 46: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ We define the autocorrelation function of the process X(t) as the expectation of the product of two random variables X(t1) and X(t2)expectation of the product of two random variables X(t1) and X(t2).

( ) ( ) ( )1, 2 1 2XR t t X t X t⎡ ⎤= ⎣ ⎦E

◊ We say a random process X(t) is stationary to second order if the( ) ( ) ( ) ( )

1 21 2 1 2 1 2, , 5.60X t X tx x f x x dx dx∞ ∞

−∞ −∞= ∫ ∫

◊ We say a random process X(t) is stationary to second order if the joint distribution depends on the difference between the observation time t1 and t2.

( ) ( ) ( )1 2 1 2, ,X t X tf x x

◊ The autocovariance function of a stationary random process X(t) is

( ) ( ) ( )1 2 2 1 1 2, for all and 5.61X XR t t R t t t t= −

◊ The autocovariance function of a stationary random process X(t) is written as

( ) ( )( ) ( )( ) ( ) ( )2⎡ ⎤

46

( ) ( )( ) ( )( ) ( ) ( )21 2 1 2 2 1, 5.62X X X X XC t t X t X t R t tμ μ μ⎡ ⎤= − − = − −⎣ ⎦E

Page 47: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ For convenience of notation, we redefine the autocorrelation function of a stationary process X(t) asfunction of a stationary process X(t) as

( ) ( ) ( ) ( ) for all 5.63XR X t X t tτ τ⎡ ⎤= +⎣ ⎦E

◊ This autocorrelation function has several important properties:

⎣ ⎦

( ) ( ) ( )( ) ( ) ( )

2 0 5.64

5 65XR X t

R Rτ τ

⎡ ⎤= ⎣ ⎦=

1. E

2 ( ) ( ) ( )( ) ( ) ( )

5.65

0 5.67X X

X X

R R

R R

τ τ

τ

= −

2.

3.

◊ Proof of (5.64) can be obtained from (5.63) by putting τ = 0.

47

Page 48: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Proof of (5.65):

( ) ( ) ( ) ( ) ( ) ( )X XR X t X t X t X t Rτ τ τ τ⎡ ⎤ ⎡ ⎤= + = + = −⎣ ⎦ ⎣ ⎦E E

◊ Proof of (5.67):

( ) ( )( )20 X t X tτ⎡ ⎤+ ± ≥

⎣ ⎦E

( ) ( ) ( ) ( )( ) ( )

2 22 0

2 0 2 0X X

X t X t X t X t

R R

τ τ

τ

⎡ ⎤ ⎡ ⎤⎡ ⎤→ + ± + + ≥⎣ ⎦⎣ ⎦ ⎣ ⎦→ ± ≥

E E E

( ) ( )( ) ( ) ( )

2 0 2 0

0 0X X

X X X

R R

R R R

τ

τ

→ ± ≥

→ − ≤ ≤

48

( ) ( )0X XR Rτ→ ≤

Page 49: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ The physical significance of the autocorrelation function RX(τ) is that it provides a means of describing the “interdependence” of twoit provides a means of describing the interdependence of two random variables obtained by observing a random process X(t) at times τ seconds apart.( ) p

49

Page 50: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Example 5.7 Sinusoidal Signal with Random PhaseC id i id l i l i h d h◊ Consider a sinusoidal signal with random phase:

( ) ( )cos 2X t A f tπ= +Θ ( )1 ,

2fπ θ π

θ π⎧ − ≤ ≤⎪= ⎨( ) ( )cos 2 cX t A f tπ= +Θ ( ) 2

0, elsewheref θ πΘ = ⎨

⎪⎩( ) ( ) ( )XR X t X tτ τ⎡ ⎤= +⎣ ⎦E( ) ( ) ( )

( ) ( )2

2 cos 4 2 2 cos 22

X

c c c

R X t X t

AA f t f f

τ τ

π π τ π τ

⎡ ⎤+⎣ ⎦

⎡ ⎤ ⎡ ⎤= + + Θ + ⎣ ⎦⎣ ⎦

E

E E( ) ( )

( ) ( )2 2

21 cos 4 2 2 cos 2

2 2 2

c c c

c c cA Af t f d f

ππ π τ θ θ π τ

⎣ ⎦⎣ ⎦

= + + +∫ ( ) ( )

( )2

2 2 2

cos 22

c c c

cA f

π π

π τ

=

50

( )2 c

Page 51: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Averages for joint stochastic processes◊ Let X(t) and Y(t) denote two stochastic processes and let◊ Let X(t) and Y(t) denote two stochastic processes and let

Xti≡X(ti), i=1,2,…,n, Yt’j≡Y(t’j), j=1,2,…,m, represent the

random variables at times t1>t2>t3>…>tn, and 1 2 3 n,t’1>t’2>t’3>…>t’m , respectively. The two processes are characterized statistically by their joint PDF:

◊ The cross-correlation function of X(t) and Y(t), denoted by( )' ' '1 2 1 2

, ,..., , , ,...,n m

t t t t t tp x x x y y y

◊ The cross correlation function of X(t) and Y(t), denoted by Rxy(t1,t2), is defined as the joint moment:

( ) ( ) ( )R t t E X Y x y p x y dx dy∞ ∞

= = ∫ ∫◊ The cross-covariance is:

1 2 1 2 1 2 1 21 2( , ) ( ) ( , )xy t t t t t t t tR t t E X Y x y p x y dx dy−∞ −∞

= = ∫ ∫

51

1 2 1 2 1 2( , ) ( , ) ( ) ( )xy xy x yt t R t t m t m tμ = −

Page 52: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Averages for joint stochastic processes◊ When the process are jointly and individually stationary we◊ When the process are jointly and individually stationary, we

have Rxy(t1,t2)=Rxy(t1-t2), and μxy(t1,t2)= μxy(t1-t2):

◊ The stochastic processes X(t) and Y(t) are said to be

' ' ' '1 1 1 1 1 1( ) ( ) ( ) ( ) ( )xy t t yxt t t t

R E X Y E X Y E Y X Rτ τ ττ τ+ − −− = = = =

◊ The stochastic processes X(t) and Y(t) are said to be statistically independent if and only if :

for all choices of ti and t’i and for all positive integers n and m

),...,,(),...,,(),...,,,,...,,( ''2

'121''

2'121 mnmn tttttttttttt yyypxxxpyyyxxxp =

for all choices of ti and t i and for all positive integers n and m.◊ The processes are said to be uncorrelated if

( ) ( ) ( ) 0)(

52

1 21 2( , ) ( ) ( ) xy t tR t t E X E Y= ⇒ 0),( 21 =ttxyμ

Page 53: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Example 5.9 Quadrature-Modulated Processes◊ Consider a pair of quadrature-modulated processes X1(t) and X2(t):

( ) ( ) ( )1 cos 2 cX t X t f tπ= +Θ

( ) ( ) ( )2 sin 2 cX t X t f tπ= +Θ

( ) ( ) ( )R X t X t⎡ ⎤⎣ ⎦E( ) ( ) ( )( ) ( ) ( ) ( )

12 1 2

cos 2 sin 2 2c c c

R X t X t

X t X t f t f t f

τ τ

τ π π π τ

⎡ ⎤= −⎣ ⎦⎡ ⎤= − +Θ − +Θ⎣ ⎦

E

E

( ) ( ) ( ) ( )cos 2 sin 2 2

1c c cX t X t f t f t fτ π π π τ⎡ ⎤⎡ ⎤= − +Θ − +Θ⎣ ⎦ ⎣ ⎦

⎡ ⎤

E E

( ) ( ) ( )

( ) ( )

1 sin 4 2 2 sin 22

1

X c c cR f t f t fτ π π π τ⎡ ⎤= − + Θ −⎣ ⎦E

( ) ( ) ( )⎡ ⎤

53

( ) ( )1 sin 22 X cR fτ π τ= − ( ) ( ) ( )12 1 20 0R X t X t⎡ ⎤= =⎣ ⎦E

Page 54: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Ergodic Processesi i i diffi l i ibl b ll l◊ In many instances, it is difficult or impossible to observe all sample

functions of a random process at a given time.It i ft i t t b i l l f ti f◊ It is often more convenient to observe a single sample function for a long period of time.

◊ For a sample function x(t) the time average of the mean value over◊ For a sample function x(t), the time average of the mean value over an observation period 2T is

( ) ( )1 5 84T

x t dtμ = ∫◊ For many stochastic processes of interest in communications, the

( ) ( ), 5.842x T T

x t dtT

μ−

= ∫

time averages and ensemble averages are equal, a property known as ergodicity.Thi t i li th t h bl i i d

54

◊ This property implies that whenever an ensemble average is required, we may estimate it by using a time average.

Page 55: Comm-05-Random Variables and Processes

5.6 Mean,5.6 Mean, Correlation and Covariance FunctionsCorrelation and Covariance Functions

◊ Cyclostationary Processes (in the wide sense)h i h i l f d l◊ There is another important class of random processes commonly

encountered in practice, the mean and autocorrelation function of which exhibit periodicity:which exhibit periodicity:

( ) ( )( ) ( )

1 1X Xt T t

R t T t T R t t

μ μ+ =

+ +for all t1 and t2.

( ) ( )1 2 1 2, ,X XR t T t T R t t+ + =

◊ Modeling the process X(t) as cyclostationary adds a new dimension namely period T to the partial description of thedimension, namely, period T to the partial description of the process.

55

Page 56: Comm-05-Random Variables and Processes

5.7 Transmission of a Random Process Through 5.7 Transmission of a Random Process Through a Linear Filtera Linear Filter

◊ Suppose that a random process X(t) is applied as input to linear time invariant filter of impulse response h(t) producing a newtime-invariant filter of impulse response h(t), producing a new random process Y(t) at the filter output.

◊ Assume that X(t) is a wide-sense stationary random process.◊ The mean of the output random process Y(t) is given by◊ The mean of the output random process Y(t) is given by

( ) ( ) ( ) ( )1 1 1Y t Y t h X t dμ τ τ τ∞

−∞

⎡ ⎤⎡ ⎤= = −⎣ ⎦ ⎢ ⎥⎣ ⎦∫E E

( ) ( )1 1 1 = h X t dτ τ τ∞

−∞

⎣ ⎦

⎡ ⎤−⎣ ⎦∫ E

56

( ) ( ) ( )1 1 1 5.86Xh t dτ μ τ τ∞

−∞= −∫

Page 57: Comm-05-Random Variables and Processes

5.7 Transmission of a Random Process Through 5.7 Transmission of a Random Process Through a Linear Filtera Linear Filter

◊ When the input random process X(t) is wide-sense stationary, the mean is a constant then mean is also a constant( )tμ μ ( )tμ μmean is a constant , then mean is also a constant .( )X tμ Xμ ( )Y tμ Yμ

( ) ( ) ( ) ( )1 1 0 5.87Y X Xt h d Hμ μ τ τ μ∞

−∞= =∫

where H(0) is the zero-frequency (dc) response of the system.

◊ The autocorrelation function of the output random process Y(t) is given by:given by:

( ) ( ) ( ) ( ) ( ) ( ) ( )1 1 1 2 2 2,YR t u Y t Y u h X t d h X u dτ τ τ τ τ τ∞ ∞

−∞ −∞

⎡ ⎤⎡ ⎤= = − −⎣ ⎦ ⎢ ⎥⎣ ⎦∫ ∫E E

( ) ( ) ( ) ( )1 1 2 2 1 2d h d h X t X uτ τ τ τ τ τ∞ ∞

−∞ −∞

∞ ∞

⎣ ⎦

⎡ ⎤= − −⎣ ⎦∫ ∫∫ ∫

E

57

( ) ( ) ( )1 1 2 2 1 2,Xd h d h R t uτ τ τ τ τ τ∞ ∞

−∞ −∞= − −∫ ∫

Page 58: Comm-05-Random Variables and Processes

5.7 Transmission of a Random Process Through 5.7 Transmission of a Random Process Through a Linear Filtera Linear Filter

◊ When the input X(t) is a wide-sense stationary random process, the autocorrelation function of X(t) is only a function of the differenceautocorrelation function of X(t) is only a function of the difference between the observation times:

( ) ( ) ( ) ( ) ( )1 2 1 2 1 2 5.90Y XR h h R d dτ τ τ τ τ τ τ τ∞ ∞

−∞ −∞= − +∫ ∫

◊ If the input to a stable linear time-invariant filter is a wide-sense stationary random process, then the output of the filter is also a y p , pwide-sense stationary random process.

58

Page 59: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ The Fourier transform of the autocorrelation function RX(τ) is called the power spectral density SX( f ) of the random processthe power spectral density SX( f ) of the random processX(t).

( ) ( ) ( ) ( )exp 2 5.91X XS f R j f dτ π τ τ∞

−∞= −∫

∫◊ Equations (5 91) and (5 92) are basic relations in the theory of

( ) ( ) ( ) ( )exp 2 5.92X XR S f j f dfτ π τ∞

−∞= ∫

◊ Equations (5.91) and (5.92) are basic relations in the theory of spectral analysis of random processes, and together they constitute what are usually called the Einstein-Wiener-Khintchine relations.y

59

Page 60: Comm-05-Random Variables and Processes

5 . 8 P o w e r S p e c t r a l D e n s i t y5.8 Power Spectral Density

◊ Properties of the Power Spectral DensityP 1 ( ) ( ) ( )

∫◊ Property 1:

◊ Proof: Let f =0 in Eq. (5.91)

( ) ( ) ( )0 5.93X XS R dτ τ−∞

= ∫◊ Proof: Let f 0 in Eq. (5.91)

◊ Property 2: ( ) ( ) ( )2 5.94XX t S f df∞

−∞⎡ ⎤ =⎣ ⎦ ∫E

◊ Proof: Let τ =0 in Eq. (5.92) and note that RX(0)=E[X2(t)].

◊ Property 3: ( ) ( )0 for all 5.95XS f f≥

◊ Property 4:◊ Proof: From (5.91)

( ) ( ) ( ) 5.96X XS f S f− =

60

( ) ( ) ( )( ) ( )

( ) ( ) ( )exp 2 exp 2X X

X X X XR RS f R j f d R j f d S f

τ τ

τ ττ π τ τ τ π τ τ

→−∞ ∞

−∞ −∞= −− = = − =∫ ∫

Page 61: Comm-05-Random Variables and Processes

Proof of Eq. (5.95)Proof of Eq. (5.95)◊ It can be shown that (see eq. 5.106) ( ) ( ) ( ) 2

Y XS f S f H f=

( ) ( ) ( ) ( ) ( ) ( )2exp 2 exp 2Y Y XR S f j f df S f H f j f dfτ π τ π τ

∞ ∞

−∞ −∞= =∫ ∫

S l t |H( f )|2 1 f bit il ll i t l f ≤ f ≤ f

( ) ( ) ( ) ( ) ( )220 0 for any Y XR E Y t S f H f df H f∞

−∞⎡ ⎤= = ≥⎣ ⎦ ∫

◊ Suppose we let |H( f )|2=1 for any arbitrarily small interval f1 ≤ f ≤ f2 , and H( f )=0 outside this interval. Then, we have:

( )2f

∫This is possible if an only if S ( f )≥0 for all f

( )2

1

0f

XfS f df ≥∫

This is possible if an only if SX( f )≥0 for all f.

C l i S ( f )≥0 f ll f

61

◊ Conclusion: SX( f )≥0 for all f.

Page 62: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ Example 5.10 Sinusoidal Signal with Random PhaseC id h d X( ) A (2 f Θ) h Θ i◊ Consider the random process X(t)=Acos(2πfc t+Θ), where Θ is a uniformly distributed random variable over the interval (-π,π).Th t l ti f ti f thi d i i i◊ The autocorrelation function of this random process is given in Example 5.7:

( ) ( )2

cos 2 (5.74)2X cAR fτ π τ=

◊ Taking the Fourier transform of both sides of this relation:2

( ) ( ) ( )2

( )Af f f f fδ δ⎡ ⎤( ) ( ) ( ) (5.97)4X c cAS f f f f fδ δ⎡ ⎤= − + +⎣ ⎦

62

Page 63: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ Example 5.12 Mixing of a Random Process with a Si id l PSinusoidal Process◊ A situation that often arises in practice is that of mixing (i.e.,

multiplication) of a WSS random process X(t) with a sinusoidalmultiplication) of a WSS random process X(t) with a sinusoidal signal cos(2πfc t+Θ), where the phase Θ is a random variable that is uniformly distributed over the interval (0,2π).is uniformly distributed over the interval (0,2π).

◊ Determining the power spectral density of the random process Y(t) defined by:y

( ) ( ) ( )cos 2 (5.101)cY f X t f tπ= +Θ

◊ We note that random variable Θ is independent of X(t) .

63

Page 64: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ Example 5.12 Mixing of a Random Process with a Si id l P ( ti d)Sinusoidal Process (continued)◊ The autocorrelation function of Y(t) is given by:( ) ( ) ( )⎡ ⎤( ) ( ) ( )

( ) ( ) ( ) ( )cos 2 2 cos 2Y

c c c

R Y t Y t

X t f t f X t f t

τ τ

τ π π τ π

⎡ ⎤= +⎣ ⎦⎡ ⎤= + + +Θ +Θ⎣ ⎦

E

E ( ) ( ) ( ) ( )( ) ( ) ( ) ( )cos 2 2 cos 2

1

c c c

c c cX t X t f t f f tτ π π τ π⎣ ⎦

⎡ ⎤⎡ ⎤= + + +Θ +Θ⎣ ⎦ ⎣ ⎦E E

( ) ( ) ( )1 cos 2 cos 4 2 221

X c c cR f f t fτ π τ π π τ⎡ ⎤= + + + Θ⎣ ⎦E

( ) ( )1 cos 22 X cR fτ π τ=

1Fourier transform

64

( ) ( ) ( )1 (5.103)4Y X c X cS f S f f S f f⎡ ⎤= − + +⎣ ⎦

Page 65: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ Relation among the Power Spectral Densities of the Input and Output Random ProcessesOutput Random Processes◊ Let SY( f ) denote the power spectral density of the output random process Y(t)

obtained by passing the random process through a linear filter of transfer function H( f ).( ) ( ) 2j f

Y YS f R e dπ ττ τ∞ −

−∞= ∫ ( ) ( ) ( ) ( ) ( )1 2 1 2 1 2 5.90Y XR h h R d dτ τ τ τ τ τ τ τ

∞ ∞= − +∫ ∫

( ) ( ) ( ) 21 2 1 2 1 2

j fXh h R e d d dπ ττ τ τ τ τ τ τ τ

∞ ∞ ∞ −

−∞ −∞ −∞= − +∫ ∫ ∫

L t +

( ) ( ) ( ) ( ) ( )1 2 1 2 1 2Y X−∞ −∞∫ ∫

1 2 0Let τ τ τ τ− + =

( ) ( ) ( ) ( )0 1 221 2 0 1 2 0

j fXh h R e d d dπ τ τ ττ τ τ τ τ τ

∞ ∞ ∞ − + −= ∫ ∫ ∫ ( ) ( ) ( )

( ) ( ) ( ) 01 2 22 21 1 2 2 0 0

j fj f j fXh e d h e d R e dπ τπ τ π ττ τ τ τ τ τ

−∞ −∞ −∞

∞ ∞ ∞ −−

−∞ −∞ −∞=

∫ ∫ ∫∫ ∫ ∫

65

( ) ( ) ( ) ( ) ( ) ( )2 5.106X XH f H f S f H f S f∗= =

Page 66: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ Example 5.13 Comb FilterC id h fil f Fi ( ) i i f d l li d◊ Consider the filter of Figure (a) consisting of a delay line and a summing device. We wish to evaluate the power spectral density of the filter output Y(t)of the filter output Y(t).

66

Page 67: Comm-05-Random Variables and Processes

5.8 Power Spectral Density5.8 Power Spectral Density

◊ Example 5.13 Comb Filter (continued)Th f f i f hi fil i◊ The transfer function of this filter is( ) ( ) ( ) ( )H 1 exp 2 1 cos 2 sin 2f j fT fT j fTπ π π= − − = − +

( ) ( ) ( )

( ) ( )

2 2 2

2

H 1 cos 2 sin 2

2 1 2 4 i

f fT fT

fT fT

π π⎡ ⎤⎡ ⎤= − +⎣ ⎦⎣ ⎦⎡ ⎤

◊ Because of the periodic form of this frequency response (Fig. (b)), h fil i i f d b fil

( ) ( )2=2 1 cos 2 =4sinfT fTπ π⎡ ⎤−⎣ ⎦

the filter is sometimes referred to as a comb filter.◊ The power spectral density of the filter output is:

( ) ( ) ( ) ( ) ( )2 2( ) ( ) ( ) ( ) ( )2 24sinY X XS f H f S f fT S fπ= =

If is very smallfT

67

2 2 2( ) 4 ( ) (5.107)Y XS f f T S fπ≅

Page 68: Comm-05-Random Variables and Processes

5.9 Gaussian Process5.9 Gaussian Process

◊ A random variable Y is defined by:1

N

i ii

Y a X=

=∑

◊ We refer to Y as a linear functional of X(t)

( ) ( )0

TY g t X t dt= ∫

1

are constants are random variablesi li f ti f th

i

i

i

aXY X◊ We refer to Y as a linear functional of X(t).

◊ If the weighting function g(t) is such that the mean-square value of the random variable Y is finite, and if the random variable Y is a

is a linear function of the .iY X

,Gaussian-distributed random variable for every g(t) in this class of functions, then the process X(t) is said to be a Gaussian process.I h d h X( ) i G i if◊ In other words, the process X(t) is a Gaussian process if every linear functional of X(t) is a Gaussian random variable.

◊ The Gaussian process has many properties that make analytic results possible.The random processes prod ced b ph sical phenomena are often

68

◊ The random processes produced by physical phenomena are often such that a Gaussian model is appropriate.

Page 69: Comm-05-Random Variables and Processes

5.9 Gaussian Process5.9 Gaussian Process

◊ The random variable Y has a Gaussian distribution if its probability density function has the formy

( ) ( )2

2

μ1 exp2σ2πσ

YY

yf y

⎡ ⎤−= −⎢ ⎥

⎢ ⎥⎣ ⎦2σ2πσ YY ⎢ ⎥⎣ ⎦μ : the mean of the random variable Y Y

2

◊ If the Gaussian random variable Y is normalized to have a mean of d i f h li d G i

2σ : the variance of the random variable Y Y

zero and a variance of one, such a normalized Gaussian distribution is commonly written as N(0,1).

( )21 exp

22πYyf y

⎛ ⎞= −⎜ ⎟

⎝ ⎠

69

22π ⎝ ⎠

Page 70: Comm-05-Random Variables and Processes

5.9 Gaussian Process5.9 Gaussian Process

◊ Central Limit TheoremL X 1 2 N b f d i bl h i fi◊ Let Xi, i = 1, 2, …, N, be a set of random variables that satisfies the following requirements:◊ The Xi are statistically independent.◊ The Xi are statistically independent.◊ The Xi have the same probability distribution with mean μX and variance σ2

X.The X so described are said to constitute a set of independent and◊ The Xi so described are said to constitute a set of independent and identically distributed (i.i.d.) random variables.

◊ Define: ( )1 μ 1 2Y X i N= =1 N

V Y= ∑( )μ , 1, 2, , .σi i X

X

Y X i N= − = …1

N ii

V YN =

= ∑[ ] 0iY =E [ ]var 1iY =

◊ The central limit theorem states that the probability distribution of VN approaches a normalized Gaussian distribution N(0,1) in

[ ]i [ ]i

70

of VN approaches a normalized Gaussian distribution N(0,1) in the limit as N approaches infinity.

Page 71: Comm-05-Random Variables and Processes

5.9 Gaussian Process5.9 Gaussian Process◊ Property 1: If a Gaussian process X(t) is applied to a stable linear

filter, then the output of Y(t) is also Gaussian., p ( )◊ Property 2: Consider the set of random variables or samples X(t1),

X(t2), …X(tn), obtained by observing a random process X(t) at time t1, t t If th X(t) i G i th thi t f dt2, …, tn. If the process X(t) is Gaussian, then this set of random variables is jointly Gaussian for any n, with their n-fold joint probability density function being completely determined by p y y g p y yspecifying the set of means:

( ) ( )μ , 1,2, ,X i it X t i n⎡ ⎤= =⎣ ⎦E …and the set of autocovariance functions:

( ) ( ) ( )( ) ( ) ( )( ), μ μ , , 1, 2, ..., k iX k i k iX t X tC t t X t X t k i n⎡ ⎤= − − =⎣ ⎦E

◊ Consider the composite set of random variables X(t1), X(t2),…, X(tn),Y(u1), Y(u2),…, Y(um). We say that the processes X(t) and Y(t) are

( ) ( ) ( )( ) ( ) ( )( )k iX t X t⎣ ⎦

71

mjointly Gaussian if this composite set of random variables are jointly Gaussian for any n and m.

Page 72: Comm-05-Random Variables and Processes

5.9 Gaussian Process5.9 Gaussian Process

◊ Property 3: If a Gaussian process is wide-sense stationary, then the process is also stationary in the strict sensethe process is also stationary in the strict sense.

◊ Property 4: If the random variables X(t1) X(t2) X(t ) are◊ Property 4: If the random variables X(t1), X(t2), …X(tn), are uncorrelated, that is

( )( ) ( )( )μ μ = 0X t X t i k⎡ ⎤ ≠E

then these random variables are statistically independent.

( ) ( )( ) ( ) ( )( )μ μ = 0, k ik iX t X tX t X t i k⎡ ⎤− − ≠⎣ ⎦E

◊ The implication of this property is that the joint probability density function of the set of random variables X(t ) X(t )density function of the set of random variables X(t1), X(t2),…, X(tn) can be expressed as the product of the probability density functions of the individual random variables in the set.

72

Page 73: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise

◊ The sources of noise may be external to the system (e.g., atmospheric noise galactic noise man-made noise) or internal toatmospheric noise, galactic noise, man-made noise), or internal to the system.

◊ The second category includes an important type of noise that arises from spontaneous fluctuations of current or voltage in electrical p gcircuits. This type of noise represents a basic limitation on the transmission or detection of signals in communication systems i l i h f l i d iinvolving the use of electronic devices.

◊ The two most common examples of spontaneous fluctuations in electrical circuits are shot noise and thermal noise.

73

Page 74: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise

◊ Shot Noise◊ Shot noise arises in electronic devices such as diodes and◊ Shot noise arises in electronic devices such as diodes and

transistors because of the discrete nature of current flow in these devices.

◊ For example, in a photodetector circuit a current pulse is generated every time an electron is emitted by the cathode due to incident light from a source of constant intensity Theto incident light from a source of constant intensity. The electrons are naturally emitted at random times denote by τk.

◊ If the random emissions of electrons have been going on for a ◊ e do e ss o s o e ec o s ve bee go g o olong time, then the total current flowing through the photodetector may be modeled as an infinite sum of current pulses as shown by ∞pulses, as shown by

where h(t- τ ) is the current pulse generated at time τ

( ) ( )τkk

X t h t∞

=−∞

= −∑

74

where h(t- τk) is the current pulse generated at time τk.◊ The process X(t) is a stationary process, called shot noise.

Page 75: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise◊ Shot Noise

◊ The number of electrons, N(t), emitted in the time interval (0, t)◊ The number of electrons, N(t), emitted in the time interval (0, t) constitutes a discrete stochastic process, the value of which increase by one each time an electron is emitted. (Fig. 5.17)

◊ Let the mean value of the number of electrons, v, emitted between times t and t+t0 beλ: a constant called the rate of the process

[ ] 0tν λ=E

◊ The total number of electrons emitted in the interval (t t+t0) is

λ: a constant called the rate of the process

( ) ( )0N t t N tν = + −interval (t, t+t0) isfollows a Poisson distribution with a meanvalue equal to λt0.

( ) ( )0N t t N tν +

va ue equa to λt0.◊ The probability that k electrons are emitted in

the interval (t, t+t0) isFig. 5.17 Sample function of a Poisson counting process.

( )k

75

( 0)

[ ] ( )0 0, 1, !

kkt

k e kk

λλν −= = = …P

Page 76: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise

◊ Thermal Noise◊ Thermal noise is the name given to the electrical noise arising◊ Thermal noise is the name given to the electrical noise arising

from the random motion of electrons in a conductor.

◊ The mean-square value of the thermal noise voltage VTN , appearing across the terminals of a resistor, measured in a bandwidth of Δf Hertz is given by:bandwidth of Δf Hertz, is given by:

2 24 voltsTNV kTR f⎡ ⎤ = Δ⎣ ⎦E

k : Boltzmann’s constant=1.38 ×10-23 joules per degree Kelvin.

TN f⎣ ⎦

T : Absolute temperature in degrees Kelvin.R: The resistance in ohms.

76

Page 77: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise

◊ White Noise◊ The noise analysis is customarily based on an idealized form of◊ The noise analysis is customarily based on an idealized form of

noise called white noise, the power spectral density of which is independent of the operating frequency.

◊ White is used in the sense that white light contains equal amount of all frequencies within the visible band of electromagnetic radiationelectromagnetic radiation.

◊ We express the power spectral density of white noise, with a sample function denoted by w(t), ass p e u c o de o ed by ( ), s

( ) 0

2WNS f =2

0 eN kT=

The dimensions of N are in watts per Hertz k is Boltzmann’s

77

The dimensions of N0 are in watts per Hertz, k is Boltzmann s constant and Te is the equivalent noise temperature of the receiver.

Page 78: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise

◊ White Noise◊ The equivalent noise temperature of a system is defined as the◊ The equivalent noise temperature of a system is defined as the

temperature at which a noisy resistor has to be maintained such that, by connecting the resistor to the input of a noiseless version of the system, it produces the same available noise power at the output of the system as that produced by all the sources of noise in the actual systemsources of noise in the actual system.

◊ The autocorrelation function is the inverse Fourier transform of the power spectral density:

( ) ( )0τ τ2W

NR δ=

◊ Any two different samples of white noise, no matter how closely together in time they are taken, are uncorrelated.

78

◊ If the white noise w(t) is also Gaussian, then the two samples are statistically independent.

Page 79: Comm-05-Random Variables and Processes

5.10 Noise5.10 Noise

◊ Example 5.14 Ideal Low-Pass Filtered White NoiseS th t hit G i i ( ) f d◊ Suppose that a white Gaussian noise w(t) of zero mean and power spectral density N0/2 is applied to an ideal low-pass filter of bandwidth B and passband amplitude response of one.p p p

◊ The power spectral density of the noise n(t) is

N⎧( )

0 , 20,

N

N B f BS f

f B

⎧ − < <⎪= ⎨⎪ >⎩

◊ The autocorrelation function of n(t) is

( ) ( )

( )

0τ exp 2π τ2

B

N B

NR j f df−

= ∫

79

( )0 sinc 2BτN B=

Page 80: Comm-05-Random Variables and Processes

5.11 Narrowband Noise5.11 Narrowband Noise◊ The receiver of a communication system usually includes some

provision for preprocessing the received signal.p p p g g◊ The preprocessing may take the form of a narrowband filter whose

bandwidth is just large enough to pass the modulated component of h i d i l i ll di d b lthe received signal essentially undistorted but not so large as to

admit excessive noise through the receiver.◊ The noise process appearing at the output of such a filter is called◊ The noise process appearing at the output of such a filter is called

narrowband noise.

Fig. 5.24 (a). Power spectral density of narrowband noise. (b). Sample

80

g. 5. (a). owe spect a de s ty o a owba d o se. (b). Sa p efunction of narrowband noise, which appears somewhat similar to a sinewave of frequency fc, which undulates slowly in both amplitude and phase.

Page 81: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components pp pp

◊ Consider a narrowband noise n(t) of bandwidth 2B centered on frequency fc, it can be represented asq y fc, p

( ) ( ) ( ) ( ) ( )cos 2π sin 2πI c Q cn t n t f t n t f t= −( ) i h t f ( )

◊ Both n (t) and n (t) are low pass signal

nI(t): in-phase component of n(t)nQ(t): quadrature component of n(t)

◊ Both nI(t) and nQ(t) are low-pass signal.

Fi 5 25 ( ) E i f i h d d f b d

81

Fig. 5.25 (a). Extraction of in-phase and quadrature components of a narrowbandprocess. (b). Generation of a narrowband process from its in-phase and quadraturecomponents.

Page 82: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ nI(t) and nQ(t) of a narrowband noise n(t) have some important properties:

pp pp

p p1) The nI(t) and nQ(t) of n(t) have zero mean.2) If n(t) is Gaussian, then nI(t) and nQ(t) are jointly Gaussian.3) If (t) i t ti th (t) d (t) j i tl t ti3) If n(t) is stationary, then nI(t) and nQ(t) are jointly stationary.4) Both nI(t) and nQ(t) have the same power spectral density, which is related

to the power spectral density SN( f ) of n(t) as

( ) ( ) ( ) ( ) , 0, otherwiseI Q

N c N cN N

S f f S f f B f BS f S f

⎧ − + + − ≤ ≤= = ⎨

5) nI(t) and nQ(t) have the same variance as the narrowband noise n(t).6) The cross-spectral density of nI(t) and nQ(t) of n(t) is purely imaginary

( ) ( )j S f f S f f B f B⎧ ⎡ ⎤ ≤ ≤⎪ ⎣ ⎦

7) If n(t) is Gaussian and its power spectral density S (t) is symmetric about

( ) ( ) ( ) ( ) ,

0, otherwiseI Q Q I

N c N cN N N N

j S f f S f f B f BS f S f

⎧ ⎡ ⎤+ − − − ≤ ≤⎪ ⎣ ⎦= − = ⎨⎪⎩

82

7) If n(t) is Gaussian and its power spectral density SN(t) is symmetric about the mid-band frequency fc, then nI(t) and nQ(t) are statistically independent.

Page 83: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ Example 5.17 Ideal Band-Pass Filtered White NoiseConsider a white Gaussian noise of zero mean and power spectral

pp pp

◊ Consider a white Gaussian noise of zero mean and power spectral density N0/2, which is passed through an ideal band-pass filter of passband magnitude response equal to one, mid-band frequency passband magnitude response equal to one, mid band frequency fc, and bandwidth 2B.

◊ The power spectral density characteristic of the filtered noise n(t) p p y ( )is shown in Fig. (a). The power spectral density characteristic of nI(t) and nQ(t) are shown in Fig. (c).

83

Page 84: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ Example 5.17 Ideal Band-Pass Filtered White NoiseThe autocorrelation function of n(t) is the inverse Fourier

pp pp

◊ The autocorrelation function of n(t) is the inverse Fourier transform of the power spectral density characteristic:

f fN N( ) ( ) ( )

( ) ( ) ( )

0 0exp 2 exp 22 2

i 2 2 2

c c

c c

f B f B

N f B f B

N NR j f df j f df

N B B j f j f

τ π τ π τ− + +

− − −= +

⎡ ⎤⎣ ⎦

∫ ∫( ) ( ) ( )( ) ( )

0

0

sinc 2 exp 2 exp 2

2 sinc 2 cos 2c c

c

N B B j f j f

N B B f

τ π τ π τ

τ π τ

⎡ ⎤= − +⎣ ⎦=

◊ The autocorrelation function of nI(t)and (t) is gi en b :and nQ(t) is given by:

( ) ( ) ( )02 sinc 2I QN NR R N B Bτ τ τ= =

84

I Q

Page 85: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ The narrowband noise n(t) can be represented in terms of its envelope and phase components:

pp pp

p p p

( ) ( ) ( )cos 2π ψcn t r t f t t⎡ ⎤= +⎣ ⎦( )⎡ ⎤

( ) ( ) ( ) 1 22 2I Qr t n t n t⎡ ⎤= +⎣ ⎦ ( ) ( )

( )1ψ tan Q

I

n tt

n t− ⎡ ⎤

= ⎢ ⎥⎣ ⎦

r(t) : envelope of n(t); ψ(t) : phase of n(t)

◊ Both r(t) and ψ(t) are sample functions of low-pass random processes.

◊ The probability distributions of r(t) and ψ(t) may be obtained from th f (t) d (t)

85

those of nI(t) and nQ(t).

Page 86: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ Let NI and NQ denote the random variables obtained by observing the random processes represented by the sample functions nI(t) and

pp pp

p p y p I( )nQ(t), respectively .

◊ NI and NQ are independent Gaussian random variables of zero mean d i 2 Th i j i b bili d i f i i i band variance σ2. Their joint probability density function is given by:

( )2 21 exp I Qn n

f n n⎛ ⎞+

= −⎜ ⎟⎜ ⎟

◊ Define:n = r cosψ and n = r sinψ We have dn dn = r dr dψ

( ), 2 2, exp2πσ 2σI QN N I Qf n n = ⎜ ⎟⎜ ⎟

⎝ ⎠

◊ Define:nI = r cosψ and nQ = r sinψ. We have dnI dnQ = r dr dψ.◊ The joint probability density function of R and Ψ is:

2r r⎛ ⎞

Th Ψ i if l di ib d i id h 0 2

( ), 2 2,ψ exp2πσ 2σR

r rf rΨ

⎛ ⎞= −⎜ ⎟

⎝ ⎠

86

◊ The Ψ is uniformly distributed inside the range 0 to 2π.

Page 87: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ The probability density function of the random variable R is:2⎧ ⎛ ⎞

pp pp

( )2

2 2exp , 0 (5.150)σ 2σR

r r rf r

⎧ ⎛ ⎞− ≥⎪ ⎜ ⎟= ⎨ ⎝ ⎠

◊ A random variable having the probability density function of

0, elsewhere⎪⎩

g p y y(5.150) is said to be Rayleigh distributed.

◊ The Rayleigh distribution in the normalized form

( )2

exp , 0υυ υ⎧ ⎛ ⎞

− ≥⎪ ⎜ ⎟( )exp , 0

20, elsewhere

Vfυ υ

υ≥⎪ ⎜ ⎟= ⎨ ⎝ ⎠

⎪⎩

87

Fig. 5.28 Normalized Rayleigh distribution

Page 88: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ Example 5.18 Sinusoidal Signal Plus Narrowband Noise ◊ A sample function of the sinusoidal signal A cos(2πf t) plus

pp pp

◊ A sample function of the sinusoidal signal A cos(2πfct) plus narrowband noise n(t) is given by:

( ) ( ) ( )cos 2πx t A f t n t= +

◊ Representing n(t) in terms of its in-phase and quadraturecomponents around the carrier frequency f

( ) ( ) ( )cos 2π cx t A f t n t= +

components around the carrier frequency fc

( ) ( ) ( ) ( ) ( )' cos 2π sin 2πI c Q cx t n t f t n t f t= − ( ) ( )'I In t A n t= +

◊ Assume that n(t) is Gaussian with zero mean and variance σ2.B th ’ (t) d (t) G i d t ti ti ll i d d t◊ Both n’I(t) and nQ(t) are Gaussian and statistically independent.

◊ The mean of n’I(t) is A and that of nQ(t) is zero.◊ The variance of both n’ (t) and n (t) is σ2

88

◊ The variance of both n I(t) and nQ(t) is σ2.

Page 89: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ The joint probability density function of the random variables N’ and NQ corresponding to n’ (t) and nQ(t) is

pp pp

N I and NQ , corresponding to n I(t) and nQ(t) is

( ) ( )2' 2' 1 exp I Qn A n

f n n⎡ ⎤− +⎢ ⎥=

L t (t) d t th l f (t) d (t) d t it h

( )' 2 2,, exp

2πσ 2σQII QN N

f n n ⎢ ⎥= −⎢ ⎥⎣ ⎦

◊ Let r(t) denote the envelope of x(t) and ψ(t) denote its phase.

( ) ( ) ( ){ }1 22' 2t t t⎡ ⎤ +⎣ ⎦ ( ) ( )1tan Qn ttψ − ⎡ ⎤= ⎢ ⎥

◊ The joint probability density function of the random variables

( ) ( ) ( ){ }2I Qr t n t n t⎡ ⎤= +⎣ ⎦ ( ) ( )'tan

I

tn t

ψ = ⎢ ⎥⎣ ⎦

◊ The joint probability density function of the random variables R and Ψ is given by

( )2 2 2 cosψr r A Arf

⎛ ⎞+ −

89

( ), 2 2

2 cosψ,ψ exp2πσ 2σR

r r A Arf rΨ

⎛ ⎞+= −⎜ ⎟

⎝ ⎠

Page 90: Comm-05-Random Variables and Processes

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms of InIn--phase and phase and QuadratureQuadrature Components Components

◊ The function fR,Ψ (r,ψ) cannot be expressed as a product fR(r)fΨ(ψ). This is because we now have a term involving the values of both

pp pp

grandom variables multiplied together as r cos ψ.

◊ Rician distribution:2 M difi d B l f ti f th fi t( ) ( )2π

,0

2 2 2π

,ψ ψR Rf r f r d

r r A Ar

Ψ=

⎛ ⎞+ ⎛ ⎞

Modified Bessel function of the first kind of zeroth order.

2 2 20 exp exp cosψ ψ

2πσ 2σ σr r A Ar d

⎛ ⎞+ ⎛ ⎞= − ⎜ ⎟⎜ ⎟⎝ ⎠⎝ ⎠

The Rician distribution reduces to the Rayleighy gdistribution for small a, and reduces to an

approximate Gaussian distribution when a is large.

90Fig 5.29 Normalized Rician distribution