Unit iii rpq

34
UNIT III CLASSIFICATION OF RANDOM PROCESSES PART A 1. State the four types of stochastic processes. The four types of stochastic processes are (a) Discrete random sequence (b) Continuous random sequence (c) Discrete random process (d) Continuous random process 2. Give an example for a continuous time random process. If represents the maximum temperature at a place in the interval (0,t), is a continuous random process. 3. Define a stationary process. If certain probability distribution or averages do not depend on t, then the random process is called a stationary process. 4. Give an example for a stationary process. A Bernoulli process is a stationary process. 5. Give an example of stationary process and justify your claim. A Bernoulli process is a stationary stochastic process as the joint probability distributions are independent of time. 6. Define strict sense and wide sense stationary process. A random process is called a strict sense stationary process or strongly stationary process if all its finite dimensional distributions are invariant under translation of time parameter A random process with finite first and second order moments is called a weakly stationary process or covariance stationary process or wide-sense stationary process if its mean is a constant and the auto correlation depends only on the time difference. i.e, if and 7. Give an example for strict sense stationary process. Bernoulli’s process is an example for strict sense stationary random process. 8. Prove that a first order stationary random process has a constant mean. as the process is stationary. 74

description

 

Transcript of Unit iii rpq

Page 1: Unit iii rpq

UNIT III

CLASSIFICATION OF RANDOM PROCESSES

PART A

1. State the four types of stochastic processes.

The four types of stochastic processes are

(a) Discrete random sequence

(b) Continuous random sequence

(c) Discrete random process

(d) Continuous random process

2. Give an example for a continuous time random process.

If represents the maximum temperature at a place in the interval (0,t), is a continuous

random process.

3. Define a stationary process.

If certain probability distribution or averages do not depend on t, then the random process is called

a stationary process.

4. Give an example for a stationary process.

A Bernoulli process is a stationary process.

5. Give an example of stationary process and justify your claim.

A Bernoulli process is a stationary stochastic process as the joint probability distributions are independent of

time.

6. Define strict sense and wide sense stationary process.

A random process is called a strict sense stationary process or strongly stationary process if all its finite

dimensional distributions are invariant under translation of time parameter

A random process with finite first and second order moments is called a weakly stationary process or

covariance stationary process or wide-sense stationary process if its mean is a constant and the auto

correlation depends only on the time difference. i.e, if and

7. Give an example for strict sense stationary process.

Bernoulli’s process is an example for strict sense stationary random process.

8. Prove that a first order stationary random process has a constant mean.

as the process is stationary.

Put

Therefore, is independent of t.

is a constant

74

Page 2: Unit iii rpq

9. When is a random process or stochastic process said to be ergodic?

A random process is said to be ergodic, if its ensemble averages are equal to appropriate time

averages.

10. Give an example of an ergodic process.

(a) A Markov chain finite state space.

(b) A stochastic process X(t) is ergodic if its time average tends to the ensemble

average as

11. State the properties of an ergodic process

is ergodic if all its statistics can be determined from a single function of the process.

12. What is a Markov process?

Markov process is one in which the future value is independent of the past values, given the present value.

13. Give an example of a Markov process.

Poisson process is a Markov process. Therefore, number of arrivals in (0,t) is a Poisson process and hence a

Markov process.

14. Define Markov chain and one – step transition probability.

If then

the process , n=0,1,2,…. is called a Markov chain.

The conditional probability is called the one step transition probability from state

to state at the step.

15. Describe a random walk process. Is it a Markov process?

Suppose a person tosses a fair coin every T seconds and instantly after each toss, he moves a distance d to

the right if heads show and to the left if tails show. is the position of the person after n tosses

Then the process is a random walk process.

The random walk process is a Markov process.

75

Page 3: Unit iii rpq

16. Define binomial process and state its properties.

Let denote a random variable which represents the result of trial . If assumes

only 2 values 0 and 1, the process is called a Bernoulli process.

A Binomial process is defined as a sequence of partial sums where

Properties of binomial process:

(1) Binomial process is a Markov process

(2) Since is a binomial random variable,

(3) Expected value of a binomial process is np and its variance is np(1-p)

17. Show that a binomial process is a Markov process.

Let

Hence binomial process is a Markov process.

18. Define Poisson process.

If represents the number of occurrences of a certain event in (0,t), then the discrete process is

called the Poisson process.

19. What is homogeneous Poisson process?

The probability law for the Poisson process is when is a

constant, the Poisson process is called a homogeneous Poisson process.

20. State the postulates of Poisson process.

The postulates of Poisson process are

(i)(ii)(iii)(iv) is independent of the number of occurrences of the event in any interval prior to and

after the interval (0,t)

(v) The probability that the event occurs a specified number of times in depends only

on t , but not on

21. State any two properties of Poisson process.

(i) The Poisson process is a Markov process.

(ii) Sum of two independent Poisson processes is a Poisson process

(iii) Difference of two independent Poisson processes is not a Poisson process.

22. Prove that the sum of two independent Poisson processes is also Poisson.

Let

76

Page 4: Unit iii rpq

nn

t

nn

t

nnnnn

t

n

r

rnrnrt

n

r

rnrnrrtt

rntn

r

rt

n

r

n

te

n

te

nCnCn

te

tn

nCe

rnr

ttee

rn

te

r

te

rntXPrtXP

ntXtXPntXP

)(!

)(!

.........!

!

)!(!

)!(

)(

!

)(

)()(

)()()(

21)(

21)(

121

2221

1212

)(

021

)(

0

21

2

0

1

20

1

21

21

21

21

21

21

21

Therefore, is a Poisson process with parameter .

23. Prove that the difference of two independent Poisson processes is not a Poisson process.

Let

24. Let be a Poisson process with rate . Find .

25. For a Poisson process with parameter and for show that

77

Page 5: Unit iii rpq

26. For the sine wave process constant, the amplitude Y is a

random variable with uniform distribution in the interval 0 to 1. Check whether the process is stationary or

not.

Given Y is a random variable with uniform distribution in the interval 0 to 1, then

Also given

Since the mean is time dependent, the process is not stationary.

PART B

1. Define random process. Classify it with an example.

Random process:

A random process is a collection of RVs that are functions of a real variable, namely time t where

(sample space) and (parameter set or index set).

Classification of Random Process:

Depending on the contjnuous or discrete nature of the state space S and parameter set T, a random process

can be classified into four types:

78

Page 6: Unit iii rpq

a) If both T and S are discrete, the random process is called a discrete random sequence.

For example, if represents the outcome of the toss of a fair die, then is a

discrete random sequence, since and .

b) If T is discrete and S is continuous, the random process is called a continuous random

sequence.

For example, if represents the temperature at the end of the hour of a day, then

is a continuous random sequence, since temperature can take any value in

an interval and hence continuous.

c) If T is continuous and S is discrete, the random process is called a discrete random process.

For example, if represents the number of telephone calls received in the interval (0,t)

then is a discrete random process, since

d) If T and S are continuous, the random process is called a continuous random process. For

example, if represents the maximum temperature at a place in the interval (0,t) ,

is a continuous random process.

2. Consider the two dimensional process . Define mean, correlation, covariance, cross

correlation and cross covariance functions.

Mean:

Mean of the process is the expected value of a typical member of the process.

Auto correlation:

Autocorrelation of the process denoted by is the expected value of the product of any

two members and of the process

Auto covariance

Auto covariance of the process denoted by is defined as

Cross Correlation

Cross correlation of two processes and is defined as

Cross Covariance

Cross covariance of two processes and is defined as

3. The probability distribution of the process is given by

Show that it is not stationary.

The probability distribution of is

X(t)=n 0 1 2 3 …

….

79

Page 7: Unit iii rpq

=1

(constant)

80

Page 8: Unit iii rpq

If is a stationary process, and are constants.

Since is a function of t, the given process is not stationary.

4. Verify if the sine wave process , where and Y is uniformly distributed in (0,1) is

a strict sense stationary process.

Given Y is uniformly distributed in (0,1), the density function of Y is

, a function of t.

If is to be a SSS process, its mean must be a constant.

Therefore, is not a strict sense stationary process.

5. Verify whether the random process is wide sense stationary when A and are

constants and is uniformly distributed on the interval .

Given is uniformly distributed on the interval , the density function is given by

81

Page 9: Unit iii rpq

, a function of t

is a function of t.

Therefore, the random process is not a wide sense stationary process.

6. Show that the random process where with A and are constants and is

a uniform random variable over is wide sense stationary.

Given is a uniform random variable over , the density function is

dtA

)cos(2

= 0

is independent of time.

82

Page 10: Unit iii rpq

[a function of ]

is a function of

Therefore the process is a wide sense stationary process.

7. Show that the random process is wide sense stationary where A and are

constants and is uniformly distributed on the interval

Given is uniformly distributed on the interval . The density function is

83

Page 11: Unit iii rpq

= 0

= 0 = constant

[a function of ]

is a function of

Therefore the process is a wide sense stationary process.

8. Show that the process where A and B are random variables is wide sense

stationary if

Given

84

Page 12: Unit iii rpq

= a constant

=a function of

Therefore the process is a WSS process.

9. Two random processes and are defined by and

. Show that and are jointly

wide sense stationary, if A and B are uncorrelated RV’s with zero means and the same variables and is a

constant.

Given

Since A and B are uncorrelated RVs, we have

By the previous problem, and are individually WSS processes.

Now

=a function of

Hence and are jointly WSS processes.

85

Page 13: Unit iii rpq

10. Let where is a constant. Show that is stationary in the wide

sense iff where is the characteristic function of the random variable Y.

Given is the characteristic function of the random variable Y, we have

Given

=0 (a constant)

Given

= a function

Therefore is a WSS process.

11. Derive the probability law for the Poisson process

Let be the number of occurrences of the event in unit time

86

Page 14: Unit iii rpq

Let --------------------------------------(1)

--------------------------------(2)

(2)-(1) implies

-

Taking limit as ,

------------------------(3)

Let the solution of equation (3) be

-----------------------------------------------------------(4)

Differentiating (4) with respect to t, we have,

------------------(5)

Using (4)and (5) in (3), we have,

Integrating with respect to t, we have

87

Page 15: Unit iii rpq

--------------------------(6)

Substituting n=0 in (4), we have

---------------------------(7)

Substituting t=0 in (7), we have

=1------------------------------------(8)

Substituting t=0 in (6), we have

=k

Therefore, -----------------------------------------------(9)

Using (9) in (4), we have

Thus the probability distribution of is the Poisson distribution with parameter .

12. Prove that the random process where A and are constants and is

uniformly distributed on the interval is correlation ergodic.

Given is uniformly distributed on the interval , then,

The stationary process is correlation ergodic if tends to

as

i.e., to prove that

88

Page 16: Unit iii rpq

89

Page 17: Unit iii rpq

Therefore, is correlation ergodic.

13. Determine whether or not the process is ergodic, if A and B are normally

distributed random variables with zero means and unit variances.

Given A and B are normally distributed RVs with zero means and unit variances, we have,

Similarly

TT

Bsin

=0

=

Therefore, the process is ergodic process.

90

Page 18: Unit iii rpq

14. Let be a random process where =total number of points in the interval (0,t)

[=k say] i.e., and =AX(t) such that

and A is independent of . Find the autocorrelation of , .

(i)

, a constant.

Before finding let us find

P[X(t)=1]=P[number of points in (0,t) is even]

which follows Poisson

(ii)

91

Page 19: Unit iii rpq

(iii)

Note: (a) is WSS from (i) & (iii)

(b) is evolutionary even though is a function in time difference ( is

not a constant.)

15. A random process has sample values

Find the mean and

variance of the process. Also check if the process is ergodic in the mean.

has six values: 5,3,1,-1,-3,-5.

We see that the variance of does not depend on T and it does not decrease as we increase T. Hence

the process is not ergodic in the mean.

16. A random process be given as where is stationary random

process with and . If is a RV independent of and uniformly

distributed over the interval . Show that and

Given and .

and are independent RVs.

Given is uniformly distributed, the density function of is given by

(since independent RVs)

92

Page 20: Unit iii rpq

Therefore,

17. Verify whether the sine wave process where where Y is uniformly

distributed in (0,1) is a SSS process.

i.e.,

= a function of t.

If is to be a SSS process, its first order density must be independent of t. Therefore, is not a

SSS process.

18. If , where Y and Z are two independent normal RVs with

. and is a constant, prove that is a SSS process of order

2.

Since is a linear combination of Y and Z, that are independent, follows a normal distribution

with

93

Page 21: Unit iii rpq

Since and are each and are jointly normal with the joint pdf given

by

---(1)

In (1), r is the correlation coefficient between and

(since Y and Z are independent, E[YZ]=E[Y]E[Z]=0)

( + )

Now the joint pdf of and is given by a similar expression as in (1), where,

Thus the joint pdfs of and are the samw.

Therefore, is a SSS process of order 2.

19. Show that when events occur as a Poisson process, the time interval between successive events

follow exponential distribution.

Let two consecutive occurrences of the event be and .

Let take place at time instant and T be the interval between the occurrences of and . T is a

continuous RV.

=P[No event occurs in an interval of length t]

=

94

Page 22: Unit iii rpq

Therefore, the cdf of T is given by

Therefore, the pdf of T is given by

which is an exponential distribution with mean .

20. If and are two independent Poisson processes, show that the conditional

distribution given is binomial.

Let and be two independent Poisson processes with parameter and respecstively.

Hence is also a Poisson process with parameter (By additive property)

Hence

Since and are independent, we have

where and

Hence, the conditional distribution of given is binomial.

21. Suppose that customers arrive at a bank according to a Poisson process with a mean rate of 3 per

minute. Find the probability that during a time interval of 2 mins

1. exactly 4 customers arrive

2. more than 4 customers arrive

Mean of the Poisson process =

Mean arrival rate=mean number of arrivals per minute=

Given

(a)

95

Page 23: Unit iii rpq

(b)

0.715

22. A machine goes out of order whenever a component fails. The failure of this part follows a Poisson

process with a mean rate of 1 per week. Find the probability that 2 weeks have elapsed since last failure. If

there are 5 spare parts of this component in an inventory and that the next supply is not due in 10 weeks,

find the probability that the machine will not be out of order in the next 10 weeks.

Here the unit of time is 1 wweek.

Mean failure rate=mean number of failures in a week =

P[no. of failures in the 2 weeks since last failure]

=0.135

There are only 5 spare parts and the machine should not go out of order in the next 10 weeks.

23. Define a Markov chain. How would you classify the states.

If for all n ,

then the process ,n=0,1,2,…,..is called a Markov Chain.

Classification of states of a Markov Chain:

If for some n and for all i and j,then every state can be reached from every other state.

When this condition is satisfied, the Markov chain is said to be irreducible. The transition probability matrix

of an irreducible chain is an irreducible matrix. Otherwise, the chain is said to be non irreducible or

reducible.

State i of a Markov chain is called a return state if for some n>1.

The period of a return state i is defined as the greatest common divisor of all m such that

State i is said to be periodic with period if and

aperiodic if .

Obviously, state i is aperiodic if . The probability that the chain returns to state i , having started from

state i ,for the first time at the step is denoted by and called the first return time probability or the

recurrence time probability. is the distribution of recurrence times of the state i.

96

Page 24: Unit iii rpq

If , the return to state i is certain.

is called the mean recurrence time of the state i

A state i is said to be persisent or recurrent if the return to state i is certain i.e.,if . The state i is

said to be transient if the return to the state i is uncertain,i.e.,if .The state i is said to be nonnull

persistent if its mean recurrence time is finite and null persistent if

A nonnull persistent and aperiodic state is called ergodic.

24. The transition probability matrix of a Markov chain , three states 1,2 and 3 is

and the initial distribution is Find (i)

(ii)

=

(ii)

Since by definition of conditional probability

(by Markov property)

25. A man either drives a car or catches a train to got to office each day. He never goes 2 days in a row

by train, but if he drives one day, then the next day he is just as likely to drive again as he is to travel by

train. Now suppose that on the first day of the week, the man tossed a fair die and drove to work iff a 6

appeared. Find

97

Page 25: Unit iii rpq

(i) the probability that he takes a train on the third day and

(ii) the probability that he drives to work in the long run

The travel pattern is a Markov chain, with state space = (train,car)

The TPM of the chain is .

The initial state probability distribution is since

P(traveling by car)=P(getting 6 in the toss of the die)=

P(traveling by train)=

Therefore, P(the man travels by train on the third day)=

Let be the limiting form of the state probability distribution or stationary state distribution of

the Markov chain.

By the property of ,

-------------------(1)

-------------(2)

Equations (1) and (2) are one and the same.

Therefore, consider (1) or (2) with since is a probability distribution.

Therefore, P[the man travels by car in the long run]=

98

Page 26: Unit iii rpq

26. Three boys A, B and C are throwing a ball to each other. A always throws the ball to B and B always

throws the ball to C, but C is just as likely to throw the ball to B as to A. Show that the process is Markovian.

Find the transition matrix and classify the states.

The transition probability matrix of the process is given below.

States of depends only on states of but not on states of , ,…or earlier states.

Therefore, is a Markov chain.

Now,

; ; ; and so on

and all other . Therefore, the chain is irreducible.

We note that etc are >0 for i=2,3and GCD of 2,3,5,6,….=1

Therefore, the states 2 and 3 are periodic with period 1 i.e., aperiodic

We note that etc., are>0 and GCD of 3,5,6,…=1.Therefore, the state 1 is periodic with

period 1, i.e, aperiodic.

Since the chain is finite and irreducible all its states are nonnull persistent.

Moreover, all the states are ergodic.

27. Write a detailed note on sine wave process.

A sine wave process is represented as where is a constant and is a uniform

random variable in the range 0 to , there are infinite number of sample functions. It is because there are

infinite number of values in the interval 0 to .

If takes values only in the range of 0 to , then the random process contains only two sample functions

and

28. Write a note on binomial process.

99

Page 27: Unit iii rpq

Binomial process can be defined as a sequence of partial sums where

Assumptions:

i. Time is assumed to be slotted into unit intervals. So, the process takes place at discrete

time.

ii. Atmost one arrival can occur in any slot.

iii. Arrivals occur randomly and independently in each slot with probability P

Properties:

i. Binomial process is Markovian.

ii. is a binomial random variable.So, ;

;

iii. The distribution of the number of slots between and arrival is geometric

with parameter p and starts from 0. The random variables are

mutually independent. The geometric distribution is given by

iv. The binomial distribution of the process approaches Poisson when n is large and p is

small.

v. The number of occurrences between t and the first left most successive occurrence of

the event is called age life random variable and the number of occurrences between t

and the first rightmost successive occurrence of the event is called residual life random

variable.

vi. The age and residual life distributions are geometric with parameter p. The value of

ages does not affect the value of residual life. So, these distributions are independent of

each other.

29. Write a detailed note on normal process.

A real valued random process is called a Gaussian process or normal process, if the random variables

are jointly normal for every and for any set of . The order

density of a Gaussian process is given by

where and is the order square matrix where and

=cofactor of in .

Properties:

i. If a Gaussian process is wide-sense stationary, it is also strict sense stationary.

ii. If the member functions of a Gaussian process are uncorrelated, then they are

independent.

iii. If the input of a linear system is a Gaussian process, the output will also

be a Gaussian process.

100