Ptsp Probability and stochastic processs

35
http://www.PRsolutions.in JNTU ONLINE EXAMINATIONS [Mid 2 - PTSP] 1. Find the expected value of g(X1, X2, X3, X4) = where n where n1, n2, n3 and n4 are integers ≥ 0 and , (x1, x2, x3, x4) = for 0 < x1 < a; 0 < x2 < b; < x3 < c; 0 < x4 d = 0 else where a. b. (n + 1) 4 c. d. 2. Two R.Vs X and Y have means 1 and 2 respectively and variances 4 and 1 respectively. Their correlation coefficient is 0.4. New R.Vs W and V are define as V = - X + 2Y , W = X + 3Y The correlation and correlation coefficient of V and W is a. 22.2 and 0.08 b. 0.222 and 0.8 c. 0.08 and 2.22 d. 2.22 and 0.8 3. Which of the following Relation is correct a. b. c. d. 4. 5. Two independent R.V's X, Y are always a. Correlated b. Uncorrelated c. have cov (X, Y) ≠ 0 d. have correlation co-efficient - 1 6. If independent R.V's 'X' and 'Y' have the variances 36 and 16 respectively, the correlation coefficient between (X + Y) and (X - Y) assuming X and Y are possessing Zero mean is a. b. c. d. 7. If X and Y and Z are uncorrelated R.V's with the semi variance. Find the correlation coefficient between (X + Y) and (Y + Z).

description

Probability and stochastic process

Transcript of Ptsp Probability and stochastic processs

Page 1: Ptsp Probability and stochastic processs

http://www.PRsolutions.in JNTU ONLINE EXAMINATIONS [Mid 2 - PTSP]

1. Find the expected value of g(X1, X2, X3, X4) = where n where n1, n2,

n3 and n4 are integers ≥ 0 and , (x1, x2, x3, x4) = for 0 < x1 < a; 0 < x2 < b; < x3 < c; 0 < x4 ∠∠∠∠ d

= 0 else where

a.

b. (n + 1)4

c.

d.

2. Two R.Vs X and Y have means 1 and 2 respectively and variances 4 and 1 respectively. Their correlation coefficient is 0.4. New R.Vs W and V are define as V = - X + 2Y , W = X + 3Y The correlation and correlation coefficient of V and W is

a. 22.2 and 0.08

b. 0.222 and 0.8

c. 0.08 and 2.22

d. 2.22 and 0.8

3. Which of the following Relation is correct

a.

b.

c.

d.

4.

5. Two independent R.V's X, Y are always

a. Correlated

b. Uncorrelated

c. have cov (X, Y) ≠ 0

d. have correlation co-efficient - 1

6. If independent R.V's 'X' and 'Y' have the variances 36 and 16 respectively, the correlation coefficient between (X + Y) and (X - Y) assuming X and Y are possessing Zero mean is

a.

b.

c.

d.

7. If X and Y and Z are uncorrelated R.V's with the semi variance. Find the correlation coefficient between (X + Y) and (Y + Z).

Page 2: Ptsp Probability and stochastic processs

http://www.PRsolutions.in a. 0

b.

c.

d. 1

8. If g(x, y) is some function of two R.V's X and Y the expected value of g(x, y) is

a.

b.

c.

d.

9. The (n + K)th order joint moment of two R.V's X and Y is defined as

a.

b.

c.

d.

10. The (n + K)th order joint central moment of the R.V's X and Y is defined as

a.

b.

c.

d.

11. Cov (X, Y) covariance of (X, Y) is

a. E(XY) b. E(XY) - E(X) c. E(XY) - E (X) E (Y)

d. E(XY) + E(X)E(Y)

Page 3: Ptsp Probability and stochastic processs

http://www.PRsolutions.in 12. Two R.Vs X and Y have the joint characteristic function

ΦX,Y (w1, w2) = exp the means of X and Y respectively are

a. 0, 0

b. 0, 1

c. 1, 0

d. 1, 1

13. The joint moments can be found from the joint characteristic function as

a.

b.

c.

d.

14. The joint characteristic function of two r.v's X and Y is given as

φX,Y (w1, w2) = exp Then the means of X and Y are respectively

a. 0, 0

b. 0, 1

c. 1

d. 0

15. For the above problem covariance is

a. 3

b. 2

c. 1

d. 0

16. The joint characteristic function of two R.V's X and Y is defined by X,Y (w1, w2) =

a.

b.

c.

d.

17. The expression for joint characteristic function of two r.V's X and Y in integral form

is X,Y (w1, w2) =

a.

b.

Page 4: Ptsp Probability and stochastic processs

c.

d.

18. The expression for joint characteristic function X,Y (w1, w2) is recognized as the two dimensional

a. Fourier transform of joint density function

b. Fourier transform of joint distribute function

c. Inverse Fourier transform of joint distribute functions

d. Inverse Fourier transform of joint density function

19. For the above problem covariance of X, Y is cov (X, Y) is

a. 3

b. 2

c. 1

d. 0

20. Which of the following is marginal characteristic function

a. X,Y (0, 0)

b. X,Y (w1 0)

c. X,Y (w1, w2)e JwX

d. X,Y(w1, w2)e-JwX

21. To convert correlated R.V's X and Y into two statistically independent Gaussian r.v.'s, the co-ordinate rotation through an angle is θ =

a.

b.

c.

d.

22. The R.V's X and Y are transformed to get new r.v's V and W as V = X cosθ + Ysinθ W = X sinθ - Ycosθ

If f(x, y) = then f(V, W) is

a.

b.

c.

d.

23. The marginal density function of X is given by fX(x) =

Page 5: Ptsp Probability and stochastic processs

http://www.PRsolutions.in

a.

b.

c.

d.

24. X and Y are Gaussian r.v's with variances and . Then the R.V's V = X + KY and w = X - KY are statistically independent for K equal to

a.

b.

c.

d.

25. X and Y are jointly Gaussian R.Vs with same variance and ρxy = -1. The angle θ of a co-ordination rotation that generates non r.v's that are statically independent is

a.

b.

c.

d. π

26. Two R.V's X and Y are said to be jointly Gaussian if their joint density function is of the form fX,Y (x, y) =

a.

b.

c.

d.

Page 6: Ptsp Probability and stochastic processs

27. Gaussian R.V's are completely defined through only their

a. first order moments

b. Second order moments

c. first order momentst & Second order moments

d. Covariance

28. Any uncorrelated Gaussian R.V's are always

a. not independent b. independent c. have non-zero covariance

d. has correlation coefficient of unity

29. The independent R.V's with zero mean are

a. orthogonal

b. non-orthogonal c. correlated

d. have RXY ≠ 0

30. Which of the following is correct

a.

b.

c.

d.

31. X and Y are two independent normal r.v's N(m, σ2) = N(0, 4). Consider V = 2X + 3Y is a _ R.V.

a. Rayleigh

b. Gaussian

c. poisson

d. Binomial 32. If X and Y are independent r.v's with density function fX(x) = 1 for 1 ≤ x ≤ 2

and in 2 ≤ y ≤ 4. The density of Z = XY in the region 4 ≤ Z ≤ 8 is

a. Z - 2

b.

c.

d.

33. The joint density of two R.Vs X and Y is f(x, y) = (x + y) for 0 ≤ x ≤ 1; 0 ≤ y ≤ 1; = 0 else where Then the density of Z = XY in 0 ≤ Z ≤ 1 is

a. 0

b. 2( Z - 1)

c.

d. 2 ( 1 - Z)

34. X and Y are two independent R.V's with density f(x) = 2.e-2x for x > 0 and f(y) = 2e-

2y for y > 0. The density of the R.V Z = X - Y in the region Z > 0 is

a. e-2z

b. e2z

c. e-z

d. ez

35. For the above problem the density of R.V Z = X - Y when Z < 0 is

Page 7: Ptsp Probability and stochastic processs

http://www.PRsolutions.in a. e-2z

b. e2z

c. e-z

d. ez

36. Let Z and w be two R.V's expressed as functions of two R.V's X and Y i.e., Z = g(X, Y) and W = h (X, Y) there the joint pdf of Z and w is given as fZw (Z, W) =

a. fXY (x, y)

b.

c. fXY (x, y) ejWX

d. fXY (x, y) . e-jWX

37. The Jacobian of the transformation x = V , y = ? (u - v) is

a.

b.

c. 1

d. 2

38. X and Y are R.V's transformed as and y = V. The jacobian of this transformation is

a. V

b. -V

c.

d.

39. If X and Y are independent R.V's then density of Z = X + Y is ________ of the individual densities of X and Y.

a. Convolution

b. Fourier Transform

c. Laplace transform

d. Z - Transform

40. The mean and variance of U for the above problem after transformation are respectively

a. 26, 52

b. 52, 26

c. 0, 52

d. 52, 0

41. Let x and y be jointly Gawssian r.V's where σx2 = σy

2 and e = -1, Find the Transformations matrix such that new r-V's y1 and y2 are statistically independent.

a.

b.

Page 8: Ptsp Probability and stochastic processs

c.

d.

42. Two Gaussian R.V's X1 and X2 are defined by the mean and covariance matrices

Two new R.V's Y1 and Y2 are formed using the transformation.

Then is

a. 0.628

b. 0.268

c. 0.379

d. 0.826

43. The joint characteristics function of X and Y if fXY (x, y) = is

a.

b.

c.

d.

44. Two Gaussian r.v's X1 and X2 have zero means and variances and .

Their Covariance . If Y1 = X1 - 2X2

Y2 = 3X1 + 4X2 then

are respectively

a. 28, 252

b. 252, 28

c. 66, 142

d. 142, 66

45. If X, Y and Z are three independent R.V's of same mean and variance, the mean square value of (Y - Z) is

a.

b.

c.

d. 0

Page 9: Ptsp Probability and stochastic processs

http://www.PRsolutions.in 46. Zero mean Gaussian R.V's X1, X2 and X3 having a covariance matrix

are transformed to new variables Y1 = 5X1 + 2X2 - X3 Y2 = - X1 + 3X2 + X3 Y3 = 2X1 - X2 + 2X3 The covariance matrix of Y1, Y2 and Y3 is [CY]

a.

b.

c.

d.

47. X1 and X2 be two Gaussian R.V's and let they also jointly Gaussian and let Y1 = a11 X1 + a12 X2

Y2 = a21 X1 + a22 X2 Define by [Y] = [T] [X] then the covariance matrix of Y1, and Y2 is

a. [CY] = [T-1] [CX] [Tt]-1

b. [CY] = [T] [CX] [Tt]-1

c. [CY] = [T] [CX]-1 [Tt]-1

d. [CY] = [T] [CX] [T]t

48. For the above problem is

a. 66

b. - 66

c. 41

d. -41

49. If X has the distribution N(m, σ2) = N (0, 1) the MGF of Y = X2 is MY(t)

a.

b.

c.

d.

Page 10: Ptsp Probability and stochastic processs

50. Two R.V's X1 and X2 have variances K and 2 respectively. A random variable Y is define as Y = 3X2 - X1. If var(Y) = 25. then K =

a. 6

b. 7

c. 8

d. 9

51. X(t1) = X1 and X(t2) = X2, the correlation between X1 and X1 is R(t1, t2) =

a.

b.

c.

d.

52. The auto covariance of random process X(t) is CovXX(t1, t2)

a. RXX (t1, t2) - E[X(t1)] b. RXX (t1, t2) + E[X(t2)] c. RXX (t1, t2) - E[X(t1)] . E[X(t2)]

d. RXX (t1, t2) + E[X(t1)] + E[X(t2)] 53. If FX(x1, x2 : t1, t2) is referred to as second order joint distribution function then the

corresponding joint density function is

a.

b.

c.

d.

54. The mean of random process X(t) is the expected value of the R.V X at time t i.e., the mean m(t) =

a.

b.

c.

d.

55. If RXY = 0 then X and Y are

a. independent

b. orthogonal

Page 11: Ptsp Probability and stochastic processs

http://www.PRsolutions.in c. independent & orthogonal d. Statistically independent

56. Te collection of all the sample functions is referred to a as

a. Ensemble

b. Assemble

c. Average

d. Set 57. If the future value of a sample function cannot be predicted based on its past,

values, the process is referred to as.

a. deterministic process

b. non-deterministic process

c. independent process

d. Statistical process

58. If the future value of the sample function can be predicted based on its past values, the process is referred to as

a. deterministic process

b. non-deterministic process

c. independent process

d. Statistical process

59. If sample of X(t) is a R.V then the cumulative distribution function FX(x1 : t1) is

a. P(X(t1)) b. P(X(t1) ≤ 0) c. P (X(t1) ≤ x1)

d. P (X(t1) ≥ x1) 60. The random process X(t) and Y(t) are said to be independent, if fXY (x1, y1 : t1, t1) is

=

a. fX(x1 : t1) b. fY(y1 : t2) c. fX(x1 : t1) fY(y1 : t2)

d. 0

61. A random process is defined as X(t) = cos(wot + θ), where θ is a uniform random variable over (-π, π). The second moment of the process is

a. 0

b.

c.

d. 1

62. For the random process X(t) = A coswt where w is a constant and A is a uniform R.V over (0, 1) the mean square value is

a.

b. cos wt

c.

d.

63. A stationary continuous process X(t) with auto-correlation function RXX(τ) is called autocorrelation-ergodic or ergodic in the autocorrelation if, and only if, for all τ

Page 12: Ptsp Probability and stochastic processs

a.

b.

c.

d.

64. A random process is defined as X(T) = A cos(wt + θ), where w and θ are constants and A is a random variable. Then, X(t) is stationary if

a. F(A) = 2

b. F(A = 0

c. A is Gaussian with non-zero mean

d. A is Reyleigh with non-zero mean

65. For an ergodic process

a. mean is necessarily zero

b. mean square value is infinity

c. all time averages are zero

d. mean square value is independent if sine

66. Two processes X(t) and Y(t) are statistically independent if

a.

b.

c.

d.

67. Let X(t) is a random process which is wide sense stationery then

a. E[X(t)] = constant b. E[X(t) . X[t + τ)] = RXX (τ) c. E[X(t)] = constant and E[X(t) . X[t + τ)] = RXX (τ)

d. E[X2(t)] = 0

68. A process stationary to all orders N = 1, 2, ---- &. For Xi = X(ti) where i = 1, 2, -------N is called

a. Strict-sense stationary

b. wide-sense stationary

c. strictly stationary

d. independent 69. Time average of a quantity x(t) is defined as A[x (t)] =

a.

b.

c.

d.

Page 13: Ptsp Probability and stochastic processs

http://www.PRsolutions.in 70. Consider a random process X(t) defined as X(t) = A coswt + B sinwt, where w is a

constant and A and B are random variables which of the following its a condition for its stationary.

a. E (A) = 0, E(B) = 0

b. E (AB) ≠ 0

c. E(A) ≠ 0; E(B) ≠ 0

d. A and B should be independent 71. X(t) is a wide source stationary process with E[X(t)] = 2 and RXX(τ) = 2 and RXX(τ) =

y + . Find the mean and variance of .

a. 2, 20 [10e-0.1 - 9]

b. 1, 10 [10e-0.1 + 9] c. 0, 5 [20e-0.1 - 9] d. 0, 10 [10e-0.1 + 9]

72. X(t) is a random process with mean 3 and autocorrelation R(t1, t2) = . The covariance of the R.V's Z = X(5) and w = X(8) is

a. e-0.6

b. 4 e-0.6

c. 8 e-0.6

d.

73. Let X(t) and Y(t) be two random processes with respective auto correlation

functions RXX(τ) and RYY(τ). Then is

a.

b.

c.

d.

74. X(t) is a Gaussian process with mean = 2 and auto correlation function . Then the variance of the random variable X(2) is

a. 21

b. 25

c. 4

d. 1

75. A stationary random process X(t) is periodic with period 2T. Its auto correlation function is

a. non periodic

b. periodic with period T

c. periodic with period 2T

d. periodic with period T/2

76. RXX(τ) =

a. E[X2 (t)]

b.

c.

d. E[x(t) . X(t +τ)]

77. Which of the following is correct

Page 14: Ptsp Probability and stochastic processs

a.

b.

c.

d.

78. If X(t) is ergodic, zero mean and has no periodic component then

a.

b.

c. mean is 0

d. constant mean

79. Let X(t) and Y(t) be two jointly stationary random process. Than RYX (-τ)=

a. RXY(τ)

b. RYX(τ) c. RXY(-τ) d. - RYX(τ)

80. If CXX (τ) and CXY (τ) are auto and cross-covariance functions respectively then

a.

b.

c.

d.

81. A random process is defined as X(t) = A cos (w1 t + θ) other θ is a uniform random variable over (0, 2π). Then RXX(τ) is

a.

b.

c.

d.

82. X(t) = A coswt where w is a constant and A is a uniform random variable over 10, 11. the auto covariance of X(t) is

a. Cos wt1, cos wt2

b.

c.

d.

83. The auto covariance C(t1, t2) of a process X(t) is (assume that E[X(t) = η(t))

a. C(t1, t2) = R(t1, t2)

b. C(t1, t2) = R(t1, t2) -

Page 15: Ptsp Probability and stochastic processs

http://www.PRsolutions.in

c. C(t1, t2) = R2(t1, t2) +

d. C(t1, t2) = R2(t1, t2) -

84. The correlation coefficient of the process X(t) is γ(t1, t2) =

a.

b.

c.

d.

85. Which of the following is correct

a.

b.

c.

d.

86. The auto correlation function of a stationary random process X(t) is RXX(τ)

= . The mean & variance is

a. 4, 25

b. 25, 4

c. 21, 2

d. 5, 4

87. The auto covariance C(t1, t2) of a process x(t) is the covariance of the R.V's

a. X1 (t) & X2 (t) b. X1 (t1) & X2 (t1) c. X(t1) & X(t2)

d. X1(t1) & X2(t2) 88. Two processes X(t) & Y(t) are called (mutually) orthogonal if for every t1 and t2.

a. Rxy(t1, t2) = 0

b. Rxy (t1, t2) > 0

c. Rxy (t1, t2) < 0

d. Rxy (t1, t2) = 1

89. Two processes X(t) and Y(t) are called uncorrelated of

a. Cxy(t1, t2) = 0

b. Cxy (t1, t2) > 0

c. Cxy (t1, t2) ∠ 0

d. Cxy (t1, t2) = 1

90. Which of the following is correct

a.

Page 16: Ptsp Probability and stochastic processs

b.

c.

d.

91. Auto correlation function of Poisson process is RXX(t1, t2) for t1 > t2 is

a. λt2 (1 + λt1)

b. λt1 (1 + λt2) c. 1

d. 0

92. X(t) is a Gaussian process with mean 2 and auto correlation function RXX(τ)

= . Then P(X(τ) ≤ 1) is (interms of quartiles)

a. Q(1)

b. Q(2) c. Q(3) d. Q(4)

93. The interval between two consecutive occurrences of Poisson process is ____r.v.

a. Gaussian

b. Poisson

c. expential d. binomial

94. The mean square value for the Poisson process sX(t) with parameter λt is

a. λt b. (λt)2

c. λt + (λt)2

d. λt - (λt)2

95. Difference of two independent Poisson processes is

a. Poisson process

b. not a Poisson process

c. Process with mean = 0 [λ1 t ≠ λ2t] d. Process with variance = 0 [λ1 t ≠ λ2t]

96. The mean of the poison random process X(t) with the parameter λ t is

a. &lambda

b. λ t

c.

d. 0

97. For a Poisson random process with b = λt, the probability of exactly K occurrences over the time internal (0, t) is P[X(t) = K] is (K=0, 1,2----).

a.

b.

c.

d.

98. A Gaussian random process (which is known to be WSS) the auto covariance and autocorrelation function will depend only on

a. Time differences

b. absolute time

c. mean

d. mean & variance

Page 17: Ptsp Probability and stochastic processs

http://www.PRsolutions.in 99. If X(t) and Y(t) are independent WSS processes with zero means, the

autocorrelation function of the process Z(t) = K X(t) Y(t) is

a. RXX(τ) RYY (τ)

b. K RXX(τ) RYY (τ) c. K2 RXX(τ) RYY (τ)

d.

100. Sum of two independent Poisson process is _________ process

a. Gaussian

b. Poisson

c. Rayleigh

d. Binomial 101. The mean several value of a random process whose auto correlation

function is

a.

b.

c.

d.

102. The output of a filter is given by Y(t) = X(t + T) = X(t - T). Where X(t) is a WSS process with power spectrum SXX(w) and T is a constant the power spectrum of Y(t) is

a.

b.

c.

d.

103. The PSD of a random process whose auto correlation function is is

a.

b.

c.

d.

104. The PSD of a random process X(t) = A.cos(Bt + Y), where Y is a uniform r.v over (0, 2π)

Page 18: Ptsp Probability and stochastic processs

a.

b.

c.

d.

105. The auto correlation function of a random process whose PSD is is

a.

b.

c.

d.

106. The power density spectrum or power spectral density of xT (t) = x(t) for - T ∠∠∠∠ t ∠∠∠∠ T.

= 0 elsewhere is SXX(w)=

a.

b.

c.

d.

107. The average power of the above described random process is =

a.

b.

c.

d. zero

108. Time average of auto correlation function and the power spectral density form __ pair.

a. Fourier Transform

b. Laplace Transform

c. Z-Transform

d. Convolution

Page 19: Ptsp Probability and stochastic processs

http://www.PRsolutions.in 109. The rms bend-width interms of PSD is

a.

b.

c.

d.

110. PSD of a WSS is always

a. negative

b. non-negative

c. positive

d. can be negative or positive

111. The average power of the periodic random process signal whose auto

correlation function is

a. 0

b. 1

c. 2

d. 3

112. If and and X(t) and Y(t) are of zero mean, then let U(t) = X(t) + Y(t). Then SXU (w) is

a.

b.

c.

d.

113. If Y(t) = X(t) - X(t - a) is a random process and X(t) is a WSS process and a > 0, is a constant, the PCD of y(t) in terms of the corresponding quantities of X(t) is

Page 20: Ptsp Probability and stochastic processs

a.

b.

c.

d.

114. A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real

constants and X(t) and Y(t) are jointly WSS processes. The power spectrum SZZ(w) is

a. AB SXY (w) + AB SYX (w) b. A2 + B2, + AB SXY (w) + AB SYX (w) c. A2 SXX(w) + AB SXY (w) + AB SYX (w) + B2 SYY (w)

d. 0

115. A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real constants and X(t) and Y(t) are jointly WSS processes. If X(t) and Y(t) are uncorrelated then SZZ (w)

a. A2 + B2

b. A2 SXX (w) + B2 SYY (w)

c. AB SXY(w) + AB SYX (w) d. 0

116. PSD is _________ function of frequency

a. even

b. odd

c. periodic

d. asymmetric

117. For a WSS process, PCD at zero frequency gives

a. the area under the graph of power spectral density

b. area under the graph auto correlation of the process

c. mean of the process

d. variance of the process

118. The mean square value of WSS process equals

a. the area under the graph of PSD

b. the area under the graph of auto correlation of process

c. zero

d. mean of the process

119. X(T) = A cos (wot + θ), where A and wo are constants and θ is a R.V uniformly distributed over (0, π). The average power of X(t) is

a. θ

b.

c.

d.

120. A WSS process X(t) has an auto correlation function . The PSD is

a.

Page 21: Ptsp Probability and stochastic processs

http://www.PRsolutions.in

b.

c.

d.

121. The real part and imaginary part of SYX (w) is ________ and ___________ function of w respectively.

a. odd, odd

b. odd, even

c. even, odd

d. even, even

122. If cross correlation is the cross power spectrum is SXY (w) =

a.

b.

c.

d.

123. If X(t) and Y(t) are uncorrelated and of constant means E(X) and E(Y), respectively then SXY (w) is

a. E(X) E(Y) b. 2E(X) E(Y) δ (w) c. 2πE(X) E(Y) δ (w)

d.

124. The means of two independent, WSS processes X(t) and Y(t) are 2 and 3 respectively. Their cross spectral density is

a. 6 δ (w) b. 12πδz(w)

c. 5πδ(w) d. δ(w)

125. The cross spectral density of two random processes X(t) and Y(t) is SXY (w) is

a.

b.

c.

d.

Page 22: Ptsp Probability and stochastic processs

126. Time average of cross correlation function and the cross spectral density function from ________ pair

a. Laplace Transform

b. Z-Transform

c. Fourier Transform

d. Convolution

127. SYX (w) =

a. SXY (w) b. SXY (-w)

c. SYX (-w) d. - SYX (w)

128. If X(t) and Y(t) are orthogonal then

a. SXY (w) = 0

b. SXY (w) = 1

c. SXY (w) > 1

d. SXY (w) < 1

129. The cross power formula Pxy is given by

a.

b.

c.

d.

130. The average power PXY is

a.

b.

c.

d.

131. The cross spectral density of two random process X(t) and Y(t) is

SXY (w) = for - K ∠∠∠∠ w ∠∠∠∠ K

= 0 elsewhere where K > 0 The cross correlation function between the processes

a.

b.

c.

d.

Page 23: Ptsp Probability and stochastic processs

http://www.PRsolutions.in

132. If Then RXY (τ) is

a.

b.

c.

d.

133. X(t) is a random process with mean 'K' and auto correlation function RXX(τ)

= . the variance of the r.v. X(5) is

a. (Y + K)2

b. (Y - K)2

c. 16 - K2

d.

134. The cross spectral density of two random processes X(t) and Y(t)

is , the area enclosed by their auto correlation function is

a.

b. a

c. Ka

d.

135. If then KXY(τ) =

a.

b.

c.

d.

136. RXY (τ) in terms of SXY (w) is

a.

b.

c.

Page 24: Ptsp Probability and stochastic processs

d.

137. SXY (w) in terms of RXY(τ) is

a.

b.

c.

d.

138. A random process is given by Z(t) = A.X(t) + B.Y(t) where 'A' and 'B' are real constant's ad X(t) and Y(t) are jointly WSS processes. The cross power spectrum Sxz (w) is

a. A SYX(w) + B SYY (w) b. A SXX(w) + B SYY (w) c. A SXX (w) + B SXY (w)

d. A SYY (w) + B SXY (w) 139. For the above problem SYZ (w) is

a. A SYX (w) + B SYY(w)

b. A SXX (w) + B SYY (w) c. A SXX (w) + B SXY (w) d. A SYY (w) + B SXX (w)

140. The auto correlation function of a WSS random process X(t)

is . The area enclosed by the PSD curve of X(t) is

a. 6

b. 2

c. 8

d. 4

141. If auto correlation function is R(τ) = then the power spectrum of random process is

a.

b.

c.

d.

142. If X(t) is an engodic process with auto correlation function of the form

for

Page 25: Ptsp Probability and stochastic processs

= 0 for The spectral density SXX(w) is

a.

b.

c.

d.

143. If for - 2πB ≤ w ≤ 2πB then RXX (τ) is =

a.

b.

c.

d.

144. The auto correlation function of a WSS process is R(τ) = , then its spectral density is S(w) =

a.

b.

c.

d.

145. The auto correlation function of a process with PSD of is

a.

b.

c.

Page 26: Ptsp Probability and stochastic processs

d. 1

146. If Z(t) = X(t) + Y(t) where X(t) and Y(t) are two WSS processes and X(t) and Y(t) are un correlated and of zero then RZZ(τ)=

a. RXY (τ) + RYX (τ) b. RXX (τ) + RYY(τ)

c. RXY (τ) - RYX (τ) d. RXX (τ) + RYY(τ)

147. For the above problem SZZ(w) is

a. Sxx (w) - Syy(w) b. Sxy (w) - Syx(w)

c. Sxx (w) + Syy(w) d. Sxy(w) -Syx(w)

148. The mean sqaure value for above problem

a. 0

b. B

c. η

d.

149. The random process X(t) and Y(t) are having their auto correlation functions

as and respectively. If they are orthogonal processes, then the mean square value of X(t) + Y(t)( is

a. 2

b. 3

c. 5

d. 6

150. Average process is given by

a. E [X(t)] b. RXX (0)

c. E[X(t)]2

d. RXX (1)

151. White noise with two sided PSD is passed through an RC-low pass network with time constant τ = R.C and there after though an ideal amplifier with voltage gain 10. The expression for mean square value of output noise is _______

a.

b.

c.

d.

152. A random process X(t) has , where A and B are +ve constants. The mean value of the response of a system having an impulse response h(t) = e-kt for t > 0 = 0 for t ∠∠∠∠ 0

Where K is a real positive const, for which X (t) is its input is

a.

b. AK

c. AK2

Page 27: Ptsp Probability and stochastic processs

d. A

http://www.PRsolutions.in

153. A random process with PSD K w/Hz for is passed through a system

with transfer function H(f) = 2 for - B ≤ f ≤ B and zero elsewhere. The output power of the system is

a. BK

b. 2BK

c.

d. B2K

154. X(t) is a WSS process with zero mean and is the input of an LTI system

with . If , the area enclosed by the auto correlation function of output process is

a. 1

b.

c.

d.

155. If RXX(τ) = 3.δ(τ) and then the mean smale value of Y(t) is

a. 3

b. 4

c.

d.

156. A system is said to be linear system if the system satisfies

a. principle of superposition

b. principle of homogeneity

c. principle of superposition and principle of homogeneity

d. Reciprocity principle

157. For an LTI system, the response y(t) , for any input x(t), with the known impulse response can be determined using ___________ integral.

a. convolution

b. Fourier c. Laplace

d. Fourier & Laplace

158. For an LTI system, its impulse response and transfer function form a _____ pair.

a. Laplace transform

b. Fourier transform

c. Z-transfer d. convolution

159. The cross correlation between X(t) and Y(t) is the RXY (τ) =

a. h(τ) * RXX (τ)

b. h(-τ) * RXX (τ) c. h(-τ) * RXY(τ) d. h(τ) * RYX(τ)

Page 28: Ptsp Probability and stochastic processs

160. A random process X(t) of mean 3 is applied to a delay element. The mean of the olp process is

a. 2

b. 3

c. 1.5

d. 9

161. Let X(t) be the i/p process and Y(t) be the o/p process of an LTI system with impulse response h1(t). Let Z(t) be the i/p process and V(t) be the o/p process of another LTI system with impulse response h2(t) there SYV (w) =

a. H1(w) H2(w) SXZ(w) b. H1*(w) H2(w) SXZ(w)

c. H1(w) H2(w) SYZ(w) d. H1*(w) H2(w) SYZ(w)

162. For the above problem noise bandwidth is BN =

a.

b. 0.58K

c.

d. 0.62K

163. A Gaussian filter has the transfer function

= o elsewhere Its 3dB bandwidth is

a.

b. K(0.58) c. 0.58

d. K2(0.58)

164. If the input power spectral density of a system is and its output PSD is SYY(w) N0. then the transfer function of the system is

a.

b.

c.

d.

165. If X(t) is a WSS process with auto correlation function RXX(τ) and power

spectrum SXX(w). Let . Then

a.

Page 29: Ptsp Probability and stochastic processs

b.

c.

d.

166. If SXX(w) is the power spectrum of the input process X(t) and is the power transfer function of the system then the average power PYY is =

a.

b.

c.

d.

167. The power spectral density of Y(t) can be obtained by the formula SYY(w) =

a.

b.

c.

d.

168. SYX (w) =

a.

b.

c.

d.

169. The noise band width of practical system is BN = (whose T.F is H(w)

a.

b.

Page 30: Ptsp Probability and stochastic processs

http://www.PRsolutions.in

c.

d.

170. The noise band width of a RC low pass filter is

a.

b.

c.

d.

171. Let N(t) be any band-limited wide-sense stationary real random process with

mean value of zero and a power density spectrum that satisfies

SNN(w) ≠ 0 0 ∠∠∠∠ w0 - W1 < ∠∠∠∠ w0 - W1 + W

= 0 elsewhere Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) - N(t)2] = 0, then If LP[ . ] denotes preserving only the low pass part of the quantity with in the brackets then SXX(w).

a. SNN(w - w0) + SNN(w + w0) b. LP[SNN(w - w0) + SNN(w + w0)

c. LP[SNN(w + w0) - SNN(w - w0) d. SNN(w + w0) - SNN(w - w0)

172. Let N(t) be any band-limited wide-sense stationary real random process with mean value of zero and a power density spectrum that satisfies

SNN(w) ≠ 0 0 ∠∠∠∠ w0 - W1 < ∠∠∠∠ w0 - W1 + W

= 0 elsewhere

Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) - N(t)2] = 0, then SYX (w) is

a. = SXY (w) b. = - SXY (w)

c. = SXY (-w)

d. = -SXY (-w) 173. Let N(t) be any band-limited wide-sense stationary real random process with

mean value of zero and a power density spectrum that satisfies

SNN(w) ≠ 0 0 ∠∠∠∠ w0 - W1 < ∠∠∠∠ w0 - W1 + W

= 0 elsewhere Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) - N(t)2] = 0, then E[X2 (t) __________ E[Y2 (t)]

a. >

b. ∠

c. ≤

d. =

174. Let N(t) be any band-limited wide-sense stationary real random process with mean value of zero and a power density spectrum that satisfies

Page 31: Ptsp Probability and stochastic processs

SNN(w) ≠ 0 0 < w0 - W1 < < w0 - W1 + W = 0 elsewhere Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) - N(t)2] = 0, then RXX (τ) =

a.

b.

c.

d.

175. Let N(t) be any band-limited wide-sense stationary real random process with mean value of zero and a power density spectrum that satisfies

SNN(w) ≠ 0 0 < w0 - W1 < < w0 - W1 + W = 0 elsewhere Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) - N(t)2] = 0, then RXY (0) =

a. 0

b.

c. π

d. 1

176. If the power density spectrum of a random process has its significant

components clustered in a B.W W(rad/s) that does not include w = 0. Then it is ___________ process.

a. band-limited

b. band pass

c. narrow band

d. stationary

177. If the power spectrum of a band pass random process is zero outside some frequency band of width W(rad/s) that does not include w = 0, the process is called

a. band-limited

b. band pass

c. narrow band

d. stationary

178. A process is said be narrow band if the frequency band of width w is _________ frequency near band-center.

a. equal b. much greater c. much lesser

d. twice

179. Let N(t) be any band-limited wide-sense stationary real random process with mean value of zero and a power density spectrum that satisfies

SNN(w) ≠ 0 0 ∠∠∠∠ w0 - W1 < ∠∠∠∠ w0 - W1 + W

= 0 elsewhere Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) - N(t)2] = 0, then X(t) and Y(t) are _______ processes

a. WSS

b. Non-Deterministic

Page 32: Ptsp Probability and stochastic processs

c. Strict-sense stationary

d. Gaussian rending

180. Let N(t) be any band-limited wide-sense stationary real random process with mean value of zero and a power density spectrum that satisfies

SNN(w) ≠ 0 0 ∠∠∠∠ w0 - W1 < ∠∠∠∠ w0 - W1 + W

= 0 elsewhere Where W1 and W are real positive constants consider N(t) can be represented by the random processes X(t) and Y(t), where E[N(t) N(t)2] = 0, then The mean of X(t) and Y(t) are

a. 1, 1

b. 1, 0

c. 0, 1

d. 0, 0

181. A TV receiver has a 4KΩ input resistance and operates in a frequency range of 54-56 MHZ. At an ambient temperature of 270C, the RMS thermal noise voltage at the i/p of the receiver is

a. 25µV

b. 19.9µV

c. 14.8µV

d. 22µV

182. The RMS noise voltage across a 2µF capacitor over the entire frequency band when the capacitor is shunted by a 2KΩ resistor maintained at 3000K is

a. 0.454µV

b. 4.54µV

c. 45.4µV

d. 0.0454µV

183. When noise is mixed with a sinusoid the amplitude and PSD of the resulting noise component becomes & of the original respectively

a. Same as that of original b. Half, half c. half, one-third

d. half, one-fourth

184. The available noise power per unit bandwidth at the input of an antenna with a noise temperature of 150K, feeding into a microwave amplifier with Te = 200K is

a. 483 x 10-23w

b. 4.83 x 10-23w

c. 48.3 x 10-23w

d. 483w

185. Two resistor with resistances R1 and R2 are connected in parallel and have physical temperatures T1 and T2 respectively The effective noise temperature TS of an equivalent resistor with resistance equal to the parallel combination R1 and R2 is

a.

b.

c.

d.

186. Let n(t) be the narrow band representation of noise where n(t) = nc(t) cos wct - ns(t) sinwc t, and let P1, P2, P3 are powers of n(t), nc(t) and ns(t) respectively then

a. P1 = 2P2 = 3P3

http://www.PRsolutions.in

Page 33: Ptsp Probability and stochastic processs

http://www.PRsolutions.in

b.

c. P1 = P2 = P3

d. 3P1 = 2P2 = P3

187. The equivalent noise temperature of parallel combination of two resistors R1 = R2 = R operating at noise temperature T1, and T2respectively is

a. T1 + T2

b. (T1 + T2)2

c.

d. T1. T2

188. In defining the noise bandwidth of a real system, it is required that the noise power N1 passed by ideal filter and noise powder N2 passed by real filter should be related as

a. N1 = 2N2

b. N2 = 2N1

c. N1 + N2 = 1

d. N1 = N2

189. Two resistor with resistances R1 and R2 are connected in parallel and have physical temperatures T1 and T2 respectively If T1 = T2 = T1, what is Ts ?

a. T

b.

c.

d.

190. Two resistor with resistances R1 and R2 are connected in parallel and have physical temperatures T1 and T2 respectively If the resistors are in series, TS =

a.

b.

c.

d.

191. An amplifier has three stages for which Te1 = 1500K (first stage), Te2 = 3500K and Te3 = 6000K (output stage). Available power gain of the first stage is 10 and overall input effective noise temperature is 1900K then the available power gain of second stage and cascade's noise figure are respectively.

a. 12, 1.655

b. 1.655, 12

c. 14,3.65

d. 3.65, 14

192. In an amplifier, the first stage in a cascade of 5 stages has Te1 = 750K and G1 =

0.5. Each succeeding stage has an effective input noise temperature and an available power gain that are each 1.75 times that of the stage preceding it. The cascade's effective input noise temperature is

Page 34: Ptsp Probability and stochastic processs

a. 50.260K

b. 500.260K

c. 400.260K

d. 550.260K

193. The total available output noise power spectral density for a noisy two port network is Ga0 =

a. ga(f)(T0 + Te)

b.

c.

d.

194. The noise present at the input of a two port network is 1µw. The noise figure of the network is 0.5dB and its gain is 1010. The available noise power contributed by two port is

a. 1.22 KW

b. 12.2KW

c. 122KW

d. 1220KW

195. For the above problem the available output noise power is

a. 12.2 KW

b. 11.2KW

c. 122KW

d. 112 KW

196. If ga(f) is the available gain of the network these the noise figure in terms of (effective noise temperature) and T0 is

a.

b.

c.

d.

197. The average noise figure is =

a.

b.

c. always

Page 35: Ptsp Probability and stochastic processs

d.

198. If the available gain ga(f), of the network is constant (=ga) over the range (f1, f2) then the average noise figure in terms of power is

a.

b.

c.

d.

199. The equivalent noise temperature of an amplifier with noise figure of 0.2dB at a temperature of 2900K is

a. 13.60K

b. 150K

c. 200K

d. 14.80K

200. If cascading of 2 two port networks is done, then the corresponding equivalent (effective) noise temperature of the cascade is obtained as Te= Where Te1 and Te2 are effective noise temperatures of two part - 1 & two port - 2 respectively. is available gain of port - 1

a.

b.

c.

d.

http://www.PRsolutions.in