INDR 343 Problem Session 1 09.10.2014 indr343

17
INDR 343 Problem Session 1 09.10.2014 http://home.ku.edu.tr/ ~indr343/

Transcript of INDR 343 Problem Session 1 09.10.2014 indr343

Page 1: INDR 343 Problem Session 1 09.10.2014 indr343

INDR 343 Problem Session 109.10.2014

http://home.ku.edu.tr/~indr343/

Page 2: INDR 343 Problem Session 1 09.10.2014 indr343

Consider the second version of stock market model presented as an example. Whether the stock goes up tomorrow depends upon whether it increased today and yesterday. If the stock increased today and yesterday, it

will increase tomorrow with probability α1. If the stock increased today and decreased

yesterday, it will increase tomorrow with probability α2.

If the stock decreased today and increased yesterday, it will increase tomorrow with probability α3.

If the stock decreased today and yesterday, it will increase tomorrow with probability α4.

16.2-3

Page 3: INDR 343 Problem Session 1 09.10.2014 indr343

(a) Construct the (one-step) transition matrix of the Markov chain.

(b) Explain why the states used for this Markov chain cause the mathematical definition of the Markovian property to hold even though what happens in the future (tomorrow) depends upon what happened in the past (yesterday) as well as the present (today).

16.2-3

Page 4: INDR 343 Problem Session 1 09.10.2014 indr343

Suppose that a communications network transmits binary digits, 0 or 1, where each digit is transmitted 10 times in succession.

During each transmission, the probability is 0.99 that the digit entered will be transmitted accurately. In other words, the probability is 0.01 that the digit being transmitted will be recorded with the opposite value at the end of the transmission.

For each transmission after the first one, the digit entered for transmission is the one that was recorded at the end of the preceding transmission. If X0 denotes the binary digit entering the system, X1 the binary digit recorded after the first transmission, X2 the binary digit recorded after the second transmission, . . . , then {Xn} is a Markov chain.

16.3-2

Page 5: INDR 343 Problem Session 1 09.10.2014 indr343

(a) Construct the (one-step) transition matrix.

(b) Use your OR Courseware to find the 10-step transition matrix P(10). Use this result to identify the probability that a digit entering the network will be recorded accurately after the last transmission.

16.3-2

Page 6: INDR 343 Problem Session 1 09.10.2014 indr343

An urn always contains 2 balls. Ball colors are red and blue. At each stage a ball is randomly chosen and then replaced by a new ball, which with probability 0.8 is the same color, and with probability 0.2 is the opposite color, as the ball it replaces. If initially both balls are red, find the probability that the fifth ball selected is red.

Ex: Unconditional Probabilities

Page 7: INDR 343 Problem Session 1 09.10.2014 indr343

Classification of StatesAccessible: Possible to go from state i to state j (path exists in

the network from i to j).

2 3 4 …10

d4d1 d2 d3

a0a1 a2 a3

2 3 4 …a a

1a

0a0 1 2 3

Two states communicate if both are accessible from each other. A system is irreducible if all states communicate.

State i is recurrent if the system will return to it after leaving some time in the future.

If a state is not recurrent, it is transient.

Page 8: INDR 343 Problem Session 1 09.10.2014 indr343

Classification of States (continued)

A state is periodic if it can only return to itself after a fixed number of transitions greater than 1 (or multiple of a fixed number).

A state that is not periodic is aperiodic.

2

0

1

(1) (1)

(1)

a. Each state visited every 3 iterations

(1)

2

0

1

(1)

(0.5)

(1)

4(0.5)

b. Each state visited in multiples of 3 iterations

Page 9: INDR 343 Problem Session 1 09.10.2014 indr343

Classification of States (continued)An absorbing state is one that locks in the system once it enters.

2 3 4

a a1

a

0

d d d1 2 3

1 2 3

This diagram might represent the wealth of a gambler who begins with $2 and makes a series of wagers for $1 each.

Let ai be the event of winning in state i and di the event of losing in state i.

There are two absorbing states: 0 and 4.

Page 10: INDR 343 Problem Session 1 09.10.2014 indr343

Classification of States (continued)

Class: set of states that communicate with each other.

A class is either all recurrent or all transient..

State i is ergodic if it is recurrent and aperiodic.

A Markov chain is ergodic if all of its states are ergodic.

2

0

1 5

3

4

6

Page 11: INDR 343 Problem Session 1 09.10.2014 indr343

Illustration of Concepts

3 1

0

2

0 0 X 0 X

1 X 0 0 0

2 X 0 0 0

3 0 0 X X

0 1 2 3

State

Example 1

Every pair of states communicates forming a single recurrent class; moreover, the states are not periodic.

Thus the stochastic process is aperiodic and irreducible.

Page 12: INDR 343 Problem Session 1 09.10.2014 indr343

Example 2

4

0

0 X X 0 0 X

1 X X 0 0 0

2 0 0 X X 0

3 0 0 0 X 0

0 1 2 3 4

State 4 0 0 0 0 0

23

1

States 0 and 1 communicate and form a recurrent class.

States 3 and 4 form separate transient classes.

State 2 is an absorbing state and forms a recurrent class.

Illustration of Concepts

Page 13: INDR 343 Problem Session 1 09.10.2014 indr343

Example 3

3 1

0

2

0 0 0 0 X

1 X 0 0 0

2 X 0 0 0

3 0 X X 0

0 1 2 3

State

Every state communicates with every other state, so we have an irreducible stochastic process.

Periodic?

Yes, so Markov chain is irreducible and periodic.

Illustration of Concepts

Page 14: INDR 343 Problem Session 1 09.10.2014 indr343

Example

Classification of States

2.08.0000

1.04.05.000

07.03.000

0005.05.0

0006.04.0

5

4

3

2

1

P

.5.4

.6

.5

.3 .5

.4

.8

.7

.1

1

5

23 4

.2

Page 15: INDR 343 Problem Session 1 09.10.2014 indr343

Given each of the following (one-step) transition matrices of a Markov chain, determine the classes of the Markov chain and whether they are recurrent.

16.4-2

Page 16: INDR 343 Problem Session 1 09.10.2014 indr343

16.4-3

Page 17: INDR 343 Problem Session 1 09.10.2014 indr343

Consider the Markov chain that has the following (one step) transition matrix.

(a) Determine the classes of this Markov chain and, for each class, determine whether it is recurrent or transient.

(b) Determine the periods of each state.

16.4-5