M A R K O V C H A I N

11
1 A SEMINAR ON MARKOV CHAIN AND PROCESS PRESENTED BY PANKAJ A. DEOKATE SAGAR TIKKAS GUIDED BY MR.ASHISH GHORPADE tqma2z.blogspot.com SHRI SHIVAJI SHIKSHAN SANSTHA DR. PANJBRAO DESHMUKH INST. OF MANAGEMENT TECHNOLOGY& RESEARCH,NAGPUR

description

M A R K O V C H A I N

Transcript of M A R K O V C H A I N

Page 1: M A R K O V  C H A I N

1

A SEMINAR ON

MARKOV CHAIN AND PROCESS PRESENTED BY

PANKAJ A. DEOKATE SAGAR TIKKAS

GUIDED BY

MR.ASHISH GHORPADE tqma2z.blogspot.com

SHRI SHIVAJI SHIKSHAN SANSTHADR. PANJBRAO DESHMUKH INST. OF MANAGEMENT TECHNOLOGY&

RESEARCH,NAGPUR

Page 2: M A R K O V  C H A I N

2

Outline

1 Introduction

2 Brand switching example.

3 Markov process

3.1 finite states 3.2 first order process 3.3 Stationarity 3.4 Uniform time period

4 MARKOV ANALYSIS: INPUT AND OUTPUT

- problem

- 5 Application

- 6 References

Page 3: M A R K O V  C H A I N

3

Introduction

• Many times we are interested to know as to how a random variable changes over time.

• we may like to know that the way market share of a detergent or any brand product changes year after year.

• the study of how random variable evolve over a full period of time includes stochastic process

• we discuss about focusing on stochastic process called MARKOV CHAIN

• Markov chain are used extensively with application in marketing, finance, accounting, production education and so on.

Page 4: M A R K O V  C H A I N

4

Brand switching example BRAND THIS MONTH BRAND NEXT MONTH

In the given problem ,the aim is to determine the behaviour of the system.

the situation is dynamic in sense that it involve multiple period (differnt time period ) requiring to customer to make

sequence of decision in chance environment ,involving two or more possible outcome at fixed interval of time .In short given

process may be defined in stochastic process.

Discrete time stochastic process:let Xt be the value of system characteristic at time t, and it may be vive as random variable a description of a relation between random variable at X0,X1,X2 and so on called Discrete time stochastic process transition of switching from one brand to another brand is called transition probality.

D1

D2

D3

D1

D2

D3

Page 5: M A R K O V  C H A I N

5

3.Markov process it includes

1.finite states

- absorbing states

in regards to the classification of states,it may be further noted that,

a. for two states i,j, a sequence of transition thats begins in i and ends in j is called "path" from i to j.

b. A states j is known as "reachable" from state i if there is path from i to j

c. communicative

d. transient

2.first order process

in this context,it may be mention that where the probability of the next event depends upon the outcome of the last event like customer choice of brands in given month is depends upon

choice in the last month,the markov process is terms as first order markov process,

similarly second order mar. process

Page 6: M A R K O V  C H A I N

6

3.Stationarity: Transition probability are constant over the time.

4.Uniform time period:

The changes from one state to another takes place only ones during each time period and time period are equal in duration.

Page 7: M A R K O V  C H A I N

7

MARKOV ANALYSIS: INPUT AND OUTPUT

In the Markov analysis ,the analysis of given system is based on the following two

sets of inputs data

1.Transition matrix (containing transition probability)

2.The initial condition in which the system is based on these inputs

problem 01:A market survey is made on two brands of breakfast food A and B. every times a customer purchase, he may buy a same brand or switch to another brands the transition matrix is given bellow,

at present its estimated that 60% of the people buy brand A and 40% buy brand B. determine the market share of brand A and B in stedy state.

TO

FROM A B A 0.8 0.2 B 0.6 0.4

Page 8: M A R K O V  C H A I N

8

Application

1.The application of Markov chain IN CHEMICAL ENGINEERING

2. The application of Markov chain analysis to oligonucleotide frequency prediction

3. The application of Markov chain in marketing as a brand product prediction

Page 9: M A R K O V  C H A I N

9

Conclusions

In this way the study of Markov analysis proves that the its effective in application with marketing, algorithmic, scientific and others studies

whatever may be the results in Markov is based on prediction basis

Page 10: M A R K O V  C H A I N

10

References

[1] Business Statistics, G. C. Beri (TMH)

[2] Quantitative Techniques in Management, N. D. Vohra (TMH)

[3] Quantitative Methods For Business, Anderson ( Thomson Learning Books)

[4] Statistical methods, S.P. Gupta ( S Chand)

[5] Levin Richard & Rubin David - Statistics for Management (Prentice Hall of India)

Page 11: M A R K O V  C H A I N

11