Hidden Markov Model - The Most Probable Path

Post on 28-Jun-2015

280 views 4 download

Tags:

description

Hidden Markov Model - The Most Probable Path Viterbi Algorithm Machine learning

Transcript of Hidden Markov Model - The Most Probable Path

HIDDEN MARKOV MODEL

Presented by: - Vo Quang Tuyen- Le Quang Hoa- Truong Hoang Linh

Problem 2

MOST PROBABLE PATH

Markov Models

A Markov Model is specified by

The set of states S = {s1, s2, …. , sNs}. and characterized by

The prior probabilities:

Probabilities of si being the first state of a state sequence. Collected in vector .

The transition probabilities ai j = P(qn+1 = sj | qn = si)

probability to go from state i to state j. Collected in matrix A

The Markov model produces

A state sequence Q = {q1,….qN}, qn S over time

Hidden Markov Models

Additionally, for a Hidden Markov model we have

Emission probabilities:

for continuous valued observations, . A set of functions:

for discrete observations,

bi,k = P(xn = vk | qn = si )

Probabilities for the observation of xn = vk, if the system is in state si.

Collected in matrix B.

Observation sequence: X = { x1, x2, …., xN}

HMM parameters (for fixed number of states Ns) thus are

Trellis Diagram

A trellis diagram can be used to visualize likelihood calculations of HMMs.

Trellis Example

Joint likelihood for observed sequence X and state sequence (path) Q:

Example for Trellis diagram:

Hidden Markov Models Problem

Given model , what is the hidden state sequence Q that best explains an observation sequence X

𝑄∗=𝑎𝑟𝑔𝑚𝑎𝑥 𝑃 ( 𝑋 ,𝑄|Θ¿=?𝑄

Viterbi Algorithm

for a HMM with states.

1. Initialization: ,

where πi is the prior probability of being in state si at time n = 1

2. Recursion:

for n > 1 and all j = 1 ...

Viterbi Algorithm

3. Termination:

Find the best likelihood when the end of the observation sequence t = T is reached.

4. Backtracking of optimal state sequence:

Read the best sequence of states from the vectors.

Viterbi Algorithm / Example

For our weather HMM ,find the most probable hidden weather sequence for the observation sequence

Viterbi Algorithm / Example

1. Initialization (n=1):

Viterbi Algorithm / Example

2. Recursion (n=2):

We calculate the likelihood of getting to state “sunny” from all possible 3 redecessor states, and choose the most likely one to go on with:

The likelihood is stored in , the most likely predecessor in .The same procedure is executed with states “rainy” and “foggy”:

Viterbi Algorithm / Example

Viterbi Algorithm / Example

Recursion (n = 3):

Viterbi Algorithm / Example

Viterbi Algorithm / Example

3. Termination

The globally most likely path is determined, starting by looking for the last state of the most likely sequence.

4. BacktrackingThe best sequence of states can be read from the vectors.

n = N – 1 = 2:

n = N – 1 = 1:

Viterbi Algorithm / Example

The most likely weather sequence is:

Backtracking:

HIDDEN MARKOV MODELS

ANY QUESTION ?

REFERENCE

- Hidden Markov Models - A Tutorial for the Course Computational Intelligence, Barbara Resch.

- Hidden Markov Models, Speech Communication 2, SS 2004 , Erhard Rank & Franz Pernkopf