INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

37
INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty

Transcript of INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Page 1: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

INC 551 Artificial Intelligence

Lecture 8

Models of Uncertainty

Page 2: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Inference by Enumeration

Page 3: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Bayesian Belief Network Model

Causes -> Effect

Graph Structure shows dependency

Page 4: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Burglar Alarm ExampleMy house has a burglar alarm but sometimes it rings becauseof an earthquake. My neighbors, John and Mary promise meto call if they hear the alarm. However, their ears are not perfect.

Page 5: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

One Way to Create BBN

Page 6: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Computing Probability

= 0.90x0.70x0.001x0.999x0.998 = 0.00062

Page 7: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

BBN Construction

There are many ways to construct BBN ofa problem because the events depend oneach other (related).

Therefore, it depends on the order of eventsthat you consider.

The most simple is the most compact.

Page 8: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.
Page 9: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.
Page 10: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.
Page 11: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Not compact

Page 12: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Inference Problem

)|( eEXP i Find

),|( trueMaryCallstrueJohnCallsBurglaryP

For example, find

Page 13: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Inference by Enumeration

y

yeXPeXPeP

eXPeXP ),,(),(

)(

),()|(

Note: let

α is called “Normalized Constant”

)(

1

eP

y are other events

Page 14: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

),|( trueMaryCallstrueJohnCallsBurglaryP

(summation of all events)

Page 15: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Calculation Tree

Inefficient, Compute P(j|a)P(m|a) for every value of e

Page 16: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

001.0

998.))06.05.01(.)94.9.7((.

002.))05.05.01(.)95.9.7((.

),,(),|(

mjbPmjbP

Next, we have to find P(~b|j,m) and find α = 1/P(j,m)

Page 17: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Approximate Inference

Idea: Count from real examples

We call this procedure “Sampling”

Sampling = get real examples from the world model

Page 18: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Sampling Example

Cloudy = มี�เมีฆSprinkler = ละอองน้ำ��

Page 19: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

?)?,?,,(

),,,(

T

WetGrassRainSprinklerCloudy

Page 20: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

?)?,?,,(

),,,(

T

WetGrassRainSprinklerCloudy

Page 21: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

?)?,,,(

),,,(

FT

WetGrassRainSprinklerCloudy

Page 22: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

?),,,(

),,,(

TFT

WetGrassRainSprinklerCloudy

Page 23: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

?),,,(

),,,(

TFT

WetGrassRainSprinklerCloudy

Page 24: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

),,,(

),,,(

TTFT

WetGrassRainSprinklerCloudy

1 Sample

Page 25: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Rejection Sampling

Idea: Count only the sample that agree with e

To find )|( eXP i

Page 26: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Rejection Sampling

Drawback: There are not many samples that agree with e

From the example above, from 100 samples, only 27 areUsable.

Page 27: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Likelihood Weighting

Idea: Generate only samples that are relevant with e

However, we must use “weighted sampling”

e.g. Find ),|( trueWetGrasstrueSprinklerRainP

Page 28: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Fix sprinkler = TRUE Wet Grass = TRUE

Weighted Sampling

Sample from P(Cloudy)=0.5 , suppose we get “true”

Page 29: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.
Page 30: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.
Page 31: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Sprinkler already has value = true,therefore we multiply with weight = 0.1

Page 32: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Sample from P(Rain)=0.8 , suppose we get “true”

Page 33: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

WetGrass already has value = true,therefore we multiply with weight = 0.99

Page 34: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Finally, we got a sample (t,t,t,t)With weight = 0.099

Page 35: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Temporal Model (Time)

When the events are tagged with timestamp

1dayRain 2dayRain 3dayRain

Each node is considered “state”

Page 36: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Markov Process

Let Xt = stateFor Markov processes, Xt depends only on finite numberof previous xt

Page 37: INC 551 Artificial Intelligence Lecture 8 Models of Uncertainty.

Hidden Markov Model (HMM)

Each state has observation, Et.

We cannot see “state”, but we see “observation”.