Belief Networks Done by: Amna Omar Abdullah Fatima Mahmood Al-Hajjat Najwa Ahmad Al-Zubi.

Post on 17-Dec-2015

214 views 1 download

Transcript of Belief Networks Done by: Amna Omar Abdullah Fatima Mahmood Al-Hajjat Najwa Ahmad Al-Zubi.

Belief Networks

Done by:Amna Omar Abdullah

Fatima Mahmood Al-HajjatNajwa Ahmad Al-Zubi

Definition of Belief Networks:

A belief network: is a directed acyclic graph (DAG) which uses a graphical representation to represent probabilistic structure. It consists of vertices (or nodes) that are connected by causal links - represented by arrows - which point from parent nodes (causes) to child nodes (effects).

Each node is used to represent a random variable, and each directed edge represents an immediate dependence or direct influence. How to design a Belief Network?

• Explore the causal relations

Example

Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.

Rain Sprinklers

Holme’s lawn wet

Watson’s lawn wet

Defining casual components

Cont’d

Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.

Rain Sprinklers

Holme’s lawn wet

Watson’s lawn wet

Define the probabilities

Cont’d

Holmes and Watson have moved to IA. He wakes up to find his lawn wet. He wonders if it has rained or if he left his sprinkler on. He looks at his neighbor Watson’s lawn and he sees it is wet too. So, he concludes it must have rained.

Rain Sprinklers

Holme’s lawn wet

Watson’s lawn wet

Given w, P(R) goes up and P(S) goes down

Joint Probability Distribution:

Belief network is related to the joint probability distribution that:

1- Representing information in a more understandable and logical manner.

2- making construction and interpretation much simpler, It does not fully specify it. What it does is give the conditional dependencies in the probability distribution.

On top of these direct dependence relationships, we need to give some numbers (or functions) which define how the variables relate to one another.

Joint distribution:• P(X1, X2, …, Xn)• Probability assignment to all combinations

of values of random variables and provide complete information about the probabilities of its random variables.

• A JPD table for n random variables, each ranging over k distinct values, has kⁿ entries!

Toothache :Toothache

Cavity .04 .06:Cavity .01 .89

Using the joint distribution

A joint probability distribution P(A,B,C,D,E) over a number of variables can be written in the following form using the chain rule of probability:

P(A,B,C,D,E) = P(A) P(B|A) P(C|B,A) P(D|C,B,A) P(E|D,C,B,A).

Probability theory

• Conditioning

• P(A) = P(A | B) P(B) + P(A | :B) P(:B)

• A and B are independent iff

• P(A | B) = P(A)

• P(B | A) = P(B)

• A and B are conditionally independent given C iff

• P(A | B, C) = P(A | C)

• P(B | A, C) = P(B | C)

• Bayes’ Rule

• P(A | B) = P(B | A) P(A) / P(B)

• P(A | B, C) = P(B | A, C) P(A | C) / P(B | C)

Cont’d

Variables: V1, …, Vn• Values: v1, …, vn• P(V1=v1, V2=v2, …, Vn=vn) = Πi

P(Vi=vi | parents(Vi))

P(ABCD) = P(A=true, B=true,C=true, D=true)

P(ABCD) =P(D|ABC)P(ABC)

A B

C

D

P(A) P(B)

P(C|A,B)

P(D|C)

P(ABCD) = P(D|ABC)P(ABC) = P(D|C) P(ABC) =

A independent from D given CB independent from D given C

A B

C

D

P(A)

P(C|A,B)

P(D|C)

P(B)

P(ABCD) = P(D|ABC)P(ABC) = P(D|C) P(ABC) = P(D|C) P(C|AB) P(AB) =

A independent from D given CB independent from D given C

A B

C

D

P(A)

P(C|A,B)

P(D|C)

P(B)

Cont’d

P(ABCD) =P(D|ABC)P(ABC) =P(D|C) P(ABC) =P(D|C) P(C|AB) P(AB) =P(D|C) P(C|AB) P(A)P(B)

A independent from B

A B

C

D

P(A)

P(C|A,B)

P(D|C)

P(B)

Probability that Watson Crashes

Icy

Holmes Crash

Watson Crash

P(I)=0.7

P(W) = P(W| I) P(I) + P(W|-I) P(-I)

= 0.8*0.7 + 0.1*0.3

= 0.56 + 0.03

= 0.59

P(H|I)

I 0.8

-I 0.1

P(W|I)

I 0.8

-I 0.1

Cont’d

Icy

Holmes Crash

Watson Crash

P(I)=0.7

P(H|I)

I 0.8

-I 0.1

P(W|I)

I 0.8

-I 0.1

P(I | W) = P(W | I) P(I) / P(W)

= 0.8*0.7 / 0.59

= 0.95

We started with P(I) = 0.7; knowing that Watson crashed raised the probability to 0.95

Thank you for listening, if

you have any questions

don’t hesitate to ask Mr.

Ashraf.