Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons...

67
CSE 473: Ar+ficial Intelligence Bayes’ Nets: Inference Luke Ze@lemoyer --- University of Washington [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available at h@p://ai.berkeley.edu.]

Transcript of Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons...

Page 1: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

CSE473:Ar+ficialIntelligence

Bayes’Nets:Inference

LukeZe@lemoyer---UniversityofWashington[TheseslideswerecreatedbyDanKleinandPieterAbbeelforCS188IntrotoAIatUCBerkeley.AllCS188materialsareavailableath@p://ai.berkeley.edu.]

Page 2: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Bayes’NetRepresenta+on

§  Adirected,acyclicgraph,onenodeperrandomvariable§  Acondi+onalprobabilitytable(CPT)foreachnode

§  Acollec+onofdistribu+onsoverX,oneforeachcombina+onofparents’values

§  Bayes’ netsimplicitlyencodejointdistribu+ons

§  Asaproductoflocalcondi+onaldistribu+ons

§  ToseewhatprobabilityaBNgivestoafullassignment,mul+plyalltherelevantcondi+onalstogether:

Page 3: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example:AlarmNetwork

Burglary Earthqk

Alarm

Johncalls

Marycalls

B P(B)

+b 0.001

-b 0.999

E P(E)

+e 0.002

-e 0.998

B E A P(A|B,E)

+b +e +a 0.95

+b +e -a 0.05

+b -e +a 0.94

+b -e -a 0.06

-b +e +a 0.29

-b +e -a 0.71

-b -e +a 0.001

-b -e -a 0.999

A J P(J|A)

+a +j 0.9

+a -j 0.1

-a +j 0.05

-a -j 0.95

A M P(M|A)

+a +m 0.7

+a -m 0.3

-a +m 0.01

-a -m 0.99[Demo: BN Applet]

Page 4: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example:AlarmNetworkB P(B)

+b 0.001

-b 0.999

E P(E)

+e 0.002

-e 0.998

B E A P(A|B,E)

+b +e +a 0.95

+b +e -a 0.05

+b -e +a 0.94

+b -e -a 0.06

-b +e +a 0.29

-b +e -a 0.71

-b -e +a 0.001

-b -e -a 0.999

A J P(J|A)

+a +j 0.9

+a -j 0.1

-a +j 0.05

-a -j 0.95

A M P(M|A)

+a +m 0.7

+a -m 0.3

-a +m 0.01

-a -m 0.99

B E

A

MJ

Page 5: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example:AlarmNetworkB P(B)

+b 0.001

-b 0.999

E P(E)

+e 0.002

-e 0.998

B E A P(A|B,E)

+b +e +a 0.95

+b +e -a 0.05

+b -e +a 0.94

+b -e -a 0.06

-b +e +a 0.29

-b +e -a 0.71

-b -e +a 0.001

-b -e -a 0.999

A J P(J|A)

+a +j 0.9

+a -j 0.1

-a +j 0.05

-a -j 0.95

A M P(M|A)

+a +m 0.7

+a -m 0.3

-a +m 0.01

-a -m 0.99

B E

A

MJ

Page 6: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Bayes’Nets

§  Representa+on

§  Condi+onalIndependences

§  Probabilis+cInference§  Enumera+on(exact,exponen+alcomplexity)

§  Variableelimina+on(exact,worst-caseexponen+alcomplexity,ohenbe@er)

§  InferenceisNP-complete

§  Sampling(approximate)

§  LearningBayes’NetsfromData

Page 7: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

§  Examples:

§  Posteriorprobability

§  Mostlikelyexplana+on:

Inference

§  Inference:calcula+ngsomeusefulquan+tyfromajointprobabilitydistribu+on

Page 8: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

InferencebyEnumera+on§  Generalcase:

§  Evidencevariables:§  Query*variable:§  Hiddenvariables: Allvariables

*Worksfinewithmul:plequeryvariables,too

§  Wewant:

§  Step1:Selecttheentriesconsistentwiththeevidence

§  Step2:SumoutHtogetjointofQueryandevidence

§  Step3:Normalize

⇥ 1

Z

Page 9: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

InferencebyEnumera+oninBayes’Net§  Givenunlimited+me,inferenceinBNsiseasy

§  Reminderofinferencebyenumera+onbyexample:B E

A

MJ

P (B |+ j,+m) /B P (B,+j,+m)

=X

e,a

P (B, e, a,+j,+m)

=X

e,a

P (B)P (e)P (a|B, e)P (+j|a)P (+m|a)

=P (B)P (+e)P (+a|B,+e)P (+j|+ a)P (+m|+ a) + P (B)P (+e)P (�a|B,+e)P (+j|� a)P (+m|� a)

P (B)P (�e)P (+a|B,�e)P (+j|+ a)P (+m|+ a) + P (B)P (�e)P (�a|B,�e)P (+j|� a)P (+m|� a)

Page 10: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

InferencebyEnumera+on?

P (Antilock|observed variables) = ?

Page 11: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

InferencebyEnumera+onvs.VariableElimina+on§  Whyisinferencebyenumera+onsoslow?

§  Youjoinupthewholejointdistribu+onbeforeyousumoutthehiddenvariables

§  Idea:interleavejoiningandmarginalizing!§  Called“VariableElimina+on”§  S+llNP-hard,butusuallymuchfasterthan

inferencebyenumera+on

§  Firstwe’llneedsomenewnota+on:factors

Page 12: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

FactorZoo

Page 13: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

FactorZooI

§  Jointdistribu+on:P(X,Y)§  EntriesP(x,y)forallx,y§  Sumsto1

§  Selectedjoint:P(x,Y)§  Asliceofthejointdistribu+on§  EntriesP(x,y)forfixedx,ally§  SumstoP(x)

§  Numberofcapitals=dimensionalityofthetable

T W P

hot sun 0.4

hot rain 0.1

cold sun 0.2

cold rain 0.3

T W P

cold sun 0.2

cold rain 0.3

Page 14: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

FactorZooII

§  Singlecondi+onal:P(Y|x)§  EntriesP(y|x)forfixedx,ally§  Sumsto1

§  Familyofcondi+onals:P(X|Y)§  Mul+plecondi+onals§  EntriesP(x|y)forallx,y§  Sumsto|Y|

T W P

hot sun 0.8

hot rain 0.2

cold sun 0.4

cold rain 0.6

T W P

cold sun 0.4

cold rain 0.6

Page 15: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

FactorZooIII

§  Specifiedfamily:P(y|X)§  EntriesP(y|x)forfixedy,butforallx

§  Sumsto…whoknows!

T W P

hot rain 0.2

cold rain 0.6

Page 16: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

FactorZooSummary

§  Ingeneral,whenwewriteP(Y1…YN|X1…XM)

§  Itisa“factor,”amul+-dimensionalarray

§  ItsvaluesareP(y1…yN|x1…xM)

§  Anyassigned(=lower-case)XorYisadimensionmissing(selected)fromthearray

Page 17: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example:TrafficDomain

§  RandomVariables§  R:Raining§  T:Traffic§  L:Lateforclass! T

L

R +r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

P (L) = ?

=X

r,t

P (r, t, L)

=X

r,t

P (r)P (t|r)P (L|t)

Page 18: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

VariableElimina+on(VE)

Page 19: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

InferencebyEnumera+on:ProceduralOutline

§  Trackobjectscalledfactors§  Ini+alfactorsarelocalCPTs(onepernode)

§  Anyknownvaluesareselected§  E.g.ifweknow,theini+alfactorsare

§  Procedure:Joinallfactors,theneliminateallhiddenvariables

+r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

+t +l 0.3-t +l 0.1

+r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

Page 20: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Opera+on1:JoinFactors

§  Firstbasicopera+on:joiningfactors§  Combiningfactors:

§  Justlikeadatabasejoin§  Getallfactorsoverthejoiningvariable§  Buildanewfactorovertheunionofthevariables

involved

§  Example:JoinonR

§  Computa+onforeachentry:pointwiseproducts

+r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

+r +t 0.08+r -t 0.02-r +t 0.09-r -t 0.81T

R

R,T

Page 21: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example:Mul+pleJoins

Page 22: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example:Mul+pleJoins

T

R JoinR

L

R, T

L

+r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

+r +t 0.08+r -t 0.02-r +t 0.09-r -t 0.81

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

R, T, L

+r +t +l 0.024+r +t -l 0.056+r -t +l 0.002+r -t -l 0.018-r +t +l 0.027-r +t -l 0.063-r -t +l 0.081-r -t -l 0.729

JoinT

Page 23: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Opera+on2:Eliminate

§  Secondbasicopera+on:marginaliza+on

§  Takeafactorandsumoutavariable§  Shrinksafactortoasmallerone

§  Aprojec+onopera+on

§  Example:

+r +t 0.08+r -t 0.02-r +t 0.09-r -t 0.81

+t 0.17-t 0.83

Page 24: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Mul+pleElimina+on

SumoutR

SumoutT

T, L L R, T, L

+r +t +l 0.024+r +t -l 0.056+r -t +l 0.002+r -t -l 0.018-r +t +l 0.027-r +t -l 0.063-r -t +l 0.081-r -t -l 0.729

+t +l 0.051+t -l 0.119-t +l 0.083-t -l 0.747

+l 0.134-l 0.886

Page 25: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

ThusFar:Mul+pleJoin,Mul+pleEliminate(=InferencebyEnumera+on)

Page 26: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

MarginalizingEarly(=VariableElimina+on)

Page 27: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

TrafficDomain

§  InferencebyEnumera+onT

L

R P (L) = ?

§  VariableElimina+on

=X

t

P (L|t)X

r

P (r)P (t|r)

JoinonrJoinonr

Joinont

Joinont

Eliminater

Eliminatet

Eliminater

=X

t

X

r

P (L|t)P (r)P (t|r)

Eliminatet

Page 28: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

MarginalizingEarly!(akaVE)SumoutR

T

L

+r +t 0.08+r -t 0.02-r +t 0.09-r -t 0.81

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

+t 0.17-t 0.83

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

T

R

L

+r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

JoinR

R, T

L

T, L L

+t +l 0.051+t -l 0.119-t +l 0.083-t -l 0.747

+l 0.134-l 0.866

JoinT SumoutT

Page 29: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Evidence

§  Ifevidence,startwithfactorsthatselectthatevidence§  Noevidenceusestheseini+alfactors:

§  Compu+ng,theini+alfactorsbecome:

§  Weeliminateallvarsotherthanquery+evidence

+r 0.1-r 0.9

+r +t 0.8+r -t 0.2-r +t 0.1-r -t 0.9

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

+r 0.1 +r +t 0.8+r -t 0.2

+t +l 0.3+t -l 0.7-t +l 0.1-t -l 0.9

Page 30: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

EvidenceII

§  Resultwillbeaselectedjointofqueryandevidence§  E.g.forP(L|+r),wewouldendupwith:

§  Togetouranswer,justnormalizethis!

§  That’sit!

+l 0.26-l 0.74

+r +l 0.026+r -l 0.074

Normalize

Page 31: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

GeneralVariableElimina+on

§  Query:

§  Startwithini+alfactors:§  LocalCPTs(butinstan+atedbyevidence)

§  Whilethereares+llhiddenvariables(notQorevidence):§  PickahiddenvariableH§  Joinallfactorsmen+oningH§  Eliminate(sumout)H

§  Joinallremainingfactorsandnormalize

Page 32: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example

Choose A

Page 33: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example

ChooseE

FinishwithB

Normalize

Page 34: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example2:P(B|a)

A B P

+a +b 0.08

+a ¬b 0.09B A P

+b +a 0.8

b ¬a 0.2

¬b +a 0.1

¬b ¬a 0.9

B P

+b 0.1

¬b 0.9 a

B a, B

Start/Select JoinonB Normalize

A B P

+a +b 8/17

+a ¬b 9/17

Page 35: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

SameExampleinEqua+ons

marginalcanbeobtainedfromjointbysummingout

useBayes’netjointdistribu+onexpression

usex*(y+z)=xy+xz

joiningona,andthensummingoutgivesf1

usex*(y+z)=xy+xz

joiningone,andthensummingoutgivesf2

Allwearedoingisexploi4nguwy+uwz+uxy+uxz+vwy+vwz+vxy+vxz=(u+v)(w+x)(y+z)toimprovecomputa4onalefficiency!

Page 36: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

AnotherVariableElimina+onExample

Computa+onalcomplexitycri+callydependsonthelargestfactorbeinggeneratedinthisprocess.Sizeoffactor=numberofentriesintable.Inexampleabove(assumingbinary)allfactorsgeneratedareofsize2---astheyallonlyhaveonevariable(Z,Z,andX3respec+vely).

Page 37: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

VariableElimina+onOrdering

§  ForthequeryP(Xn|y1,…,yn)workthroughthefollowingtwodifferentorderingsasdoneinpreviousslide:Z,X1,…,Xn-1andX1,…,Xn-1,Z.Whatisthesizeofthemaximumfactorgeneratedforeachoftheorderings?

§  Answer:2n+1versus22(assumingbinary)

§  Ingeneral:theorderingcangreatlyaffectefficiency.

Page 38: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

VE:Computa+onalandSpaceComplexity

§  Thecomputa+onalandspacecomplexityofvariableelimina+onisdeterminedbythelargestfactor

§  Theelimina+onorderingcangreatlyaffectthesizeofthelargestfactor.§  E.g.,previousslide’sexample2nvs.2

§  Doestherealwaysexistanorderingthatonlyresultsinsmallfactors?§  No!

Page 39: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

WorstCaseComplexity?§  CSP:

§  IfwecananswerP(z)equaltozeroornot,weansweredwhetherthe3-SATproblemhasasolu+on.

§  HenceinferenceinBayes’netsisNP-hard.Noknownefficientprobabilis+cinferenceingeneral.

Page 40: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Polytrees

§  Apolytreeisadirectedgraphwithnoundirectedcycles

§  Forpoly-treesyoucanalwaysfindanorderingthatisefficient§  Tryit!!

§  Cut-setcondi+oningforBayes’netinference§  Choosesetofvariablessuchthatifremovedonlyapolytreeremains§  Exercise:Thinkabouthowthespecificswouldworkout!

Page 41: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Bayes’Nets

§  Representa+on

§  Condi+onalIndependences

§  Probabilis+cInference§  Enumera+on(exact,exponen+al

complexity)

§  Variableelimina+on(exact,worst-caseexponen+alcomplexity,ohenbe@er)

§  InferenceisNP-complete

§  Sampling(approximate)

§  LearningBayes’NetsfromData

Page 42: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

ApproximateInference:Sampling

Page 43: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Sampling§  Samplingisalotlikerepeatedsimula+on

§  Predic+ngtheweather,basketballgames,…

§  Basicidea§  DrawNsamplesfromasamplingdistribu+onS

§  Computeanapproximateposteriorprobability

§  ShowthisconvergestothetrueprobabilityP

§  Whysample?§  Learning:getsamplesfromadistribu+on

youdon’tknow

§  Inference:gewngasampleisfasterthancompu+ngtherightanswer(e.g.withvariableelimina+on)

Page 44: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Sampling

§  Samplingfromgivendistribu+on§  Step1:Getsampleufromuniform

distribu+onover[0,1)§  E.g.random()inpython

§  Step2:Convertthissampleuintoanoutcomeforthegivendistribu+onbyhavingeachoutcomeassociatedwithasub-intervalof[0,1)withsub-intervalsizeequaltoprobabilityoftheoutcome

§  Example

§  Ifrandom()returnsu=0.83,thenoursampleisC=blue

§  E.g,ahersampling8+mes:

C P(C)red 0.6green 0.1blue 0.3

Page 45: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

SamplinginBayes’Nets

§  PriorSampling

§  Rejec+onSampling

§  LikelihoodWeigh+ng

§  GibbsSampling

Page 46: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

PriorSampling

Page 47: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

PriorSampling

Cloudy

Sprinkler Rain

WetGrass

Cloudy

Sprinkler Rain

WetGrass

+c 0.5-c 0.5

+c

+s 0.1-s 0.9

-c

+s 0.5-s 0.5

+c

+r 0.8-r 0.2

-c

+r 0.2-r 0.8

+s

+r

+w 0.99-w 0.01

-r

+w 0.90-w 0.10

-s

+r

+w 0.90-w 0.10

-r

+w 0.01-w 0.99

Samples:

+c,-s,+r,+w-c,+s,-r,+w

Page 48: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

PriorSampling

§  Fori=1,2,…,n§  SamplexifromP(Xi|Parents(Xi))

§  Return(x1,x2,…,xn)

Page 49: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

PriorSampling

§  Thisprocessgeneratessampleswithprobability:

…i.e.theBN’sjointprobability

§  Letthenumberofsamplesofaneventbe

§  Then

§  I.e.,thesamplingprocedureisconsistent

Page 50: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Example

§  We’llgetabunchofsamplesfromtheBN:+c,-s,+r,+w+c,+s,+r,+w-c,+s,+r,-w+c,-s,+r,+w-c,-s,-r,+w

§  IfwewanttoknowP(W)§  Wehavecounts<+w:4,-w:1>§  NormalizetogetP(W)=<+w:0.8,-w:0.2>§  Thiswillgetclosertothetruedistribu+onwithmoresamples§  Canes+mateanythingelse,too§  WhataboutP(C|+w)?P(C|+r,+w)?P(C|-r,-w)?§  Fast:canusefewersamplesifless+me(what’sthedrawback?)

S R

W

C

Page 51: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Rejec+onSampling

Page 52: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

+c,-s,+r,+w+c,+s,+r,+w-c,+s,+r,-w+c,-s,+r,+w-c,-s,-r,+w

Rejec+onSampling

§  Let’ssaywewantP(C)§  Nopointkeepingallsamplesaround§  JusttallycountsofCaswego

§  Let’ssaywewantP(C|+s)§  Samething:tallyCoutcomes,butignore(reject)sampleswhichdon’thaveS=+s

§  Thisiscalledrejec+onsampling§  Itisalsoconsistentforcondi+onalprobabili+es(i.e.,correctinthelimit)

S R

W

C

Page 53: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Rejec+onSampling§  IN:evidenceinstan+a+on§  Fori=1,2,…,n

§  SamplexifromP(Xi|Parents(Xi))

§  Ifxinotconsistentwithevidence§  Reject:Return,andnosampleisgeneratedinthiscycle

§  Return(x1,x2,…,xn)

Page 54: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

LikelihoodWeigh+ng

Page 55: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

§  Idea:fixevidencevariablesandsampletherest§  Problem:sampledistribu+onnotconsistent!§  Solu+on:weightbyprobabilityofevidence

givenparents

LikelihoodWeigh+ng

§  Problemwithrejec+onsampling:§  Ifevidenceisunlikely,rejectslotsofsamples§  Evidencenotexploitedasyousample§  ConsiderP(Shape|blue)

Shape ColorShape Color

pyramid,greenpyramid,redsphere,bluecube,redsphere,green

pyramid,bluepyramid,bluesphere,bluecube,bluesphere,blue

Page 56: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

LikelihoodWeigh+ng

+c 0.5-c 0.5

+c

+s 0.1-s 0.9

-c

+s 0.5-s 0.5

+c

+r 0.8-r 0.2

-c

+r 0.2-r 0.8

+s

+r

+w 0.99-w 0.01

-r

+w 0.90-w 0.10

-s

+r

+w 0.90-w 0.10

-r

+w 0.01-w 0.99

Samples:

+c,+s,+r,+w…

Cloudy

Sprinkler Rain

WetGrass

Cloudy

Sprinkler Rain

WetGrass

Page 57: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

LikelihoodWeigh+ng§  IN:evidenceinstan+a+on§  w=1.0§  fori=1,2,…,n

§  ifXiisanevidencevariable§  Xi=observa+onxiforXi§  Setw=w*P(xi|Parents(Xi))

§  else§  SamplexifromP(Xi|Parents(Xi))

§  return(x1,x2,…,xn),w

Page 58: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

LikelihoodWeigh+ng

§  Samplingdistribu+onifzsampledandefixedevidence

§  Now,sampleshaveweights

§  Together,weightedsamplingdistribu+onisconsistent

Cloudy

R

C

S

W

Page 59: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

LikelihoodWeigh+ng

§  Likelihoodweigh+ngisgood§  Wehavetakenevidenceintoaccountaswe

generatethesample§  E.g.here,W’svaluewillgetpickedbasedonthe

evidencevaluesofS,R§  Moreofoursampleswillreflectthestateofthe

worldsuggestedbytheevidence

§  Likelihoodweigh+ngdoesn’tsolveallourproblems§  Evidenceinfluencesthechoiceofdownstream

variables,butnotupstreamones(Cisn’tmorelikelytogetavaluematchingtheevidence)

§  WewouldliketoconsiderevidencewhenwesampleeveryvariableàGibbssampling

Page 60: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

GibbsSampling

Page 61: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

GibbsSampling

§  Procedure:keeptrackofafullinstan+a+onx1,x2,…,xn.Startwithanarbitraryinstan+a+onconsistentwiththeevidence.Sampleonevariableata+me,condi+onedonalltherest,butkeepevidencefixed.Keeprepea+ngthisforalong+me.

§  Property:inthelimitofrepea+ngthisinfinitelymany+mestheresul+ngsampleiscomingfromthecorrectdistribu+on

§  Ra:onale:bothupstreamanddownstreamvariablescondi+ononevidence.

§  Incontrast:likelihoodweigh+ngonlycondi+onsonupstreamevidence,andhenceweightsobtainedinlikelihoodweigh+ngcansome+mesbeverysmall.Sumofweightsoverallsamplesisindica+veofhowmany“effec+ve”sampleswereobtained,sowanthighweight.

Page 62: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

§  Step2:Ini+alizeothervariables§  Randomly

GibbsSamplingExample:P(S|+r)

§  Step1:Fixevidence§  R=+r

§  Steps3:Repeat§  Chooseanon-evidencevariableX§  ResampleXfromP(X|allothervariables)

S +r

W

C

S +r

W

C

S +rW

CS +r

W

CS +r

W

CS +r

W

CS +r

W

CS +r

W

C

Page 63: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

EfficientResamplingofOneVariable

§  SamplefromP(S|+c,+r,-w)

§  Manythingscancelout–onlyCPTswithSremain!§  Moregenerally:onlyCPTsthathaveresampledvariableneedtobeconsidered,and

joinedtogether

S +r

W

C

Page 64: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Bayes’NetSamplingSummary§  PriorSamplingP

§  LikelihoodWeigh+ngP(Q|e)

§  Rejec+onSamplingP(Q|e)

§  GibbsSamplingP(Q|e)

Page 65: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

FurtherReadingonGibbsSampling*

§  Gibbssamplingproducessamplefromthequerydistribu+onP(Q|e)inlimitofre-samplinginfinitelyohen

§  GibbssamplingisaspecialcaseofmoregeneralmethodscalledMarkovchainMonteCarlo(MCMC)methods

§  Metropolis-Has+ngsisoneofthemorefamousMCMCmethods(infact,GibbssamplingisaspecialcaseofMetropolis-Has+ngs)

§  YoumayreadaboutMonteCarlomethods–they’rejustsampling

Page 66: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

HowAboutPar+cleFiltering?

Par+cles:(3,3)(2,3)(3,3)(3,2)(3,3)(3,2)(1,2)(3,3)(3,3)(2,3)

Elapse Weight Resample

Par+cles:(3,2)(2,3)(3,2)(3,1)(3,3)(3,2)(1,3)(2,3)(3,2)(2,2)

Par+cles:(3,2)w=.9(2,3)w=.2(3,2)w=.9(3,1)w=.4(3,3)w=.4(3,2)w=.9(1,3)w=.1(2,3)w=.2(3,2)w=.9(2,2)w=.4

(New)Par+cles:(3,2)(2,2)(3,2)(2,3)(3,3)(3,2)(1,3)(2,3)(3,2)(3,2)

X2 X1 X2

E2

= likelihood weighting

Page 67: Bayes’ Nets: Inference · 2018. 2. 21. · § Bayes’ nets implicitly encode joint distribu+ons § As a product of local condi+onal distribu+ons § To see what probability a BN

Par+cleFiltering

§  Par+clefilteringoperatesonensembleofsamples§  Performslikelihoodweigh+ngforeachindividualsampletoelapse+meandincorporateevidence

§  Resamplesfromtheweightedensembleofsamplestofocuscomputa+onforthenext+mestepwheremostoftheprobabilitymassises+matedtobe