Learning and Structural Uncertainty in Relational Probability Models
description
Transcript of Learning and Structural Uncertainty in Relational Probability Models
![Page 1: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/1.jpg)
Learning and Structural Uncertainty in Relational Probability Models
Brian MilchMIT 9.66
November 29, 2007
![Page 2: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/2.jpg)
Outline
• Learning relational probability models • Structural uncertainty
– Uncertainty about relations– Uncertainty about object existence and identity
• Applications of BLOG
![Page 3: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/3.jpg)
3
Review: Relational Probability Models
Abstract probabilistic model for attributes
Relational skeleton: objects & relations
Graphical model
![Page 4: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/4.jpg)
4
Reivew: Dependency Statements
Specialty(r) ~ TabularCPD[[0.5, 0.3, 0.2]];
Topic(p) ~ TabularCPD[[0.90, 0.01, 0.09], [0.02, 0.85, 0.13], [0.10, 0.10, 0.80]] (Specialty(FirstAuthor(p)));
WordAt(wp) ~ TabularCPD[[0.03,..., 0.02, 0.001,...], [0.03,..., 0.001, 0.02,...], [0.03,..., 0.003, 0.003,...]] (Topic(Doc(wp)));
BNs RL Theory
BNs RL Theory| BNs| RL| Theory
the Bayesian reinforcement| BNs| RL| Theory
![Page 5: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/5.jpg)
5
Learning
• Assume types, functions are given• Straightforward task: given structure, learn
parameters– Just like in BNs, but parameters are shared across
variables for same function, e.g.,Topic(Smith98a), Topic(Jones00), etc.
• Harder: learn abstract dependency structure
![Page 6: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/6.jpg)
6
Structure Learning for BNs
• Find BN structure M that maximizes
• Greedy local search over structures– Operators: add, delete, reverse edges– Exclude cyclic structures
dMpMpMpMp )|(),|data()()data|(
![Page 7: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/7.jpg)
7
Logical Structure Learning
• In RPM, want logical specification of each node’s parent set
• Deterministic analogue: inductive logic programming (ILP) [Dzeroski & Lavrac 2001; Flach and Lavrac 2002]
• Classic work on RPMs by Friedman, Getoor, Koller & Pfeffer [1999]– We’ll call their models FGKP models
(they call them “probabilistic relational models” (PRMs))
![Page 8: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/8.jpg)
8
FGKP Models• Each dependency statement has form:
where s1,...,sk are slot chains• Slot chains
– Basically logical terms: Specialty(FirstAuthor(p))– But can also treat predicates as “multi-valued
functions”: Specialty(AuthorOf(p))
Func(x) ~ TabularCPD[...](s1,..., sk)
Smith&Jones01
Smith
Jones
BNs
RLAuthorOf
AuthorOf Specialty
Specialty
aggregate
![Page 9: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/9.jpg)
9
Structure Learning for FGKP Models
• Greedy search again– But add or remove whole slot chains– Start with chains of length 1, then 2, etc.– Check for acyclicity using symbol graph
![Page 10: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/10.jpg)
Outline
• Learning relational probability models • Structural uncertainty
– Uncertainty about relations– Uncertainty about object existence and identity
• Applications of BLOG
![Page 11: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/11.jpg)
11
Relational Uncertainty: Example
• Questions: Who will review my paper, and what will its average review score be?
Reviews
AuthorOf AuthorOf ReviewsReviews
Specialty: RLGenerosity: 2.9
Specialty: Prob. ModelsGenerosity: 2.2
Topic: RLAvgScore: ?
Topic: RLAvgScore: ?
Topic: Prob ModelsAvgScore: ?
ReviewsSpecialty: TheoryGenerosity: 1.8
![Page 12: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/12.jpg)
12
Simplest Approach to Relational Uncertainty
• Add predicate Reviews(r, p)• Can model this with existing syntax:
• Potential drawback:– Reviews(r, p) nodes are independent given
specialties and topics– Expected number of reviews per paper grows with
number of researchers in skeleton
[Getoor et al., JMLR 2002]
Reviews(r, p) ~ ReviewCPD(Specialty(r), Topic(p));
![Page 13: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/13.jpg)
13
Another Approach: Reference Uncertainty
• Say each paper gets k reviews– Can add Review objects to skeleton– For each paper p, include k review objects rev
with PaperReviewed(rev) = p• Uncertain about values of function
Reviewer(rev)
[Getoor et al., JMLR 2002]
PaperReviewed
??
?
Reviewer
![Page 14: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/14.jpg)
14
Models for Reviewer(rev)• Explicit distribution over researchers?
– No: won’t generalize across skeletons• Selection models:
– Uniform sampling from researchers with certain attribute values [Getoor et al., JMLR 2002]
– Weighted sampling, with weights determined by attributes [Pasula et al., IJCAI 2001]
![Page 15: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/15.jpg)
15
Choosing Reviewer Based on Specialty
ReviewerSpecialty(rev) ~ SpecSelectionCPD (Topic(PaperReviewed(rev)));
Reviewer(rev) ~ Uniform({Researcher r : Specialty(r) = ReviewerSpecialty(rev)});
![Page 16: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/16.jpg)
16
Context-Specific DependenciesRevScore(rev) ~ ScoreCPD(Generosity(Reviewer(rev)));
AvgScore(p) = Mean({RevScore(rev) for Review rev : PaperReviewed(Rev) = p});
random object
• Consequence of relational uncertainty: dependencies become context-specific– RevScore(Rev1) depends on Generosity(R1) only
when Reviewer(Rev1) = R1
![Page 17: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/17.jpg)
17
Semantics: Ground BN
• Can still define ground BN• Parents of node X are all basic RVs whose values are
potentially relevant in evaluating the right hand side of X’s dependency statement
• Example: for RevScore(Rev1)…
– Reviewer(Rev1) is always relevant– Generosity(R) might be relevant for any researcher R
RevScore(rev) ~ ScoreCPD(Generosity(Reviewer(rev)));
![Page 18: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/18.jpg)
18
Ground BNTopic(P1)
Specialty(R1)
Specialty(R2)
Specialty(R3)
Generosity(R1)
Generosity(R2)
Generosity(R3)
RevScore(Rev2)RevScore(Rev1)
Reviewer(Rev2)Reviewer(Rev1)
RevSpecialty(Rev2)RevSpecialty(Rev1)
![Page 19: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/19.jpg)
19
Inference
• Can still use ground BN, but it’s often very highly connected
• Alternative: MCMC over possible worlds [Pasula & Russell, IJCAI 2001]– In each world, only certain dependencies are
active
![Page 20: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/20.jpg)
20
Metropolis-Hastings MCMC• Metropolis-Hastings process: in world ,
– sample new world from proposal distribution q(|)
– accept proposal with probability
otherwise remain in • Stationary distribution is p()
)|()()|()(,1max
qpqp
![Page 21: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/21.jpg)
21
Computing Acceptance Ratio Efficiently
• World probability is
where pa(X) is inst. of X’s active parents in • If proposal changes only X, then all factors not
containing X cancel in p() and p()
X
XxXPp ))(pa|()(
[Pasula et al., IJCAI 2001]
Result: Time to compute acceptance ratio often doesn’t depend on number of objects
![Page 22: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/22.jpg)
22
Learning Models for Relations• Binary predicate approach:
– Use existing search over slot chains
• Selecting based on attributes
– Search over sets of attributes to look at– Search over parent slot chains for choosing attribute
values
Reviews(r, p) ~ ReviewCPD(Specialty(r), Topic(p));
ReviewerSpecialty(rev) ~ SpecSelectionCPD (Topic(PaperReviewed(rev)));
Reviewer(rev) ~ Uniform({Researcher r : Specialty(r) = ReviewerSpecialty(rev)});
[Getoor et al., JMLR 2002]
![Page 23: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/23.jpg)
Outline
• Learning relational probability models • Structural uncertainty
– Uncertainty about relations– Uncertainty about object existence and identity
• Applications of BLOG
![Page 24: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/24.jpg)
24
S. Russel and P. Norvig (1995). Artificial Intelligence: A Modern Approach. Upper Saddle River, NJ: Prentice Hall.
Example 1: Bibliographies
Russell, Stuart and Norvig, Peter. Articial Intelligence. Prentice-Hall, 1995.
Title: …
Name: …
PubCited
AuthorOf
![Page 25: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/25.jpg)
25
Example 2: Aircraft Tracking
DetectionFailure
![Page 26: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/26.jpg)
26
Example 2: Aircraft Tracking
FalseDetection
UnobservedObject
![Page 27: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/27.jpg)
27
Handling Unknown Objects
• Fundamental task: given observations, make inferences about initially unknown objects
• But most RPM languages assume set of objects is fixed and known (Herbrand models)
• Bayesian logic (BLOG) lifts this assumption
[Milch et al., IJCAI 2005. See also MEBN: Laskey & da Costa, UAI 2005; Dynamical Grammars: Mjolsness & Yosiphon, AMAI to appear]
![Page 28: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/28.jpg)
28
Possible Worlds
How can we define a distribution over such outcomes?
(not showing attribute values)
![Page 29: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/29.jpg)
29
Generative Process• Imagine process that constructs worlds
using two kinds of steps– Add some objects to the world– Set the value of a function on a tuple of
arguments
![Page 30: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/30.jpg)
30
BLOG Model for Citations
#Paper ~ NumPapersPrior();
Title(p) ~ TitlePrior();
guaranteed Citation Cit1, Cit2, Cit3, Cit4, Cit5, Cit6, Cit7;
PubCited(c) ~ Uniform({Paper p});
Text(c) ~ NoisyCitationGrammar(Title(PubCited(c)));
number statement
familiar syntax for reference uncertainty
part of skeleton:exhaustive list of distinct citations
![Page 31: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/31.jpg)
31
Adding Authors#Researcher ~ NumResearchersPrior();Name(r) ~ NamePrior();
#Paper ~ NumPapersPrior();FirstAuthor(p) ~ Uniform({Researcher r});Title(p) ~ TitlePrior();PubCited(c) ~ Uniform({Paper p});Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));
![Page 32: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/32.jpg)
32
Generative Process for Aircraft Tracking
Sky RadarExistence of radar blips depends on existence and locations of aircraft
![Page 33: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/33.jpg)
33
BLOG Model for Aircraft Tracking
…#Aircraft ~ NumAircraftDistrib();State(a, t)
if t = 0 then ~ InitState() else ~ StateTransition(State(a, Pred(t)));
#Blip(Source = a, Time = t) ~ NumDetectionsDistrib(State(a, t));
#Blip(Time = t) ~ NumFalseAlarmsDistrib();
ApparentPos(r)if (Source(r) = null) then ~ FalseAlarmDistrib()else ~ ObsDistrib(State(Source(r), Time(r)));
2
Source
Time
a
tBlips
2
Time
t Blips
![Page 34: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/34.jpg)
34
Basic Random Variables (RVs)
• For each number statement and tuple of generating objects, have RV for number of objects generated
• For each function symbol and tuple of arguments, have RV for function value
• Lemma: Full instantiation of these RVs uniquely identifies a possible world
![Page 35: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/35.jpg)
35
• Each BLOG model defines contingent Bayesian network (CBN) over basic RVs– Edges active only under certain conditions
Contingent Bayesian Network
[Milch et al., AI/Stats 2005]
Title((Pub, 1)) Title((Pub, 2)) Title((Pub, 3)) …
Text(Cit1)PubCited(Cit1)
#Pub
PubCited(Cit1) = (Pub, 1)
PubCited(Cit1) = (Pub, 2)
PubCited(Cit1) = (Pub, 3)
(Pub, 2)=
infinitely many nodes!
![Page 36: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/36.jpg)
36
Probability Distribution• Through its CBN, BLOG model specifies:
– Conditional distributions for basic RVs– Context-specific independence properties e.g.,
Text(Cit1) indep of Title((Pub, 1)) given PubCited(Cit1) = (Pub, 3)
• Theorem: Under certain “context-specific ordering” conditions, every BLOG model fully defines a distribution over possible worlds
[Milch et al., IJCAI 2005]
![Page 37: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/37.jpg)
37
Inference with Unknown Objects
• Does infinite set of basic RVs prevent inference?• No: Sampling algorithms only need to instantiate
finite set of relevant variables• Generic algorithms:
– Rejection sampling [Milch et al., IJCAI 2005]
– Guided likelihood weighting [Milch et al., AI/Stats 2005]
• More practical: MCMC over partial worlds
![Page 38: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/38.jpg)
Toward General-Purpose MCMCwith Unknown Objects
• Successful applications of MCMC with domain-specific proposal distributions:– Citation matching [Pasula et al., 2003]– Multi-target tracking [Oh et al., 2004]
• But each application requires new code for:– Proposing moves– Representing MCMC states– Computing acceptance probabilities
• Goal: – User specifies model and proposal distribution– General-purpose code does the rest
![Page 39: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/39.jpg)
Proposer for Citations
• Split-merge moves:
– Propose titles and author names for affected publications based on citation strings
• Other moves change total number of publications
[Pasula et al., NIPS 2002]
![Page 40: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/40.jpg)
MCMC States
• Not complete instantiations!– No titles, author names for uncited publications
• States are partial instantiations of random variables
– Each state corresponds to an event: set of outcomes satisfying description
#Pub = 100, PubCited(Cit1) = (Pub, 37), Title((Pub, 37)) = “Calculus”
![Page 41: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/41.jpg)
MCMC over Events
• Markov chain over events , with stationary distrib. proportional to p()
• Theorem: Fraction of visited events in Q converges to p(Q|E) if:– Each is either subset of Q or
disjoint from Q– Events form partition of E
E
Q
![Page 42: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/42.jpg)
Computing Probabilities of Events
• Engine needs to compute P() / P(n) efficiently (without summations)
• Use instantiations that include all active parents of the variables they instantiate
• Then probability is product of CPDs:
)(vars
))(Pa(|)()(
X
X XXpP
![Page 43: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/43.jpg)
States That Are Even More Abstract
• Typical partial instantiation:
– Specifies particular publications, even though publications are interchangeable
• Let states be abstract partial instantiations:
• There are conditions under which we can compute probabilities of such events
#Pub = 100, PubCited(Cit1) = (Pub, 37), Title((Pub, 37)) = “Calculus”, PubCited(Cit2) = (Pub, 14), Title((Pub, 14)) = “Psych”
x y x [#Pub = 100, PubCited(Cit1) = x, Title(x) = “Calculus”, PubCited(Cit2) = y, Title(y) = “Psych”]
![Page 44: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/44.jpg)
Outline
• Learning relational probability models • Structural uncertainty
– Uncertainty about relations– Uncertainty about object existence and identity
• Applications of BLOG
![Page 45: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/45.jpg)
45
Citation Matching• Elaboration of generative model shown earlier • Parameter estimation
– Priors for names, titles, citation formats learned offline from labeled data
– String corruption parameters learned with Monte Carlo EM
• Inference– MCMC with split-merge proposals– Guided by “canopies” of similar citations– Accuracy stabilizes after ~20 minutes
[Pasula et al., NIPS 2002]
![Page 46: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/46.jpg)
46
Citation Matching Results
Four data sets of ~300-500 citations, referring to ~150-300 papers
0
0.05
0.1
0.15
0.2
0.25
Reinforce Face Reason Constraint
Erro
r (F
ract
ion
of C
lust
ers
Not R
ecov
ered
Cor
rect
ly)
Phrase Matching[Lawrence et al. 1999]
Generative Model + MCMC[Pasula et al. 2002]
Conditional Random Field[Wellner et al. 2004]
![Page 47: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/47.jpg)
47
Cross-Citation Disambiguation
Wauchope, K. Eucalyptus: Integrating Natural Language Input with a Graphical User Interface. NRL Report NRL/FR/5510-94-9711 (1994).
Is "Eucalyptus" part of the title, or is the author named K. Eucalyptus Wauchope?
Kenneth Wauchope (1994). Eucalyptus: Integrating natural language input with a graphical user interface. NRL Report NRL/FR/5510-94-9711, Naval Research Laboratory, Washington, DC, 39pp.
Second citation makes it clear how to parse the first one
![Page 48: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/48.jpg)
48
Preliminary Experiments: Information Extraction
• P(citation text | title, author names) modeled with simple HMM
• For each paper: recover title, author surnames and given names
• Fraction whose attributes are recovered perfectly in last MCMC state:– among papers with one citation: 36.1%– among papers with multiple citations: 62.6%
Can use inferred knowledge for disambiguation
![Page 49: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/49.jpg)
49
Multi-Object Tracking
FalseDetection
UnobservedObject
![Page 50: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/50.jpg)
50
State Estimation for “Aircraft”
#Aircraft ~ NumAircraftPrior();State(a, t)
if t = 0 then ~ InitState() else ~ StateTransition(State(a, Pred(t)));
#Blip(Source = a, Time = t) ~ NumDetectionsCPD(State(a, t));
#Blip(Time = t) ~ NumFalseAlarmsPrior();
ApparentPos(r)if (Source(r) = null) then ~ FalseAlarmDistrib()else ~ ObsCPD(State(Source(r), Time(r)));
![Page 51: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/51.jpg)
51
Aircraft Entering and Exiting
#Aircraft(EntryTime = t) ~ NumAircraftPrior();Exits(a, t)
if InFlight(a, t) then ~ Bernoulli(0.1);InFlight(a, t)
if t < EntryTime(a) then = falseelseif t = EntryTime(a) then = trueelse = (InFlight(a, Pred(t)) & !Exits(a, Pred(t)));
State(a, t)if t = EntryTime(a) then ~ InitState() elseif InFlight(a, t) then ~ StateTransition(State(a, Pred(t)));
#Blip(Source = a, Time = t) if InFlight(a, t) then
~ NumDetectionsCPD(State(a, t));
…plus last two statements from previous slide
![Page 52: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/52.jpg)
52
MCMC for Aircraft Tracking
• Uses generative model from previous slide (although not with BLOG syntax)
• Examples of Metropolis-Hastings proposals:
[Oh et al., CDC 2004][Figures by Songhwai Oh]
![Page 53: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/53.jpg)
53
Aircraft Tracking Results
[Oh et al., CDC 2004][Figures by Songhwai Oh]
MCMC has smallest error, hardly degrades at all as tracks get dense
MCMC is nearly as fast as greedy algorithm; much faster than MHT
Estimation Error Running Time
![Page 54: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/54.jpg)
54
BLOG Software
• Bayesian Logic inference engine available:
http://people.csail.mit.edu/milch/blog
![Page 55: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/55.jpg)
55
References• Friedman, N., Getoor, L., Koller, D.,and Pfeffer, A. (1999) “Learning probabilistic relational
models”. In Proc. 16th Int’l Joint Conf. on AI, pages 1300-1307.• Taskar, B., Segal, E., and Koller, D. (2001) “Probabilistic classification and clustering in
relational data”. In Proc. 17th Int’l Joint Conf. on AI, pages 870-878.• Getoor, L., Friedman, N., Koller, D., and Taskar, B. (2002) “Learning probabilistic models
of link structure”. J. Machine Learning Res. 3:679-707.• Taskar, B., Abbeel, P., and Koller, D. (2002) “Discriminative probabilistic models for
relational data”. In Proc. 18th Conf. on Uncertainty in AI, pages 485-492.• Dzeroski, S. and Lavrac, N., eds. (2001) Relational Data Mining. Springer.• Flach, P. and Lavrac, N. (2002) “Learning in Clausal Logic: A Perspective on Inductive
Logic Programming”. In Computational Logic: Logic Programming and Beyond (Essays in Honour of Robert A. Kowalski), Springer Lecture Notes in AI volume 2407, pages 437-471.
• Pasula, H. and Russell, S. (2001) “Approximate inference for first-order probabilistic languages”. In Proc. 17th Int’l Joint Conf. on AI, pages 741-748.
• Milch, B., Marthi, B., Russell, S., Sontag, D., Ong, D. L., and Kolobov, A. (2005) “BLOG: Probabilistic Models with Unknown Objects”. In Proc. 19th Int’l Joint Conf. on AI, pages 1352-1359.
![Page 56: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/56.jpg)
56
References• Milch, B., Marthi, B., Russell, S., Sontag, D., Ong, D. L., and Kolobov, A. (2005) “BLOG:
Probabilistic Models with Unknown Objects”. In Proc. 19th Int’l Joint Conf. on AI, pages 1352-1359.
• Milch, B., Marthi, B., Sontag, D., Russell, S., Ong, D. L., and Kolobov, A. (2005) “Approximate inference for infinite contingent Bayesian networks”. In Proc. 10th Int’l Workshop on AI and Statistics.
• Milch, B. and Russell, S. (2006) “General-purpose MCMC inference over relational structures”. In Proc. 22nd Conf. on Uncertainty in AI, pages 349-358.
• Pasula, H., Marthi, B., Milch, B., Russell, S., and Shpitser, I. (2003) “Identity uncertainty and citation matching”. In Advances in Neural Information Processing Systems 15, MIT Press, pages 1401-1408.
• Lawrence, S., Giles, C. L., and Bollacker, K. D. (1999) “Autonomous citation matching”. In Proc. 3rd Int’l Conf. on Autonomous Agents, pages 392-393.
• Wellner, B., McCallum, A., Feng, P., and Hay, M. (2004) “An integrated, conditional model of information extraction and coreference with application to citation matching”. In Proc. 20th Conf. on Uncertainty in AI, pages 593-601.
![Page 57: Learning and Structural Uncertainty in Relational Probability Models](https://reader035.fdocuments.in/reader035/viewer/2022062813/5681668e550346895dda5eeb/html5/thumbnails/57.jpg)
57
References• Pasula, H., Russell, S. J., Ostland, M., and Ritov, Y. (1999) “Tracking many objects with many
sensors”. In Proc. 16th Int’l Joint Conf. on AI, pages 1160-1171.• Oh, S., Russell, S. and Sastry, S. (2004) “Markov chain Monte Carlo data association for
general multi-target tracking problems”. In Proc. 43rd IEEE Conf. on Decision and Control, pages 734-742.