Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data,...
Transcript of Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data,...
![Page 1: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/1.jpg)
© Fraunhofer FKIE
All living creatures by nature perform sensor data and information fusion. Each in their own way, they “fuse” sensations provided by multiple and mutually
complementary sense organs with knowledge learned from previous experiences and communications from other creatures. As a result, they produce a “mental picture” of their individual environment, the basis of behaving appropriately in
their struggle to avoid harm or reach a particular goal in a given situation.
Prior to its technical realization or the scientific reflection on it:
Information Fusion – an Omnipresent Phenomenon!
Branch of Applied Informatics: „Cognitive Tools“
1. Understanding, Automation, Enhancement
2. Integration of New Sources of Information
• networking, mobile sensors of high sensitivity, range • new dimensions of apprehension otherwise hidden • data base systems containing vast context information • interaction with humans: exploit natural intelligence!
![Page 2: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/2.jpg)
© Fraunhofer FKIE
Sensor Data and Information Fusion for Producing Situation Pictures
Information to be fused: imprecise, incomplete, ambiguous, unresolved, false, deceptive, hard-to-be-formalized, contradictory, ...
Context Knowledge
Sensor Data & Information Fusion
Mapping of a dynamic over-all scenario onto near real-time “situation pictures“
Multiple Sensor Data
Basis for Decision Support
Array-Sensor Signals
Condensed Information on Objects of Interest
Existence, number, geolocation, time, behavior,
properties, class, identity, interrelations history,
sources/sinks, traffic analysis, anomaly detection, …
![Page 3: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/3.jpg)
A Generic Tracking and Sensor Data Fusion System
Track AssociationSensor Data to Track File
Storage
Track Maintenance:
Retrodiction Prediction, Filtering
Sensing Hardware:
Signal Processing:
Parameter Estimation
Received Waveforms
Detection Process:
Data Rate Reduction
Track Initiation:
Multiple Frame
- Object Environment- Object Characteristics
A Priori Knowledge:
- Sensor Performance
- Track Cancellation- Object Classification / ID- Track-to-Track Fusion
Track Processing:
- Interaction Facilities
Man-Machine Interface:
- Displaying Functions- Object Representation
Tracking & Fusion System
Sensor System Sensor System
SensorData
SensorControl
Sensor System
Track Extraction
W. Koch, Walking Through the JDL Model
![Page 4: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/4.jpg)
On Characterizing Tracking / Fusion Performance
a well-understood paradigm: air surveillance with multiple radarsMany results can be transfered to other sensors (IR, E/O, sonar, acoustics).
Sensor Data Fusion: ‘tracks’ represent the available information on the targetsassociated to them with appropriate quality measures, thus providing answers to:
When? Where? How many? To which direction? How fast, accelerating? What?
By sensor data fusion we wish to establish one-to-one associations between:
targets in the field of view $ identified tracks in the tracking computer
Strictly speaking, this is only possible under ideal conditions regarding the sensorperformance and underlying target situation. The tracking/fusion performance can thus
be measured by its deficiencies when compared with this ideal goal.
W. Koch, Walking Through the JDL Model
![Page 5: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/5.jpg)
1. Let a target be detected at first by a sensor at time ta. Usually, a timedelay is involved until a confirmed track has finally been establishedat time te (track extraction). A ‘measure of deficiency’ is thus:
• extraction delay te � ta.
2. Unavoidably, false tracks will be extracted in case of a high falsereturn density (e.g. clutter, jamming/detection), i.e. tracks related tounreal or unwanted targets. Corresponding ‘deficiencies’ are:
• mean number of falsely extracted targets per time,• mean life time of a false track before its deletion.
3. A target should be represented by one and the same track until leav-ing the field of view. Related performance measures/deficiencies:
• mean life time of tracks related to true targets,• probability of an ‘identity switch’ between targets,• probability of a target not being represented by a track.
W. Koch, Walking Through the JDL Model
![Page 6: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/6.jpg)
4. The track inaccuracy (error covariance of a state estimate) should beas small as possible. The deviations between estimated and actualtarget states should at least correspond with the error covariances pro-duced (consistency). If this is not the case, we speak of a ‘track loss’.
• A track must really represent a target!
Challenges:
• low detection probability • high clutter density • low update rate• agile targets • dense target situations • formations, convoys• target-split events (formation, weapons) • jamming, deception
Basic Tasks:
• models: sensor, target, environment ! physics• data association problems ! combinatorics• estimation problems ! probability, statistics• process control, realization ! computer science
W. Koch, Walking Through the JDL Model
![Page 7: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/7.jpg)
Target Tracking: Basic Idea, Demonstration
Problem-inherent uncertainties and ambiguities!BAYES: processing scheme for ‘soft’, ‘delayed’ decision
sensor performance: • resolution conflicts • DOPPLER blindness
environment: • dense situations • clutter • jamming/deception
target characteristics: • qualitatively distinct maneuvering phases
background knowledge • vehicles on road networks • tactics
W. Koch, Walking Through the JDL Model
![Page 8: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/8.jpg)
pdf: tk−1
‘Probability densities functions (pdf)’ p(xk�1|Zk�1) represent imprecise
knowledge on the ‘state’ xk�1 based on imprecise measurements Zk�1.
W. Koch, Walking Through the JDL Model
![Page 9: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/9.jpg)
pdf: tk−1
Prädiktion: tk
Exploit imprecise knowledge on the dynamical behavior of the object.
p(xk|Zk�1)
| {z }prediction
=
Rdxk�1 p(xk|xk�1)| {z }
dynamics
p(xk�1|Zk�1)
| {z }old knowledge
.
W. Koch, Walking Through the JDL Model
![Page 10: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/10.jpg)
pdf: tk−1
tk: kein plot
missing sensor detection: ‘data processing’ = prediction(not always: exploitation of ‘negative’ sensor evidence)
W. Koch, Walking Through the JDL Model
![Page 11: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/11.jpg)
pdf: tk−1
pdf: tk
Prädiktion: tk+1
missing sensor information: increasing knowledge dissipation
W. Koch, Walking Through the JDL Model
![Page 12: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/12.jpg)
pdf: tk−1
pdf: tk
tk+1: ein plot
sensor information on the kinematical object state
W. Koch, Walking Through the JDL Model
![Page 13: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/13.jpg)
pdf: tk−1
pdf: tk
Prädiktion: tk+1
likelihood(Sensormodell)
BAYES’ formula: p(xk+1
|Zk+1
)
| {z }new knowledge
=
p(zk+1
|xk+1
) p(xk+1
|Zk)
Rdxk+1
p(zk+1|{z}plot
|xk+1
) p(xk+1
|Zk)
| {z }prediction
W. Koch, Walking Through the JDL Model
![Page 14: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/14.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1(Bayes)
filtering = sensor data processing
W. Koch, Walking Through the JDL Model
![Page 15: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/15.jpg)
pdf: tk−1
pdf: tk
tk+1: drei plots
ambiguities by false plots: 1 + 3 data interpretation hypotheses(‘detection probability’, false alarm statistics)
W. Koch, Walking Through the JDL Model
![Page 16: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/16.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1
Multimodal pdfs reflect ambiguities inherent in the data.
W. Koch, Walking Through the JDL Model
![Page 17: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/17.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1
Prädiktion: tk+2
temporal propagation: dissipation of the probability densities
W. Koch, Walking Through the JDL Model
![Page 18: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/18.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1
tk+2: ein plot
association tasks: sensor data$ interpretation hypotheses
W. Koch, Walking Through the JDL Model
![Page 19: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/19.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1
Prädiktion: tk+2
likelihood
BAYES: p(xk+2
|Zk+2
) =
p(zk+2
|xk+2
) p(xk+2
|Zk+1
)Rdxk+2
p(zk+2
|xk+2
) p(xk+2
|Zk+1
)
W. Koch, Walking Through the JDL Model
![Page 20: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/20.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1
pdf: tk+2
in particular: re-calculation of the hypothesis weights
W. Koch, Walking Through the JDL Model
![Page 21: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/21.jpg)
pdf: tk−1
pdf: tk
pdf: tk+1
pdf: tk+2
How does new knowledge affect the knowledge in the past of a past state?
W. Koch, Walking Through the JDL Model
![Page 22: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/22.jpg)
pdf: tk−1
pdf: tk
Retrodiktion: tk+1
pdf: tk+2
‘retrodiction’: a retrospective analysis of the past
W. Koch, Walking Through the JDL Model
![Page 23: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/23.jpg)
tk−1tk
tk+1
tk+2
optimal information processing at present and for the past
W. Koch, Walking Through the JDL Model
![Page 24: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/24.jpg)
Multiple Hypothesis Tracking: Basic IdeaIterative updating of conditional probability densities!
kinematic target state xk at time tk, accumulated sensor data Zk
a priori knowledge: target dynamics models, sensor model, road maps
• prediction: p(xk�1|Zk�1)
dynamics model����������!road maps
p(xk|Zk�1)
• filtering: p(xk|Zk�1)
sensor data Zk����������!sensor model
p(xk|Zk)
• retrodiction: p(xl�1|Zk)
filtering output ����������dynamics model
p(xl|Zk)
– finite mixture: inherent ambiguity (data, model, road network )– optimal estimators: e.g. minimum mean squared error (MMSE)– track initiation, termination: sequential likelihood ratio testing
W. Koch, Walking Through the JDL Model
![Page 25: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/25.jpg)
Summary: BAYESian (Multi-) Sensor Tracking
• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.
• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.
• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.
• Approach: Interpret measurements and target vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.
• Solution: Derive iteration formulae for calculating the pdfs! Develop a mech-anism for initiation! By doing so, exploit all background information available!Derive state estimates from the pdfs along with appropriate quality measures!
W. Koch, Walking Through the JDL Model
![Page 26: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/26.jpg)
How to deal with probability density functions?
• pdf p(x): Extract probability statements about the RV x by integration!
• naı̈vely: positive and normalized functions (p(x) � 0,Rdx p(x) = 1)
• conditional pdf p(x|y) =
p(x,y)p(y)
: Impact of information on y on RV x?
• marginal density p(x) =
Rdy p(x, y) =
Rdy p(x|y) p(y): Enter y!
• Bayes: p(x|y)= p(y|x)p(x)p(y)
=
p(y|x)p(x)Rdx p(y|x)p(x): p(x|y) p(y|x), p(x)!
• certain knowledge on x: p(x) = �(x� y) ‘=’ lim�!0
1p2⇡�
e
�1
2
(x�y)2�2
• transformed RV y = t[x]: p(y) =
Rdxp(y, x) =
Rdxp(y|x)px(x) =
Rdx �(y � t[x]) px(x) =: [T px](y) (T : px 7! p, “transfer operator”)
W. Koch, Walking Through the JDL Model
![Page 27: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/27.jpg)
The Multivariate GAUSSian Pdf
– wanted: probabilities ‘concentrated’ around a center x
– quadratic distance: q(x) =
1
2
(x� x)P�1(x� x)>
q(x) defines an ellipsoid around x, its volume and orien-tation being determined by a matrix P (symmetric: P> =
P, positively definite: all eigenvalues > 0).
– first attempt: p(x) = e
�q(x)/Rdx e
�q(x) (normalized!)
p(x) = N (x; x, P) =
1
q|2⇡P|
e
�1
2
(x�x)>P�1(x�x)
E[x] = x, E[(x� x)(x� x)>] = P (covariance matrix)
– GAUSSian Mixtures: p(x) =
Pi pi N (x; xi, Pi) (weighted sums)
W. Koch, Walking Through the JDL Model
![Page 28: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/28.jpg)
Moment Matching: Approximate an arbitrary pdf
p(x) with E[x] = x, C[x] = P by p(x) ⇡ N⇣x; x, P
⌘!
here especially: p(x) =
X
i
piN (x;xi,Pi) (GAUSSian mixtures)
x =
X
i
pi xi
P =
X
i
pi�Pi +
spread termz }| {(xi � x)(xi � x)>
W. Koch, Walking Through the JDL Model
![Page 29: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/29.jpg)
A Useful Product Formula for GAUSSians
N�z; Hx, R
�N�x; y, P
�= N
�z; Hy, S
�| {z }
independent of x
⇥
(N�x; y+W⌫, P�WSW>�
N�x; Q�1(P�1x+H>R�1z), Q
�
⌫ = z�Hy, S = HPH>+R, W = PH>S�1, Q�1 = P�1 +H>R�1H.
Sketch of the proof:
• Interpret N�z; Hx, R
�N�x; y, P
�as a joint pdf p(z|x)p(x) = p(z,x).
• Show that p(z,x) is a GAUSSian: p(z,x) = N��
zx
�;
�Hyy
�,�
S HPPH> P
��.
• Calculate from p(z,x) the marginal and conditional pdfs p(z) and p(x|z).
• From p(z,x) = p(z|x)p(x) = p(x|z)p(z) = p(x, z) we obtain the result.
W. Koch, Walking Through the JDL Model
![Page 30: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/30.jpg)
More Precise Formulation of the BAYESian Approach
Consider a set of measurements Zl = {zjl }mlj=1
of a single or a multipletarget state xl at time instants tl, l = 1, . . . , k and the time series:
Zk= {Zk,mk, Zk�1,mk�1, . . . , Z1
,m1
} = {Zk,mk,Zk�1}!
Based on Zk, what can be learned about the object states xlat t
1
, . . . , tk, tk+1
, . . ., i.e. for the past, present, and future?
Evidently the answer is given be calculating the pdfp(xl|Zk
)
!
multiple sensor measurement fusion: Calculate p(x|Zk1
, . . . ,ZkN)!
• communication lines • common coordinate system: sensor registration
W. Koch, Walking Through the JDL Model
![Page 31: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/31.jpg)
How to calculate the pdf p(xl|Zk)?
Consider at first the present time: l = k.
an observation:
BAYES’ rule: p(xk|Zk) = p(xk|Zk,mk,Zk�1
)
=
p(Zk,mk|xk,Zk�1) p(xk|Zk�1
)
Rdxk p(Zk,mk|xk,Zk�1
)| {z }likelihood function
p(xk|Zk�1)| {z }
prediction
• p(xk|Zk�1) is a prediction of the target state at time tk
based on all measurements in the past.
• p(Zk,mk|xk) / `(xk;Zk,mk) describes, what the current sensor output Zk,mk
can say about the current target state xk and is called likelihood function.
W. Koch, Walking Through the JDL Model
![Page 32: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/32.jpg)
• p(xk|Zk�1) is a prediction of the target state at time tk
based on all measurements in the past.
p(xk|Zk�1) =
Rdxk�1 p(xk,xk�1|Zk�1
) marginal pdf
=
Rdxk�1 p(xk|xk�1,Zk�1
)| {z }object dynamics!
p(xk�1|Zk�1)| {z }
idea: iteration!
notion of a conditional pdf
often: p(xk|xk�1,Zk�1) = p(xk|xk�1) (MARKOV)
sometimes: p(xk, ik|xk�1, ik�1) = pik,ik�1| {z }phase transition
N�xk; F
ikk|k�1| {z }
deterministic
xk�1, Dikk|k�1| {z }random
�(IMM)
W. Koch, Walking Through the JDL Model
![Page 33: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/33.jpg)
• p(xk|Zk�1) is a prediction of the target state at time tk
based on all measurements in the past.
p(xk|Zk�1) =
Rdxk�1 p(xk,xk�1|Zk�1
) marginal pdf
=
Rdxk�1 p(xk|xk�1,Zk�1
)| {z }object dynamics!
p(xk�1|Zk�1)| {z }
idea: iteration!
notion of a conditional pdf
often: p(xk|xk�1,Zk�1) = p(xk|xk�1) (MARKOV)
sometimes: p(xk, ik|xk�1, ik�1) = pik,ik�1| {z }phase transition
N�xk; F
ikk|k�1| {z }
deterministic
xk�1, Dikk|k�1| {z }random
�(IMM)
• p(Zk,mk|xk) / `(xk;Zk,mk) describes, what the current sensor output Zk,mk
can say about the current target state xk and is called likelihood function.
sometimes: `(Zk,mk;xk) = (1� PD)⇢c + PD
Pmk
j=1
N�zjk; H
jkxk, R
jk
�(single target in clutter)
iteration formula: p(xk|Zk) =
`(xk;Zk,mk)Rdxk�1 p(xk|xk�1) p(xk�1|Zk�1
)Rdxk `(xk;Zk,mk)
Rdxk�1 p(xk|xk�1) p(xk�1|Zk�1
)
W. Koch, Walking Through the JDL Model
![Page 34: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/34.jpg)
How to calculate the pdf p(xl|Zk)?
Consider the past: l < k!
an observation:
p�xl| Zk
�=
Zdxl+1
p�xl,xl+1
| Zk�=
Zdxl+1
p�xl+1
|xl
�p�xl| Z l
�Rdxl p
�xl+1
|xl
�| {z }dynamics model
p�xl| Z l
�| {z }
filtering tl
p�xl+1
| Zk�
| {z }retrodiction: tl+1
• p(xl+1
|Zk) retrodiction: known from last iteration step
• p(xk|xk�1) dynamic object behavior (possibly multiple models)
• p(xl|Z l) filtering at the time where retrodiction is considered
• GAUSSians or GAUSSian mixtures: Exploit product formula!
• linear GAUSSian likelihood/dynamics: Rauch-Tung-Striebel smoothing
W. Koch, Walking Through the JDL Model
![Page 35: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/35.jpg)
A side Result: Expected Measurements
innovation statistics, expectation gates, gating
p�zk| Zk�1�
=
Zdxk p
�zk|xk
�p�xk| Zk�1�
=
Zdxk N
�zk; Hkxk, Rk
�| {z }likelihood: sensor model
N�xk; xk|k�1, Pk|k�1
�| {z }
prediction at time tk
= N�zk; Hkxk|k�1, Sk|k�1
�(product formula)
innovation: ⌫k|k�1 = zk �Hkxk|k�1,
innovation covariance: Sk|k�1 = HkPk|k�1H>k +Rk
expectation gate: ⌫>k|k�1S�1k|k�1⌫k|k�1 �(Pc)
MAHALANOBIS: ellipsoid containing zk with probability Pc
�(Pc): gating parameter (! �2-table)
W. Koch, Walking Through the JDL Model
![Page 36: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/36.jpg)
Sensor data of uncertain origin
• prediction: xk|k�1, Pk|k�1 (dynamics)
• innovation: ⌫k = zk �Hxk|k�1 , white
• Mahalanobis norm: ||⌫k||2 = ⌫>k S�1k ⌫k
• expected plot: zk ⇠ N(Hxk|k�1,Sk)
• ⌫k ⇠ N(0,Sk), Sk = HPk|k�1H>+R
• gating: ||⌫k|| < �, Pc(�) correlation prob.
missing/false plots, measurement errors, scan rate, agile targets: large gates
W. Koch, Walking Through the JDL Model
![Page 37: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/37.jpg)
more conventional tracking methods:• KALMAN filter + gating• Nearest-Neighbor filter (NN)• Probabilistic Data Association Filter• Joint PDAF (JPDAF, multiple targets)
problem: limited applicability
nearly optimal solution (BAYES):
multiple targetmultiple hypothesismultiple model
9>=
>;tracking
8><
>:
predictionfilteringretrodiction
ad-hoc tracking methods as limiting cases!
W. Koch, Walking Through the JDL Model
![Page 38: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/38.jpg)
ambiguous sensor data (PD < 1, ⇢f > 0)
mk +1 possible interpretations of the sensor data Zk = {zjk}mkj=1
!
• E0
: target was not detected; mk false returns in the field of view (FoV)
• Ej, j = 1, . . . ,mk: detected; zjk originates from the target; mk � 1 false returns
W. Koch, Walking Through the JDL Model
![Page 39: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/39.jpg)
ambiguous sensor data (PD < 1, ⇢f > 0)
mk +1 possible interpretations of the sensor data Zk = {zjk}mkj=1
!
• E0
: target was not detected; mk false returns in the field of view (FoV)
• Ej, j = 1, . . . ,mk: detected; zjk originates from the target; mk � 1 false returns
Incorporate the interpretations Ej into the likelihood function p(Zk,mk|xk)!
p(Zk,mk|xk) =
mkX
j=0
p(Zk,mk,Ej|xk) =
mkX
j=0
p(Zk,mk|Ej,xk) p(Ej|xk)
W. Koch, Walking Through the JDL Model
![Page 40: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/40.jpg)
ambiguous sensor data (PD < 1, ⇢f > 0)
mk +1 possible interpretations of the sensor data Zk = {zjk}mkj=1
!
• E0
: target was not detected; mk false returns in the field of view (FoV)
• Ej, j = 1, . . . ,mk: detected; zjk originates from the target; mk � 1 false returns
Incorporate the interpretations Ej into the likelihood function p(Zk,mk|xk)!
p(Zk,mk|xk) =
mkX
j=0
p(Zk,mk,Ej|xk) =
mkX
j=0
p(Zk,mk|Ej,xk) p(Ej|xk)
p(Ej|xk) =
(1� PD j = 0
1
mkPD j 6= 0
p(Zk,mk|Ej,xk) = p(Zk|mk,Ej,xk) p(mk|Ej,xk)
=
⇢pF(mk) |FoV|�mk j = 0
pF(mk � 1) |FoV|�(mk�1) N�zjk; Hxk, R
�j 6= 0
W. Koch, Walking Through the JDL Model
![Page 41: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/41.jpg)
ambiguous sensor data (PD < 1, ⇢f > 0)
mk +1 possible interpretations of the sensor data Zk = {zjk}mkj=1
!
• E0
: target was not detected; mk false returns in the field of view (FoV)
• Ej, j = 1, . . . ,mk: detected; zjk originates from the target; mk � 1 false returns
Incorporate the interpretations Ej into the likelihood function p(Zk,mk|xk)!
p(Zk,mk|xk) =
mkX
j=0
p(Zk,mk,Ej|xk) =
mkX
j=0
p(Zk,mk|Ej,xk) p(Ej|xk)
/ (1� PD)⇢F + PD
mkX
j=1
N⇣zjk; Hxk, R
⌘=: `(Zk,mk|xk)
p(Ej|xk) =
(1� PD j = 0
1
mkPD j 6= 0
pF(mk) =
(⇢F |FoV|)mk
mk!e
�⇢F |FoV| (Poisson)
p(Zk,mk|Ej,xk) = p(Zk|mk,Ej,xk) p(mk|Ej,xk)
=
⇢pF(mk) |FoV|�mk j = 0
pF(mk � 1) |FoV|�(mk�1) N�zjk; Hxk, R
�j 6= 0
W. Koch, Walking Through the JDL Model
![Page 42: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/42.jpg)
Overview: MHT for Well-separated Targets
ambiguous data Zk ! interpretation hypotheses Ek
the target was not detected, 1� PD
zk 2 Zk originated by the target, PD
�mk +1 interpretations
hypothetical interpretation histories for Zk
• tree structure: Hk = (EHk,Hk�1)
• current: EHk, pre-histories: Hk�i
normal mixtures $ MHT (total probability)
p(xk|Zk) =
X
Hk
pHkN
⇣xk; xHk
, PHk
⌘
xHk, PHk
: update by KALMAN iteration!
pHk=
p⇤HkPHk
p⇤Hk
, p⇤Hk= pHk�1
⇢(1� PD) ⇢FPD N (⌫Hk
,SHk)
W. Koch, Walking Through the JDL Model
![Page 43: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/43.jpg)
Problem: Growing Memory Disaster:m data, N hypotheses! Nm+1 continuations
radical solution: mono-hypothesis approximation• gating: Exclude competing data with ||⌫i
k|k�1|| > �!
!KALMAN filter (KF)
+ very simple, � � too small: loss of target measurement
• Force a unique interpretation in case of a conflict!
look for smallest statistical distance: mini ||⌫ik|k�1||
!Nearest-Neighbor filter (NN)
+ one hypothesis, � hard decision, � not adaptive
• global combining: Merge all hypotheses!
!PDAF, JPDAF filter
+ all data, + adaptive, � reduced applicability
W. Koch, Walking Through the JDL Model
![Page 44: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/44.jpg)
The qualitative shape of p(xk|Zk) is often much simpler
than its correct representation: a few pronounced modes
adaptive solution: nearly optimal approximation
• individual gating: Exclude irrelevant data!before continuing existing track hypotheses Hk�1
! limiting case: KALMAN filter (KF)
• pruning: Kill hypotheses of very small weight !after calculating the weights pHk
, before filtering
! limiting case: Nearest Neighbor filter (NN)
• local combining: Merge similar hypotheses!after the complete calculation of the pdfs
! limiting case: PDAF (global combining)
W. Koch, Walking Through the JDL Model
![Page 45: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/45.jpg)
W. Koch, Walking Through the JDL Model
![Page 46: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/46.jpg)
W. Koch, Walking Through the JDL Model
![Page 47: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/47.jpg)
W. Koch, Walking Through the JDL Model
![Page 48: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/48.jpg)
W. Koch, Walking Through the JDL Model
![Page 49: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/49.jpg)
W. Koch, Walking Through the JDL Model
![Page 50: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/50.jpg)
W. Koch, Walking Through the JDL Model
![Page 51: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/51.jpg)
W. Koch, Walking Through the JDL Model
![Page 52: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/52.jpg)
W. Koch, Walking Through the JDL Model
![Page 53: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/53.jpg)
W. Koch, Walking Through the JDL Model
![Page 54: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/54.jpg)
W. Koch, Walking Through the JDL Model
![Page 55: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/55.jpg)
W. Koch, Walking Through the JDL Model
![Page 56: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/56.jpg)
W. Koch, Walking Through the JDL Model
![Page 57: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/57.jpg)
W. Koch, Walking Through the JDL Model
![Page 58: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/58.jpg)
W. Koch, Walking Through the JDL Model
![Page 59: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/59.jpg)
W. Koch, Walking Through the JDL Model
![Page 60: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/60.jpg)
W. Koch, Walking Through the JDL Model
![Page 61: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/61.jpg)
W. Koch, Walking Through the JDL Model
![Page 62: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/62.jpg)
Retrodiction for GAUSSian Mixtures
wanted: p(xl|Zk) � p(xl+1
|Zk) for l < k
p(xl|Zk) =
X
Hk
p(xl, Hk|Zk) =
X
Hk
p(xl|Hk,Zk)| {z }
no ambiguities!
p(Hk|Zk)| {z }
filtering!
Calculation of p(xl|Hk,Zk) as in case of PD = 1, ⇢F = 0!
p(xl|Hk,Zk) = N
⇣xl; xHk
(l|k), PHk(l|k)
⌘
with parameters given by RAUCH-TUNG-STRIEBEL formulae:
xHk(l|k) = xHk
(l|l) +WHk(l|k) (xHk
(l+1|k)� xHk(l+1|l))
PHk(l|k) = PHk
(l|l) +WHk(l|k) (PHk
(l+1|k)�PHk(l+1|l))WHk
(l|k)>
gain matrix: WHk(l|k) = PHk
(l|l)F>l+1|lPHk(l+1|l)�1
W. Koch, Walking Through the JDL Model
![Page 63: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/63.jpg)
Retrodiction of Hypotheses’ Weights
Consider approximation: neglect RTS step!
p(xl|Hk,Zk) = N
�xl; xHk
(l|k), PHk(l|k)
�⇡ N
�xl; xHk
(l|l), PHk(l|l)
�
p(xl|Hk,Zk) ⇡
X
Hl
p⇤HlN�xl; xHk
(l|l), PHk(l|l)
�
with recursively defined weights:p⇤Hk
= pHk, p⇤Hl
=
Pp⇤Hl+1
summation over all histories Hl+1
with equal pre-histories!
• Strong sons strengthen weak fathers.• Weak sons weaken even strong fathers.• If all sons die, also the father must die.
W. Koch, Walking Through the JDL Model
![Page 64: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/64.jpg)
IMM Modeling: Suboptimal Realization
• Conventional KALMAN filteringOnly one component: worst-case assumtion
• standard IMM filter (as discussed!)Approximate after prediction, before update byr components! Effort: ⇠ r KALMAN filter
• GPB: Generalized Pseudo-BAYESianApproximate after measurement processing byr components! Effort: ⇠ r2 KALMAN filter
• IMM-MHT filter (nearly optimal)Accept longer dynamics histories! variablenumber of components!
Extendable to ambiguity with respect to sensor models!
W. Koch, Walking Through the JDL Model
![Page 65: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/65.jpg)
Track Extraction: Initiation of the PDF Iteration
extraction of target tracks: detection on a higher level of abstraction
start: data sets Zk = {zjk}mkj=1
(sensor performance: PD, ⇢F , R)
goal: Detect a target trajectory in a time series: Zk= {Zi}ki=1
!
at first simplifying assumptions:
• The targets in the sensors’ field of view (FoV) are well-separated.• The sensor data in the FoV in scan i are produced simultaneously.
decision between two competing hypotheses:
h1
: Besides false returns Zk contains also target measurements.h0
: There is no target existing in the FoV; all data in Zk are false.
statistical decision errors:
P1
= Prob(accept h1
|h1
) analogous to the sensors’ PD
P0
= Prob(accept h1
|h0
) analogous to the sensors’ PF
W. Koch, Walking Through the JDL Model
![Page 66: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/66.jpg)
Practical Approach: Sequential Likelihood Ratio Test
Goal: Decide as fast as possible for given decision errors P0
, P1
!
Consider the ratio of the conditional probabilities p(h1
|Zk), p(h
0
|Zk) and the
likelihood ratio LR(k) = p(Zk|h1
)/p(Zk|h0
) as an intuitive decision function:
p(h1
|Zk)
p(h0
|Zk)
=
p(Zk|h1
)
p(Zk|h0
)
p(h1
)
p(h0
)
a priori: p(h1
) = p(h0
)
Starting from a time window with length k = 1, calculate the test functionLR(k) successively and compare it with two thresholds A, B:
If LR(k) < A, accept hypothesis h0
(i.e. no target is existing)!
If LR(k) > B, accept hypothesis h1
(i.e. target exists in FoV)!
If A < LR(k) < B, wait for new data Zk+1
, repeat with LR(k +1)!
W. Koch, Walking Through the JDL Model
![Page 67: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/67.jpg)
Sequential LR Test: Some Useful Properties
1. Thresholds and decision errors are approximately related to each other by:
A ⇡1� P
1
1� P0
and B ⇡P1
P0
2. The actual decision length (number of scans required) is a random variable.
3. On average, the test has a minimal decision length for given errors P0
, P1
.
4. The quantity P0
(P1
) affects the mean decision length given h1
(h0
) holds.
5. Choose the probability P1
close to 1 for actually detecting real target tracks.
6. P0
should be small for not overloading the tracking system with false tracks.
W. Koch, Walking Through the JDL Model
![Page 68: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/68.jpg)
Iterative Calculation of the Likelihood Ratio
LR(k) =
p(Zk|h1
)
p(Zk|h0
)
=
Rdxk p(Zk,mk,xk,Zk�1|h
1
)
p(Zk,mk,Zk�1, h0
)
=
Rdxk p(Zk,mk|xk) p(xk|Zk�1, h
1
) p(Zk�1|h1
)
|FoV|�mk pF(mk) p(Zk�1|h0
)
W. Koch, Walking Through the JDL Model
![Page 69: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/69.jpg)
Iterative Calculation of the Likelihood Ratio
LR(k) =
p(Zk|h1
)
p(Zk|h0
)
=
Rdxk p(Zk,mk,xk,Zk�1|h
1
)
p(Zk,mk,Zk�1, h0
)
=
Rdxk p(Zk,mk|xk) p(xk|Zk�1, h
1
) p(Zk�1|h1
)
|FoV|�mk pF(mk) p(Zk�1|h0
)
=
Rdxk p(Zk,mk|xk, h1
) p(xk|Zk�1, h1
)
|FoV|�mk pF(mk)LR(k � 1)
basic idea: iterative calculation!
W. Koch, Walking Through the JDL Model
![Page 70: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/70.jpg)
Iterative Calculation of the Likelihood Ratio
LR(k) =
p(Zk|h1
)
p(Zk|h0
)
=
Rdxk p(Zk,mk,xk,Zk�1|h
1
)
p(Zk,mk,Zk�1, h0
)
=
Rdxk p(Zk,mk|xk) p(xk|Zk�1, h
1
) p(Zk�1|h1
)
|FoV|�mk pF(mk) p(Zk�1|h0
)
=
Rdxk p(Zk,mk|xk, h1
) p(xk|Zk�1, h1
)
|FoV|�mk pF(mk)LR(k � 1)
basic idea: iterative calculation!
Let Hk = {Ek,Hk�1} be an interpretation history of the time series Zk= {Zk,Zk�1}.
Ek = E0
k : target was not detected, Ek = Ejk: zjk 2 Zk is a target measurement.
p(xk|Zk�1, h1
) =
X
Hk�1
p(xk|Hk�1Zk�1, h1
) p(Hk�1|Zk�1, h1
) The standard MHT prediction!
p(Zk,mk|xk, h1
, h1
) =
X
Ek
p(Zk,Ek|xk, h1
) The standard MHT likelihood function!
The calculation of the likelihood ratio is just a by-product of Bayesian MHT tracking.
W. Koch, Walking Through the JDL Model
![Page 71: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/71.jpg)
Sequential Track Extraction: Discussion• LR(k) is given by a growing number of summands, each related to a par-
ticular interpretation history. The tuple {�jk,xjkPjk} is called a sub-track.
• For mitigating growing memory problems all approximations discussed fortrack maintenance can be used if they do not significantly affect LR(k):
– individual gating: Exclude data not likely to be associated.
– pruning: Kill sub-tacks contributing marginally to the test function.
– local combining: Merge similar sub tracks:
{�i,xi,Pi}i ! {�,x,P} with: � =
Pi �i,
x =
1
�
Pi �ixi, P =
1
�
Pi �i[Pi + (xi � x)(. . .)>].
• The LR test ends with a decision in favor of or against the hypotheses: h0
(no target) or h1
(target existing). Intuitive interpretation of the thresholds!
W. Koch, Walking Through the JDL Model
![Page 72: Information Fusion – an Omnipresent Phenomenon! · – finite mixture: inherent ambiguity (data, model, road network) – optimalestimators: e.g.minimummeansquarederror(MMSE) –](https://reader035.fdocuments.in/reader035/viewer/2022071001/5fbda82e86c3f74f174957ea/html5/thumbnails/72.jpg)
track extraction at tk: Decide in favor of h1
!
initiation of pdf iteration (track maintenance):
Normalize coefficients �jk: pjk =
�jkPjk �jk
!
(�jk,xjk,Pjk) ! p(xk|Zk) =
X
jk
pjk N⇣xk; xjk, Pjk
⌘
Continue track extraction with the remaining sensor data!
sequential LR test for track monitoring:
After deciding in favor of h1
reset LR(0) = 1! Calculate LR(k) from p(xk|Zk)!
track confirmation: LR(k) > P1
P0
: reset LR(0) = 1!
track deletion: LR(k) < 1�P1
1�P0
; ev. track re-initiation
W. Koch, Walking Through the JDL Model