Estimation and Detection
-
Upload
trinhnguyet -
Category
Documents
-
view
228 -
download
2
Transcript of Estimation and Detection
1
Estimation and Detection Lecture 11: Detection Theory – Unknown Parameters
Dr. ir. Richard C. Hendriks – 11/12/2015
2
Previous Lecture H0 : T (x)< �
H1 : T (x)> �
Using detection theory, rules can be derived on how to chose � and how to find T (x).
• Neyman-Pearson Theorem: L(x) = p(x;H1)p(x;H0)
> � ,
where � found from PFA =R{x:L(x)>�} p(x;H0)dx= ↵
• Minimum probability of error:
p(x|H1)p(x|H0)
> P (H0)P (H1)
= �
• Bayesian detector:
p(x|H1)p(x|H0)
> C10�C00C01�C11
P (H0)P (H1)
= �.
Similar test statistic,
different threshold.
3
Deterministic Signals - Interpretation
Interpretation 2: The resulting T (x) =PN�1
n=0 x[n]s[n]
is a matched filter.
Fig. 4.1 from Kay-II.
Interpretation 1: The resulting T (x) =PN�1
n=0 x[n]s[n]
is a correlator. The received data is correlated with a
replica of the signal.
4
Deterministic Signals – Summary Deterministic signals:
T (x) = x
TC
�1s
Notice that is C is positive definite, C�1 can be written as C
�1 =D
TD, leading to
T (x) = x
TD
TDs
Fig. 4.7 Kay-II.
5
Random Signals – Estimator Correlator Hence, we decide for H1 if
T (x) = x
Ts> �
00
with
s=1
�2
1
�2
�Cs+�2
I
�C
�1s
��1
x=Cs(Cs+�2I)�1
x
Fig. 5.2 Kay-II.
6
Random Signals – Exercise Hence, we decide for H1 if
T (x) = x
Ts> �
00
with
s=1
�2
1
�2
�Cs+�2
I
�C
�1s
��1
x=Cs(Cs+�2I)�1
x
Fig. 5.2 Kay-II.
Problem Ch. 5 Kay-II.
7
Today
with
s=1
�2
1
�2
�Cs+�2
I
�C
�1s
��1
x=Cs(Cs+�2I)�1
x
Random Signals:
T (x) = x
Ts> �
00
Deterministic signals:
T (x) = x
TC
�1s
Detection under NP:
NP requires perfect knowledge
of p(x;H0) and p(x;H1).
(and thus s and/or Cs and �2)
What if this information is
unknown?
8
Composite Hypothesis Testing
Remember this example
H0 : x[n] = w[n] n= 0,1, . . . ,N �1
H1 : x[n] =A+w[n] n= 0,1, . . . ,N �1
where A> 0 and w[n] is WGN with variance �
2. NP detector decides H1 if
L(x) =p(x;H1)
p(x;H0)> �
9
Composite Hypothesis Testing
We then have
1
(2⇡�2)N2exp
h� 1
2�2
PN�1n=0 (x[n]�A)
2i
1
(2⇡�2)N2exp
h� 1
2�2
PN�1n=0 x
2[n]
i> �
Taking the logarithm of both sides and simplification results in
1
N
N�1X
n=0
x[n]>
�
2
NA
ln�+
A
2
= �
0
Where the threshold is found by
PFA = Pr(T (x)> �
0;H0) =Q
�
0
p�
2/N
!) �
0=
r�
2
N
Q
�1(PFA)
10
Composite Hypothesis Testing
What if the value of A is unknown?
H0 : x[n] = w[n] n= 0,1, . . . ,N �1
H1 : x[n] =A+w[n] n= 0,1, . . . ,N �1
where A is unknown, but we know A> 0. Further we know that w[n] is WGN with variance
�
2. NP detector decides H1 if
L(x) =p(x;A,H1)
p(x;A,H0)> �
PDFs parameterized by A!
This is called: Composite
Hypothesis testing.
11
Composite Hypothesis Testing We now have
1
(2⇡�2)N2exp
h� 1
2�2
PN�1n=0 (x[n]�A)
2i
1
(2⇡�2)N2exp
h� 1
2�2
PN�1n=0 x
2[n]
i> �
Taking the logarithm of both sides and simplification results in
1
N
N�1X
n=0
x[n]>
�
2
NA
ln�+
A
2
= �
0
Where the threshold is found by
PFA = Pr(T (x)> �
0;H0) =Q
�
0
p�
2/N
!) �
0=
r�
2
N
Q
�1(PFA)
In this case, Both the test statistic
T (x) and threshold �0
do not
depend on A.
Even though A is unknown, we can
thus implement this detector.
(Although PD does depend on A)
PD =Q
Q�1(PFA)�
rNA2
�2
!
12
Composite Hypothesis Testing - UMP
The test1
N
N�1X
n=0
x[n]>�
2
NA
ln�+A
2=
r�
2
N
Q
�1(PFA),
leads to the highest PD (remember NP maximises PD) for any value A. (as long as A> 0).
Such a test is called a Uniformly Most Powerful (UMP) test. Any other test will have a poorer
performance.
However...often an UMP does not exist.
13
Composite Hypothesis Testing - UMP
In this case, the UMP does not exist (as A can either be > 0 or < 0). Instead of en optimal
test we will have to use a sub-optimal test. However, the unrealizable optimal test can be
used for performance comparison (such a test is called a clairvoyant detector ).
Consider the same example:
What if A is �1<A<1 instead of A> 0?
In that case we will have a different test for negative and positive A:
A> 0 :1
N
N�1X
n=0
x[n]>
r�
2
N
Q
�1(PFA)
A< 0 :1
N
N�1X
n=0
x[n]<�r
�
2
N
Q
�1(PFA)
14
Example – DC level in WGN unknown A
Let us first calculate the clairvoyant detector for the case �1<A<1
A> 0 :1
N
N�1X
n=0
x[n]> �
0+
A< 0 :1
N
N�1X
n=0
x[n]< �
0�
For A> 0
PFA = Pr{x > �
0+;H0}=Q
�
0+p
�
2/N
!
For A< 0
PFA = Pr{x < �
0�;H0}= 1�Q
�
0�p
�
2/N
!=Q
��
0�p
�
2/N
!
15
Example – DC level in WGN unknown A The detection performance of the clairvoyant detector:
for A> 0 PD = Pr{x > �
0+;H1}=Q
�
0+�A
p�
2/N
!=Q
Q
�1(PFA)�r
NA
2
�
2
!
for A< 0 PD = 1�Q
�
0��A
p�
2/N
!=Q
��
0�+A
p�
2/N
!=Q
Q
�1(PFA)+Ap�
2/N
!
Fig. 6.3 Kay-II.
16
Example – DC level in WGN unknown A Instead of the clairvoyant detector, let’s look at the realisable detector:
�����1
N
N�1X
n=0
x[n]
�����> �
00,
where A is now unknown and �1<A<1.
PFA = Pr{|x|> �
00;H0}= 2Pr{x > �
00;H0}= 2Q
�
00
p�
2/N
!
�
00=p
�
2/NQ
�1 (PFA/2)
PD = Pr{|x|> �
00;H1}=Q
Q
�1(PFA/2)�r
NA
2
�
2
!+Q
Q
�1(PFA/2)+Ap�
2/N
!
17
Example – DC level in WGN unknown A
PD = Pr{|x|> �
00;H1}=Q
Q
�1(PFA/2)�r
NA
2
�
2
!+Q
Q
�1(PFA/2)+Ap�
2/N
!
Fig. 6.4 Kay-II.
The performance of this realisable
detector is thus not optimal, but close
to the optimal clairvoyant detector.
18
Exercise
Problems Ch. 6 Kay-II.
19
Approaches for composite Hyp. testing
• Need to choose prior pdf.
• Integration can be difficult.
Bayesian approach:
Assign priors to unknown parameters �0 and �1 under hypotheses H0 and H1, respectively:
p(x;H0) =
Zp(x|�0;H0)p(�0)d�0
p(x;H1) =
Zp(x|�1;H1)p(�1)d�1
NP detector:
p(x;H1)p(x;H0)
=Rp(x|�1;H1)p(�1)d�1Rp(x|�0;H0)p(�0)d�0
> �
Two approaches:
• Bayesian approach: Consider unknown parameters as realizations of random vari-
ables and assign a prior pdf.
• Generalized likelihood ratio: Estimate unknown parameters using MLEs.
20
Exercise Bayesian Composite NP Detector
NP detector:
p(x;H1)p(x;H0)
=Rp(x|�1;H1)p(�1)d�1Rp(x|�0;H0)p(�0)d�0
> �Example Bayesian NP detector:
We observe one sample x[0] having an exponential pdf
p(x[0]) = �exp(��x[0]),
where � is unknown. For the hypothesis testing problem
H0 : �= �0
H1 : � 6= �0,
Determine the Bayesian composite NP detector if it is given that � is uniformly distributed
between 0 and C.
21
Generalized Likelihood Ratio Test
GLRT:
• Replace unknown parameters by their MLEs.
• GLRT:
LG(x) =p(x; ˆ�1,H1)
p(x; ˆ�0,H0)> �
with p(x; ˆ�i,Hi) given by
p(x; ˆ�i,Hi) = max
�i
p(x;�i,Hi)
22
Example: DC in WGN with Unknown Amplitude - GRLT
Remember this example
H0 : x[n] = w[n] n= 0,1, . . . ,N �1
H1 : x[n] =A+w[n] n= 0,1, . . . ,N �1
where �1 < A < 1 and w[n] is WGN with variance �
2. NP detector decides H1 if the
GLRT:
LG(x) =p(x;
ˆ
�1,H1)
p(x;
ˆ
�0,H0)> �
with p(x;
ˆ
�i,Hi) given by
p(x;
ˆ
�i,Hi) = max
�i
p(x;�i,Hi)
23
Example: DC in WGN with Unknown Amplitude - GRLT MLE of A:
p(x;
ˆ
A,H1) = max
A
p(x;A,H1)
= max
A
1
(2⇡�
2)
N2
exp
h� 1
2�
2
N�1X
n=0
(x[n]�A)
2i.
This will lead to
ˆ
A=
1N
Px
[n] = x.
Thus, the GLRT
LG(x) =
1
(2⇡�2)N2exp
h� 1
2�2
PN�1n=0 (x[n]� x)
2i
1
(2⇡�2)N2exp
h� 1
2�2
PN�1n=0 x
2[n]
i> �
Taking the logarithm of both sides we have
lnLG(x) =� 1
2�
2(�2Nx
2+Nx
2) =
Nx
2
2�
2
24
Example: DC in WGN with Unknown Amplitude - GRLT
PFA = Pr{|x|> �
00;H0}= 2Pr{x > �
00;H0}=Q
�
00
p�
2/N
!
�
00=p
�
2/NQ
�1 (PFA/2)
PD = Pr{|x|> �
00;H1}=Q
Q
�1(PFA/2)�r
NA
2
�
2
!+Q
Q
�1(PFA/2)+Ap�
2/N
!
lnLG(x) =� 1
2�2(�2Nx
2+Nx
2) =Nx
2
2�2
We decide thus for H1 if
|x|> �
00.
This detector is identical to realisable detector we looked at before. Remember: