1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.
-
date post
20-Dec-2015 -
Category
Documents
-
view
218 -
download
0
Transcript of 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.
![Page 1: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/1.jpg)
1
Markov random field: A brief introduction
Tzu-Cheng Jen
Institute of Electronics, NCTU
2007-03-28
![Page 2: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/2.jpg)
2
Outline
Neighborhood system and cliques
Markov random field
Optimization-based vision problem
Solver for the optimization problem
![Page 3: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/3.jpg)
3
Neighborhood system and cliques
![Page 4: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/4.jpg)
4
Prior knowledge
In order to explain the concept of the MRF, we first introduce following definition:
1. i: Site (Pixel)
2. Ni: The neighboring point of i
3. S: Set of sites (Image)
4. fi: The value at site i (Intensity)
f1 f2 f3
f4 fi f6
f7 f8 f9
A 3x3 imagined image
![Page 5: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/5.jpg)
5
Neighborhood system
The sites in S are related to one another via a neighborhood system. Its definition for S is defined as:
where Ni is the set of sites neighboring i.
The neighboring relationship has the following properties: (1) A site is not neighboring to itself
(2) The neighboring relationship is mutual f1 f2 f3
f4 fi f6
f7 f8 f9
' 'i ii N i N
![Page 6: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/6.jpg)
6
Neighborhood system: Example
First order neighborhood system
Second order neighborhood system
Nth order neighborhood system
![Page 7: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/7.jpg)
7
Neighborhood system: Example
The neighboring sites of the site i are m, n, and f.
The neighboring sites of the site j are r and x
![Page 8: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/8.jpg)
8
Clique
A clique C is defined as a subset of sites in S. Following are some examples
![Page 9: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/9.jpg)
9
Clique: Example
Take first order neighborhood system and second order neighborhood for example:
Neighborhood system Clique types
![Page 10: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/10.jpg)
10
Markov random field
![Page 11: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/11.jpg)
11
Markov random field (MRF)
View the 2D image f as the collection of the random variables (Random field)
A random field is said to be Markov random field if it satisfies following properties
Image configuration f
f1 f2 f3
f4 fi f6
f7 f8 f9
{ }
(1) ( ) 0, (Positivity)
(2) ( | ) ( | ) (Markovianity)i S i i Ni
P f f
P f f P f f
F
![Page 12: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/12.jpg)
12
Gibbs random field (GRF) and Gibbs distribution
A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is:
Image configuration f
f1 f2 f3
f4 fi f6
f7 f8 f91 2
1 2 '{ } { , '}
1 2 '{ } { } '
( ) ( ) ( ) ( , ) .....
( ) ( , ) .....i
c i i ic C i C i i C
i i ii S i S i N
U f V f V f V f f
V f V f f
1( )1( )
U fTP f Z e
U(f): Energy function; T: Temperature Vi(f): Clique potential
Design U for different applications
![Page 13: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/13.jpg)
13
Markov-Gibbs equivalence
Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF
Proof(<=): Let P(f) be a Gibbs distribution on S with the neighborhood system N.
f1 f2 f3
f4 fi f6
f7 f8 f9
A 3x3 imagined image
( )
{ } ( '){ }
'
( )( | )
( )
cc C
cc C
i
V f
i S i V fS i
f
P f eP f f
P fe
{ }( | ) ( | ) i S i i NiP f f P f f
![Page 14: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/14.jpg)
14
Markov-Gibbs equivalence
Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i:
A 3x3 imagined image
f1 f2 f3
f4 fi f6
f7 f8 f9
( ) ( ) ( )
{ } ( ') ( ') ( ')
''
( )
( ')
'
[ ][ ]( | )
{[ ][ ]}
[ ] ( | )
{[ ]}
c c cc C c A c B
c c cc C c A c B
ii
cc A
cc A
i
V f V f V f
i S i V f V f V f
ff
V f
i NiV f
f
e e eP f f
e e e
eP f f
e
![Page 15: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/15.jpg)
15
Optimization-based vision problem
![Page 16: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/16.jpg)
16
Denoising
Noisy signal d denoised signal f
![Page 17: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/17.jpg)
17
MAP formulation for denoising problem
The problem of the signal denoising could be modeled as the MAP estimation problem, that is,
arg max{ ( | )}
By Baye's rule:
arg max{ ( | ) ( )}
:
:
f
f
f p f d
f p d f p f
f Unknown data
d Observed data
(Prior model)
(Observation
model)
![Page 18: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/18.jpg)
18
MAP formulation for denoising problem
Assume the observation is the true signal plus the independent Gaussian noise, that is
Under above circumstance, the observation model could be expressed as
2, (0, )i i i id f e e N
2 2
1
( ) / 2( | )
2 2
1 1( | )
2 2
m
i i ii
f dU d f
m m
i ii m i m
p d f e e
U(d|f): Likelihood energy
![Page 19: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/19.jpg)
19
MAP formulation for denoising problem
Assume the unknown data f is MRF, the prior model is:
Based on above information, the posteriori probability becomes
1( )1( )
U fTP f Z e
2 2
1
( )( ) / 21
2
1( | ) ( | )* ( ) *
2
m
i i ii
U ff dT
m
ii m
p f d P d f P f e Z e
![Page 20: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/20.jpg)
20
MAP formulation for denoising problem
The MAP estimator for the problem is:
2 2
1
( )( ) / 21
2
2 2
1
arg max{ ( | )} arg max{ ( | ) ( )}
1arg max{ * }
2
arg min{ ( ) / 2 ( )}
arg min{ ( | ) ( )}
m
i i ii
f f
U ff dT
f m
ii m
m
f i i ii
f
f p f d p d f p f
e Z e
f d U f
U d f U f
?
![Page 21: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/21.jpg)
21
MAP formulation for denoising problem
Define the smoothness prior:
Substitute above information into the MAP estimator, we could get:
21( ) ( )i i
i
U f f f
22
121 1
arg max{ ( | )} arg min{ ( | ) ( )}
( )arg min{ ( ) }
2
f f
m mi i
f i ii i
f p f d U d f U f
f df f
Observation model (Similarity measure)
Prior model (Reconstruction constrain)
![Page 22: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/22.jpg)
22
Super-resolution
Super-Resolution (SR): A method to reconstruct high-resolution images/videos from low-resolution images/videos
![Page 23: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/23.jpg)
23
Super-resolution
Illustration for super-resolution
d(1) d(2) d(3) d(4)
f(1)
Use the low-resolution frames to reconstruct the high resolution frame
![Page 24: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/24.jpg)
24
MAP formulation for super-resolution problem
The problem of the super-resolution could be modeled as the MAP estimation problem, that is,
(1) (2) ( )
(1) (2) ( )
( )
arg max{ ( | ..... )}
By Bayes rule:
arg max{ ( ..... | ) ( )}
:
:
Mf
Mf
i
f p f d d d
f p d d d f p f
f High resolution image
d Low resolution image
(Prior model) (Observation model)
![Page 25: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/25.jpg)
25
MAP formulation for super-resolution problem
The conditional PDF can be modeled as the Gaussian distribution if the noise source is Gaussian noise
We also assume the prior model is joint Gaussian distribution
(1) (2) ( ) (1) (2) ( )( ..... | ) exp( ( , ,...., , ))M Mp d d d f H d d d f
1( ) exp( ( ) ( ))
:
: var
Tp f f M f M
where
M Mean of f
Co iance matrix
![Page 26: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/26.jpg)
26
MAP formulation for super-resolution problem
Substitute above relation into the MAP estimator, we can get following expression:
(1) (2) ( )
(1) (2) ( ) 1
(1) (2) ( ) 1
arg max{ ( ..... | ) ( )}
arg max{exp{-( ( , ,...., , ) ( ) ( ))}}
arg min{ ( , ,...., , ) ( ) ( ))} arg min ( )
Mf
M Tf
M Tf f
f p d d d f p f
H d d d f f M f M
H d d d f f M f M E f
(Prior model) (Observation model)
![Page 27: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/27.jpg)
27
Solver for the optimization problem
![Page 28: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/28.jpg)
28
The solver of the optimization problem
In this section, we will introduce different approaches for solving the optimization problem: 1. Brute-force search (Global extreme)
2. Gradient descent search (Local extreme, Usually)
3. Genetic algorithm (Global extreme)
4. Simulated annealing algorithm (Global extreme)
![Page 29: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/29.jpg)
29
Gradient descent algorithm (1)
![Page 30: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/30.jpg)
30
Gradient descent algorithm (2)
![Page 31: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/31.jpg)
31
Simulation: SR by gradient descent algorithm
Use 6 low resolution frames (a)~(f) to reconstruct the high resolution frame (g)
![Page 32: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/32.jpg)
32
Simulation: SR by gradient descent algorithm
![Page 33: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/33.jpg)
33
The problem of the gradient descent algorithm
Gradient descent algorithm may be trapped into the local extreme instead of the global extreme
![Page 34: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/34.jpg)
34
Genetic algorithm (GA)
The GA includes following steps:
![Page 35: 1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28.](https://reader036.fdocuments.in/reader036/viewer/2022062421/56649d4e5503460f94a2e141/html5/thumbnails/35.jpg)
35
Simulated annealing (SA)
The SA includes following steps: