92
CHAPTER 4
IMAGE FUSION USING APPROXIMATION AND DETAIL
4.1 IMAGE FUSION
The successful fusion of images acquired from different modalities
or instruments is of great importance in many applications, such as medical
imaging, microscopic imaging, remote sensing, computer vision and robotics.
Image Fusion can be defined as the process by which several image or some
of their features are combined together to form a single image. Image fusion
can be performed at different levels of the information representation. Four
different levels can be distinguished, that is signal, pixel, feature and symbolic
levels. Today, the results of image fusion in areas such as remote sensing and
medical imaging are primarily intended for presentation to human observation
for easier and enhanced interpretation. Therefore, the perception of fused
image is of paramount importance when evaluating different fusion schemes.
Some generic requirements can be imposed on the fusion result. They are:
The fused image should preserve as closely as possible all
relevant information contained in the input images.
The fusion process should not introduce any artifacts or
inconsistencies which can distract or mislead the human
observer or any subsequent image processing steps.
In the fused image, irrelevant features and noise should be
suppressed to a maximum extent.
93
4.2 PRINCIPLE OF IMAGE FUSION
The images of the same modality taken at the same time but from
different places or under different conditions are fused together to generate a
single image with information content from both the images. Such type of
fusion is called as multi-view image fusion which is used in this study. The
transverse abdominal scan taken with the right side of the maternal abdomen
appears on the left side of the monitor and the maternal left side appearing
on the right of the monitor. The aim of image fusion is to combine relevant
information from these source images from different views into a single
image using the fusion rules. The image resulting from the fusion contains
most of the information from all the source images. The applications differ
from one another in using the fusion rules.
W
W
Figure 4.1 Wavelet Decomposition and Image Fusion using Fusion Rules
The Figure 4.1 represents how the images are subjected to wavelet
decomposition and fused using the application of fusion rules.
( ) { ( )), ( ))} (4.1)
Where,
( ) = input image1(transverse scans)
( ) = input image 2 (longitudinal scans)
W-1
Image1
Image2
FusionRules
FusedImage
FusedImage
94
( ) = image fused using the fusion rule
= Wavelet transforms.
The Figure 4.2 shows how the matching between the images results
in generating an image.
Figure 4.2 Matching the images obtained from multiview
4.3 IMAGE FUSION TECHNIQUES
There are many approaches to fusion techniques.
95
1. Fusion Techniques based on Averaging
This technique is a basic and straight forward techniques and fusion
could be achieved by simple averaging corresponding pixels in each input
image as:
( ) = ( ( ) ( ))/2 (4.2)
2. Fusion Techniques based on Principal Component Analysis
The Principal Component Analysis (PCA) involves a mathematical
procedure that transforms a number of correlated variables into a number of
uncorrelated variables called principal components. It computes a compact
and optimal description of the data set. The first principal component
accounts for as much of the variance in the data as possible and each
succeeding component accounts for as much of the remaining variance as
possible. First principal component is taken to be along the direction with the
maximum variance. The second principal component is constrained to lie in
the subspace perpendicular of the first. Within this subspace, this component
points the direction of maximum variance. The third principal component is
taken in the maximum variance direction in the subspace perpendicular to the
first two and so on. The PCA is also called as Karhunen-Loeve transform or
the Hotelling transform. PCA is a general statistical technique that transforms
multivariate data with correlated variables into one with uncorrelated
variables. A univariate measure uses a single image, whereas a bivariate
measure is a comparison between two images. The new variables are obtained
as linear combination of the original variables where the input images to be
fused is arranged in two column vectors. The Eigen vectors corresponding to
the larger Eigen value is obtained and the normalized components are
computed from the obtained Eigen vectors.
96
PCA Algorithm
Let the source images be arranged in two-column vectors. The steps
to be followed o project this data into 2-D subspaces are:
1. Organize the data into column vectors. The resulting matrix Z
is of dimension 2Xn.
2. Compute the empirical mean along each column. The
empirical mean vector Me has a dimension of 1X2.
3. Subtract the empirical mean vector Me from each column of
the data matrix Z. The resulting matrix X is of dimension 2Xn.
4. Find the covariance matrix C of X ie. Cov(X).
5. Compute the Eigen vectors V and Eigen value D of C and sort
them by decreasing Eigen value. Both V and D are of
dimension 2X2.
6. Consider the first column of V which corresponds to larger
Eigen value to compute P1 and P2 as:
= ( ) (4.3)
= ) (4.4)
The fused image is represented as:
( ) = ) + ) (4.5)
3. Fusion Based on Wavelet Transform Techniques
The two dimensional wavelet transform becomes one of the
standard tools for image fusion. The later technique is employed in this
97
research. The wavelet image fusion works by merging the wavelet
decompositions of the two original images using fusion methods applied to
approximations coefficients and details coefficients. The wavelet transform
divide the original image into low frequency components and high frequency
components. The low frequency coefficients reflect the approximate feature
of the image. It contains the main outline information of the image. It is an
approximate image of the original image at certain dimensions. Most of the
information and energy of the image is included in this. The high frequency
coefficients reflect the detail of the luminance change which corresponds to
the edge information of an image. It is important to keep the edge information
and the outline information of the input image in the fused image. The fusion
should preserve the detail information like high frequency and give
prominence to the outline information in the target image. The two images
must be of the same size and color map.
Figure 4.3 Transverse scans of placenta (10 weeks Gestational Age) image
reconstructured using Wavelet Image Fusion
98
4.4 WAVELET IMAGE FUSION BY MAX APPROXIMATION
AND MEAN DETAIL
The images decomposed using wavelet techniques are then fused
with the original image using min, max and mean fusion techniques. After the
fused image is generated, it is processed further and some features of interest
are extracted.
In wavelet image fusion scheme, the source images ( ) and
( ) are decomposed into approximation and detailed coefficients at
required level using Haar Wavelet. The approximation and detailed
coefficients of both images are combined using fusion rule as represented by
equation 4.1. The fused image ( ) is obtained by taking the inverse
wavelet transform. The fusion rule used in this research obtains the maximum
of the approximation coefficients and finds the mean of the detailed
coefficient in each sub-band with the largest magnitude. The Figure 4.4
represents the fused ultrasound placenta where the input image is subjected to
wavelet decomposition followed by image fusion. The two images from
wavelet decompositions at same levels are used by taking the max method for
approximations and mean for details. The wavelet based method is available
as the image fusion tool in the wavelet toolbox, which is used for fusing
various images of the same sizes. Thus using different techniques like mean,
max, min approximation and details, fused image is obtained. The inverse 2D
wavelet transform is used to reconstruct the image from sub images
( ) ( ) ( ) and ( ). The Figure 4.5 show the images
fused using the fusion rule (a)Max Max (b) Max Min (c) Max Mean (d) Min
Max (e) Min Min (f)Min Mean (g) Mean Max (h) Mean Min (i) Mean Mean
approximation and detail of a fetus with the Gestational Age as 15 weeks.
99
Figure 4.4 Image Fusion of Wavelet Decomposed Ultrasound Placenta
using Max Approximation and Mean Detail
The fused image is then used for further analysis. For the next
level decomposition, the decomposed ultrasound images of the placenta are
used as input. There is corresponding information in each of the fused
images that are different than the original images.
Algorithm 1 Wavelet Based Image Fusion Algorithm.
The left image XL and right image XR of the ultrasonic placenta is
taken. The images are subjected to Haar wavelet decomposition. The
decomposed left image and right image is then fused using max
approximation and mean detail. This generates synthesized image. This
retains the essential characteristics of the image.
100
The Algorithm 1 uses the min approximation and min detail.
The algorithm can be implemented as
1. Consider the left image XL and the right image XR
(multi- view images of ultrasound placenta).
2. Apply the discrete wavelet transform on XL using Haar
wavelet. This gives the Approximation (Appr) coefficient and
the detail coefficients: Horizontal (Hor) details and the
diagonal (Dia) details.
3. Consider the pair of images (Appr, Hor) of the image XL for
feature extraction.
4. Let C= max (min ((Appr (XL), Hori), min (Hor (XL), Appri)
If C=min ((Appr (XL), Hor I), then
Appr (XL) belongs to histogram H1
If C=min (Hor (XL), Appr), then
Appr (XL) belongs to Histogram H2
5. Repeat from step 2 for the right image XR
6. Apply the same step for the pair of images (Appr (XL), Ver),
(Appr (XL), Dia), Appr (XR), Ver), (Appr (XR), Dia) and for
(Ver, Hor, Dia) of XL and (Ver, Hor, Dia) of XR.
7. The synthesized image is generated.
8. The image is subjected to inverse wavelet transform.
9. The fused ultrasound placenta image is generated.
101
(a) (b) (c)
(d) (e) (f)
(g) (h) (i)
Figure 4.5 Image fused using the fusion rule (a) Max Max (b) Max Min
(c) Max Mean (d) Min Max (e) Min Min (f)Min Mean
(g) Mean Max (h) Mean Min (i) Mean Mean approximation
and detail of 15 weeks of Gestational Age
4.5 IMAGE QUALITY ASSESSMENT TECHNIQUES
In the case of medical images, it is important to reproduce the
image close to the original image so that the smallest of the details are
readable. Wavelet transform is an effective method of image fusion.
102
Decomposition levels affect the image fusion. The two issues in wavelet
decomposition for image fusion are:
Selection of the best wavelet basis function.
Selection of best decomposition levels.
A good quality measure should reflect the distortion on the image
well due to, for example, noise, blurring or distortion. The fused image is
used in the medical diagnosis, only when it is passed in the quality measures
test to make it more suitable for diagnostic accuracy. It is a The image quality
of the ultrasound image and the fused image is analyzed based on the fusion
evaluation methods such as Mean, Standard Deviation (SD), Entropy (H),
Root Mean Square Error (RMSE), Peak Signal to Noise Ratio (PSNR),
Fusion Mutual Information (FMI), Mean Square Error (MSE), Normalized
Absolute Error (NAE), Average Difference (AD), Max Difference (MD),
Normalized Cross Correlation (NCC), Structural Content (SC) and Signal to
Noise Ratio (SNR).
4.6 DIAGNOSTIC ACCURACY EVALUATION OF FUSED
ULTRASOUND PLACENTA
Objective indices can overcome the influence of human vision,
mentality and knowledge and makes machines automatically select a superior
algorithm.
Objective indices can be divided into three categories.
1. Image features such as entropy, spatial frequency and
gradient.
103
2. Relation of the fused image to the source image which is
nothing but the mutual information of an image
3. Relation of the fused image to the reference image such as
entropy, correlation coefficient, Root Mean Square Error.
4.6.1 Mean
It measures the mean value of the pixels in the image.
Mean= (4.6)
where,
= Total number of pixels in the image.
= Value of the ith pixel
4.6.2 Standard Deviation (SD)
The standard deviation of gray image reflects its clarity and
contrast. The increase in standard deviation indicates that the image is
becoming “noisier”. The smaller the image contrast, the more affected by
noise.
The Standard Deviation (SD) is
= ( ) ) (4.7)
where
= input image
= Standard Deviation of the image
104
= rows of the image
= columns of the image
( )=grey value of pixel at location
= Mean of
It is represented by,
= ) (4.8)
4.6.3 Entropy (H)
It is the information richness which indicates the average
information amount contained in the image. Entropy is greater, the greater the
amount of information carried by the fusion image, information richer. It does
not measure the similarity of fused image and source image. The larger the
value of entropy, better the fusion results.
= log( ) (4.9)
Where,
= probability of the occurrence of i
4.6.4 Root Mean Square Error (RMSE)
This is used to evaluate the fused image. It presents the amount of
deviation present in the fused image compared to reference image. Let be
the fused image, is the input image, then
105
= ( ) ( )] (4.10)
Where and denotes the spatial position of pixels while and are the
dimensions of the image.
4.6.5 Peak Signal to Noise Ratio (PSNR)
The peak signal to noise ratio PSNR is the ratio between the
maximum value of an image and the magnitude of background noise and is
commonly used as a measure of quality of reconstruction in image fusion. It
indicates the similarity between two images. The higher value of PSNR the
better is the fused image is.
= 10 log ) (4.11)
4.6.6 Fusion Mutual Information (FMI)
It measures the degree of dependence of two images. A large
measure implies better quality. If the joint histogram between ( ) and
( )is defined as ) and ( ) and ( )is ( ) then fused
mutual information is given as,
+ (4.12)
Where
= )( ) )
(4.13)
106
=)
( ) ) (4.14)
4.6.7 Mean Square Error (MSE)
It is one of the frequently used measures to capture the deviations
between the original and the fused image. It is computed by finding the
squared error divided by the total number of pixels in the image. The spatial
distortions introduced by the fusion process are measured by MSE. The
smaller the value of the error metric, the better the fused image represents the
original input image. But it cannot capture the artifacts like blur or blocking
artifacts.
=[ ( ) ( )]
(4.15)
Where,
= Mean Square Error
( )= original image
( )=fused image
= size of the image
4.6.8 Normalized Absolute Error (NAE)
It is a measure of how far is the fused image from the original
image. Large value of NAE indicates poor quality of the image.
107
The NAE is computed by:
NAE =[ ( ) )]
| )| (4.16)
where,
NAE = Normalized Absolute Error
= size of the image f )
( ) = the original image
) =the fused image
4.6.9 Average Difference (AD)
A lower value of Average Difference gives a “cleaner” image as
more noise is reduced and it is computed using
=[ ( ) )]
(4.17)
where,
( ) = pixel values of the original image f )
) = pixel values of the fused image )
4.6.10 Max Difference (MD)
Large value of MD means that the image is of poor quality. It is
obtained by measuring the distortion between the original and the fused
image.
108
= max(| ( ) )| (4.18)
where,
( ) = pixel values of the original image f )
) = pixel values of the fused image )
4.6.11 Normalized Cross Correlation (NCC)
Normalized Cross-Correlation is one of the methods used for
template matching, a process used for finding incidences of a pattern or object
within an image. It is computed by
=( ) )
( ) ) (4.19)
where,
= Normalized Cross Correlation
) = Original Image,
) = Fused Image,
= size of the image f
4.6.12 Structural Content (SC)
This estimates the similarity of the structure of two images. It
effectively compares the total weight of an original image to that of the fused
image to measure the distortions contained in it. The fused image is of better
quality if the spread is at 1. The large value of SC means that the image is of
poor quality.
=[ )]
) (4.20)
109
where,
= Structural Content
) = Original Image
) = Fused Image
= size of the image f
4.6.13 Signal to Noise Ratio (SNR)
The SNR is the quantitative description of the quality of the
information carried by the radiographic images. The SNR is the measure
between the fused and the original image. It measures the ratio between the
information and the noise of the fused image. The higher the ratio the less
obtrusive the background noise is. It is calculated using the equation,
= 10( ) ( )
) (4.21)
where,
( )= original image
( )=fused image
= size of the image
4.7 RESULTS
The results obtained by using the fusion rule given by equation 4.1,
4.2 and 4.5 are recorded in the Tables 4.1 and 4.2. The fusion rule based on
mean approximation and mean detail gives least values for MSE and the
highest values for PSNR. The noise and other artifacts in the image are
removed.
110
Table 4.1 Performance Assessment of Various Fusion Techniques
using Peak signal to noise ratio, mean square error, root
mean square error, normalized absolute error, normalized
cross correlation, structural content and fusion mutual
information of 15 weeks Gestational Age
FusionTechniques PSNR MSE RMSE NAE NCC SC FMI
ImageAveraging
39.1725 7.8673 2.8049 0.0923 0.9764 1.0271 1.8706
Wavelet 40.9709 5.1999 2.2803 0.0637 1.0066 0.9782 4.625
PCA 40.0335 6.4525 2.5402 0.0768 1.0019 0.9826 4.2678
Table 4.2 Performance Assessment of Various Fusion Techniques
using fusion entropy, mean, standard deviation, average
difference, signal to noise ratio and spatial frequency of 15
weeks Gestational Age.
Fusion
TechniquesSTDF Entropy H Mean F AD SNR SF
Image Averaging 29.6438 6.2384 88.4785 -0.1009 39.1725 -0.006
Wavelet 33.2047 6.6184 89.6649 -0.4963 40.9709 3.4182
PCA 33.0031 6.5281 89.5115 -0.4452 40.0335 -0.0044
111
Table 4.3 Evaluation of fusion rules based on Image Quality Measures
PSNR, RMSE, NAE, NCC and SC
Fusion RulePSNR RMSE NAE NCC SC
Approximation Detail
Max Max 38.2766 3.1096 0.121 1.008 0.9548
Max Min 39.5058 2.6993 0.0893 1.0065 0.9702
Max Mean 40.9709 2.2803 0.0637 1.0066 0.9782
Min Max 39.3109 2.7605 0.0951 0.9941 0.9923
Min Min 39.7844 2.6141 0.0819 0.991 1.0023
Min Mean 40.5563 2.3918 0.071 0.9914 1.0062
Mean Max 38.9197 2.8877 0.1013 1.0083 0.9616
Mean Min 39.5153 2.6964 0.089 0.9991 0.9842
Mean Mean 40.253 2.4768 0.0747 1.0052 0.9774
Table 4.4 Evaluation of fusion rules based on Image Quality Measures
FMI, ENT, MEAN and STD
Fusion RuleFMI ENT MEAN STD AD
Approximation Detail
Max Max 38.2766 3.1096 0.121 1.008 -0.5526
Max Min 39.5058 2.6993 0.0893 1.0065 -0.4963
Max Mean 40.9709 2.2803 0.0637 1.0066 -0.8875
Min Max 39.3109 2.7605 0.0951 0.9941 0.0208
Min Min 39.7844 2.6141 0.0819 0.991 -0.3151
Min Mean 40.5563 2.3918 0.071 0.9914 0.0022
Mean Max 38.9197 2.8877 0.1013 1.0083 -0.4466
Mean Min 39.5153 2.6964 0.089 0.9991 -0.6262
Mean Mean 40.253 2.4768 0.0747 1.0052 -0.547
112
(a)
(b)
(c)
01234
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
RM
SE V
alue
s
RMSE
RMSE
00.050.1
0.15
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
NA
E V
alue
s
NAE
NAE
36.537
37.538
38.539
39.540
40.541
41.5
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
PSN
R V
alue
s
PSNR
PSNR
Image Fusion Rules
Image Fusion Rules
Image Fusion Rules
113
(d)
(e)
(f)
(c)
0.980.9850.99
0.9951
1.0051.01
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
NC
C V
alue
s
NCC
NCC
0.920.940.960.98
11.02
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
SC V
alue
s
SC
SC
36373839404142
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
FMI V
alue
s
FMI
FMI
Image Fusion Rules
Image Fusion Rules
Image Fusion Rules
114
(g)
(h)
(i)
01234
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
ENT
Val
ues
ENT
ENT
00.020.040.060.08
0.10.120.14
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
MEA
NF
MEANF
MEANF
0.980.985
0.990.995
11.005
1.01
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean Mean
STD
F V
alue
s
STDF
STDF
Image Fusion Rules
Image Fusion Rules
Image Fusion Rules
115
(j)
Figure 4.6 Quality Measures for various fusion rules using entropy,
mean, standard deviation, average difference, signal to noise
ratio, spatial frequency, PSNR, RMSE, NAE, NCC and SC
The performance analysis of various fusion techniques are recorded in
the Table 4.1 and Table 4.2. The findings of this chapter are that the
performance of wavelet image fusion is better when compared to the image
averaging and PCA method. This is clearly shown in the Appendix A. The
Figure A.1 (a), (e), (g), (j), (l) of Appendix A, showed an increase in the
values for wavelet image fusion technique when compared to Image
Averaging or PCA. The PSNR, NCC, FMI, ENT, SF, RMSE, MSE, NAE, SC
and AD prove to be good in the evaluation of the fusion methods suitable for
ultrasound placenta images. Of these measures, SF, FMI, ENT, PSNR, RMSE
seems to be a better indicator to the performance of the fusion methods.
This research used image quality measures like entropy, mean,
standard deviation, fusion mutual information, normalized cross correlation,
Root mean square error, structural content, Normalized Absolute Error and
Absolute Difference to analyze on the fused image. Though the fusion
methods produced varieties of images, few fusion rules only produced images
-1-0.8-0.6-0.4-0.2
00.2
Detail: Max Min Mean Max Min Mean Max Min Mean
Appx: Max Max Max Min Min Min Mean Mean MeanAD
Val
ues
ADAD
Image Fusion Rules
116
suitable for diagnostic accuracy. A useful image is identified on the execution
of quality measures on these images. The quality measures obtained for the
images fused with different fusion rules is recorded in Table 4.3 and Table 4.4
using the equation 4.1. The values for PSNR, RMSE, NAE, NCC, SC, FMI,
ENT, MEAN, STD and AD recorded in Tables 4.3 and 4.4 are obtained using
the equation 4.11, 4.10, 4.16, 4.19, 4.20, 4.12, 4.9, 4.6, 4.7 and 4.17. The
PSNR value obtained for Max Mean Fusion Rule performed well compared to
other fusion rule followed by Min Mean as given in Figure 4.6(a). The
recording in Figure 4.6(b) shows Max Mean with lower RMSE value
indicating the closeness of the fused image to the original image. Similar is
the NAE results in Figure 4.6(c). The quality measure NCC shows good
performance of Mean Max followed by Max Max in Figure 4.6(e). The
structural content ranks Max Max as good fusion rule followed by Max Min,
Min Mean and then Max Mean in Figure 4.6(f). The values depicted in Figure
4.6(g) and Figure 4.6(h) indicates Max Mean as the best quality image which
shows high FMI and Entropy. These indicate the richness of information. The
Mean and STD of Figure 4.6 (i) and (j) play only a less role in the selection of
fusion rule for the fused ultrasound placenta image. Again AD shows Max
Mean fused image to be cleaner that the other rules. It is marked with red
rhombus in Figure 4.6(k). It clearly shows that the wavelet decomposed
images when subjected to image fusion increases the quality of information in
an image. Thus the essential features, that characterizes the placenta can
extracted. It preserves boundary information and structural details without
introducing any other consistencies to the image. This work suggests that Max
Approximation and Mean Detail fusion rule produces good quality ultrasound
placenta complicated by GDM followed by Max Approximation and Max
Detail fusion rule.
117
4.8 CONCLUSION
The information entropy is comparatively better when the images
are fused using max approximation and mean detail. The Mutual Information
obtained is also good. These indicate the richness of information. It clearly
shows that the wavelet decomposed images when subjected to image fusion
increases the quality of information in an image. Thus the essential features,
that characterizes the placenta can extracted. It preserves boundary
information and structural details without introducing any other consistencies
to the image.
Top Related