Post on 26-Feb-2021
THEORETICAL ADVANCES
Enhanced Gabor wavelet correlogram feature for image indexingand retrieval
H. Abrishami Moghaddam • M. Nikzad Dehaji
Received: 8 July 2010 / Accepted: 15 June 2011 / Published online: 14 July 2011
� Springer-Verlag London Limited 2011
Abstract In this paper, a new feature scheme called
enhanced Gabor wavelet correlogram (EGWC) is proposed
for image indexing and retrieval. EGWC uses Gabor
wavelets to decompose the image into different scales and
orientations. The Gabor wavelet coefficients are then
quantized using optimized quantization thresholds. In the
next step, the autocorrelogram of the quantized wavelet
coefficients is computed in each wavelet scale and orien-
tation. Finally, the EGWC index vector simply consists of
the autocorrelogram coefficients. Due to non-orthogonality
of Gabor decomposition, the resulting wavelet coefficients
suffer from redundancy, which increases the computational
cost and reduces the effectiveness of EGWC. Here, we
present a solution to handle the redundancy problem using
non-maximum suppression and adjustment of autocorre-
logram distance parameters as a function of the wavelet
scale. The retrieval results obtained by applying EGWC to
index two image databases with 5,000 natural images and
1,792 texture images demonstrated its better performance
in terms of retrieval rates with respect to the state-of-the-art
content-based and multidirectional texture indexing
algorithms.
Keywords Content-based image indexing and retrieval �Wavelet correlogram � Enhanced Gabor wavelet
correlogram
1 Introduction
Digital image libraries and other multimedia databases
have been dramatically expanded in recent years. Storage
and retrieval of images in such libraries become a real
demand in industrial, medical and other applications [1, 2].
Content-based image indexing and retrieval (CBIR) is
considered as a solution. In such systems, in the indexing
algorithm, some features are extracted from every picture
and stored as an index vector [3]. Then, in the retrieval
algorithm, the index vector corresponding to the query
image is compared (using a similarity or dissimilarity cri-
terion) with all stored indices to find similar pictures [4].
Various indexing algorithms based on different image
features such as color [5, 6], texture [7] and shape [8] have
been developed. Color is frequently used as a signature for
indexing and retrieval of images and videos in multimedia
databases [9]. Color histogram [5] and its variations [10–
12] were the first algorithms introduced to the pixel
domain. Despite its efficiency and insensitivity to small
changes of view point, the color histogram is unable to
carry local spatial information of pixels. Therefore, in such
systems, retrieved images may have many inaccuracies,
especially in large image databases (imagebases). For these
reasons, three variations called image partitioning [13, 14]
histogram refinement [15] and color correlogram [6] were
proposed to improve the effectiveness of such systems. In
the histogram refinement approaches such as color coher-
ence vectors algorithm (CCV) introduced by Pass et al.
[15], each histogram bar is divided into two or more parts
H. Abrishami Moghaddam (&)
Biomedical Engineering Group,
Faculty of Electrical Engineering,
K.N. Toosi University of Technology,
16315-1355 Tehran, Iran
e-mail: moghadam@eetd.kntu.ac.ir
M. Nikzad Dehaji
Department of Computer Engineering,
Islamic Azad University, Science and Research Branch,
1415-775 Tehran, Iran
e-mail: m.nikzad@srbiau.ac.ir
123
Pattern Anal Applic (2013) 16:163–177
DOI 10.1007/s10044-011-0230-1
according to its spatial color distributions. On the other
hand, in the color correlogram technique introduced by
Huang et al. [6], the spatial color correlation of the image
pixels are computed. Recently, Teng and Lu [16] used
vector quantization for indexing and retrieval of com-
pressed color images.
Shape description techniques are widely used in shape-
based indexing algorithms [17–24]. The visual part of the
MPEG-7 standard adopted the curvature scale-space rep-
resentation (CSS) and the angular radial transform (ART)
as the contour-based and region-based shape descriptors,
respectively [25]. Apart from CSS [18] and ART [22],
many other valuable and interesting algorithms have been
developed for shape retrieval, which can be classified as
global [17, 20, 26] and local [19, 21, 23, 24] methods
according to whether the shape is represented as a whole or
by local shape features. In Ref. [17], two sets of shape
features were used to describe the global shape informa-
tion: a 72-bin histogram of the shape edge direction and
seven invariant moments. To tackle occluded or partially
visible object retrieval, researchers tuned to local shape-
based methods. In Ref. [21], shape contexts were used for
finding correspondence between shapes. Huet and Hancock
[23] extracted line segments from an image as primitives.
An N-nearest neighbor graph was used as the shape
representation.
In the above algorithms (except Ref. [22]), the feature
vectors are constructed using spatial domain information.
Another possibility is the use of transformed domain data
to extract some higher-level features [27]. Wavelet-based
methods, which provide space–frequency decomposition of
the image, have been used [28–30]. Daubechies’ wavelets
are the most frequently used in CBIR for their fast com-
putation and regularity. In Ref. [27], Daubechies’ wavelets
in three scales have been used to obtain the transformed
data. Then, histograms of the wavelet coefficients in each
sub-band have been computed and stored to construct the
feature vector. In SIMPLIcity [28, 31], the image is first
classified into different semantic classes using a kind of
texture classification algorithm. Then, Daubechies’ wave-
lets are used to extract feature vectors. Kokare et al. [32]
proposed a texture image indexing retrieval method using
two-dimensional rotated wavelet filters, which could
improve characterization of diagonally oriented textures. In
2005, a wavelet-based CBIR system called wavelet corre-
logram was introduced by Moghaddam et al. [29]. This
system will be briefly reviewed in Sect. 2.1. In a later work
[33], the authors presented an enhanced version of the
wavelet correlogram method (EWC) using optimal quan-
tization thresholds obtained by evolutionary group algo-
rithm (EGA).
Although common wavelet-based methods like SIM-
PLIcity [28, 31] or wavelet correlogram [29] allow for a
multiresolution decomposition, they have limited direc-
tional selectivity and are not able to capture arbitrary
directional information. To overcome this shortcoming,
other multiresolution multidirectional image decomposi-
tion techniques such as 2D Gabor transform [34, 35],
discrete contourlet transform [36], steerable pyramid [37],
ridgelet transform [38], curvelet transform [39] and com-
plex directional filter bank (CDFB [40]) have been pro-
posed to be used to form a feature vector [41–44]. The use
of curvelet transform was also proposed to detect and
characterize non-Gaussian signatures in the image data
[44]. However, due to non-orthogonality of decomposition,
the multidirectional wavelet coefficients are redundant,
which degrade the indexing retrieval performance. Among
these techniques, the 2D Gabor wavelet has been shown to
provide indexing features with comparable or better aver-
age retrieval performance with respect to other multidi-
rectional wavelet decompositions including contourlet,
steerable pyramid and CDFB [41]. In Ref. [41], the features
were generated using only the mean and variance of the
wavelet coefficients and no post-processing has been made
before using them in retrieval. Furthermore, a simple dis-
tance measure was used there for feature matching. Most of
the wavelet-based CBIR methods use first, second [41, 42]
or higher order [44] statistical parameters of the wavelet
coefficients in different scales and orientations to construct
their index vector. In this paper, we propose the use of
spatial correlation of quantized Gabor wavelet coefficients
by wavelet correlogram method [29] to construct a new
effective index vector. Moreover, we introduce an efficient
method for handling redundancy problem in computing the
autocorrelogram of 2D Gabor wavelet coefficients. In
addition, current multidirectional wavelet-based indexing
methods do not present any solution for optimizing
indexing parameters. Here, we propose the use of a new
optimization method called evolutionary group algorithm
(EGA) to optimize quantization thresholds of Gabor
wavelet correlogram indexing features.
This paper is organized as follows. The wavelet corre-
logram method is briefly reviewed in Sect. 2. Section 3
presents the theoretical basis of the new image indexing
algorithm and explains some technical points including
redundancy problem handling, quantization threshold opti-
mization, feature construction and retrieval performance
improvement. Experimental results using two frequently
used texture and natural image datasets are given in Sect. 4.
Finally, Sect. 5 is devoted to the concluding remarks.
2 Wavelet correlogram
Wavelet correlogram applies the color correlogram method
[6] to the wavelet coefficients of an image. Therefore, it
164 Pattern Anal Applic (2013) 16:163–177
123
inherits the multiscale multiresolution properties from
wavelet transform and translation invariancy from color
correlogram. Color pictures in the database are first trans-
formed to a unified gray-level format. For this purpose, the
RGB values are first converted to NTSC coordinates and,
after setting the hue and saturation components to zero, the
values are converted back to RGB color space.
2.1 Wavelet correlogram indexing algorithm
Wavelet correlogram indexing algorithm consists of three
steps [29, 45]. First, the discrete wavelet transform of the
input image is computed using Daubechies’ wavelets for
their regularity, separability and compact support proper-
ties. In practice, a limited number of scales are sufficient
for wavelet decomposition. According to our experiments,
applying wavelet transform in three scales gives a good
compromise between efficiency and effectiveness of the
algorithm [29]. Then, the wavelet coefficients are quan-
tized into a small number of levels. Augmenting the
number of bins may improve the indexing effectiveness;
however, the computational cost and memory requirements
will be increased as well. Finally, horizontal and vertical
autocorrelogram of quantized coefficients are computed for
LH and HL submatrices in each scale, respectively. The
wavelet coefficients corresponding to HH filters have no
significant spatial correlation. Therefore, there will be no
need to obtain the autocorrelogram of these coefficients
[45].
Quantization thresholds corresponding to each wavelet
scale are illustrated as values under the axes in Fig. 1. As
shown, [-nm,1 nm,1] is considered as noise margin in the
mth scale, and the wavelet coefficients inside it are dis-
carded. The image noise can be originated by different
sources such as quantal or electronic noise. In general, it is
considered as zero mean Gaussian white noise. In this case,
the noise margin using each scale can be computed in
corresponding LH and HL matrices based on BayesShrink
method [46]. This method uses the noise variance in an
image, which can be estimated by Donoho’s method [47,
48]. According to [47–49], the noise standard deviation of
an image (rn) can be estimated using the median absolute
deviation (MAD) of the diagonal wavelet coefficients at the
first level of decomposition as:
rn ¼MedianðjYijjÞ
0:6745Yij 2 HH ð1Þ
This value can be also considered as the standard
deviation of noise in LH and HL matrices in the first level.
In the same way, if we consider the LL image in the mth
scale as a new image, the same method can be used to
estimate the variance of the noise in LH and HL images in
the mth scale. To take into consideration the dynamic range
of wavelet coefficients, different quantized levels in each
wavelet scale (-nm,2, nm,2) are used. These quantization
levels were obtained experimentally, by dividing the
dynamic range into a small number of bins, each bin
representing a percentage of the wavelet coefficient
population. In Fig. 1, the values above the axes show the
percentiles of wavelet coefficients’ population
corresponding to each quantized level.
Horizontal autocorrelogram of discretized LH coeffi-
cients matrix with distances k [ {1,2,3,4} is defined as
follows:
where ci-s are quantization levels and |{.}| represents the
cardinality of a set. Indeed, aH(i,k) is the probability of
finding two pixels with quantization level ci at the same
row of LH in a distance k of each other. Vertical auto-
correlogram aV(i,k) is computed in the same manner using
HL coefficients.
The wavelet correlogram index vector is simply con-
structed by using autocorrelogram coefficients computed
for LH and HL wavelet matrices in each scale. Each matrix
gives 16 coefficients resulting in a total of U = 96 words
per index [45].
2.2 Wavelet correlogram retrieval algorithm
In the retrieval phase of the wavelet correlogram, after
computing the index vector of the input image, all the
index vectors corresponding to the images in the database
are sorted according to a matching criterion. Then, N
images corresponding to the best-matched index vectors
are retrieved and shown to the user. Several matching
criteria including (a) geometric measures such as L1 or L2
distance, (b) information theoretic distances such as Kull-
back–Leibler or Jeffery divergence and (c) statistic mea-
sures such as Mahalanobis, Chi-squared or Kolmogorov–
Smirnov distance can be used [50]. A comparative study
aHði; kÞ ¼ðx; yÞ LHðx; yÞ ¼ ci; LHðx; yþ kÞ ¼ ci or LHðx; y� kÞ ¼ cijf gj j
2� jfðx; yÞjLHðx; yÞ ¼ cigji ¼ 1; . . .; 4k ¼ 1; . . .; 4
ð2Þ
Pattern Anal Applic (2013) 16:163–177 165
123
demonstrated that most of the dissimilarity measures from
the geometric category have better performance than the
other two categories [51]. Moreover, our main objective is
being focused on indexing performance evaluation; in this
paper we have used a modified version of L1 distance
measure, since it is commonly used when comparing two
index vectors [6].
3 Gabor wavelet correlogram
Wavelet correlogram was originally developed using
Daubechies wavelets for their regularity, separability and
compactness [45]. Other commonly used wavelets such as
9/7 biorthogonal could be used particularly for image
indexing in compressed domain. However, they have dem-
onstrated lower performance with respect to Haar and
Daubechies wavelet in retrieving nine classes of JPEG and
JPEG2000 compressed images [52]. In spite of their
advantages, these wavelets suffer from poor directional
selectivity; since they are computed in only horizontal,
vertical and diagonal directions. Moreover, as we explained
in the previous section, the wavelet correlogram uses only
horizontal and vertical wavelet coefficients. To alleviate this
problem, multiresolution multidirectional image decompo-
sition techniques such as steerable pyramid [37], 2D Gabor
transform [35], ridgelet [43], curvelet [39], discrete con-
tourlet transform [36] and CDFB [40] can be used. In a
recent study, Vo et al. [41] compared the texture retrieval
performance of feature vectors produced by the Gabor
wavelet, steerable pyramid, contourlet and CDFB. The
results of their study demonstrated that the Gabor wavelet
feature has a higher or comparable retrieval performance
with respect to the features obtained by other wavelets. On
the other hand, the features provided by ridgelet have shown
superior retrieval performance with respect to the features
provided by other multidirectional transforms. However, the
ridgelet features are relatively larger and proportional to the
image size. The use of curvelet transform was recently
proposed to detect and characterize non-Gaussian signatures
in the image data [44].
The use of Gabor filters in extracting textured image
features is further motivated by various factors [53]. The
Gabor representation has been shown to be optimal in the
sense of minimizing the joint two-dimensional uncertainty
in space and frequency [54]. These filters can be considered
as orientation and scale tunable edge and line (bar)
detectors, and the statistics of these micro features in a
given region are useful to characterize the underlying
texture information. Regarding the advantages of Gabor
wavelets in image representation, in this paper we propose
a new algorithm for constructing an enhanced Gabor
wavelet correlogram indexing feature. This new algorithm
handles the data redundancy caused by non-orthogonality
of the Gabor wavelet using non-maximum suppression and
an adaptive distance for computing the Gabor wavelet
correlogram.
3.1 Construction of the Gabor wavelets
A 2D Gabor function is a Gaussian modulated by a com-
plex sinusoid [35]:
wðx; yÞ ¼ 1
2prxryexp � 1
2
x2
r2x
þ y2
r2y
!þ 2pjxx
" #ð3Þ
The Gabor wavelets are obtained by dilation and
rotation of the generating function w(x,y) as follows:
wm;nðx; yÞ ¼ a�mwða�mðx cos hþ y sin hÞ;a�mð�x cos hþ y sin hÞÞ ð4Þ
Fig. 1 Wavelet correlogram
quantization. Values above and
under the axes are percentiles of
wavelet coefficient population
and quantization thresholds,
respectively
166 Pattern Anal Applic (2013) 16:163–177
123
where h = np/K; m [ {0,…, S - 1} and n [ {0,…, K -
1} represent scale and orientation, respectively; and K and
S are the number of desired orientations and scales,
respectively. Equation 3 can be written in the frequency
domain as follows:
Wðu; vÞ ¼ 1
2prurvexp � 1
2
ðu� xÞ2
r2u
þ v2
r2v
!" #ð5Þ
where ru = 1/2prx and rv = 1/2pry. Gabor functions do
not result in an orthogonal decomposition, which means
that a wavelet transform based upon the Gabor wavelet is
redundant [35]. Manjunath and Ma [54] proposed a design
strategy to project the filters so as to ensure that the half-
peak magnitude supports of the filter responses in the
frequency spectrum touch one another. By doing this, it can
be ensured that the filters will capture the maximum
information with minimum redundancy. Hence, the
parameters a, ru and rv are computed by (6–8),
respectively:
a ¼ ðUh=UlÞð1=S�1Þ ð6Þ
ru ¼ða� 1ÞUh
ðaþ 1Þffiffiffiffiffiffiffiffiffi2ln2p ð7Þ
rv ¼ tanp
2K
� �Uh �
r2u
Uh
� �2 ln 2
� �, ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi2 ln 2� ð2ln2Þ2r2
u
U2h
s
ð8Þ
where Ul and Uh are the upper and lower bound of the
designing frequency band, respectively. The resultant real-
valued even symmetric Gabor filters, which are oriented
over a range of 180�, are more appropriate for image
indexing purposes [54]. Examples of such Gabor filters in
the frequency domain are shown in Fig. 2.
3.2 Proposed indexing algorithm
The block diagram of GWC is shown in Fig. 3. According
to this figure, GWC computes first the wavelet coefficients
using Gabor wavelets in three different scales. In each
scale, the non-maximum suppression block is aimed to
reduce the data redundancy as will be explained in the
following subsection. Then, the coefficients are discretized
using three sets of quantization thresholds as indicated in
Fig. 4. In Ref. [34], these coefficients were selected
experimentally for achieving good retrieval performance.
However, as will be explained in Sect. 3.6, we propose the
use of recently introduced EGA [33] to optimize the
quantization thresholds. The final stage in the block dia-
gram of Fig. 3 is the computation of wavelet autocorrelo-
gram in each scale.
As illustrated in Fig. 4, small coefficients are considered
as noise and discarded. The distribution of noise in Gabor
wavelet coefficients depends on its distribution in the ori-
ginal image. If we consider that the noise distribution in the
Input Image
ψ0,0
ψ0,1
ψ0,2
ψ0,3
ψ1,0
ψ1,1
ψ1,2
ψ1,3
ψ2,0
ψ2,1
ψ2,2
ψ2,3
Non-M
aximum
Suppression
Non-M
aximum
Suppression
Non-M
aximum
Suppression
Quantization of 1
st Wavelet
Scale
Quantization of 2
nd Wavelet
Scale
Quantization of 3
rd Wavelet
Scale
Auto-
correlogram
(Distance Set 1)
Auto-
correlogram
(Distance Set 2)
Auto-
correlogram
(Distance Set 3)
EG
WC
Feature Vector
Fig. 3 Block diagram of GWC
Fig. 2 Examples of Gabor filter
in the frequency domain. Each
ellipse represents the range of
the corresponding filter
response from 0.5 to 1.0 in
squared magnitude. The plots
(a) and (b) illustrate two
different ways for sampling the
frequency spectrum by
changing the Ul, Uh, S and
K parameters of the Gabor
presentation. a Ul = 0.03,
Uh = 0.4, S = 3, K = 2,
b Ul = 0.05, Uh = 0.4, S = 3
and K = 4
Pattern Anal Applic (2013) 16:163–177 167
123
original image is Gaussian, then its distribution in Gabor
wavelet coefficients will be also Gaussian with different
parameters due to linearity of the transform. It is worth-
while to note that in most of denoising algorithms for
image databases including objects, (natural or man-made)
scenes or objects in scenes, the noise distribution has been
considered as normal [46, 47, 55–57]. Therefore, the var-
iance of noise in an image can be estimated by Donoho’s
algorithm [47] and the noise margin can be estimated using
soft thresholding [47] or BayesShrink method [46].
The negative coefficients are truncated since they have
significant correlation with the positive coefficients. These
negative coefficients are mainly produced by undesirable
oscillations resulted from the Gabor wavelets especially in
higher scales, since for small m, Gabor wavelets are not well
localized in the spatial domain and do not die out quickly
away from their central point (Fig. 7). Finally, the auto-
correlogram of the quantized coefficients is computed along
the direction normal to the Gabor wavelet orientation:
am;nði; kÞ ¼ðx; yÞ
Wm;nðx; yÞ ¼ ci;
Wm;nðxkm; ykmÞ ¼ ci or Wm;nðx�km
; y�kmÞ ¼ ci
�
2� jfðx; yÞjWm;nðx; yÞ ¼ cigj
i ¼ 1; . . .Kq
k ¼ 1; . . .Kd
ð9Þ
where Kq is the number of quantization levels in each
wavelet scale, Kd is the number of distances, Wm,n is the
matrix of the quantized wavelet coefficients computed by
wm,n, km indicates the distance parameter of
autocorrelogram, and xkmand ykm
are given by:
xkm
ykm
� �¼ x
y
� �þ km sin h
km cos h
� �� ð10Þ
where the function bae rounds a 2 R to the nearest integer.
3.3 Handling the redundancy problem
Gabor wavelet coefficients are redundant because of non-
orthogonal decomposition. From wavelet correlogram
point of view, there are two types of redundancy in Gabor
wavelet coefficients, (a) the redundancy in the direction of
high-pass filtering and (b) the redundancy in the direction
of low-pass filtering. These redundancies may abnormally
increase the value of the correlogram coefficients and the
computational cost.
Gabor wavelet coefficients along the radial direction
are not expected to be correlated; because the Gabor
radial direction represents the direction of high-pass fil-
tering. However, due to the redundancy of transform,
Fig. 5 Suppressing non-maximum Gabor wavelet coefficients along the direction of high-pass filtering: a The query image, b Gabor wavelet
coefficients with m = 2 and n = 2 and c non-maximum suppressed Gabor wavelet coefficients along the vertical direction
Fig. 4 GWC quantization
thresholds. Values under the
axes are quantization thresholds
168 Pattern Anal Applic (2013) 16:163–177
123
particularly in large scales, Gabor wavelet coefficients
along the radial direction show some correlation. This
phenomenon is clearly visible in Fig. 5b. In this figure,
Gabor wavelets have been computed in the third scale
(m = 2) and vertical direction (n = 2 or h = p/2). As can
be observed, Gabor wavelet coefficients are correlated in
the vertical direction, while this direction corresponds to
the direction of high-pass filtering. It is worthwhile to
note that correlation of wavelet coefficients in the direc-
tion of high-pass filtering is not informative. This is the
reason why Gabor wavelet correlogram is computed only
in the direction of low-pass filtering. Let us explain this
observation differently. As illustrated in Fig. 5b, the
Gabor wavelet coefficients represent horizontal edges in
the image, and Gabor wavelet correlogram looks for
correlation only along the edges. Therefore, it is reason-
able to suppress non-maximum coefficients along the
radial direction to reduce the redundancy of wave-
let coefficients and improve the efficiency of the
algorithm.
Furthermore, the redundancy in the direction of low-
pass filtering is avoided by adapting the distance parameter
km in (9) according to the wavelet scale as follows:
km 2 fsm � a a ¼ 1; . . .;Kdj g ð11Þ
where s indicates the scaling coefficient of the filter bank,
which is set to 2 for the Gabor wavelet filters used in this
article. In more details, let us consider that the original
image is low-pass filtered in the y direction using a
Gaussian kernel with standard deviation of ry. If the pixels
in the original image are supposed to be uncorrelated, it can
be easily shown that the autocorrelation function of the
filtered image in the y direction in the mth scale will be a
Gaussian function with the standard deviation equal toffiffiffi2p
rysm [58]:
RðkmÞ ¼1
2ffiffiffipp
rysme�1
2kmffiffi
2p
rysm
� �2
ð12Þ
where km is the distance between coefficients as a function
of scaling coefficient s and wavelet scale m. Therefore,
each sm adjacent wavelet coefficients is significantly
dependent in the mth wavelet scale due to the redundancy.
However, we should distinguish between inherent corre-
lation of pixels in the original image and the correlation
due to low-pass filtering by a Gaussian kernel. Obviously,
Gabor wavelet correlogram should not take into account
the correlations caused by low-pass filtering and it should
reflect only the inherent correlation of coefficients. This is
the reason why we propose the use of a scale-dependent
distance for the computation of autocorrelogram in the
direction of low-pass filtering.
3.4 GWC index vector
The index vector of GWC simply consists of the autocor-
relogram coefficients computed for all Gabor wavelets:
F ¼ ½am;0ði; kmÞ; am;1ði; kmÞ; . . .; am;K�1ði; kmÞ�;m ¼ 0; 1; . . .; S� 1; i ¼ 1; 2. . .;Kq ð13Þ
Therefore, GWC index vector includes U = S 9
K 9 Kq 9 Kd words. In this paper, we selected S = 3,
K = 4, Kq = 4 (Fig. 4), and Kd = 4 (Eq. 9) and hence,
U = 192. It should be noted here that the proposed feature
vector contains no color and macro-structure shape
information. As will be seen in Sect. 4 (Experimental
Results), this feature vector gives an effective and efficient
representation of the image for CBIR.
3.5 Retrieval algorithm
Different types of distances to compare color and texture
have been proposed and discussed in literature. SIMPLIcity
[28] uses a complex image similarity measure called
integrated region matching (IRM), which measures the
overall similarity between images by integrating the
properties of all the segmented region. Vo et al. [41] and
Manjunath et al. [42] used a special normalized L1
distance.
In this paper, for the feature vectors F and F0 (obtained
from the images I and I0, respectively), the distance mea-
sure d1 is defined as follows:
d1ðF;F0Þ ¼XCj¼1
Fj � F0j1þ Fj þ F0j
ð14Þ
where Fj indicates the jth component of the feature vector
F (the 1 in the denominator prevents division by zero [59]).
It can be easily shown that the use of d1 provides better
performance with respect to L1 [6].
3.6 Optimizing quantization thresholds
The quantization thresholds illustrated in Fig. 4 should be
optimally determined to achieve a good retrieval perfor-
mance of the resulted Gabor wavelet features. Optimization
of CBIR algorithms is a complicated and time-consuming
task since, each time a parameter of the indexing algorithm
is changed, all images in the database should be indexed
again. Therefore, classical optimization methods like
genetic algorithms (GA) cannot be used for this purpose,
since in each generation they need to index all the images
in the database using all chromosomes in the population
and compute the retrieval performance of each chromo-
some as fitness value. In this paper, we use EGA [33] that
has been shown to be particularly efficient for optimizing
Pattern Anal Applic (2013) 16:163–177 169
123
the parameters of CBIR methods. In EGA, the evolution
process is made faster compared to GA by partitioning the
reference imagebase into several smaller subsets and the
whole database is used only once. Each subset is used by an
updating process as training patterns for each chromosome
during evolution. Therefore, in EGA, the evolution pro-
ceeds during indexing the image subsets of the reference
imagebase. This will reduce significantly the computational
cost of EGA compared to GA. Here, each chromosome
includes (a) an age gene that implies the progress of the
updating process, (b) some evolutionary genes that par-
ticipate in evolution, (c) a number of history genes for
saving the previous chromosome updating process states
and (d) finally some evaluation genes that indicate the
evaluation function value. Furthermore, a new fitness
function is defined, which evaluates the fitness of the
chromosomes of the current population with different ages
in each generation. This fitness value will be valid if the
chromosome’s age is larger than a threshold (mature
chromosome); otherwise, it will be set to zero for immature
chromosomes.
The EGA flowchart is shown in Fig. 6. This evolutionary
algorithm initially generates a random population. The
chromosomes of the current population are then updated by
a chromosome updating process (CUP) and the process is
repeated until there are at least two mature chromosomes. In
the next stage, based on the fitness of chromosomes, two
mature individuals are selected from the current population
as parents by a selection operator. Two offspring are then
generated by the parents using the crossover and mutation
operators. Finally, the new population is generated by
replacing two chromosomes that have the smallest fitness by
the new offspring. The above procedure is repeated until a
stop criterion is satisfied. Since in each generation, the elite
chromosomes are kept in the population, according to the
Rudolph theory [60], the proposed evolutionary group
converges eventually to the global optimum after a sufficient
number of generations. In EGA, the selection, crossover, and
mutation operators, stop criterion and other parameters can
be determined according to the application requirements.
In [33], six quantization thresholds were optimized
using a population size of M = 150 chromosomes. In this
application of EGA, the number of evolutionary genes
(quantization thresholds) is 12. Therefore, as a compromise
between population diversity and computational cost, the
population size was set to M = 200. In both applications,
the number of mature chromosomes in CUP was set to
SCUP = 0.2 9 M. Moreover, we used the tournament
selection operator [61] in which the tournament size was
set to M/7, one-point mathematical crossover operator [62]
and typical mutation operator in which the mutation
probability was set to Pm = 0.01 [63]. For EGA evolution,
we used the databases C5000 and Brodatz texture sepa-
rately (see Sect. 4.1). Table 1 indicates the resultant
quantization thresholds (illustrated in Fig. 4) for enhanced
GWC (EGWC), which correspond to the evolutionary
genes of the best chromosome in the final population after
800 generations for each dataset.
3.7 Computational complexity
Taking into account all Gabor wavelet scales (S), directions
(K), distances (Kd), and quantization levels (Kq), the
computational complexity of GWC and EGWC is
Fig. 6 EGA flowchart
170 Pattern Anal Applic (2013) 16:163–177
123
OðSKM2KdKqÞ for an M 9 M image (see Eq. 9). In prac-
tice, S and K are fairly small (B4). Our experiments using
small values for S and K demonstrated a good overall
performance of the indexing retrieval algorithm. Because
of multiscale property of GWC and also adjusting distances
as a function of the wavelet scale, Kd is generally smaller
than the maximum number of distances used by the color
correlogram indexing algorithm [6]. The number of quan-
tization levels (Kq) is, in general, considerably less than the
number of image gray levels (L). Obviously, augmenting
L will require increasing Kq. However, the ratio
(Kq/L) remains considerably small, which justifies the
reasonable computational cost of GWC. Our algorithm has
been implemented in MATLAB on a PC with an Intel Core
2 Duo 2.67 GHz CPU and 3.24 GB of RAM. The com-
putation of the index vector for each image takes less than
2 s.
On the other hand, optimization of quantization thresh-
olds by EGA requires indexing and retrieval of the images
in the local database of each chromosome. Since, the
computational cost of indexing and retrieval of the images
in a database changes linearly with the number of quanti-
zation levels (Kq), it seems that EGA computational cost is
a linear function of Kq. However, it is worthwhile to note
that increasing the number of quantization levels increases
the complexity of the search space for EGA. In other
words, if the number of quantization levels increases, the
number of chromosomes in the initial population (or the
number of generations) will be increased.
4 Experimental results
A number of criteria influence the choice of K and
S. Bianconi and Fernandez [64] conducted an extensive
experimental campaign to investigate the effects of Gabor
filter parameters on texture classification. The outcomes of
the experimental activity demonstrate a poor correlation
between the number of frequencies and orientations used to
define a filter bank and the percentage of correct classifi-
cation. Greenspan and Perona [65] showed numerically
that taking complex filters in four directions is enough to
represent most (97%) of the image energy. Still, six
directions are usually used [42]. Using a larger number of
directions (K [ 4) may improve the feature effectiveness;
however, it increases the index size (memory requirement)
and computational cost. The number of scales (number of
considered filter frequency bands) is usually three or four
[41–43]. A larger number of scales have been used by some
authors [44]; however, a post-processing of the resulted
features has been performed to select the most important
components. In this paper, in order to obtain a feature
vector with reasonable dimensionality (without any post-
processing of the resulted features), we used Gabor
wavelets in four directions (K = 4) and three scales
(S = 3) as illustrated in Fig. 7.
The parameters Ul and Uh are chosen according to the
lowest and highest frequencies of interest in the image
[66]. Uh is usually chosen to maintain the filter response
inside the region delimited by the Nyquist frequency (0.5).
Bianconi and Fernandez [64] tested different values of
smoothing parameters rx and ry and showed that the per-
centage of correct classification decreases as the level of
the two parameters increases. They computed indirectly Uh
given the value of rx using:
Uh ¼rx
2 rx þffiffiffiffiffiffiffiffiffilog2p
=pð Þ ð15Þ
and showed that increasing Uh from 0.35 ðffiffiffi2p
=4Þ to 0.45
results in decreasing classification accuracy. For the natural
texture images, it can be seen that the directional structures
appear at finer scales, which are governed by higher fre-
quencies. Therefore, the lowest center frequency can be
placed at a fairly high value, Ul = 0.10 [66]. Manjunath
et al. [54] used a wide range of the frequency spectrum for
texture images (Ul = 0.05 and Uh = 0.40). In this work,
the parameters Ul and Uh are chosen experimentally as 0.05
and 0.49, respectively. A similar statistical strategy as in
[64] may be adopted to evaluate the effect of Ul and Uh and
other Gabor filter parameters on images retrieval accuracy.
However, this requires the design of experiments and
analysis of the results by applying different Gabor filter
bank over different groups of images and is subject to
ongoing work.
Table 1 Optimal values obtained for the quantization thresholds illustrated in Fig. 5 for C5000 and texture Brodatz datasets
Quantization thresholds in scale i
ni;1 ni;2 ni;3 ni;4
C5000 Brodatz C5000 Brodatz C5000 Brodatz C5000 Brodatz
Scale 1 132.92 85.3 150 128.2 410 292.4 983.13 579.4
Scale 2 207.82 118.01 274.76 282.2 343.46 338.5 945.29 908.04
Scale 3 316.10 209.3 365.46 342.1 751.14 508.5 1503.86 1042.8
Pattern Anal Applic (2013) 16:163–177 171
123
In Fig. 7, it is worthwhile to note that an important
problem appears in the first scale (m = 0) where the fre-
quency response of the filter has significant amplitude
above the Nyquist frequency. Cutting off abruptly the filter
response above the Nyquist frequency strongly distorts the
filter shape in the spatial domain (this causes the appear-
ance of side lobes or ringing). Since the horizontal and
vertical filters are defined by central symmetry, they are
continuous across the periodicities of the Fourier domain;
therefore, they are well localized and without extra side
lobes in the space domain. The oblique filters (i.e., filters
which are neither vertical nor horizontal) are also defined
by central symmetry. But this is not sufficient to maintain
the Fourier domain continuously (across periods) and to
keep a good localization in the space domain. To alleviate
this problem, a solution has been proposed in [67].
The parameters used for the development of EGWC
(such as Ul, Uh, K and S) have already been used for
indexing and retrieval of image databases including objects
(natural or man-made), scenes, objects in natural scenes
and textures. Therefore, we studied the performance of the
proposed algorithm using two different datasets: COREL
and Brodatz texture databases [28, 68].
4.1 Image databases
To demonstrate the performance of the new algorithm on a
CBIR system, one subset of COREL imagebase [28] with
Fig. 7 Illustrations of the
applied Gabor wavelets in the
spatial (left images) and
frequency (right images)
domains
Table 2 The names of 50 categories in C5000
No. Category No. Category No. Category No. Category No. Category
1 Vultures 11 Dinosaurs 21 Card game 31 Snow board 41 Road signs
2 Decoys 12 Clouds 22 Easter egg 32 Fireworks 42 Troops parade
3 Lions 13 Dawn 23 Gun 33 Body building 43 Fashion
4 Elephants 14 Mountains 24 Flag 34 Kong Fu 44 Tran
5 Tigers 15 Caves 25 Dolls 35 Surfing 45 Airplane
6 Horses 16 Trees 26 Tools 36 Sailing ship 46 Food
7 Dolphins 17 Waves 27 Mineral texture 37 Cars 47 Ballet
8 Panthers 18 Flowers 28 Granular texture 38 Doors 48 Marble
9 Cats 19 Japanese trees 29 Precious-stones 39 Interior design 49 Bus
10 Shells 20 Antiques 30 Molecules 40 Historical building 50 Medicine
172 Pattern Anal Applic (2013) 16:163–177
123
5,000 (C5000) images in 50 categories and with 100 ima-
ges in each category were utilized. The categories’ names
of C5000 are given in Table 2. In addition, a popular
texture database is used in our experiments: the Brodatz
texture database, consisting of 112 different types of tex-
ture images. Each original image with a size of 512 9 512
is evenly divided into 16 of 128 9 128 non-overlapping
sub-images, thus creating a database of 112 9 16 = 1,792
Brodatz texture images. If anyone of them is imposed as
the query image, the 16 texture images divided from the
same original image are viewed as the images from the
same class and targeted to be retrieved as the ground truth.
4.2 Evaluation measures
In this paper, the standard performance measurement pre-
cision-recall pair is used for the evaluation of retrieval
performance. Precision P is defined as the ratio of the
number of retrieved relevant images r to the total number
of retrieved images n, i.e., P = r/n. Precision P measures
the accuracy of the retrieval. Recall R is defined as the ratio
of the number of retrieved relevant images r to the total
number m of relevant images in the whole database, i.e.,
R = r/m. Recall that R measures the robustness of the
retrieval.
P ¼ r
n¼ No of relevant images retrieved
Total no of images retrieved
R ¼ r
m¼ No of relevant images retrieved
Total no of relevant images in DB
ð16Þ
4.3 Performance using 5,000 image database
The first experiment was performed to compare the pro-
posed algorithms with a number of counterpart methods
Fig. 9 A query results on C5000 obtained using GWC with d1. The top left image in each series is the query image and the following 11 images
are retrieval results
Fig. 8 Average retrieval result of 5,000 queries using GWC,
SIMPLIcity and EGWC
Pattern Anal Applic (2013) 16:163–177 173
123
using the imagebase C5000. Figure 8 compares the per-
formance of EGWC, GWC and SIMPLIcity [28].
As can be seen from the figure, the retrieval perfor-
mance of EGWC feature is significantly higher than that of
SIMPLIcity and GWC feature. It is worthwhile to note that
EGWC uses a simple relative distance measure for feature
matching, while SIMPLIcity uses a complex integrated
region matching measure of image similarity, which works
based on region segmentation of images. Obviously, using
more complex distance measures or classifiers with EGWC
features may improve its retrieval performance. In addi-
tion, SIMPLIcity uses both color and shape features in
contrast to EGWC, which uses only micro-structure shape
information. Figures 9 and 10 illustrate the results of GWC
and EGWC, respectively, for two query images from
C5000.
4.4 Performance using Brodatz texture image database
The second experiment was aimed to study the perfor-
mance of the EGWC with the other multiresolution mul-
tidirectional image decomposition techniques such as
conventional 2D Gabor transform [35], discrete contourlet
transform [36], steerable pyramid [37] and CDFB [38] on
texture retrieval [41]. Figure 11 shows the overall perfor-
mances for the case of n (total number of retrieved images)
from 16 to 65. As shown in Fig. 11, EGWC performed
Fig. 10 A query results on C5000 obtained using EGWC with d1. The top left image in each series is the query image and the following 11
images are retrieval results
Fig. 11 Average retrieval rate of Brodatz database according to the
number of top images considered using directional filter banks
(conventional Gabor, district contourlet, steerable pyramid, CDFB
[40] and EGWC)
174 Pattern Anal Applic (2013) 16:163–177
123
meaningfully better performance than other multiresolution
multidirectional counterpart techniques. As can be seen,
due to the nature of the Gabor transform, its conventional
feature structure [41] even has better or comparable per-
formance among its three counterparts (contourlet, steer-
able pyramid and CDFB transforms). Clearly, in EGWC,
the correlogram technique has promoted the quality of the
Gabor indexing features significantly. Figure 12 illustrates
the results of EGWC, respectively, for a query images from
Brodatz dataset.
5 Conclusion
In this paper, a new feature called enhanced Gabor
wavelet correlogram was proposed for image indexing
and retrieval. EGWC uses Gabor decomposition which
has been shown to be optimal in the sense of minimizing
the joint two-dimensional uncertainty in space and fre-
quency. It takes also advantage of Gabor wavelet direc-
tional selectivity to extract effectively shape features.
EGWC also proposes an appropriate solution to reduce
the data redundancy caused by Gabor decomposition. For
this purpose, it uses non-maximum suppression in the
direction of high-pass filtering. It proposes also the use
of a scale-dependent distance for the computation of
wavelet autocorrelogram. In addition, EGWC uses
quantization thresholds optimized by an evolutionary
algorithm particularly designed for optimizing CBIR
parameters.
Comprehensive experiments were performed with two
different datasets including 5,000 natural and 1,792 texture
images, respectively. EGWC provided better performance
with respect to a number of frequently used CBIR algo-
rithms including SIMPLIcity and directional filter bank
methods such as Gabor wavelets, discrete contourlet
transform, steerable pyramid and CDFB.
EGWC uses only micro-structure shape information for
constructing index vector. Obviously, enriching this feature
vector with color and macro-structure image information
may provide significant improvement in retrieval
performance.
Acknowledgments The authors would like to thank Pr. J.Z. Wang
for his invaluable comments and providing a subset of COREL
database and executable version of SIMPLIcity software with which
we could obtain retrieval results on C5000. We would also like to
thank Dr. M. Saadatmand-Tarzjan and Mrs. N. Nematzadeh for their
contribution and help.
Fig. 12 A query result on Brodatz texture database obtained using EGWC with d1. The top left image in each series is the query image and the
following 11 images are retrieval results
Pattern Anal Applic (2013) 16:163–177 175
123
References
1. Hanbury A, Muller H, Clough P (2010) Special issue on image
and video retrieval evaluation. Comput Vis Image Underst
4:409–410
2. Wang JZ, Geman D, Luo JB, Gray RM (2008) Special issue on
real-world image annotation and retrieval. IEEE Trans Pattern
Anal Mach Intell 30:1873–1876
3. Datta R, Joshi D, Li J, Wang JZ (2008) Image retrieval: ideas,
influences, and trends of the new age. ACM Comput Surv
40:1–60
4. Lew M, Sebe N, Djeraba C, Jain R (2006) Content-based mul-
timedia information retrieval: state of the art and challenges.
ACM Trans on Multimed Comput Commun Appl 2:1–19
5. Gevers T, van de Weijer J, Stokman H (2007) Color feature
detection: an overview. In: Color image processing: methods and
applications. CRC Press, USA
6. Huang J, Ravi Kumar S, Mitra M, Zhu WJ, Zabih R (1997) Image
indexing using color correlograms. Proc IEEE Comput Soc Conf
Comput Vis Pattern Recognit 1:762–768
7. Stricker M, Dimai A (1997) Spectral covariance and fuzzy
regions for image indexing. Mach Vis Appl 10:66–73
8. Mahmoudi F, Shanbehzadeh J, Eftekhari-Moghadam AM, Solt-
anian-Zadeh H (2003) Image retrieval based on shape similarity
by edge orientation autocorrelogram. Pattern Rec 36:1725–1736
9. Schettini R, Ciocca G, Zuffi S (2001) A survey on methods for
color image indexing and retrieval in image databases. In: Luo R,
MacDonald L (eds) Color imaging science: exploiting digital
media. Wiley, New York, pp 183–211
10. Flickner M, Sawhney H, Niblack W, Ashley J, Huang Q, Dom B,
Gorkani M, Hafner J, Lee D, Petkovic D, Steele D, Yanker P
(1995) Query by image and video content: the QBIC system.
IEEE Comput 28:23–32
11. Ogle VE, Stonebraker M (1995) Chabot: retrieval from a rela-
tional database of images. IEEE Comput 28:40–48
12. Pentland A, Picard R, Sclaroff S (1996) Photobook: content-based
manipulation of image databases. J Comput Vis 18:233–254
13. Carson C, Thomas M, Blongie S, Hellerstein JM, Malik J (1999)
Blobworld: a system for region-based image indexing and
retrieval. Proc 3rd Intl Conf Vis Inform Sys 1:509–516
14. Del Bimbo A, Mugnaini M, Pala P, Turco F (1998) Visual que-
rying by color perceptive regions. Pattern Recogn 31:1241–1253
15. Pass G, Zabih R, Miller J (1996) Comparing images using color
coherence vectors. Proc 4th ACM Multimed Conf 1:65–73
16. Teng SW, Lu G (2007) Image indexing and retrieval based on
vector quantization. Pattern Recogn 40:3299–3316
17. Jain AK, Vailaya A (1998) Shape-based retrieval: a case study
with trademark image database. Pattern Recogn 31:1369–1390
18. Abbasi S, Mokhtarian F, Kittler J (1999) Curvature scale space in
shape similarity retrieval. Multimed Sys 7:467–476
19. Berretti S, Bimbo Del A, Pala P (2000) Retrieval by shape sim-
ilarity with perceptual distance and effective indexing. IEEE
Trans Multimed 2:225–239
20. Kim HK, Kim JD (2000) Region-based shape descriptor invariant
to rotation, scale and translation. Signal Process Image Commun
16:87–93
21. Belongie S, Malik J, Puzicha J (2002) Shape matching and object
recognition using shape context. IEEE Trans Pattern Anal Mach
Intell 24:509–522
22. Ricard J, Coeurjolly D, Baskurt A (2005) Generalization of
angular radial transform for 2D and 3D shape retrieval. Pattern
Recogn Lett 26:2174–2186
23. Huet B, Hancock ER (1999) Line pattern retrieval using rela-
tional histograms. IEEE Trans Pattern Anal Mach Intell
21:1363–1370
24. Chi Y, Leung MKH (2007) ALSBIR: a local-structure-based
image retrieval. Pattern Recogn 40:244–261
25. Manjunath BS (2002) Introduction to MPEG-7: multimedia
content description interface. Wiley, New York
26. Mokhtarian F, Mackworth AK (2002) The curvature scale space
representation: theory, applications and MPEG-7 standardization.
Kluwer Academic Publishers, Netherland
27. Balmelli L, Mojsilovic A (2001) Wavelet domain features for
texture description, classification and replicability analysis.
Wavelets in signal and image analysis: from theory to practice.
Kluwer Academic Publishers, Netherland
28. Wang JZ, Li J, Wiederhold G (2001) SIMPLIcity: semantics-
sensitive integrated matching for picture libraries. IEEE Trans
Pattern Anal Mach Intell 23:947–963
29. Abrishami Moghaddam H, Khajoie TT, Rouhi AH, Ssaadatmand-
Tarzjan M (2005) Wavelet correlogram: a new approach for
image indexing and retrieval. Pattern Recogn 38:2506–2518
30. Mandal MK, Aboulnasr T, Panchanathan S (1999) Fast wavelet
histogram techniques for image indexing. Comput Vis Image
Underst 75:99–110
31. Du Y, Wang JZ (2001) A scalable integrated region-based image
retrieval system. IEEE Conf one Image Process (ICIP2001)
1:22–25
32. Kokare M, Biswas PK, Chatterji BN (2007) Texture image retrieval
using rotated wavelet filters. Pattern Recogn Lett 28:1240–1249
33. Saadatmand-Tarzjan M, Abrishami Moghaddam H (2007) A
novel evolutionary approach for optimizing content-based image
indexing algorithms. IEEE Trans Sys Man Cybern Part B Cybern
37:139–153
34. Abrishami Moghaddam H, Saadatmand-Tarzjan M (2006) Gabor
wavelet correlogram algorithm for image indexing and retrieval.
Proc 18th Intl Conf Pattern Recognition (ICPR’06) 1:925–928
35. Lee TS (1996) Image representation using 2D Gabor wavelets.
IEEE Trans Pattern Anal Mach Intell 18:959–971
36. Do MN, Vetterli M (2005) The contourlet transform: an efficient
directional multiresolution image representation. IEEE Trans
Image Proc 14:2091–2106
37. Simoncelli EP, Freeman WT, Heeger DJ (1992) Shiftable mul-
tiscale transforms. IEEE Trans Inform Theory 38:587–607
38. Do MN, Vetterli M (2003) The finite ridgelet transform for image
representation. IEEE Trans Image Proc 12:16–28
39. Joutel G, Eglin V, Bres S, Emptoz H (2007) Curvelets based
queries for CBIR application in handwriting collections. Int Conf
Document Anal Recognit (ICDAR’07) 1:649–653
40. Nguyen T, Oraintara S (2005) The shift-invariant complex
directional pyramid. IEEE Trans Signal Process 56:4651–4672
41. Vo APN, Nguyen TT, Oraintara S (2006) Texture image retrieval
using complex directional filter bank. In: Proceedings of IEEE
International Symposium on Circuits and Systems (ISCAS’06),
vol 1, pp 5495–5498
42. Manjunath BS, Wu P, Newsam S, Shin HD (2000) A texture
descriptor for browsing and similarity retrieval. Signal Process
Image Commun 16:33–43
43. Borgne HL, O’Connor N (2005) Natural scene classification and
retrieval using Ridgelet-based Image Signatures. Proc Adv
Concepts Intell Vis Syst (ACIVS’ 05) 1:116–122
44. Murtagh F, Starck JL (2008) Wavelet and curvelet moments for
image classification: application to aggregate mixture grading.
Pattern Recogn Lett 29:1557–1564
45. Abrishami Moghaddam H, Khajoie TT, Rouhi AH (2003) A new
algorithm for image indexing and retrieval using wavelet corre-
logram. Proc IEEE Intl Conf Image Process (ICIP’03) 2:497–500
46. Vetterli M, Chang SG, Yu B (2000) Adaptive wavelet thres-
holding for image denoising and compression. IEEE Trans Image
Proc 9:1532–1546
176 Pattern Anal Applic (2013) 16:163–177
123
47. Donoho DL (1995) De-noising by soft-thresholding. IEEE Trans
Inform Theory 41:613–627
48. Donoho DL, Johnstone IM (1994) Ideal spatial adaptation by
wavelet shrinkage. Biometrika 81:425–455
49. Mallat S (1998) A wavelet tour of signals processing. Academic
Press, San Diego
50. Rahman Md, Bhattacharya P, Desai BC (2007) A framework for
medical image retrieval using machine learning and statistical
similarity matching techniques with relevance feedback. IEEE
Trans Inform Technol Biomed 11:58–69
51. Liu H, Song D, Ruger S, Hu R, Uren V (2008) Comparing dis-
similarity measures for content-based image retrieval. In: Lecture
notes in computer science: information retrieval technology,
Springer, Berlin, pp 44–50
52. Au KM, Law NF, Siu WC (2007) Unified feature analysis in
JPEG and JPEG 2000-compressed domains. Pattern Recognit
40:2049–2062
53. Turner MR (1986) Texture discrimination by Gabor functions.
Biol Cybern 55:71–82
54. Manjunath BS, Ma WY (1996) Texture features for browsing and
retrieval of image data. IEEE Trans Pattern Anal Mach Intell
18:837–842
55. Shang L (2008) Denoising natural images based on a modified
sparse coding algorithm. Appl Math Comput 205:883–889
56. Tan S, Jiao L (2008) A unified iterative denoising algorithm
based on natural image statistical models: derivation and exam-
ples. Optic Express 16:975–992
57. Forouzanfar M, Abrishami Moghaddam H, Ghadimi S (2008)
Locally adaptive multiscale Bayesian method for image denois-
ing based on bivariate normal inverse Gaussian distributions. Int J
Wavelets Multiresolution Inform Process 6:653–664
58. Papoulis A, Pillai SU (2002) Probability, random variables and
stochastic processes. McGraw Hill, New York
59. Haussler D (1992) Decision theoric generalization of the PAC
model for neural net and other Learning applications. Inform
Comput 100:78–150
60. Rudolph G (1994) Convergence analysis of canonical genetic
algorithms. IEEE Trans Neural Net 5:96–101
61. Poli R (2005) Tournament selection, iterated coupon-collection
problem and backward-chaining evolutionary algorithms. In:
Wright A et al (eds) Foundations of genetic algorithms. Springer,
New York, pp 132–155
62. Engelbrecht AP (2002) Computational intelligence. Wiley, UK
63. Whitley D (1994) A genetic algorithm tutorial. Stat Comput
4:65–85
64. Bianconi F, Fernandez A (2007) Evaluation of the effects of
Gabor filter parameters on texture classification. Pattern Recogn
40:3325–3335
65. Greenspan H, Belongie S, Perona P, Goodman R, Rackshit S,
Anderson C (1994) Overcomplete steerable pyramid filters and
rotation invariance. In: Proceedings of IEEE Conference on
Computer Vision and Pattern Recognition, vol 1, pp 222–228
66. Ferrari RJ, Rangayyan RM, Desautels JL, Frere AF (2001)
Analysis of asymmetry in mammograms via directional filtering
with Gabor wavelets. IEEE Trans Med Imaging 20:953–964
67. Fischer L, Sroubek F, Perrinet L, Redondo R, Cristobal G (2007)
Self-invertible 2D log-Gabor wavelets. Int J Comput Vis
75:231–246
68. Brodatz P (1966) A photographic album for artists and designer.
New York, Dover
Pattern Anal Applic (2013) 16:163–177 177
123