Edge detection with large depth of focus using differential Haar–Gaussian wavelet transform

7
Edge detection with large depth of focus using differential Haar–Gaussian wavelet transform Ling Fan a,d , Feijun Song b , Suganda Jutamulia c, * a Beijing University of Posts and Telecommunications, School of Electronic Engineering, Beijing 100876, China b China Daheng Group, Inc., P.O. Box 8618, A9 Shangdi Xinxilu, Haidian District, Beijing 100085, China c University of Northern California, 1304 Southpoint Boulevard, Suite 220, Petaluma, CA 94954, USA d Beijing Jiaotong University, School of Sciences, Beijing 100044, China Received 1 December 2005; received in revised form 18 August 2006; accepted 8 September 2006 Abstract We propose a new differential Haar–Gaussian (DHG) wavelet transform together with the bandwidth matching algorithm to perform edge detection, which can be processed with fast computation in both spatial and frequency domains. The telecentric optics is used to produce high-precision edge detection with large depth of focus. Ó 2006 Elsevier B.V. All rights reserved. Keywords: Wavelet transform; Edge detection; Automatic inspection; Large depth of focus; Telecentric optics 1. Introduction Images formed naturally or artificially always contain some distinctive regions. Physical parameters such as bright- ness, illuminance, or chromaticity, change slowly inside the region. But they change rapidly between regions creating edges. An edge is the transient between regions and often carries useful physical information. In image processing, edge detection forms the basis of many applications such as automatic inspection, target recognition, robotic vision, biomedical imaging, electro-optical measurement, etc. [1–5]. In a measurement system based on optical imaging, because objects have three dimensions, the parts to be mea- sured may have different distances from the imaging sys- tem. This will generate two problems. First, different magnifications for objects at different distances can cause measurement error. Second, an imaging system has limited depth of focus. Within the depth of focus, the image is clear enough to inspect easily. But beyond the depth of focus, the edges in the image become blurred, which is known as defocus effect. When we measure a large object, the aforementioned two problems become serious. Addition- ally, the superposed noise will make the result worse. Telecentric objective lens [6] can be used to solve the first problem since it provides a uniform magnification for objects at different locations. Edge detection is commonly performed using a digital differential method. If the image is clear, this method can produce a good result. However, for a noisy object larger than the depth of focus, the image will be fuzzy and the edge detected by the digital differential method may contain errors. To overcome this second prob- lem related to the depth of focus, recently wavelet trans- form has been applied to the noisy edge detection. The results are more accurate and adaptive than the regular dig- ital differential method [2,4,5,7–9]. In this paper we propose another new wavelet trans- form, which is the differential Haar–Gaussian (DHG) wavelet transform. This new wavelet transform is applied to the edge detection, which can be processed in both spa- tial and frequency domains. Furthermore, the bandwidth matching algorithm and the telecentric optics are used to achieve fast computation, and to produce high-precision edge detection with large depth of focus. 0030-4018/$ - see front matter Ó 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.optcom.2006.09.015 * Corresponding author. Tel.: +1 5104868762; fax: +1 7077698600. E-mail address: [email protected] (S. Jutamulia). www.elsevier.com/locate/optcom Optics Communications 270 (2007) 169–175

Transcript of Edge detection with large depth of focus using differential Haar–Gaussian wavelet transform

www.elsevier.com/locate/optcom

Optics Communications 270 (2007) 169–175

Edge detection with large depth of focus using differentialHaar–Gaussian wavelet transform

Ling Fan a,d, Feijun Song b, Suganda Jutamulia c,*

a Beijing University of Posts and Telecommunications, School of Electronic Engineering, Beijing 100876, Chinab China Daheng Group, Inc., P.O. Box 8618, A9 Shangdi Xinxilu, Haidian District, Beijing 100085, China

c University of Northern California, 1304 Southpoint Boulevard, Suite 220, Petaluma, CA 94954, USAd Beijing Jiaotong University, School of Sciences, Beijing 100044, China

Received 1 December 2005; received in revised form 18 August 2006; accepted 8 September 2006

Abstract

We propose a new differential Haar–Gaussian (DHG) wavelet transform together with the bandwidth matching algorithm to performedge detection, which can be processed with fast computation in both spatial and frequency domains. The telecentric optics is used toproduce high-precision edge detection with large depth of focus.� 2006 Elsevier B.V. All rights reserved.

Keywords: Wavelet transform; Edge detection; Automatic inspection; Large depth of focus; Telecentric optics

1. Introduction

Images formed naturally or artificially always containsome distinctive regions. Physical parameters such as bright-ness, illuminance, or chromaticity, change slowly inside theregion. But they change rapidly between regions creatingedges. An edge is the transient between regions and oftencarries useful physical information. In image processing,edge detection forms the basis of many applications suchas automatic inspection, target recognition, robotic vision,biomedical imaging, electro-optical measurement, etc. [1–5].

In a measurement system based on optical imaging,because objects have three dimensions, the parts to be mea-sured may have different distances from the imaging sys-tem. This will generate two problems. First, differentmagnifications for objects at different distances can causemeasurement error. Second, an imaging system has limiteddepth of focus. Within the depth of focus, the image is clearenough to inspect easily. But beyond the depth of focus,the edges in the image become blurred, which is known

0030-4018/$ - see front matter � 2006 Elsevier B.V. All rights reserved.

doi:10.1016/j.optcom.2006.09.015

* Corresponding author. Tel.: +1 5104868762; fax: +1 7077698600.E-mail address: [email protected] (S. Jutamulia).

as defocus effect. When we measure a large object, theaforementioned two problems become serious. Addition-ally, the superposed noise will make the result worse.

Telecentric objective lens [6] can be used to solve the firstproblem since it provides a uniform magnification forobjects at different locations. Edge detection is commonlyperformed using a digital differential method. If the imageis clear, this method can produce a good result. However,for a noisy object larger than the depth of focus, the imagewill be fuzzy and the edge detected by the digital differentialmethod may contain errors. To overcome this second prob-lem related to the depth of focus, recently wavelet trans-form has been applied to the noisy edge detection. Theresults are more accurate and adaptive than the regular dig-ital differential method [2,4,5,7–9].

In this paper we propose another new wavelet trans-form, which is the differential Haar–Gaussian (DHG)wavelet transform. This new wavelet transform is appliedto the edge detection, which can be processed in both spa-tial and frequency domains. Furthermore, the bandwidthmatching algorithm and the telecentric optics are used toachieve fast computation, and to produce high-precisionedge detection with large depth of focus.

170 L. Fan et al. / Optics Communications 270 (2007) 169–175

2. Mathematical model of edge function

The ideal image of a straight edge, which coincides withy-axis is described by

gðxÞ ¼ a 1� 2

Z x

�1dðnÞdn

� �¼ �asgnðxÞ ¼

a; x < 0;

�a; x P 0;

�ð1Þ

where sgn(x) is a sign function. Fig. 1(a) shows the idealimage of a straight edge. For simplicity, the ideal imageof a straight edge g(x) is also called the geometric straightedge function.

The real image of an edge is a transient zone but not ageometric straight edge. Because of the image aberrationand defocusing effect, the real image of edge is graduallychanged. Again, for simplicity, the real image of edge isalso called edge function f(x) and shown in Fig. 1(b).Under incoherent illumination, the edge function f(x) canbe described as the convolution of a point spread functiong(x) and the geometric straight edge function g(x).

f ðxÞ ¼ gðxÞ � gðxÞ: ð2ÞThe point spread function includes all factors to form a

real image of edge. Naturally, the point spread function isGaussian (see Fig. 1(b)):

gðx; rÞ ¼ exp � xr

� �2� �

: ð3Þ

The convolution makes the image of the straight edge be-come slowly transient zone. For the sake of analysis, we de-fine the equivalent edge width DS (see Fig. 1(b)) in thefollowing derivation. Substitution of Eqs. (1) and (3) into(2) yields

f ðxÞ ¼ � arffiffiffipp

Z 1

�1gðn;rÞsgnðx� nÞdn

¼ arffiffiffipp

Z 1

�1exp � n

r

� �2" #

sgnðx� nÞdn

¼ arffiffiffipp

Z x

�xexp � n

r

� �2" #

dn ¼ a erfxr

� �; ð4Þ

− a

a

( )xη

4 2 0 2 42

0

2a 0.2+

a 0.2

c x, a,( )g x,( )s x a,( )

f x( )

yy

44x x, x, x,

2,

x

( )f x

a

−a

ΔS

g(x,σ)

x

ba

Fig. 1. (a) Ideal image of straight edge g(x). (b) Edge function f(x) withtransient zone, point spread function g(x,r), and equivalent edge widthDS.

where erf(x) is error function. Normalized parameters in-sure f(x) converging to ±a when x! ±1. The derivativeof Eq. (4) at x = 0 is

k0 ¼ kjx¼0 ¼ �df ðxÞ

dx

����x¼0

¼ � arffiffiffipp

Z 1

�1gðn; rÞ o½sgnðx� nÞ�

oxdn

� �x¼0

¼ 2arffiffiffippZ 1

�1gðn; rÞdðx� nÞdn

����x¼0

¼ 2arffiffiffipp : ð5Þ

The equivalent edge width is

DS ¼ 2ak0

¼ rffiffiffipp¼ 1:77r; ð6Þ

which is of course determined by the point spread functiong(x,r). The edge function f(x), point spread functiong(x,r), and the equivalent edge width DS are shown inFig. 1(b).

From the property of Fourier transform [10], the fre-quency bandwidth of the edge function is

DW ¼ 1

DS¼ 1ffiffiffi

pp

r¼ k0

2a; ð7Þ

which shows an uncertainty relationship. A sharper edgeprovides smaller DS and broader DW, which means morehigh frequency components included in f(x). Conversely,the smoother the edge is, the narrower its frequency band-width is. In practical edge detection, different object struc-tures (i.e., different edge locations) and different imagingoptics result in different widths for the same edge. In otherwords, different point spread functions g(x,r) produce dif-ferent widths DS. Additionally, the image may be noisy.This will further prevent regular digital differentiationmethods to provide high-precision measurement.

3. Differential Haar–Gaussian (DHG) wavelet transform

The proposed differential Haar–Gaussian wavelet(DHG) is defined as

hsðxÞ ¼2

sx� q

s

� �exp � x� q

s

� �2

� 2

sxþ q

s

� �exp � xþ q

s

� �2

; ð8Þ

where s is a positive scaling parameter, q is a separatingparameter. Two examples of the DHG wavelet forq = 5 s and q = 2.5 s are schematically shown in Fig. 2(a)and (b), respectively.

It appears that when q is approaching s, two peaks aremoving closer. Finally, when q = s, the double peak curvechanges to a single peak curve as shown in Fig. 3(a), similarto the Mexican hat wavelet [11]. The double peak curve isuseful for detecting double edges, which will be explainedin Section 5 in more detail. We will first discuss the detec-tion using a single peak curve (q = s) shown in Fig. 3(a) for

a

b

-5 0 5 x

-5 0 5 x

hs(x)

hs(x)

Fig. 2. Schematic curves of Differential Haar–Gaussian (DHG) wavelet.(a) q = 5 s and (b) q = 2.5 s.

Fig. 3. (a) DHG wavelet hs(x) for q = s. (b) Spectra of (a) Hh(m).

L. Fan et al. / Optics Communications 270 (2007) 169–175 171

detecting a single edge. Although, the DHG wavelet forq = s is not exactly the same as the Mexican hat function[11], both functions share common features (single peakwith two side lobes) as shown in Fig. 3(a).

The wavelet transform of a signal g(x) is defined asfollows:

W fgðxÞg ¼Z 1

�1h�s

n� xs

� �gðnÞdn: ð9Þ

In frequency domain, the wavelet transform is describedby

W fgðxÞg ¼Z 1

�1HðmÞGðmÞ expði2pmnÞdm; ð10Þ

where H(m) and G(m) are Fourier transforms of waveleths(x) and signal g(x), respectively. The wavelet spreads def-initely in both spatial and the frequency domains (seeFig. 3). Eq. (10) shows that the wavelet transform workslike a filter. Its nonzero region becomes the ‘‘frequencywindow’’ of the wavelet.

The width of the ‘‘spatial window’’ of the DHG waveletis

DSh ¼ 2shðxÞ; x2; hðxÞð ÞðhðxÞ; hðxÞÞ

12

¼ 2s

R1�1 x2 hðxÞ½ �2 dxR1�1 hðxÞ½ �2 dx

" #12

; ð11Þ

where (f,g) is the inner product of f and g. SubstitutingEqs. (8) into (11), we have

DSh ¼ 2sq3

4q2 1� e�2q2� �

þ 1þ e�2q2

1� ð1� 4q2Þe�2q2

0@

1A

12

; ð12Þ

when q� 1, we get approximately

DSh ¼ 2sq: ð13ÞThe Fourier transform of the DHG wavelet given by Eq.

(8) is

HhðmÞ ¼ �sm sinð2psqmÞe�ðpsmÞ2 : ð14ÞThe DHG for q = s and its Fourier transform are shown inFig. 3(a) and (b), respectively. The first maximum of Hh(m)in Fig. 3(b) is approximately at

mc ¼ 1=ð4sqÞ: ð15ÞWe can consider mc as the central frequency of the DHGFourier spectra. The bandwidth of the frequency windowDWh is

DW h ¼ 1=ð2sqÞ; ð16Þsince

DShDW h ¼ 1: ð17ÞAs a convoluting factor, the wavelet transform is a

smoothing procedure. From the uncertainty relationship,the smaller DSh is, the wider the frequency window DWh

is. Correspondingly, the collected frequency componentsare more abundant. On the other hand, because the band-width of noise (especially white noise) is very broad, morenoise is picked up when the frequency window broadens.This will increase the measurement error. Therefore, wemust select a proper bandwidth for a specific detection.The ratio Q of the central frequency to the frequency win-dow bandwidth is a parameter concerning measurementprecision, which is

Fig. 4. DHG wavelet transforms of edge function f(x).

172 L. Fan et al. / Optics Communications 270 (2007) 169–175

Q ¼ mc=DW h ¼ 1=2: ð18ÞThe ratio is independent of the central frequency. Fre-quency window broadens as central frequency increases.Since the measurement precision is proportional to Q, thewavelet transform produces the same precision for differentspatial frequencies.

However, the wavelet transform method has a seriousdrawback. The transform procedure includes both the con-volution of the shift parameter n in spatial domain and thefiltering of the dilation s in frequency domain. The digitalcomputation is too large for real-time measurement. Fromthe above analysis we can consider the wavelet transform afilter at frequency domain. Edge as a part of an image haslimited frequency components. The width of the filter canbe selected, such that it is a little bit broader than the band-width of the edge’s spectra. Thus, the primary frequencycomponents of the signal are fully collected, while the unnec-essary components and noise are filtered out. Consequently,S/N will be increased, and the error is significantly reduced.

From the uncertainty principle given by Eq. (7), thebandwidth of the edge’s spectra DW is directly propor-tional to the slope of the edge k0. Therefore, we can esti-mate the value of DW by measuring k0. The properselection of the bandwidth of the wavelet frequency win-dow DWh can both insure the measurement precision andsave the computing time greatly. This selection method iscalled the bandwidth matching algorithm. The ratio ofDWh to DW is called the matching parameter. From Eqs.(7) and (16), we have

b ¼ DW h=DW ¼ DS=DSh ¼ a=ðqk0Þ ¼ffiffiffipp

r=ð2sqÞ: ð19ÞIn other words, according to the bandwidth matching

algorithm, the product sq of the wavelet is determined bythe bandwidth of the edge’s spectra DW. After applyingEq. (19), the DHG wavelet transform of the edge functiongiven in Eq. (14) is

W ff ðxÞg ¼Z 1

�1HhðmÞF ðmÞei2pmx dm

¼ 2asffiffiffiffiffiffiffiffiffiffiffiffiffiffis2 þ r2p e�

x�x0rlð Þ2 � e�

xþx0rlð Þ2

;

l ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi1þ p

4b2q2

r; x0 ¼

ffiffiffipp

r2b

: ð20Þ

On … O2 O1 1

′Δ

″Δ

On O2 O1

Fig. 5. Schematic diagram of experime

Fig. 4 shows Eq. (20) in four cases: b = 0.5, 1, 2, and 10,respectively. It is important to note that the zero pointshows the position of the geometric edge (see Fig. 1(a)),i.e., the center point of the edge function f(x) (see Fig. 1(b)).

4. Experimental results

The experimental system is shown in Fig. 5. Lens L1 andstop S form a telecentric measurement system. Stop S is atthe back focal plane of lens L1, so the primary ray throughthe center of the stop is parallel to the optical axis afterpassing lens L2. A series of objects (holes) O1, O2, . . . ,On

are imaged onto a CCD. Among these objects, O1 andCCD are conjugate about the optical system and thus itsimage on the CCD is clear. O2, . . . ,On are at different defo-cused distances D and their images on CCD are blurred.Since the primary ray is parallel to the optical axis, themagnifying powers to defocused objects are constant. Thisis the principle of a telecentric system [6].

In our experimental system the focal length of the frontlens (L1) is 500 mm. A hole placed at various defocus dis-tances (D are from 0 to 180 mm) are tested. The basic mea-surement is to identify the edge of the hole from the greyscale curve (which contains only positive values) of everyline collected by the computer-imaging system. Fig. 6shows the detected grey scale curve of edge (left columndotted line), which is approximately f(x) + B P 0, B standsfor bias or background, and its DHG wavelet transform

S 2

CCD

Computer system

L1 S L2

ntal setup. D is defocused distance.

Table 1Measured hole diameters at different defocused distances (correctvalue = 29.98 mm)

Defocus distance D (mm) Measured hole diameter (mm) Error (mm)

�180 29.957 �0.023�140 29.986 0.006�120 29.987 0.007�80 29.992 0.012�40 30.000 0.020�20 29.991 0.011

0 29.973 �0.00710 29.953 �0.02720 29.935 �0.04550 29.912 �0.068

L. Fan et al. / Optics Communications 270 (2007) 169–175 173

W{f(x) + B} (left column solid line). Since the DHG wave-let transform is a differential operation, W{f(x) + B} =W{f(x)}. The spectra of the grey scale curve of edge andtheir wavelet transforms are shown in right column, dottedand solid lines, respectively. The frequency bandwidth ofthe wavelet is three times that of the corresponding edgebandwidth (b = 3).

To get the data shown in Fig. 6, first we transform thegrey scale curve f(x) + B using a DHG wavelet with anarbitrary bandwidth to obtain the approximate centerpoint (where the slope is maximum) of the edge curve(see Fig. 4). After calculating the slope k0 at that pointand setting b, we can then select the bandwidth of thewavelet following Eq. (19). Accordingly, the grey scalecurve is again wavelet transformed with the selected wave-let bandwidth. The zero point now indicates the preciselocation of the edge.

Fig. 6. Grey scale curves of edge and their DHG wavelet transforms at varioD = �80 mm, (e) and (f) D = �180 mm. Left column: grey scale curves of edgespectra of grey scale curves of edge (dotted line) and spectra of the DHG wavelright column is the central frequency of the DHG wavelet transform.

Table 1 shows experimental results of the detected diam-eters of the hole at different defocus distances. Note thatthe hole diameter is the distance between two edges. The

us defocused distances: (a) and (b) defocused distance D = 0, (c) and (d)(dotted line) and their DHG wavelet transforms (solid line). Right column:et transforms of the left column (solid line). The vertical broken line in the

CCD

Imaging Beam

Beam Splitting Prism

Illuminating Beam

45° Chamfer

Fig. 7. Schematic diagram of setup for detecting a chamfer containing two edges.

174 L. Fan et al. / Optics Communications 270 (2007) 169–175

results indicate that because the wavelet frequency band-widths follow the bandwidths of the edge function, weget close precisions for different defocus conditions. Themaximum error is 0.22% and the mean error is only0.075%. The fact that correct results can be obtained atlarge defocus distances means it is possible to detect theedge of a large object with high precision.

5. Discussion

The single edge detection was performed by applying theproposed differential Haar–Gaussian (DHG) wavelettransform in spatial and frequency domains. Since theselected DHG wavelet transform (q = s) is similar to theMexican hat wavelet [11], it is anticipated that a Mexicanhat function can also be applied following the sameprocedure.

However, for detecting double edges of a chamfer asshown in Fig. 7, the proposed DHG wavelet transformwith double peaks (see Fig. 2(a)) can detect the chamferin one step by setting the distance between two peaksapproximately the same as the width of the chamfer. The

Chamfer Intensity

DHG Wavelet

x

Fig. 8. Chamfer intensity signal and a DHG wavelet, with q = D/2 ands = D, D is the width of chamfer (distance between two edges).

simulated chamfer intensity and the appropriate DHGwavelet transform are shown in Fig. 8. The intensity func-tion of the chamfer is bright outside the chamfer (lightside), medium at the chamfer, and then darker outsidethe chamfer (shadow side). The study of chamfer detectionwill be reported in the future.

6. Concluding remarks

A mathematical model of the blurred edge was pre-sented. We proposed a bandwidth matching algorithmsuch that the wavelet frequency bandwidth followed theedge frequency bandwidth to provide high precision andfast computation with large depth of focus. Telecentricoptics was also used to provide a uniform magnificationfor objects at different defocused distances. The proposedDHG wavelet transform, the bandwidth matching algo-rithm, and the telecentric optics are proven to be powerfultools for edge detection with large depth of focus, whichproduce very small error of 0.22% or less within 180 mmdefocused range.

References

[1] Q. Xu, G. Liao, Solid rocket near-field noise in static experiment:wavelet analysis, in: H.H. Szu, J.R. Buss (Eds.), Wavelet andIndependent Component Analysis Applications IX, Proc. SPIE, vol.4738, 2002, p. 448.

[2] Y. Lee, S. Kozaitis, Opt. Eng. 39 (2000) 2405.[3] Z. Shi, D. Zhang, H. Wang, K. Haixiang, D.J. Kouri, D.K. Hoffman,

Biomedical signal processing using a new class of wavelets, in: H.H.Szu, et al. (Eds.) Wavelet Applications VII, Proc. SPIE, vol. 4056,2000, p. 450.

[4] Y. Lee, S.P. Kozaitis, Improved wavelet-based multiresolution edgedetection in noisy images, in: K.P. Stephen, D.J. Richard (Eds.),Visual Information Processing VIII, Proc. SPIE, vol. 3716, 1999, p.185.

[5] A. Talukder, D.P. Casasent, Multiscale Gabor wavelet fusion foredge detection in microscopy images, in: H.H. Szu (Ed.), WaveletApplications V, Proc. SPIE, vol. 3391, 1998, p. 336.

L. Fan et al. / Optics Communications 270 (2007) 169–175 175

[6] W.B. Whetherell, Afocal system, in: M. Bass, E.W. Van Stryland,D.R. Williams, W.L. Wolfe (Eds.), Handbook of Optics, vol. II,McGraw-Hill, New York, 1995.

[7] F. Song, L. Yu, S. Jutamulia, Opt. Eng. 39 (2000) 1190.

[8] F. Song, S. Jutamulia, Opt. Eng. 41 (2002) 50.[9] Song Feijun, Acta Opt. Sin. 6 (1986) 137.

[10] Song Fei-Jun, Acta Opt. Sin. 12 (2003) 3055.[11] D. Marr, E. Hildreth, Proc. R. Soc. London B207 (1980) 187.