Realization of an automated microassembly task involving micro adhesive bonding

7
International Journal of Automation and Computing 10(6), December 2013, 545-551 DOI: 10.1007/s11633-013-0752-7 Realization of an Automated Microassembly Task Involving Micro Adhesive Bonding Fu-Dong Li De Xu Zheng-Tao Zhang Ya-Li Shi Research Center of Precision Sensing and Control, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China Abstract: An automated approach is proposed for a microassembly task, which is to insert a 10 μm diameter glass tube into a 12 μm diameter hole in a silicon substrate, and bond them together with ultraviolet (UV) curable adhesive. Two three-degree-of-freedom micromanipulators are used to move the glass tube and the dispensing needle, respectively. Visual feedback is provided by an optical microscope. The angle of the microscope axis is precisely calibrated using an autofocus strategy. Robust image segmentation method and feature extraction algorithm are developed to obtain the features of the hole, the glass tube and the dispensing needle. Visual servo control is employed to achieve accurate aligning for the tube and the hole. Automated adhesive dispensing is used to bond the glass tube and the silicon substrate together after the insertion. On-line monitoring ensures that the diameter of the adhesive spot is within a desired range. Experimental results demonstrate the effectiveness of the proposed strategy. Keywords: Micro adhesive bonding, microassembly, visual servo control, image segmentation, feature extraction. 1 Introduction Conventional micro-electro-mechanical systems (MEMS) fabrication technologies such as bulk micromachining, sur- face micromachining, lithographie, galvanoformung, abfor- mung (LIGA), and deep reactive ion etching, are widely used in producing various kinds of simple function MEMS. However, as MEMS are becoming more and more func- tional, these technologies are insufficient in producing MEMS with multiple functionalities. Integration of MEMS devices fabricated with different technologies and differ- ent materials is required to build a multi-functional hy- brid MEMS. Microassembly is the process of manipulat- ing, moving and bonding micro components together to form functional hybrid MEMS, which is gaining much at- tention because of its essential role in constructing hybrid MEMS [14] . However, manual microassembly is laborious, time consuming, and of low success rate. So the automation of microassembly is indispensable for the eventual success of hybrid MEMS. To realize the integration of the micro components into a complete hybrid MEMS, reliable micro bonding technol- ogy is of great importance. There are various kinds of micro bonding technologies such as micro fasteners [5] , micro adhe- sive dispensing [68] , soldering [9] , acoustic bonding [10] , etc. Micro fastener has been proven to be an effective way for bonding different micro devices together, yet it has the dis- advantages of complicating the micro parts structure and compromising the strength of the micro parts. Micro solder- ing and diffusion are mainly used in micro circuit assembly, and they are not suitable for hybrid MEMS assembly in- volving soft materials. Micro adhesive dispensing is of high potential, and is capable of bonding firmly almost any two kinds of materials with the right type of adhesive. Without regarding the issues on the adhesive property, there are two main challenges facing micro adhesive dis- Manuscript received March 21, 2013; revised July 18, 2013 This work is supported by National Natural Science Foundation of China under (Nos. 61227804 and 61105036). pensing. Firstly, the accurate aligning of the parts to be bonded is very critical. Because, unlike the micro fastener approach which is self-aligned by their mechanical struc- tures after the bonding, the misalignment of the micro parts may aggravate the internal stress in the bonded structure after curing of the adhesive [11] , which may eventually affect the stability of the whole hybrid MEMS. Secondly, accu- rate positioning of the dispensed adhesive spot and precise amount control of the dispensed adhesive are of great im- portance for the bonding task. Too little adhesive can not ensure the strength of the bonding, while too much adhesive may affect the uniformity of the structure or even damage the surrounding environment. In this paper, we present an automated approach to per- form the insertion of a 10 μm diameter glass tube into a 12 μm diameter hole in a silicon substrate and bond them together with ultraviolet (UV) curable adhesive. The in- sertion process of the glass tube into the hole on the sil- icon substrate is split into 2 steps: 1) The glass tube tip is aligned with the hole, using a visual servo control strat- egy. 2) The glass tube is inserted into the hole, in which an open-loop control method is employed because the glass tube image will be blended with the hole image. Vision- based on-line monitoring is used to make sure that exact amount of adhesive is dispensed to the juncture of the two micro parts. The rest of the paper is organized as follows: Section 2 gives a brief introduction of the system setup. In Sec- tion 3, the calibration of the microscope axis angle is in- troduced. Section 4 addresses the details of the automated microassembly process, including the insertion process of glass tube and the adhesive dispensing process. Section 5 presents the experimental results. Finally, the paper is con- cluded in Section 6. 2 System setup The automated microassembly system is composed of an insertion unit, a micro dispensing unit, a vision unit and a

Transcript of Realization of an automated microassembly task involving micro adhesive bonding

Page 1: Realization of an automated microassembly task involving micro adhesive bonding

International Journal of Automation and Computing 10(6), December 2013, 545-551

DOI: 10.1007/s11633-013-0752-7

Realization of an Automated Microassembly Task

Involving Micro Adhesive Bonding

Fu-Dong Li De Xu Zheng-Tao Zhang Ya-Li ShiResearch Center of Precision Sensing and Control, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China

Abstract: An automated approach is proposed for a microassembly task, which is to insert a 10 µm diameter glass tube into a 12µm diameter hole in a silicon substrate, and bond them together with ultraviolet (UV) curable adhesive. Two three-degree-of-freedommicromanipulators are used to move the glass tube and the dispensing needle, respectively. Visual feedback is provided by an opticalmicroscope. The angle of the microscope axis is precisely calibrated using an autofocus strategy. Robust image segmentation methodand feature extraction algorithm are developed to obtain the features of the hole, the glass tube and the dispensing needle. Visualservo control is employed to achieve accurate aligning for the tube and the hole. Automated adhesive dispensing is used to bond theglass tube and the silicon substrate together after the insertion. On-line monitoring ensures that the diameter of the adhesive spot iswithin a desired range. Experimental results demonstrate the effectiveness of the proposed strategy.

Keywords: Micro adhesive bonding, microassembly, visual servo control, image segmentation, feature extraction.

1 Introduction

Conventional micro-electro-mechanical systems (MEMS)fabrication technologies such as bulk micromachining, sur-face micromachining, lithographie, galvanoformung, abfor-mung (LIGA), and deep reactive ion etching, are widelyused in producing various kinds of simple function MEMS.However, as MEMS are becoming more and more func-tional, these technologies are insufficient in producingMEMS with multiple functionalities. Integration of MEMSdevices fabricated with different technologies and differ-ent materials is required to build a multi-functional hy-brid MEMS. Microassembly is the process of manipulat-ing, moving and bonding micro components together toform functional hybrid MEMS, which is gaining much at-tention because of its essential role in constructing hybridMEMS[1−4]. However, manual microassembly is laborious,time consuming, and of low success rate. So the automationof microassembly is indispensable for the eventual successof hybrid MEMS.

To realize the integration of the micro components intoa complete hybrid MEMS, reliable micro bonding technol-ogy is of great importance. There are various kinds of microbonding technologies such as micro fasteners[5], micro adhe-sive dispensing[6−8], soldering[9], acoustic bonding[10], etc.Micro fastener has been proven to be an effective way forbonding different micro devices together, yet it has the dis-advantages of complicating the micro parts structure andcompromising the strength of the micro parts. Micro solder-ing and diffusion are mainly used in micro circuit assembly,and they are not suitable for hybrid MEMS assembly in-volving soft materials. Micro adhesive dispensing is of highpotential, and is capable of bonding firmly almost any twokinds of materials with the right type of adhesive.

Without regarding the issues on the adhesive property,there are two main challenges facing micro adhesive dis-

Manuscript received March 21, 2013; revised July 18, 2013This work is supported by National Natural Science Foundation of

China under (Nos. 61227804 and 61105036).

pensing. Firstly, the accurate aligning of the parts to bebonded is very critical. Because, unlike the micro fastenerapproach which is self-aligned by their mechanical struc-tures after the bonding, the misalignment of the micro partsmay aggravate the internal stress in the bonded structureafter curing of the adhesive[11], which may eventually affectthe stability of the whole hybrid MEMS. Secondly, accu-rate positioning of the dispensed adhesive spot and preciseamount control of the dispensed adhesive are of great im-portance for the bonding task. Too little adhesive can notensure the strength of the bonding, while too much adhesivemay affect the uniformity of the structure or even damagethe surrounding environment.

In this paper, we present an automated approach to per-form the insertion of a 10 μm diameter glass tube into a12 μm diameter hole in a silicon substrate and bond themtogether with ultraviolet (UV) curable adhesive. The in-sertion process of the glass tube into the hole on the sil-icon substrate is split into 2 steps: 1) The glass tube tipis aligned with the hole, using a visual servo control strat-egy. 2) The glass tube is inserted into the hole, in whichan open-loop control method is employed because the glasstube image will be blended with the hole image. Vision-based on-line monitoring is used to make sure that exactamount of adhesive is dispensed to the juncture of the twomicro parts.

The rest of the paper is organized as follows: Section2 gives a brief introduction of the system setup. In Sec-tion 3, the calibration of the microscope axis angle is in-troduced. Section 4 addresses the details of the automatedmicroassembly process, including the insertion process ofglass tube and the adhesive dispensing process. Section 5presents the experimental results. Finally, the paper is con-cluded in Section 6.

2 System setup

The automated microassembly system is composed of aninsertion unit, a micro dispensing unit, a vision unit and a

Page 2: Realization of an automated microassembly task involving micro adhesive bonding

546 International Journal of Automation and Computing 10(6), December 2013

host computer to control the whole process. The systemsetup is shown in Fig. 1.

The insertion unit consists of a 6-degrees-of-freedom(DOF) manual platform, on which the silicon substrate witha hole at its center is placed, and a 3-DOF manipulator, onwhich the glass tube is mounted through a vertically curvedmetal tube. The 6-DOF manual platform is pre-adjustedsuch that the silicon substrate placed on it can face up-wards.

Fig. 1 System setup

The dispensing unit is formed by a microinjector anda 3-DOF manipulator. The microinjector is used to addpressure to the dispensing needle to dispense the adhesivecontained in the needle. The 3-DOF manipulator controlsthe motion of the dispensing needle, and it is the same typeas the one used in the insertion unit.

The vision unit consists of a microscope camera and amoving platform with a support mechanism. The micro-scope camera is formed with a microscope lens and a CCDcamera. The moving platform is employed to move the mi-croscope camera in order to adjust its distance to the objectto be viewed. It has one translation DOF. It is necessaryfor the microscope camera to be mounted to point to thehole in an inclined direction in order to view the hole andthe tube tip and the dispensing needle at the same time.The insertion unit and the vision unit are mounted on thevibration isolation platform. The computer is used to cap-ture images via the microscope camera and to control themovements of the insertion unit and the moving platform.

3 Microscope axis calibration

To make enough space for the microassembly operation,the microscope is installed about 30◦ off the horizontalplane. The angle of the microscope axis is very impor-tant for the insertion process of the glass tube and the hole,because the final insertion process will be carried out inan open-loop manner. Therefore, it is necessary to cali-brate the angle of the microscope axis, instead of taking amechanically constrained theoretical value. A simple buteffective way to calibrate the angle of the microscope axisis developed here. Fig. 2 illustrates the calibration method.

At position 1, the glass tube tip is on the focal planeof the microscope. Then it is moved vertically downwardto position 2, where the glass tube tip is out of the focalplane of the microscope. Refocus the glass tube tip hori-zontally to position 3, and record the horizontal distancebetween the focus position between position 1 and position3 to be hd. The distance from position 1 to position 2 is aknown distance vd. The angle of the microscope axis can

be calculated as

θ = atan(hd

vd) (1)

Fig. 2 Illustration of the angle of the microscope axis and glass

tube movement

4 Automated microassembly

The microassembly task is to insert a 10 μm diameterglass tube into a 12 μm diameter hole in a silicon substrate,and bond them together with adhesive. The flow chart ofthe general procedure is shown in Fig. 3.

Fig. 3 Flow chart of the general assembly procedure

Before the automated process begins, the glass tube, thehole and the dispensing needle must be manually broughtinto the field of view of the 120× microscope. This man-ual preparation is conducted with the help of a stereo mi-croscope and a 12× microscope. Calibration of the 120×microscope is conducted in advance.

4.1 Image segmentation

Image segmentation is very critical in imageprocessing[12,13] . The image segmentation method withthree steps used here is illustrated in Fig. 4.

In the first step, a manually selected gray threshold isused for binarization of the whole image. Because the holeimage is much darker than the rest of the image, the gray

Page 3: Realization of an automated microassembly task involving micro adhesive bonding

F. D. Li et al. / Realization of an Automated Microassembly Task Involving Micro Adhesive Bonding 547

threshold can be in a certain range to guarantee a good bi-narization result. Also, with the priori knowledge about thehole position, that is, the hole sits at the bottom right areaof the image. The right and bottom edges of the hole areacan be easily decided by scanning from the bottom and theright. After the right and bottom edges are decided, thescanning area will be narrowed, and the top and left edgeswill be decided by scanning in the narrowed area.

Fig. 4 Image segmentation illustration

In the second step, the glass tube is moved 2 μm to theright, and the needle is moved 2 μm downward. Then, back-ground subtraction is applied. Binarization is also appliedto the image after background subtraction, with a thresholdchosen in advance. To decide the right edge of the needlearea and the right and left edges of the glass tube area, thearea above the hole is projected vertically to form a curve.From the projection curve, the three edges can be decided.

In the third step, after the needle and glass tube area areseparated, the two areas are projected horizontally to formtheir curves. The top and bottom edges of the needle area,and the bottom edge of the glass tube can be decided fromthe projection curves.

4.2 Feature extraction

Since the 3-DOF manipulator has a high resolution, thekey factor for the success of the automated microassemblylies on feature extraction of the micro components on theimage.

The hole is circular, but because of the tilting angle of themicroscope, its image is approximately an ellipse. There-fore, after the hole area is decided, edge points of the holecan be achieved by scanning the binarized hole area. Thenan ellipse is fitted to determine the center of the hole, whichis the image feature of the hole.

The region of interest (ROI) of the tube is firstly acquiredin the image segmentation part. In the following procedure,the tube ROI will be tracked and updated. The image qual-ity of the glass tube is poor because it is transparent andreflective. The tube ROI is binarized to improve the ro-bustness of the feature extraction. It is performed basedon the following observation. There are three gray levels inthe glass tube area, the background part, the dark part ofthe glass tube resulted from blocking some of the light, andthe bright part of the glass tube resulted from reflecting thelight. Both the dark and the bright parts are the images ofthe glass tube, namely, the foreground images, as given in

g(i, j) =

{0, gh − gt < g(i, j) < gh + gt

255, otherwise(2)

where gt is a threshold to separate the foreground imagefrom the background image, and its value is experimentallydecided. The background gray value gh is obtained fromthe grayscale histogram of the tube ROI.

The gray value corresponding to the biggest value of the

grayscale histogram value is chosen to be the gray value ofthe background. After binaryzation of the tube ROI, me-dian filter is performed to eliminate the noise. Then edgepoints are obtained by scanning the binarized ROI. Linefit method is used to find the two edges of the glass tubetip on the image plane, using the random sample consensus(RANSAC)[14,15]. The angle bisector of the two fitted edgelines is chosen to be the center line of the glass tube. Theintersection of the center line and the bottom of the tubearea is determined, which is the image feature of the tipcenter of the glass tube.

Feature extraction for the dispensing needle is similar tothe glass tube, which is omitted here.

4.3 Aligning and insertion control

The insertion process is split into two steps: the visualservo control aligning step and the open-loop insertion step.

The glass tube tip is firstly aligned with the position at120 pixels above the hole center using visual servo control:[

Δxc

Δyc

]=

[kxΔu

kyΔv

](3)

where (Δu, Δv) are the image errors between the hole cen-ter and the glass tubes tip center on the image plane, kx,ky are pixel lengths, which are calibrated in advance, Δxc

and Δyc are the position errors on the clear imaging plane.For convenience to transform the motion increments on

the clear imaging plane to the 3D manipulator coordinates,the microscope camera is adjusted such that the x axis ofthe manipulator coordinates is parallel to the Δxc axis ofthe camera coordinates. Δxc and Δyc are calculated di-rectly from the feature errors on the image plane, as givenin (3). The axis angle of the microscope and the parallelconstraint of x and xc axes are used to convert Δxc andΔyc into the 3D manipulator coordinates, as given in⎧⎪⎨

⎪⎩Δxm = Δxc

Δym = −Δyc sin θ

Δzm = −Δyc cos θ

(4)

where θ is the rotation angle around x axis. Δxm, Δym

and Δzm are the position errors in the 3D Cartesian space.A proportional-integral (PI) controller is employed to

move the glass tube tip to the aligning position.⎧⎪⎨⎪⎩

Δxa(n) = Kp[Δxm(n) − Δxm(n − 1)] + KiΔxm(n)

Δya(n) = Kp[Δym(n) − Δym(n − 1)] + KiΔym(n)

Δza(n) = Kp[Δzm(n) − Δzm(n − 1)] + KiΔzm(n)

(5)where Δxa, Δya and Δza are the adjusting position in-crements in the 3D Cartesian space, Kp and Ki are theparameters of controller.

The first step is finished once the tube tip is moved tothe aligning position. Then the second step begins. Thetube tip is focused again to make sure that it is right onthe clear imaging plane before the final insertion. Then itis relocated and moved downwards by a given distance toinsert into the hole. The process of aligning the dispensingneedle with the designated position is similar to the glasstube insertion process. The details are omitted here. Thealigning process will be shown in the experiment part.

Page 4: Realization of an automated microassembly task involving micro adhesive bonding

548 International Journal of Automation and Computing 10(6), December 2013

4.4 Adhesive dispensing control

The adhesive will start to leak into the crevice of the holeand spread around the glass tube once the dispensing nee-dle touches the silicon substrate or the glass tube becauseof the wetting phenomenon. Before the dispensing needletouches the silicon substrate or the glass tube, the adhesivewill cling to the tip of the needle firmly due to strong sur-face tension when the diameter of the adhesive at the tip ofthe needle is very small[16,17] .

To get a desired adhesive spot, an on-line “monitor” isdesigned as a “trigger”. When the diameter of the adhesivespot reaches 35 μm on the on-line “monitor”, it will imme-diately trigger a motion to withdraw the dispensing needleand end the adhesive dispensing process.

The on-line “monitor” is realized using background sub-traction method. The image captured by the CCD rightafter the needle contacts the bottom part of the glasstube is used as the background image. After that, on-line images are used to subtract the background image, andbinarization is performed to achieve the growing adhesivespot. To improve the speed of the image processing forthe adhesive spot, the ROI is set to be a long rectangle,which contains 50 pixels above and 50 pixels below the holecenter.

5 Experiments and results

5.1 Experiment system

An experiment system was established according to thescheme given in Section 2, as shown in Fig. 5. In the ex-periment system, there are three microscope cameras in-cluding an 120× magnification microscope camera (Navitar12× combined with Mitutoyo M84010711 10×) and a stereomicroscope (Zeiss Stemi 2000-C) and a 12× magnificationmicroscope camera (Navitar 12×). But only the 120×magnification microscope camera is used to provide visualfeedback in the automated microassembly task. The 12×magnification microscope camera and the stereo microscopecamera are used for the initiation for the automated pro-cess, which is to bring the glass tube and the dispensingneedle to the vicinity of the hole in the silicon substrate.The two 3-DOF manipulators are both Sutter MP-285, andthe manipulator had a travel of 2.54 cm with a resolutionof 0.04 μm.

Fig. 5 Experiment system

The glass tube tip is about 10 μm in diameter. The holeis 12 μm in diameter. And the outer diameter of the dis-pensing needle is about 1.2 μm.

5.2 Calibration results

Experiments were conducted to complete the calibrationsof the system. Fig. 6 shows the calibration results of the mi-croscope axis in 20 experiments. The calibrated angle of themicroscope axis off the horizontal plane was 29.8◦.

Fig. 6 Microscope axis calibration result

5.3 Image segmentation and feature ex-traction

The image segmentation method as described in Section4.2 was employed to segment the objects. Fig. 7 is the im-age area acquired after median filtration and morphologicalopening operation of the binarized subtraction image. Thethreshold used for binarization was 15.

Fig. 7 Binarized subtraction image area

The projection curves used for determining the edges ofthe glass tube area and the needle area are shown in Fig. 8.From Fig. 8 (a), with the vertical projection of the wholearea above the hole area, the right edge of the needlearea and left and right edges of the glass tube area wereeasily and robustly determined. Fig. 8 (b) is the horizon-tal projection of the needle area separated from Fig. 8 (a).From this curve, the top and bottom edges of the needlearea were determined. Fig. 8 (c) is the horizontal projectioncurve of the glass tube area, from which the bottom edgeof the glass tube area was determined.

Page 5: Realization of an automated microassembly task involving micro adhesive bonding

F. D. Li et al. / Realization of an Automated Microassembly Task Involving Micro Adhesive Bonding 549

(a) Vertical projection of the area above the hole area

(b) Horizontal projection of needle area

(c) Horizontal projection of glass tube area

Fig. 8 Projection curves

The final image segmentation result is shown in Fig. 9. Itcan be found that all the three objects were well segmented.

Fig. 9 Image segmentation result

5.4 Automated aligning and insertion

In the insertion experiment, threshold gt for binarizationof the glass tube tip ROI was set to be 15.

The tracking results and the feature extraction resultsof the tube tip with the proposed methods were shown inFigs. 10 and 11. It can be found that the tracking area wasadequate and the extracted feature was accurate.

Fig. 10 Tracking areas of the glass tube tip

Fig. 11 Feature extraction results of the glass tube tip

The proportion coefficient Kp and the integral coefficientKi in the PI control law were determined experimentally.They were 0.1 and 0.5, respectively. In the visual servocontrol step, the moving trajectory of the glass tube tip onthe image plane was shown in Fig. 12. After aligning theglass tube tip with the designated aligning position whichwas 120 pixels above the hole area, the glass tube tip wasrefocused and relocated to eliminate the accumulated errorintroduced by the visual servo process. The trajectory ofthe aligning and insertion in the manipulator coordinateswas shown in Fig. 13. Some pictures in the process weregiven in Fig. 14.

Fig. 12 Moving trajectory on the image plane

Fig. 14 (a) shows the initial status of the hole, theglass tube and the dispensing needle on the image plane.Fig. 14 (b) shows the results of autofocus and the featureextraction of the glass tube tip. Fig. 14 (c) is the final statewhen the visual servo control of aligning the glass tube tipcenter with the designated position above the hole center.Fig. 14 (d) is the final result of the insertion. In the exper-iments, the tube tip was moved to the hole steadily andinserted into the hole accurately.

Page 6: Realization of an automated microassembly task involving micro adhesive bonding

550 International Journal of Automation and Computing 10(6), December 2013

Fig. 13 Trajectory of the glass tube tip in aligning and insertion

process

Fig. 14 Pictures in the process of aligning and insertion

5.5 Automated adhesive dispensing pro-cess

The needle was aligned with the dispensing position us-ing the same control strategy with the glass tube aligning.Fig. 15 shows the dispensing needle aligning process and theadhesive dispensing process. The finished microassemblytask is shown in the last image in Fig. 15. In the experi-ments, the adhesive dispensing was successfully conductedand the adhesive quantity was also well controlled.

6 Conclusions

In this paper, an effective strategy is proposed to realizethe automated microassembly task involving micro adhe-sive bonding, and some simple but effective methods arepresented. Small depth of field is utilized to calibrate themicroscope axis angle using the autofocus method. Robustimage segmentation is achieved using background subtrac-tion and projection curves. Feature extraction of the glasstube and the dispensing needle is realized using RANSACline fit method. And an on-line “monitor” is designed toensure precise size of the adhesive spot. Although the meth-ods are put forward to solve the specific problems in ourmicroassembly task, they can easily be adopted to various

kinds of microassembly tasks. Experiments have verifiedthe proposed strategy for realizing the automation of themicroassembly task.

Fig. 15 Adhesive dispensing process

References

[1] A. N. Das, P. Zhang, W. H. Lee, D. Popa, H. Stephanou.µ3: Multiscale, deterministic micro-nano assembly systemfor construction of on-wafer microrobots. In Proceedings of2007 IEEE International Conference on Robotics and Au-tomation, IEEE, Roma, Italy, pp. 461–466, 2007.

[2] N. Dechev, W. L. Cleghorn, J. K. Mills. Microassembly of 3-D microstructures using a compliant, passive microgripper.Journal of Microelectromechanical Systems, vol. 13, no. 2,pp. 176–189, 2004.

[3] T. Kasaya, H. Miyazaki, S. Saito, T. Sato. Micro objecthandling under SEM by vision-based automatic control.In Proceedings of 1999 IEEE International Conference onRobotics and Automation, IEEE, Detroit, USA, pp. 2189–2196, 1999.

[4] L. G. Chen, L. N. Sun, W. B. Rong, X. Q. Bian. Hy-brid control of vision and force for MEMS assembly sys-tem. In Proceedings of 2004 IEEE International Confer-ence on Robotics and Biomimetics, IEEE, Shenyang, China,pp. 136–141, 2004.

[5] L. D. Wang, L. Ren, J. K. Mills, W. L. Cleghorn. Auto-mated 3-D micrograsping tasks performed by vision-basedcontrol. IEEE Transactions on Automation Science and En-gineering, vol. 7, no. 3, pp. 417–426, 2010.

[6] S. Bohm, K. Dilger, J. Hesselbach, J. Wrege. S. Rathmann,W. Ma, E. Stammen, G. Hemken. Micro bonding withnon-viscous adhesives. Microsystem Technologies, vol. 12,pp. 676–679, 2010.

[7] D. Andrijasevic, K. Malecki, I. Gioroudi, W. Smetana, W.Brenner. Low temperature non-viscous adhesive bondingin MEMS. In Proceedings of the 29th International SpringSeminar on Electronics Technology, IEEE, St. Marienthal,Germany, pp. 44–48, 2006.

[8] X. J. Wang, X. D. Wang, G. J. Yao, C. Liu, L. D. Wang. Au-tomatic micro-bonding Technology of capillaries with adhe-sives. In Proceedings of the 6th International Conference onElectronic Packaging Technology, IEEE, Shenzhen, China,pp. 1–5, 2005.

[9] K. Tsuchiya, A. Murakami, G. Fortmann, M. Nkao, Y.Hatamura. Micro assembly and micro bonding in nano man-ufacturing world. SPIE Proceedings, vol. 3834, pp. 132–140,1999.

[10] Y. B. Sun, Y. Luo, Y. Q. Feng, X, D. Wang. Adaptive pres-sure ultrasonic precise bonding method for polymer microjoint. In Proceedings of the 5th International Conferenceon Nano/Micro Engineered and Molecular Systems, IEEE,Xiamen, China, pp. 459–474, 2010.

Page 7: Realization of an automated microassembly task involving micro adhesive bonding

F. D. Li et al. / Realization of an Automated Microassembly Task Involving Micro Adhesive Bonding 551

[11] F. Sarvar, D. A. Hutt, D. C. Whalley. Application of ad-hesives in MEMS and MOEMS assembly: A review. InProceedings of 2002 International Conference on Polymersand Adhesives in Microelectronics and Photonics, IEEE,Zalaegerszeg, Hungary, pp. 22–28, 2002.

[12] Y. Gat. A branch-and-bound technique for nano-structureimage segmentation. In Proceedings of 2003 Conferenceon Computer Vision and Pattern Recognition Workshop,IEEE, Wisconsin, USA, pp. 19, 2003.

[13] R. Marfil, L. Molina-Tanco. A. Bandera, J. A. Rodrıguez,F. Sandoval. Pyramid segmentation algorithms revisited.Pattern Recognition, vol. 39, no. 8, pp. 1430–1451, 2006.

[14] J. Zhang, Z. T. Zhang, D. Xu, W. S. Zhang. Aligning micro-gripper to ring object in high precision with microscope vi-sion. In Proceedings of 2011 IEEE International Conferenceon Computer Science and Automation Engineering, IEEE,Shanghai, China, pp. 123–127, 2011.

[15] M. A. Fischler, R. C. Bolles. Random sample consensus: Aparadigm for model fitting with applications to image anal-ysis and automated cartography. Communications of theACM, vol. 24, no. 6, pp. 381–395, 1981.

[16] X. B. Chen, M. G. Li, N. Cao. Modeling of the fluid volumetransferred in contact dispensing processes. IEEE Trans-actions on Electronics Packaging Manufacturing, vol. 32,no. 3, pp. 133–137, 2009.

[17] X. B. Chen, H. Ke. Effects of fluid properties on dispensingprocesses for electronics packaging. IEEE Transactions onElectronics Packaging Manufacturing, vol. 29, no. 2, pp. 75–82, 2006.

Fu-Dong Li received his B. Sc. andM. Sc. degrees in mechanical design andautomation, and mechatronics engineeringfrom Shandong University of Science andTechnology, China in 2008 and 2011, re-spectively. He is currently working towardhis Ph.D. degree in control science and en-gineering at Institute of Automation, Chi-nese Academy of Sciences (IACAS).

His current research interests includecomputer vision and automation.

E-mail: fudong [email protected] (Corresponding author)

De Xu received his B. Sc. and M. Sc. de-grees from Shandong University of Tech-nology, China in 1985 and 1990, respec-tively, and Ph.D. degree from Zhejiang Uni-versity, China in 2001, all in control scienceand engineering. He joined the Institute ofAutomation, Chinese Academy of Sciences(IACAS), China in 2001, where he is cur-rently professor in the Research Center ofPrecision Sensing and Control.

His research interests include robotics and automation, par-ticularly control of robots, such as visual control and intelligentcontrol.

E-mail: [email protected]

Zheng-Tao Zhang received his B. Sc.degree from China University of Petroleum,China in 2004, and M. Sc. degree in controlscience and engineering from Beijing Insti-tute of Technology, China in 2007. He is anassociate professor in Institute of Automa-tion, Chinese Academy of Sciences, China.

His research interests include visual mea-surement, micro-assembly and automation.

E-mail: [email protected]

Ya-Li Shi recieved her B. Sc. degree fromChangchun University of Science and Tech-nology, China in 2004, and Ph.D. degreefrom Changchun Institute of Optics, FineMechanics and Physics, Chinese Academyof Sciences, China in 2009. She is currentlyan associate professor at the Research Cen-ter of Precision Sensing and Control, Insti-tute of Automation, Chinese Academy ofSciences, China.

Her research interests include fine mechanics and micro as-sembly.

E-mail: [email protected]