MEMS BASED SYSTEM FOR CONTROLLING AND...

9
QUARTERLY ISSN 1232-9312 4/2017 (107) PROBLEMY EKSPLOATACJI Journal of Machine Construction and Maintenance Kamil STATECZNY, Mirosław PAJOR, Karol MIąDLICKI, Mateusz SAKóW Department of Mechanical Engineering and Mechatronics, West Pomeranian University of Technology, Szczecin, Poland [email protected], [email protected], [email protected], [email protected] MEMS BASED SYSTEM FOR CONTROLLING AND PROGRAMING INDUSTRIAL MANIPULATOR FANUC S-420F USING GESTURES Key words: manipulator, manual control, operator interface, gesture control, MEMS sensor, FANUC KAREL, FANUC S-420F. Abstract: The paper presents a mobile system dedicated to an industrial manipulator control. For this purpose, an interface was designed to be used for intuitive interaction with a robot, which consists of a measuring scanner for the orientation of the human upper limb, and a glove to measure the deflection of fingers. Basic gestures used for moving the manipulator have been proposed. The paper was based on the previous experience obtained in the implementation of projects which entailed manual programming of CNC machine tools. The system shall be a starting point for the construction of a control interface for other mechanical devices. System sterowania i programowania robota przemysłowego FANUC S-420F za pomocą gestów bazujący na sensorach MEMS Słowa kluczowe: robot, manipulator przemysłowy, interfejs operatora, sterowanie gestami, czujniki MEMS, FANUC S-420F, Fanuc Karel, HMI. Streszczenie: W pracy przedstawiono mobilny system do poruszania manipulatorem przemysłowym. W tym celu zaprojekto- wano interfejs służący do intuicyjnej interakcji z robotem, na który składa się skaner mierzący orientację kończyny górnej oraz rękawicę do mierzenia ugięcia palców dłoni. Zaproponowano podstawowe gesty do poruszania manipulatorem. Pracę oparto na dotychczasowych doświadczeniach zdobytych przy realizacji projektów nad programowaniem manualnym obrabiarek CNC. System stanowi punkt wyjściowy do budowy interfejsu obsługi innych urządzeń mechanicznych. Introduction This article is the result of a non-classical approach in manipulator control. So far, the authors of the article have concentrated on a design of a system for manual control and programming of CNC machine tools [1, 2]. The manual control techniques used with virtual reality interface were implemented to control a manipulator available at the Institute of Manufacturing Engineering. Unfortunately, operators are not able to communicate with the device by using machine code in a relatively simple way, so they have to be equipped with different types of human-machine interfaces to allow the communication. Nowadays, there is an expanding trend moving towards to intuitive and easy-to-use control system for devices. There are many facilities used for the controlling and positioning of manipulators, so there is no need to control the device from the keyboard by entering the desired position. Manipulators are controlled by different types of levers, control panels, and touchpads. Some robots have the ability to program the desired movement by grabbing a robot arm and moving in the intended direction. There has also been research on voice [3–5], vision [6, 7], and sensor fusion [8, 9] interfaces used to control industrial manipulators. Although such programming is the most intuitive, it is not suitable for large robots, because the operator is smaller and unable to reach the tip of the manipulator. p. 81–89

Transcript of MEMS BASED SYSTEM FOR CONTROLLING AND...

Page 1: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

QUARTERLYISSN 1232-9312 4/2017 (107)

Problemy eksPloatacji

Journal of MachineC o n s t r u c t i o n and Maintenance

Kamil STATECZNY, Mirosław PAJOR, Karol MIąDLICKI, Mateusz SAKóWDepartment of Mechanical Engineering and Mechatronics, West Pomeranian University of Technology, Szczecin, Poland [email protected], [email protected], [email protected], [email protected]

MEMS BASED SYSTEM FOR CONTROLLING AND PROGRAMING INDUSTRIAL MANIPULATOR FANUC S-420F USING GESTURES

Key words: manipulator, manual control, operator interface, gesture control, MEMS sensor, FANUC KAREL, FANUC S-420F.

Abstract: The paper presents a mobile system dedicated to an industrial manipulator control. For this purpose, an interface was designed to be used for intuitive interaction with a robot, which consists of a measuring scanner for the orientation of the human upper limb, and a glove to measure the deflection of fingers. Basic gestures used for moving the manipulator have been proposed. The paper was based on the previous experience obtained in the implementation of projects which entailed manual programming of CNC machine tools. The system shall be a starting point for the construction of a control interface for other mechanical devices.

System sterowania i programowania robota przemysłowego FANUC S-420F za pomocą gestów bazujący na sensorach MEMS

Słowa kluczowe: robot, manipulator przemysłowy, interfejs operatora, sterowanie gestami, czujniki MEMS, FANUC S-420F, Fanuc Karel, HMI.

Streszczenie: W pracy przedstawiono mobilny system do poruszania manipulatorem przemysłowym. W tym celu zaprojekto-wano interfejs służący do intuicyjnej interakcji z robotem, na który składa się skaner mierzący orientację kończyny górnej oraz rękawicę do mierzenia ugięcia palców dłoni. Zaproponowano podstawowe gesty do poruszania manipulatorem. Pracę oparto na dotychczasowych doświadczeniach zdobytych przy realizacji projektów nad programowaniem manualnym obrabiarek CNC. System stanowi punkt wyjściowy do budowy interfejsu obsługi innych urządzeń mechanicznych.

Introduction

This article is the result of a non-classical approach inmanipulatorcontrol.Sofar,theauthorsofthearticlehaveconcentratedonadesignofasystemformanualcontrolandprogrammingofCNCmachinetools[1,2].Themanualcontroltechniquesusedwithvirtualrealityinterface were implemented to control a manipulator availableattheInstituteofManufacturingEngineering.Unfortunately, operators are not able to communicatewith thedevicebyusingmachine code in a relativelysimpleway,sotheyhavetobeequippedwithdifferenttypes of human-machine interfaces to allow the communication.Nowadays,thereisanexpandingtrend

moving towards to intuitive and easy-to-use controlsystem for devices.There aremany facilities used forthe controlling and positioning of manipulators, sothereisnoneedtocontrolthedevicefromthekeyboardby entering the desired position. Manipulators arecontrolled by different types of levers, control panels,andtouchpads.Somerobotshavetheabilitytoprogramthe desired movement by grabbing a robot arm andmoving in the intendeddirection.Therehas alsobeenresearchonvoice[3–5],vision[6,7],andsensorfusion[8,9]interfacesusedtocontrolindustrialmanipulators.Although such programming is the most intuitive, itisnotsuitablefor largerobots,becausetheoperator issmallerandunabletoreachthetipofthemanipulator.

p. 81–89

Page 2: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

82 Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017

Thepaperpresentsadevelopedmethoddedicatedtomanipulatorcontrolbytrackingtheoperator’supperlimbs motion. The work has been narrowed down todetermine only the position of tool centre point of the manipulator in the base coordinate system of the manipulatorandwithoutdeterminingitsorientation.All

measurements, except finger bending measurements,areused todetermine theorientation in agiven spacewithout a distance measurement. The operator cancontrolthemanipulatorfromanydistance.Hecanevenbeoutsidetheworkareaofthemanipulator.

Fig. 1. Block diagram of the operator interface

1. Operator interface

The operator interface (Fig. 1) consists of inputelementsandacomputingplatform.

1.1. Input elements

The inputs elements of the operator interface are MEMS sensors and the 3D Data Glove 4 Ultra. Theglove is made of bend sensors to determine fingersbending. In this paper, we applied a technique usingthe accelerationmeasurements (includinggravity), theangular velocity, and the magnetic field measured bymeansofatri-axisaccelerometer,atri-axisgyroscope,

andatri-axismagnetometer,respectively.Thesesensorswere integrated in a complete systemADIS16405. Itsadvantage is that it has a compensation of the inputvoltage impact and a compensation of the impact oftemperature for each of the built-in sensors [10]. Itis factory-calibrated sensitivity, with bias and axialalignment.

Agyroscopemeasurestheangularvelocity,andbyknowingtheinitialconditions,itispossibletointegratethemeasuredsignalwithtimeinordertoobtainorientation.However, a measurement error leads to an aggregateerror of the transformed orientation,whichmakes thedetermination of absolute orientation impossible. AnaccelerometermeasurestheEarth’sgravityinanabsolute

Fig. 2. Block diagram of the orientation for an MARG implementation

Page 3: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017 83

system,buttherearestrongmeasurementinterferencesduring itsmovement.Themagnetometermeasures theEarth’s magnetic field, but disturbances from othermagneticsources in theenvironmentcouldchangethereadings. Inorder to calculateorientationon thebasisofmeasurementsfromthegyroscope,theaccelerometer,and the magnetometer, we have used an orientationfilter [11] for the MARG (Magnetic, Angular Rate,andGravity) implementation,whose block diagram ispresentedinFig.2.Thisfilterischaracterizedwithalowcomputational cost, which enables its implementationon inexpensive microcontrollers. It achieves a higherlevel of accuracy in comparison to the Kalman filterapplied for calculating orientation [11].ThefilterwasimplementedontheSTM32F103CBT6microcontrolleron the ZL30ARM instruction set. Figure 3 presentsthe acceleration sensors along with the platform on which the filter was implemented. The ADIS16362sensors communicates with the microcontroller by meansofanSPIinterface(SerialPeripheralInterface),sending the acceleration and angular velocity valuesnecessary for calculating the orientation by means of thefilter.Orientationinspaceisrepresentedbymeansofaquaternion.DataistransmittedtothecomputerbymeansofaUSBinterface.

1.2. Computing unit

Since it is a prototype system, due to theimplementation speed, the control procedure isimplemented on a PC using C++ programming language.Themainapplicationmodulesareresponsiblefor receiving input data and processing them, and forsending information selected for the manipulator control system.

1.3. Industrial Manipulator

AFANUCS-420FrobotwithaRHcontrollerwasused to implement and conduct tests of the proposed control system.The robot isdrivenby6servomotorsresponsibleformotionintherespectiveaxes(Fig.4).

Fig. 3. Sensors along with the microcontrollers

TheRHcontroller is amodular systemdedicatedtocontrolFANUCindustrialmanipulators.ItconsistsofthemainCPU,apathtrackingprocessor,dividedRAM,a bubble memory, and servo motor control systems.SixI/Ocardsareavailablewithatotalof48inputs,48digitaloutputs, and twoRS-232communicationports.The device can communicate with external systems,e.g.,tofeedthestatusofaproductionline.Thecontrolleralso contains six servo amplifiers. A comprehensivedescription of the robot and the controller can be found in the technical specifications [12].TheRHcontrollercanbeprogrammedbyusingFANUCKarel language.A comprehensive description of the syntax of theprogramming language, the user manual for CRT/KBoperation control panel, the KCL command prompts,systemvariables, and error codes can be found in thetechnicalspecifications[13].Tocommunicatewiththecontrol system of the manipulator, a serial interfaceRS232 was used. This interface was used to sendinformation about the desired position of the end-effectorof themanipulator.Thedata canbe refreshedevery 200 ms. By using the programming language(Karel), an application was written that allowed thesystem to interpret the received data (desired position– incrementalandglobal). Ithasalsobeenpossible todetermine the speed and acceleration that the manipulator shouldreachtoachievethedesiredposition.Thecontrolsystemimplementsinversekinematicsandatrajectorygeneratorwithlinearinterpolation.

Fig. 4. Design of the FANuC S-420F

Page 4: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

84 Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017

frame,{R}–robotsframe,{S}–sensorframe,and{P}–remotecontrolframe.

Orientation sensor provides the orientation ofthe earth relative to the sensor frame. In this case, , and define the orientation of the earth frame {E}relativetothesensorframe{S}andtheorientationoftheearthframe{E}relativetotheremote-controlframe{P},respectively.Unitquaternionmaybe regarded as a representation of rotation, where thevector determines the direction of axis rotation andthe angle q = 2 arccos(w) determined the angle by which the rotation has been made [14, 15]. It ispossible to translate the quaternion to the rotation matrix unambiguously, which is presented by Eq.1.

(1)

2. Proposed methods of control

The research was focused on two ways of controllingthemanipulator:a“remotecontrol”andbyusing a system for measuring the position of the hand in theoperatorcoordinatesystem.

2.1. Control by using a remote device

Figure 5 shows the operator holding the remotecontrol containing an orientation sensor. In addition,Figure 5 also shows the applied frames: {E} – earth

ESq >

EPq >

BA w x y zq = [ ],� ,� ,�>

BA

y z xy zw xz yw

xy zw x z yz xw

x

R =

− +( ) −( ) +( )+( ) − +( ) −( )

1 2 2 2

2 1 2 2

2

2 2

2 2

zz yw yz xw x y−( ) +( ) − +( )

2 1 2 2 2

Fig. 5. Coordinate systems for controlling by remote device

The sensor S is used to determine the orientationof the remote control relative to the orientation of therobotframe.Therefore,itisnecessarytosetitintheS

sensororientationastherobotframe.Thankstousingthesensor,theoperatordoesnothavetomanuallyenterdataofthemutualorientationsoftheseframes.Quaternions

Page 5: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017 85

areconvertedintorotationmatricesaccordingtoEqs.(2)and(3):

(2)

(3)

Since the frame {S} has the same orientation asframe{R};therefore,

(4)

To describe the orientation of the remote control in theframe{R},Eq.5wasused.

(5)

Becauseitwasonlypossibletocontroltheposition,notthespeed,theincrementalpositionofthetoolcentrepointwassenttothecontrolsystemofthemanipulator.Thevectorshiftofthetoolcentreofthemanipulatorwasset as follows:

(6)

where

(7)

whered–fixedstepmovement.

ThedirectionofthevectorPV was determined by theorientationoftheremotecontrol.Sensewasselectedby pressing one of two buttons.After the buttonwaspressed, the incremental position RV was send to the manipulator’scontrolsystem.

2.2. Control by using a movement scanner

For a more intuitive control of the manipulator,a scanner of the operator’s upper limb movementswas designed.The scanner determines the position ofthehandintheoperatorframe.Thesensorshavebeendeployed on the operator’s body and have been takenintoconsiderationofthemovementsystemofthehumanbodymuscles. Since itwas impossible to position thesensorsoneachboneanddeterminetheirorientation,weconstructedasimplifiedmodelofhumankinematicsanddeployed sensors in strategic locations to determine the positionofhandsinspace.Fig.6showsthedistributionofthesensorssystems.

The point defining the position of the handswascalculated according to Fig. 7. Coordinate systemsfor controlling the robot using the operator’s handmovementsareshowninFig.8.

Quaternions are converted into rotation matricesaccordingtoEq.(8):

(8)

The position of the hand in the frame {O} isdetermined according to the following equation:

(9)

wheren–thenumberofsegmentsinthescanner.

ER

ESR R= �

EP

ER T

RPR R R=

� R RP PV R V=

ES

ESq R→ �>

EP

EPq R→ �>

PV =

�ESi

ESiq R→ �>

Fig. 6. Distribution of the orientation sensors on a operator eqqupied with momement scanner

On

k

n

ESk T Sk

kP R P+=

+= ∏11

1

d

0

0

Page 6: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

86 Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017

Fig. 7. Determining the position of segments

Fig. 8. Coordinate systems used in movement scanner calculations

The frame {O} corresponds to i = 0; therefore,it is the base system for the scanner of the upper limb movements. The operator position in the frame {E}is unknown, but his orientation relative to the frameis known.Performing a handgesture, it is possible todesignateafreevector.

(10)

where

– handpositionattimet,

– hand position at time t + Dt where Dt is samplingtime.

O On

OnV P P= −+ +

' �1 1

OnP +1 �

OnP '

+1

Page 7: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017 87

Vector OV can be expressed in the frame {R}according to the following relation:

(11)

Movementofthemanipulatorispossibleonlyafterclosingthehand.IncrementalpositionRV is then sent to thecontrolsystem.

3. Experimental results

The proposed systems were tested on a test stand in the technology hall of the Institute of Materials ScienceandEngineering,WestPomeranianUniversityof Technology, Szczecin. The tests were focused onthe possibilities of using the method and the ease of control,ratherthanontheaccuracyofthemethod.Inthetests,theoperatormovedhishandalongthedesignated

RER

EO OV R R VT=

trajectories.Duringthemovements,thehandandrobotTCP positions were recorded. In the first experiment,the desired trajectory was a circle with a radius of 200mm.TheresultsofthefirstexperimentarepresentedinFigure9and10.

In the second experiment, the original trajectorywasanellipsewitharadiusoftheminoraxisof200mmandthemajoraxisof250mm.

The experiments revealed that the proposedrobotcontrolsystemwasmuchmoreintuitivethantheoperation control panel (teach pendant) or command line ortheprogramminglanguage(KAREL)providedbythemanufacturer.Inthetestedsystem,theoperatordidnothavetoknowhowtousetheoperationcontrolpanelorcommandlineorknowspecificcommands.Theoperatorcould move the robot TCP along circles and ellipseswithout programming, and these trajectorieswould bedifficulttoperformevenusingatechpedant.

Fig. 9. Method A results – moving along circle with radius 200 mm

Fig. 10. Method B results – moving along circle with radius 200 mm

Page 8: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

88 Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017

Fig. 11. Method A results – moving along ellipse with a radius of minor axis 200 mm and major axis 250 mm

Fig. 12. Method B results – moving along ellipse with a radius of minor axis 200 mm and major axis 250 mm

However, theproposedsystemhas itsdrawbacks,which are mainly caused by hardware imperfections.The main issues are caused by slow interface between robot RH controller and MEMS controller. Whenoperatormovedquickly, the reactionof the robotwasdelayed.Anotherdisadvantageaffectingtheaccuracyofthesystemisthemagneticinterferencebetweensensors.Inthefirstcontrolmethod,whereonesensorwasused,the errors were about ±20mm. In the secondmethod,where there are more sensors, errors were about ±30mm.Given thepurposesof the system, thevaluesaresufficient for tasks such as pre-positioning ormovingobjects. It is also possible to apply additional supportfunctionsthatimproveaccuracy.

Summary

The paper presents and discusses the concept and the constituent elements of the proposed system and test results. Conducted studies have shown that it iseasier to control the robot by remote control containing an orientation sensor as well as using a scanner of the upperlimbmovementsbasedontheorientationsensorsfordetermining the location.Bettersmoothmovementusing the proposed remote-control systems would be obtained if it was possible to control the speed of the toolcentrepointofthemanipulatorinagivendirectionandsense.Itwasworthnotingthefactthatthemagneticinterference can cause erroneous determination of

Page 9: MEMS BASED SYSTEM FOR CONTROLLING AND ...jmcm.itee.radom.pl/images/pe_2017/pe_4_2017/pe_4_2017_s...AANUC F S-420F robot with a RH controller was used to implement and conduct tests

Journal of Machine Construction and Maintenance | PROBLEMY EKSPLOATACJI | 4/2017 89

orientation. Therefore, an optical supervisory systemmightbesufficient.Withsmallerenergyefficientsensors,it will be possible to design a system hidden in clothing providingwirelesscommunication.Sincethemainaimof the research was to test the possibility of using a new control approach to operate industrial manipulators,in the next stage of the project, the precision can beimproved.

References

1. StatecznyK.,PajorM.,“ProjectofamanipulationsystemformanualmovementofCNCmachinetoolbody units,”Advances inManufacturing Science,vol.35,pp.33–41,2011.

2. StatecznyK., “Constructionof a system scanningthe movement of human upper limbs," ZeszytyNaukowe/AkademiaMorskawSzczecinie,no.32(104)z.1,pp.75–80,2012.

3. Laureiti C. et al., “Comparative performanceanalysisofM-IMU/EMGandvoiceuserinterfacesfor assistive robots,” in 2017 InternationalConference on Rehabilitation Robotics (ICORR),2017,pp.1001–1006.

4. Majewski M., Kacalak W., “Conceptual Designof Innovative Speech Interfaces withAugmentedReality and Interactive Systems for ControllingLoader Cranes,” in Artificial IntelligencePerspectivesinIntelligentSystems,Vol1,vol.464,R.Silhavy,R.Senkerik,Z.K.Oplatkova,P.Silhavy,and Z. Prokopova, Eds. (Advances in IntelligentSystems and Computing, Berlin: Springer-VerlagBerlin,2016,pp.237–247.

5. ZinchenkoK.,WuC.Y., SongK.T., “A Study onSpeechRecognitionControlforaSurgicalRobot,”IEEE Transactions on Industrial Informatics, vol.13,no.2,pp.607–615,2017.

6. Shirwalkar S., Singh A., Sharma K., Singh N.,“Telemanipulation of an industrial robotic armusing gesture recognition with Kinect,” in 2013International Conference onControl,Automation,RoboticsandEmbeddedSystems(CARE),Jabalpur,2013,pp.1–6:IEEE.

7. PajorM.,MiądlickiK.,SakówM.,“Kinectsensorimplementation in FANUC robot manipulation,”Archivesofmechanicaltechnologyandautomation,vol.34,pp.35–44,2014.

8. Yepes J.C.,Yepes J.J.,Martínez J.R., PérezV.Z.,“ImplementationofanAndroidbasedteleoperationapplicationforcontrollingaKUKA-KR6robotbyusingsensorfusion,”in2013PanAmericanHealthCareExchanges(PAHCE),2013,pp.1–5.

9. BonillaI.etal.,“Avision-based,impedancecontrolstrategyforindustrialrobotmanipulators,”in2010IEEE International Conference on Automation ScienceandEngineering,2010,pp.216–221.

10. (12.08.2017). ADIS16405 Data Sheet. Available:http://www.analog.com/en/products/sensors/inertial-measurement-units/adis16405.html.

11. MadgwickS.O.H.,HarrisonA.J.L.,VaidyanathanR., “Estimation of IMU and MARG orientationusing a gradient descent algorithm,” in 2011IEEE International Conference on RehabilitationRobotics,2011,pp.1–7.

12. “Fanuc Robotics Maintenance manual[MARMKS42H1174EF][B-67205EG01].pdf,”ed.

13. “Fanuc Robotics MAROKENHA0885EF –Enhanced KAREL Operations Manual v. 2.22R.pdf,”ed.

14. DeLoura M., Perełki programowania gier: vade-mecumprofesjonalisty.Helion,2002.

15. MatulewskiJ.,Grafika,fizyka,metodynumeryczne:symulacjefizycznezwizualizacją3D.Wydawnic-twoNaukowePWN,2010.