Implementation of Reliable Open Source

6
ISSN : 2319 3182, Volume-2, Issue-4, 2013 43 Implementation of Reliable Open Source IRIS Recognition System Dhananjay Ikhar 1 , Vishwas Deshpande 2 & Sachin Untawale 3 1&3 Dept. of Mechanical Engineering, Datta Meghe Institute of Engineering, Technology & Research, Wardha 2 Ramdeobaba Collge of Engineering & Management, Nagpur E-mail : [email protected], [email protected], [email protected] Abstract RELIABLE automatic recognition of persons has long been an attractive goal. As in all pattern recognition problems, the key issue is the relation between inter-class and intra-class variability: objects can be reliably classified only if the variability among different instances of a given class is less than the variability between different classes.The objective of this paper is to implement an open-source iris recognition system in order to verify the claimed performance of the technology. The development tool used will be MATLAB, and emphasis will be only on the software for performing recognition and not hardware for capturing an eye image. A reliable application development approach will be employed in order to produce results quickly. MATLAB provides an excellent environment, with its image processing toolbox. To test the system, a database of 756 grayscale eye images courtesy of Chinese Academy of Sciences-Institute of Automation (CASIA) is used. The system is to be composed of a number of sub-systems, which correspond to each stage of iris recognition. These stages are- image acquisition, segmentation, normalization and feature encoding. The input to the system will be an eye image, and the output will be an iris template, which will provide a mathematical representation of the iris region. Which conclude the objectives to design recognition system are- study of different biometrics and their features? Study of different recognition systems and their steps, selection of simple and efficient recognition algorithm for implementation, selection of fast and efficient tool for processing, apply the implemented algorithm to different database and find out performance factors. Index TermsBiometrics, iris Recognition, Biometrics, Iris image quality, fingerprint, iris Recognition, Normalisation I. INTRODUCTION THE ANTICIPATED large-scale applications of biometric technologies such as iris recognition are driving innovations at all levels, ranging from sensors to user interfaces, to algorithms and decision theory. At the same time as these good innovations, possibly even outpacing them, the demands on the technology are getting greater. The iris is an externally visible and well protected organ whose unique epigenetic pattern remains stable throughout adult life. These characteristics make it very attractive for use as a biometric for identifying individuals. Image processing techniques can be employed to extract the unique iris pattern from a digitized image of eye, and encode it in to biometric template, which can be stored in a database. This biometric template contains an objective mathematical representation of unique information stored in iris, and allows comparisons to be made between templates. When a subject wishes to be identified by iris recognition system, their eye is first photographed, and then a template created for their iris region. This template is then compared with other templates stored in database until either a matching template is found and the subject is identified, or no match is found and the subject remains undefined. So basis of every biometric trait is to get the input signal/image apply some algorithm and extract the prominent feature for person identification/verification. In the identification case, the system is trained with the patterns of several persons. For each person, a template is calculated in training stage. A pattern that is going to be identified is matched against every known template. In the verification case, a person's identity is claimed a priori. The pattern that is verified only is compared with the person's individual template. Most biometric systems allow two modes of operation, an enrolment mode for adding templates to database, and an identification mode, where a template is created for an individual and then a match is searched for in the database of pre-enrolled templates. Following figure differentiates these enrolment and identification process clearly.

description

Implementation of Reliable Open SourceMechanical engineeringinternational journal

Transcript of Implementation of Reliable Open Source

ISSN :2319 3182, Volume-2, Issue-4, 2013 43 Implementation of Reliable Open Source IRIS Recognition System Dhananjay Ikhar1, Vishwas Deshpande2 & Sachin Untawale3 1&3 Dept. of Mechanical Engineering, Datta Meghe Institute of Engineering, Technology & Research, Wardha 2Ramdeobaba Collge of Engineering & Management, Nagpur E-mail : [email protected], [email protected], [email protected] AbstractRELIABLEautomaticrecognitionofpersons haslongbeenanattractivegoal.Asinallpattern recognition problems, the key issue is the relation between inter-classandintra-classvariability:objectscanbe reliablyclassifiedonlyifthevariabilityamongdifferent instancesofagivenclassislessthanthevariability betweendifferentclasses.Theobjectiveofthispaperisto implement an open-source iris recognition system in order toverifytheclaimedperformanceofthetechnology.The developmenttoolusedwillbeMATLAB,andemphasis willbeonlyonthesoftwareforperformingrecognition andnothardwareforcapturinganeyeimage.Areliable applicationdevelopmentapproachwillbeemployedin ordertoproduceresultsquickly.MATLABprovidesan excellentenvironment,withitsimageprocessingtoolbox. To test the system,adatabaseof 756 grayscale eye images courtesyofChineseAcademyofSciences-Instituteof Automation(CASIA)isused.Thesystemistobe composedofanumberofsub-systems,whichcorrespond toeachstageofirisrecognition.Thesestagesare-image acquisition,segmentation,normalizationandfeature encoding. The input to the system will be an eye image, and theoutputwillbeaniristemplate,whichwillprovidea mathematicalrepresentationoftheirisregion.Which concludetheobjectivestodesignrecognitionsystemare- studyofdifferentbiometricsandtheirfeatures?Studyof differentrecognitionsystemsandtheirsteps,selectionof simpleandefficientrecognitionalgorithmfor implementation,selectionoffastandefficienttoolfor processing,applytheimplementedalgorithmtodifferent database and find out performance factors. I ndex TermsBiometrics, iris Recognition, Biometrics, I ris image quality, fingerprint, iris Recognition, Normalisation I.INTRODUCTION THEANTICIPATEDlarge-scaleapplicationsof biometrictechnologiessuchasirisrecognitionare driving innovations at all levels, ranging from sensors to user interfaces, to algorithms and decision theory. At the sametimeasthesegoodinnovations,possiblyeven outpacingthem,thedemandsonthetechnologyare getting greater. Theirisisanexternallyvisibleandwellprotected organwhoseuniqueepigeneticpatternremainsstable throughout adult life. These characteristics make it very attractiveforuseasabiometricforidentifying individuals.Imageprocessingtechniquescanbe employedtoextracttheuniqueirispatternfroma digitizedimageofeye,andencodeitintobiometric template,whichcanbestoredinadatabase.This biometrictemplatecontainsanobjectivemathematical representationofuniqueinformationstorediniris,and allowscomparisonstobemadebetweentemplates. Whenasubjectwishestobeidentifiedbyiris recognitionsystem,theireyeisfirstphotographed,and thenatemplatecreatedfortheiririsregion.This template is then compared with other templates stored in databaseuntileitheramatchingtemplateisfoundand thesubjectisidentified,ornomatchisfoundandthe subject remains undefined. Sobasisofeverybiometrictraitistogettheinput signal/imageapplysomealgorithmandextractthe prominentfeatureforpersonidentification/verification. In theidentification case, thesystem is trainedwith the patterns ofseveralpersons.For each person, atemplate is calculated in training stage. A pattern thatis going to beidentifiedismatchedagainsteveryknowntemplate. Intheverificationcase,aperson'sidentityisclaimeda priori. The pattern that is verified only is compared with theperson'sindividualtemplate.Mostbiometric systemsallowtwomodesofoperation,anenrolment modeforaddingtemplatestodatabase,andan identificationmode,whereatemplateiscreatedforan individualandthenamatchissearchedforinthe databaseofpre-enrolledtemplates.Followingfigure differentiates these enrolment and identification process clearly. International Journal on Theoretical and Applied Research in Mechanical Engineering (IJTARME) ISSN :2319 3182, Volume-2, Issue-4, 2013 44 II.IMPLEMENTATION Implementation of good image acquisition system is very difficult becauseit is expected that from system to obtain noise free image. But it is depend on surrounding intensity,lightningused,andcameraresolutionand distancebetweencameraanduserseye.LEDismost commonlyusedlightsourcethanIRLEDbecauseits light affects the human eye system. In this thesis CASIA DATABASEeyeimagesareusedwhichincludesboth noisefreeandnoisyimages.ItistakenfromThe Centreof Biometricand Security Research, CASIA Iris Image Database. A.Image Segmentation Inpupildetection,theirisimageisconvertedinto grayscale to removetheeffect of illumination.As pupil is the largest black area in the intensity image, its edges can be detected easily from the binarized image by using suitablethresholdontheintensityimage.Thusthefirst step to find or separate out the pupil apply histogram of inputimagefromwhichwegetthresholdvaluefor pupil,thenapplyedgedetection,onceedgeofpupil find,thencentercoordinatesandradiuscanbeeasily findoutbyfollowingalgorithmhavingsteps-(A)Find thelargestandsmallestvaluesforbothxandyaxis. (B)Addthetwox-axisvalueanddividethembytwo willgivesx-centerpoint.(C)Similarlyaddtwoy-axis values,divideitbytwo,givesy-centerpoint.(D) Radius is calculated by subtracting minimum value from maximum and divides it by two gives the radius of pupil circle. (a)(b) (c) Fig 1 (a) Canny edge image (b) Only pupil, (c) Pupil ring Eyelashandeyelidalwaysaffectsthe performanceofsystem.Theeyelashesaretreatedas belongingtotwotypes,separableeyelashes,whichare isolated in the image, and multiple eyelashes, which are bunchedtogetherandoverlapintheeyeimage.Inthis thesis iris circle diameter is assumed as two timespupil diameterandthenoise,eyelashandeyelid,areavoided byconsideringlower1800portionofiriscircle.Hence aftersegmentationacompleteirispartisseparateout. Shown below (a) (b)(c) Fig 2. (a) Selected Iris Circle (b) Sharpened Iris(c) Iris Part with Eyelashes B.Normalization TheDaugmansrubbersheetmodelremapseach point within the iris region to a pair of polar coordinates ) , (u rwhere risoninterval[0,1]and uisangle[0, 2t] Fig.3. Daugmans rubber sheet model Theremappingoftheirisregionfrom(x,y)Cartesian coordinatestothenormalizednon-concentricpolar representation is modeled as ) ( ) ( ) 1 ( ) , () ( ) ( ) 1 ( ) , () , ( )) , ( ), , ( (u u uu u uu u ul pl py r y r r yx r x r r xwithr I r y r x I+ =+ =..(1) Where) , ( y x I istheirisregionimage, ) , ( y x arethe originalCartesiancoordinates, ) , (u r arethe correspondingnormalizedpolarcoordinates,andl l p py x and y x ,,arethecoordinatesofthepupiland irisboundariesalongtheu direction.Therubbersheet modeltakesintoaccountpupildilationandsize inconsistenciesinordertoproduceanormalized representation with constant dimensions. In this way the irisregionismodeledasaflexiblerubbersheet anchored at the iris boundary with the pupil centre as the referencepoint.Eventhoughthehomogeneousrubber sheetmodelaccountsforpupildilation,imaging distanceandnon-concentricpupildisplacement,itdoes notcompensateforrotationalinconsistencies.Inthe Daugmansystem,rotationisaccountedforduring matchingbyshiftingtheiristemplatesintheu International Journal on Theoretical and Applied Research in Mechanical Engineering (IJTARME) ISSN :2319 3182, Volume-2, Issue-4, 2013 45 directionuntiltwoiristemplatesarealigned.For normalizationofirisregionsatechniquebasedon Daugmans rubber sheet model is employed. The centre ofpupilisconsideredasthereferencepoint,andradial vector pass through the iris region, as shown in fig. Fig.4. Normalization Anumberofdatapointsareselectedalongeachradial lineandthisisdefinedastheradialresolution.The numberofradiallinesgoingaroundtheirisregionis definedastheradialresolution.Sincethepupilcanbe non-concentrictotheiris,aremappingformulais needed to rescalepoints depending on theangle around the circle. This is given by||.|

\|||.|

\| =+ = =u t |oo | o o|xyy xlooo owithr rarctan cos'2 22 2.. (2) Where displacement of the centre of the pupil relative to thecentreoftheirisisgivenby',r and o oy xisthe distancebetweentheedgeofthepupilandedgeofthe iris at an angle, uaround the region, and lr is the radius of the iris. The remapping formula first gives the radius of the iris region doughnut as a function of the angleu.Aconstantnumberofpointsarechosenalongeachradialline,sothataconstantnumberofradialdata pointsaretaken,irrespectiveofhownarrowandwide theradiusisataparticularangle.Thenormalized patternwascreatedbybacktrackingtofindthe Cartesian coordinates of datapoints from the radial and angularpositioninthenormalizedpattern.Fromthe doughnut iris region, normalization produces 2D array withhorizontaldimensionsofangularresolutionand verticaldimensionsofradialresolution.Another2D arrayiscreatedformakingreflections,eyelashes,and eyelidsdetectedinthesegmentationstage.Inorderto preventnon-irisregiondatafromcorruptingthe normalizedrepresentation,datapointswhichoccur alongthepupilborderortheirisborderarediscarded. AsinDaugmansrubbersheetmodel,removing rotationalinconsistenciesisperformedatthematching stage and will be discussed in the next chapter. Fig.5. Normalized iris part C.Feature Encoding 1. Radial and Circular Feature Encoding This approach is based on edge detection .Edges are detected in input image using canny edge detector. After edgedetectionimageischangedtobinaryformatin which white pixels are present on edges and black pixels elsewhere.Thenumberofwhitepixelsinradial directionandoncircleofdifferentradiusgives importantinformationaboutirispatterns.Normalized polar iris image will contain only white and black pixels as itis obtained from aboveedgedetected input image. Featuresfromnormalizedimagesareextractedintwo ways (a) radial way (b) circular way. (a) Radial features Fig.6. Feature extraction in radial direction Inirisimagevalueofradialfeatureatparticularangle willbenumberofwhitepixelsalongtheradial direction. If International Journal on Theoretical and Applied Research in Mechanical Engineering (IJTARME) ISSN :2319 3182, Volume-2, Issue-4, 2013 46 BLACK r image polar irisWHITE r image polar iris Sr= = == = =] ][ [ _ _ 0] ][ [ _ _ 1,uuu Feature at angle u will be ==NrrS F1,u u (3) (b) Circular features Fig .7.Feature extraction in circular direction In iris image value of circular feature at particular radius willbeconsideredassumofwhitepixelsalongthe circleofthatradius.Keepingthemeaningof Sr u , same.thefeatureofparticularradiusrwillbegivenas following. ==tuu20, r rS F .. (4) Iris code will be considered as sequence of radial andcircularfeatures.Inthismethodnumberofwhite pixel on radial and circular direction is measured which thenindicatescodeforthatparticulareyeimage.Itis obtained by following steps. 1.Imageinpolarformisconvertedintobinary form. Fig.8. Normalized image converted into binary 2. Number of white pixels in radial and circular direction is measured. White pixels [counts] = 1059 Black pixels [x] = 7041 3. Total numbers of white pixels are stored. 4. Similar steps from 1 to 3 are followed for both the query and data base image. 5. For matching compare the two images by using Subtraction of white pixels available in database image from number of white pixels available in query image. Fig.9. Circular Feature Fig.10. Radial Features Fig11.(a)Query Image(b)Retrieved Image Fig.12. Normalized image International Journal on Theoretical and Applied Research in Mechanical Engineering (IJTARME) ISSN :2319 3182, Volume-2, Issue-4, 2013 47 D.Matching Themostcommonlyusedmetricformatchingthe twobitstringsgeneratedbyqueryimageandtemplate storedindatabaseistheHammingDistance.Itisa simpleXORoperationwhereresultequaltozerowhen bothsaidastringhassamebitstring.Although,in theory,twoiristemplatesgeneratedfromthesameiris will have a hamming distance of 0.0, in practice this will not occur. Because normalization is not perfect, and also therewillbesomenoisethatgoesundetectedsosome variation will be present when comparing two intra-class iris templates. III.PERFORMANCE EVALUATION The performance of the iris recognition as wholeis examined.Testswerecarriedouttofindthebest separation. So thatthefalsematch and falseaccept rate isminimized,andtoconformthatirisrecognitioncan performaccuratelyasabiometricforrecognitionof individuals.Aswellasconfirmingthatthesystem providesaccuraterecognition,experimentswerealso conducted in order to confirm theuniqueness of human irispatternsbyreducingthenumberofdegreesof freedompresentiniristemplaterepresentation.The pointswhichdecidetheperformanceofsystemsare-1. FalseAcceptanceRate[FAR]2.FalseRejectionRate [FRR]3.EqualErrorRate[EER]4.Accuracy5. Decidability6.FARandFRRaccordingtohamming distance. Figure and Table Fig.13. Database images vs. HD Database images Absolute difference 1.bmp618 10.bmp880 11.bmp658 12.bmp820 13.bmp890 14.bmp720 15.bmp802 16.bmp806 2.bmp802 3.bmp0 4.bmp852 5.bmp816 6.bmp794 7.bmp862 8.bmp792 9.bmp892 Performance Evaluation for Circular and radial 1.FalseAcceptingRate:-Thefractionofthenumberof acceptedclientpatternsdividedbythetotalnumberof client patterns is called False Rejection Rate (FAR). persons different between n comparisio of Numbermatched person different times of NumberFAR100 = (5) 2.FalseRejectingRate:-Thefractionofthenumberof rejectedclientpatternsdividedbythetotalnumberof client patterns is called False Rejection Rate (FRR). person same between n comparisio of Numberrejected person same times of NumberFRR100 = (6) 3. Decidability Thekeyobjectiveofarecognitionsystemistobe abletoachieveadistinctseparationofintra-classand inter-classhammingdistances.Theseparationbetween interclassandintra-classhammingdistance distributionscanbemeasuredbymetricdecidability and is given by the following formula. Higher the value ofdecidabilitybetteristheperformanceofsystem. ActuallyitdecidestheseparationofFARandFRR. Note that if the score distributions overlap, the FAR and FRRintersectatacertainpoint.ThevalueoftheFAR andtheFRRatthispoint,whichisofcoursethesame forbothofthem,iscalledtheEqualErrorRate(EER) andinabovegraphsit,isobtainedforimage3.bmp where HD is same for FAR and FRR(0.32778). Database images vs Absolute difference020040060080010001.bmp10.bmp11.bmp12.bmp13.bmp14.bmp15.bmp16.bmp2.bmp3.bmp4.bmp5.bmp6.bmp7.bmp8.bmp9.bmpDatabase imagesAbsolute differenceSeries1 International Journal on Theoretical and Applied Research in Mechanical Engineering (IJTARME) ISSN :2319 3182, Volume-2, Issue-4, 2013 48 Fig.16. Graph of Decidability IV.CONCLUSION AND FUTURE SCOPE Anirisrecognitionmethodsproposedinthisthesis employs iris featureextraction using acumulative-sum-basedchangeanalysisandRadialandCircularmethod. In order to extract iris features, using a cumulative-sum-basedchangeanalysis,anormalizedirisimageis dividedintobasiccells.Iriscodesforthesecellsare generated by proposed code generation algorithm which usesthecumulativesumsofeachcell.Themethodis relativelysimpleandefficientcomparedtoother existingmethods.Experimentalresultsshowthatthe approachofimplementedmethodhasgoodrecognition performanceandspeed.Infuture,tomakethesystem more robust and reliableneed to experiment on a larger iris database. Theperformanceof secondmethod is not encouragingbecauseabsolutedifferencebetweentwo templates is considered formatching purpose.Also this algorithmisbasedonresultofedgedetectionandedge detection algorithms are not efficient for illumination in images. Some edges cannot be detected if image is taken in low illumination condition. V.REFERENCES [1]J.G.Daugman High confidence Visual Recognition ofPersonsbyaTestofstatistical IndependenceIEEETrans.Patternanalysisand machineIntelligence,Vol.15,no.11,1993,pp.1148-1161. [2]W.W.BolesandB.Boashash,AHuman IdentificationtechniqueusingimagesofIrisand Wavelet Transform, IEEE trans.Signal processing, Vol.46, no.41998, pp.1185-1188. [3] Li Ma and T.Tan,Personal identification based on iristextureanalysis,IEEETrans.Patternanalysis and Machine Intelligence, Vol.25, no.12, 2003. [4] J. Matey, K. Hanna, R. Kolcyznski, D. LoIacono, S. Mangru,O.Naroditsky,M.Tinker,T.Zappia,and W-Y.Zhao,Irisonthemove:Acquisitionof imagesforirisrecognitioninlessconstrained Environments,Proc.IEEE,vol.94,no.11,pp. 19361947, Nov. 2006. [5] J. G. Daugman, How iris recognition works, IEEE Trans.CircuitsSyst.VideoTechnol.,vol.14,no.1, pp. 2130, Jan. 2004 Decidability and EER Graph00.050.10.150.20.250.30.351.bmp 1.bmp 1.bmp 2.bmp 2.bmp 3.bmpimagesHDSeries1Series2