METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK...

11
2012 SLOVAK UNIVERSITY OF TECHNOLOGY 10 T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE COMPONENTS INCLUDING AN EXAMPLE OF ITS IMPLEMENTATION IN SLOVAKIA KEY WORDS SDI, INSPIRE, conformance testing, methodology, design, tools, reporting ABSTRACT Before any spatial data infrastructure (SDI) is implemented as fully operational, many relevant testing procedures should take place. Such procedures should evaluate the compliancy level of particular SDI components against the relevant standards and implementing rules. Hence, they should ensure a high interoperability level. Many testing activities have already been performed within the implementation of the European SDI (INSPIRE – Infrastructure for Spatial Information in the European Community). Nevertheless, a common and versatile testing methodology, which is possible to use at any level of SDI realization is still lacking. This paper proposes a conformance testing methodology for selected SDI components applicable via network services for the discovery, view and downloading of data. An example of such an implementation has taken place within an environmental SDI developed by the Slovak Environmental Agency. A testing report template summarizing the results of the tests is proposed to be considered as a common template on a national level to be used within the implementation of a National Infrastructure for Spatial Information in the Slovak Republic. Tomáš KLIMENT email: [email protected] Research field: Discovery of the Geospatial Resources on the Web, Spatial Data Infrastructures (SDI) - Metadata, Geospatial Web Services, Infrastructure for Spatial Information in Europe (INSPIRE) Address: Department of Theoretical Geodesy, Faculty of Civil Engineering, Slovak University of Technology in Bratislava, Radlinského 11, 813 68, Bratislava, Slovakia. Martin TUCHYŇA email: [email protected] Research field: Spatial Data Infrastructures (SDI) - Spatial Data Modelling, Data Quality, Infrastructure for Spatial Information in Europe (INSPIRE) - Data Specification Development and testing Address: Department of Information Systems, Slovak Environmental Agency in Banská Bystrica, Tajovského 28 , 975 90 Banská Bystrica, Slovakia Marcel KLIMENT email: [email protected] Research field: Digital Photogrammetry Applications for Spatial Data Capture, Surface Modelling, Remote Sensing Address: Department of Landscape Planning and Ground Design, Slovak University of Agriculture in Nitra, Tulipánová 7, 949 76 Nitra, Slovakia Vol. XX, 2012, No. 1, 10 – 20

Transcript of METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK...

Page 1: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

2012 SLOVAK UNIVERSITY OF TECHNOLOGY10

T. KLIMENT, M. TUCHYŇA, M. KLIMENT

METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE COMPONENTS INCLUDING AN EXAMPLE OF ITS IMPLEMENTATION IN SLOVAKIA

KEY WORDS

• SDI,• INSPIRE,• conformance testing,• methodology,• design,• tools,• reporting

ABSTRACT

Before any spatial data infrastructure (SDI) is implemented as fully operational, many relevant testing procedures should take place. Such procedures should evaluate the compliancy level of particular SDI components against the relevant standards and implementing rules. Hence, they should ensure a high interoperability level. Many testing activities have already been performed within the implementation of the European SDI (INSPIRE – Infrastructure for Spatial Information in the European Community). Nevertheless, a common and versatile testing methodology, which is possible to use at any level of SDI realization is still lacking. This paper proposes a conformance testing methodology for selected SDI components applicable via network services for the discovery, view and downloading of data. An example of such an implementation has taken place within an environmental SDI developed by the Slovak Environmental Agency. A testing report template summarizing the results of the tests is proposed to be considered as a common template on a national level to be used within the implementation of a National Infrastructure for Spatial Information in the Slovak Republic.

Tomáš KLIMENTemail:[email protected]:DiscoveryoftheGeospatialResourceson the Web, Spatial Data Infrastructures (SDI) -Metadata,GeospatialWebServices, InfrastructureforSpatialInformationinEurope(INSPIRE)

Address:DepartmentofTheoreticalGeodesy,FacultyofCivilEngineering,SlovakUniversityofTechnologyinBratislava,Radlinského11,81368,Bratislava,Slovakia.

Martin TUCHYŇAemail:[email protected] field: Spatial Data Infrastructures (SDI) -Spatial Data Modelling, Data Quality, Infrastructurefor Spatial Information in Europe (INSPIRE) - DataSpecificationDevelopmentandtesting

Address:DepartmentofInformationSystems,SlovakEnvironmentalAgencyinBanskáBystrica,Tajovského28,97590BanskáBystrica,Slovakia

Marcel KLIMENTemail:[email protected] field: Digital Photogrammetry Applicationsfor Spatial Data Capture, SurfaceModelling, RemoteSensing

Address:DepartmentofLandscapePlanningandGroundDesign,SlovakUniversityofAgricultureinNitra,Tulipánová7,94976Nitra,Slovakia

Vol. XX, 2012, No. 1, 10 – 20

Page 2: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

2012/1 PAGES 10 — 20

11METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

1. INTRODUCTION

Before any component of aspatial data infrastructure (SDI) canbe provided for an operational phase via aGeoportal acting asagateway to SDI (Giff, et al., 2008), several testing proceduresshouldbeperformedagainstpredefinedrulestoensuretherequiredinteroperability level. Interoperability in thiscontext is technicallyensured by international standards (ISO-TC211), implementationspecifications (OGC) and binding legislative documents on theEuropean (INSPIRE 2007, 2008, 2009a, 2009b, 2010a, 2010band 2010c) and national (Zákon 2010) levels. The fundamentalcomponentsofSDIarenaturallyspatialdata(SD), includingtheirdescription, known as metadata. These components should beprovidedtotheGeoportal(forinstance,theINSPIREGeoportalontheEuropeanlevelorEnviroGeoPortálontheSlovaklevel,fortheenvironmentaldomain)bynetworkservices(NS).TheimplementationphaseoftheINSPIREdirectivebeganin2010,when the first technical components were defined in order to bereported (INSPIRE Roadmap). This milestone launched activitieswithintheEuropeanUnion(EU)andmemberstate(MS)stakeholdercommunities. Collaboration between the Slovak EnvironmentalAgency (SEA) and the Slovak University of Technology (SUT)has been established to reflect the implementation process. Thiscollaboration has been extended with the Slovak University ofAgriculture (SUA) in 2011 and currently also representsSlovakiain thePTB (PersistentSDITestBed forResearchandEducation)European research community initiative. The first resultsgained from this collaboration were presented at theAgile 2011international forumwithinaprogramworkshopdevoted to testingSDIcomponents(Kliment,etal.,2011b).The fundamental objective of this paper is to propose atestingmethodology for SDI applicable via network services and toshowanexampleofits implementationwithinthenationalSDIinSlovakia.Methodologysection2introducesatechnicaloverviewofSDI components and the relations among them.Then it describestheproposed testingmethodologyandadesignof the testing tool.AdescriptionofthetoolsdevelopedandasampletestingoftheSEAcomponentsisprovided.Theresultsinsection3presentasummaryofthefindingsandproposeatemplateforreportingthem.Section4concludesthepaperandpointsoutfutureresearchwork.

2. MATERIALS AND METHODS

Theissueof testingSDIcomponentshasbeenrecentlydefinedasveryimportantduetotheimplementationactivitieslaunchedbytheINSPIRE directive milestones. Many testing activities have beenrealized so far, and various works (Horak, et al., 2009;Ardielli,

et al., 2009, 2011; Cibulka, 2011) are related to testing the viewor also discovery (Kliment, et al., 2011a) services according tothe INSPIRE requirements.Theextentof the testingdidnothaveacomplex character and mainly provided information about thequality of the service, where the service’sinterface and contenthadnotbeen tested,or if so,only at aminimal level.OtherworkisdevotedtocurrentdatamodelstransformationintotheINSPIREapplication models (Östman, et al., 2010). The ESDIN projectproposed atesting environment that enables testing either NS orSD models (Portele, et al., 2011). The project Nature-SDI plusprovides an INSPIREvalidation briefcase for nature conservationdata,whichintroducesmethodologyfocusedondataandmetadatavalidation (Ford, et al., 2011). However, none of the above citedworks are focused on the testing methodology for an entireSDI framework. Therefore, this paper will concentrate on suchaproposedmethodologyinthefollowingsections.

2.1 OVERVIEW OF SDI COMPONENTS FROM A TECHNICAL PERSPECTIVE

Various resources such as books, papers or cookbooks providedetailedinformationaboutSDInowadays(Aalders,2001;Rajabifard,2001;Masser,2005;Nebert2009).Itisimportanttomentioninthebeginning that many terms besides SDI now exist that are usedwithinthegeospatialinformation(GI)domain.Someauthorsusetheterm“GeospatialData Infrastructure” (Brox, et al., 2002;Woldai,2002; Groot, et al., 2000). Others use “Geospatial InformationInfrastructure” (Diaz, 2010). In Slovakia, the term “Infrastructurefor Spatial Information” has been established following nationallegislation devoted to GI, which is mainly represented by an acton the National Infrastructure for Spatial Information (Zákon,

Fig. 1 SDI components from a technical perspective.

Page 3: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

12 METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

2012/1 PAGES 10 — 20

2010). The description of an SDI is slightly different within theabove-mentionedworks,buttheyallrepresentthesameconceptualframework.Hencewewilluse this termfor the restof thepaper.Inprinciple,SDIdefines aframework to facilitate and coordinatedata exchange and sharing among the stakeholders from variousdomains.SDIismuchmorethanjustthedataandgoesfarbeyondsurveying and mapping. SDI provides an environment withinwhich organisations and/or nations interact with technologies tofoster activities for producing, managing and using geospatialdata (Rajabifard,2001).FromatechnicalpointofviewSDI isaninfrastructurethatservesspatialdataanditsmetadata(datalayer)toclients(clientlayer)vianetworkservices(servicelayer)(Figure1).The service layer of SDI defines the following types of networkservices:• Discovery service– serves for data discoverybasedonmetadata.The user specifies the search criteria (keywords, spatial/temporalextent, topic category, etc.); the service searches records in thecatalogue(Kliment,2010c)andreturnsrecordsmatchingthecriteria.Theuserevaluatesthedataconcerningitsquality,lineage,resolution,use constraints, access restrictions, etc., based on the metadatareturned.Finally,adatasetischosenforfurtheraccessanduse.

• Viewservice–servestoportraythedataasarastersetofimages(maps) and provides such basic functions as zoom, pan, andlegendportrayal.

• Downloadservice–servesfordownloadingdatatolocalstorageintheselecteddataexchangeformat.

• Transformationservice–servesfordatatransformationbetweencoordinatesystemsoramongdifferentdatamodels.

• Processing services – serves for various geo-processes to beinvoked,suchas,forinstance,spatialanalysis(buffer,intersect,union,etc.),orchainingseveralnetworkservices.

Above described technical components are standardized withaseriesofstandardsandimplementationspecificationsbothwithintheGI domain (ISO-TC211, OGC) and theWeb domain (W3C).Further non-technical SDI components are agreements on thedata sharing that play avery important role and are definedwithin INSPIRE (INSPIRE 2010c). Continuous monitoring ofmeasurements is acomponent that implies reporting whetherparticularcomponentsarefulfillingrequirementsornot(INSPIRE2009b). The proper deployment of SDI components, togetherwith well-established procedures for further maintenance anddevelopment, can ensure the achievement of ahigh level ofinteroperability and attract all the other stakeholders to enlargethe SDI beyond the public sector domain (Proposal for INSPIREMaintenance and Implementation 2011). To ensure this, relevanttests shouldbedone. It is recommended toperformrelevant testsbeforetheoperationalphaseoftheinfrastructureislaunched.

2.2 TESTING METHODOLOGY FOR SDI COMPONENTS

Theproposedmethodologyisbasedonauserscenario,where thetesting tool is aremote SDI client accessing data and metadatavia network services.Thismeans that conformity of the networkservicesistestedfirstandthenthecontentitself(dataandmetadata).The following points of the testing methodology were proposedfor the purposes of testing SDI components against legislation(INSPIRE)andrelatedstandardization(ISO,OGC):• Testing requirements –analysisoftherequirementsdefinedintherelateddocumentation–e.g.,withintheINSPIREframework,this comprises astudy of the INSPIRE directive, regulationsfor individual components of the infrastructure, and technicalguidelines. Obviously, related ISO standards and the OGCimplementation specifications have to be analysed since theydefinethebaseforINSPIRE.

• Testing scope – whether to perform acomplex testing modelto test all the requirements defined in the previous point –the service interface (operations and the parameters of therequest and response validations), quality of the service (QoS)(evaluatingthedefinedqualityparameters),servicecontent(dataandmetadatavalidation)andothercriteria,ortoperformapartialtestingmodelbasedonthedefinitionsdrivenbythetester.

• Temporal extent – defines the time span and repetitionfrequency of tests which might be either long-term testing inordertoevaluatetheQoSpreciselyorshorttermforafastinitialoverviewoftheserviceinterfaceandthevalidityofitscontent.

• Testing scenarios –conceptualdesign(textdescriptionorbusinessprocess models (with Business Process Modelling Notation(BPMN)),oractivitydiagrams(withUnifiedModellingLanguage(UML) notation) and implementation of individual testingscenarios,dependingonthetestingenvironment.Itwouldbeidealtohaveacommontoolthatwouldenableexecutionofallthetests.

• Testing execution and reporting results – includes theexecution of individual scenarios defined in the previous pointand generation of testing reports. It is important to definethe structure of the report, because it is themain objective ofthe testing. For instance, such information as description ofthe testing scenarios, information about the tester, the testingenvironment and the main testing results (e.g. Passed/Failed,Passedat54%,etc.)shouldbeincluded.

2.3 DESIGN OF THE TESTING ENVIRONMENT

TheproposedtestingenvironmentisdesignedusingUMLusecasediagramnotation(Kanisova,etal.,2004).The individual typesofactorsandthecorrespondingusecasesareshowninFigure2.

Page 4: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

2012/1 PAGES 10 — 20

13METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

Theusecasediagramdefinesthreeroles,i.e.,theSDIcomponentstester,theSDIcomponentsprovider,and theSDIcomponentuser.TheSDIcomponentuserrolemightbeanyonewhoneedstouseSDIcomponents(i.e., aministry, agency or citizen). Under certain circumstances theSDIcomponent tester roleandSDIcomponentprovider rolemaybeassignedtooneperson.Themodeldefinesthefollowingusecases:• setupTestingScenario – the SDI component tester developsparticulartestingscenariosinordertodefineparametersfortheindividual requirements to be tested.An example is the searchcriteriatestfortheINSPIREdiscoveryservice,wherethetestingenvironmenthastosendaseriesofDiscoverMetadata(INSPIRE2009a) requests with aparticular search criterion against themetadatainthecatalogueandvalidatetheserverresponse.

• selectTesting Scenario – this use case includesdefineTestParameters usecase for theSDIcomponentproviderandtester.Itdefinesmainparametersnecessaryforthetesting,astheserviceURLdefinition, temporalcoverageof testing,emailnotificationandothers.

• executeTest – the SDI component tester or SDI componentproviderrolelaunchesthetestdefinedintheprevioususecases.TheusecasenotifyByEmailmayprovidebothnotificationsaboutthebeginningandcompletionofthetests.

• reportResults – the SDI component tester browses all theresults, includingthetechnicaldetails(thecontentofresponseswith errors, etc.) and defines the final structure of the reportconcerningitsreceiver(SDIcomponentprovideroruser).

Fig. 2 UML use case diagram of the testing environment.

Page 5: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

14 METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

2012/1 PAGES 10 — 20

• publishReport–theSDIcomponentproviderpublishesareportinorder topromotetheresults toathirdparty(SDIcomponentuser).

• displayResults – all the roles can display the report withinthe GUI application and in the structure(s) defined within thereportResultsusecase.

• downloadReport – all the roles can download the report inapredefinedformat(e.g.HTML,XML,orPDF).

• printReport–theSDIcomponentprovideranduserprintoutthetestingreport.

• communicate – theSDIcomponent testerandSDIcomponentprovider communicate mutually using the communicationinterface for consultations related to configuration, execution,resultsandreportingoftheindividualtests.

• searchReport –theSDIcomponentuserqueriesthetestingtooldatabase in order to display the particular testing report of theSDIcomponentshewantstouse.

2.4 TESTING TOOLS

For testing purposes based on the above-presented methodology,testing tools currently available on the scene as well as toolsdeveloped in-house at the Department of Theoretical Geodesy(DTG) of SUT were used. External tools used for testing thediscovery and view service are described in detail within thepublications(Kliment,2010b;Kliment,etal.,2011a).This section aims to provide an overview of the tools developedatDTG.Thefirstonewebtest v.1.0 (Fig.3) iswritten in theJavaprogramming language and also uses Java Server Pages (JSP)technology for afast way to create dynamic web content. Thewebtestprovides the functionalityof sending single,multiple andsimultaneousHypertextTransferProtocol(HTTP)GETandPOSTrequestsandtakesmeasurementsfortwotypesofresponsetimes:• the timebetween sending the request and the first downloadedbyteoftheresponse-INSPIREresponsetime(INSPIRE2009a)

• the time between sending the request and the last downloadedbyteoftheresponse

Thetoolprovidesasimplegraphicuserinterface(GUI),wheretheindividual testing scenarios may be imported either with acopyand paste of the XML fragment defining the request, or directlyuploadingtheentireXMLconfigurationfilecontainingthecomplextesting scenario. The webtest is aweb tool; thus, any users mayuseitremotelybyusingawebbrowserwithoutinstallationofanyadditionalsoftwaretotheircomputers.Anotherusefulfunctionistocheckwhetherthepredefinedstringappearsintheresponseornot;ifso,itsappearanceisreported.Thisfunctionisusedtoidentifytheexpected results using, for instance, acharacter string within the

unified resource identifier (URI).The testing resultsareshown inthetablewiththefollowinginformation:requestidentifier,responsetimesasdescribedabove,andthenumberofstringsdiscoveredinthe response.An example of the table can be seen in the resultssection. The tool provides neither any graphic statistics in theformofplots insofaras theywouldbeusedfor longtermtesting,norstoragefor the testingresults.Nevertheless, this ispartof theon-going development of thewebtest v. 2.0,which is planned tobefinalizedduringthefirsthalfoftheyear2012(Cibulka,2012).The second tool is aMDValidator (Fig. 4),whichperformsbatchmetadatavalidationagainst the INSPIRErequirements (INSPIRE,2008).Themainreasonfordevelopingsuchatoolwasitsabsencefromthemetadatatoolsavailableonthescene.TheMDValidatorincrementallyreadsmetadataintheXMLformatfrom the local directory and sends them to an online validationservice (INSPIREvalidation service).The result is areport in the

Fig. 3 GUI testing tool webtest – simple HTTP POST request/response.

Fig. 4 GUI validation tool MDValidator – batch validation execution.

Page 6: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

2012/1 PAGES 10 — 20

15METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

HTML/XML format (Fig. 5), which summarizes the validationresults – the name of the validated metadata record and itspath, number and listing of correct/incorrect elements found.Thevalidation service does not perform avalidation against the ISOmetadataschema(ISO2003,2007);whatisactuallyitsdrawback.Thisisanimportantaspecttoconsider,becauseifametadatarecordisvalidagainst INSPIRE, itdoesnothave tobevalidagainst theISOschemaandviceversa.

2.5 OVERVIEW OF THE SDI COMPONENTS PROVIDED BY SEA

TheSEAprovides spatialdataandmetadatavianetworkserviceslistedinTab.1forINSPIREandthenationalinfrastructure.ThesespatialdatarefertothemesdefinedintheannexesoftheINSPIREdirective (INSPIRE, 2007) and has been provided by SEA fortestingpurposes:• AnnexI–metadataandspatialdatafortheINSPIREspatialdatathemes8.Hydrographyand9.Protectedsites.

• Annex II–metadataand spatialdata for INSPIREspatialdatatheme2.LandCover.

• Annex III – spatial data for the INSPIRE themes 17.Biogeographical regions, 18. Habitats and Biotopes and 19.SpeciesDistribution.

These serviceswill be provided to the national infrastructure andconnectedviatheoperationalNationalGeoportalinSlovakia.Theuserwill be able to searchmetadata and evaluate and use spatialdata via this central point of the national infrastructure underpredefinedconditions.

2.6 TESTING METHODOLOGY - SAMPLE IMPLEMENTATION

The initial testing of the discovery service provided bySEAwasperformedintheyear2010,andtheresultswerepresented(Kliment2010a;Kliment2010b).Thissubsectiondescribestherealizationofthetestingonthediscovery,viewanddownloadservicesintroducedintheprevioussubsection.Themainaimistodemonstrateasampleimplementation of the proposed methodology under real worldconditions.

Testing the discovery service• Testing requirements – INSPIRE directive framework – theimplementing rules fornetwork (discovery) services (INSPIRE2009a) defines four operations for the discovery service withrelatedparameters,27searchcriteriaandthreeQoSparameters;the implementing rules for metadata (INSPIRE 2008) defines19 metadata elements (16 mandatory and 3 conditional) fordataset and series resources and 17 elements (13 mandatoryand 4 conditional) for services. The technical guidance forthe implementation of INSPIRE discovery services (INSPIRE2011a)mightassistinabetterunderstandingoftherequirementsfromatechnicalperspective.

• Testing scope – Three operations have been tested(GetDiscoveryServiceMetadata, DiscoverMetadata and

Fig. 5 Fragment of validation report from the MDValidator.

Tab. 1 List of network services provided by SEA to INSPIRE and national infrastructure.NETWORK SERVICE TYPE INSPIRE ANNEX I INSPIRE ANNEX II INSPIRE ANNEX III

DISCOVERYSERVICEYES

(TERRACATALOGCSW2.0.2)

YES(TERRACATALOGCSW

2.0.2)NOTYET

VIEWSERVICEYES

(ARCGISSERVERWMS1.3)YES

(ARCGISSERVERWMS1.3)YES

(ARCGISSERVERWMS1.3)

DOWNLOADSERVICEYES

(ARCGISSERVERWFS1.1)YES

(ARCGISSERVERWFS1.1)YES

(ARCGISSERVERWFS1.1)

Page 7: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

16 METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

2012/1 PAGES 10 — 20

PublishMetadata); 23 search criteria and 2 quality parameters(performance and capacity) have been tested. The metadatahas been validated against the INSPIRE (INSPIRE validationservice)andISOschema(ISO2007).

• Temporal extent – short-termtesting;everytestingscenariowasexecutedrepeatedlywithinashortperiod(3days)

• Testing scenarios – (1) scenario configured for the testingof the search criteria together with the DiscoverMetadataoperation and performance of theQoS parameter. (2) Scenariofor testing the combinationsofoperationsdefined in the scopeandcapacityoftheQoSparameter.(3)ScenarioforthetestingofPublishMetadataoperation(PushandPullmechanismasdefinedinINSPIRE,2009).

• Testing tools –webtestandMDValidator.

Testing the view service• Testing requirements – INSPIRE directive framework –the implementing rules for the network services (INSPIRE2009b)definethreeoperationsfortheviewservicewithrelatedparameters and three QoS parameters. The technical guidancefortheimplementationoftheINSPIREviewservices(INSPIRE2011b)might assist in abetter understanding the requirementsfromatechnicalperspective.

• Testing scope – two operations have been tested(GetViewServiceMetadata and GetMap) and two qualityparameters(performanceandcapacity).Therequestandresponseparametershavenotbeentested.

• Temporal extent – short-termtesting;everytestingscenariowasexecutedrepeatedlywithinashortperiod(3days)

• Testing scenarios – (1) scenario configured for testing theGetMapoperationinacombinationofallthelayersprovidedbytheSEAviewservicesandtheQoSparameterperformance.(2)ScenariofortestingthecombinationsoftheoperationsdefinedinthescopeandcapacityoftheQoSparameters.

• Testing tools –webtest.

Testing the download service• Testing requirements – INSPIRE directive framework – theimplementing rules for network services (INSPIRE 2010b)definesfouroperationsforpre-defineddatasetdownloadserviceandanothertwofordirectaccessdownloadoperationsplustheirparametersandthreeQoSparameters.Theservicecontent(data)applies implementing rules for the spatial data sets (INSPIRE2010a) and the technical guidance and Geography MarkupLanguage (GML) schemas for the individual INSPIRE spatialthemes.

• Testing scope – three operations have been tested(GetDownloadServiceMetadata, GetSpatialDataSet and

DescribeSpatialDataSet) and two network service qualityparameters(performanceandcapacity).Therequestandresponseparametershavenotbeentested.ThedatahasnotbeenvalidatedagainsttheINSPIREschemasdevelopedatthisphase.

• Temporal extent – short-termtesting;everytestingscenariowasexecutedrepeatedlywithinashortperiod(3days)

• Testing scenarios – (1) scenario configured for testing theGetSpatialDataSet in combinations of the data provided bytheSEAdownload services (Table1) and theperformanceandcapacity of the QoS parameters. (2) Scenario for the testingcombinationsofoperationsdefined in the scopeaboveand theQoSparameterperformance.

• Testing tools –webtest.

3. RESULTS

The testing results provided by thewebtest and theMDValidatorare indeednotveryuser friendly.Anexampleof the results tablecreatedbythewebtestisshowninFigure.6.Thistabledisplaystheresultsfromthetestingscenario(1)oftestingthediscoveryservice.The results table above provides IDs of the individual requests(second column from the left) sent within the scenario, theresponse times in milliseconds (third and fourth columns fromtheleft),andthenumberofpredefinedstringsdiscoveredintheresponses (the right column). This table provides raw results

Fig. 6 webtest testing report fragment – discovery service, scenario (1).

Page 8: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

2012/1 PAGES 10 — 20

17METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

that probably would not say very much to the SDI componentuser. Therefore, areport template, which will aggregate andsummarizealltheinformationintoamorehumanlyreadableandunderstandable way, should be proposed and implemented. Asaresult of thiswork, the following structure has beenproposed(Table 2) and filled in with the results gained within thisimplementationexample.Themainaimistocreateatemplatetobeusedwithinthenationalinfrastructure.

Theproposedtemplateconsistsofatableheaderdefiningthetestednetwork services and four main blocks according to the testingscope defined within the methodology. The first block providesthe interface’stesting definition and the results for the servicestested. The second block reports QoS, and the third reports anyother criteria defined within the requirements. The last, but notleast, block reports on the validity of the service content.Valuessuch as Supported, Supported/Parameters Tested and Not Tested

Tab. 2 Testing results – proposed template structure for the report.TESTED SERVICE DISCOVERY SERVICE VIEW SERVICE DOWNLOAD SERVICEINTERFACE (OPERATIONS&PARAMETERS)

RESULTS

GetNetworkServiceMetadataSupported

ParametersnottestedSupported

ParametersnottestedSupported

Parametersnottested

DiscoverMetadataSupported

Parametersnottested

PublishMetadataSupported

ParametersnottestedLinkService Nottested Nottested Nottested

GetMapSupported

Parametersnottested

GetSpatialDataSetSupported

Parametersnottested

DescribeSpatiaDataSetSupported

ParametersnottestedGetSpatialObject NottestedDescribeSpatialObject NottestedQUALITY OF SERVICE RESULTS

PerformanceSatisfied(115requestssent

115responses<3s)90%Satisfied(30requestssent27responses<5s)

Satisfied(30/10requestssent30/10responses<10/30s)

CapacitySatisfied

(30simultaneousrequestssent30responses<3s)

70%Satisfied(20simultaneousrequestssent

14responses<5s)

Satisfied(10simultaneousrequestssent

10responses<30s)Availability Nottested Nottested NottestedOTHER CRITERIA RESULTSSearchcriteria Supported(tested23/27)SearchCriteriafortheGetSpatialObjectOperation

Nottested

SERVICE CONTENT RESULTS

MetadatamodelsValid(validated161MD

recordsagainstINSPIREandISO)

Datamodels Nottested

Page 9: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

18 METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

2012/1 PAGES 10 — 20

areusedtoevaluatetheserviceinterfacepart.ValuesSatisfied,X%SatisfiedandNotTestedareusedfortheQoSassessment.Finally,theSupportedandNotTestedvaluesareusedforothercriteriaandValid,Invalid,NotTestedfortheservicecontentvalidation.

4. CONCLUSIONS AND FUTURE WORK

The work presented within this paper provides proposals foramethodologytoverifytheconformityofexistingSDIcomponentswith the requirements defined for the SDIs established on theEuropeanlevel.Anexampleoftheimplementationoftheproposedmethodologyhasbeenprovidedaswell.Nevertheless,thisexamplewas not meant to constitute comprehensive testing against theSEA network service, but rather to check the feasibility of theproposed methodology. The report structure has been proposedaccordingly. Moreover, collaboration between the public andacademicsectororganisationshasbeenprovedasveryimportanttoreachacommongoal–interoperableaccesstospatialinformationamong the stakeholders from theentire society.Themethodologyand an example of its implementation showed the demands forappropriate knowledge and understanding of all the requirementsdefined by the framework establishing SDI. Preparation of thetesting scenarios is avery important part of themethodology andis directly related to the testing coverage. It has been discoveredthat it consumes asignificant amount of time to prepare them inaproperway.The testing results should be easily interpreted andaggregatedinanunderstandableway.Moreover,ifpossible,variousappropriatelevelsofcompliancyshouldbeintroduced.TheresultspresentedconfirmthemeritsofaninitiativesuchasthePersistentTest Bed (PTB) and, at the same time, introduce motivation forfutureresearchwork.Wewouldliketoaddressfutureworkbythefollowing:• Discussions, proposals, and suggestions on the testingmethodology presented within the expert groups dealing withtestingonboththeEuropeanandnationallevels.

• Testing scenario extensions by the following: (1) For all therequirements defined by INSPIRE, or by specific nationalinfrastructureonesinthefuture.(2)Theparametersoftheoperationshave to be validated against INSPIRE’sspecific constraints(e.g. The INSPIRE network service shall provide INSPIRE

compliant metadata within the GetNetwotkServiceMetadataresponse, language parameter, layer metadata for the viewservice,spatialdatasetmetadatafordownloadservice(INSPIRE2009a,2010b), etc. (3)Preparationandexecutionof long-termtests for estimating accurate service quality parameters. (4)Documentation of individual scenarios on aconceptual level(UMLactivityorBPMNdiagrams,textdescriptions).

• Testingreport template -discussionsanddecisionson the finalform and content of the reports with afocus on their targetpurpose (e.g., atabular form with information such as date,test description, test execution, results, pass/fail definitions,comments,comparabilityofreports).

• The extension of the testing tool by the following: (1)Functions for reporting exports, statistical calculations, plots.(2) Implementation of the testing scenario series as complextests (e.g. INSPIRE discovery service testing scenario, etc.)(3)Resultsstoredintheformofadatabasetoavoidthelossofresultsinlong-termtestingandsupportqueryingfunctionality.

• The testing of the service content: (1)Validating themetadataagainst the ISO metadata model (ISO 19115, 19139) and thespecific constraints of the INSPIRE metadata regulation. (2)ValidatingspatialdataagainstGMLapplicationschemasdefinedwithinparticularINSPIREdataspecifications.

New legislative requirements aswell as user requirements in thefield of GI additionally bring about new responsibilities as wellas new challenges and space for the exchange of know-how inordertoachievetheirfulfilment.Animportantpartincludeseffortsto promote testing (increasing awareness). Moreover, aproposalfor the establishment of acommon testing platform allowingsharing tools, materials, methodologies, experience and expertiserelated to testing SDI components is very important. Searchingfor solutions to particular tasks and open issues connected withbuilding information infrastructures provides new opportunitiesforcollaborationamongbodies.Thisallowstheopeningupofnewpossibilitiesfortheestablishmentofmoreintensivecommunicationamong thebodies thathavepreviouslynotbeencooperating.Theauthorsofthispaperbelievethattheexperiencepresentedherewillalsoprovideinspirationfortheextensionofsuchcollaborationsoreventuallytocreatesimilarinitiativesinotherdomainsonseverallevelsfromnational,regionalandinternationalones.

Page 10: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

2012/1 PAGES 10 — 20

19METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

REFERENCES

[1] Aalders Henri J. G. L., Harold Moellering (2001). SpatialDataInfrastructure.

[2] Ardielli J., Horak J.,Ruzicka J. (2011). View ServiceQualityTestingAccording to INSPIREImplementingRules.Elektronika ir elektrotechnika. – Kaunas: Technologija,Lithuania,2011–6pp.Inprint.

[3] Ardielli J., Horak J. (2009).TheTestingoftheWebServicesAccessibilityofCENIACompany.ProceedingsofGISOstrava2009.–VŠB-TU,Ostrava,CzechRepublic,2009-8pp.ISBN978-80-87294-00-0.

[4] Brox, C., Bishr, Y., Kuhn, W., Senkler, K., & Zens, K. (2002).Toward aGeospatial Data Infrastructure for North Rhine-Westphalia.Computer,EnvironmentandUrbanSystems,pp.19-37.

[5] Cibulka, D: (2011).Testingoftheviewservices.ProceedingsofJuniorstav2011–VUTBrno,CzechRepublic epublic,2011–10pp.ISBN978-80-214-4232-0(inSlovak).

[6] Cibulka, D. (2012). Publication of selected geodata in awebenvironment. Proceedings of Juniorstav 2012 – VUT Brno,CzechRepublic, 2011 – 6 pp. ISBN 978-80-214-4393-8 (inSlovak).

[7] Díaz L. (2010).ImprovingresourceavailabilityforGeospatialInformationInfrastructures.Dissertationthesis.UniversityofJaumeIofCastellón,Spain.2010

[8] Ford M., Martirano G., Schleidt K. and Vinci F. (2011). AnINSPIRE validation briefcase for nature conservation andbeyond. INSPIRE Conference 2011, Edinburgh, Scotland.2011.

[9] Giff G.,Van Loenen B., Crompvoets J. and Zevenbergen J. (2008). Geoportals in Selected European States: ANon-Technical Comparative Analysis. GSDI-10, St. Augustine,Trinidad,2008–14pp.

[10] Groot, R. and McLaughin, J., (eds.) (2000).GeospatialDataInfrastructure: Concepts, cases and good practice. OxfordUniversityPress.2000.

[11] Horak J., Ardielli J., Horakova B. (2009). Testing ofWeb Map Services. International Journal of Spatial DataInfrastructures Research – Rotterdam, 2009. – 19 pp. ISSN1725-0463

[12] INSPIRE (2007). Directive 2007/2/EC of the EuropeanParliamentandoftheCouncilof14March2007establishingan Infrastructure for Spatial Information in the EuropeanCommunity(INSPIRE)

[13] INSPIRE (2008). Commission Regulation (EC) No1205/2008 of 3 December 2008 implementing Directive2007/2/ECoftheEuropeanParliamentandoftheCouncilasregardsMetadata

[14] INSPIRE (2009a). Commission Regulation (EC) No976/2009of19October2009implementingDirective2007/2/ECoftheEuropeanParliamentandoftheCouncilasregardstheNetworkServices

[15] INSPIRE (2009b).CommissionDecisionregardingINSPIREmonitoringandreporting.

[16] INSPIRE (2010a). Commission Regulation (EU) No1089/2010 of 23 November 2010 implementing Directive2007/2/ECoftheEuropeanParliamentandoftheCouncilasregardsInteroperabilityofSpatialDataSetsandServices

[17] INSPIRE (2010b). COMMISSION REGULATION (EU)No 1088/2010 of 23 November 2010 amending Regulation(EC) No 976/2009 as regards Download Services andTransformationServices

[18] INSPIRE (2010c). COMMISSION REGULATION (EU)No 268/2010 of 29 March 2010 implementing Directive2007/2/ECoftheEuropeanParliamentandoftheCouncilasRegards theAccess toSpatialDataSets andServicesof theMemberStatesbyCommunityInstitutionsandBodiesunderHarmonisedConditions

[19] INSPIRE (2011a). TechnicalGuidancefortheImplementationofINSPIREDiscoveryServices

[20] INSPIRE (2011b). TechnicalGuidancefortheImplementationofINSPIREViewServices

[21] ISO (2003). ISO 19115:2003 – Geographic information –Metadata.ISO,Switzerland,2003

[22] ISO (2007).ISO/TS19139:2007-Geographicinformation–Metadata–XMLschema implementation, ISO,Switzerland,2007

[23] Kanisová H., Muller M. (2004) – UML srozumitelně.ComputerPress,Brno,2004.ISBN80-251-0231-9(inCzech).

[24] Kliment T. (2010a). Discovery service testing accordingto INSPIRE implementing rules –.Abstract in proceedingsof GI2010 – X – Border SDI/GDI – Symposium, Dresden,Germany,2010.ISSN1801-6480

[25] Kliment T. (2010b).Designandpilotrealizationofthetestingprocess for the INSPIRE discovery service. ENVIRO-I-FORUM2010,Zvolen,Slovakia,ISBN978-80-88850-96–0.pp38-42(inSlovak).

Page 11: METHODOLOGY FOR Metadata, Geospatial Web Services, … · 2012. 5. 11. · 10 2012 SLOVAK UNIVERSITY OF TECHNOLOGY T. KLIMENT, M. TUCHYŇA, M. KLIMENT METHODOLOGY FOR CONFORMANCE

20 METHODOLOGY FOR CONFORMANCE TESTING OF SPATIAL DATA INFRASTRUCTURE ...

2012/1 PAGES 10 — 20

REFERENCES

[26] Kliment T. (2010c). Metainformation infrastructure forgeospatial information.Slovak JournalofCivilEngineering,SlovakUniversityofTechnologyinBratislava,Slovakia,2010–7pp.ISSN1210-3896

[27] Kliment T., Cibulka D. (2011a).Discoveryandviewservicetestingaccording to the INSPIRE requirements.Proceedingsof GIS Ostrava 2011 – VŠB-TU Ostrava Czech Republic,2011-9pp.ISBN978-80-248-2366-9(inSlovak).

[28] Kliment T., Cibulka D., Tuchyna M., Kliment M., Koska M., Mozolik P., Tobik J. (2011b). Use case of publicand academic sector organization cooperation related toSDI component testing. Presentation at Workshop TestbedResearchPTB,AGILE2011,Ultrecht,TheNetherlands,2011–23pp.

[29] Kliment T., Tuchyna M., Kliment M. (2011c).TestingofSDIcomponents–Afundamental interoperability elementwithinINSPIRE and and National SDI’s. Abstract in proceedingsof GI2011 – X – Border SDI/GDI - Symposium, Dresden,Germany,2011.ISSN1801-6480.

[30] Masser Ian (2005). GIS Worlds: Creating Spatial DataInfrastructures. ESRI Press, New York, USA. 2005. ISBN1-58948-122-4.

[31] Nebert, D. (2009). Developing SpatialData Infrastructures:The SDI Cookbook. Version 3.0, Global Spatial DataInfrastructure,2009–150pp.

[32] Östman A. (2010). Network for testing GI services.Proceedings ofGISOstrava 2010.VŠB-TUOstrava, CzechRepublic,2010-6pp.ISBN978-80-248-2171-9.

[33] Portele C., Östman A., Koutroumpas M., He X., Kovanen J., Schneider M., Skopeliti A. (2011). Testing-anessentialaspect of establishing an SDI. INSPIRE conference 2011 –Edinburgh,Scotland.2011.

[34] Rajabifard, A. and I.P.Williamson (2001). Spatial DataInfrastructures:Concept,SDIHierarchyandFuturedirections.GEOMATICS’80Conference.Tehran,Iran.

[35] Woldai, T. (2002). Geospatial Data Infrastructure: TheproblemofdevelopingmetadataforgeoinformationinAfrica.In:Proceedingsin4thAARSEConferenceongeoinformationfor sustainable development inAfrica. October 14-18 2002Abuja,Nigeria.

[36] Zákon (2010).LawNo.3/2010ontheNationalInfrastructureforSpatialInformation(inSlovak).

Internetresources:[37]BPMN-BusinessProcessModelandNotation-http://www.

omg.org/spec/BPMN[38]EnviroGeoPortál-http://geo.enviroportal.sk/[39]ESDIN - European Spatial Data Infrastructure with aBest

PracticeNetwork-http://www.esdin.eu/[40]gmd - Geographic MetaData extensible markup language

- http://standards.iso.org/ittf/Publicly Available Standards/ISO_19139_Schemas/gmd/gmd.xsd

[41]INSPIREGeoportal-http://www.inspire-geoportal.eu/[42]INSPIRERoadmap-http://inspire.jrc.ec.europa.eu/index.cfm/

pageid/44[43]INSPIREvalidationservice-http://www.inspire-geoportal.eu/

INSPIREValidatorService/resources/validation/inspire[44]ISO-TC211-ISO/TC211Geographicinformation/Geomatics

-http://www.isotc211.org/[45]OGC - Open Geospatial Consortium - http://www.

opengeospatial.org/[46]PersistentSDITestBedforResearchandEducation-http://

sdi-testbed.eu/[47]Proposal for INSPIRE Maintenance and Implementation

- http://ec.europa.eu/transparency/regcomitology/index.cfm?do=search.documentdetail&oA3xqr5242jcjc4gaV9U5MlPeKEWYLiu+kATHKRP84QxdbQ+AI/X9VTTMRqv00VG

[48]W3C-WorldWideWebConsortium–www.w3.org[49]webtest - tool forweb services testing - http://geo.vm.stuba.

sk:8080/webtest