11:45 – 12:30 From IHE Profiles to conformance testing, closing the implementation gap Helping...
-
Upload
maximo-hardey -
Category
Documents
-
view
215 -
download
0
Transcript of 11:45 – 12:30 From IHE Profiles to conformance testing, closing the implementation gap Helping...
11:45 – 12:30 From IHE Profiles to conformance testing,
closing the implementation gap Helping the implementers, testing tools,
connectathons
12:30 – 13:30 Lunch Break
13:30 - 15:00 How to use IHE resources: hands on experience Technical Frameworks: navigating, Q&A Test tools: finding, using, configuring Participating in the testing process
IHE ResourcesEric Poiseau, INRIA, IHE Europe technical managerCharles Parisot, GE, IHE Europe
Connectathon
History
Connectathon
Started in 1998 in Chicago within the RSNA HQ
Europe started in 2001
Japan in 2003
China and Australia now also in the process
Formation IHE France 6
Charenton le pont 2001
11 companies
18 systems
40 m2
30 participants
Nantes Nov 2007
Formation IHE France 7
Paris 2002
33 companies
57 systems
130 m2
100 participants
Nantes Nov 2007
Formation IHE France 8
Aachen 2003
43 companies
74 systems
350 m2
135 participants
Nantes Nov 2007
Formation IHE France 9
Padova 2004
46 companies
78 systems
600 m2
180 participants
Nantes Nov 2007
Formation IHE France 10
Noordwijkerhout 2005
75 companies
99 systems
800 m2
250 participants
Nantes Nov 2007
Formation IHE France 11
Barcelona 2006
67 companies
117 systems
1500 m2
+250 participants
Nantes Nov 2007
Berlin 2007
Companies
systems
1500 m2
+300 participants
Oxford 2008
83 companies
112 systems
1500 m2
300 participants
C.A.T Participation in Europe
10
50
100
150
200
250
300
350
CompaniesSystemsParticipants
Paris
Paris
Aachen
Padova
Noordwijkerhout
BarcelonaBerlin Oxford
Purpose
Test the implementation of the integration profile within product Verify that the vendors did a good job
Verify that what the committees invented makes sense ! Verify that the text is clear enough Verify that that the committee did not
miss anything
Build a community of …
Computer geeks…
…who like to enjoy local brewed beers
From the vendor perspective
Unique Opportunity for vendors to test their implementations of the IHE integration profiles
Controlled environment Customer is not present ! Not in a clinical production environment
Specialists available From SDO From the peer companies
Bugs are identified and most of the time fixed !!!!
Connectathon Result Matrix http://sumo.irisa.fr/con_result
But…
Testing is sub-optimal Only a part of all the possible tests
are performed A system successful at the
connectathon is not guaranteed to be error free !!!!
We do not do certification !
From the IHE perspective
Feedback from the vendor community Did the committee do a good
job ? Did the developed integration
profile respond to a demand of the vendors ?
European C.A.T
We have reach now our cruise speed
NA and EU C.A.T are very alike
C.A.T used as an IHE promoting tool Workshop in parallel to the C.A.T
Berlin : ITEG Oxford Vienna
C.A.T. Model
Projet IHE-Dev Inria Rennes 23
The IHE testing process
22/05/08
Users
Sponsors:ProjectManagementTeam
Vendors
Sponsors:Exhibits
DevelopTesting Tools
Testing Tools
ImplementProfile Actors
In-HouseTesting
Connectathon
Demonstration
DeployedSystems
Testing Results
ApprovesTest Logs
IHE Technical Framework (Profiles Specification)
Product +IntegrationStatement
Pre-
connectathon
Pre-connectathon
Registration See what can be tested
Exchange of configuration parameters IP addresses AE Title Assigning authorities OID Certificates Affinity domain specification
Pre-connectathon
Mesa testing
In-house testing for vendors to get ready
Vendors return logs
Upon log return participation to C.A.T is accepted
At
connectathon
Participant Workshop 286-7 Feb 2008
Connectathon Testing
3 types of test to be performed No peer tests Peer to peer tests Workflow tests
Participant Workshop 29
No Peer Tests
Calibration Tests -CPI : screen calibration Printer calibration
Scrutiny Tests Verify that the objects created are « valid » Provide peers with samples
6-7 Feb 2008
Participant Workshop 30
Peer To Peer Tests (P2P)
Test subsections of a workflow between 2 vendors
Preparation to workflow test Vendor chose when to run them Vendor select their peer. Not to be run with other systems from same
company
6-7 Feb 2008
Participant Workshop 31
Workflow Tests
Test an entire workflow that may combined more than one integration profile
We have a schedule, vendors need to be ready at the time of the test.
We have a list of difficulties to check. Some test can run in 15 minutes Some will require more than an hour No second chance test
6-7 Feb 2008
5 days
Monday morning till 11 am Set up time
Till Friday noon : Free peer to peer and no peer testing
From Wednesday till Friday noon : Directed workflow testing
Monitors
Volunteers
Independent from vendors
Standard specialist
Verify tests
Act as moderator between vendors
Results
Failure are not reported
To be successful Each peer to peer test needs to be verified with
at least 3 peers There are some exceptions A vendor may fail for an actor but pass for the
others
Formation IHE France 35Nantes Nov 2007
Connectathon Results
IHE does not report failure Public results only at the company level
IHE will never tell you what system participated to the connetathon
Vendors have access to their own test results.
Formation IHE France 36Nantes Nov 2007
Connect-a-thon Results Browser
Formation IHE France 37Nantes Nov 2007
Connectathon Results Browser
Formation IHE France 38Nantes Nov 2007
Connectathon Results Browser
Formation IHE France 39
What does it mean ?
The Company was successful at the connectathon for the actor/integration profile combination
Results do not guaranty product conformity This is the role of the « IHE integration
statements »
Nantes Nov 2007
Formation IHE France 40
IHE Integration Statement
Nantes Nov 2007
Participation Fees
First System € 2750
Other systems € 2850
Per domain € 750
Covers : Infrastructure : room, power, monitors,
internet… Lunch and coffee breaks for 2 engineers during 5
days
Next Connectathon
Where : Remise, Vienna, Austria http://www.koop-kundenweb.at/remise/
When : Monday 20th April to Friday 24th April 2009
Registration : November 1st – January 7th 2009
Announcement to be released soon
CAT :
conclusion
C.A.T : Conclusion
It’s not a certification process
Unique opportunity for vendor to test and discuss
Seems to be usefull as proved by increased participation over the years
Sure, needs improvement… … but, we are working on it
Testing
Projet IHE-Dev Inria Rennes 46
Before we start
Impossible to test every thing
What we do not test Design Performance (Load)
What we are looking for interoperability conformity
22/05/08
Projet IHE-Dev Inria Rennes 47
Conformance / Interoperability
22/05/08
Specifications/Standards
Implementation A
Vendor A
Implementation B
Vendor B
Conformance testing
Interoperability testing
Conformance Testing (1/2)
Is unit testing Tests a single ‘part’ of a device
Tests against well-specified requirements For conformance to the requirements of specified
and the referenced standards Usually limited to one requirement per test.
Tests at a 'low' level At the protocol (message/behaviour) level.
Requires a test system (and executable test cases) Can be expensive, tests performed under ideal
conditions
Conformance Testing (2/2)
High control and observability Means we can explicitly test error behaviour Can provoke and test non-normal (but
legitimate) scenarios Can be extended to include robustness tests
Can be automated and tests are repeatable Conformance Testing is DEEP and NARROW
Thorough and accurate but limited in scope Gives a high-level of confidence that key
components of a device or system are working as they were specified and designed to do
Limitations of Conformance Testing
Does not prove end-to-end functionality (interoperability) between communicating systems Conformance tested implementations may still not
interoperate This is often a specification problem rather than a testing
problem! Need minimum requirements or profiles Does not test a complete system
Tests individual system components, not the whole A system is often greater than the sum of its parts! Does not test functionality
Does not test the user’s ‘perception’ of the system Standardised conformance tests do not include proprietary
‘aspects’ Though this may well be done by a manufacturer with own
conformance tests for proprietary requirements
Interoperability Testing
Is system testing Tests a complete device or a collection of devices
Shows that (two) devices interoperate within a limited scenario !
Tests at a ‘high’ level (as perceived by users) Tests the ‘whole’, not the parts Tests functionality
Does not necessarily require a test system Uses existing interfaces (standard/proprietary)
Interoperability Testing is BROAD and SHALLOW Less thorough but wide in scope Gives a high-level of confidence that devices (or
components in a system) will interoperate with other devices (components)
Limitations of Interoperability Testing
Does not prove interoperability with other implementations with which no testing has been done A may interoperate with B and B may interoperate with C.
But it doesn’t necessarily follow that A will interoperate with C.
Combinatorial explosion Does not prove that a device is conformant
Interoperable devices may still interoperate even though they are non-conformant
Cannot explicitly test error behaviour or unusual scenarios Or other conditions that may need to be forced (lack of
controllability) Has limited coverage (does not fully exercise the device)
Not usually automated and may not be repeatable
Conformance or Interoperability
Both are Needed ! Complementary, not competitive ETSI : « While it is not absolutely necessary to
undertake both types of testing, the combined application of both techniques gives a greatly increased confidence in the tested product and its chances of interoperating with the other similar products
Conclusion
Need to have conformance testing in the IHE Testing process
Important to perform conformance testing in advance of the connectathon
Interoperability testing takes place during the connectathon.
Need to perform conformance testing as well during the connectathon.
IHE Ressources
Technical Frameworks
One per Domain
They are the reference, the tools are not !
Written and reviewed by Vendors and Users
Freely available on http://www.ihe.net
Organization of the TF
Volume 1 Description of the Integration profiles and actors Dependencies between actors and integration
profiles Use cases
Volume 2 and followings Description of the transactions with reference to
used standards
TF Life Cycle
Every year : New integration profiles Change Proposal
Integration Profile proposed as Supplements Public Comment Trial Implementation Final Text
Once in Final Text integration into the main document.
No concept of version.
TF Navigation (Kudu)
IHE Connectathon management tool (Kudu) needs to know about the IHE Concepts
Concepts in a database
PHP script to navigate among the concepts
URL : http://sumo.irisa.fr/TF
Warning : TF is the reference, this is an other view of the official document
Wiki
http://wiki.ihe.net A lot of information
Committee planning / minutes
http://ihewiki.wustl.edu/ Wiki for connectathon organization and
management Code exchange XDS implementation page …
Mesa tools
Mesa Tools
http://ihedoc.wustl.edu/mesasoftware/index.htm
First generation of tools
Used for pre-connectathon testing
More focused on conformance than interoperability
Mesa Tools Installation
Available for Windows, Linux (easier to use on Linux)
Need database installation perl
Contains hl7 listener, initiator, Dicom tools
Set of perl scripts to run scenario
Set of data to run the scenario.
Mesa Tools Output
Set of tests to run based on system pedigree
Each test gathers exchanged messages
A script evaluates the content of the captured message for specifics content
Output is a text file : “Passed”, “Failed”
Mesa Tools Limitation
Need to install the entire tool set even for testing a single integration profile
Scripts kind of require a clean SUT context.
Not easy to use… not easy to maintain
Kudu
Connectathon management
Registration process
Pre-connectathon testing management
Pre-connectathon configuration exchange
Connectathon test management
Connectathon results management
Kudu
Used by in North America Europe Japan China Australia
Helps harmonizing the testing process
Kudu draw backs
Designed for Connectathon not for usage by vendors
PHP scripts : scalability may not work
Designed for “interoperability testing” not “conformance testing”
Gazelle
Gazelle = MESA + Kudu
Proposal to combine MESA and Kudu 2nd generation of tool
Avoid 1st generation design errors Target more use cases Allow scalability
More developers Better software, better coverage Improved support
Gazelle Requirements
Objectives
Improve the overall quality of testing Conformance and Interoperability
Broaden the use of the application Build a framework for Healthcare
interoperability testing
5 Use Cases
Connectathon Virtual Connectathon Company Internal Testing tool Healthcare Enterprise Testing tool Governmental organizations
Requirements
Synchronous testing of multiple systems Multilingual Scalable
Gazelle Architecture
Projet IHE-Dev Inria Rennes 77
Gazelle Architecture
Proxy
System Under Test
Network
GazelleTest Engine
Control
Configuration Info
Feedback
ExternalValidation Services
Tests Scenario
GazelleActor (Simulators)
Gazelle Control System
ExternalValidation ServicesExternal
Validation ServicesExternalValidation Services Gazelle
Actor (Simulators)GazelleActor (Simulators)Gazelle
Actor (Simulators)GazelleActor (Simulators)
System Under TestSystem Under Test
System Under Test
Database
22/05/08
System under test
More than one system can be tested at the same time One S.U.T, many simulators (~mesa) Many S.U.T, no simulators (~kudu)
S.U.T management Web application to provide instruction
Database
Model of TF concepts Storage of test related information
Assertion to be tested Ideally provided by the IHE technical
committees
Test Management
EVSExternal Validation Services
External Validation Services
Part of the Gazelle architecture
Webservices following a common definition
Perform “validation” HL7 Dicom CDA
Dicom EVS
2 services available
DVTK (http://www.dvtk.org/) based service Hosted by MIR :
http://gazelle-blue.wustl.edu:8090/service1?wsdl
Dicom3tools (http://www.dclunie.com/dicom3tools.html) based service Hosted by MIR :
http://gazelle-red.wustl.edu:8080/axis2/services/service1
CDA EVS
Service proposed by the NIST
GUI http://xreg2.nist.gov/cda-validation/validation.ht
ml
WSDL http://xreg2.nist.gov:8080/ws/services/Validation
WebService?wsdl
HL7 EVS
NIST EVS http://xreg2.nist.gov:8080/HL7Web http://xreg2.nist.gov:8080/HL7WS
INRIA EVS : http://sumo.irisa.fr:8080/InriaHL7EVS-ear-InriaHL
7EVS-ejb/Hl7GazelleMessageValidation?wsdl
CDA : How does it work ?
Use of Schematron
NIST wrote schematron for the IHE Integration profiles
EVS performs schematron validation of the send document
Dicom : How does it work ?
Check for conformance to the dicom standard.
No IHE specifics there.
Use of MTOM for transport of the large dicom objects.
HL7 : How does it work ?
HL7 message profiles
INRIA wrote message profiles for IHE integration profiles
EVS uses the HL7 message profile as a reference to validate message
Validates the “syntax” of the message
HL7 Message profiles
In order to check HL7 message conformity need a reference document
Usage of HL7 Message Profiling mechanism
Re-engineering of TF and production of the HL7 message profiles for the existing transactions
See : http://sumo.irisa.fr/TF or get the source from the forge HL7 Message profiles HL7 message samples
EVS GUI Tool
Tool for easy EVS interfacing
Java Web Start : http://sumo.irisa.fr:8080/EVSClientGUI/
Allow user to select a file and get validation
For HL7 can work as a proxy
Simulators
Actor Simulators
IHE actors with Web service interface for control by Gazelle
We are currently working on the API Configuration Control Feedback
Re-use of existing software Need to adapt to fit API
Proxy
Proxy
Used to capture messages exchanged between systems to perform validation by the EVS
Use of Mirth during the Connectathon Need an infrastructure that supports large
number of systems
Use of Nule HL7 for single SUT testing
Proxy experiment in Oxford
Successfull usage of Mirth during the Oxford C.A.T.
Used as proxy for HL7 messages Kudu hacked to get channel generation and
report of validation Both NIST and INRIA EVS used by the proxy
Proxy Environment Overview
95
DaemonEVSmirth_input
hl7_message_validation
SUT1 SUT21
3
45
6
7
2Proxy
Test Engine
Test Engine
Controls simulators Controls proxy Use BPEL for test definition and orchestration SUT cannot be controlled directly
Control user through WEB GUI
Participants
3 IHE Regions North America : MIR Europe : INRIA Japan : Shizuoka University
DVTK NIST Tiani-Spirit David Clunie Offis
Roadmap
DB model redesign
EVS API Definition
Finalize licence
EVS at Chicago connectathon DICOM
EVS at Oxford connectathon HL7, DICOM, CDA
Proxy for HL7 messages
Registration with Gazelle
API for Simulators
API for TestEngine
Test cases in gazelle PIX PDQ SWF LTW
Project Management
Testing and Tool Committee Overview of IHE testing activities Choice of the licenses
Testing Management Group Project Management Eric and Steve
Licensing
Agreement of an opensource license Final choice of the license still in discussion Licensing does not concern tools developed by
3rd party Typically EVS, Simulators