MULTILATERAL INTEROPERABILITY … Document Library/99_Archives...multilateral interoperability...
-
Upload
phungkhanh -
Category
Documents
-
view
216 -
download
0
Transcript of MULTILATERAL INTEROPERABILITY … Document Library/99_Archives...multilateral interoperability...
MTEMP - TEWG Edition: 3.0.2
MULTILATERAL INTEROPERABILITY PROGRAMME
MIP TEST AND EVALUATION MASTER PLAN (MTEMP)
14 May 2009, Greding Germany
This Multilateral Interoperability Programme (MIP) Test and Evaluation Master Plan
has been reviewed and is hereby approved by the Heads of Delegation of
participating nations. Release of this document to nations or agencies, who are not
participants in the Multilateral Interoperability Programme including the media and
general public, require the approval of the MIP Steering Group (MSG) in accordance
with policy stated in the MIP Communications and Liaison Plan (MCLiP). This
document is the property of the MIP participants and the information contained in this
document shall not be communicated, either directly or indirectly, to any person or
agency not authorised to receive it.
MTEMP - TEWG 20090514
Edition: 3.0.2
i
RECORD OF CHANGES PAGE
CP Number
Date Entered
Responsible individual
Remarks
29 May 2007 TEWG Edition 3.1
26 Sep 2007 TEWG Edition 3.2
14 MAY 2009 TEWG NEW EDITION 3.0.3
MTEMP - TEWG 20090514
Edition: 3.0.2
ii
TABLE OF CONTENTS
___________________________________________________________________
SECTION TITLE PAGE
___________________________________________________________________
1 MIP VISION...............................................................................................................................................1
1.1 MIP SCOPE..........................................................................................................................................1
1.2 MIP MISSION.......................................................................................................................................1
1.3 MIP TASKS ..........................................................................................................................................1
2 AIM OF THE TEST AND EVALUATION MASTER PLAN ...................................................................2
3 TEST STRATEGY....................................................................................................................................2
3.1 APPROACH .........................................................................................................................................2
3.2 MIP TESTS ACTIVITIES .....................................................................................................................2
3.3 ENTRANCE- AND EXIT CRITERIA DEFINITIONS ..........................................................................3
3.3.1 ENTRANCE CRITERIA ............................................................................................................3
3.3.2 EXIT CRITERIA ........................................................................................................................3
3.4 TEST LEVELS......................................................................................................................................4
3.4.1 IMPLEMENTATION LEVEL TESTS........................................................................................5
3.4.2 SYSTEM LEVEL TESTS..........................................................................................................5
3.4.3 OPERATIONAL LEVEL TESTS...............................................................................................7
3.5 TEST TOOLS .......................................................................................................................................7
3.5.1 MIP TEST REFERENCE SYSTEM (MTRS)...........................................................................7
3.5.2 TEST DATA...............................................................................................................................8
3.6 REGRESSION TESTING / RE-TESTING..........................................................................................8
4 TEST ORGANISATION ...........................................................................................................................8
4.1 ORGANISATION..................................................................................................................................9
4.1.1 MIP TEST DIRECTOR .............................................................................................................9
4.1.2 MIP TEST CONTROLLER .......................................................................................................9
4.1.3 DATA COLLECTION & EVALUATION CONTROLLER.........................................................9
4.1.4 TEST DATA MANAGER ..........................................................................................................9
4.1.5 OPERATIONAL COORDINATOR .........................................................................................10
4.1.6 TECHNICAL COORDINATOR...............................................................................................10
MTEMP - TEWG 20090514
Edition: 3.0.2
iii
4.1.7 NATIONAL TEST COORDINATORS....................................................................................10
4.1.8 NATIONAL DATA COLLECTOR & EVALUATOR................................................................10
4.1.9 HOST NATION COORDINATOR ..........................................................................................10
4.1.10 OTHER FUNCTIONS WHEN NECESSARY OR DESIRED...........................................11
4.1.11 MIP TESTING READINESS REVIEW BOARD (MTRRB) ..............................................11
4.1.12 TEST RESPONSIBILITIES................................................................................................13
4.1.13 Test Coordination ...............................................................................................................14
5 TEST REPORTING AND EVALUATION .............................................................................................14
5.1 TRACEABILITY..................................................................................................................................14
5.2 DATA COLLECTION TOOLS ...........................................................................................................14
5.3 MPR’S .................................................................................................................................................15
5.3.1 MPR Severity Code Categories .............................................................................................15
5.4 TEST AND EVALUATION CONFIGURATIONS..............................................................................16
5.5 MIP TEST MANAGEMENT TOOL (MTMT) .....................................................................................17
5.6 MIP TEST REPORTS........................................................................................................................17
5.7 CAPABILITY MATRIX .......................................................................................................................17
ANNEX A GLOSSARY AND DEFINITIONS
ANNEX B: MIP BASIC TERMINOLOGY
MTEMP - TEWG 20090514
Edition: 3.0.2
1
MIP TEST AND EVALUATION MASTER PLAN (MTEMP)
1 MIP VISION
The vision for the Multilateral Interoperability Programme (MIP) is to become the
principal operator-led multinational forum to promote international interoperability
of Command and Control Information Systems (C2IS) at all levels of command.
1.1 MIP SCOPE
The MIP scope is to deliver a command and control interoperability solution focused
on the Land operational user in a Joint environment.
1.2 MIP MISSION
MIP is to further develop and improve interface specifications in order to reduce the
interoperability gap between different C2IS.
1.3 MIP TASKS
− Support fielded MIP solutions.
− Further improve the MIP solution by adopting modern development
approaches and standards1.
− Harmonise with NATO and leverage other appropriate standards2.
− Improve flexibility in using the MIP solution in ad-hoc coalitions.
− Extend the scope of MIP interoperability.
1 Examples of approaches and standards include the NATO Architectural Framework (NAF), Model
Driven Development, Service Orientation and common standards (XML, UML, RDF, etc.).
2 Examples include NNEC, APP-11, APP-6, etc.
MTEMP - TEWG 20090514
Edition: 3.0.2
2
− Engage Air, Maritime and other Coalition of Interests to cooperate with
MIP.
− Examine better ways of structuring the MIP programme.
2 AIM OF THE TEST AND EVALUATION MASTER PLAN
The aim of the MIP Test and Evaluation Master Plan (MTEMP) is to define the overall
test approach to be carried out within MIP. It provides a framework for testing and
evaluation in order to prove that national implementations meet the minimum level of
(C2IS) interoperability as defined by MIP. The MTEMP is a generic document that
identifies and describes the test strategy; it does not contain technical details of test
and evaluation procedures which are available in block specific documents.
3 TEST STRATEGY
3.1 APPROACH
The approach is based on the test engineering discipline defined in the MIP
Integrated Framework (MIF). The objective is that, after conducting testing, nations
will be confident that the MIP Solution is a viable mechanism to achieve
interoperability. The Test and Evaluation Working Group (TEWG) has overall
responsibility for ensuring that the test engineering discipline is performed in a
consistent, comprehensive and coherent manner, with the OWG, DMWG and
SEAWG providing support when needed.
3.2 MIP TESTS ACTIVITIES
The activities associated with the test engineering discipline divide into the following
areas:
• Test Development
Test development encompasses the development of technical specifications,
test data, test suites and test cases for the verification and validation of the
MTEMP - TEWG 20090514
Edition: 3.0.2
3
MIP Solution. This is the responsibility of the TEWG with the assistance from
other WGs in providing inputs such as test cases and data.
• Test Management and Execution
Test management and execution activities relate to the management of tests after
development, test planning, control and test result analysis. This is the responsibility
of the TEWG with the assistance of other WGs and the nations.
3.3 ENTRANCE- AND EXIT CRITERIA DEFINITIONS
The testing process uses entrance- and exit criteria to determine the eligibility of
systems to proceed through the testing process.
3.3.1 ENTRANCE CRITERIA
The purpose of these criteria is to ensure all participants are ready for testing. The
entrance criteria for a test (subtype) level is achieved when each system has
satisfied all test objectives of the previous test (subtype) level.
Systems will proceed to the next level on the basis of the analysis of test results and
recommendation by the test organisation, PMG or the MIP Test Readiness Review
Board (MTRRB, see Test Organisation). If a system has open MIP Problem Reports
(MPR’s) as a result of the testing, then each MPR will be assessed against the
severity category by the test organisation. If it is determined that the MPR is severe
enough to preclude advancement to the next level, then it will be incumbent on the
system provider to correct the problem and demonstrate (via regression testing / re-
testing) that the problem has been corrected.
3.3.2 EXIT CRITERIA
MTEMP - TEWG 20090514
Edition: 3.0.2
4
The exit criterion for a successful test level is achieved when each system has
satisfied all test objectives of a given test (subtype) level based on the analysis of the
test results by the test organisation, and on the PMG acceptance of the results.
3.4 TEST LEVELS
The TEWG has based its test strategy on testing at 3 distinct levels:
• Implementation Level Test (ILT)
• System Level Test (SLT)
• Operational Level Test (OLT).
In addition, each level can consist of subtype levels. The Implementation Level Tests
are conducted under national responsibility to confirm a nation’s implementation is
ready for testing with another nation/system, whilst the TEWG is responsible for the
conduct of System- and Operational Level Tests. Nations are required to share the
results of this testing, both successful and unsuccessful, with the other MIP nations in
order to share relevant experience and lessons identified in implementation. As
depicted in the following figure, the MIP Test Reference System (MTRS) results
analysis can be performed by the MIP test controller to provide inputs for SLT
subtype level testing.
Testing 'waterfall' per System Under Test
National MIP MIP National MIP MIP National MIP MIP MIP MIP
ILT1
MTRS1
SLT1
ILT2
MTRS2
SLT2
ILT3
MTRS3/Symb
Cat_code
CheckerSLT3
OLT TV
OLT CV
Responsibility:
Level 1 Test Organisation Level 2 Test Organisation Level 3 Test Organisation OLT Test Organisation
MTEMP - TEWG 20090514
Edition: 3.0.2
5
3.4.1 IMPLEMENTATION LEVEL TESTS
Implementation Level Testing (ILT) is conducted under national responsibility, with
two implementations of the same system connected ‘back to back’. All System Level
Test (SLT) test cases of a subtype level are performed to ensure that the system is
ready to test against other systems. A successful Implementation Level Test of a
subtype level will represent the entrance criterion for the System Level Testing of that
subtype level (ILT1 for SLT1, etc).
3.4.2 SYSTEM LEVEL TESTS
System Level Testing ultimately demonstrates the timely end-to-end transfer of
operational data between national C2ISs. It demonstrates the ability of national
systems to send/transmit, receive/parse and understand data end-to-end. This level
will be used to evaluate MIP Specifications from a systems rather than operational
perspective.
The System Level Tests comprise:
• SLT1 (Technical Level Testing)
• SLT2 (Data & Procedural Level Testing)
• SLT3 (C2IS Level Testing).
Some specific test cases for MEM (both SLT1 and SLT2) are grouped in one
document. For the System Level Tests the MIP System Level Test Plan (MSLTP)
provides guidance to the MIP community on test preparation, execution and
evaluation. Preparation for SLT will be the responsibility of a SLT WP led by the
TEWG with the assistance of other WGs in providing inputs including test cases and
test data.
3.4.2.1 SLT1
MTEMP - TEWG 20090514
Edition: 3.0.2
6
SLT1 focuses on data transmission and communication protocols between different
nations/organisations to provide assurance that each technical implementation of the
MIP Gateway is in accordance with the MIP Specifications for the baseline under
evaluation.
The objectives of SLT1 are to:
• Validate the protocol stack
• Validate the information exchange mechanisms
• Assess the capability of exchanging management events (generate, send,
receive and correctly parse)
• Validate the exchange of MIP messages
3.4.2.2 SLT2
SLT2 tests are primarily focused on the correct information exchange between
JC3IEDM-databases based on the MIP Specifications for the baseline under
evaluation.
SLT2 objectives are to:
• Validate the information exchange mechanism (correctly generate, send,
receive, parse and store)
• Validate the operating procedures
• Validate the exchange of MIP messages
3.4.2.3 SLT3
SLT3 is designed to validate the information exchanges between C2IS´s based on
MIP Specifications for the baseline under evaluation.
The objectives of SLT3 are to:
MTEMP - TEWG 20090514
Edition: 3.0.2
7
• Test the semantic integrity of the data between national C2IS’s across the MIP
Gateway
• Confirm operational interoperability between C2IS’s
3.4.3 OPERATIONAL LEVEL TESTS
This test level comprises the top level evaluation of the national MIP Interface and
C2IS, when deployed in the context of an operational scenario. The operational test
validates and ensures the MIP Solution meets the operational objective of a given
development block. Operational Level Tests are based on a scaled down version of
the appropriate System Level Test in order to minimise the resources needed to
support this activity. Preparation for OLT will be the responsibility of an OLT WP led
by the TEWG with the assistance of other WGs in providing inputs such as an
operational scenario for testing.
OLT objectives include:
• Confirmation of operational interoperability between C2IS systems under near-
real operational conditions.
• Evaluation of the reliability, functionality, and performance (application and
system performance) of the MIP Solution under test.
3.5 TEST TOOLS
Various tools can be used to support testing. In the next section examples are given.
3.5.1 MIP TEST REFERENCE SYSTEM (MTRS)
A MIP Test Reference System is defined as a system that is fully compliant with the MIP
specifications, provides features optimized to support conformance-testing, but doesn’t
necessary have operational C2IS functionality. It may be a MIP developed and operated
system or a national test system as agreed by the MIP community. The MTRS may not be
available for use at the start of a testing cycle. After conducting an ILT, each system shall test
MTEMP - TEWG 20090514
Edition: 3.0.2
8
with the MTRS system on each MTRS supported (subtype) level as part of the exit criterion
for the testing level. The result of the test session with the MTRS will not influence the
entrance criterion for the next level but will be used by TEWG to increase confidence in the
MTRS system.
After conducting an ILT, each system will have to test with the MTRS for each MTRS
supported (subtype) level.
3.5.2 TEST DATA
The Data Modelling Working Group (DMWG) is the custodian of all data used to test
if a system is compliant with the MIP Specifications. A variety of data will be used in
testing including:
• Symbology test data (to check the inbound mapping of data and display of
a symbology picture).
• Error Checking test data (test data with known errors introduced to test if a
system detects the error and responds to it).
• Boundary test data (test data that is error free but uses maximum
allowable values for certain fields (e.g. 100 characters in a name field); to
test if a system handles these maximum allowed values).
• Large files (for use in performance testing of a system).
3.6 REGRESSION TESTING / RE-TESTING
Regression testing is used to ensure that changes to a system do not introduced unexpected
errors or behaviour. This process confirms that a system remains compliant after previous
test and is also of value in re-testing systems to confirm successful changes to a system. It is a
national / organisational responsibility to report modifications that could affect the MIP
Interface and submit a plan for regression testing to the TEWG.
4 TEST ORGANISATION
MTEMP - TEWG 20090514
Edition: 3.0.2
9
4.1 ORGANISATION
The TEWG will manage all levels and subtypes of testing in MIP, although it will
require additional support for specific roles. Depending on the level of testing,
possible roles required for testing, are identified below. These MIP Test Organisation
roles may also be appropriate for National test organisations and are drawn from the
MIP Working Groups and Nations.
4.1.1 MIP TEST DIRECTOR
The MIP Test Director is responsible for managing and coordinating all levels of
testing. Multiple levels of testing can occur simultaneously and the MIP Test Director
is responsible to de-conflict all issues which may arise during testing and will provide
guidance on the execution of the tests. Chair of TEWG normally fulfils this role.
4.1.2 MIP TEST CONTROLLER
The MIP Test Controller is responsible for managing and co-ordinating the test
activities of a specific (subtype) test level and producing the final Test Report. OWG,
DMWG and SEAWG members will assist in the preparation, execution and
evaluation of the tests.
4.1.3 DATA COLLECTION & EVALUATION CONTROLLER
The Data Collection & Evaluation Controller is responsible for the coordination with
the national data collectors and evaluators, ensuring the data is in a standard format
and is responsible for the safe storage of test data.
4.1.4 TEST DATA MANAGER
Before a test is executed, test data may have to be compiled by a Test Data
Manager. The DMWG is responsible for nominating a representative as Test Data
Manager. The DMWG is also responsible for the MIP Common Data Fill.
MTEMP - TEWG 20090514
Edition: 3.0.2
10
4.1.5 OPERATIONAL COORDINATOR
The Operational Coordinator is responsible for ensuring tests are run in accordance
with mandated operational procedures, and the assessment of results against
operational requirements. The Operational Coordinator also provides the test
organisation with an operational perspective. OWG is responsible for nominating a
representative as Operational Coordinator.
4.1.6 TECHNICAL COORDINATOR
The Technical Coordinator is responsible for ensuring that tests are run in
accordance with the appropriate technical procedures and for the assessment of the
results against the technical specifications. The Technical Coordinator is also
responsible for analysing technical problems that occur during testing and if needed
capturing these problems in the MPRT. SEAWG is responsible for nominating a
representative as Technical Coordinator.
4.1.7 NATIONAL TEST COORDINATORS
All Nations assign a National Test Coordinator to assist the Test Organisation in the
preparation, execution and evaluation of the tests. He will be the overall point of
contact for all test related issues in a nation.
4.1.8 NATIONAL DATA COLLECTOR & EVALUATOR
The National Data Collector & Evaluator collects and evaluates all National test
results and reports to the Test Organisation.
4.1.9 HOST NATION COORDINATOR
The Host Nation Coordinator provides information about and is point of contact for,
the test site (requirements for access to the site, airport/rental car/hotel information,
network/power issues, shipping of hardware, etc).
MTEMP - TEWG 20090514
Edition: 3.0.2
11
4.1.10 OTHER FUNCTIONS WHEN NECESSARY OR DESIRED
Other roles may be needed for certain test levels. These include (but are not limited
to):
- National Operational Controller (to evaluate the national use the MIP Solution
and provide feedback on the MIP Specifications, stability, usability, etc).
- National System Coordinator (a nation can have more than one system under
test; the System Coordinator is a point of contact for that system).
4.1.11 MIP TESTING READINESS REVIEW BOARD (MTRRB)
The MTRRB will convene as required to validate entry to a higher test level for
systems that have not fully completed all test cases of the previous level with
success. This decision is normally within the remit of the PMG, but in the case where
two test periods follow each other in a period in which PMG doesn’t meet, the
MTRRB is mandated to decide on behalf of PMG.
The MTRRB shall consist of the following (independent) members:
• Test Director as Chairman; • Test Controller of the level to be exited and of the level to be entered; • Technical Coordinator; • Operational Coordinator of the level to be exited and of the level to be
entered (if required); • PMG member (if available).
The MTRRB shall base its decision to accept or refuse a system that did not achieve the exit criteria on the following criteria:
• Severity codes of the unsuccessful test case(s); • Operational or technical significance of the unsuccessful test case(s) In doing so, the MTRRB shall investigate all options available to enable the system to proceed into the next level, without imposing a risk to other systems under test or the test schedule. If voting is needed, every member (except the
MTEMP - TEWG 20090514
Edition: 3.0.2
12
chairman) will have one vote. When numbers are equal, the chairman will have a deciding vote.
MTEMP - TEWG 20090514
Edition: 3.0.2
13
4.1.12 TEST RESPONSIBILITIES
The responsibilities for testing activities in a block are:
TEWG:
• To elaborate the MIP test and evaluation plan for the block according to the MIP
Specifications and the requirements given by OWG, DMWG and SEAWG.
• To monitor, manage and coordinate the execution of testing.
• To assemble test suites for all test events.
• To collect and report test results.
• To assess against the severity category in the case of a MPR.
• To track test and system performance progress.
• To elaborate a test specification document before the test (objective,
schedule, environment, test suite, test cases, participation of nations,
coordination mechanism, etc).
• To elaborate the Test Report to PMG.
OWG:
• To determine and coordinate with TEWG, during the MIP elaboration phase, those
requirements that must be checked in the Operational Level Test.
• To provide, during the construction phase, the operational input for testing.
• To provide personnel for the SLTs and OLT in order to assess the testing with respect
to the operational requirements.
SEAWG:
• To support the development of test cases for the verification and validation of a MIP
Gateway.
• To provide personnel to give technical assistance and guidance to the Test
MTEMP - TEWG 20090514
Edition: 3.0.2
14
Organisation.
DMWG:
• To provide the necessary test data to support testing.
• To provide a Test Data Manager to support testing.
4.1.13 Test Coordination
The TEWG shall draw up a testing schedule under the direction of the PMG. The
overall test schedule will be part of the MIP Integrated Programme Schedule (MIPS).
A detailed schedule shall specify the sequence of tests to be run and the nations
involved in each test. A Test Directive may be published to provide guidance to (the
preparation of) a test period.
The Test Organisation will decide the co-ordination mechanism (telephone, e-mail,
network “chat” tool, website, etc.) that will be used to co-ordinate each testing
session.
During testing only UNCLASSIFIED information will be exchanged.
5 TEST REPORTING AND EVALUATION
5.1 TRACEABILITY
In order to ensure that all selected or determined operational and technical
requirements are tested and reported, traceability will be maintained between
operational requirements, technical requirements, use cases and test cases.
5.2 DATA COLLECTION TOOLS
The test organisation will use a variety of tools to support the data collection and
evaluation process, to hold test cases, test results and any other appropriate
information related to a test.
These tools will be used to:
MTEMP - TEWG 20090514
Edition: 3.0.2
15
• support test management;
• show timeline, scenario and test cases;
• provide test results;
• provide visibility of test results over a longer period of time to all MIP
members.
Tools can include: chat logs, email-logs, databases, PDU/database message
loggers/decoders as well as manual forms for data collection.
5.3 MPR’s
The MIP Problem Report Tracking (MPRT) tool is primarily intended for MIP issues
(relating to the specifications). National issues may be added if they represent a
serious interoperability problem (e.g. so that other systems may be aware of this
issue) or if knowledge of the problem (and its resolution) could benefit the MIP
community. A MPR will be raised for all MIP problems encountered during formal
testing. A problem is any event that is unexpected or that results in test failure. The
Test Organisation will assign an initial severity code and priority to each problem and
will assign it to a WG, a WP or a Nation. All MPR’s can be tracked in the MIP
Problem Report Tracking Tool. An overview of the complete MPR Process can be
found on the MPRT website.
5.3.1 MPR Severity Code Categories
Level Description
1 a) Prevents the accomplishment of an essential capability and/or
mission.
b) Jeopardises safety, security, or any other requirement designated
as “critical".
MTEMP - TEWG 20090514
Edition: 3.0.2
16
Level Description
2 a) Adversely affects the accomplishment of an essential capability
and no work-around solution is known.
b) Adversely affects technical, cost, or schedule risks to the
programme or to the life cycle support of the system, and no
work-around solution is known.
3 a) Adversely affects accomplishment of an essential capability but a
work-around solution is known.
b) Adversely affects technical, cost, or schedule risks to the
programme or to the life-cycle support of the system, but a work-
around solution is known.
4 a) Results in inconvenience or annoyance but does not affect
satisfying a required operational or mission-essential capability.
b) Results in inconvenience or annoyance for development or
maintenance personnel but does not prevent the accomplishment
of the responsibilities of those personnel.
5 Any other effect – minor in nature
5.4 TEST AND EVALUATION CONFIGURATIONS
The MIP Specification identifies different employment options that require national
systems to be interconnected based on various network topologies. The basic
topology involves all national MIP Gateways interconnecting on a common LAN. This
topology will provide the basis for the majority of the test and evaluation programme.
Other topologies may be used to validate the MIP Solution as required by the MIP
Specification and as agreed by MIP Nations. The Test Plans should reflect the
agreed topologies that the MIP Solution will be tested on.
MTEMP - TEWG 20090514
Edition: 3.0.2
17
5.5 MIP TEST MANAGEMENT TOOL (MTMT)
MIP will use a MIP Test Management Tool to drive tests and capture test results. The
tool will be used for all (subtype) levels and can generate reports.
The test results should provide enough management information in the form of
metrics and statements that will give systems developers, managers and project
leaders the status of their system and what needs to be improved to reach the next
level of success or to be able to make a decision on the employability of their
product.
5.6 MIP TEST REPORTS
After each level of testing a high level overall test report will be generated. The report
can contain the following information:
• Executive summary.
• Details of all tests run during the test period. For formal tests, these details shall
include the Nations/systems involved in each test.
• A matrix of all formal tests indicating which tests have been run, their outcome, and
the date of the test period in which they were run. This matrix is the mechanism by
which satisfactory completion of formal testing will be documented.
• Network configuration details.
• Hardware and software configuration baseline information.
• Recommendations.
• Lessons learned.
5.7 CAPABILITY MATRIX
The test results and other information will be used to create a Capabilities Matrix.
Capabilities Matrix will:
MTEMP - TEWG 20090514
Edition: 3.0.2
18
• Include operational requirements, based on OWG’s Operational Capability
Matrix.
• Provide an overview of nations’ systems implementation of operational
requirements.
• Provide information how successful nations’ systems were in testing with other
systems.
MTEMP - TEWG 20090514
Edition: 3.0.2
A-1
ANNEX A GLOSSARY and DEFINITIONS
See MIP Glossary.
MTEMP - TEWG 20090514
Edition: 3.0.2
B-2
ANNEX B: MIP BASIC TERMINOLOGY
1 INTRODUCTION
This document explains ‘basic terms’ which are common to more than one MIP document.
2 DEFINITIONS COVERED
− MIP Product Set
− MIP Specification
− MIP Common Interface (MCI)
− MIP Gateway (MIP GW)
− MIP Message Exchange Mechanism (MEM)
− MIP Data Exchange Mechanism (DEM)
− MIP Information Exchange Data Model (MIP IEDM)
− MIP Information Resource Dictionary (MIRD)
− MIP LAN
− MIP Solution
− MIP Scope
− National LAN
− National C2IS
3 RELATIONSHIPS BETWEEN DEFINITIONS
The hierarchical relationship between definitions is given below:
− The full set of MIP documents is known as the MIP Product Set and it includes a
subset of documents known as the MIP Specification.
MTEMP - TEWG 20090514
Edition: 3.0.2
B-3
− The MIP Specification can be implemented to produce a MIP Gateway which
consists of a MIP MEM and a MIP DEM.
− A MIP DEM can exchange data compliant with a specific MIP IEDM and its
associated MIRD.
− A MIP Gateway provides the point of connection between the MIP LAN and the
National LAN which hosts the National C2IS.
− The information which can be exchanged through a MIP Gateway is defined by the
MIP Common Interface.
− Two or more functioning MIP Gateways which can exchange data over a MIP LAN
constitute the MIP Solution, which fulfils the MIP Aim.
− The MIP Scope is restricted to the definition of the data which can be exchanged in
accordance with the MIP Common Interface and the protocols for the exchange of
that data.
3.1 MIP PRODUCT SET
DEFINITION
The MIP Product Set is the total set of documents produced by MIP within a named Baseline
Release or Version.
EXPLANATION
The MIP Product Set includes both the MIP Specification (the technical side) and also
additional supporting MIP documents (such as the MIP Vision and Scope) which provide
amplifying details (the programme management side). Thus there are MIP documents (such
as the MIP Campaign Plan and the MPMP) which, whilst essential to the overall achievement
of MIP’s vision and mission, cannot in any way be considered as ‘specifications’
MTEMP - TEWG 20090514
Edition: 3.0.2
B-4
3.2 MIP SPECIFICATION
DEFINITION
The MIP Specification is that subset of the total documentation set within a named Baseline
Release or Version, which deals with the technical specification of a system and which may
be implemented to achieve the MIP mission.
EXPLANATION
The MIP Specification is a term which is widely used and should be understood as those MIP
documents which provide the technical specification. It contains specifications for
implementation in the form of Data Structures, Data and Message Formats and Functionality,
Application Functionality and User Procedures. It also includes a matching set of Test Cases,
Procedures and Specifications. The MIP Specification is a coherent set of documents
designed to be used in conjunction with one another.
3.3 MIP COMMON INTERFACE (MCI)
DEFINITION
The MIP Common Interface (MCI) is that subset of a MIP Specification which defines what
can be exchanged using the MIP MEM and DEM. It is an interface in that it provides the
tangible face of what is otherwise a ‘Black Box’. The MCI consists of an agreed set of MEM
messages and attachment file types, DEM management messages and DEM data PDUs
(Protocol Data Units). The permitted content of these exchange formats is constrained by the
MIP Specification (e.g. the content of the DEM data PDUs must be conformant with a
specified version of the MIRD).
EXPLANATION
The MIP Specification further constrains the MCI by the definition of protocols and procedures
to ensure that the exchanges can take place meaningfully.
MTEMP - TEWG 20090514
Edition: 3.0.2
B-5
3.4 MIP GATEWAY (GW)
DEFINITION
A MIP Gateway is an implementation in both software and hardware which conforms to a
named version or release of the MIP Specification. It incorporates the functionality of both a
MIP MEM (Message Exchange Mechanism) and a MIP DEM (Data Exchange Mechanism).
EXPLANATION
Both MEM and DEM are defined as ‘mechanisms’ which means that they must be functioning
systems. MIP Gateway is a more general term than MEM or DEM.
3.5 MIP MESSAGE EXCHANGE MECHANISM (MEM)
DEFINITION
A MIP Message Exchange Mechanism (MEM) consists of an ESMTP3 mailer which is a
software implementation of the MEM specification contained in the MTIDP.
EXPLANATION
The ESMTP mailer logically resides within the MIP Gateway. However this is not technically
necessary; since the MIP Gateway is providing access between the MIP and National LANs,
the mailer could be located on the National LAN.
3.6 MIP DATA EXCHANGE MECHANISM (DEM)
DEFINITION
A MIP Data Exchange Mechanism (DEM) consists of a replication mechanism which is a
software implementation of the DEM specification contained in the MTIDP. The DEM is
3 Enhanced Secure Mail Transfer Protocol used for emails and attachments.
MTEMP - TEWG 20090514
Edition: 3.0.2
B-6
designed to be used in conjunction with a MIP Information Exchange Data Model schema and
its associated MIP Information Resource Dictionary (MIRD).
EXPLANATION
The DEM logically resides within the MIP Gateway.
3.7 MIP INFORMATION EXCHANGE DATA MODEL (MIP IEDM)
DEFINITION
The MIP IEDM is a formally defined data structure designed to encompass the minimal
subset of operational data required for multinational exchange in accordance with the MIP
V&S and MCP.
EXPLANATION
The term MIP IEDM is a generic term which can be applied to unspecified releases or
versions of the MIP Information Exchange Data Model (such as the LC2IEDM, the C2IEDM
and the current JC3IEDM).
3.8 MIP INFORMATION RESOURCE DICTIONARY (MIRD)
DEFINITION
The MIRD is a database schema and an associated data fill which provides a store for meta-
data describing a specific version of the MIP Information Exchange Data Model.
EXPLANATION
Part of the MIRD is used by the Replication Mechanism as a means of minimising the data
exchanged and of ensuring integrity. It has recently been extended to include an encoding of
basic business rules designed to ensure semantic integrity which can be used at the
MTEMP - TEWG 20090514
Edition: 3.0.2
B-7
application level (i.e. either before data is submitted to the MIP Gateway for transmission or
after it has been received from a MIP Gateway but before it is submitted to a national C2IS).
3.9 MIP LAN
DEFINITION
The MIP LAN is a limited network which provides connectivity between MIP Gateways. The
only protocol used for transport is TCP/IP.
EXPLANATION
The term MIP LAN is required to distinguish the network on the multinational side of a MIP
Gateway from that on the national side (which is known as the National LAN).
3.10 MIP SOLUTION
DEFINITION
The MIP Solution is what exists when two or more MIP Gateways are connected across a
MIP LAN. The MIP Solution is the provision of a capability to permit the exchange of actual
data, in order to improve common understanding of information.
EXPLANATION
The MIP Solution should be seen as the opposite and concluding end of the spectrum of MIP
definitions from the MIP mission. It is that complete thing which enables the MIP mission to
be fulfilled.
3.11 MIP SCOPE
DEFINITION
MTEMP - TEWG 20090514
Edition: 3.0.2
B-8
The MIP scope defines the constraints and limitations that MIP operates within. It identifies
the boundaries within which MIP operates.
EXPLANATION
The MIP scope can be broadly understood as the Terms of References for the MIP
programme. Importantly it explains where MIP and national responsibilities begin and end.
3.12 NATIONAL LAN
DEFINITION
A National LAN is a network controlled by a single nation which cannot be accessed by other
MIP nations, but which might be connected to a MIP Gateway for purposes of multinational
data exchange.
EXPLANATION
The term National LAN is required to distinguish the network on the national side of a MIP
Gateway from that on the multinational side (which is known as the MIP LAN).
3.13 NATIONAL C2IS
DEFINITION
The National C2IS is a Command and Control Information System which resides on a
National LAN.
EXPLANATION
The behaviour of the national C2IS is outside the scope of MIP. However it is recognised that
in order to ensure international interoperability desired by the MIP mission, a national C2IS
should provide a degree of functionality which permits the meaningful interpretation of
operational data received from a MIP Gateway.