Revision History€¦ · Web viewTable 15 : Defect escalation procedure26 Table 16 : Defect...
Transcript of Revision History€¦ · Web viewTable 15 : Defect escalation procedure26 Table 16 : Defect...
SCDHHS Master Test PlanNoSQL-Member Data Initiative (MDI)
Prepared by:
South Carolina Department of Health and Human Services (SCDHHS) Project Management Office (PMO)
MDI Master Test Plan
Contents1. Revision History...................................................................................................................................5
2. Approvals.............................................................................................................................................5
3. Definitions...........................................................................................................................................6
4. Reference Documents.........................................................................................................................7
5. Project Overview.................................................................................................................................7
5.1 EDS NoSQL Solution.....................................................................................................................9
6. Test Plan Purpose..............................................................................................................................10
6.1 Testing Objectives......................................................................................................................11
7. Scope.................................................................................................................................................12
7.1 In Scope Requirements..............................................................................................................12
7.2 Out of Scope Requirements.......................................................................................................12
8. Integration and Intersystem Interfaces.............................................................................................12
9. Test estimate.....................................................................................................................................13
10. Testing Schedule............................................................................................................................13
10.1 Unit............................................................................................................................................13
10.2 QA/SIT........................................................................................................................................13
10.3 UAT............................................................................................................................................14
11. Test Environments.........................................................................................................................14
11.1 Hardware Configuration............................................................................................................14
11.1.1 Development Environment:...............................................................................................14
11.1.2 QA Environment:...............................................................................................................15
11.1.3 UAT Environment:..............................................................................................................15
11.1.4 Production Environment:...................................................................................................16
11.2 Software....................................................................................................................................16
11.2.1 Software Requirements.....................................................................................................16
12. Test Data........................................................................................................................................16
13. Test Users......................................................................................................................................17
14. Testing Responsibilities..................................................................................................................18
PMO_TCoE_Master Test Plan_Template v1.0 2
MDI Master Test Plan
14.1 VMODEL with Accountability.....................................................................................................18
14.2 Roles and Responsibilities for Major Test Events.......................................................................18
15. Test Types......................................................................................................................................20
15.1 Unit testing................................................................................................................................20
15.2 Functional Integration Test (QA)................................................................................................20
15.3 System Test................................................................................................................................21
15.4 User Acceptance Test (UAT)......................................................................................................23
16. Performance Testing......................................................................................................................23
17. Test Deliverables...........................................................................................................................23
17.1 Status and Issue Reporting........................................................................................................24
18. Project Testing Related Tools........................................................................................................24
19. Defect Management......................................................................................................................24
20. Objectives of the defect review meetings.....................................................................................25
20.1 Purpose of the defect review meeting.......................................................................................25
20.2 Defect reporting and resolution process...................................................................................25
20.3 Defect escalation procedure......................................................................................................27
20.4 Defect severity definitions.........................................................................................................28
20.5 Defect life cycle stage................................................................................................................28
21. Results and metrics reporting........................................................................................................30
22. Communication and Escalation.....................................................................................................31
23. Assumptions/Constraints/Risks/Issues..........................................................................................32
23.1 Assumptions..............................................................................................................................32
23.2 Dependencies............................................................................................................................32
23.3 Constraints.................................................................................................................................33
23.4 Risks...........................................................................................................................................33
List of Tables
Table 1 : Reference Documents...................................................................................................................7Table 2: Included / Excluded Scope...........................................................................................................12Table 3 : QA/SIT Testing Schedule Table....................................................................................................12Table 6 : Dev Environment Site Requirements..........................................................................................13Table 7 : QA Environment Site Requirements...........................................................................................14
PMO_TCoE_Master Test Plan_Template v1.0 3
MDI Master Test Plan
Table 8 : UAT Environment Site Requirements..........................................................................................14Table 9 : Production Environment Site Requirements...............................................................................15Table 10 : Unit Testing type.......................................................................................................................19Table 11 : Functional Integration Test (QA)...............................................................................................19Table 12 : System Test...............................................................................................................................20Table 13 : User Acceptance Test (UAT)......................................................................................................22Table 14 : Project Testing Related Tools....................................................................................................23Table 15 : Defect escalation procedure.....................................................................................................26Table 16 : Defect severity definitions........................................................................................................27Table 17 : Defect life cycle stage...............................................................................................................28Table 18 : Results and metrics reporting...................................................................................................29Table 19 : Communication and Escalation.................................................................................................30Table 20 : Escalation hierarchy..................................................................................................................31
List of Figures
Figure 1: EDS MDI Business Data View........................................................................................................9Figure 2 : Pattern for EDS process flow........................................................................................................9Figure 3 : EDS NoSQL MDI Architecture Diagram......................................................................................10Figure 4 : VModel with Accountability.......................................................................................................17Figure 5 : Defect life cycle..........................................................................................................................25
PMO_TCoE_Master Test Plan_Template v1.0 4
MDI Master Test Plan
1. Revision HistoryVersion No.
Date Revised By Description of Change
0.1 08/14/2017 TurningPoint/Sudhakar Vinjamuri Initial Draft Submitted to SCDHHS1.2 09/18/2017 TurningPoint/Sudhakar Vinjamuri Final version submission to SCDHHS after
review and feedback corrections.
2. ApprovalsThe undersigned acknowledge that they have reviewed and agree with the information presented within this document. Changes to this plan will be coordinated with, and approved by, the undersigned, or their designated representatives. The Project Sponsor will be notified when approvals occur.
Signature: Date: 12/21/2017
Print Name: Rajesh Kadarkarai
Title: TCoE LeadRole: Test Lead
Signature: Date:
12/21/2017
Print Name: Raja GampaTitle: Performance ManagerRole: Program Director
Signature: Date:
12/21/2017
Print Name: Mark SummersTitle: TCoE ManagerRole: Test Manager
Signature: Date:12/21/2017
Print Name: Sreelekha VuppalanchaTitle: TCoE Lead
PMO_TCoE_Master Test Plan_Template v1.0 5
MDI Master Test Plan
Role: PMO TCOE Auditor
3. DefinitionsACRONYM TRANSLATION
API Application Program Interface
BR Business Requirement
DEV Development Environment
DR Disaster Recovery Environment
EDS Enterprise Data Services
I&C Installation and Configuration
MAGI Modified Adjusted Gross Income
MDI Member Data Initiative
MES Medicaid Enterprise Services
MMIS Medicaid Management Information System
Non-MAGI Non-Modified Adjusted Gross Income
ODI Oracle Data Integrator
ODS Operational Data Store
PRE-PROD Pre-Production environment
PROD Production environment
RMMIS Replacement Medicaid Management Information System
RTM Requirements Traceability Matrix
RU Business Rule
SCDHHS South Carolina Department of Health and Human Services
SQL Structured Query Language
SSN Social Security Number
TCoE Testing Center of Excellence
TEST Test environment
UAT User Acceptance Testing
XML Extensible Markup Language
PMO_TCoE_Master Test Plan_Template v1.0 6
MDI Master Test Plan
4. Reference DocumentsThe documents listed below were used as a source for the high-level test planning for the EDS NoSQL Solution - MDI portion of the project.
Documents and Repository Path Definition
Requirement Stories from JIRA JIRA StoriesMCTRA BRD MCTRA BRDFRS MCTRA-FRS
5. Project OverviewSCDHHS has engaged TurningPoint Global Solutions (TurningPoint) to provide an EDS NoSQL solution to enhance SCDHHS’ strategy to provide Medicaid Enterprise Services (MES) functionality across independent subsystems, while maintaining control over SCDHHS data. The MarkLogic software solution is the platform to deliver the NoSQL Medicaid Enterprise Data Services (EDS).
Currently, SCDHHS Medicaid data is exchanged across multiple systems in various formats. It is difficult and time consuming for agency staff to gather all related existing data relative to a Medicaid member. The data includes the following information: member contact information, family and household data, demographic data, employment data, other insurance data, etc. The EDS MDI will provide a more consistent, consolidated current view of a member’s data in a centralized repository.
As citizens of South Carolina apply for Medicaid, their eligibility for coverage must first be determined. Eligibility applications are submitted in various ways (e.g. paper, phone, on-line, in-person interview). The data from these applications eventually ends up in one or more systems. Each of the current systems listed below will be a source of member data in the Operational Data Store (ODS) database:
Medicaid Eligibility Determination System (MEDS)
MEDS is a Medicaid Eligibility Determination System that contains member, eligibility, and reference data. The eligibility determination system, implemented as a COBOL/IDMS solution, for the approximately 200,000 citizens who do not fall into the modified adjusted gross income (non-MAGI) Medicaid population.
PMO_TCoE_Master Test Plan_Template v1.0 7
MDI Master Test Plan
Curam: The eligibility determination system, an IBM solution, for the approximately 800,000 citizens in the modified adjusted gross income (MAGI) Medicaid population. Curam contains member, eligibility, and reference data.
Medicaid Management Information System (MMIS)
MMIS is a Medicaid Management Information System that contains member, eligibility, enrollment, reference, and claims data. The Medicaid claims processing and information retrieval system, implemented as a COBOL/IDMS solution. MMIS
OnBase
OnBase is the SCDHHS Electronic Document Management System (EDMS) that contains eligibility applications, person data, such as critical supporting documents, workflow, and reference data.
Eligibility File
The source systems currently maintain the System of Record (SOR) data. The EDS NoSQL Solution will be able to access the source systems, obtain all the relevant data, and store it in a Raw Data Lake (RDL) data repository. During materialization of raw data, records are matched, de-duplicated, and consolidated, to create a ‘golden copy’ of an entity. These new records/entities (“source of truth” data elements) are then stored in the Operational Data Store (ODS) database, where they will be accessed more efficiently. Data from the ODS database can be made available to end users, downstream trading partners, and other systems as needed.
Initially, there will be a one-time data load of all member records from each source system, to establish the member data in the EDS. The process to handle exceptions (errors, fallouts) will be defined as part of the system design. Records may need to be reviewed, reconciled and perhaps reprocessed. After the initial load, as new or changed records are generated in the source systems, incremental loads to the EDS will continue to update the RDL and ODS repositories to maintain the currency of the member record in the EDS. The exception process can be utilized for all future incremental loads.
Figure 1 displays the business data view utilizing the MDI to support a 360°-member data view. The diagram shows the data consolidation from the Medicaid application intake, through the Medicaid application process and eligibility determination, as well as the enrollment process, to the consolidated 360° view of a member. Finally, there is an export of eligible member records (eligibility file) to the MMIS system, and an export of enrollment data (834-enrollment file) to the Medicaid participating trading partners or insurers.
PMO_TCoE_Master Test Plan_Template v1.0 8
MDI Master Test Plan
Figure 1: EDS MDI Business Data View
For more details on requirements refer to SCDHHS Confluence page http://cuihhsiutl12.clemson.edu/display/NOSQL/Requirements_NoSQL
5.1 EDS NoSQL Solution The proposed solution follows the following pattern that incorporates five major elements:
Figure 2 : Pattern for EDS process flow
• Access –Access data from mainframe, mid-range, API based, and file-based data sources using a set of processes and supporting tools.
PMO_TCoE_Master Test Plan_Template v1.0 9
MDI Master Test Plan
• Ingest – Transport data from its source and ingest in raw, indexed format into the NoSQL RDL.
• Materialize – Transform the RDL data into a consolidated member data and store it in NoSQL ODS.
• Deliver – Expose data via web services API, and file exchange for processing and consumption.
• Consume – Process and present the data through Enterprise Data Store.
The scope of this test plan document includes Access, Ingest, Materialize and Deliver.
The below picture indicates many custom and third party components/processes employed in achieving the objective of Access, Ingest, Materialize and Deliver.
For complete details of the Architecture refer to the EDS NoSQL Architecture and Component details document located in SCDHHS Confluence page http://cuihhsiutl12.clemson.edu/display/NOSQL/SCDHHS+Enterprise+Data+Services+NoSQL-Architecture
Figure 3 : EDS NoSQL MDI Architecture Diagram
6. Test Plan PurposeThe purpose of this test plan document categorizes the System Integrated Test Plan for the South Carolina Department of Health and Human Services (SCDHHS) Enterprise Data Services (EDS) NoSQL Solution project solicitation. This document provides the high-level Test plan and test strategy for testing the business requirements (BR) for the Member Data Initiative (MDI) project. This document conveys the
PMO_TCoE_Master Test Plan_Template v1.0 10
MDI Master Test Plan
scope of testing with respect to the business requirements, business rules, functional, non-functional, and system related requirements and design development testing of the MDI solution.
6.1 Testing ObjectivesA series of tests will be executed to verify and validate the software quality for MDI. Testing team will coordinate with other data sources required for integration with the Medicare Enterprise data. Testing will address all components of the MDI application such as business rules, functionality, No SQL Data store query languages, Integration between all data sources, files transferred and file formats, work flow process, Usability of data and application etc.
The testing objectives for EDS NoSQL MDI solution will include Integration testing, Functional testing, End-to-End testing, Performance testing, Regression testing, and User Acceptance testing.
Below is the summary of various types of testing to address the MDI requirements:
Integration testing will assure MDI is able to access data from MEDS, MMIS, OnBase, and Curam data sources
Perform integration testing to ensure data can be ingested to the MDI in their raw format. Functional testing will be performed to ensure that the materialization of the data is performed
correctly and that the Member 360° view is created as expected. Perform functionality testing on each step of ACCESS, INGEST, MATERIALIZE of data from every
input data source and verify and validate the business rules are developed to DELIVER the member eligibility file and 834 file for each member.
Perform functional testing to verify and validate the Member Eligibility file is consumed by the end users successfully.
End-to-End testing will be performed to validate the Active Eligibility file is generated and the contents are valid. The testing will also ensure the active eligibility file is ready for delivery.
Additionally, End-to-End testing will cover the consumption of data by the Medicaid Enterprise. Performance testing will be performed as an internal testing component, to make sure EDS NoSQL
Solution is able to Access and ingest various file formats from each data source daily/weekly. Regression testing will be performed to ensure all the defects reported during testing are fixed and
resolved with no side effects. User Acceptance testing will be performed by the SCDHHS end users.TurningPoint will develop a requirements traceability matrix to ensure sufficient coverage of functional test cases for MDI. A User Acceptance Test (UAT) will be conducted upon successful completion of System Test.
PMO_TCoE_Master Test Plan_Template v1.0 11
MDI Master Test Plan
7. Scope
7.1 In Scope RequirementsThe scope of this document is to define the requirements for the MDI as follows:
Ref ID Functionality1 Develop hierarchy for applying reimbursement rates based on approved rules
2 Establish process for updating and modifying rules as needed, to include ancillary reimbursement, budgets, and actuary as needed
3 Define agency approval and review process
7.2 Out of Scope Requirements
Ref ID Functionality
1 Anything not mentioned in In Scope
8.Integration and Intersystem InterfacesThe following tabular contents will list down the various Interfaces/Applications involved in the Integration Testing of Superdome Project and also contains the individual point of contact that will be used for coordinating any Integration Testing. A diagram might work better or nice to have in addition
System ID Application/Functional Area Testing Responsibility/SME
MMIS UAT 2 MMIS Rajesh Kadarkarai
PMO_TCoE_Master Test Plan_Template v1.0 12
MDI Master Test Plan
9. Test estimate
Tasks SourceTest Planning Manual Test Case Development (auto from "detail_data" sheet)Existing Test Case Maintenance (auto from "detail_data" sheet)Test Data Creation/Envir Prep (auto from "detail_data" sheet)Automation Script Development (auto from "detail_data" sheet)Performance Script Development (auto from "detail_data" sheet)Manual Execution Estimate (auto from "detail_data" sheet)Number of Execution Cycles Regression Suite Names & Execution Estimate
Webstore Smoke Checkout, eComm End to End
Performance Workload name and Execution Estimate
Webstore_5000 Concurrent_10 sec think timeTotal Hours
10.Testing Schedule
10.1 UnitStory/Req. # Environment Test Start Date Test End Date
Test Plan Document preparation QA/TEST 08/29 09/11
Test Plan Execution QA/TEST 09/12 10/09
10.2 QA/SITStory/Req. # Environment Test Start Date Test End Date
Test Plan Document preparation QA/TEST 08/29 09/11
Test Plan Execution QA/TEST 09/12 10/09Note: The case execution has a dependency on QA environment access and code deployment.
PMO_TCoE_Master Test Plan_Template v1.0 13
MDI Master Test Plan
10.3 UATStory/Req. # Environment Test Start Date Test End Date
Test Plan Document preparation QA/TEST 08/29 09/11
Test Plan Execution QA/TEST 09/12 10/09
11. Test EnvironmentsThe following diagram identifies the environments used for testing.Identify environment to be used for production fixes vs. development
Site Identification and Implementation
SCDHHS Data Center, Clemson, SC
The Development, QA (Test), UAT and Production Site for the EDS NoSQL Solution will be located at Clemson, SC, managed by Clemson University.
11.1 Hardware ConfigurationThe hardware and software items needed for the EDS NoSQL implementation are listed in the table below.
11.1.1 Development Environment:
Component No. of Nodes/ Machines
CPU/Node Memory Storage
Load Balancer 1 N/A N/A N/A
ML Cluster 3 8 vCPUs 32 Gb/node 1 TB/node
ML Data Hub&
Ingest Processing
1 8 vCPUs 16 Gb 500 Gb mounted
Application Server
1 4-8 vCPUs 16 Gb 100 Gb
Build Server 1 4 vCPUs 16 Gb 100 Gb
PMO_TCoE_Master Test Plan_Template v1.0 14
MDI Master Test Plan
11.1.2 QA Environment:
Component No. of Nodes/
Machines
CPU/Node Memory Storage
Load Balancer 1 N/A N/A N/A
ML Cluster 6 8 vCPUs 32 Gb/node 1 TB/node
ML Data Hub Framework
1 8 vCPUs 16 Gb 500 Gb Note: This will be mount and shared between Data hub and Ingest Processing
Ingest Processing 1 8 vCPUs 16 Gb Note: This will be use above mount which should be shared between Data hub and
Ingest Processing
Application Server 1 4-8 vCPUs 16 Gb 100 Gb
Build Server 1 4 vCPUs 16 Gb 100 Gb
11.1.3 UAT Environment:
Component No. of Nodes/ machines
CPU/Node Memory Storage
Load Balancer 1 N/A N/A N/A
ML Cluster 6 8 vCPUs 32 Gb/node 1 TB/node
ML Data Hub Framework
1 8 vCPUs 16 Gb 500 Gb Note: This will be mount and shared between Data hub and Ingest Processing
Ingest Processing
1 8 vCPUs 16 Gb Note: This will be use above mount which should be shared between Data hub and Ingest Processing
Application Server
1 4-8 vCPUs 16 Gb 100 Gb
Build Server 1 4 vCPUs 16 Gb 100 Gb
PMO_TCoE_Master Test Plan_Template v1.0 15
MDI Master Test Plan
11.1.4 Production Environment:
Component No. of Nodes/
machines
CPU/Node Memory Storage
Load Balancer 1 N/A N/A N/A
ML Cluster 6 8 vCPUs 32 Gb/node 1 TB/node
ML Data Hub Framework
1 8 vCPUs 16 Gb 500 Gb Note: This will be mount and shared between Data hub and Ingest Processing
Ingest Processing 1 8 vCPUs 16 Gb Note: This will be use above mount which should be shared between Data hub and Ingest Processing
Application Server
1 4-8 vCPUs 16 Gb 100 Gb
Build Server 1 4 vCPUs 16 Gb 100 Gb
11.2 Software
11.2.1 Software Requirements
MarkLogic 8.0 Red Hat Enterprise Linux 7.0 (Packages - redhat-lsb, glibc, gdb, cyrus-sasl-lib, glibc, i686) JRE 8.0 MLCP 8.0.6.4 CORB 2.3.2 Gradle 3.5
12. Test Data
Test Data: Data will be accessed from MEDS, MMIS, OnBase, CURAM data sources into the data hub server into the RDL. Test team will utilize this data to verify and ensure the data validness. No mock data will be prepared. The same data will be ingested from RDL to ODS and will be materialized. Test team will select random samples from these data sources for each table for analysis. Test team will develop and execute the XQuery’s to validate the data in MarkLogic Query console.
Test team will identify sample data for each of the data sources in the QA environment
PMO_TCoE_Master Test Plan_Template v1.0 16
MDI Master Test Plan
Test data will be refreshed for each regression testing cycle Data will be accessed from MEDS, MMIS, OnBase, CURAM data sources
Application/Service
Assigned ResourceData Requirements
EDS Rajesh MMIS Source
EDS Rajesh Onbase Source
EDS Rajesh Curam Source
13. Test UsersEach test case will require one or more test users. Test users must be created to replicate real business users allowing defects related to authorisation profiles and delegation of duties to be identified. Using unrealistic role assignment for test users will invalidate all functional tests.A catalog of test users should be maintained. Automatic provisioning of test users needs to be established as part of the setup of the test environment.
Name Project Role Email Phone LocationRajesh Kadarkarai
Test Lead [email protected]
82131 Jefferson
PMO_TCoE_Master Test Plan_Template v1.0 17
MDI Master Test Plan
14. Testing Responsibilities
14.1 VMODEL with Accountability
Figure 4 : VModel with Accountability
14.2 Roles and Responsibilities for Major Test Events
Phase ActivityTest Lead
Dev Lead
Unix, DBA Lead
PM BA Comments
Requirement Analysis
Providing detailed list of requirements in scope for the release
I I I A&R R Functional lead is also responsible for any functional requirements if developed for the release
Test Case Development
Requirement analysis and test
A&R C C C C During requirement understanding and test case development test lead
PMO_TCoE_Master Test Plan_Template v1.0 18
MDI Master Test Plan
Phase ActivityTest Lead
Dev Lead
Unix, DBA Lead
PM BA Comments
case development would require help from other stakeholders to finalize the test plan
Test Case Development
Test requirements review and sign off
A R I R R While test lead has the final accountability on finalizing test plan, it is other stake holders responsibility to review and provide sign off
QA Test Execution
Perform FT, SIT, RT, PT and report results
A&R R C C C During test execution if there are any clarifications required CSR & Functional leads are consulted for help. Development team’s takes equal responsibility to ensure test execution is completed.
QA Test Execution
Bug fixes and retest R A&R C C C During bug fixes if there are any clarifications required CSR & Functional leads are consulted for help. Development team’s takes accountability in resolving and ensuring retest is completed
QA Test Execution
Test Results Review and Sign off
C C I A R
Acceptance test execution
UAT execution C R C A R
R – Responsibility A – Accountability C – Consulted I – InformedResponsible: Those who do the work to achieve the task. There is typically one role with a participation type of responsible, although others can be delegated to assist in the work requiredAccountable: (also Approver or final Approving authority) those who are ultimately accountable for the correct and thorough completion of the deliverable or task, and the one to whom responsible is accountable. In other words, an Accountable must sign off (Approve) on work that responsible provides. There must be only one Accountable specified for each task or deliverable. Consulted: Those whose opinions are sought. Informed: Those who are kept up-to-date on progress, often only on completion of the task or deliverable; and with whom there is just one-way communication
PMO_TCoE_Master Test Plan_Template v1.0 19
MDI Master Test Plan
15. Test Types
15.1 Unit testing Table 1 : Unit Testing type
Purpose This preliminary test is performed by the development team for testing of individual configuration, custom programs and/or technical services for EDS NoSQL Solution to ensure that they function according to the detailed technical specification.Unit test is a white box test and should test all possible flows. Both positive and negative conditions will be tested.
Development Phase Development and TestingTest Scope All configurations, code validation, memory testing, integration, code complexity, etc.Test Environment Development EnvironmentTest Data Manual data created by developersInterface Requirements
NA
Role DeveloperEntry Criteria 1. Formal reviews for process models, functional specs and technical specifications
have been completed2. All Inspection related defects have been corrected3. All documentation and design of the architecture must be made available4. Development of the component is complete and compiles without error5. All Unit test cases are documented
Exit Criteria 1. All Unit test cases completed successfully2. All source code is unit tested3. No outstanding critical defects4. All outstanding defects are entered into the defect tracker 5. All test results have been documented
15.2 Functional Integration Test (QA)Table 2 : Functional Integration Test (QA)
Purpose Functional test validates that full operability of interconnected functions, methods or objects within a functional area. This includes a set of logically related activities or business processes to achieve a defined business process.
Functional test cases will typically consist of a series business processes or stories joined together to achieve a business process. The smaller size of test cases will enable to test multiple data sets and permutations.
It happens after or in parallel with the development phase as and when all components for a specific flow are complete. Functional test will be done by independent testing team in QA environment.
PMO_TCoE_Master Test Plan_Template v1.0 20
MDI Master Test Plan
During subsequent integration testing activities these business process (functional) tests are combined to build end-to-end integration test scenarios.
Development Phase Development and TestingTest Scope All functional tests, requirement/story coverage using test design techniques like
Orthogonal Analysis, Decision Tables, Equivalence Partitioning, etc.Test Environment Test EnvironmentTest Data Manual data created by Test teamInterface Requirements
Interface connectivity required for impacted systems
Role QA TeamEntry Criteria 1. All specs are frozen and the requirements change control process has begun
2. Proper test data is available3. Test plans and test cases are reviewed and signed off4. Unit Testing has been completed5. Specifications for the product have been completed and approved6. All test hardware platforms must have been successfully installed, configured and
functioning properly.7. All standard software tools including testing tools must have been successfully
installed and functioning properly.8. All personnel involved in the system test effort must be trained in tools to be used
during testing process.9. All personnel involved in the system test effort must be trained the usage of the
application and new features.10. All functional test cases are documented
Exit Criteria 1. Test case execution completed with 90% passed 2. All defects are recorded in Quality Center or Solution Manager3. No outstanding “showstopper or severe” defects4. All test results have been documented5. All code has been migrated into the QA environment6. Coverage of code/functionality/requirements is 100% of functional requirements.
15.3 System Test Table 3 : System Test
Purpose SIT validates a set of business processes that define a business scenario in a comprehensive and self-contained manner on a macro level.
This is an end-to-end test of the business process. Typically, a business scenario will involve testing multiple SAP modules test cases together. The primary objective of this testing, is to discover errors in the integration between different modules and to verify that the modules work together correctly as one function. E2E test validates the integration within SAP and between SAP and Legacy (all Non-SAP) applications. All testing related to validation of the data interchange between SAP and Legacy applications are categorized as Interface testing.
PMO_TCoE_Master Test Plan_Template v1.0 21
MDI Master Test Plan
Security role based authorization test is performed to ensure that all the security profiles and roles are being implemented as designed. Security profile is designed and built based on the job role (i.e., positions) of the end users. Security roles are assigned at the business transaction level.
The objectives of security testing are Ensure that user has access to the required transactions to perform their job Ensure that the user does not have access to transactions other than what is
required for the role Ensure that accesses to critical system administration transactions are controlled. Ensure that only authorized person has the right to view the information on
Screens and Reports. Ensure that Delegation being done in SAP (where ever system allows user to
delegate his authority to other user/s) are tested from the viewpoint of the Delegator and to whom it is being delegated.
Test Scope Full End to end business process Performance Testing Regression Interface testing with interfacing systems Security role based authorization testing End to End scenarios executed with user id mapped to actual security roles batch jobs execution using scheduled runs Printers and other devices
Development Phase Development and TestingTest Environment QA Environment or Pre-ProdTest Data Data from Mock cutover or Test Data Management toolInterface Requirements
Interface connectivity required for all interfacing systems
Role QA TeamEntry Criteria 1. All specs are frozen and the requirements change control process has begun
2. Proper test data is available3. Test plans and test cases are reviewed and signed off4. SIT has been completed5. All functional test cases are documented
Exit Criteria 1. Test case execution completed with 100% passed 2. All defects are recorded in Quality Center or Jira3. No outstanding “showstopper or severe” defects4. All test results have been documented5. All code has been migrated into the Pre-Prod environment6. No new defects have been discovered for a week prior to System Testing.7. Coverage of code/functionality/requirements is 100% of functional
requirements.
PMO_TCoE_Master Test Plan_Template v1.0 22
MDI Master Test Plan
15.4 User Acceptance Test (UAT)Table 4 : User Acceptance Test (UAT)
Purpose User acceptance test is performed by business users. The users test the complete, end-to-end business processes to verify that the implemented solution performs the intended functions and satisfies the business requirements.
Development Phase Final Prep or Implementation
Test Scope UAT Full Regression
Test Environment Pre-Prod or ImplementationTest Data Mock cutover or Test Data Management toolInterface Requirements
Interface connectivity required for all interfacing systems
Role Process Team & Business UsersEntry Criteria 1. The application works functionally as defined in the specifications
2. No outstanding “showstopper or severe” defects3. All areas have had testing started on them unless pre agreed by UAT
stakeholder/Test and Project managers4. Entire system functioning and all new components available unless previously
agreed between UAT stakeholder/Test manager and project managers5. All test cases are documented and reviewed prior to the commencement of UAT
Exit Criteria 1. The Acceptance Tests must be completed, with a pass rate of not less than 98%.2. No outstanding “showstopper or severe” defects3. Less than 5 significant defects outstanding4. All Test cases have been complete5. No new defects have been discovered for a week prior to Production
Implementation.6. All test results recorded and approved7. UAT test summary report documented and approved8. UAT close off meeting held.
16. Performance TestingPerformance Testing is not in scope for TurningPoint. This will be performed by SCDHHS Testing Center of Excellence.
17. Test DeliverablesTesting documents delivered as part of the SCDHHS EDS NoSQL solution will be
PMO_TCoE_Master Test Plan_Template v1.0 23
MDI Master Test Plan
Test plan document which includes test casesLocation in JIRA: Test Plan: http://cuihhsiutl12.clemson.edu/display/NOSQL/EDS+NoSQL+MDI+Test+Plan
Test Cases: http://cuihhsiutl10.clemson.edu/secure/Tests.jspa#/design?projectId=11801
Requirements Traceability Matrixhttp://cuihhsiutl12.clemson.edu/display/NOSQL/EDS+NoSQL+MDI+Testing+RTM
17.1 Status and Issue ReportingTest status report will be sent to project stake holder during the end of every week throughout the Project SDLC phase
18. Project Testing Related ToolsTable 5 : Project Testing Related Tools
Phase/activity Test tool requirement
Test case documentation (Manual & Automation) Adaptavist Test Management (JIRA)Requirement Management JIRATest cases automation development and execution Adaptavist Test Management (JIRA)Test execution and results reporting Adaptavist Test Management (JIRA)Defect reporting and tracking JIRADocument storage ConfluenceBusiness Process Flow BizAgiTest Data Management IBM Optim
19. Defect ManagementDefect Review Meetings will be held on a daily basis with TurningPoint development leads, test leads and all managers. The goal of this meeting is to ensure that defects are being resolved in a timely fashion and that any issues or questions are resolved. It is at these meetings that progress tracking of defect resolution and closure is communicated.
PMO_TCoE_Master Test Plan_Template v1.0 24
MDI Master Test Plan
20. Objectives of the defect review meetings
20.1 Purpose of the defect review meeting To help prioritize defect fixes for Implementation, Legacy support, and Conversion teams. To discuss and assign priority and severity to defects, discuss the expected turnaround time and
the planned turnaround time. To monitor and review the progress of defect fixes that is due or overdue as of current date. To determine the extent of retesting required due to a fix/ enhancement. To escalate defects/ issues to PMO when a quick resolution is required, or in case of a deadlock
on ownership of defects/ issues. To identify whether defect is assigned to right team Identify defects that need to be deferred to subsequent releases
20.2 Defect reporting and resolution processPrerequisite: Development team & Business Team should have access to defects section of Jira and are able to update the defect details. Jira should be configured to send auto emails when a new defect is logged, assignee is changed and the status is moved to re-test
Below diagram help in understand the defect life cycle process quickly and easily. Defect severity definition and recommended SLAs are available in appendix. Various stages of defects and the subsequent stages are also listed in appendix in detail.
PMO_TCoE_Master Test Plan_Template v1.0 25
MDI Master Test Plan
Figure 5 : Defect life cycle
PMO_TCoE_Master Test Plan_Template v1.0 26
MDI Master Test Plan
20.3 Defect escalation procedureBelow table provides information on when to escalate a defect
Table 6 : Defect escalation procedure
Defect Severity # Blocking test cases
Slipped SLA Candidate for escalation
Any Level >10% of total test cases
Yes
Critical >5% of total test cases
Yes
Any Level Any number Yes / Go–No Go meeting is scheduled within 5 days from current day
Defect communication and escalation procedureFirst level of notification: As soon as the defect is logged in to quality center, auto generated email would be sent to the assigned person. Since the defect will be assigned to development team alias, all the team who are subscribed to the alias would get the email.
Daily status review meeting: Along with the test execution status discussions, all the outstanding defects would be discussed in the meeting. Development team, business team, basis team, QA management and other stakeholders as appropriate would join the meeting. Defect details and estimated time of fix would be documented in the quality center accordingly.
Defect disposition meeting: this is a twice a week meeting where in only high impact defects as identified are the candidates for escalation would be discussed in detail. Development team management, QA team management along with respective leads would discuss the finer details and put an action plan to resolve them.
Escalation email to development team/SME team manager: QA Manager from would send an email with details of defects which need immediate attention to development team/SME team manager and on need bases a triage call involving senior management would be organized to discuss associated risks, have a resolution plan, and to review the status.
Note: Above mentioned escalation criteria can be adjusted during execution based on number of days left for the release go-no go decision.
PMO_TCoE_Master Test Plan_Template v1.0 27
MDI Master Test Plan
20.4 Defect severity definitionsTable 7 : Defect severity definitions
Severity Definition Expected time for Closure
Critical A complete software system, or a subsystem, or software unit (program or Module) within the system lost its ability to perform its required function (=Failure) and no workaround available
OR
Testing of a significant number of tests cannot continue without closure
OR
Potential show stopper for Go/ No-Go decision to enter next stage or Cutover without closure
1 Business Day
Major The software system, or subsystem, or software unit (program or module) within the system produces Incorrect, Incomplete, or Inconsistent results
OR
Defect impairs the usability (capability of software to be understood, learned, used and attractive to the user when used under specified conditions [ISO 9126]
2 Business days
Minor Everything that not Major or Critical 3 Business days
20.5 Defect life cycle stageAs part of the Defect Life Cycle definition and Defect Management process, various Defect stages will be identified as mentioned below
PMO_TCoE_Master Test Plan_Template v1.0 28
MDI Master Test Plan
Table 8 : Defect life cycle stage
Defect Status Description Required Previous
Status
Next Possible Status
New - Defect identified and raised by a Team- Defect is not reviewed by the Assigned Team
NAOpen
Assigned
Open - Assigned Team acknowledges the defect by moving the defect to open status- No one has been assigned to analyze the defect
New
AssignedRejectedDeferredDuplicate
Assigned -Defect is assigned to a user (developer) for analyses.
New
Open
Need more info RejectedDeferredDuplicate
FixedRetest
Need More Info
- Defect is assigned to the tester for getting additional information about the problem for more analysis.
New
Open
Assigned RejectedDeferredDuplicate
FixedRetest
Rejected - An invalid defect has been logged. The defect can be rejected by the Assigned Team for various reasons - Invalid data used by tester - Invalid test case executed by tester - Test steps followed by the tester were incorrect
Note: If the defect is rejected because requirements were changed and the Testing team was not notified for the update requirement, then the defect shouldn't be rejected. It should be Closed
Open
AssignedAssigned
Fixed - Assigned Team moves the defect to fixed when the defect is fixed and is ready to be deployed
Assigned Retest
Retest - Assigned team moves the defect to Retest when the defect has been deployed for testing on the required environment
FixedClosed
Re-Open
PMO_TCoE_Master Test Plan_Template v1.0 29
MDI Master Test Plan
Defect Status Description Required Previous
Status
Next Possible Status
Re-Open - If a defect in Retest Fails, then the defect is Reopened and assigned back to the previous team which fixed the defectNote: If the retest of the defect fails because of a reason different than what the defect was logged for then a new defect should be open for the new issue. The current defect shouldn't be reopened in such cases
RetestAssigned
FixedRetest
Closed - Defect passes the retest and can be closed FixedRetest
<NOCHANGE>
Deferred - Defect is acknowledged by the Assigned Team and cannot be fixed with the Release timeline because of any constraints. The defect then will be deployed to production with known risk- To be able to move a defect to Deferred status an approval from all the key stakeholders is required- A CR needs to be initiated for doing the change in future- The approval email for deferring the defect needs to be attached as a part of the defect
AssignedRe-Open
<NOCHANGE>
Duplicate - The defect is a duplicate of and existing open defect and is same as the previous one. - Previous defect Id needs to be updated in this case
OpenRe-Open
Assigned
21. Results and metrics reportingBelow listed metrics would be published to provide Testing Center of Excellence (TCoE) stakeholders with an update and status of the release.
Table 9 : Results and metrics reporting
Report Name Details Frequency
Weekly status report PMO status report to be sent every Tuesday by 3.00 Participate in Thread leads status meeting every
Wednesday
Weekly
Daily Status Report during test execution
Test Execution status of all the tracks. This report contains Every working day
PMO_TCoE_Master Test Plan_Template v1.0 30
MDI Master Test Plan
Report Name Details Frequency
phase Test execution planned Vs completed Test Case pass/fail numbers, Defects open/close, age and severity Risks and Issues Milestone achievements
Defect Churn Rate Need definition from Richa
QA/UAT Velocity Rate Need definition from Richa
Closure Report Summary of test execution phase just concluded to all stakeholders for their sign off
End of each test phase
22. Communication and Escalation Below details will provide a view of how communication and escalation can be done against IBM QA team
Table 10 : Communication and Escalation
Category Type Participants Mode Type of reporting
Bi-Weekly project meeting
Project test lead IBM test manager IBM test lead
Telephonic conference
High level project status, Key issues and risks, Action tracker
Weekly status meeting
PMO test lead IBM test lead
Telephonic conference
Progress as against plan Key issues and risks Action tracker
Daily status reporting
Project QA stakeholders
IBM QA team
Email Daily reporting of tasks and progress of the same against plan
Escalation hierarchy –
PMO_TCoE_Master Test Plan_Template v1.0 31
MDI Master Test Plan
Table 11 : Escalation hierarchy
Name Role Issue Age Email address
Track Leads Thread leads >3 days
Test Manager >5 days
PMO > 7days
23. Assumptions/Constraints/Risks/IssuesAssumptions, constraints, risks and issues are external factors that the decision maker has little or no control over, thus they inhibit the decision making process.
23.1 Assumptions1) Business Requirements/Software System Requirement Specifications are clear, concise and
able to be translated into test cases.2) Any approved PCR’s that QA Team have not had a chance to estimate for will not be included
in our testing until such time as they have been estimated, planned and approved.3) All impacted application(s)/system(s) and their respective interfaces will be tested at least
once during the testing phase’s lifecycles4) All necessary development will be complete in time to start testing.5) JIRA/Adaptavist will be used as test management tool. All test cases, test results and defects
will be available in JIRA at: Project MCTRA (MCTRA)6) All the team member will have access to JIRA/Adaptavist
23.2 Dependencies1) All SDLC artifacts are complete and signed off2) Test resources availability syncs with project scheduling3) All test scripts are uploaded to Adaptavist prior to commencement of UAT execution4) The Test environments are available and connectivity has been established between all the
interfaces identified on this project.5) All necessary accesses are provided for UAT Team6) Availability of Test Cases and specific test data according to the requirements7) Changes in scope or redesign will require a project change request be submitted and approved
PMO_TCoE_Master Test Plan_Template v1.0 32
MDI Master Test Plan
23.3 Constraints1) Any unidentified or future changes or inclusions that may adversely affect the test schedule2) Any technology ‘freeze’ periods3) Resource contention and availability of Business, IT & External SME’s throughout all work
streams due to current allocation on other projects.4) Timely resolution of issues and key decisions
23.4 RisksThis section lists all potential test related risks known at this time, the proposed mitigation and contingency measures to be adopted by the UAT Team.
When conducting risk analysis two components should be considered: The probability that the negative event will occur. The potential impact or loss associated with the event.
Refer to the Project Risk Log for the full inventory of project related risks.Ref ID
Risk Risk Probabilit
y
Risk Impact
Mitigation Contingency
H / M / L H / M / L
1 Availability of Data for validation
L H N/A Extend the testing timeline
2 Availability of SME L H Backup plan needs to be there for SME
Project plan need to be aligned based on availability of SME
PMO_TCoE_Master Test Plan_Template v1.0 33