Novelis Testing Management Plan v1.4

36
Novelis 2.0 Testing Management Plan

Transcript of Novelis Testing Management Plan v1.4

Page 1: Novelis Testing Management Plan v1.4

Novelis 2.0

Testing Management Plan

Version 1.0

June 24, 2011

Page 2: Novelis Testing Management Plan v1.4

Test Management Plan

Version History

Owner Oliver Sachse

Status Draft

VersionPublication

DateDescription of Change Author Reviewer

1.4 Version to be used for test planning and execution Oliver Sachse

4/9/23 document.doc ii

Page 3: Novelis Testing Management Plan v1.4

Test Management Plan

Table of Contents

1 INTRODUCTION........................................................................................................................................1

1.1 Unit testing...........................................................................................................................................11.2 Integration testing.................................................................................................................................21.3 Security testing.....................................................................................................................................31.4 Data migration testing..........................................................................................................................31.5 User acceptance testing........................................................................................................................51.6 Performance testing..............................................................................................................................51.7 Cut-over testing....................................................................................................................................61.8 Regression testing.................................................................................................................................6

2 TESTING OBJECTIVES..........................................................................................................................7

3 TESTING TYPE OVERVIEW.................................................................................................................8

4 TESTING PHASES..................................................................................................................................10

4.1 Unit testing.........................................................................................................................................104.1.1 Process/configuration testing.....................................................................................................114.1.2 Development testing...................................................................................................................114.1.3 Security testing...........................................................................................................................11

4.2 Integration testing...............................................................................................................................114.2.1 Scenario testing..........................................................................................................................124.2.2 System infrastructure testing......................................................................................................13

4.3 Data migration testing........................................................................................................................144.4 User acceptance testing......................................................................................................................144.5 Performance testing............................................................................................................................154.6 Cut-over testing..................................................................................................................................154.7 Regression testing...............................................................................................................................16

5 TEST ORGANIZATION.........................................................................................................................17

5.1 Test lead..............................................................................................................................................175.2 Internal process expert (business)......................................................................................................185.3 External process expert (business & IT)............................................................................................18

6 TEST MANAGEMENT CONTROL.....................................................................................................19

6.1 Test management process...................................................................................................................196.1.1 Test planning..............................................................................................................................196.1.2 Test design..................................................................................................................................196.1.3 Test design..................................................................................................................................196.1.4 Test execution.............................................................................................................................196.1.5 Defect tracking and resolution....................................................................................................196.1.6 Test evaluation and reporting.....................................................................................................20

6.2 Test planning and dependencies.........................................................................................................216.3 Test management tools.......................................................................................................................21

7 FORMS AND REPORTS........................................................................................................................22

7.1 Test scenario.......................................................................................................................................227.2 Test script...........................................................................................................................................227.3 Test problem reports...........................................................................................................................227.4 Test metrics........................................................................................................................................22

4/9/23 document.doc iii

Page 4: Novelis Testing Management Plan v1.4

Test Management Plan

7.5 Test status report................................................................................................................................22

List of Figures

Figure 1: Sequence of testing...............................................................................................................................1

Figure 2: Sequence of testing...............................................................................................................................2

Figure 3: Testing building blocks.........................................................................................................................8

Figure 4: Test organization.................................................................................................................................18

Figure 5: Test management process...................................................................................................................21

List of Tables

Table 1: Testing type overview..........................................................................................................................10

Table 2: Test types, activitites, roles and responsibilities..................................................................................20

4/9/23 document.doc iv

Page 5: Novelis Testing Management Plan v1.4

Test Management Plan

1 Test objectives

The primary objective of testing is to ensure all applications within the scope of Novelis 2.0 operate as expected and meet the business requirements, as documented during the blueprint phase and configured during the realization phase. Testing will include the functionality of the associated SAP system, related processes that are part of add-ons to SAP, processes that are interfaced or integrated with current systems and/or business partners, SAP system security (based on identified roles and responsibilities), and performance testing of the technical environment.Business process owners and IT as well as end users will be involved during the testing activities and will provide a dual role. First, they will augment the resource base required for testing, and second they will provide a conduit for feedback on the interaction with the system to a base of employees beyond the project team.

There will be multiple testing initiatives: Unit Testing (covering functional and technical testing) Integration Testing (covering scenario and system testing) Security testing (integrated into unit and integration testing) Data migration testing User acceptance testing Cut-over testing Regression testing Performance testing

Figure 1: Sequence of testing

4/9/23 document.doc 1

Page 6: Novelis Testing Management Plan v1.4

Test Management Plan

2 Introduction

This document establishes the strategy for testing that will be followed throughout the project. It defines each testing activity, outlines the objectives of each testing effort, and provides the high level approach to accomplish the goals of testing. The project scope and resources must be considered throughout the execution of this strategy to ensure that system and process integrity is properly achieved.

This document will be reviewed and signed off by the PMO team to ensure understanding of the purpose, approach, ownership, and commitment of resources, as well as the common usage of terminology.

There will be multiple testing initiatives:

Unit Testing (covering functional and technical testing) Integration Testing (covering scenario and system testing) Security testing (integrated into unit and integration testing) Data migration testing User acceptance testing Cut-over testing Regression testing Performance testing

Figure 2: Sequence of testing

4/9/23 document.doc 2

Page 7: Novelis Testing Management Plan v1.4

Test Management Plan

2.1 Unit Testing

Unit testing consists of several different types:

Configuration testing String testing Development testing Interface testing

Configuration testing: This tests isolated pieces of functionality. If a sales order is created and saved for example it confirms that the sales order can be saved using the SAP organization elements (sales organization, company code, credit control area, etc.) along with the customer master data set up, partner functions, material master data, etc. The unit testing covers the areas of system configuration, program development and all relevant interfaces for the business process that is tested. The unit test is executed in the development system by a configuration specialist after customizing. It establishes a baseline of SAP functionality.

String testing: String tests use specific business cases.  There may be configuration and business process design that is unique to a certain customer set, a given product line or a set of services. Since tangible products and services are processed very differently from each other there might be different scenarios that need to be tested.   This testing also includes execution of interfaces and other development objects, e.g. reports, with fabricated data.

Development testing: For ABAP development, for example, unit testing shows that a report can be created from developer generated data.  Assistance in data generation may come from a functional consultant.

Interface testing: Interface testing typically occurs at different points in a project therefore it is important to understand the timing and scope of the specific interface test. Interface testing usually refers to unit testing activities where it is being confirmed that the code can read and process a self-created file or will create a pre-defined file. There might be two development systems (one SAP system and one legacy system) where a portion of the test needs to be executed in order to show that the sender can generate a file and the receiver can read and process it.

2.2 Integration Testing

This testing is similar to unit testing except it is done in the quality environment and uses production-like data. The data used will be the result of a data extract, conversion and load process from the data migration team  (not necessarily a full conversion) so the data has a certain familiarity to it for a business end user, e.g. recognizable customers, materials, pricing, vendors, contracts, etc. 

The testing shows that the business process as designed and configured in SAP runs using representative real world data.  In addition the testing shows interface triggers, reports, workflow are working.

During the integration testing a specific business case is tested (also known as end-to-end testing) from start to finish including interfaces, reports, manual inputs, workflow, etc.  In short it is attempting to simulate a real world business process using data that is as close to the production data as possible. 

This test will occur in a quality environment. It is seen as a way of validating that the individual unit tests, integration tests and interface tests produced results that work together. End users will be involved during all phases of the testing. The goal is to ensure that end users are able to perform their designated job functions within the new system(s). 

4/9/23 document.doc 3

Page 8: Novelis Testing Management Plan v1.4

Test Management Plan

A crucial part of this testing is referring back to the business requirements and blueprint to ensure that the expected features, functions and capabilities are available.  As part of the project user involvement along the way should have been providing feedback to ensure the design met the requirements, so there should not be any big surprises.

2.3 Security Testing

Security testing ensures that users are only able to execute transactions and access to appropriate data is critical to any project. This will verify that there are no inherent segregation of duties violations and excessive access granted which is important for Novelis’ SOX compliance.  This testing will be done in a quality environment against near-final configuration and data from a full extract, conversion and load exercise.  Test IDs for job roles are created and used to both confirm what a user can do and what a user cannot do.  This kind of testing is combined with end user or user acceptance testing (see Error: Reference source not found Error: Reference source not found).

2.4 Data Migration Testing

It is important, then, to raise confidence levels, and reduce anxiety levels surrounding this activity.  There will be at least three data migration test cycles executed in an isolated data migration client.  This allows for Several attempts at exercising and fine-tuning the data migration plan Collecting nominal run time statistics to have a rough estimate on how long the data migration

might take Identifying and fixing any data migration program object defects Identifying and fixing any data mapping and content defects Identifying and fixing any functional configuration issues. Each of these tasks is done with the goal of significantly improving the fallout rate with each data migration test cycle.  These data migration test cycles also give the opportunity to practice the legacy extract skills, the fallout analysis skills, and the fallout manual cleanup skills.The approach to testing data and content migrations relies upon sampling, where some subset of random data or content is selected and inspected to ensure the migration was completed “as designed”. Those that have tested migrations using this approach are familiar with the typical iterative test, debug and retest method, where subsequent executions of the testing process reveal different error conditions as new samples are reviewed.The following lists options for data migration testing grouped by the migration phase.

 Pre-migration testingThese tests occur early in the migration process, before any migration, even migration for testing purposes, is completed. The pre-migration testing options include: Verify scope of source systems and data with all stakeholders. Verification should

include data to be included as well as excluded. If applicable it will also be tied to the specific extraction queries being used for the migration.

Define the source to target high-level mappings for each category of data or content and verify that the desired type has been defined in the destination system.

Verify destination system data requirements such as the field names, field type, mandatory fields, valid value lists and other field-level validation checks.

Using the source to destination mappings, test the source data against the requirements of the destination system. For example, if the destination system has a mandatory field, ensure that the appropriate source exists and contains valid data. If the destination system field has a list of valid values ensure that the appropriate source fields contains only these values.

4/9/23 document.doc 4

Page 9: Novelis Testing Management Plan v1.4

Test Management Plan

Test the fields that uniquely link source and target records. Ensure that there is a unique record identifier.

Test source and target system connections from the migration platform. Test tool configuration against the migration specification which can often be completed

via black box testing on a field-by-field basis. Furthermore, testing here can also be used to verify that a migration specification’s mappings are complete and accurate.

Formal design reviewConduct a formal design review of the migration specification when the pre-migration testing in almost complete or during the earliest stages of the migration tool configuration. The specification should include: A definition of the source systems The source system’s data sets and queries The mappings between the source system fields and the destination system Number of source records Number of source systems records created per unit time (for defining the migration

timing and downtime) Identification of supplementary sources Data cleansing requirements Performance requirements Testing requirements

The formal design review should include representatives from the appropriate user communities, IT and management. The outcome of a formal design review should include a list of open issues, action plans on how to resolve each issue. The formal design review will also approve the migration specification and a process to maintain the specification in sync with the migration tool configuration.

Post-migration testingOnce a migration has been executed, additional end to end testing can be executed. Expect a significant sum of errors to be identified during the initial test runs although it will be minimized if sufficient pre-migration testing is well executed. Post-migration is typically performed in a test environment and includes: Test the throughput of the migration process (number of records per unit time). This

testing will be used to verify that the planned downtime is sufficient. For cutover planning it is necessary to consider the time needed to verify that the migration has been completed successfully.

For planning purposes, consider the time to verify that the migration process was completed successfully.

Compare migrated records to records generated by the destination system: this ensures that migrated records are complete and of the appropriate context.

Summary Verification – there are several techniques that provide summary information including record counts and checksums. Here, the number of records migrated is compiled from the destination system and then compared to the number of records migrated. This approach provides only summary information and if any issue exists, it does not often provide insight to an issue’s root cause.

Compare migrated records to sources: this test should verify that fields’ values are migrated as per the migration specification. In short, source values and the field level mappings are used to calculate the expected results at the destination. This testing can be completed using sampling if appropriate or if the migration includes data that poses significant business or compliance risk, 100% of the migrated data can be verified using an automated testing tool.

4/9/23 document.doc 5

Page 10: Novelis Testing Management Plan v1.4

Test Management Plan

The advantages of the automated approach include the ability to identify errors that are less likely to occur. Additionally, as an automated testing tool can be configured in parallel with the configuration of the migration tool, the ability to test 100% of the migrated data is available immediately following the first test migration. When compared to sampling approaches, it is easy to see that automated testing saves significant time and minimizes the typical iterative test, debug and retest found with sampling.

2.5 User Acceptance Testing

User Acceptance Testing (UAT) - also called end user testing - is a phase of the project where the software is tested in the "real world" by the intended user (business) community. It is the last step prior to cutover and putting the system into production. The goal of User Acceptance Testing is to assess if the system can support day-to-day business and user scenarios. It is the responsibility of end users to test the applications prior to go-live. When performing UAT, there are seven basic steps to ensure the system is tested thoroughly and meets the business needs.

1. Analyze business requirements: All three steps (elicit requirements, analyze requirements, and recording requirements) have already been done, this step simply is revisiting the business requirements to ensure a common understanding amongst all testers.

2. Identify UAT scenarios: This step identifies in detail which of the previously written scenarios will be addressed within the UAT considering that each roll-out will have a separate UAT.

3. Define the UAT test plan: The test plan breaks down the test phase into detail. It also lays out the tools to be used, the timeline and the test process.

4. Create UAT test cases: All scenarios that have been selected and defined in step 2 need to be detailed and be enriched with the appropriate test data for the specific roll-out currently being addressed.

5. Run the tests: Executing the test will be done either manually or being supported by automated test tools.

6. Record the results: Result of the execution of the tests (manually or automatically) need to be recorded in order to correct the problem at the source and to provide the necessary details for sign-off.

7. Confirm business objectives are met: This is the final step and ensures that the testing loop has successfully come to an end.

2.6 Performance Testing

The performance or stress test examines things like whether the system response time is acceptable, periodic processes run quickly enough, and if the expected concurrent user load can be supported.  It also identifies processing bottlenecks and ABAP coding inefficiencies.

The testing is geared towards simulating peak loads of activity from either online users or batch processing and identifies the steps needed to improve performance.  Determining and recording performance testing objectives however involves communicating with the stakeholders to establish and update these objectives as the project advances through milestones. Such objectives might include providing business-related metrics, obtaining resource utilization data under load, generating specific loads to assist with tuning an application server, or providing a report of the number of objects requested. While it is most valuable to collect performance-testing objectives early in the project life cycle, it is also important to periodically revisit these objectives and ask team members if they would like to see any new objectives added. Consequently the first stress test on a system can be painful as minor issues might be identified that weren’t necessarily an issue during the isolated testing scenarios. Given that the initial test reveals lots of areas for improvement expect to run through this a couple of times to ensure the results are good. The following five steps will be used as a guideline for this type of testing:

4/9/23 document.doc 6

Page 11: Novelis Testing Management Plan v1.4

Test Management Plan

Step 1: Determine the objectives of the performance-testing effort: define the key performance indicators that need to be met. Step 2: Capture or estimate resource usage targets and thresholds: define the performance load in terms of resources that will be active concurrently. This estimate will change with each roll-out of Novelis 2.0 Step 3: Identify Metrics: with the input from step 1 through 3 define a matrix in order to be able to measure in detail the result of the performance testing Step 4: Perform test: execute the performance test, either manually or automaticallyStep 5: Communicate results: document the results and forward for sign-off

2.7 Cutover Testing

This kind of testing is simulating and practicing all steps that need to be done during cutover i.e. handing over the configured and developed system into production. The cutover testing is typically called a “dry run”. It involves a full scale execution of all the tasks involved to extract data from legacy systems, perform any kind of data conversion, load the results into SAP (and any other systems) and fully validate the results, including a sign-off before handing over system into production. It also involves for example importing transports, manual configuration, unlocking user IDs; starting up periodic processing for interfaces and so on.  

Most projects have several “dry runs” which progress from an exercise in capturing all the steps, checkpoints and sign-offs at cutover to a timed exercise to ensure everything can be accomplished in the time window for go-live. 

2.8 Regression testing

Every time a configuration or program code is changed (due to an error or a changed requirement), the related test cases must be repeated. Hence the role of regression testing is to test the existing functionality to ensure that it still works as expected with the newly updated configuration and code base.  In order to ensure that no new issues have been introduced through system changes in production regression testing will take place in the quality environment which has similar data as the production system.  In some cases automated testing can be effectively deployed as a fast and regular method to ensure core business processes are not adversely affected by new releases of code and configuration.

4/9/23 document.doc 7

Page 12: Novelis Testing Management Plan v1.4

Test Management Plan

Figure 3: Testing building blocks

4/9/23 document.doc 8

Page 13: Novelis Testing Management Plan v1.4

Test Management Plan

3 Testing type overview

Type of testing Work package or activity

Tools Test participants

Unit testingProcess/Configuration testing

Baseline configuration Final scope configuration

BPPs1

Work instructions Test scenario

template

Business process team Business process owner Systems development team Technical infrastructure

teamDevelopment testing Conversion programs

Interface programs Enhancements Reports Forms

Test section of functional and design specs

Key data elements

Development team Technical team Business process team

Security testing Activity groups Roles Profiles

Activity groups User profile definitions Job descriptions Security design specs

Business process team Business process owner Systems development team Technical infrastructure

team

Integration testingScenario testing Test scenarios

Multiple test cycles Includes all technical objects End user roles and security

testing

Test scenario Template

Business process teams Systems development team

System infrastructure testing

Infrastructure testing Security testing

To be determined Systems development team Security team Infrastructure team

Data migration testingPre-migration testing Scope of source system(s)

and data Define mapping Verify destination system(s)

requirement Source-target connection Tool configuration

To be determined Systems development team Infrastructure team

Formal design review Source system definition Mapping rules Quantify source records Data cleansing requirements Performance requirements

Workshop User community IT Executive sponsors

Post-migration testing Throughput evaluation Comparison of migrated

records Summary verification

To be determined User community IT

User acceptance testingProduction Testing “Day in the life of….” testing To be determined All teams

Performance testingPerformance testing Define objective

Estimate resource usage targets

Estimate resource budget

To be determined IT

1 BPP – Business process procedure

4/9/23 document.doc 9

Page 14: Novelis Testing Management Plan v1.4

Test Management Plan

Identify metrics Volume testing Stress testing Communicate results

Cut-over testingDry-run Data extraction

Data conversion Data load Validation Sign-off

To be determined All teams Executive sponsors

Production run Data extraction Data conversion Data load Validation Sign-off

To be determined All teams Executive sponsors

Regression testingRegression testing Test scenarios

Test cases Multiple test cycles Includes all technical objects

To be determined All teams

Table 1: Testing type overview

The system environment for testing (system name, system access, and client) will be defined in the system landscape strategy.

4/9/23 document.doc 10

Page 15: Novelis Testing Management Plan v1.4

Test Management Plan

4 Testing phases

Each test type contains specific testing tasks during the testing life cycle:

Define the Approach

Based on the testing strategy. Outline the objectives, scope, entry/exit criteria, test tools, and test environment for the test

stage.

Plan the Test

Identify test scenarios and conditions, and define test cycles for the test stage. Define the test environment requirements and set-up plans.

Prepare the Test

Expand the test plan by defining input data and refining the expected results to create the test script deliverable.

Prepare the test cycle schedule. Prepare test environment (systems, clients, tools, etc.).

Execute the Test

Execute the test scripts, compare the actual results to the expected results, identify and resolve any discrepancies.

Ensure that the test environment is effectively supported. Deliverables: documented actual results, fixed defects, and a fully executed Test Plan.

Close the Test

Complete the test stage in line with requirements. Ensure that all exit criteria are met and all deliverables are accounted for. Deliverables: Test status report and a sign-off sheet.

4.1 Unit testing

Objective

Confirm the integrity of each transaction configuration by executing the transaction and confirming that functional requirements have been successfully met.

Ensure that the SAP configuration satisfies “to be“ processes and the design described in the scripts.

Ensure custom developed programs and technical objects function according to design specs.

Ensure security is set-up and activity groups are correct as identified by user roles. Provide a basis for a more successful and on-time integration test. Engage business process owners as necessary to increase knowledge of the system and

validate business requirements are met.

Approach

Create BPP work instructions and unit test scripts, which will function as reusable building blocks that continually expand upon them to encompass the full scope of testing and can be leveraged during development of training material.

Create development object unit tests to confirm the functionality of each development request.

4/9/23 document.doc 11

Page 16: Novelis Testing Management Plan v1.4

Test Management Plan

Establish test data. Conduct /initiate (along with business process owners as applicable) the unit test

scenarios/scripts to confirm the configured processes. Conduct formal play-back sessions with business process owners that validate the

achievement of Blueprint business requirements. Identify variances from expected results. Document test problems for resolution.

Outputs

BPP / Scripts Completed development objects with associated unit test plans Security activity groups

There will be multiple types of test during the unit test with objectives which differ slightly:

4.1.1 Process/configuration testingProcess/configuration testing is designed to test the business processes identified and configured during Business Blueprint. The following items are tested during process/configuration testing:

Process validity Configuration Intra- and inter-module processing Validation and exception processing Work instruction validation

4.1.2 Development testingDevelopment testing is the testing of specific program development. This includes all systems development effort. The following are tested during development testing:

Workflows Reports Interfaces – SAP, Legacy, and Third-party Conversions Enhancements (User-exits and other code enhancements) Forms

4.1.3 Security testingSecurity testing is the testing of the process, SAP application, technical infrastructure, and network security against the role responsibility profile. The following are tested during security testing:

Positive and negative access testing for the activity group Testing of segregation of duty Business process authorization profiles Network and technical access controls

4.2 Integration testing

Objectives

Ensure that business process and technical requirements have been met Ensure that functional integrity has been maintained Validate that hardware and network infrastructures are in place and operating at efficient

levels Ensure successful on-line and batch processing

4/9/23 document.doc 12

Page 17: Novelis Testing Management Plan v1.4

Test Management Plan

Ensure adequate performing conversions, interfaces, enhancements, reports, and forms Ensure security is operating as designed and in conjunction with the business process, SAP

application, and technical requirements

Approach

Leverage test scripts/work instructions and development/security unit test plans from unit testing

Perform a high degree of cross-functional process flow (scenario) tests to validate end-to-end process integrity. Focus on high-risk design areas and core process functionality

Expand on unit test scenarios or create additional test scenarios specific to the development effort, which will functions as reusable building blocks that continue to expand upon themselves to encompass the full scope of testing

Establish test data Conduct/initiate (along with business process team members) the scenario tests Conduct formal play-back sessions with business process team members that validate the

achievement of Blueprint business requirements Conduct at multiple cycles of scenario testing. Each cycle should build upon the unit tests

and the previous cycle tests Document test problems for resolution Leverage the test scripts and data to produce performance test transactional scripts for

selected, high priority processes Conduct integrated testing of the network, network servers, communication lines (including

off-site access), production PC’s, printers, other peripherals, laptop access to SAP, remote printing and any other relevant hardware

Perform initial and incremental backups Generate stress/volume transactions, interfaces and batch processes

Outputs

BPPs / Scripts Completed development objects with associated unit test plans Completed security role and activity group tests Documented and Signed off test scenarios

There will be multiple types of test during the unit test with objectives which differ slightly:

4.2.1 Scenario testingScenario testing is designed to test those transactions that operate together and reflect important business processes and scenarios. This will cover multiple transactions within and across an enterprise area. The scenarios that are developed will also include all system development objects. Multiple cycles of scenario testing will be completed during the course of integration testing. Scenario testing during integration consists of process/configuration testing, systems development testing and security testing as described above in the unit test section. Scenario testing tests the following:

Multi-transactional processes within a team’s defined area of functionality by grouping transactions into scenarios

Multi-transactional processes across functional areas by grouping transactions into scenarios

Development objects (includes programs, interfaces, conversions, enhancements) End user security

4/9/23 document.doc 13

Page 18: Novelis Testing Management Plan v1.4

Test Management Plan

4.2.2 System infrastructure testingSystem infrastructure testing is designed to test that the components of the technical infrastructure (e.g. servers, disk storage, workstations, printers, networks, etc.) are operational and functioning as designed. System infrastructure testing will also consist of testing the servers and networks performance in relation to peak user, transaction and batch processing loads. The following are tested during systems infrastructure testing:

Server, disk storage, workstation, printer, and network operations Access – LAN, WAN, and remote Volume and stress testing (includes user, transaction, interfaces and batch processing) Performance when executing multiple concurrent transactions and under peak concurrent

use Interface performance Batch processing Backup and restore procedures (ensuring that they are accurate, and that both initial and

incremental activities can be performed) Input/output devices such as printers and faxes

4/9/23 document.doc 14

Page 19: Novelis Testing Management Plan v1.4

Test Management Plan

4.3 Data migration testing

Objective

Ensure that the data load file is build in alignment to the functional specifications and mapping documents can be loaded successfully

Ensure that the data loaded into the system conforms with the business processes customized in the system

Confirm the accuracy of the data conversion

Approach

The legacy system data has been thoroughly cleansed. The providers of the legacy data provide accurate data on time. Data migration takes place in a separate box that is not the development box. The data migration client is an exact copy of the client where the data migration objects

were built and tested. The data migration client is not open for configuration The data migration client is not part of a transport path to prevent the current cycle of

conversion testing from being blindsided by any new configuration changes. The data migration client is locked down so that only data migration and data validation

tasks can be performed. The data migration client is configured to handle a more background and update processes

and fewer dialog processes. The data migration client is built in a system that has enough disk space to be the repository

for the primary load files and any intermediate processing files needed. Outputs

BPP / Scripts Completed development objects with associated unit test plans Security activity groups

4.4 User acceptance testing

Objectives

Ensure that business process and technical requirements have been met Ensure that the objectives for running the day-to-day business activities have been met Ensure security is operating as designed and in conjunction with the business process, SAP

application, and technical requirements

Approach

Expand on unit test scenarios or create additional test scenarios specific to the development effort, which will functions as reusable building blocks that continue to expand upon themselves to encompass the full scope of testing

Establish test data Conduct/initiate (along with business process team members) the scenario tests Conduct formal play-back sessions with business process team members that validate the

achievement of Blueprint business requirements Conduct at multiple cycles of scenario testing. Each cycle should build upon the unit tests

and the previous cycle tests Document test problems for resolution Leverage the test scripts and data to produce performance test transactional scripts for

selected, high priority processes

4/9/23 document.doc 15

Page 20: Novelis Testing Management Plan v1.4

Test Management Plan

Simulate the production environment and evaluate system production performance capability

Outputs

BPPs / Scripts Completed development objects with associated user acceptance test plans Completed security role and activity group tests Documented and Signed off test scenarios

4.5 Performance testing

Objective

Ensure acceptable response time Identify processing bottleneck Define performance metrics

Approach

Enhance BPP work instructions and unit test scripts to be used for performance testing (based on reusable building blocks).

Establish performance test data. Conduct /initiate (along with business process owners as applicable) the performance test

scenarios/scripts to determine or extrapolate the performance. Conduct formal play-back sessions with business process owners that validate the

achievement of the performance testing objective. Simulate the production environment and evaluate system production performance

capability Identify variances from expected results. Document test problems for resolution.

Outputs

Identified gaps and performance problems as well as proposed resolution steps. Go/No-Go decision

4.6 Cut-over testing

Objective

Full-scale execution of all tasks necessary for go-live Validation of all executed tasks Enable / ensure planning by the minute for the cut-over period

Approach

List all the activities that need to be done during cut-over (i.e. prepare cut-over plan) Identify systems Identify all stakeholders (internal and external) Identify responsibilities and roles

Outputs

Cut-over plan Fall-back scenario Sign-off

4/9/23 document.doc 16

Page 21: Novelis Testing Management Plan v1.4

Test Management Plan

4.7 Regression testing

Objective

Ensure that business process and technical requirements have no adverse affect on the production environment

Ensure that functional integrity will be kept Validate that hardware and network infrastructures will still be sufficient Ensure continuation of all on-line and batch processing Ensure security will still operating as designed and in conjunction with the business process,

SAP application, and technical requirements

Approach

Leverage test scripts/work instructions and development/security unit test plans from previous testing

Focus on high-risk design areas and core process functionality Conduct/initiate (along with business process team members) the scenario tests Conduct formal play-back sessions with business process team members that validate the

achievement of Blueprint business requirements Document test problems for resolution Leverage the test scripts and data to produce performance test transactional scripts for

selected, high priority processes Simulate the production environment and evaluate system production performance

capability Conduct integrated testing of the network, network servers, communication lines (including

off-site access), production PC’s, printers, other peripherals, laptop access to SAP, remote printing and any other relevant hardware

Perform initial and incremental backups Generate stress/volume transactions, interfaces and batch processes

Outputs

Updated BPP / Scripts Completed development objects with associated unit test plans Completed security role and activity group tests Documented and Signed off test scenarios

4/9/23 document.doc 17

Page 22: Novelis Testing Management Plan v1.4

Test Management Plan

5 Test organization

The following roles need to be appointed:

Figure 4: Test organization

The proposed organizational chart will be useful for all testing. Preparation will start prior to the actual test phase. Staffing/engagement of additional resources might be necessary for certain test types (for example user acceptance test).

5.1 Test lead

The role of Test Lead/Manager is to effectively lead the testing team. To fulfill this role the Lead must understand the discipline of testing and how to effectively implement a testing process while fulfilling the traditional leadership role of a manager.

The Test Lead/Manager is responsible for:

Defining the role testing plays within the organizational structure. Defining the scope of testing within the context of each release / delivery. Deploying and managing the appropriate testing framework to meet the testing mandate. Implementing and evolving appropriate measurements and metrics. Planning, deploying, and managing the testing effort for any given /release. Managing and growing testing assets required for meeting the testing mandate:

Team Members Testing Tools

4/9/23 document.doc 18

Page 23: Novelis Testing Management Plan v1.4

Test Management Plan

Testing Process

5.2 Internal process expert (business)

The role of internal process expert is to effectively define test cycles and test scripts, based upon the business requirement that the application has been defined for. They will also execute the appropriate test types and are responsible for integrating additional resources as they see fit (for example engaging SMEs for specific business requirements). The internal process expert will also document the test result and – based upon successful completion of the test – will also sign off on their respective part of the test.

5.3 External process expert (business & IT)

The role of external process expert is to ensure that the application and systems have been prepared according the business requirement and is ready for the appropriate test type in question. In addition they will support the definition and execution of the test cycles and test scripts. Based upon the test results they will also be responsible for bug fixing to ensure that the test defects are being corrected.

5.4 Internal Audit team

Novelis Internal Audit has dedicated resources to oversee and advise the rollout of Novelis 2.0. The two major components the audit team is looking at the project governance of the Novelis 2.0 project as well as the optimization of the business process controls. For the project governance Internal Audit is looking at the project delivery on time, on budget and within scope. On the business process side processes and controls should be efficient, robust and aligned with the overall objectives. Further, available technology .i.e. Quality Center (from SAP or HP) for testing or SAP GRC monitoring business and application controls should be leveraged as much as possible for strengthening the internal controls framework. Lastly, identity management is implemented efficiently and securely.

As part of the testing strategy the audit team will:

Confirm that testing strategy is appropriate and applied correctly Confirm that test scripts execution has been delegated to appropriate personnel and that

reviews of testing results have been performed. Determine that infrastructure performance was assessed and is aligned with organizational

requirements Assess business control testing as well as design of IT GC for operations. Assess the execution of tests and management of test issues and confirming that corrective

actions are taken in a timely fashion.

5.5 End user (business)

The role of the end user is crucial to the acceptance of the production system. Although there will be some involvement of the end user also during unit testing and integration testing, there main role will be during the user acceptance test. As pointed out user acceptance test is the final functional testing milestone before go-live. Selecting the right mix of end users is important since not all end users can be involved during user acceptance test. Each workstream needs to select their representative end users to be involved during user acceptance testing. The selection criteria and the actual selection will be defined in the testing plan (see 6.1.1 Test planning) for the user acceptance test.

4/9/23 document.doc 19

Page 24: Novelis Testing Management Plan v1.4

Test Management Plan

End user testers are the most important testing roles since they are validating the system meets the business needs. End user testing is usually the final activity before the system goes live and this role requires multi-faceted skills. These qualities allow the person playing that role to perform this important activity. Without these qualities one may not be able to conduct a proper end user test.

These following four core qualities of an end user tester are:

Background: Experience of user operations, not involved in the overall IT project, experience in the use of IT facilities, and respected as an independent thinker

Skill: A good communicator, avoids politics, expects the system to fail Independence: Not involved in user specifications, has an independent reporting structure,

and is a self starter Attitude: Lateral thinker, tenacious, analytical

Table 2: Test types, activitites, roles and responsibilities

4/9/23 document.doc 20

Page 25: Novelis Testing Management Plan v1.4

Test Management Plan

6 Test management control

6.1 Test management process

6.1.1 Test planningThis step needs to be done for every test type and project phase/roll-out. It determines which transactions are most critical to the business processes being implemented within Novelis 2.0. A priority level is established for each business scenario. Based on priority and availability, a schedule is created and responsibility for creating and executing tests is assigned for each transaction that supports the identified business processes. Highly customized transactions and potentially high-risk business processes with heavy user volumes are identified and scheduled.

In addition in this test planning entry criteria will be defined (what is necessary to start the test, i.e. prerequisites for testing) as well as test scope (which business requirements will be tested) and the acceptance criteria (defect categories, in conjunction with the priority levels established above will influence the sign-off).

6.1.2 Test designDuring this phase, the individual business requirements and their associated tests are identified, and the master data required for each must be established. The master data established during this phase is vital to ensure that comprehensive test coverage and accurate test results are produced. A plan is established for resetting or regenerating master data for each test cycle. During this step, test scripts and test cases are created and stored using standard tools like SAP/HP Quality Center, Quick Test Professional and SAP eCATT.

6.1.3 Test designThis step of the testing process establishes the technical environment in which the test(s) will be executed. Testing Team takes complete control over the environment and it is not shared with other entities during the course of testing.

6.1.4 Test executionIn this step, the test scripts are executed according to the test plan guidelines either:

Manually - Using tools like SAP/HP Quality Center Automated – Using tools like SAP/HP Quick Test Professional and SAP eCATT.

Automating the test cases rules out subjective factors in the comparison of the expected and actual results and significantly reduces, in particular, the effort for regression tests because all test cases can subsequently be run at the press of a button, be it as part of an update or one-by-one in the case of quality assurance parallel to development. Results of the test execution are recorded using these tools thus maintaining a complete audit trail.

6.1.5 Defect tracking and resolutionIn this iterative step the defects identified during test execution are logged into a defect tracking and resolution tool like SAP/HP Quality Center. Scenarios for which defect resolutions are provided by the configuration/development team are retested to ensure conformity with the requirements.Defects will also be categorized in terms of severity. With the priorities established in Test planning this will help in establishing the priorities for bug fixing and for evaluation of the test results.

4/9/23 document.doc 21

Page 26: Novelis Testing Management Plan v1.4

Test Management Plan

6.1.6 Test evaluation and reportingAfter the tests are executed, the test results are analyzed, and evaluations are made as to the readiness of the configuration. Test summary report is prepared and distributed to all the stakeholders highlighting attention areas / weak spots.

4/9/23 document.doc 22

Figure 5: Test management process

Page 27: Novelis Testing Management Plan v1.4

Test Management Plan

6.2 Test planning and dependencies

The execution of testing is dependent upon several factors. The timelines of each of these dependencies must be coordinated.

Completion of configuration and development work Definition of security roles and activity groups Generation of work instructions and test scripts Availability of test data – both SAP, legacy and third party Availability of SAP and legacy environments Availability of bolt-on’s

6.3 Test management tools

Documenting test objectives, test cases/scripts, and expected results are important tasks of testing. Test scripts and additional unit test plans will be written by teams and upon execution reviewed and signed off by team members, team leads and as applicable process owners. Supporting documentation denoting the test(s) will be attached to the script. Proper documentation provides a record of the test as well as system changes resulting from the test. A log of test scripts will be maintained denoting the process, testing phase, status, etc.

4/9/23 document.doc 23

Page 28: Novelis Testing Management Plan v1.4

Test Management Plan

7 Forms and Reports

7.1 Test scenario

The form to be used for test scenarios is described in the document management strategy.

7.2 Test script

The form to be used for test scripts is described in the document management strategy.

7.3 Test problem reports

The form to be used for test problem reports (TPR) is described in the document management strategy.

7.4 Test metrics

In order to monitor the progress of the testing effort, several metrics will be implemented. These will serve as the basis for identifying problem areas in the testing effort.

Total number of test scripts Number of test scripts, categorized by “in progress”, “not started”, “completed” and “signed

off” Total number of TPRs Number of tests waiting on TPRs Number of TPRs, categorized by “in progress”, “in retest”, “completed” and “signed off”.

7.5 Test status report

The form to be used for test status reports is described in the document management strategy.

4/9/23 document.doc 24