Isn’t an Enterprise Test Strategy just · Mission Statement Standards for test analysis, planning...

36
Renaissance Hotel 1 West Wacker Drive Chicago IL 60601 Speaker(s): Susan Schanta When: April 22, 2016 Company: Cognizant Technology Solutions Time: 10:00 am - 11:00 am CHICAGO April 18th — April 22th Isn’t an Enterprise Test Strategy just for an Enterprise Project Release?

Transcript of Isn’t an Enterprise Test Strategy just · Mission Statement Standards for test analysis, planning...

Renaissance Hotel 1 West Wacker Drive Chicago IL 60601

Speaker(s): Susan Schanta When: April 22, 2016

Company: Cognizant Technology Solutions Time: 10:00 am - 11:00 am

CHICAGO April 18th — April 22th

Isn’t an Enterprise Test Strategy just

for an Enterprise Project Release?

© 2015 Cognizant 1 © 2015 Cognizant

Isn’t an Enterprise Test Strategy just for an Enterprise Project Release?

04/22/16

Susan Schanta, Director

© 2015 Cognizant 2

Why do I need an Enterprise Test Strategy? Why do I need an Enterprise Test Strategy? Because quality can’t be tested into the product… We need a shared vision to achieve quality… What is the Challenge We’re Solving for? Lack of shared understanding regarding the role of the Quality organization

− We pay a lot for QA, why isn’t quality better? − Why do we need QA so early in the lifecycle? − QA just tests at the end, right? − Why does it take QA so long to test?

How do I change our persona from a Testing Department to a Quality Center of Excellence − Does the organization understand QA’s role in the software development lifecycle? − Have I set expectations for how and when project stakeholders should engage QA? − How can I establish a collaborative relationship where cross-functional teams understand the

interdependencies between their work product and the QA work product?

© 2015 Cognizant 3

What is an Enterprise Strategy?

The Enterprise Test Strategy establishes a framework for how the QA organization operates and interacts with project team members. When collaboratively developed with Project Management, Business Analysts and Development, the Enterprise Test Strategy provides a foundation for how the organization will build quality into product releases while reducing the cost of quality. Establishes standards for test analysis, planning and validation Helps drive behavioral change in QA and cross-functional teams Introduces shared responsibility for best practices Drives defect reduction in the Requirements, Design and Construction Phases of the lifecycle Institutes a shared and disciplined approach to automation standards to achieve automation

sustainability Defines standards for performance standards for critical applications to address operational and

business continuity goals Aligns test data management to corporate security policies Creates strategies where gaps exist for specialty testing such as mobility, big data, automation,

performance, etc.

© 2015 Cognizant 4

The Value Proposition

The Enterprise Test Strategy provides a framework for how QA drives quality throughout the lifecycle and provides a foundation for cross-team collaboration, quality disciplines and operating guidelines. The ultimate goal is to deliver the best quality while driving down the cost of quality. Limit Scope Creep Increase Test Coverage Increases velocity of Test Case creation Increase Requirements Traceability to Test Cases Reduce Defect Leakage to Production Reduce Cost of Maintenance Reduces redo work

© 2015 Cognizant 5

Enterprise Test Strategy

Mission Statement Standards for test analysis, planning and validation Definition of Test Types RACI for Test Phases – Unit through UAT Tiered approach for test documentation

Automation Disciplines – ROI driven automation, development disciplines, Performance Disciplines – load/stress, usability and

business continuity Defect Management Guidelines Test Data Management Guidelines Test Environment Management Test Tool Metrics beyond Defect Rates

Program / Project Test Strategy

Project scope and objectives Risks & Mitigation Assumptions, Dependencies & Constraints

Test Scope – what will be tested Limitations to testing based on tools needed Limitations to testing based on environment

availability Test Approach Manual test activities Automated test activities

Test Environment Requirements Required hardware, software and licenses

Test Data Requirements Data extraction requirements from production Parameters for manipulation of test data (such

as aging)

An Examination of Strategy Types

© 2015 Cognizant 6

Establishing the Enterprise Quality Framework

© 2015 Cognizant 7

Defining the Mission

Quality Assurance is dedicated to reducing the cost of quality by improving overall product reliability. Our mission to attain Quality lies in defect prevention by establishing a precisely measurable process to ensure conformance to requirements. We believe that Quality is the all important catalyst that makes the difference between success and failure.

Our mission is to control, manage and drive all testing services, ensure quality in our software solutions, deliver on-time, on-budget, goal-oriented and cost-effective solutions for business, satisfy customer requirements, strive for continuous process improvements and contribute to the overall growth of the organization through streamlined, efficient and best-in-class testing practices, governance model with highly skilled & motivated people.”

Create a definitive statement that defines your organizational goals…

© 2015 Cognizant 8

Tiered Approach to Test Documentation

After defining the full library of QA artifacts, determine mandatory use based on set criteria Based on project size &

complexity Based on project duration Based on test scope

# Deliverable SDLC Phase PMO Maintenance Releases

1 L0 QA Project Estimate Initiation Y Y 2 QCOE Project Plan (Schedule) Requirements Y N 3 L1 QA Project Estimate Requirements Y N 4 Requirements Traceability Matrix Requirements Y N 5 Test Scenario Requirements Y Y 6 Master Test Plan Requirements Y N

7 L2 QA Project Estimate Design Y N

8 Project Level Test Plan Design Y N 9 Regression Test Case Selection Design Y Y

10 Test Case Design Y Y 11 Test Data Requirements Template Design Y Y 12 Test Environment Readiness Checklist Construction Y N 13 Defect Report Template Validation Y Y 14 Productivity Loss Log Validation Y Y 15 Test Summary & Closure Report Validation Y N 16 Lessons Learnt Document Validation Y N

© 2015 Cognizant 9

Communication Guidelines Communication Deliverable Objective Owner Frequency

Business/Test Scenario Test conditions based on requirements, business rules, constraints, etc. QA Lead Once during Requirements Phase

Business/Test Scenario Stakeholder Review Stakeholder feedback collected and incorporated QA Manager As scheduled

Code Turnover Notes Code turnover notes of functions ready to test, workarounds and open defects Dev Lead For each code turnover

L0 QCOE Project Estimate Estimate given based on business case and discussion-no requirements defined. QA Manager Once in Proposal Phase

L1 QCOE Project Estimate Estimate given based on Elicitation Sessions and Requirements QA Lead Once in Requirements Phase

Requirements Traceability Matrix Trace requirements to test cases to defects QA Lead Weekly from Test Design forward

Test Case Steps to validate a test condition based on business, user, functional/nonfunctional requirements QA Lead Once/Stored in Test Repository

Test Case Peer Review Peer feedback collected and incorporated QA Lead As scheduled

Test Plan Defines the tactical and operational approach to validation of test conditions QA Lead Initiated in Requirements and

updated as needed

Test Plan Stakeholder Review Stakeholder feedback collected and incorporated QA Manager As scheduled

© 2015 Cognizant 10

Definition of Test Types Test Type Test Definition

Unit Test Unit Test is performed by the developer to validate units of source code, modules and functions using controlled data to measure that each unit of code operates as designed.

Integrated Unit Test Integration unit testing is a logical extension of unit testing. In its simplest form, two units that were individually tested are combined into a component and the interface between them is tested. A component, in this sense, refers to an integration of one or more units.

Unit Regression Test When developers modify code, regression unit testing is required to evaluate code quality. In this case, the developer can reuse his original unit test cases. In some cases, the developer may be required to modify the existing tests or create new ones depending on the extent of the changes made.

Smoke Test The Smoke Test provides a preliminary evaluation to identify failures severe enough to reject a prospective code turnover. A subset of test cases that cover the most important functionality.

© 2015 Cognizant 11

Definition of Test Types Test Type Test Definition

System Test System Test is performed to compare a program’s behavior against the functional business and technical specifications. Positive system testing involves testing users can successfully use all paths of functionality with expected results. Negative system testing is checking to make sure users are not allowed to use improper paths.

System Integration Test

System Integration Test is performed on the complete, fully integrated system with a focus on Role Based Testing to emulate real life scenarios, sometimes called end-to-end testing. The purpose of integrated system testing is to detect any inconsistencies between the functions that are integrated together.

Regression Test

Regression test is any type of software testing that seeks to uncover software errors by partially retesting a modified program. The intent of regression testing is to provide a general assurance that no additional errors were introduced in the process of fixing other problems.

User Acceptance Test

User Acceptance Testing (UAT) is a process to obtain confirmation that a system meets mutually agreed-upon requirements. The UAT acts as a final verification of the required business function and proper functioning of the system, emulating real-world usage conditions on behalf of the end user.

© 2015 Cognizant 12

Defect Management

• Enforce a single standard for logging and tracking defects • Traceability of defects through resolution and Root Cause Analysis Business Priority vs. Technical Severity

− Implementation provides a preventative remedy to avoid downgrading defects in order to release to production

Establish SLAs based on business priority and technical severity based on lifecycle phase − Expected turnaround time < 12 hrs. − Expected turnaround time < 48 hrs. − Expected turnaround time < 72 hrs. − Expected turnaround time – prior to production release

Restricted permissions for select defect states − Closed, cancelled and deferred states

Go/No Go Decision − Zero uninspected defects at release

© 2015 Cognizant 13

Standards for Test Analysis, Planning & Validation

© 2015 Cognizant 14

Enterprise Approach to Requirements Phase

Requirements Elicitation Participate in elicitation sessions Create business/test scenarios Ensure requirements have an expected outcome

Requirements Analysis Ambiguity Reviews to identify unclear, incomplete

and missing requirements Feedback loop with BA to address ambiguities Unresolved ambiguities documented as requirements

defect Requirements Phase Signoff QA certifies requirements are testable QA certifies requirements are traceable to test cases QA obtains signoff from business that business/test

scenarios provide appropriate test coverage

Participation in Requirements Elicitation Sessions Requirements Analysis Requirements Phase Signoff

© 2015 Cognizant 15

Enterprise Approach to Design Phase

Design Sessions Participate in design sessions Add/modify business/test scenarios Ensure design delivers documented requirements and

provide the expected outcome Design Analysis Identify gaps/conflicts between requirements and

design Feedback loop with BAs/Development to address

gaps/conflicts between requirements and design Design Phase Signoff QA certifies design addresses defined requirements QA certifies design is traceable to requirements and

test cases

Participation in Design Sessions Design Analysis Design Phase Signoff

© 2015 Cognizant 16

Enterprise Approach to Construction Phase

Test Case Creation Create test cases based on requirements and

business/test scenarios Automation script creation Performance script creation

Regression Suite Analysis Examine regression test case suite and select test

cases for execution during regression test Test Case Review Internal test case peer reviews with Test Leads Project stakeholder review with BAs and Dev Leads

Test Case Signoff QA secures signoff from BAs and Dev Leads

Test Case Creation Regression Suite Analysis Test Case Review Test Case Signoff

© 2015 Cognizant 17

Enterprise Approach to Validation Phase

Smoke Test Execute as entrance gate to Validation Phase Execute as entrance gate for every code turnover

System Testing Test execution for system test cases Test execution for system integration test cases Defect fix and retest cycles

Regression Test Execute manual regression test cases Execute automated regression test cases

Test Closure Final update of test cases where needed Test case selection for Regression Suite Target test cases for next stage of automation Lessons Learned

Smoke Test System Testing Regression Test Test Closure

© 2015 Cognizant 18

Introducing the Enterprise Test Strategy

© 2015 Cognizant 19

Introducing an Enterprise Test Strategy to Your Organization

Define the purpose and mission Decide what you want to accomplish Engage key stakeholders to gain acceptance and support Create a template outline in alignment with your goals Draft standards and guidelines as needed where gaps exist Implement in phased approach

Introduce governance councils Set goals based on governance council purpose (such as…)

− Automation goals based on calculated ROI − Create development disciplines to achieve automation sustainability

Perform collaborative reviews with cross-functional stakeholders Introduce to senior management as a program Determine if the creation of a steering committee will garner greater support Highlight approach as an initiative to reduce the cost of quality Create metrics to benchmark present state and measure progress going forward Set and keep a schedule for publishing metrics such as a monthly scorecard

© 2015 Cognizant 20

Automation & Performance Governance Councils Automation Governance Council Performance Governance Council

Provide oversight and direction to facilitate expansion of automated test coverage and sustainability to achieve long term ROI

Provide oversight and direction to facilitate expansion of performance test coverage and sustainability to achieve long term ROI

Develop code disciplines to support automation sustainability

Collaborate to design automation coding approach to align with EH system architecture

Council governs cross-functional collaboration for tool/process

Prioritization of test case automation ROI review against goals Set standards to facilitate use of automation

benchmarks in development Prioritize, review and approve CapEx expenditures for

tools, hardware and software to support test environment management before presentation to Senior Management

Code disciplines to support performance sustainability Prioritization of applications to performance test ROI review against goals Evaluate environmental requirements to support

performance as a discipline Prioritize, review and approve CapEx expenditures for

tools, hardware and software to support test environment management before presentation to Senior Management

© 2015 Cognizant 21

Test Data & Test Environment Management Governance Councils

Test Data Management Governance Council Test Environment Governance Council Provide oversight and direction to facilitate transfer, use, maintenance and disposal of production test data within the guidelines of IT Governance

Provide oversight and governance to normalize test environment configuration, usage and expansion to support E2E testing

Align IT Governance Standards to the test data management discipline for masking, manipulating and disposal of production data extracts used for testing

Govern use of production data to perform break/fix testing

Set standards for capacity management (of data) to ensure storage use is optimized

Implement guidelines for test coverage to ensure IT Governance Standards for data are observed

Prioritize, review and approve CapEx expenditures for tools, hardware and software to support test environment management before presentation to Senior Management

Unify management of test environments under one governance structure

Establish standards for configuration to align as closely as possible to production

Govern release standards to ensure code promotion in test environments mirrors production

Set standards for capacity management to ensure new hardware/tools are not requested until current system usage is maximized

Prioritize, review and approve CapEx expenditures for tools, hardware and software to support test environment management before presentation to Senior Management

© 2015 Cognizant 22

Measurement

© 2015 Cognizant 23

Project Level Measurements

Project Budget: Estimated vs.

Actual

Schedule Variance

Effort Variance Compliance to SDLC Processes

Maintenance as % of IT Budget

% Milestone Adherence

Rate of Change Requests

Defect Aging

Production Defect Leakage

© 2015 Cognizant 24

Requirements Measurement

% Requirements Schedule

Adherence

Requirements Volatility Index

Rate of Change Requests

# Missing Requirements

# Incomplete Requirements

# Ambiguities per Requirement

# Clarifications due to Unclear Requirements

Rate of Defects with Root Cause as Requirements

Defect Leakage to Design &

Construction

© 2015 Cognizant 25

Development Measurements

% Development Schedule

Adherence

Defect Leakage to Validation Phase

Adherence to Automation

Development Disciplines

# Times Defect Rejected/ Reopened

Defects per Lines of Code

Defects per Function Points

Code Turnover Build Failures

Mean Time to Repair

Technical Debt due to Poor Design

© 2015 Cognizant 26

Test Measurements

% QA Schedule Adherence

Requirements Traceability to

Test Cases

Test Effectiveness

Productivity Loss Log

% Automated Regression Coverage

Automation ROI

Defect Detection Rate

Defect Removal Efficiency

Production Defect Leakage

© 2015 Cognizant 27

Appendix

© 2015 Cognizant 28

Enterprise Test Strategy – Sample Template 1. Executive Summary

a. QCOE Mission Statement b. QCOE Scope of Testing

2. Test Levels a. Unit Test b. Integrated Unit Test c. Unit Regression Test d. System Test e. System Integration Test f. Smoke Test g. Regression Test

1) Manual vs. Automated h. Nonfunctional Test i. UAT

3. Tiered Approach to Projects 4. Minimum Threshold for Entrance to QCOE 5. Entrance & Exit Criteria

a. Proposal Phase b. Initiation & Planning Phase c. Requirements Phase

© 2015 Cognizant 29

Enterprise Test Strategy – Sample Template d. Design Phase e. Construction Phase f. Validation Phase

6. Suspension & Resumption Criteria a. Test Halt

1) Smoke Test Fails 2) Module/Function Failure or

Roadblock b. Test Resumption

7. Program Assumptions, Constraints, Risks & Dependencies a. Assumptions b. Constraints c. Risks d. Dependencies e. Regulatory & Compliance

8. Systems Supported a. Operating Systems b. Web Browsers

9. QCOE Deliverables 10. QCOE Communication Guidelines

© 2015 Cognizant 30

Enterprise Test Strategy – Sample Template 6. Test Estimation Model

a. L0 Test Estimation b. L1 Test Estimation c. L2 Test Estimation d. Impact Analysis – Change in Scope

12. Strategic Program Approach to Test Activities a. Functional Test b. System Test c. System Integration Test d. Regression Test

13. Functional vs. User Acceptance Test 14. Strategic Program Approach for Automation

a. Guidelines for Test Automation b. Automation Phase & Activities

1) Requirements Phase 2) Design Phase 3) Construction Phase 4) Validation Phase

c. IT Development Standards d. Automation Coding Standards

© 2015 Cognizant 31

Enterprise Test Strategy – Sample Template 15. Strategic Program Approach for Performance

a. Types of Performance Testing b. Performance Standards c. Guidelines for Performance d. Performance Phase & Activities

16. Test Data Management a. TDM Standards b. Guidelines for Test Data Use c. TDM Framework d. TDM Phase & Activities

17. Test Environment Management a. TEMs Standards b. Guidelines for Test Environment Use c. TEMs Framework d. TEMs Phase & Activities e. System Environment Limitations & Missing Environments

18. QCOE Tools a. Commercial Tools b. Proprietary Tools

© 2015 Cognizant 32

Enterprise Test Strategy – Sample Template 19. Defect Management Process

a. Defect State b. Business Priority c. Defect Technical Severity d. Defect Prioritization Based on Business Priority &

Technical Severity e. Restricted Permissions – Closing/Cancelling/Deferred

Defects 20. Go/No Go Decision

a. Zero Uninspected Defects at Release b. Go/No Go Decision

21. QCOE Test Management Tool Set 22. QCOE Program Reporting

a. Reports (Traceability, Daily, Weekly, Productivity Loss Log) b. KPIs, SLAs & Metrics

23. Reference Documents 24. Appendix

a. Glossary

© 2015 Cognizant 33

References Software Testing Stuff: Test Strategy - All Aspects http://www.softwaretestingstuff.com/2008/11/test-strategy-all-aspects.html. Test Strategy and Test Plan http://www.testingexcellence.com/test-strategy-and-test-plan/ Test Plan Fundamentals http://softwaretestingfundamentals.com/test-plan/ Difference between Test Plan and Test Strategy | Do we really need Test Plan documents? http://www.softwaretestingtimes.com/2010/04/difference-between-test-plan-and-test.html Defining Code Quality http://blog.smartbear.com/peer-review/defining-code-quality/

© 2015 Cognizant 34

References

Test Metrics & KPIs http://www.ust-global.com/en/images/stories/pdf/Test_Metrics_and%20KPI_s.pdf Project Metrics for Software Development http://www.infoq.com/articles/project-metrics The Cost of Software Quality - A Powerful Tool to Show the Value of Software Quality http://www.riceconsulting.com/articles/software-cost-of-quality-metric.html Using the Cost of Quality Approach for Software http://sunset.usc.edu/cse/pub/event/archives/pdfs/Herb_pt2.pdf

Susan Schanta Director Cognizant Technology Solutions [email protected] 201-478-0571