How the world’s top organizations test

18
How the world’s top organizations test The state of enterprise application testing 2021

Transcript of How the world’s top organizations test

How the world’s top organizations testThe state of enterprise application testing 2021

Table of Contents

03 About this report

04 Introduction

06 Key takeaways

07 Quality metrics

09 Testing process assessment

10 Testing approach details

11 Next steps

12 Appendix A: Demographics

14 Appendix B: Metrics details

15 Appendix C: Element details

3© 2021 Tricentis USA Corp. All rights reserved

How the world’s top organizations test The state of enterprise application testing 2021

ABOUT THIS REPORT

With every business becoming a digital enterprise, the ability to

rapidly release reliable applications is now a core strategic advan-

tage. Are Fortune 500 and equivalent organizations prepared for

the digital race ahead? The answer may lie in their software testing

process, which can be either a curse or catalyst to speedy inno-

vation.

Continuous Testing enables software to be released 10x faster

than traditional testing allows. Yet, its adoption is limited—partic-

ularly among large enterprise organizations.

This first-of-its-kind report sheds light on how industry leaders

test the software that their business (and the world) relies on.

4© 2021 Tricentis USA Corp. All rights reserved

IntroductionThe challenge of software testing is immensely underestimated and invariably un-

appreciated. Even with seemingly basic applications—like a common mobile app—

there’s a staggering amount of testing approaches you could take, paths and con-

ditions you could exercise, device configurations you could test against, and so on.

With today’s near-continuous release cycles, ensuring that each update adds value

without disrupting the user experience is already a daunting task. The difficulty is

exacerbated for enterprise organizations. At this level, testing must accommodate:

• Complex application stacks that involve an average of 900 applications. Sin-

gle transactions touch an average of 82 different technologies ranging from

mainframes and legacy custom apps to microservices and cloud-native apps.

• Deeply-entrenched manual testing processes that were designed for waterfall

delivery cadences and outsourced testing—not Agile, DevOps, and the drive

towards “continuous everything.”

• Demands for extreme reliability. Per IDC, an hour of downtime in enterprise

environments can cost from $500K to $1M. “Move fast and break things” is not

an option in many industries.

Particularly at the enterprise level, testing is the #1 source of delivery delays, man-

ual testing remains pervasive (only 15% is automated), and testing costs consume

an average of 23-35% of the overall IT spend.

Yet, many top organizations find a way to break through these barriers. They trans-

form testing into a catalyst for their digital transformation initiatives—accelerating

delivery and also unlocking budget for innovation.

Enterprise software testing is insanely difficult. Yet, many organizations break through the barriers and make testing a strategic business advantage.

What are they doing differently?

5© 2021 Tricentis USA Corp. All rights reserved

Introducing the first annual enterprise application testing benchmark

To shed light on how industry leaders test the soft-

ware that their business (and the world) relies on,

Tricentis is releasing our findings on how the world’s

top organizations test. This data was collected through

one-on-one interviews with senior quality manag-

ers and IT leaders representing multiple teams. The

participants represented teams using a variety of

QA-focused functional test automation tools (open

source, Tricentis, and other commercial tools). Devel-

oper testing and security testing activities were out of

scope.

The current report focuses on the data collected from

the top 100 organizations we interviewed: Fortune

500 (or global equivalent) and major government

entities across the Americas, Europe, and Asia-Pacific.

All for-profit companies have revenue of $5B USD or

greater.

We’re protecting everyone’s privacy here, but just

imagine the companies you interact with as you drive,

work, shop, eat and drink, manage your finances…and

take some well-deserved vacations after all of that.

Given the average team size and number of teams

represented, we estimate that this report represents

the activities of tens of thousands of individual testers

at these leading organizations.

Following the key findings, benchmark data is presented

in 3 core sections:

How this report is organized

Quality metricsResults and analysis for key metrics

used to measure testing transformation

success.

Testing approach detailsA deeper dive into how organizations are

testing their applications and measuring

their success.

Testing process assessment Organizations’ progress on implement-

ing key elements of a successful testing

process.

6© 2021 Tricentis USA Corp. All rights reserved

Key takeaways

Automation without stabilization The average test automation rate (39%) is relatively

high, but so are false positives (22%). This is com-

mon for early-stage test automation efforts that lack

stabilizing practices like test data management and

service virtualization.

Tests aren’t aligned to risks Requirements coverage (63%) is high, but risk cover-

age is low (25%). Likely, teams are dedicating the same

level of testing resources to each requirement rather

than focusing their efforts on the functionality that’s

most critical to the business.

Dev and test cycles are out of syncThe average test cycle time (23 days) is shockingly

ill-suited for today’s fast-paced development cycles

(87% of which were 2 weeks or less back in 2018).

With such lengthy test cycles, testing inevitably lags

behind development.

Quality is high (among some) The reported defect leakage rate (3.75%) is quite im-

pressive.1 However, only about 10% of respondents

tracked defect leakage, so the overall rate is likely

higher. The organizations tracking this metric tend to

be those with more mature processes.

Great foundationOrganizations have made good strides master-

ing the foundational elements of testing success

(adopting appropriate roles, establishing test en-

vironments, fostering a collaborative culture).

“Continuous everything” isn’t happening…yetFew are achieving >75% test automation rates or

adopting stabilizing practices like service virtual-

ization and test data management. Given that,

limited CI/CD integration isn’t surprising. All are

high on organizations’ priority lists though.

Greatest gapsThe greatest gaps between leaders and laggards

are in the areas of the percentage of automat-

ed tests executed each day, risk coverage, defect

leakage into UAT, and test cycle time.

Top improvement targets The areas where organizations hope to make

the greatest short-term improvements (within 6

months) are risk coverage, defect leakage into

UAT, false positive rate, and test cycle time.

1 Typically, < 10% is considered acceptable, <5% is good, and <1% is exceptional.

39% test automation...but high false positives, low risk coverage, and shockingly slow testing cycles.

7© 2021 Tricentis USA Corp. All rights reserved

Quality metrics

* This is the percentage of automated tests that are successfully executed each day/week

Note that in some cases (defect leakage, false positives, cycle time) a lower number is better.

To learn more about these metrics, see Appendix B

Here are the findings for 10 core quality metrics that testing teams were tracking:

METRIC Mean MedianBottom Quartile

Top Quartile

Automation rate 39 40 20 65

False positive rate 23 20 30 10

Defect leakage into production* 4 3 5 1

Defect leakage into system integration 20 12 25 12

Defect leakage into UAT 18 6 35 4

Execution rate per day* 41 22 0 89

Execution rate per week* 52 33 21 100

Risk coverage 25 5 0 45

Requirements coverage 63 75 50 75

Test cycle time (in days) 23 10 35 5

Leaders vs laggards

The metrics with the great-est gaps between leaders (top quartile) and laggards (bottom quartile) are:

Execution rate per day

Risk coverage

Defect leakage into UAT

Test cycle time

8© 2021 Tricentis USA Corp. All rights reserved

A peek into specific organizations’ prioritiesHere’s a look at what 10 organizations reported as their top priorities:

GOVERNMENT TAX AGENCY – reduce their false positive rate

from 50% to 15%

GOVERNMENT SOCIAL SERVICES AGENCY – increase risk coverage

from 5% to 20%

DEPARTMENT STORE –increase their test automation

rate from 4% to 26%

ENERGY COMPANY –increase the daily execution

rate from 85% to 100%

HOME IMPROVEMENT RETAILER – reduce defect leakage into pro-

duction from 2% to <1%

AUTOMOTIVE COMPANY –boost requirements coverage

from 75% to 95%

COMPUTER STORAGE COMPANY –go from zero test data

automation to 20%

FINANCIAL SERVICES PROVIDER –shrink test cycles from

5 days to 1 day

BEVERAGE COMPANY –advance from a 60% test data

automation rate to 80%

FOOD MANUFACTURER –increase test automation

from 20% to 50%

9© 2021 Tricentis USA Corp. All rights reserved

Testing process assessment

The following is an overview of organizations’ progress on implementing key el-

ements of a mature, sustainable testing process. These elements are based on

the industry’s leading framework for scaling continuous testing in the enterprise.

Framework contributors include Accenture, Capgemini, Tasktop, Ken Pugh, TTC,

Narwal, Infometis AG, Automators, Quence IT, Getting Digital Done, Expleo Group,

and many of the industry’s top quality leaders.

Successful testing transformation begins by working on the foun-

dational elements: delineating and adopting the necessary team

roles, fostering collaboration among the various testing roles and

development, and ensuring easy access to the hardware and soft-

ware required to execute tests. We were pleased to see that many

organizations have already made great headway on these funda-

mental elements.

We weren’t surprised to find automation, continuous integration,

test data management, and service virtualization at the lower spec-

trum of adoption. This makes sense. Without consistent, reliable

access to the test data and dependencies, automation is unstable

and unable to cover the more advanced use cases that large enter-

prise organizations need to exercise. Such organizations have an

average of 900 applications, and single transactions commonly

touch an average of 82 different technologies . Addressing both

test data management and test environment access issues can en-

able much higher levels of test automation and add the stability

required for automated testing within a CI/CD pipeline. To learn more about these elements, see Appendix C.

10© 2021 Tricentis USA Corp. All rights reserved

Testing approach details

As part of the interview process, QA leaders shared many details about how their organizations approach

testing. Here’s a window into some rarely-exposed aspects of the software testing process:

On what basis do you select an applica-

tion or application area for automation?

What strategy do you follow to select

test data for your tests?

Which of the following metrics on the

business impact of testing do you track

and report on?

How do you approach test

design and creation?

How long does it take you to generate a

report on the quality of the application?

How do you measure the coverage of

your test suite?

To whom do you present the business

benefits and successes achieved from

your testing (not just pure metrics)?

Do you have traceability between the

following artifacts?

What QA roles do you have in your team

(or are available to support your team)?

Business impact 85%

Effort savings 82%

Frequency of updates 73%

Technical feasibility 71%

Application maturity 55%

Synthetic test data creation 82%

Basic (find the right data for

a particular test case) 77%

Production refresh 60%

Gold copy 18%

Defects prevented 88%

Cost savings 45%

Speed to market 32%

By intuition 70%

Provided by dev or product owners 54%

Methodical approach 46%

(orthogonal, pairwise, linear expansion…)

Within an hour 30%

Within a day 27%

It’s real-time 26%

More than a day 9%

Don’t know 8%

Requirements coverage 81%

Business risk coverage 49%

Number of test cases 44%

Within the department 93%

Senior management 74%

Other departments 58%

Defects and tests 83%

Requirements and tests 80%

Requirements and defects 60%

Automation specialists 88%

Manual testers 88%

Test analysts 77%

Automation engineers 70%

Test architects 63%

11© 2021 Tricentis USA Corp. All rights reserved

Next steps

Want to see how your current process stacks up

across the 12 critical testing process elements out-

lined above–and learn what specific steps are needed

to reach your short-term and long-term objectives?

Take the Continuous Testing Maturity Assessment. It

can be completed online, at your team’s convenience.

After answering some questions about your team’s

testing process, you will receive a detailed report with:

• An overview of your teams’ strengths and

weaknesses

• A customized roadmap of what to focus on

in the next 3-6 months

• A set of KPIs to track and optimize

• Best practices for improving your testing

practice.

We also invite you to browse our Customer Stories

site to learn how your peers are achieving amazing

results by transforming testing.

Learn how to make Enterprise Continuous Testing a reality

Read now >

12© 2021 Tricentis USA Corp. All rights reserved

Appendix A: Demographics

Location

Industry

AMS

Financial services

Insurance

Retail and online

Food & beverage

Banking

Technology

Energy & utilities

Government

Other

APAC

EMEA

13© 2021 Tricentis USA Corp. All rights reserved

Appendix A: Demographics

Company size(number of employees)

Up to 100k

100k - 200k

200k +

14© 2021 Tricentis USA Corp. All rights reserved

Appendix B: Metrics details

Automation rateAutomation rate measures the percentage of automated tests in

your test suite. The higher your percentage, the greater the poten-

tial for effort savings. Organizations typically aim to automate 50% or

more of their tests to meet speed and coverage expectations.

Defect leakage into production | system integration testing | user acceptance testing (UAT)Defect leakage measures the number of defects that are not caught

until production, system integration testing, or UAT. Defect leakage

helps you gauge the overall effectiveness of the testing process as

well as the quality of releases. A defect leakage into production rate

under 5% is considered good by most industry standards.

Execution rate per day | weekExecution rate measures what percentage of the team’s automat-

ed tests are run on a daily/weekly basis. Since the ROI in automa-

tion efforts comes from executing automated tests, understanding

this metric helps you measure the value of automation. Depending

on your release cycle timeframe, your daily/weekly execution rate

needs to keep you on track to achieve 100% of test completions

prior to release.

False positive rateThe false positive rate measures the percentage of failures reported

by executed tests that are not related to a defect in the product. A

false positive rate of <10% will build confidence in test results and

prevent avoidable rework.

Requirements coverageRequirements coverage measures the percentage of the require-

ments that are tested by your test suite. Some regulated industries

expect 100% requirements coverage. Risk coverage is often a more

meaningful metric than requirements coverage.

Risk coverageRisk coverage measures the amount of business risk your test suite

covers. It helps you focus testing on the functionality that matters

most to the business. Rather than try to test all functionality equally,

allocate more resources to areas with the greatest risk (considering

their frequency of use and the potential damage from their failure).

Test cycle timeTest cycle time measures the duration of the testing process, which

is a good indicator of how efficient your testing process is. Especially

with Agile and DevOps, the shorter the test cycle time, the better.

Track the breakdown of testing activities to find opportunities to re-

duce the test cycle time.

15© 2021 Tricentis USA Corp. All rights reserved

Appendix C: Element details

AutomationTest automation is a key enabler of Continuous Testing. Test auto-

mation can free up resources, expose defects earlier, and provide

faster feedback on application changes. Test automation, however,

comes at a cost: the cost of creating the automated tests, and more

importantly, the cost of maintaining the automated tests. For the

greatest ROI, automate the right test sets at the different stages of

the delivery pipeline. A green classification means you are optimizing

your testing process with at least 75% automation. Yellow indicates a

test automation rate from 20%-74%.

CollaborationEffective collaboration is absolutely key to a successful testing trans-

formation. Establish a process that facilitates communication be-

tween different teams so that challenges are dealt with efficiently.

Being in the green zone is a great indicator that your collaboration

process is working (with testers involved across requirements, de-

sign, and operations). In the yellow zone, there is regular dev-tester

interaction but you might have some silos.

Continuous IntegrationContinuous integration involves automatically building and test-

ing code within your CI/CD pipeline for fast feedback on changes.

Test execution should be triggered automatically at the appropriate

phases of the pipeline. A green classification indicates that unit tests

and smoke tests are executed regularly and trigger the expected

actions (e.g., code promotions or stakeholder notifications).

Data strategyTest data management consumes 44% (per Sogeti) to 60% (per IBM)

of testing time. Defining a strategy around test data management

will not only save time, resources, and frustration. It will also enable

higher levels of automation, increase coverage, and reduce false

positives. A green classification means you are preparing or generat-

ing ≥60% of test data through automation and using data refreshes

or synthetic test generation. Yellow indicates 20%-59% automation

and a basic test data preparation strategy.

Environment strategyA test environment is the setup of software and hardware that al-

lows the testing team to execute their tests. Having reliable access to

the right test environments can greatly reduce the defect leakage as

well as increase the speed of testing and root cause analysis. On the

other hand, not having the right test environments can cause delays,

frustration, and loss in confidence in the test results. In the green

zone, you have multiple test environments with high stability. In the

yellow zone, you have at least one test environment with moderate

stability.

16© 2021 Tricentis USA Corp. All rights reserved

Appendix C: Element details

Execution strategyThe test execution strategy helps define what tests to execute when,

where, and with what data. The strategy should ensure that all re-

quired tests are executed to ensure proper test coverage. It should

also ensure that redundant tests are not executed to avoid wast-

ing time and resources. Having parallel test execution and weekly

automated test runs is sufficient for yellow. A green classification is

attained if automated tests are run (at least) daily—including regres-

sion, progression, and smoke tests.

ResilienceResilience indicates how well your test automation withstands ap-

plication changes. An application change should not cause a test

failure unless it introduces a defect (the application must be fixed)

or changes the intended functionality (the test must be fixed). This

measure is influenced by false positive rates, test execution stabili-

ty, and test automation maintainability. If you’re in the green zone,

this indicates a false positive rate ≤5% and test maintenance effort

≤20%. A false positive rate from 6%-20% and test maintenance effort

from 21%-40% is sufficient for yellow.

RolesYou should have 5 main roles in your team to succeed on your Con-

tinuous Testing journey: Test Architect, Test Analysts, Automation

Engineers, Automation Specialists, and Test Managers. A single re-

source may play multiple roles, but be careful not to overload your

resources. If you’re in the green zone, you have all the above roles

and are well equipped to succeed. In yellow, you should onboard

further members and/or upskill your existing team.

Service VirtualizationService Virtualization mocks a required service (that might otherwise

be blocked, offline or unavailable) so you can test your system in

isolation. This enables you to “shift left” integration testing and allows

teams to work more independently of one another. To reach the

green zone, you must virtualize ≥50% of your application’s interfaces

to dependent third-party systems. Virtualizing 25%-49% is sufficient

for yellow.

Success storiesSuccess stories share testing transformation achievements across

the organization. Success stories can come in any format (written,

video, etc.). The critical component for success stories is they cap-

ture outcomes important to the business and are shared broadly

to relevant teams and stakeholders. Yellow indicates you report on

risk avoidance metrics and share success stories within the team.

For green, you track additional metrics (for cost + speed) and share

successes more broadly.

17© 2021 Tricentis USA Corp. All rights reserved

Appendix C: Element details

Test coverageTest coverage assesses how well your tests are mitigating risks. Ide-

ally, testing includes risk-based testing, test case design methods

(orthogonal, pairwise, linear expansion…), and exploratory testing.

Together, these approaches can help you efficiently identify issues

with a high impact to your organization. Green indicates ≥75% test

coverage of business risks, methodical test design, and exploratory

testing. Yellow indicates 50%-74% coverage of requirements.

Test managementTest management is the process of creating tests, tracking tests, and

linking defects for all types of testing. Test management tools have

evolved to help oversee these elements, incorporating a dashboard

so that all project stakeholders can get an overview. If you’re in the

green zone, you have comprehensive traceability across tests, de-

fects, and requirements plus near real-time reporting. Yellow indi-

cates traceability between tests and requirements only.

2570 W El Camino Real, Suite #540

Mountain View, CA 94040

+1 650 383 8329

[email protected]