Software Testing Industry Trends and Standards in Australia
Dr Mark Pedersen
K.J. Ross & Associates
Company Background • Founded in 1997,
emerging from safety critical systems
• Pure-play software testing
• Focus on standards: – Health – Transport – ISO 29119
Presentation
Overview • Highlights of the 2012
Australian Software Testing Industry Benchmark
• The role of standards and accreditation in software testing in Australia; implications for the future
Project Success?
Successful 41%
Challenged 53%
Cancelled 6%
Software Testing Industry Benchmark 2012
Over Budget 37%
Over Time 45%
Customer Requirements not
met at deployment
18%
Challenged projects
By 2025, there will be a
“9/11” magnitude
software failure.
Barry Boehm, 2011
Proportions of Budget
0%
5%
10%
15%
20%
25%
30%
35%
< 5% 5-10% 10-15% 15-20% 20-25% 25-30% 30-35% 35-40% 40%+
Distribution - Proportion of Project Budget (Current)
Planned
Actual
18.9%
19.8%
21.5%
17.6%
22.0%
0%
5%
10%
15%
20%
25%
30%
35%
40%
- 2 years Current + 2 years
Testing as Proportion of Project Budget
Planned Actual
Perception of Budget
0% 20% 40% 60% 80% 100%
IT Executives
Business
Project Managers
Test Managers
Percent of respondants
Perceptions on Test Expenditure
Way too low
Somewhat too low
About right
Somewhat too high
Way too high
Changing Roles
Test Activity
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
40.0%
45.0%
50.0%
2 years ago Now 2 years ahead
Management
Analysis
Execution
Env
Report
Team Structure
0%
10%
20%
30%
40%
50%
Pro
po
rtio
n o
f e
ffo
rt
Test Role
2 years ago
Now
2 years ahead
Ratios
• Test Managers to Team 1 : 11
• Test Directors to Team 1: 70
• Testers to Developers 1: 5
Team Structure
Team Structure Te
st M
an
ag
er
Te
st L
ea
d
Te
st A
nalyst
Te
ste
r
Te
st A
uto
matio
n
Pe
rfo
rm
anc
e
Resourcing Mix
64.6% 69.2% 66.0%
30.4% 20.3% 14.0%
3.8%
5.7% 11.3%
1.3% 4.8% 3.8%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
2 years ago Now 2 years on
Test Resourcing
Outsource / Vendors – Offshore
Outsource / Vendors – Onshore
Contractors
In-house
Preferred Lifecycle
Training budgets down
$0
$1,629
$4,000
0
27
75
0
10
20
30
40
50
60
70
80
$0
$2,000
$4,000
$6,000
Min Avg Max
Allo
cate
d T
rain
ing
Ho
urs
Budget
Hours
$0
$2,475
$9,000
8
46.9047619
120
0
20
40
60
80
100
120
140
$0
$2,000
$4,000
$6,000
$8,000
$10,000
Max
Avg
Min
Allo
cate
d T
rain
ing
Ho
urs
Average Annual Training Allocation
Budget
Hours
2010 2012
Automation
0.00%
5.00%
10.00%
15.00%
20.00%
25.00%
30.00%
35.00%
40.00%
45.00%
2 years ago Now 2 years ahead
Pe
rce
nt
Percentage of Tests Automated
Automation
12.8%
24.4%
50.3%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
2 years ago Now 2 years ahead
Pe
rce
nt
What proportion of automated test cases are re-executed regularly during regression testing?
Automation Investment
No, we will significantly
decrease investment
0%
No, we will decrease
investment 4%
Investment will stay the same, or
undecided 35%
Yes, we will increase
investment 50%
Yes, we will significantly
increase investment
11%
In the next few years, we intend to invest further in test automation tools?
No, 8%
Yes, marginally, 42%
Yes, strongly, 50%
Do you feel that automation can deliver real value and return on investment?
Early Error Detection
Reqs Design Code Test Prod
Cost
Injection
Current Detection Ideal Detection
Distribution of Test Effort
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Co
nce
pt
Req
uir
emen
ts
Des
ign
Co
din
g
Un
it T
est
Ap
plic
atio
n A
ccep
tan
ce
Test
Co
mp
on
ent
Inte
grat
ion
Te
st
Syst
em T
est
Syst
em In
tegr
atio
n
Test
ing
End
to
En
d F
un
ctio
nal
Te
st (
carr
ied
ou
t b
y …
Acc
epta
nce
Tes
t (c
arri
ed o
ut
by
end
…
Alp
ha
Test
Bet
a Te
st
Dep
loym
ent
Po
st-d
eplo
ymen
t (i
n
pro
du
ctio
n, w
ith
in …
Po
st-d
eplo
ymen
t (i
n
pro
du
ctio
n, o
uts
ide
…
Oth
er (
ple
ase
com
men
t)
Proportion of Project Effort
Defect Injection
17.8%
14.4%
51.4%
10.8%
1.3%
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
Requirements Design Code Bad fix (regression) Other
Ave
rage
Pe
rce
nt
Type
Defect Type
Defect Detection
0.3%
5.3% 3.9%
7.7% 7.1%
18.2%
34.4%
20.4%
0.3% 0.3% 2.1%
3.4% 0.9%
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
40.0%
Co
nce
pt
Req
uir
emen
ts
Des
ign
Co
din
g
Un
it T
est
Inte
grat
ion
Tes
t
Syst
em T
est
Acc
epta
nce
Tes
t
Dep
loym
ent
Alp
ha
Test
or
Bet
a Te
st
Po
st-d
eplo
ymen
t (i
n p
rod
, wit
hin
war
ran
ty)
Po
st-d
eplo
ymen
t (i
n p
rod
, ou
tsid
e w
arra
nty
)
Oth
er
Ave
rage
Pe
rce
nt
Software Development Phase
Defect Detection
Environment Challenges
Complexity & Lost Effort
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
30.0%
35.0%
Minimum Average Maximum
Pe
rce
nt
of
resp
on
dan
ts
Effort Lost Due to Environment Issues
9.9 9.7 10.5
0
2
4
6
8
10
12
2 years ago Now 2 years ahead
Mo
nth
s
Average Number of Systems Impacted by Project
0%
5%
10%
15%
20%
25%
30%
1-3 3-5 5-10 10-20 20-50 50-100 100+
Co
un
t
Duration (months)
Average Number of Systems Impacted by a Project
2 years ago Now 2 years ahead
Software Testing Standards and
Accreditation
• ISO/IEC 29119
• NATA accreditation for software testing in Australia
– NEHTA Secure Message Delivery
– NEHTA Health Identifiers
• Other standards requiring software testing
BS 7925-1 Concepts & Vocabulary
Part 1
ISO/IEC 29119 – Scope & Structure
ISO/IEC 33063
Test Process Assessment
BS 7925-2 IEEE 1008
BS 7925-2
Documentation
Part 3
Testing
Techniques
Part 4
IEEE 829
Processes
Part 2
Part 5
Keyword Driven Testing
Testing Process Model (Part 2)
Test Management Processes
Organizational Test Process
Dynamic Test Processes
Organizational Test Process
Organizational Test Documentation
Feedback on Organizational Test Documentation
Test Management Processes
Test Planning Test Monitoring &
Control Test Completion
Test Plan Updates
Test Plan
Test Completion Report
Dynamic Test Processes Test Management Processes
Test Measures
Test Measures
Test Plan, Control Directives
Test Plan, Control Directives
Test Management Processes
Organizational Test Strategy
Unit Test Docs
Project Test Management Plan & Strategy
Documentation Example (Part 3)
Organizational Test Policy
Int Test Plan
System Test Plan
Performance Test Plan
Usability Test Plan
UAT Plan
Unit Test Plan
Int Test Docs
Techniques (Part 4)
Dynamic Test Techniques
Black-Box White-Box
Non-Functional
Testing in the Health Domain
• National E-Health Transitionary Authority (NEHTA) requires compliance to specific published standards – Secure Message Delivery – Health Identifiers
• NATA governs accreditation for approved compliance testing labs
• Frequent changes to compliance requirements require a high level of communication and collaboration with NEHTA
• Compliance requirements expected to increase in the future
Testing in Safety Critical Domains
• ISO 50128: Railway applications, communications, signaling and processing systems – software for railway control and protection systems
• 2012 KJ Ross acted as the independent safety assessor for a new signaling system for NSW
• Applied software testing quality control principles across the whole of the client process to ensure compliance with the requirements of ISO 50128.
• Focus on evidence of testing
• State safety authority extremely satisfied.
Conclusion
• Significant contrast between typical “commercial” software testing practice vs. industries where software testing is mandated.
• Significant difficulty in bringing in legislated software testing requirements
• High profile failures for “non-critical” systems (e.g. health payroll), may begin to change the governance culture
• Opportunities for off-shore software testing services are increasing in the commercial arena
• Legislative compliance testing more challenging due to (changing) levels of detail
• Cost imperatives for compliance testing will drive offshore opportunities.
Questions
Contact:
Mark Pedersen
K. J. Ross & Associates Pty Ltd
www.kjross.com.au
Top Related