T6 Value-Inspired Testing...–emergent path between “too much chaos” & “too much order” ......
Transcript of T6 Value-Inspired Testing...–emergent path between “too much chaos” & “too much order” ......
Neil Thompson, Thompson information Systems Consulting Ltd
Value-Inspired Testing: Renovating Risk-Based Testing, and
Innovating with Emergence
www.eurostarconferences.com
@esconfs #esconfs
Value-Inspired Testing v1.1
Renovating Risk-Based Testing, and Innovating with Emergence
2
Neil Thompson ©
[email protected] @neilttweet neiltskype
Deming: survival is “not compulsory”
3 Are reports of testing’s death “greatly exaggerated”?
• Tim Rosenblatt (Cloudspace blog 22 Jun 2011) “Testing Is Dead – A Continuous Integration Story For Business People”
• James Whittaker (STARWest 05 Oct 2011) “All That Testing Is Getting In The Way Of Quality”
• Alberto Savoia (Google Test Automation Conference 26 Oct 2011) “Test Is Dead”
• (There *may* be others?)
But those definitions of testing seem too
narrow – my Agenda instead... • To renovate the use of Risk in testing:
– collate current variants, eg “Risk-Based, Risk-Driven” – use context-driven mix of principles – grade testing from high to low (not truncate) – balance risk against benefits, giving net Value – use risk throughout testing “process” – integrate risk into SDLC using Value Flow ScoreCards
• To innovate in testing: – consider evolution in Nature – also a value flow? – appreciate concept of Memes; evolving “memeplexes”
in testing – emergent path between “too much chaos” & “too much
order” – creativity: where good ideas come from (Johnson)
4
So, when holistic & evolving,
testing will not die?
5 (based on http://www.needham.eu/wp-content/uploads/2011/01/ascent-of-man1.jpg)
Start renovation of “Risk” by
collating current variants
6
IMPLICIT RISK
PRINCIPLES
“TESTING IS RISK-
BASED” HOW TO
DO IT RISK,
SCHMISK!
Risks as entities to test, driving techniques
Risk as prioritisation of features etc
RISK-BASED TEST DESIGN
RISK-BASED TEST MANAGEMENT
2002 !
1972-3
1970s - 1984
1976
1990
1979
1984-1988
Use a context-driven mix of
available principles
7
RISK-BASED TEST MANAGEMENT
RISK-BASED TEST DESIGN
Project environment
Project environment
Perceived quality
Risk workshops: • why, whether, who, where? • when, what risks, how handle?
After: Heuristic Test Strategy Model v4.8, James Bach
Quality criteria
Quality criteria
Business risks
Technical risks
Risk factors to choose, eg: • usage • newness • complexity
Test techniques
Prioritisation
Product elements
Product elements
What to prioritise & focus on: • test items? • features? • data items? • test conditions?
Prioritisation: better than truncating
“low-risk” tests, *grade* coverage
8
Even distribution
Test Coverage & Effort
Riskiness
Random / spurious priorities Risk-truncated
X X
After: Chris Comey, Testing Solutions Group
Risk-graded
X • Does this make sense? • No!
• Even less sense!
• Better, but dangerous to omit some areas completely?
• This is the most responsible way
Consider not only risks – balance
against benefits to give net value
9
Open
Closed
Pro
du
ct
Ris
ks
Open
Open
Closed
Closed
Open
Obje
ctive
Obje
ctive
Obje
ctive
Obje
ctive
Benefit
Obje
ctive
Closed
Project objectives, hence business benefits,
available for release now After: Paul Gerrard & Neil Thompson, book Risk-Based E-Business Testing
Obje
ctive
Obje
ctive
Ob
jective
Obje
ctive
Obje
ctive
Obje
ctive
Obje
ctive
Benefit Benefit
Project...
Business...
.............Value
Tests
graded
by...
Priorities
+ FEATURES etc ... .. .... . ..... ... .. ... .
Apply risk principles throughout
software lifecycle
10
programming with risk of bugs
Requirements
Functional Specification
Technical Design
Module Spec
Acceptance Test Analysis
& Design
System Test Analysis
& Design
Integration Test Analysis
& Design
Component Test Analysis
& Design
TEST MODEL
DEVELOPMENT MODEL
REAL WORLD
simplification
refinement with risk of distortion
AT Execution
ST Execution
IT Execution
validation testing
verification testing
CT Execution
SOFTWARE
SOFTWARE (observed)
DEV MODEL (expected)
TEST MODEL (ver’d / val’d)
REAL WORLD (desired)
after SOFTWARE TESTING: A CRAFTSMAN’S APPROACH Paul Jorgensen
So: • remember overlapping models • we need both
verification & validation • this is not “the” V-model!
Bear in mind causes and effects of risks
11
programming with risk of bugs
Requirements
Functional Specification
Technical Design
Module Spec
TEST MODEL
DEVELOPMENT MODEL
REAL WORLD
simplification
refinement with risk of distortion
SOFTWARE
Mistake: a human action
that produces an
incorrect result
(eg in spec-
writing,
program-
coding)
Defect:
incorrect
information in
specifications
Fault: an incorrect step,
process or data
definition in a
computer program
(ie executable
software)
Anomaly: an unexpected
result
during testing
Failure: an incorrect result
Error:
amount by which
result is incorrect
Knock-on
Effects
Probability of making mistakes, of defects causing faults, faults causing failures, etc
Consequence of risk if it happens...............................................................................
On TEST “process”
On REAL WORLD
after go-live
Static Verification
Validation
Static verification
Risk principles apply throughout testing
“process”
12
Specification
TEST MODEL
DEVELOPMENT MODEL
Mistake Defect Fault Anomaly Failure
Error
Knock-on
Effects
Detect omissions, distortions, rogue additions...
Static validation
Use “Peopleware” principles
Prevention
other oracles Test
analysis Test
design Test
exec’n Bug
mgmt
May be all or partially exploratory.................
Prioritise by both urgency.............
& importance................................
On DEV & TEST “processes”
On REAL WORLD after go-live
Detect further bugs; Adjust test coverage
Fix, test fixes, regression-test
Write / model better requirements
A framework for managing value through
the lifecycle: “Value Flow ScoreCard”
13
Financial
Customer Supplier Improv’t
Process Product
Financial Customer Supplier Infra- structure
Process Product
WHO...
WHY
WHAT, WHEN, WHERE
HOW
• In action, the ScoreCard is a 7x4 table: – uses include setting / balancing test policy,
strategy, coverage, troubleshooting & improvement
– can start with repositionable paper notes, or use spreadsheet
– NB the measures & targets need not be quantitative, may be qualitative eg rubrics
Improve- ment
Infrast
• “The seven watchwords of highly effective software people!”
Risk can be integrated into the
scorecard
14
Objectives Threats to success Measures Targets Initiatives
WHY we do things HOW they may fail WHAT (will constitute success, WHEN & WHERE) HOW to do things well
Risk Risk Risk Risk Risk Risk
Financial Supplier Customer Infrastructure
Process Product
SEVEN VIEWPOINTS of what stakeholders want
Improvement
• Now it’s a 7x5 table
Types of risk
15
Project risk
Process risk
Product risk
Eg: • supplier may
deliver late • key staff may leave
may cause
Eg: • configuration management
may install wrong version of product
Eg: • specifications may
contain defects • software may contain
faults
may cause
may cause
may cause
So: we’ve renovated “risk-based testing”
into a whole-lifecycle structure
16
Objectives Threats to success Measures Targets Initiatives
WHY we
do things
WHAT (will constitute success, WHEN & WHERE)
HOW to
do things well
Project risk
Process risk
Product risk
(Process risks)
Project risk
Project risk
Financial Supplier Customer
Infrastructure
Process Product
SEVEN VIEWPOINTS of what stakeholders want
Improvement
Now to move on to innovation
17
– but also: how we are planning to improve for next & future projects
Objectives Threats to success Measures Targets Initiatives
Process risks
Financial Supplier Customer
Infrastructure
Process Product Improvement
• The double feedback loop of the ScoreCard: – not only is our
scorecard, and its cascading, converging on desired targets for current project...
How does Nature innovate?
18 Images from wikipedia
Lamarck: Acquired characteristics, Usage, Inheritance
Darwin: Mutation, Fitness, Reproduction
(various authors) Emergence...
A scientific view of emergence
19 Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF
Sources: Daniel Dennett “Darwin’s Dangerous Idea” “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc)
Physics (quantum end)
Chemistry: Inorganic
Chemistry: Organic
Biology
Social sciences
(Ouroboros: Greek Οὐροβόρος or
οὐρηβόρος, from οὐροβόρος ὄφις
"tail-devouring snake”)
Physics (gravity end)
Is like value flow? (and it looks better this way up!)
20
• Each level of progress generates possibilities, which are tested
• Then, each level is a platform which, when established, is easily built upon by “cranes” (without having to worry about the details below)
• After the science levels...
• humans made tools, talked and co-operated
• printing gave us another level
• now, software is following exponential growth
• So, software testing should surf the wave of evolution (not flounder in the shallows behind it)
• Kurzweil epochs
5: Bio methods integrated into technology?
6: Intelligence into matter/energy patterns?
4: Technology
3: Brains
2: Biology
1: Chemistry & Physics
+0: Maths?!
“SINGULARITY”
The Singularity is Near, 2005
The Darwinian view of evolution – but
does this explain all emergence?
21 Image from www.qwickstep.com
Biological evolution as
sophistication rising with diversity
22
Diversity
Sophistication
Time
But evolution is not smooth?
23 “Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay Gould Images from www.wikipedia.org
Sophistication
Diversity “Gradual” Darwinsim
Sophistication
Diversity Punctuated equilibria
“Explosion” in species, eg Cambrian
Spread into new niche, eg Mammals
Mass extinction, eg Dinosaurs
(equilibrium)
(equilibrium)
(equilibrium)
Sophistication
Diversity
Number of species
So... evolution of sciences overall?
24
Biology
Chemistry
Organic
Inorganic
Physics
Social sciences
• Arguably other sciences have not evolved smoothly either • Sudden advances, akin to punctuated equilibria in biological evolution
Per Bak, “How Nature works” 1996
(image Tracey Saxby, Integration and Application Network, University of Maryland Center for Environmental Science ian.umces.edu/imagelibrary/)
Sophistication
Diversity
OK, what’s all this got to do with
software testing?
25 Social sciences
Tools
Language
Books
Computers
• We have an important and difficult job to do here!
• Social sciences evolution Tipping Points
(Malcolm Gladwell)
Sophistication
Diversity
Testing needs to evolve / emerge /
innovate to keep up with complexity
26 Computers
1GL
Object Orientation
Internet, Mobile devices
Artificial Intelligence?!
4GL
3GL
2GL
• For example, are we ready to test AI??
Sophistication
Diversity
How has testing evolved so far?
27
PERIOD
1957
1976
EXEMPLAR OBJECTIVES SCOPE “SCHOOL”?
Pre-
1983
1984
2000
2011
Weinberg (1961 & 71)
Test + Debug Programs
Hetzel (1972)
Show meets requirements
Find bugs Myers (1976 & 79)
Programs
Programs, System, Accept’ce
Kaner et al (1988 & 99)
Experiment & Evolve?
Neo- Holistic?
Measure quality
?
Beizer (1984)
+ Integr- ation
Find bugs, show meets requirements, +prevent bugs
Find bugs, in service of improving quality, for customer needs
Analytic
Standard (Control)
Quality
Context Driven
“no schools, but...”
?
Agile (Test-Driven)
Factory
Overall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, 1988 CACM 31 (6) as quoted on Wikipedia
?
DEBUGGING (Psychology)
DEMONSTRATION (Method)
DESTRUCTION (Art)
EVALUATION (Engineering?)
PREVENTION (Craft?)
HUMANISATION?
Science?
(Technology?)
(Social Science?)
AUTOMATION?
UNIFICATION??
Another way of thinking about evolution:
genes...
28 Image from www.qwickstep.com Image from .schools.wikipedia.org
Diversity
Sophist- ication
Replication & Selection
Mutation
...and for humans, “memes”, as an
extension of the genes concept
29
Biological evolution
Theme developed from Daniel Dennett “Darwin’s Dangerous Idea”
Mental, social & cultural evolution
Platforms
Cranes
(Lamarckian??) Replication & Selection
Mutation
Ideas Beliefs Practices Symbols
Gestures
Speech
Writing
Image from .www.salon.com Taxonomy from www.wikipedia.org
“Other imitable
phenomena”
Rituals Sophistication
Diversity
Considering memes in testing: here is
an example “memeplex”
30
Always-consider Effectiveness
Efficiency Risk management Quality management
Insurance Assurance
V-model: what testing against W-model: quality management
Risks: list & evaluate
Define & detect errors (UT,IT,ST) Give confidence (AT)
Prioritise tests based on risks
Tailor risks & priorities etc to factors
Refine test specifications progressively:
Plan based on priorities & constraints
Design flexible tests to fit
Allow appropriate script format(s)
Use synthetic + lifelike data
Allow & assess for coverage changes Document execution & management procedures
Distinguish problems from change requests
Prioritise urgency & importance
Distinguish retesting from regression testing
Use handover & acceptance criteria
Define & measure
test coverage
Measure progress & problem significance
Be pragmatic over quality targets
Quantify residual risks & confidence
Decide process targets
& improve over time
Define & use metrics
Assess where errors originally made
Define & agree roles & responsibilities
Use appropriate skills mix
Use independent system & acceptance testers
Use appropriate techniques & patterns
Plan early, then
rehearse-run,
acceptance tests
Use appropriate tools
Optimise efficiency
Source: Neil Thompson STAREast 2003
(not “best practices” but reference points for variation?)
Another example memeplex for testing
31
Source: Neil Thompson BCS SIGiST 2002 review of Lessons Learned in Software Testing (Kaner, Bach & Pettichord)
Yo
ur ca
reer in so
ftwa
re testing
Th
e role o
f the tester
Th
ink
ing
like
a tester
Testin
g
techn
iqu
es
Bu
g
ad
vo
cacy
Au
tom
atin
g
testing
Documenting testing
Interacting with programmers
Managing the testing project
Managing the testing group
Pla
nn
ing
the
testing
strateg
y
• (Grouped here by chapter for illustration, and coloured by theme) • 293 individual “lessons” selectable by testers according to context
Management
Thinking
So, do we have punctuated equilibria in
the evolution of testing?
32 Software testing
Sources: Gelperin & Hetzel 1988 etc?? DEBUGGING
DEMONSTRATION eg V-model
DESTRUCTION eg test techniques
EVALUATION eg metrics initiatives
PREVENTION eg reviews, root cause analysis
HUMANISATION? eg Context-Driven school
Science?
Technology?
Social Science?
AUTOMATION? eg test-driven development
UNIFICATION?? • Where were the Platforms? • What were the CRANES? • Tipping points?
Psychology • But... is there something wrong with this picture?...
Method
Art
Engineering?
Craft?
Mass-market software
Open-source tools
Belief in cost-of-failure curves
Publication of ANSI/IEEE standards
Establishment of textbooks
Acknowledg’t of testing as distinct discipline
Software analysis
Sophistication
Diversity
One of the existing views of
innovations in software testing
33
After: Lines of innovation in software testing, Stuart Reid 2010/2011, testing-solutions.com
Testing & Quality
Testing (20th C)
Testing innovations in specific subjects
• Concepts: – hierarchy – products / processes
• Factors: – invention / application – individuals / organisations – bottom-up / top-down – synthesis of precursors – adjacent possibilities – role of testing!
• Aids: – population size – diversity / interdiscipline – free time / free to fail – psychology & serendipity – recording media
Arguably, emergence is more than
just Lamarckian / Darwinian
34
Physics
Social sciences
Chemistry
Biology
• Emergences at coarser scales not explained by “reductionism” to finer scales • For best innovation & progress, need neither too much order nor too much chaos • Examples: galaxy development, phase transitions,
Gaia, autocatalysis, aminoacids→proteins,
political swings,
AI & IA?
Extrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations”
• Might also apply to testing??
Sophistication
Diversity
Time
History of testing is intertwined in “ecosystems”
with technology, software lifecycles, etc
35
Testing & Quality
Development
Psychology
Method
Art
Engineering
Craft
Technology Social science
Structured methodologies
CASE tools
immature Agile
mature Agile?
Diversity
Sophistication
And within testing, different contexts
have so far evolved in separate streams?
36
Testing & Quality: TRADITIONAL “SCHOOLS”
CONTEXT-DRIVEN
Psychology
Method
Art
Engineering Craft
Technology
Social science X X
X
X • Recent changes regarding “school” & “approach”
• Limited dialogue, mutual mistrust, “language” differences
Diversity
Sophistication
An “emergent” view of innovation
37
7. Platforms
1. Adjacent possible
Reef
City
Web
2. Liquid networks
• Eight related ideas from history of human innovation
“0”
3. Slow hunch 4. Serendipity 5. Error 6. Exaptation
Johnson’s ideas overlaid here on Neil Thompson’s graphic
Emergent view: (a) innovation framework
38
7. Platforms
1. Adjacent possible
Reef
City
Web
2. Liquid networks
“0”
• Coral reefs surprisingly diverse habitat, because crowded, wave- washed boundary zone • Cities concentrate minority interests where they can communicate • Tech innovations used to take 10 years; on www 1 is enough
“Patterns of innovation are fractal”
• Things happen wherever they can happen
• Ideas flowing without friction
• Once a new level is established, can build on it, almost without thinking
Emergent view: (b) innovation “techniques”
39
7. Platforms
1. Adjacent possible
Reef
City
Web
2. Liquid networks
“0”
3. Slow hunch
4. Serendipity
5. Error
6. Exaptation
• Many innovations are not eureka moments, they take time to evolve & establish
• You may find something different, but it’s important to be seeking something
• Noise can make us focus more • OK to fail, but try to fail fast
• Modifications can be hi-jacked for unexpected things (and beneficially)
tattoos99.com
A brief history of human innovation
40 Individual(s) Communities
Amateur
Market
1400-1600
1600-1800
1800-current
• Most discoveries “amateur individuals”, eg:
– supernovae (Brahe)
• Rise of amateur communities, eg:
– Milky Way (Al-Biruni, Galileo, Herschel & his sister)
• Rise of market communities, eg:
– radio (Marconi, Tesla, Braun, Hertz etc)
Source: Steven Johnson, “Where good ideas come from: the natural history of innovation”
So, what could software testing learn from
the history of innovation? HUMAN HISTORY SOFTWARE TESTING
Slow hunch, Exaptation
• Keep a notebook. You never know what may come in handy eventually (see also Jerry Weinberg’s Fieldstone method)
communities (market & amateur)
• Even competitors in this market seem to collaborate and mutually-respect. Keep it up! • Attend conferences etc
Adjacent possible • Try modifying / combining / hybridising techniques. They’re not set in stone (eg 2-D classification trees)
Reef, City, Web • Even if introvert, use LinkedIn, Twitter etc
Serendipity • If a trail goes cold, turn your nostrils in some other direction
Platforms • Seek new uses of previous achievements, eg test automation in new ways (high-volume random)
41
An additional thought
42
Renovated risk, & Science,
as UNIFICATION?
• Testing contexts will of course continue to differ, but... • More mutual dialogue may increase innovation, both sides
Traditional, risk-averse sectors
Market-chasing, product-oriented, risk-tolerant / risk-embracing sectors
• ...if we can all share understanding across varied contexts
Diversity
Sophistication
A brief history of testing innovation?
43 Individual(s) Communities
Amateur
Market
1950s-1999?
2000-2012?
2012 onwards?
• Guru individuals?
• Communities in relative isolation?
• Communities interacting more?
Context Driven
Agile (Test-Driven)
Analytic Quality
Factory
Key references & acknowledgements (NB this is not a full bibliography)
• Use of Risk in testing (yes, other sources are available!): – Kaner, Bach & Pettichord: Lessons Learned in Software Testing – Craig & Jaskiel: Systematic Software Testing – Gerrard & Thompson: Risk-Based E-Business Testing
• Principles contributing to Value Flow ScoreCard: – Kaplan & Norton: The Balanced Scorecard – Translating Strategy into Action – Isabel Evans, Mike Smith, Software Testing Retreat
• History & innovations in testing: – Gelperin & Hetzel: The Growth of Software Testing – Meerts: testingreferences.com incl. timeline – Stuart Reid: Lines of Innovation in Software Testing
• Emergence: – Dennett: Darwin’s Dangerous Idea – Eldredge & Gould: Punctuated Equilibria... (in Models in Palaeobiology) – Kauffman: The Origins of Order, Investigations etc – Johnson: Where Good Ideas Come From – (+Kurzweil: The Singularity is Near?!)
44
Takeaway ideas
• All testing is risk-based/value-inspired: whether or not you recognise it yet (so, make a virtue of it)
• Embrace diversity; discuss! don’t dismiss, disrespect or just “agree to differ”
• Mix with lots of non-testers • Seek out analogies & metaphors • Depending on your personality:
– Read lots of books (eg “things to read together” = adjacent possible)
– Do lots of thinking – deliberate & unintended – Participate in blogs, discussion groups
• Remember: change is accelerating, and innovation is fractal!
45