Otto Vinter - Analysing Your Defect Data for Improvement Potential

20
Otto Vinter Software Engineering Mentor Tel/Fax: +45 4399 2662, Mobile: +45 4045 0771 [email protected] www.vinter.suite.dk

Transcript of Otto Vinter - Analysing Your Defect Data for Improvement Potential

Otto VinterSoftware Engineering Mentor

Tel/Fax: +45 4399 2662, Mobile: +45 4045 [email protected] www.vinter.suite.dk

Why Defect Analysis?Why Defect Analysis?Objective

identify why defects occur help prevent defect insertion

Assumption defect detection costs both time and money defects cost more when detected later in the life cycle fewer defects can be inserted through training/education

Strategy move defect detection to earlier parts of the life cycle avoid defect insertion by improving the process

Defect Analysis ModelsDefect Analysis ModelsMathematical models

quantitative analysis statistics growth curve modelling comparison to historical data not easily translatable to corrective action

Causal analysis qualitative analysis investigate details of a sample of defects fewer restrictions on what can be analysed time consuming harder to aggregate easy to translate to individual improvement actions

Defect Analysis in PracticeDefect Analysis in Practice

Why mathematical models often fail to predict low-level maturity processes are the norm software development is not a production process personal and team issues dominate

Why causal analysis often works focuses on prevention addresses individual and team issues feed-back, education, motivation bottom-up process improvement

Tools for Performing Defect AnalysisTools for Performing Defect AnalysisFor classifying defects

Boris Beizer’s bug taxonomy

For capturing quantitative information When found: distribution over time (before/after release) Where found: distribution across system parts

(subsystems/modules) Who found: distribution across stakeholder groups

(project/specialists/customers)

For capturing qualitative information Interviews with the relevant developers What happened: listen and learn

(problem/explanation/insight/cause) How to prevent: ad-hoc check lists

(literature/history/experience)

Retrospective Defect AnalysisRetrospective Defect AnalysisRetrospective analysis

combined bug classification and qualitative analysis performed on a sample of at least 100 bug reports investigates potential prevention techniques proposes focused improvement actions

Interview sessions 1-2 developers and 1-2 process consultants app. 4 min / bug for classification and qualitative

information app. 30 minutes / bug for detailed analysis of selected

bugs to determine potentials for improvement (prevention)

When Are Defects Found (example)When Are Defects Found (example)

0

10

20

30

40

50

-327-297-267-237-207-177-147-117 -8

7-57-27 3 33 63 93 12

3153183213243273303333363393423453483513543573603633

Customers

All

Defect Detection Percentage Over Time (example)Defect Detection Percentage Over Time (example)DDP(t) = Defects before release / All defects (t) %

0,00

10,00

20,00

30,00

40,00

50,00

60,00

70,00

80,00

90,00

100,00

0 30 60 90 120 150 180 210 240 270 300 330 360 390 420 450 480 510 540 570 600 630

Where / Who Found the Defects (example)Where / Who Found the Defects (example)

0 5 10 15 20 25 30

Communication

Other Parts,Unspecified

SystemAdministration

Data Processing

User Interface

Customers

Sales/support people

Project Group

%

Boris Beizer’s Bug TaxonomyBoris Beizer’s Bug TaxonomyBug classification

nine main categories detailed in up to four levels open and extendable

Statistics Boris Beizer

- 16,209 bugs in Assembler, FORTRAN, COBOL, ADA, C- primarily US defence, aerospace, and telecommunication- collected before 1989

Otto Vinter- 982 bugs in Pascal, C, C++- embedded and PC applications at Brüel & Kjær, DK- collected from 1992 – 1998

More information: www.vinter.suite.dk/engdefana.htm

The Beizer Bug TaxonomyThe Beizer Bug Taxonomy1. Requirements and Features2. Functionality as Implemented3. Structural Bugs4. Data5. Implementation6. Integration7. System and Software Architecture8. Test Definition or Execution Bugs9. Other Bugs, Unspecified

Bug Taxonomy NotationBug Taxonomy NotationEach category is detailed to a depth of 4+ levels

- 3xxx -- structural bugs in the implemented software- 32xx -- processing bugs- 322x -- expression evaluation bugs- 3222 -- arithmetic expressions bugs- 3222.1 -- wrong operator bugs

The "x" is a place holder for possible future filling in of numbers as the taxonomy is (inevitably) expanded.

Categories ending on “9” are used when a finer breakdown is not available, e.g. for bugs that do not fit in the other (sub)categories:

- 9xxx -- other or unspecified bugs- 39xx -- other structural bugs- 329x -- other processing bugs- 3229 -- other expression evaluation bugs- 3222.9 -- other arithmetic bugs

B&K Top 12 Beizer Categories (52% of all bugs)B&K Top 12 Beizer Categories (52% of all bugs)

7

6

5

4

3

3

3

3

3

3

3

8

0 1 2 3 4 5 6 7 8 9 10

211x Feature misunderstood, Wrong

231x Missing cases

9xxx Other bugs, unspecified

218x Feature interactions

324x Cleanup

323x Initialization

131x Incomplete requirements

132x Missing, unspecified requirement

1621 Feature Added

413x Data definition initial, default values

241x Domain misunderstood, Wrong

8111 Requirements misunderstood in test design

%

B&K

Top Prevention Techniques (expensive example)Top Prevention Techniques (expensive example)%

External S

oftware

Stress Test (

301)

Functional P

roto

type,

Daily Tasks (2

30)

Product E

xpert

Screen R

eview (280)

Functional P

roto

ype,

Stress C

ases (232)

Paper Mockup,

Daily Tasks (2

10)

Orthogonality

Check

(721)

0

2

4

6

8

10

12

Hitrate Savings

Perform

ance

Specificatio

n (820)

Top Prevention Categories (simpler/cheaper example)Top Prevention Categories (simpler/cheaper example)

0,0 5,0 10,0 15,0 20,0 25,0

Better analysis ofconsequences

Better analysis ofrequirements

Unknown

Improved code review

Enforce codingstandards

%

1st Improvement Action: The Prevention of Errors 1st Improvement Action: The Prevention of Errors through Experience-driven Test Efforts (PET)through Experience-driven Test Efforts (PET)Results of the analysis of bug reports

no special bug class dominates embedded software development

requirements problems, and requirements related problems, are the prime bug cause (>36%)

problems due to lack of systematic unit testing is the second largest bug cause (22%)

Action: Improve Testing Processes static analysis

- source code complexity, data flow anomalies, coding standards

dynamic analysis- code coverage by test cases, cross references code to test cases

Funded by CEC. More information: www.vinter.suite.dk/engswtest.htm

Results of the Improved Testing ProcessResults of the Improved Testing Process

46% Improvement in testing efficiency removing static bugs increasing unit branch coverage to > 85%

75% Reduction in production-release bugs Compared to trial-release

70% Requirements bugs in production-release

Next action: Improve Requirements Engineering

Results of the analysis of requirements bugs requirements related bugs: 51%

usability issues dominate: 64% external software issues: 28% functionality issues: 22%

Action: Improve Requirements Elicitation and Validation Scenarios

- Relate demands to use situations. Describe tasks for each scenario.

Usability Test- Check that the users are able to use the system for daily tasks, based on a

navigational prototype of the user interface.

Funded by CEC. More information: www.vinter.suite.dk/engreqeng.htm

2nd Improvement Action: A Methodology for Preventing 2nd Improvement Action: A Methodology for Preventing Requirements Issues from Becoming Defects (PRIDE)Requirements Issues from Becoming Defects (PRIDE)

M iss ing29%

Changed24%

M isund ers to o d38%

Other9%

Results of the 2nd Improvement ActionResults of the 2nd Improvement Action

Product sold more than twice as many copies in the first year

compared to a similar product developed for a similar domain by the same team

Usability was exceptional in the market users’ interaction with the product was totally

changed as a result of the early usability tests

Almost 3 times increase in productivity for the development of the user interface

27% reduction in problem reports

When the benefits of performing a defect When the benefits of performing a defect analysis are so evident, why don’t you start analysis are so evident, why don’t you start

analysing your defect data ?analysing your defect data ?

Thank you for listeningThank you for listening Otto Vinter

Software Engineering MentorTel/Fax: +45 4399 2662, Mobile: +45 4045 0771

[email protected] www.vinter.suite.dk