Dp and causal analysis guideline

32
CS577b 3/20/00 1 Cyrus Fakharzadeh USC Computer Science CAUSAL ANALYSIS AND RESOLUTION

description

Defect prevention and causal analysis tools and usage with examples

Transcript of Dp and causal analysis guideline

Page 1: Dp and causal analysis guideline

CS577b 3/20/00 1

Cyrus Fakharzadeh

USC Computer Science

CAUSAL ANALYSIS ANDRESOLUTION

Page 2: Dp and causal analysis guideline

CS577b 3/20/00 2

Outline

n Definitionsn Defect analysis reviewn Sample causal analysis exercisesn Defect prevention KPA

Page 3: Dp and causal analysis guideline

CS577b 3/20/00 3

Definitions

n Causal analysis: the analysis of defects todetermine their underlying root cause

n Causal analysis meeting: a meeting,conducted after completing a specific task,to analyze defects uncovered during theperformance of that task

Page 4: Dp and causal analysis guideline

CS577b 3/20/00 4

Defect Analysis

n Defect: any flaw in the specification,design, or implementation of a product.

n Facilitate process improvement throughdefect analysis• defect categorization to identify where work

must be done and to predict future defects• causal analysis to prevent problems from

recurring

Page 5: Dp and causal analysis guideline

CS577b 3/20/00 5

Fault Distributions

Requirements Design Coding FunctionalTest

System Test Field Use

10%40%

50%

3% 5% 7%25%

50%

10%

~1.KDM ~1.KDM ~1.KDM ~6. KDM

~12. KDM

~20. KDM

FaultOrigin

FaultDetection

Cost perFault

KDM=kilo deutsch marks

Page 6: Dp and causal analysis guideline

CS577b 3/20/00 6

Fault Distributions (cont.)

ProcessMaturityLevel

Requirements Design Coding FunctionalTest System Test Field Use Phase

5

4

3

2

1

1 1 1 6 12 20

0% 0% 2% 15% 50% 33%

0% 0% 3% 30% 50% 17%

0% 2% 20% 38% 32% 8%

3% 12%30%

30% 20% 5%

5%20% 40%

20% 10% <5%

10% 40% 50%

Relative FaultCost

FaultDetectionDistribution

FaultIntroductionDistribution

Page 7: Dp and causal analysis guideline

CS577b 3/20/00 7

Sample Defect Datan Defect data should be collected by:

• detection activity• when detected• introduction phase• type• mode

n A defect introduction and removal matrix can begenerated and used for defect prevention to helpanswer “what are high-leverage opportunities fordefect prevention / cost containment?”.

Page 8: Dp and causal analysis guideline

CS577b 3/20/00 8

Defect Flow Tracking

n A defect introduction and removalmatrix can be generated and used as abasis for defect analysis and prevention.

Percentage of DefectsPhase injected

Phase detected Requirements Preliminarydesign

Detaileddesign

Code/unittest

Total

Requirements 37% 8%Preliminary design 22% 38% 16%Detailed design 15% 18% 34% 17%Code/unit test 7% 24% 28% 43% 25%Integration testing 7% 9% 14% 29% 14%System testing 11% 12% 24% 29% 19%Total 100% 100% 100% 100% 100%

Page 9: Dp and causal analysis guideline

CS577b 3/20/00 9

Causal Analysis

n Data on defects is collected andcategorized

n Trace each defect to its underlyingcause

n Isolate the vital few causes• Pareto principle: 80% of defects are

traceable to 20% of all possible causes

n Move to correct the problems thatcaused the defects

Page 10: Dp and causal analysis guideline

CS577b 3/20/00 10

Causal Analysis Form Fields

n Post-inspection example:

Moderator, DateSubject, Subject typeItem numberAssigned toDefect category (interface, requirements, design, code, other)Item descriptionProbable causeSuggestions for eliminating probable causeAction takenNumber of hours to take corrective action

Page 11: Dp and causal analysis guideline

CS577b 3/20/00 11

Causal Analysis Example

0

100

200

300

400

500

600

700

800

900

1000

Cor

rect

ness

Cla

rity

Com

plet

enes

s

Con

sist

ency

Com

plia

nce

Mai

ntai

nabi

lity

Fun

ctio

nalit

y

Inte

rfac

e

Per

form

ance

Tes

tabi

lity

Reu

sabi

lity

Tra

ceab

ility

Defect Category

Freq

uenc

y

Page 12: Dp and causal analysis guideline

CS577b 3/20/00 12

Typical Analysis Steps

1. Sort data by defect origin. Count the number in each group.Arrange the totals in descending order of total hours.

2. Calculate the average fix times for each of the totals in the firststep.

3. For the top two or three totals in step 1, count the defects sortedby defect type and multiply the appropriate average fix times.Limit the number of types to the largest totals plus a single totalfor all others.

4. Add up the defects in each module. Get totals for the five mostfrequently changed modules plus a single total for all others.

5. Review the defect reports for the defects included in the largesttotals from steps 3 and 4. Summarize the defect-reportsuggestions for how the defects might have been prevented orfound earlier.

Page 13: Dp and causal analysis guideline

CS577b 3/20/00 13

Causal Analysis Exercise #1The following defect data is from a completed project, and another with the samegeneric component types is being planned with no reuse. Use causal analysis to identify thehighest risks and make suggestions for the new project.

Component Type Rework hours C hardware interface 25 B communication 3 B communication 6 B hardware interface 15 B hardware interface 18 A communication 4 A logic 12 B logic 5 A logic 12 A logic 14 B user interface 19 C logic 20 A user interface 23 C user interface 42

Page 14: Dp and causal analysis guideline

CS577b 3/20/00 14

Exercise #1 Answer

n Determine the defect types and components that contribute the mostrework:

n TYPE• user interface 84 hours• logic 63 hours• hardware interface 53 hours• communication 13 hours

– > concentrate on the user interface (resolve risk early, allocateresources, user prototyping, inspections, etc.)

n COMPONENT• C87 hours• B 66 hours• A 65 hours

– > concentrate on component C

Page 15: Dp and causal analysis guideline

CS577b 3/20/00 15

Defect # Origin Type

1 Documentation Standards2 Code Logic3 Documentation Process Comm.4 Design S/W Interface5 Code Computation6 Code Logic7 Specification User Interface8 Design Process Comm.9 Specification Functionality10 Code Logic11 Design User Interface12 Code Logic13 Design H/W Interface14 Other Process Comm.15 Code Computation16 Environment Standards Support17 Other Process Comm.18 Specification Functionality19 Code Computation20 Code Logic

Weighting factors - normalized cost to fix defect typesif not found until testing.

Specification 14(e.g. it takes 14 times as much effort to fix a specificationdefect in the test phase compared to in the specification phase)Design 6.2Code 2.5Documentation 1Other 1Operator 1

Analyze the following defect data. Produce threePareto column charts (or tables in descending order)showing 1) the distribution of defect origins, 2) aneffort-weighted distribution of defect origins showingthe normalized hours to fix defects, 3) effort-weighteddistribution of defect types for the top two defectorigins from chart #1. Make summary suggestions forthe development process.

Causal Analysis Exercise #2

Page 16: Dp and causal analysis guideline

CS577b 3/20/00 16

Exercise #2 AnswersDefect Origin # of defectsCode 8 Design 4Specification 3Documentation 2 Other 2Environment Support 1

Defect Origin # of defects weight total weightSpecification 3 14 42Design 4 6.2 24.8Code 8 2.5 20Documentation 2 1 2Other 2 1 2Environment Support 1 1 1

Code Defect Type # of Defects Weight Total wt.Logic 5 2.5 12.5Computation 3 2.5 7.5

Design Defect Type # of Defects Weight Total wt.S/W Interface 1 6.2 6.2Process Comm. 1 6.2 6.2User Interface 1 6.2 6.2H/W Interface 1 6.2 6.2

Page 17: Dp and causal analysis guideline

CS577b 3/20/00 17

Level 4 Relationship to Level 5 KPAs

n Data analysis from Level 4 activitiesenables focusing the performance ofDefect Prevention (DP), TechnologyChange Management (TCM), andProcess Change Management (PCM)

Page 18: Dp and causal analysis guideline

CS577b 3/20/00 18

Defect PreventionThe purpose of Defect Prevention is to identify the cause of defects and prevent themfrom recurring.

Defect Prevention involves analyzing defects that were encountered in the past andtaking specific actions to prevent the occurrence of those types of defects in thefuture. The defects may have been identified on other projects as well as in earlierstages or tasks of the current project. Defect prevention activities are also onemechanism for spreading lessons learned between projects.

Trends are analyzed to track the types of defects that have been encountered and toidentify defects that are likely to recur. Based on an understanding of the project'sdefined software process and how it is implemented (as described in the IntegratedSoftware Management and Software Product Engineering key process areas), the rootcauses of the defects and the implications of the defects for future activities aredetermined.

Both the project and the organization take specific actions to prevent recurrence ofthe defects.

Page 19: Dp and causal analysis guideline

CS577b 3/20/00 19

ENTRY

1. Policy for organization toperform DP activities (C1)

2. Policy for projects to performDP activities (C2)

3. Organization-level team existsto coordinate DP activities (Ab1)

4. Project level team exists tocoordinate DP activities (Ab2)

5. Adequate resources/funding(Ab3)

6. Training for members of theS/W engineering group andrelated groups (Ab4)

7. Procedures for Ac1, Ac3, Ac6,& Ac7

TASK

1. Develop Project’s DP plan (Ac1)

2. Team has kick-off meeting to prepare forDP activities (Ac2)

3. Conduct causal analysis meetings (Ac3)

4. Conduct coordination meetings to reviewthe implementation of action proposals fromthe causal analysis meetings (Ac4)

5. Document and track DP data (Ac5)

6. Revise the organization’s standard processresulting from DP actions (Ac6)

7. Revise the project’s defined processresulting from DP actions (Ac7)

8. Provide feedback to developers on thestatus and results of DP actions (Ac8)

VERIFICATION

1. Reviews with senior management (V1)

2. Reviews with project manager (V2)

3. Reviews/audits by SQA (V3)

4. Measurement of status of DP activities(M1)

EXIT

1. DP activities are planned(G1)

2. Common causes of defectsare sought out and identified(G2)

3. Common causes of defectsare prioritized andsystematically eliminated(G3)

Defect Prevention (DP)ETVX Diagram

Page 20: Dp and causal analysis guideline

CS577b 3/20/00 20

Defect Prevention Policiesn Organization defect prevention policy should state:

• Long-term plans and commitments are established for funding, staffing, and otherresources for defect prevention.

• The resources needed are allocated for the defect prevention activities.• Defect prevention activities are implemented across the organization to improve the

software processes and products.• The results of the defect prevention activities are reviewed to ensure the effectiveness

of those activities.• Management and technical actions identified as a result of the defect prevention

activities are addressed.

n Project defect prevention policy should state:• Defect prevention activities are included in each project's software development plan.• The resources needed are allocated for the defect prevention activities.• Project management and technical actions identified as a result of the defect

prevention activities are addressed.

Page 21: Dp and causal analysis guideline

CS577b 3/20/00 21

DP Tools and Trainingn Tools:

• statistical analysis tools• database systems• other

n Examples of DP training:• defect prevention methods• conduct of task kick-off meetings• conduct of causal analysis meetings, and• statistical methods (e.g., cause/effect

diagrams and Pareto analysis).

Page 22: Dp and causal analysis guideline

CS577b 3/20/00 22

DP Project Activities

n Project plan for defect prevention: 1.Identifies the defect prevention activities (e.g., task kick-off and causal

analysis meetings) that will be held. 2.Specifies the schedule of defect prevention activities. 3.Covers the assigned responsibilities and resources required, including staff

and tools. 4.Undergoes peer review.

n Kick-off meetings are held to familiarize the members of the team with thedetails of the implementation of the process, as well as any recent changes tothe process.

n Causal analysis meetings are held.

Page 23: Dp and causal analysis guideline

CS577b 3/20/00 23

Causal Analysis Proceduresn Causal analysis meeting procedure typically specifies:

1.Each team that performs a software task conducts causal analysis meetings. A causal analysis meeting is conducted shortly after the task is completed. Meetings are conducted during the software task if and when the number of defects uncovered warrants the additional meetings. Periodic causal analysis meetings are conducted after software products are released to the customer, as appropriate. For software tasks of long duration, periodic in-process defect prevention meetings are conducted, as appropriate. An example of a long duration task is a level-of-effort, customer support task. 2.The meetings are led by a person trained in conducting causal analysis meetings. 3.Defects are identified and analyzed to determine their root causes. An example of a method to determine root causes is cause/effect diagrams.

Page 24: Dp and causal analysis guideline

CS577b 3/20/00 24

Causal Analysis Procedures (cont.)4.The defects are assigned to categories of root causes. Examples of defect root cause categories include: inadequate training, breakdown of communications, not accounting for all details of the problem, and making mistakes in manual procedures (e.g., typing).5.Proposed actions to prevent the future occurrence of identified defects and similar defects are developed and documented. Examples of proposed actions include modifications to: the process, training, tools, methods, communications, and software work products.6.Common causes of defects are identified and documented. Examples of common causes include: frequent errors made in invoking a certain system function, and frequent errors made in a related group of software units.7.The results of the meeting are recorded for use by the organization and other projects.

Page 25: Dp and causal analysis guideline

CS577b 3/20/00 25

DP Team Activitiesn Each of the teams assigned to coordinate defect prevention activities meets on a periodic

basis to review and coordinate implementation of action proposals from the causal analysismeetings. The teams involved may be at the organization or project level.

n Team activities include: 1.Review the output from the causal analysis meetings and select action proposals that will be addressed. 2.Review action proposals that have been assigned to them by other teams coordinating defect prevention activities in the organization and select action proposals that will be addressed. 3.Review actions taken by the other teams in the organization to assess whether these actions can be applied to their activities and processes. 4.Perform a preliminary analysis of the action proposals and set their priorities. Priority is usually nonrigorous and is based on an understanding of: the causes of defects, the implications of not addressing the defects, the cost to implement process improvements to prevent the defects, and the expected impact on software quality. An example of a technique used to set priorities for the action proposals is Pareto analysis.

Page 26: Dp and causal analysis guideline

CS577b 3/20/00 26

DP Team Activities (cont.)5.Reassign action proposals to teams at another level in the organization, as appropriate.6.Document their rationale for decisions and provide the decision and the rationale to the submitters of the action proposals.7.Assign responsibility for implementing the action items resulting from the action proposals. Implementation of the action items includes making immediate changes to the activities that are within the purview of the team and arranging for other changes. Members of the team usually implement the action items, but, in some cases, the team can arrange for someone else to implement an action item.8.Review results of defect prevention experiments and take actions to incorporate the results of successful experiments into the rest of the project or organization, as appropriate. Examples of defect prevention experiments include: using a temporarily modified process, and using a new tool.9.Track the status of the action proposals and action items.

Page 27: Dp and causal analysis guideline

CS577b 3/20/00 27

DP Team Activities (cont.)

10.Document software process improvement proposals for the organization's standard software process and the projects' defined software processes as appropriate. The submitters of the action proposal are designated as the submitters of the software process improvement proposals. 11.Review and verify completed action items before they are closed. 12.Ensure that significant efforts and successes in preventing defects are recognized.

Page 28: Dp and causal analysis guideline

CS577b 3/20/00 28

DP Documentation and Tracking ActivitiesActivity 5 -- Defect prevention data are documented and trackedacross the teams coordinating defect prevention activities.

1.Action proposals identified in causal analysis meetings are documented.Examples of data that are in the description of an action proposal include:

originator of the action proposal, description of the defect, description of the defect cause, defect cause category, stage when the defect was injected, stage when the defect was identified, description of the action proposal, and action proposal category.

2.Action items resulting from action proposals are documented. Examples of data that are in the description of an action item include: the person responsible for implementing it, a description of the areas affected by it, the individuals who are to be kept informed of its status, the next date its status will be reviewed, the rationale for key decisions, a description of implementation actions, the time and cost for identifying the defect and correcting it, and the estimated cost of not fixing the defect.

Page 29: Dp and causal analysis guideline

CS577b 3/20/00 29

DP Feedback

n Feedback is needed on the status and results of the organization's andproject's defect prevention activities on a periodic basis.

The feedback provides:

1.A summary of the major defect categories. 2.The frequency distribution of defects in the major defect categories. 3.Significant innovations and actions taken to address the major defect categories. 4.A summary status of the action proposals and action items.

Examples of means to provide this feedback include: electronic bulletin boards, newsletters, and information flow meetings.

Page 30: Dp and causal analysis guideline

CS577b 3/20/00 30

DP Measurementsn Examples:

• the cumulative costs of defect prevention activities (e.g.,holding causal analysis meetings and implementing actionitems)

• the time and cost for identifying the defects and correctingthem, compared to the estimated cost of not correcting thedefects

• profiles measuring the number of action items proposed,open, and completed

• the number of defects injected in each stage, cumulatively,and over releases of similar products and the total number ofdefects.

Page 31: Dp and causal analysis guideline

CS577b 3/20/00 31

DP Management Reviewsn DP reviews cover:

1.A summary of the major defect categories and the frequencydistribution of

defects in these categories. 2.A summary of the major action categories and the frequency

distribution of actions in these categories. 3.Significant actions taken to address the major defect categories. 4.A summary status of the proposed, open, and completed action items. 5.A summary of the effectiveness of and savings attributable to the

defect prevention activities. 6.The actual cost of completed defect prevention activities and the

projected cost of planned defect prevention activities.

Page 32: Dp and causal analysis guideline

CS577b 3/20/00 32

Referencesn Defect Prevention (DP)

Inderpal Bhandari, Michael Halliday, et al., "A Case Study of Software Process Improvement DuringDevelopment," IEEE Transactions on Software Engineering, Vol. 19, No. 12, December 1993, pp. 1157-1170.

R. Chillarege and I. Bhandari, "Orthogonal Defect Classification -- A Concept for In-Process Measurements,"IEEE Software, Vol. 18, No. 11, November 1992, pp. 943-955.

Julia L. Gale, Jesus R. Tirso, and C. Art Burchfield, "Implementing the Defect Prevention Process in the MVSInteractive Programming Organization," IBM Systems Journal, Vol. 29, No. 1, 1990, pp. 33-43.

C.L. Jones, "A Process-Integrated Approach to Defect Prevention," IBM Systems Journal, Vol. 24, No. 2, 1985,pp. 150-167.

Juichirou Kajihara, Goro Amamiya, and Tetsuo Saya, "Learning from Bugs," IEEE Software, Vol. 10, No. 5,September 1993, pp. 46-54.

R.G. Mays, C.L. Jones, G.J. Holloway, and D.P. Studinski, "Experiences with Defect Prevention," IBM SystemsJournal, Vol. 29, No. 1, 1990, pp. 4-32.

Norman Bridge and Corinne Miller, "Orthogonal Defect Classification Using Defect Data to Improve SoftwareDevelopment," Proceedings of the 7th International Conference on Software Quality, Montgomery,Alabama, 6-8 October 1997, pp. 197-213.