Pitt spin-sept-10-ev-in-sw-projects-psp

56
© 2010 Carnegie Mellon University Earned Value for Software Development Is NOT a Myth! Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 William Nichols and James McHale September 2010

description

Earned Value Management for Agile

Transcript of Pitt spin-sept-10-ev-in-sw-projects-psp

Page 1: Pitt spin-sept-10-ev-in-sw-projects-psp

© 2010 Carnegie Mellon University

Earned Value for Software DevelopmentIs NOT a Myth!

Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213

William Nichols and James McHaleSeptember 2010

Page 2: Pitt spin-sept-10-ev-in-sw-projects-psp

2

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Agenda

Why EV won’t Work (for Software Development Projects)

How to Make EV work

Pulling it all Together (TSP)

Page 3: Pitt spin-sept-10-ev-in-sw-projects-psp

3

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why EV won’t work?

Page 4: Pitt spin-sept-10-ev-in-sw-projects-psp

4

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.

Page 5: Pitt spin-sept-10-ev-in-sw-projects-psp

5

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why EV won’t work for software

Software project work is hard to estimate with sufficient accuracy.

You can’t get an accurate progress report from a software developer.

Page 6: Pitt spin-sept-10-ev-in-sw-projects-psp

6

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.

You can’t get an accurate progress report from a software developer.

Until the software tests successfully we don’t know that we are done.

Page 7: Pitt spin-sept-10-ev-in-sw-projects-psp

7

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.

You can’t get an accurate progress report from a software developer.

Until the software tests successfully we don’t know that we are done.

Software is iterative, work is revised a number of times before complete

Page 8: Pitt spin-sept-10-ev-in-sw-projects-psp

8

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why EV won’t work for software

Software development work is hard to estimate with sufficient accuracy.

You can’t get an accurate progress report from a software developer.

Until the software tests successfully we don’t know that we are done.

Software is iterative, work is revised a number of times before complete

Software project have imprecise requirements, scope isn’t fixed

Page 9: Pitt spin-sept-10-ev-in-sw-projects-psp

9

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

So what do you do?

The 95% rule for ETC?

Developers are always 95% done, Just ask them. So how much time remains?

Page 10: Pitt spin-sept-10-ev-in-sw-projects-psp

10

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

So what do you do?

The 95% rule for ETC?

It takes “95% of the estimated schedule/cost to complete 95% of the work and ANOTHER 95% TO Finish it!”

Page 11: Pitt spin-sept-10-ev-in-sw-projects-psp

11

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

So what do you do?

The 90% rule for ETC?

It takes “90% of the estimated schedule/cost to complete 90% of the work and ANOTHER 90% TO Finish it!”

DON’T do this!

Page 12: Pitt spin-sept-10-ev-in-sw-projects-psp

12

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

How to make EV work?

Page 13: Pitt spin-sept-10-ev-in-sw-projects-psp

13

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

The Four Core Requirements For Earned Value†

A credible schedule of the planned work

A time phased budget for the planned work

A means of collecting progress to plan of the work performed

A means of collecting cost information for the work performed

† The Earned Value Management Maturity Model®, Ray W. Stratton, Management

Concepts, 2006.

Page 14: Pitt spin-sept-10-ev-in-sw-projects-psp

14

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Software development work is hard to estimate with sufficient accuracy.

Page 15: Pitt spin-sept-10-ev-in-sw-projects-psp

15

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Estimate Accurately

Software development work is hard to estimate accurately?

Start by defining “what done looks like”

Decompose the work into work packages.

Learn how to estimate a work package.

Use history as a guide

Page 16: Pitt spin-sept-10-ev-in-sw-projects-psp

16

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Ex: Text Pages versus Writing Time

y = 2.4366x + 4.1297

R2 = 0.6094

0

20

40

60

80

100

120

0 10 20 30 40 50

Text pages

Tim

e (

ho

urs

)

Page 17: Pitt spin-sept-10-ev-in-sw-projects-psp

17

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Ex: LOC versus Development Time

y = 6.4537x - 252.94

R2 = 0.9582

-1000

0

1000

2000

3000

4000

5000

6000

0 200 400 600 800

C++ LOC

Tim

e (

min

.)

Page 18: Pitt spin-sept-10-ev-in-sw-projects-psp

18

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

PSP 0

PSP 1

PSP 2

Effort Estimation Accuracy

100%0%-100%-200% 100%0%-100%-200%

0

20

40

0

20

40

100%0%-100%-200% 100%0%-100%-200%

0

20

40

0

20

40

100%0%-100%-200% 100%0%-100%-200%

0

20

40

0

20

40

PSP Estimating Accuracy

Majority are under-estimating

Balance of over-estimates and under-estimates

Much tighter balance around zero

Page 19: Pitt spin-sept-10-ev-in-sw-projects-psp

19

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Improving Estimating Accuracy

11109876543210

0.2

0.3

0.4

0.5

0.6

0.7

Mean Time Misestimation

PSP Level Average

Effort Estimation Accuracy Trend

Program Number

Es

tim

ate

d M

inu

tes

-A

ctu

al

Min

ute

s /

Es

tim

ate

d M

inu

tes

298 developers

PSP0 PSP1 PSP2

Page 20: Pitt spin-sept-10-ev-in-sw-projects-psp

20

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

You can’t get an accurate progress report from a software developer?

Page 21: Pitt spin-sept-10-ev-in-sw-projects-psp

21

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

You can’t get an accurate progress report from a software developer?

Make a credible plan, and track to the plan.

Break the work package into smaller steps.

For each step, now what DONE looks like. Have a standard.

Use history to estimate the effort in each step.

Page 22: Pitt spin-sept-10-ev-in-sw-projects-psp

22

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

You can’t get an accurate progress report from a software developer?

Make a credible plan, and track to the plan.

Break the work package into smaller steps.

For each step, now what DONE looks like. Have a standard.

Use history to estimate the effort in each step.

Page 23: Pitt spin-sept-10-ev-in-sw-projects-psp

23

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

How do you break a work package into steps?

Use a process!Plan

Test

Personal review

Build

Field use

Team inspection

Page 24: Pitt spin-sept-10-ev-in-sw-projects-psp

24

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

What is a Process?

A process is a defined and measured set of steps for doing a job.

A process guides your work.

A process is usually defined for a job that is done multiple times.

A process provides a foundation for planning.

• A process is a template, a generic set of steps.

• A plan is a set of steps for a specific job, plus other information such as effort,

costs, and dates.

Page 25: Pitt spin-sept-10-ev-in-sw-projects-psp

25

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

TSP Process Elements

Document the process entry criteria, phases/ steps, and exit criteria. The purpose is to provide expert-level guidance as you use the process.

Phase Purpose To guide you in developing module-level programs

Inputs Required Problem descriptionPSP project plan summary form

Time and defect recording logsDefect type standardStop watch (optional)

1 Planning - Produce or obtain a requirements statement.- Estimate the required development time.- Enter the plan data in the project plan summary form.- Complete the time log.

2 Development - Design the program.- Implement the design.- Compile the program and fix and log all defects found.- Test the program and fix and log all defects found.

- Complete the time recording log.

3 Postmortem Complete the project plan summary form with actualtime, defect, and size data.

Exit Criteria - A thoroughly tested program- Completed project plan summary with estimated and

actual data- Completed defect and time logs

Scripts

Measures Measure the process and the product. They provide insight into how the process is working and the status of the work.

Forms Provide a convenient and consistent framework for gathering and retaining data

StandardsProvide consistent definitions that guide the work and gathering of data.

Student Date

Program Program #

Instructor Language

Summary Plan Actual To Date

LOC/Hour

Actual Time

Planned Time

CPI(Cost-Performance Index)(Actual/Planned)

% Reuse

% New Reuse

Test Defects/KLOC

Total Defects/KLOC

Yield %

% Appraisal COQ

% Failure COQ

COQ A/F Ratio

Program Size (LOC): Plan Actual To Date

Base(B)(Measured) (Measured)

Deleted (D)(Estimated) (Counted)

Modified (M)(Estimated) (Counted)

Added (A)(N-M) (T-B+D-R)

Reused (R)(Estimated) (Counted)

Total New & Changed (N)(Estimated) (A+M)

Total LOC (T)(N+B-M-D+R) (Measured)

Total New Reused(Estimated) (Counted)

Total Object LOC (E)(Estimated) (Counted)

Upper Prediction Interval (70%)

Lower Prediction Interval (70%)

Time in Phase (min.) Plan Actual To Date To Date %

Planning

Design

Design review

Code

Code review

Compile

Test

Postmortem

Total

Total Time UPI (70%)

Total Time LPI (70%)

Page 26: Pitt spin-sept-10-ev-in-sw-projects-psp

26

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Example Process Script - Requirements

Page 27: Pitt spin-sept-10-ev-in-sw-projects-psp

27

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Until the software tests successfully we don’t know that we are done?

Then you are done when the test complete successfully!

But, test is highly variable.

Then make a quality plan. Plan to remove the defects you’ve put in.

Defects can result in

• incorrect functionality

• poor operation

• improper installation

• confusing or incorrect documentation

• error-prone modification and enhancement

Page 28: Pitt spin-sept-10-ev-in-sw-projects-psp

28

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why Focus on Quality?

The fastest way to finish is to do it right the first time! Do it right, or do it over

Completed tasks should be verified with performance measures.

This links to TPM, Technical Performance Measures

Page 29: Pitt spin-sept-10-ev-in-sw-projects-psp

29

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Why Focus on Defects?

In most engineering organizations, a significant number of resources are dedicated to fixing defects. Often more than 40% of cost and schedule.

Defects are very costly. It is beneficial to find and remove defects early in the process.

The reasons for managing defects are to

• produce better products

• improve your ability to develop products on time and within budget

Page 30: Pitt spin-sept-10-ev-in-sw-projects-psp

30

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Quality Measures

The TSP uses three quality measures for planning and tracking.

1. Defect injection and removal rates

2. Defect density

3. Review rates

Page 31: Pitt spin-sept-10-ev-in-sw-projects-psp

31

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Defect Injection and Removal Rates

Defect Injection Rate is defined as the number of defects injected per hour while

performing activities in a process phase.

Defect Removal Rate is defined as the number of defects removed per hour while

performing activities in a process phase.

Typical defect injection phases include requirements and design.

Typical defect removal phases include reviews, inspections, and testing.

Page 32: Pitt spin-sept-10-ev-in-sw-projects-psp

32

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Defect Density

Defect Density is the ratio of the number of defects removed and the product size.

A common defect density measure in software projects is number of defects found per thousand lines of code (defects/KLOC).

An example of another defect density measure, used by the SEI when developing training slides, is number of defects found per slide.

Defect density is also a good product planning, tracking, and predictive measure.

Page 33: Pitt spin-sept-10-ev-in-sw-projects-psp

33

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Review Rates

Review rates is the ratio between the size of a product and the time spent in reviewing or inspecting the product.

A common example in software projects is lines of code reviewed per hour (LOC/hr).

Another example is number of requirements pages reviewed per hour (Req Pages/hr).

Review rate is the control variable for inspections and reviews.

It is used to

• allocate appropriate time during planning

• predict the effectiveness of the review or inspection

Page 34: Pitt spin-sept-10-ev-in-sw-projects-psp

34

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Defect Injection and Removal

In your process, you will have activities that

• inject defects

• remove defects

Defect injection typically occurs when you

• determine the job requirements

• specify the product

• build the product

Defect removal typically occurs when you

• review the products

• test the products

• use the products

Plan

Test

Personal review

Build

Field use

Team inspection

Page 35: Pitt spin-sept-10-ev-in-sw-projects-psp

35

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Defect Removal Techniques

Reviews (inspections, walkthroughs, personal reviews)

• examine the product or interim development artifacts of the product

• find and eliminate defects

Testing

• exercises the product or parts of the product

• proves that the product works correctly

• identifies potential defects or symptoms

In many cases, projects rely on testing to find and fix defects.

When this happens, many defects are found in the field by the customer.

Page 36: Pitt spin-sept-10-ev-in-sw-projects-psp

36

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

PSP Quality Results

11109876543210

0

10

20

30

40

50

60

70

80

90

100

110

120

Mean Compile + Test

PSP Level Mean Comp + Test

Defects Per KLOC Removed in Compile and Test

Program Number

Me

an

Nu

mb

er

of

De

fec

ts P

er

KL

OC

298 developers

PSP0

PSP1

PSP2

Page 37: Pitt spin-sept-10-ev-in-sw-projects-psp

37

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Compile and Test Defects - from PSP Training

0

50

100

150

200

250

Prog1

Prog2

Prog3

Prog4

Prog5

Prog6

Prog7

Prog8

Prog9

Prog1

0

PSP Assignment Number

Defe

cts

/KL

OC

1st Quartile

2nd Quartile

3rd Quartile

4th Quartile

810 developers

Defectreduction1Q: 80.4%

2Q: 79.0%

3Q: 78.5%

4Q: 77.6%

The ‘Worst’ as Good as the ‘Best’?

Page 38: Pitt spin-sept-10-ev-in-sw-projects-psp

38

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Software is iterative, work is revised a number of times before complete?

Page 39: Pitt spin-sept-10-ev-in-sw-projects-psp

39

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Software is iterative, work is revised a number of times before complete?

Incremental and iterative development isn’t a bug, it’s a feature!

DoD 5000.02's procurement cycle, incremental and iterative approaches are used.

This is a fact on almost any project. (ever change a leaky faucit?)

Apply the appropriate method to deal with this..

Page 40: Pitt spin-sept-10-ev-in-sw-projects-psp

40

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

TSP Cyclic Development Strategy

TSP favors an iterative or cyclic development strategy.

• develop in increments

• use multiple cycles

• work-ahead

Projects can start on any phase or cycle.

Each phase or cycle starts with a launch or re-launch.

TSP permits whatever process structure makes the most business and technical sense.

Phase or cyclePostmortem

Developmentphase

or cycle

Launch

Re-launch

ProjectPostmortem

Page 41: Pitt spin-sept-10-ev-in-sw-projects-psp

41

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Core Success Criteria for Earned Value and Software Development

Define the outcomes of the work effort in some tangible way.

Define the way progress is going to be measured. 0/100% for each work task.

Either it's done or it's not done, there is no “we’re trying real hard”.

Define the tangible evidence, the date the tangible evidence is expected, and the associated costs.

.

Page 42: Pitt spin-sept-10-ev-in-sw-projects-psp

42

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Software project have imprecise requirements, scope isn’t fixed

Page 43: Pitt spin-sept-10-ev-in-sw-projects-psp

43

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Use a Planning Process

Apply the appropriate method to deal with this.

Usually most portions of the project can be planned.

A common SW Dev project mistake is staffing up too soon!

Page 44: Pitt spin-sept-10-ev-in-sw-projects-psp

44

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

TSP: Pulling it all together

Page 45: Pitt spin-sept-10-ev-in-sw-projects-psp

45

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Plan Execution -1

Tasks in personal task plans are ordered according to team priorities.

Developers adjust the order as needed and work ahead as needed.

Developers select a task planned for current week and begin tracking time.

Page 46: Pitt spin-sept-10-ev-in-sw-projects-psp

46

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Plan Execution -2

While working they also record

• any defects they find

• any risks they identify

• any process improvement ideas

When they are done they

• stop time tracking

• mark task completed if done

• record size if the process step calls for it

Page 47: Pitt spin-sept-10-ev-in-sw-projects-psp

47

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Monitor and Control the Plan: Script WEEK

Team members meet each week to assess progress.

• Role managers present their evaluation of the team’s plan and data.

• Goal owners present status on product and business objectives.

• Risk owners present status on risk mitigation plans and new risks.

• Team members present status on their plans.

Plan deviations are addressed each week.

Significant deviations like new requirements trigger a replan.

Performance Data Reviewed• Baseline Plan Value• Plan Value• Earned Value• Predicted Earned Value• Earned Value Trend• Plan Task Hours• Actual Task Hours• Tasks/Milestones completed• Tasks/Milestones past due• Tasks/Milestones next 2 weeks• Effort against incomplete tasks• Estimation Accuracy• Review and Inspection Rates• Injection Rates• Removal Rates• Time in Phase Ratios• Phase and Process Yield• Defect Density• Quality Profile (QP)• QP Index• Percent Defect Free• Defect Removal Profile• Plan to Actual Defects Injected/Removed

Page 48: Pitt spin-sept-10-ev-in-sw-projects-psp

48

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Form Week -1

Teams use Form Week to review status at their weekly meetings.

TSP Week Summary - Form WEEKName Date 3/1/2007

Team

Status for Week 11 Selected Assembly Cycle

Week Date 1/22/2007 SYSTEM

Task Hours %Change Weekly Data Plan Actual

Plan /

Actual

Plan -

Actual

Baseline 660.8 Schedule hours for this week 51.0 48.1 1.06 2.9 Baseline 3/19/2007

Current 745.5 Schedule hours this cycle to date 344.0 395.0 0.87 -51.0 Plan 3/26/2007

%Change 12.8% Earned value for this week 5.6 0.7 8.10 4.9 Predicted 7/30/2007

Earned value this cycle to date 43.8 22.0 1.99 21.8

To-date hours for tasks completed 163.9 314.5 0.52

To-date average hours per week 31.3 35.9 0.87

EV per completed task hour to date 0.134 0.070

Assembly Phase Task

Re-

source

Plan

Hrs.

Actual

Hrs.

EV or

PV CPI

OPEN MILESTONES

CD Mag PEG Window MGMT CD Mag PEG Milestone ne 0.0 0.0 12/4/2006 4 1/8/2007 9 1/29/2007 12 1/29/2007 12

CD Mag PEG a MGMT Complete Mag PEG Milestone bf 0.0 0.0 0.0 1/8/2007 9 1/1/2007 8 1/22/2007 11 1/29/2007 12

INTEG Mag UI FW MGMT Complete Mag UI PEG Milestone bf 0.0 0.0 0.0 2/5/2007 13 1/22/2007 11 2/12/2007 14 3/5/2007 17

INTEG Mag UI FW MGMT Test Mag UI TES On Target ne 10.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28

INTEG Mag UI FW MGMT Test Mag UI TES On Target Milestone ne 0.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28

INTEG Mag UI FW MGMT Test Mag UI TES On Target bf 10.0 0.0 0.0 3/19/2007 19 3/5/2007 17 3/26/2007 20 7/2/2007 34

TASKS COMPLETED IN WEEK 11

CD Mag PEG Frame PM CD Mag UI FW Frame Phase1 PM bf 0.0 0.4 0.0 0.00 12/4/2006 4 12/18/2006 6 1/23/2007 11

Mag General MGMT Mag Initialization bf 0.0 1.9 0.0 0.00 12/18/2006 6 1/22/2007 11

Mag General DLDR Mag Initialization DLDR phase 1 bf 0.0 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11

CD Mag PEG Frame PM CD Mag UI FW Frame Phase2 PM bf 0.3 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11

CD Mag UI FW Flow PM CD Mag FW Flow PM bf 0.1 0.1 0.0 0.98 12/4/2006 4 12/18/2006 6 1/23/2007 11

CD Mag UI FW Flow bf 0.0 0.0 0.0 12/18/2006 6 12/18/2006 6 1/23/2007 11

CD Mag PEG Window BuilderCODEINSP CD Mag PEG Window Builder CODEINSP nm 0.3 1.0 0.0 0.25 12/4/2006 4 1/1/2007 8 1/25/2007 11

Consolidation

Voyager

Project End Dates

Baseline or

Committed Date

and Week

Plan Date and

Week

Slip Date and

Week

Predicted Date

and Week

Page 49: Pitt spin-sept-10-ev-in-sw-projects-psp

49

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Form Week -2

The top of form Week displays a summary of current status for any week up to the current week.

TSP Week Summary - Form WEEKName Date 3/1/2007

Team

Status for Week 11 Selected Assembly Cycle

Week Date 1/22/2007 SYSTEM

Task Hours %Change Weekly Data Plan Actual

Plan /

Actual

Plan -

Actual

Baseline 660.8 Schedule hours for this week 51.0 48.1 1.06 2.9 Baseline 3/19/2007

Current 745.5 Schedule hours this cycle to date 344.0 395.0 0.87 -51.0 Plan 3/26/2007

%Change 12.8% Earned value for this week 5.6 0.7 8.10 4.9 Predicted 7/30/2007

Earned value this cycle to date 43.8 22.0 1.99 21.8

To-date hours for tasks completed 163.9 314.5 0.52

To-date average hours per week 31.3 35.9 0.87

EV per completed task hour to date 0.134 0.070

Consolidation

Voyager

Project End Dates

Page 50: Pitt spin-sept-10-ev-in-sw-projects-psp

50

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Form Week -3

The bottom half of Form Week displays the status of open milestones, tasks completed in the selected week, and tasks due in the next two weeks.

Assembly Phase Task

Re-

source

Plan

Hrs.

Actual

Hrs.

EV or

PV CPI

OPEN MILESTONES

CD Mag PEG Window MGMT CD Mag PEG Milestone ne 0.0 0.0 12/4/2006 4 1/8/2007 9 1/29/2007 12 1/29/2007 12

CD Mag PEG a MGMT Complete Mag PEG Milestone bf 0.0 0.0 0.0 1/8/2007 9 1/1/2007 8 1/22/2007 11 1/29/2007 12

INTEG Mag UI FW MGMT Complete Mag UI PEG Milestone bf 0.0 0.0 0.0 2/5/2007 13 1/22/2007 11 2/12/2007 14 3/5/2007 17

INTEG Mag UI FW MGMT Test Mag UI TES On Target ne 10.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28

INTEG Mag UI FW MGMT Test Mag UI TES On Target Milestone ne 0.0 0.0 3/12/2007 18 3/12/2007 18 4/2/2007 21 5/21/2007 28

INTEG Mag UI FW MGMT Test Mag UI TES On Target bf 10.0 0.0 0.0 3/19/2007 19 3/5/2007 17 3/26/2007 20 7/2/2007 34

TASKS COMPLETED IN WEEK 11

CD Mag PEG Frame PM CD Mag UI FW Frame Phase1 PM bf 0.0 0.4 0.0 0.00 12/4/2006 4 12/18/2006 6 1/23/2007 11

Mag General MGMT Mag Initialization bf 0.0 1.9 0.0 0.00 12/18/2006 6 1/22/2007 11

Mag General DLDR Mag Initialization DLDR phase 1 bf 0.0 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11

CD Mag PEG Frame PM CD Mag UI FW Frame Phase2 PM bf 0.3 0.0 0.0 12/4/2006 4 12/18/2006 6 1/23/2007 11

CD Mag UI FW Flow PM CD Mag FW Flow PM bf 0.1 0.1 0.0 0.98 12/4/2006 4 12/18/2006 6 1/23/2007 11

CD Mag UI FW Flow bf 0.0 0.0 0.0 12/18/2006 6 12/18/2006 6 1/23/2007 11

CD Mag PEG Window BuilderCODEINSP CD Mag PEG Window Builder CODEINSP nm 0.3 1.0 0.0 0.25 12/4/2006 4 1/1/2007 8 1/25/2007 11

Baseline or

Committed Date

and Week

Plan Date and

Week

Slip Date and

Week

Predicted Date

and Week

Page 51: Pitt spin-sept-10-ev-in-sw-projects-psp

51

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Schedule Management -1

TSP teams routinely meet their schedule commitments.

They use earned value management, task hour management, and quality management at the team and personal level to help manage schedule.

TSP Week Summary - Form WEEKName Date

Team

Status for Week 11 Selected Assembly Cycle

Week Date 1/22/2007 SYSTEM

Task Hours %Change Weekly Data Plan Actual

Plan /

Actual

Plan -

Actual

Consolidation

Voyager

Earned value for this week 5.6 0.7 8.10 4.9

Earned value this cycle to date 43.8 22.0 1.99 21.8

To-date hours for tasks completed 163.9 314.5 0.52

3/1/2007

Baseline 3/19/2007

Plan 3/26/2007

Predicted 7/30/2007

Project End Dates

TSP Week Summary - Form WEEKName Date

Team

Status for Week 11 Selected Assembly Cycle

Week Date 1/22/2007 SYSTEM

Task Hours %Change Weekly Data Plan Actual

Plan /

Actual

Plan -

Actual

Consolidation

Voyager

Earned value for this week 5.6 0.7 8.10 4.9

Earned value this cycle to date 43.8 22.0 1.99 21.8

To-date hours for tasks completed 163.9 314.5 0.52

3/1/2007

Baseline 3/19/2007

Plan 3/26/2007

Predicted 7/30/2007

Project End Dates

Teams monitor earned value per week

and cumulative earned value for the cycle

Page 52: Pitt spin-sept-10-ev-in-sw-projects-psp

52

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Schedule Management -2

Intuit’s 2007 release of QuickBooks met every major milestone and delivered 33% more functionality than planned.

First-time TSP projects at Microsoft had a 10 times better mean schedule error than non-TSP projects at Microsoft as reflected in the following table.

Microsoft Schedule Results Non-TSP Projects TSP Projects

Released on Time 42% 66%

Average Days Late 25 6

Mean Schedule Error 10% 1%

Sample Size 80 15

Source: Microsoft

Page 53: Pitt spin-sept-10-ev-in-sw-projects-psp

53

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Reporting to Higher Level Management: Script STATUS

The team presents status to management and the customer at

specified intervals.

• weekly, bi-weekly, monthly

Project status

• earned value and projection

• task hours

• milestones planned/completed

• product quality indicators

Project risks

• status of existing risks

• newly-identified risks

Page 54: Pitt spin-sept-10-ev-in-sw-projects-psp

54

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Questions

?

Page 55: Pitt spin-sept-10-ev-in-sw-projects-psp

55

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University

Contact Information

Jim McHale – [email protected]

Bill Nichols – [email protected]

Page 56: Pitt spin-sept-10-ev-in-sw-projects-psp

56

Earned Value for Software DevelopmentSPIN 08-Sept 2010

© 2008-09 Carnegie Mellon University