Follow-up Bootstrap Case Study

15
Hubbard Decision Research The Applied Information Economics Company Follow-up Follow-up Bootstrap Case Bootstrap Case Study Study

description

Follow-up Bootstrap Case Study. The Measurement Choice. The follow-up decision was defined as whether to proceed with the project as planned or make a significant reduction in scope by removing functions ( ) The VIA of this decision indicated that risk of cancellation was a key variable - PowerPoint PPT Presentation

Transcript of Follow-up Bootstrap Case Study

Page 1: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Follow-up Bootstrap Follow-up Bootstrap Case StudyCase Study

Page 2: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

The Measurement ChoiceThe Measurement Choice

The follow-up decision was defined as whether to proceed with the project as planned or make a significant reduction in scope by removing functions ( )

The VIA of this decision indicated that risk of cancellation was a key variable

Further calibrated estimates and decomposition were uninformative and insufficient historical data exists for creating an “actuarial” model

Bootstrapping the chance of cancellation was judged to be the most feasible measurement method

Additional investments may use this bootstrap model

Page 3: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Bootstrapping OverviewBootstrapping Overview

Historical analysis of IT investments First Workshop:

Review history Identify Success Factors Confirm possible ranges

Design test assessments Second Workshop

Calibrate for binary questions Conduct collaborative assessment

Independent assessments Compute regression model Confirm model

Page 4: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Questions for Initial PlanningQuestions for Initial Planning

Is bootstrapping necessary? (explain alternatives and when bootstrapping is good)

Have a kickoff: explain objectives and approach w/ specific examples/success stories, studies show that bootstrap are improved

What is the scope of the portfolio? What outcome is to be bootstrapped? What historical information is obtainable and where? Who are the decision makers? Who will be attending the workshops? Schedule the workshops, interviews, and the

presentation to validate the model

Page 5: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Project Planning EstimatesProject Planning Estimates

• Historical data gathering: 1-2 people, 1-3 days• Preparation for 2 workshops: 1-2 people, 2-4 hours each• Conduct 2 workshops: 1-2 facilitators + participants, 1/2 day

(3-4 hours) each• Construct initial bootstrap list: 1 person, 1-3 hours• Construct final bootstrap list: 1 person, 1-3 hours hour• Build regression model: 1-2 people, 4-8 hours• Prepare for presentation to confirm model: 1-2 people, 6-8

hours• Conduct presentation to confirm model: 1-2 presenters +

participants, 1 hour

Page 6: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Historical AnalysisHistorical Analysis

Determine scope of historical data needed How far back do we need data? Up to 30 examples Do we need investment size, duration, status, objective,

etc.? (have standard list)

Identify historical data available on IT investments Budgeting process/accounting data IT staff memory Any metrics efforts Past strategic IT plans

Collect investment data Consolidate data into single table for hand out

Page 7: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

First Workshop ObjectivesFirst Workshop Objectives

The first Bootstrap workshop is meant to be a free-form brainstorming forum to address the following: Introduce concepts/objectives to new participants Review the historical data and attempt to spot trends and

success factors Which investments were extreme examples for the variable

being bootstrapped List potential predictive variables Determine realistic values of predictive variables including

combinations of values Define criteria for bootstrap output Agree on input consolidation rules – shall we just average

the group, throw out highest/lowest, etc.

Page 8: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Results of First WorkshopResults of First Workshop

We identified the scope of the portfolio as any randomly chosen this organizations investments

There were 4 participants We identified the following variables as pertinent to a follow-

up measurement on chance of cancellation: Is the investment a documented strategic initiative? 90% confidence interval for time remaining (months) Is some part of the investment a compliance requirement? The number of business units involved Is sponsor business, IT or corporate? % over-budget and % over-schedule Test score of staff regarding project plan knowledge Project manager and sponsor evaluation of project % deliverables complete

Page 9: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Design Test AssessmentsDesign Test Assessments

Using the identified predictive variables, generate a list of hypothetical investments

The range of individual values should reflect the actual portfolio – ie. You should not have mostly investments over $50 million if that size is rare for this client

The combination of values in each hypothetical investment should be realistic – ie. The size and duration should fit each other

Make sure list represent investments in a range of possible bootstrapped output values

Produce a short table that lists each investment with hypothetical values and blanks for their input (perhaps 10 investments)

Page 10: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Second WorkshopSecond Workshop

Calibrate for binary questions Present trial investment list (just 5

investments), explain values shown and inputs needed

Discuss each investment as a group Identify changes to list Obtain calibrated estimates for each Explain next steps

Page 11: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Prepare Final BootstrapPrepare Final Bootstrap

Modify constraints based on findings from second workshop Clarify definitions/units of measure Add/drop variables Confirm input ranges

Generate new list of hypothetical investments The list should be enough to produce at least 100

responses total and no less than 30+# of variables per evaluator

Randomize list order Options:

Make some investments duplicates (for measuring consistency)

Include a few best/worst case investments

Page 12: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Calibrated Estimation ResultsCalibrated Estimation Results Each evaluator assessed chance of cancellation for 48 investments Variance between evaluators was often very large but may have been less if we did the trial evaluation or

calibration Olympic scoring throws out highest and lowest Disagreement among evaluators averaged 16% but was as much as 60% Difference between Olympic scores of duplicates was 6% Nobody stood out as particularly inconsistent or consistent but Ando and Vinay were clearly more optimistic

than Jean-Rene and Cecile Clearly, these chances of cancellation are high for any RAVI project

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

VKUJRRCPPAANOlympic

Page 13: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Compute RegressionCompute Regression

Aggregate inputs of various estimators Convert input into quantities

Pivot tables on un-ordered and discrete but non-binary variables Graph continuous variables against output to look for obvious

non-linear relationships For each output variable (confidence of success, chance of

cancellation, etc.) compute a regression model Try combinations of higher order terms where you think there

is a compounding effect Size is always a good candidate for higher-order terms Compare model error to evaluator inconsistency (model error

should be less) Test changes in “controllable” success factors – this may

identify sub-zones

Page 14: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Confirm ResultsConfirm Results

To confirm results show each of the following:• Plot of the original estimates vs. the model• The test classification chart• Plot actual projects on classification chart

and discuss discrepancies Determine volumes in each zone to

check if support is realistic Present results to group

Page 15: Follow-up Bootstrap Case Study

HubbardDecision Research

The Applied Information Economics Company

Regression ResultsRegression Results

Each investment was described by 12 but the model reduced this to 8.

After a few regression models were tried, one was found with an R squared of 0.91

Higher-order variables were added such as one which considered level of over-budget only if the investment was neither strategic or compliance

Part of the variance from the Olympic to the Model was due to evaluator inconsistency, not actual error in the model Olympic score of calibrated estimates

Comparison of estimates to model

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.2 0.4 0.6 0.8 1

Mod

el E

stim

ate