Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition...

26
Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses, Alexandria, VA
  • date post

    22-Dec-2015
  • Category

    Documents

  • view

    215
  • download

    0

Transcript of Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition...

Page 1: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Software in Acquisition Workshop

Software Expert Panel Outbrief Day 2

DoD Software In Acquisition Workshop15-16 April

Institute for Defense Analyses, Alexandria, VA

Page 2: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

2

Voting for Payoff vs. Ease for “Big Ideas”

Page 3: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

3

Payoff vs. Ease for “Big Ideas”L

ow

Hig

h

Difficult Easy

Define software implications of competitive prototyping policy

Lack of software requirements and technology maturity are causing most systems to fail

Recommended guidance for defining architecture requirements and quantitatively identifying, predicting, evaluating, verifying, and validating quality characteristics and attributes

Max score is 261 = 29*9

Body of knowledge for estimation

Systems and software quality survey report (e.g., resources regarding: selection of characteristics, measurements, evaluation methods, etc.)

Page 4: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

4

Knit Together Selected “Big Ideas” into Unified Proposed Initiative

Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems

and Software Requirements, Risk, Estimation, and Quality

Page 5: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

5

Competitive Prototyping Memo

Page 6: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

6

Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements,

Risk, Estimation, and Quality (1/3)

● Task 1: Conduct surveys and interviews of leading software professionals (government, industry, academia) to gather ideas, assess impacts, and sense expectations for Competitive Prototyping– 14 days: Coordination of feedback from selected participation in DoD

Workshop on Competitive Prototyping 4/28/08

– 6 months: Summary report complete

● Task 2: Provide amplification of Competitive Prototyping memo for integrated SE/SW, [including where in the lifecycle there are opportunities for Competitive Prototyping [and how they can be contractually achieved]]– 6 months: Initial sensing of guidance/implementation implications, including

investigating the movement of Milestone B after System PDR

– 12 months: Specific guidance/implementation recommendations

Page 7: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

7

Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements,

Risk, Estimation, and Quality (2/3)

● Task 3:Identify first adopters of Competitive Prototyping and facilitate and gather insights on effective usage, including collecting and analyzing data– 6 months: Recommendations for programs, including lessons

learned from previous similar programs that used competitive prototyping

– 12 months: Support kickoff of programs– 18 months: Actively engaged with programs in facilitation and

gathering insights

● Task 4: Develop guidance for early selection and application of integrated SE/SW quality systems for Competitive Prototyping [for RFP-authors]– 6 months: 1st draft , ready for limited circulation– 18 months: 2nd draft, ready for wide circulation

Page 8: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

8

Leveraging Competitive Prototyping through Acquisition Initiatives in Integrated Systems and Software Requirements,

Risk, Estimation, and Quality (3/3)

● Task 5: Develop Software Engineering Handbook for Competitive Prototyping including material explicitly targeted to different audiences (acquirer, supplier, etc.). Note: tasks 5 and 6 are tightly coupled.– 6 months: Outline– 12 months: 1st draft , ready for limited circulation– 18 months: Usage and evaluations on programs– 30 months: 2nd draft, ready for wide circulation

● Task 6: Develop training assets (materials, competencies, skill sets, etc.) that capture best-of-class ideas/practices for Competitive Prototyping. Note: tasks 5 and 6 are tightly coupled.– 6 months: Outline– 12 months: Scenarios and initial example materials including drafts

coordinated with feedback from programs– 18 months: Usage and evaluations on programs– 24 months: 1st draft, ready for limited circulation– 36 months: 2nd draft, ready for wide circulation

Page 9: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

9

Original “Big Ideas” Charts

Page 10: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

10

Define Software Implications of Competitive Prototyping Policy

● Summary– Define the software implications of the competitive prototyping

policy and use this opportunity to address the “overall estimation problem”

● Benefits– Concurrently engineer the systems and software requirements– Address the overall estimation problem– Improve how to best manage expectations (software achievability) – Align incentives for stakeholders; Program office, services,

industry VPs, proposal business development, program execution

Page 11: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

11

Lack of Software Requirements and Technology Maturity are Causing Most

Systems to Fail● Summary

– These are dominant (according to GAO) factors in software system acquisition that need to be addressed

– Recommend pilots of programs that identify software technology development consistent with the Young memo

● Benefits– Summarizes statements from many sources of software problems and what we can

learn from them– Posits a change to the life cycle

• Better / more mature requirements at initiation• Substantially more attention to software maturity• PMs have to know more about their jobs (seeking support in software and systems

engineering and cost estimation and tracking)

● Team Elaborations – Much of what we are trying to develop is cutting edge, so we have to consciously

deal with technology maturations (integration of multiple parallel new technologies)– Need discussion of development of human capital– Need to understand scope change versus evolution, clarification, and careful

decomposition

Page 12: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

12

Acquisition GuidanceTask 2: Develop recommended guidance, based on an example of

Quality Model, for architecture of software intensive systems; includes guidance for defining architecture requirements and quantitatively identifying, predicting, evaluating, verifying, and validating Quality Characteristics and Attributes

Deliverables:Deliverable 1: Systems and software quality survey report (e.g., resources

regarding: selection of characteristic, measurements, evaluation methods, etc.)

Deliverable 2: Recommended guidance for defining architecture requirements and quantitatively identifying, predicting, evaluating, verifying, and validating Quality Characteristics and Attributes

Timeline – Survey report: 4 months post 4/08 workshop – Draft recommended guidance: 12 months post 4/08 workshop– Updated recommended guidance: 18 months post 4/08 workshop

Page 13: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

13

Body of Knowledge for Estimation

● Summary– Curriculum for training people on estimation; Body of knowledge

for estimation (pulls from many disciplines: SW, economics, finance, management)

● Benefits– Grow human capital– Disseminate best practices, body of knowledge – Facilitate professional certifications– Build on MS in SW Engineering curriculum– Define types of competencies and skill sets– Synergies with Young’s just kicked off Software Acquisition

Training and Education Working Group (SATEWG)

Page 14: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

14

Whitepapers Available (see SEI website www.sei.cmu.edu)

● Type 1:– Making Requirements Management Work in an Organization

● Type 2:– Requirements Engineering

● Type 3: – Department of Defense SW System Acquisition--What’s Broke

and What Can SW Requirement Management Contribute to a Systemic Fix?

– Delivering Common Software Requirements for DoD Mission Capabilities

– A Consideration for Enabling a Systems-of-Systems (SoS) Governance

Page 15: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

15

Charter for Day 2 Working Sessions

● React to “unified proposal” on “Enabling Competitive Prototyping through Software Acquisition Initiatives in Requirements, Risk/Estimation, and Quality”

● Define/refine/improve near-term tasks, deliverables, milestones

● Define task leaders ● Define lists of interested participants● Estimate resources required for near-term tasks

Page 16: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

16

Proposed Software in Acquisition Working Group Meeting (Week of July 14, 2008, WashDC area)

● JULY: Outbrief from 4/28/08 and related meetings on CompProto (Blake Ireland)

● JULY: Updated task planning/status for Tasks 1-6 and performing organizations (Bruce Amato plus reps from performing orgs)

● JULY/OCT: Invited speakers to share experiences on previous competitive prototyping (Rick Selby)

● JULY: Updated on DAG and related guides (John Forbes)● JULY/OCT: Systems Engineering Forum coordination (Kristen Baldwin)● JULY/OCT: Initial results from Task 1 surveys/interviews regarding

gathering ideas, assessing impacts, and sensing expectations for Competitive Prototyping (Carl Clavadetscher)

● OCT: Panel discussion on how competitive prototyping has been used in the past, how it is currently being planned as embodied in the Competitive Prototyping memo, and emerging/unanticipated issues (Ken Nidiffer)

● OCT: Update on programs adopting Competitive Prototyping and how they are doing so including status of their plans and decisions (Bruce Amato)

● JULY/OCT: Action plan going forward, including planning for Fall 2008 meeting (All)

● Invitees: April 2008 attendees plus people working Tasks 1-6

Page 17: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

TPS Decision Problem 7

•Available decision rules inadequate

•Need better information

Info. has economic value

State of Nature

Alternative Favorable Unfavorable

BB (Bold) 250 -50

BC (Conservative) 50 50

Page 18: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Expected Value of Perfect Information (EVPI)

Build a Prototype for $10K– If prototype succeeds, choose BB

Payoff: $250K – 10K = $240K

– If prototype fails, choose BCPayoff: $50K – 10K = $40K

If equally likely,Ev = 0.5 ($240K) + 0.5 ($40K) = $140K

Could invest up to $50K and do better than before – thus, EVPI = $50K

Page 19: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

However, Prototype Will Give Imperfect Information

That is,P(IB|SF) = 0.0

Investigation (Prototype) says choose bold state of nature: Bold will fail

P(IB|SS) = 1.0Investigation (prototype) says choose bold state of nature:

bold will succeed

Page 20: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Suppose we assess the prototype’s imperfections asP(IB|SF) = 0.20, P(IB|SS) = 0.90

And Suppose the states of nature are equally likelyP(SF) = 0.50 P(SS) = 0.50

We would like to compute the expected value of using the prototypeEV(IB,IC) = P(IB) (Payoff if use bold)

+P(IC) (Payoff if use conservative)

= P(IB) [ P(SS|IB) ($250K) + P(SE|IB) (-$50K) ] + P(IC) ($50K)

But these aren’t the probabilities we know

Page 21: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

How to get the probabilities we need

P(IB) = P(IB|SS) P(SS) + P(IB|SF) P(SF)

P(IC) = 1 – P(IB)

P(SS|IB) = P(IB|SS) P(SS) (Bayes’ formula)

P(IB)

P(SF|IB) = 1 – P (SS|IB)

P(SS|IB) = Prob (we will choose Bold in a state of nature where it will succeed)

Prob (we will choose Bold)

Page 22: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Net Expected Value of PrototypePROTO

COST, $KP (PS|SF) P (PS|SS) EV, $K NET EV,

$K

0 60 0

5 0.30 0.80 69.3 4.3

10 0.20 0.90 78.2 8.2

20 0.10 0.95 86.8 6.8

30 0.00 1.00 90 0

8

4

010 20 30

NE

T E

V, $

K

PROTO COST, $K

Page 23: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Conditions for Successful Prototyping (or Other Info-Buying)

1. There exist alternatives whose payoffs vary greatly depending on some states of nature.

2. The critical states of nature have an appreciable probability of occurring.

3. The prototype has a high probability of accurately identifying the critical states of nature.

4. The required cost and schedule of the prototype does not overly curtail its net value.

5. There exist significant side benefits derived from building the prototype.

Page 24: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Pitfalls Associated by Success Conditions

1. Always build a prototype or simulation– May not satisfy conditions 3,4

2. Always build the software twice– May not satisfy conditions 1,2

3. Build the software purely top-down– May not satisfy conditions 1,2

4. Prove every piece of code correct– May not satisfy conditions 1,2,4

5. Nominal-case testing is sufficient– May need off-nominal testing to satisfy

conditions 1,2,3

Page 25: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

Statistical Decision Theory: Other S/W Engineering Applications

How much should we invest in:– User information gathering– Make-or-buy information– Simulation– Testing– Program verification

How much should our customers invest in:– MIS, query systems, traffic models, CAD,

automatic test equipment, …

Page 26: Software in Acquisition Workshop Software Expert Panel Outbrief Day 2 DoD Software In Acquisition Workshop 15-16 April Institute for Defense Analyses,

The Software Engineering Field Exists Because

Processed Information Has Value