Post on 22-Nov-2014
1
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
The COCOMO II Suite of Software Cost Estimation Models
Barry Boehm, USCTRW Presentation
March 19, 2001
boehm@sunset.usc.edu http://sunset.usc.edu/research/cocomosuite
2
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
• Commercial Industry (18)
– Automobile Club of Southern California, C-Bridge, Daimler Chrysler, EDS, Fidelity Group, Galorath, Group Systems.Com, Hughes, IBM, Lucent, Marotz, Microsoft, Motorola, Price Systems, Rational, Sun, Telcordia, Xerox
• Aerospace Industry (9)
– Boeing, Draper Labs, GDE Systems, Litton, Lockheed Martin, Northrop, Grumman, Raytheon, SAIC, TRW
• Government (3)
– FAA, US Army Research Labs, US Army TACOM
• FFRDC’s and Consortia (4)
– Aerospace, JPL, SEI, SPC
• International (1)
– Chung-Ang U. (Korea)
USC-CSE Affiliates (33)
3
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
USC-CSE Affiliates’ CalendarJune 22, 2000
July 25-26, 2000
July 27, 2000
August 24-25, 2000
September 13-15, 2000
October 24-27, 2000
February 6-9, 2001
February 21-23, 2001
February 21, 2001
March 28, 2001
May 2001
June 14, 2001
Easy WinWin Web Seminar
Easy WinWin Hands-on Tutorial
Tutorial: Transitioning to the CMMI via MBASE
Software Engineering Internship Workshop
Workshop: Spiral Development in the DoD (Washington DC; with SEI)
COCOMO/Software Cost Modeling Forum and Workshop
Annual Research Review, COTS-Based Systems Workshop (with SEI, CeBASE)
Ground Systems Architecture Workshop (with Aerospace, SEI)
LA SPIN, Ron Kohl, COTS-Based Systems Processes
LA SPIN, High Dependability Computing
Annual Affiliates’ Renewal
Rapid Value/RUP/MBASE Seminar (with C-Bridge, Rational)
4
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Outline• COCOMO II Overview• Overview of Emerging Extensions
– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– System Engineering (COSYSMO)– Tool Effects– Code CountTM
• Related USC-CSE Research– MBASE, CeBASE and CMMI
• Backup charts
5
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
1. Introduction 2. Model Definition 3. Application Examples 4. Calibration 5. Emerging Extensions 6. Future Trends Appendices
– Assumptions, Data Forms, User’s Manual, CD Content
COCOMO II Book Table of Contents- Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece,
Software Cost Estimation with COCOMO II, Prentice Hall, 2000
CD: Video tutorials, USC COCOMO II.2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms
6
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
To help people reason about the
cost and schedule implications of
their software decisions
Purpose of COCOMO II
7
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Major Decision SituationsHelped by COCOMO II
• Software investment decisions– When to develop, reuse, or purchase– What legacy software to modify or phase out
• Setting project budgets and schedules
• Negotiating cost/schedule/performance tradeoffs
• Making software risk management decisions
• Making software improvement decisions– Reuse, tools, process maturity, outsourcing
8
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Need to ReEngineer COCOMO 81
• New software processes
• New sizing phenomena
• New reuse phenomena
• Need to make decisions based on incomplete information
9
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Feasibility
Concept ofOperation
Rqts.Spec.
Plansand
Rqts.
ProductDesign
ProductDesignSpec.
DetailDesignSpec.
DetailDesign
Devel.and Test
AcceptedSoftware
Phases and Milestones
RelativeSize Range x
4x
2x
1.25x
1.5x
0.25x
0.5x ApplicationsComposition
(3 parameters)
Early Design(13 parameters)
Post-Architecture(23 parameters)0.67x
0.8x
COCOMO II Model Stages
10
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Relations to MBASE*/Rational Anchor Point Milestones
App. Compos.
Inception Elaboration, Construction
LCO, LCA
IOC
Waterfall Rqts. Prod. Des.
LCA
Development
LCO
Sys Devel
IOC
Transition
SRR PDR
Construction
SAT
Trans.
*MBASE: Model-Based (System) Architecting and Software Engineering
Inception Phase
Elaboration
11
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Early Design and Post-Architecture Model
FactorsScaleProcessSizeEffort
sMultiplier
Environment
Environment: Product, Platform, People, Project Factors
Size: Nonlinear reuse and volatility effects
Process: Constraint, Risk/Architecture, Team, Maturity Factors
FactorsScaleProcess EffortSchedule Multiplier
12
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Relativecost
Amount Modified
1.0
0.75
0.5
0.25
0.25 0.5 0.75 1.0
0.55
0.70
1.0
0.046
Usual LinearAssumption
Data on 2954NASA modules
[Selby, 1988]
Nonlinear Reuse Effects
13
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II. 2000 Productivity Ranges
Productivity Range
1 1.2 1.4 1.6 1.8 2 2.2 2.4
Product Complexity (CPLX)
Analyst Capability (ACAP)
Programmer Capability (PCAP)
Time Constraint (TIME)
Personnel Continuity (PCON)
Required Software Reliability (RELY)
Documentation Match to Life Cycle Needs (DOCU)
Multi-Site Development (SITE)
Applications Experience (AEXP)
Platform Volatility (PVOL)
Use of Software Tools (TOOL)
Storage Constraint (STOR)
Process Maturity (PMAT)
Language and Tools Experience (LTEX)
Required Development Schedule (SCED)
Data Base Size (DATA)
Platform Experience (PEXP)
Architecture and Risk Resolution (RESL)
Precedentedness (PREC)
Develop for Reuse (RUSE)
Team Cohesion (TEAM)
Development Flexibility (FLEX)
Scale Factor Ranges: 10, 100, 1000 KSLOC
14
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO Model ComparisonsCOCOMO Ada COCOMO COCOMO II:
Application CompositionCOCOMO II:Early Design
COCOMO II:Post-Architecture
Size Delivered Source Instructions(DSI) or Source Lines ofCode (SLOC)
DSI or SLOC Application Points Function Points (FP) andLanguage or SLOC
FP and Language or SLOC
Reuse Equivalent SLOC = Linear(DM,CM,IM)
Equivalent SLOC = Linear(DM,CM,IM)
Implicit in Model Equivalent SLOC = nonlinear(AA, SU,UNFM,DM,CM,IM)
Equivalent SLOC = nonlinear(AA, SU,UNFM,DM,CM,IM)
Rqts. Change Requirements Volatilityrating: (RVOL)
RVOL rating Implicit in Model Change % : RQEV
Maintenance Annual Change Traffic(ACT) = %added + %modified
ACT Object Point ACT (ACT,SU,UNFM) (ACT,SU,UNFM)
Scale (b) inMMNOM=a(Size)b
Organic: 1.05 Semidetached:1.12 Embedded: 1.20
Embedded: 1.04 -1.24depending on degree of: early risk elimination solid architecture stable requirements Ada process maturity
1.0 .91-1.23 depending on thedegree of: precedentedness conformity early architecture, risk
resolution team cohesion process maturity (SEI)
.91-1.23 depending on thedegree of: precedentedness conformity early architecture, risk
resolution team cohesion process maturity (SEI)
Product Cost Drivers RELY, DATA, CPLX RELY*, DATA, CPLX*,RUSE
None RCPX*, RUSE* RELY, DATA, DOCU*,CPLX, RUSE*
Platform Cost Drivers TIME, STOR, VIRT, TURN TIME, STOR, VMVH,VMVT, TURN
None Platform difficulty: PDIF * TIME, STOR, PVOL(=VIRT)
Personnel CostDrivers
ACAP, AEXP, PCAP,VEXP, LEXP
ACAP*, AEXP, PCAP*,VEXP, LEXP*
None Personnel capability andexperience: PERS*, PREX*
ACAP*, AEXP, PCAP*,PEXP*, LTEX*, PCON*
Project Cost Drivers MODP, TOOL, SCED MODP*, TOOL*, SCED,SECU
None SCED, FCIL* TOOL*, SCED, SITE*
* Different Multipliers Different Rating Scale
RQEV
15
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Percentage of sample projects within 30% of actuals
-Without and with calibration to data source
COCOMO II Estimation Accuracy:
COCOMO81 COCOMOII.2000COCOMOII.1997
# Projects 63 83 161
Effort
Schedule
81% 52%64%
75%80%
61%62%
72%81%
65%
16
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II Experience Factory: I
Ok?
Rescope
COCOMO 2.0Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
No
Yes
Cost,Sched,Risks
17
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II Experience Factory: II
Ok?
Rescope
COCOMO 2.0Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
No
Yes
Cost,Sched,Risks
No
Milestone plans,resources
18
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II Experience Factory: III
Ok?
Rescope
COCOMO 2.0
RecalibrateCOCOMO 2.0
Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
AccumulateCOCOMO 2.0
calibrationdata
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
No
Yes
Cost,Sched,Risks
No
Milestone plans,resources
19
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II Experience Factory: IV
Ok?
Rescope
COCOMO 2.0
RecalibrateCOCOMO 2.0
Corporate parameters:tools, processes, reuse
System objectives:fcn’y, perf., quality
Executeprojectto next
Milestone
Ok?
Done?
End
ReviseMilestones,
Plans,Resources
EvaluateCorporate
SWImprovement
Strategies
AccumulateCOCOMO 2.0
calibrationdata
No
RevisedExpectations
M/SResults
Yes
Yes
Milestone expectations
ImprovedCorporate
Parameters
No
Yes
Cost,Sched,Risks
Cost, Sched,Quality drivers
No
Milestone plans,resources
20
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
1. Introduction 2. Model Definition 3. Application Examples 4. Calibration 5. Emerging Extensions 6. Future Trends Appendices
– Assumptions, Data Forms, User’s Manual, CD Content
COCOMO II Book Table of Contents- Boehm, Abts, Brown, Chulani, Clark, Horowitz, Madachy, Reifer, Steece,
Software Cost Estimation with COCOMO II, Prentice Hall, 2000
CD: Video tutorials, USC COCOMO II.2000, commercial tool demos, manuals, data forms, web site links, Affiliate forms
21
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Outline• COCOMO II Overview• Overview of Emerging Extensions
– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– System Engineering (COSYSMO)– Tool Effects– Code CountTM
• Related USC-CSE Research– MBASE, CeBASE and CMMI
• Backup charts
22
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
USC-CSE Modeling Methodology
Analyze existing literature
Step 1
Perform Behavioral analyses
Step 2 Identify relative significance
Step 3Perform expert-judgment Delphi assessment, formulate a-priori modelStep 4 Gather project data
Step 5
Determine Bayesian A-Posteriori model
Step 6
Gather more data; refine model
Step 7
- concurrency and feedback implied
23
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Results of Bayesian Update: Using Prior and Sampling Information (Step 6)
1.06
Literature,behavioral analysis
A-prioriExperts’ Delphi
Noisy data analysis
A-posteriori Bayesian update
Productivity Range =Highest Rating /Lowest Rating
1.451.51
1.41
Language and Tool Experience (LTEX)
24
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Status of Models
COCOMO II
COCOTS
COQUALMO Defects in Defects out
COPSEMO
CORADMO
COSYSMO
Literature BehaviorSignif. Variables Delphi
Data, Bayes
**
*****
**
****
**
****
**
**
161
20
2
2
10
25
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO vs. COCOTS Cost SourcesS
TA
FF
ING
TIME
26
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOTS Effort Distribution: 20 Projects
Mean % of Total COTS Effort by Activity (+/- 1 SD)
49.07% 50.99%
61.25%
20.27%20.75% 21.76%
31.06%
11.31%
-7.57% -7.48%
0.88% 2.35%
-20.00%
-10.00%
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
assessment tailoring glue code system volatility
% P
erso
n-m
on
ths
27
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Integrated COQUALMOCOCOMO II
COQUALMO
Defect Removal Model
Software Size estimate
Software product, process, computer and personnel attributes
Defect removal capability levels
Software development effort, cost and schedule estimate
Number of residual defects
Defect Introduction
Model
Defect density per unit of size
28
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COQUALMO Defect Removal Estimates- Nominal Defect Introduction Rates
60
28.5
14.37.5
3.5 1.60
10
20
30
40
50
60
70
VL Low Nom High VH XH
Delivered Defects/ KSLOC
Composite Defect Removal Rating
29
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II cost drivers
(except SCED)
Language Level,
experience,...
COCOMO II
Phase Distributions(COPSEMO)
RAD Extension
Baseline effort,
schedule
Effort,
schedule by stage
RAD effort, schedule by phase
RVHL
DPRS
CLAB
RESL
COCOMO II RAD Extension (CORADMO)
PPOSRCAP
30
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
0
2
4
6
8
10
12
14
16
0 10 20 30 40 50
PM
M
3.7*(Cube root) 3*(Cube root) Square root
RCAP = XL
RCAP = XH
Effect of RCAP on Cost, Schedule
31
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COPROMO (Productivity) Model• Uses COCOMO II model and extensions as
assessment framework– Well-calibrated to 161 projects for effort, schedule
– Subset of 106 1990’s projects for current-practice baseline
– Extensions for Rapid Application Development formulated
• Determines impact of technology investments on model parameter settings
• Uses these in models to assess impact of technology investments on cost and schedule– Effort used as a proxy for cost
32
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Strawman COSYSMO• Sizing model determines nominal COCOMO II SysE
effort and schedule
– Function points/use cases/other for basic effort
– Tool and document preparation separate (?)
• “source of effort”
– Factor in volatility and reuse
– Begin with linear effort scaling with size (?)
• Cost & Schedule drivers multiplicatively adjust nominal effort and schedule by phase, source of effort (?)
– Application factors
– Team factors
33
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
0.5
11.
52
COCOMO II.1998 Productivity Ranges and Current Practice
Average Multiplier for 1990’s projects
34
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COSYSMO: Factor Importance RatingRate each factor H, M, or L depending on its relatively high, medium, or low influence on system engineering effort. Use an equal number of H’s, M’s, and L’s.
Application Factors__H___Requirements understanding_M - H_Architecture understanding_L - H_ Level of service rqts. criticality, difficulty_L - M_ Legacy transition complexity_L – M COTS assessment complexity_L - H_ Platform difficulty_L – M_Required business process reengineering______ TBD :Ops. concept understanding (N=H)______ TBD
Team Factors_L - M_Number and diversity of stakeholder communities_M - H_Stakeholder team cohesion_M - H_Personnel capability/continuity__ H__ Personnel experience_L - H_ Process maturity_L - M_Multisite coordination_L - H_Degree of system engineering ceremony_L - M_Tool support______ TBD______ TBD
N=63.02.52.31.51.71.71.5
1.52.72.73.02.01.52.01.3
35
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
New Tool Rating Scale
• Basis of Tool Rating Scale– Breadth of Process Support
• Specification, Analysis, Design, Programming, Test, CM, QA, Management, etc.
– CMM Tool maturity and support
– Degree of Tool Integration
Rating CASE ToolsVery Low Text-Based Editor
Basic 3GL CompilerBasic library AidsBasic Text-based DebuggerBasic Linker
Low Graphical Interactive EditorSimple Design LanguageSimple Programming Support LibrarySimple Metrics/Analysis Tool
Nominal Local Syntax Checking EditorStandard Template Support Document GeneratorSimple Design ToolsSimple Standalone Configuration Management ToolStandard Data Transformation ToolStandard Support Metrics Aids with RepositorySimple Repository, Basic Test Case Analyzer
High Local Semantics Checking EditorAutomatic Document GeneratorRequirement Specification Aids and AnalyzerExtended Design ToolsAutomatic Code Generator from Detailed DesignCentralized Configuration Management ToolProcess Management AidsPartially Associative Repository (Simple Data Model Support)Test Case Analyzer with Spec. Verification AidsBasic Reengineering & Reverse Engineering Tool
Very High Global Semantics Checking EditorTailorable Automatic Document GeneratorRequirement Specification Aids and Analyzer with Tracking CapabilityExtended Design Tools with Model VerifierCode Generator with Basic Round-Trip CapabilityExtended Static Analysis ToolBasic Associative, Active Repository (Complex Data Model Support)Heterogeneous N/W Support Distributed Configuration Management ToolTest Case Analyzer with Testing Process Manager, Oracle SupportExtended Reengineering & Reverse Engineering Tools
Extra High GroupWare systemsDistributed Asynchronous Requirement Negotiation and Tradeoff toolsCode Generator with Extended Round-Trip CapabilityExtended Associative, Active RepositorySpec-based Static and Dynamic AnalyzersPro-active Project decision Assistance
36
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Code Count™
• Suite of 9 counting tools + COPY LEFTed + Full source code
Ada ASM 1750 C/C++
COBOL FORTRAN Java
JOVIAL Pascal PL/1
• Counts *”QA” Data (Tallies) +SLOC +Statements By Type
+DSI +Comment By Type
37
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Outline• COCOMO II Overview• Overview of Emerging Extensions
– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– System Engineering (COSYSMO)– Tool Effects– Code CountTM
• Related USC-CSE Research– MBASE, CeBASE and CMMI
• Backup charts
38
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
MBASE, CeBASE, and CMMI•Model-Based (System) Architecting and Software Engineering (MBASE)
–Extension of WinWin Spiral Model
–Avoids process/product/property/success model clashes
–Provides project-oriented guidelines
•Center for Empirically-Based Software Engineering (CeBASE)
–Led by USC, UMaryland
–Sponsored by NSF, others
–Empirical software data collection and analysis
–Integrates MBASE, Experience Factory, GMQM into CeBASE Method
–Goal-Model-Question-Metric method
–Integrated organization/portfolio/project guidelines
•CeBASE Method implements Integrated Capability Maturity Model (CMMI) and more
–Parts of People CMM, but light on Acquisition CMM
39
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Spiral Model Refinements
•Where do objectives, constraints, alternatives come from?
–Win Win extensions
•Lack of intermediate milestones
–Anchor Points: LCO, LCA, IOC
–Concurrent-engineering spirals between anchor points
•Need to avoid model clashes, provide more specific guidance
–MBASE
Evaluate product and
The WinWin Spiral Model
2. Identify Stakeholders’win conditions
1. Identify next-levelStakeholders
Reconcile win conditions. Establishnext level objectives,
3.
process alternatives.Resolve Risks
4.
Define next level of product andprocess - including partitions
5.
Validate productand processdefinitions
6.
Review, commitment7.
Win-Win Extensions
OriginalSpiral
constraints, alternatives
40
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Life Cycle Anchor Points• Common System/Software stakeholder commitment
points– Defined in concert with Government, industry affiliates
– Coordinated with Rational’s Unified Software Development Process
• Life Cycle Objectives (LCO)– Stakeholders’ commitment to support system architecting
– Like getting engaged
• Life Cycle Architecture (LCA)– Stakeholders’ commitment to support full life cycle
– Like getting married
• Initial Operational Capability (IOC)– Stakeholders’ commitment to support operations
– Like having your first child
41
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
• Elaboration of system objectives and scope of increment• Elaboration of operational concept by increment
Win Win Spiral Anchor Points(Risk-driven level of detail for each element)
• Elaboration of WWWWWHH* for Initial Operational Capability (IOC) - Partial elaboration, identification of key TBD’s for later increments
• Assurance of consistency among elements above• All major risks resolved or covered by risk management plan
• Choice of architecture and elaboration by increment - Physical and logical components, connectors, configurations, constraints - COTS, reuse choices - Domain-architecture and architectural style choices• Architecture evolution parameters
*WWWWWHH: Why, What, When, Who, Where, How, How Much
Milestone Element Life Cycle Objectives (LCO) Life Cycle Architecture (LCA)
Definition of OperationalConcept
• Top-level system objectives and scope - System boundary - Environment parameters and assumptions - Evolution parameters• Operational concept - Operations and maintenance scenarios and parameters - Organizational life-cycle responsibilities (stakeholders)
• Top-level functions, interfaces, quality attribute levels, including: - Growth vectors and priorities - Prototypes• Stakeholders’ concurrence on essentials
• Elaboration of functions, interfaces, quality attributes, and prototypes by increment - Identification of TBD’s( (to-be-determined items)• Stakeholders’ concurrence on their priority concerns
• Top-level definition of at least one feasible architecture - Physical and logical elements and relationships - Choices of COTS and reusable software elements• Identification of infeasible architecture options
• Identification of life-cycle stakeholders - Users, customers, developers, maintainers, interoperators, general public, others• Identification of life-cycle process model - Top-level stages, increments• Top-level WWWWWHH* by stage
• Assurance of consistency among elements above - via analysis, measurement, prototyping, simulation, etc. - Business case analysis for requirements, feasible architectures
Definition of SystemRequirements
Definition of Systemand SoftwareArchitecture
Definition of Life-Cycle Plan
FeasibilityRationale
System Prototype(s) • Exercise key usage scenarios• Resolve critical risks
• Exercise range of usage scenarios• Resolve major outstanding risks
42
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Clashes Among MBASE Models
Product Model Process Model Property Model Success Model
ProductModel
Structure clash
Traceabilityclash
Architecturestyle clash
COTS-drivenproduct vs.Waterfall(requirements-driven) process
Interdependentmultiprocessorproduct vs. linearperformancescalability model
4GL-basedproduct vs. lowdevelopment costand performancescalability
ProcessModel
Multi-incrementdevelopmentprocess vs. Single-incrementsupport tools
Evolutionarydevelopmentprocess vs.Rayleigh-curvecost model
Waterfall processmodel vs. "I'llknow it when I seeit" (IKIWISI)prototypingsuccess model
PropertyModel
Minimize costand schedule vs.maximize quality(Quality is free)
Fixed-pricecontract vs. easy-to-change, volatilerequirements
SuccessModel
Golden Rule vs.stakeholder win-win
43
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
MBASE Electronic Process Guide (1)
44
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
MBASE Electronic Process Guide (2)
45
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Center for Empirically-Based Software Engineering (CeBASE) Strategic Vision
Strategic FrameworkStrategic Process: Experience FactoryTailoring G/L: Goal-Model-Question-MetricTactical Process: Model Integration (MBASE); WinWin Spiral
Strategic Framework
Empirical MethodsQuantitative Qualitative
Experimental Ethnographic
Observational Analysis Surveys, Assessments
Parametric Models Critical Success Factors
Dynamic Models Root Cause Analysis
Pareto 80-20 Relationships
Experience Base (Context; Results)Project, Context Attributes
Empirical Results; References
Implications and Recommended Practices
Experience Feedback Comments
• Initial foci: COTS-based systems; Defect reduction
46
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
•Org. Value Propositions (VP’s)-Stakeholder values
•Current situation w.r.t. VP’s•Improvement Goals, Priorities•Global Scope, Results Chain•Value/business case models
Org-Portfolio Shared Vision
•Strategy elements•Evaluation criteria/questions•Improvement plans
-Progress metrics-Experience base
Org. Strategic Plans
Organization/Portfolio:EF-GMQM
•Monitor environment-Update models
•Implement plans•Evaluate progress
-w.r.t. goals, models•Determine, apply
corrective actions•Update experience base
Org. Monitoring & Control
Monitoring& ControlContext
•Project Value Propositions-Stakeholder values
•Current situation w.r.t. VP’s•Improvement Goals, Priorities•Project Scope, Results Chain•Value/business case models
Project Shared Vision
Project:MBASE
Planningcontext
Plan/Goal mismatches
•LCO/LCA Package-Ops concept, prototypes, rqts, architecture, LCplan, rationale
•IOC/Transition/Support Package-Design, increment plans, quality plans, T/S plans
•Evaluation criteria/questions•Progress metrics
Project Plans
PlanningContext
Initiatives
Progress/Plan/Goal mismatches-shortfalls, opportunities,
risks
Projectvision,goals
Shortfalls,opportunities,
risksScopingcontext
Shortfalls,opportunities,
risks
PlanningContext
•Monitor environment-Update models
•Implement plans•Evaluate progress
-w.r.t. goals, models, plans
•Determine, apply corrective actions•Update experience base
Proj. Monitoring & Control
Monitoring& Controlcontext
Progress/Plan/goal mismatches-Shortfalls, opportunities, risks
Plan/goal mismatches
Monitoring& Controlcontext
Projectexperience,
progress w.r.t.plans, goals
LCO: Life Cycle ObjectivesLCA: Life Cycle ArchitectureIOC: Initial Operational CapabilityGMQM: Goal-Model-Question-Metric ParadigmMBASE: Model-Based (System) Architecting and Software Engineering
Integrated GMQM-MBASE Experience Factory-Applies to organization’s and projects’ people, processes, and products
47
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
CeBASE Method Coverage of CMMI - I
• Process Management– Organizational Process Focus: 100+– Organizational Process Definition: 100+– Organizational Training: 100-– Organizational Process Performance: 100-– Organizational Innovation and Deployment: 100+
• Project Management– Project Planning: 100– Project Monitoring and Control: 100+– Supplier Agreement Management: 50-– Integrated Project Management: 100-– Risk Management: 100– Integrated Teaming: 100– Quantitative Project Management: 70-
48
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
CeBASE Method Coverage of CMMI - II• Engineering
– Requirements Management: 100– Requirements Development: 100– Technical Solution: 60+– Product Integration: 70-– Verification: 70-– Validation: 80+
• Support– Configuration Management: 70-– Process and Product Quality Assurance: 70-– Measurement and Analysis: 100-– Decision Analysis and Resolution: 100-– Organizational Environment for Integration: 80-– Causal Analysis and Resolution: 100
49
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Outline
• COCOMO II Overview• Overview of Emerging Extensions
– COTS Integration (COCOTS)– Quality: Delivered Defect Density (COQUALMO)– Phase Distributions (COPSEMO)– Rapid Application Development Schedule (CORADMO)– Productivity Improvement (COPROMO)– System Engineering (COSYSMO)– Tool Effects– Code CountTM
• Related USC-CSE Research• Backup charts
50
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Backup Charts
• COCOMO II
• COCOTS
• COQUALMO
• CORADMO
• COSYSMO
51
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
The future of the software practices marketplace
Application generators
(0.6M)
Infrastructure (0.75M)
System integration
(0.7M)
Application composition
(0.7M)
User programming
(55M performers in US in year 2005)
52
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II Coverage of Future SW Practices Sectors
• User Programming: No need for cost model• Applications Composition: Use application points
- Count (weight) screens, reports, 3GL routines• System Integration; development of applications
generators and infrastructure software- Prototyping: Applications composition model- Early design: Function Points and/or Source Statements and 7 cost drivers- Post-architecture: Source Statements and/or Function Points and 17 cost drivers- Stronger reuse/reengineering model
53
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Element Type Complexity-WeightSimple Medium Difficult
Screen 1 2 3Report 2 5 8
3GL Component 10
Baseline Application Point Estimation ProcedureStep 1: Assess Element-Counts: estimate the number of screens, reports, and 3GL components that will comprise this
application. Assume the standard definitions of these elements in your ICASE environment.
Step 2: Classify each element instance into simple, medium and difficult complexity levels depending on values of
characteristic dimensions. Use the following scheme:
For Screens For Reports# and source of data tables # and source of data tables
Number ofViews
Contained
Total < 4(<2 srvr, <3
clnt)
Total <8(<3 srvr, 3 -
5 clnt)
Total 8+(>3 srvr, >5
clnt)
Numberof SectionsContained
Total < 4(<2 srvr, <3
clnt)
Total <8(<3 srvr, 3 -
5 clnt)
Total 8+(>3 srvr, >5
clnt)
<3 simple simple medium 0 or 1 simple simple medium3-7 simple medium difficult 2 or 3 simple medium difficult>8 medium difficult difficult 4+ medium difficult difficult
Step 3: Weigh the number in each cell using the following scheme. The weights reflect the relative effort required to
implement an instance of that complexity level.
Step 4: Determine Application-Points: add all the weighted element instances to get one number, the Application-Point count.
Step 5: Estimate percentage of reuse you expect to be achieved in this project. Compute the New Application Points to be
developed NAP =(Application-Points) (100-%reuse) / 100.
Step 6: Determine a productivity rate, PROD=NAP/person-month, from the following scheme:
Developer's experience and capability Very Low Low Nominal High Very HighICASE maturity and capability Very Low Low Nominal High Very High
PROD 4 7 13 25 50
Step 7: Compute the estimated person-months: PM=NAP/PROD.
54
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
New Scaling Exponent Approach
• Nominal person-months = A*(size)**B• B = 0.91 + 0.01 (exponent driver ratings)
- B ranges from 0.91 to 1.23- 5 drivers; 6 rating levels each
• Exponent drivers:- Precedentedness- Development flexibility- Architecture/ risk resolution- Team cohesion- Process maturity (derived from SEI CMM)
55
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Project Scale FactorsPMestimated
3.67i
(Size)(SF) EMi
SF .0.910.01 wi
Scale Factors(Wi)
Very Low Low Nominal High Very High Extra High
PREC thoroughlyunprecedented
largelyunprecedented
somewhatunprecedented
generallyfamiliar
largely familiar throughlyfamiliar
FLEX rigorous occasionalrelaxation
somerelaxation
generalconformity
someconformity
general goals
RESL little (20%) some (40%) often (60%) generally(75%)
mostly (90%) full (100%)
TEAM very difficultinteractions
some difficultinteractions
basicallycooperativeinteractions
largelycooperative
highlycooperative
seamlessinteractions
PMAT weighted sum of 18 KPA achievement levels
56
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Reuse and Reengineering Effects
• Add Assessment & Assimilation increment (AA)- Similar to conversion planning increment
• Add software understanding increment (SU)- To cover nonlinear software understanding effects- Coupled with software unfamiliarity level (UNFM)- Apply only if reused software is modified
• Results in revised Equivalent Source Lines of Code (ESLOC)
- AAF = 0.4(DM) + 0.3 (CM) + 0.3 (IM)- ESLOC = ASLOC[AA+AAF(1+0.02(SU)(UNFM))], AAF < 0.5 - ESLOC = ASLOC[AA+AAF(SU)(UNFM))], AAF > 0.5
57
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Software Understanding Rating / Increment
Very Low Low Nom High Very High
Structure Very lowcohesion, high
coupling,spaghetti code.
Moderately lowcohesion, high
coupling.
Reasonablywell -
structured;some weak
areas.
High cohesion,low coupling.
Strongmodularity,information
hiding indata/controlstructures.
ApplicationClarity
No matchbetween
program andapplication
world views.
Somecorrelation
betweenprogram andapplication .
Moderatecorrelation
betweenprogram andapplication .
Goodcorrelation
betweenprogram andapplication .
Clear matchbetween
program andapplication
world views.Self-
DescriptivenessObscure code;documentation
missing,obscure orobsolete.
Some codecommentary andheaders; some
usefuldocumentation.
Moderate levelof code
commentary,headers,
documentation.
Good codecommentaryand headers;
usefuldocumentation;
some weakareas.
Self-descriptive
code;documentationup-to-date,
well-organized,with designrationale.
SU Increment toESLOC
50 40 30 20 10
58
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Other Major COCOMO II Changes
• Range versus point estimates
• Requirements Volatility (Evolution) included in Size
• Multiplicative cost driver changes
- Product CD’s
- Platform CD’s
- Personnel CD’s
- Project CD’s
• Maintenance model includes SU, UNFM factors from reuse model
– Applied to subset of legacy code undergoing change
59
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Process Maturity (PMAT) Effects
• Clark Ph.D. dissertation (112 projects)– Research model: 12-23% per level– COCOMO II subsets: 9-29% per level
• COCOMO II.1999 (161 projects)– 4-11% per level
• PMAT positive contribution is statistically significant
– Effort reduction per maturity level, 100 KDSI project
– Normalized for effects of other variables
60
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
• Initial Schedule Estimation
where estimated person months excluding Schedulemultiplier effects
• Output Ranges
- 80% confidence limits: 10% of time each below Optimistic, above Pessimistic
- Reflect sources of uncertainty in model inputs
TDEV 3.67 PM
0.280.2(B 0.91)
SCED%
100
PM
Stage Optimistic Estimate Pessimistic Estimate
Application Composition 0.50 E 2.0 EEarly Design 0.67 E 1.5 E
Post-Architecture 0.80 E 1.25 E
Other Model Refinements
61
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Early Design vs. Post-Arch EMs:
Early Design Cost Driver Counterpart Combined Post Architecture Cost Drivers
Product Reliability and Complexity RELY, DATA, CPLX, DOCU
Required Reuse RUSE
Platform Difficulty TIME, STOR, PVOL
Personnel Capability ACAP, PCAP, PCON
Personnel Experience AEXP, PEXP, LTEX
Facilities TOOL, SITE
Schedule SCED
62
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOTS Backup Charts
• Development and Life Cycle Models
• Research Highlights Since ARR 2000
• Data Highlights
• New Glue Code Submodel Results
• Next Steps
• Benefits
63
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOTS: Development Model
2. COTSTailoring1. COTS
Assessment
3. Glue CodeDevelopment
4. System Effort due to COTS Volatility
New System Development Not Involving COTS Components
Time
Sta
ffin
gLCO
(requirements review)
LCA(preliminary
design review)
IOC(systemdelivery)
LCO – Lifecycle ObjectivesLCA – Lifecycle ArchitectureIOC – Initial Operational Capability COCOMO II Effort Estimate
COCOTS Effort Estimate
64
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOTS: Draft Life-Cycle Model
LCO IOC
development maintenance
?
transition?
?
?
Retirementof
System
? ?
? ? ??
repeating refresh cycles? end cycle?start cycle?
A T GC
V
COCOMO II
Volatility
Operations
R
A T GC
? R
A T GC
? R
A T GC
?
TR TR TR TR
COCOMO Maintenance Model
Transition
65
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Current Insights into Maintenance Phase IssuesCurrent Insights into Maintenance Phase Issues Priority of Activities by Effort Involved and/or CriticalityPriority of Activities by Effort Involved and/or Criticality
• Higher– training SS CC– configuration management CC– operations support CC– integration analysis SS– requirements management SS CC
• Medium– certification SS– market watch CC– distribution SS
– vendor management CC– business case evaluation SS
• Lower– administering COTS licenses CC
S - spikes aroundS - spikes around refresh cyclerefresh cycle anchor pointsanchor pointsC - continuousC - continuous
66
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Data HighlightsData Highlights
Median Detailed Assessment Effort by COTS Class
5.75
0.67
7.17
0.50
0.00 2.00 4.00 6.00 8.00
DBMS
GUI
networkmanagers
operatingsystems
Person-months
67
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Data HighlightsData HighlightsMedian Tailoring Effort by COTS Class
38.29
14.00
12.67
2.00
0.00 10.00 20.00 30.00 40.00 50.00
DBMS
GUI
networkmanagers
operatingsystems
Person-months
68
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
New Glue Code Submodel ResultsNew Glue Code Submodel Results• Current calibration looking reasonably good
– Excluding projects with very large, very small amounts of glue code (Effort Pred):
• [0.5 - 100 KLOC]: Pred (.30) = 9/17 = 53%• [2 - 100 KLOC]: Pred (.30) = 8/13 = 62%
– For comparison, calibration results shown at ARR 2000:• [0.1 - 390 KLOC]: Pred (.30) = 4/13 = 31%
• Propose to revisit large, small, anomalous projects– A few follow-up questions on categories of code & effort
• Glue code vs. application code• Glue code effort vs. other sources
69
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
BenefitsBenefits
• Existing– Independent source of estimates
– Checklist for effort sources
– (Fairly) easy-to-use development phase tool
• On the Horizon– Empirically supported, tightly calibrated,
total lifecycle COTS estimation tool
70
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COQUALMO Backup Charts
• Current COQUALMO system
• Defect removal rating scales
• Defect removal estimates
• Multiplicative defect removal model
• Orthogonal Defect Classification (ODC) extensions
71
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COCOMO II
Current COQUALMO System
COQUALMO
DefectIntroduction
Model
DefectRemoval
Model
Software platform, Project, product and personnel attributes
Software Size Estimate
Defect removal profile levelsAutomation, Reviews, Testing
Software development effort, cost and schedule estimate
Number of residual defectsDefect density per unit of size
72
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Defect Removal Rating Scales
Highly advanced
tools, model-based test
More advance test tools,
preparation.
Dist-monitoring
Well-defined test seq. and
basic test coverage tool
system
Basic test
Test criteria based on checklist
Ad-hoc test and debug
No testingExecution Testing and
Tools
Extensive review
checklist
Statistical control
Root cause analysis,
formal follow
Using historical data
Formal review roles and Well-trained people
and basic checklist
Well-defined preparation,
review, minimal
follow-up
Ad-hoc informal walk-
through
No peer reviewPeer Reviews
Formalized specification, verification.
Advanced dist-
processing
More elaborate req./design
Basic dist-processing
Intermediate-level module
Simple req./design
Compiler extension
Basic req. and design
consistency
Basic compiler capabilities
Simple compiler syntax
checking
Automated Analysis
Extra HighVery HighHighNominalLowVery Low
COCOMO II p.263
73
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Defect Removal Estimates- Nominal Defect Introduction Rates
60
28.5
14.37.5
3.5 1.60
10
20
30
40
50
60
70
VL Low Nom High VH XH
Delivered Defects/ KSLOC
Composite Defect Removal Rating
74
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Multiplicative Defect Removal Model- Example : Code Defects; High Ratings
• Analysis : 0.7 of defects remaining• Reviews : 0.4 of defects remaining• Testing : 0.31 of defects remaining• Together : (0.7)(0.4)(0.31) = 0.09 of defects remaining
• How valid is this?- All catch same defects : 0.31 of defects remaining- Mostly catch different defects : ~0.01 of defects remaining
75
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Example UMD-USC CeBASE Data Comparisons
• “Under specified conditions, …”
• Peer reviews are more effective than functional testing for faults of omission and incorrect specification(UMD, USC)
• Functional testing is more effective than reviews for faults concerning numerical approximations and control flow(UMD,USC)
• Both are about equally effective for results concerning typos, algorithms, and incorrect logic(UMD,USC)
76
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
ODC Data Attractive for Extending COQUALMO
- IBM Results (Chillarege, 1996)
Percent within activity
40
30
20
10
25
40
20 20
30
20
30 30
10
40
20
10
0
10
20
30
40
50
Design Code review Function test System test
Function Assignment Interface Timing
77
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COQUALMO/ODC Extension Research Approach
• Extend COQUALMO to cover major ODC categories
• Collaborate with industry ODC users- IBM, Motorola underway
- Two more sources being explored
• Obtain first-hand experience on USC digital library projects- Completed IBM ODC training
- Initial front-end data collection and analysis
78
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
CORADMO Backup Charts
• Rapid Application Development (RAD) context
• RAD Opportunity Tree and CORADMO schedule drivers
• RAD Capability (RCAP) schedule driver
• Square-root effort-schedule model and RCAP adjustment
79
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
RAD Context
• RAD a critical competitive strategy– Market window; pace of change
• Non-RAD COCOMO II overestimates RAD schedules– Need opportunity-tree cost-schedule
adjustment– Cube root model inappropriate for small
RAD projects• COCOMO II: Mo. = 3.7 ³ PM
80
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Transition to Learning Organization
Weekend warriors - PPOS
RAD Opportunity Tree
Eliminating Tasks
Reducing Time Per Task
Reducing Risks of Single-Point Failures
Reducing Backtracking
Activity Network Streamlining
Increasing Effective Workweek
Transition to Learning Organization
Development process reengineering - DPRS
Reusing assets - RVHL
Applications generation - RVHL
Design-to-schedule - O
Tools and automation - O
Work streamlining (80-20) - O
Increasing parallelism - RESL
Reducing failures - RESL
Reducing their effects - RESL
Early error elimination - RESL
Process anchor points - RESL
Improving process maturity - O
Collaboration technology - CLAB
Minimizing task dependencies - DPRS
Avoiding high fan-in, fan-out - DPRS
Reducing task variance - DPRS
Removing tasks from critical path - DPRS
24x7 development - PPOS
Nightly builds, testing - PPOS
Weekend warriors - PPOS
O: covered by
RAD
Eliminating Tasks
Reducing Time Per Task
Reducing Risks of Single-Point Failures
Reducing Backtracking
Activity Network Streamlining
Increasing Effective Workweek
Reusing assets - RVHL
Applications generation - RVHL
Design-to-schedule - O
Tools and automation - O
Work streamlining (80-20) - O
Increasing parallelism - RESL
Reducing failures - RESL
Reducing their effects - RESL
Early error elimination - RESL
Process anchor points - RESL
Improving process maturity - O
Collaboration technology - CLAB
Minimizing task dependencies - DPRS
Avoiding high fan-in, fan-out - DPRS
Reducing task variance - DPRS
Removing tasks from critical path - DPRS
24x7 development - PPOS
Nightly builds, testing - PPOS
O: covered by
RAD Capability and experience - RCAPBetter People and Incentives
O
81
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
RATING
FACTOR XL VL L N H VH XH
PERS-R 10% 25% 40% 55% 70% 85% 95%
PREX-R 2mo
4 mo 6 mo 1 yr 3 yrs 6 yrs 10 yrs
I,E, C Multipliers
PM 1.20 1.13 1.06 1.0 .93 .86 .80
M 1.40 1.25 1.12 1.0 .82 .68 .56
P=PM/M .86 .90 .95 1.0 1.13 1.26 1.43
RCAP:RAD Capability of Personnel
PERS-R is the Early Design Capability rating, adjusted to reflect the performers’ capability to rapidly assimilate new concepts and material, and to rapidly adapt to change.
PREX-R is the Early Design Personnel Experience rating, adjusted to reflect the performers’ experience with RAD languages, tools, components, and COTS integration.
82
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
RCAP Example RCAP = Nominal PM = 25, M = 5, P = 5
The square root law: 5 people for 5 months: 25 PM
RCAP = XH PM = 20, M = 2.8, P = 7.1A very good team can put on 7 people and finish in 2.8 months: 20 PM
RCAP = XL PM = 30, M = 7, P = 4.3
Trying to do RAD with an unqualified team makes them less efficient (30 PM) and gets the schedule closer to the cube root law:
(but not quite: = 9.3 months > 7 months)
83
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
0
2
4
6
8
10
12
14
16
0 10 20 30 40 50
PM
M
3.7*(Cube root) 3*(Cube root) Square root
RCAP = XL
RCAP = XH
Effect of RCAP on Cost, Schedule
84
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COSYSMO Backup Charts• Background
• Scope
• Strawman Model
– Size & complexity
– Cost & schedule drivers
– Outputs
• Issues
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
85
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Background
• Topic of breakout group at October 2000 COCOMO/SCM Forum
• Decided on incremental approach– Increment I: front-end costs of information
systems engineering
• Coordinating with development of INCOSE-FAA systems engineering maturity data repository
• Also coordinating with Rational sizing metrics effort
86
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
• Expand COCOMO II to information system engineering front end costs– Excluding aircraft, printer, etc. system
engineering • sensors a gray area
– Excluding Transition effort for now– All of Inception and Elaboration effort– Construction: Requirements; Deployment;
50% of Design effort
COSYSMO Increment I : Scope
87
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Development Inception
Elaboration
Construction
Transition
Total IECT Royce/COCOMO II
Total Maint.
Rational schedule
10 30 50 10 100
COCOMO II Schedule
12.5 37.5 62.5 12.5 125
Rational Effort 5 20 65 10 100 COCOMO II
Effort 6 24 76 12 118 100
Activity % of phase/of IECT
100
100
100
100
118
100
Management 14
12
10
14
13 12
11
Environment/CM 10
8
5
5
7 12
6
Requirements 38
18
8
4
13 12
12
Design 19
36
16
4
22 18
17
Implementation 8
13
34
19
32 29
24
Assessment 8
10
24
24
24 29
22
Deployment 3
3
3
30
7 6
8
Proposed System Engineering Scope: COCOMO II MBASE/RUP Phase and Activity Distribution
88
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Strawman COSYSMO• Sizing model determines nominal COCOMO II SysE
effort and schedule
– Function points/use cases/other for basic effort
– Tool and document preparation separate (?)
• “source of effort”
– Factor in volatility and reuse
– Begin with linear effort scaling with size (?)
• Cost & Schedule drivers multiplicatively adjust nominal effort and schedule by phase, source of effort (?)
– Application factors
– Team factors
89
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
USC Strawman Sizing Model• Function points, adjusted for complexity • Use cases, adjusted for complexity
– flows of events; complexity of interactions• Other: #rqts.; #threads; #features; #interfaces• Rqts. Volatility factor similar to COCOMO II• Reuse factor simpler than COCOMO II (TBD)• Weighting of FP, use case quantities TBD• Also use pairwise comparison approach for sizing
– Compare with known systems • Use COCOMO II CPLX factors for complexity (?)
– Control, computability, device-dependent, data management, UI operations scales
90
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Evolving Rational Sizing Model• Objective: Obtain “software mass” for COCOMO
engine• USC “MVC” approach
– “Model” -- number of classes of data
– “View” -- number of use cases
– “Control” -- distribution and algorithm complexity
• Size new application by MVC comparison to similar applications
• Overall, very similar to USC strawman sizing approach– Preparing to collaborate via Philippe Kruchten
91
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
COSYSMO: Factor Importance RatingRate each factor H, M, or L depending on its relatively high, medium, or low influence on system engineering effort. Use an equal number of H’s, M’s, and L’s.
Application Factors__H___Requirements understanding_M - H_Architecture understanding_L - H_ Level of service rqts. criticality, difficulty_L - M_ Legacy transition complexity_L – M COTS assessment complexity_L - H_ Platform difficulty_L – M_Required business process reengineering______ TBD :Ops. concept understanding (N=H)______ TBD
Team Factors_L - M_Number and diversity of stakeholder communities_M - H_Stakeholder team cohesion_M - H_Personnel capability/continuity__ H__ Personnel experience_L - H_ Process maturity_L - M_Multisite coordination_L - H_Degree of system engineering ceremony_L - M_Tool support______ TBD______ TBD
N=63.02.52.31.51.71.71.5
1.52.72.73.02.01.52.01.3
92
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Strawman Model : Outputs
• Effort & schedule by phase – By activity ?– By source of effort (analysis,
prototypes, tools, documents)?• Risk assessment ?
93
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Issues : Suggestions on Improving
• Scope• Proposed Approach• Model Form• Model Elements• Outputs• Over/underlaps with COCOMO II,
COCOTS, CORADMO• Sources of data• Staffing
94
University of Southern CaliforniaCenter for Software EngineeringC S E
USC
©USC-CSE3/19/01
Further InformationV. Basili, G. Caldeira, and H. Rombach, “The Experience Factory” and “The Goal Question Metric Approach,” in J. Marciniak (ed.), Encyclopedia of Software Engineering, Wiley, 1994.
B. Boehm, C. Abts, A.W. Brown, S. Chulani, B. Clark, E. Horowitz, R. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II, Prentice Hall, 2000.
B. Boehm, D. Port, “Escaping the Software Tar Pit: Model Clashes and How to Avoid Them,” ACM Software Engineering Notes, January, 1999.
B. Boehm et al., “Using the Win Win Spiral Model: A Case Study,” IEEE Computer, July 1998, pp. 33-44.
R. van Solingen and E. Berghout, The Goal/Question/Metric Method, McGraw Hill, 1999.
COCOMO II, MBASE items : http://sunset.usc.edu
CeBASE items : http://www.cebase.org