University of Southern CaliforniaCenter for Systems and Software Engineering
Future Challenges for Systems and Software Cost Estimation
and Measurement
Barry Boehm, USC-CSSE13th Annual PSM Conference
June 25, 2009
6/25/2009 1©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
SummarySummary• Current and future trends create challenges for DoD g
systems and software data collection and analysis– Mission challenges: emergent requirements, rapid change, net-
centric systems of systems, COTS and services, highcentric systems of systems, COTS and services, high assurance with agility
– DoD initiatives: DoDI 5000.02, evolutionary acquisition, competitive prototyping, time-certain milestonescompetitive prototyping, time certain milestones
• Updated software data definitions and estimation methods could help DoD systems management– Examples: incremental and evolutionary development; COTS
and services; net-centric systems of systems– Further effort and coordination needed to converge on these– Being addressed in Brad Clark workshop this afternoon
6/25/2009 ©USC-CSSE 2
University of Southern CaliforniaCenter for Systems and Software Engineering
Current and Future DoD ChallengesCurrent and Future DoD Challenges• Emergent requirements
Cannot prespecify requirements cost schedule EVMS– Cannot prespecify requirements, cost, schedule, EVMS– Need to estimate and track early concurrent engineering
• Rapid change– Long acquisition cycles breed obsolescence– DoDI 5000.02 emphasis on evolutionary acquisition
• Net-centric systems of systems• Net-centric systems of systems– Incomplete visibility and control of elements
• Model, COTS, service-based, Brownfield systems– New phenomenology, counting rules
• Always-on, never-fail systemsNeed to balance agility and high assurance– Need to balance agility and high assurance
6/25/2009 3©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
The Broadening Early Cone of Uncertainty (CU)
• Need greater investments in narrowing CU– Mission, investment, legacy
Global Interactive,Brownfield
X8
analysis– Competitive prototyping– Concurrent engineeringBatch, Greenfield
X4
X2
ConOps Specs/Plans IOC– Associated estimation
methods and management metrics
Local Interactive
• Larger systems will often have subsystems with
CU’
Local Interactive,Some Legacy
narrower CU’s
6/25/2009©USC-CSSE 4
University of Southern CaliforniaCenter for Systems and Software Engineering
COSYSMO O ti l C tCOSYSMO Operational Concept
SizeDri ers
# Requirements# Interfaces# Scenarios# Algorithms COSYSMODrivers
EffortMultipliers
Effort# Algorithms
+Volatility Factor
p
Calibration- Application factors
-8 factors- Team factors
-6 factors- Schedule driver WBS guided by
ISO/IEC 15288
56/25/2009 ©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
4. Rate Cost Drivers -4. Rate Cost Drivers Application
6
University of Southern CaliforniaCenter for Systems and Software Engineering
COSYSMO Change Impact Analysis ICOSYSMO Change Impact Analysis – I– Added SysE Effort for Going to 3 Versions
Si N b l it l tilit f t• Size: Number, complexity, volatility, reuse of system requirements, interfaces, algorithms, scenarios (elements)– 1 3 Versions: add 3-6% per increment for number of elements
add 2-4% per increment for volatilityadd 2-4% per increment for volatility– Exercise Prep.: add 3-6% per increment for number of elements
add 3-6% per increment for volatility
• Most significant cost drivers (effort multipliers)– Migration complexity: 1.10 – 1.20 (versions)– Multisite coordination: 1.10 – 1.20 (versions, exercise prep.)( , p p )– Tool support: 0.75 – 0.87 (due to exercise prep.)– Architecture complexity: 1.05 – 1.10 (multiple baselines)– Requirements understanding: 1.05 – 1.10 for increments 1,2;
1.0 for increment 3; .9-.95 for increment 4
6/25/2009 7©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
COSYSMO Change Impact Analysis – II– Added SysE Effort for Going to 3 Versions
Cost Element Incr. 1 Incr. 2 Incr. 3 Incr. 4Size 1.11 – 1.22 1.22 – 1.44 1.33 – 1.66 1.44 – 1.88Size 1.11 1.22 1.22 1.44 1.33 1.66 1.44 1.88
Effort Product 1.00 – 1.52 1.00 – 1.52 0.96 – 1.38 0.86 – 1.31
Effort Range 1.11 – 1.85 1.22 – 2.19 1.27 – 2.29 1.23 – 2.46
Arithmetic Mean 1.48 1.70 1.78 1.84
Geometric Mean 1.43 1.63 1.71 1.74
6/25/2009 8©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
COSYSMO R i C i Ch llCOSYSMO Requirements Counting Challenge
• Estimates made in early stagesEstimates made in early stages– Relatively few high-level design-to requirements
• Calibration performed on completed projects– Relatively many low-level test-to requirements
• Need to know expansion factors between levels– Best model: Cockburn definition levelsBest model: Cockburn definition levels
• Cloud, kite, sea level, fish, clam• Expansion factors vary by application area, size
– One large company: Magic Number 7– Small e-services projects: more like 3:1, fewer lower levels
• Survey form available to capture your experienceSurvey form available to capture your experience
6/25/2009 ©USC-CSSE 9
University of Southern CaliforniaCenter for Systems and Software Engineering
Next-Generation Systems ChallengesNext-Generation Systems Challenges• Emergent requirements
E l Vi t l l b l ll b ti t t– Example: Virtual global collaboration support systems– Need to manage early concurrent engineering
• Rapid changep g– In competitive threats, technology, organizations,
environment• Net centric systems of systems• Net-centric systems of systems
– Incomplete visibility and control of elements• Model, COTS, service-based, Brownfield systems
– New phenomenology, counting rules• Always-on, never-fail systems
N d t b l ilit d hi h– Need to balance agility and high assurance
10
University of Southern CaliforniaCenter for Systems and Software Engineering
Rapid Change Creates a Late Cone of UncertaintyRapid Change Creates a Late Cone of Uncertainty– Need evolutionary/incremental vs. one-shot development
Uncertainties in competition, ptechnology, organizations,
mission priorities
©USC-CSSE 116/25/2009
University of Southern CaliforniaCenter for Systems and Software Engineering
Evolutionary Acquisition per New DoDI 5000 02Evolutionary Acquisition per New DoDI 5000.02No clean boundary between R&D and O&M
12 6/25/2009©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Incremental Development Productivity Decline (IDPD)Incremental Development Productivity Decline (IDPD)
• Example: Site Defense BMD Software – 5 builds, 7 years, $100M; operational and support software– Build 1 productivity over 300 LOC/person month– Build 5 productivity under 150 LOC/PMp y
• Including Build 1-4 breakage, integration, rework• 318% change in requirements across all builds• IDPD factor = 20% productivity decrease per buildIDPD factor 20% productivity decrease per build
– Similar trends in later unprecedented systems– Not unique to DoD: key source of Windows Vista delays
• Maintenance of full non-COTS SLOC, not ESLOC– Build 1: 200 KSLOC new; 200K reused@20% = 240K ESLOC– Build 2: 400 KSLOC of Build 1 software to maintain integrateBuild 2: 400 KSLOC of Build 1 software to maintain, integrate
6/25/2009 13©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
IDPD Cost Drivers:IDPD Cost Drivers: Conservative 4-Increment Example
• Some savings: more experienced personnel (5-20%)• Depending on personnel turnover rates
• Some increases: code base growth diseconomies of• Some increases: code base growth, diseconomies of scale, requirements volatility, user requests
• Breakage, maintenance of full code base (20-40%)g , ( )• Diseconomies of scale in development, integration
(10-25%)• Requirements volatility; user requests (10-25%)
• Best case: 20% more effort (IDPD=6%)• Worst case: 85% (IDPD=23%)• Worst case: 85% (IDPD=23%)
6/25/2009 14©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Effects of IDPD on Number of IncrementsEffects of IDPD on Number of Increments
M d l l ti d ti it d li t SLOC
160001800020000
• Model relating productivity decline to number of builds needed to reach 8M SLOC Full Operational Capability
• Assumes Build 1 production of 2M SLOC8M
SLOC
60008000
100001200014000
Cumulative KSLOC
0% productivity decline10% productivity decline15% productivity decline
Assumes Build 1 production of 2M SLOC @ 100 SLOC/PM– 20000 PM/ 24 mo. = 833 developers– Constant staff size for all builds
0200040006000
1 2 3 4 5 6 7 8
15% productivity decline20% productivity decline
Constant staff size for all builds• Analysis varies the productivity decline
per build– Extremely important to determine the
2M
Build
yincremental development productivity decline (IDPD) factor per build
6/25/2009 15©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Incremental Development Data ChallengesIncremental Development Data Challenges • Breakage effects on previous increments
– Modified, added, deleted SLOC: need Code Count with diff toolModified, added, deleted SLOC: need Code Count with diff tool• Accounting for breakage effort
– Charged to current increment or I&T budget (IDPD)• IDPD effects may differ by type of software
– “Breakage ESLOC” added to next increment– Hard to track phase and activity distributions
• Hard to spread initial requirements and architecture effort
• Size and effort reporting– Often reported cumulatively– Often reported cumulatively– Subtracting previous increment size may miss deleted code
• Time-certain development– Which features completed? (Fully? Partly? Deferred?)
6/25/2009 ©USC-CSSE 16
University of Southern CaliforniaCenter for Systems and Software Engineering
“Equivalent SLOC” Paradoxes
• Not a measure of software size• Not a measure of software effort
N f d li d f bili• Not a measure of delivered software capability• A quantity derived from software component sizes
and reuse factors that helps estimate effortand reuse factors that helps estimate effort• Once a product or increment is developed, its
ESLOC loses its identity– Its size expands into full SLOC– Can apply reuse factors to this to determine an ESLOC
quantity for the next increment• But this has no relation to the product’s size
6/25/2009 ©USC-CSSE 17
University of Southern CaliforniaCenter for Systems and Software Engineering
Current and Future DoD ChallengesCurrent and Future DoD Challenges• Emergent requirements
Cannot prespecify requirements cost schedule EVMS– Cannot prespecify requirements, cost, schedule, EVMS– Need to estimate and track early concurrent engineering
• Rapid change– Long acquisition cycles breed obsolescence– DoDI 5000.02 emphasis on evolutionary acquisition
• Net-centric systems of systems• Net-centric systems of systems– Incomplete visibility and control of elements
• Model, COTS, service-based, Brownfield systems– New phenomenology, counting rules
• Always-on, never-fail systemsNeed to balance agility and high assurance– Need to balance agility and high assurance
6/25/2009 18©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Net Centric Systems of Systems ChallengesNet-Centric Systems of Systems Challenges• Need for rapid adaptation to change
See first understand first act first finish decisively– See first, understand first, act first, finish decisively• Built-in authority-responsibility mismatches
– Increasing as authority decreases through Directed, Acknowledged, Collaborative, and Virtual SoS classes
• Severe diseconomies of scale– Weak early architecture and risk resolutionWeak early architecture and risk resolution– Need thorough flowdown/up of estimates, actuals– More complex integration and test preparation, execution
• More software intensive – Best to use parallel software WBS
• Many different classes of system elementsMany different classes of system elements– One-size-fits-all cost models a poor fit
6/25/2009 19©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Add d C t f W k A hit tiAdded Cost of Weak ArchitectingCalibration of COCOMO II Architecture and Risk Resolution
factor to 161 project data pointsp j p
©USC-CSSE 206/25/2009
University of Southern CaliforniaCenter for Systems and Software Engineering
Model COTS Service-Based Brownfield SystemsModel, COTS, Service-Based, Brownfield SystemsNew phenomenology, counting rules
• Product generation from model directives• Product generation from model directives– Treat as very high level language: count directives
• Sizing COTS and services use needs improvementg– Unrealistic to use COTS, services SLOC for sizing– Alternatives: function point elements, amount of glue code,
activity-based assessment costing, tailoring parametersactivity based assessment costing, tailoring parameters • Brownfield legacy constraints, re-engineering
– Re-engineer legacy code to fit new architecture– Apply reuse model for re-engineering
• A common framework for reuse, incremental development, maintenance, legacy re-engineering?development, maintenance, legacy re engineering?– All involve reusing, modifying, deleting existing software
6/25/2009 21©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Achieving Agility and High Assurance -IAchieving Agility and High Assurance IUsing timeboxed or time-certain development
Precise costing unnecessary; feature set as dependent variable
Rapid Change
Short DevelopmentIncrements
g
Short, StabilizedIncrement N Transition/O&M
ForeseeableChange
(Plan)Development
Of Increment N
Increment N Transition/O&M
Increment N Baseline
(Plan)
HighAssurance Stable Development
6/25/2009 ©USC-CSSE22
Assurance Stable DevelopmentIncrements
University of Southern CaliforniaCenter for Systems and Software Engineering
Achieving Agility and High Assurance -IIAchieving Agility and High Assurance -IIUnforeseeable Change (Adapt)
Agile Rebaselining for
Future Increments
Rapid Change
Future Increment Baselines
utu e c e e ts
Short Stabilized
Deferrals
I t N T iti /
ShortDevelopmentIncrements
ForeseeableChange
(Plan) Short, StabilizedDevelopment
of Increment N
A tif t C
Increment N Transition/
Operations and MaintenanceIncrement N Baseline
(Plan)
Stable DevelopmentIncrements
Verification and Validation (V&V)
Artifacts ConcernsHigh
Assurance Future V&VResources
Current V&VResources
Increments
6/25/2009 ©USC-CSSE23
Validation (V&V)of Increment N
Resources
Continuous V&V
University of Southern CaliforniaCenter for Systems and Software Engineering
Related Additional Measurement Challenges
• Tracking progress of rebaselining V&V teams• Tracking progress of rebaselining, V&V teams– No global plans; individual changes or software drops– Earlier test preparation: surrogates, scenarios, testbeds
• Tracking content of time-certain increments– Deferred or partial capabilities; effects across system
• Trend analysis of emerging risks• Trend analysis of emerging risks– INCOSE Leading Indicators; SERC Effectiveness Measures
• Contributions to systems effectiveness– Measures of Effectiveness models, parameters
• Systems of systems progress, risk, change trackingC i t t t fl fl d fl– Consistent measurement flow-up, flow-down, flow-across
6/25/2009 ©USC-CSSE 24
University of Southern CaliforniaCenter for Systems and Software Engineering
Software data definition topics for discussionSoftware data definition topics for discussionIn Brad Clark workshop this afternoon
• Ways to treat data elements– COTS, other OTS (open source; services; GOTS; reuse; legacy code)– Other size units (function points object points, use case points, etc.)– Generated code: counting generator directives
Requirements volatility– Requirements volatility– Rolling up CSCIs into systems and systems of systems– Cost model inputs and outputs (e.g., submitting estimate files)
• Scope issuesp– Cost drivers, Scale factors– Reuse parameters: Software Understanding , Programmer Unfamiliarity– Phases included: hardware-software integration; systems of systems
i t ti t iti i tintegration, transition, maintenance– WBS elements and labor categories included– Parallel software WBS
• How to involve various stakeholders• How to involve various stakeholders– Government, industry, commercial cost estimation organizations
6/25/2009 ©USC-CSSE 25
University of Southern CaliforniaCenter for Systems and Software Engineering
SummarySummary• Current and future trends create challenges for DoDCurrent and future trends create challenges for DoD
systems and software data collection and analysis– Mission challenges: emergent requirements, rapid change,
net-centric systems of systems COTS and services highnet-centric systems of systems, COTS and services, high assurance with agility
– DoD initiatives: DoDI 5000.02, evolutionary acquisition, competitive prototyping time-certain milestonescompetitive prototyping, time-certain milestones
• Updated software data definitions and estimation methods could help DoD systems management– Examples: incremental and evolutionary development; COTS
and services; net-centric systems of systems– Further effort and coordination needed to converge on theseg– Being addressed in Brad Clark workshop this afternoon
6/25/2009 ©USC-CSSE 26
University of Southern CaliforniaCenter for Systems and Software Engineering
ReferencesReferences Boehm, B., “Some Future Trends and Implications for Systems and Software Engineering Processes”,
Systems Engineering 9(1), pp. 1-19, 2006. Boehm, B. and Lane J., "21st Century Processes for Acquiring 21st Century Software-Intensive Systems ofBoehm, B. and Lane J., 21st Century Processes for Acquiring 21st Century Software Intensive Systems of
Systems." CrossTalk: Vol. 19, No. 5, pp.4-9, 2006. Boehm, B., and Lane, J., “Using the ICM to Integrate System Acquisition, Systems Engineering, and
Software Engineering,” CrossTalk, October 2007, pp. 4-9.Boehm, B., Brown, A.W.. Clark, B., Madachy, R., Reifer, D., et al., Software Cost Estimation with COCOMO II,
Prentice Hall, 2000.Dahmann, J. (2007); “Systems of Systems Challenges for Systems Engineering”, Systems and SoftwareDahmann, J. (2007); Systems of Systems Challenges for Systems Engineering , Systems and Software
Technology Conference, June 2007.Department of Defense (DoD), Defense Acquisition Guidebook, version 1.6, http://akss.dau.mil/dag/, 2006.Department of Defense (DoD), Instruction 5000.2, Operation of the Defense Acquisition System, May 2003.Department of Defense (DoD), Systems Engineering Plan Preparation Guide, USD(AT&L), 2004.Galorath D and Evans M Software Sizing Estimation and Risk Management Auerbach 2006Galorath, D., and Evans, M., Software Sizing, Estimation, and Risk Management, Auerbach, 2006.Lane, J. and Boehm, B., “Modern Tools to Support DoD Software-Intensive System of Systems Cost
Estimation, DACS State of the Art Report, also Tech Report USC-CSSE-2007-716Lane, J., Valerdi, R., “Synthesizing System-of-Systems Concepts for Use in Cost Modeling,” Systems
Engineering, Vol. 10, No. 4, December 2007.Madachy, R., “Cost Model Comparison,” Proceedings 21st, COCOMO/SCM Forum, November, 2006,
http://csse usc edu/events/2006/CIIForum/pages/program htmlhttp://csse.usc.edu/events/2006/CIIForum/pages/program.htmlMaier, M., “Architecting Principles for Systems-of-Systems”; Systems Engineering, Vol. 1, No. 4 (pp 267-
284).Northrop, L., et al., Ultra-Large-Scale Systems: The Software Challenge of the Future, Software
Engineering Institute, 2006. Reifer, D., “Let the Numbers Do the Talking,” CrossTalk, March 2002, pp. 4-8.
6/25/2009 ©USC-CSSE 27
Valerdi, R, Systems Engineering Cost Estimation with COSYSMO, Wiley, 2009 (to appear)
University of Southern CaliforniaCenter for Systems and Software Engineering
Backup ChartsBackup Charts
6/25/2009 28©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
How Much Architecting is Enough?100
e
How Much Architecting is Enough?- Larger projects need more
70
80
90
all S
ched
ule
Percent of Project Schedule Devoted to Initial Architecture and Risk Resolution
Added Schedule Devoted to Rework(COCOMO II RESL factor)
10000KSLOC
50
60
70
ed to
Ove
ra (COCOMO II RESL factor)
Total % Added Schedule
Sweet Spot
20
30
40
f Tim
e A
dd
100 KSLOC
Sweet Spot Drivers:
R id Ch l ft d
0
10
20
0 10 20 30 40 50 60
Perc
ent o 10 KSLOC Rapid Change: leftward
High Assurance: rightward
6/25/2009 ©USC-CSSE 29
0 10 20 30 40 50 60
Percent of Time Added for Architecture and Risk Resolution
University of Southern CaliforniaCenter for Systems and Software Engineering
TRW/COCOMO II Experience Factory: IVRescope
System objectives:fcn’y, perf., quality
Executeprojectto next
MilestoneRevise
Mil t
No
Yes
Cost,Sched,Risks
Ok?COCOMO IICorporate parameters:tools, processes, reuse
Milestone Milestones,Plans,
Resources
No
M/SResults
YesRisks
Milestone plans,resources
Ok?
Evaluate Accumulate
RevisedExpectationsYes
Milestone expectations
ImprovedCorporate
Parameters
Cost, Sched,Quality drivers
RecalibrateCOCOMO II
Done?
End
CorporateSW
ImprovementStrategies
COCOMO IIcalibration
data Yes
No
30
End
6/25/2009 30©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Choosing and Costingg gIncremental Development Forms
Type Examples Pros Cons Cost Estimation
Evolutionary Sequential
Small: AgileLarge: Evolutionary
Development
Adaptability to change
Easiest-first; late, costly breakage
Small: Planning-poker-typeLarge: Parametric with IDPD
Prespecified Sequential
Platform base plus PPPIs
Prespecifiable full-capability requirements
Emergent requirements or
rapid change
COINCOMO with no increment overlap
requirements rapid change
Overlapped Evolutionary
Product lines with ultrafast change
Modular product line
Cross-increment breakage
Parametric with IDPD and Requirements Volatility
Rebaselining Mainstream
product lines;High assurance
with rapid changeHighly coupled systems with
COINCOMO, IDPD for development; COSYSMO forg
Evolutionaryproduct lines;
Systems of systems
with rapid change systems with very rapid change
development; COSYSMO for rebaselining
IDPD: Incremental Development Productivity Decline, due to earlier increments breakage, increasing code base to integratecode base to integrate
PPPIs: Pre-Planned Product Improvements
COINCOMO: COCOMO Incremental Development Model (COCOMO II book, Appendix B)
COSYSMO: Systems Engineering Cost Model (in-process COSYSMO book)
©USC-CSSE 31
All Cost Estimation approaches also include expert-judgment cross-check.
6/25/2009
University of Southern CaliforniaCenter for Systems and Software Engineering
Compositional approaches: Directed systems of systems
InceptionElaboration
Source SoSSelection Architecting
Increment 1 Increments 2,… n
LCO LCA IOC1
Customer,Users
LSI –Agile
RFP, SOW, Evaluations
, Contracting
Similar withsess
or
t-fa
lls
CO
→LC
At
all l
evel
s Assess sources of change;
Negotiate b li d
Effort COSYSMO-like.
Schedule = Effort/Staff
Agile
LSI IPTs –Agile
Effort/StaffSimilar, with
added changetraffic from
users…
Ass
mpa
tibi
lity,
sho f
Rew
ork
LCP
acka
ges
at
COSOSIMO
rebaselined LCA2
package at all levelsCOSOSIMO
-like
Try to modelideal staff size
Suppliers –Agile
Suppliers –P o o l
comCOSOSIMO
-like-like
Similar, withDevelop to LCA1
Effort/staffat all levels
LCA2pp
PD – V&V
LSI –Integrators
Proposals,
added re-baselineing risks
and rework…
spec, V&VCORADMO-like
Degree of Completene
ss
risks, rework
Proposal
risks, rework
Risk-manage slow-
performer, completeness
risks, rework
Integrate
COSOSIMOrisks,
kProposal Feasibility
p COSOSIMO-like
LCA2 shortfalls
rework
6/25/2009 32©USC-CSSE
University of Southern CaliforniaCenter for Systems and Software Engineering
Comparison of Cost Model ParametersParameter Aspects COSYSMO COSOSIMO
Size drivers # of system requirements# of system interfaces
# of SoS requirements# of SoS interface protocols
# operational scenarios# algorithms
# of constituent systems# of constituent system organizations# operational scenarios
“Product” characteristics Size/complexity Size/complexityRequirements understandingArchitecture understandingLevel of service requirements# of recursive levels in designMigration complexity
Requirements understandingArchitecture understandingLevel of service requirementsComponent system maturity and stabilityComponent system readinessMigration complexity
Technology risk#/ diversity of platforms/installationsLevel of documentation
Component system readiness
Process characteristics Process capability Maturity of processesMulti-site coordinationTool support
Tool supportCost/schedule compatibilitySoS risk resolution
People characteristics Stakeholder team cohesion Stakeholder team cohesion
6/25/2009 ©USC-CSSE 33
Personnel/team capabilityPersonnel experience/continuity
SoS team capability
University of Southern CaliforniaCenter for Systems and Software Engineering
SoSE Core Element Mapping toSoSE Core Element Mapping to COSOSIMO Sub-models
Translating UnderstandingTranslating capability objectives
Developing,
Understanding systems &
relationships(includes plans)Planning,
Requirements
COSOSIMO
p g,evolving and maintaining
SoS design/arch
Addressing new requirements
qManagement,
and Architecting (PRA)
q& options
Orchestrating upgrades
to SoS
Source Selection and Supplier
Oversight (SO)Assessing (actual)
performance to capability objectives
Monitoring
O e s g t (SO)
SoS Integrationand Testing
6/25/2009 ©USC-CSSE34
Monitoring & assessing
changes(I&T)
Top Related