WG 1 Product S&T DoD Software Summit 9 Aug 01 Bill Mark Gregor Kiczales Mike Evangelist Helen Gigley...
-
date post
22-Dec-2015 -
Category
Documents
-
view
219 -
download
2
Transcript of WG 1 Product S&T DoD Software Summit 9 Aug 01 Bill Mark Gregor Kiczales Mike Evangelist Helen Gigley...
WG 1
Product S&T
DoD Software Summit9 Aug 01
Bill MarkGregor KiczalesMike EvangelistHelen GigleyLinda NorthropKevin SullivanLiGuo HuangDan PortDon ReiferBill Scherlis
Report outline
• Recommendations
• Defense software Role Need
• Case study: FCS
• Product attributes: challenges and potential Interoperability Dependability Evolvability
• Contributing technologies Selected examples
• Technology evaluation issues
• Phased strategy (tbd)
Recommendations
1. Initiate a multi-year S&T program to advance the software science and technology base, particularly for system of systems
Address DoD-critical needs, and exploit dual-use opportunities Focus on tech base, and involve user/integrator base Problem-driven and strategic. Mid- to long-horizon. Structure as a program suite at aggregate funding $50M/yr.
2. Create virtual laboratories for research and experimentation at scale, to accelerate transition by providing early validation
Multi-stakeholder participation Enable work across lifecycle Realism and reality transfer: scale, complexity, (sanitized) artifacts Evaluate technologies, artifacts, designs, operational concepts,
methods, theories, tools Document value and impact.
3. Employ more aggressive models for co-evolving operational concept and engineering design in order to develop the most aggressive possible solutions
Conceptualization/demonstration/evaluation Rapid coupled iteration. Cf. JWID, CPOF double-helix.
Software wins wars
• The 21st Century Defense environment Asymmetry, coalition, rapid response
• Organizational structure more complex, changing more rapidly Need for rapid robust adaptation: short time scale, high
consequences• E.g., Patriot, EW, ECM• Need to validate quickly: “Sorry, Dave, I can’t do this upgrade”
Increasing requirements uncertainty, evolution, and volatility
• Defense computing is at the core Huge diversity of system elements Operate in adversarial environment
• Software is the key engineering medium Embodies:
• Decision capability, algorithmic content, flexibility/malleability, … Focus of capability: A principal competitive differentiator
• For the soldier: Software as force multiplier
• Not realizing full potential
Defense software challenges
• Will commercial industry do this for DoD? Technology providers are not fully addressing DoD critical needs DoD does have critical needs
• Will defense integrators solve these problems for DoD? Software interventions occur at all stages of the acquisition
process Integrators depend on vendors and other technology providers
• Will the software S&T community deliver the technology? The DoD software S&T community is dispersing.
• Military challenges must once again become a dominant driver• Other programs not effectively addressing these challenges
DoD is too small a player to drive key elements of the IT food chain purely through demand
• Bottom line: DoD needs to build on the rapidly evolving commercial base DoD needs to stimulate technology providers to innovate in its
mission-critical areas, leveraging the market where possible DoD needs to reestablish its software S&T community
FCS Concept Is this right?
• Operations Accelerate SA/Plan/Execute/Replan cycle Accelerate power projection Empower the commander
• Performance Reliable, robust, lightweight Enter-once principle
• Acquisition Rapidly and safely incorporate new capability Improve affordability and flexibility Provide common basis for interoperability
System of systems
• Key technology drivers Interoperability Dependability Evolvability
• Attributes Definition Inhibitors Strategy Impact
Interoperability – definition
• Elements of Interoperation Composability
• Safe and predictable interconnection Frameworks
• Safe, predictable, and scalable interconnection
• COM, CORBA, Beans, EJB, HLA, etc. Content sharing
• Information linking Common UI elements
• Cut&Paste
Interoperability – inhibitors
• Increasing: Scale, distribution, etc. Versions, configurations Framework constituents
• Shared domain semantics End-to-end requirements
• Real-time Use of OTS elements
• Rate of change • Assurance, framework compliance
Interoperability – strategy
• Develop improved taxonomy of interoperability criteria
• Select areas for developmental investigation E.g., field maneuvers across services Conduct aggressive experiments to drive problem
identification
• Develop basic metrics to understand effect on operations. Adopt subjective assessment where measures lack Apply current knowledge to achieve several levels of
interoperability. Analyze what is required to go from current state to
required
• Develop theory and technologies to address the criteria
Interoperation – impact
• Value to the mission Common information base
• Enter once• Diff, reconciliation, fusion• Increased situation awareness
Right decision faster• Remove process costs
Assurance about the aggregate
• Value to the PM Cross-cutting issues addressed
• Interdependence, jointness Limit PM risk exposure
Dependability – definition
External dependability Availability
• Five 9s and beyond Security
• Prevention• Detection• Tolerance/Healing
Safety Capability Human usage
• Internal dependability Framework robustness Component compliance for
component aggregation• Robust service interfaces • Pre/post conditions
• Security, etc.• Code safety• Performance• Deadlines
• Model compliance• AOSD, UML• Code safety and
concurrency, …• Domain
Dependability – inhibitors
• Scale Environment Distribution Number of components Subsystem complexity Heterogeneity
• Integration into SWE process/tools E.g., ongoing measures,
robustness testing… Assurance integration Aspect consistency mgmt
• Technology Appropriate analysis and
formal methods Models, aspects, and
support tools Design principles
• Measures and ROI Empirical validation:
measures at scale Analytical validation at
scale
• Researcher access At-scale artifacts Dependability failure data
Dependability – strategy
1. Enablers/levers• Measures
• Dependability-assurance Co-evolution of code,
models, argumentation
• Assurance technology Analysis / Assertion /
Typing / Certification
• Component certification Robustness testing
• Human error models
• Component SWE technologies Composability Compositionality
• Framework technology
2. Principles• Incrementality
• Adoptability
• Integration into lifecycle mgmt Development process,
practices, tools, evaluation, etc.
3. Actions• Technology
Self-stability Robustness test Lightweight formal methods Etc.
• Testbed experimentation
Dependability – impact
• Information Armor Reliable, predictable warfighting
capability Confident adaptation and change Improved reliability, security
• Less “ility-constraint” on capability
• More predictable systems costs
Evolvability – definition
Includes
• Enhancement Functional Performance Structural Integration,
interconnection
• Repair
• Dependability improvement Tolerance Robustness
• Etc.
Encompassing
• Design for evolution Ride commercial
growth curves Rapid adaptation Rapid exploration Creating tomorrow’s
legacy• Avoid software rust
belt
• Legacy management Program understanding Information capture
Evolvability – inhibitors
• Inadequate knowledge early in life-cycle of innovative systems to make appropriate architectural choices And inadequate architectures, in any case
• Imperfect understanding of the current/future environment
• Lack of centralized control over system design
• Intractability of legacy software systems
• Lack of complete knowledge of design, dependencies & consequent difficulty of assessing impacts of change
• Lack of present rewards for investments in future benefits
• Failure to consider importance of corresponding “wetware”
• Affordability Costliness of revalidation
Evolvability – strategy
Technical approaches• Theories, tools, practices of
design & evolution without total central control (including COTS)
• Models and tools for discovering and managing dependencies across “administrative domains”
• Generative methods: raising abstraction level
• Self-protecting systems: resist bad upgrades
• Legacy analysis and evolution support systems
• Qualitative and quantitative evolution models
Principles of operation• Aim to make fundamental
advances, but do it in a way that can be evaluated on a year-to-year basis through the production of advances in short, mid- and long-term
• Enable “valuation” and “monetization” of evolvability
Evolvability – impact
• Improved ability to meet evolving threats
• Substantially reduced lifecycle costs
• Enable emergence of systems of systems (including legacy/evolving components)
• Improved ability to meet user needs
Technical ideas
• A sample of technical ideas that contribute to these goals Aspect oriented
software design Value-based design
approaches Architecture-based
performance prediction Chains of evidence for
incremental assurance Self-stabilizing systems
• Description attributes Definition Approaches to
demonstration and evaluation
Maturity of ideas and community supporting them
Impact and value to the mission
Points of intervention in the software lifecycle
Technical ideas
• Idea: Aspect-oriented software development (AOSD) Modularize crosscutting concerns (next modularity after OO)
• Manage complexity through localization of aspects
• Evaluation Can large systems be developed/maintained more
effectively, with higher-quality, etc. Drive thru: design, coding, validation, other practices
• Maturity Xerox, UBC, IBM, NEU, MIT, CMU, VUB (Brussels)… 10 workshops in last 4 years, conference in 2002
• Impact/outcomes Composability at next levels of scale/dynamicity…
• Intervention Arch, design/implementation, evolution, assurance
Technical ideas
• Idea: Value-based design approaches for software Design for value added: products, processes, portfolios
• Evaluation: Sensible theory, tools and practices, empirical evaluation
• Maturity U Va, USC, CMU, U Wash, SEI, NRC Canada, USCD Workshops (EDSER)
• Impact Significantly increase economic efficiency Value-added construction of much more complex
systems
• Intervention Requirements, architecture, evolution, lifecycle
Technical ideas
• Idea: Architecture-based analytical performance modeling Build on queuing theory, etc.
• Evaluation Compare with predicted performance
• Maturity SEI, U Va, UIUC, MIT, Ohio St, U Texas
• Impact No performance surprises on system integration No performance surprises when system overloaded
• Intervention Design, test, upgrade (prediction)
Technical ideas
• Idea: Chains of evidence for incremental assurance
• Code safety: exceptions, arrays, encapsulation, concurrency, etc.
Maintain assurance argument and code simultaneously
• Evaluation Case studies at scale
• Maturity Compaq (ESC/Java), MIT/UVa (LCLint), CMU, U Brussels
• Impact Increments of effort yield increments of value in
constructing assurance rationale
• Intervention Design/development, evolution
Technical ideas
• Idea: Self-stabilizing systems
• Evaluation Empirical/analytic comparison with standard ad hoc
approaches.
• Maturity U Texas, Ohio St, IBM, Technion Workshops
• Impact Improvement over standard fault tolerance approaches. Coherent extension of SSS to load balancing, security,
etc.
• Intervention Design
Additional technical ideas
• Appropriate formal methods Proof-carrying code
• Frameworks and patterns Theory and practice Evaluation of components for compliance Evaluation of frameworks
• Product architecture, human organization, evolution and dynamics Complex adaptive systems
• Understanding/predicting emergent behaviors
• Model-based systems architecting and software engineering Avoiding model-clashes through explicit model
integration Engineering through model-views
The role of scientific approaches to technology evaluation
• Benefits Early validation
• Do we understand the requirements?• Does the op concept match the technology?
Prototype early• “10k users before publishing”• IETF model
Rapid iteration in technology development
• Caveats Diversity of stakeholders and value metrics Light under the lamppost
• Not all needs have measures• Indeed, few needs have useful measures
– Dependability? Evolvability? Interoperability?• Capability to measure/evaluate lags capability to produce
• Internal measures often lacking Incorporate strategic/subjective value in program mgmt
Technology insertion
• Points of technology intervention occur throughout the lifecycle Major product -ilities must be addressed through combinations
of process, tools, information mgmt, models, etc. • Interoperability, evolvability, dependability
• Technology intervention is being facilitated through incremental approaches to evaluation and adoption These are now emerging for some SWE capabilities Reduce costs/risks of adoption
• Improving evaluation is important Relatively few evaluation criteria have useful measures But keep caveats in mind – they are significant
• Major changes in the tech base and in the marketplace are enabling new approaches to long-standing challenges Early prototyping and rapid iteration Collaboration and information sharing