Post on 20-Jun-2020
1
Adapting Operational Testto Rapid Acquisition Programs
Mr. Dave BeyrodtMr. Dave BisaillonMr. Jacob WarrenMr. Dan Telford Lt Stu Corbett
11 Apr 2019
2
Overview
• Rapid Acquisition Terminology (5 min) − Mr. Dan Telford (AFOTEC)
• AFOTEC perspective on rapid acquisition (15 min) − Mr. Dan Telford, Lt Stu Corbett
• MCOTEA perspective on rapid acquisition (15 min) − Mr. Jacob Warren
• COTF perspective on rapid acquisition (15 min) − Mr. Dave Beyrodt, Mr. Dave Bisaillon
• ATEC perspective on rapid acquisition (15 min) • Panel Q&A session (25 min) − Mr. Beyrodt, Mr. Bisaillon, Mr. Warren, Mr. Telford, Lt
Corbett
3
Rapid Acquisition Terminology
Daniel TelfordAFOTEC/A-2/911 Apr 2019
4
• Clarify terminology• Differentiate between Acquisition (traditional, 804 )
programs and Agile (software) development
Overview• DoDI 5000.02 Traditional and Rapid acquisitions• NDAA Section 804 Middle Tier Rapid Prototyping/Fielding
acquisitions • Agile (software) development
Purpose
5
DoDI 5000.02 Traditional AcquisitionWaterfall (aka traditional)
Requirements and CONOPS
• Follows rules defined by DoDI 5000.02 with six basic models• May be accelerated when schedule is dominant (model 4) • May have competitive prototypes to support down select decisions• May include agile software development approach
Model Guidance DecisionsTraditional• ACAT I MDAPs• ACAT II/III• Timeline: > 5 yrs
• DoDI 5000.02• USC Title 10 (139, 2399,
2366)
• Traditional MS A, B, C• LRIP/FRP/FDD/IOC Note: Model 4
is concurrent production/ development with early fielding of LRIP articles prior to IOT&E
Development and DT
Operational Test
Field and Sustain
6
Middle Tier Acquisitions (aka 804s)
• NDAA 16, Section 804 provides authority − Rapidly prototype and/or rapidly field capabilities under a new pathway− Distinct from the traditional acquisition system
• USD A&S, the DOD Comptroller and VCJCS, shall establish guidance for a middle tier of acquisition programs
• Interim authority from USD A&S to implement Section 804, 16 Apr 18− Start policy development NLT 1 Jan 19− Direction to the services to establish processes/procedures by 30 Sep 19
7
Section 804 Rapid Prototyping and Fielding
Approved Requirements
Rapid Fielding
Sustain
Model Guidance DecisionsMiddle Tier Acquisition “Rapid Acquisition” of “Proven Technologies”
Timeline: 2 to 5 yrs
Can be any ACAT equivalent
NDAA 16 Section 804, PL 114-92USD A&S memo, 16 Apr 18
• Requirement Definition (non-JCIDS)• Production Decision w/in 6 months of
requirement definition• Operational performance demo/evaluation—
timing TBD• Production/fielding complete w/in 5 yrs of
requirement definitionMiddle Tier Acquisition“Rapid Prototyping and Fielding” of “Innovative Technologies”
Timeline: 2 to 5 yrs
Can be any ACAT equivalent
• Requirement Definition (non-JCIDS)• Prototype Production/Qty Decision - ASAP• Fielding decision following “Demo of Field-
able Prototypes in Ops Environment” w/in 5 yrs of defined requirement
• Follow-on transition decision to proceed • Rapid Acquisition Pathway• Traditional Acquisition Process
Rapid Prototyping5000.02 Acquisition804 Rapid Fielding
8
Agile is an Approach • FY18 NDAA (H.R. 2810) defines the term “Agile Acquisition”− “acquisition using agile or iterative development.”
• Agile or Iterative Development, with respect to software:− …acquisition pursuant to a methodology for delivering multiple, rapid, incremental
capabilities to the user for operational use, evaluation, and feedback; and …incremental development and fielding of capabilities, commonly called spirals, spins, or
sprints, which can be measured in a few weeks or months; and …continuous participation and collaboration by users, testers, and requirements
authorities
• Taken from commercial practices for software-based product development− Continuous collaboration between operations/development− Continuous integration, delivery, and deployment
• Programs titled “Agile” are part of other acquisition models• Numerous programs identify as agile acquisitions, they may be: − DoDI 5000.02 acquisitions, such as Model 3, Incrementally Deployed Software
Intensive Systems (e.g., DCGS)− NDAA Section 804 (e.g., PTES)
9
Agile Terminology• Agile: Describes a set of principles for software development under which
requirements and solutions evolve through the collaborative effort of self-organizing cross-functional teams
• Agile Release Train (ART)• Backlog• Capability• Continuous development• DevOps• Epic • Iteration• Minimum Viable Product (MVP)• Minimum Operational Product (MOP)• Program increment (PI)• Release• Retrospective• Scrum• Spiral
• Sprint
• Sustainment Readiness Gate (SRG)
• Use Cases
• User stories
• Value Stream
• …
• Start with Scaled Agile Framework (SAFe) definitions
• Program offices are modifying and adding new terms, so review within program context
10
Summary
• Acquisition terms can have different meanings and often misused • Agile terms varied and often misused• Different rules apply depending on the actual authority• We still have limited experience—we’re all learning as we go
Questions
11
Adapting Operational Testto Rapid Acquisition Programs
AFOTEC Examples
Daniel Telford1st Lt Stuart CorbettAFOTEC/A-2/911 Apr 2019
12
AFOTEC Adaptive Relevant Testing (AART)
Key Principles1. Early OT Involvement2. Tailor to Situation3. Continuous Feedback4. Streamline Processes & Products5. Integrate & Synchronize Test6. Adaptive
AART
Section 804Agile SW Dev
SAF/AQ ResponseSpeed of
Relevance
FasterAnd
Smarter
Enhanced/Integrated Partnership
Acq Ktr
OTDT
User
13
Steps for Designing a Test
0. Define the product and processes to be tested(the SUT and the COIs/Operations)
1. State the purpose or goals of the test2. Select the response variables3. Choice of factors, levels, and ranges (conditions)4. Choice of test design (test event matrix or TEM)5. Perform the test (test execution)6. Analysis of the data7. Conclusions and recommendations (test report)
14
Current AFOTEC Efforts
• Advanced Pilot Trainer (APT)• F-35 Continuous Capability Development & Delivery (C2D2)• B-2 Defensive Management System Modernization (DMS)• B-21 Raider• Protected Tactical Enterprise Service (PTES)• Nuclear Planning and Execution System (NPES)• Kessel Run (KR) Air Operations Center (AOC)• Integrated Strategic Planning and Analysis Network (ISPAN)• Distributed Common Ground System (DCGS)• …
15
APT
Transfer of skills
Reliability
Syllabus Availability
Supportability
A/C & GBTSMission tasks
• OT Strategy (Original)– “How well can it train?”
Transfer of skills
A/C Mission Tasks
Reliability
Availability
Supportability
GBTS Mission Tasks
• OT Strategy (Change)– “How well can it accomplish
training (mission) tasks?”
16
APT AART
• Scope of Test – Changed focus to capabilities• Test Design− Individual capability (operations) designs, characterize− Targets IOT&E, complete set early, but adaptable
• Measures – Established early, complete as test progresses
Periodic/Event-DrivenReports
Synchronized Test Phase Report
IOT&E Report
OM – Observation memoSTU – Synchronized Test UpdateOTRR – OT Readiness ReviewOA – Operational Assessment
17
NPES
AcquisitionPhases &Milestones
Deliveries
Alpha Phase Beta Phase
Rel 1
FY19Q1 Q2 Q3 Q4
FY20Q1 Q2 Q3 Q4
FY21Q1 Q2 Q3 Q4
FY22Q1 Q2 Q3 Q4
Rel 5Rel 6
Rel 7Rel 8
Continuous Prototyping Continuous Fielding
MOP
Rel 9
Rel 10
Rel 11
Rel 12
Rel 13
Rel 14
Rel 15
Rel 16
Rel 2Rel 3
Rel 4
PIG
Ded.OT&E
DT/OT
DT/OT
DT/OT
DT/OT
DT/OT
DT/OT
DT/OT
DT/OT
MVP
DT/OT
DT/OTDT/OT
DT/OT
DT/OT
DT/OTDT/OT
DT/OT
Anticipate first capabilities accepted and fielded by USSTRATCOM at MOP
18
Sub-System / Component / Module
Performance
SystemAttributes
Scope-Measure Pyramid
Desired Effects
Mission-scope
Capability-scope
Feature/Function-scope
19
NPES AART
FY 19
R1 R2
Sprint involvementassessment during
planning meetings @ start of each sprint
R3
Sprint review @ end of each sprint (written feedback if relevant)
R4 R5 R6 R7 R8 R9 R10 R11 R12 R14 R15 R16 R13
Sprint 3
Minimum Operational
Product (MOP)
Sustainment Readiness Gate (SRG)
FY 20 FY 21 FY 22
OTOT
Master OT
Plan
MasterOT
ConceptFinal SRG
ReportMOP Report
OT Key OT TouchpointSprint 2Sprint 1 Sprint 4
Observation Report
Observation Plan
Observation report (memo) @ end of each release
20
Infrastructure Value Stream
DCGS Program Predictability Required
• Must synchronize planning effectiveness, predictability, and resource economies-of-scale by:− Synchronizing and fixing deployment (release/fielding) schedules− Synchronizing deployment locations on a fixed schedules− Slipping deployment into the next release when slips occur
GEOINT Value Stream
SIGINT Value Stream
MultiINT Value Stream
When a slip is required, moves to next release cycle
Schedule driven synchronized releases enable planning
effectiveness, predictability, and resource economies
Notional alignment
14-week cycle
21
AFOTEC DCGS Agile Response
• AFOTEC executed a DCGS process improvement effort: − Eliminated numerous products (documents)− Modified document formats (plans/reports, RALOT
analysis) − Significantly cut internal and external coordination
timelines− Delegated some decision making to lower levels− Prioritized resources to support DCGS fielding tempo− Project a rigorous 2-week OUE for each release to schedule
assets− Prove much of the AFOTEC approach on the first DCGS HA
OUE
Rigorous 2-week OUE for each release to manage warfighter risk
22
Fixed OT&E Dates and DurationChallenges
• DCGS Agile Pathfinder offers four OT&E periods a year• OT&E duration from test start to report is 5 weeks− 14 calendar days for execution− 10 calendar days for analysis & reporting− 9 calendar days for reviews and approval
• Platforms, Sensors, and Ranges difficult to secure every quarter− Open Air Exercises and Unit Training Events used as
mitigation (when aligned)− Sensor Modeling and Simulation and Emulations − Mission Play Backs; Real World Missions and Shadowing
of operational missions
23
Summary
• How we test (meet the challenge) depends on:− How capability is developed
• Implications for test design and measurement− What are the test design goal(s)?
What test designs are “adequate” for the goal− When are test designs developed?− What / when are measures developed?
Level of conflict (test)• Resources and schedule to execute
Marine Corps Operational Test and Evaluation Activity
Adapting Operational Test to Rapid-Acquisition Programs
Agenda
• MCOTEA’s Mission• Rapid Background• Marine Corps Rapid Capabilities Office• What it means for MCOTEA
25
MCOTEA’s Mission
MCOTEA provides operational testing and evaluation for the Marine Corps and conducts additional testing and evaluation as required to support the Marine Corps mission to man, train, equip, and sustain a force in readiness.
• We exist because Marine Corps decision makers need information that is independent, objective, and most importantly . . . defensible for critical resource and acquisition decisions.
• Don’t Guess
26
Rapid Background
• Marine Corps has not designated any Middle Tier Acquisition (MTA) programs– MTA authorized by section 804 of 2016 NDAA– MTA not subject to JCIDS or DODI 5000.02
• SECNAVINST 5000.42 established the Rapid Prototyping, Experimentation and Demonstration (RPED) process– The preferred path to find a solution for a fleet or force priority needs
for which a suitable materiel solution cannot be readily identified– RPED rapidly trials solutions and assesses performance and
operational utility– RPED conducts a demonstration and user evaluation as quickly as
feasible27
Marine Corps Rapid Capability Office
• In October 2016, the Commandant of the Marine Corps (CMC) directed the formation of the Marine Corps Rapid Capabilities Office (MCRCO) to:– accelerate the identification, development, and assessment of
emergent and disruptive technology• emergent - newer, cheaper, evolutionary• disruptive - replaces existing technology, revolutionary
– provide operational assessments that inform requirement development and investment planning
• The MCRCO has no program of record responsibility, but rather limited authority to provide capability to equip forces for operational assessments
28
Marine Corps Rapid Capability Office
• Identifies select prototypes demonstrating military utility and affordability
• Matches operational needs to emergent technologies• Equips Marine Corps units for operational assessment
• MCRCO work culminates with a Capability Assessment Report (CAR) which informs the decisions to:– Pursue the technology as a Program of Record (POR) (accelerated
or deliberate)– Return the technology to the Marine Corps Warfighting Lab (MCWL)
Science and Technology for further development– Stop pursuit altogether
29
MCRCO Process Flow
30
What this means for MCOTEA
• Nirvana– Gets OTAs involved early and upfront– Help establish requirements and drive initial testing
towards the metrics that really mean something to the warfighter
– We become part of team, when before we weren’t, so we get to be added value – whatever we’re doing, we’re helping make it better than it was
31
What this means for MCOTEA
• Customer based approach– What’s our level of involvement?– What do we deliver?
• Full Test vs Demonstration (minimum viable product)– Provide a product with just enough features to learn how the
system aligns with user needs, and provide feedback for future product development
– Manage expectations
32
Evaluation Planning
Concept Planning
Detailed Planning
Event Execution
Data Reporting
Evaluation Reporting
MCOTEA Six-step Process
What this means for MCOTEA
• Rapid is a good way to build experience• Move fast but don’t sacrifice too much quality
– Experienced, well-trained people– Conflict resolution– Minimize Rework
• Make our organization more flexible, more agile– “MCRCO has the necessary resources and processes to
enable it to effectively execute 12 innovative projects per year by the end of FY 2020”
33
Programs
• TEMSOS – Tactical Electro-Magnetic Spectrum Operations and Support– RCO sponsored project to enhance small unit situational awareness, capability to communicate,
provide EA, etc.– Full Large Scale Test
• TFT – Total Force Translator– RCO sponsored to provide automatic two-way translation capabilities in operational setting– Full Small Scale Test
• LEON – Littoral Explosive Ordnance Neutralizer– RCO sponsored project to Marines’ ability to search, detect, locate, and confirm explosive hazards
and obstacles in the littoral zone– Small scale demonstration of various different platforms
• SPS – Shooting Performance System– Special interest project, non RCO. WFTBn Parris Island effort to decrease number of Marines
failing to qualify with their combat rifle. Close training “gap,” aid coaches in training their shooters, and create more lethal Marines
– < 2 months from contact to execution• LMMG – Lightweight Medium Machine Gun• ITIW – Integrated Tactical Information Warfare
34
Questions
Jacob Warrenjacob.j.warren@usmc.mil
35
Visit our website at https://hqmc.usmc.afpims.mil/Agencies/MCOTEA.aspx
BACKUP SLIDES
36
Recent Guidance
• Per SECNAVINST 5000.2F 26 Mar 2019– DoN T&E/OPNAV N94 will publish a SECNAVINST 5000.2TE
instruction that will apply to all USN/USMC ACAT programs, the various Accelerated and Rapid Acquisition Programs, Non-Developmental Items (NDI) and Commercial Off-the-Shelf (COTS) items
– QRAs are abbreviated OT&E events in support of the DoN Accelerated/Rapid Acquisition Process. There is no assessment of effectiveness or suitability.
37
Backup
Rapid Prototyping, Experimentation and Demonstration (RPED):An approach to fast-track the development, fielding, and assessment of prototypes to: demonstrate solutions to capability needs; inform concepts of operations and requirements development; and inform acquisition and resource planning, and, if necessary, provide limited fielding until a formal program can be established and a full system acquired. RPED initiatives are not formal acquisition programs, but may be used to support urgent needs processes in cases where a suitable material solution cannot be identified through the Service’s urgent need process.
38
Mission Based Test Design
11 April 2019Mr. Dave Beyrodt
COTF Director for Test Design
UNCLASSIFIED
UNCLASSIFIED
40
Mission Based Test Design
41
Questions