Software Acquisition

AAF  >  Software Acquisition  >  Execution Phase

Execution Phase

How to use this site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Reference Source: DODI 5000.87 Section 3.3

 

The purpose of this phase is to rapidly and iteratively design, develop, integrate, test, deliver, and operate resilient and reliable software capabilities that meet the users’ priority needs.

 

Programs will assemble software architecture, infrastructure, services, pipelines, development and test platforms, and related resources from enterprise services and development contracts. Leveraging existing services from enterprise services and development contracts will be preferred over acquiring new services to the extent consistent with the program acquisition strategy and IP strategy.

 

Programs will maximize use of automated software testing and security accreditation, continuous integration and continuous delivery of software capabilities, and frequent user feedback and engagement. Programs will consider the program’s lifecycle objectives and actively manage technical debt. Programs will use modern, iterative software practices to continuously improve software quality (e.g., iteratively refactor design and code, reduce cybersecurity vulnerabilities, and create effective modular open systems approaches to support future capabilities). Programs using the embedded software path will align test and integration with the overarching system testing and delivery schedules.

 

The sponsor and program office will develop and maintain a product roadmap to plan regular and iterative deliveries of software capabilities. The product owner and program office will also develop and maintain program backlogs that identify detailed user needs in prioritized lists. The backlogs allow for dynamic reallocation of current and planned software releases. Issues, errors, threats, and defects identified during development and operations, including
software updates from third parties or suppliers, should be captured in the program’s backlogs to address in future iterations and releases. Regular stakeholder feedback and inputs will shape the product roadmap and program backlogs.

 

The PM and the sponsor will use an iterative, human-centered design process to define the minimum viable product (MVP) recognizing that an MVP’s definition may evolve as user needs become better understood. Insights from MVPs help shape scope, requirements, and design.

 

The PM and the sponsor will use an iterative, human-centered design process to define a minimum viable capability release (MVCR) if the MVP does not have sufficient capability or performance to deploy into operations. The MVCR delivers initial warfighting capabilities to enhance mission outcomes. The MVCR for applications programs must be
deployed to an operational environment within 1 year after the date on which funds are first obligated to acquire or develop new software capability including appropriate operational test. If the MVP version of the software is determined sufficient to be fielded for operational use, the MVP will become the MVCR.

 

  • Subsequent capability releases will be delivered at least annually. Software updates to address cybersecurity vulnerabilities will be released in a timely manner, potentially including out of release cycle as needed, per the program’s risk based lifecycle management approach.
  • Programs should deploy embedded software upgrades at least annually to an environment (e.g., development, staging, or operations) consistent with the overarching weapon system testing delivery strategy.

 

Programs will continuously improve or refine software development processes, practices, tools, and program strategies to reflect them. They should employ small empowered teams and scale larger efforts across multiple teams. This includes integrating and aligning efforts across government and software development organizations. Continuous user feedback and self-assessments help balance investments between short-term capability deliveries and longer-term enduring solutions.

 

Software development testing, government developmental testing, and operational testing will be integrated, streamlined, and automated to the maximum extent possible to accelerate delivery timelines based on risk strategies. Automated test scripts and test results will be made available to the test community so that critical verification functions (e.g., performance, reliability), and validation functions (e.g., effectiveness, suitability and survivability) can be assessed iteratively and incrementally.

 

Automated cyber testing and continuous monitoring of operational software will be designed and implemented to support a cATO or an accelerated accreditation process to the maximum extent practicable; and will be augmented with additional testing where appropriate in accordance with cybersecurity policies, and in coordination with the assigned authorizing official. All safety critical software standards and guidance apply for programs using the software acquisition pathway. Programs will implement recurring cybersecurity assessments of the development environment, processes and tools.

 

Cybersecurity and software assurance will be integral to strategies, designs, development environment, processes, supply chain, architectures, enterprise services, tests, and operations. Continuous and automated cybersecurity and cyber threat testing will identify vulnerabilities to help ensure software resilience throughout the lifecycle. PMs will work with stakeholders to provide sufficient controls to enable a cATO where appropriate. Ensuring software security includes:

  • Secure development (e.g., development environment, vetted personnel, coding, test, identity and access management, and supply chain risk management).
  • Cybersecurity and assurance capabilities (e.g., software updates and patching, encryption, runtime monitoring, and logging).
  • Secure lifecycle management (e.g., vulnerability management, rigorous and persistent cybersecurity testing, and configuration control).

IP considerations will be tracked and managed, and the IP strategy continuously updated accordingly, throughout the execution phase. For example, any changes to the planned use of government-funded and privately-funded modules or components should be reflected in the required listings of asserted restrictions, and the inspection and acceptance of deliverables should include a verification that any markings are consistent (e.g., both conforming and justified) with the anticipated restrictive markings.

 

Each program will develop and track a set of metrics to assess and manage the performance, progress, speed, cybersecurity, and quality of the software development, its development teams, and ability to meet users’ needs. Metrics collection will leverage automated tools to the maximum extent practicable. The program will continue to update its cost estimates and cost and software data reporting from the planning phase throughout the execution phase.

 

The sponsor and user community will perform a value assessment at least annually on the software delivered. The sponsor will provide feedback on whether the mission improvements or efficiencies realized from the delivered software capabilities are timely and worth the investment. The feedback should be informed by test and evaluation results. The DA, sponsor, and PM will use the value assessments to assess progress on the program, update strategies, designs, and inform resourcing decisions.

 

The PM will iteratively develop and verify technical training materials that are synchronized with software deliveries throughout the software development lifecycle. The PM will deliver training materials that ensure that receiving users and military units can be trained to the appropriate level of proficiency and readiness to successfully execute the individual and collective tasks necessary to accomplish the mission supported by the software. The PM will deliver technical operator and maintainer manuals required to operate and maintain the system. Digital delivery of software manuals and automated training will be allowed and preferred. Every effort should be made to include all updated software manuals and automated training that are iteratively improved with each new release of software capabilities.

Information Required During the Execution Phase

Reference Source: USD(A&S) Guidance

System Architecture Regulatory DODI 5000.87
Product Roadmap or equivalent Regulatory DODI 5000.87
Program Backlog or equivalent Regulatory DODI 5000.87
Periodic updates to strategies (acquisition, contracting, test, cybersecurity, IP, product support) Statutory/ Regulatory DODI 5000.87
Periodic updates to cost estimate and CARD Regulatory DODI 5000.87
Value Assessment (at least annually) Regulatory DODI 5000.87
Clinger Cohen Act (CCA) Compliance (see CCA table) Statutory DODI 5000.82, Subtitle III of Title 40
Core Logistics Determination (CLD)/Core Logistics and Sustaining Workloads Estimate (CLSWE) Statutory for programs with “software maintenance” 9 USC 2464
Post Implementation Review Statutory 40 USC 11313
DoDI 5000.82
Program Metrics Regulatory DODI 5000.87
Semi-Annual Data Reporting to OUSD(A&S) Regulatory DODI 5000.87
Contractor Cost Data Report ($100M+ contracts) Regulatory DODI 5000.73
Software Resources Data Report ($100M+ Programs) Regulatory DODI 5000.73
Technical Data Report ($100M+ Programs) Regulatory DODI 5000.73
Contractor Business Data Report (CTRs with $250M) Regulatory DODI 5000.73
Maintenance and Repair Parts Data Report ($100M+ Programs) Regulatory DODI 5000.73

Under certain scenarios, some statutory information may be required; e.g.:

  • Benefit Analysis and Determination (Part of ACQ Strat)
  • Consideration of Technology Issues (Part of ACQ Strat)
  • Contract-type Determination (Part of ACQ Strat)
  • Cooperative Opportunities (Part of ACQ Strat)
  • Cybersecurity Strategy (required by 5000.87 regardless)
  • DOT&E Report on Initial Operational Test and Evaluation (IOT&E)
  • Independent Cost Estimate (required by 5000.87 regardless)
  • IP Strategy (Part of ACQ Strat)
  • Market Research (Part of ACQ Strat)
  • Operational Test Plan (OTP)

Note: If the below video(s) show a “Website Blocked” banner, try accessing the video via a non-DoD network. No sensitive content is contained in this video. Current DoD network policies to conserve bandwidth may be impacting the ability to view this video.