Software Acquisition

AAF  >  Software Acquisition  >  Execution Phase

Execution Phase

How To Use This Site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Reference Source: DODI 5000.87 Section 3.3


The purpose of this phase is to rapidly and iteratively design, develop, integrate, test, deliver, and operate resilient and reliable software capabilities that meet the users’ priority needs.


Programs will assemble software architecture, infrastructure, services, pipelines, development and test platforms, and related resources from enterprise services and development contracts. Leveraging existing services from enterprise services and development contracts will be preferred over acquiring new services to the extent consistent with the program acquisition strategy and IP strategy.


Programs will maximize use of automated software testing and security accreditation, continuous integration and continuous delivery of software capabilities, and frequent user feedback and engagement. Programs will consider the program’s lifecycle objectives and actively manage technical debt. Programs will use modern, iterative software practices to continuously improve software quality (e.g., iteratively refactor design and code, reduce cybersecurity vulnerabilities, and create effective modular open systems approaches to support future capabilities). Programs using the embedded software path will align test and integration with the overarching system testing and delivery schedules.


The sponsor and program office will develop and maintain a product roadmap to plan regular and iterative deliveries of software capabilities. The product owner and program office will also develop and maintain program backlogs that identify detailed user needs in prioritized lists. The backlogs allow for dynamic reallocation of current and planned software releases. Issues, errors, threats, and defects identified during development and operations, including
software updates from third parties or suppliers, should be captured in the program’s backlogs to address in future iterations and releases. Regular stakeholder feedback and inputs will shape the product roadmap and program backlogs.


The PM and the sponsor will use an iterative, human-centered design process to define the minimum viable product (MVP) recognizing that an MVP’s definition may evolve as user needs become better understood. Insights from MVPs help shape scope, requirements, and design.


The PM and the sponsor will use an iterative, human-centered design process to define a minimum viable capability release (MVCR) if the MVP does not have sufficient capability or performance to deploy into operations. The MVCR delivers initial warfighting capabilities to enhance mission outcomes. The MVCR for applications programs must be
deployed to an operational environment within 1 year after the date on which funds are first obligated to acquire or develop new software capability including appropriate operational test. If the MVP version of the software is determined sufficient to be fielded for operational use, the MVP will become the MVCR.


  • Subsequent capability releases will be delivered at least annually. Software updates to address cybersecurity vulnerabilities will be released in a timely manner, potentially including out of release cycle as needed, per the program’s risk based lifecycle management approach.
  • Programs should deploy embedded software upgrades at least annually to an environment (e.g., development, staging, or operations) consistent with the overarching weapon system testing delivery strategy.


Programs will continuously improve or refine software development processes, practices, tools, and program strategies to reflect them. They should employ small empowered teams and scale larger efforts across multiple teams. This includes integrating and aligning efforts across government and software development organizations. Continuous user feedback and self-assessments help balance investments between short-term capability deliveries and longer-term enduring solutions.


Software development testing, government developmental testing, and operational testing will be integrated, streamlined, and automated to the maximum extent possible to accelerate delivery timelines based on risk strategies. Automated test scripts and test results will be made available to the test community so that critical verification functions (e.g., performance, reliability), and validation functions (e.g., effectiveness, suitability and survivability) can be assessed iteratively and incrementally.


Automated cyber testing and continuous monitoring of operational software will be designed and implemented to support a cATO or an accelerated accreditation process to the maximum extent practicable; and will be augmented with additional testing where appropriate in accordance with cybersecurity policies, and in coordination with the assigned authorizing official. All safety critical software standards and guidance apply for programs using the software acquisition pathway. Programs will implement recurring cybersecurity assessments of the development environment, processes and tools.


Cybersecurity and software assurance will be integral to strategies, designs, development environment, processes, supply chain, architectures, enterprise services, tests, and operations. Continuous and automated cybersecurity and cyber threat testing will identify vulnerabilities to help ensure software resilience throughout the lifecycle. PMs will work with stakeholders to provide sufficient controls to enable a cATO where appropriate. Ensuring software security includes:

  • Secure development (e.g., development environment, vetted personnel, coding, test, identity and access management, and supply chain risk management).
  • Cybersecurity and assurance capabilities (e.g., software updates and patching, encryption, runtime monitoring, and logging).
  • Secure lifecycle management (e.g., vulnerability management, rigorous and persistent cybersecurity testing, and configuration control).

IP considerations will be tracked and managed, and the IP strategy continuously updated accordingly, throughout the execution phase. For example, any changes to the planned use of government-funded and privately-funded modules or components should be reflected in the required listings of asserted restrictions, and the inspection and acceptance of deliverables should include a verification that any markings are consistent (e.g., both conforming and justified) with the anticipated restrictive markings.


Each program will develop and track a set of metrics to assess and manage the performance, progress, speed, cybersecurity, and quality of the software development, its development teams, and ability to meet users’ needs. Metrics collection will leverage automated tools to the maximum extent practicable. The program will continue to update its cost estimates and cost and software data reporting from the planning phase throughout the execution phase.


The sponsor and user community will perform a value assessment at least annually on the software delivered. The sponsor will provide feedback on whether the mission improvements or efficiencies realized from the delivered software capabilities are timely and worth the investment. The feedback should be informed by test and evaluation results. The DA, sponsor, and PM will use the value assessments to assess progress on the program, update strategies, designs, and inform resourcing decisions.


The PM will iteratively develop and verify technical training materials that are synchronized with software deliveries throughout the software development lifecycle. The PM will deliver training materials that ensure that receiving users and military units can be trained to the appropriate level of proficiency and readiness to successfully execute the individual and collective tasks necessary to accomplish the mission supported by the software. The PM will deliver technical operator and maintainer manuals required to operate and maintain the system. Digital delivery of software manuals and automated training will be allowed and preferred. Every effort should be made to include all updated software manuals and automated training that are iteratively improved with each new release of software capabilities.

Reference Source: OUSD(A&S) Guidance

Entering the Execution Phase

Reference Source: USD(A&S) Guidance

The following documents should represent an initial program position but be regularly updated as the program matures.

Documentation Required to ENTER the Execution Phase

Reference Source: USD(A&S) Guidance

Applicability Source
Capability Needs Statement or Other Approved Requirements Document Regulatory DODI 5000.87
User Agreement Regulatory DODI 5000.87
Acquisition Strategy (includes market research findings, acquisition approach, business strategy, contract strategy, intellectual property strategy, product support strategy, metrics plan, risk management, etc.)

Statutory for major programs (> ACAT II)

Regulatory for others

10 USC 2431a

DODI 5000.87

Market Research (part of Acquisition Strategy) Statutory

10 USC 2377

41 USC 3306(a)(1)

41 USC 3307(d)

Cybersecurity Plan (may be part of Acquisition Strategy or standalone document). Should suffice for purposes of obtaining an Authority to Operate (ATO) and meeting the requirements for Clinger-Cohen compliance.

Statutory for Mission Critical and Mission Essential IT programs

Regulatory for others

40 USC 11313

DODI 5000.87

DODI 8500.01

DODI 5000.82

Test Strategy (may be part of Acquisition Strategy) Should ensure that the general approach for DT/OT is understood among the test community.


Programs on DOT&E Oversight list may require a TEMP.

DODI 5000.87
Intellectual Property Strategy (may be part of Acquisition Strategy) Regulatory DODI 5000.87
Product Support Strategy including Business Case Analysis (may be part of Acquisition Strategy) Regulatory DODI 5000.87
Information Support Plan (includes system architecture) Regulatory DODI 8330.01
Bandwidth Requirements Review (part of information support plan or related document)

Statutory for programs > ACAT II

Regulatory for Others

§1047, P.L. 110-417
Program Cost Estimate (for applicable programs, an Independent Cost Estimate (ICE) is required iaw DoDI 5000.73. This requires coordination with CAPE who may/may not delegate) Regulatory. CAPE ICE for programs > ACAT II unless delegated DODI 5000.87 DODI 5000.73
Cost Analysis Requirements Document Regulatory DODI 5000.73
Clinger Cohen Act (CCA) Compliance (see CCA considerations below) Statutory DODI 5000.82, Subtitle III of Title 40
Initial Product Roadmap (to convey intial-near term plan to the Decision Authority) Regulatory DODI 5000.87
Acquisition Decision Memorandum (for the Execution Phase) Regulatory DODI 5000.87


Clinger Cohen Act Compliance Considerations

Reference Source: USD(A&S) Guidance

While the Clinger-Cohen Act is statutory for all IT programs (in the broadest use of that term), there may be creative ways to achieve compliance using alternative SWP artifacts.  Consult with your CCA Approver early so you can understand the options you have to tailor compliance in this area.

Clinger-Cohen Act Requirement 40 USC 1401 Applicable SWP Documentation
Determination that the acquisition supports core, priority functions of the DoD. Capability Needs Statement (Capabilities section)
Establish outcome-based performance measures linked to strategic goals Capability Needs Statement (Performance Attributes section)
Redesign the processes that the system supports to reduce costs, improve effectiveness and maximize the use of commercial off-the-shelf technology Capability Needs Statement (Program Summary section)
Determine that no private sector or government source can better support the function Acquisition Strategy
Conduct an analysis of alternatives Acquisition Strategy (Business Strategy section)
Conduct an economic analysis that includes a calculation of the return on investment; or for non-AIS programs, conduct a life-cycle cost estimate. Component Cost Estimate, Component Cost Position
Develop clearly established measures and accountability for program progress. Acquisition Strategy (Program Metrics, Value Assessment)
Ensure that the acquisition is consistent with the DoD Information Enterprise policies and architecture, to include relevant standards. System Architecture
Ensure that the program has a Cybersecurity Strategy that is consistent with DoD policies, standards and architectures, to include relevant standards. Cybersecurity Strategy
Ensure, to the maximum extent practicable, (1) modular contracting has been used, and (2) the program is being implemented in phased, successive increments, each of which meets part of the mission need and delivers measurable benefit, independent of future increments. Acquisition Strategy (Contracting Strategy section)
Register Mission-Critical and Mission-Essential systems with the DoD CIO. DoD Information Technology Portfolio Repository

Reference Source: DODI 5000.87 Section 3.1

The DA will document the decision and rationale for a program to use the software acquisition pathway in an acquisition decision memorandum.

Acquisition Decision Memorandum (ADM)

Reference Source: OUSD(A&S) Guidance

An Acquisition Decision Memorandum (ADM) captures the decision authority’s key decisions, program direction, and action items. While there are no milestone reviews in the SWP, there are two key points where an ADM will commonly be used.

  1. The decision for an acquisition program to use the SWP requires an ADM (and draft CNS) per DODI 5000.87.
  2. Proceeding to the Execution Phase within the SWP requires the decision authority validate a program has done the appropriate planning, met statutory requirements, and is resourced to effectively begin (or continue) software development. This is done in an ADM.

Embracing the key tenets of the Adaptive Acquisition Framework, a decision authority is encouraged to simplify and tailor acquisition approaches while empowering program managers. The ADM may capture unique direction for the program to address high risk areas or provide additional details to its strategies and analysis. The strategic intent of the SWP is to balance speed with rigor, to proceed with sufficient strategies and risk without waiting for an exhaustive list of documentation. Therefore, a decision authority may authorize proceeding with a draft strategy and set expectations for a final version within a certain timeframe.

A decision authority may issue an ADM at any other point within a program’s lifecycle as conditions warrant. These may include, but are not limited to:

  • Major pivot in program strategies due to change in requirements, contractors, performance, technologies, platforms, development approaches, budget/costs, or other factors.
  • Additional direction or expectations following an Independent Program Review, audit, or related review that identified critical issues.
  • Integration or merging with other programs, initiating a major new scope of work (with validated needs), or program termination.

The ADM Template provides two examples for SWP programs and decision authorities to tailor to capture the key decisions, direction, and actions.

During the Execution Phase

Reference Source: USD(A&S) Guidance

The following documents should also represent an initial program position but be regularly updated as the program matures.

Information/Documentation to be Compiled and/or Approved DURING the Execution Phase

Reference Source: USD(A&S) Guidance

Applicability Source
System Architecture Regulatory DODI 5000.87
Product Roadmap (approved by the user community in terms of near-term priorities to pursue) Regulatory DODI 5000.87
Program Backlog Regulatory DODI 5000.87
Periodic updates to strategies (acquisition, contracting, test, cybersecurity, IP, product support) as needed Statutory/ Regulatory DODI 5000.87
Periodic updates to cost estimate and CARD if applicable Regulatory DODI 5000.87
Value Assessment (at least annually) to meet intent of Post-Implementation Review Regulatory DODI 5000.87
Clinger Cohen Act (CCA) Compliance (See CCA table below) Statutory DODI 5000.82, Subtitle III of Title 40
Core Logistics Determination (CLD)/Core Logistics and Sustaining Workloads Estimate (CLSWE) if applicable Statutory for programs with “software maintenance” 10 USC 2464
DOT&E Report on Initial Operational Test and Evaluation (IOT&E) if applicable Statutory for programs on the DOT&E Oversight List 10 U.S.C. 2399

10 U.S.C. 139

DOT&E Operational Test Plan if applicable Statutory for programs on the DOT&E Oversight List 10 U.S.C. 2399
Post Implementation Review (may be satisfied by Value Assessments) Statutory 40 USC 11313

DoDI 5000.82

Program Metrics Regulatory DODI 5000.87
Semi-Annual Data Reporting to OUSD(A&S) Regulatory DODI 5000.87
Contractor Cost Data Report ($100M+ contracts) Regulatory DODI 5000.73
Software Resources Data Report ($100M+ Programs) Regulatory DODI 5000.73
Technical Data Report ($100M+ Programs) Regulatory DODI 5000.73
Contractor Business Data Report (CTRs with $250M) Regulatory DODI 5000.73
Maintenance and Repair Parts Data Report ($100M+ Programs) Regulatory DODI 5000.73


The following graphic visualizes the interplay of the key SWP artifacts and elements during the Exeuction Phase:

Many SW Pathway elements

Procuring Hardware on the SWP

Reference Source: OUSD (A&S) Guidance

While the primary intent of the SWP is to provide acquisition programs with a streamlined means of delivering software capability faster, there is broad recognition that hardware will need to be procured to support the development, testing and fielding of that software.

Commonly expected hardware procurement items to support and enable software delivery includes servers, racks, workstations, computer equipment, peripherals, and hardware-in-the-loop test suites (for embedded path programs). There is no threshold for procurement of hardware items in direct support of development, testing, training, and fielding activities.

Less common hardware procurement items include sensors, non-commercial user equipment and other electronic equipment items that are intended to be employed by operational users.  While there is no clear threshold for the purchase of these types of hardware items, the expectation is that the funding allocated to hardware procurement is at a reasonable level and does not disproportionally outweigh the funding dedicated to software development, testing and fielding activities.

Custom design and development of hardware, even if intended to enable software, should not be conducted on the SWP.  There are other pathways, such as the Middle Tier of Acquisition, that can be seamlessly combined with the SWP to enable that type of work. (For an example of combined pathways, see Vignette: MTA and SWP Hybrid Acquisition Approach)

Procuring HW on SWP Image