Software Acquisition

AAF  > Software Acquisition  >  Execution Phase

Execution Phase

How to use this site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.



Reference Source: Software Acquisition Pathway Interim Policy and Procedures, 3 Jan 2020


Design decisions made during the Planning Phase are critical (e.g., systems and software architecture, software integration strategy, software development factory and pipelines) and will have a major impact on future program cost and schedule during the Execution Phase.  The decision authority should consider developing planning completion criteria based on the business decision artifacts to assess program readiness to execute this software acquisition pathway.  The decision authority can assess program achievement of planning completion criteria at the end of the Planning phase and approve program transition to Execution Phase 1.


The execution phase focuses on scoping, developing, and deploying an MVP and MVCR to the Warfighter/end user as quickly as possible. MVPs provide users with working software to demonstrate initial capabilities, test external interfaces as needed, accelerate learning, and shape needs/requirements, designs, and future iterations. While time to MVP is of high importance, utility of the functionality is diminished if quality control, testing, and user engagement are not integrated into the software development process. For example, security and testing should be integrated into the process to deliver the most value. Sponsors, users, acquirers and developers shall maintain active continual engagement throughout the software development process. Annually, PMs and Sponsors will use holistic program health data from metrics, value assessments, test results, etc. to determine if the effort should continue as planned.


The software development team further refines capabilities and features with the users to decompose capabilities into a prioritized backlog of functional and performance requirements, features, mission threads, and/or user stories, use cases, etc. The development efforts leverage enterprise software development services to the maximum extent possible to accelerate deliveries, reduce costs, and improve security and interoperability. The team works with key stakeholders to achieve a continuous authority to operate or an aggressively streamlined approval process for each software delivery. The PM tracks metrics to manage the software teams’ progress and convey insights to key stakeholders.


Following delivery of the MVP, the software development team will iteratively design, develop, test, and deliver capabilities that meet the users’ and/or systems’ highest priorities. The team will establish a cadence for iterating on the broader design, architecture, and infrastructure; allow adequate time for software refactoring; and plan for the MVCR. Users should be involved in the planning of each iteration and evaluation of the software at the end of each iteration if possible. The MVCR focuses on delivery of a minimum set of features that represent a deployable release to mission operations. Subsequent releases are intended to add further sets of capabilities and align with system development schedules. The team uses product roadmaps to convey the integration and synchronization of activities to align with system development activities and planned operational releases. Program and development teams should assess themselves at every iteration to ensure the strategic goals have not changed and are being met, and to remain responsive to emergent and changing requirements by delivering more frequent and timely releases to end users.


Depending on the software practices used by the team, subsequent deliveries could be continuously integrated and delivered when ready or iteratively queued in a staging/testing environment and transitioned to a production/operational environment via an established, consistent time frame agreed to by the users and other stakeholders. Once a delivery cadence is agreed upon, adherence to it will become a key indicator of program health and effectiveness. Programs must maintain rigorous focus on maintaining delivery cadence. The amount of capability delivered during each rigid, time-bound delivery can and should be allowed to change, but quality should be fixed. Test and evaluation should be automated to the maximum extent possible by defining tests up front, with tight alignment of government and vendor testers. Active user engagement is critical throughout development, delivery, and capability release to shape priorities and provide insights into operations, feedback on early capability iterations, design mock-ups, and previous developments to ensure rapid delivery of capabilities that will have an impact on the mission.


Government and vendor personnel should continuously improve their processes, practices, and roles via small empowered teams and should scale efforts across teams, tracking improvements via established metrics. Issues, errors, and defects identified during operations should be captured in the program’s backlog to be addressed in future development iterations. The software architecture should be continuously monitored and updated to reflect as-built design using automated tool scans as the preferred means. Continuous monitoring may feed value assessments to balance investments between short-term capabilities delivery and the need for longer term enduring solutions.


Business decision artifacts that decision authority should consider during the execution phase include, but are not limited to the artifacts described in the following:

  • Enterprise Services and DevSecOps Pipeline
  • MVP, MVCR, and Product Roadmap
  • Test Strategy
  • Secure Software and Cyber Security Plan
  • Metrics Plan
  • Value Assessment