Software Acquisition

AAF  > Software Acquisition  >  Program Management

Program Management

How to use this site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Acquisition Strategy

Reference Source: Software Acquisition Pathway Interim Policy and Procedures, 3 Jan 2020

 

A description of the overall approach to the acquisition, including key decisions such as:

  • the contract type(s) and contracting strategy to be used
  • the approach to cost-effectively obtaining appropriate intellectual property rights and maintaining an open system architecture
  • development and test platforms, test resources and infrastructure
  • identification of dependencies with other programs, either in development or in operation
  • cadence for operational delivery of the software being acquired
  • a roadmap that describes short-term plans with fidelity while outlining longer term objectives, among others
  • identification of sustainment factors such as for upgrades, security and performance

 

The acquisition strategy documents the program approach to obtain the required capabilities with acceptable risk, aiming to field a software capability within one year of initiation of the execution phase, and to collect data on fielded software to facilitate continuous engineering. Follow-on software development efforts will be established in increments of one year or less and evaluated based on software performance in the field. The acquisition strategy should also align to modern software development principles, emphasizing iterative deliveries of quality software early and often, and incorporating frequent feedback from users and other stakeholders.

 

Reference Source: Software Acquisition Pathway Guide v1.0

The acquisition strategy for a software program must address traditional functional areas such as Test and Evaluation, Risk Management, Funding, Contracting, Acquisition Approach, etc. The strategy must describe a plan for acquiring software at a level of detail that is suitable for justifying the investment decision. Constructing the acquisition strategy for an iterative software development program also requires the following activities:

  • Develop the roadmap: The roadmap must identify how often to deliver working software to operational users, based on a cadence that is appropriate to meet user needs. The roadmap must also describe the overall approach for managing iterative software development, showing how the software iterations fit with any constraints from interfacing systems or hardware dependencies. The program must describe how it will allow detailed requirements to be prioritized and developed with user involvement while ensuring progress toward implementing the high-level features needed.
  • Select a software development framework to guide the work: Based on technical constraints, prior experience and other considerations, the program should state whether it has already selected an Agile framework (e.g., SAFe, Scrum, XP) as a way to organize the work, or describe the considerations that will influence the choice once implementation starts.
  • Develop a plan for obtaining competencies in modern software development practices on the government team: The program should describe what gaps exist, in terms of bringing sufficient expertise in modern software practices into the team, and have a plan for filling those gaps in a way that scales up realistically (e.g., leveraging expertise available from the enterprise, or contracting for consulting, coaching, or training).
  • Develop a pathway to an MVP: Working with the users, the Program Manager must either explicitly state the major capabilities that would be part of an MVP that can be used to collect relevant user feedback or describe the process by which that MVP will be defined.
  • Establish initial design solution constraints: Based on the technical vision, the acquisition program must describe the approach it will take to software design, highlighting any known constraints and identifying how the program will deal with emergent design issues as the system evolves across iterations. Design decisions should be informed by the current threat environment as assessed by the acquisition intelligence community.
  • Identify applicable enterprise services and infrastructure decisions: Programs must describe fundamental technical decisions related to cloud usage, networking, and the establishment of environments for software development, integration, and deployment (DevSecOps). Programs should identify likely commercial or government-provided solutions to these needs or be prepared to justify why existing enterprise services will not meet the needs and how the program will bear the recurring costs of keeping these solutions up to date.
  • Describe the necessary team size and budget for the work: Based on an estimate of the level of effort required over time to produce an MVP and continue working on the capability, programs must establish the team size, team organization, and budgetary needs.
  • Develop a contracting strategy: The program must describe the strategy for contracting for the needed software team, ideally in a way that lowers barriers to entry for organizations that can contribute innovative technical solutions. The discussion should cover topics such as incentivizing innovation and performance in an Agile context and protecting intellectual property while giving the government the access to source code needed to scan for issues in a DevSecOps environment.
  • Develop Software Transition Strategy: Prior to software transition to a different organization (e.g. a DoD Lifecycle Software Center), the PM should require delivery of the complete software technical baseline, including all software capability descriptions (e.g. features, story points, use cases, etc.) and all as-built architecture and design products, traceability products, interface definitions including interfaces to proprietary software elements, and any other requisite documentation. The baseline facilitates managing program risk, understanding intellectual property rights, and supports the software transition to another organization for sustainment.

Contracting Strategy

Reference Source: Software Acquisition Pathway Guide v1.0

Program Managers and Contracting Officers should consider the full range of vehicles to secure an agreement to develop software for the program and should use the most flexible vehicle to accomplish the mission.

Contract Value: In general, it is wise to start small and select the most flexible format that meets program needs early in the program development.

Contract Types
  • Simplified Acquisition Procedures: Simplified procedures may be used for small contracts to provide familiarity with the contractor’s work and enable the program to quickly evaluate contractor proposals.
  • Other Transaction Authorities (OTAs): OTAs provide the ideal flexibility for the often-unpredictable world of software development programs. OTAs can and should contain language like a few acquisition clauses, but the key is to limit this language. The Agile team and contracting officer can mitigate risk through contracts for multiple software development projects that start very small and gradually grow as the team gains confidence. OTAs may be used during any stage of software development, even for full production.
  • Commercial Acquisition: Because the software industry is commercial, this is clearly the most streamlined method to acquire software.
  • Acquiring a Combination of Commercial and Proprietary Software: Usually the most streamlined contract type for software is a combination of commercial acquisition and a component that must be customized for national security reasons. The key is to use modular development to the extent possible and tightly circumscribe the amount of proprietary or customized software development.
  • IDIQ Contracts for Software as a Service: For SaaS, contracting officers might consider Indefinite Delivery, Indefinite Quantity (IDIQ) contracts, which provide for an indefinite quantity of services for a fixed time. IDIQ contracts are most often used for service contracts. Awards are usually for base years and option years. The government places delivery orders (for supplies) or task orders (for services) against a basic contract for individual requirements. The basic contract specifies minimum and maximum quantity limits as either a specified number of units (for supplies) or as dollar values (for services). IDIQ contracts help streamline the contract process and speed service delivery.
Legal Requirements

Competition: Government organizations must comply with the Competition in Contracting Act (CICA), but competition in general consumes more time due to the many legal requirements. Competing pilot programs are a means to introduce a form of competition without delaying software development. Use of IDIQ contracts is another means to introduce competition while maintaining streamlined contracting for software development.

Considerations for minimum requirements of the Agile Software Development Contract include:

  • The intended result of the overall project
  • Use of multiple small, short-duration milestones (i.e., “iterations”), with plans for the labor effort each would involve
  • A clear definition of the requirements and features of the final product for each successive contract within the software development
  • Pricing method or contract type, such as Fixed Price Incentive Fee (FPIF), Cost-Plus-Incentive Fee (CPIF), or performance-based contracts
  • The dollar value of the contract
  • Contract financing
  • Software development schedule and associated penalties or incentives
  • Duration of the contract
  • Compatibility requirements with other modules
  • Designation of a full-time contractor project manager (scrum master) dedicated to the contract
  • Skill, experience levels, and (if necessary) security clearances of the proposed contractor development team; requirement for team stability over the duration of the contract
  • Requirement for government acceptance of contractor team members
  • Intellectual property rights or government purpose rights
  • Security requirements for the software
  • Warranties, indemnities, and contractor liability
  • Metrics for product
  • Contract termination.

Intellectual Property (IP) Strategy

Reference Source: Software Acquisition Pathway Interim Policy and Procedures, 3 Jan 2020

The program office will develop strategies for acquisition, funding, contracting, Intellectual Property (IP), test and evaluation, systems engineering, software security, and sustainment in a single or minimum set of tailored documents. The team will estimate costs, identify funding, and develop metrics and value assessment plans. These planning efforts should be tightly aligned but can occur independently to support individual business decisions.

The Intellectual Property (IP) strategy should be tailored to meet the needs of the program throughout the lifecycle, with specific emphasis on acquiring the computer software and license rights required to support the acquisition strategy, and with consideration of the IP rights of both the Government and industry. To the maximum extent practicable, the strategy should include the requirement for periodic delivery of all source code developed in whole or in part at Government expense (e.g., after each iteration or each release) ), along with any other software or documentation necessary to compile, test, debug, deploy, and successfully operate the software. The IP strategy should also identify technological areas where IP may result from Government investment, and treat those appropriately. Prior to software transition to a different organization (e.g. a DoD Lifecycle Software Center), the PM should require delivery of all the software developed at government expense, including all software capability descriptions (e.g. features, story points, use cases) and all as-built architecture and design products, traceability products, interface definitions including interfaces to proprietary software elements, and any other requisite documentation. This facilitates managing program risk, understanding IP rights, and supports the software transition to another organization for sustainment. The PM shall define the software transition plan in a lifecycle software support plan and shall capture the final software package at the point of transition in the product roadmap.

Reporting

Reference Source: Software Acquisition Guide v1.0

The decision authority for a program may elect to use this pathway for the entire program or for one or more software sub-systems within the program. To initiate this pathway, the decision authority will submit an Acquisition Decision Memorandum (ADM) to OUSD (A&S) to provide notification of pathway use. The ADM should contain rationale for using the pathway. Once initiated, the decision authority will provide program data to OUSD (A&S) quarterly to provide insight into the operation of the pathway and to support decisions regarding making pathway improvements. Programs should provide the following data on a quarterly basis:

  • Program Name
  • Contract vehicle type
  • Current year budget
  • Current year costs incurred

Metrics

Reference Source: Software Acquisition Pathway Interim Policy and Procedures, 3 Jan 2020

 

The metrics plan identifies key metrics that allow the PM and other stakeholders to manage cost, schedule, and performance. It also organizes metrics by common types (or classes) and provides guidance on how to read and interpret each metric. Each program shall tailor the set of metrics for the unique considerations of the program. All software acquisition programs must have a set of core metrics in addition to the existing requirements outlined in the Software Resources Data Report (SRDR).

 

Reference Source: Software Acquisition Pathway Guide v1.0

A Metrics Plan identifies metrics to be collected in order to manage the software program. The purpose of metrics is to provide data to PMs and other stakeholders to inform decisions and provide insight into the development effort. Every metric produced on a program should target a specific stakeholder or set of stakeholders, have a defined purpose, and support decision making at some level. Programs should establish and maintain metrics to measure progress in the following areas:

  • Process Efficiency Metrics: These metrics identify where inefficiencies may exist in the software development process. Maintaining process efficiency metrics supports decisions related to how/when/where to change the process, if needed, and enables continual process improvement.
  • Software Quality Metrics: These metrics identify where in the overall system software quality may be degraded and supports identification of specific software components or software teams that contribute to degraded quality. Maintaining software quality metrics supports decisions related to software architecture, software team performance, etc.
  • Software Development Progress Metrics: These metrics illustrate the capability developed to date as compared to the overall capability planned, and the speed at which capability is delivered. Maintaining progress metrics allows internal and external stakeholders to maintain visibility into the capability planned vs capability delivered and supports senior leader resourcing decisions or resourcing justification. Progress metrics also support cost estimation and decisions related to number and size of teams.
  • DevSecOps Metrics: These metrics identify where inefficiencies may exist in the DevSecOps pipeline. Maintaining these metrics supports identifying tool or configuration changes that may be necessary to improve the performance of the pipeline.
  • Cost Metrics: Cost metrics provide insight into the program budget and expenditure rate. Maintaining cost metrics supports resource decisions like number of teams required, or technical decisions like how much capability to plan for a given time span.
  • Value Metrics: Value metrics identify the level of significance for each capability and feature from the users’ perspective. Capabilities and features should all have a priority and a value assignment designated by the user to support prioritization and to provide a cursory view of the value (or significance) of the capability developed to date. This metric can be used in concert with more comprehensive value assessments that must be conducted periodically.

The minimum set of metrics to be collected should include at least one metric from each of the above categories. The program should establish a minimum specific set of metrics to provide insight into the status of the project and support technical and programmatic decisions. The program should have the ability to expand on the minimum set of metrics as needed so that the metrics remain appropriate for the size of the project, while also considering the level of effort and cost associated with the collection of each metric. The program should automate collection of metrics as much as possible. For those metrics that cannot be automated initially, the program should develop a plan for moving toward automation. Programs should consider migrating from a quarterly software metrics push to providing access to their set of software metrics via an automated read only self-service metrics portal for OUSD(A&S), OUSD(R&E) and other approved stakeholders. The following subsections list example metrics for each category.

Process Efficiency Metrics
  • Feature Points – The project team uses feature points (e.g., story points, use cases, etc.) to perform relative sizing of features. The developer assigned to a feature is responsible for identifying how much effort is required to complete the work in each iteration. Based on the duration of each iteration, minus overhead and time off, the team builds an understanding of the number of points the team can complete in each iteration. Over time the team develops efficiencies and estimation tends to improve.
  • Velocity – Velocity measures the amount of work, in feature points, that the team completes in each iteration. It is derived by summing the total points of all the features completed in each iteration.
  • Feature Completion Rate – Feature completion rate describes the number of features completed in each iteration or release.
  • Feature Burndown Chart – Teams use a feature burndown chart to estimate the pace of work accomplished daily. The pace is usually measured in hours of work, although no specific rule prevents the team from measuring in feature points.
  • Release Burnup – Release burn up charts measure the amount of work completed for a given release based on the total amount of work planned for the release. Usually feature points are used as the unit of measure to show planned and completed work.
  • Number of Blockers – A blocker is an issue that cannot be resolved by the individual assigned to complete the activity and requires assistance to overcome. Number of blockers describes the number of events that prohibit the completion of an activity.
Software Quality Metrics
  • Recidivism Rate – Recidivism describes stories that are returned to the team for various reasons.
  • Defect Count – Defect count measures the number of defects per iteration or release.
  • Change Fail Rate – The percentage of changes to the production system that fail.
Software Development Progress
  • Deployment Frequency – Deployment frequency provides information on the cadence of deployments in terms of time elapsed between deployments.
  • Progress against Roadmap – Progress measures major capabilities planned versus delivered.
  • Achievement date of MVP / MVCR.
Cost Metrics
  • Total Cost Estimate – This metric provides the total estimated cost for the product being developed or the service being acquired. The cost estimation approach can depend on whether the program is seeking services over time (e.g., DevSecOps expert; Full Stack Developer; tester or product delivery based on a clear set of Agile user requirements (user stories) contained in a product backlog baseline.
  • Burn Rate – Burn rate measures incurred cost over time (e.g., monthly burn rate, iteration burn rate, release burn rate).
Capability / Value Delivery Metrics
  • Delivered Features – Count of delivered features measures the business-defined features accepted and delivered.
  • Delivered Value Points – This metric represents the count of value points delivered to users for a given release. Value points are usually defined by the users (or user representatives) to indicate the business value assigned to a given feature or story.
  • Level of User Satisfaction – This metric represents the degree of user satisfaction based on the value delivered by the product or solution.
Metrics Considerations for Programs Implementing Agile Methods

Programs implementing Agile metrics should consider taking the following actions to improve implementation success and pace of adoption.

  • Align on metrics to be collected.
  • Identify tools to enable automation of metrics and supporting data to reduce the level of effort required to collect and report on metrics.
  • Document the metrics to be collected in a Metrics Plan. The plan should include:
    • The list of metrics to be collected and reported
    • Information on which metrics are automated
    • A plan for automating metrics not yet automated, or justification for why the metric should not be automated
    • Frequency of reporting for each metric
    • Tools used to collect and report metrics.
  • Ensure leadership support of the plan for Agile metrics.