Major Capability Acquisition (MCA)
Activities Across Phases
Technical Reviews
Develop Strategies
Program Management
Cost Estimation/Affordability
MDD
MSA PHASE
Develop Requirements
Analysis of Alternatives
Study Contracts
Milestone A
TMRR PHASE
Mature Requirements
Prototype Contracts
Prototyping
Develop Preliminary Design
CDD Validation
Dev RFP Release Decision
Milestone B
EMD PHASE
Development Contracts
Complete System Design
Develop System
Developmental Testing
Milestone C
P&D PHASE
Production Contracts
Low Rate Initial Production
Operational Testing
FRP Decision
Full Rate Production/Deployment
O&S PHASE
Sustainment Contracts
Sustain System
IOC/FOC
Acquisition Categories (ACATs)
*AAF Pathway Resources*
Glossary
Developmental Testing and Evaluation
How To Use This Site
Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.
DoD and Service policy is indicated by a BLUE vertical line.
Directly quoted material is preceeded with a link to the Reference Source.
Reference Source: DODI 5000.85 Section 3.11.b.(2)
Developmental testing and evaluation provides hardware and software feedback to the PM on the progress of the design process and on the product’s compliance with contractual requirements, effective combat capability, and the ability to achieve key performance parameters (KPPs) and key system attributes (KSAs). The DoD Component’s operational test organization will conduct independent evaluations, operational assessments, or limited user tests to provide initial assessments of operational effectiveness, suitability, survivability, and the ability to satisfy KPPs and KSAs. Opportunities to combine contractor and Government developmental testing should be maximized, and integrated developmental and operational testing will be conducted when feasible.
Reference Source: DODI 5000.85 Section 3D.4.c.(4)
DoD Components will ensure reliability and maintainability data from operational and developmental testing and evaluation, fielding, all levels of repair and their associated manpower, and real property informs estimates of O&S costs for major weapon systems.
Reference Source: DODI 5000.89 Section 4.4.b
The USD(R&E) will prepare MS B and MS C DT&E sufficiency assessments on those MDAPs where the DAE is the MDA, in accordance with Section 838 of Public Law 115-91. For programs where the Service or the Component acquisition executive is the MDA, see Paragraph 5.3.b.(2) for additional details.
Reference Source: DODI 5000.89 Section 5.1
DT&E activities support data generation for independent evaluations. They also provide program engineers and decision-makers with information to measure progress, identify problems, characterize system capabilities and limitations, and manage technical and programmatic risks. PMs use DT&E activities to manage and reduce risks during development, verify that products are compliant with contractual and technical requirements, prepare for OT, and inform decision-makers throughout the program life cycle. DT&E results verify exit criteria to ensure adequate progress before investment commitments or initiation of phases of the program, and as the basis for contract incentives.
- DT&E starts with capability requirements and continues through product development, delivery, and acceptance; transition to OT&E; production; and operations and support. Consideration of DT&E in the requirements and systems engineering processes ensures that capability requirements are measurable, testable, and achievable. Identifying and correcting deficiencies early is less costly than discovering system deficiencies late in the acquisition process.
- The PM will take full advantage of DoD ranges, labs, and other resources. Programs will use government T&E capabilities unless an exception can be justified as cost-effective to the government. PMs will conduct a cost-benefit analysis for exceptions to this policy and obtain approval through the TEMP approval process before acquiring or using non-government, program-unique test facilities or resources.
- Systems have become more complex, and resource constraints often force tradeoffs in the type and scope of testing that can be performed. The DT&E budget and schedule must allow testing that adequately verifies performance to contractual requirements in a controlled environment and to operational requirements.
DT&E Activities
Reference Source: DODI 5000.89, Section 5.2
DT&E activities will start when requirements are being developed to ensure key technical requirements are measurable, testable, and achievable; as well as provide feedback that the system engineering process is performing adequately. A robust DT&E program will provide the data and assessments for independent evaluations and decision-making. The DT&E program will:
- Verify achievement of critical technical parameters and the ability to achieve KPPs. OT will use relevant DT data to assess progress toward achievement of critical operational issues.
- Assess the system’s ability to achieve the thresholds prescribed in the capabilities documents.
- Assess system specification compliance.
- Provide data to the PM to enable root cause determination of failures arising from tests and to identify corrective actions.
- Validate system functionality in a mission context to assess readiness for OT.
- Provide information for cost, performance, and schedule tradeoffs.
- Report on the program’s progress to plan for reliability growth and assess reliability and maintainability performance for use during milestone decisions.
- Identify system capabilities, limitations, and deficiencies.
- Assess system safety.
- Assess compatibility with legacy systems.
- Stress the system within the intended operationally relevant mission environment.
- Support all appropriate certification processes.
- Document achievement of contractual technical performance, and verify incremental improvements and system corrective actions.
- Assess entry criteria for IOT&E and follow-on OT&E.
- Provide DT&E data to validate parameters in models and simulations.
- Assess the maturity of the chosen integrated technologies.
- Include T&E activities to detect cyber vulnerabilities within custom and commodity hardware and software.
- Support cybersecurity assessments and authorization, including RMF security controls.
DT&E Execution, Evaluation, and Reporting
Reference Source: DODI 5000.89, Section 5.3
DT&E Execution.
The PM and test team will develop detailed test plans for each DT event identified in the TEMP. The PM, in concert with the user and T&E community, will provide relevant safety documentation (e.g., occupational health risk acceptance) and required documentation (e.g., the National Environmental Policy Act and Executive Order 12114 documentation for the DT event, safety, and occupational health risk assessment) to testers before any test that may affect safety of personnel. The PM will conduct test readiness reviews for those events identified in the TEMP, or other test strategy documentation.
DT&E Evaluation.
DT&E Program Assessments.
For ACAT 1B/1C programs on the T&E oversight list for which USD(R&E) did not conduct a DT&E sufficiency assessment, the USD(R&E) will provide the MDA with a program assessment at the development RFP release decision point and MS B and C. This will be updated to support the operational test readiness review or as requested by the MDA or PM. The program assessment will be based on the completed DT&E and any operational T&E activities completed to date, and will address the adequacy of the program planning, the implications of testing results to date, and the risks to successfully meeting the goals of the remaining T&E events in the program.
DT&E Sufficiency Assessments.
In accordance with Sections 2366b(c)(l) and 2366c(a)(4) of Title 10, U.S.C., when the USD(A&S) is the MDA, the USD(R&E) will conduct DT&E sufficiency assessments for MDAPs to be included in MS B and MS C brief summary reports provided to the congressional defense committees. When the Service or the Component acquisition executive is the MDA, the senior official within the Military Department, Defense Agency, or DoD Field Activity with responsibility for DT will conduct DT&E sufficiency assessments for MDAPs to be included in MS B and MS C brief summary reports provided to the congressional defense committees.
DT&E Reports and Data.
- The USD(R&E) and the acquisition chain of command and their designated representatives will have full and prompt access to all ongoing developmental testing and integrated testing, and all DT and integrated test records and reports, including but not limited to: data from all tests, recurring test site status and execution reports, system logs, execution logs, test director notes, certifications, user and operator assessments, and surveys. This applies to all government-accessible data including classified, unclassified, and competition sensitive or proprietary data. Data may be preliminary and identified as such, when applicable.
- The PM and test agencies for all T&E oversight programs will provide DTIC with all reports and the supporting data for the test events in those reports.
- The DoD Components will collect and retain data from DT&E, integrated testing, and OT&E on the reliability and maintainability of ACAT I and II programs.
Developmental Test and Evaluation (DT&E) Guidance
Reference Source: DAG CH 8-3.1 Developmental T&E
Developmental Test and Evaluation (DT&E) is the disciplined process of generating substantiated knowledge on the capabilities and limitations of systems, subsystems, components, software, and materiel. This knowledge is used to inform decision-makers on risks in acquisition, programmatic, technical, and operational decisions throughout the acquisition life cycle. DT&E assesses maturity of technologies, system design, readiness for production, acceptance of government ownership of systems, readiness to participate in distributed and operational T&E, and sustainment.
Both test and evaluation are necessary to gain value from a DT&E effort. In the context of DT&E, an entity can be a technology, process, materiel, software modules, components, subsystems, systems, and system-of-systems. Identified conditions refer to test conditions that are controlled, uncontrolled, measured, or not measured. Developmental evaluations are accomplished using criteria derived from various sources. The most common sources are the mission sets from the Concept of Operations/Operational Mode Summary/Mission Profile (CONOPS/OMS/MP), the capability gaps, user requirements specified in the capabilities documents (Initial Capabilities Document (ICD), Capability Development Document (CDD), Capability Production Document (CPD)), Critical Operational Issues (COIs), and Critical Operational Issues and Criteria (COIC), the design measures contained in the technical requirements documents (TRD), and contractual performance specifications. One set of tests can result in multiple developmental evaluations.
A DT&E program will:
- Verify achievement of critical technical parameters and the ability to achieve key performance parameters, and assess progress toward achievement of critical operational issues.
- Assess the system’s ability to achieve the thresholds prescribed in the capabilities documents.
- Provide data to the program manager to enable root cause determination and to identify corrective actions.
- Validate system functionality.
- Provide information for cost, performance, and schedule tradeoffs.
- Assess system specification compliance.
- Report on program progress to plan for reliability growth and to assess reliability and maintainability to performance for use during key reviews.
- Identify system capabilities, limitations, and deficiencies.
- Include T&E activities to detect cyber vulnerabilities within custom and commodity hardware and software.
- Assess system safety.
- Assess compatibility with legacy systems.
- Stress the system within the intended operationally relevant mission environment.
- Support cybersecurity assessments and authorization, including Risk Management Framework security controls.
- Support the interoperability certification process.
- Document achievement of contractual technical performance, and verify incremental improvements and system corrective actions.
- Assess entry criteria for Initial Operational Test and Evaluation (IOT&E) and Follow-On Operational Test and Evaluation.
- Provide DT&E data to validate parameters in models and simulations.
- Assess the maturity of the chosen integrated technologies.
Other areas DT&E contributes to include:
- Data collection, migration, management, and archiving.
- Software functionality validation.
- Cybersecurity.
- Interoperability.
- Interface design and management.
- Integration.
- Modeling and simulation verification, validation, and accreditation.
- Environmental compliance and impact.
- Reliability.
- Logistics Demonstration.
Program Planning
Reference Source: DAG CH 8-3.1 Developmental T&E
The Test and Evaluation Master Plan (TEMP) is the primary planning and management tool for the integrated test program. At a minimum, the following documents (unless MDA waiver is obtained) are used to support development of the TEMP:
- JCIDS documents (ICD, CDD, CPD)
- Critical Operational Issues (COIs) and Critical Operational Issue Criteria (COIC)
- Analysis of Alternatives (AoA)
- System Threat Assessment Report (STAR) (Note: The Validated Online Life-cycle Threat (VOLT) is being developed to replace the STAR.)
- Acquisition Strategy (AS)
- Systems Engineering Plan (SEP)
- Program Protection Plan (PPP)
- Cybersecurity Strategy
- Security Plan
- Security Assessment Plan
- Information Support Plan (ISP)
- Acquisition Program Baseline (APB)
- Cost Analysis Requirements Description (CARD)
- Concept of Operations/Operational Mode Summary/Mission Profile (CONOPS/OMS/MP)
Evaluation of Developmental Test Adequacy
Reference Source: DAG CH 8-3.1 Developmental T&E
DT&E provides feedback to the PMs and decision-makers to inform decision-making throughout the acquisition cycle. The PM uses the TEMP as the primary planning and management tool for the integrated test program. The TEMP should describe a logical DT&E strategy, including:
(1) decisions to be informed by the DT&E information,
(2) evaluations to inform those decisions,
(3) test and modeling and simulation events to be conducted to generate the data for the evaluation, and
(4) resources to be used and schedules to be followed to execute T&E events.
A comprehensive DT&E program generates the key data used to evaluate technologies, components, subsystems, interoperability, cybersecurity, and reliability capabilities. The TEMP includes a developmental evaluation framework that shows the correlation/mapping between decisions, capabilities to be evaluated, measures to be used to quantify the capabilities, and test and modeling and simulation events.