Middle Tier of Acquisition (MTA)

AAF  >  MTA Rapid Fielding Path  >  Test & Demo

Test & Demonstrate

How To Use This Site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Operational environment: A set of operational conditions, selected by the users in coordination with the appropriate independent operational testing agency that are representative of the desired spectrum of operational employments.

DODI 5000.80

Reference Source: DODI 5000.80, Paragraph 3.2.b

 

Demonstrating and Evaluating Performance. DoD Components will develop a process for demonstrating performance and evaluating for current operational purposes the proposed products and technologies. This process will result in a test strategy or an assessment of test results, included in the acquisition strategy, documenting the evaluation of the demonstrated operational performance, to include validation of required cybersecurity and interoperability as applicable. The operational demonstration assessment will support the initial production decision by the DA. Programs on the DOT&E oversight list will follow applicable procedures.

Check out the MTA FAQs and MTA Tips for potentially related questions from the field and helpful tips!

Rapid Fielding Test and Evaluation (T&E)

Overview of T&E Procedures

Reference Source: DoDI 5000.89, Section 3.1.a-c

 

The fundamental purpose of T&E is to enable the DoD to acquire systems that support the warfighter in accomplishing their mission. To that end, T&E provides engineers and decision-makers with knowledge to assist in managing risks; to measure technical progress; and to characterize operational effectiveness, operational suitability, interoperability, survivability (including cybersecurity), and lethality. This is done by planning and executing a robust and rigorous T&E program.

 

Integrated testing and independent evaluation are part of a larger continuum of T&E that includes DT&E (both contractor and government), OT&E, and LFT&E. Integrated testing requires the collaborative planning and execution of test phases and events to provide shared data in support of independent analysis, evaluation, and reporting by all stakeholders. Whenever feasible, the programs will conduct testing in an integrated fashion to permit all stakeholders to use data in support of their respective functions.

 

Programs will incorporate integrated testing at the earliest opportunity when developing program strategies, plans with program protection, documentation, and T&E strategies or the TEMPs. Developing and adopting integrated testing early in the process increases the effectiveness and efficiency of the overall T&E program.

 

MTA Ops Demo

Reference Source: DoDI 5000.89, Section 4.3.d

 

For rapid fielding, a test plan is developed. The lead OTA will plan and conduct the ops demo as an OA, with representative units, missions, and environments. Ops demos may consist of a series of incremental test events or separate “capstone” demonstration events based on program requirements. All events should be conducted in an integrated fashion, supported by collaborative developer, program office, DT, and OT planning.

 

Ops demos should consider all aspects of system performance, including survivability and lethality if deemed critical to mission effectiveness or force protection. During the demo, operational personnel will operate the system, with the minimum necessary level of contractor support. Mission demonstrations should be designed as end-to-end missions to the maximum extent possible, to include planning, mission task execution, and post-mission activities, based on user-provided employment concepts and tactics.

 

The OTA must submit the ops demo plan leading to a fielding decision or transition to another pathway to the DOT&E for approval before testing begins. The plan will adequately detail: system configuration; capabilities to be demonstrated; the operational units, users, mission, and environment; and the primary T&E data that will demonstrate the required capabilities.

DoD Component Guidance

Note that DoD Component MTA Implementation policies and guidance are currently being updated to be consistent with the newly published DODI 5000.80 30 December 2019 (Change 1 Effective 25 Nov 2024).

Air Force

Army

Reference Source: ASA(ALT) Middle Tier of Acquisition Policy, 20 March 2020, Enclosure 1
[Note: CAC required for access]

 

The PM will develop a process for demonstrating performance and evaluating for current operational purposes of the proposed products and technologies. This process will result in a test strategy or an assessment of test results, included in the acquisition strategy, documenting the evaluation of the demonstrated operational performance, to include validation of required cybersecurity and interoperability as applicable. An independent assessment from the U.S. Army Test and Evaluation Command is recommended. Programs on the Director, Operational Test and Evaluation oversight list will follow applicable procedures.

 

Reference Source: Policy Directive for Test and Evaluation of MTA Programs, 28 Feb 2019

 

Section 3. Background.

 

a.  An MTA program that is a “covered system,” “major munitions program,” “missile program,” or “covered product improvement program” as defined in Title 10 U.S.C. Section 139(a)(2)(B) is subject to the requirements of that section pertaining to survivability and lethality testing.

 

b.  An MTA program that is an MDAP pursuant to Director of Operational Test and Evaluation designation under Title 10 U.S.C. Section 139(a)(2)(B) is subject to the requirements of Title 10 U.S.C. Section 2399 pertaining to Operational Test and Evaluation (OT&E).

 

c.  An MTA program that is a “major system” for the purposes of Title 10 U.S.C. Section 2400 and DoDI 5000.02 is subject to the requirements of those two provisions pertaining to LRIP.

 

d. Test and Evaluation (T&E) characterizes risks in order to inform decision makers. For MTA programs, decision makers may accept risk in order to expedite the prototyping and fielding of advanced technologies and capabilities to the Warfighter.

 

 

Section 4. Directive.

 

a.  A T&E strategy is required for all MTA programs. A formal T&E Master Plan is not required for MTA programs.

 

b.  In coordination with the T&E organization, the materiel developer will develop a strategy that defines the appropriate scope of T&E. The T&E strategy informs the MTA initiation request and is included in the MTA program strategy (reference 1.b) or Single Acquisition Management Plan (SAMP). The T&E strategy must be approved by the materiel developer, the responsible T&E organization, and the capability proponent at the equivalent rank or grade as the MTA Decision Authority. If the AAE is the MTA Decision Authority, the T&E strategy shall be coordinated with the Army T&E Executive.

 

c.  The T&E strategy will provide sufficient detail to allow for resourcing of all data collection events supporting assessment (progress towards demonstrating operational effectiveness, operational suitability and survivability) and/or evaluation (determination of operational effectiveness, operational suitability and survivability) to include modeling and simulation (M&S), experimentation, demonstrations, contractor testing and government testing. It will describe how the planned T&E will be used to inform and/or validate program requirements. The T&E strategy shall include an integrated test program schedule; any technical, developmental, operational, or integrated test events and objectives; a technical and operational evaluation framework; and a test resource summary.

 

(1) The T&E strategy will provide for sufficient data to determine the operational effectiveness, operational suitability, survivability and safety of an MTA program, while at the same time allowing Army decision makers to understand program risks.

 

(2) The T&E strategy will be based on program requirements and consideration of factors such as existing data, technology maturity, operator complexity, integration and interoperability characteristics, mission impacts, training, and sustainment/logistical needs. It will employ representative threats or validated and accredited models to assess the MTA program. The Independent Evaluator will provide input to the MTA Decision Authority on the risks, T&E limitations, and program requirements not addressed.

 

(3) Since requirements will likely evolve over the course of an MTA program development, it is prudent that the T&E strategy sets the conditions for periodic assessments of performance and validation/revalidation of requirements as execution proceeds. Modeling, experimentation (reference 1.l) and T&E results will be used to validate and refine MTA program requirements and determine the military utility of the program. The capability proponent shall use T&E results to modify the applicable requirement document, engaging in continuous learning as the T&E strategy unfolds in a test – learn – fix – test approach

 

(4) The T&E strategy will maximize use of flexible and innovative T&E approaches (e.g., hierarchical Bayesian models for reliability, M&S, and risk-based T&E). Supporting MTA programs with responsive T&E requires an innovative environment of continuous learning that mitigates or informs risk. This can be enabled through:

 

(a) Use of small events that are focused on particular aspects of performance. These can be credible contractor or government-led events focused on particular aspects of a system. These events can be used to inform requirements that may have otherwise been established without qualitative analysis.

 

(b) Increased use of M&S. When M&S is going to be used for the MTA program, the plan for validation, verification and accreditation (VV&A) shall be incorporated into the T&E strategy (reference 1.m.). When the M&S is well rationalized, lack of VV&A should not impede its use. VV&A of all M&S shall be considered during the course of the program and approrpiate weight given to its findings. Expanded use of M&S will become increasingly important as the Army expands its development and application of artificial intelligence and machine learning systems where physical verification of state conditions and responses will become prohibitive.

 

(c) Greater reliance on contractor expertise and M&S for MTA system reliability. Incorporate use of reliability design activities as a primary source of data. Rapid improvement events may be conducted at a component or subsystem level to work out design flaws or to improve design margins. The materiel developer, in coordination with the T&E organization, is responsible for determining the scope of the reliablity design activities and ensuring the MTA program contract is structured to obtain the required data and analyses. Concepts such as Bayesian statistics, highly accelerated life testing, highly accelerated stress screening and rapid design improvement events are to be considered. Traditional sequential reliability growth testing is discouraged.

 

d.  Increased prototyping and experimentation (reference 1.l) allows for early learning of system capabilities, limitations and military utility. Soldiers will be involved and leveraged throughout prototyping, experimentation and developmental testing to characterize operational risks in order to inform decision makers. The use of reserve component units for experimentation and testing is encouraged, provided that Active Duty Operational Support funding is available. National Guard and Army Reserve personnel may provide recent and relevant industry experience that is valuable for assessing or evaluating the systems under development. Operational testing will focus on data gaps that remain after reviewing all available data to complete an operational effectiveness, operational suitability and survivability evaluation.

 

e.  The T&E organization will provide continuous and cumulative assessments during program execution to support MTA program knowledge points and decision reviews. Evaluations and assessments of TA programs will consider all credible data. This may require the materiel developer to establish standard language, metrics, data dictionary, data methods and database structures across M&S, experimentation, demonstrations, contractor testing and government testing. The responsible T&E organization will advise on the adequacy of existing data to satisfy requirements verification.

 

f.  In accordance with reference 1.f, US Army Test and Evaluation Command (ATEC) must complete a safety release before any hands-on testing, use or maintenance by Soldiers. In accordance with reference 1.n, ATEC must complete a safety confirmation to support Materiel Release.

 

g.  If the MTA program satisfies the criteria described in paragraph 3.a of this directive it must complete survivability and lethality testing (reference 1.h) and OT&E (reference 1.j) before production exceeds LRIP equivalent quantities (reference 1.k).

 

h.  The delivery of contractor test data will be included in the materiel developer’s contracting actions. The materiel developer will deliver the data to the responsible T&E organization. The responsible T&E organization will be afforded the opportunity to review test plans for contractor testing; witness testing at the contractor facility; and review test reports (to include all derived test data), in conjunction with the materiel developer, to assess the adequacy of the test data for use in evaluation. The materiel developer will include the use of contractor test data in their MTA program strategy or SAMP. In addition, the materiel developer’s Request for Proposal will address program T&E needs, to include test articles, T&E data rights and government access to test data.

 

i.  All T&E-related approvals external to the Army require coordination through the Army T&E Executive.

Navy

SOCOM

Reference Source: USSOCOM Middle Tier Acquisition Authorities and Guidance, 1 Aug 2018

Test Plan: An MTA Strategy must account for the capability receiving an F&DR or CF&DR before fielding. The Test Officer will play a critical role in ensuring that appropriately tailored test events are planned, documented in the SAMP, resourced and executed to ensure the capability is safe, suitable and effective on schedule.

Additional Resources