Middle Tier of Acquisition (MTA)

AAF  >  MTA Rapid Fielding Path  >  Test & Demo

Test & Demonstrate

How to use this site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Operational environment: A set of operational conditions, selected by the users in coordination with the appropriate independent operational testing agency that are representative of the desired spectrum of operational employments.

DODI 5000.80

Reference Source: DODI 5000.80, Paragraph 3.2.b

 

Demonstrating and Evaluating Performance. DoD Components will develop a process for demonstrating performance and evaluating for current operational purposes the proposed products and technologies. This process will result in a test strategy or an assessment of test results, included in the acquisition strategy, documenting the evaluation of the demonstrated operational performance, to include validation of required cybersecurity and interoperability as applicable. The operational demonstration assessment will support the initial production decision by the DA. Programs on the DOT&E oversight list will follow applicable procedures.

Check out the MTA FAQs and MTA Tips for potentially related questions from the field and helpful tips!

Purpose and Applicability of Test and Evaluation (T&E)

Reference Source: Guidance from DOT&E, 2019

MTA programs are required by Pub. L. 114-92 to demonstrate and evaluate operational performance by either “be[ing] demonstrated in an operational environment” or going through a “process for demonstrating performance and evaluating for current operational purposes”.  As required by DoDI 5000.UC and 10 U.S.C. 139, the Director, Operational Test and Evaluation (DOT&E) monitors and reviews both these demonstrations and evaluations, and other operational test and evaluation (OT&E) that the MTA may be required to conduct.  For example, a program that meets the definition of a covered system in 10 USC 2366 must complete live fire test and evaluation (LFT&E) and a program that meets the requirements described in 10 USC 2399(a)(2) is required to complete initial operational test and evaluation (IOT&E) before proceeding beyond low-rate initial production.

 

To fulfill this statutory obligation, DOT&E monitors MTA programs using the criteria for operational test and evaluation (OT&E) and live fire test and evaluation (LFT&E) outlined in 10 U.S.C. 1392366, and 2399 and uses the considerations in DOT&E Memorandum, “Designation of Programs for Director, Operational Test and Evaluation Oversight”, July 13, 2018, to determine the need for DOT&E oversight of particular programs.

General T&E Approach

Reference Source: Guidance from DOT&E, 2019

Rapid Prototyping or Rapid Fielding programs should design demonstrations and/or tests consistent with how the prototype, capability, product, or technology (PCPT) will be fielded to evaluate the ability of warfighters to complete relevant missions using the system’s capabilities.  In particular, the scope of these events should consider whether a PCPT will be fielded to troops in combat, to non-combat areas, or into a training environment and to capture the benefits and risks to the ability of warfighters to complete relevant missions.

The program manager is responsible for resourcing and executing the test and evaluation (T&E) program, which includes the demonstrations and evaluations required by Pub. L. 114-92, and other OT&E required by statute.  Because of their accelerated nature, the MTA program manager should assemble a team including the appropriate Service/Agency Operational Test Agency (OTA) early in program planning to build a foundation for a successful strategy and events.  The program manager documents the T&E program planning in a test strategy.  As part of the required documentation for an MTA, the DA approves the test strategy.  Test strategies for programs under DOT&E oversight also require DOT&E approval.

 

Rapid Fielding Test Strategy

Reference Source: Guidance from DOT&E, 2019

A system using the rapid fielding pathway is intended to begin production within six months of program start; this timeline depends on using proven technologies that require minimal development.  The purpose of the test strategy for a Rapid Fielding pathway is to inform the decision to begin production.  As such, the test strategy must provide needed decision information before production begins.  The timelines are such that a report might use results from activities accomplished before the program started.  Because the test strategy for a rapid fielding pathway occurs before production, possibly including activities accomplished before designation as a program, involving the Service or Agency OTA early in the planning process is critical. 

Results from previous activities used to support a rapid fielding program will be approved by the DA as part of required program documentation.  Test results supporting MTA Rapid Fielding programs on DOT&E oversight will also require DOT&E approval.  Test reports from previous testing should identify the operational conditions the test was conducted under, how those conditions represent the actual operational environment, and any limitations to evaluating operational performance across the actual spectrum of operational employments.

The program manager will document in the test strategy any events required to support a production decision in addition to previous activity.  To support resourcing and planning, the test strategy should describe the set of operational conditions chosen to represent the operational environment.  If not described in the program schedule, the test strategy should identify the needed resources (ranges, teams, operational forces, targets, etc) with associated timelines.

 

Guidance from Director (OT&E)

The Director, Operational Test and Evalution (DOT&E) published Live-Fire Test and Evaluation Planning guidelines for MTA programs on 24 Oct 2019. This document is For Official Use Only and cannot be accessed from this site.

 

DoD Component Guidance

Note that DoD Component MTA Implementation policies and guidance are currently being updated to be consistent with the newly published DODI 5000.80 (effective 30 Dec 2019). 

Air Force

Army

Reference Source: ASA(ALT) Middle Tier of Acquisition Policy, 20 March 2020, Enclosure 1

 

The PM will develop a process for demonstrating performance and evaluating for current operational purposes of the proposed products and technologies. This process will result in a test strategy or an assessment of test results, included in the acquisition strategy, documenting the evaluation of the demonstrated operational performance, to include validation of required cybersecurity and interoperability as applicable. An independent assessment from the U.S. Army Test and Evaluation Command is recommended. Programs on the Director, Operational Test and Evaluation oversight list will follow applicable procedures.

 

Reference Source: Policy Directive for Test and Evaluation of MTA Programs, 28 Feb 2019

 

Section 3. Background.

 

a.  An MTA program that is a “covered system,” “major munitions program,” “missile program,” or “covered product improvement program” as defined in Title 10 U.S.C. Section 139(a)(2)(B) is subject to the requirements of that section pertaining to survivability and lethality testing.

 

b.  An MTA program that is an MDAP pursuant to Director of Operational Test and Evaluation designation under Title 10 U.S.C. Section 139(a)(2)(B) is subject to the requirements of Title 10 U.S.C. Section 2399 pertaining to Operational Test and Evaluation (OT&E).

 

c.  An MTA program that is a “major system” for the purposes of Title 10 U.S.C. Section 2400 and DoDI 5000.02 is subject to the requirements of those two provisions pertaining to LRIP.

 

d. Test and Evaluation (T&E) characterizes risks in order to inform decision makers. For MTA programs, decision makers may accept risk in order to expedite the prototyping and fielding of advanced technologies and capabilities to the Warfighter.

 

 

Section 4. Directive.

 

a.  A T&E strategy is required for all MTA programs. A formal T&E Master Plan is not required for MTA programs.

 

b.  In coordination with the T&E organization, the materiel developer will develop a strategy that defines the appropriate scope of T&E. The T&E strategy informs the MTA initiation request and is included in the MTA program strategy (reference 1.b) or Single Acquisition Management Plan (SAMP). The T&E strategy must be approved by the materiel developer, the responsible T&E organization, and the capability proponent at the equivalent rank or grade as the MTA Decision Authority. If the AAE is the MTA Decision Authority, the T&E strategy shall be coordinated with the Army T&E Executive.

 

c.  The T&E strategy will provide sufficient detail to allow for resourcing of all data collection events supporting assessment (progress towards demonstrating operational effectiveness, operational suitability and survivability) and/or evaluation (determination of operational effectiveness, operational suitability and survivability) to include modeling and simulation (M&S), experimentation, demonstrations, contractor testing and government testing. It will describe how the planned T&E will be used to inform and/or validate program requirements. The T&E strategy shall include an integrated test program schedule; any technical, developmental, operational, or integrated test events and objectives; a technical and operational evaluation framework; and a test resource summary.

 

(1) The T&E strategy will provide for sufficient data to determine the operational effectiveness, operational suitability, survivability and safety of an MTA program, while at the same time allowing Army decision makers to understand program risks.

 

(2) The T&E strategy will be based on program requirements and consideration of factors such as existing data, technology maturity, operator complexity, integration and interoperability characteristics, mission impacts, training, and sustainment/logistical needs. It will employ representative threats or validated and accredited models to assess the MTA program. The Independent Evaluator will provide input to the MTA Decision Authority on the risks, T&E limitations, and program requirements not addressed.

 

(3) Since requirements will likely evolve over the course of an MTA program development, it is prudent that the T&E strategy sets the conditions for periodic assessments of performance and validation/revalidation of requirements as execution proceeds. Modeling, experimentation (reference 1.l) and T&E results will be used to validate and refine MTA program requirements and determine the military utility of the program. The capability proponent shall use T&E results to modify the applicable requirement document, engaging in continuous learning as the T&E strategy unfolds in a test – learn – fix – test approach

 

(4) The T&E strategy will maximize use of flexible and innovative T&E approaches (e.g., hierarchical Bayesian models for reliability, M&S, and risk-based T&E). Supporting MTA programs with responsive T&E requires an innovative environment of continuous learning that mitigates or informs risk. This can be enabled through:

 

(a) Use of small events that are focused on particular aspects of performance. These can be credible contractor or government-led events focused on particular aspects of a system. These events can be used to inform requirements that may have otherwise been established without qualitative analysis.

 

(b) Increased use of M&S. When M&S is going to be used for the MTA program, the plan for validation, verification and accreditation (VV&A) shall be incorporated into the T&E strategy (reference 1.m.). When the M&S is well rationalized, lack of VV&A should not impede its use. VV&A of all M&S shall be considered during the course of the program and approrpiate weight given to its findings. Expanded use of M&S will become increasingly important as the Army expands its development and application of artificial intelligence and machine learning systems where physical verification of state conditions and responses will become prohibitive.

 

(c) Greater reliance on contractor expertise and M&S for MTA system reliability. Incorporate use of reliability design activities as a primary source of data. Rapid improvement events may be conducted at a component or subsystem level to work out design flaws or to improve design margins. The materiel developer, in coordination with the T&E organization, is responsible for determining the scope of the reliablity design activities and ensuring the MTA program contract is structured to obtain the required data and analyses. Concepts such as Bayesian statistics, highly accelerated life testing, highly accelerated stress screening and rapid design improvement events are to be considered. Traditional sequential reliability growth testing is discouraged.

 

d.  Increased prototyping and experimentation (reference 1.l) allows for early learning of system capabilities, limitations and military utility. Soldiers will be involved and leveraged throughout prototyping, experimentation and developmental testing to characterize operational risks in order to inform decision makers. The use of reserve component units for experimentation and testing is encouraged, provided that Active Duty Operational Support funding is available. National Guard and Army Reserve personnel may provide recent and relevant industry experience that is valuable for assessing or evaluating the systems under development. Operational testing will focus on data gaps that remain after reviewing all available data to complete an operational effectiveness, operational suitability and survivability evaluation.

 

e.  The T&E organization will provide continuous and cumulative assessments during program execution to support MTA program knowledge points and decision reviews. Evaluations and assessments of TA programs will consider all credible data. This may require the materiel developer to establish standard language, metrics, data dictionary, data methods and database structures across M&S, experimentation, demonstrations, contractor testing and government testing. The responsible T&E organization will advise on the adequacy of existing data to satisfy requirements verification.

 

f.  In accordance with reference 1.f, US Army Test and Evaluation Command (ATEC) must complete a safety release before any hands-on testing, use or maintenance by Soldiers. In accordance with reference 1.n, ATEC must complete a safety confirmation to support Materiel Release.

 

g.  If the MTA program satisfies the criteria described in paragraph 3.a of this directive it must complete survivability and lethality testing (reference 1.h) and OT&E (reference 1.j) before production exceeds LRIP equivalent quantities (reference 1.k).

 

h.  The delivery of contractor test data will be included in the materiel developer’s contracting actions. The materiel developer will deliver the data to the responsible T&E organization. The responsible T&E organization will be afforded the opportunity to review test plans for contractor testing; witness testing at the contractor facility; and review test reports (to include all derived test data), in conjunction with the materiel developer, to assess the adequacy of the test data for use in evaluation. The materiel developer will include the use of contractor test data in their MTA program strategy or SAMP. In addition, the materiel developer’s Request for Proposal will address program T&E needs, to include test articles, T&E data rights and government access to test data.

 

i.  All T&E-related approvals external to the Army require coordination through the Army T&E Executive.

Navy

SOCOM

Reference Source: USSOCOM Middle Tier Acquisition Authorities and Guidance, 1 Aug 2018

Test Plan: An MTA Strategy must account for the capability receiving an F&DR or CF&DR before fielding. The Test Officer will play a critical role in ensuring that appropriately tailored test events are planned, documented in the SAMP, resourced and executed to ensure the capability is safe, suitable and effective on schedule.

Additional Resources