Middle Tier of Acquisition (MTA)

AAF  >  MTA Rapid Prototyping Path  >  Test & Demo

Test & Demonstrate

How To Use This Site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Reference Source: DODI 5000.80, Paragraph 3.1.c

 

Demonstrating and Evaluating Performance. DoD Components will develop a process for demonstrating performance and evaluating for current operational purposes the proposed products and technologies. This process will result in a test strategy or an assessment of test results, included in the acquisition strategy, documenting the evaluation of the demonstrated operational performance, to include validation of required cybersecurity and interoperability as applicable. Programs on the DOT&E oversight list will follow applicable procedures.

Operational Environment: A set of operational conditions, selected by the users in coordination with the appropriate independent operational testing agency that are representative of the desired spectrum of operational employments.

DODI 5000.80

Check out the MTA FAQs and MTA Tips for potentially related questions from the field and helpful tips!

Rapid Prototyping Test and Evaluation (T&E)

Overview of T&E Procedures

Reference Source: DoDI 5000.89, Section 3.1.a-c

 

The fundamental purpose of T&E is to enable the DoD to acquire systems that support the warfighter in accomplishing their mission. To that end, T&E provides engineers and decision-makers with knowledge to assist in managing risks; to measure technical progress; and to characterize operational effectiveness, operational suitability, interoperability, survivability (including cybersecurity), and lethality. This is done by planning and executing a robust and rigorous T&E program.

 

Integrated testing and independent evaluation are part of a larger continuum of T&E that includes DT&E (both contractor and government), OT&E, and LFT&E. Integrated testing requires the collaborative planning and execution of test phases and events to provide shared data in support of independent analysis, evaluation, and reporting by all stakeholders. Whenever feasible, the programs will conduct testing in an integrated fashion to permit all stakeholders to use data in support of their respective functions.

 

Programs will incorporate integrated testing at the earliest opportunity when developing program strategies, plans with program protection, documentation, and T&E strategies or the TEMPs. Developing and adopting integrated testing early in the process increases the effectiveness and efficiency of the overall T&E program.

 

MTA Ops Demo

Reference Source: DoDI 5000.89, Section 4.3.d

 

In rapid prototyping, the OTA provides input to the DT plan for execution of the ops demo. The lead OTA will plan and conduct the ops demo as an OA, with representative units, missions, and environments. Ops demos may consist of a series of incremental test events or separate “capstone” demonstration events based on program requirements. All events should be conducted in an integrated fashion, supported by collaborative developer, program office, DT, and OT planning.

 

Ops demos should consider all aspects of system performance, including survivability and lethality if deemed critical to mission effectiveness or force protection. During the demo, operational personnel will operate the system, with the minimum necessary level of contractor support. Mission demonstrations should be designed as end-to-end missions to the maximum extent possible, to include planning, mission task execution, and post-mission activities, based on user-provided employment concepts and tactics.

 

The OTA must submit the ops demo plan leading to a fielding decision or transition to another pathway to the DOT&E for approval before testing begins. The plan will adequately detail: system configuration; capabilities to be demonstrated; the operational units, users, mission, and environment; and the primary T&E data that will demonstrate the required capabilities.

Types of Prototype Evaluations

Reference Source: DoD Prototyping Handbook, Oct 2022

In most cases, the most important step in the prototyping process is evaluating the prototype. In fact, prototypes are often built specifically for the evaluation activity, and will be discarded after the evaluation. Evaluations should be designed and conducted in a way that addresses the purpose of the prototyping project. Evaluations typically come in three forms:

    • demonstrations,
    • experimentation, and
    • red teaming.

For prototypes that transition or integrate into MDAPs, FAR requirements for developmental testing and operational testing still apply.

Prototype Demonstrations

Reference Source: DoD Prototyping Handbook, Oct 2022

Demonstrations are evaluations specifically designed to determine if a prototype can do what it was developed to do. The desired outcome of a demonstration may be a “ready to fight” capability for the warfighter.

Prototype Experimentation

Reference Source: DoD Prototyping Handbook, Oct 2022

Experimentation is an evaluation method that uses prototypes to test hypotheses. Rather than simply demonstrating that a capability meets the need it was built to meet, experimentation stresses the technology to identify its full capability and limitations. In addition to evaluating the technical feasibility of a prototype, experimentation can also verify military utility and help in the development of preliminary CONOPS and Tactics, Techniques, and Procedures (TTPs) for emerging technological capabilities. Modeling and simulation is an excellent tool for experimentation.

Prototype Red Teaming

Reference Source: DoD Prototyping Handbook, Oct 2022

Red teaming is an effort, often performed by an independent organization, to identify potential vulnerabilities to the U.S. warfighter and to seek ways to make emerging technologies more secure. Red teaming is typically used for the following purposes:

  • To identify an adversary’s vulnerabilities and develop approaches to exploit them.
  • To investigate how an adversary might use emerging and trending technologies. This approach can help identify vulnerabilities of U.S. CONOPS and technology, but can also be used to identify ways of defeating emerging technologies.
  • To inform design choices early in a technology’s development cycle in an attempt to minimize vulnerabilities of U.S. systems or take advantage of the vulnerabilities of adversary systems.
  • To discover unconventional approaches that an adversary may use to counter DoD technologies and CONOPS. This includes not only technological adaptation, but also potential changes to their TTPs that could have cascading effects on our TTPs.
Prototype Test Environment

Reference Source: DoD Prototyping Handbook, Oct 2022

All evaluations should be conducted in a manner and at a venue that provides an environment relevant to the future decision that the prototyping project is intended to inform. Not all relevant environments are operational environments. Depending on the decision, the relevant environment could be a laboratory bench, a wind tunnel, a test and evaluation facility, a VR environment, a commercial environment, or an operational field exercise—to name just a few.

The key is to ensure the environment is relevant to the decision being informed. However, if the ultimate plan is to transition the prototype to operational use by the warfighter, the relevant environment must include putting the prototype in the hands of a warfighter in an operationally representative environment. This ensures that the prototype can be successfully used by a warfighter in an operational context to meet their capability need. In fact, Congress mandated that fieldable prototypes developed under the authorities in Section 804 of the FY16 NDAA (referred to as “Middle Tier Acquisition” authorities for the remainder of the document) be demonstrated in an operational environment.

Examples of DoD Evaluation Venues

Reference Source: DoD Prototyping Handbook, Oct 2022

Participants in these events are typically responsible for covering their own costs.

Advanced Naval Technology Exercise (ANTX)

ANTX provides a demonstration and experimentation environment focused on specific technology focus areas or emerging warfighting concepts. ANTXs are loosely scripted experimentation events where technologists and warfighters are encouraged to explore alternate tactics and technology pairings. Participants receive feedback from government technologists and operational SMEs. ANTXs are hosted by labs and warfare centers from across the naval R&D establishment.

Army Expeditionary Warrior Experiment (AEWE)

The Army Maneuver Center of Excellence conducts an annual AEWE campaign of experimentation to identify concepts and capabilities that enhance the effectiveness of the current and future forces by putting new technology in the hands of Soldiers. AEWE provides the opportunity to examine emerging technologies of promise; to experiment with small unit concepts and capabilities; and to help determine DOTMLPF implications of new capabilities.

Chemical Biological Operational Assessment (CBOA)

CBOAs are scenario-based events that support vulnerability and system limitation analysis of emerging capabilities in contested environments in an operationally relevant venue. These events provide an opportunity for technology developers to interact with operational personnel and determine how their efforts might support military capability gaps and high priority mission deficiencies. CBOAs are sponsored by The Defense Threat Reduction Agency Research and Development-Chemical and Biological Warfighter Integration Division.

Hanscom Collaboration and Innovation Center (HCIC) PlugTests

The Hanscom Air Force Base HCIC provides a custom-built Hanscom milCloud environment that enables the Air Force to evaluate cutting-edge advances in defense applications, cyber security, public safety, and information technology. PlugTests conducted at the HCIC enable vendors to demonstrate their systems on accessible military networks and make modifications in real-time. PlugTests also place prototypes directly in the hands of the operators, facilitating operator feedback prior to a contract award.

Joint Interagency Field Experimentation (JIFX)

The JIFX program conducts quarterly collaborative experimentation using established infrastructure at Camp Roberts and San Clemente Island to help the DoD and other organizations conduct concept experimentation using surrogate systems, demonstrate and evaluate new technologies, and incorporate emerging technologies into their operations. JIFX is run by the Naval Postgraduate School.

Joint Warfighting Assessment (JWA)

JWA is an annual exercise that seeks warfighter feedback on concepts and emergent capabilities required for the Joint Force. JWA focuses on experimentation of technologies and operational concepts in the Joint warfighting environment.

Simulation Experiment (SIMEX)

SIMEX is a virtual environment for early stage experimentation that uses real Command and Control systems, simulated weapons and sensors, and real military and civilian operators executing various crisis action scenarios to explore technology, system interoperability, CONOPS, and TTPs. SIMEX is run by an FFRDC.

Stiletto

Stiletto is a flexible and responsive maritime platform that provides short notice demonstration and experimentation capability for innovators to use in evaluating operational utility and performance of new concepts and prototypes in coastal and riverine warfare. It is sponsored by RRTO and operated by Naval Surface Warfare/Combatant Craft Division at Little Creek, VA.

Thunderstorm

Thunderstorm is a demonstration and experimentation venue that enables innovators to demonstrate their technologies in a realistic scripted or unscripted environment and facilitates interaction between innovators and warfighters around a specific technology or warfighter use case. Thunderstorm is sponsored by RRTO.

 

Best Practices for Evaluating Prototypes

Reference Source: DoD Prototyping Handbook, Oct 2022

  • Planning evaluations should begin as early in the planning process as possible to ensure that the type of evaluation and the environment will provide the data and information needed to satisfy the prototyping project’s purpose.
  • The end user of the prototype and testing professionals should be included in developing the scope, objectives, approach, and schedule of the prototype project evaluation.
  • Including independent assessors to help plan and conduct evaluations and/or analyze the data generated should be considered.
  • For prototypes that are intended for transition to operational use, product quality and risk factors should be assessed.

 

DoD Component Guidance

Note that DoD Component MTA Implementation policies and guidance are currently being updated to be consistent with the newly published DODI 5000.80 (effective 30 Dec 2019). 

Air Force

Reference Source: Air Force Guidance Memorandum for Rapid Acquisition Activities, 27 June 2019

 

7. Rapid Prototype Operational Test & Evaluation. In order to field a rapid prototype, the system must be demonstrated / tested in an operationally-relevant environment (T-0).

 

7.1. Wherever possible, scope and methodology for these tests should be co-developed with end users. User inputs should be documented as part of test planning.

 

7.2. Certification that sufficient user input supports classification of testing as “operationally-relevant” should be approved by the MDA prior to final testing.

7.2.1. It is understood that operational conditions cannot be fully recreated in controlled tests. The PM should demonstrate that major risks (i.e., ones without technical or operational “work-arounds”) will be retired by the end of testing with only moderate/minor risks (i.e.,

Army

Reference Source: ASA(ALT) Middle Tier of Acquisition Policy, 20 March 2020, Enclosure 1
[Note: CAC required for access]

 

The PM will develop a process for demonstrating performance and evaluating for current operational purposes of the proposed products and technologies. This process will result in a test strategy or an assessment of test results, included in the acquisition strategy, documenting the evaluation of the demonstrated operational performance, to include validation of required cybersecurity and interoperability as applicable. An independent assessment from the U.S. Army Test and Evaluation Command is recommended. Programs on the Director, Operational Test and Evaluation oversight list will follow applicable procedures.

 

Reference Source: Policy Directive for Test and Evaluation of MTA Programs, 28 Feb 2019

 

Section 3. Background.

 

a.  An MTA program that is a “covered system,” “major munitions program,” “missile program,” or “covered product improvement program” as defined in Title 10 U.S.C. Section 139(a)(2)(B) is subject to the requirements of that section pertaining to survivability and lethality testing.

 

b.  An MTA program that is an MDAP pursuant to Director of Operational Test and Evaluation designation under Title 10 U.S.C. Section 139(a)(2)(B) is subject to the requirements of Title 10 U.S.C. Section 2399 pertaining to Operational Test and Evaluation (OT&E).

 

c.  An MTA program that is a “major system” for the purposes of Title 10 U.S.C. Section 2400 and DoDI 5000.02 is subject to the requirements of those two provisions pertaining to LRIP.

 

d. Test and Evaluation (T&E) characterizes risks in order to inform decision makers. For MTA programs, decision makers may accept risk in order to expedite the prototyping and fielding of advanced technologies and capabilities to the Warfighter.

 

 

Section 4. Directive.

 

a.  A T&E strategy is required for all MTA programs. A formal T&E Master Plan is not required for MTA programs.

 

b.  In coordination with the T&E organization, the materiel developer will develop a strategy that defines the appropriate scope of T&E. The T&E strategy informs the MTA initiation request and is included in the MTA program strategy (reference 1.b) or Single Acquisition Management Plan (SAMP). The T&E strategy must be approved by the materiel developer, the responsible T&E organization, and the capability proponent at the equivalent rank or grade as the MTA Decision Authority. If the AAE is the MTA Decision Authority, the T&E strategy shall be coordinated with the Army T&E Executive.

 

c.  The T&E strategy will provide sufficient detail to allow for resourcing of all data collection events supporting assessment (progress towards demonstrating operational effectiveness, operational suitability and survivability) and/or evaluation (determination of operational effectiveness, operational suitability and survivability) to include modeling and simulation (M&S), experimentation, demonstrations, contractor testing and government testing. It will describe how the planned T&E will be used to inform and/or validate program requirements. The T&E strategy shall include an integrated test program schedule; any technical, developmental, operational, or integrated test events and objectives; a technical and operational evaluation framework; and a test resource summary.

 

(1) The T&E strategy will provide for sufficient data to determine the operational effectiveness, operational suitability, survivability and safety of an MTA program, while at the same time allowing Army decision makers to understand program risks.

 

(2) The T&E strategy will be based on program requirements and consideration of factors such as existing data, technology maturity, operator complexity, integration and interoperability characteristics, mission impacts, training, and sustainment/logistical needs. It will employ representative threats or validated and accredited models to assess the MTA program. The Independent Evaluator will provide input to the MTA Decision Authority on the risks, T&E limitations, and program requirements not addressed.

 

(3) Since requirements will likely evolve over the course of an MTA program development, it is prudent that the T&E strategy sets the conditions for periodic assessments of performance and validation/revalidation of requirements as execution proceeds. Modeling, experimentation (reference 1.l) and T&E results will be used to validate and refine MTA program requirements and determine the military utility of the program. The capability proponent shall use T&E results to modify the applicable requirement document, engaging in continuous learning as the T&E strategy unfolds in a test – learn – fix – test approach

 

(4) The T&E strategy will maximize use of flexible and innovative T&E approaches (e.g., hierarchical Bayesian models for reliability, M&S, and risk-based T&E). Supporting MTA programs with responsive T&E requires an innovative environment of continuous learning that mitigates or informs risk. This can be enabled through:

 

(a) Use of small events that are focused on particular aspects of performance. These can be credible contractor or government-led events focused on particular aspects of a system. These events can be used to inform requirements that may have otherwise been established without qualitative analysis.

 

(b) Increased use of M&S. When M&S is going to be used for the MTA program, the plan for validation, verification and accreditation (VV&A) shall be incorporated into the T&E strategy (reference 1.m.). When the M&S is well rationalized, lack of VV&A should not impede its use. VV&A of all M&S shall be considered during the course of the program and approrpiate weight given to its findings. Expanded use of M&S will become increasingly important as the Army expands its development and application of artificial intelligence and machine learning systems where physical verification of state conditions and responses will become prohibitive.

 

(c) Greater reliance on contractor expertise and M&S for MTA system reliability. Incorporate use of reliability design activities as a primary source of data. Rapid improvement events may be conducted at a component or subsystem level to work out design flaws or to improve design margins. The materiel developer, in coordination with the T&E organization, is responsible for determining the scope of the reliablity design activities and ensuring the MTA program contract is structured to obtain the required data and analyses. Concepts such as Bayesian statistics, highly accelerated life testing, highly accelerated stress screening and rapid design improvement events are to be considered. Traditional sequential reliability growth testing is discouraged.

 

d.  Increased prototyping and experimentation (reference 1.l) allows for early learning of system capabilities, limitations and military utility. Soldiers will be involved and leveraged throughout prototyping, experimentation and developmental testing to characterize operational risks in order to inform decision makers. The use of reserve component units for experimentation and testing is encouraged, provided that Active Duty Operational Support funding is available. National Guard and Army Reserve personnel may provide recent and relevant industry experience that is valuable for assessing or evaluating the systems under development. Operational testing will focus on data gaps that remain after reviewing all available data to complete an operational effectiveness, operational suitability and survivability evaluation.

 

e.  The T&E organization will provide continuous and cumulative assessments during program execution to support MTA program knowledge points and decision reviews. Evaluations and assessments of TA programs will consider all credible data. This may require the materiel developer to establish standard language, metrics, data dictionary, data methods and database structures across M&S, experimentation, demonstrations, contractor testing and government testing. The responsible T&E organization will advise on the adequacy of existing data to satisfy requirements verification.

 

f.  In accordance with reference 1.f, US Army Test and Evaluation Command (ATEC) must complete a safety release before any hands-on testing, use or maintenance by Soldiers. In accordance with reference 1.n, ATEC must complete a safety confirmation to support Materiel Release.

 

g.  If the MTA program satisfies the criteria described in paragraph 3.a of this directive it must complete survivability and lethality testing (reference 1.h) and OT&E (reference 1.j) before production exceeds LRIP equivalent quantities (reference 1.k).

 

h.  The delivery of contractor test data will be included in the materiel developer’s contracting actions. The materiel developer will deliver the data to the responsible T&E organization. The responsible T&E organization will be afforded the opportunity to review test plans for contractor testing; witness testing at the contractor facility; and review test reports (to include all derived test data), in conjunction with the materiel developer, to assess the adequacy of the test data for use in evaluation. The materiel developer will include the use of contractor test data in their MTA program strategy or SAMP. In addition, the materiel developer’s Request for Proposal will address program T&E needs, to include test articles, T&E data rights and government access to test data.

 

i.  All T&E-related approvals external to the Army require coordination through the Army T&E Executive.

Navy

SOCOM

Reference Source: USSOCOM Middle Tier Acquisition Authorities and Guidance, 1 Aug 2018

Test Plan: An MTA Strategy must account for the capability receiving an F&DR or CF&DR before fielding. The Test Officer will play a critical role in ensuring that appropriately tailored test events are planned, documented in the SAMP, resourced and executed to ensure the capability is safe, suitable and effective on schedule.

Additional Resources