Major Capability Acquisition (MCA)

AAF  >  MCA  >  Develop Strategies  > Test and Evaluation Management Plan

Test and Evaluation Master Plan (TEMP)

How to use this site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Reference Source: DODI 5000.89 Section 3

 

Before the start of testing for any acquisition path, the T&E WIPT will develop and document a TEMP or similar strategic document to capture DT, OT, and LFT&E requirements; the rationale for those requirements (e.g., Joint Capabilities Integration and Development System and concept of operations (CONOPS)); and resources, to be approved by the DOT&E and USD(R&E), or their designee, as appropriate. The TEMP, or similar strategic document for programs not under T&E oversight, is approved at the Service level. At a minimum, the document details:

  • The resources and test support requirements needed for all test phases.
  • Developmental, operational, and live fire test objectives and test metrics.
  • Program schedule with T&E events and reporting requirements that incorporate report generation timelines.
  • Test phase objectives, including entrance and exit criteria and cybersecurity test objectives.
  • Program decisions and data requirements to support those decisions.
  • Data collection requirements.
  • Funding sources for all test resources.

The PM will use the TEMP, test strategy, or other pathway-appropriate test strategy documentation as the planning and management tool for the integrated T&E program. The test strategy documentation requires DoD Component approval. Documentation for programs under USD(R&E) or DOT&E oversight will require USD(R&E), or their designee, and DOT&E approval respectively. Documentation for programs not under T&E oversight is approved at the Service level.

TEMP Guidance

Reference Source: DAG CH 8-3.6 Test & Evaluation Master Plan

The Test and Evaluation Master Plan (TEMP) is a document that describes the overall structure and objectives of the T&E program and articulates the necessary resources to accomplish each phase. It provides a framework within which to generate detailed T&E plans and documents schedule and resource implications associated with the T&E program. The TEMP serves as the overarching document for managing a T&E program.

The TEMP identifies the necessary DT&E, OT&E, and LFT&E activities. It relates program schedule, test management strategy and structure, and required resources to: KPPs and KSAs, as identified within the Capability Development Document (CDD); Critical Operational Issues (COIs); and Critical Technical Parameters (CTPs) developed by the Chief Developmental Tester, in collaboration with the Chief Engineer/Lead System Engineer, and coordinated with the T&E WIPT.

The TEMP includes objectives and thresholds documented in the CDD, CPD, evaluation criteria, and milestone decision points. For multi-Service or Joint programs, a single integrated TEMP is required. Component-unique content requirements, particularly evaluation criteria associated with COIs, can be addressed in a Component-prepared annex to the basic TEMP.

The PM uses the TEMP as the primary planning and management tool for all test activities starting at Milestone A. The PM will prepare and update the TEMP at Milestone B and to support the Development RFP Release Decision and FRP/FD decision points. Additionally, the TEMP will have to be updated prior to Milestone C based on the CPD, and any remaining DT&E prior to IOT&E, and updates to IOT&E.

Program Management Offices (PMOs) develop a TEMP (and subsequent updates) to document the following:

 

  • Roles and responsibilities, including Chief Developmental Tester and Lead DT&E Organization.
  • Certification requirements necessary for the conduct of T&E.
  • An event-driven T&E schedule.
  • The T&E strategy aligned with and supporting the approved acquisition strategy to provide early identification of design and integration issues and adequate, risk-reducing T&E information to support decisions.
  • The integration of developmental and operational tests into an efficient test continuum.
  • The strategy for T&E.
  • Starting at Milestone A, a developmental evaluation methodology.
  • Starting at Milestone B, a developmental evaluation framework.
  • The T&E resources, which should be in alignment with the CARD and T&E budget exhibits (ACAT I programs).
  • The test and evaluation strategies to efficiently identify technology and functionality limitations and capabilities of alternative concepts to support early cost performance trade-off decisions.
  • Adequate measures to support the program’s reliability growth plan and requirements for a Reliability, Availability, Maintainability Cost (RAM-C) Rationale Report defined in DoD RAM Cost Rationale Manual, for Milestones B and C.
  • The modeling and simulation approach and where it is used in the test events, including the resources required and methodology for their verification, validation, and accreditation (VV&A); and how the PM and OTA plan to accredit M&S for OT use.
  • A T&E approach that stresses the system under test to at least the limits of the Operational Mode Summary/Mission Profile, and for some systems, beyond the normal operating limits to ensure the robustness of the design.
  • The plan for demonstration of maturity of the production process through production qualification testing (PQT) of low-rate initial production (LRIP) assets prior to full-rate production (FRP).
  • The plan for using the System Threat Assessment (STA) or System Threat Assessment Report (STAR) as a basis for scoping a realistic test environment.
  • The approach for demonstrating performance against threats and their countermeasures as identified in the Defense Intelligence Agency (DIA), DoD Component intelligence agency, or Service intelligence organization validated threat document.
  • The cybersecurity test and evaluation approach. Additionally, the approach should coordinate development of the Security Assessment Plan with the development of the TEMP in support of the Risk Management Framework (RMF) process. (The RMF process and certification can be a useful entrance criterion for cybersecurity T&E, but it does not obviate the need for T&E.)
  • The plan for Joint interoperability assessments required to certify system-of-systems interoperability.
  • For business systems, the identification of the certification requirements needed to support the compliance factors established by the Office of the Under Secretary of Defense (Comptroller) (USD(C)) for financial management, enterprise resource planning, and mixed financial management systems.
  • A system-of-systems network architecture diagram, including removable media and laptops, etc., for cybersecurity assessment.

TEMP Outline

Reference Source: DAG CH 8-3.6 Test & Evaluation Master Plan

The following contains a basic TEMP outline, which highlights the key TEMP topics needing addressed. Go to T&E Policy & Guidance for an editable TEMP Format and additional TEMP information.

Refer to the TEMP Guidebook for more detail regarding TEMP content.

 

TEMP FORMAT

PART I – Introduction

1.1. Purpose
1.2. Mission Description
1.2.1. Mission Overview
1.2.2. Concept of Operations
1.2.3. Operational Users
1.3. System Description
1.3.1. Program Background
1.3.2. Key Interfaces
1.3.3. Key Capabilities
1.3.4. System Threat Assessment
1.3.5. Systems Engineering (SE) Requirements
1.3.6. Special Test or Certification Requirements
1.3.7. Previous Testing
 

PART II – TEST PROGRAM MANAGEMENT AND SCHEDULE

2.1. T&E Management
2.1.1. T&E Organizational Construct
2.2. Common T&E Database Requirements
2.3. Deficiency Reporting
2.4. TEMP Updates
2.5. Integrated Test Program Schedule
Figure 2.1. Integrated Test Program Schedule
 

PART III – TEST AND EVALUATION STRATEGY AND IMPLEMENTATION

3.1. T&E Strategy
3.1.1. Decision Support Key
3.2. Developmental Evaluation Approach
3.2.1. Developmental Evaluation Framework
3.2.2. Test Methodology
3.2.3. Modeling and Simulation (M&S)
3.2.4. Test Limitations and Risks
3.3. Developmental Test Approach
3.3.1. Mission-Oriented Approach
3.3.2. Developmental Test Events (Description, Scope, and Scenario) and Objectives
3.4. Certification for Initial Operational Test and Evaluation (IOT&E)
3.5. Operational Evaluation Approach
3.5.1. Operational Test Events and Objectives
3.5.2. Operational Evaluation Framework
3.5.3. Modeling and Simulation
3.5.4. Test Limitations
3.6. Live Fire Test & Evaluation Approach
3.6.1. Live Fire Test Objectives
3.6.2. Modeling and Simulation
3.6.3. Test Limitations
3.7. Other Certifications
3.8. Future Test & Evaluation
 

PART IV – RESOURCE SUMMARY

4.1. Introduction
4.2. Test Resource Summary
4.2.1. Test Articles
4.2.2. Test Sites
4.2.3. Test Instrumentation
4.2.4. Test Support Equipment
4.2.5. Threat Representation
4.2.6. Test Targets and Expendables
4.2.7. Operational Force Test Support
4.2.8. Models, Simulations, and Test Beds
4.2.9. Joint Operational Test Environment
4.2.10. Special Requirements
4.3. Federal, State, and Local Requirements
4.4. Manpower / Personnel and Training
4.5. Test Funding Summary
 

APPENDICES

Appendix A Bibliography
Appendix B Acronyms
Appendix C Points of Contact
 

The following appendices provide a location for additional information, as necessary

Appendix D Scientific Test and Analysis Techniques
Appendix E Cybersecurity
Appendix F Reliability Growth Plan
Appendix G Requirements Rationale
 
Additional Appendices, as needed

Key Considerations in T&E Strategy Development

Reference Source: DODI 5000.89 Section 3.1

 

  • Programs will incorporate integrated testing at the earliest opportunity when developing program strategies, plans with program protection, documentation, and T&E strategies or the TEMPs. Developing and adopting integrated testing early in the process increases the effectiveness and efficiency of the overall T&E program.
    • If done correctly, integrated testing provides greater opportunity for early identification of concerns to improve the system design, and guides the system development during the engineering and manufacturing development phase. Conducting critical test activities earlier will enable the discovery of problems that the program can fix while the system is still in development and avoid costly redesigns late in the acquisition life cycle.
    • Integrated testing and independent evaluation also encourage the sharing of all developmental test (DT), OT, and live fire test resources to accomplish the test program. For programs informing decisions that are not addressed in Title 10, U.S.C., such as fielding, deployment, and low-rate production decisions, well planned and executed integrated testing may provide necessary data for an OTA to determine the system’s operational effectiveness, suitability, and overall mission capability.
    • Integrated testing does not replace or eliminate the requirement for IOT&E, as a condition to proceed beyond LRIP for programs with beyond low-rate (i.e., full-rate) production decisions as required by Section 2399 of Title 10, U.S.C.
  • To ensure T&E focuses on informing the program’s decision-making process throughout the acquisition life cycle, the TEMP will include the program’s key decision points and the T&E information needed to support them. These decisions may be made by leaders ranging from the program manager (PM) to the MDA, and should represent major turning or decision points in the acquisition life cycle that need T&E information in order to make an informed decision. Examples include milestone decisions, key integration points, and technical readiness decisions.
  • This information is captured in a table known as the Integrated Decision Support Key (IDSK). This table is developed by the PM by analyzing what is already known about the capability, what still needs to be known about the capability, and when it needs to be known.

The PM:

  • Resources and executes the system’s integrated test and independent evaluation program.
  • Identifies DT, OT, and LF data requirements necessary to support decisions, in consultation with the chief developmental tester (CDT), the chief engineer, and the OTA representative, and combines them into an IDSK.
  • Charters an integrated test planning group (i.e., the T&E Working-level Integrated Product Team (WIPT), also known as an integrated test team) early in the program. It will consist of empowered representatives of test data producers and consumers (including all applicable stakeholders) to ensure collaboration and to develop a strategy for robust, efficient testing to support systems engineering, evaluations, and certifications throughout the acquisition life cycle.

The T&E WIPT, chaired by the CDT:

  • Provides a forum for involvement by all key organizations in the T&E effort.
  • Develops the TEMP for the PM. Requires all key stakeholders to be afforded an opportunity to contribute to TEMP development.
  • Includes representatives of test data stakeholders such as systems engineering, DT&E, OT&E, LFT&E, the user, product support, the intelligence community, and applicable certification authorities.
  • Supports the development and tracking of an integrated test program for DT, OT,LFT&E, and modeling and simulation to support evaluations.
  • Supports the development and maintenance of the integrated test schedule.
  • Identifies and provides a recommended corrective action or risk assessment.
  • Explores and facilitates opportunities to conduct integrated testing to meet DT, OT, and LFT&E objectives.

The T&E WIPT requires test objectives to be understood, the testing to be conducted in an operational context to the maximum extent possible, and the resultant data to be relevant for use in independent evaluations and the rationale behind the requirements. While using the T&E framework, as shown in Figure 1, it is critical that all stakeholders:

  • Understand the scope of the evaluations required.
  • Define, up front, the end state for evaluations.
  • Develop an integrated testing approach that generates the data required to conduct independent evaluations.

Integrated T&E Framework

Figure 1. Integrated T&E Framework

 

The T&E WIPT will identify DT, OT, and LFT&E data requirements needed to inform critical acquisition and engineering decisions. Once the T&E WIPT identifies the data requirements, the developmental and operational testers together will determine which data requirements can be satisfied through integrated testing and develop an integrated test matrix.

  • All stakeholders will use the IDSK to independently develop evaluation frameworks or strategies that will show the correlation and mapping between evaluation focus areas, critical decision points, and specific data requirements.
  • The CDT will develop the developmental evaluation framework (DEF) that focuses on the correlation between technical requirements, decision points, and data requirements.
  • The OTA representative will develop the operational evaluation framework (OEF)that focuses on the correlation between operational issues, decision points, and data requirements. The linkage between the OEF and the DEF shows that technical requirements support operational capabilities.

As part of the digital engineering strategy, models and data will be used to digitally represent the system in a mission context to conduct integrated T&E activities. To the largest extent possible, programs will use an accessible digital ecosystem (e.g., high bandwidth network, computational architectures, multi-classification environment, enterprise resources, tools, and advanced technologies). This environment must provide authoritative sources of models, data, and test artifacts (e.g., test cases, plans, deficiencies, and results) and provide digital technologies to automate, reuse, and auto-generate test artifacts to gain greater accuracy, precision, and efficiencies across integrated test resources.

T&E Oversight List

Reference Source: DODI 5000.89 Section 3.2

 

T&E Oversight List

  • The DOT&E will manage the T&E oversight list used jointly by the USD(R&E) and DOT&E. Programs on OT and LFT&E oversight include those programs that meet the statutory definition of MDAPs in Section 2430, Title 10, U.S.C., and those that are designated by the DOT&E for oversight pursuant to Paragraph (a)(2)(B) of Section 139, Title 10, U.S.C. The DOT&E treats the latter programs as MDAPs for the purpose of OT and LFT&E oversight requirements, but not for any other purpose.
  • The DOT&E may place any program or system on the T&E oversight list at any time by using the following criteria:
    • Program exceeds or has the potential to exceed the dollar value threshold for a major program, to include MDAPs, designated major subprograms, as well as highly classified programs and pre-MDAPs.
    • Program has a high level of congressional or DoD interest.
    • Weapons, equipment, or munitions that provide or enable a critical mission warfighting capability, or are a militarily significant change to a weapon system.
    • The DOT&E will provide formal notification to a Military Service when a program is being added to the T&E oversight list.
  • The DOT&E will provide formal notification to a Military Service when a program is being added to the T&E oversight list.
  • The DOT&E will monitor acquisition programs and consider the following to determine when programs should be removed from the T&E oversight list:
    • T&E (initial and follow-on OT&E or LFT&E) is complete and associated reporting to inform fielding and full-rate production (FRP) decisions is complete.
    • Program development has stabilized, and there are no significant upgrade activities.
  • The DOT&E is the approval authority for the respective OT&E and LFT&E planned activities in TEMPs, test strategies, or other overarching program test planning documents for programs on the T&E oversight list.
  • The USD(R&E) is the approval authority for the DT&E plan in the TEMP, test strategy, or other overarching program test planning documents for all acquisition category (ACAT) ID programs. The USD(R&E) reviews and advises the MDA on the DT&E plan in the TEMP, test strategy, or other overarching program test planning documents for ACAT IB and IC programs.
  • If an Under Secretary or Service Secretary has a significant objection to a fundamental aspect of the DT&E plan, he or she may raise this objection to the Deputy Secretary of Defense in the form of a briefing. The briefing serves to notify the Deputy of a dissenting view, not to preemptively halt the relevant decision or the program office’s activities. If warranted, the Deputy will intercede. Briefing requests should be made well in advance of the approval of the TEMP.
  • The T&E oversight list is unclassified. The DOT&E maintains the T&E oversight list continuously at https://osd.deps.mil/org/dote-extranet/SitePages/Home.aspx (requires login with a common access card). Classified and sensitive programs placed on T&E oversight will be identified directly to their MDAs.
  • Force protection equipment (including non-lethal weapons) will be subject to oversight, as determined by the DOT&E.

T&E Management

Reference Source: DODI 5000.89 Section 3.3

 

T&E Management

  • As soon as practicable after the program office is established, the PM will designate a CDT. The CDT will be responsible for coordinating the planning, management, and oversight of all DT&E (contractor and government) activities; overseeing the T&E activities of other participating government activities; and helping the PM make technically informed, objective judgments about contractor and government T&E planning and results.
  • PMs will designate, as soon as practicable after the program office is established, a government test agency to serve as the lead DT&E organization. For non-T&E oversight programs, a lead DT&E organization should be used, when feasible, and identified in the TEMP. The lead DT&E organization will be responsible for:
    • Providing technical expertise on T&E concerns to the CDT.
    • Conducting DT&E activities to support independent evaluations.
    • Conducting DT&E activities as directed by the CDT or his or her designee.
    • Supporting certification and accreditation activities when feasible.
    • Assisting the CDT in providing oversight of contractors.
    • Assisting the CDT in reaching technically informed, objective judgments about contractor and government T&E planning and results.
  • For each program, a lead OTA, lead DT organization, and lead test organization (LTO)will be designated to plan and conduct OTs, DTs, and LFT&E; report results; and provide an independent and objective evaluation of operational effectiveness, operational suitability, survivability (including cybersecurity), or lethality. They also conduct additional testing and evaluation, as required.
  • A program may use several different acquisition pathways, such as the major capability acquisition pathway that has a component or subprogram being developed through the MTA pathway and a software capability developed using the software acquisition pathway. As required in the particular pathway guidance, individual program planning documents will include a transition or integration plan that describes the T&E scope and resources following the transition.
  • T&E program documentation that already exists in other acquisition documents may be referenced as appropriate in the DOT&E- or USD(R&E)-approved T&E document. Once referenced, there is no requirement to repeat the language in the T&E program document.
  • The PM and test agencies for T&E oversight programs will provide the Defense Technical Information Center (DTIC) with all reports, and the supporting data and metadata for the test events in those reports. If there are limitations in the data or metadata that can be provided to DTIC, those limitations will be documented in the TEMP starting at MS B.
  • Test agencies will provide the DoD Modeling and Simulation Coordination Office with a descriptive summary and metadata for all accredited unclassified models or simulations that can potentially be reused by other programs.
  • The Secretaries of the Military Departments, in coordination with the DAE, the DOT&E, and the Under Secretary of Defense for Personnel and Readiness, will establish a common set of data for each major weapon system type to be collected on damage incurred during combat operations. These data will be stored in a single dedicated and accessible repository at the DTIC. The lessons learned from analyzing these data will be included, as appropriate, in both the capability requirements process and the acquisition process for new acquisitions, modifications, and upgrades.

T&E Program Planning

Reference Source: DODI 5000.89 Section 3.4

 

 

T&E Program Planning

 

  • The following are key considerations in developing the TEMP or other test planning documentation:
    • The PM and the T&E WIPT will use the TEMP or other planning documentation starting at Milestone A or the decision point to enter the applicable acquisition pathway. The PM and the T&E WIPT will prepare and update the planning documentation as needed to support acquisition milestones or decision points. For FRP decision review, full deployment decision review, and thereafter, the MDA, the senior DoD Component leadership, or DOT&E (for programs on T&E oversight), may require planning documentation updates or addendums to address changes to planned or additional testing.
    • Draft TEMPs will be available to program stakeholders as early and as frequently as possible. For oversight programs, TEMPs approved by the DoD Components will be submitted to the OSD for approval not later than 45 calendar days before the supported decision point. The PMs will ensure programs containing Information Technology (IT) are properly deconflicted with those programs’ post implementation review described in DoD Instruction (DoDI) 5000.82. To support agile acquisition, the timeline for TEMP delivery may be tailored with mutual consent between the DOT&E, OTA, and program office.
    • A TEMP may be waived or other tailored test strategy documentation be specified for certain acquisition pathways. In cases where a TEMP is not needed, early briefings to Service stakeholders (as well as the USD(R&E) and DOT&E for oversight programs) are required to facilitate cross-organizational alignment and subsequent approval of test planning documentation.
  • The TEMP or other test strategy documentation will:
    • Contain an integrated test program summary and master schedule of all major test events or test phases to evaluate. The schedule should include the key programmatic decision points supported by the planned testing.
    • Describe DT test events designed to evaluate performance interoperability, reliability, and cybersecurity.
    • Describe OT test events designed to evaluate operational effectiveness, operational suitability, survivability, and cybersecurity.
    • Include an event-driven testing schedule that will allow adequate time to support pre-test predictions; testing; post-test analysis, evaluation, and reporting; reconciliation of predictive models; and adequate time to support execution of corrective actions in response to discovered deficiencies. The schedule should allow sufficient time between DT&E and IOT&E for rework, reports, and analysis, and developmental testing of critical design changes.
    • Be a source document for the request for proposal (RFP).
    • Guide how contractor proposals will address program T&E needs, (e.g., test articles; T&E data rights; government access to the failure reporting; built-in test and embedded instrumentation data; government use of contractor-conducted T&E; government review and approval of contractor T&E plans; and government review of contractor evaluations).
    • Include a DEF, live fire strategy, and an OT concept or OEF. The DEF, live fire strategy, and the OT concept identify the key data that will contribute to assessing whether the DoD is acquiring a system that supports the warfighter in accomplishing the mission.
    • Examples of DT measures of program progress include key performance parameters (KPPs), critical technical parameters, intelligence data requirements, key system attributes, interoperability requirements, cybersecurity requirements, reliability growth, maintainability attributes, and DT objectives. In addition, the DEF will show the correlation and mapping between test events, key resources, and the decision supported.
    • The PM and T&E WIPT should use an IDSK to ensure that the critical operational issues are not evaluating the technical specifications of the system, but are unit focused and tied to unit mission accomplishment.
    • Identify how scientific test and analysis tools will be used to design an effective and efficient test program that will produce the required data to characterize system behavior and combat mission capability across an appropriately selected set of factors and conditions.
    • Require all test infrastructure and tools (e.g., models, simulations, automated tools, synthetic environments) supporting acquisition decisions to be verified, validated, and accredited (VV&A) by the intended user or appropriate agency. Test infrastructure, tools, and the VV&A strategy and schedule, including the VV&A authority for each tool or test infrastructure asset, will be documented in the TEMP, or other test strategy documentation. PMs will plan for the application and accreditation of any modeling and simulation tools supporting T&E.
    • Require complete resource estimates for T&E to include: test articles, test sites and instrumentation, test support equipment, threat representations and simulations, intelligence mission data, test targets and expendables, support for friendly and threat operational forces used in test, models and simulations, testbeds, joint mission environment, distributed test networks, funding, manpower and personnel, training, federal/State/local requirements, range requirements, and any special requirements (e.g., explosive ordnance disposal requirements or corrosion prevention and control). Resources will be mapped against the IDSK and schedule to ensure adequacy and availability.
    • For MDAPs, pursuant to Section 839(b) of Public Law 115-91, the PM will develop a resource table listing the initial estimates for government T&E costs in three specific categories: DT&E, OT&E, and LFT&E. This requirement also applies at each TEMP, or other test strategy documentation update.
  • Pursuant to Section 139, Title 10, U.S.C., the DOT&E will have prompt access to all data regarding modeling and simulation activity proposed to be used by Military Departments and Defense Agencies in support of operational or LFT&E of military capabilities. This access will include data associated with VV&A activities. The PM will allow prompt access, after a test event, to the USD(R&E) and DOT&E, all records and data (including classified and propriety information, and periodic and preliminary reports of test events). Timelines for delivery for records, reports, and data will be coordinated among the stakeholders and documented in appropriate test documentation.

 

Cybersecurity T&E

Reference Source: DODI 5000.89 Section 3.5

 

Cybersecurity T&E

    • Cybersecurity planning and execution occurs throughout the entire life cycle. All DoD acquisition programs and systems (e.g., DBS, national security systems, weapon systems, non-developmental items), regardless of acquisition pathway, will execute the cybersecurity DT and OT iterative T&E process detailed in the DoD Cybersecurity T&E Guidebook throughout the program’s life cycle, including new increments of capability. The DoD Cybersecurity T&E Guidebook provides the latest in data-driven, mission-impact-based analysis and assessment methods for cybersecurity T&E and supports assessment of cybersecurity, survivability, and resilience within a mission context and encourages planning for tighter integration with traditional system T&E.
    • The PMs will:
    • Develop a cybersecurity strategy as part of the program protection plan based on the Joint Capabilities Integration and Development System or other system cybersecurity, survivability, and resilience requirements; known and postulated threats; derived system requirements; draft system performance specifications; and the intended operational use and environment. The cybersecurity strategy will also incorporate the appropriate aspects of the risk management framework (RMF) process (governed by DoDI 8500.01 and DoDI 8510.01) that supports obtaining an authority to operate and other items as addressed in DoD cybersecurity policies
  • The cybersecurity strategy should describe how the authority to operate decision will be informed by the cybersecurity testing specified in the DoD Cybersecurity T&E Guidebook. The cybersecurity strategy should leverage integrated contractor and government testing to evaluate the security of contractor and government development capabilities of the program’s sub-components, components, and integrated components; and describe the dedicated government system vulnerability and threat-based cybersecurity testing to be conducted before program product acceptance.
  • Use the cybersecurity strategy as a source document to develop the TEMP, or othertest strategy documentation. The TEMP DEF and OEF will identify specific cybersecurity data required to address the various cybersecurity stakeholder needs (PM, engineers, RMF, DT testers, OTA), crosswalk the data to develop an integrated cybersecurity T&E strategy that efficiently obtains these data, and describe how key program decisions, including the authority to operate decision, will be informed by cybersecurity testing.
  • Determine the avenues and means by which the system and supporting infrastructure may be exploited for cyber-attack and use this information to design T&E activities and scenarios. Conduct a mission-based cyber risk assessment (such as a cyber table top) to identify those elements and interfaces of the system that, based on criticality and vulnerability analysis, need specific attention in T&E events.
  • Plan to conduct contractor and government integrated tailored cooperative vulnerability identification (T&E activities to identify vulnerabilities and plan the means to mitigate or resolve them, including system scans, analysis, and architectural reviews. These activities begin with prototypes.
  • Plan to conduct integrated tailored cybersecurity DT&E events using realistic threat exploitation techniques in representative operating environments and scenarios to exercise critical missions within a cyber-contested environment to identify any vulnerabilities and assess system cyber resilience. Whenever possible, plan threat-based testing as part of integrated contractor and government T&E.
  • The April 3, 2018 DOT&E Memorandum directs OTAs to perform a cybersecuritycooperative vulnerability and penetration assessment (CVPA) and an adversarial assessment (AA) of all acquisition programs. The January 21, 2015 DOT&E Memorandum directs OTAs to modify their cybersecurity T&E processes as appropriate for DoD systems whose functions include financial or fiscal/business activities or the management of funds. The January 21, 2015 DOT&E Memorandum also directs the OTAs to add cyber economic threat analysis, cyber economic scenario testing, and financial transaction analysis to their cybersecurity test planning for DBS.
  • The DOT&E requires testing of cybersecurity during OT&E to include the representative users and an operationally representative environment. This may include hardware; software (including embedded software and firmware); operators; maintainers; operational cyber and network defense; end users; network and system administrators; help desk; training; support documentation; tactics, techniques, and procedures; cyber threats; and other systems that input or exchange information with the system under test, as applicable.
  • The OTA, with DOT&E review and approval, should integrate developmental and operational testing where possible to ensure sufficient data are obtained to meet OT&E objectives and measures. The OTAs should review and consider data from DT events (such as the cooperative vulnerability identification and adversarial cybersecurity DT&E) and any integrated tests previously conducted. CVPA and AA results used in conjunction with the other OT&E and LFT&E results will inform the overall evaluation of operational effectiveness, suitability, and survivability.
  • All programs should plan for periodic integrated government cybersecurity test events before beginning operational testing or initial production, with the goal of increasing efficiency and effectiveness of cybersecurity T&E.
    (1)Understanding that the objectives and knowledge requirements of DT&E and OT&Emust be met, it is critical that the conditions of the test event and the maturity of the system under test are acceptable to both stakeholders.
    (2)The system under test must be mature enough to represent the production version.The test conditions should be realistic enough to adequately represent the operational environment, while still being flexible enough to allow a wide range of penetration and adversarial activities. The goal is to maximize assessment of vulnerabilities, evaluate adversarial exploitability of those vulnerabilities, as well as evaluate recovery and restoral processes.
    (3)Testing must include evaluating appropriate defensive cyberspace operations inaccordance with DoDI 8530.01. The result of cybersecurity testing should be an understanding of mission critical cybersecurity vulnerabilities, each of which should then be eliminated before fielding the system.

Interoperability T&E

Reference Source: DODI 5000.89 Section 3.6

 

Interoperability T&E

  • Interoperability testing is governed by DoDI 8330.01. All programs or acquisition paths that exchange data with an organization or site external to their Service require an interoperability certification from the Joint Interoperability Test Command, and will need to incorporate interoperability into the DT and OT.
  • IT interoperability should be evaluated early and with sufficient frequency throughout a system’s life cycle to capture and assess changes affecting interoperability in a platform, joint, multinational, and interagency environment. Interoperability T&E can be tailored for the characteristics of the capability being acquired in accordance with applicable acquisition pathway policy. Interoperability certification must be granted before fielding of a new IT capability or upgrade to existing IT.
  • Working with the DoD business, warfighting, intelligence, and enterprise information environment mission area owners (Chief Management Officer of the Department of Defense, Chairman of the Joint Chiefs of Staff, Under Secretary of Defense for Intelligence and Security, and DoD Chief Information Officer) and the other DoD Component heads, the T&E WIPTs should require that capability-focused, architecture-based measures of performance and associated metrics are developed to support evaluations of IT interoperability throughout a system’s life cycle and to ensure logistics assets are planned for within the T&E management plan.

Navigation Warfare Compliance T&E

Reference Source: DODI 5000.89 Section 3.7

 

Navigation Warfare (NAVWAR) Compliance T&E

  • In accordance with the national defense strategy and DoDD 4650.05, resilient positioning, navigation, and timing (PNT) information is essential to the execution and command and control of military missions and to the efficient operation of information networks necessary for continuous situational awareness by Combatant Commanders. The DoD will employ NAVWAR capabilities to ensure a PNT advantage in support of military operations, and programs producing or using PNT information must be NAVWAR compliant. NAVWAR compliance testing is governed by DoDI 4650.08.
  • Each program or system producing or using PNT information must incorporate the system survivability KPP as defined in Paragraph 3.2.a. of DoDI 4650.08.
  • For each program or system producing or using PNT information, the PM must conduct system T&E (e.g., real-world test; modeling and simulation; empirical analysis) sufficient to validate that all systems or platforms producing or using PNT information meet the system survivability KPP referred to in Paragraph 3.7.b.
  • Pursuant to Section 1610 of Public Law 115-232, also known as “the National DefenseAuthorization Act for Fiscal Year 2019,” the PM will systematically collect PNT T&E data, lessons learned, and design solutions. In accordance with DoDD 4650.05, the USD(R&E) and the DOT&E will share insights gained from such information with the DoD PNT Enterprise Oversight Council, as appropriate.