Defense Business Systems (DBS)

AAF  >  DBS  >  Test & Evaluation

Test & Evaluation

How to use this site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Integrated Testing

Reference Source: DoDI 5000.75, Section 4.1j

 

The MDA will oversee an effective yet efficient testing approach that incorporates:

 

  • Integrated testing, in which a single test activity can provide data to satisfy multiple objectives, as supported by an integrated testing strategy documented in the capability implementation plan defined in Appendix 4B. Integrated testing may include combined contractor and government developmental testing, as well as integrated government developmental and operational testing.
  • The use of test automation, to the greatest extent practical.
  • Involvement of users and testers throughout the entire life cycle.
  • When supported by the appropriate risk analysis, assessments will primarily use data from integrated test events rather than a dedicated independent operational test event. For programs on the Director, Operational Test and Evaluation (DOT&E) Oversight List, the level of test and use of integrated test data, test strategies, as well as dedicated operational test events should be approved by DOT&E based upon Guidelines for Operational Test and Evaluation of Information and Business Systems.

 

Test & Evaluation (T&E) for the DBS Pathway

Reference Source: DoDI 5000.89, Section 4.6

 

  • DBS are governed by DoDI 5000.75 and supplemented by this issuance relative to T&E.
  • DBS PMs will develop a Test and Evaluation Master Plan (TEMP) OR other test strategy documentation. The PM will describe the test strategy and essential elements of the TEMP in the DBS implementation plan. Specific T&E management content requirements in the implementation plan include:

1. Test events to collect data must be defined, scheduled, and resourced in the implementation plan, including a Developmental Evaluation Framework* (DEF) matrix for DT events.

 

2. Cybersecurity operational T&E must also include a cyber economic vulnerability analysis as outlined in the September 14, 2010 and January 21, 2015 DOT&E Memoranda. The MDA will not tailor cybersecurity T&E solely to meet authority to operate requirements.

 

3. T&E planning will include mission-oriented developmental T&E with actual operators performing end-to-end scenarios in a controlled environment to collect human-system interface data and reduce risk during operational testing.

 

  • Business operations testing ensures the system is working properly before the go-live decision to support Operational Test (OT) on the live environment. Business operations testing employs actual users on the test environment performing end-to-end business transactions.
  • The Chief Developmental Tester (CDT) should plan for interoperability Developmental Test (DT) early to ensure availability of other interfacing business system test environments.
  • For programs on the T&E oversight list, the level of test and use of test data as well as dedicated OT events should be approved by the DOT&E using guidance provided in the September 14, 2010 DOT&E Memorandum. Developmental Test & Evaluation (DT&E) will include interoperability testing with realistic simulations or test environments of interfacing systems with operationally representative data exchanges in a controlled environment.

*DEF: Identifies key data that will contribute to assessing system performance, interoperability, cybersecurity, reliability, and maintainability; the DEF shows the correlation and mapping between technical requirements, decision points, and data requirements.

Cyber Economic Vulnerability Assessment (CEVA)

Reference Source: OT&E Memo, CEVA, January 2015 

Cyber threats present a risk of economic exploitation of information systems whose functions include financial management, payments, allotments, and fiscal transfers. Many of these systems connect to non-Department of Defense (DOD) networks and environments. An adversary may exploit such systems to disrupt mission-essential logistics or steal funds. Business-focused systems in the Department need to be secure and resilient in a potentially
hostile information environment.

Operational Test Agencies (OTAs) should modify their cybersecurity test and evaluation processes as appropriate for DOD systems whose functions include financial or fiscal/business activities or the management of funds, to include the following activities:

  • Cyber Economic Threat Analysis – Development of a set of economic exploitation scenarios derived from threat analysis. The intelligence should come from a variety of sources (e.g., open source intelligence, intelligence agencies, commercial partners, etc.)? This analysis should consider the known or potential vulnerabilities of the system and its associated control processes in question, and establish test cases by which the financial security of the systems under test may be evaluated.
  • Cyber Economic Scenario Testing – Tests threat vectors against the production system under realistic operating conditions, and with the participation of personnel who sufficiently understand the system and associated control processes and how they can be exploited. Testing should encompass scenarios ranging from small-scale fraud to attacks that might result in significant economic degradation to DOD or the U.S. government.
  • Financial Transaction Analysis – Review a representative set of past and current financial transactions for evidence of fraudulent activity (e.g., fraud indicators that identify exceptions or transactions that fall outside normal activity).

 

To adequately assess cyber economic vulnerabilities, all cyber adversarial activities must be conducted with certified and accredited “red team” personnel and should include system and cyber economic subject matter experts to ensure the key operational capabilities and business processes are evaluated (roles, responsibilities, and business processes within the system, as well as dependencies between the host system and other enterprise systems.)

———————————-

CEVA Process Overview

A Cyber Economic Vulnerability Assessment should be conducted in a series of phases across two workstreams:

Workstream One is comprised of three separate activities: Scenario Development, Tabletop Exercise, and Adversarial Testing. The output of Workstream One is a set of findings on cyber economic threats with respect to the system under test (SUT).

Workstream Two is the analysis of SUT data for fraudulent transactions. The output of this analysis is a set of initial findings and recommendations for further analysis.

Figure 1. Assessment Process

Figure 1. Assessment Process

 

The CEVA should leverage, as available, threat intelligence from cyber intrusions into commercial industries to develop an initial set of cyber economic threat vectors (the Operational Test Agencies (OTAs) should use these types of reports which are produced by many commercial vendors; e.g., Mandiant, Verizon, Kaspersky). These threat vectors should be the foundation of stakeholder discussions to create cyber economic scenarios applicable to the
functions of the SUT.

The attack scenarios will serve as a basis for a Tabletop Exercise used to assess the probability of success for attackers and SUT defenders, and to refine scenarios. Upon conclusion of the Tabletop Exercise, the red team, acting as part of a Cyber Opposing Force (OPFOR), will execute a series of technical penetration tests and economic exploitation of the SUT. The Cyber OPFOR should be augmented with subject matter expertise (SME) from the SUT and Department of Defense (DOD) business processes.

Detailed workstream processes, as well as example timelines, scenarios, and lessons learned are provided in Attachment to  OT&E Memo, Cyber Economic Vulnerability Assessments (CEVA).

OT&E of Information and Business Systems

Reference Source: OT&E Memo, Guidelines for OT&E for Business Systems, September 2010

A risk analysis will be conducted by the lead OTA documenting the degree of risk and potential impact on mission accomplishment for each capability. The results of this analysis are expected to be part of the program’s test and evaluation (T&E) documentation (the T&E Strategy (TES) or T&E Master Plan (TEMP), depending on program maturity) and will be used to determine the appropriate level of OT&E to assess operational effectiveness, suitability, and survivability/security. The risk assessment combines the distinct concepts of risk likelihood and risk impact of a capability failing to be operationally effective, suitable, and survivable/secure.

There are three levels of possible OT&E for Applicable Programs. Programs should always plan an integrated test and evaluation strategy to fully assess all capabilities in a given deliverable. The degree of additional independent operational testing is determined by the OTA’s risk analysis.

———————————–

Level I OT&E

An assessment primarily using data from integrated test events other than a dedicated independent operational test event, e.g., developmental tests, certification events, and independent observations of the capability being used in operationally realistic or representative
conditions. Even for programs under DOT&E oversight, the assessment plan is approved by the lead Service or agency OTA.

Features of Level I OT&E are:

  • The OTA influences and monitors selected test activities including recommending inclusion of test cases for examining specific operational issues, and collecting data for the evaluation.
  • Contractor participation is in accordance with the nature ofthe test events with consideration given to fielding plans of the system.
  • For acquisition and fielding decisions, the OTA must confirm that the program has plans in place that address recovery from failures and resolution of shortfalls discovered in test events.
  • The OTA prepares and provides an appropriate independent evaluation or assessment to support the acquisition and fielding processes and, for Applicable, provides a copy to DOT&E.

Level I OT&E is appropriate for capabilities having low risks. Typical deliverables with low risk capabilities where Level I OT&E is anticipated are maintenance upgrades, hardware upgrades, and software patches containing only minor capabilities or enhancements.

———————————–

Level II OT&E

An evaluation that includes an independent operational event, which is carried out by typical users in an operationally realistic or representative environment to assess risk specific factors of operational effectiveness, operational suitability, and survivability/security. The evaluation primarily uses data independently collected during the independent operational event, but also includes data as appropriate from other integrated test program events. The lead Service or agency OTA approves the test plan.

Features of Level II OT&E are:

  • Typical users in their operational environment performing tasks. One or more operational sites might participate and the OTA might prescribe scripted events in addition to normal user activity.
  • Contractor participation is limited to that prescribed in the program’s support plan.
  • For acquisition and fielding decisions, the OTA must confirm that the program has plans in place that address recovery from failures and resolution of shortfalls discovered in test events.
  • The OTA prepares an appropriate independent evaluation of operational effectiveness, operational suitability, and survivability/security to support the acquisition and fielding processes and provides a copy to DOT&E.

Level II OT&E is appropriate for capabilities having a moderate level of risk with limited potential for mission disruption. The Level II OT&E is typically suitable for modest, self-contained, operational capabilities.

———————————–

Level III OT&E

An evaluation of the operational effectiveness, operational suitability, and survivability/security of the operational capability using the COIs and an independent dedicated operational test. This is the highest level and most comprehensive of OT&E. DOT&E will approve the operational test plan.

Features of Level III OT&E are:

  • Level III OT&E must comply with statutes and all provisions of the DoD 5000 series regulations.
  • The OTA carries out test events in an operational environment.
  • The OTA independently evaluates and reports on the operational effectiveness, operational suitability, and survivability/security using all available data, especially independently collected operational test data, to support the acquisition and fielding processes with a copy provided to DOT&E.
  • All test data will be provided to DOT&E for independent analysis and reporting.

Level III OT&E is appropriate for Applicable Programs that have capabilities with high risks. Level III OT&E is typically appropriate for significant or new operational capabilities with high potential for mission disruption.


Risk Assessment Implementation

 

Assess Risk. The OTA, with support from the program office, user representative, and threat community, assesses and documents the risks. Risk assessments are developed using the OTA’s preferred procedures. Assessments must distinguish between the likelihood of occurrence and the severity of the mission impact if the risk is realized (no matter how unlikely). The OTAs may have or develop their own risk rating scales, and no specific rating scale is required.

1. Mission Risk Categories. DOT&E expects the OTAs to evaluate risk categories, questions, and considerations that best reflect the deliverable and operational capabilities being assessed. In all cases, the OTA will perform the risk assessment with support of the program management office, user representatives, and threat community. The four risk categories are:

      • Technology and Software Development (including software reliability). This risk category represents the well-known concern that software can have “bugs” and/or be developed with incorrect understanding of user needs.
      • Integration and Deployment. This risk category relates to signal and data environment; program interfaces to the operating system and user input, interfaces to legacy databases, messaging, communications protocols, and local configuration files; published and actual software service specifications; interoperability; real time processing issues; competency and accurate record-keeping of system administrators tasked with software installation, and other aspects of distributed computing.
      • Training, Utilization, and Management. This risk category relates to user training and organizational buy-in; tactics and procedures for sustainment; and usability issues
      • Information Assurance. This risk category relates specifically to the survivability/security assessment.

2. Likelihood of Risk Occurrence. Once risks have been identified, the risk assessment will distinguish the likelihood that a risk will occur vice the consequence if the risk does occur. The OTAs may use any point system they commonly use for rating risk likelihood. Table 1 below is an example of a three point scale. Every risk identified in step (1) for a given capability should be rated for its likelihood. When in doubt, the likelihood should be rated high.

Table 1. Likelihood of Occurrence at IOT&E/Follow-on Test and Evaluation

3. Operational Impact of Risk Occurrence. The assessment of operational impact, which is the operational consequence from the risk occurring, in the context of OT&E activity is somewhat different from the assessment of impact in a standard risk assessment. 12 First, operational impacts relate only to performance impacts, not impacts on cost and schedule. Second, some risks can have performance impacts that do not greatly affect the mission, and therefore have low operational impact. For example, a risk could cause a complete loss of a specific operational capability but still have low mission impact because of reasons such as the redundancy within the system of interest or from the presence of another system that is able to support completing the mission. So, operational impact involves an understanding of performance impacts of risks plus an assessment of the operational/mission relevance of those impacts. The risk categories above are organized by similar performance outcomes:

      • Technology and Software Development Risks: The performance impacts of these risks will tend to be computer crashes, data errors, and excess resource usage on desktops, servers, and local networks.
      • Integration and Deployment: The performance impacts of these risks tend to be problems in data transmission and inter-operability. Deployment risks also include one-way data or computer transformations that cannot be rolled back. Irreversible changes must be tested at OT&E level II or III.
      • Training, Utilization, and Management: The performance impacts of these risks tend to be problems in user acceptance, excessive training requirements, software administration, and data maintenance.
      • Information Assurance: These are risks related to security exposure, intrusion detection, and ability to respond.

Table 2. Operational/Mission Impact Classification

 

Determination of Level of Operational Test Required. On completion of the risk analysis effort, the levels of OT&E for each risk can be determined from the matrix below. The level of OT&E for each capability is the maximum of the OT&E levels determined for each of the risks to the capability.

Likelihood of Risk Occurrence vs Operational/Mission Impact of Risk Matrix

 

Obtain Approvals. Once the risk assessment is complete, if an Applicable Program, the OTA will provide DOT&E with the risk assessment (likelihoods of occurrence and mission impacts) and the corresponding proposed level of OT&E for approval.