Major Capability Acquisition (MCA)

AAF  > MCA  >  Technical Reviews  >  Critical Design Review

Critical Design Review

How To Use This Site

Each page in this pathway presents a wealth of curated knowledge from acquisition policies, guides, templates, training, reports, websites, case studies, and other resources. It also provides a framework for functional experts and practitioners across DoD to contribute to the collective knowledge base. This site aggregates official DoD policies, guides, references, and more.

DoD and Service policy is indicated by a BLUE vertical line.

Directly quoted material is preceeded with a link to the Reference Source.

Reference Source: DAG CH 3-3.3.5 Critical Design Review

The Critical Design Review (CDR), which occurs during the EMD phase, confirms the system design is stable and is expected to meet system performance requirements, confirms the system is on track to achieve affordability and should-cost goals as evidenced by the detailed design documentation and establishes the initial product baseline.

The CDR provides the acquisition community with evidence that the system, down to the lowest system element level, has a reasonable expectation of satisfying the requirements of the system performance specification as derived from the Capability Development Document (CDD) within current cost and schedule constraints. At this point in the program, system performance expectations are based on analysis and any prototype testing/demonstration efforts conducted at the system element and/or system level. Demonstration of a complete system is not expected to be accomplished by this point.

The CDR establishes the initial product baseline for the system and its constituent system elements. It also establishes requirements and system interfaces for enabling system elements such as support equipment, training system, maintenance and data systems. The CDR should establish an accurate basis to assess remaining risk and identify new opportunities. At this point the system has reached the necessary level of maturity to start fabricating, integrating, and testing pre-production articles with acceptable risk.

The product baseline describes the detailed design for production, fielding/deployment and operations and support. The product baseline prescribes all necessary physical (form, fit and function) characteristics and selected functional characteristics designated for production acceptance testing and production test requirements. It is traceable to the system performance requirements contained in the CDD. The initial system element product baseline is established and placed under configuration control at the system element CDR and verified later at the Physical Configuration Audit (PCA). The Program Manager (PM) assumes control of the initial product baseline at the completion of the system level CDR to the extent that the competitive environment permits. This does not necessarily mean that the PM takes delivery and acceptance of the Technical Data Package (TDP).

Roles and Responsibilities

Reference Source: DAG CH 3-3.3.5 Critical Design Review

The Systems Engineer documents the approach for the CDR in the Systems Engineering Plan (SEP). This includes identification of criteria and artifacts defining the product baseline.

The PM reviews and approves the approach, ensures the required resources are available and recommends review participants.

The PM and Systems Engineer may hold incremental CDRs for lower-level system elements, culminating with a system-level CDR. The system CDR assesses the final design as captured in system performance specifications for the lower-level system elements; it further ensures that documentation for the detailed design correctly and completely captures each such specification. The PM and Systems Engineer evaluate the detailed designs and associated logistics elements to determine whether they correctly and completely implement all allocated system requirements, and whether they have maintained traceability to the CDD.

The PM’s responsibilities include:

  • Approving, funding and staffing the system CDR as planned in the SEP developed by the Systems Engineer.
  • Establishing the plan to the System Verification Review (SVR) in applicable contract documents including the SE Management Plan (SEMP), Integrated Master Schedule (IMS) and Integrated Master Plan (IMP).
  • Ensuring the plan includes subject matter experts to participate in each review.
  • Controlling the configuration of the Government-controlled subset of the functional, allocated and product baselines; convene Configuration Steering Boards (CSBs) when changes are warranted.

The Systems Engineer’s responsibilities include:

  • Developing and executing the system CDR plans with established quantifiable review criteria, carefully tailored to satisfy program objectives.
  • Ensuring the pre-established review criteria have been met to ensure the design has been captured in the allocated baseline and initial product baseline.
  • Ensuring assessments and risks associated with all design constraints and considerations are conducted, documented and provided (e.g., reliability and maintainability, corrosion, and Environment, Safety and Occupational Health (ESOH) considerations).
  • Updating risk, issue and opportunity plans. Identifying, analyzing, mitigating, and monitoring risks and issues; and identifying, analyzing, managing and monitoring opportunities. (See the DoD Risk, Issue, and Opportunity Management Guide for Defense Acquisition Programs.) Monitor and control the execution of the CDR closure plans.
  • Documenting the plan to SVR in the SEP and elsewhere as appropriate.

The CDR is mandatory for MDAPs. A CDR assessment will be conducted — assessing the conduct of the review and the program technical risk — and will be provided to the MDA. For ACAT ID programs, DASD(SE) conducts the CDR assessment. For ACAT IC programs, the Component Acquisition Executive conducts the CDR assessment.

Inputs and Review Criteria

Reference Source: DAG CH 3-3.3.5 Critical Design Review

Figure 25 provides the end-to-end perspective and the integration of SE technical reviews and audits across the acquisition life cycle.

Figure 25: Weapon System Development Life Cycle

Figure 25: Weapon System Development Life Cycle

The March 2012 Government Accountability Office (GAO) report, “Assessments of Selected Weapon Programs,” suggests a best practice is to achieve design stability at the system-level CDR. A general rule is that 75 to 90 percent of (manufacturing quality) product drawings, software design specification(s) and associated instructions (100 percent for all Critical Safety Items (CSIs) and Critical Application Items (CAIs) should be complete in order to provide tangible evidence of a stable product design. A prototype demonstration shows that the design is capable of meeting performance requirements.

The CDR review criteria are developed to best support the program’s technical scope and risk and are documented in the program’s SEP no later than Milestone B.

CDR Products and Criteria

Reference Source: DAG CH 3-3.3.5 Critical Design Review

Product

CDR Criteria

Cost Estimate
  • Updated Cost Analysis Requirements Description (CARD) is consistent with the approved initial product baseline
  • System production cost model has been updated, allocated to system-element level and tracked against targets
Technical Baseline Documentation (Initial Product)
  • Detailed design (hardware and software), including interface descriptions, are complete and satisfy all requirements in the functional baseline
  • Requirements trace among functional, allocated and initial product baselines is complete and consistent
  • Key product characteristics having the most impact on system performance, assembly, cost, reliability and sustainment or ESOH have been identified to support production decisions
  • Initial product baseline documentation is sufficiently complete and correct to enable hardware fabrication and software coding to proceed with proper configuration management
  • Assessment of the technical effort and design indicates potential for operational test and evaluation success (operationally effective and operationally suitable) (See CH 8–4.3.)
  • 100% of Critical Safety Items and Critical Application Items have completed drawings, specifications and instructions
  • Failure mode, effects and criticality analysis (FMECA) is complete
  • Estimate of system reliability and maintainability based on engineering analyses, initial test results or other sources of demonstrated reliability and maintainability
  • Detailed design satisfies sustainment and Human Systems Integration (HSI) requirements (See CH 5–4.)
  • Software functionality in the approved initial product baseline is consistent with the updated software metrics and resource-loaded schedule
  • Software and interface documents are sufficiently complete to support the review
  • Detailed design is producible and assessed to be within the production budget
  • Process control plans have been developed for critical manufacturing processes
  • Critical manufacturing processes that affect the key product characteristics have been identified, and the capability to meet design tolerances has been determined
  • Verification (developmental test and evaluation (DT&E)) assessment to date is consistent with the initial product baseline and indicates the potential for test and evaluation success (See Test and Evaluation Master Plan (TEMP) and Chief Developmental Tester in CH 8–4.3.)
Risk Assessment
  • All risk assessments and risk mitigation plans have been updated, documented, formally addressed and implemented
  • Approach/Strategy for test and evaluation defined in the TEMP accounts for risks with a mitigation plan; necessary integration and test resources are documented in the TEMP and current availabilities align with the Program’s IMS (Systems Engineer coordinates with Chief Developmental Tester in this area; see CH 8–4.3.)
  • ESOH risks are known and being mitigated
  • Risks associated with intelligence mission data (IMD) dependencies have been identified and addressed; refer to CH 3–4.3.12. Intelligence (Life-Cycle Mission Data Plan)
Technical Plans
  • PDR is successfully completed; all PDR actions are closed
  • Integrating activities of any lower-level CDRs have occurred; identified issues are documented in action plans
  • All entry criteria stated in the contract (e.g., SOW, SEP, approved SEMP and system performance specification) have been satisfied
  • Adequate processes and metrics are in place for the program to succeed
  • Program schedule as depicted in the updated IMS (see CH 3–4.1.1.2. Integrated Master Plan and CH 3-4.1.1.3. Integrated Master Schedule) is executable (within acceptable technical/cost risks)
  • Program is properly staffed
  • Program is executable with the existing budget and the approved initial product baseline
  • Detailed trade studies and system producibility assessments are under way
  • Issues cited in the ISP are being satisfactorily addressed
  • Materials and tooling are available to meet the pilot line schedule
  • Logistics (sustainment) and training systems planning and documentation are sufficiently complete to support the review
  • Life-Cycle Sustainment Plan (LCSP), including updates on program sustainment development efforts and schedules based on current budgets, test and evaluation results and firm supportability design features, is approved
  • Long-lead procurement plans are in place; supply chain assessments are complete

Outputs and Products

Reference Source: DAG CH 3-3.3.5 Critical Design Review

The Technical Review Chair determines when the review is complete. Completion of the CDR should provide the following:

  • An established initial product baseline.
  • Acceptable risks with mitigation plans approved and documented in the IMS.
  • Updated CARD (or CARD-like document) based on the initial product baseline.
  • Updated program development schedule including fabrication, test and evaluation, software coding and critical path drivers.
  • Corrective action plans for issues identified in the CDR.
  • Updated LCSP, including program sustainment development efforts and schedules based on current budgets, test evaluation results, and firm supportability design features.

Note that baselines for some supporting items might not be at the detailed level and may lag the system-level CDR. Enabling systems may be on different life-cycle timelines. The CDR agenda should include a review of all this information, but any statement that all of the detailed design activity on these systems is complete may lead to misunderstandings. As an example, development of simulators and other training systems tends to lag behind weapon system development.

Critical Design Review (CDR) Assessment

Reference Source: DAG CH 3-3.3.5 Critical Design Review

A system-level CDR assessment is required for MDAP programs. This assessment informs the MDA of the technical risks and the program’s readiness to proceed. The Deputy Assistant Secretary of Defense for Systems Engineering (DASD(SE)) is directed to conduct CDR assessments on ACAT ID programs; and the Component Acquisition Executive (CAE) is to conduct CDR assessments on ACAT IC programs. In support of this policy direction, MDAP PMs are required to invite DASD(SE) and CAE to their CDRs and make the CDR artifacts available.

DASD(SE) reviews the conduct of the program’s CDR, to include system-element level reviews as appropriate, and provides the MDA with an assessment of the following:

  • The conduct and adequacy of the CDR, including the participation of stakeholders, technical authorities and subject matter experts; status of the CDR entrance and exit criteria; open Requests for Action/Information; and closure of the system elements and system-level reviews.
  • The program technical schedule and schedule risk assessments.
  • The program’s risks, issues and opportunities.
  • The establishment and configuration control of the initial product baseline as demonstrated by the completion of build-to documentation for hardware and software configuration items, including production models, drawings, software design specifications, materials lists, manufacturing processes and qualification plans/procedures.
  • The design’s ability to meet KPP, KSA and TPM thresholds and the proposed corrective actions to address any performance gaps, as appropriate.
  • Key Systems Engineering design considerations.

T&E Considerations

Reference Source: DAG CH 8-3.8.3 Critical Design Review

The Critical Design Review (CDR) assesses design maturity, design build-to or code-to documentation, and remaining risks, and establishes the initial product baseline. The CDR serves as the decision point identifying the system design is ready to begin developmental prototype hardware fabrication and/or software coding with acceptable risk. The system CDR occurs during the EMD phase.

Besides establishing the initial product baseline for the system and its constituent system elements, the CDR also establishes requirements and system interfaces for enabling system elements such as support equipment, training system, maintenance, and data systems. The CDR should establish an accurate basis to assess remaining risk and identify new opportunities.

The Chief Developmental Tester and the Lead DT&E Organization participate in the CDR and provide any analysis and assessments to date.