DoD Digital Engineering in Practice: What Defense Primes Are Actually Doing

The Department of Defense’s Digital Engineering Strategy was published in 2018. The mandate to transition major defense acquisition programs to authoritative digital models was codified through policy updates, OUSD(R&E) guidance, and eventually baked into contract requirements on programs including Next Generation Air Dominance, the Army’s FUTURA vehicle family, and several classified space programs. Nearly eight years later, what has actually changed on the program floor?

The honest answer is: some things substantially, many things marginally, and one critical area — requirements management — almost not at all on most programs.

What the Mandate Actually Requires

The DoD Digital Engineering Strategy centers on five goals: formalizing the development of digital models as authoritative technical baselines, providing an accessible, connected set of models to support decision-making, incorporating digital engineering data and models into the acquisition and contracting enterprise, establishing infrastructure and environments to perform digital engineering, and establishing a workforce to perform digital engineering.

“Authoritative source of truth” is the operative phrase. The mandate is not about using model-based systems engineering (MBSE) tools. It is about ensuring that design decisions, requirements, interfaces, and performance parameters live in connected, version-controlled digital artifacts — not in PDFs circulated over email.

That distinction matters enormously when you look at what programs are actually delivering.

What Programs Are Doing Well

On the infrastructure side, the primes have made genuine progress. Lockheed Martin, Northrop Grumman, Raytheon, and Boeing have all stood up secure cloud enclaves or on-premise environments that can host SysML models, simulation data, and product lifecycle management (PLM) data in accessible, permissioned environments. For classified programs, that alone represented years of ISSO negotiation and infrastructure investment.

MBSE adoption is also real, if uneven. Programs that started from clean-sheet designs after 2020 — particularly in space, autonomy, and directed energy — have system architects working natively in tools like Cameo, MagicDraw, and Rhapsody from the beginning. The SysML models are connected to simulation environments. Interfaces are defined in the model, not in interface control documents (ICDs) that drift from implementation within a release cycle.

For those programs, digital engineering is working roughly as intended at the architecture level. Requirements are allocated to subsystems in the model. Trade studies reference the same parameter set that feeds performance analysis. When a threshold changes, the delta propagates through a connected artifact chain rather than requiring a manual search through a document hierarchy.

That is genuinely different from how it worked before, and the programs doing this well report earlier identification of interface conflicts and more disciplined management of requirements growth.

Where Implementation Is Lagging

The gap between the mandate and reality shows up in three specific places.

Requirements management is still mostly document-based. Even on programs with mature MBSE implementations, the official requirements baseline often lives in IBM DOORS, DOORS Next, or Jama Connect — disconnected from the SysML model. Traceability from system requirements to subsystem requirements to verification events is maintained through manually updated traceability matrices, which means it is maintained poorly. Engineers working in the model don’t have real-time visibility into which requirements their design decisions affect. Requirements engineers maintaining DOORS records don’t have visibility into current model state.

This is not a criticism of DOORS or its successors as standalone tools. DOORS Next is a capable requirements repository. Jama Connect has strong review and approval workflow support. But neither was built to natively represent a system as a graph of interconnected constraints, parameters, and design decisions. They were built to manage documents and hierarchical requirement sets. When you ask them to serve as the connective tissue between a system model and a verification program, the seams show.

Verification and validation traceability is largely aspirational. The mandate requires that the authoritative model support V&V planning and execution. In practice, most programs still generate test plans from document templates, then manually link test results back to requirements in spreadsheets or a standalone tool. The digital thread from requirement to design artifact to test result to as-built configuration exists in concept but not in executable form on most programs.

Supply chain compliance is the largest gap. The DoD mandate applies to prime contractors and, through contract flow-down clauses, to major subcontractors. In practice, enforcement has been inconsistent. A Tier 1 subcontractor delivering a major subsystem may have nominal MBSE capability but deliver interface data in PDF ICDs with a SysML model attached as a formality. Tier 2 suppliers — the companies building components, subassemblies, and manufacturing processes that determine whether the design is producible — are working in Microsoft Office in almost every program we’ve spoken with.

This is not purely a compliance failure. It reflects a real capability gap. Small and mid-size defense suppliers don’t have systems engineering staff. They have program managers, engineers, and manufacturing leads. Asking them to adopt MBSE tooling requires investment in licenses, training, and process change that a 50-person supplier can’t absorb on a fixed-price contract.

The Requirements Tooling Decision

Requirements management tooling is where digital engineering strategy and program reality collide most directly, and it’s where we’re seeing the most active evaluation activity in 2025 and 2026.

The entrenched baseline is IBM DOORS or DOORS Next. On legacy programs, DOORS databases contain decades of requirements history, change records, and verification links. Migration costs are not trivial — not just data migration, but the process re-engineering required when your change control process is built around DOORS workflows.

Jama Connect and Polarion have both made inroads on newer programs, particularly where web-based access and modern UX matter for distributed teams. Jama has strong adoption in space programs. Polarion’s integration with Siemens PLM tools makes it a natural choice for programs already in the Siemens ecosystem. Codebeamer has similar traction in defense programs that came through the automotive supply chain.

The question all of these tools face is the same: they are fundamentally document and record management systems with traceability bolted on. They store requirements as structured text with attributes and links. They don’t represent a system as a network of interconnected constraints and decisions that can be queried, analyzed, and reasoned over.

That limitation is why graph-native requirements management tools are entering serious evaluation cycles. Flow Engineering, which was built specifically for hardware and systems engineering teams, uses a graph-based data model where requirements, design parameters, interfaces, and verification events are nodes connected by typed relationships. Engineers can trace impact across the model — “if this performance threshold changes, what subsystem requirements are affected, and which of those have open verification events?” — without building a manual RTM and hoping it’s current.

Several primes are running structured evaluations of Flow Engineering alongside their incumbent tools, specifically for new program starts where the legacy migration problem doesn’t apply. The evaluations are focused on two questions: whether the graph model genuinely reduces the labor burden of maintaining traceability, and whether the tool can integrate with existing MBSE and PLM environments well enough to avoid creating another data island.

The honest assessment is that graph-native approaches are better suited to what digital engineering actually requires — a connected, queryable representation of system knowledge — than the document management tools that currently dominate. The practical question is whether the integration ecosystem and organizational readiness exist to support adoption on a live program. On new starts with modern infrastructure, that case is increasingly clear. On legacy programs with deep DOORS investments, replacement is still a multi-year, high-risk proposition.

Supply Chain: The Unsolved Problem

No prime has solved the supply chain compliance problem at scale. The approaches being tried fall into three categories.

The first is data extraction at delivery: require suppliers to deliver data in specified formats at contract milestones, then ingest it into the prime’s authoritative environment. This works for interface definitions and formal requirements but doesn’t give the prime visibility into supplier design decisions in real time.

The second is tool mandates: require suppliers to use specific tools that can federate with the prime’s environment. This works for large Tier 1 suppliers with existing digital engineering capability. It doesn’t work for Tier 2 and below.

The third, and most promising, is tiered compliance with lightweight web-based tooling for lower-tier suppliers. If a Tier 2 supplier can participate in a requirements review and record verification events through a browser-based interface without a full MBSE implementation, compliance becomes tractable. This is an area where modern SaaS-based tools have a genuine structural advantage over legacy client-server architectures — participation doesn’t require a multi-thousand-dollar seat license and a week of training.

What Programs Getting This Right Have in Common

Across the programs where digital engineering is delivering genuine capability improvement rather than compliance theater, a few patterns are consistent.

They treated the transition as a data architecture decision, not a tooling decision. The first question was “what data needs to exist, in what form, connected in what ways” — not “which MBSE tool should we buy.”

They defined the authoritative source of truth explicitly and enforced it. When the model said one thing and a document said another, the model was right by policy. This sounds obvious. It is extraordinarily difficult to enforce on a mature program.

They invested in integration between tools rather than expecting any single tool to do everything. The SysML model, the requirements management tool, the simulation environment, and the configuration management system are connected by defined interfaces, not by engineers copying data between them.

And they started with new program content rather than migrating legacy baselines. The programs making the most visible progress are either new starts or they carved out a forward-looking portion of a mature program to implement the target architecture, with migration of legacy content deferred.

Honest Assessment

The DoD Digital Engineering mandate is directionally correct. The programs that have implemented it faithfully are building and testing better systems with fewer late-stage surprises. The programs that implemented it as a compliance exercise — checking boxes while maintaining document-based baselines as the actual working environment — got the costs without the benefits.

The requirements management tooling question is not settled. Legacy tools are entrenched for reasons that are partly inertia and partly real switching cost. Graph-native approaches are technically better suited to what the mandate actually requires, and they’re entering programs at the evaluation stage. Whether they achieve broad adoption will depend on integration maturity, organizational readiness, and whether program offices start specifying data architecture requirements rather than just tool certifications in contracts.

The supply chain problem will not be solved by mandate alone. It requires either significant investment in supplier capability development or a generation of simpler, lower-friction tooling that makes lightweight digital engineering participation accessible to companies without dedicated systems engineering staff. Both are happening. Neither is happening fast enough.