Fusion Energy’s Engineering Challenge: Building Systems With No Heritage

When a new commercial aircraft program launches, engineers inherit decades of heritage: type-certified components, operational failure data, regulatory precedent, and requirements documents refined through thousands of hours of fleet experience. When Commonwealth Fusion Systems finalized the design requirements for SPARC’s high-temperature superconducting (HTS) magnet system, they were working from a much thinner base. There were no commercial fusion reactors to reference. There was no fleet. There was, in many respects, no playbook.

This is the defining systems engineering condition of commercial fusion in 2026: an industry writing its own requirements, largely from first principles, while simultaneously conducting the experiments that would normally have validated those requirements decades earlier. It is an extraordinary technical freedom. It is also a liability that could determine which companies survive the engineering gauntlet to first power.

The Heritage Problem, Precisely Defined

Heritage requirements in complex engineering programs serve a specific function: they encode hard-won empirical knowledge into formal constraints. A fission reactor vendor specifying coolant flow rates through a primary loop draws on 70 years of operating data across hundreds of reactors. That data has been distilled into codes and standards — 10 CFR 50, ASME Section III, IEEE 603 — that carry embedded conservatism informed by real failures.

Commercial fusion has none of this for its most critical subsystems. Plasma confinement systems at fusion-relevant parameters (plasma temperatures above 100 million degrees Celsius, sustained fusion gain) have not operated commercially. First wall and blanket materials have not experienced commercial-scale neutron flux over operational lifetimes. Tritium breeding blankets exist at laboratory scale but have never operated in an integrated power plant. The power conversion backend is better understood, but the interface requirements between a fusion neutron source and a thermal power system are novel.

This means fusion systems engineers face a specific and unusual problem: they must write requirements that are simultaneously technically grounded enough to design hardware against, and flexible enough to be revised as the physics experiments catch up. That balance is structurally different from writing requirements for a new variant of a known system class.

Freedom: The Case for the Blank Slate

The absence of heritage is not purely a disadvantage. Legacy nuclear requirements carry enormous embedded conservatism that fusion programs can legitimately bypass where the underlying physics differs.

Fission reactors must manage decay heat — the thermal energy released by fission products after shutdown — which drives major safety system requirements including emergency core cooling, redundant heat removal, and containment design. Fusion reactors do not have this problem in the same form. A tokamak or inertial confinement system that loses plasma control simply stops producing fusion power. This fundamental physics difference means fusion programs can, in principle, write simpler safety cases for certain failure modes than equivalent fission programs.

Similarly, the absence of a long-lived fission product inventory changes the waste management requirements picture significantly. Fusion companies are not inheriting requirements shaped by the political and regulatory legacy of plutonium management.

Companies like Helion Energy, pursuing a field-reversed configuration (FRC) approach quite different from the mainline tokamak design, have architectural freedom that would be unavailable if they were constrained to requirements inherited from a prior generation of similar machines. The blank slate allows genuine architectural innovation at the system level — not just incremental improvement.

TAE Technologies, pursuing a beam-driven FRC approach, has similarly written plasma system requirements that would not exist in a heritage document set because their confinement concept has no commercial predecessor. Their requirements define a solution space, not a refinement of prior art.

This freedom is real. But it creates a specific failure mode that is equally real.

Liability: When First Principles Fail

Requirements derived from first principles carry a risk that heritage requirements do not: they may be wrong in ways that do not become apparent until late in the development cycle, when design changes are expensive.

In heritage-rich programs, requirements are conservatively bounded by empirical data. The requirement is wrong only if the data was wrong or misinterpreted. In first-principles derivation, the requirement is only as good as the physics model used to derive it. If the model is incomplete — if it fails to capture an interaction between subsystems that wasn’t anticipated — the requirement can be precisely stated and rigorously traced while still being fundamentally inadequate.

The tritium breeding blanket is a concrete example. Tritium self-sufficiency is a hard requirement for any commercial D-T fusion reactor: the plant must breed at least as much tritium as it burns, plus losses. The breeding ratio requirement can be derived from neutron physics models. But those models depend on assumptions about neutron multiplication, geometry, material activation, and tritium extraction efficiency that have not been validated at commercial scale. A breeding ratio requirement of 1.05 (5% surplus) looks precise on paper. Whether it is achievable — or whether the actual achievable ratio under integrated operating conditions is 0.95 — is an empirical question that the requirements document cannot answer.

This physics-to-requirements uncertainty propagates through the entire system. First wall material requirements depend on neutron flux models. Neutron flux models depend on plasma performance requirements. Plasma performance requirements depend on confinement physics that is still being refined experimentally. The requirements chain has no firmly anchored empirical foundation at the plasma physics end — it rests on validated models of increasing fidelity, not on operational heritage.

This is not a reason to abandon rigorous requirements management. It is precisely the reason rigorous requirements management becomes more important, not less.

Standards Adaptation: Using the Wrong Map

In the absence of fusion-specific standards, commercial fusion companies are adapting from several source domains, none of which fits cleanly.

Nuclear fission standards (10 CFR 50/52, ASME BPVC Section III, IEEE nuclear standards) provide the most obviously relevant framework. The NRC’s regulatory structure for advanced reactors — including the Part 53 framework under development — is being adapted to fusion contexts. But fission standards encode assumptions about decay heat, coolant chemistry, criticality, and waste that do not translate directly. Using them uncritically would impose unnecessary constraints. Using them selectively requires justifying every deviation from established regulatory practice, which itself creates significant regulatory engagement overhead.

Aerospace and defense standards (MIL-STD-882 for system safety, AS9100 for quality management systems, MIL-STD-1521 for technical reviews) provide process frameworks that fusion companies are adopting more readily. The system safety approach from MIL-STD-882 — hazard identification, risk assessment, hazard controls — is structurally applicable to fusion even though the specific hazards differ. Several fusion companies have adopted AS9100-aligned quality management systems, borrowing aerospace discipline for documentation and configuration control.

Industrial pressure vessel and materials standards (ASME BPVC, EN 13445, ASTM material specifications) apply directly to the mechanical systems that are well-understood — vacuum vessels, heat exchangers, structural supports — and are being applied where applicable without major adaptation.

IEC 61508 for functional safety of electrical and electronic systems is being selectively applied to control and protection systems, with fusion-specific failure mode analysis layered on top.

The practical result is a patchwork: fusion companies are operating under multiple partially-applicable standards frameworks simultaneously, with significant internal judgment calls about where each standard’s logic applies and where it needs to be adapted or supplemented. This creates compliance risk — not because the companies are cutting corners, but because the regulatory baseline for commercial fusion is still being established, and the standards being applied were not designed for this system class.

Requirements Traceability in a Moving Physics Baseline

The standard requirement for requirements traceability — every requirement traceable to a parent requirement or stakeholder need, every design element traceable to a requirement — is straightforwardly difficult when the physics baseline is itself under revision.

Consider a simplified requirements chain for a tokamak first wall:

Plasma performance target → neutron wall loading → thermal flux → first wall heat removal requirement → coolant flow rate → pump specification

Each link in this chain is derivable from physics models. But if an experimental campaign revises the plasma performance target — which is a normal occurrence in fusion development — the revision propagates through the entire chain. Every downstream requirement and design element is potentially affected. In a document-based requirements management approach, tracking this propagation manually across hundreds of interconnected requirements is where change management breaks down.

This is precisely the problem that graph-based requirements management addresses. When requirements and their relationships are represented as a connected graph rather than a document hierarchy, a change to a parent requirement automatically surfaces all downstream requirements and design elements that need review. The traceability isn’t manual — it’s structural.

Tools like Flow Engineering, built specifically for hardware and systems engineering teams working with complex, interconnected requirement sets, implement this graph-based approach. For fusion programs where the physics-to-requirements boundary is shifting throughout development, the ability to propagate change impact analysis automatically through a requirements graph is not a convenience feature — it’s a prerequisite for maintaining a coherent and current requirements baseline. The alternative is requirements documents that are formally consistent at the time of issue and progressively diverge from each other as individual sections are updated without systematic propagation.

Competitive Differentiation Through Requirements Discipline

In a field where multiple well-capitalized companies are pursuing fundamentally different confinement approaches, requirements management quality is becoming a differentiator in ways that are not always visible from the outside.

The differentiator operates on two axes.

Speed of iteration. Fusion development requires rapid integration of experimental results into design. Programs that can propagate experimental findings into updated requirements, identify affected subsystems, and issue revised specifications quickly can iterate faster. Programs that manage requirements in disconnected documents — a physics model in one system, requirements in another, design specifications in a third — introduce delays at every handoff. Those delays compound.

Regulatory readiness. Regulatory review of first-of-a-kind nuclear facilities is requirements-intensive. Regulators need to follow the logic from safety requirements to design implementation. Programs with well-structured, traceable requirements can construct and present that argument efficiently. Programs without it must reconstruct traceability retroactively — an expensive and error-prone process that slows licensing.

The companies that reach first power will not do so through physics alone. The gap between “we have shown fusion conditions in a laboratory device” and “we have an operating commercial power plant” is primarily a systems engineering gap. Requirements management quality is a direct determinant of how efficiently that gap is closed.

Honest Assessment

Commercial fusion’s no-heritage situation is not a problem that better tools or better processes can fully resolve. The physics uncertainty is real, and it will only be resolved by experiment. No requirements management system converts an incomplete physics model into a reliable requirement.

What rigorous requirements management does is prevent the compounding of physics uncertainty with process failures. It ensures that when the physics baseline updates — and it will — the downstream impact on requirements and design is visible, traceable, and manageable. It keeps the requirements database coherent as a living document rather than allowing it to fragment into a collection of individually current but mutually inconsistent artifacts.

Fusion programs that treat requirements management as overhead — something to be formalized after the physics is settled — are making a category error. The physics will not be fully settled before the hardware must be built. The discipline to manage requirements rigorously under uncertainty is not a downstream concern. It is a present-tense competitive variable in the race to first power.

The blank slate is real. So is the liability. How well fusion companies manage the space between them will be one of the defining engineering stories of this decade.