Can You Manage eVTOL Certification Requirements in DOORS?
Yes. DOORS can manage eVTOL certification requirements. The question that actually matters for your program is whether you can afford the overhead.
That distinction is not semantic. IBM DOORS has been the aerospace requirements management standard for decades. It handles Part 25 transport category programs. It handles DO-178C and DO-254 avionics certification. FAA DERs know it. Suppliers know it. Regulatory agencies accept artifacts produced by it. If someone asks whether DOORS is capable of storing and linking eVTOL requirements, the honest answer is yes.
The harder question is what happens when you apply a document-centric tool to a certification program operating at eVTOL scale, under the pace of a novel aircraft development, with requirements that are still being negotiated with the FAA during Stage 4 review. At that scale and pace, the tool’s architecture stops being a background concern and starts generating direct program risk.
What DOORS Actually Does Well
To evaluate DOORS fairly, you have to credit what it genuinely does. DOORS Next Generation (DOORS Next) provides a structured repository for requirements, supports typed links between requirements and downstream artifacts, and has a mature ecosystem of integrations with verification management tools, change control systems, and PLM platforms. For programs where the requirement set is largely stable, where the team is already trained, and where supplier exchange formats matter, DOORS is a rational choice.
The IBM ecosystem around DOORS Next — the Engineering Lifecycle Management (ELM) suite, Rhapsody integration, the ETM test management tooling — gives large organizations a connected, if complex, infrastructure. If you are a Tier 1 aerospace contractor with an existing DOORS infrastructure and a 737-scale program, that ecosystem has real value.
For organizations that have already paid the DOORS learning curve and built their processes around it, switching costs are real. That deserves honest acknowledgment.
Where the Architecture Starts Fighting You
eVTOL certification programs have structural characteristics that expose the specific limits of document-centric requirements management. These are not edge cases. They are central features of what it means to certify a novel aircraft type under a Special Federal Aviation Regulation or Issue Paper-driven means of compliance.
Verification test point volume
Joby Aviation has publicly described a certification scope involving hundreds of thousands of test points. That number is not unusual for a novel propulsion and flight control architecture where conventional means of compliance do not exist and new ones must be negotiated and demonstrated.
DOORS stores requirements as objects inside modules. Traceability is expressed as links between objects across modules. At scale, that link graph becomes unwieldy inside a document-centric structure. Queries slow. Reports require scripting. Coverage analysis requires exporting data to external tools or writing DOORS Object Model (DOM) scripts to traverse link chains. None of that is impossible — DOORS engineers do it regularly — but every one of those operations is manual work that adds latency between the state of the program and the state of the model.
At 50,000 test points, that overhead is manageable with a disciplined team. At 300,000 test points tied to evolving compliance artifacts across propulsion, structures, flight controls, and electrical systems, the overhead is a permanent tax on program velocity.
Requirements still evolving in Stage 4
The standard aerospace program assumption — that requirements are substantially frozen before verification planning begins — does not hold for novel aircraft certification. Stage 4 is supposed to be about demonstrating compliance. For an eVTOL program operating under a novel flight profile with distributed electric propulsion, Stage 4 frequently surfaces requirement gaps, ambiguous compliance criteria, or FAA feedback that requires requirement revision.
When a parent requirement changes in DOORS, the downstream impact — to derived requirements, to verification cases, to test procedures, to analysis reports — must be propagated manually. DOORS does not automatically flag which verification artifacts are now suspect because their parent requirement changed. That assessment requires a human to traverse the link structure, identify affected objects, change their status, and notify downstream owners.
In a program where requirement changes are rare, that process is acceptable. In a program where requirements are being refined through active regulatory dialogue during verification execution, that process generates constant, compounding rework. Engineers who should be closing verification events are instead auditing link trees to determine what a requirement change touched.
Manual traceability burden
The Requirements Traceability Matrix is the artifact that demonstrates to a regulatory authority that every requirement has a corresponding verification event and that verification event has a result. In DOORS, the RTM is typically generated as a report — a snapshot of the link state at a point in time.
That snapshot model creates a structural lag. The RTM you show a DER during an audit reflects the state of the database as of the last report generation, which may not reflect the state of the program as of today. More critically, maintaining RTM completeness requires every engineer adding a requirement, modifying a link, or closing a verification event to follow the linking protocol correctly, every time.
At eVTOL scale, across a team that includes software, hardware, structures, propulsion, flight controls, and human factors — with external suppliers contributing requirements objects — protocol compliance is not a given. Gaps accumulate. Finding them requires running coverage queries, which requires someone to own that function, which is overhead that grows with program size.
Coverage gap detection
A coverage gap — a requirement with no linked verification event, or a verification event with no result, or a derived requirement that has drifted out of alignment with its parent — is a certification finding waiting to happen. In a document-centric system, detecting those gaps is an active query process. Someone has to run the query. Someone has to interpret the results. Someone has to triage whether a gap is a real gap or a linking artifact.
The problem is not that DOORS cannot surface gaps. It can, with the right scripts and report configurations. The problem is that gap detection is not continuous and automatic — it is periodic and manual. In an active program, the interval between gap detection runs is an interval during which new gaps can open without anyone knowing.
For a Part 25-equivalent novel aircraft certification, a coverage gap discovered late in Stage 4 is not a database maintenance problem. It is a schedule risk.
The Architecture Mismatch Explained Directly
DOORS was designed when aerospace certification programs were document-centric because the regulatory framework was document-centric. You submitted documents. Documents had sections. Sections had requirements. Requirements had verification references. That model made sense.
eVTOL certification under an Issue Paper-driven means of compliance is not a document problem — it is a graph problem. The relationships between a top-level airworthiness requirement, its derived requirements across multiple systems, the applicable means of compliance, the test procedures, the analysis methods, the test results, and the open findings are a network of interconnected objects. The correctness of the certification case depends on the integrity of that network as a whole.
Representing a network inside a document structure does not make the network wrong — it makes it harder to traverse, harder to query, harder to update, and harder to validate. Every operation that is natural in a graph model requires a workaround in a document model. Over a multi-year, multi-hundred-thousand test point certification program, those workarounds accumulate into structural overhead.
What Graph-Native Tooling Changes
Tools that model requirements, verification events, compliance artifacts, and their relationships as a live graph — rather than as linked documents — change the operational profile of the traceability problem in ways that matter at eVTOL scale.
Gap detection becomes continuous rather than periodic. The graph always knows which nodes have no downstream links, which links cross a changed requirement boundary, and which verification events have no attached results. Engineers do not run reports to find gaps — the gaps are visible in the model at all times.
Requirement churn propagates automatically. When a parent requirement changes, the graph flags all downstream nodes that are affected. Engineers receive a scoped work list rather than an open-ended audit task.
Coverage can be queried at any level of abstraction — by system, by phase, by means of compliance — without exporting data or writing custom scripts, because the query language is native to the model.
Flow Engineering was built specifically for this problem domain. It implements a graph-native requirements and traceability model designed for hardware and systems engineering teams running complex certification programs. Its architecture is designed to absorb requirement churn, surface coverage gaps automatically, and provide live RTM state rather than periodic snapshots — the specific capabilities that matter when your requirements are evolving during Stage 4 and your test point count is in the hundreds of thousands.
That design focus means it does not replicate the full breadth of the IBM ELM ecosystem or the extensive supplier exchange formats that DOORS has accumulated over decades. For programs where supplier DOORS compatibility is a hard constraint, that tradeoff deserves evaluation. For programs where the primary risk is internal traceability overhead rather than supplier exchange format, the tradeoff runs the other direction.
The Honest Assessment
DOORS is not the wrong tool because it is old or because it lacks features. It is the wrong tool for high-churn, high-volume eVTOL certification programs because its document-centric architecture generates compounding manual overhead precisely in the conditions those programs operate under — large requirement sets, evolving compliance criteria, continuous verification execution, and late-breaking regulatory feedback.
The overhead is real. It falls on systems engineers, verification engineers, and configuration managers. It creates latency between program state and tool state. It makes coverage gaps harder to find and slower to close.
For a program like Joby’s — or any eVTOL program operating at that scale under novel means of compliance — that overhead is not a tooling annoyance. It is a program risk with a direct path to schedule impact and certification findings.
DOORS can do the job. The question is whether you want your engineers spending their cycles on the job or on the tool.