What the Next Generation of Systems Engineers Expects from Their Tools

The tools an engineering team uses have always said something about the organization. But for a long time in hardware and systems engineering, that signal ran in one direction: heavy, expensive, complex tooling was a proxy for serious, mature programs. IBM DOORS was a mark of enterprise credibility. The friction was considered part of the discipline.

That logic is breaking down. Not because the tools got worse, but because the engineers being hired into these programs grew up in a completely different software environment — and they are now making career decisions based partly on what tools they’ll be asked to use.

The Baseline Has Shifted

Engineers entering the workforce over the last five to eight years have been shaped by a specific kind of software experience: Notion, Linear, Figma, GitHub, Slack. These tools share a set of properties that are now being taken for granted: real-time collaboration, browser-based access, immediate feedback, searchable everything, and integration through APIs rather than manual exports.

This isn’t a preference for “nice UI.” It’s a set of functional expectations built through daily use. When a new systems engineer encounters a tool that requires a dedicated client installation, has a data model expressed through folder hierarchies and module types, and exports to Word for review cycles, it doesn’t read to them as rigorous. It reads as broken.

The cognitive load isn’t just about learning curves. It’s about the gap between how the tool works and how modern engineering collaboration actually happens. A junior systems engineer who spends three months learning DOORS or Polarion is learning a tool, not the domain. That’s time that isn’t spent understanding requirements quality, traceability rationale, or system architecture. The tool becomes the subject rather than the instrument.

What Legacy Tools Actually Cost in Practice

Legacy requirements management tools — DOORS, DOORS Next, Polarion, Codebeamer, Jama Connect — are not bad tools in the sense of being wrong about requirements engineering. Their data models encode real knowledge about how requirements need to be structured, traced, and baselined for regulated programs. That’s worth acknowledging directly.

But several of their operational characteristics create specific friction that newer engineers identify as dealbreakers:

Installation and environment management. Tools that require local clients, server configurations, or VPN-gated access create onboarding friction that engineers from web-first backgrounds find genuinely confusing. “Why can’t I just log in?” is not a naive question.

Change management for simple edits. In many legacy tools, modifying a requirement in a way that preserves traceability integrity involves a workflow with five to eight steps. Engineers who are accustomed to inline editing with automatic version history find this workflow disproportionate to the task.

Review cycles through document exports. The practice of exporting requirements to Word or PDF for stakeholder review — and then reconciling comments back into the tool — is invisible overhead that compounds across a program’s life. Engineers who have used tools with native commenting and resolution workflows see this immediately as process debt.

Search and navigation. Hierarchical module structures in legacy tools force navigation patterns that don’t match how engineers think about requirements when they’re working. Cross-cutting concerns, tags, and semantic relationships are difficult to express and traverse.

None of these are fatal in isolation. Together, they produce a working environment that actively conflicts with the expectations of engineers who’ve spent their formative years in high-quality SaaS products.

When Tool Friction Shows Up in Hiring Decisions

This has moved beyond abstract frustration. Program managers and engineering directors at several mid-size defense and aerospace primes have noted — informally, in conversations at industry events — that tool environment is appearing in candidate conversations more often than it did five years ago.

The pattern is usually not “I won’t take this job because of your requirements tool.” It’s more subtle: candidates asking what the collaboration workflow looks like, whether engineers can work remotely without VPN overhead, whether the team uses any AI assistance for requirements work. When the answers reveal a legacy toolchain, some candidates disengage. The strongest candidates — the ones who have options — disengage more often.

Retention shows up too. Exit surveys at hardware-focused companies increasingly capture comments about tooling environment. Frustration with requirements processes is often bundled with frustration about tool quality. It’s hard to separate the two. When a requirements process is painful and the tool that runs it is also painful, engineers don’t distinguish between them. They experience it as a system that doesn’t respect their time.

This is a recruiting problem that most hiring managers are not fully accounting for in their tool selection calculus. The license cost and the switching cost of moving from DOORS to anything else are visible. The marginal offer acceptance rate, and the early-tenure attrition that’s partially attributable to tool environment, are not tracked in the same spreadsheet.

AI Assistance Is No Longer a Differentiator — It’s an Expectation

Among engineers under 35, AI assistance in development workflows is essentially universal. GitHub Copilot, Cursor, and various LLM integrations into IDEs have normalized the idea that writing — whether code or requirements text — should be assisted by intelligent suggestions, completeness checks, and automated review.

When engineers move into requirements work and find no AI assistance available, the absence is jarring. It’s not that they expect AI to write requirements for them. They expect it to flag when a requirement is ambiguous, suggest derived requirements they might have missed, check for consistency across the specification, and surface related requirements they should be aware of. These are the kinds of tasks that consume significant cognitive overhead in requirements work, and they’re exactly the tasks where pattern-matching at scale is valuable.

Legacy tools have been adding AI capabilities as bolt-on features, but the architecture shows. When a tool was designed for structured document management and AI is added as a module or integration, it tends to be disconnected from the core data model. The suggestions don’t understand the traceability graph. The checks don’t know what requirements are upstream. The value is limited.

Tools built from the ground up with AI as a core design assumption work differently. Flow Engineering, for instance, was designed around a graph-based requirements model where AI assistance has access to the full structure of relationships — between requirements, system elements, verifications, and stakeholders. When the model is the interface rather than a document being managed by the interface, AI can actually reason about requirements quality in context rather than in isolation.

What Modern Teams Are Actually Looking For

Talking to systems engineers at companies that have recently evaluated or switched tooling, a few consistent themes emerge:

Collaboration without ceremony. The ability for a systems engineer, a test engineer, and a customer stakeholder to comment on the same requirement in real time — without an export-review-import cycle — is not a luxury. It’s how fast-moving programs need to work.

Traceability that engineers actually maintain. In many legacy tool deployments, traceability is technically captured but practically stale. Engineers don’t maintain it because maintaining it is too expensive relative to the immediate pressure they’re under. Modern teams want tools where traceability maintenance has low friction, not tools where it’s correct in theory and ignored in practice.

Visibility into the whole system, not just the module. Requirements exist in relationship to architecture, tests, and operational concepts. Engineers want a tool that makes those relationships visible as part of normal work, not as a separate activity.

AI that understands the domain, not just the text. Autocomplete and grammar checking are not what engineers mean when they say they want AI assistance. They mean tools that understand what a well-formed derived requirement looks like, what a testability problem looks like, and what a traceability gap looks like in context.

An Honest Assessment

The generational shift in engineering tool expectations is real, but it’s not happening uniformly or quickly. Large defense programs with contractual requirements around specific tools, or with decades of data captured in DOORS, are not migrating on the basis of engineer preference. The switching cost is real, the contractual inertia is real, and the risk tolerance for toolchain changes in safety-critical programs is appropriately low.

What is changing is the calculus at the margin: which programs choose which tools for new program starts, which companies use modern tooling as a talent differentiator in recruiting, and which teams are able to onboard new engineers into requirements work in weeks rather than months.

The companies that are moving fastest on this are generally not the largest primes. They’re mid-size contractors, commercial space and defense technology companies, and dual-use hardware startups where the engineering team skews younger and the program management overhead is lower. These are also, not coincidentally, the companies most likely to win engineering talent away from the larger primes in the current market.

For hiring managers and program managers evaluating their requirements toolchain: the cost of modern tooling is visible. The cost of legacy tooling is mostly invisible — distributed across recruiting friction, onboarding time, early-tenure attrition, and the slow degradation of requirements quality that happens when engineers find workarounds rather than using the tool correctly. That math is worth doing before the next program start.