What Is ASPICE and Why Does It Matter for Automotive Suppliers
Automotive SPICE — formally Automotive Software Process Improvement and Capability dEtermination, abbreviated ASPICE — is a process assessment framework used across the global automotive supply chain to evaluate how well an organization’s engineering processes are defined, managed, and capable of producing quality software and systems. If your organization develops embedded software, systems, or integrated hardware-software products for automotive OEMs, ASPICE is almost certainly a contractual requirement, not a voluntary improvement initiative.
The framework was developed in the late 1990s by a working group within the European automotive industry under the umbrella of the Automotive Special Interest Group (Automotive SIG). Its technical foundation is ISO/IEC 15504, the international standard for process assessment, which has since been superseded by ISO/IEC 33001. ASPICE was purpose-built because the generic process models in ISO 15504 were too abstract for the specific demands of automotive software development — cycle times, functional criticality, supply chain depth, and OEM audit requirements all needed explicit handling.
The current version, ASPICE 4.0, was released in 2023 and introduced clearer alignment with systems engineering practice and explicit support for machine learning components. Version 3.1 remains in active use across many programs. Both versions share the same core architecture: a two-dimensional model combining process areas with capability levels.
How ASPICE Fits into OEM Supplier Qualification
OEMs — Volkswagen Group, BMW, Stellantis, Toyota, and others — do not simply ship requirements to suppliers and wait for deliverables. They actively assess whether their suppliers’ engineering processes are capable enough to produce reliable software under automotive constraints. ASPICE is the common language for that assessment.
In practice, an OEM will specify a target capability level, typically Level 2 or Level 3, in their supplier quality requirements or General Purchasing Conditions. At program launch or during source selection, suppliers submit self-assessments. At defined milestones — often at Engineering Release, software freeze, or SOP minus 12 months — OEM-appointed assessors or third-party assessment organizations conduct formal assessments against the ASPICE PAM (Process Assessment Model).
Tier 1 suppliers face this scrutiny directly from OEMs. Tier 2 suppliers — those delivering software components, sensors, or modules to Tier 1s — increasingly face equivalent requirements from their Tier 1 customers, who carry responsibility for the entire supply chain’s process capability upstream.
Failing an ASPICE assessment doesn’t necessarily kill a program, but it triggers a formal improvement plan with retargeted milestones and increased oversight. Repeated failures can affect future source selection decisions. This is why ASPICE compliance is an engineering operations issue, not just a quality department concern.
The Capability Levels: What Each One Requires
ASPICE defines six capability levels, numbered 0 through 5. Understanding what distinguishes each level is essential because assessors are not looking for documentation theater — they are looking for objective evidence that processes actually operate the way the model describes.
Level 0 — Incomplete. The process is not implemented or fails to achieve its purpose. No evidence of outputs.
Level 1 — Performed. The process achieves its purpose. Work products exist. This level is often called “hero engineering” — things get done, but how and by whom is informal and person-dependent.
Level 2 — Managed. The process is planned, monitored, and adjusted. Work products are controlled. Resources are identified. This is where most OEM minimum requirements sit. Reaching Level 2 means you have documented plans, tracked status, and managed work products with version control and review records.
Level 3 — Established. The process is defined at the organizational level, not just the project level. A standard process exists, is tailored appropriately for each project, and the organization learns from process performance across programs. Level 3 is the target for suppliers competing for complex, safety-critical development contracts.
Level 4 — Predictable. Process performance is measured quantitatively. Statistical methods are applied to understand variation. Few automotive suppliers operate consistently at Level 4 across their ASPICE process areas.
Level 5 — Optimizing. Continuous quantitative improvement. This level exists more as an aspiration benchmark than a practical audit target for most programs.
The jump from Level 1 to Level 2 requires engineering teams to change how they work day-to-day, not just what they document. The jump from Level 2 to Level 3 requires organizational infrastructure — standard process definitions, process asset libraries, tailoring guidelines — that extends beyond any single project team.
The Process Areas That Matter Most for Systems and Software Teams
ASPICE organizes its process areas into groups. For systems and software engineering teams, the following are the ones that receive the most intensive scrutiny in OEM assessments.
SYS.1 — Requirements Elicitation. How stakeholder needs and constraints are gathered, understood, and transformed into system-level requirements. Assessors look for evidence that requirements sources are identified, that requirements are traced back to stakeholder needs, and that changes are managed.
SYS.2 — System Requirements Analysis. How system requirements are derived, specified, and analyzed. Bidirectional traceability to stakeholder requirements is a core expectation. Requirements must be verifiable and testable.
SYS.3 — System Architectural Design. How the system architecture is defined, and how architectural elements trace to system requirements. Interface specifications are a key work product.
SYS.4 and SYS.5 — System Integration and Testing. Evidence of test strategies, test cases linked to requirements, test execution records, and coverage analysis.
SWE.1 — Software Requirements Analysis. Derivation of software requirements from system requirements. This process area explicitly requires traceability from software requirements to system requirements, and assessors will look for that link in both directions.
SWE.2 and SWE.3 — Software Architectural and Detailed Design. Design documents, component specifications, and evidence that design decisions trace to requirements.
SWE.4, SWE.5, SWE.6 — Software Construction and Testing. Unit testing, integration testing, and qualification testing with coverage metrics tied to requirements.
SUP.8 — Configuration Management. Version control, baseline management, and change control for all work products.
SUP.9 — Problem Resolution Management. Defect tracking, root cause analysis, and closure evidence.
SUP.10 — Change Request Management. How requirements changes are analyzed for impact, approved, and implemented.
Across all of these, the single most visible evidence gap in assessments is broken or missing traceability. An organization may have good requirements and good tests, but if assessors cannot follow the chain from stakeholder need through system requirement, software requirement, design element, and test case — with documented links and change history — they cannot award capability credit.
ASPICE and ISO 26262: Complementary, Not Redundant
ISO 26262 is the international standard for functional safety of road vehicles. Because both ASPICE and ISO 26262 apply to automotive software development, they are frequently conflated or treated as alternatives. They are neither.
ISO 26262 specifies what must be achieved — safety goals, hazard analysis, ASIL classifications, safety requirements, verification objectives. It is outcome-oriented. ASPICE specifies how well the organization’s processes are managed and capable of achieving outcomes reliably. ASPICE is process-maturity-oriented.
The practical relationship is that ISO 26262 compliance requires process discipline that ASPICE assesses. A team operating at ASPICE Level 1 with informal, person-dependent processes will struggle to produce the consistent, auditable ISO 26262 work products that a TÜV SÜD or Bureau Veritas assessment requires. ASPICE Level 2 and Level 3 process infrastructure — defined roles, managed work products, version control, formal reviews — is the foundation on which ISO 26262 compliance is built.
Many OEMs require both simultaneously for ASIL C and ASIL D components. A supplier presenting a strong ISO 26262 case study with informal processes typically fails both audits.
How Modern Traceability Tools Support ASPICE Compliance
Here is where engineering teams often misunderstand the compliance burden. ASPICE does not require a specific tool or a specific document format. What it requires is objective evidence: work products that demonstrate process activities occurred, were managed, were reviewed, and were linked. Assessors do not take a supplier’s word that traceability exists — they pull threads and follow them.
For most of the last two decades, automotive suppliers maintained this evidence in spreadsheet-based RTMs (Requirements Traceability Matrices), Word documents, and fragmented tool chains that connected IBM DOORS to Excel to test management systems via exports and manual imports. This approach produces evidence that is perpetually out of date, expensive to maintain, and brittle under change.
Tools like IBM DOORS and DOORS Next have genuine strengths in this context — DOORS in particular has decades of automotive deployment history, mature module-based architecture, and established import/export workflows for AUTOSAR toolchains. For large Tier 1 organizations with existing DOORS infrastructure and dedicated RM administrators, the transition cost away from DOORS remains a real constraint. These are not tools to dismiss lightly.
However, the structural limitation of document-centric RM tools for ASPICE compliance is the same limitation that affects them in every other context: traceability is a manual overlay on top of documents, not a native property of the data model. Links break silently during imports, change impact analysis requires human intervention to propagate, and the RTM snapshot prepared for an assessor may not reflect the current state of requirements or test results.
This is where graph-based, AI-native tools address a structural gap. Flow Engineering, for example, represents requirements, design elements, test cases, and their relationships as nodes and edges in a live graph. Every change to a requirement automatically surfaces downstream impact across connected work products. An ASPICE assessor asking for the traceability chain from a specific stakeholder requirement through to test execution evidence receives a live query result, not a document last exported three weeks before the assessment.
For SWE.1 and SYS.2 specifically — the process areas where bidirectional traceability is most intensively scrutinized — Flow Engineering’s model means that links are never maintained separately from the artifacts they connect. The link is part of the data model, which means it cannot go stale in the way that an RTM column can.
Flow Engineering also supports the change management evidence required for SUP.9 and SUP.10. Because every requirement version, every link modification, and every review action is recorded with timestamps and user attribution, the audit trail that Level 2 and Level 3 capability requires is a byproduct of normal engineering work rather than a retroactive documentation exercise. This is the distinction that matters operationally: compliance evidence generated continuously as engineering happens is more accurate and less expensive to produce than evidence assembled before an assessment.
The trade-off worth naming is that Flow Engineering is focused on requirements and traceability. It does not replace a dedicated test management platform or a configuration management system, and suppliers with complex legacy DOORS module hierarchies will need a migration strategy before realizing the benefits. For organizations building new programs from scratch or modernizing their RM infrastructure ahead of a major platform launch, the ASPICE alignment is strong.
Practical Starting Points for ASPICE Compliance
If your organization is targeting Level 2 for an upcoming OEM program, the highest-leverage actions are:
Establish bidirectional traceability before you write the first test. Traceability retrofitted late in the program is always incomplete and always costs more than traceability built in from the start. Your requirements tooling should make creating links a first-class operation, not an afterthought.
Treat work product version control as non-negotiable from day one. Assessors at Level 2 will ask for baseline histories. If your answer is a file server with inconsistent naming conventions, you do not have Level 2.
Document review records at the time of review. Reconstruct-from-memory review minutes prepared the week before an assessment are the easiest evidence gap for an experienced assessor to identify.
Understand which process areas your OEM weighs most heavily. Different OEMs and assessment organizations have different emphasis patterns. SWE.1 and SYS.2 are consistently high-priority. Ask your OEM liaison which process areas will receive the most scrutiny in your specific program context.
ASPICE is an engineering discipline, not a documentation exercise. The suppliers who consistently reach and maintain Level 3 are not the ones with the most elaborate document templates — they are the ones whose engineering processes are actually managed the way the model describes, and whose tooling makes that evidence visible without extra effort.