What Is ASPICE? Automotive SPICE Explained for Hardware and Systems Engineers
Automotive SPICE — formally known as ASPICE — is one of the more misunderstood frameworks in embedded systems engineering. Engineers often encounter it as a compliance checkbox imposed by an OEM customer, without a clear picture of what it actually measures, why it exists, or what passing an assessment requires in practice.
This article cuts through that. It explains what ASPICE is, where it came from, what the five capability levels mean operationally, and which process areas matter most to hardware and systems engineers. It also covers what assessors inspect as evidence and how modern tooling can reduce the overhead of producing that evidence.
What ASPICE Actually Is
ASPICE stands for Automotive Software Process Improvement and Capability dEtermination. It is a process assessment framework — not a product standard, not a safety standard, and not a testing protocol. It measures whether your engineering processes are defined, followed, and improving over time.
The framework was developed by a consortium of European automotive OEMs in the early 2000s, working from ISO/IEC 15504 (now superseded by ISO/IEC 33000), which defined a generic two-dimensional model for assessing software processes. The automotive industry adapted that model for supplier development relationships: an OEM needs confidence that a Tier 1 or Tier 2 supplier’s development process is disciplined enough to produce reliable software and systems at scale. ASPICE provides the assessment methodology to establish that confidence.
The current release, ASPICE for Automotive v3.1 (with v4.0 in active adoption), is maintained by the Automotive Special Interest Group (Automotive SIG), a working group operating under VDA QMC in Germany. Most major European OEMs — Volkswagen Group, BMW, Mercedes-Benz, Stellantis, and others — require ASPICE assessments as part of their supplier qualification processes.
The key insight: ASPICE does not tell you what to build. It tells you how well-controlled your process of building things is.
The Two Dimensions of ASPICE
Every ASPICE assessment operates across two dimensions: process areas and capability levels.
Process areas describe categories of engineering work: requirements analysis, architecture design, integration, testing, project management, and so on. Each process area has defined purposes, outcomes, and associated work products. The assessment examines whether those outcomes are being achieved.
Capability levels describe how mature and controlled the execution of a process is, regardless of which process area is being assessed. The six levels (0 through 5) derive directly from the ISO/IEC 15504 measurement framework.
Most OEM contracts target Level 2 or Level 3 for development processes. Level 1 is often the minimum entry requirement for any supplier relationship.
Capability Levels 1 Through 5: What They Mean in Practice
Level 0 — Incomplete. The process is not performed or fails to achieve its purpose. No work products. No outcomes. This is the baseline from which everything else is measured.
Level 1 — Performed. The process achieves its purpose. Work products exist. Someone is doing requirements analysis, someone is doing design. There may be no documentation, no defined approach, no consistency across teams or projects — but the activity is happening and producing outputs. Level 1 assessors look for evidence that outcomes are real: a requirements document exists, design artifacts exist, testing happened.
Level 2 — Managed. The process is planned, tracked, and adjusted. The key word is managed: someone is responsible for the process, inputs and outputs are controlled, work products are reviewed and version-controlled, and the process is repeated consistently. At Level 2, assessors expect to see review records, version histories, defined roles, and evidence that deviations from the plan were tracked and resolved.
Level 3 — Established. The process is defined — meaning there is a documented, organization-wide standard process that projects are required to follow. Tailoring is permitted but must be justified. Level 3 is where process consistency across projects and teams becomes the standard of evidence. Assessors look for a process asset library, tailoring guidelines, and evidence that the defined process was actually used on the assessed project — not just that a process document exists somewhere.
Level 4 — Predictable. The process is measured quantitatively. Organizations at Level 4 collect defined metrics on process performance, set targets, and can demonstrate that the process operates within predictable limits. This level is genuinely difficult to achieve and maintain. Few suppliers operate at Level 4 across their full development portfolio.
Level 5 — Optimizing. Quantitative data is used to drive continuous improvement. Root cause analysis feeds back into process changes. Level 5 represents a closed-loop improvement system. Almost no Tier 1 or Tier 2 supplier operates here at scale; it is the theoretical ceiling of the framework.
For practical purposes, the gap most suppliers are trying to close is Level 1 to Level 2 or Level 2 to Level 3. Those transitions have the most direct impact on OEM qualification and the most tractable engineering actions.
Process Areas Most Relevant to Hardware and Systems Engineers
ASPICE organizes its process areas into several groups. The two groups hardware and systems engineers interact with most directly are the System Engineering (SYS) processes and the Software Engineering (SWE) processes. Here are the ones that generate the most audit activity.
SYS.2 — System Requirements Analysis
Purpose: Define and maintain system requirements that are unambiguous, verifiable, consistent, and traceable to stakeholder needs.
SYS.2 is where the engineering decomposition chain begins. The process requires that system-level requirements be derived from stakeholder requirements (SYS.1), allocated to system elements, and maintained in a controlled state with version history. Every system requirement must be traceable upward to a stakeholder need and downward to software or hardware requirements that implement it.
At Level 1, auditors look for evidence that system requirements exist and were used. At Level 2, they expect to see requirements baselined in a version-controlled repository, with review records showing that requirements were inspected before use. At Level 3, the requirements process must follow an organization-defined standard — not just be performed ad hoc.
The most common deficiency in SYS.2 assessments: requirements exist in documents but lack formal traceability. A spreadsheet of requirements with no upstream or downstream links will not satisfy Level 2 traceability obligations.
SWE.1 — Software Requirements Analysis
Purpose: Transform system requirements allocated to software into a set of software requirements that are consistent, verifiable, and traceable.
SWE.1 is the software equivalent of SYS.2, operating one level down in the decomposition hierarchy. Software requirements must trace to system requirements. The SWE.1 process also requires that software requirements be reviewed, allocated to software components where appropriate, and maintained through change.
The traceability obligation here is bidirectional: every software requirement should point to the system requirement that generates it, and every system requirement allocated to software should have at least one software requirement that implements it. Gaps in either direction are findings in an ASPICE assessment.
SYS.4 and SYS.5 — System Integration and Testing
These process areas require that integration test cases trace to system requirements, and that test results demonstrate coverage. Auditors are not just looking for test results — they are looking for evidence that the tests were designed to cover requirements and that gaps in coverage were identified and dispositioned.
The Cross-Cutting Traceability Obligation
Across all of these process areas, the underlying theme is bidirectional requirements traceability: the ability to trace from any work product up to the stakeholder need that motivates it, and down to the implementation artifact that satisfies it. ASPICE makes this explicit in its generic practice GP 2.1.3 at Level 2, which requires that work products be identified and controlled — and that their relationships to other work products be established.
In practice, this means that by the time an assessor arrives, you need to be able to show: this stakeholder need generated these system requirements; these system requirements were allocated to these software and hardware elements; these software requirements trace to those system requirements; these test cases cover those software requirements; and these test results prove the tests were executed and passed.
What ASPICE Assessors Actually Inspect
ASPICE assessments are conducted by certified assessors who follow a defined interview, review, and sampling protocol. Understanding what they inspect removes most of the ambiguity about what you need to produce.
Work products. Each process area has associated work products — documents, models, records — that serve as objective evidence. For SYS.2, the primary work product is the System Requirements Specification. For SWE.1, it is the Software Requirements Specification. Assessors will review these directly, not just ask whether they exist.
Version control evidence. At Level 2 and above, work products must be under change control. Assessors look for version histories, change logs, and evidence that the current version is the authoritative one. A requirements document with no version number or history is a Level 1 artifact at best.
Review records. Requirements must be reviewed before use. Assessors ask for review meeting minutes, comment logs, or tool-generated review records. Verbal statements that “we reviewed it” are not objective evidence.
Traceability reports. Assessors will request traceability matrices or equivalent reports showing coverage in both directions. They will spot-check individual requirements to verify that links are real and not placeholder entries.
Consistency and testability. Requirements are inspected for quality attributes: are they unambiguous, verifiable, and consistent with each other? Requirements written as narrative prose with no measurable acceptance criteria are a common finding.
How Modern Tooling Changes the ASPICE Preparation Equation
The traditional approach to ASPICE preparation involves extracting requirements from documents, manually building traceability matrices in spreadsheets, and assembling review records from email threads before an assessment. Teams typically spend two to four weeks on this activity alone — not to improve their process, but to reconstruct evidence that should have been produced continuously.
This is where the structure of the tooling matters as much as the content.
Tools designed around graph-based requirements models — where requirements are nodes, and relationships between requirements are first-class edges — produce traceability natively. Every time an engineer links a software requirement to a system requirement, that link is stored, versioned, and queryable. When an assessor requests a traceability report, it is generated from live data, not assembled from scratch.
Flow Engineering (flowengineering.com) is built on exactly this model. It treats requirements, system elements, and their relationships as a connected graph rather than a hierarchy of documents. A system requirement in Flow Engineering has explicit upstream links to stakeholder needs and explicit downstream links to the software and hardware requirements that implement it. Those links are maintained continuously as requirements change, not patched together before an audit.
For ASPICE purposes, this architecture directly addresses the most common Level 2 and Level 3 findings. The traceability chain that SYS.2 and SWE.1 require — and that assessors will follow link by link — exists in the tool as a structural property, not as a document that was hand-assembled. Review records and version histories are part of the platform’s native workflow, not add-ons.
Flow Engineering’s focus is on the requirements and systems engineering layer — the part of ASPICE where most supplier findings occur. It is not a full ASPICE lifecycle tool covering project management, quality assurance, and configuration management processes. Teams using it still need process discipline in those adjacent areas. That focus is deliberate: the requirements layer is where AI-assisted analysis, automated consistency checking, and graph-based traceability provide the highest leverage for engineering teams.
Where ASPICE assessors spend the most time — reviewing requirements quality, tracing links across levels, and checking that change history is intact — Flow Engineering produces evidence continuously rather than requiring a pre-assessment sprint to reconstruct it.
Practical Starting Points
If your team is approaching an ASPICE assessment for the first time, or trying to move from Level 1 to Level 2, three actions have the highest return:
Establish a single authoritative requirements repository. Documents in shared drives do not support Level 2 traceability. Move requirements into a tool that can version them, link them, and report on coverage. The tool does not have to be sophisticated; it has to be consistent.
Build traceability incrementally, not retroactively. The most expensive ASPICE preparation mistake is treating traceability as a documentation activity done before the assessment. Traceability built into the engineering workflow — created when requirements are authored, updated when they change — is both cheaper and more defensible.
Instrument your review process. Assessors look for objective evidence of review. Whether you use a formal tool, a structured comment log, or meeting minutes, the record needs to exist. Define your review process before the project starts and follow it consistently.
ASPICE is a framework for building better-controlled engineering processes. The capability levels are not arbitrary bureaucratic hurdles — they describe a coherent progression from ad hoc activity to measurable, improving discipline. Teams that understand what each level actually requires, and build tooling and process around continuous evidence production, find that ASPICE assessments confirm good practice rather than interrupt engineering work.