Automotive SPICE® (ASPICE) is a process assessment and improvement model for automotive software and systems engineering. It is maintained within the VDA QMC framework and is based on the ISO/IEC 330xx series for process assessment. ASPICE defines the outcomes that processes are expected to achieve and the indicators and evidence used to assess whether that work has been performed effectively.
ASPICE is lifecycle-agnostic. Agile, V-model, and hybrid organisations can all apply it, provided they can consistently demonstrate the required outcomes and evidence. ASPICE does not replace functional safety standards such as ISO 26262. Instead, it supports them by improving the discipline, consistency, and auditability of the processes used to produce safety-relevant work products.
This overview is relevant to two main audiences. For leaders and programme owners, ASPICE provides a clear link to business outcomes such as predictability, audit readiness, and supplier alignment. For delivery leads and engineering teams, it provides practical guidance on reviews, approvals, baselines, gates, traceability, and evidence management within an existing lifecycle.
What ASPICE Covers
ASPICE addresses the core engineering and management activities needed in modern automotive development and connects them into a coherent whole.
The work starts with requirements management. Requirements should be necessary, testable, and understood by all relevant parties. In mature practice, they have stable identifiers, clear sources, defined rationales, and unambiguous acceptance criteria. When requirements change, those changes are controlled and traceable.
System and software architecture translate intent into technical structure. Interfaces are defined, responsibilities are allocated, and key design decisions are documented together with their rationale. Architecture is not just a static model. It acts as a working reference that allows teams to collaborate consistently while taking account of safety, cybersecurity, and performance concerns.
Implementation converts design into working code and configuration. ASPICE focuses less on coding style itself and more on engineering control. Coding standards are defined, reviews are carried out, builds are reproducible, and verification evidence such as static analysis, unit tests, and coverage is linked to the items being verified.
Integration is expected to be planned and controlled. Teams integrate against named baselines, ensure that environments and test assets are ready when needed, and record results in a way that makes regressions explainable rather than speculative.
Verification and testing provide the factual basis for release decisions. Requirements should be verified at the appropriate level, test results should be linked to the exact build under test, and deviations should enter a formal problem-resolution process rather than being informally accepted.
Configuration, Change, and Problem Resolution
ASPICE places strong emphasis on control and reproducibility.
Configuration management ensures that work products are uniquely identified, baselines are established at the right points, and approved deliveries can be recreated exactly. This is essential for auditability and for stable engineering collaboration.
Change management ensures that proposed changes are analysed, their impact is understood, and approvals are obtained before implementation. This reduces uncontrolled variation and improves confidence in release content.
Problem resolution management takes issues from detection through triage, root-cause analysis, corrective action, and closure. The objective is not only to fix individual defects, but also to prevent recurrence.
Quality assurance provides an independent view of whether agreed processes are being followed and whether the underlying evidence is reliable.
Project management supports planning, monitoring, risk handling, and go/no-go decisions based on defined criteria rather than opinion. In many organisations, this extends to supplier monitoring and release governance.
Why Evidence Matters
A central strength of ASPICE is that it defines not only expected outcomes, but also the evidence used to demonstrate them. This makes improvement tangible.
A requirement with no test link, a build that cannot be reproduced from a baseline, or an approval that refers only to “latest” rather than a specific revision are not abstract weaknesses. They are concrete gaps that can be identified and corrected.
Addressing such gaps usually improves predictability quickly. Teams spend less time firefighting, integration becomes more stable, and planning improves because uncertainty is reduced earlier.
Capability Levels
ASPICE capability is assessed per process, not as a single organisation-wide label. The capability dimension uses six levels, from Level 0 to Level 5, in line with the ISO/IEC 330xx assessment framework.
- Capability Level 0 – Incomplete: the process does not achieve its intended purpose.
- Capability Level 1 – Performed: the process achieves its intended outcomes, and there is evidence that the work has been carried out.
- Capability Level 2 – Managed: the process is planned, monitored, and controlled, and work products are managed appropriately.
- Capability Level 3 – Defined: a defined process is deployed consistently across the project or organisation.
- Capability Level 4 – Predictable: the process operates within defined limits and is brought under quantitative control.
- Capability Level 5 – Innovating: the process is continually improved using quantitative understanding and innovation.
Assessors rate the achievement of process attributes using the standard scale of Not Achieved, Partly Achieved, Largely Achieved, and Fully Achieved. To claim a capability level, the attributes at that level and at all lower levels must be at least largely achieved. Higher levels represent repeatability and control, not perfection. In practice, many organisations achieve the highest return by progressing reliably through Levels 1, 2, and 3 before pursuing quantitative control more broadly.
Traceability and Consistency
Two concepts are especially important in ASPICE: traceability and consistency.
Traceability connects the lifecycle end to end. A requirement enters the lifecycle with a defined identity, source, and rationale. It is then realised in architecture and detailed design, implemented in code or configuration, and verified through test evidence. Related changes, issues, and decisions remain linked to the same thread.
This makes coverage demonstrable and impact analysis practical. When something changes, teams can identify which design elements, code modules, tests, and documents are affected and verify those effects before approval.
Consistency means that all work products describe the same product state. Requirements, design, code, interfaces, tests, and release information should align with each other. When they do not, contradictions appear later as integration problems, rework, delays, or safety and reliability concerns.
Supplier Alignment
ASPICE is also valuable in supplier management when expectations are defined operationally from the start.
If the required evidence, reporting rhythm, review points, and acceptance criteria are agreed early, supplier integration becomes planned work rather than late discovery. Reviews can then be conducted against explicit expectations, and deviations can be addressed in a constructive but controlled way.
At minimum, supplier deliveries should provide coherent trace links to the tested build, test results that match the declared content, a controlled change history with approvals, and documentation that accurately reflects the delivered product.
Useful Metrics
ASPICE does not require large numbers of metrics, but a small and focused set can improve predictability.
Examples include:
- Traceability coverage, to show whether evidence is keeping pace with change
- Gate first-pass rate, to indicate whether entry and exit criteria are realistic and whether teams are preparing in time
- Change turnaround time, to reveal delays in analysis or approvals
- Defect leakage, to highlight weak gates or rushed integration between lifecycle stages
These metrics are useful when they support process improvement, not when they are collected for their own sake.
A Practical Adoption Approach
A practical way to adopt ASPICE is to start by mapping current practices against the core processes. From there, select a small number of high-impact gaps, define clear entry and exit criteria, establish baseline points, and ensure that lifecycle tools support stable identifiers and trace links.
These controls can then be trialled on the next delivery. The results should be reviewed, the main impediments addressed, and the approach extended gradually to other processes.
Used in this way, ASPICE becomes a set of disciplined engineering habits adapted to the organisation’s context, rather than a parallel system that replaces how teams already work.
Conclusion
ASPICE is an evidence-driven framework for improving the development of automotive software and systems. It defines what processes are expected to achieve and how those outcomes are demonstrated. In doing so, it supports more predictable delivery, stronger compliance evidence, and better coordination across internal teams and suppliers.
Its practical value lies in reducing avoidable surprises: fewer late changes, smoother integrations, clearer approvals, and releases that can withstand technical and audit scrutiny. For organisations looking for a structured starting point, a focused readiness scan and lightweight self-assessment can help identify the first meaningful gaps and the evidence needed to close them.