DO-178C is the standard that governs software development for certified aircraft systems in most of the world. If you’re building avionics software — flight management, flight control, navigation, communication systems, or any other software whose failure could affect aircraft safety — DO-178C is the regulatory framework your software development process must satisfy.

This guide covers what DO-178C actually requires, how it drives requirements and verification practices, and what the emergence of AI in avionics means for a standard built on deterministic assumptions.

What DO-178C Is

DO-178C — officially “Software Considerations in Airborne Systems and Equipment Certification” — is a document published by RTCA that defines software development and verification objectives for airborne software. It’s accepted by the FAA as a means of compliance for software aspects of airworthiness certification and by EASA (and most other civil aviation authorities) under equivalent standards (ED-12C in Europe).

It’s not a standard that certifies your software. It defines objectives your development process must meet, with evidence, as part of the overall aircraft type certification process. Demonstrating compliance with DO-178C is how you show the certifying authority that your software was developed and verified with appropriate rigor for its safety implications.

Software Levels (DAL)

The most important concept in DO-178C is the Development Assurance Level (DAL), also called the Software Level. It ranges from A through E and is determined by failure condition analysis — specifically, what’s the worst-case effect of a software failure on aircraft safety and occupants?

DAL A — Catastrophic. Software failure could cause loss of aircraft or multiple fatalities. Full suite of DO-178C objectives, including Modified Condition/Decision Coverage (MC/DC) testing. Applies to flight control computers, engine controllers, and similar systems.

DAL B — Hazardous/Severe. Software failure could cause serious injury or significant capability reduction. Extensive objectives including decision coverage testing. Applies to many navigation and communication systems.

DAL C — Major. Software failure would reduce aircraft capability or increase crew workload significantly. Statement coverage testing required. Many avionics display and management systems.

DAL D — Minor. Software failure has minor impact. Basic objectives only. Passenger entertainment systems and similar.

DAL E — No safety effect. No DO-178C objectives. Very few avionics systems qualify.

The level determines which objectives apply and what evidence must be produced. DAL A software has 71 objectives, many of them requiring independent verification activities and specific test coverage metrics.

Requirements and Traceability in DO-178C

DO-178C makes requirements quality and traceability explicit objectives, not just good practice:

High-Level Requirements (HLR) are derived from the system requirements allocated to software. DO-178C requires them to be verifiable, consistent, accurate, and unambiguous. Compliance requires producing and reviewing HLRs and demonstrating they satisfy the system requirements.

Low-Level Requirements (LLR) are derived from HLRs and are the specifications from which code is written. They must be complete enough that code can be produced and reviewed against them without additional design information.

Bidirectional traceability is an explicit DO-178C objective. You must demonstrate:

  • Every system requirement allocated to software traces to at least one HLR
  • Every HLR traces to at least one LLR
  • Every LLR traces to source code
  • Every HLR traces to at least one test case
  • Every LLR traces to at least one test case

Traceability gaps — requirements with no downstream trace, or test cases with no upstream requirement — are findings in a DER (Designated Engineering Representative) review. They indicate either requirements that haven’t been implemented or tests that don’t correspond to requirements.

This is why requirements management tooling matters in avionics. Manual traceability maintenance in spreadsheets at DAL A rigor is extremely difficult to sustain and audit. Tools that make traceability structural — where coverage gaps are surfaced automatically — substantially reduce certification audit risk.

Verification Objectives by DAL

DO-178C’s verification objectives scale with DAL:

Reviews and analysis — required at all levels. Requirements reviews, code reviews, traceability analysis, accuracy and consistency checks.

Testing — required at all levels but coverage criteria scale with DAL:

  • DAL C: Statement coverage (every line of code exercised)
  • DAL B: Decision coverage (every branch in control flow exercised)
  • DAL A: MC/DC (Modified Condition/Decision Coverage — each condition in a decision independently affects the outcome)

Independence — DAL A and B require independent verification, meaning verification activities conducted by people or organizations separate from those who developed the software.

Tool qualification — tools used to produce or verify software that could contribute to airborne software errors must be qualified under DO-330.

DO-278A: The Ground Systems Equivalent

DO-278A applies the same framework to ground-based systems — air traffic management software, CNS/ATM infrastructure, navigation databases. If you’re building software that goes into ATC systems, airport navigation infrastructure, or flight operations systems, DO-278A is the relevant standard.

The structure is similar to DO-178C: assurance levels determined by failure condition analysis, requirements and traceability objectives, verification coverage requirements. The assurance levels are labeled AL1 through AL6 (inverted from DAL — AL1 is the most stringent).

AI and Machine Learning in Certified Avionics

DO-178C was designed for deterministic software. Its verification assumptions — that you can enumerate requirements, write tests that verify them, and achieve coverage metrics that give you confidence in correctness — don’t apply to machine learning components that learn behaviors from data and can produce outputs outside their training distribution.

This is a recognized gap. EASA’s AI roadmap, published in phases since 2020, addresses machine learning in certified avionics with a conceptual framework for:

  • Learning assurance — assurance that the ML development process produced a model with appropriate performance
  • Operational domain — formal specification of conditions within which the system is designed to operate
  • Accuracy and robustness — quantitative performance objectives and robustness to distributional shift
  • Explainability — ability to understand why the system behaved as it did

Supplements to DO-178C for ML-based components are in development through RTCA and EUROCAE working groups. The framework is converging on requirements that parallel classical DO-178C structure but with probabilistic performance specifications, dataset assurance objectives, and operational monitoring requirements substituting for the deterministic test coverage requirements.

For programs currently developing avionics products that include ML components, the practical approach is to engage with the certifying authority early — DER and EASA — to agree on a means of compliance that addresses the ML-specific assurance gaps. Waiting for standards to finalize before beginning that conversation extends timelines.

Practical DO-178C Tooling

The administrative burden of DO-178C compliance is substantial. Producing and maintaining the evidence artifacts — requirements documents, traceability matrices, review records, test coverage reports, configuration management records — at DAL A rigor requires disciplined tooling.

Requirements tools used in DO-178C programs need to support:

  • Bidirectional traceability with automatic gap detection
  • Requirement version control with change history
  • Review and approval workflows with records
  • Export of traceability matrices for DER review

Test management tools need to support coverage reporting at the applicable coverage criterion and linkage from test results back to requirements.

Configuration management must ensure that the software delivered to the aircraft and the artifacts that demonstrate its compliance are from the same baseline — a significant discipline requirement that many teams underestimate.

DO-178C compliance isn’t primarily a software problem — it’s an engineering process problem. Teams that treat it as only a documentation exercise at the end of development discover the hard way that certification evidence needs to be built throughout the development process, not assembled at the end.