Formal program verification in avionics certification
Five years after the official adoption of the new DO-178C/ED-12C standard and its supplements, including the DO-333/ED-216 supplement on formal methods, no avionics-certification project has yet acknowledged using this new supplement. However, formal method technologies do exist that would ease the development of avionics software.
A major roadblock preventing the adoption of formal program verification for avionics certification is the absence of a general consensus on how to apply DO-333/ED-216, despite significant dissemination efforts from the committee that developed it. There now exists a detailed process for the use of SPARK to satisfy objectives of DO-333/ED-216 as a replacement for certain forms of testing, with a focus on checking that the source code is consistent, accurate, and complies with low-level requirements.
The process addresses the alternative objectives for coverage when using formal methods and the objective of property preservation between source code and executable object code. The former is required when some testing is replaced by the use of formal methods. The latter is required in order to benefit from source code verification of executable object code. This process has been discussed with both the Federal Aviation Administration (FAA) and European Aviation Safety Agency (EASA) for future applicants using SPARK in DO-178C/ED-12C.
Formal methods in avionics
Although the addition of a formal methods supplement in version C of DO-178 is somewhat recent (2012), the use of formal methods to develop avionics software dates back to the 1990s at least, when John Rushby wrote a thorough guidance document about their use for the FAA. [“Formal Methods and the Certification of Critical Systems,” Rushby, FAA, 1993.] While Rushby focused on deductive methods, increases in automation and computer power since then have made two other kinds of formal methods attractive to develop avionics software: model checking and abstract interpretation. DO-333 specifically addresses the use of these three categories of formal methods for developing avionics software. Examples of use of all three categories are presented in a NASA report from 2014 [“DO-333 Certification Case Studies,” Cofer and Miller, Rockwell Collins, 2014.]
While abstract interpretation and model checking are well-suited to check simple program properties across a codebase with minimal human intervention, they suffer from the so-called state explosion problem, when the size of the model analyzed (whether supplied explicitly in model checking or constructed by the tool from an abstract interpretation) is too large for analysis to complete. Deductive methods do not suffer from these drawbacks, but they have the cost of requiring users to write function contracts. These contracts are (partial) specifications of the function behavior that define both the objective for verification and a suitable summary of the function behavior for analyzing calls to that function. This allows deductive methods to apply powerful verification techniques that can prove non-trivial properties of software, because function contracts enable the focus to be on the verification on individual functions, one at a time.
Two toolsets provide formal program verification based on deductive methods for industrial users of C and Ada: the Frama-C toolset for C programs and the SPARK toolset for Ada programs. Both have been used in the context of DO-178 avionics certification. For example, Lockheed Martin initially used SPARK in 1997 for the control software of the C-130J U.S. military and U.K. Royal Air Force aircraft. BAE Systems has since used SPARK to prove critical properties of the C-130J control software during maintenance. As another example, one which is documented in DO-333, Airbus used Caveat (the predecessor to Frama-C) in 2002 to prove low-level requirements on the Airbus A380 civilian airplane, as a replacement for unit testing.
Verification objectives addressed
SPARK can be used as the primary source of evidence for many verification objectives in DO-333, from low-level requirements (LLR) to source code and executable object code (EOC). Formal verification is a particular case of analysis, hence the guidance needed for applying formal analysis to LLR and source code is simply the criteria and conditions for the use of formal analysis. [“Guidance for Using Formal Methods in a Certification Context,” Brown et al., SC-205/WG-71, ERTS 2010.] Usage in the EOC requires more justification, particularly when replacing unit tests.
When LLR are expressed as contracts in SPARK, the formal notation guarantees that LLR are precise and unambiguous, so accuracy is guaranteed. Consistency is also guaranteed because contracts on distinct functions cannot conflict. Contracts are also verifiable and conform to a (programming language) standard by design. These cover objectives 2, 4, and 5 of table FM.A-4 from DO-333. (op. cit. Cofer and Miller.)
One of the main resources of SPARK is that it automatically shows that the source code complies with LLR expressed as function contracts. Function contracts can also express data dependencies and the SPARK toolset can show automatically that source code complies with this part of software architecture. SPARK code is verifiable and conforms to a (programming language) standard by design. The source code of a function is implicitly traced to the LLR expressed in the function contract. Finally, SPARK code is unambiguous, so the consistency of the source code can be analyzed automatically to show that it is free from reads of uninitialized data, arithmetic overflows, other runtime errors and unused computations (variables, statements, etc.) These cover objectives 1 to 6 of table FM.A-5 from DO-333.
The objectives of compliance and robustness of EOC with respect to LLR (objectives 3 and 4 of table FM.A-6 from DO-333) can be addressed by relying on corresponding objectives for source code, provided one also provides a demonstration of property preservation between source code and EOC. One way to show property preservation would be to demonstrate with reasonable confidence that in all possible cases the compiler preserves the semantics of programs from source code to EOC. Unfortunately, no reasonable approach seems to be able to provide that confidence. By running integration tests in a mode where contracts are executed in SPARK, the user can be confident that the compiler properly preserved the semantics of source code in EOC; if it did not, the contracts proved in individual functions would have (with very high probability) failed during integration tests. By running integration tests both with and without contracts being executed and checking that the outputs are identical, the user can be confident that compilation of contracts does not impact compilation of code, because otherwise the outputs would most probably be different on some tests.
Of course, a major benefit of using SPARK is the ability to replace unit tests by SPARK analysis. In such a case, DO-333 also defines additional objectives 5 to 8 of table FM.A-7. A combination of formal verification and reviews can address these objectives, as demonstrated by past experiences at Airbus and Dassault Aviation. [“Testing or Formal Verification: DO-178C Alternatives and Industrial,” Moy et al., IEEE Software 2013.] Several features of SPARK are in use here, such as the ability to state data dependencies in function contracts and the possibility of expressing function contracts by disjoint cases.
Formal program verification toolsets have been used by a few pioneers since the 1990s. Progress in automation of formal program verification in the certification of avionics software now makes these techniques accessible to more companies. SPARK enables users to address many of the verification objectives defined in the Formal Methods Supplement DO-333 of DO-178C. Certification authorities in the United States and Europe are now looking favorably at applicants who use such methods in avionics certification.
Adacore • www.adacore.com