Military Embedded Systems

When code is born of old: Architectural obsolescence within the development life cycle

Story

April 11, 2011

Mark Pitchford

LDRA Technology

Portability is not just relevant to software migration during the lifetime of a product. In complex military projects where the development cycle can easily take up to 10 years, software might have to be ported to new architectures two or three times. Unless specific steps are taken to ensure platform portability, such migration accelerates project overage, extending work schedules right through testing and verification. The ideal would be to anticipate such an occurrence and plan to minimize its impact, but the right software test tools can smooth the way even when porting legacy code written without such foresight.

Choosing appropriate software test tools is always an essential step for safety-critical applications, and that is particularly true when portability is vital. It is well worth a little additional lateral thinking about how those tools might be applied to this specific circumstance.

The purpose of DO-178B is to provide guidelines for the production of software for airborne systems and equipment that performs its intended function while providing safety criticality. Although DO-178B demands the use of appropriate coding rules, it does not dictate what those rules should be.

Other standards, such as the Motor Industry Software Reliability Association C coding standard (MISRA C), MISRA C++, and Joint Strike Fighter Air Vehicle C++ Coding Standard (JSF++AV), specify detailed rules about how such safety-critical code should be written, and are found in an increasingly broad spectrum of applications. For instance, MISRA C was initially developed for use in automotive applications but has since been adopted and used across a wide variety of industries, including the medical, rail, aerospace, and military sectors. Sure, standards such as DO-178B will insist on the use of a rule set such as the MISRA and JSF++AV standards. The problem is that although these all acknowledge that portability is desirable and include rules relating to it, it is rarely the primary concern.

Adopting an appropriate rule set

Choosing a static analysis tool that encompasses many such standards will provide access to the portability rules of all of them. Considering additional portability rules in tandem with those from a chosen standard will ensure more focus on that element of the development work, and hence, ensure the minimum possible impact in the event of such a change.

There is no consequential compromise in the adherence to the standard of choice. And by choosing a rule superset of the nominated standard, it is possible to ensure that a change in architecture will have the minimum possible impact. With the correct tools, it is also perfectly possible to cross-reference Code Review Reports to a nominated standard and still include the additional portability rules.

Adapting to new target hardware

When the time comes to port code to a new target, it is useful to focus on only those parts of the rule set that have implications for portability, regardless of whether target obsolescence results from lengthy development times.

If the portability-focused rule set has been in place from the start, this exercise should largely be one of confirming that there are no problems. If not, the ability to filter static code review reporting to focus on portability becomes invaluable.

In this case, there are likely to be changes spread through the code base. So once the changes have been made, how can it be proven that the code functionality has not been compromised?

Ensuring uncompromised functionality

Unit test tools often encourage the collation of a suite of regression tests as development progresses. If such tests exist, rerunning them for a new target environment is a trivial exercise. If there is no such existing portfolio of unit test cases, an alternative approach needs to be considered.

For example, it is possible to automatically generate a set of unit test cases based on code analysis. These test cases can then be used to prove that functionality is identical before and after modification. While unit test cases not based on requirements are generally inferior because of a lack of independence, this approach bears scrutiny here because the primary requirement is to confirm that functionality has not changed.

Another approach involves comparing “baselines” from test tools, which can show how code has been changed. These can be used in conjunction with the tools’ graphical representations of code architecture to ensure that even unchanged code is not affected by changes made elsewhere.

Such possibilities exist because modern test tool suites include a collection of tools that, although frequently used in a prescribed sequence, is often much more adaptable than such a rigid definition implies. By considering them not only as part of the development process but also as a tool kit, they have the flexibility to help in a plethora of situations … including this one.

Mark Pitchford has more than 25 years of experience in software development for engineering applications. Since 2001, he has specialized in software testing and works as a Field Applications Engineer with LDRA. He can be contacted at [email protected].

 

Featured Companies

LDRA Technology

2540 King Arthur Blvd, Suite #228
Lewisville, TX 75056
Categories
Avionics - Safety Certification
Topic Tags