Military Embedded Systems

Code Quality: Dynamic & static analysis combined makes engineers & auditors happy

Blog

March 28, 2013

Mark Pitchford

LDRA Technology

In the good old days, before writing software became “software engineering,” code development was a black art practiced by weird nerdy kids straight out of college. For them, coding was by no means a structured discipline. If you managed to get them to communicate, they might tell you that they were hacking code together and using ad hoc test data to see whether it did what it was supposed to do when they executed it.

Whether they knew it or not, they were practicing dynamic analysis through system functional test. Unlike static analysis, dynamic analysis involves code execution by definition.

But what did this do beyond showing the basic functionality as roughly correct for whatever rudimentary test data was dreamed up? While better than nothing, likely no more than half of the code was exercised. Jack Ganssle, industry software guru, and chief consultant for The Ganssle Group, and industry editor, concurs: “Studies confirm that, without the use of code coverage analysis, testing typically exercises only 50% of the code. Given typical bug rates, that means 100K lines of code in a program will ship with 2500 to 5000 bugs. These bugs lead to many systems failures.”

Why? Because no matter how imaginative the tests, chances are that real life will throw some curve balls to try out the untested paths. And if something executes that wasn’t tested, you’re likely in for some surprises and potentially catastrophic failures.

Fast forward 30 or 40 years. Although such a homespun approach doesn’t cut it with complex military embedded applications, functional test remains at the core of what dynamic test can do. Carefully selected test data show that the branches and statements in the source code are exercised in accordance with specification, allowing us to not only show that the system is functionally correct, but also that we have exercised all of it. When used in tandem with static analysis, dynamic analysis provides the supporting evidence needed to prove that all of our other good works and best practices yield a safe, secure, and high-quality end product.

Unlike the hacking of years ago, today’s automated test tools precisely trace the execution route via techniques such as the use of instrumentation probes. These probes are essentially additional function calls that generate “I’ve been here” messages from strategic points in the source code and allow coverage data to be collated. They allow dynamic tests to generate feedback on how comprehensive they are, so that we can successively build on each set of results until the desired level of coverage is reached.

In turn, this gives us flexibility in how much code is exercised at a time. Dynamic analysis of an entire system is possible, but there are always routes through the code which cannot be exercised by our system during normal operation – defensive code for instance; perhaps a check for a division by zero.

At times like this, it’s good to also use “Unit tests.” Unit tests encapsulate a subset of the system and allow parameters to be passed such that the defense mechanism code in our example is exercised. We even have the option of basing application dynamic analysis entirely on unit tests, collating code coverage data as each module is developed and removing any requirement to wait for a complete system.

Today’s military applications require support for architectural standards such as ARINC 653 or FACE to improve code portability and reusability. The provision of comprehensive coverage data by means of dynamic analysis provides evidence that code remains maintainable, safe, and secure even when ported to a different application, particularly when used in tandem with an effective static analysis regime.

Our nerdy friends of yesteryear weren’t easily impressed, but they would surely recognize what we are doing as an extension of their own test efforts. Who knows?! We might even get a grunt of approval for our system failure rates or maybe even high fives over the mounds of Coke cans and pizza boxes stacked around their computers!

 

 

Featured Companies

LDRA Technology

2540 King Arthur Blvd, Suite #228
Lewisville, TX 75056
Categories
Avionics - Software
Topic Tags