Comment on this article

Raising the bar for security needs: What does "secure boot" really mean?

Embedded boot code security is an important area of vulnerability analysis being investigated by technology providers. After adding a digital signature or authentication step, however, marketing immediately labels the solution "secure boot." It is time to examine what secure boot_ really means, and how to grade secure boot technologies against the security risks in end systems. Key considerations include what is being protected and how, secure boot capabilities available, matching requirements to capabilities and claims, and open standards efforts.

As “Cyber Security Awareness” becomes a top White House priority in 2010, it is becoming apparent that most of today’s awareness efforts are focused on Internet systems. Very little outreach focuses on the of embedded systems, unless such a security breach makes the news.

The number one vulnerability source in embedded processors is the initial boot phase of the device. This is an assertion made by the author, but the truth of this statement can be independently assessed through a search of academic papers on embedded security, discussions at security conferences, and feedback from equipment manufacturers who can attest to the ways in which their products are tampered with and cloned in the marketplace.

The volume of these published and unpublished security incidents in the past few years has driven the software industry to focus on the security of microprocessors’ and embedded processors’ boot process. The reason for the analysis presented herein is straightforward: By assessing how much security is really designed into the system with any solution, designers can make a more informed decision on how much actual end-system security they are getting for their R&D and procurement dollars.

What are you protecting – and how?

Two key questions should be posed by the architect responsible for system security (Figure 1). The first question in assessing the level of security in secure boot technology is whether the capability is actually preventing the exposure of the and software IP of the embedded device or system. Can the most sophisticated antagonists who have access to a boot device or process access the code, then use it for reverse-engineering purposes?

21
Figure 1: Two key questions for secure boot requirements
(Click graphic to zoom by 1.9x)

If so, the hacker will then have all the information needed to clone the system, insert malware, develop countermeasures, or utilize other methods of disabling the system based on a vulnerability or flaw. Secure boot technology that does not protect the code itself offers no real security against the most determined attackers, and might reveal critical intellectual property.

The next question to ask about secure boot technology is: Which kinds of threats does it address? This question is a little bit more complicated. Some secure boot technologies aim to prevent the substitution of altered boot code (“unauthorized replacement”), which might introduce malware or security backdoors into a processor once it is initialized. Other capabilities attempt to limit changeable boot parameters such as multi-stage boot code sourcing in the device as it loads, so that an antagonist cannot interrupt the boot process and substitute false commands or security backdoors into the device setup.

Secure boot capabilities

A variety of methodologies are implemented today as secure boot technologies. This includes digitally signed binaries, secure and trusted boot loaders, boot file encryption, and security microprocessors.

Digitally signed boot files provide an important first step in preventing some of the most widespread boot-loading attacks tracked on the Internet. While some manufacturers call this capability “secure boot” because of its increased resistance to repetitive attacks, it is still susceptible to attacks in the verification procedure if the verifying module is not integrated into the embedded processor. In addition, this kind of secure boot does not address the protection of proprietary and sensitive information embedded in the boot file itself.

Improving the trust and security levels of boot loaders is also an important step in industry security and awareness. Proving the security or trust level of the boot loader is itself a multi-variable problem that depends a great deal on the complexity of the boot process. When loading boot code over a network, for example, the security and trust of the boot loader are likely only as secure as the network itself. Securing a large multi-party network is a far larger problem than the secure boot of an embedded processor. This is a security factor in systems that are enabled to receive remote firmware updates.

Boot file encryption/decryption and dedicated security microprocessors are relatively recent design features of secure embedded processors. They offer silicon circuit embedded capabilities to protect operating code and manage secure boot.

Matching requirements to capabilities, claims, and standards

The primary mechanism today leading to the component feature “secure boot” is a digitally signed software boot image verified by the embedded processor during the boot process. This process can coexist with virtually any software image encryption scheme.

Without question, adding a digital signature verification step adds security compared to systems with no digital verification. But to call this system “secure boot” prior to examining the verification process, and the methodology of employing the digital signature and verifying it, is not necessarily a genuine claim. It is also a nomenclature that does not allow for comparing one “secure boot” scheme to another to determine which is more secure, or which offers more layered secure boot features.

Only one major industry group is currently focused on setting commercially available standards for the boot integrity of processors and embedded processors: the Trusted Computing Group (). This international industry consortium includes many of the largest providers of operating systems and processors; the TCG consortium is dedicated to creating a common non-proprietary standard for improving the “trust” level of a user computing environment. This group has significantly advanced the cause of improving the integrity of runtime code execution.

The TCG has developed a common standard called the “Trusted Platform Module” ()[1], which serves to monitor the operating kernel within a system, both in boot phase and in operation. A Mobile Trust Module (MTM) is being specified as well. The TPM is designed to digitally sign loaded operating system images and determine if there has been any tampering in the boot phase or otherwise. There has been some sporadic adoption and specification of more recent versions of the TPM standard, particularly in government systems[2].

Using a commercially available open standard is usually the most cost-effective approach to implementing new features and capabilities, though not always the most secure or tamper-proof. Military customers favor strong defense-in-depth approaches to layered security and redundancy, which often run counter to the cost-effectiveness goals of .

What “secure” really means for products

When engineers at  talk about “secure boot” in embedded systems, they define it as a fully encrypted boot file with multiple levels of encryption, implemented by a security engineer within a controlled environment. This involves configuring the device so that it boots only from an encrypted file that can be matched to the secure processor through a unique hardware ID. This process protects the boot code and any proprietary information within, manages the boot process through a dedicated internal security processor to prevent code tampering, and follows boot loader guidelines for secure operating systems to constrain unknown configuration states. These elements of secure boot can be seen in Figure 2.

22
Figure 2: Separated elements of secure boot systems
(Click graphic to zoom)

End goal: More secure systems

As systems architects are well aware, secure boot is not a goal in and of itself. The purpose of a secure boot capability is to design a more secure overall system, where the initialization process and the operating code itself are protected from tampering and reverse-engineering. No system is fool-proof, and so we should be cautious and judicious in the use of the term “secure boot” in an embedded system design.

The author advocates that embedded technology providers and engineers designate the descriptors “digitally signed boot files,” “encrypted boot,” “boot code authentication,” and “trusted boot loader” as separate features in both requirements and system specifications. Additionally, buyers of extremely high-security systems might want to specify boot features in even greater detail. This will avoid the current confusion surrounding the term “secure boot,” which is now an abstract term referring to a variety of point solutions and features.

References:

1. Trusted Computing Group. “Trusted Platform Module Main Specification,” July 2007. www.trustedcomputinggroup.org/solutions/authentication.

2. Rolf von Roessing, Computer Weekly. “ISACA: Users reject trusted computing because of privacy and security concerns,” 17 November, 2009. www.computerweekly.com/Articles/2009/11/17/239169/isaca-users-reject-trusted-computing-because-of-privacy-and-security.htm

J. Ryan Kenny is a product manager at CPU Tech. He is responsible for developing security requirements and certification roadmaps for the Acalis line of secure embedded processors. He joined CPU Tech in February 2009 and has more than 10 years of experience in space and defense electronics in the U.S. Air Force and defense systems engineering. He graduated from the U.S. Air Force Academy and completed an MSEE and MBA from California State University and Santa Clara University respectively. He can be contacted at rkenny@cputech.com.

CPU Tech 925-224-9920 www.cputech.com