Wednesday, October 13, 2010

The supply chain and re-use of software and hardware

by Courtney Cavness

In many prominent IT security standards, there are no specific guidelines defining what constitutes an acceptance procedure when an organization incorporates 3rd-party code into their product.

Perhaps because there are no standard specifications, many IT companies undergoing security evaluation are not implementing a manual review of this 3rd-party code, especially on large, open source material such as kernels and libraries. If IT companies that are seeking security certification for their product are not performing such reviews, it must be reflective of the real-world practice of the vast majority of all IT companies. It saves money to re-use code. Unfortunately, code developed/modified by someone outside your organization is inherently untrusted. And the practice of including this untrusted code has led to the introduction of many vulnerabilities, as discussed in a recent report issued by Veracode.

The following report outlines some statistics on the types of vulnerabilities that can be traced back to the inclusion of 3rd-party code in a product, and highlights the fact that use of 3rd-party code is so commonplace that some organizations don't recognize that components originated outside their development environment. In fact, third party code was found to be a major source of application vulnerabilities - and to be ubiquitous. Between 30% and 70 of applications submitted to Veracode as "internally developed" were found to contain code from third party suppliers, according to Chris Eng, Senior Director of Security Research at Veracode. Report: Reused, Third Party Code Major Sources of Insecurity

The following white paper discusses specific features of what could constitute acceptance procedures for the Common Criteria security standard at evaluation assurance levels EAL3 and EAL4. Untrusted Developers- Code Integrity in a Distributed Development Environment.

And according to the following article featured in Scientific American, this problem can also affect hardware, especially integrated circuits, because of the inherent business model of chip design that incorporates circuits designed in multiple world-wide locations by various firms. As the article points out, the very nature of physical hardware attacks makes them difficult to test for and to correct once identified (unlike software that can be wiped clean from a system). The Hacker in Your Hardware.

In the end, there are no shortcuts to security. Although it may save money, time, and development effort to re-use code, an IT organization that incorporates malicious or vulnerable code in its products doesn't really save much at all when they pass along that code to their end users.

No comments:

Post a Comment

Comments are moderated with the goal of reducing spam. This means that there may be a delay before your comment shows up.