Showing posts with label hardware. Show all posts
Showing posts with label hardware. Show all posts

Tuesday, September 27, 2011

Steve Weingart to Speak at the Non-Invasive Attack Testing Workshop in Nara, Japan

atsec's principal consultant Steve Weingart will be a panelist at the Non-Invasive Attack Testing Workshop (September 25th – 27th, 2011) in Nara, Japan l. Weingart was asked to join the panel as a laboratory representative discussing the practicality of non-invasive testing and how it fits into the conformance testing and business requirements of the laboratories. He will give a short introduction of the subject matter before joining the panel discussion.

Read more about this on our website.

Monday, December 6, 2010

What is Side Channel Analysis and why should I worry about it?

by Steve Weingart

Side Channel Analysis (SCA) has its roots in the TEMPEST work that goes back to the World War II era. TEMPEST was the study of electrical, mechanical and/or acoustical emissions from devices. These emissions could contain information that was supposed to remain hidden and TEMPEST methods were used in attempts to learn the hidden secrets. Side Channel Analysis is more recent work, started by Paul Kocher and others at the end of the 1980’s that examines emissions from electronic devices to learn secrets that are not supposed to be leaked.

The best known methods of SCA are Simple Power Analysis and Differential Power Analysis (SPA and DPA). SPA and DPA are extremely effective ways to extract information from small computing devices, such as smart cards and tokens. SPA and DPA work by sampling and examining the power supply current (Icc) of these devices. By simple inspection, in the case of SPA, or by mathematical processing in the case of DPA, it is often possible to determine the data, and the secrets, that were processed by the device.

When SPA/DPA was first put into use, it was often possible to take a single oscilloscope trace of the Icc as a Smart Card performed an encryption operation and then, with a little practice, read the encryption key from the screen directly. It was pretty scary! Especially since it used no special or exotic equipment and the command that invoked the cryptographic operation was a normal identification command.

Once people understood the risk, the race was on. Developers would create mechanisms to make it harder and harder to find any information in the available signals, and the attackers would create more and more sophisticated methods of extracting that data.

In the 1990’s and 2000’s several important things happened in SCA. The attack and defense mechanisms have both become so exotic that it now takes very specialized equipment to mount an attack that is likely to be effective. But it is certain that both sides are working harder than ever, and the risk is still there, bigger than ever.

In addition, other avenues of SCA have been explored, such as Electro Magnetic Analysis (EMA). EMA examines the radio frequency emanations from these same electronic devices and can be significantly more effective than SPA/DPA at extracting secrets — despite prevention mechanisms.

What this means to developers of cryptographic devices and tokens is that SCA is an important risk to assess. SPA/DPA risk analysis is becoming required for Smart Cards and tokens used in the credit card and Personal Identity Verification (PIV) industries, and that list of industries is growing. Some Common Criteria Protection Profiles now require SPA/DPA analysis, and FIPS 140-3 is very likely to require SPA/DPA analysis. In fact, it might be added to FIPS 140-2 or to a companion standard, to become a requirement even sooner, rather than waiting for the release of FIPS 140-3.

Wednesday, October 13, 2010

The supply chain and re-use of software and hardware

by Courtney Cavness

In many prominent IT security standards, there are no specific guidelines defining what constitutes an acceptance procedure when an organization incorporates 3rd-party code into their product.

Perhaps because there are no standard specifications, many IT companies undergoing security evaluation are not implementing a manual review of this 3rd-party code, especially on large, open source material such as kernels and libraries. If IT companies that are seeking security certification for their product are not performing such reviews, it must be reflective of the real-world practice of the vast majority of all IT companies. It saves money to re-use code. Unfortunately, code developed/modified by someone outside your organization is inherently untrusted. And the practice of including this untrusted code has led to the introduction of many vulnerabilities, as discussed in a recent report issued by Veracode.

The following report outlines some statistics on the types of vulnerabilities that can be traced back to the inclusion of 3rd-party code in a product, and highlights the fact that use of 3rd-party code is so commonplace that some organizations don't recognize that components originated outside their development environment. In fact, third party code was found to be a major source of application vulnerabilities - and to be ubiquitous. Between 30% and 70 of applications submitted to Veracode as "internally developed" were found to contain code from third party suppliers, according to Chris Eng, Senior Director of Security Research at Veracode. Report: Reused, Third Party Code Major Sources of Insecurity

The following white paper discusses specific features of what could constitute acceptance procedures for the Common Criteria security standard at evaluation assurance levels EAL3 and EAL4. Untrusted Developers- Code Integrity in a Distributed Development Environment.

And according to the following article featured in Scientific American, this problem can also affect hardware, especially integrated circuits, because of the inherent business model of chip design that incorporates circuits designed in multiple world-wide locations by various firms. As the article points out, the very nature of physical hardware attacks makes them difficult to test for and to correct once identified (unlike software that can be wiped clean from a system). The Hacker in Your Hardware.

In the end, there are no shortcuts to security. Although it may save money, time, and development effort to re-use code, an IT organization that incorporates malicious or vulnerable code in its products doesn't really save much at all when they pass along that code to their end users.

Wednesday, May 26, 2010

Automobile Security

One of the best things about working for atsec is that I never know what I will be working on or investigating next. That is so long as it's on the topic of information security.

A topic that sometimes gets discussed with my atsec colleagues is that of embedded systems; how pervasive they are; how they are often found in very hostile environments where it is difficult to use some of the environment safeguards that are typically found in data-centers and that we can "assume" in an analysis of operating systems and applications. These include restricting access to attackers, and that there is a trusted administrator. In the world of embedded systems these assumptions can be much less readily assumed. Our office in Germany has had projects with some major European car manufacturers and the topic of information security for the on-board systems is for some reason a topic of much interest to the guys...

I was reminded of this when I found this paper: "Experimental Security Analysis of a Modern Automobile". I guess we'll be talking more on this topic.

Fiona Pattinson