Complexity and Security
by Robert Hoffmann
The feature-richness of today's information systems and applications has not only led to architectural complexity and difficulties in maintenance, but also provides a large attack surface.
A risk analysis shows that the actual threat is the product of probability and severity. The traditional approach of improving the component's security is therefore offset, if at the same time the accessible interfaces increase at a much higher rate.
As a result of corporate driven philosophies to provide ever more product features and functionality leads to and increasingly worsening situation in today's technology. Instead of relying on trusted components, new components, product versions or even whole new technologies are often deployed in production systems as soon as they are available, resulting in complex architectures and implementations.. Of course the vendors are only partly to blame. It is also the customer who demands more and more advanced products. Only lately have they started to realize that this usually comes at the cost of decreased stability and security.
In September 2010, the German computer magazine c't reported on severe security issues in 17 banking websites. [1] The 16 year old student whom the article is written about, Armin Razmdjou, found several cross site scripting vulnerabilities. Even after the banks patched them up, he still found more. In their desire to provide the best customer experience, the banks were using web technologies that were never designed to provide a secure transaction environment. Combined with the complex layout of their web pages, it led to an application that cannot be considered “secure by design” anymore.
In the fall of 2011, Sony became a target for hackers due to political reasons [2]. Within two months their web servers were hacked 20 times, including disclosure of personal related data. The hackers attacked various local servers in multiple countries to gain access. Instead of “putting all their eggs in one basket,” a complex distributed design was used by Sony. This meant that their information systems exposed a huge attack surface, and a threat from a basically equipped attacker who could exploit just one of the vulnerabilities resulting in a very costly incident.
This problem is not limited to websites, but can be found in a variety of applications and services as a consequence of increasing complexity in systems together with public access to them.
Another example was Vodafone in 2009. Their femtocell product was hacked, allowing an attacker to impersonate and perform a man-in-the-middle-attack on any Vodafone UK customer [3]. This was possible because access to internal cryptographic information was needed by the femtocell to function. While larger base stations are located in a secured area and connected via dedicated lines, the femtocell is set up in the more hostile environment of the customer premises and communicates over the Internet. This meant that many formerly only internal interfaces were now accessible by the customer. There were of course security measures in place, but because the hackers could manipulate the hardware they could also circumvent the software protection.
In this case, a complex device that was never designed to be used externally was suddenly given to the customer, thereby multiplying the attack surface of the whole system.
Traditional approaches, such as penetration testing, might only remind us of the problem. It was already shown in 1974 by Paul Karger in his Multics vulnerability analysis [4] that testing and patching only shows the presence of vulnerabilities, and not their absence.
The security community has recognized this and developed the concept of attack surfaces [5]. By analyzing the attack opportunities, a statement can be made about the security level of a system. This should not be mistaken with “security by obscurity.” Any interface, be it intended to be public or not, is considered as being available to the attacker.
But the problem needs to be approached more fundamentally. Systems need to be designed with security in mind from the beginning. This includes the requirement to minimize both the attack probability and its possible severity.
The UNIX principle of having small components that provide only limited functionality, but do that well and proven, should be used to create building blocks of systems.
By combining such components, it will be possible to create interfaces that expose only the least required functionality and ensure its security. This also keeps the overall design complexity of the system low because only required components are included and active.
Security analysis needs, of course, also have to reflect this. A reactive approach using only penetration testing and subsequent patching techniques is insufficient. Instead the complexity of a system needs to be included in the analysis, as well as the size of the attack surface it exposes. Both need to be treated as the weaknesses they are, and be subject to improvement. Security analysis and testing has to move from a component orientated task to a holistic system evaluation.
Some ideas in this direction in regard to Common Criteria have been suggested by Helmut Kurth at the 10th ICCC in Tromsø:
http://www.atsec.com/downloads/presentations/An_Attack_Surface_Driven.pdf
[1] http://www.heise.de/newsticker/meldung/Banken-Seiten-weiterhin-unsicher-1179476.html
[2] http://blogs.forbes.com/andygreenberg/2011/06/20/in-sonys-20th-breach-in-two-months-hacker-claims-177000-sony-emails-compromised/
[3] http://seclists.org/fulldisclosure/2011/Jul/187
[4] http://csrc.nist.gov/publications/history/karg74.pdf
[5] http://www.cs.cmu.edu/%7Ewing/publications/Howard-Wing03.pdf
No comments:
Post a Comment
Comments are moderated with the goal of reducing spam. This means that there may be a delay before your comment shows up.