Showing posts with label ICCC. Show all posts
Showing posts with label ICCC. Show all posts

Wednesday, October 30, 2013

Collaboration and Openness to the Rescue of Entropy



This past September was my conference month. I first went to the 14th International Common Criteria Conference (ICCC) in Orlando, Florida and then a week later I was at the 1st International Cryptographic Module Conference (ICMC) in Gaithersburg, Maryland.

The theme of the ICCC this year was a collaborative approach. The conference directed the CC community to adopt the Collaborative Protection Profile approach (cPPs). The task of the protection profile development and maintenance is shifted from the CC Development Board (CCDB) to various Technical Communities (TCs). TCs are expected to gather experts from the Industry, Government, and Academia to ensure the cPPs stay current in the world of fast-evolving technologies. The CC User Forum (CCUF) will be playing an increasingly important role facilitating the communication between the cPP consumers and cPP developers.

The opening speech of the ICMC was delivered by Charles H. Romine, Director of the Information Technology Laboratory at NIST. Openness of NIST was the core of the speech. NIST has held open competitions for AES and SHA-3. NIST even reopened the public comment period for SP 800-90 series of crypto standards in the interest of openness to give the public a second opportunity to view and comment on the standards. NIST acknowledges the challenges (e.g., such as several-month long review pending queue) that the Cryptographic Module Validation Program (CMVP) is facing and invited all ideas from the audience and public for improvement.

One of the common features of both conferences was the heated discussions on RNG and entropy. The ICMC had three presentations devoted to this topic:
The ICCC had the following presentations on general cryptography, as well as RNG and entropy in particular:

The number of presentations at both conferences was not surprising since more and more Target of Evaluations (TOEs) rely on cryptographic support (i.e., class FCS functionality) for user authentication, data protection, secure communication, and trusted path/channels. Assessing the security strength of cryptographic algorithms and keys has become indispensable for the vulnerability assessment of a TOE.  The Random Number Generator (RNG) that provides the random source for a key determines the quality of the generated keys. A predictable RNG could lead to the downfall of the entire system. To avoid this Achilles’ heel, it’s crucial to have a well-designed and properly-tested RNG and entropy source.


COLLABORATION?





Both CC evaluations and FIPS 140-2 validations require scrutiny of the entropy source.  For example, Annex D of the U.S. Network Device Protection Profile (NDPP) v1.1 provides requirements for entropy documentation and assessment. The documentation of the entropy source is expected to be detailed enough to include the following:
  • Design Description
    Documentation shall include the design of the entropy source as a whole, including the interaction of all entropy source components.
  • Entropy Justification
    There should be a technical argument for where the unpredictability in the source comes from and why there is confidence in the entropy source exhibiting probabilistic behavior (an explanation of the probability distribution and justification for that distribution given the particular source is one way to describe this).
  • Operating Conditions
    Documentation will also include the range of operating conditions under which the entropy source is expected to generate random data.
  •  Health Tests
    All entropy source health tests and their rationale will be documented.

The CMVP is working on an Implementation Guidance (IG) on entropy analysis. The draft version has already been circulated among accredited labs for review and comments. Vendors can also provide feedback through their representing labs. Although the CMVP is still working on incorporating the feedback received from labs and vendors, the following requirements stated in the draft IG will likely remain unchanged in the final version, which currently states the following:
  • The documentation shall contain a detailed logical diagram which illustrates all of the components, sources, and mechanisms that constitute the entropy source.
  • The statistical analysis has to be performed on the raw, non-conditioned data. The testing lab is responsible for choosing the statistical tests and justifying their selection. Further, the lab shall explain what the result of each test means and how this result can be justified.
  • A heuristic analysis of an entropy source along with the justifications of the entropy claims based on this analysis. This analysis shall always be included in the test report.

In theory, a thorough analysis on the entropy source coupled with some statistical tests on the raw data is absolutely necessary to gain some assurance of the entropy that plays such a vital role in supporting security functionality of IT products. While the statistical tests are useful to detect the patterns in the input data and hence to alert for low entropy cases, their passing results do not at all prove that there is sufficient entropy in the input data. For example, a sequence of 20-byte bit-strings obtained by consecutively applying SHA-1 function as a pseudo randomizer to an initial 20-byte of 0 bit-string may well pass all sorts of statistical tests, but it is obvious that there is no entropy in the 20-byte of 0 bit-string to start with. Therefore, the statistical tests alone cannot justify the seemingly randomness in the output strings. The statistical tests performed on the conditioned data are even more removed from reflecting the adequate entropy of the initial random value. If one is seriously investigating the entropy source to gain a certain level of assurance regarding its quality, then the requirements set forth for CC evaluation or FIPS validation are appropriate.

However, the requirements of entropy analysis stated above impose an enormous burden on the vendors as well as the labs to an extent that they are out of balance (in regard to effort expended) compared to other requirements; or in some cases, it may not be possible to meet the requirements.

The TOE Security Assurance Requirements specified in Table 2 of the NDPP is (roughly) equivalent to Evaluation Assurance Level 1 (EAL1) per CC Part 3. This is a rather bizarre phenomenon! The NDPP does not require design documentation of the TOE itself; nevertheless its Annex D does require design documentation of the entropy source--which is often provided by the underlying Operating System (OS). Suppose that a TOE runs on Windows, Linux, AIX and Solaris and so on, some of which may utilize cryptographic acceleration hardware (e.g., Intel processors supporting RDRAND instruction). In order to claim NDPP compliance and succeed in the CC evaluation, the vendor is obligated to provide the design documentation of the entropy source from all those various Operating Systems and/or hardware accelerators. This is not only a daunting task, but also mission impossible because the design of some entropy sources are proprietary to some OS or hardware vendors.

The vendors pursuing cryptographic module validation under FIPS 140-2 are facing the same challenge. While software modules often rely on the operational environment to provide an entropy source, hardware modules may use the third-party provided entropy source in an Integrated Circuit (IC). But regardless of whether the module is hardware or software based, the design documentation of the third-party provided entropy source is often not available. In addition, there is no externally-accessible interface to the third-party provided entropy source that would enable accessing the raw non-conditioned random data for statistical tests. These interfaces are inaccessible due to their security architecture, which if they were accessible, these interfaces may become an attack surface susceptible to the malicious manipulation of the entropy source.

In cases where the entropy source is from some open source OS such as Linux or perhaps even designed by the vendor themselves, the vendor may be able to provide the design documentation and raw non-conditioned data for test. However, this places a heavy burden on the testing labs to provide justifications for their methodology (e.g., selection of statistical tools) and then provide the analysis based on the justified methodology. Many labs raised their concerns at the ICMC that a task of this nature requires mathematicians with doctoral degrees and goes beyond the scope of the conformance testing that the cryptographic module validation program is bound to.

As we can see, from the requirements for entropy source to the fulfillment of these requirements, there is a giant leap. Asking each vendor and each lab for each CC evaluation or each FIPS validation to meet the requirements of entropy source as stated in the Annex D of the NDPP or in the draft FIPS 140-2 IG is a monumental task not commensurate with the expected effort, and even then the proposed result would still be beyond reach.

Instead of requiring the vendors and labs to find solutions for the entropy issue on their own, NIST should play a leading role not only in setting up the requirements but also in establishing a path to meet the requirements. Vendors and labs can join this effort led by NIST to put the necessary infrastructure in place, before expecting vendors and labs to evaluate the quality of the entropy. Here are some thoughts on how to establish such an infrastructure:
  • NIST may hold open competitions for acceptable entropy sources and entropy collection designs with reference implementation in commonly-seen categories such as linear feedback shift registers (LSFRs), noisy diodes, thermal sampling, ring oscillator jitter, CPU clock readings, various human-induced measurements (e.g., the time intervals between the keystrokes). The end result would be a list of NIST-recommended entropy sources and their corresponding entropy collection mechanisms. Just like NIST-approved algorithm standards, the NIST-approved entropy source standard would regulate the entropy source design and implementation.
  • NIST should set up test criteria (e.g., operational conditions, pre-requisites, variable lengths) and provide test tools to the accredited testing labs for validating entropy source. Just like the Cryptographic Algorithm Validation Program (CAVP), this can be Entropy Source Validation Program (ESVP).
  • NIST would maintain a validation list for all of validated entropy sources.
  • CMVP and NIAP (or perhaps even CCRA members) would reference the entropy source validated list.

With this infrastructure in place, the steep entropy requirements are broken down into several steps. The OS vendors and IC vendors, if they provide entropy source in their products, will be motivated to undertake the entropy source validation and make their product available on the NIST validation list. Vendors who need to make use of the third-party entropy source can look up the NIST validation list and make an appropriate selection. Labs performing the testing on the entropy source would use the provided test methodology and tool for the entropy source producer, and check the validation list for the entropy source consumer. After establishing the above-described infrastructure, the testing of the entropy source and maintenance of the validation list are manageable steps for the labs and vendors to follow.

One may say that it sounds like a plan, but that’s easier said than done. I hope with NIST’s openness and NIAP’s collaborative approach, it is possible to rescue the vendors and labs from the currently impossible entropy requirements.

By: Dr. Yi Mao

Sunday, September 15, 2013

Riding the tiger


The 14th ICCC is now over. 

As you know, we were hoping to see a new CCRA announced but it seems that was an over-optimistic expectation. There has been no new version of the CCRA signed, and it seems that there are still open issues, matters of interpretations which need to be resolved and of course the long and winding road of ratification by each of the nations.




The good news is that there is an agreement in principle that a new CCRA is needed.

In fact during the conference we heard several estimates of tasks and milestones from the CCDB and CCMC chairs on this topic. Since in a previous blog we suggested that a simple Gantt chart might be useful in visualizing this we present a simple chart here, both the optimistic and the pessimistic scenarios. Probably the new CCRA will be with us somewhere between the best and worst case; i.e., sometime between the beginning of 2016 and mid-2018.

 
We have previously raised some of our concerns relating to the CCRA vision in our blog. 
But, and this is important, the impressions we gained from this conference is that some of these issues are at long last being discussed much more actively and openly than ever before with the CCUF. Both in the formal presentations to the conference attendees and also on a one-to-one basis with the participants at the conference. We can hope that with this openness also comes a little more understanding for the concerns in respect of the position from all CCRA nations, both new and old as well as from other stakeholders represented through the CCUF. The opening and closing presentation of the CCMC and CCDB chairs surprised us. 

This time we were positively surprised. Thank you for listening.

We also heard, at last, the confusion caused by mixing up the CCMC vision and national policies being addressed through clearer communication about what is the purview of the CCMC and which is a national approach. One contribution to this clarity is that the CCDB and national schemes are using recently coined terminology more precisely and not re-using terminology for similar, but nevertheless different concepts. It is vital that similar yet different terms are clearly distinguished if we are to avoid wasting time due to issues of fear, uncertainty and doubt.

Some high-lights of the conference:

•    The collaboration between the CCRA community and the CCUF was very evident. We also heard from both the CCMC and the CCUF a clearly stated intent to involve the end-user community in the future. (The assurance consumers.)

•    The CCUF are growing, leading the community, becoming a force for change, can move relatively quickly, are gaining momentum and is proving to be a much respected organization.

•    Although we are not allowed to see any of the 17 drafts of the proposed CCRA agreement, we are made aware that, with so many revisions, the discussions have involved a lot of hard work...

•    The transition period for the new CCRA is 36 months once it is fully signed. Alas it is estimated that it will take at least a year and maybe two before all the nations complete their bureaucratic dance.

•    As of this moment we have no available cPPs. 

We believe that it is necessary for the key cPPs to be in place as soon as the CCRA is signed and a great many more of them must be in place before the 36 month transition period ends.

The CCDB chair estimated that 10, 20, or more cPPs might be possible by next year. Let's see if that is possible. 


•    Several schemes have expressed that they will continue to do medium and high assurance certification, while at the same time participate in the cPP development. This means that there will not be high or low assurance schemes, but schemes that are committed to both and will allow end users to select the assurance they need.

•    India is a new CCRA certificate authorizing member. This is important, not only because India is a large nation, but because they have been the only nation so far to ask for certification of telecom products.

...and some low-lights:

•    The CCRA has 26 nations all over the world. So how is it that we have a marketing panel of U.S. vendors only discussing primarily how to market the CC to the U.S. DoD? The CCUF, although dominated by the U.S., must take great care to make sure that all the CCRA nations are properly represented. An important working group like this, composed of members from only one nation is not credible.

•   One thing to note is that the CCUF has no way to move things forward without the CCDB authorizing that. The CCUF can suggest a cPP but only the CCDB will say "yes" or "no." 


Summary

Not only did we receive positive comments on our “activism” but interestingly references to the atsec blog were made in plenary, formal presentations, and by many individuals at least once a day during the conference. We heard little dissent about our blog, although we recognize that an approach relying on informal public discussion  is a difficult one for government folks to be able to contribute.

There will be a long ride with the tiger. The discussions, development and implementation of the new CCRA framework will not be over soon. During this process it is important not only to understand the technical underpinning of the Common Criteria, but also the technical and political issues involved with the standardization processes.

Finally, it would be appreciated if the CCDB could agree on the location of the next ICCC, at least “in principle."

By Staffan Persson

P.S. If anyone is interested to receive the atsec material we showed at our conference booth (pictures and clips) as well our presentations and other material please send us an email request to  info@atsec.com


Monday, August 19, 2013

What I would like to hear (and not) at the next ICCC...

After 13 years, the International Common Criteria Conference (ICCC) returns to the U.S. The first three-day conference took place at the Baltimore Convention Center on May 23-25, 2000. It looks like almost the end of a cycle. In the year 2000, information assurance and use of the Common Criteria standards was firmly on the agenda of all stakeholders: government agencies, vendors, and users.

The need for mutual recognition was strongly voiced and recognized, serving as a key driver for the development of the standards, and one of the core principles for Common Criteria.
Thirteen years later, the community includes 16 certificate issuing nations and a further 10 consuming nations that have subscribed to the principles of the CCRA. Close to 80 vendor companies have registered with the CCUF, and nearly 2,000 certificates have been issued under the auspices of the CCRA.

The establishment and management of something like the CCRA is ambitious. It is a difficult task to lead the management of such an arrangement between nations, which requires supervision of proposed improvements and redefined procedures
 
During the years following 2000, we have seen opinions on information assurance changing, but throughout this change, all of the stakeholders agreed that it was important to keep the structure intact.

"Change your opinions, keep to your principles; change your leaves, keep intact your roots."

-Victor Hugo

Although the Common Criteria standard has evolved over the years, the last
real change happened more than five years ago. Today, what is at stake are the basic principles

the roots of the Common Criteria standard. 

Did the CCRA fail?


What I would NOT like to hear at the ICCC this year:

I would not like to see and hear just a "Dog and Pony Show," as David Martin (the CCDB chair), called his duet with Dag Stroman (the CCMC chair), during the certificate handover ceremony at a circus in Paris. This year in Orlando, the ICCC should avoid a "Goofy and Donald Duck Show," and we certainly do not want or need a Mickey Mouse standard.


An old chair
What I want is a presentation of the progress of the standard and the CCRA framework accompanying it, instead of another content-free presentation from David Martin in his role as the CCDB chair
For example, I recall a presentation from David Martin summarizing all the problems that CC Version 4 was going to address. Those problems have been brought up by labs and vendors for many years, and yet the CCDB has not addressed a single one of those problems in the vision statement. Have those problems magically disappeared because of the "vision?" Is there no longer a need to address the problems identified over the years, just because some people have had a "vision?"

Last year, during the ICCC presentation titled "Common Criteria in 5 years," the only message I remember is David Martin's opinion that he and Dag Stroman wouldn't be there in 5 years! I am sure this can be achieved without waiting so long!



What I would like to hear about at ICCC: 

I would like the CCMC chair, Dag Stroman,  to say something representative of all of the CCRA members and not just for the same few schemes, and I would like the CCDB chair, Dave Martin, to be open and transparent about how the standards are being developed.

Where are we heading with the standard and the future of mutual recognition through the CCRA?

We want to know, and we are paying to know. I think, after 13 years, the CCDB owes us a correct interpretation of the true direction of mutual recognition through the CCRA.

Although I speak for a lab, it is not just the business of the labs at stake; the vendors need to know what they have to do, whether they need to evaluate once or more than once, and if so, where and using which standard?
This is a key point for the various signatory nations too, as they refine their national policies and strategies.

What is the status of the CCRA? 
We need to know. Please be honest and transparent.

The UK is doing CPA; is this recognized by the CCRA? Is CPA a UK-only thing, or is it recognized in the U.S., Canada, and Australia? The labs should know if they need to set up shop in the UK, and the vendors need to know if they have to pay for a special evaluation in the UK, and why.


(correction 2013-09-02)


India is asking for telco equipment to be evaluated only by Indian labs! However, their policies go beyond telecommunication equipment to require even the operating system or database used for accounting—just because they are used in a telco product—to also be evaluated by the Indian lab, even if they have been previously evaluated already.

India is asking for telco equipment to be evaluated only by Indian labs using ISO/IEC 15408 - technically outside the CCRA scope, but certainly outside the spirit of the CCRA. Their policies go beyond telecommunication equipment to even require that operating systems or databases used for accounting also be evaluated by an Indian lab  just because they are used in a telco product; even if they have been previously evaluated already.
(correction 2013-09-02)



These are just two examples of how "old" and "new" schemes are trying to change the roots of the Common Criteria standard and CCRA, in direct opposition to the principles that the users, vendors, and countries originally subscribed to. 


How do scheme-specific technical communities vs the international technical communities affect the development of cPPs?

What about SOGIS? 

How about a summary of real-world policies on mutual recognition from each of the schemes?


Where are we with the cPPs?

Dag Stroman said last year that, "cPPs are like French wine, the older they get the better they become."

The problem is that we are still waiting for the grape harvest to take place! Of course, with good wine there are bad years, and good years and much depends on the quality of the grapes. Not every wine is good, even after waiting several years. If we are using this metaphor for cPPs, I would also make the observation that a good, vintage French wine is expensive, and sometimes very expensive.

 
Vendors, Labs, and People
The vendors too should come out and be heard! Continuing this dance with the different schemes is not helping. Complying with sixteen different cheap, low-assurance schemes is unlikely to be cheaper or quicker than complying with one good one.

If the main objective of the vision is to put the labs out of business and/or put a number of information assurance professionals out of a job, the vision is succeeding! 

The vision does not take into account the people who have devoted their careers to information assurance. They are not in this field  to get rich, they are in it because they believe in it and they will continue to stay around and come together. We now have a chance to direct all our efforts in the right direction under one umbrella. Once the umbrella is gone, anything can and will happen.

Conclusion

A last message to the "Visioners," from the business side: the CC validation business is down heavily and the trend is downward. However, what is more disappointing is to be unable to guide our customers on what to do, how to do it, and particularly, how to avoid being "blackmailed" from various schemes into doing something more or something local. The vision promised cPPs developed by a Technical Community that would solve the problems. But so far the CCDB is still struggling over the criteria just to allow a Technical Community to be accepted by the CCDB. Being unable to solve this fundamental  prerequisite within a year does not give me much hope that the "vision" is anything more than just an illusion.

If you want the vision to succeed, then you need to plan on how to do this without killing the golden goose!

With this in mind, I will end with a quote from the inventor of Mickey Mouse: "All the adversity I've had in my life, all my troubles and obstacles, have strengthened me... You may not realize it when it happens, but a kick in the teeth may be the best thing in the world for you." -- Walt Disney
 
Cheers.

...sal