Collaboration and Openness to the Rescue of Entropy
This past September was my conference month. I first went to
the 14th International Common Criteria Conference (ICCC)
in Orlando, Florida and then a week later I was at the 1st
International Cryptographic Module Conference (ICMC) in Gaithersburg,
Maryland.
The theme of the ICCC this year was a collaborative
approach. The conference directed the CC community to adopt the Collaborative
Protection Profile approach (cPPs). The task of the protection profile development and
maintenance is shifted from the CC Development Board (CCDB) to various
Technical Communities (TCs). TCs are expected to gather experts from the
Industry, Government, and Academia to ensure the cPPs stay current in the world
of fast-evolving technologies. The CC User Forum (CCUF) will be playing an
increasingly important role facilitating the communication between the cPP consumers
and cPP developers.
The opening speech of the ICMC was delivered by Charles H.
Romine, Director of the Information Technology Laboratory at NIST. Openness of
NIST was the core of the speech. NIST has held open competitions for AES and
SHA-3. NIST even reopened the public comment period for SP 800-90 series of
crypto standards in the interest of openness to give the public a second
opportunity to view and comment on the standards. NIST acknowledges the
challenges (e.g., such as several-month long review pending queue) that the
Cryptographic Module Validation Program (CMVP) is facing and invited all ideas
from the audience and public for improvement.
One of the common features of both conferences was the
heated discussions on RNG and entropy. The ICMC had three presentations devoted
to this topic:
- How Random Is Random?, Helmut Kurth, Chief Scientist, atsec information security
- Entropy: Order from Disorder, Tim Hall and Apostol Vassilev, NIST
- SP 800-90 – Reviewing The Standard, Stephan Mueller, Principal Consultant and Evaluator, atsec information security
The ICCC had the following presentations on general cryptography,
as well as RNG and entropy in particular:
- Cryptography and Common Criteria, Chris Brych (Safenet Inc.) and Ashit Vora (Cisco Systems Inc.)
- Modeling of Cryptographically Secured Channels (IPsec, SSH, TLS, DTLS) in ST/PP, Fritz Bollmann, German Federal Office for Information Security (BSI)
- A New Evaluation Method for RNG, Ülkühan Güler, National Research Institute of Electronics and Cryptology, Turkey
- Entropy Sources – Industry Realities andEvaluation Challenges, Sonu Shankar, Cisco Systems, Inc.
The number of presentations at both conferences was not
surprising since more and more Target of Evaluations (TOEs) rely on
cryptographic support (i.e., class FCS functionality) for user authentication,
data protection, secure communication, and trusted path/channels. Assessing the
security strength of cryptographic algorithms and keys has become indispensable
for the vulnerability assessment of a TOE.
The Random Number Generator (RNG) that provides the random source for a
key determines the quality of the generated keys. A predictable RNG could lead to the downfall
of the entire system. To avoid this Achilles’ heel, it’s crucial to have a
well-designed and properly-tested RNG and entropy source.
COLLABORATION?
Both CC evaluations and FIPS 140-2 validations require scrutiny of the entropy source. For example, Annex D of the U.S. Network Device Protection Profile (NDPP) v1.1 provides requirements for entropy documentation and assessment. The documentation of the entropy source is expected to be detailed enough to include the following:
- Design Description
Documentation shall include the design of the entropy source as a whole, including the interaction of all entropy source components. - Entropy Justification
There should be a technical argument for where the unpredictability in the source comes from and why there is confidence in the entropy source exhibiting probabilistic behavior (an explanation of the probability distribution and justification for that distribution given the particular source is one way to describe this). - Operating Conditions
Documentation will also include the range of operating conditions under which the entropy source is expected to generate random data. - Health Tests
All entropy source health tests and their rationale will be documented.
The CMVP is working on an Implementation Guidance (IG) on
entropy analysis. The draft version has already been circulated among accredited
labs for review and comments. Vendors can also provide feedback through their
representing labs. Although the CMVP is still working on incorporating the
feedback received from labs and vendors, the following requirements stated in
the draft IG will likely remain unchanged in the final version, which currently
states the following:
- The documentation shall contain a detailed logical diagram which illustrates all of the components, sources, and mechanisms that constitute the entropy source.
- The statistical analysis has to be performed on the raw, non-conditioned data. The testing lab is responsible for choosing the statistical tests and justifying their selection. Further, the lab shall explain what the result of each test means and how this result can be justified.
- A heuristic analysis of an entropy source along with the justifications of the entropy claims based on this analysis. This analysis shall always be included in the test report.
In theory, a thorough analysis on the entropy source coupled
with some statistical tests on the raw data is absolutely necessary to gain
some assurance of the entropy that plays such a vital role in supporting
security functionality of IT products. While the statistical tests are useful
to detect the patterns in the input data and hence to alert for low entropy
cases, their passing results do not at all prove that there is sufficient
entropy in the input data. For example, a sequence of 20-byte bit-strings
obtained by consecutively applying SHA-1 function as a pseudo randomizer to an
initial 20-byte of 0 bit-string may well pass all sorts of statistical tests,
but it is obvious that there is no entropy in the 20-byte of 0 bit-string to
start with. Therefore, the statistical tests alone cannot justify the seemingly
randomness in the output strings. The statistical tests performed on the
conditioned data are even more removed from reflecting the adequate entropy of
the initial random value. If one is seriously investigating the entropy source
to gain a certain level of assurance regarding its quality, then the
requirements set forth for CC evaluation or FIPS validation are appropriate.
However, the requirements of entropy analysis stated above
impose an enormous burden on the vendors as well as the labs to an extent that
they are out of balance (in regard to effort expended) compared to other
requirements; or in some cases, it may not be possible to meet the requirements.
The TOE Security Assurance Requirements specified in Table 2
of the NDPP is (roughly) equivalent to Evaluation Assurance Level 1 (EAL1) per CC
Part 3. This is a rather bizarre phenomenon! The NDPP does not require design
documentation of the TOE itself; nevertheless its Annex D does require design
documentation of the entropy source--which is often provided by the underlying
Operating System (OS). Suppose that a TOE runs on Windows, Linux, AIX and
Solaris and so on, some of which may utilize cryptographic acceleration
hardware (e.g., Intel processors supporting RDRAND instruction). In order to
claim NDPP compliance and succeed in the CC evaluation, the vendor is obligated
to provide the design documentation of the entropy source from all those various
Operating Systems and/or hardware accelerators. This is not only a daunting
task, but also mission impossible because the design of some entropy sources
are proprietary to some OS or hardware vendors.
The vendors pursuing cryptographic module validation under
FIPS 140-2 are facing the same challenge. While software modules often rely on the
operational environment to provide an entropy source, hardware modules may use
the third-party provided entropy source in an Integrated Circuit (IC). But regardless
of whether the module is hardware or software based, the design documentation
of the third-party provided entropy source is often not available. In addition,
there is no externally-accessible interface to the third-party provided entropy
source that would enable accessing the raw non-conditioned random data for
statistical tests. These interfaces are inaccessible due to their security
architecture, which if they were accessible, these interfaces may become an attack
surface susceptible to the malicious manipulation of the entropy source.
In cases where the entropy source is from some open source
OS such as Linux or perhaps even designed by the vendor themselves, the vendor may
be able to provide the design documentation and raw non-conditioned data for
test. However, this places a heavy burden on the testing labs to provide justifications
for their methodology (e.g., selection of statistical tools) and then provide the
analysis based on the justified methodology. Many labs raised their concerns at
the ICMC that a task of this nature requires mathematicians with doctoral
degrees and goes beyond the scope of the conformance testing that the
cryptographic module validation program is bound to.
As we can see, from the requirements for entropy source to
the fulfillment of these requirements, there is a giant leap. Asking each
vendor and each lab for each CC evaluation or each FIPS validation to meet the
requirements of entropy source as stated in the Annex D of the NDPP or in the draft
FIPS 140-2 IG is a monumental task not commensurate with the expected effort,
and even then the proposed result would still be beyond reach.
Instead of requiring the vendors and labs to find solutions
for the entropy issue on their own, NIST should play a leading role not only in
setting up the requirements but also in establishing a path to meet the
requirements. Vendors and labs can join this effort led by NIST to put the
necessary infrastructure in place, before expecting vendors and labs to
evaluate the quality of the entropy. Here are some thoughts on how to establish
such an infrastructure:
- NIST may hold open competitions for acceptable entropy sources and entropy collection designs with reference implementation in commonly-seen categories such as linear feedback shift registers (LSFRs), noisy diodes, thermal sampling, ring oscillator jitter, CPU clock readings, various human-induced measurements (e.g., the time intervals between the keystrokes). The end result would be a list of NIST-recommended entropy sources and their corresponding entropy collection mechanisms. Just like NIST-approved algorithm standards, the NIST-approved entropy source standard would regulate the entropy source design and implementation.
- NIST should set up test criteria (e.g., operational conditions, pre-requisites, variable lengths) and provide test tools to the accredited testing labs for validating entropy source. Just like the Cryptographic Algorithm Validation Program (CAVP), this can be Entropy Source Validation Program (ESVP).
- NIST would maintain a validation list for all of validated entropy sources.
- CMVP and NIAP (or perhaps even CCRA members) would reference the entropy source validated list.
With this infrastructure in place, the steep entropy
requirements are broken down into several steps. The OS vendors and IC vendors,
if they provide entropy source in their products, will be motivated to
undertake the entropy source validation and make their product available on the
NIST validation list. Vendors who need to make use of the third-party entropy
source can look up the NIST validation list and make an appropriate selection.
Labs performing the testing on the entropy source would use the provided test
methodology and tool for the entropy source producer, and check the validation
list for the entropy source consumer. After establishing the above-described
infrastructure, the testing of the entropy source and maintenance of the
validation list are manageable steps for the labs and vendors to follow.
One may say that it sounds like a plan, but that’s easier
said than done. I hope with NIST’s openness and NIAP’s collaborative approach, it
is possible to rescue the vendors and labs from the currently impossible
entropy requirements.
By: Dr. Yi Mao
No comments:
Post a Comment
Comments are moderated with the goal of reducing spam. This means that there may be a delay before your comment shows up.