Friday, February 28, 2014

Call for Papers for the Second International Cryptographic Module Conference

Mark Your Calendar: ICMC 2014, November 19-21, Hilton Washington D.C., Rockville, MD

ICMC brings together experts from around the world to confer on the topic of cryptographic modules, with emphasis on their secure design, implementation, assurance, and use, referencing both new and established standards such as FIPS 140-2 and ISO/IEC 19790.
We are focused on attracting participants from the engineering and research community, test laboratories, government organizations, the procurers, deployers and administrators of cryptographic modules and academia. Our program consists of one day of workshops and tutorials, followed by two days of 30 minute presentations (plus 15 minute for questions). We solicit proposals for high quality papers and relevant workshops that will be of interest to the community involved with cryptographic modules on topics below. Visit www.ICMConference.org for complete information.

Topics

  • Management of Cryptographic Modules in the Field
  • Standards: Including FIPS 140-2, ISO/IEC 19790, FIPS 140-3
  • Physical Security and Hardware Design
  • Key Management
  • Random Number Generation
  • Side Channel Analysis, Non-invasive Attacks
  • Choice of and Implementing Cryptographic Algorithms
  • Cryptographic Modules Implemented in Open Source
  • Hybrid Systems, Embedded Systems
  • Tools and Methodologies
  • Other Cryptographic Standards
The committee favors vendor-neutral presentations that focus on the practical design, testing and use of cryptographic modules. Product vendors are encouraged to recruit clients and partners who are front-line implementers as presenters.

Dates
  • Abstracts: April 10, 2014
    All prospective authors must submit their abstracts and workshop proposals using this link: http://icmconference.org/?page_id=24
  • Review and comments: May 18, 2014
  • Acceptance notifications: July 17, 2014
  • Final versions due: October 23, 2014
Workshops/Tutorials: November 19, 2014
Presentation of papers: November 20-21, 2014

For any questions regarding submissions or the conference in general, please contact us at info@icmconference.org.

Presented with the cooperation of:
CMUF
Cryptographic Module User Forum

Friday, February 14, 2014

Some Efficiencies in the O-TTPS Accreditation Program


One of the nice things about working with the OTTF and The Open Group TM in developing their O-TTPS Accreditation Program has been the emphasis that the forum members placed on efficiency within the program.

One of the major gripes in the IT assurance community is that we seem to do the same things over and over again.

That costs us precious resources, people, money and time and was pointed out by many of the members of the OTTF.

Mary Ann Davidson of Oracle, as usual, expressed the problem very well: "Doing the same thing twice or more unintentionally usually ends up with worse security as we use scarce resources on duplicative measures."

For developers, how many times do they really have to get their processes checked to see if, for example, they use an automated configuration management system, and that they have implemented access control for it. Such checks are made over and over again if they need to certify products under a variety of assessment programs.

For some of our customers, we have checked this close to a hundred times over the last decade.

Using the above example, the checks are made if a product from the developer needs FIPS 140-2, Common Criteria, O-TTPS assessment, etc., etc. It's both inefficient and expensive for the developers.
These overlaps are prevalent and extend to many of the organizations' processes, not just to configuration management.

Similarly, for mature IA assessment companies like atsec, we find that all of the various programs that we are involved with demand that we have certified management systems. The problem is that there are a variety of standards and a variety of certification programs to choose from.

In atsec's case, this has meant that we have to, across our small company of less than 100 people, to manage, endure and pay for many management system audits; It is a cost of our doing business, but a real pain in the... .

We have:
  • ISO/IEC 17025: 2 for NVLAP (don't ask!), BSI, FMV, and by ISCCC;
  • Technical competence audits: CMVP, NIAP, GSA, BSI, CSEC, O-TTPS;
  • PCI SSC: (not based on international standard);
  • ISO 9001 (Customer demand);
  • ISO/IEC 27001  (Customer demand);
  • and soon to ISO/IEC 17020.
None of the our auditors are allowed to accept the results from the others. Granted, each scheme requires a different technical competence, but our document and record control, HR, training, resource management, corrective and preventive actions, internal audit, calibration, etc. are the same for all of the programs to which we belong.

atsec customers can be assured that our management system is well audited!


.. back to the O-TTPS Accreditation Program: 

The forum saw the chance to try and address this issue: The notion of carefully reusing existing assurance was one of the factors we considered during each stage of developing the new O-TTPS Accreditation Program.

Here are some of the efficiencies that we identified and that the Accreditation Program has implemented:
  • The ability for O-TTPS assessors to reuse existing audit reports presented by the developer. These might include, where relevant, ISO 9001, security audits, Common Criteria site visits, etc.
    Note that there are some careful provisions, which are detailed in the Assessment Procedures.
  • Currently under development: The provision of mappings, allowing the work done for existing product certifications, such as Common Criteria, to be more easily mapped to the O-TTPS requirements.
  • For prospective Recognized Assessor companies and assessors, the ability to accept a variety of existing certifications of management systems and assessor qualifications that address the core skills needed as information assurance (IA) professionals. This allows The Open Group to concentrate only on the additional specialised skills needed for the O-TTPS accreditation program.
Although simple policies, they allow the program to be more efficient, reducing the overhead of unnecessary duplication, meaning that the costs of accreditation are not overburdened. These reduce the costs of assessment, to all parties involved: The Accreditation Authority, the Recognized Assessors and to the Organizations undergoing accreditation.

Not only that, but we can concentrate more on the important issue of integrity in the COTS ICT Supply Chains.

Thank you to the OTTF for living in the real world! :)

~ Fiona Pattinson

- "The Open Group" is a trademark of The Open Group

Monday, February 3, 2014

atsec and supply chain security

Supply chain assurance: mitigate the threats of counterfeit and tainted products, globally. Quite a challenge!

To do that we'd need some of the best subject matter experts in the world.

Build a new industry-led open standard and an assessment program that can demonstrate to integrators and acquirers of COTS ICT products that their providers are Trusted Technology ProvidersTM and who can demonstrate that they follow industry best practices in mitigating the threats of counterfeit and maliciously tainted products.

Yes! We need some experts...

atsec has been working closely with The Open Group Trusted Technology Forum (OTTF) since it began in late 2010. For the last three years we've been dedicating a great many hours, alongside our OTTF colleagues with representatives from providers, acquirers and assessors, all working together to progress an  industry vision for an open voluntary standard and an associated accreditation program for trusted technology in relation to supply chain security.


Together, the OTTF have worked very hard.

We developed and published a white paper, a snapshot of the standard, and the O-TTPS standard itself. We have collaborated, presented and harmonized.

We have looked for and found consensus amongst many key organizations. It took blood, sweat and tears!
  • We worked with some of the major names in the I.T. industry for security assurance, including many of the major global players in the industry.
  • We worked with governments around the world. 
  • We worked amongst a time of change within (U.S.) government agencies.

Believe me, finding consensus in such a diverse and knowledgeable set of people was never going to be an easy task!


Finally, we have been working hard on an Accreditation Program and piloting the model in real world assessments. This way we had the chance to iron out some of the wrinkles before the accreditation program became public.

Well, the launch of the assessment program is finally here: February 3rd, 2014; three years after we began work. A short eternity perhaps, but an era that has demonstrated a great achievement.

As atsec worked with the pilots for the accreditation program we had the experience of training and qualifying our assessor teams in. We drew from those with experience in testing,evaluating and of course assessing. Here is a quote from Courtney Cavness, one of our process assessment experts:
"Working as an Assessor on this pilot was a tremendous learning experience. I enjoyed applying my previous audit/evaluation experience to this new standard to help refine and enhance both the standard and the accreditation program."

Another of our senior assessors, King Ables, said  that: 

"This important new standard fills a gap in the spectrum of assurance measures available to product developers and end users. As an early Recognized Assessor, atsec participated in a pilot assessment to "test drive" the standard. This pilot allowed us to gain practical experience with O-TTPS as well as to provide suggestions for improvements based on our other security assessment experience."

The new Accreditation Program, which has been running in pilot mode since last year, already has  one completed accreditation under its belt. An achievement not to be sneezed at!

atsec thanks IBM for the opportunity to participate in the pilot phase and for assigning some of their senior experts in the field to the project which helped the project go very smoothly. 

For a list of  completed and approved accreditations, and their scope's see The Open Group's accreditation register.

Well known and global Trusted Technology Providers have placed enough credence in the program to actively participate; and two respected assessor companies in the field of information assurance also invested in helping to develop and test the program and are the first "Recognized Assessors" working with The Open Group's O-TTPS independent Accreditation Authority to help providers "Build with Integrity" which then allows their acquirer and integrator customers to "Buy with Confidence."


With wide across-the-board participation in the OTTF, we heard a very important message: resources available for providing assurance are scarce and nobody wants to do or pay more than once for the same thing. I believe that the Accreditation Program has incorporated this requirement. It allows for appropriate re-use of assurance already gained.

The OTTF are continuing to work hard on this topic, forging appropriate alliances in the field of information security and information assurance.

Of course it remains to be seen if the program will engender enough trust and confidence to be useful assurance for those eager to have it, but so far all the indications are that it will.

Fiona Pattinson


Tuesday, November 12, 2013

ISO/IEC 27001


 Here at atsec we are fans of internationally recognized security standards that are closely scrutinized by security experts. There are "a ton" of national security standards or self-contrived private “seals of approval” for information security out there. However, national security standards usually do not work well for companies working internationally, because nobody outside of the country of origin cares about a local security standard. With “seals of approval” invented by security companies, the benefit depends completely on the trust your customers have in the company which created the seal.

Unfortunately, trust nowadays is in short supply…

The only available internationally recognized standard for Information Security Management is ISO/IEC 27001. The standard describes how information security can be implemented systematically. Please bear in mind, this is about INFORMATION security, not just technical IT security. So, ISO/IEC 27001 is not the typical “IT standard” which can be completely covered by your IT department. It requires direct involvement and supervision from business management to succeed.

ISO/IEC 27001 uses a risk-based approach, which helps to custom-tailor information security measures to the size and the risk situation of a company. Smaller companies or companies in a low risk market are not required to implement the same measures as companies facing high risks. This makes the standard achievable both for small companies and worldwide enterprises.

The goal of ISO/IEC 27001 is to implement an Information Security Management System (ISMS), which helps to organize security management in a consistent and structured way. This ISMS is not a technical system, but the sum of all security processes and documentation in a company.
   
ISO/IEC 27001 plays well with other security standards. It integrates without problems with other ISO standards. Common integrations are with ISO 9001 (Quality Management), ISO 14001 (Environmental Management), ISO/IEC 20000 (IT Service Management) and ISO 22301, formerly BS 25999 (Business Continuity Management). Other security requirements can also be seamlessly integrated into an ISMS based on ISO/IEC 27001, e.g., PCI-DSS or local data protection requirements.

Let’s have a look at the numbers for ISO/IEC 27001 certificates. The ISO survey 2012  shows a steadily increasing number of certificates for all regions. At the end of 2012, nearly 20,000 ISO/IEC 27001 certificates were issued. East Asia and Pacific (EAP) and Europe together are covering 85.6% of all ISO 27001 certificates. Europe leads in annual growth in absolute numbers in 2012, with 1,095 new certificates being issued.

Diagram from the 2012 ISO survey of certificates

North America with a ratio of 2.8% of all ISO/IEC 27001 certificates is currently lagging behind the rest of the world, which is the result of a strong focus on national standards. Note that the preliminary version of the  NIST led Cybersecurity Framework has included ISO/IEC 27001 in its core of security management standards, alongside the national FISMA suite of standards including SP 800-53 and the COBIT standard that is also well adopted in the U.S.
  
In September 2013, ISO released a new version of the standard, eight years after the release of ISO/IEC 27001:2005. This new standard is called ISO/IEC 27001:2013. The release of a new version of the standard is relevant for existing certificate holders, since after a transition period of 2 years all existing certificates must be migrated to ISO/IEC 27001:2013. Companies which are “nearly finished” with their preparation for ISO/IEC 27001 certification can (and should) perform the certification based on ISO/IEC 27001:2005.

If you are just planning to start your preparations, it will very likely be best to directly go for ISO/IEC 27001:2013 and save the time for a later migration.
 
The update of the standard will probably not directly lead to an even higher number of ISO/IEC 27001 certificates. The generally simpler requirements for Risk Management might entice some undecided companies, but most of the changes helped to tidy several legacy issues in the standard without dramatically changing the requirements. This shows that ISO/IEC 27001 is a standard which is actively developed and which is steadily aligned with the other ISO standards.

The events of this year emphasize the requirements for strong information security. Companies that offer IT services will be forced by their customers to prove a high level of security. Otherwise the customers will just switch to another service provider with a higher level of security. ISO/IEC 27001 helps companies to prove their commitment to information security and helps management to perform due diligence regarding information security.

Matthias Hofherr - atsec Munich

Wednesday, October 30, 2013

Collaboration and Openness to the Rescue of Entropy



This past September was my conference month. I first went to the 14th International Common Criteria Conference (ICCC) in Orlando, Florida and then a week later I was at the 1st International Cryptographic Module Conference (ICMC) in Gaithersburg, Maryland.

The theme of the ICCC this year was a collaborative approach. The conference directed the CC community to adopt the Collaborative Protection Profile approach (cPPs). The task of the protection profile development and maintenance is shifted from the CC Development Board (CCDB) to various Technical Communities (TCs). TCs are expected to gather experts from the Industry, Government, and Academia to ensure the cPPs stay current in the world of fast-evolving technologies. The CC User Forum (CCUF) will be playing an increasingly important role facilitating the communication between the cPP consumers and cPP developers.

The opening speech of the ICMC was delivered by Charles H. Romine, Director of the Information Technology Laboratory at NIST. Openness of NIST was the core of the speech. NIST has held open competitions for AES and SHA-3. NIST even reopened the public comment period for SP 800-90 series of crypto standards in the interest of openness to give the public a second opportunity to view and comment on the standards. NIST acknowledges the challenges (e.g., such as several-month long review pending queue) that the Cryptographic Module Validation Program (CMVP) is facing and invited all ideas from the audience and public for improvement.

One of the common features of both conferences was the heated discussions on RNG and entropy. The ICMC had three presentations devoted to this topic:
The ICCC had the following presentations on general cryptography, as well as RNG and entropy in particular:

The number of presentations at both conferences was not surprising since more and more Target of Evaluations (TOEs) rely on cryptographic support (i.e., class FCS functionality) for user authentication, data protection, secure communication, and trusted path/channels. Assessing the security strength of cryptographic algorithms and keys has become indispensable for the vulnerability assessment of a TOE.  The Random Number Generator (RNG) that provides the random source for a key determines the quality of the generated keys. A predictable RNG could lead to the downfall of the entire system. To avoid this Achilles’ heel, it’s crucial to have a well-designed and properly-tested RNG and entropy source.


COLLABORATION?





Both CC evaluations and FIPS 140-2 validations require scrutiny of the entropy source.  For example, Annex D of the U.S. Network Device Protection Profile (NDPP) v1.1 provides requirements for entropy documentation and assessment. The documentation of the entropy source is expected to be detailed enough to include the following:
  • Design Description
    Documentation shall include the design of the entropy source as a whole, including the interaction of all entropy source components.
  • Entropy Justification
    There should be a technical argument for where the unpredictability in the source comes from and why there is confidence in the entropy source exhibiting probabilistic behavior (an explanation of the probability distribution and justification for that distribution given the particular source is one way to describe this).
  • Operating Conditions
    Documentation will also include the range of operating conditions under which the entropy source is expected to generate random data.
  •  Health Tests
    All entropy source health tests and their rationale will be documented.

The CMVP is working on an Implementation Guidance (IG) on entropy analysis. The draft version has already been circulated among accredited labs for review and comments. Vendors can also provide feedback through their representing labs. Although the CMVP is still working on incorporating the feedback received from labs and vendors, the following requirements stated in the draft IG will likely remain unchanged in the final version, which currently states the following:
  • The documentation shall contain a detailed logical diagram which illustrates all of the components, sources, and mechanisms that constitute the entropy source.
  • The statistical analysis has to be performed on the raw, non-conditioned data. The testing lab is responsible for choosing the statistical tests and justifying their selection. Further, the lab shall explain what the result of each test means and how this result can be justified.
  • A heuristic analysis of an entropy source along with the justifications of the entropy claims based on this analysis. This analysis shall always be included in the test report.

In theory, a thorough analysis on the entropy source coupled with some statistical tests on the raw data is absolutely necessary to gain some assurance of the entropy that plays such a vital role in supporting security functionality of IT products. While the statistical tests are useful to detect the patterns in the input data and hence to alert for low entropy cases, their passing results do not at all prove that there is sufficient entropy in the input data. For example, a sequence of 20-byte bit-strings obtained by consecutively applying SHA-1 function as a pseudo randomizer to an initial 20-byte of 0 bit-string may well pass all sorts of statistical tests, but it is obvious that there is no entropy in the 20-byte of 0 bit-string to start with. Therefore, the statistical tests alone cannot justify the seemingly randomness in the output strings. The statistical tests performed on the conditioned data are even more removed from reflecting the adequate entropy of the initial random value. If one is seriously investigating the entropy source to gain a certain level of assurance regarding its quality, then the requirements set forth for CC evaluation or FIPS validation are appropriate.

However, the requirements of entropy analysis stated above impose an enormous burden on the vendors as well as the labs to an extent that they are out of balance (in regard to effort expended) compared to other requirements; or in some cases, it may not be possible to meet the requirements.

The TOE Security Assurance Requirements specified in Table 2 of the NDPP is (roughly) equivalent to Evaluation Assurance Level 1 (EAL1) per CC Part 3. This is a rather bizarre phenomenon! The NDPP does not require design documentation of the TOE itself; nevertheless its Annex D does require design documentation of the entropy source--which is often provided by the underlying Operating System (OS). Suppose that a TOE runs on Windows, Linux, AIX and Solaris and so on, some of which may utilize cryptographic acceleration hardware (e.g., Intel processors supporting RDRAND instruction). In order to claim NDPP compliance and succeed in the CC evaluation, the vendor is obligated to provide the design documentation of the entropy source from all those various Operating Systems and/or hardware accelerators. This is not only a daunting task, but also mission impossible because the design of some entropy sources are proprietary to some OS or hardware vendors.

The vendors pursuing cryptographic module validation under FIPS 140-2 are facing the same challenge. While software modules often rely on the operational environment to provide an entropy source, hardware modules may use the third-party provided entropy source in an Integrated Circuit (IC). But regardless of whether the module is hardware or software based, the design documentation of the third-party provided entropy source is often not available. In addition, there is no externally-accessible interface to the third-party provided entropy source that would enable accessing the raw non-conditioned random data for statistical tests. These interfaces are inaccessible due to their security architecture, which if they were accessible, these interfaces may become an attack surface susceptible to the malicious manipulation of the entropy source.

In cases where the entropy source is from some open source OS such as Linux or perhaps even designed by the vendor themselves, the vendor may be able to provide the design documentation and raw non-conditioned data for test. However, this places a heavy burden on the testing labs to provide justifications for their methodology (e.g., selection of statistical tools) and then provide the analysis based on the justified methodology. Many labs raised their concerns at the ICMC that a task of this nature requires mathematicians with doctoral degrees and goes beyond the scope of the conformance testing that the cryptographic module validation program is bound to.

As we can see, from the requirements for entropy source to the fulfillment of these requirements, there is a giant leap. Asking each vendor and each lab for each CC evaluation or each FIPS validation to meet the requirements of entropy source as stated in the Annex D of the NDPP or in the draft FIPS 140-2 IG is a monumental task not commensurate with the expected effort, and even then the proposed result would still be beyond reach.

Instead of requiring the vendors and labs to find solutions for the entropy issue on their own, NIST should play a leading role not only in setting up the requirements but also in establishing a path to meet the requirements. Vendors and labs can join this effort led by NIST to put the necessary infrastructure in place, before expecting vendors and labs to evaluate the quality of the entropy. Here are some thoughts on how to establish such an infrastructure:
  • NIST may hold open competitions for acceptable entropy sources and entropy collection designs with reference implementation in commonly-seen categories such as linear feedback shift registers (LSFRs), noisy diodes, thermal sampling, ring oscillator jitter, CPU clock readings, various human-induced measurements (e.g., the time intervals between the keystrokes). The end result would be a list of NIST-recommended entropy sources and their corresponding entropy collection mechanisms. Just like NIST-approved algorithm standards, the NIST-approved entropy source standard would regulate the entropy source design and implementation.
  • NIST should set up test criteria (e.g., operational conditions, pre-requisites, variable lengths) and provide test tools to the accredited testing labs for validating entropy source. Just like the Cryptographic Algorithm Validation Program (CAVP), this can be Entropy Source Validation Program (ESVP).
  • NIST would maintain a validation list for all of validated entropy sources.
  • CMVP and NIAP (or perhaps even CCRA members) would reference the entropy source validated list.

With this infrastructure in place, the steep entropy requirements are broken down into several steps. The OS vendors and IC vendors, if they provide entropy source in their products, will be motivated to undertake the entropy source validation and make their product available on the NIST validation list. Vendors who need to make use of the third-party entropy source can look up the NIST validation list and make an appropriate selection. Labs performing the testing on the entropy source would use the provided test methodology and tool for the entropy source producer, and check the validation list for the entropy source consumer. After establishing the above-described infrastructure, the testing of the entropy source and maintenance of the validation list are manageable steps for the labs and vendors to follow.

One may say that it sounds like a plan, but that’s easier said than done. I hope with NIST’s openness and NIAP’s collaborative approach, it is possible to rescue the vendors and labs from the currently impossible entropy requirements.

By: Dr. Yi Mao

Monday, October 14, 2013

Vendor Viewpoints (from a Lab)

What might we hear if we sat at a round table with software vendors currently pursuing Common Criteria evaluations? Would we hear the same thoughts that currently drive the CC community at large?

The following discussion is based on comments from vendors in response to the questions posed below regarding their selection of a CC scheme:
  1. Were the entrance requirements for CC evaluation clear? Is the fee structure clear? Did you have to collect a group of documents to get a complete understanding of the differences from one scheme to the next?
  2. Does your government sales team have a clear view of what is required to enter into CC evaluation? Did you receive guidance regarding the approach to take for CC evaluations?
  3. What change or improvement would you like to see in the certification process overall?

General Response 

One vendor replied that they have a collection of scheme policies, statements, and guidance. Obtaining guidance from the schemes was not an issue, the problem was in getting a definitive picture of the life cycle of an evaluation project.

Another vendor responded that PPs highlight their product’s features, which they found very beneficial. They noted that internally, their development organization needs to remember that, for the company, the end goal is sales and it is the product management and marketing teams who provide input to the company’s CC requirements. The sales team is generally unaware of the effort an evaluation will take.

Time and Cost

High on the list of issues is the amount of time an evaluation will take to complete. Some vendors may have experience with multiple schemes and so have rationale for making a choice between schemes.  And, remember that time is a factor that is equally important as choosing the right PP – it affects sales.

Several vendors commented that the overall change or improvement they would like to see is a reduction in time or cost of performing evaluations.

Need for unified requirements

Another consideration for the choice of where to go is the assurance level that can be used in evaluation. One vendor responded that since many its customers are asking for a specific assurance level, the vendor has had to choose a certain route for an evaluation, even though it was a more difficult path to take, because the vendor needed to satisfy their customers. The vendors are trying to balance the requirements of both US and non-US customers. The schemes are not helping solve this business problem since the Common Recognition arrangement is not so much an arrangement any longer.

Another vendor responded that they look forward to stability and cooperation. Problems have included not being able to predict requirements and the resulting difficulty in being able to tailor their development plans. Elements in the CC world are changing but importantly, the vendor notes, no one is agreeing on these changes.

Duplication

Another respondent commented that there must be a way found to avoid their running evaluations in more than one place. This will raise the price, not lower it.

Another vendor responded with a somewhat different approach in that they are evaluating their product in one place but getting two certificates. The reason for this was that one scheme demands exact PP compliance while another allows us to exceed that. So with some extra work that vendor can get what both international markets want.

Confusion factors

One point of confusion (and thus, delay) is the current misunderstanding for some new protection profiles. There is a learning curve for the creation of specific profiles. But there is also a learning curve for customers who even have yet to understand what the current US plan is. At atsec, we work to educate them, but more information and a clearer understanding provided by the schemes themselves is in order.

One vendor’s sales team does not fully understand just what it takes to fulfill security requirements, and so it is unlikely that this vendor sales team can explain to their prospective DoD customers the correct path to take. Again, this information needs to be clearly outlined by the scheme.

And the same observation was made about another vendor’s sales team that didn't understand what the CC evaluation process takes, and so aren’t able to offer guidance to their own development team. The sales team needs to understand the requirements of the ultimate customer – the user.

Another respondent indicated that they have to look at each product over again regarding the entrance requirements. Getting into evaluation is no longer a consistent process because some products, for example, need to have crypto validated by a FIPS validation while others are subject to entropy scrutiny. It differs by product and by which policy is in place at a particular time, and it varies by scheme.

Conclusion


What do our vendor customers want? The same thing that we want: they want faster evaluations. Who doesn’t want this? And this is in line with what the US scheme has also heard and is working toward.

They want “a stable, common worldwide evaluation standard” as one of them succinctly put it.

Does this mean the same level of security assurance or more? Less cost is derived from less work, which is derived from less scrutiny and less assurance.

Vendors want things to be equal among the international schemes – as they were told they were supposed to be. But there is a decided expectation that only the US rules are acceptable, and so that colors their decisions regarding not only whether to undergo a CC evaluation, but also where and when.

One respondent suggested a novel idea.  They suggested having some sort of SLA (service level agreements) from the schemes, much as atsec provides for our internal / external customers. The SLA could outline what the vendor (and labs) can expect from the scheme given a specific input. This would make the evaluation process more of a business-focused proposition.

Note: atsec would like to point out that one scheme has instituted a plan to address this last point. CSEC, the national scheme in Sweden, sets its feedback to the lab based on each report or set of reports sent to the certifier for the entire length of the project. Once the lab delivers a given report to the scheme, there is an established date for certifier feedback to the lab’s evaluation team. This enables the customer to plan for any needed responses in a more predicable manner. This also makes for a more transparent practice that the lab can share with the sponsor. This is not the total solution, but it may make for faster turnaround and thus faster evaluation.

-- Ken Hake
Lab manager (US)

Sunday, October 6, 2013

Dual_EC_DRBG Usage in Evaluations

This information is intended to be of use to those working under all the CCRA national evaluation schemes. Some of whom are updating policies relevant to this topic. This blog is not intended as an atsec opinion about the underlying issues.

On September 9th this year, NIST posted the following announcement:

In light of recent reports, NIST is reopening the public comment period for Special Publication 800-90A and draft Special Publications 800-90B and 800-90C.
NIST is interested in public review and comment to ensure that the recommendations are accurate and provide the strongest cryptographic recommendations possible. The public comments will close on November 6, 2013. Comments should be sent to RBG_Comments@nist.gov.

In addition, the Computer Security Division has released a supplemental ITL Security Bulletin titled "NIST Opens Draft Special Publication 800-90A, Recommendation for Random Number Generation Using Deterministic Random Bit Generators, For Review and Comment (Supplemental ITL Bulletin for September 2013)" to support the draft revision effort.
The above-mentioned ITL Security Bulletin makes the following recommendation:
NIST strongly recommends that, pending the resolution of the security concerns and the re-issuance of SP 800-90A, the Dual_EC_DRBG, as specified in the January 2012 version of SP 800-90A, no longer be used.
We have found that the following Protection Profiles allow for the optional claims of the Dual_EC_DRBG random number generator in FCS_RBG_EXT.1:
  • Protection Profile for Software Full Disk Encryption
  • Protection Profile for USB Flash Drives
  • Security Requirements for Mobile Operating Systems
  • Security Requirements for Voice Over IP Application
  • Network Device Protection Profile (NDPP) Extended Package VPN Gateway
  • Protection Profile for Network Devices
  • Standard Protection Profile for Enterprise Security Management Policy Management
  • Standard Protection Profile for Enterprise Security Management Identity and Credential Management
  • Standard Protection Profile for Enterprise Security Management Access Control
  • Protection Profile for IPsec Virtual Private Network (VPN) Clients
  • Protection Profile for Wireless Local Area Network (WLAN) Access Systems
  • Protection Profile for Wireless Local Area Network (WLAN) Clients
Since some schemes have different policies in regard to this topic, we recommend that vendors and labs check with their scheme before including this algorithm in their security claims.

~the atsec team

Wednesday, October 2, 2013

US partial Government shutdown

The US government shutdown means that the following are affected:
 
  • NIAP - closed until further notice.
    13/10/07 - 
    NIAP Returns from Furlough

    As of 7 October 2013, NIAP operations are resumed. Please feel free to contact us with inquiries. We thank you for your continued support.
  • CMVP (NIST) - closed until further notice, but CSEC (Canada) CMVP 
    will continue operations, however no validations will be completed 
    without a NIST signatory
  • CAVP (NIST) -  closed until further notice. 
  • SCAP -  closed until further notice.
  • GSA and TWIC are also probably affected, 
    but we don't have official word from those programs yet. 
 
atsec continues to work on IUT and evaluation projects in the atsec labs as normal, but obviously progression through scheme/program milestones may be affected.
atsec customers in evaluation under other schemes such as BSI or CSEC (Sweden) are not being affected. 
atsec customers with questions about their projects should contact their atsec representative. 

Monday, September 30, 2013

A summary of the first ICMC:

The first ICMC is over.

It was a wonderful event and thanks are due to all of the 171 participants for making it so.
Participant Quote: "This conference is Win Win Win!"

These attendees represented developers, governments, laboratories, consultants,  and academics from the cryptographic module community.

ICMC 2013 sponsors
It turned out to be a truly international affair with people from organizations based in eighteen countries: Australia, Belgium, Brazil, Canada, China, Finland, France, Germany, Japan, Netherlands, Singapore, South Korea, Spain, Sweden, Switzerland, Taiwan, the U.K., and the United States of America.

Thanks are also due to the sponsors. A first conference is always a risk, and it would have been hard to make the conference a success without them.  

Thanks to Bill and Nikki from CNXTD who planned and supported us; arranging the hotel, providing registrations, communications, last minute schedule patches and the management of all those things that security experts are often bad at. CNXTD focus on planning conferences in the security field and have immense experience that showed!


About the conference

We already posted short summaries of each of the days including some of the photographs that we took.

Monday was the CMVP and accredited laboratories meeting. Many of us heard how much hard work that was, especially since they usually take at least three days to cover all of their business. This is a private meeting and as you might expect there are no public slides or links.

Tuesday was for workshops: These were very well attended, and my big surprise was how popular Steve Weingart's, "An introduction to FIPS 140-2" was. I had my doubts since almost everyone who signed up for the conference was an expert in the field. Silly me! Almost everyone I spoke to that had attended said they still learned something new! The quality of the other workshops on more specialised areas (physical security, side channel analysis and testing, and mobile security) were of a very high standard and provided excellent opportunities to learn more on these topics.

Wednesday and Thursday included the keynotes, policy/program related and technical  presentations and plenty of coffee breaks.

As the conference progressed we began to see some themes and memes emerge:

Credibility and Trust
Charles H. Romine

Although cryptographic modules are very important to security in government, critical infrastructure and commercial sectors, it is rare that anything to do with cryptographic modules and the validations of their conformance to standards become the subject of public and political attention. Usually this topic is reserved for the "boffins" i.e., it is a very technical topic only well understood and discussed by the policy and technical experts. For many end users the assurance they rely upon is based on the trust and credibility they hold in those specifying the assurance case on their behalf. In the U.S. that is NIST and the CMVP. For many years this has not been doubted in the slightest.


Surprise!!! 

In the weeks leading up to the conference this topic was headline news, not just within our small community, but it was discussed in the popular media around the world. Not just fame, but infamy. Oh my....

This truly vital topic was addressed by Charles H. Romine, Director of the Information Technology Laboratory at NIST who was a keynote speaker for ICMC. It is clear that NIST take this subject very seriously indeed and are taking appropriate action. Dr. Bertand du Castel also talked about trust from a non-governmental perspective.

The delays to the update of FIPS 140-2, and the length of time that developers must wait to be listed as conformant with the standard ("The Queue" is currently measured at several months) are also affecting the credibility of NIST. The conference participants asked that NIST listen to the community and take appropriate action on these topics too.  

The future of the FIPS specifications, ISO

FIPS 140-3 was bound to be a discussion topic. We heard, as we expected, that FIPS 140-3 is moribund. This has brought problems which are getting worse with time, not better. As technology moves on, and the pace of change increases, with no real update to the specification for a decade, FIPS 140-2 is creaking badly. To deal with this, the CMVP must issue Implementation Guidance (I.G.), which is now so complex that it is virtually impossible to understand all the nuances. We saw several presentations on the topic of several notorious implementation guidances, and even some more formal logical analysis of the I.G. themselves. My goodness, what have we created?...

We heard a lot of discussion and grievances in relation to this topic and we realised that at least part of the reason for the length of the queue is related to the I.G.:  Its (sometimes) retroactive applicability; its complexity; incomplete understanding and inconsistent application of the policies by validators, testers and developers.

There is some light at the end of the tunnel. We heard a lot about the ISO standards and supporting documents for cryptographic modules. How they have been developed by experts, and are now publishing the second revision, representing more recent technical improvements that have "leap-frogged" the FIPS specifications. We heard from several programs that are already using the standards in formal programs. including Japan, South Korea, Spain and Turkey. We heard of the iCMVP that is a memorandum of understanding between Japan and NIST in regard to the framework for accepting work done under each program. This may be a simplistic start when we look at the CC Recognition Arrangement, but it could, in time, grow to include other nations...

All we have to do is convince NIST and CSEC to adopt the ISO standards as national standards, and manage the transition.

The Queue

The length of "The Queue" was discussed during several sessions. This subject was at the front of everyone's minds as the CMVP, with limited resource, struggle to keep up with the number of validations, laboratory assessments, policy writing, as well as other assigned duties.

The current length of The Queue means that it can take developers many months to get their modules validated and hence available for procurement from federal agencies. In an increasing number of cases, products are obsolete or un-supported by the time the validation is finally documented. We heard how  the unpredictability of The Queue is a problem too, since it greatly affects  how developers can perform their marketing, sales and project planning.

We heard a lot more detail about the resource constraints under which the CMVP must operate, and by the end of the conference I believe everyone had a better understanding of why, and we even had some ideas on how to address this problem. These ranged from increased fees for service which would allow NIST to have more resources, sub-contracting validators to the NIST team, allowing labs and developers to work on appropriate topics that would make validation of the test reports easier  and more efficient, and discussion of the internal CMVP review process.

Entropy 

Captain Entropy:
conceived at the first ICMC

Another "hot" topic was entropy. We heard several papers related to entropy, including the philosophical,the mathematical and the practical. Earnest discussion of the subject  was continued throughout the evening by representatives of several of the accredited laboratories, which finally resulted in the conception of "Captain Entropy." We share the vision with you and hope that you can forgive both poor artistic skills and our making light of what really is a serious subject.




What's next?

During the final wrap up session we had a clear mandate from the participants to continue the ICMC conference next year, and also to champion the establishment of a "user group." The user group will seek active participation from all the stakeholders for the development, testing and validation of cryptographic modules. 

So please do not think it is all over for a year. This was the beginning and we have work to do together during the remainder of this year and in 2014.

atsec will continue to facilitate this, but we will be seeking active participation from all the stakeholder groups. We envisage something similar to the CC User's Forum and will initially use both the mailing list for ICMC 2013, the ICMC 2013 and FIPS 140-2 related LinkedIn groups to communicate how this will happen. Please help us spread the word.

Fiona Pattinson