Monday, June 2, 2014

The Spirit of the CCRA has not changed. Neither have the Disagreements

The proposed new CCRA (draft version 16.5.1)  was recently published on the Common Criteria Portal along with a covering message from the chair of the CCRA Management Committee.
This blog describes the major changes that we have identified.
The draft CCRA document builds upon the current CCRA, first published in 2000, which is supplemented by four procedures listed on the CC portal.

General

There are many instances of grammar corrections throughout the new Agreement. These are not reported here unless we believed that it changes, or may change the meaning of the Agreement.
  • The title has no changes.
  • 26 Participants are listed, although Hungary is listed as "to be completed".
  • In Annex "L", The list of  Compliant CBs, two are missing from the current list of "Authorizing Participants currently published on the portal: New Zealand and India.
  • Throughout the Arrangement the formalization of the status of "Supporting Documents" has been added.
  • Throughout the Arrangement the standards to which CBs and ITSEF must comply have been updated. Now ISO/IEC 17065 and ISO/IEC 17025 are the norm.

Purpose

1.    The paragraph:
"The purpose of this Arrangement is to advance those objectives by bringing about a situation in which IT Products and Protection Profiles which earn a Common Criteria Certificate, as per the requirements of the CC standard, can be procured or used without the need for further Evaluation. It seeks to provide grounds for confidence in the reliability of the judgements on which the original certificate was based by requiring that a Certification/Validation Body (CB) issuing Common Criteria Certificates should meet high and consistent standards."
Was changed to add the text " as per the requirements of the CC standard."

2.    The paragraph:
"It is likely that some sensitive government IT Systems will be procured, certified and Recognised according to specific user’s requirements for their strategic need or separate bilateral or multilateral agreements. This Arrangement does not constrain such agreements. In particular, the exceptions described in Article 3 do not apply to any such separately negotiated agreements."
Was changed to add the text "specific user’s requirements for their strategic need "

3.    Text was added that basically incorporates the 2006 supplement to the CCRA. "Multiple or commercial CBs MC policy procedure 2006-09-001 v 1.0"

Spirit

The spirit of the Agreement has not changed.

Article 1: Membership

No changes.

Article 2: Scope

The scope has, of course, changed. With the removal of mutual recognition up to EAL 4 and the addition of the new text:
"This Arrangement covers certificates with claims of compliance against Common Criteria assurance components of either:
1) a collaborative Protection Profile (cPP), developed and maintained in accordance with Annex K, with assurance activities selected from Evaluation Assurance Levels up to and including level 4 and ALC_FLR, developed through an International Technical Community endorsed by the Management Committee; or
2) Evaluation Assurance Levels 1 through 2 and ALC_FLR2.
The scope may be modified with the consent of the Participants in this Arrangement at any time, in accordance with the provisions of Article 14, by addition or removal of assurance levels or components."

Discussion

In the past, the CCRA did not set any de-jure requirements for PPs specified by evaluations that were formally recognized under the Agreement beyond that they be validated in accordance with the CC standards. Of course, this meant that the Arrangement did not formally specify an organization for the development of PPs and for many years, such an organization was not in place.
In both the 1998 and 2000 versions of the CCRA there was an implicit acceptance that a validated PP approved by any one CCRA Participant should be recognized by all.
Hence, the Common Criteria Management Board (CCMB) did not sponsor development of PPs. (Another reason has been cited by them is that they had no resource to do so.)
However, a list of de-facto CC validated PPs was and is still maintained on the CC portal . It was left to the various national schemes to make de-jure policy about which PPs could be specified in an evaluation performed within their scheme. The original 1998 CCRA included the concept of national PPs but this was removed in the 2000 version.

The 2014 draft CCRA changes this allowing for mutual recognition of evaluations specifying an agreed collaborative PP (cPP) and sets the rules for their development such that to be a cPP it must have been developed under the auspices of an International Technical Community (iTC). The new CCRA defines an iTC as:
"A group of technical experts including Participants, Certification/Validation Bodies, ITSEFs, developers and users which are:
 a) working in manners that promote fair competition;
 b) working in some specific technical area in order to define cPPs;
 c) endorsed for this purpose by the Management Committee; and
 d) establishing Interpretations of the application of the CC and CEM necessary for cPPs through Supporting Documents which are subject to the CCRA approval process"

Article 3: Exceptions

The following text was added:
"Annexes F.3 and G.2 of this Arrangement should be followed by Participants who call for exceptions in accordance with this article."

Annex F.3 is "General Information Affecting the Terms of this Arrangement" which requires Participants to inform the the CCMC about changes in national laws, regulations etc. and about any changes in the operation or procedures of its national scheme if they affect the ability of that Participant to comply with the Arrangement.
Annex G.2 "Documentation to be Provided" is the list of items such as the point of contact, quality manual, procedures, etc.

Article 4: Definitions

No substantive changes here, but there were plenty in the Glossary (see the changes in the Annex "A" section below).

Article 5: Conditions for Recognition

No substantive changes.

Article 6: Voluntary Periodic Assessments

No substantive changes.
See also Annex "D."

Article 7: Publications and the Use of the Service and Certification Mark

Certificates that in the previous version of the CCRA "should" have displayed the mark of the Recognition Arrangement, now "shall " bear the device.
Note that Annex "E" has been extensively changed. The two logos look the same, but they are much more extensively defined in terms of color, spacing, etc. There is a much updated policy about their usage.

1. The added text in E.1:
"The Certificate Authorising Participants shall make necessary legal arrangements with their ITSEFs and their Client vendors, to the effect that the vendors are also required to:"…
may be problematic, since the Particpant does not have a relationship with the ITSEF. According to the Arrangement it is the CB that has the relationship with the ITSEF. I guess it's just through the chain...

2. The added text in E.2 (in green) makes a sentence that cannot be easily understood.
"After termination of participation in this Arrangement, the terminating Participant shall immediately cease to use the Service Mark and distribute any certificates bearing the Certification Mark of making reference to this Arrangement."
This text needs some clarification.

Article 8: Sharing of Information

No substantive changes.

Article 9: New Participants

No substantive changes.
See also Annex "G."

Article 10: Administration of this Arrangement

The text "All Participants should be represented on the Management Committee" was changed to:
"All Participants are represented on the Management Committee"

Article 11: Disagreements

The disagreements have not changed.

Article 12: Use of Contractors

No substantive changes.

Article 13: Costs of this Arrangement

No substantive changes.

Article 14: Revision

No substantive changes.

Article 15: Duration

No substantive changes.

Article 16: Voluntary Termination of Participation

No substantive changes.

Article 17: Commencement

The following text was added:
"In terms of continuation, the qualifying status of all members remains valid for a period of five years from the date of the latest Voluntary Periodic Assessment / Shadow Certification/Validation under the previous version of the Arrangement, unless otherwise approved by the Management Committee.
Certificate Consuming Participants, which have applied to become a Certificate Authorising Participant under the previous version of this Arrangement, and for which the Shadow Certification/Validation has not yet been completed, may choose to complete the Shadow Certification/Validation under the conditions of the previous version of this Arrangement.
Furthermore, all Participants agree:
a) to Recognise conformant certificates issued under the previous version of this Arrangement;
b) to Recognise certificates resulting from products accepted into the certification process prior to approval of this Arrangement according to the previous version of the Arrangement; and
c) for a period of 36 months from the date on which this Arrangement has been signed by all its Participants, to Recognise re-certifications and maintenance addenda issued according to the previous version of this Arrangement. Thereafter, within the scope of this arrangement all Participants shall limit recognition of certifications issued in accordance with Article 2."
The 2006 supplement to the Arrangement on "time criteria required to transfer from a Certificate Consuming Participant to a Certificate Authorising Participant"  does not seem to have been incorporated into this draft of the CCRA.

Article 18: Effect of this Arrangement

No substantive changes.

Annex "A" the Glossary

  • Added "Achievable Common Level of Security Assurance"
  • Changed "Approval/Licensing Policy"  was "Licensing policy" but the text changed too
  • Changed "CC" removed the reference of equivalence to ISO/IEC 15408
  • Added "Certification/Validation Body (CB)"
  • Added "International Technical Community (iTC)"
  • Changed "IT Security Evaluation Criteria"
  • Changed "System" which became an "IT System"
  • Changed: "Originator" which was "Originating Party"
  • Not changed: "Certificate Authorising participant: A Participant representing one or more Compliant CBs", this is pointed out because the definition in the "draft CCRA is inconsistent with text in the "Purpose" section of the Arrangement, which precludes having more than one Compliant CB
  • Added " Collaborative Protection Profile (cPP)"
  • Changed "Protective Marking"
  • Added "RA in Confidence"
  • Added "Security Target (ST)" The definition is consistent with that in CC part 1.
  • Added "Supporting Document"

Annex "B" Evaluation and Certification/Validation Scheme

In B.2, The Role and Principal Characteristics of the CB, the following text was added:
"An Evaluation Facility which has been Licensed or Approved to carry out Evaluations within a particular Scheme is known as an IT Security Evaluation Facility"
It is not obvious why that text was added into B.2 and removed from B.3, Accreditation and Licensing of Evaluation Facilities, especially since the term "ITSEF" is already defined differently in the glossary thus:
"ITSEF:
IT Security Evaluation Facility, an Accredited Evaluation Facility, Licensed or Approved to perform Evaluations within the context of a particular IT Security Evaluation and Certification/Validation Scheme."

Annex "C" Requirements for Certification/Validation Body

No substantive changes other than C.7 which now requires validations of IT products and Protection profiles to be correctly carried out.
"The CB is to have the required facilities and documented procedures to enable the IT Product or Protection Profile Certification/Validation to be correctly carried out in accordance with the Common Criteria and related Evaluation Methods"

Annex "D" Voluntary Periodic Assessments

In the existing CCRA a shadow certification/validation was performed on a "suitable IT product at Common Criteria Evaluation Assurance Level 3 or 4 as agreed upon by the Participants directly involved." This has now been changed to:
"A Voluntary Periodic Assessment should be performed on at least two IT Products that are within the scope of this Arrangement as decided by the Participants directly involved."
Additionally the documentation needing to be provided to the assessment team in support of the Voluntary Periodic Assessment was added to with the addition of:
"Other Evaluation reports should be provided on request in accordance with guidance issued by the Management Committee."
Note that currently the supplemental procedure, Voluntary Periodic Assessment 2005-06, currently applies and gives much detail on the conduct of a voluntary periodic assessment.
It is not clear from the published documents if this detailed procedure will remain in effect or withdrawn.

Annex "E" Certificate and Service Marks

Please see the discussion above under Article 7.
Additionally, it is not clear if an ITSEF may be allowed to use the mark to promote related ITSEF services. I guess that may be up to the relevant CB?

Annex "F" Information to be Provided to Participants

Many changes have been made, but they can be characterised in terms of modernization on the topic of information security. It seems that the topic of information security has moved forward over the last 15 years ;)

Annex "G" New Compliant Certification/Validation Bodies

The text requiring that the initial shadow certification/validation be for two products at EAL3 or EAL4 was changed to:
".. the applicant will be asked to nominate as candidates for Shadow Certification/Validation at least two products evaluated against a collaborative Protection Profile, or a Security Target claiming at least Evaluation Assurance Level 2 and, if appropriate, ALC_FLR where no cPP is used."
Personal comment: Two EAL 2 validations seems like a very low barrier for a potential CB to demonstrate confidence that a more complex cPP could be competently performed. Given the changes to the scope of the agreement, (EAL 2 plus cPPs) I see that this is a little moot given the number of cPPs currently available, but worry a little bit about the future cPPs, with as yet undefined functional and assurance requirements that may require mutual confidence in a deeper competence by CBs.

Note that the supplemental procedure, Conducting Shadow Certification 2004-07, currently applies and gives much detail on the conduct of a voluntary periodic assessment.
It is not clear from the published documents if this detailed procedure will remain in effect or be withdrawn.

Annex H: Administration of the Arrangement

In H.3 Decisions: There has been an addition that:
"The Management Committee should always attempt to achieve a unanimous vote," 
.. but the decisions are still, as before, to be reached by simple majority in most cases.

In H.7 Executive Subcommittee: Representation in the Executive Subcommittee  has been simplified, from the old:
"The Executive Subcommittee should consist of Qualified Participants and additional discretionary Participants up to a numerical limit determined by the Management Committee"
to a rather simpler :
"All Participants may be represented on the Executive Subcommittee."
Three responsibilities were removed from the EC:
"e) resolving technical disagreements about the terms and application of this Arrangement;
f) managing the development of IT security evaluation criteria and IT security evaluation methods;
g) managing the maintenance of historical databases as to the background to interpretations and any resultant decisions that could affect future versions of either the criteria or methodology."
(NB, These functions have not gone. They were added into the responsibilities of the CCDB. See H.8, below.)

And one was added:
"e) managing the promotion of the Common Criteria."

H.8 Development Board

This whole section was added. It seems that until this version of the CCRA is ratified there is no formal CCDB. I wonder who will chair it? Anyway, here is what the "new" CCDB will do:
H.8 Development Board
The Management Committee should establish a Development Board to manage the technical aspects of the Arrangement, foster and oversee the development and maintenance of the criteria, associated methodology and the development of collaborative Protection Profiles by suitable International Technical Communities, and to provide technical advice and recommendations to the Management Committee.
All Participants may be represented on the Development Board.

The business of the Development Board includes:
a) resolving technical disagreements about the terms and application of this Arrangement;
b) managing the development of IT Security Evaluation Criteria and IT Security Evaluation Methods;
c) managing the maintenance of historical databases as to the background to Interpretations and any resultant decisions that could affect future versions of either the criteria or methodology;
d) technical approval of updated criteria, methodology and CC Supporting Documents, to ensure technical consistency;
e) ensuring the effective development of collaborative Protection Profiles by means of suitable technical communities.
Perhaps this last one should really say "suitable iTCs"?

Annex I: Contents of Certification/Validation Reports

There are some additions and changes in regard to cPPs and a few changes that might have relevance to the contents of the reports.

Annex J: Common Criteria Certificates

The old section "Common Criteria Certificates Associated with IT Product Evaluations"  was replaced with two:
"Common Criteria Certificates Associated with IT Product Evaluations with a cPP claimed."
and
"Common Criteria Certificates Associated with IT Product Evaluations without a cPP claimed"
The section:
"Common Criteria Certificates Associated with Protection Profile Evaluations"
remains.
Some new text about supporting documents and use of the logo/mark that has been discussed elsewhere is included.

Annex K: Collaborative Protection Profiles

This is a whole new Annex.
The introductory paragraph seems to suggest that security can be analyzed into an IT product. I wish it were true.
"A collaborative Protection Profile (cPP) and related Supporting Documents define the minimum set of common security functional requirements and the Achievable Common Level of Security Assurance. It includes vulnerability analysis requirements to ensure certified products achieve an expected level of security."
In several places a reference to a "CCRA approval process" is made. For example in K.1 which says:
"The use of extended assurance components should be avoided unless such a rationale can be provided and is subject to the CCRA approval process."
This CCRA approval process is not defined or explained. Could it mean the long process used to approve the CCRA itself? Surely not!

Friday, February 28, 2014

Call for Papers for the Second International Cryptographic Module Conference

Mark Your Calendar: ICMC 2014, November 19-21, Hilton Washington D.C., Rockville, MD

ICMC brings together experts from around the world to confer on the topic of cryptographic modules, with emphasis on their secure design, implementation, assurance, and use, referencing both new and established standards such as FIPS 140-2 and ISO/IEC 19790.
We are focused on attracting participants from the engineering and research community, test laboratories, government organizations, the procurers, deployers and administrators of cryptographic modules and academia. Our program consists of one day of workshops and tutorials, followed by two days of 30 minute presentations (plus 15 minute for questions). We solicit proposals for high quality papers and relevant workshops that will be of interest to the community involved with cryptographic modules on topics below. Visit www.ICMConference.org for complete information.

Topics

  • Management of Cryptographic Modules in the Field
  • Standards: Including FIPS 140-2, ISO/IEC 19790, FIPS 140-3
  • Physical Security and Hardware Design
  • Key Management
  • Random Number Generation
  • Side Channel Analysis, Non-invasive Attacks
  • Choice of and Implementing Cryptographic Algorithms
  • Cryptographic Modules Implemented in Open Source
  • Hybrid Systems, Embedded Systems
  • Tools and Methodologies
  • Other Cryptographic Standards
The committee favors vendor-neutral presentations that focus on the practical design, testing and use of cryptographic modules. Product vendors are encouraged to recruit clients and partners who are front-line implementers as presenters.

Dates
  • Abstracts: April 10, 2014
    All prospective authors must submit their abstracts and workshop proposals using this link: http://icmconference.org/?page_id=24
  • Review and comments: May 18, 2014
  • Acceptance notifications: July 17, 2014
  • Final versions due: October 23, 2014
Workshops/Tutorials: November 19, 2014
Presentation of papers: November 20-21, 2014

For any questions regarding submissions or the conference in general, please contact us at info@icmconference.org.

Presented with the cooperation of:
CMUF
Cryptographic Module User Forum

Friday, February 14, 2014

Some Efficiencies in the O-TTPS Accreditation Program


One of the nice things about working with the OTTF and The Open Group TM in developing their O-TTPS Accreditation Program has been the emphasis that the forum members placed on efficiency within the program.

One of the major gripes in the IT assurance community is that we seem to do the same things over and over again.

That costs us precious resources, people, money and time and was pointed out by many of the members of the OTTF.

Mary Ann Davidson of Oracle, as usual, expressed the problem very well: "Doing the same thing twice or more unintentionally usually ends up with worse security as we use scarce resources on duplicative measures."

For developers, how many times do they really have to get their processes checked to see if, for example, they use an automated configuration management system, and that they have implemented access control for it. Such checks are made over and over again if they need to certify products under a variety of assessment programs.

For some of our customers, we have checked this close to a hundred times over the last decade.

Using the above example, the checks are made if a product from the developer needs FIPS 140-2, Common Criteria, O-TTPS assessment, etc., etc. It's both inefficient and expensive for the developers.
These overlaps are prevalent and extend to many of the organizations' processes, not just to configuration management.

Similarly, for mature IA assessment companies like atsec, we find that all of the various programs that we are involved with demand that we have certified management systems. The problem is that there are a variety of standards and a variety of certification programs to choose from.

In atsec's case, this has meant that we have to, across our small company of less than 100 people, to manage, endure and pay for many management system audits; It is a cost of our doing business, but a real pain in the... .

We have:
  • ISO/IEC 17025: 2 for NVLAP (don't ask!), BSI, FMV, and by ISCCC;
  • Technical competence audits: CMVP, NIAP, GSA, BSI, CSEC, O-TTPS;
  • PCI SSC: (not based on international standard);
  • ISO 9001 (Customer demand);
  • ISO/IEC 27001  (Customer demand);
  • and soon to ISO/IEC 17020.
None of the our auditors are allowed to accept the results from the others. Granted, each scheme requires a different technical competence, but our document and record control, HR, training, resource management, corrective and preventive actions, internal audit, calibration, etc. are the same for all of the programs to which we belong.

atsec customers can be assured that our management system is well audited!


.. back to the O-TTPS Accreditation Program: 

The forum saw the chance to try and address this issue: The notion of carefully reusing existing assurance was one of the factors we considered during each stage of developing the new O-TTPS Accreditation Program.

Here are some of the efficiencies that we identified and that the Accreditation Program has implemented:
  • The ability for O-TTPS assessors to reuse existing audit reports presented by the developer. These might include, where relevant, ISO 9001, security audits, Common Criteria site visits, etc.
    Note that there are some careful provisions, which are detailed in the Assessment Procedures.
  • Currently under development: The provision of mappings, allowing the work done for existing product certifications, such as Common Criteria, to be more easily mapped to the O-TTPS requirements.
  • For prospective Recognized Assessor companies and assessors, the ability to accept a variety of existing certifications of management systems and assessor qualifications that address the core skills needed as information assurance (IA) professionals. This allows The Open Group to concentrate only on the additional specialised skills needed for the O-TTPS accreditation program.
Although simple policies, they allow the program to be more efficient, reducing the overhead of unnecessary duplication, meaning that the costs of accreditation are not overburdened. These reduce the costs of assessment, to all parties involved: The Accreditation Authority, the Recognized Assessors and to the Organizations undergoing accreditation.

Not only that, but we can concentrate more on the important issue of integrity in the COTS ICT Supply Chains.

Thank you to the OTTF for living in the real world! :)

~ Fiona Pattinson

- "The Open Group" is a trademark of The Open Group

Monday, February 3, 2014

atsec and supply chain security

Supply chain assurance: mitigate the threats of counterfeit and tainted products, globally. Quite a challenge!

To do that we'd need some of the best subject matter experts in the world.

Build a new industry-led open standard and an assessment program that can demonstrate to integrators and acquirers of COTS ICT products that their providers are Trusted Technology ProvidersTM and who can demonstrate that they follow industry best practices in mitigating the threats of counterfeit and maliciously tainted products.

Yes! We need some experts...

atsec has been working closely with The Open Group Trusted Technology Forum (OTTF) since it began in late 2010. For the last three years we've been dedicating a great many hours, alongside our OTTF colleagues with representatives from providers, acquirers and assessors, all working together to progress an  industry vision for an open voluntary standard and an associated accreditation program for trusted technology in relation to supply chain security.


Together, the OTTF have worked very hard.

We developed and published a white paper, a snapshot of the standard, and the O-TTPS standard itself. We have collaborated, presented and harmonized.

We have looked for and found consensus amongst many key organizations. It took blood, sweat and tears!
  • We worked with some of the major names in the I.T. industry for security assurance, including many of the major global players in the industry.
  • We worked with governments around the world. 
  • We worked amongst a time of change within (U.S.) government agencies.

Believe me, finding consensus in such a diverse and knowledgeable set of people was never going to be an easy task!


Finally, we have been working hard on an Accreditation Program and piloting the model in real world assessments. This way we had the chance to iron out some of the wrinkles before the accreditation program became public.

Well, the launch of the assessment program is finally here: February 3rd, 2014; three years after we began work. A short eternity perhaps, but an era that has demonstrated a great achievement.

As atsec worked with the pilots for the accreditation program we had the experience of training and qualifying our assessor teams in. We drew from those with experience in testing,evaluating and of course assessing. Here is a quote from Courtney Cavness, one of our process assessment experts:
"Working as an Assessor on this pilot was a tremendous learning experience. I enjoyed applying my previous audit/evaluation experience to this new standard to help refine and enhance both the standard and the accreditation program."

Another of our senior assessors, King Ables, said  that: 

"This important new standard fills a gap in the spectrum of assurance measures available to product developers and end users. As an early Recognized Assessor, atsec participated in a pilot assessment to "test drive" the standard. This pilot allowed us to gain practical experience with O-TTPS as well as to provide suggestions for improvements based on our other security assessment experience."

The new Accreditation Program, which has been running in pilot mode since last year, already has  one completed accreditation under its belt. An achievement not to be sneezed at!

atsec thanks IBM for the opportunity to participate in the pilot phase and for assigning some of their senior experts in the field to the project which helped the project go very smoothly. 

For a list of  completed and approved accreditations, and their scope's see The Open Group's accreditation register.

Well known and global Trusted Technology Providers have placed enough credence in the program to actively participate; and two respected assessor companies in the field of information assurance also invested in helping to develop and test the program and are the first "Recognized Assessors" working with The Open Group's O-TTPS independent Accreditation Authority to help providers "Build with Integrity" which then allows their acquirer and integrator customers to "Buy with Confidence."


With wide across-the-board participation in the OTTF, we heard a very important message: resources available for providing assurance are scarce and nobody wants to do or pay more than once for the same thing. I believe that the Accreditation Program has incorporated this requirement. It allows for appropriate re-use of assurance already gained.

The OTTF are continuing to work hard on this topic, forging appropriate alliances in the field of information security and information assurance.

Of course it remains to be seen if the program will engender enough trust and confidence to be useful assurance for those eager to have it, but so far all the indications are that it will.

Fiona Pattinson


Tuesday, November 12, 2013

ISO/IEC 27001


 Here at atsec we are fans of internationally recognized security standards that are closely scrutinized by security experts. There are "a ton" of national security standards or self-contrived private “seals of approval” for information security out there. However, national security standards usually do not work well for companies working internationally, because nobody outside of the country of origin cares about a local security standard. With “seals of approval” invented by security companies, the benefit depends completely on the trust your customers have in the company which created the seal.

Unfortunately, trust nowadays is in short supply…

The only available internationally recognized standard for Information Security Management is ISO/IEC 27001. The standard describes how information security can be implemented systematically. Please bear in mind, this is about INFORMATION security, not just technical IT security. So, ISO/IEC 27001 is not the typical “IT standard” which can be completely covered by your IT department. It requires direct involvement and supervision from business management to succeed.

ISO/IEC 27001 uses a risk-based approach, which helps to custom-tailor information security measures to the size and the risk situation of a company. Smaller companies or companies in a low risk market are not required to implement the same measures as companies facing high risks. This makes the standard achievable both for small companies and worldwide enterprises.

The goal of ISO/IEC 27001 is to implement an Information Security Management System (ISMS), which helps to organize security management in a consistent and structured way. This ISMS is not a technical system, but the sum of all security processes and documentation in a company.
   
ISO/IEC 27001 plays well with other security standards. It integrates without problems with other ISO standards. Common integrations are with ISO 9001 (Quality Management), ISO 14001 (Environmental Management), ISO/IEC 20000 (IT Service Management) and ISO 22301, formerly BS 25999 (Business Continuity Management). Other security requirements can also be seamlessly integrated into an ISMS based on ISO/IEC 27001, e.g., PCI-DSS or local data protection requirements.

Let’s have a look at the numbers for ISO/IEC 27001 certificates. The ISO survey 2012  shows a steadily increasing number of certificates for all regions. At the end of 2012, nearly 20,000 ISO/IEC 27001 certificates were issued. East Asia and Pacific (EAP) and Europe together are covering 85.6% of all ISO 27001 certificates. Europe leads in annual growth in absolute numbers in 2012, with 1,095 new certificates being issued.

Diagram from the 2012 ISO survey of certificates

North America with a ratio of 2.8% of all ISO/IEC 27001 certificates is currently lagging behind the rest of the world, which is the result of a strong focus on national standards. Note that the preliminary version of the  NIST led Cybersecurity Framework has included ISO/IEC 27001 in its core of security management standards, alongside the national FISMA suite of standards including SP 800-53 and the COBIT standard that is also well adopted in the U.S.
  
In September 2013, ISO released a new version of the standard, eight years after the release of ISO/IEC 27001:2005. This new standard is called ISO/IEC 27001:2013. The release of a new version of the standard is relevant for existing certificate holders, since after a transition period of 2 years all existing certificates must be migrated to ISO/IEC 27001:2013. Companies which are “nearly finished” with their preparation for ISO/IEC 27001 certification can (and should) perform the certification based on ISO/IEC 27001:2005.

If you are just planning to start your preparations, it will very likely be best to directly go for ISO/IEC 27001:2013 and save the time for a later migration.
 
The update of the standard will probably not directly lead to an even higher number of ISO/IEC 27001 certificates. The generally simpler requirements for Risk Management might entice some undecided companies, but most of the changes helped to tidy several legacy issues in the standard without dramatically changing the requirements. This shows that ISO/IEC 27001 is a standard which is actively developed and which is steadily aligned with the other ISO standards.

The events of this year emphasize the requirements for strong information security. Companies that offer IT services will be forced by their customers to prove a high level of security. Otherwise the customers will just switch to another service provider with a higher level of security. ISO/IEC 27001 helps companies to prove their commitment to information security and helps management to perform due diligence regarding information security.

Matthias Hofherr - atsec Munich

Wednesday, October 30, 2013

Collaboration and Openness to the Rescue of Entropy



This past September was my conference month. I first went to the 14th International Common Criteria Conference (ICCC) in Orlando, Florida and then a week later I was at the 1st International Cryptographic Module Conference (ICMC) in Gaithersburg, Maryland.

The theme of the ICCC this year was a collaborative approach. The conference directed the CC community to adopt the Collaborative Protection Profile approach (cPPs). The task of the protection profile development and maintenance is shifted from the CC Development Board (CCDB) to various Technical Communities (TCs). TCs are expected to gather experts from the Industry, Government, and Academia to ensure the cPPs stay current in the world of fast-evolving technologies. The CC User Forum (CCUF) will be playing an increasingly important role facilitating the communication between the cPP consumers and cPP developers.

The opening speech of the ICMC was delivered by Charles H. Romine, Director of the Information Technology Laboratory at NIST. Openness of NIST was the core of the speech. NIST has held open competitions for AES and SHA-3. NIST even reopened the public comment period for SP 800-90 series of crypto standards in the interest of openness to give the public a second opportunity to view and comment on the standards. NIST acknowledges the challenges (e.g., such as several-month long review pending queue) that the Cryptographic Module Validation Program (CMVP) is facing and invited all ideas from the audience and public for improvement.

One of the common features of both conferences was the heated discussions on RNG and entropy. The ICMC had three presentations devoted to this topic:
The ICCC had the following presentations on general cryptography, as well as RNG and entropy in particular:

The number of presentations at both conferences was not surprising since more and more Target of Evaluations (TOEs) rely on cryptographic support (i.e., class FCS functionality) for user authentication, data protection, secure communication, and trusted path/channels. Assessing the security strength of cryptographic algorithms and keys has become indispensable for the vulnerability assessment of a TOE.  The Random Number Generator (RNG) that provides the random source for a key determines the quality of the generated keys. A predictable RNG could lead to the downfall of the entire system. To avoid this Achilles’ heel, it’s crucial to have a well-designed and properly-tested RNG and entropy source.


COLLABORATION?





Both CC evaluations and FIPS 140-2 validations require scrutiny of the entropy source.  For example, Annex D of the U.S. Network Device Protection Profile (NDPP) v1.1 provides requirements for entropy documentation and assessment. The documentation of the entropy source is expected to be detailed enough to include the following:
  • Design Description
    Documentation shall include the design of the entropy source as a whole, including the interaction of all entropy source components.
  • Entropy Justification
    There should be a technical argument for where the unpredictability in the source comes from and why there is confidence in the entropy source exhibiting probabilistic behavior (an explanation of the probability distribution and justification for that distribution given the particular source is one way to describe this).
  • Operating Conditions
    Documentation will also include the range of operating conditions under which the entropy source is expected to generate random data.
  •  Health Tests
    All entropy source health tests and their rationale will be documented.

The CMVP is working on an Implementation Guidance (IG) on entropy analysis. The draft version has already been circulated among accredited labs for review and comments. Vendors can also provide feedback through their representing labs. Although the CMVP is still working on incorporating the feedback received from labs and vendors, the following requirements stated in the draft IG will likely remain unchanged in the final version, which currently states the following:
  • The documentation shall contain a detailed logical diagram which illustrates all of the components, sources, and mechanisms that constitute the entropy source.
  • The statistical analysis has to be performed on the raw, non-conditioned data. The testing lab is responsible for choosing the statistical tests and justifying their selection. Further, the lab shall explain what the result of each test means and how this result can be justified.
  • A heuristic analysis of an entropy source along with the justifications of the entropy claims based on this analysis. This analysis shall always be included in the test report.

In theory, a thorough analysis on the entropy source coupled with some statistical tests on the raw data is absolutely necessary to gain some assurance of the entropy that plays such a vital role in supporting security functionality of IT products. While the statistical tests are useful to detect the patterns in the input data and hence to alert for low entropy cases, their passing results do not at all prove that there is sufficient entropy in the input data. For example, a sequence of 20-byte bit-strings obtained by consecutively applying SHA-1 function as a pseudo randomizer to an initial 20-byte of 0 bit-string may well pass all sorts of statistical tests, but it is obvious that there is no entropy in the 20-byte of 0 bit-string to start with. Therefore, the statistical tests alone cannot justify the seemingly randomness in the output strings. The statistical tests performed on the conditioned data are even more removed from reflecting the adequate entropy of the initial random value. If one is seriously investigating the entropy source to gain a certain level of assurance regarding its quality, then the requirements set forth for CC evaluation or FIPS validation are appropriate.

However, the requirements of entropy analysis stated above impose an enormous burden on the vendors as well as the labs to an extent that they are out of balance (in regard to effort expended) compared to other requirements; or in some cases, it may not be possible to meet the requirements.

The TOE Security Assurance Requirements specified in Table 2 of the NDPP is (roughly) equivalent to Evaluation Assurance Level 1 (EAL1) per CC Part 3. This is a rather bizarre phenomenon! The NDPP does not require design documentation of the TOE itself; nevertheless its Annex D does require design documentation of the entropy source--which is often provided by the underlying Operating System (OS). Suppose that a TOE runs on Windows, Linux, AIX and Solaris and so on, some of which may utilize cryptographic acceleration hardware (e.g., Intel processors supporting RDRAND instruction). In order to claim NDPP compliance and succeed in the CC evaluation, the vendor is obligated to provide the design documentation of the entropy source from all those various Operating Systems and/or hardware accelerators. This is not only a daunting task, but also mission impossible because the design of some entropy sources are proprietary to some OS or hardware vendors.

The vendors pursuing cryptographic module validation under FIPS 140-2 are facing the same challenge. While software modules often rely on the operational environment to provide an entropy source, hardware modules may use the third-party provided entropy source in an Integrated Circuit (IC). But regardless of whether the module is hardware or software based, the design documentation of the third-party provided entropy source is often not available. In addition, there is no externally-accessible interface to the third-party provided entropy source that would enable accessing the raw non-conditioned random data for statistical tests. These interfaces are inaccessible due to their security architecture, which if they were accessible, these interfaces may become an attack surface susceptible to the malicious manipulation of the entropy source.

In cases where the entropy source is from some open source OS such as Linux or perhaps even designed by the vendor themselves, the vendor may be able to provide the design documentation and raw non-conditioned data for test. However, this places a heavy burden on the testing labs to provide justifications for their methodology (e.g., selection of statistical tools) and then provide the analysis based on the justified methodology. Many labs raised their concerns at the ICMC that a task of this nature requires mathematicians with doctoral degrees and goes beyond the scope of the conformance testing that the cryptographic module validation program is bound to.

As we can see, from the requirements for entropy source to the fulfillment of these requirements, there is a giant leap. Asking each vendor and each lab for each CC evaluation or each FIPS validation to meet the requirements of entropy source as stated in the Annex D of the NDPP or in the draft FIPS 140-2 IG is a monumental task not commensurate with the expected effort, and even then the proposed result would still be beyond reach.

Instead of requiring the vendors and labs to find solutions for the entropy issue on their own, NIST should play a leading role not only in setting up the requirements but also in establishing a path to meet the requirements. Vendors and labs can join this effort led by NIST to put the necessary infrastructure in place, before expecting vendors and labs to evaluate the quality of the entropy. Here are some thoughts on how to establish such an infrastructure:
  • NIST may hold open competitions for acceptable entropy sources and entropy collection designs with reference implementation in commonly-seen categories such as linear feedback shift registers (LSFRs), noisy diodes, thermal sampling, ring oscillator jitter, CPU clock readings, various human-induced measurements (e.g., the time intervals between the keystrokes). The end result would be a list of NIST-recommended entropy sources and their corresponding entropy collection mechanisms. Just like NIST-approved algorithm standards, the NIST-approved entropy source standard would regulate the entropy source design and implementation.
  • NIST should set up test criteria (e.g., operational conditions, pre-requisites, variable lengths) and provide test tools to the accredited testing labs for validating entropy source. Just like the Cryptographic Algorithm Validation Program (CAVP), this can be Entropy Source Validation Program (ESVP).
  • NIST would maintain a validation list for all of validated entropy sources.
  • CMVP and NIAP (or perhaps even CCRA members) would reference the entropy source validated list.

With this infrastructure in place, the steep entropy requirements are broken down into several steps. The OS vendors and IC vendors, if they provide entropy source in their products, will be motivated to undertake the entropy source validation and make their product available on the NIST validation list. Vendors who need to make use of the third-party entropy source can look up the NIST validation list and make an appropriate selection. Labs performing the testing on the entropy source would use the provided test methodology and tool for the entropy source producer, and check the validation list for the entropy source consumer. After establishing the above-described infrastructure, the testing of the entropy source and maintenance of the validation list are manageable steps for the labs and vendors to follow.

One may say that it sounds like a plan, but that’s easier said than done. I hope with NIST’s openness and NIAP’s collaborative approach, it is possible to rescue the vendors and labs from the currently impossible entropy requirements.

By: Dr. Yi Mao

Monday, October 14, 2013

Vendor Viewpoints (from a Lab)

What might we hear if we sat at a round table with software vendors currently pursuing Common Criteria evaluations? Would we hear the same thoughts that currently drive the CC community at large?

The following discussion is based on comments from vendors in response to the questions posed below regarding their selection of a CC scheme:
  1. Were the entrance requirements for CC evaluation clear? Is the fee structure clear? Did you have to collect a group of documents to get a complete understanding of the differences from one scheme to the next?
  2. Does your government sales team have a clear view of what is required to enter into CC evaluation? Did you receive guidance regarding the approach to take for CC evaluations?
  3. What change or improvement would you like to see in the certification process overall?

General Response 

One vendor replied that they have a collection of scheme policies, statements, and guidance. Obtaining guidance from the schemes was not an issue, the problem was in getting a definitive picture of the life cycle of an evaluation project.

Another vendor responded that PPs highlight their product’s features, which they found very beneficial. They noted that internally, their development organization needs to remember that, for the company, the end goal is sales and it is the product management and marketing teams who provide input to the company’s CC requirements. The sales team is generally unaware of the effort an evaluation will take.

Time and Cost

High on the list of issues is the amount of time an evaluation will take to complete. Some vendors may have experience with multiple schemes and so have rationale for making a choice between schemes.  And, remember that time is a factor that is equally important as choosing the right PP – it affects sales.

Several vendors commented that the overall change or improvement they would like to see is a reduction in time or cost of performing evaluations.

Need for unified requirements

Another consideration for the choice of where to go is the assurance level that can be used in evaluation. One vendor responded that since many its customers are asking for a specific assurance level, the vendor has had to choose a certain route for an evaluation, even though it was a more difficult path to take, because the vendor needed to satisfy their customers. The vendors are trying to balance the requirements of both US and non-US customers. The schemes are not helping solve this business problem since the Common Recognition arrangement is not so much an arrangement any longer.

Another vendor responded that they look forward to stability and cooperation. Problems have included not being able to predict requirements and the resulting difficulty in being able to tailor their development plans. Elements in the CC world are changing but importantly, the vendor notes, no one is agreeing on these changes.

Duplication

Another respondent commented that there must be a way found to avoid their running evaluations in more than one place. This will raise the price, not lower it.

Another vendor responded with a somewhat different approach in that they are evaluating their product in one place but getting two certificates. The reason for this was that one scheme demands exact PP compliance while another allows us to exceed that. So with some extra work that vendor can get what both international markets want.

Confusion factors

One point of confusion (and thus, delay) is the current misunderstanding for some new protection profiles. There is a learning curve for the creation of specific profiles. But there is also a learning curve for customers who even have yet to understand what the current US plan is. At atsec, we work to educate them, but more information and a clearer understanding provided by the schemes themselves is in order.

One vendor’s sales team does not fully understand just what it takes to fulfill security requirements, and so it is unlikely that this vendor sales team can explain to their prospective DoD customers the correct path to take. Again, this information needs to be clearly outlined by the scheme.

And the same observation was made about another vendor’s sales team that didn't understand what the CC evaluation process takes, and so aren’t able to offer guidance to their own development team. The sales team needs to understand the requirements of the ultimate customer – the user.

Another respondent indicated that they have to look at each product over again regarding the entrance requirements. Getting into evaluation is no longer a consistent process because some products, for example, need to have crypto validated by a FIPS validation while others are subject to entropy scrutiny. It differs by product and by which policy is in place at a particular time, and it varies by scheme.

Conclusion


What do our vendor customers want? The same thing that we want: they want faster evaluations. Who doesn’t want this? And this is in line with what the US scheme has also heard and is working toward.

They want “a stable, common worldwide evaluation standard” as one of them succinctly put it.

Does this mean the same level of security assurance or more? Less cost is derived from less work, which is derived from less scrutiny and less assurance.

Vendors want things to be equal among the international schemes – as they were told they were supposed to be. But there is a decided expectation that only the US rules are acceptable, and so that colors their decisions regarding not only whether to undergo a CC evaluation, but also where and when.

One respondent suggested a novel idea.  They suggested having some sort of SLA (service level agreements) from the schemes, much as atsec provides for our internal / external customers. The SLA could outline what the vendor (and labs) can expect from the scheme given a specific input. This would make the evaluation process more of a business-focused proposition.

Note: atsec would like to point out that one scheme has instituted a plan to address this last point. CSEC, the national scheme in Sweden, sets its feedback to the lab based on each report or set of reports sent to the certifier for the entire length of the project. Once the lab delivers a given report to the scheme, there is an established date for certifier feedback to the lab’s evaluation team. This enables the customer to plan for any needed responses in a more predicable manner. This also makes for a more transparent practice that the lab can share with the sponsor. This is not the total solution, but it may make for faster turnaround and thus faster evaluation.

-- Ken Hake
Lab manager (US)

Sunday, October 6, 2013

Dual_EC_DRBG Usage in Evaluations

This information is intended to be of use to those working under all the CCRA national evaluation schemes. Some of whom are updating policies relevant to this topic. This blog is not intended as an atsec opinion about the underlying issues.

On September 9th this year, NIST posted the following announcement:

In light of recent reports, NIST is reopening the public comment period for Special Publication 800-90A and draft Special Publications 800-90B and 800-90C.
NIST is interested in public review and comment to ensure that the recommendations are accurate and provide the strongest cryptographic recommendations possible. The public comments will close on November 6, 2013. Comments should be sent to RBG_Comments@nist.gov.

In addition, the Computer Security Division has released a supplemental ITL Security Bulletin titled "NIST Opens Draft Special Publication 800-90A, Recommendation for Random Number Generation Using Deterministic Random Bit Generators, For Review and Comment (Supplemental ITL Bulletin for September 2013)" to support the draft revision effort.
The above-mentioned ITL Security Bulletin makes the following recommendation:
NIST strongly recommends that, pending the resolution of the security concerns and the re-issuance of SP 800-90A, the Dual_EC_DRBG, as specified in the January 2012 version of SP 800-90A, no longer be used.
We have found that the following Protection Profiles allow for the optional claims of the Dual_EC_DRBG random number generator in FCS_RBG_EXT.1:
  • Protection Profile for Software Full Disk Encryption
  • Protection Profile for USB Flash Drives
  • Security Requirements for Mobile Operating Systems
  • Security Requirements for Voice Over IP Application
  • Network Device Protection Profile (NDPP) Extended Package VPN Gateway
  • Protection Profile for Network Devices
  • Standard Protection Profile for Enterprise Security Management Policy Management
  • Standard Protection Profile for Enterprise Security Management Identity and Credential Management
  • Standard Protection Profile for Enterprise Security Management Access Control
  • Protection Profile for IPsec Virtual Private Network (VPN) Clients
  • Protection Profile for Wireless Local Area Network (WLAN) Access Systems
  • Protection Profile for Wireless Local Area Network (WLAN) Clients
Since some schemes have different policies in regard to this topic, we recommend that vendors and labs check with their scheme before including this algorithm in their security claims.

~the atsec team

Wednesday, October 2, 2013

US partial Government shutdown

The US government shutdown means that the following are affected:
 
  • NIAP - closed until further notice.
    13/10/07 - 
    NIAP Returns from Furlough

    As of 7 October 2013, NIAP operations are resumed. Please feel free to contact us with inquiries. We thank you for your continued support.
  • CMVP (NIST) - closed until further notice, but CSEC (Canada) CMVP 
    will continue operations, however no validations will be completed 
    without a NIST signatory
  • CAVP (NIST) -  closed until further notice. 
  • SCAP -  closed until further notice.
  • GSA and TWIC are also probably affected, 
    but we don't have official word from those programs yet. 
 
atsec continues to work on IUT and evaluation projects in the atsec labs as normal, but obviously progression through scheme/program milestones may be affected.
atsec customers in evaluation under other schemes such as BSI or CSEC (Sweden) are not being affected. 
atsec customers with questions about their projects should contact their atsec representative.