Monday, October 14, 2013

Vendor Viewpoints (from a Lab)

What might we hear if we sat at a round table with software vendors currently pursuing Common Criteria evaluations? Would we hear the same thoughts that currently drive the CC community at large?

The following discussion is based on comments from vendors in response to the questions posed below regarding their selection of a CC scheme:
  1. Were the entrance requirements for CC evaluation clear? Is the fee structure clear? Did you have to collect a group of documents to get a complete understanding of the differences from one scheme to the next?
  2. Does your government sales team have a clear view of what is required to enter into CC evaluation? Did you receive guidance regarding the approach to take for CC evaluations?
  3. What change or improvement would you like to see in the certification process overall?

General Response 

One vendor replied that they have a collection of scheme policies, statements, and guidance. Obtaining guidance from the schemes was not an issue, the problem was in getting a definitive picture of the life cycle of an evaluation project.

Another vendor responded that PPs highlight their product’s features, which they found very beneficial. They noted that internally, their development organization needs to remember that, for the company, the end goal is sales and it is the product management and marketing teams who provide input to the company’s CC requirements. The sales team is generally unaware of the effort an evaluation will take.

Time and Cost

High on the list of issues is the amount of time an evaluation will take to complete. Some vendors may have experience with multiple schemes and so have rationale for making a choice between schemes.  And, remember that time is a factor that is equally important as choosing the right PP – it affects sales.

Several vendors commented that the overall change or improvement they would like to see is a reduction in time or cost of performing evaluations.

Need for unified requirements

Another consideration for the choice of where to go is the assurance level that can be used in evaluation. One vendor responded that since many its customers are asking for a specific assurance level, the vendor has had to choose a certain route for an evaluation, even though it was a more difficult path to take, because the vendor needed to satisfy their customers. The vendors are trying to balance the requirements of both US and non-US customers. The schemes are not helping solve this business problem since the Common Recognition arrangement is not so much an arrangement any longer.

Another vendor responded that they look forward to stability and cooperation. Problems have included not being able to predict requirements and the resulting difficulty in being able to tailor their development plans. Elements in the CC world are changing but importantly, the vendor notes, no one is agreeing on these changes.

Duplication

Another respondent commented that there must be a way found to avoid their running evaluations in more than one place. This will raise the price, not lower it.

Another vendor responded with a somewhat different approach in that they are evaluating their product in one place but getting two certificates. The reason for this was that one scheme demands exact PP compliance while another allows us to exceed that. So with some extra work that vendor can get what both international markets want.

Confusion factors

One point of confusion (and thus, delay) is the current misunderstanding for some new protection profiles. There is a learning curve for the creation of specific profiles. But there is also a learning curve for customers who even have yet to understand what the current US plan is. At atsec, we work to educate them, but more information and a clearer understanding provided by the schemes themselves is in order.

One vendor’s sales team does not fully understand just what it takes to fulfill security requirements, and so it is unlikely that this vendor sales team can explain to their prospective DoD customers the correct path to take. Again, this information needs to be clearly outlined by the scheme.

And the same observation was made about another vendor’s sales team that didn't understand what the CC evaluation process takes, and so aren’t able to offer guidance to their own development team. The sales team needs to understand the requirements of the ultimate customer – the user.

Another respondent indicated that they have to look at each product over again regarding the entrance requirements. Getting into evaluation is no longer a consistent process because some products, for example, need to have crypto validated by a FIPS validation while others are subject to entropy scrutiny. It differs by product and by which policy is in place at a particular time, and it varies by scheme.

Conclusion


What do our vendor customers want? The same thing that we want: they want faster evaluations. Who doesn’t want this? And this is in line with what the US scheme has also heard and is working toward.

They want “a stable, common worldwide evaluation standard” as one of them succinctly put it.

Does this mean the same level of security assurance or more? Less cost is derived from less work, which is derived from less scrutiny and less assurance.

Vendors want things to be equal among the international schemes – as they were told they were supposed to be. But there is a decided expectation that only the US rules are acceptable, and so that colors their decisions regarding not only whether to undergo a CC evaluation, but also where and when.

One respondent suggested a novel idea.  They suggested having some sort of SLA (service level agreements) from the schemes, much as atsec provides for our internal / external customers. The SLA could outline what the vendor (and labs) can expect from the scheme given a specific input. This would make the evaluation process more of a business-focused proposition.

Note: atsec would like to point out that one scheme has instituted a plan to address this last point. CSEC, the national scheme in Sweden, sets its feedback to the lab based on each report or set of reports sent to the certifier for the entire length of the project. Once the lab delivers a given report to the scheme, there is an established date for certifier feedback to the lab’s evaluation team. This enables the customer to plan for any needed responses in a more predicable manner. This also makes for a more transparent practice that the lab can share with the sponsor. This is not the total solution, but it may make for faster turnaround and thus faster evaluation.

-- Ken Hake
Lab manager (US)

No comments:

Post a Comment

Comments are moderated with the goal of reducing spam. This means that there may be a delay before your comment shows up.