The Evolution Of Common Criteria

Friday, May 29, 2009

Original Source:

Hi, my name is Adam O’Brien. I help guide Oracle products through Common Criteria evaluations. Common Criteria is a worldwide, government-backed scheme for testing the security of a product or system. Essentially, you state what security functions your product should be able to perform, then an independent lab evaluates if the product implements these functions reliably and robustly.

Historically, Oracle has been very proactive in evaluating its products through Common Criteria. To date, the company has obtained 26 evaluations, and 13 evaluations are currently pending. This commitment to Common Criteria makes a lot of sense: we perform a rigorous security assessment of our products and gain a certificate that helps federal and government sales. In addition, these evaluations are beneficial to non-government customers in that they constitute a third-party validation of the effectiveness of the security controls provided by the evaluated product.

A Common Criteria evaluation is often mandated for a government sale and it has a (near) global acceptance. These two aspects also lead to the main inconvenience of Common Criteria – it is bureaucratic and very slow to change.

Common Criteria has been used for 10 years, and version 3 was only recently introduced. The Common Criteria working groups are already in place looking at prospective changes for version 4 of the standard (Common Criteria is also known as ISO/IEC 15408 international standard). Previously, vendors were excluded from these working groups, and participation was limited to government agencies, such as the US National Security Agency (NSA). These government agencies alone decided how the standard should change. The vendors were presented with a revised version and expected to use it.

Recently a number of vendors, including Oracle, have been allowed to provide input to the working groups, in part through lobbying activities from an umbrella group of companies called the Common Criteria Vendors Forum. Note however that none of these companies were given a seat in the working groups. We are hoping that vendors’ participation will help assure that the changes in version 4 will make the process more effective at finding vulnerabilities and giving assurance of security. Even more importantly, vendors can use their (minor) influence to try to force these working groups to keep to their schedules.

There are six working groups, some with rather odd sounding names:
Evidence-based approach – looking at ways of making more use of the design documentation produced as part of product development, rather than (wastefully, in my opinion) producing evidence specifically for the Common Criteria process.
Predictive assurance – looking at ways to examine vendor development processes to determine some level of predictability for future assurance. For example, if version 5.0 of a product is evaluated as being secure and the product lifecycle process and bug fixing systems are reliable, can we have some level of assurance that version 5.2 will be secure?
Skills and interaction – ensuring the independent testers have a staff development process to stay up to date with attack methodologies and tools.
Meaningful reports – trying to reform the reports produced at the conclusion of a Common Criteria evaluation so that they offer meaningful information to vendors without publicly disclosing confidential information.
Lower assurance evaluations – finding ways to perform quicker and cheaper Common Criteria evaluations that still give some assurance of security.
Tools – looking at ways to integrate the use of tools that check source code for vulnerabilities into Common Criteria.

I’m the liaison between the vendors and the Common Criteria working group on tools. I suspect this could be a very lively topic. Automated detection of vulnerabilities can find lots of vulnerabilities, but in my experience, many of the flaws detected are false positives or are unexploitable and can therefore be considered low priority issues. Configuring the tools is also a critical and difficult process. Oracle, in the last few years, has developed tremendous experience with automated tools. It is my hope that, through this working group, we can share the very steep learning curve we experienced with such technology; we have many insights to share.

At the moment, the “tools” working group seems to be making slow progress and it isn’t yet in a position to accept any suggestions or feedback. However, I am still gathering views within Oracle and the rest of the vendor community on what we want to see in this area, hoping that the working group will soon be able to accept feedback. Hopefully, such feedback will happen soon. This might help set the working group on a reasonable course from the start, because I suspect that significant changes might be hard to produce once the working group has engaged into its process.


General General
Post Rating I Like this!
avelin injector