The body controlling payment card security rules has issued a report aimed at companies intending to use tokenisation...
as a way of restricting the scope of their compliance with the Payment Card Industry Data Security Standard (PCI DSS).
The Payment Card Industry Security Standards Council (PCI SSC) outlined the basic principles governing the use of tokenisation in its guidance document, PCI Tokenisation Guidelines (.pdf), issued Friday. Tokenisation technology substitutes all or part of a card number, known as the Primary Account Number (PAN), with an alternative identifier, called a token. The token can then be processed by a merchant’s internal transaction systems, limiting access to and use of PAN data.
PCI tokenisation best practices
The tokenisation best practices guidance aims to help merchants make the right choices by dealing four main areas:
Outlining scoping elements
This looks at the types of token that can be used and the considerations that need to be taken into account for each of them. Jeremy King, European director of the PCI SSC, said the process is challenging because not all cards have a 16-digit primary account number (PAN). Some tokenisation methods are more applicable than others according to the card in question. Some tokens try to preserve the format of the original PAN in order to maintain compatibility with internal processing applications, while other approaches may generate a new truncated or randomised number, King said.
While the document offers helpful recommendations, the PCI SSC is not validating individual tokenisation systems, said Dan Konisky, director of product management for tokenisation specialists Liaison Technologies. Konisky said he hoped to find more specific guidance on choosing a type of token.
”While the published tokenisation guidelines are a great start, the PCI community is thirsting for the actual validation criteria to determine scoping,” he said in a written statement. “We're hopeful that the Council will eventually take this next important step in providing guidance to the community.”
Recommendations: Scope reduction, tokenisation process, deployment and operation factors
The purpose of tokenisation is to minimise exposure of the credit card details in any transaction; once the details are tokenised, the systems can then be deemed as out of scope for compliance purposes. But King said there also has to be a mechanism to allow the token to identify the card it represents. For instance, if an acquirer needs to check a transaction, someone will need to identify the card in question and not just an anonymous token.
“Systems that allow you to get back to the PAN need to be properly protected, and are in scope,” King said.
Companies may well choose to outsource tokenisation, and the PCI tokenisation guidance also outlines the responsibilities and factors to consider, and lays down good security practices for accessing the services.
Detailing best practices for selecting a tokenisation solution
While not recommending any one product, the paper does provide merchants with some guidance on choosing a solution that suits their own needs, whether it be an in-house product or one supplied through a service provider.
Defining the domains where tokenisation could potentially minimise the card data environment
As King points out, tokenisation was once seen as an alternative to point-to-point encryption in helping companies limit the scope of their compliance, but the two techniques are now being combined by some merchants. “Point-to-point encryption works very well at the initial point of sale, but once the data gets into your systems, tokenisation works better,” he said. “It seems we can use the best of both technologies throughout the transaction process. The guidance explains how to introduce it at various stages of the transaction.”
Companies should not regard tokenisation as a “silver bullet” to de-scope their systems, said Mathieu Gorge, CEO of Dublin-based PCI specialists VigiTrust.
“Tokenisation can reduce your scope, but the guidance is open to interpretation, and there is a lack of industry standard for tokens,” he said. “You still need to have all the policies, procedures and technology to protect your account data, and you need to understand the design of systems so network components can be properly isolated to reduce the potential attack surface.” The final decision will rest with the Qualified Security Assessor to decide on whether any solution is sufficient, he said.
Jeremy King agreed: “It’s not just a question of buying a solution,” he said. “It needs control and management, and you need to know what you are doing. If you are going to tokenise, you need to have a good relationship with the vendor to ensure they explain how it is going to work, and what the particular challenges are in your environment.”