Tokenization: The PCI Guidance

Supplement Outlines Best Practices for Merchants
Tokenization: The PCI Guidance
The PCI Security Standards Council's new guidance for tokenization, PCI DSS Tokenization Guidelines Information Supplement, issued Aug. 12, offers clarification about tokenization and recommends steps merchants can take in selecting the provider that will best help them comply with the PCI Data Security Standard version 2.0, which was released in October.

"We recommend that the merchant perform a risk assessment as part of their due diligence when they're selecting a tokenization service provider," says Bob Russo, general manager of the PCI Security Standards Council.

"How are they storing tokens? What are they doing with them? How do you get the token? Is it encrypted? There all kinds of issues that you really need to consider," Russo says in an interview with BankInfoSecurity.com's Tracy Kitten [transcript below].

While the guidance isn't meant to replace compliance demands for existing data security standards, Russo says it does offer steps for stakeholders, including best practices that can foster layered security.

During this interview, Russo discusses:

  • Key areas the new guidance addresses;
  • The need for risk analysis, before, during and after tokenization is implemented;
  • Why a lack of standardization and industry understanding of tokenization was a catalyst for the council's decision to issue the guidance.

Russo has more than 25 years of high-tech business management, operations and security experience. In his role as general manager, Russo guides the PCI Council through its charter, which is focused on improving data security standards for merchants, banks and other key stakeholders involved in the global payment card transaction process. To fulfill this role, Russo works with representatives from American Express, Discover Financial, JCB, MasterCard Worldwide and Visa International to drive awareness and adoption of the PCI Data Security Standard.

New Tokenization Guidance

TRACY KITTEN: Last fall, the PCI Security Standards Council said it was taking a hard look at tokenization, evaluating how merchants and others should view it under the purview of PCI. Today the council has issued new guidance specific to tokenization and its connection to the PCI Data Security Standard. The council has talked about tokenization for some time now. Today's new guidance highlights how tokenization can impact card security, as well as affect compliance with the PCI Data Security Standard. Can you give us some background about the guidance as it relates to card security specifically?

BOB RUSSO: As part of our ongoing commitment to put out guidance on whatever might be an advancement in the payments area, the council has been working with members of the PCI community, specifically our special interest groups, and specifically on tokenization our scoping special interest group, to put together this guidance on the use of tokenization as it relates to the payment environment. It's primarily focused on helping the merchant community decide what they need to do when they're considering a tokenization acquisition and then some implementation decisions that they may want to make as well.

Just to clarify for your listeners once again, this is supplemental guidance. These are guidelines to use for tokenization, and they do not replace or supersede any PCI DSS requirement. You're still required to comply with PCI DSS. So instead, these guidelines basically aid these merchants in understanding how this different technology can help remove some of the cardholder data environment for them in terms of what they need to do in order to become compliant. And it's a very, very good starting document for them.

KITTEN: For the purpose of these new guidelines, can you tell us how the PCI Council defines tokenization?

RUSSO: Tokenization technology is generally thought of as replacing a primary account number, or the PAN, with a value basically called the token. When it comes to PCI DSS, this involves basically substituting sensitive PAN values with non-sensitive token values, and this can reduce or remove the need for the merchant to retain the PAN in their environment once the initial transaction has been processed.

Tokenization & Card Security

KITTEN: Now the PCI Council views tokenization as an emerging technology, yet tokens have been around for decades. In fact, some argue the technology is relatively dated. How does the council view tokens and the role they play in card security today?

RUSSO: Well, you're absolutely right. Tokenization is a technology that's been around in one form or another for quite some time, and really it's not an emerging technology, but as with many of these technologies, it's really still evolving. We see just the sheer number and variety of token solutions that are out there on the market right now for merchants to choose from, and this guidance pretty much helps them in that process. What we're doing with this paper is officially acknowledging for the first time the potential of the use of tokenization to reduce the scope of what the cardholder data environment is, and therefore reduce the pain that someone has to go through to get PCI compliant as a merchant. In general what we're saying is that tokenization can provide a model to centralize the cardholder data storage and basically minimize the number of cardholder data occurrences that we see in a particular environment. But again, like with any technology, this isn't a magic bullet, and implementation is really key. A properly implemented tokenization solution can reduce or remove the need for a merchant to retain the PAN in their environment once the initial transaction has been processed.

Concerns in the Wake of RSA

KITTEN: The recent breach of RSA tokens, of course, has raised a number of concerns about token security. How do you view those criticisms, and do you deem tokenization to remain a viable way to protect sensitive data such as cardholder data?

RUSSO: This paper is focused obviously on specific card data and the impact of tokenization technology in the card payment environment. So it's specific to that. As with any technology, security issues are still going to come up, and proper implementation and then, more importantly, ongoing maintenance and monitoring is always going to be important. A couple of the key areas that we look at in the paper are areas that merchants need to make sure are addressed with the specific solutions that they're choosing. Ongoing monitoring, as I just mentioned, ensures that the solution is using logging, monitoring and alerting as appropriate to identify any kind of suspicious activity that might be going on and enable you to initiate any kind of response procedures that you've got. Another item that we recommend is implementing a risk analysis process. If you do a risk analysis process, you should do it not just before but also during and after you implement a tokenization solution. One other big one is making sure that you have strong authentication and access controls in place for access to the tokenization system. Whether it's tokenizing or de-tokenizing the data, the authentication credentials really must be kept secure and away from unauthorized access.

Again, all of these considerations are part of the existing PCI DSS, and this guidance pretty much clarifies the impact of what a tokenization solution would be on your cardholder data environment.

KITTEN: That's a great point that you make and I'm going to build on that discussion just a bit here, to talk about some of these different layers of security and how tokenization falls into the greater scope of security. These new guidelines for tokenization are just one piece. They're just one piece of guidance in a series of security guidelines issued by the council. How does this new tokenization supplement actually complement the other supplements that the council has recently issued?

RUSSO: That's a great question. We're looking at each one of these guidance documents to be used as a layer on top of PCI DSS, certainly to make you more secure. You may recall that we just released guidance specific to PCI DSS in virtualized environments not too long ago. These guidance documents are pretty much products of ongoing evaluation of various technologies that are out there, and more importantly what the impact is on PCI DSS by using one of these technologies. They're produced, very importantly, in conjunction with the PCI community, namely our special interest groups. The members of these special interest groups, specifically on tokenization, the scoping played a huge role in this particular piece of guidance, and we rely on industry input to make sure that we're giving our stakeholders everything that they need. These are all complementary. We're seeing tokenized environments being used with point-to-point encryption, being used in virtual environments, so they all sort of mesh together. And as a reminder, this certainly doesn't take the place of PCI DSS requirements. It's really meant to clarify what the DSS is looking for and give you the ability to add these specific technologies in that context.

Unifying Tokenization Standards

KITTEN: Lack of industry standards for tokenization has been a concern for the council, and you noted this earlier in our discussion. How does the council hope to unify common industry standards and definitions surrounding tokenization?

RUSSO: You're absolutely right. There is no standard right now for tokenization, and as I mentioned a little bit earlier, this document is really a starting point for merchants as there are not standards. We sort of recognize that people are looking for guidance and what they need to be aware of when they're trying to decide on which kind of tokenization solution they're going to pick, because, as you know, there are many tokenization solutions out there, all of which do things in a little bit different manner than the next. They're all good. They all have merit, but they're all different. This is something that the stakeholders asked for, and with these guidelines and their collaboration, we're providing what we think is a really great first step in best practices to get them eased into this tokenization technology and get it into their layered security.

KITTEN: The supplement also mentions best practices that merchants should consider when it comes to tokenization. Can you tell us what some of those best practices are?

RUSSO: Sure. There is actually a section within the document that talks about some of the best practices, as you said, and some recommendations for merchants. What much of it comes down to is just helping the merchant understand and ask the right questions to determine what kind of solution best fits their needs when it comes to securing their cardholder data. For example, we recommend that the merchant perform a risk assessment as part of their due diligence when they're selecting a tokenization service provider, because there are different providers out there. How are they storing tokens? What are they doing with them? How do you get the token? Is it encrypted? There are all kinds of issues that you really need to consider.

We also suggest that you verify the adequacy of any segmentation controls that you've got if these controls aren't part of the supplied solutions that you're getting from the tokenization vendor. Another thing we suggest is that you review logs of the merchant's interaction with the tokenization systems. Process what's going on there on a regular basis just to ensure that only users and system components that are authorized to go out there and access this tokenization/de-tokenization process, if you will, are actually getting there. These are just a few of the examples that are included in the guidance itself, but the bottom line is, again, no silver bullet here. If you think that you're going to be done with PCI DSS compliance by buying one of these solutions, you're in for a rude awakening.

KITTEN: That's a good point. And before we close, what final thoughts about the guidance would you like to share, generally?

RUSSO: In closing, I just encourage your listeners to be sure to check out the guidance. We're really pleased with this particular resource and the amount of work that's gone into it. As I said, I think it's a great starting point, considering the use of tokenization within your environment. Many people across the PCI community have been involved and collaborated on this. It's so great to have all of these, if you will, chefs in the kitchen helping us out, and you're talking about a mix here of not only vendors but merchants and certainly people from the card brands, as well as PCI people managing the process through. A great collaborative effort and a great document came out of it.


About the Author

Jeffrey Roman

Jeffrey Roman

News Writer, ISMG

Roman is the former News Writer for Information Security Media Group. Having worked for multiple publications at The College of New Jersey, including the College's newspaper "The Signal" and alumni magazine, Roman has experience in journalism, copy editing and communications.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing inforisktoday.eu, you agree to our use of cookies.