Merchants and their banks are mandated by the credit card companies to adhere to the Payment Card Industry Data Security Standard (PCI DSS), a set of security best practices defined by the credit card industry and focused on protecting cardholder data. Compliance is determined by recurring assessments, and failure to comply can result in penalties including fines or the revocation of card processing privileges. To meet the terms of PCI DSS, many companies invested heavily—the National Retail Federation estimates that as of June 2009, its members had spent more than $1 billion on PCI DSS compliance.
As a result, there are now hundreds of solutions across dozens of categories, all of which claim to address one aspect or another of PCI DSS. While much of this activity has simply been an attempt to give old solutions new life, some of it is more interesting. Perhaps most interesting is a data security approach called tokenization.
How tokenization works
Tokenization substitutes a credit card number, which can be monetized and thus has value, with a random string of characters (a token) that can’t be monetized and thus is valueless. Security is improved because systems that store tokens instead of credit card numbers are no longer at risk. Even if the tokens are compromised, the bad guys gain nothing and customers lose nothing. PCI DSS scope is reduced because tokens are not cardholder data and thus aren’t subject to PCI DSS.
Merchants of any size can benefit from tokenization as it provides two types of value:
• It protects cardholder data by minimizing its use.
• It reduces PCI DSS scope and thus lowers compliance costs.
Merchants that outsource payment processing, however, can get out of PCI DSS entirely—they never touch cardholder data, nor do they store it anywhere.
Tokenization vs. encryption
An alternative to tokenization, especially from a security perspective, is encryption. With encryption in place, merchants store only encrypted versions of credit card data, and only authorized accounts/processes get the keys to access actual credit card numbers. Encryption is, however, notoriously expensive to implement and manage. In particular, putting the appropriate key management systems and processes in place drives significant infrastructure and people costs. In addition, encryption typically results in larger stored values than the credit card numbers it replaces. This can wreak havoc with other systems; for instance, systems that used to receive a 16-digit credit card number now get a larger encrypted number, potentially with unexpected characters. Ensuring related applications—for marketing, charge-back, account history and others—continue to function after encryption is implemented drives additional expense. In the future, format-preserving encryption may help address this issue by returning the same number of characters.
In contrast, tokens can be implemented to preserve data format. If related applications expect to receive a 16-digit card number, merchants can implement tokens of this length. Tokens can also use partial values of the card numbers they replace—preserving the first four or last six digits, for example—to further streamline integration. As a result, tokens can more easily support various business processes that previously relied on credit card numbers: settlement, refunds/returns, charge-backs, recurring billing, frequency programs and marketing/sales analytics. Tokens are also less expensive to implement and manage since they don’t require the key management infrastructure associated with encryption.
Tokenization for Internal Operations
Tokenization isn’t for everyone, but for the right organization, its ability to both bolster security and slash PCI DSS expense is uniquely valuable and thus worth consideration. Those with the potential to gain include:
• Online merchants: Tokenization delivers near-perfect risk transference for online merchants, especially those that already outsource payment processing. In this case, merchants never possess credit card data, and thus can make a strong case for declaring themselves completely outside of PCI DSS scope. Payment processors/token providers bear all the risk. Merchants retain a proxy (the token) to support internal business process requirements.
• Traditional merchants: More diverse merchants, like those with extensive internal systems and distributed brick-and-mortar stores, aren’t the greatest prospects for tokenization. They are more likely to want to do it themselves rather than outsource. So while they can remove systems from PCI DSS scope, they can’t benefit from the same degree of risk transference as those that completely outsource payment processing. That said, tokenization could offer traditional merchants a way to support time-sensitive new initiatives with minimum fuss.
• Emerging providers: Service providers that don’t already have extensive internal payment-related infrastructure are good candidates to use tokenization for their internal operations. If you are building (or re-building) your app/on-demand content store, why do payments yourself if a tokenization partner can improve your time to market, bolster security and reduce compliance expense?
• Established providers: Established providers with extensive internal infrastructure may not be the best candidates for using tokenization as part of their internal operations. It could work nicely, however, for short time frame/departmental/one-off projects as a way to meet tight deadlines and support business requirements while providing needed protection. Need to accept pre-orders for a hot new phone that launches next week? A micro-site whose shopping cart is outsourced to a tokenized payment processor could make it happen.
• Large enterprises: Since tokenization by itself doesn’t likely justify large-scale infrastructure changes, enterprises may find the best fit for tokenization is in short time frame/departmental/one-off projects as a way to meet tight deadlines and support business requirements while providing needed protection.
• Cloud applications: The distributed nature of cloud applications makes them a strong potential fit for tokenization via partnership with a third-party payment provider.
Tokenization as a service
Tokenization can also drive revenue for service providers as a basis for new service offerings. Because it is a disruptive technology, relatively lightweight to implement and driven by compliance urgency, tokenization offers an attractive wedge for service providers to incrementally grow their business. Successful providers will add additional payment-, compliance- and security-related offerings to this base over time.
For service providers, tokenization’s potential as a revenue generator relies on synergy with existing offerings. Those with the most potential are already in the payment stream or are doing business with retailers in a way that’s not far removed from the checkout line. For example, if your security services organization already helps customers secure cardholder data or achieve PCI DSS compliance, a tokenization offering may be a great fit. The same is true if your hosting business already has dedicated e-commerce hosting options, SSL certificates and so on. The largest providers may have multiple synergies and could offer tokenization across multiple lines of business.
Lack of standards leads to challenges
Of course, tokenization has its share of issues as well. Most notable is a thorough lack of standards. From how tokens are formatted, to how they are generated, to how they are stored and accessed, there is no consistent standard.
Not only does this make it tricky to move from an internal to an outsourced solution, it also i
ncreases the risk of vendor lock-in. Once you’ve processed lots of transactions and have thoroughly integrated internal systems with your provider’s token format, switching could be very painful. What’s to stop your provider from raising rates?
The lack of standards may cause large organizations to either pass on tokenization entirely or roll out their own tokenization solution to retain control. Large firms can, however, use their leverage to get providers to support their chosen tokenization format. As such, outsourcing could remain a viable option to support time-sensitive or departmental needs in a way that doesn’t disrupt broader internal processes. Sony Pictures could still roll out the last-minute commerce micro-site for its summer blockbuster because it pressures First Data to support its token format. As such, Sony’s internal marketing and sales processes still function perfectly, as the tokens look the same as the ones generated internally. Smaller organizations are generally less exposed to lock-in as they have fewer internal processes that would be disrupted by a change in provider.
Tokenization today and tomorrow
Tokenization solutions are just now rolling out. Services have recently become available from providers like Element Payment Services and Shift4; software is available from vendors like nuBridges and Voltage Security. Akamai, First Data/RSA and Heartland are currently in trial/beta and should have generally available offerings by the end of the year.
In the near term, tokenization’s relative ease of adoption and ability to reduce PCI DSS scope will drive adoption, particularly by firms that already outsource components of the payment process and whose business process requirements (for example, to support marketing analytics) are modest. So, if you already leverage a provider like First Data for payment processing, you will soon get tokenization as an option. Organizations that prefer to roll out their own solutions will get tokenization as another feature of their data/payment security solution from software vendors.
Longer term, tokenization will simply become a feature of broader payment/security offerings. So while today’s customers may seek out tokenization discretely, tomorrow’s customers will base their purchasing decision on other factors, like ease of internal integration, assuming that tokenization is one of the options on the menu from any provider. This is particularly true as the lines between technologies like tokenization and encryption blur. Perhaps this distinction will disappear entirely at some point, but for the foreseeable future, tokenization and encryption will offer customers the ability to balance security and availability, and protection and intrusiveness, as their business requirements demand.
Though far from a silver bullet, tokenization’s ability to improve security and slash compliance expense make it uniquely attractive to both enterprises and service providers alike.
Julian is a principal analyst in Yankee Group’s Anywhere Network research group. He leads the company’s research in the area of network intelligence — the emerging solutions and technologies that help service providers design and deploy flexible, scalable and secure services over the intelligent IP network.
Holland is a senior analyst with Yankee Group’s Anywhere Network team. His research examines the development of technologies enabling mobile transactions in both the digital and physical domains. They can be reached at [email protected] and [email protected]