StrongKey - May 16, 2019

A Token for Your Thoughts on Vaultless Tokenization

In our efforts to fulfill our clients’ growing data storage and security demands, we often research new methods to perform our mission and carefully weigh whether or not these discoveries serve our clients’ best interests. Encryption schemes come in many forms and methods with strength and size generally varying from one method to another. The race to avoid malicious attacks and data breaches is balanced by a desire for speed and scalability. One recent encryption model is called “vaultless tokenization” (VT), and because there is a lot of industry buzz about it, we at StrongKey feel it wise to talk about and address some of your questions about it.

Nearly ten years ago, data security vendors offered their potential clients various software data encryption solutions, making additional hardware solutions simply an option. StrongKey decided then to exclusively use a standardized hardware cryptoprocessor, called a Trusted Platform Module, or TPM. TPMs were conceived by a computer consortium called the Trusted Computing Group. The common theme here is trust: a trusted method, a trusted standard, a trusted solution for data protection and encryption. Standards offer a baseline from which to work, to address issues, and to comply with the widest range of interdependent systems.

Our data protection processes must conform to the Payment Card Industry Data Security Standard (PCI DSS), which also outlines sets of standards for network security, data protection, access control, and policy maintenance. When a card is swiped, cardholder data is encrypted and stored in a secure database, called a vault. The encryption method we use is also a standard, having been researched by mathematicians and cryptographers for years before it was accepted as a baseline, and by NIST for its Suite-B algorithm mandates in the U.S. government (which are used for protecting classified content).

One feature that makes this and similar encryption methods secure is the ability to generate truly random tokens to substitute for sensitive data. The “tokenization” process in VT is also encryption, but the method is proprietary, and the encryption algorithm has neither been vetted by the industry nor does it conform to an "industry best-practice." Each security company can use its own programmer’s algorithm design, which is essentially the opposite of a standard. The tokens themselves are derived, and have a mathematical relationship to the original data, and in theory could be reversed engineered—known as a “reversible token.” The PCI DSS guidelines put it this way: “The intent is to ensure that it is computationally infeasible to recover the original PAN [Personal Account Number] knowing only the token, a number of tokens, or token-to-PAN pairs that don’t include the original PAN. If it is feasible, the tokenization system is not secure.” 1

Additionally, the "tokens" that VT companies return to the application for storing in the application database, are nothing more than encrypted data (ciphertext)—simply because the "token" has to go back into the VT companies' implementation to be substituted back into the original cardholder’s data, and the VT companies claim they do not maintain a vault of encrypted cardholder data.  This effectively brings the application back into scope for PCI DSS audits for sections 3.4, 3.5, and 3.6 because the application's database is the vault of encrypted cardholder data. The VT companies are, essentially, transferring the risk back to the customer by making the “map” of ciphertext-to-token in the application’s lookup table a secret, rather than the token itself.

VT can offer a customer some financial advantages: reduced database licensing, for example, and less need for powerful hardware with large storage capacity and redundancy. It can also offer advantages such as increased speed without the need to look up and check vault transactions for consistency and duplicates. However, in a well-managed and knowledgeable IT environment, the speed and scalability are not an issue to be avoided, but are rather components of a knowledgeable, efficient engineering environment.

Businesses always need to watch their bottom line; that’s a given. The reduced cost of a VT may be attractive to your budget, but we think that it may be a risk that could be costlier in the long run. We aim to earn your trust by using open-source code, complying with industry-vetted standards and practices, communicating clearly, and putting your interests at the center of our decisions. We believe that these qualities are the hallmarks of a great partner. We will always look through the implications of new technologies and techniques with our deep understanding of security, and we will always put our customers’ well-being at the forefront of our decision-making processes.


¹ https://www.pcisecuritystandards.org/documents/Tokenization_Product_Security_Guidelines.pdf?agreement=true&time=1551196562887

Cybersecurity can be hard, we get it. Click here to request a free security assessment.

Tokenization