Origins of Tokenisation

Reducing risks of handling high value financial instruments

Posted on:

Origins of Tokenisation

Since the first currency systems emerged centuries ago, the concept of tokenization, as adopted by the industry today, has existed as a way of reducing the risk of handling high value financial instruments by replacing them with substitutes. Coin tokens have a long history of use in the physical world to replace the financial tool i.e. minted coins and banknotes. In recent history, metro tokens and casino chips have been used to replace physical currency and cash, handling risks such as theft with their respective systems. Exonumia and scrip are words that are synonymous and often used in harmony with these tokens. Since the 1970s, similar techniques of substitution have been used in the digital world to isolate real data elements from other data systems. For example, in databases, since 1976, surrogate key values have been used to isolate data related to internal database mechanisms and their external equivalents for a variety of uses in data processing. More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for data protection purposes. Tokenisation is one way to protect sensitive cardholder data in the payment card industry in order to comply with industry standards and government regulations. 

In 2001, TrustCommerce created the tokenization concept to protect the customer’s sensitive payment data, classmates.com. They engaged TrustCommerce because the risk of card holder data being stored was too great if their systems had ever been hacked. TrustCommerce has developed TC Citadel ®, where customers can reference a token instead of card holder data and where TrustCommerce processes payment on behalf of the merchants. This secure accounting application enables customers to process recurring payments safely without having to store cardholder payment information. Tokenization replaces the PAN with secure tokens randomly generated. If intercepted, the data does not contain information about the cardholder, making it useless for hackers. The Primary Account Number (PAN) can not be retrieved even if the token and the systems on which it resides are affected or if the token can be reversed to reach the PAN. Tokenization was applied by Shift4 Corporation to payment card data and was released to the public at the 2005 Industrial Security Summit in Las Vegas, Nevada. The technology is designed to prevent the theft of credit card information in storage. Shift4 defines tokenisation in the following manner:  “The concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In payment card industry (PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.”

To protect data throughout its entire life cycle, tokenization is often combined with end to end encryption to secure data in transit to the tokenization system or service, replacing the original data upon return. For example, in order to avoid the risks of malware stealing data from low trust systems such as point of sale (POS) systems, as in the 2013 target breach, cardholder data must be encrypted before and not after card data enters the POS. Encryption takes place within the limits of a security-hardened and validated card reading device and data is encrypted until received by the processing host, an approach pioneered by Heartland Payment Systems as a means to secure payment data against advanced threats, which are now widely used by payment processing companies and technology companies. The PCI Council also specified end to end encryption (P2PE: certified point to point encryption) in different PCI Council point to point encryption documents for various service implementations.

Comments are closed.