What is Tokenisation?

Will tokens replace money as we know it?

Posted on:

What is Tokenisation?

When used for data security, tokenization is the process of replacing a sensitive data element with a non-sensitive equivalent, referred to as a token, which has no extrinsic or exploitable meaning or value. The token is a point (i.e. identification) that maps sensitive data via a tokenisation system. The mapping of original token data uses methods that make tokens impossible to reverse in the absence of a tokenization system, for example by using tokens created from random numbers. The tokenization system must be secured and validated using safety best practices for sensitive data protection, secure storage, audit, authentication and authorisation. The tokenization system gives the authority and interfaces for data processing applications to request tokens or detokenise the sensitive data. The benefits of tokenization for safety and risk reduction require the tokenization system to be logically isolated and segmented from data processing systems and applications that previously processed or stored sensitive data replaced by tokens. Only the tokenization system can tokenize data in order to create tokens or recover sensitive data under strict security checks. The token generation method must be demonstrated to have the property such that no feasible means are available to conduct a direct attack upon it.

When tokens replace live data in systems, sensitive data is minimized for applications, stores, people and processes, reducing the risk of compromising or accidental exposure and unauthorised access to sensitive data. Applications can operate using tokens instead of live data, with the exception of a small number of trusted applications that are explicitly allowed to detoken for an approved business purpose when strictly necessary. Tokenization systems can be operated in-house within a securely isolated data center segment or as a secure service provider service. Tokenization may be used to protect sensitive data, such as bank accounts, financial statements, medical records, criminal records, driver’s licences, loan applications, stock trades, registrations of electors and other types of personally identifiable information (PII). Tokenization is often used in the processing of credit cards. The PCI Council defines tokenization as “…a process that replaces the primary account number (PAN) with a substitute value called a token. De-tokenization is the reverse process to restore a token to its PAN value. The safety of an individual token depends mainly on the inability to determine the original PAN knowing only the surrogate value.” The choice of tokenization as an alternative to other techniques such as encryption depends on different regulatory requirements, interpretation and acceptance by the respective auditing or evaluation bodies. This is in addition to any technical, architectural or operational constraints imposed in practical use by tokenisation. 

Comments are closed.