Free Newsletter
Register for our Free Newsletters
Newsletter
Zones
Access Control
LeftNav
Alarms
LeftNav
Biometrics
LeftNav
Detection
LeftNav
Deutsche Zone (German Zone)
LeftNav
Education, Training and Professional Services
LeftNav
Government Programmes
LeftNav
Guarding, Equipment and Enforcement
LeftNav
Industrial Computing Security
LeftNav
IT Security
LeftNav
Physical Security
LeftNav
Surveillance
LeftNav
View All
Other Carouselweb publications
Carousel Web
Defense File
New Materials
Pro Health Zone
Pro Manufacturing Zone
Pro Security Zone
Web Lec
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
ProSecurityZone Sponsor
 
 
News

Effective, In-House Distributed Data Tokenisation

Protegrity : 22 July, 2010  (Special Report)
Ulf Mattsson of Protegrity provides insight into the challenges of data tokenisation and how his company has developed a scalable and distributed approach to tokenisation which overcomes the challenges associated with the more usual centralised methods
A new approach is being taken to tokenize data which eliminates challenges associated with standard centralized tokenization. Particularly in high volume operations, the usual way of generating tokens is prone to issues that impact the availability and performance of the data. From a security standpoint, it is critical to address the issue of collisions caused when tokenization solutions assign the same token to two separate pieces of data. This next generation tokenization solution addresses all of these issues. System performance, availability and scaling are enhanced, numeric and alpha tokens are generated to protect a wide range of high-risk data, key management is greatly simplified, and collisions are eliminated. This new approach has the potential to change where tokenization can be used.

Different ways to render data unreadable

There are three different ways to render data unreadable:

1) Two-way cryptography with associated key management processes
2) One-way transformations including truncation and one-way cryptographic hash functions
3) Index tokens and pads

Two-way encryption of sensitive data is one of the most effective means of preventing information disclosure and the resulting potential for fraud. Cryptographic technology is mature and well proven. The choice of encryption scheme and topology is critical in deploying a secure, effective and reasonable control.

Hash algorithms are one-way functions that turn a message into a fingerprint and are usually no more than a dozen bytes long. Truncation will discard part of the input field. These approaches are used to reduce the cost of securing data fields when data is not needed to do business and will never need the original data back again.

Tokenization is substituting sensitive data with replacement values that retain all the essential characteristics without compromising its security. A token can be thought of as a claim check that an authorized user or system can use to obtain sensitive data such as a credit card number. Using tokenization, all credit card numbers stored in business applications and databases are removed and placed in a highly secure, centralized encryption management server that can be protected and monitored utilizing robust encryption technology.


A Central Token Solution

* Minimize risk of exposing data with intrinsic value
* Reduce number of potential attacker targets
* Reduce cost of PCI assessment upwards of $225K per year

All industries can benefit from centralizing and tokenizing data. Tokenization is about understanding how to design systems and processes to minimize the risk of exposing data elements with intrinsic (or market) value. An enterprise tokenization strategy reduces the overall risk to the enterprise by limiting the amount of people having access to confidential data. When Tokenization is applied strategically to enterprise applications, confidential data management costs are reduced and the risk of a security breach is eliminated. Security is immediately strengthened by reducing the number of potential targets for would-be attackers. Studies have shown annual audits average $225K per year for the world's largest credit card acceptors.

Any business that collects, processes or stores payment card data is likely to gain measurable benefits from central tokenization. Most of the tokenization packages available today are focused on the Point of Sale (POS), card data is removed from the process at the earliest point and a token number with no value to the attacker is substituted. These approaches are offered by third party gateway vendors and other service providers.


Common Myths about Tokenization

Myth: Data is gone forever if you lose access to the encryption keys
Myth: Do not encrypt data that will be tokenized
Myth: Tokens transparently solve everything

There has been erroneous information published about tokenization. For example, that tokenization is better than encryption because "if you lose access to the encryption keys the data is gone forever." This issue exists with both tokenization and encryption, and in both cases can be managed through proper key management, and secure key recovery process. Both a key server and a token server can crash, and therefore must have a backup. The token server is often using encryption to encrypt the data that is stored there, so the token server may also lose the key. A solid key management solution and process is a critical part of any enterprise data protection plan.

Some articles encourage businesses not to encrypt data that they plan to tokenize. They claim that encrypted data takes more tokenization space than clear text data, and that many forms of sensitive data contain more characters than a 16-digit credit card number causing storage and manageability problems. This is untrue if you are using Format-Controlling Encryption (FCE), which is available from Protegrity. Telling companies not to encrypt because of an issue that is easily addressed denies them a critical layer of security that is the best line of defense for sensitive data.

Often tokenization can be complicated in larger retail environments or enterprises because the data resides in many places, different applications, and service providers. Applications which have to process the real value of the data would need to be reworked to support tokenization. The cost for changing the application code can be hard to justify when considering the level of risk reduction. Regardless of industry, if the data resides in many different places, switching to tokenization will probably require some programming changes and you may not be able to rebuild if using a legacy application.

Tokenizing and the Data Lifecycle

The combined approaches of tokenization and encryption can be used to protect the whole data lifecycle in an enterprise. It also provides high quality production level data in test environments, virtualized servers and outsourced environments.

In the development lifecycle there is a need to be able to perform high quality test scenarios on production quality test data by reversing the data hiding process. Key data fields that can be used to identify an individual or corporation need to be cleansed to depersonalize the information. In the early stages of implementation, cleansed data needs to be easily restored (for downstream systems and feeding systems). This requires two-way processing. The restoration process should be limited to situations for which there is no alternative to using production data.

Authorization to use this process must be limited and controlled. In some situations, business rules must be maintained during any cleansing operation (addresses for processing, dates of birth for age processing, names for gender distinction). There should also be the ability to set parameters, or to select or identify fields to be scrambled, based on a combination of business rules.

Should a company build their own tokenizing solution?

Developing all the capabilities to build a solution in-house can present significant challenges. To be implemented effectively, all applications that currently house payment data must be integrated with the centralized tokenization server. Developing either of these interfaces would require a great deal of expertise to ensure performance and availability. Writing an application that is capable of issuing and managing tokens in heterogeneous environments that can support multiple field length requirements can be complex and challenging.

Furthermore, ongoing support of this application could be time consuming and difficult. Allocating a dedicated resource to this large undertaking and covering for responsibilities could present logistical, tactical, and budgetary challenges.

For many organizations, locating the in-house expertise to develop such complex capabilities as key management, token management, policy controls, and heterogeneous application integration can be very difficult. Writing code that interfaces with multiple applications, while minimizing the performance impact on those applications, presents an array of challenges. The overhead of maintaining and enhancing a security product of this complexity can ultimately represent a huge resource investment and a distraction from an organization's core focus and expertise.

Security administrators looking to gain the benefits of centralization and tokenization, without having to develop and support their own tokenization server, should look at vendors that offer off-the-shelf solutions.

Reasons to keep the token server in-house

* Liability and risk
* Many applications use or store data
* Multi-channel commerce
* Security of outsourcing
* Recurring cost of tokenization when data volume is increasing since outsourcing may charge based on transaction volume issues of transparency, availability, performance and scalability

Typically companies do not want to outsource secure handling of data since they cannot outsource risk and liability. Organizations are not willing to move the risk from its environment into a potentially less secure hosted environment. Further, enterprises need to maintain certain information about transactions at the point of sales (POS), as well as on higher levels. In most retail systems, there are a plurality of applications that use or store card data, from the POS to the data warehouse, as well as sales audit, loss prevention, and finance. At the same time, the system needs to be adequately protected from attacks from data thieves.

Merchants who gather card data via Web commerce, call centers and other channels, should ensure that the product or service they use can tokenize data through all channels. Not all offerings in the market work well or cost-effectively in a multi-channel environment, particularly if the token service is outsourced. Merchants need to ensure that their requirements reflect current and near-future channel needs. Another concern is that tokenization is new and unproven can pose an additional risk relative to mature encryption solutions.

A Risk Management analysis will reveal whether the cost of deploying tokenization in house is worth the benefits. An outsourcing environment must be carefully reviewed from a security point and provide a reliable service to each globally connected endpoint. Many merchants continue to object to having anyone keep their card data other than themselves. Often, these are leading merchants that have made significant investments in data security and simply do not believe that any other company has more motivation (or better technology) than they do to protect their data.

Along with separation of duties and auditing, a tokenization solution requires a solid encryption and key management system to protect the information in the tokenization server. By combining encryption with tokenization, organizations can have security, efficiency, and cost savings for application areas within an enterprise.

Holistic solutions can support end-to-end field encryption, which is an important part of the next generation protection of the sensitive data flow. In some data flows the best combination is end-to-end field encryption utilizing format controlling encryption from the point of acquisition and into the central systems. At that point the data field will be converted to a token for permanent use within the central systems. A mature solution should provide this integration between encryption/tokenization processes.

Security is addressed by a running the tokenization solution in-house on a high security network segment isolated from all other data and applications. If a segmented approach is used, most tokenization requests will need to be authorized to access this highly sensitive server. Access to the token server must be provided based on authentication, authorization, encrypted channel and monitoring or blocking of suspicious transaction volumes and requests.

Transparency, availability, performance, scalability and security are common concerns with tokenization, particularly if the service is outsourced. Transparency can be enhanced by selecting a tokenization solution that is well integrated into enterprise systems like databases. Availability concerns can be addressed by selecting a tokenization solution that is running in-house on a high availability platform. Performance issues can be addressed by selecting a tokenization solution that is running locally on your high transaction volume servers. Scalability is best addressed by a selecting a tokenization solution that is running in-house on your high performance corporate back-bone network.

The Solution: Distributed Tokenization

Distributed tokenization is a method of storing sensitive strings of characters on a local server. This new approach changes where tokenization can be used. After years of research and development, Protegrity has developed a solution by intelligently altering the traditional backend processes used in tokenization.

This new patent-pending way to tokenize data eliminates the challenges associated with standard centralized tokenization and solves the issues described above with outsourcing. Particularly in high volume operations, the usual way of generating tokens is prone to issues that impact the availability and performance of the data. From a security standpoint, it is critical to address the issue of collisions caused when tokenization solutions assign the same token to two separate pieces of data.

This next generation tokenization solution addresses all issues. System performance, availability and scaling are enhanced, numeric and alpha tokens are generated, key management is greatly simplified, and collisions are eliminated.

The benefits include scalable with multiple, parallel instances, dramatic high performance, highly available, centralized or distributed deployment, no token collisions and support of PCI and PII data.

A solution can provide easy export of the static token tables to remote Token Servers to support a distributed tokenization operation. The static token tables can easily be distributed by using a simple file export to each Token Server. Each token table should be encrypted throughout this exported and import operation.


Conclusion

This new way to tokenize data which eliminates challenges associated with standard centralized tokenization. This new approach has the potential to change where tokenization can be used.

It is important to understand that data is stored to render follow-up checks, audits, and analysis. At the same time the information stored on the servers is a security risk, and needs to be protected. Even though the examples discussed in this article are mostly concerned with credit card numbers, similar issues are encountered when handling social security numbers, driving license numbers or bank account numbers. Companies need to deploy an enterprise tokenization and key management solution to lock down various data across the enterprise.

A holistic solution for data security should be based on a centralized data security management that protect sensitive information from acquisition to deletion across the enterprise. Protegrity customers maintain complete protection over their data and business by employing software and solutions specifically designed to secure data, manage the data via a centralized policy, and generate detailed security reports.

Third party data security vendors develop solutions that protect data in the most cost effective manner. External security technology specialists with deep expertise in data security techniques, encryption key management, and security policy in distributed environments are needed to find the most cost effective approach for each organization. To maximize security with minimal business impact, high performance, transparent solution optimized for the dynamic enterprise will require a risk-adjusted approach to data security. This approach will optimize the data security techniques that will be deployed on each system in the enterprise.
Bookmark and Share
 
Home I Editor's Blog I News by Zone I News by Date I News by Category I Special Reports I Directory I Events I Advertise I Submit Your News I About Us I Guides
 
   © 2012 ProSecurityZone.com
Netgains Logo