Subscribe

Dan Simmons l Sep 10, 2020 l PCI DSS, Data Protection, Compliance, Cloud Computing

PCI DSS 4.0 and the Changing Approach to Compliance

In the past, the focus of PCI compliance was to store as little sensitive data as possible and keep it secured. In today's data driven world, that focus has changed. 

Architectural and system alterations, new privacy regulations, and the need to maximize the use and consumption of data are all driving factors behind the major changes that we have seen lately. All of these factors driving industry-wide evolution can be traced to one simple notion: data is the new gold, and organizations want to monetize it.

If you look at the world of PCI DSS over the past 10 years, tokenization emerged as a technology to reduce risk and compliance scope. The previous philosophy was to remove any data that was not needed and protect what little data you keep. However, fast forward to the modern day, organizations are thinking about digital transformation and their assets that will equip them to compete with competitors, and data is at the heart of nearly every business decision made. Therefore, organizations must answer the questions:

  • How do we balance data collection and the associated risks?
  • How do we make sure we are getting the maximum utility from data if it is sensitive in nature?

PCI DSS 4.0

PCI DSS 4.0 is emerging and organizations have to prepare for its impending arrival. This preparation, if not taken seriously, will present a challenge, especially with regards to legacy PCI DSS technologies. PCI DSS 4.0 demands a more continuous assessment and risk management strategy. This will be very different to how PCI started which was very much point in time assessments. This means you could be compliant one day and potentially be penalized the next if there is not a continuous risk strategy in place.

Another issue that will arise due to PCI DSS legacy controls and systems is they have agility limitations that will eventually impede organizations that want to migrate to more agile architectures, for instance modern cloud native architectures. Recently, we've seen container application architectures emerge, new types of stacks, new ways to orchestrate and build applications. This has seen many organizations undertaking digital transformations at a fundamental level to allow them to be more agile, to compete with new emerging start-ups, and to build and adapt to market conditions.

As an aside, the COVID-19 scenario is a prime example of a very difficult unpredictable market change that impacted the world and resulted in organisations having to accelerate further their ability to respond to these conditions with regards to risk and data - agile methods are absolutely critical to that.

On top of all that, we have privacy regulations in addition to classic regulatory environments like PCI DSS. These are applying new pressures on organisations wanting to stay compliant while utilizing sensitive data. Furthermore, with the emergence of Big Data, machine learning and AI, more power is being leveraged by organisations, but these will bring about new privacy and security risks as well.

So there are many factors contributing to the heaps of sensitive data that organizations are amassing and at the same time there are more and more standards and regulations emerging that govern the way that that data has to be handled. Remember, there are two parts to the data protection formula: keep only the sensitive data you need and protect what you keep. If you're going to keep more sensitive data for analytics and other business cases, then as before you have to protect it. Tokenization can be the  key to this balancing act. 

Tokenization is the key (with no need for key management!)

tokenizaiton is the keyOut of all the data security solutions on the market, the best way to protect sensitive data and enable analytics is through tokenization. This is done by substituting a sensitive data element (e.g. a name, address, or D.O.B) with a non-sensitive equivalent (known as a token). By tokenizing critical data, those that wish to carry out data analytics are able to extract insights without the risk of exposing personal, confidential data.

This eliminates one of the prime issues with classic security solutions which are unable to fully protect sensitive data throughout its entire lifecycle. For instance, data that is protected with classic encryption has to be deciphered and thereby exposed before analytics can be performed. Not so with tokenization. Analytics can be performed on tokenized data while they are still in a protected state. Should a third-party stumble across this tokenized information, it will be worthless as any identifiable information would be replaced.

An additional advantage of tokenization is that it eliminates the need for key management, as with classic encryption. This helps avoid a large vulnerability associated with encryption and saves the time and resources that key management requires. 

Tokenization has long been used to fulfill key requirements of PCI DSS by taking sensitive data out of scope. It is the preferred data protection method of many banks, financial institutions, and retailers across the globe, both big and small. Its superior level of protection coupled with low impact on IT systems makes it the ideal form of data protection for organizations that need to both protect and process large volumes of data.


Share this:  LinkedIn XING Email

Want to know how to perform analytics on protected data?

Valuable data is often kept locked away because it may contain sensitive elements, which can stop data analytics in its tracks. Download our white paper and find out how to perform analytics on protected data, allowing you to gain valuable insights while avoiding data exposure:

Download White Paper

Related posts