Today, we see a third CCPA class action in short order, which points to very specific claims of security failure against the CCPA regulation. This time, it affects 73 million records – that’s pretty significant. While CCPA compliance focuses on citizens’ data privacy as a whole, which includes processes to allow individuals to request their data, delete it, and so on, the really critical impact of the regulation as far as any in-scope businesses are concerned relates to the mass litigation risk that it permits and encourages through the damages framework it outlines per violation. As noted in this recent case commentary, this is “bet the company” damages territory for many businesses that collect large amounts of customer and analytic data. Large companies will certainly reel from the cost impact, but to a medium-sized business this kind of breach could be a crushing blow.When a breached entity is taken to court, the critical and essential definition of reasonable security that’s referenced in most data security regulations including CCPA becomes central to defense. “Reasonable security” is about the full data lifecycle: where and why it is collected and stored, how it is managed, and how it is specifically governed by accountable stakeholders to security operations processes. At the heart of this, however, is how it is protected if it is legitimately captured, stored, and processed. Ultimately, even if there is good governance, weak protection means the process is essentially.
An organization that has no protection of data at all will have no chance to defend that posture as “reasonable.” A badly configured COTS or cloud application that is then compromised will also likely be problematic in defense as a systemic failure to govern. Similarly, even if encryption was used that is either weak, poorly implemented, or inappropriate to the given threat, such as data-at-rest-only encryption that allows data to be read in the clear upon access of a compromised system, application or by an uncontained or monitored insider, it will also likely be insufficient under a ‘reasonableness’ test. Likewise, if a method of de-identifying or pseudonymizing data, a control that is specifically defined in CCPA and regulations like GDPR will also come undone under scrutiny if used without appropriate scrutiny, foundation, or standards acceptance for the specific data in question. It’s not uncommon, for example, to find in-house ‘scrambling’ methods or home-grown encryption tools even in large enterprises that won’t be held “reasonable” under scrutiny.
The challenge then becomes how to secure, govern, and control access to personal and sensitive data to meet compliance and reasonableness. A contemporary, simple, and yet effective technology called tokenization can help secure such data. Tokenization works by replacing live information in place, for example in databases or files, with a secure but unrelated replacement value that retains the data’s utility, but not its risk. This allows risk to be reduced in production and non-production systems while keeping processing intact – for example, analytics or AI processing, or live production applications. Effective tokenization is largely a transparent affair, reducing complexity which itself introduces more risk. Also, the ability to recover back to the live data under tight controls via the isolated tokenization/detokenization system and audit data that goes along with that means data is never lost and access to it is strictly managed and evidenced. Data that was formerly a risk is stored and used secured as tokens in application, analytics, cloud, COTS, or SaaS applications. Strong tokenization is now well defined in modern standards like ANSI X9.119-2, a widely accepted and well-established standards body.
As the ‘Minted’ case shows, if reasonableness is not sustained in the legal proceedings, the depth and scale of the breach could be financially crushing. This will be the case to watch for CCPA’s enforcement template. There’s no doubt that many businesses operate with the best intentions with regard to security, but given the complexity of the hybrid cloud ecosystems we operate in, the pervasiveness of attacks, errors, and vulnerabilities, relying on IT layer security control like access, configuration, and traditional encryption isn’t enough. Tokenization of data levels the playing field of risk while enabling, not locking down the business. Best in class enterprises offer tokenization as an enterprise service for easy consumption, making it a natural part of the application development and pipeline processes as part of complete cyber security framework as best practice. If the data had been effectively tokenized at Minted, attackers would have got no sensitive data, and the impact of this breach would have been neutralized with a very different likely outcome – low to zero impact as the likelihood of determining live data from the tokenized data is in the realm of practical impossibility.
So while privacy of individuals and their rights to individual data elements and their right to be forgotten are very important and often the focus in the popular media, approaches to privacy compliance need to start with the biggest risk impact – data security at scale. Making sure that breaches don’t yield anything meaningful to attackers and that stolen data does not trigger subsequent litigation are top priorities. However, it’s important to make sure the leaked or stolen data is useless AND the approach to protection that makes it useless is well governed and can reliably withstand the critical reasonable tests should a claim reach court for effective defense.