One of the big challenges with implementing narrow point products for enterprise data security is that the IT ecosystem in which your data resides is so expansive. Just think about it. Data does a lot of traveling within an enterprise, regardless of whether that data originates from within the organization or outside it. Data takes on a life of its own, at various times being created, accessed, edited, saved, copied, backed up, or archived for longer term storage. It also travels between and among many resources and users until its final destruction. Like humans in an unfamiliar city, sometimes data even gets lost, at least until it’s found (and sometimes found by the wrong person).
Trying to protect data from all threats both internal and external throughout this “action-packed” lifecycle means addressing controls over data stores, users and data access, defined perimeters, networking components, cloud services, and other various and sundry IT resources, not to mention protection for the data itself. Given the finite amount of resources, time, and personnel under which many software companies operate, most solution vendors tackle a narrower part of this larger ecosystem with a point product or products that solve a particular piece of the data security puzzle. Unfortunately, that approach creates issues with interoperability and can ultimately be conducive to isolated data siloes.
Siloed data comes with unwanted side effects, including the inability of users to leverage that data fully across the enterprise—what’s the purpose of data if it’s not contributing to the company’s productivity and success because of its inaccessibility? Siloes also introduce risk due to the lack of centralized control over data and a clear visibility into who’s using data for what purposes. Even in the most innocuous situation, data siloes just make it easier for data to become unused and “lost,” lingering in out-of-the-way corners of the ecosystem ultimately to be overlooked or forgotten until something less innocuous occurs.
For example, one siloed data set might hide sensitive data that the organization loses track of and doesn’t even know exists within it—so what happens when someone innocently and with all good intentions archives that data set into a public cloud resource and then, due to misconfigurations or other human error, a breach occurs? Potential disaster. Much less innocuous indeed!
If you’ve started down the road of cobbling together cybersecurity point products to address specific protection needs, you’re now having to implement and administer them in a piecemeal fashion instead of with a unified approach across the larger data security workflow. One solution might protect the data through encryption or tokenization, another might perform automated, AI-driven data discovery, while yet another might yield intelligent data lineage tracing and classification.
However, if these point products are all from different vendors, you’re in a situation of increasingly complicated and disjointed operations while at the same time having to worry about interoperability between products moving forward. Does one product at one point in the data lifecycle and protection workflow work well with another product at another point in the workflow? Are their capabilities complementary? That certainly has to be a consideration, especially if you’re the decision maker approving or sponsoring these technology acquisitions, or you’re the business leader within the company who is ultimately responsible for enterprise data and overall data security.
The road to a better and safer utilization of enterprise data, which undoubtedly increases its value to the business, consists of migrating to a data security platform (DSP) that is easier and quicker to implement, simpler to operate, and end-to-end in its visibility and functional coverage across the enterprise data ecosystem. The ultimate goal with a DSP is to eliminate data security siloes and apply consistent and effective controls to all data in the enterprise based on the type of data being protected.
Some DSPs also help organizations understand (and in some cases even discover and classify) data before applying any policies and controls. Remember, not all data is equal, and not all data is equally sensitive and risk-inducing. Knowing as much about the data—to the point of even finding data you didn’t know existed in your environment—before you apply protections is critical. More centralized DSPs attempt to do this by tying together all of these functions and capabilities into a more seamless and unified solution so that everything works well together. Every function is complementary to others across a broader spectrum of data security.
At comforte AG, we bring to market some of the most powerful and flexible data security capabilities for enterprises both large and small. For example, we protect an enormous amount of financial transactions going on in the world every day, so we know a thing or two about effective data protection and everything happening upstream and downstream from it. If you’re trying to solve particular problems at very specific points in the data protection workflow, such as tokenizing transactional or PII-laden data, or perhaps protecting sensitive information within SaaS applications, we have individual modules to address those pain points. More importantly, we work with organizations that have realized the dangers of siloed data and are moving beyond just point-product treatment of sensitive information. Increasingly, we are expanding our platform approach to address data discovery, lineage tracking, classification, protection, and simplified integration and operation. Comforte’s SecurDPS data security platform is the culmination of these efforts to provide seamless coverage across a wider swath of data security.
We use a single, simple image to capture all of these notions. If you notice, the workflow of data protection (with the colorful waves going around the discrete steps) isn’t necessarily represented in linear fashion—no arrows are present on the “waves” indicating direction or dependent steps. Depending on the lifecycle stage of an individual piece of data or data element, one or more of these steps and measures apply. The whole thing can be as linear or non-linear as your organization needs it to be. The point is, we at comforte view data as an organic, dynamic, evolving asset that’s always in flux. You can always apply focused capabilities at one point of this workflow with one of our modules, if that’s what you want to do, or you can apply our full range of capabilities across the entire spectrum. However, ongoing trends indicate that enterprises now require more integrated data security platforms tying together many functions and capabilities working effortlessly together. That’s why we’ve positioning ourselves ahead of that trend by engineering a data security platform which is holistic and end-to-end. And maybe we’re even influencing that trend a bit too!