In a world beset by geopolitical and economic uncertainty, one thing remains inviolable: the onward march of technological innovation. It is the driver of sustainable growth and better business decision making. It promises to uplift worker productivity, generate cost savings, improve process efficiency and transform user experience. But finding the winners in a market of “maybes” is always a challenge. Expensive bets on the wrong technology can erode competitive advantage and corporate finances.
This is where the Gartner Emerging Tech Impact Radar comes in. It’s designed to cut through the noise and marketing hype, to highlight the emerging technologies expected to have a “significant impact on the market.” And it estimates how sizeable that impact is likely to be. Once again, tokenization is highlighted as a potential game-changer.
The potential to disrupt
The Gartner report highlights the 30 hottest emerging technologies, split into four themes.
Smart world is all about the way users interact with “people, places, content and things,” as online and offline experiences converge. This year it includes things like digital twins, AI avatars and multimodal UI.
Productivity revolution lists the AI-powered technologies that are driving change across the enterprise. It includes model compression, virtual assistants and autonomous unmanned aerial vehicles (UAVs).
Privacy and transparency technologies do exactly that – enhance privacy and transparency at a time when personal and corporate data is being collected at an unprecedented rate. It includes things like human-centered AI, behavioral analytics and privacy-enhancing technologies.
Critical enablers support new use cases and enhance existing experiences. They range from the familiar (blockchain, private 5G) to the more outlandish (neuromorphic computing). Also included in this sphere is tokenization.
Why tokenization matters
Like all of the technologies listed above, tokenization has “the potential to disrupt a broad cross-section of markets” and is “crucial for product leaders to evaluate as part of their competitive strategy,” according to Gartner Director Analyst, Tong Nguyen. Gartner assesses that the technology will start to hit home in the next 3-6 years, and that its “mass” will be “very high” – meaning, it will have a substantial impact on existing products and markets.
This makes sense in the context of today’s threat and risk landscape. Organizations are increasingly realizing that existing perimeter defenses are no match for determined threat actors, who can often waltz straight through their digital front door using stolen credentials. An increasingly complex and burdensome compliance environment raises the stakes further.
That’s why many are turning to tokenization – which mitigates the risk of breaches by replacing sensitive data with “tokens” – in a reversible and format-preserving way. Preserving the length of underlying data is particularly useful, as it means existing applications and message structures don’t have to be changed when migrating to using tokenization – saving cost and effort.
comforte’s SecureDPS platform delivers a range of tokenization algorithms to suit any data element and use case, and a linearly scalable, high-performance system that is stateless, vaultless and collision-free. Because tokenization happens purely in memory and CPU, without any disk IO, it’s also more secure.
Putting theory into practice
Gartner suggests that organizations can use emerging technologies like this to
- Enhance competitive advantage in the smart world
- Balance growth with mitigating risk
- Support strategic roadmaps
Tokenization certainly has a part to play in all three scenarios. In a world of escalating cyber risk, it can enhance security, and reduce compliance costs and scope.