Implementing Tokenization Shouldn’t Be So Damn Hard
What surprised us in a recent inquiry with a leading global analyst firm was not the fact that they had seen an increase in tokenization inquiries of more than 700% year-over-year; we had been seeing the same spikes on our end. It was, rather, what was happening after the initial inquiries that caught our attention. The analysts were seeing most of the customers that had requested guidance on picking the right tokenization vendor schedule follow-up inquires a few months later asking for advice on how to structure long-term, often multi-YEAR, professional services engagements for help integrating their data security platform!
Now we understand that protecting data, especially at the application layer where data is most at risk of exposure, can be incredibly complex, but apparently, the analyst explained, the industry has come to accept that enterprises will settle for long, difficult, costly implementations that require tremendous customization in order to protect data well. Are we the only ones that find this to be alarming!? Not just because this is an unnecessarily painful and costly effort to live through, but also because changes to the security posture over time will likely lead to piles of re-work — with more professional services fees, no doubt.
Now, I’m not sure if the competition is saying “if you can’t fix it, feature it” or if they are just in the business of selling professional services over products, but there’s really no reason your enterprise should have to sign up for years of professional services just to integrate a tokenization solution…
When the analyst finally offered, “if there is something about your solution that makes it easier to implement with less professional services, that could be a big deal…,” we were almost embarrassed to answer. You see, not only do we not charge expensive, long-term professional services fees but also, for the vast majority of our implementations, we charge nothing related to implementing our data protection solution, due to how quickly and easily EncryptRIGHT deploys.
EncryptRIGHT sports an elegant architecture that abstracts data protection from applications, instead of interweaving cryptography and data-protection functionality, delivering application-native data protection without complex customization. Centralizing the core data protection functionality not only makes it easier and more accurate to implement but also creates natural separations of duties, making it more secure, and this architecture also allows changes to be made to how data is protected without requiring any re-work, making it more flexible and less costly over time.
Consider asking your tokenization vendor for a proof-of-concept to demonstrate how their solutions integrate. If they tell you that it can’t be done without cumbersome, long-term integration efforts, request a POC for EncryptRIGHT – we’d be happy to prove that implementing tokenization doesn’t have to be so damn hard.