The Advent of Data Hyper-Protection
Data hyper-protection ensures the privacy of data from its creation and presence on the enterprise mainframe to its final destination on applications and cloud services.
Join the DZone community and get the full member experience.Join For Free
Critical system-of-record data must be compartmentalized and accessed by the right people and applications, at the right time.
Since the turn of the millennium, the art of cryptography has continuously evolved to meet the data security and privacy needs of doing business at Internet speed, by taking advantage of the ready processing horsepower of mainframe platforms for data encryption and decryption workloads.
Having enterprise data processing, encryption and business logic colocated on the same mainframe offered an ideal way to reduce the latency of additional network hops for data, but there still remained a need to protect that data as it moved to and from the mainframe, as well as when it was at rest.
For the purposes of maintaining the privacy of high-speed transactional data, early encryption wouldn’t have worked very well as a separate networked ‘gateway’ or system, which would add an unacceptable amount of latency to massive transaction volumes.
To resolve such lags, IBM kept it local by putting a cryptographic processor right next to the primary processor in their early generation Z series platforms. These evolved into the next generation of CryptoExpress cards, which offered quite a jump in onboard performance, especially as processor speeds improved and costs came down.
Since the encryption activity happened right on the hardware, these cards were virtually impossible to disrupt through typical software hacking means.
But if secrets are safe inside the box, maybe we can still shake the outside of the box and figure out what’s inside?
Maintaining pervasive encryption
As business applications became ever more distributed over the last two decades, mainframes started becoming bona fide data hubs, handling an increasing volume of requests from business partners and external services.
To adapt and reduce points of failure, leading platform vendors explored more pervasive encryption approaches.
In addition to accelerating the encryption of large data stores at a bulk scale, a reduction in computing overhead was further realized by positioning a ‘traffic cop’ at the network edges of the core system.
For instance, IBM introduced zERT to give administrators a way to monitor all incoming requests and track sessions, in order to constrain which sources of network traffic are permitted on which ports.
Many companies put their mainframes in co-located data centers, where multiple companies might support their own regional branch locations. IBM Fibre Channel Endpoint Security for each core system and encryption of the data in flight between storage devices prevents the company’s extended SAN from becoming another attack vector, for the integrity of data flows without reliance on ‘packet sniffer’ network tools.
From Secure Data Silos to Protecting Data Everywhere
Hybrid IT applications are forcing companies to take a less enterprise-centric view of data protection and privacy. What happens to critical data when it leaves the data center, making the round trip to multiple partners, ephemeral cloud services, or remote edge computing devices in a broader business workflow?
Data from the core system must remain shielded from unauthorized access while in motion, while still allowing each participant to see data appropriate for their needs.
The introduction of IBM Hyper Protect Data Controller solves this data privacy dilemma for IBM, not just among networked Z series platforms, but everywhere else the data travels.
When data is accepted or sent from Z systems, it is encapsulated as Trusted Data Objects (or TDOs), each with an encryption wrapper just as opaque as any mystery compartment in that Advent calendar.
Like a virtual passport, this self-contained TDO is then validated by the data controller system when opened, based on the user who is trying to open it. Three items are on the manifest — the policies, keys, and business logic that govern its behavior.
Here’s How It Works:
- Policies allow a high degree of rule-based control over which groups, application services or individuals can see what part of the TDO. A data scientist may get one view of certain rows for forecasting — without sensitive information, a transaction system may only see an order number and confirmation, while an authorized end customer should be able to see their own full account status and history.
- Keys are the essential artifacts for ‘unlocking’ encrypted data in the TDO. All keys are maintained inside the Hyper Protect Data Controller solution.
- Business logic is often metadata that defines how the data itself realizes its purpose in a business workflow.
The Intellyx Take
The encryption of data still utilizes the power of the core system, while cryptographic key checks and the secure enforcement of Hyper Protect Data Controller objects are now happening wherever the data moves on the internet or within remote systems — right at the point of consumption.
Even more impressive, centralized management tools can allow admins to revoke access to any TDO out there in the wild.
The advent of cryptographically secure data controllers proves that there are plenty of new out-of-the-box innovative approaches for protecting the data flowing through today’s systems of record.
© 2021, Intellyx, LLC. Intellyx retains sole editorial control over the content of this article. At the time of publishing, IBM LinuxONE is an Intellyx customer. Image credit: Rich Bowen, Advent Calendar, flickr, modified by JE.
Published at DZone with permission of Jason English. See the original article here.
Opinions expressed by DZone contributors are their own.