How to Have Your OT Cake and Eat it Too With Cloud and On-Prem
When it comes to linking your OT data, the choice is usually cloud or on-prem. The correct answer? Both. And don't forget the edge.
Join the DZone community and get the full member experience.Join For Free
Since the dawn of the Internet, technical people have deliberated over what should be processed server-side and what should be handled locally at the client. Today, that debate is still happening, but now the buzzwords are cloud, on-premises, and edge. The basic question remains: should processing occur near the device, where latency is low, or in the cloud, where compute power is abundant? For most software customers and end users, the answer is the same as it’s always been. It appears in the form of a question: “why is this my problem?”
It’s a valid question. Among the world’s most successful consumer platforms, from iOS to Android, arguably the most significant commonality is the near-total abstraction of back-end infrastructure. The interface between server and client is so seamless that users typically don’t know whether their data and apps live on their device or on a server somewhere, or both. But while consumers suffer no dearth of smartly architected digital platforms, enterprise users have not fared quite so well.
When it comes to industrial software, why is the focus so often on deployment models rather than business outcomes? This misplaced emphasis is one of the major obstacles to widespread adoption of Industrial Internet applications.
Any company that’s shopped around for Operational Technology (OT) industrial software has at some point confronted the cloud vs. on-prem conundrum. Choose cloud, the pitch goes, and enjoy the benefits of new business models, business agility, better use of resources, and fewer operational issues. Best of all, a daunting capital expense becomes a far more manageable operating cost, billed pay-as-you-go. On the other, hand choose edge computing and bring bandwidth intensive content closer to the equipment, leverage cost efficiencies, and meet critically-important low latency use cases. What’s left unsaid, of course, is that any company hoping to innovate in the current era of digital disruption and transformation requires the benefits of both approaches.
Consider the evolution of Netflix. When the company first launched its streaming service in 2007, the delivery model was fairly basic. Content would stream from a Netflix data center directly to the end user. That worked for a while, but as terabytes became petabytes, the company built out its own content delivery network to better distribute traffic, bringing data closer to the edge. Then finally, Netflix last year rolled out direct video downloads, making data fully available at the end point. Over the course of those ten years, Netflix never prompted consumers to choose where its content ought to be stored — cloud, on-premises, or in between. Consumers, after all, don’t much care about IT architecture. What they care about are outcomes. So, like any good platform, Netflix sought simply to make the user experience as seamless as possible. Everything else happened quietly in the background.
That’s the sort of seamlessness industrial users should demand from their software vendors. Should industrial applications like asset performance management (APM) live in the cloud, at the edge, or on-premises? Make no mistake: there is a correct answer. Winning industrial applications, and the platforms that power them, will need to live across all three to leverage the benefits of each. The problem of data lineage for a customer monitoring and managing gas compressors in a power plant or aircraft engines in a commercial airline should be as abstract as it is for the masses streaming House of Cards on Netflix.
Published at DZone with permission of Venkat Eswara, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.