“The edge will eat the cloud. And this is perhaps as important as the cloud computing trend ever was.” — Tom Bittman, Vice President and Distinguished Analyst at Gartner Research
If you’re one of the many IT pros still finding your footing with cloud computing, this statement from Tom Bittman might sting a little. It sounds like you might be spending time worrying about the cloud for nothing.
But don’t worry—the time and money you’re investing in the cloud won’t go to waste. In fact, to think the rise of edge computing spells the end of cloud computing would be a mistake.
Even if you haven’t quite perfected cloud computing, you have to keep your eye on edge computing. At the very least, it’s time to understand why edge computing is gaining ground and what it is in the first place.
What’s Driving the Need for Edge Computing?
Let’s be honest — cloud computing isn’t perfect. However, it provides many benefits that today’s agile businesses can’t ignore. Cloud computing vendors are helping IT teams achieve mass centralization, greater self-service, full automation of data processing, and economies of scale.
And yet, end-user experience challenges still exist. The problem is that for all of cloud computing’s benefits, universal centralization isn’t always best. Cloud computing vendors have made great strides in real-time data access, but latency still exists.
When you’re waiting for data to travel many miles from a cloud data center to the end-user, you can’t avoid latency completely. You can minimize the damage with local processing where appropriate, but that’s more of a Band-Aid approach than a true solution. With businesses launching more and more remote locations, latency and distance become ever more important.
The latency problem will only get worse as the second wave of digital transformation hits. Augmented and virtual reality, the Internet of Things, and artificial intelligence are coming. Edge computing promises to solve your real-time access problem as these new technologies go mainstream.
Edge Computing: A Basic Overview
Cloud computing displaced data centers by offering massively scalable, centralized data processing. It’s allowed a lot of businesses to grow quickly. However, edge computing takes a different approach by pushing data processing to the edge devices requesting that data.
By processing data within an edge device, applications won’t have to wait for cloud services to handle workloads and send responses. Even if this results in milliseconds of speed improvements, edge computing can provide the reaction time necessary to get the most out of the Internet of Things.
The problem is that even though you’re used to managing modern applications at the edge of the network, you’ve never needed an entire networking strategy based on edge computing. And while you’re still getting cloud computing situated, it might seem overwhelming to start from square one on another new strategy.
There’s good news, though. Contrary to what you might have heard, edge computing won’t replace cloud computing—they’ll exist alongside one another.
You Have More Time Than You Think for Edge Computing
It’s easy to get caught up in technology hype. You see so many articles pop up about a specific topic and you start to think that if you don’t implement the new technology immediately, you’ll be left behind.
The Internet of Things, AR/VR, machine learning and more will go mainstream in your business eventually (if they haven’t started to already). But you still have plenty of room to grow with cloud computing before edge computing becomes a necessity.
Even when edge computing becomes the norm, cloud computing won’t go away. You just don’t need to decentralize data processing for something like an inventory management system. Adopting edge computing for that kind of use case would be unnecessarily complicated.
The next time you read about the need for edge computing, remember that you shouldn’t abandon your quest for cloud computing expertise. You have to walk before you can run!