Where Public Cloud Goes Wrong
Where Public Cloud Goes Wrong
Despite the convenience of public clouds, you might want to think twice if you're going to use it for app migration or don't have a backup plan for outages.
Join the DZone community and get the full member experience.Join For Free
Given the rapid growth of major public cloud services, one might think that the public cloud is eating the world. But according to a recent IDC report, over 82 percent of businesses have moved workloads or applications out of the public cloud in the past 12 months. So, what gives?
While the public cloud is extremely compatible with some applications and offers multiple benefits, it can often time clash with other workloads. That’s why three out of four organizations that migrated out of the public cloud have moved some or all of those workloads back on-premises.
In our work with enterprises in the areas of financial services, healthcare, and retail, for example, we tend to hear three reasons why the public cloud is sometimes not suitable for certain workloads.
Application Migration Headaches
Some applications are born in the cloud, which typically includes mobile and customer-facing web applications that are specifically built for the cloud infrastructure.
But most older organizations have a multitude of legacy applications, such as servers and databases. And while modernizing applications is one of the keys to digitally transforming businesses, updating core application systems to the cloud takes time. Few organizations can immediately move everything because of excessive costs and efforts.
Legacy applications need to be recoded, reconfigured, refactored, retested, and reintegrated before they can be moved to the cloud. So in addition to introducing a lot of complexity, time, and money, recoding applications for a cloud migration can degrade application performance.
Organizations hold IT teams accountable for the highest performance standards — particularly for customer-facing, business-critical, and revenue-generating applications. These applications are expected to always be up and running, and so the IT team needs to be very selective about what applications can and cannot thrive in public cloud.
Unpredictable Resource Consumption and Cost
Dropbox Vice President of Engineering, Aditya Agarwal recently described his company’s departure from the public cloud by stating, “Nobody is running a cloud business as a charity. There is some margin somewhere.”
While a public cloud can scale applications with fluctuating demand, the unexpected cost from unpredictable data growth and usage can quickly get out of control.
Many applications draw on ‘free’ resources in the data center — like network bandwidth. When applications move to a public cloud, those resources are no longer free and can exceed the expense of on-premises resources. Many companies that leap to public cloud encounter billing horror stories, like surprise monthly bills of $50,000 for network bandwidth usage.
It’s important to establish a strong cloud governance and cloud metrics program to monitor cloud usage and costs. The right private cloud offers more predictable costs. Analytics can forecast precise needs for additional capacity and performance to avoid over-provisioning. And when scaling your footprint, automation can optimize the location of every application.
The Risk of Outages and Security Vulnerabilities
As recently as this year, Amazon and Microsoft, two of the largest public cloud service providers, experienced major outages. Both companies had to move swiftly to appease users who couldn’t connect to data and applications. Many technology companies couldn’t access SaaS-based technologies such as invoicing systems, HR software and A/B testing services. Mashable half-jokingly threatened to produce its first printed newspaper. The outages were felt globally and were swiftly accompanied by the realization that it may not always be wise to put all your eggs in a single cloud basket.
In addition to the risk of outages, public cloud services have always been large targets for breaches and attacks. To make matters worse, user error can expose data. In July 2017 UpGuard’s cyber risk team reported 14 million Verizon user account details and PINs were breached via a third-party vendor that had misconfigured a storage unit on the public cloud. A month prior, Deep Root Analytics, a data management platform provider, suffered from a data leakage that exposed 198 million American voters due to the same misconfiguration issue.
While it is the customer’s responsibility to configure properly, humans make errors and the public cloud may introduce risks that simply aren’t as high using private or enterprise clouds. However, a public cloud is suitable for some workloads. When evaluating which of your company’s workloads have the right chemistry with the public cloud, it’s important to quantify the impact of downtime to revenue and reputation. Analyze workload and application purpose, importance, and requirements, and determine how much downtime each workload and application can afford before it hurts your money, your image, exposes sensitive information, or simply causes customers to complain.
Opinions expressed by DZone contributors are their own.