Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

The Evolving Data Requirements of Multi-Cloud Environments

DZone's Guide to

The Evolving Data Requirements of Multi-Cloud Environments

As projects set up cross-cloud environments, the requirements of disaster recovery and data protection mean considering newer means of keeping data safe.

· Cloud Zone
Free Resource

Make it happen: rapid app development on Kubernetes as a managed service.

By design or by accident, multi-cloud is a reality that most enterprise IT teams have to cope with today. Multi-cloud (or, cross-cloud as Dell EMC calls it) refers to an IT model where an organization uses services of one or more public cloud service providers, sometimes in addition to using its own data centers. In a multi-cloud model, different clouds may be used in different stages of the application lifecycle such as test/dev and production, and for different types of applications such as traditional structured applications and next-gen cloud applications. Nonetheless, IT leaders face challenges in identifying data management tools they need to embrace the multi-cloud environment to improve application agility, reduce operational hassles, and enhance application resilience. In this blog, I will discuss how enterprises are using a multi-cloud model and the requirements it places on next-generation data protection products. It is important to note that the genesis of this blog comes from my recent customer conversations and brainstorming with newly on-boarded customers.

One common reason for multi-cloud is to support geo-replicated applications to enable global reach of e-commerce organizations. In such a scenario, a production database may reside on a private cloud behind an enterprise firewall, while it is replicated to multiple regions in a public cloud to service reads (i.e. consumers are shopping on these eCommerce sites) from different regions at low latency. Such a configuration also protects against disasters at the data center level because the database and application are replicated to a public cloud (with multiple availability zones).

Another use case is to service the requirements of production environment versus test and development environments using different clouds. While security, control, and performance optimization may mandate a production environment to stay within a private data center, agility, access to new capabilities such as Google BigQuery and developer preferences may drive application development teams to use a public cloud service providers.

Finally, using a multi-cloud model could also be driven by the IT strategy of an enterprise to prevent overdependence (or, vendor lock-in) on any one public cloud service provider. To hedge the risks and get leverage, an enterprise may balance its application workloads across multiple cloud providers. For example, deploy Tier-1 applications on one cloud and Tier-2 applications on another. Amazing, and it is a real phenomenon that we are seeing with enterprise customers! However, a multi-cloud model presents the following unique challenges to enterprises.

  • Data protection in multi-cloud environments: Legacy data protection (backup and recovery, archiving, replication) products were architected for on-premise environments and optimized to work with legacy applications. Using legacy backup and recovery tools for next-generation applications in a public cloud environment is equivalent to fitting a round peg in a square hole. For example, legacy products don’t leverage the elastic and scalable compute and storage that is available in public cloud. True infrastructure-independent (software-only and elastic compute based) data protection products that may be deployed in heterogenous environments are required as enterprises look to migrate applications to the cloud.
  • DevOps agility: Moving data across clouds for DR or to refresh test & development environments with production data is another key requirement for enterprises. Doing so in a multi-cloud environment is not an easy task given the challenges in moving data over the WAN. There are also differences in configuration between test/dev and production environments. Further, the test/dev environments may not be as large as production environment so sub-sampling is required. Finally, personally identifiable data needs to be masked before it leaves the production firewall.
  • Ease of management and governance: Finally, federated management is required for applications and workloads running across multiple clouds. It is extremely inefficient to get familiar with different management tools and their intricacies.

This is not the complete list either. There are several other requirements such as support for cloud storage with consistent performance, handling of identity access and management processes and so on. The point is that data protection needs to be re-thought in the context of multi-cloud environments. In my next blog, I will showcase how Datos IO is solving this problem in the context of next generation applications.

Tutorial: WordPress as distributed microservices on Kubernetes.

Topics:
cloud ,disaster recovery ,multi-cloud environments ,cloud storage

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}