Digital Without Transformation

DZone 's Guide to

Digital Without Transformation

Avoid public cloud lock-in and reduce costs with a cloud data platform.

· Cloud Zone ·
Free Resource

I had the opportunity to meet virtually with Dani Golan, CEO and Co-founder, and Derek Swanson, CTO of Kaminario as part of the 34th IT Press Tour. 

Hybrid multi-cloud infrastructure continues to win the day for enterprises. This is driven by the exponential growth of data, limited budgets, digitalization of business, and changing demand for skills and roles. The globalization of business requires access to data everywhere with everything being connected and data being collected at millions of endpoints while remaining compliant with geographically-specific data privacy laws.

Dispersed data requires a cloud data platform that will run seamlessly in any cloud, with the agility to move from one cloud to another using AI/ML for orchestration of the worldwide.

More than 70% of organizations are now using cloud platforms, it is the preferred model, multi-cloud environments are growing, people understand they can't be locked into a single vendor, you need to be able to leverage your platforms in different geographies, to address different needs, and take advantage of what the providers can do differently around the globe. 

Private clouds are still used for sure traditional workloads, applications that don't get migrated easily, or applications that have specific security or performance requirements. Cloud growth is being driven by tons of new applications, cloud-native applications and legacy migrations trying to either lift and shift or modernize containerized applications, as well as refactoring. People want to refactor to get that full digital transformation but it’s challenging.

Most companies today have either a digital transformation officer or somebody tasked with that role. It's not necessarily an IT guy, it could be someone from marketing, or it could be from sales, etc. Skills and roles have changed dramatically. All the legacy roles are still there but they’re shrinking and changing. There are lots of new roles. With business globalization, with the cloud, everything has to be interconnected, on 24/7/365, across geographic boundaries. 

The drivers of digital transformation are cloud savings with application scaling to manage peaks and troughs. Improved decision-making which results in speed to market, and intelligence-driven CX and planning. Along with major functionality improvements with automation improving access and simplicity.

Edge computing is on the rise with 5g and Wi-Fi six, it is going to become more important to deliver data-rich, low latency applications right to the edge (i.e. people’s phones). B2B and B2C consumers expect high intelligence, low latency, high response applications right on their phone. 

Today, cost governance is very poor. Enterprises are seeing their costs explode and people are having a hard time controlling those costs across all of these platforms. In order to control costs, you must be interconnected. People don't want 10 different sets of tools to manage everything across different platforms. They need to be interconnected, they need to speak the same languages, we need to have similar management interfaces, and be able to govern across all of these things in a unified package. You can't treat on-premises, cloud, and edge as silos. 

People have to change. Gone is the day where you have a bunch of high-priced specialists dedicated to a niche where all they work on is a mainframe project or a specific application in an obscure networking language. Everyone's got to be versatile and agile. We need to simplify the management and governance of the infrastructure. More intelligence in the application, the orchestration, and automation to keep costs down, allow people to work across the entire environment. Everything has to be driven by business outcomes. If a project doesn't impact the business dramatically right now, mostly around cost-cutting, it's getting shelved.

People are still trying to pull their applications into the cloud to figure out how to do lift-and-shift effectively. One of the biggest challenges for legacy enterprises is to take legacy applications that are heavy, proprietary, monolithic, and get them into the cloud with the same feature functionality, performance profile user experience that they had on-prem. The killer of cloud migrations is trying to get legacy applications into the cloud without doing a full refactor. If you do refactor, oftentimes it takes forever, costs a huge amount of money, you don't get the same user experience that you had on-prem. 

Cost savings are driving the move. Companies want to save money from their own data centers and their own people and running on-prem. They also gain greater agility being able to spin up resources in the cloud, which drives faster innovation and monetization of application development

A typical public cloud footprint with >250TB of data for high-performance business-critical applications and a small number of snapshots costs $80,000 per month. Rich data services like inline variable deduplication, inline compression, pattern removal, zero detect, and thin provisioning delivers major reductions in resource utilization (2X to 4X) while increasing performance and reducing latency for $20,000 per month.

Public cloud providers attempt to get enterprise locked-in when they are refactoring cloud-native applications. A cloud data platform provides the ability to move applications between clouds to optimize processing and analytics while significantly reducing costs.

cloud ,data platform ,hybrid cloud

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}