The Evolution of Cloud Development and Deployment
The Evolution of Cloud Development and Deployment
Gone are the days where only big enterprises used the cloud. Now, tools and services that facilitate the automated and efficient delivery of quality code and apps are the norm.
Join the DZone community and get the full member experience.Join For Free
Insight into the right steps to take for migrating workloads to public cloud and successfully reducing cost as a result. Read the Guide.
To gather insights on the state of cloud development and deployment today, we spoke with 15 executives from 13 companies that develop tools and services for companies to develop in, and deploy to, the cloud.
We spoke to:
Nishant Patel, CTO, and Gaurav Purandare, Senior DevOps Engineer, Built.io.
Sacha Labourey, CEO and Founder, CloudBees.
Jeff Williams, co-founder and CTO, Contrast Security.
Samer Fallouh, V.P. Engineering, and Andrew Turner, Senior Engineer, Dialexa.
Anders Wallgren, CTO, Electric Cloud.
Jack Norris, S.V.P. Data and Applications, MapR.
Michael Elliott, Cloud Evangelist, NetApp.
Faisal Memon, Technical Product Marketing, NGINX.
Seth Proctor, CTO, NuoDB.
Pedro Verruma, CEO, rethumb.
Pete Chadwick, Director of Cloud Product Management, SUSE.
Nick Kephart, Senior Director Product Marketing, Thousand Eyes.
Dmitry Sotnikov, V.P. of Cloud, WSO2.
Here's what they told us when we asked, "How has development and delivery in the cloud evolved?"
Development and delivery in the cloud continue to accelerate with better tools and more advanced services. The long-term outcome is that developers can focus exclusively on their business logic, and won’t have to worry about all the other aspects of software development that can slow down the process.
Evolution in, and of, the cloud has seen the increase in operational data. Oracle, SQL server, and DB2 are moving to the cloud. Tech has a wealth of options from general purpose to specific set of services on AWS. We used to bring our own technology to the cloud that the cloud now provides. Now there are integrated paths and environments. Six years ago, people in the public cloud were just start-ups. Today companies with highly important data on premise are increasingly comfortable with the flexibility to use the public cloud. What does hybrid mean? Control of data version lock-in. AWS and Azure have certification and security. Pride of ownership versus security and ownership versus security and understand what’s most cost effective. It’s not about security, it’s about indemnification still release everything to the public cloud.
Everything is much more automated. The process for deployment and delivery is much more automated and streamlined. Two to four years ago you used to deploy from a machine to an online environment with a lot of special scripts and ship codes that are no longer needed with automation where the services are already available from the cloud provider or you can write your own script.
It used to be the large enterprises would only use the cloud for testing and trying different scenarios. Today, it’s just part of the infrastructure that allows you to run your business more efficiently by focusing on bigger business issues versus hardware. People ask better questions. What do we want to accomplish? What is the best technology that will enable us to achieve our goals?
From our point of view, it evolved to include easier-to-use tools and services. Today, setups are very dynamic and we feel that in recent years cloud providers have kept up with the demand, making better and more flexible services available.
Adoption of more sophisticated processes like A/B testing is easy to achieve and is no longer seen as a time-consuming challenge. Cloud and containers evolve with the adoption of automation using Chef and Puppet. A strict definition of what something should be is specific and locked down at one time ensuring consistent deployment in a carbon copy production environment for production, QA, testing, and development. The work of the QA team is more interesting because they’re testing the real thing – they don’t have to fake a process, there’s less friction for everyone involved.
The first step is all about elastic ephemeral computing now and moving into the data to ensure reliability, scalability, and speed necessary to provide real-time analysis, operations, and analytics to impact the business as it happens.
The cloud is very elastic. That provides additional benefits of having resources on demand with rapid and agile deployment. Get up and running quickly. Deploy changes to existing applications using an automated pipeline.
1) As the cloud has evolved, we've seen the Integration of a bunch of services into a cohesive set to help with the development and delivery. Three to four years ago, the systems didn’t talk to each other. Now, APIs allow billing, CRM, and issue tracking all talk to one another.
2) Continue to move to cloud-based deployment with IaaS, AWS, and Azure. AWS is the default for smaller companies. Larger companies invest in both AWS and Azure. See some smaller tech companies using Google.
We’re in the early stages of the adoption of containers in the cloud. AWS, Google, and Azure all offer containers. You need to re-architect your application to be able to deploy into a container. Use of containers depends on the type of application. What benefit does the application get from running in a container? You need the architecture to take advantage of the container, like microservices, aligned goals of low overhead.
Used to be a virtual machine (VM) world and now its containerized applications. Used to be no PaaS, just processing. Now PaaS as a way to roll out cloud-native applications. Interesting dynamic with customers embracing self-service and customers who want to know what’s going on and what their services are going to be used for.
See if a service is or a tool is already available. 99% of the time it is. Microservices and API-first thinking. APIs provide flexibility. Chef, Puppet, Ansible set things up the way we want to set up services and deploy. Docker files abstract one more layer on top.
In the past, when a developer wanted to implement their application into an existing enterprise framework they’d often find that the tools, platforms and APIs that worked in one cloud, didn’t work as well in other environments. What I call the “Hotel California” effect. Today, the tools seen in existing enterprise frameworks can be mirrored to also exist in the public cloud. This ensures that what is developed in one cloud is easily transportable to other environments. Containers are getting a lot of interest right now because of their flexibility to allow applications to be deployed across clouds.
How has development and deployment in the cloud evolved from your perspective?
Opinions expressed by DZone contributors are their own.