Containers and Clusters and Swarms, Oh My!
This article aims to explain the hype around Docker, Kubernetes, containerization, and microservices architectures in general.
Join the DZone community and get the full member experience.Join For Free
It may sound like hyperbole, but put simply, there is a revolution underfoot: Cloud resources and smarter tools have enabled systems that provide enormous leaps in computing efficiency and most definitely impact the bottom line. The result? New business models that exploit speed, ease, cost savings, and constant improvement. And to accompany these new models, consumer expectation has likewise rapidly evolved.
For example, driven by the software world’s continuous delivery model, cloud vendors will create and deploy multiple versions of an application per day. Constant enhancement of existing products and services and the uninterrupted flow of new product offerings are now normal. There is innovation in the market, but it comes with pressure. People have become accustomed to instantaneous improvement and expect companies to move fast. If you press a button on a web form, you expect an immediate result.
Witness companies like Netflix, Spotify, PayPal, even The New York Times using cloud technology, containerization, and microservices architectures in combination to enable them to stay streaming, on demand, ahead of the curve, 24/7. You’ll hear reference to platforms like Docker and Kubernetes, which play a critical part in this hypercompetitive ecosystem, and businesses are naturally drawn to using new processes and architectures that promise to let them move at lightning speed.
Containerization and Microservices
By breaking applications into small, discrete units (or containers), developers can fulfill their “write once, run anywhere” dreams. Consider Docker, the hottest open-source platform for software containerization, whose motto is “Build, Ship, Run.” As their website says, “Docker containers wrap a piece of software in a complete filesystem” that houses everything needed — code, runtime, system libraries — all in one little unit. Containers are self-sufficient and will reliably run the same regardless of environment, making them ideal for development and testing. They share the operating system kernel, so they are very fast and very lightweight. They can be easily replicated and bundled and ported and are perfectly suited to automation.
Mirroring the principals behind containerization, microservices architectures provide an efficient way to design and deploy systems in units. A few years ago, for example, a typical banking application might initially prompt a customer to log-in, then perform a security check on that user. These two steps would be handled by the software as one “macro” process with a direct-dependency linkage between the two actions. With microservices, that process would be broken down into even smaller tasks. Log-in becomes its own microservice. Authentication becomes its own separate microservice. Once someone has completed log-in and progressed to authentication, the login process is done and those resources are released. If you make a change to the log-in process, you aren’t forced to also make changes to authentication (or any of the other myriad microservices that may follow), which makes everything move more quickly and efficiently.
These processes all work together, but autonomously. They are discrete and their interactions are clean and concise. Individual developers can work on their own particular tasks in tandem with others without worrying about breaking the “whole” system. Maintenance on one service doesn’t interfere with the others. Each service is independently deployable, so the same service can be reused for a web app or a mobile app, selected and composed along with other services to your liking. They’re also independently scalable. You don’t have to test interactions with all the other services whenever you want to switch out or update any particular microservice — its function is isolated from the rest of the system.
Thus, containerization and microservices architectures enable a more flexible, more dynamic environment, providing an infrastructure (and by extension, a company) with the ability to quickly scale, innovate, and adapt to change.
And of course, all of this flexibility and scalability is fueled by cloud and/or hybrid deployments. In a microservices world, the grouping and orchestration of all these small units of software so that they work seamlessly and responsively together becomes paramount. Docker has Swarm, a clustering and scheduling tool for Docker containers that lets you manage a cluster of Docker nodes as a single virtual system. There’s also Kubernetes, a Google-designed container cluster manager that serves as a platform for automating the deployment of application containers across clusters of hosts.
But the truth is that there is no tool to just let a company “flip the switch” and get everything they already have and everything they are currently building to be containerized, cloud-native, and integrated.
And the integration between the microservices is key. When a banking application has 150 microservices services spun up in a host of containers that make up their basic customer-facing banking app, the question becomes how do all these units communicate with each other to work together reliably? Technically, this is accomplished through application programming interfaces (APIs). Each service exposes an API, built to interact with other applications — distributing and sharing appropriate information programmatically in all directions. Interconnecting and managing all these APIs to best leverage everything you have — your state-of-the-art microservices as well as legacy infrastructure, databases, third-party resources and tools, etc. — is not trivial.
Modern platform-as-a-service tools (also cloud-powered, naturally) usually have a visualization component or a complexity-reducing interface to ease the creation and management of functional microservices and integrations, but it all still requires considerable work.
Containerized methodologies have helped enterprises realize the true potential of cloud computing and the continuous delivery process, and certainly speed the production lifecycles as well as the pace of change. Containerization and microservices architectures are transforming entire industries, allowing enterprises to focus on and better manage customer experience, innovate and adapt very quickly, and scale on demand (all while controlling costs).
But any organization planning to join the revolution must address the proverbial “man behind the curtain.” Integrating these transformative processes and technologies into existing enterprise operations doesn’t occur by magic, and that reality should never be overlooked.
Opinions expressed by DZone contributors are their own.