Top 4 Innovations in Containers and Cloud Computing
Top 4 Innovations in Containers and Cloud Computing
Cloud and its related technologies continue to grow. Check out the fields that can be expected to advance quickly with cloud computing.
Join the DZone community and get the full member experience.Join For Free
Insight into the right steps to take for migrating workloads to public cloud and successfully reducing cost as a result. Read the Guide.
The container ecosystem has been maturing rapidly since the first Docker container was deployed by Solomon Hykes at PyCon in 2013 – when he unveiled a tool so revolutionary that it would bring simplicity and portability to the software development community. Five years later, containers and container platforms continue to be at the heart of innovation in the cloud computing industry. From the announcement of Docker Engine 1.0 to the breakthroughs in security, networking, and orchestration, the container industry continues to drive the emergence of new technologies that create new efficiencies and enhance productivity for developers and IT professionals alike.
So what does that container innovation look like in 2018 and beyond? Here is an overview of the top 4 innovations in container technologies that are emerging in the cloud computing industry, which will be discussed in even greater deal at this year’s DockerCon 2018.
As with containers, serverless has enabled developers to focus on application development without worrying about underlying infrastructure considerations such as the number of servers, amount of storage, etc. Although we’re still in the early days of serverless with a limited number of apps in production, it’s becoming more and more apparent that containers and functions are interrelated. Now seems like a good time to take a closer look at the different Functions-as-a-Service (FaaS) options beyond proprietary cloud providers such as AWS Lambda, Azure Functions or Google Cloud Functions — which come with lock-in concerns for enterprises. Docker has enabled the creation of modern serverless frameworks such as Apache OpenWhisk, Fn, Gestalt, Nuclio, or OpenFaaS, which are great ways to easily build and deploy portable serverless applications.
These frameworks package functions as Docker images and run functions as Docker containers, and can be deployed on a container platform such as Docker Enterprise Edition. They let you structure your application as a set of functions that are triggered either by an event coming from an event bus, or by a call through an API gateway. This space is maturing with standardization: the CNCF Serverless working group recently unveiled an initial version of the OpenEvents specification for a common, vendor-neutral format for event data.
Microservices architecture are becoming more popular as enterprises modernize their legacy applications, migrate workloads to the cloud and build greenfield applications. Modern languages and products such as the Docker container platform have played a significant role in removing some of the complexity associated with both developing and deploying microservices. However, some challenges remain, the most important one being observability. The concept of “Service Mesh” has recently emerged as the solution manages the inter-microservice communication complexity and provides observability and tracing in a seamless way.
Open Source projects such as Envoy, Istio, and Linkerd provide a large set of features such as resiliency, service discovery, routing, observability, security and interservice communication protocols.
In addition to Developers and IT pros, Docker products have become extremely popular with data scientists. From the ability to share reproducible data research and analysis to rapid prototyping of deep learning models, Docker containers come with a lot of benefits for data analysts. Additionally, with the portability benefits of the Docker platform, data scientists have the flexibility to change their compute environment to leverage different compute resources as their data requirements change.
With the development of projects such as Kubeflow, it’s becoming easier to run machine learning workflows leveraging the TensorFlow open source machine learning framework on Kubernetes with Docker, improving both the portability and the scalability of running models. There have been many advances in running containerized Machine Learning workloads in production this year, from leveraging GPUs for containerized workloads to using RDMA sockets to accelerate network transfer via custom CNI plugin or avoiding weight servers with the Horovod project.
Revenue opportunities in the financial and cryptocurrency markets have made blockchain technology one the hottest trends over the course of the past year. Blockchain frameworks such as Ethereum and Hyperledger make it possible to build modular applications where multiple parties can record, immutable and verifiable transactions without the need for an independent third party. Blockchain frameworks usually leverage the Docker platform to develop the framework and as part of running it, Hyperledger Fabric “leverages containers to host smart contracts that comprise the application logic of the system.”
As new developments continue to emerge in cloud computing, containers will continue to be the baseline for new innovation. By implementing a container platform like Docker Enterprise Edition, users and organizations alike will have a strong foundation that provides the security, operational agility and choice of cloud or infrastructure needed to drive new innovation.
Opinions expressed by DZone contributors are their own.