Over a million developers have joined DZone.

Docker: A Simple, Powerful Approach to APIs

Docker and APIS have both become incredibly popular. Check out this look at why Docker may be a powerful, and simple approach to APIs. Learn more about Docker basics, an API environment, and answer the big question: why Docker?

· Integration Zone

Is iPaaS solving the right problems? Not knowing the fundamental difference between iPaaS and dPaaS could cost you down the road. Brought to you in partnership with Liaison Technologies.

In the world of APIs, few names are as big as Docker right now. The open-source platform, which delivers an entire development ecosystem to API users in a single application, is one of the hottest technologies being used by developers today.

It’s not hard to understand why Docker is so popular. It’s an easy-to-use system that integrates code, runtime, system tools, and libraries – basically anything you could install on a server. As a result, it becomes easy for developers to create, deploy, and run applications.

Docker-APIs

Docker 101: Packing it All into Containers

So what exactly is Docker? Quite simply, it’s an implementation of lightweight Linux containers that enable virtualization through an environment that has its own process and network space. Docker automates the deployment of applications inside these software containers by offering another layer of abstraction and automation of virtualization at the operating system level. Rather than creating a full virtual machine, these containers sit on top of a single Linux instance, providing a small capsule with an application inside.

One of the best features of Docker containers is that they allow developers to package an application with everything it needs, including libraries and additional dependencies, and ship it as a single package. By packaging up applications this way, you don’t need to run a virtual machine, meaning you can put as many apps as you want to onto a Linux host. Without needing to spin up a VM for each application, you actually have more processing power to use – either for more containers or for additional apps you want to run.

Simplifying API Development

As we mentioned at the top of this post, Docker delivers an entire development ecosystem to API users in one application, which helps greatly simplify the API system at runtime. Docker containers include an application and all its dependencies, while also using a common kernel with the other apps on the host system.

This means the container is free to work on any system, and it takes away all the things virtual machine operating systems usually need to have – including binaries and libraries. As a result, the API only includes what it absolutely needs.

Why Docker? The Big Benefits

So with that background, what really makes Docker a good choice? Here are the main reasons:

It’s open. Because Docker can take advantage of the huge range of open standards on both the Linux and Microsoft OS ecosystem, it can support most infrastructure configurations and it also allows for code base transparency.

It’s secure. With Docker containers, every application you’re working on is isolated from other applications. This approach runs counter to the traditional model, in which APIs are interdependent. In that model, breaching one API can easily lead to a vulnerability in the entire system. Also, if you have a web API being hosted in a Docker container, you can enforce HTTPS for additional encryption. In addition, because Docker is an open system, it’s checked regularly for security vulnerabilities by the users. For more information, you can visit the Docker security center for tools and best practices for your Docker implementation.

It cuts development time. Docker containers are extremely simple to build and launch, and it’s easy to store images. It’s also easy to extend an existing image to an existing Docker container. And, of course, because of the packaged development ecosystem approach, you can focus more of your time on writing code and less on the system that your application will be running on.

It uses common file systems and imaging. Docker employs common file systems and imaging, which means APIs share a base kernel. As a result, fewer system resources are dedicated to redundant dependencies, which greatly reduces the space that’s created by APIs with multiple dependencies and makes API containers easy to use and understand.

Trying it Out

For all its benefits, Docker is an extremely simple technology to use. Docker builds images automatically by reading the instructions from a Dockerfile: a text document containing all the commands you could call on the command line to assemble an image. Using docker build, you can create an automated build that executes several command-line instructions.

The docker build command builds an image from a Dockerfile and a context – the files at a specified location PATH or URL. The PATH is your local file system’s directory, and the URL is the location of a Git repository. A simple build command using the current directory looks like this:

$ docker build . 
Sending build context to Docker daemon  6.51 MB

If you want to use a file in the build context, the Dockerfile refers to the file with an instruction, such as a COPY instruction. And if you want to increase the build’s performance, you can exclude files and directories by adding a .dockerignore file to the context directory. 

The Dockerfile is located in the root of the context. You can point to a Dockerfile anywhere on your file system by using the -f flag with docker build.

You can also specify a repository and tag where you want to save the new image, providing the build succeeds:

$ docker build -t shykes/myapp .

The Docker daemon runs your steps and commits the results to a new image. It then outputs the ID of the new image, automatically cleaning up the context you sent. 

For more in-depth Docker discussion and many more sample commands, you can take this Docker web tutorial

Discover the unprecedented possibilities and challenges, created by today’s fast paced data climate and why your current integration solution is not enough, brought to you in partnership with Liaison Technologies.

Topics:
mobile ,integration ,docker

Published at DZone with permission of Sheena Chandok, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}