Docker, the Last Tool I’ll Ever Install?
Let's take a quick look at Docker and explore whether or not it is the last tool you'll ever need and how this could be possible.
Join the DZone community and get the full member experience.Join For Free
For the last few years, I’ve been maintaining a personal list that grows longer and longer as I pass between new companies, company roles, and company peers that recommend some new technology that is a must-have. This isn’t a bad thing, in fact, I love taking a new tool and laying it down on my laptop for the first time; it almost feels like unwrapping a new flagship gadget straight from the store and peeling back the plastic to reveal the bright shiny new toy that you just purchased. Now keep in mind, I enjoy doing this ONCE, but when once becomes twice and twice becomes three times, I’m no longer amused. Some tools can be easier to set up than others, a new JDK or a Maven installation are not such a big deal. Other tools that require more complex setup can be a little annoying, and doing all of your startup checks to make sure that it’s been set up the correct way is time-consuming. I personally have switched between 5 different laptops in just the past 2 years alone, and this includes a mixture of Linux, Mac OSX, and Windows 7/10. That means every new machine requires me to dust off my list, begin the process of downloading and collecting all of the installers or correct binaries I’ll need, and then installing and tuning each tool to my specifications. This takes up time, as well as disk clutter on my machine or registry, and when it’s time to update a particular tool, there may be additional steps beyond just accepting a few license agreements and letting the tool update. What if Docker could somehow solve all the overhead for me? I think it can, and I think I can explain why. Docker can be installed on any developer OS machine giving the ability to run almost every modern tool that I can think of (most found on DockerHub), when it’s time to update a tool I can do so with minimal effort and make a clean break from the tool on my host OS, and lastly, I can take any of these Docker images with my desired tool and I can easily run them on my host OS as though they are installed locally. Thus, ends the theory; now to the facts.
As I stated before, I have migrated between multiple OS’s and hardware over the years, and that means I need a solution that will be OS agnostic. Docker is perfect for this as it provides the installers and instructions for installing on Mac, Windows, or Linux (simply navigate to their installation page and find the instructions for your host OS, https://docs.docker.com/install/). Once I have installed Docker on my host, I have the world at my finger-tips. My first trip is over to DockerHub, where I can begin assembling my weaponry for development. Maybe I’ll need the AWS-CLI (governmentpaas/awscli:latest), terraform (hashicorp/terraform:latest), chef (chef/chefdk:latest), JDK 8 (openjdk:8-jdk-alpine), Maven installation (maven:latest), and possibly a dozen other tools. No problem, I can find each of these tools over on DockerHub or if I have access to another private registry all the better, keep in mind you’ll most likely be able to find a dozen images for each tool just decide which one contains the right version for your needs. Now installing any of the above-mentioned tools is as easy as running a command on the command line, i.e. "docker pull IMAGENAME:TAG." In one of my above examples, if I wanted to pull the Maven latest image, I would simply run "docker pull maven:latest.” As simple as a series of pull commands and I have now set up my machine with all of the tools that I need to make my laptop a useful instrument of war in the software arena.
Now keep in mind all of these Docker images do come with a disk-space cost, and I'm not going to lie, the cost may be even more significant than the bare tools themselves, but if it’s any consolation, this initial upfront cost is all you’ll really need to pay. As you start up containers based on these images, you should remove them to keep your disk pristine, think of the Images as a way to run stateless containers on your local machine, when you no longer need access to the tool you can collapse the container and have it removed from disk, leaving behind only the image from which it came, and nothing else cluttering up your disk or registry or anything else on your PATH. Ok, now that we have come to terms with the disk-space from a tool install, let’s think about the need to upgrade that tool from a new Docker Image that was released. Once I notice that a new version of Maven has been released and I’m ready to begin consuming that version, upgrading will be as simple as running the command "docker pull maven:latest". This should result in the addition of any new layers added to the Docker Image by the author of this image. Disk space on a few new layers only results in those new layers taking up space, and the time at which the Image is updated is limited to the size of each new layer, but luckily, you are not pulling an entirely new image from scratch, which is nice. Once I have updated my tool the command to do so is the same for each and every other tool when it comes time to update them as well, simple!
Ok, at this point, I probably haven’t shared any new knowledge that a person with any practical Docker experience didn’t know, which is why I’ve saved the best for last. How do I actually go about setting up my machine to use these tools now that I have assembled my repository of images? Let’s start by stating this, in order to use a tool pre-installed in a Docker Image, I need a running container. Now simply running a Docker container and then tunneling inside that container to run my commands might work, but that would be less than ideal. What I really want is a way to live on my Host OS and have an easy way to execute commands inside the container as a conduit thru my Host OS terminal shell. Let’s look at use-case involving a java project that builds with Maven. Let’s say I’m on my Host OS in a directory containing my java project with my project pom.xml file at my current directory. I haven’t locally installed maven or java for that matter, but I want to quickly run a command to create a jar from this project. First I need to launch the container so that I can use the maven command. By running (docker run -dit --rm --name maven --entrypoint=/bin/sh -v $(pwd):/tmp -w /tmp maven:latest) I have started up a new container in the background and allocated an interactive terminal to it, I have also set the container to be removed from disk when it stops and given it an easy to remember name of maven, lastly I have mounted my current directory as a volume into the internal container at /tmp and I have set the container’s current working directory to be at /tmp. Now just to make things a little easier on my life what I can do is create an alias so that my Maven commands are transparently run using the new container, I can now run a command on my terminal like this (alias maven="docker exec -it maven mvn"). What my alias now will do for me is it will interpret my command of "maven" and run a docker exec command to the container named “maven” which will run the "mvn" command from within the container. That’s it; I can now create a jar from my java project by running the command "maven clean package." I will be able to see my output on my host terminal as though the command is running from outside the container, and when it finishes I should have a newly created ‘target/’ directory inside my current directory that now contains my jar file created from the package command. Cleanup of the running Docker container just involves stopping the running container, which in this case is (docker stop maven), will stop and remove the container thanks to our flag of "--rm", which we applied to the container on startup.
Granted, I only provided a very simple use-case, but I think you can now see the power of using Docker installed on a blank machine. I can use tools like Terraform, Chef, or even the awscli, I can mount host volumes into running containers and I can even create aliases that allow me to run my commands without ever stepping foot into a running container. What I’ve mentioned in terms of tools only scratches the surface by the way; I can also easily run my own Jenkins server and set up my own personal and custom builds.
Opinions expressed by DZone contributors are their own.