Setting Up Your Java Pipeline With Azure DevOps and Docker
This tutorial demonstates how to set up an Azure pipeline for automated building and deployment of a Java application, using Git for version control.
Join the DZone community and get the full member experience.Join For Free
Understanding how to organize a pipeline from development to operation has, in my experience, proven to be quite the endeavor. This article seeks to guide you through the required tools necessary to deploy your code as Docker containers by going through the steps involved in creating a simple "Hello World" application (although preexisting projects are also easily applicable for this approach).
If you have come far enough to consider a pipeline for your project, I expect you to be familiar with some of the simpler tools (e.g. Git, Java, Maven) involved within this process and will not cover these in-depth.
You may also enjoy: Building CI/CD Pipelines for Java Using Azure DevOps (Formerly VSTS)
To go about making a pipeline for our "Hello World" application, the following subjects will briefly be covered:
To make things clear: Our goal is being able to run
docker run <dockerid>/<image>:<tag> , while prior to, only having run
git push on master. This is an attempt to create a foundation for future CI/CD implementations ultimately leading to a DevOps environment.
One of the prerequisites for this walk-through is to use the Azure DevOps platform. I can highly encourage the full package, but the modules Repos and Pipelines are the only ones required. So, if you have not already, you should sign yourself up and create a project. After doing so, we can proceed onto the Repos module.
This module provides some simple tools for maintaining a repository for your code. While a repository could easily be managed by something like Github, this module supports solid synergy between repositories and pipelines.
After you click on the module, you will be met with the usual Git preface for setting up a repository. I highly recommend using the SSH methods for long term usage (if this is unknown to you, see Connect to your Git repos with SSH). Now, after setting it up, you will be able to clone the repository onto your computer.
Continuing, we will create a Maven project within the repository folder using IntelliJ IDEA (other IDEs can be used, but I will only cover IntelliJ), that ultimately prints the famous sentence, "Hello World!" (for setting up a project with Maven, see Creating a new Maven project - IntelliJ). This should leave you with a project tree like so:
Finishing off by creating a main class in src/main/java:
But before pushing these changes to master a few things needs to be addressed.
Maven provides developers with a powerful software management tool configurable from one location, the pom.xml file. Looking at the generated pom file in our project we will see the following:
In our case, the only really interesting part of the pom file is the version tag. Reason being, upon pushing our source code to master, Maven will require a new version each time — enforcing good practice.
As an extension, we need to make Maven create an executable .jar file with a manifest of where the main class is to be located. Luckily, we can just use their own Maven plugin:
The only thing you might want to change is the name of the main class (line 12). Remember the package name if not directly located in src/main/java (I prefer using properties but you can insert the name directly in line 26 if you like).
Lastly, before committing our additions to master, we will need to build the target folder which includes our .jar file. This can be done either directly through IntelliJ or in the terminal (if you have Maven installed). Simply press the lifecycle "package" in the UI, or run
mvn package in the terminal. Upon finalization a .jar file will have appeared in the target folder:
This concludes the initial setup necessary for our pipeline and we can now finally push our changes to master.
Most of you are probably quite familiar with Git, but I will go ahead and cover what needs to be done anyway.
The Git tool provides us with a distributed version control system easily accessible from anywhere. Now, provided we correctly configured our repository in Azure Repos, cloned it to our local computer and initialized the IntelliJ project within that folder, it should be straightforward.
As all of our added files have yet to be staged, run
git add. This will stage every changed or added file. Then run
git commit -m "initial commit" to commit the staged files. Lastly, run
git push to push the committed files to master.
You might now be wondering, "Has all the magic happened?" And the answer would be no. In fact, not much has happened. We have created a repository and filled it with a Maven project that prints "Hello World" when invoked, which in all honesty, is not much of an achievement. But, more importantly, we have established a foundation for our pipeline.
Pipelines, the star of the show, provides us with build and deployment automation. It enables us to customize what should happen whenever a build is triggered (in our case by pushing to master).
Let me take you through the process of setting up a simple pipeline.
- First, go to the Azure DevOps Pipeline module. This will present you with a single button "Create Pipeline," press it.
- We will now be prompted for the location of our code, and since we used Azure Repos, press "Azure Repos Git."
- It will now look through your repositories. Press the one you pushed the Maven project onto.
- Since it is a Maven project, select "Maven."
- You should now be presented with the following azure.pipelines.yml file:
Do not think too much about the semantics of the file. The important thing to know now is that the trigger is set to master and the steps include a task for Maven. For more information about the Maven inputs, see Maven task.
If everything looks in order, press "save and run" in the top-right corner to add the azure.pipelines.yml file to the repository. The pipeline will then be activated and run its first job.
Docker, the final piece of the puzzle, provides us with an OS-level of virtualization in the shape of containers with lots of versatility and opportunity. We need this tool to deploy our builds on machines and luckily, it is greatly integrated into the Azure DevOps platform. To fully utilize its many capabilities, you will need to register on the DockerHub.
- After registration, create a repository with the name of your application. Then choose whether or not to make it public (you can only have one private repository with the free plan).
- Next we need to authorize DockerHub into our Azure DevOps project. To do this go back to Azure DevOps and click on 'Project Settings' in the bottom-left corner.
- Choose "Pipelines/Service Connections*."
- Now click on the top-right button 'New service connection' and search for Docker registry, mark it and hit next.
- Choose "Docker Hub" as the registry type.
- Fill in the remaining fields (the service connection name is up to you). You should now be able to see your entry below "Service Connections."
The connection will make itself relevant later, but for now, we need to go back to the project and add a few things. Since we added the azure.pipelines.yml file to the repository a
git pull needs to be called to pull the newest changes. Furthermore, we need to define our Docker image using a Dockerfile. Create a new file in the root of the project and name it "Dockerfile." Your project tree should now look something like this:
The Dockerfile should be considered a template for containers much like classes are for objects. What needs to be defined in this template is as follows:
- We need to set a basis for the virtual environment (FROM openjdk:8).
- We need to copy our .jar file onto the virtual environment (COPY /target/testing-helloworld-?.?*.jar .).
- We need to run the .jar file upon initialization (CMD java -jar testing-helloworld-?-?*.jar).
You should now have a file looking similar to this:
The regex simply accounts for different versions being deployed, but the actual name has to match the .jar file from the target folder.
To sum up our current progress, we have now made a Maven project, linked it to a pipeline and created a template for the virtual environment. The only thing missing is to connect everything via the azure.pipelines.yml file.
Firstly, we will need to add some variables for the DockerHub connection as well as the ever-changing version number to the azure.pipelines.yml file (insert your Service Connection and Docker repository):
These variables are not strictly necessary but it never hurts to follow the DRY principle. Secondly, we need to add more tasks to our pipeline steps. What needs to happen is: log in to Docker, build the Dockerfile previously defined, and push the image to our DockerHub repository.
One at the time we add the wanted behavior starting with the Docker login:
Then the Docker build:
And lastly, the Docker push:
You should now have an azure.pipelines.yml file looking similar to this (with the addition of
mavenAuthenticateFeed:true in Maven@3 inputs):
Understandingly, this might be a little overwhelming but fear not, it looks more complicated than it really is. For more information about these inputs see Docker task.
Finally. Now we get to see the magic happen. However, before doing so, I need to tell you the routine procedure that is to push to the pipeline:
- Go into the pom.xml and the azure.pipelines.yml file and increment the version number.
- Run the Maven lifecycle
cleanto remove earlier .jar files in target folder.
- Run the Maven lifecycle
packageto build and package your code (creating the new .jar file).
- Provided you are on the master branch run the git commands:
git add .
git commit -m "commit message"
- Check whether or not the job passes in the pipeline.
If everything went as it should, you have now uploaded an image with your .jar file to the associated DockerHub repository. Running this image now only requires the host to have Docker installed. Let us try it!
The input (a) initiates a container from the requested repository. The image was then retrieved, instantiated and processed with the final result (b) displaying "Hello World!"
This concludes the guide for setting up your Java Pipeline with Azure DevOps and Docker.
By now it should hopefully be clear why this approach has its benefits. It enables the developer to form a run-time environment (Dockerfile) and upload it to operation with simple to no effort (
git push ). While it has not been covered, this approach also creates artefacts in Azure DevOps which is very useful when using something like Maven, as it makes dependencies surprisingly easy to manage.
Since this approach only recently made it into our team, it is still under development and a lot of additions are still to be made. I highly encourage you to further expand upon your pipeline by making it fit your exact needs.
I hope this guide has proven to be useful as well as practical, and should you have any further questions feel free to comment below.
Thank you for reading.
Setup Azure CI/CD Pipelines Using Visual Studio
Opinions expressed by DZone contributors are their own.