{{announcement.body}}
{{announcement.title}}

Setting Up Your Java Pipeline With Azure DevOps and Docker

DZone 's Guide to

Setting Up Your Java Pipeline With Azure DevOps and Docker

This tutorial demonstates how to set up an Azure pipeline for automated building and deployment of a Java application, using Git for version control.

· DevOps Zone ·
Free Resource
Learn how to use Azure for automated Java deployments.

Introduction

Understanding how to organize a pipeline from development to operation has, in my experience, proven to be quite the endeavor. This article seeks to guide you through the required tools necessary to deploy your code as Docker containers by going through the steps involved in creating a simple "Hello World" application (although preexisting projects are also easily applicable for this approach). 

If you have come far enough to consider a pipeline for your project, I expect you to be familiar with some of the simpler tools (e.g. Git, Java, Maven) involved within this process and will not cover these in-depth.

You may also enjoy: Building CI/CD Pipelines for Java Using Azure DevOps (Formerly VSTS)

To go about making a pipeline for our "Hello World" application, the following subjects will briefly be covered:

  1. Azure DevOps
  2. Azure Repos
  3. Maven
  4. Git
  5. Azure Pipelines
  6. Docker  

To make things clear: Our goal is being able to run  docker run <dockerid>/<image>:<tag>  , while prior to, only having run  git push  on master. This is an attempt to create a foundation for future CI/CD implementations ultimately leading to a DevOps environment. 

Azure DevOps 

One of the prerequisites for this walk-through is to use the Azure DevOps platform. I can highly encourage the full package, but the modules Repos and Pipelines are the only ones required. So, if you have not already, you should sign yourself up and create a project. After doing so, we can proceed onto the Repos module.

Azure Repos 

This module provides some simple tools for maintaining a repository for your code. While a repository could easily be managed by something like Github, this module supports solid synergy between repositories and pipelines. 

After you click on the module, you will be met with the usual Git preface for setting up a repository. I highly recommend using the SSH methods for long term usage (if this is unknown to you, see Connect to your Git repos with SSH). Now, after setting it up, you will be able to clone the repository onto your computer. 

Continuing, we will create a Maven project within the repository folder using IntelliJ IDEA (other IDEs can be used, but I will only cover IntelliJ), that ultimately prints the famous sentence, "Hello World!" (for setting up a project with Maven, see Creating a new Maven project - IntelliJ). This should leave you with a project tree like so:

Hello World project tree

Finishing off by creating a main class in src/main/java:

Hello World project tree

Java




x


 
1
public class Main {
2
    public static void main(String[] args) {
3
        System.out.println("Hello World!");
4
    }
5
}


But before pushing these changes to master a few things needs to be addressed.

Maven 

Maven provides developers with a powerful software management tool configurable from one location, the pom.xml file. Looking at the generated pom file in our project we will see the following: 

XML




xxxxxxxxxx
1
10


 
1
<?xml version="1.0" encoding="UTF-8"?>
2
<project xmlns="http://maven.apache.org/POM/4.0.0"
3
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
4
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
5
    <modelVersion>4.0.0</modelVersion>
6
 
          
7
    <groupId>surgo.testing</groupId>
8
    <artifactId>testing-helloworld</artifactId>
9
    <version>1.0</version>
10
</project>


In our case, the only really interesting part of the pom file is the version tag. Reason being, upon pushing our source code to master, Maven will require a new version each time — enforcing good practice. 

As an extension, we need to make Maven create an executable .jar file with a manifest of where the main class is to be located. Luckily, we can just use their own Maven plugin:

XML




xxxxxxxxxx
1
33


 
1
<?xml version="1.0" encoding="UTF-8"?>
2
<project xmlns="http://maven.apache.org/POM/4.0.0"
3
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
4
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
5
    <modelVersion>4.0.0</modelVersion>
6
 
          
7
    <groupId>surgo.testing</groupId>
8
    <artifactId>testing-helloworld</artifactId>
9
    <version>1.0</version>
10
 
          
11
    <properties>
12
        <main.class>Main</main.class>
13
    </properties>
14
 
          
15
    <build>
16
        <plugins>
17
            <plugin>
18
                <groupId>org.apache.maven.plugins</groupId>
19
                <artifactId>maven-jar-plugin</artifactId>
20
                <version>3.1.2</version>
21
                <configuration>
22
                    <archive>
23
                        <manifest>
24
                            <addClasspath>true</addClasspath>
25
                            <classpathPrefix>lib/</classpathPrefix>
26
                            <mainClass>${main.class}</mainClass>
27
                        </manifest>
28
                    </archive>
29
                </configuration>
30
            </plugin>
31
        </plugins>
32
    </build>
33
</project>


The only thing you might want to change is the name of the main class (line 12). Remember the package name if not directly located in src/main/java (I prefer using properties but you can insert the name directly in line 26 if you like).

Lastly, before committing our additions to master, we will need to build the target folder which includes our .jar file. This can be done either directly through IntelliJ or in the terminal (if you have Maven installed). Simply press the lifecycle "package" in the UI, or run  mvn package  in the terminal. Upon finalization a .jar file will have appeared in the target folder: 

This concludes the initial setup necessary for our pipeline and we can now finally push our changes to master.

Git 

Most of you are probably quite familiar with Git, but I will go ahead and cover what needs to be done anyway.

The Git tool provides us with a distributed version control system easily accessible from anywhere. Now, provided we correctly configured our repository in Azure Repos, cloned it to our local computer and initialized the IntelliJ project within that folder, it should be straightforward. 

As all of our added files have yet to be staged, run git add. This will stage every changed or added file. Then run  git commit -m "initial commit" to commit the staged files. Lastly, run  git push   to push the committed files to master.

You might now be wondering, "Has all the magic happened?" And the answer would be no. In fact, not much has happened. We have created a repository and filled it with a Maven project that prints "Hello World" when invoked, which in all honesty, is not much of an achievement. But, more importantly, we have established a foundation for our pipeline. 

Azure Pipelines 

Pipelines, the star of the show, provides us with build and deployment automation. It enables us to customize what should happen whenever a build is triggered (in our case by pushing to master).

Let me take you through the process of setting up a simple pipeline.

  1. First, go to the Azure DevOps Pipeline module. This will present you with a single button "Create Pipeline," press it. 
  2. We will now be prompted for the location of our code, and since we used Azure Repos, press "Azure Repos Git." 
  3. It will now look through your repositories. Press the one you pushed the Maven project onto.
  4. Since it is a Maven project, select "Maven."
  5. You should now be presented with the following azure.pipelines.yml file:
HXML




xxxxxxxxxx
1
22


 
1
# Maven
2
# Build your Java project and run tests with Apache Maven.
3
# Add steps that analyze code, save build artifacts, deploy, and more:
4
# https://docs.microsoft.com/azure/devops/pipelines/languages/java
5
 
          
6
trigger:
7
- master
8
 
          
9
pool:
10
  vmImage: 'ubuntu-latest'
11
 
          
12
steps:
13
- task: Maven@3
14
  inputs:
15
    mavenPomFile: 'pom.xml'
16
    mavenOptions: '-Xmx3072m'
17
    javaHomeOption: 'JDKVersion'
18
    jdkVersionOption: '1.8'
19
    jdkArchitectureOption: 'x64'
20
    publishJUnitResults: true
21
    testResultsFiles: '**/surefire-reports/TEST-*.xml'
22
    goals: 'package'


Do not think too much about the semantics of the file. The important thing to know now is that the trigger is set to master and the steps include a task for Maven. For more information about the Maven inputs, see Maven task.

If everything looks in order, press "save and run" in the top-right corner to add the azure.pipelines.yml file to the repository. The pipeline will then be activated and run its first job.


Docker 

Docker, the final piece of the puzzle, provides us with an OS-level of virtualization in the shape of containers with lots of versatility and opportunity. We need this tool to deploy our builds on machines and luckily, it is greatly integrated into the Azure DevOps platform. To fully utilize its many capabilities, you will need to register on the DockerHub

  1. After registration, create a repository with the name of your application. Then choose whether or not to make it public (you can only have one private repository with the free plan).
  2. Next we need to authorize DockerHub into our Azure DevOps project. To do this go back to Azure DevOps and click on 'Project Settings' in the bottom-left corner.
  3. Choose "Pipelines/Service Connections*."  
  4. Now click on the top-right button 'New service connection' and search for Docker registry, mark it and hit next.
  5. Choose "Docker Hub" as the registry type.
  6. Fill in the remaining fields (the service connection name is up to you). You should now be able to see your entry below "Service Connections." 

The connection will make itself relevant later, but for now, we need to go back to the project and add a few things. Since we added the azure.pipelines.yml file to the repository a  git pull needs to be called to pull the newest changes. Furthermore, we need to define our Docker image using a Dockerfile. Create a new file in the root of the project and name it "Dockerfile." Your project tree should now look something like this: 

Project Tree with additoin files

Project Tree with additoin Dockerfile

The Dockerfile should be considered a template for containers much like classes are for objects. What needs to be defined in this template is as follows:

  1. We need to set a basis for the virtual environment (FROM openjdk:8).
  2. We need to copy our .jar file onto the virtual environment (COPY /target/testing-helloworld-?.?*.jar .).
  3. We need to run the .jar file upon initialization (CMD java -jar testing-helloworld-?-?*.jar).

You should now have a file looking similar to this:

Dockerfile




x


 
1
FROM openjdk:8
2
COPY /target/testing-helloworld-?.?*.jar .
3
CMD java -jar testing-helloworld-?.?*.jar


The regex simply accounts for different versions being deployed, but the actual name has to match the .jar file from the target folder. 

To sum up our current progress, we have now made a Maven project, linked it to a pipeline and created a template for the virtual environment. The only thing missing is to connect everything via the azure.pipelines.yml file.

Firstly, we will need to add some variables for the DockerHub connection as well as the ever-changing version number to the azure.pipelines.yml file (insert your Service Connection and Docker repository):

HXML




xxxxxxxxxx
1


 
1
...
2
variables:
3
  containerRegistryServiceConnection: saban17-testing
4
  imageRepository: saban17/testing-helloworld
5
  tag: 1.0.0
6
...


These variables are not strictly necessary but it never hurts to follow the DRY principle. Secondly, we need to add more tasks to our pipeline steps. What needs to happen is: log in to Docker, build the Dockerfile previously defined, and push the image to our DockerHub repository. 

One at the time we add the wanted behavior starting with the Docker login:

HXML




xxxxxxxxxx
1


 
1
  - task: Docker@2
2
    displayName: dockerLogin
3
    inputs:
4
      command: login
5
      containerRegistry: $(containerRegistryServiceConnection)


Then the Docker build:

HXML




xxxxxxxxxx
1


 
1
  - task: Docker@2
2
    displayName: dockerBuild
3
    inputs:
4
      repository: $(imageRepository)
5
      command: build
6
      Dockerfile: Dockerfile
7
      tags: |
8
        $(tag)


And lastly, the Docker push:

HXML




xxxxxxxxxx
1


 
1
  - task: Docker@2
2
    displayName: dockerPush
3
    inputs:
4
      command: push
5
      containerRegistry: $(containerRegistryServiceConnection)
6
      repository: $(imageRepository)
7
      tags: |
8
        $(tag)


You should now have an azure.pipelines.yml file looking similar to this (with the addition of  mavenAuthenticateFeed:true  in Maven@3 inputs):

HXML




xxxxxxxxxx
1
48


 
1
trigger:
2
  - master
3
 
          
4
pool:
5
  vmImage: 'ubuntu-latest'
6
 
          
7
variables:
8
  containerRegistryServiceConnection: saban17-testing
9
  imageRepository: saban17/testing-helloworld
10
  tag: 1.0.0
11
 
          
12
steps:
13
  - task: Maven@3
14
    inputs:
15
      mavenPomFile: 'pom.xml'
16
      mavenOptions: '-Xmx3072m'
17
      javaHomeOption: 'JDKVersion'
18
      jdkVersionOption: '1.8'
19
      jdkArchitectureOption: 'x64'
20
      publishJUnitResults: true
21
      mavenAuthenticateFeed: true
22
      testResultsFiles: '**/surefire-reports/TEST-*.xml'
23
      goals: 'package'
24
 
          
25
  - task: Docker@2
26
    displayName: dockerLogin
27
    inputs:
28
      command: login
29
      containerRegistry: $(containerRegistryServiceConnection)
30
 
          
31
  - task: Docker@2
32
    displayName: dockerBuild
33
    inputs:
34
      repository: $(imageRepository)
35
      command: build
36
      Dockerfile: Dockerfile
37
      tags: |
38
        $(tag)
39
 
          
40
  - task: Docker@2
41
    displayName: dockerPush
42
    inputs:
43
      command: push
44
      containerRegistry: $(containerRegistryServiceConnection)
45
      repository: $(imageRepository)
46
      tags: |
47
        $(tag)
48
 
          


Understandingly, this might be a little overwhelming but fear not, it looks more complicated than it really is. For more information about these inputs see Docker task

Finally. Now we get to see the magic happen. However, before doing so, I need to tell you the routine procedure that is to push to the pipeline:

  1. Go into the pom.xml and the azure.pipelines.yml file and increment the version number. 
  2. Run the Maven lifecycle  clean  to remove earlier .jar files in target folder.
  3. Run the Maven lifecycle package to build and package your code (creating the new .jar file). 
  4. Provided you are on the master branch run the git commands:
    1.  git add . 
    2.  git commit -m "commit message" 
    3.  git push 
  5. Check whether or not the job passes in the pipeline.

If everything went as it should, you have now uploaded an image with your .jar file to the associated DockerHub repository. Running this image now only requires the host to have Docker installed. Let us try it! 

Running Dockerhub repositoryThe input (a) initiates a container from the requested repository. The image was then retrieved, instantiated and processed with the final result (b) displaying "Hello World!" 

Running Dockerhub repository

This concludes the guide for setting up your Java Pipeline with Azure DevOps and Docker. 


Conclusion

By now it should hopefully be clear why this approach has its benefits. It enables the developer to form a run-time environment (Dockerfile) and upload it to operation with simple to no effort ( git push ). While it has not been covered, this approach also creates artefacts in Azure DevOps which is very useful when using something like Maven, as it makes dependencies surprisingly easy to manage. 

Since this approach only recently made it into our team, it is still under development and a lot of additions are still to be made. I highly encourage you to further expand upon your pipeline by making it fit your exact needs. 

I hope this guide has proven to be useful as well as practical, and should you have any further questions feel free to comment below.

Thank you for reading. 

Further Reading

Introduction To Azure DevOps

Setup Azure CI/CD Pipelines Using Visual Studio

Topics:
devops ,azure devops ,azure portal ,maven ,java ,git ,docker ,azure ,java pipeline

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}