Over a million developers have joined DZone.

How to Build True Pipelines With Jenkins and Maven

DZone's Guide to

How to Build True Pipelines With Jenkins and Maven

Want to learn how to build a true pipeline with Jenkins and Maven? Check out this tutorial to learn more about building pipelines with Maven repositories.

· Java Zone ·
Free Resource

Download Microservices for Java Developers: A hands-on introduction to frameworks and containers. Brought to you in partnership with Red Hat.

The essence of creating a pipeline is breaking up a single build process into smaller steps, each having its own responsibility. In this way, faster and more specific feedback can be returned. Let's define a true pipeline as a pipeline that is strictly associated with a single revision within a version control system. This makes sense. Ideally, we want the build server to return full and provide accurate feedback for every single revision.

As new revisions can be committed at any time, it is natural that multiple pipelines actually get executed next to each other. If needed, it is even possible to allow concurrent executions of the same build step for different pipelines. However, some measurements need to be taken in order to guarantee that all steps executed within one pipeline are actually based on the same revision.

Within a Jenkins build server instance, it is possible to create a true pipeline for a Maven-based project quite easily and in an efficient way. But, in order to establish one, we need some extra Jenkins plugins to be provisioned and configured properly as we will see below.

Now, let's say that we have a continuous build for a multi-module, top-level Maven project that we want to break up in the following steps, each step being executed by a separate Jenkins job.

  1. create — checkout the head revision, compile the unit-test to the code, and build and archive the artifacts
  2. integration test — run integration tests for these artifacts
  3. live deploy — deploy artifacts to a live server
  4. smoke test — run smoke tests for these deployed artifacts

For efficiency, it is recommended to prevent doing work multiple times within the same pipeline, such as doing a full checkout, compiling code, testing code, building artifacts, and archiving artifacts.

The different steps in a pipeline can typically be executed by activating different Maven profiles, which actually reuse artifacts that have been created and archived in an upstream build within the same pipeline. The built-in automatic artifact archiving feature is enabled by default in Jenkins. This feature can often be disabled for downstream jobs, as these jobs typically do not produce any artifact that needs to be reused.

The Maven 2 Project Plugin sets the local maven repository by default to ~/.m2/repository. Especially when implementing pipelines, it is necessary to change this setting to local to the executor in order to prevent interference between concurrent pipeline builds. Although this behavior can be overruled in every job specific configuration, if Jenkins nodes are running with multiple executors, it is recommended to change the generic local maven repository setting anyway, as the local Maven repositories are not safe for concurrent access by different executors.

With the executors each having their own private local Maven repository, it is no longer needed to let a Maven build actually install the generated artifacts into the local repository. This is because there are no guarantees that the consecutive steps of the same pipeline are executed by the same executor. Furthermore, as we will see below, the artifacts that are needed in downstream builds will be downloaded into the local repository of the assigned executor anyway.

As every pipeline creates unique artifact versions, the size of the executor local Maven repositories can grow very quickly. Because every pipeline build only needs one specific version of the generated artifacts, there is no point to keep the older versions.

So, it is a good idea to clean up the local Maven repositories on all nodes regularly, at least for the artifacts that are generated by the pipelines. This can be done by creating a clean-up job for each node executing a simple shell script.

For the master node, the following example script will clean up the local repositories for all executors:

rm -rv ${JENKINS_HOME}/maven-repositories/*

For the slaves, the variable JENKINS_HOME in this expression needs to be replaced by the applicable Jenkins home directory.

The NodeLabel Parameter Plugin can be used to assign the clean-up jobs to the specific nodes and the Heavy Job Plugin can be used to allocate all available executors on that node to ensure exclusive access to all the local repositories.

Now, let's create the actual pipeline by using the Parameterized Trigger Plugin. This enables us to pass predefined parameters to downstream jobs. In this way, we can propagate a single pipeline revision number from the first job throughout the pipeline. Let's say the subversion is used as our version control system. In that case, we can predefine the pipeline revision number parameter PL_SVN_REVISION as being the built-in variable SVN_REVISION:


From all downstream jobs, the pipeline parameter can simply be propagated by enabling the current build parameters feature.

The PL_SVN_REVISION parameter can be used in downstream jobs to check out revision-specific code by adding @${PL_SVN_REVISION} to the Repository URL.

This could be code that is needed to bootstrap an integration test, smoke test, or to perform a live deployment. This code is preferably just a single POM file or a small module. It does not even need to contain actual test scripts or fixtures. These can simply be packed as an artifact in the first job and, just like other artifacts, be reused in downstream jobs.

The Jenkins Maven Repository Server Plugin is a great tool. It exposes a single job build as a Maven repository containing all the artifacts that are archived as part of that build. This makes it very easy and efficient to reuse these specific artifacts in downstream jobs via the usual Maven dependency mechanism. There is no longer a need to let Maven deploy artifacts to a separate Maven repository manager, like Nexus, as Jenkins has become fully self-sufficient. Furthermore, only Jenkins itself is capable of providing the specific artifacts that belong to a specific pipeline.

The Jenkins Maven Repository Server Plugin let' us define an upstream job build as a Maven repository. Although only the last successful build is supported out of the box, with a little trick, it is still possible to select a different upstream job build, more specifically the build that created the artifacts that need to be reused within the pipeline.

For that purpose, let us predefine another pipeline parameter PL_CREATE_BUILD as being a combination of the built-in variable JOB_NAME and the built-in variable BUILD_NUMBER:


This pipeline parameter can be propagated from the first job throughout the pipeline together with the PL_SVN_REVISION parameter.

Now, the specific path ${PL_CREATE_BUILD}/repository actually denotes the correct upstream Maven repository. This expression can be chosen as a specified path in the repository when defining an upstream maven repository. But, because it is an expression, the Jenkins parse POMs phase will fail.

The solution is to use the Environment Injector Plugin, which makes it possible to inject any environment variable into the build process. The Jenkins Maven Repository Server Plugin defines an environment variable Jenkins.Repository under the hood when defining an upstream maven repository. Let's instead inject this environment variable explicitly as property content with the Environment Injector Plugin:

Jenkins.Repository = ${JENKINS_URL}plugin/repository/project/${PL_CREATE_BUILD}/repository

As documented by the Jenkins Maven Repository Server Plugin, this environment variable can be used to specify a Maven repository ${env.Jenkins.Repository} in a Jenkins profile in the Maven settings.xml file.

To ensure that the executor specific local Maven repository is updated with the snapshot versions of the artifacts created in the first job of the pipeline, it is necessary to set the snapshots updatePolicy to always. After all, the different steps of a pipeline are not tied to a specific node or executor, which is the main reason to use the Maven Repository Server Plugin in the first place.

Please note that it can occur that a build executed by an executor is actually based on an older pipeline than the preceding build executed on the same executor. In that case, the snapshot artifacts in the local Maven repository actually needs to be replaced by older versions, meaning artifacts with older timestamps. Fortunately, this appears to be the default behavior of Maven. So, there is no pre-build step required that does a clean up of the local repository in order to guarantee a true pipeline.

Now, sometimes build steps from different pipelines cannot be executed concurrently, because exclusive access to the same resources is required. In case it concerns only a single job, it is sufficient to make sure that the Jenkins built-in feature execute concurrent builds if necessary is disabled for this job, which is the default behavior anyway. In cases where different jobs are involved, they need to be throttled explicitly. For instance, deploying artifacts to a live server and running a smoke test for artifacts already deployed on the same server can obviously not be executed concurrently. The Throttle Concurrent Builds Plugin can be used to define throttle categories and restrict concurrent execution of jobs by assigning them to the same throttle category.

In addition, sometimes build steps of different pipelines cannot be executed in a random order. For instance, deploying artifacts of a pipeline to a live server cannot be succeeded by deploying artifacts of another pipeline earlier than a smoke test has run for the already deployed artifacts. This can be guaranteed by assigning a higher priority to the smoke test job using the Priority Sorter Plugin.

Sometimes, a step in the pipeline might fail because of some technical error that is not related to the associated revision. In order to trigger a rebuild of the failed downstream job, the pipeline parameters PL_SVN_REVISION and PL_CREATE_BUILD need to be specified manually, which is a bit awkward. Here, the Rebuilder Plugin comes in handy, which facilitates rebuilding a job with the same parameters as the failed build.

Alternatively, the Build Pipeline Plugin, which provides a nice visualization of the most recent pipelines, can also be used to manually retrigger a failed build. Unfortunately, the current version does not visualize the revision numbers of the pipelines.

In order to visualize the actual revision numbers within Jenkins, the Build Name Setter Plugin can be used instead. This makes it easier to identify builds by revision number instead of by build number.

For the first job, one can set the build name as:


And, for downstream jobs, the variable SVN_REVISION in this expression needs to be replaced by the pipeline parameter PL_SVN_REVISION, as these can have different values.

Download Building Reactive Microservices in Java: Asynchronous and Event-Based Application Design. Brought to you in partnership with Red Hat

java ,tutorial ,maven ,jenkins ,jenkins home ,jenkins pipeline ,pipeline

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}