DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Java CI/CD: From Local Build to Jenkins Continuous Integration
  • Travis CI vs Jenkins: Which CI/CD Tool Is Right For You?
  • Recipe To Implement the Jenkins Pipeline For MuleSoft Application [Videos]
  • Concourse CI/CD Pipeline: Webhook Triggers

Trending

  • Why Database Migrations Take Months and How to Speed Them Up
  • *You* Can Shape Trend Reports: Join DZone's Software Supply Chain Security Research
  • Build Your First AI Model in Python: A Beginner's Guide (1 of 3)
  • Scalable, Resilient Data Orchestration: The Power of Intelligent Systems
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. DevOps and CI/CD
  4. Jenkins Declarative Pipeline and Awesome GitHub Integration

Jenkins Declarative Pipeline and Awesome GitHub Integration

Jenkins has introduced a declarative syntax for pipeline creation! Let's take a quick look at using it to integrate Jenkins with GitHub to help build your CI/CD process.

By 
Baptiste Wicht user avatar
Baptiste Wicht
·
Jun. 07, 17 · Tutorial
Likes (6)
Comment
Save
Tweet
Share
11.4K Views

Join the DZone community and get the full member experience.

Join For Free

This post is about some Jenkins news and how I've updated my Jenkins usage. This may be a bit of an enthusiastic post.

At the beginning of Jenkins, the best way to define the commands to be executed for your builds was simply to write the commands in the Jenkins interface. This worked quite well. Later on, Jenkins introduced the notion of the pipeline. Instead of a single set of commands to be executed, the build was defined in multi-stage pipelines of commands. This is defined as a Groovy script. One big advantage of this is that all the code for creating the build is inside the repository. This has the advantage of making each build reproducible. This enabled us to define complex pipelines of commands for your builds. Moreover, this also allows us to have a clean view of which steps are failing and which steps are taking how much time in the build. For instance, here's a view of the pipeline steps for my DLL project:

Jenkins stage view for DLL pipeline.

I think that's pretty cool!

They recently added a new feature, declarative pipelines. Instead of scripting the pipeline in Groovy, the new system uses its own syntax, completely declarative, to put blocks together and add ways of doing actions at specific points, setting environments, and so on. I think the new syntax is much nicer than the Groovy-scripted pipeline way, so I started converting my scripts. I'll give an example in a few paragraphs. First, I'd like to talk about Github integration. Before, every time I created a new project, I had to add it to Jenkins by creating a new project, updating the link to the Github project, and more in order to add it. This is not so bad, but what if you want to build on several branches and keep track of the status of the branches, and maybe the pull requests as well? All of this is now very simple. You can now declare the Github organizations (and users) you are building projects from and the projects inside the organization will be automatically detected as long as they have a Jenkinsfile inside. That means that you'll never have to create a project yourself or handle branches. Indeed, all the created projects can now handle multiples. For instance, here is the status of the two current branches of my dll project:

Jenkins branches for DLL Github project

It's maybe not the best example since one branch is failing and the other is unstable, but you can see that you can track the builds for each branch in a nice way.

A very good feature of this integration is that Jenkins will now automatically mark commits on your Github with the status of your builds at no cost! For instance, here is the status of my ETL project after I configured on Jenkins and made the first builds:

Jenkins marking commits in Github

Another nice thing in Jenkins is the Blue Ocean interface. This is an alternative interface, especially well-suited for multi-branch projects and pipelines. It looks much more modern and I think it's quite good. Here are a few views of it:

Here is the Activity view for the last events of the project:

Jenkins Blue Ocean Activity view for DLL project

And the Branches view for the status of each branch:

Jenkins Blue Ocean Branches view for DLL project

The view of the status of a build:

Jenkins Blue Ocean view of a build for DLL project

The status of the tests for a given build:

Jenkins Blue Ocean view of a build tests for DLL project

It's likely that this won't appeal to everyone, but I think it's pretty nice.

If we get back to the declarative pipeline, here is the declarative pipeline for my Expression Templates Library (ETL) project:

pipeline {
    agent any

    environment {
       CXX = "g++-4.9.4"
       LD = "g++-4.9.4"
       ETL_MKL = 'true'
    }

    stages {
        stage ('git'){
            steps {
                checkout([
                    $class: 'GitSCM',
                    branches: scm.branches,
                    doGenerateSubmoduleConfigurations: false,
                    extensions: scm.extensions + [[$class: 'SubmoduleOption', disableSubmodules: false, recursiveSubmodules: true, reference: '', trackingSubmodules: false]],
                    submoduleCfg: [],
                    userRemoteConfigs: scm.userRemoteConfigs])
            }
        }

        stage ('pre-analysis') {
            steps {
                sh 'cppcheck --xml-version=2 -j3 --enable=all --std=c++11 `git ls-files "*.hpp" "*.cpp"` 2> cppcheck_report.xml'
                sh 'sloccount --duplicates --wide --details include/etl test workbench > sloccount.sc'
                sh 'cccc include/etl/*.hpp test/*.cpp workbench/*.cpp || true'
            }
        }

        stage ('build'){
            steps {
                sh 'make clean'
                sh 'make -j6 release'
            }
        }

        stage ('test'){
            steps {
                sh 'ETL_THREADS=-j6 ETL_GPP=g++-4.9.4 LD_LIBRARY_PATH=\"${LD_LIBRARY_PATH}:/opt/intel/mkl/lib/intel64:/opt/intel/lib/intel64\" ./scripts/test_runner.sh'
                archive 'catch_report.xml'
                junit 'catch_report.xml'
            }
        }

        stage ('sonar-master'){
            when {
                branch 'master'
            }
            steps {
                sh "/opt/sonar-runner/bin/sonar-runner"
            }
        }

        stage ('sonar-branch'){
            when {
                not {
                    branch 'master'
                }
            }
            steps {
                sh "/opt/sonar-runner/bin/sonar-runner -Dsonar.branch=${env.BRANCH_NAME}"
            }
        }

        stage ('bench'){
            steps {
                build job: 'etl - benchmark', wait: false
            }
        }
    }

    post {
        always {
            step([$class: 'Mailer',
                notifyEveryUnstableBuild: true,
                recipients: "baptiste.wicht@gmail.com",
                sendToIndividuals: true])
        }
    }
}


There is nothing really fancy about it. Moreover, since I'm not an expert on pipelines and I've just discovered declarative pipelines, it may not be optimal, but it works. As you'll see, there are some problems I haven't been able to fix.

The first part declares the environment variables for the build. Then, the multiple build stages are listed. The first stage checks out the code from the SCM. This ugly piece of code is here to allow it to check out the submodules. It is the only solution I have found so far. It's very ugly but it works. The second step is simply some basic static analysis. The next step is the classical build step. Then, the tests are run. In that case, I'm using a script because the tests are compiled with several different sets of options and it was much easier to put that in a script that in the pipeline. Moreover, that also means I can run them standalone.

The variables in the line to run the script are another problem I haven't been able to fix so far. If I declare these variables in an environment block, they are not passed to the script for some reason, so I had to use this ugly line.

The next two blocks are for Sonar analysis. If you start with Sonar, you can simply use the second block that passed the branch information to Sonar. Unfortunately, Sonar is very limited in terms of Git branches. Each branch is considered as another totally different project. That means the false positives defined in the master branch will not be used in the second branch. Therefore, I kept a clean master and several different projects for the other branches. Once Sonar improves this branch handling stuff, if they ever do, I'll be able to get rid of one of these conditional stages.

The last stage is simple- running the benchmark job. Finally, the post block is using the Mailer plugin to send failed builds information. Again, there is a problem here since this does not send "back to normal" information as it used to do before. I've asked this question on StackOverflow, but haven't received an answer so far. I'll post a better solution once I have one. If any of you have some solutions to these problems, don't hesitate to post in the comments below or to contact me on Github.

I really think Jenkins is getting even greater now with all this cool stuff and I advise you to try it out!

Pipeline (software) Jenkins (software) GitHub Build (game engine) Branch (computer science) Integration Awesome (window manager)

Published at DZone with permission of Baptiste Wicht, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Java CI/CD: From Local Build to Jenkins Continuous Integration
  • Travis CI vs Jenkins: Which CI/CD Tool Is Right For You?
  • Recipe To Implement the Jenkins Pipeline For MuleSoft Application [Videos]
  • Concourse CI/CD Pipeline: Webhook Triggers

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: