DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Optimize Deployment Pipelines for Speed, Security and Seamless Automation
  • Mastering Shift-Left: The Ultimate Guide to Input Validation in Jenkins Pipelines
  • Jenkins Pipelines With Centralized Error Codes and Fail-Fast
  • Optimizing CI/CD Pipeline With Kubernetes, Jenkins, Docker, and Feature Flags

Trending

  • Code Reviews: Building an AI-Powered GitHub Integration
  • Agile’s Quarter-Century Crisis
  • Apple and Anthropic Partner on AI-Powered Vibe-Coding Tool – Public Release TBD
  • Creating a Web Project: Caching for Performance Optimization
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. DevOps and CI/CD
  4. Building Jenkins Infrastructure Pipelines

Building Jenkins Infrastructure Pipelines

Learn how to create and configure Jenkins infrastructure pipelines to automate building, testing, and deploying applications.

By 
Fedir Kompaniiets user avatar
Fedir Kompaniiets
·
Apr. 22, 24 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
1.6K Views

Join the DZone community and get the full member experience.

Join For Free

Jenkins allows you to automate everything, from building and testing code to deploying to production. Jenkins works on the principle of pipelines, which can be customized to fit the needs of any project.

After installing Jenkins, we launch it and navigate to the web interface, usually available at http://localhost:8080. On the first launch, Jenkins will ask you to enter a password, which is displayed in the console or located in a file on the server. After entering the password, you are redirected to the plugin setup page.

To work with infrastructure pipelines, you will need the following plugins:

  • Pipeline: The main plugin for creating and managing pipelines in Jenkins.
  • Git plugin: Necessary for integration with Git and working with repositories.
  • Docker Pipeline: Allows you to use Docker within Jenkins pipelines.

Also, in the Jenkins settings, there is a section related to the configuration of version control systems, and there you need to add a repository. For Git, this will require specifying the repository URL and account credentials.

Now you can create an infrastructure pipeline, which is a series of automated steps that transform your code into production-ready software. The main goal of all this is to make the software delivery process as fast as possible.

Creating a Basic Pipeline

A pipeline consists of a series of steps, each of which performs a specific task. Typically, the steps look like this:

  1. Checkout — extracting the source code from the version control system
  2. Build — building the project using build tools, such as Maven
  3. Test — running automated tests to check the code quality
  4. Deploy — deploying the built application to the target server or cloud

Conditions determine the circumstances under which each pipeline step should or should not be executed. Jenkins Pipeline has a "when" directive that allows you to restrict the execution of steps based on specific conditions.

Triggers determine what exactly triggers the execution of the pipeline:

  • Push to repository — the pipeline is triggered every time new commits are pushed to the repository.
  • Schedule — the pipeline can be configured to run on a schedule, for example, every night for nightly builds.
  • External events — the pipeline can also be configured to run in response to external events.

To make all this work, you need to create a Jenkinsfile — a file that describes the pipeline. Here's an example of a simple Jenkinsfile:

Groovy
 
pipeline {
    agent any
    stages {
        stage('Checkout') {
            steps {
                git 'https://your-repository-url.git'
            }
        }
        stage('Build') {
            steps {
                sh 'mvn clean package'
            }
        }
        stage('Test') {
            steps {
                sh 'mvn test'
            }
        }
        stage('Deploy') {
            steps {
                // deployment steps
            }
        }
    }
    post {
        success {
            echo 'The pipeline has completed successfully.'
        }
    }
}


Jenkinsfile describes a basic pipeline with four stages: checkout, build, test, and deploy

Parameterized Builds

Parameterized builds allow you to dynamically manage build parameters.

To start, you need to define the parameters in the Jenkinsfile used to configure the pipeline. This is done using the "parameters" directive, where you can specify various parameter types (string, choice, booleanParam, etc.).

Groovy
 
pipeline {
    agent any
    parameters {
        string(name: 'DEPLOY_ENV', defaultValue: 'staging', description: 'Target environment')
        choice(name: 'VERSION', choices: ['1.0', '1.1', '2.0'], description: 'App version to deploy')
        booleanParam(name: 'RUN_TESTS', defaultValue: true, description: 'Run tests?')
    }
    stages {
        stage('Initialization') {
            steps {
                echo "Deploying version ${params.VERSION} to ${params.DEPLOY_ENV}"
                script {
                    if (params.RUN_TESTS) {
                        echo "Tests will be run"
                    } else {
                        echo "Skipping tests"
                    }
                }
            }
        }
        // other stages
    }
}


When the pipeline is executed, the system will prompt the user to fill in the parameters according to their definitions.

You can use parameters to conditionally execute certain pipeline stages. For example, only run the testing stages if the RUN_TESTS parameter is set to true.

The DEPLOY_ENV parameter can be used to dynamically select the target environment for deployment, allowing you to use the same pipeline to deploy to different environments, such as production.

Dynamic Environment Creation

Dynamic environment creation allows you to automate the process of provisioning and removing temporary test or staging environments for each new build, branch, or pull request. In Jenkins, this can be achieved using pipelines, Groovy scripts, and integration with tools like Docker, Kubernetes, Terraform, etc.

Let's say you want to create a temporary test environment for each branch in a Git repository, using Docker. In the Jenkinsfile, you can define stages for building a Docker image, running a container for testing, and removing the container after the tests are complete:

Groovy
 
pipeline {
    agent any
    stages {
        stage('Build Docker Image') {
            steps {
                script {
                    // For example, the Dockerfile is located at the root of the project 
                    sh 'docker build -t my-app:${GIT_COMMIT} .'
                }
            }
        }
        stage('Deploy to Test Environment') {
            steps {
                script {
                    // run the container from the built image
                    sh 'docker run -d --name test-my-app-${GIT_COMMIT} -p 8080:80 my-app:${GIT_COMMIT}'
                }
            }
        }
        stage('Run Tests') {
            steps {
                script {
                    // steps to run tests
                    echo 'Running tests against the test environment'
                }
            }
        }
        stage('Cleanup') {
            steps {
                script {
                    // stop and remove the container after testing
                    sh 'docker stop test-my-app-${GIT_COMMIT}'
                    sh 'docker rm test-my-app-${GIT_COMMIT}'
                }
            }
        }
    }
}


If Kubernetes is used to manage the containers, you can dynamically create and delete namespaces to isolate the test environments. In this case, the Jenkinsfile might look like this:

Groovy
 
pipeline {
    agent any
    environment {
        KUBE_NAMESPACE = "test-${GIT_COMMIT}"
    }
    stages {
        stage('Create Namespace') {
            steps {
                script {
                    // create a new namespace in Kubernetes
                    sh "kubectl create namespace ${KUBE_NAMESPACE}"
                }
            }
        }
        stage('Deploy to Kubernetes') {
            steps {
                script {
                    // deploy the application to the created namespace
                    sh "kubectl apply -f k8s/deployment.yaml -n ${KUBE_NAMESPACE}"
                    sh "kubectl apply -f k8s/service.yaml -n ${KUBE_NAMESPACE}"
                }
            }
        }
        stage('Run Tests') {
            steps {
                script {
                    // test the application
                    echo 'Running tests against the Kubernetes environment'
                }
            }
        }
        stage('Cleanup') {
            steps {
                script {
                    // delete the namespace and all associated resources
                    sh "kubectl delete namespace ${KUBE_NAMESPACE}"
                }
            }
        }
    }
}


Easily Integrate Prometheus

The Prometheus metrics can be set up in Jenkins through "Manage Jenkins" -> "Manage Plugins."

After installation, we go to the Jenkins settings, and in the Prometheus Metrics section, we enable the exposure of metrics — enable Prometheus metrics.

The plugin will be accessible by default at the URL http://<JENKINS_URL>/prometheus/, where <JENKINS_URL> is the address of the Jenkins server.

In the Prometheus configuration file prometheus.yml, we add a new job to collect metrics from Jenkins:

YAML
 
scrape_configs:
  - job_name: 'jenkins'
    metrics_path: '/prometheus/'
    static_configs:
      - targets: ['<JENKINS_IP>:<PORT>']


Then, through Grafana, we can point to the Prometheus source and visualize the data.

The Prometheus integration allows you to monitor various Jenkins metrics, such as the number of builds, job durations, and resource utilization. This can be particularly useful for identifying performance bottlenecks, tracking trends, and optimizing your Jenkins infrastructure.

By leveraging the power of Prometheus and Grafana, you can gain valuable insights into your Jenkins environment and make data-driven decisions to improve your continuous integration and deployment processes.

Conclusion

Jenkins is a powerful automation tool that can help streamline your software delivery process. By leveraging infrastructure pipelines, you can easily define and manage the steps required to transform your code into production-ready software.

Infrastructure Jenkins (software) Pipeline (software)

Opinions expressed by DZone contributors are their own.

Related

  • Optimize Deployment Pipelines for Speed, Security and Seamless Automation
  • Mastering Shift-Left: The Ultimate Guide to Input Validation in Jenkins Pipelines
  • Jenkins Pipelines With Centralized Error Codes and Fail-Fast
  • Optimizing CI/CD Pipeline With Kubernetes, Jenkins, Docker, and Feature Flags

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!