DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Implementing CI/CD Pipelines With Jenkins and Docker
  • Porting Software to ARM: Here’s What We Learned
  • Testcontainers: From Zero To Hero [Video]
  • Java CI/CD: From Local Build to Jenkins Continuous Integration

Trending

  • Understanding the Shift: Why Companies Are Migrating From MongoDB to Aerospike Database?
  • Supervised Fine-Tuning (SFT) on VLMs: From Pre-trained Checkpoints To Tuned Models
  • Immutable Secrets Management: A Zero-Trust Approach to Sensitive Data in Containers
  • How Large Tech Companies Architect Resilient Systems for Millions of Users
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. DevOps and CI/CD
  4. Continuous Integration in AWS Using Jenkins Pipelines: Best Practices and Strategies

Continuous Integration in AWS Using Jenkins Pipelines: Best Practices and Strategies

Learn about implementing CI using Jenkins, a popular automation tool, and how this approach can optimize and streamline your software development process.

By 
lidor ettinger user avatar
lidor ettinger
·
May. 03, 23 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
8.0K Views

Join the DZone community and get the full member experience.

Join For Free

Developing and releasing new software versions is an ongoing process that demands careful attention to detail. The ability to monitor and analyze the entire process is critical for identifying any potential issues and implementing effective corrective measures.

The concept of continuous integration becomes relevant at this point.

By adopting a continuous integration approach, software development teams can carefully monitor each stage of the development process and conduct an in-depth analysis of the outcomes. This facilitates the early detection and diagnosis of potential issues, enabling developers to make necessary adjustments and improve the overall development process. In other words, continuous integration provides a systematic way of identifying problems and continuously enhancing software quality, ultimately leading to a better end product.

The focus of this post is on exploring the benefits of continuous integration in software development. Specifically, we will delve into the practical aspects of implementing continuous integration using Jenkins, a popular automation tool, and share valuable insights on how this approach can help optimize and streamline your software development process. By the end of this post, you will have a better understanding of how continuous integration can improve your workflow and help you build better software more efficiently.

From the Development Team’s Perspective, What Initiates Continuous Integration?

With continuous integration, the development team initiates the process by pushing code changes to the repository, which triggers an automated pipeline to build, test, and deploy the updated software version. This streamlines the development cycle, leading to faster feedback and higher-quality software.

A structured workflow aims to establish a standardized order of operations for developers, ensuring that subsequent versions of the software are built according to the software development life cycle defined by management. Here are some primary benefits of continuous integration:

  1. Version control - With continuous integration, developers can easily track production versions and compare the performance of different versions during development. In addition, the ability to roll back to a previous version is also available, should any production issues arise.
  2. Quality assurance - Developers can test their versions on a staging environment, demonstrating how the new version performs in an environment similar to production. Instead of running the version on their local machine, which may not be comparable to the real environment, developers can define a set of tests, including unit tests and integration tests, among others, that will take the new version through a predefined workflow. This testing process serves as their signature, ensuring the new version is safe to be deployed in a production environment.
  3. Scheduled triggering - Developers no longer need to manually trigger their pipeline or define a new pipeline for each new project. As a DevOps team, it is our responsibility to create a robust system that attaches to each project its own pipeline. Whether it is a common pipeline with slight changes to match the project or the same pipeline, developers can focus on writing code while continuous integration takes care of the rest. Scheduling an automatic triggering (for example, every morning or evening) ensures that the current code in GitHub is always ready for release.

Jenkins in the Era of Continuous Integration

To establish the desired pipeline workflow, we will deploy Jenkins and design a comprehensive pipeline that emphasizes version control, automated testing, and triggers.

Prerequisite

  • A virtual machine with a Docker engine

Containerizing Jenkins

To simplify the deployment of our CI/CD pipelines, we will deploy Jenkins in a Docker container.

Deployment of Jenkins:

docker run -d \
        --name jenkins -p 8080:8080 -u root -p 50000:50000 \
        -v /var/run/docker.sock:/var/run/docker.sock \
        naturalett/jenkins:2.387-jdk11-hello-world


Validate the Jenkins container:

docker ps | grep -i jenkins


Retrieve the Jenkins initial password:

docker exec jenkins bash -c -- 'cat /var/jenkins_home/secrets/initialAdminPassword'


Connect to Jenkins on the localhost (http://localhost:8080/).

Building a Continuous Integration Pipeline

I chose to utilize Groovy in Jenkins pipelines due to its numerous benefits:

  1. Groovy is a scripting language that is straightforward to learn and utilize.
  2. Groovy offers features that enable developers to write code that is concise, readable, and maintainable.
  3. Groovy's syntax is similar to Java, making it easier for Java developers to adopt.
  4. Groovy has excellent support for working with data formats commonly used in software development.
  5. Groovy provides an efficient and effective way to build robust and flexible CI/CD pipelines in Jenkins.

The Four Phases of Our Pipeline

Phase 1: The Agent

To ensure that our code is built with no incompatible dependencies, each pipeline requires a virtual environment. In the following phase, we create an agent (virtual environment) in a Docker container. As Jenkins is also running in a Docker container, we'll mount the Docker socket to enable agent execution.

pipeline {
    agent {
        docker {
            image 'docker:19.03.12'
            args '-v /var/run/docker.sock:/var/run/docker.sock'
        }
    }
...
...
...
}


Phase 2: The History of Versions

We recognize the importance of versioning in software development, which allows developers to monitor code changes and evaluate software performance to make informed decisions about rolling back to a previous version or releasing a new one. In the subsequent phase, we generate a Docker image from our code and assign it a tag based on our predetermined set of definitions.

For example: Date — Jenkins Build Number — Commit Hash

pipeline {
    agent {
...
    }
    stages {
        stage('Build') {
            steps {
                script {
                  def currentDate = new java.text.SimpleDateFormat("MM-dd-yyyy").format(new Date())
                  def shortCommit = sh(returnStdout: true, script: "git log -n 1 --pretty=format:'%h'").trim()
                  customImage = docker.build("naturalett/hello-world:${currentDate}-${env.BUILD_ID}-${shortCommit}")
                }
            }
        }
    }
}


Upon completion of the previous phase, a Docker image of our code has been successfully created and is now available for use in our local environment.

docker image | grep -i hello-world


Phase 3: The Test

In order to ensure that a new release version meets all functional and requirements tests, testing is a critical step. In the following stage, we execute tests against the Docker image that was generated in the previous stage and contains the potential next release.

pipeline {
    agent {
...
    }
    stages {
        stage('Test') {
            steps {
                script {
                    customImage.inside {
                        sh """#!/bin/bash
                        cd /app
                        pytest test_*.py -v --junitxml='test-results.xml'"""
                    }
                }
            }
        }
    }
}


Phase 4: The Scheduling Trigger

Automating the pipeline trigger is crucial in allowing developers to concentrate on writing code while ensuring the stability and readiness of the next release. We accomplish this by setting up a morning schedule that automatically triggers the pipeline as the development team begins their workday.

pipeline {
    agent {
...
    }
    triggers {
        // https://crontab.guru
        cron '00 7 * * *'
    }
    stages {
...
    }
}


An End-to-End Pipeline of the Process

The pipeline execution process has been made simple by incorporating a pre-defined pipeline into Jenkins. You can get started by initiating the "my-first-pipeline" Jenkins job.

  1. The Agent stage creates a virtual environment used for the pipeline.
  2. The Trigger stage is responsible for automatic scheduling in the pipeline.
  3. The Clone stage is responsible for cloning the project repository.
  4. The Build stage involves creating a Docker image for the project. (To access the latest commit and other Git features, we install the Git package.)
  5. The Test stage involves performing tests on our Docker image.
pipeline {
    agent {
        docker {
            image 'docker:19.03.12'
            args '-v /var/run/docker.sock:/var/run/docker.sock'
        }
    }
    triggers {
        // https://crontab.guru
        cron '00 7 * * *'
    }
    stages {
        stage('Clone') {
            steps {
                git branch: 'main', url: 'https://github.com/naturalett/hello-world.git'
            }
        }
        stage('Build') {
            steps {
                script {
                  sh 'apk add git'
                  def currentDate = new java.text.SimpleDateFormat("MM-dd-yyyy").format(new Date())
                  def shortCommit = sh(returnStdout: true, script: "git log -n 1 --pretty=format:'%h'").trim()
                  customImage = docker.build("naturalett/hello-world:${currentDate}-${env.BUILD_ID}-${shortCommit}")
                }
            }
        }
        stage('Test') {
            steps {
                script {
                    customImage.inside {
                        sh """#!/bin/bash
                        cd /app
                        pytest test_*.py -v --junitxml='test-results.xml'"""
                    }
                }
            }
        }
    }
}


Summary

We have gained a deeper understanding of how Continuous Integration (CI) fits into our daily work and have obtained practical experience with essential pipeline workflows.

Virtual environment Docker (software) Jenkins (software) Testing Continuous Integration/Deployment

Published at DZone with permission of lidor ettinger. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Implementing CI/CD Pipelines With Jenkins and Docker
  • Porting Software to ARM: Here’s What We Learned
  • Testcontainers: From Zero To Hero [Video]
  • Java CI/CD: From Local Build to Jenkins Continuous Integration

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!