DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
The Latest "Software Integration: The Intersection of APIs, Microservices, and Cloud-Based Systems" Trend Report
Get the report
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. Deployment
  4. Putting Jenkins Build Logs Into Dockerized ELK Stack

Putting Jenkins Build Logs Into Dockerized ELK Stack

In this tutorial, you will learn how to dockerize Filebeat, Elasticsearch, Logstash, and Kibana and utilize them to manage Jenkins logs.

Kayan Azimov user avatar by
Kayan Azimov
·
Oct. 19, 17 · Tutorial
Like (8)
Save
Tweet
Share
28.69K Views

Join the DZone community and get the full member experience.

Join For Free

Today we are going to look at managing Jenkins build logs in a dockerized environment.

Image titleNormally, in order to view the build logs in Jenkins, all you have to do is to go to a particular job and check the logs. Depending on a log rotation configuration, the logs could be saved for N number of builds, days, etc, meaning the old jobs' logs will be lost.

Our aim in this article will be persisting the logs in a centralized fashion, just like any other application logs, so they could be searched, viewed, and monitored from a single location.

We also will be running Jenkins in Docker, meaning if a container is dropped and no other means are in place, like mounting the volume for logs from a host and taking the backup, the logs will be lost.

As you may have already heard, one of the best solutions when it comes to logging is called ELK stack.

Image title

The idea with ELK stack is you collect logs with Filebeat (or any other *beat), parse, filter logs with Logstash, send them to Elasticsearch for persistence, and then view them in Kibana.

On top of that, because Logstash is a heavyweight JRuby app on JVM, you either skip it or use a way smaller application called Filebeat, which is a Logstash log forwarder; all it does is collect the logs and sends them to Logstash for further processing.

In fact, if you don’t have any filtering and parsing requirements, you can skip Logstash altogether and use Filebeat’s elastic output to send the logs directly to Elasticsearch.

In our example, we will use all of them, plus, we won’t be running Filebeat in a separate container, but instead, will use a custom Jenkins image with preinstalled Filebeat. If you're interested in how to install Filebeat or any other application into your Jenkins container, then you can read about it here.

So, a summary of what we are going to look at today:

  1. Configure and run Logstash in a Docker container.
  2. Configure and run Elasticsearch in a Docker container.
  3. Configure and run Kibana in a Docker container.
  4. Run Jenkins with preinstalled Filebeat to send the logs into ELK.

The command to clone and run the stack will be available at the end of the article.

1. Configure and Run Logstash in a Docker Container

Let’s create a docker-compose file called docker-compose-elk.yml and add containers related to ELK there:

Image title

As you see, we created a new file and added Logstash to it; it is a pretty old image and I just took it from the stack I set up a long time ago (I have updated it, so the source code you download from my reference implementation could have a newer version). If you want to get latest, you will need to make sure the versions matrices match:

Image title

Image title

Now we need to configure Logstash in logstash.conf:

Image title

With this config, all it does is show the logs on the output so we can check if it is actually working. Another baby step; let’s run the new stack:

Image title

Image title

As you can see, Logstash is up and running.

2. Configure and Run Elasticsearch in a Docker Container

The next step is adding Elasticsearch to the stack:

Image title

We also added links and dependency on Elastic to Logstash, so it can see it and wait for it as well. Now we need to forget to configure Logstash to send messages to Elastic on top of the standard output:

Image title

Now you can stop the ELK stack and start again, just hit Ctrl + C or run:

Image title

Image title

Image title

3. Configure and Run Kibana in a Docker Container


Let's add Kibana into the stack now:

Image title

Now go to http://localhost:5601. That is where you find Kibana; you should see this screen:

Image title

Time to send some logs into the stack.

4. Run Jenkins With Preinstalled Filebeat to Send the Logs Into ELK

Create docker-compose.yml with the content as below for Jenkins:

Image title

As you see, it requires the jobs folder to be mounted from the host; this is to configure the jobs which Jenkins will run.

Let's create the folder structure:

Image title

Add this config to config.xml:

<?xml version='1.0' encoding='UTF-8'?>
<flow-definition plugin="workflow-job@2.12.2">
  <actions>
    <org.jenkinsci.plugins.pipeline.modeldefinition.actions.DeclarativeJobPropertyTrackerAction plugin="pipeline-model-definition@1.1.9">
      <jobProperties/>
      <triggers/>
      <parameters/>
    </org.jenkinsci.plugins.pipeline.modeldefinition.actions.DeclarativeJobPropertyTrackerAction>
  </actions>
  <description></description>
  <keepDependencies>false</keepDependencies>
  <properties/>
  <definition class="org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition" plugin="workflow-cps@2.40">
    <scm class="hudson.plugins.git.GitSCM" plugin="git@3.5.1">
      <configVersion>2</configVersion>
      <userRemoteConfigs>
        <hudson.plugins.git.UserRemoteConfig>
          <url>https://github.com/kenych/maze-explorer</url>
        </hudson.plugins.git.UserRemoteConfig>
      </userRemoteConfigs>
      <branches>
        <hudson.plugins.git.BranchSpec>
          <name>*/jenkins-elk</name>
        </hudson.plugins.git.BranchSpec>
      </branches>
      <doGenerateSubmoduleConfigurations>false</doGenerateSubmoduleConfigurations>
      <submoduleCfg class="list"/>
      <extensions/>
    </scm>
    <scriptPath>Jenkinsfile</scriptPath>
    <lightweight>true</lightweight>
  </definition>
  <triggers/>
  <disabled>false</disabled>
</flow-definition>

This is needed to run Jenkins with predefined jobs. Ideally, it should be done with jobDsl, but for the sake of this article, I just used simple job config with a single job.

Let's run the stack now:

Image title

Image title

As you can see, everything is up and running. We can also check the Filebeat configuration inside the container:

Image title

As soon as you run the job, Filebeat should start sending logs found by scanning the configured path, so please start the job:

Image title

Image title

Now let's search for something in the logs:

Image title

I picked up the phrase "cloning;" let's use Elastic's REST API first:

Image title

And then Kibana:

Image title

Finally, if you didn’t follow the instructions but still want to have it all up and running with a magic command, just run this to clone and run the stack:

git clone https://github.com/kenych/dockerizing-jenkins && \
   cd dockerizing-jenkins && \
   git checkout dockerizing_jenkins_part_4_elk_stack_simplified && \
   ./runall.sh
Jenkins (software) Docker (software) Build (game engine) career

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • 10 Things to Know When Using SHACL With GraphDB
  • Practical Example of Using CSS Layer
  • What To Know Before Implementing IIoT
  • Event Driven 2.0

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: