DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
View Events Video Library
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Migrate, Modernize and Build Java Web Apps on Azure: This live workshop will cover methods to enhance Java application development workflow.

Modern Digital Website Security: Prepare to face any form of malicious web activity and enable your sites to optimally serve your customers.

Kubernetes in the Enterprise: The latest expert insights on scaling, serverless, Kubernetes-powered AI, cluster security, FinOps, and more.

A Guide to Continuous Integration and Deployment: Learn the fundamentals and understand the use of CI/CD in your apps.

Related

  • Simplify Docker Container Management at Scale With Amazon ECS
  • Mastering Node.js: The Ultimate Guide
  • Keep Your Application Secrets Secret
  • Auto Remediation of GuardDuty Findings for a Compromised ECS Cluster in AWSVPC Network Mode

Trending

  • Ways To Reduce JVM Docker Image Size
  • Understanding Zero Trust Security Building a Safer Digital World
  • A Roadmap to True Observability
  • Introducing the Apache JMeter Docker Extension
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Logging Docker Containers With AWS Cloudwatch

Logging Docker Containers With AWS Cloudwatch

This post describes how to set up the integration between Docker and AWS and then establish a pipeline of logs from CloudWatch into the ELK Stack.

Daniel Berman user avatar by
Daniel Berman
·
Nov. 28, 16 · Tutorial
Like (3)
Save
Tweet
Share
9.2K Views

Join the DZone community and get the full member experience.

Join For Free

docker logging aws cloudwatch

one of the ways to log docker containers is to use the logging drivers added by docker last year . these drivers log the stdout and stderr output of a docker container to a destination of your choice — depending on which driver you are using — and enable you to build a centralized log management system (the default behavior is to use the json-file driver, saving container logs to a json file).

the awslogs driver allows you to log your containers to aws cloudwatch, which is useful if you are already using other aws services and would like to store and access the log data on the cloud. once in cloudwatch, you can hook up the logs with an external logging system for future monitoring and analysis.

this post describes how to set up the integration between docker and aws and then establish a pipeline of logs from cloudwatch into the elk stack (elasticsearch, logstash, and kibana) offered by logz.io.

setting up aws

before you even touch docker, you need to make sure that we have aws configured correctly. this means that we need a user with an attached policy that allows for the writing of events to cloudwatch, and we need to create a new logging group and stream in cloudwatch.

creating an aws user

first, create a new user in the iam console (or select an existing one). make note of the user’s security credentials (access key id and access key secret) — they are needed to configure the docker daemon in the next step. if this is a new user, simply add a new access key.

second, you will need to define a new policy and attach it to the user that you have just created. so in the same iam console, select policies from the menu on the left, and create the next policy which enables writing logs to cloudwatch:

{
 "version": "2012-10-17",
 "statement": [
    {
       "action": [
          "logs:createlogstream",
          "logs:putlogevents"
       ],
       "effect": "allow",
       "resource": "*"
    }
 ]
}

save the policy, and assign it to the user.

preparing cloudwatch

open the cloudwatch console, select logs from the menu on the left, and then open the actions menu to create a new log group:

create new log group in aws cloudwatch

within this new log group, create a new log stream. make note of both the log group and log stream names — you will use them when running the container.

configuring docker

the next step is to configure the docker daemon (and not the docker engine) to use your aws user credentials.

as specified in the docker documentation , there are a number of ways to do this such as shared credentials in ~/.aws/credentials and using ec2 instance policies (if docker is running on an aws ec2 instance). but this example will opt for a third option — using an upstart job.

create a new override file for the docker service in the /etc/init folder:

$ sudo vim /etc/init/docker.override


define your aws user credentials as environment variables:

env aws_access_key_id=<aws_access_key_id>
env aws_secret_access_key=<aws_secret_access_key>


save the file, and restart the docker daemon:

$ sudo docker service restart


note that in this case docker is installed on an ubuntu 14.04 machine. if you are using a later version of ubuntu, you will need to use systemd.

using the awslogs driver

now that docker has the correct permissions to write to cloudwatch, it’s time to test the first leg of the logging pipeline.

use the run command with the –awslogs driver parameters:

$ docker run -it --log-driver="awslogs" --log-opt awslogs-region="us-east-1" --log-opt awslogs-group="log-group" --log-opt awslogs-stream="log-stream" ubuntu:14.04 bash


if you’d rather use docker-compose, create a new docker-compose.yml file with this logging configuration:
version: "2"
services:
   web:
      image: ubuntu:14.04
         logging:
            driver: "awslogs"
            options:
               awslogs-region: "us-east-1"
               awslogs-group: "log-group"
               awslogs-stream: "log-stream"


open up the log stream in cloudwatch. you should see container logs:

cloudwatch log stream

shipping to elk for analysis

so, you’ve got your container logs in cloudwatch. what now? if you need monitoring and analysis, the next obvious step would be to ship the data into a centralized logging system.

the next section will describe how to set up a pipeline of logs into the logz.io ai-powered elk stack using s3 batch export. two requirements by aws need to be noted here:

  • the amazon s3 bucket must reside in the same region as the log data that you want to export
  • you have to make sure that your aws user has permissions to access the s3 bucket

exporting to s3

cloudwatch supports batch export to s3 , which in this context means that you can export batches of archived docker logs to an s3 bucket for further ingestion and analysis in other systems.

to export the docker logs to s3, open the logs page in cloudwatch. then, select the log group you wish to export, click the actions menu, and select export data to amazon s3 :

export cloudwatch data to aws s3

in the dialog that is displayed, configure the export by selecting a time frame and an s3 bucket to which to export. click export data when you’re done, and the logs will be exported to s3.

importing into logz.io

setting up the integration with logz.io is easy. in the user interface, go to log shipping → aws → s3 bucket.

click “add s3 bucket,” and configure the settings for the s3 bucket containing the docker logs with the name of the bucket, the prefix path to the bucket (excluding the name of the bucket), the aws access key details (id + secret), the aws region, and the type of logs:

add s3 bucket

hit the “save” button, and your s3 bucket is configured. head on over to the visualize tab in the user interface. after a moment or two, the container logs should be displayed:

display container logs in kibana

using logstash

if you are using your own elk stack, you can configure logstash to import and parse the s3 logs using the s3 input plugin.

an example configuration would look something like this:

input {
   s3 {
      bucket => "dockertest"
      credentials => [ "my-aws-key", "my-aws-token" ]
      region_endpoint => "us-east-1"
      # keep track of the last processed file
      sincedb_path => "./last-s3-file"
      codec => "json"
      type => "cloudwatch"
  }
}

filter {}

output {
   elasticsearch_http {
      host => "server-ip"
      port => "9200"
   }
}

a final note

there are other methods of pulling the data from cloudwatch into s3 — using kinesis and lambda, for example. if you’re looking for an automated process, this might be the better option to explore — i will cover it in the next post on the subject.

also, if you’re looking for a more comprehensive solution for logging docker environments using elk, i recommend reading about the logz.io docker log collector .

Docker (software) AWS

Published at DZone with permission of Daniel Berman, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Simplify Docker Container Management at Scale With Amazon ECS
  • Mastering Node.js: The Ultimate Guide
  • Keep Your Application Secrets Secret
  • Auto Remediation of GuardDuty Findings for a Compromised ECS Cluster in AWSVPC Network Mode

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: