DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Container Checkpointing in Kubernetes With a Custom API
  • Leveraging Seekable OCI: AWS Fargate for Containerized Microservices
  • Deploying Dockerized Applications on AWS Lambda: A Step-by-Step Guide
  • Strategic Deployments in AWS: Leveraging IaC for Cross-Account Efficiency

Trending

  • How Kubernetes Cluster Sizing Affects Performance and Cost Efficiency in Cloud Deployments
  • How to Merge HTML Documents in Java
  • Creating a Web Project: Caching for Performance Optimization
  • Secrets Sprawl and AI: Why Your Non-Human Identities Need Attention Before You Deploy That LLM
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Logging Docker Containers With AWS Cloudwatch

Logging Docker Containers With AWS Cloudwatch

This post describes how to set up the integration between Docker and AWS and then establish a pipeline of logs from CloudWatch into the ELK Stack.

By 
Daniel Berman user avatar
Daniel Berman
·
Nov. 28, 16 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
9.7K Views

Join the DZone community and get the full member experience.

Join For Free

docker logging aws cloudwatch

one of the ways to log docker containers is to use the logging drivers added by docker last year . these drivers log the stdout and stderr output of a docker container to a destination of your choice — depending on which driver you are using — and enable you to build a centralized log management system (the default behavior is to use the json-file driver, saving container logs to a json file).

the awslogs driver allows you to log your containers to aws cloudwatch, which is useful if you are already using other aws services and would like to store and access the log data on the cloud. once in cloudwatch, you can hook up the logs with an external logging system for future monitoring and analysis.

this post describes how to set up the integration between docker and aws and then establish a pipeline of logs from cloudwatch into the elk stack (elasticsearch, logstash, and kibana) offered by logz.io.

setting up aws

before you even touch docker, you need to make sure that we have aws configured correctly. this means that we need a user with an attached policy that allows for the writing of events to cloudwatch, and we need to create a new logging group and stream in cloudwatch.

creating an aws user

first, create a new user in the iam console (or select an existing one). make note of the user’s security credentials (access key id and access key secret) — they are needed to configure the docker daemon in the next step. if this is a new user, simply add a new access key.

second, you will need to define a new policy and attach it to the user that you have just created. so in the same iam console, select policies from the menu on the left, and create the next policy which enables writing logs to cloudwatch:

{
 "version": "2012-10-17",
 "statement": [
    {
       "action": [
          "logs:createlogstream",
          "logs:putlogevents"
       ],
       "effect": "allow",
       "resource": "*"
    }
 ]
}

save the policy, and assign it to the user.

preparing cloudwatch

open the cloudwatch console, select logs from the menu on the left, and then open the actions menu to create a new log group:

create new log group in aws cloudwatch

within this new log group, create a new log stream. make note of both the log group and log stream names — you will use them when running the container.

configuring docker

the next step is to configure the docker daemon (and not the docker engine) to use your aws user credentials.

as specified in the docker documentation , there are a number of ways to do this such as shared credentials in ~/.aws/credentials and using ec2 instance policies (if docker is running on an aws ec2 instance). but this example will opt for a third option — using an upstart job.

create a new override file for the docker service in the /etc/init folder:

$ sudo vim /etc/init/docker.override


define your aws user credentials as environment variables:

env aws_access_key_id=<aws_access_key_id>
env aws_secret_access_key=<aws_secret_access_key>


save the file, and restart the docker daemon:

$ sudo docker service restart


note that in this case docker is installed on an ubuntu 14.04 machine. if you are using a later version of ubuntu, you will need to use systemd.

using the awslogs driver

now that docker has the correct permissions to write to cloudwatch, it’s time to test the first leg of the logging pipeline.

use the run command with the –awslogs driver parameters:

$ docker run -it --log-driver="awslogs" --log-opt awslogs-region="us-east-1" --log-opt awslogs-group="log-group" --log-opt awslogs-stream="log-stream" ubuntu:14.04 bash


if you’d rather use docker-compose, create a new docker-compose.yml file with this logging configuration:
version: "2"
services:
   web:
      image: ubuntu:14.04
         logging:
            driver: "awslogs"
            options:
               awslogs-region: "us-east-1"
               awslogs-group: "log-group"
               awslogs-stream: "log-stream"


open up the log stream in cloudwatch. you should see container logs:

cloudwatch log stream

shipping to elk for analysis

so, you’ve got your container logs in cloudwatch. what now? if you need monitoring and analysis, the next obvious step would be to ship the data into a centralized logging system.

the next section will describe how to set up a pipeline of logs into the logz.io ai-powered elk stack using s3 batch export. two requirements by aws need to be noted here:

  • the amazon s3 bucket must reside in the same region as the log data that you want to export
  • you have to make sure that your aws user has permissions to access the s3 bucket

exporting to s3

cloudwatch supports batch export to s3 , which in this context means that you can export batches of archived docker logs to an s3 bucket for further ingestion and analysis in other systems.

to export the docker logs to s3, open the logs page in cloudwatch. then, select the log group you wish to export, click the actions menu, and select export data to amazon s3 :

export cloudwatch data to aws s3

in the dialog that is displayed, configure the export by selecting a time frame and an s3 bucket to which to export. click export data when you’re done, and the logs will be exported to s3.

importing into logz.io

setting up the integration with logz.io is easy. in the user interface, go to log shipping → aws → s3 bucket.

click “add s3 bucket,” and configure the settings for the s3 bucket containing the docker logs with the name of the bucket, the prefix path to the bucket (excluding the name of the bucket), the aws access key details (id + secret), the aws region, and the type of logs:

add s3 bucket

hit the “save” button, and your s3 bucket is configured. head on over to the visualize tab in the user interface. after a moment or two, the container logs should be displayed:

display container logs in kibana

using logstash

if you are using your own elk stack, you can configure logstash to import and parse the s3 logs using the s3 input plugin.

an example configuration would look something like this:

input {
   s3 {
      bucket => "dockertest"
      credentials => [ "my-aws-key", "my-aws-token" ]
      region_endpoint => "us-east-1"
      # keep track of the last processed file
      sincedb_path => "./last-s3-file"
      codec => "json"
      type => "cloudwatch"
  }
}

filter {}

output {
   elasticsearch_http {
      host => "server-ip"
      port => "9200"
   }
}

a final note

there are other methods of pulling the data from cloudwatch into s3 — using kinesis and lambda, for example. if you’re looking for an automated process, this might be the better option to explore — i will cover it in the next post on the subject.

also, if you’re looking for a more comprehensive solution for logging docker environments using elk, i recommend reading about the logz.io docker log collector .

Docker (software) AWS

Published at DZone with permission of Daniel Berman, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Container Checkpointing in Kubernetes With a Custom API
  • Leveraging Seekable OCI: AWS Fargate for Containerized Microservices
  • Deploying Dockerized Applications on AWS Lambda: A Step-by-Step Guide
  • Strategic Deployments in AWS: Leveraging IaC for Cross-Account Efficiency

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!