Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Design Centralized Logging Architecture Using Filebeat, ElasticSearch, and Kibana

DZone 's Guide to

Design Centralized Logging Architecture Using Filebeat, ElasticSearch, and Kibana

Look at a tutorial on centralized logging architecture using Filebeat, Elasticsearch, and Kibana.

· Integration Zone ·
Free Resource

Well, I am NOT talking about these logs :)

I am talking about digital logs, generated by multiple environments across many servers.

As I am working on APIs and microservices, logs are a very important part of our programs.

Our problem was that we were getting millions of API calls every day. Normally, servers and network devices work fine, but if response is slow, it is hard for my team to find out the exact issue and where the API response process is stuck. Also, with modern cloud infrastructure logs growing every minute in production, it’s crucial to get an end-to-end view of what’s going on inside the servers and network devices. Of course, we are storing logs of each API request and process.

But the problem was that logs were on each server in log files. It is very time-consuming to go through log files on each server and find errors. Also, you cannot monitor a live stream of API calls that are not able to do analyses. You can not search massive amounts of data quickly or pinpoint issues in real-time with custom views.

Our objective was to have API logs generate a massive amount of data, and this data may be coming from multiple environments across many servers. To ensure that this data isn’t lost and can be used effectively, they should be consolidated and centralized to a single storage location.

That’s why I started to think about centralized logging architecture.

Here is how my logging architecture was before being centralized:

I have searched for a few logging architectures. There are many, and each of them has its own pros and cons. You need to decide what fits into your requirements. I have selected Filebeat, Elasticsearch, and Kibana. This is an opensource stack, and we are in the primary stage, so we just want to check to see how it works on a large scale. It works like a charm :)

Elastic

EFK  ( ElasticSearch, Filebeat, Kibana) is an open source project. Beyond log aggregation, it includes ElasticSearch for indexing and searching through data and Kibana for charting and visualizing data. Filebeat is a lightweight shipper for forwarding and centralizing log data. Installed as an agent on your servers, Filebeat monitors the log files or locations that you specify, collects log events, and forwards them to either Elasticsearch or Logstash for indexing.

Pros:

  • Open source tools come with a lot of control

  • Quick and easy setup for an open source solution

Cons:

  • ElasticSearch has its own REST API as well as JSON templates

Here is a suggested architecture using ELK stack:


But I was trying to keep less components between the log source and storage, and that’s why I have removed the buffer system and Logstatsh. Filebeat can directly send logs to Elasticsearch, so in my case, Logstash is not necessary.



Here are the steps on how to set up Filebeat to send logs to Elasticsearch. It is very simple.

Prerequisites: Elasticsearch needs to be installed and the Kibana URL should be ready.

Try walking through the full Getting Started guide for Filebeat. There are instructions for Windows. Basically, the instructions are:

  1. Extract the download file anywhere.
  2. Open PowerShell with administrator rights.
  3. Move the extracted directory into Program Files.
PS > mv filebeat-5.1.2-windows-x86_64 "C:\Program Files\Filebeat"

4. Install the Filebeat service.

PS > cd "C:\Program Files\Filebeat"PS C:\Program Files\Filebeat> powershell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1

5. Edit the filebeat.yml config file and test your config.

filebeat.yml path would be C:\Program Files\filebeat\

PS C:\Program Files\Filebeat> .\filebeat.exe -e -configtest

6. (Optional) Run Filebeat in the foreground to make sure everything is working correctly. Ctrl+C to exit.

PS C:\Program Files\Filebeat> .\filebeat.exe -c filebeat.yml -e -d "*"

7. Start the service.

PS > Start-Service filebeat

If you need to stop it, use Stop-Service filebeat. You might need to stop it and start it if you want to make changes to the config.

Now you can check log entries and live streams into Kibana.

Let us know your thoughts in the comments.

Topics:
integration architecture ,elk stacks ,elastic search ,logstash ,kibana

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}