Log Management With the ELK Stack on Windows Server — Part 1
Learn to how to use the ELK Stack Log Management Platform, including Elasticsearch, to process data and manage logs.
Join the DZone community and get the full member experience.Join For Free
As the world of digital content grows, trying to read, find, or analyze any information becomes difficult and time-consuming. You need to spend a huge amount of time to process information. To solve this problem, some good guys (thanks to their passion) developed definitions like search engines, designed to help decrease the time required to find information and the amount of information which must be consulted.
In this post, we will try to understand the ELK Stack (in another word, Elastic Stack) Log Management Platform, which uses the most popular search engine, Elasticsearch. I have broken up the explanation into three parts:
- Introduction — What is the ELK Stack?
- Installation — Install ELK Stack on Windows Server 2012 R2
- Customization — Customize and visualize custom logs
This article is Part 1, the introduction.
What Is the ELK Stack?
The ELK Stack is a collection of three open-source products: Elasticsearch, Logstash, and Kibana. The whole process can be visualized as follows:
Elasticsearch is a distributed, RESTful search and analytics engine based on Apache Lucene, which is capable of solving a growing number of use cases. Elasticsearch is developed in Java and is released as open source under the terms of the Apache License and categorized as a NoSQL database. Which means, it stores data in an unstructured way.
According to the DB-Engines ranking, Elasticsearch is the most popular enterprise search engine, followed by Apache Solr (also based on Apache Lucene).
In the context of data analysis, Elasticsearch is used together with the other components in the ELK Stack, Logstash and Kibana, and plays the role of data indexing and storage.
Logstash is a log pipeline tool, which has 3 stages:
- Input Plugins - Accepts inputs from various sources
- Filter Plugins - Executes different transformations
- Output Plugins - Exports the data to various targets
Input Plugins: Logstash supports a variety of inputs that pull in events from a multitude of common sources, all at the same time. Input plugin examples: file, HTTP, beats, log4j, etc. Check the documentation for full list of Input Plugins.
Filter Plugins: As data travels from source to store, Logstash filters parse each event, identify named fields to build a structure, and transform them to converge on a common format for easier, accelerated analysis and business value.
A filter plugin performs intermediary processing on an event. Filters are often applied conditionally depending on the characteristics of the event. The most commonly used filter plugins, which I used during the working process, are the following (check the documentation for all possible Filter Plugins):
grok- Parses unstructured event data into fields
mutate- Performs mutations on fields
kv- Parses key-value pairs
Output Plugins: Logstash has a variety of outputs that let you route data where you want. In our case it will be Elasticsearch, but it can be also file, email, HTTP, etc. You can check the list of all possible Output Plugins from the documentation.
It's simple, browser-based interface enables you to quickly create and share dynamic dashboards that display changes to Elasticsearch queries in real time. Kibana's visualization features allow users to visualize data in a variety of different ways, using charts, tables, geographical maps and other types of visualizations.
I hope I was able to help you to understand what the ELK Stack is. Please feel free to share with me your comments and feedback.
In Part 2, you will find the installation process of ELK Stack on Windows Server 2012 R2 platform.
Published at DZone with permission of Shamil Mehdiyev. See the original article here.
Opinions expressed by DZone contributors are their own.