End to End Distributed Logging Traceability With Custom Context
This article helps a simple End to End Logging Traceability using Open Source Technologies such as SpringBoot, Sleuth, and more!
Join the DZone community and get the full member experience.Join For Free
This article helps a simple End to End Logging Traceability using Open Source Technologies such as (SpringBoot, Sleuth, MDC, LogStash, ElasticSearch, and Kibana).
In the Distributed Application world, their many application participates for a given customer business process. There are difficulties to trace the log details when any problem occurs for these distributed applications. Each development team uses its logging pattern and method or APIs (Apache Logger, ALF4J, Log4j or Log4j2, Java Util Logging, Logback.. ).
In the real production or Integration Environment, each of the module developers struggles to debug the problem since there is no Unique TraceId across the multiple Applications. Also, developers might struggle to identify the Functional Context information since which is not shared across multiple Applications.
Option 1: You can make use of the built-in Springboot Sleuth which is a simple and more efficient way to create a TraceId and spanId. TraceId will be propagated to all the Microservices/Rest Components/Servlet Applications.
To view the Distributed Logs into a centralized Dashboard view, you can use even Zipkin or Kibana or any Visual Tools.
I have chosen Kibana which is easy to use, better performance using ElasticSearch, a more reliable Logstash pipeline, and much better usability.
Let's start with our solution
Java 1.8 or 11 or 13
Any Java IDE (IntelliJ / Eclipse / ..)
Install Elastic Search: https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html
Install Kibana: https://www.elastic.co/guide/en/kibana/current/install.html
Create 3 Springboot Projects using https://start.spring.io
Project 1: CoreAppOneService
Project 2: CoreAppTwoService
Project 3: CoreAppThreeService
Configure LogStash PipeLine
./logstash -f ../config/logstash.conf
Sample Logstash Pipeline execution log
Once all the Servers are started, then hit the CoreAppOne Service in Browser.
This CoreAppOne Service calls CoreAppServiceTwo, which intern calls CoreAppServiceThree.
Now, you can able to see both Console logs and JSON logs under Your Workspace/build folder
Let's Connect to Kibana to configure the Logstash index
Opinions expressed by DZone contributors are their own.