{{announcement.body}}
{{announcement.title}}

End to End Distributed Logging Traceability With Custom Context

DZone 's Guide to

End to End Distributed Logging Traceability With Custom Context

This article helps a simple End to End Logging Traceability using Open Source Technologies such as SpringBoot, Sleuth, and more!

· Big Data Zone ·
Free Resource

Introduction

This article helps a simple End to End Logging Traceability using Open Source Technologies such as (SpringBoot, Sleuth, MDC, LogStash, ElasticSearch, and Kibana).

Problem Statement

In the Distributed Application world, their many application participates for a given customer business process. There are difficulties to trace the log details when any problem occurs for these distributed applications. Each development team uses its logging pattern and method or APIs (Apache Logger, ALF4J, Log4j or Log4j2, Java Util Logging, Logback.. ).

In the real production or Integration Environment, each of the module developers struggles to debug the problem since there is no Unique TraceId across the multiple Applications. Also, developers might struggle to identify the Functional Context information since which is not shared across multiple Applications.

Solution Approach

Option 1: You can make use of the built-in Springboot Sleuth which is a simple and more efficient way to create a TraceId and spanId. TraceId will be propagated to all the Microservices/Rest Components/Servlet Applications.

Service One

To view the Distributed Logs into a centralized Dashboard view, you can use even Zipkin or Kibana or any Visual Tools.

I have chosen Kibana which is easy to use, better performance using ElasticSearch, a more reliable Logstash pipeline, and much better usability.

Let's start with our solution

Prerequisites

Java 1.8 or 11 or 13

Any Java IDE (IntelliJ / Eclipse / ..)

Install LogStash: https://www.elastic.co/guide/en/logstash/current/installing-logstash.html

Install Elastic Search: https://www.elastic.co/guide/en/elasticsearch/reference/current/install-elasticsearch.html

Install Kibana: https://www.elastic.co/guide/en/kibana/current/install.html

Create 3 Springboot Projects using https://start.spring.io

Project 1: CoreAppOneService

Java
 




x
11
9


1
src/main/resources/application.properties
2
server.port=8080
3
src/main/resources/bootstrap.yml
4
spring:
5
application:
6
name: coreappone



Src/main/resources/logback-spring.xml

Java
 







src/main/java/com.dzone.sample.logging

Java
 







src/main/java/com.dzone.sample.logging

Java
 







pom.xml

Java
 







Project 2: CoreAppTwoService

Java
 




xxxxxxxxxx
1
11
9


1
src/main/resources/application.properties
2
server.port=8081
3
src/main/resources/bootstrap.yml
4
spring:
5
application:
6
name: coreapptwo



src/main/resources/logback-spring.xml

Java
 







src/main/java/com.dzone.sample.logging

Java
 







src/main/java/com.dzone.sample.logging

Java
 







Project 3: CoreAppThreeService

Java
 




xxxxxxxxxx
1
11
9


1
src/main/resources/application.properties
2
server.port=8082
3
src/main/resources/bootstrap.yml
4
spring:
5
application:
6
name: coreappthree



src/main/resources/logback-spring.xml

Java
 




x
80


 
1
<?xml version="1.0" encoding="UTF-8"?>
2
<configuration>
3
  <include resource="org/springframework/boot/logging/logback/defaults.xml"/>
4
  
5
  <springProperty scope="context" name="springAppName" source="spring.application.name"/>
6
  <!-- Example for logging into the build folder of your project -->
7
  <p>
8
 
          
9
  <!-- You can override this to have a custom pattern -->
10
  <property name="CONSOLE_LOG_PATTERN"
11
    value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %X{context.userId} %X{context.moduleId} %X{context.caseId} %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}"/>
12
 
          
13
  <!-- Appender to log to console -->
14
  <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
15
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
16
      <!-- Minimum logging level to be presented in the console logs-->
17
      <level>DEBUG</level>
18
    </filter>
19
    <encoder>
20
      <p>${CONSOLE_LOG_PATTERN}</pattern>
21
      <charset>utf8</charset>
22
    </encoder>
23
  </appender>
24
 
          
25
  <!-- Appender to log to file -->
26
  <appender name="flatfile" class="ch.qos.logback.core.rolling.RollingFileAppender">
27
    <file>${LOG_FILE}</file>
28
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
29
      <fileNamePattern>${LOG_FILE}.%d{yyyy-MM-dd}.gz</fileNamePattern>
30
      <maxHistory>7</maxHistory>
31
    </rollingPolicy>
32
    <encoder>
33
      <p>${CONSOLE_LOG_PATTERN}</pattern>
34
      <charset>utf8</charset>
35
    </encoder>
36
  </appender>
37
  
38
  <!-- Appender to log to file in a JSON format -->
39
  <appender name="logstash" class="ch.qos.logback.core.rolling.RollingFileAppender">
40
    <file>${LOG_FILE}.json</file>
41
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
42
      <fileNamePattern>${LOG_FILE}.json.%d{yyyy-MM-dd}.gz</fileNamePattern>
43
      <maxHistory>7</maxHistory>
44
    </rollingPolicy>
45
    <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
46
      <p>
47
        <timestamp>
48
          <timeZone>UTC</timeZone>
49
        </timestamp>
50
        <p>
51
          <p>
52
            {
53
            "severity": "%level",
54
            "service": "${springAppName:-}",
55
            "trace": "%X{X-B3-TraceId:-}",
56
            "span": "%X{X-B3-SpanId:-}",
57
            "parent": "%X{X-B3-ParentSpanId:-}",
58
            "exportable": "%X{X-Span-Export:-}",
59
            "pid": "${PID:-}",
60
            "thread": "%thread",
61
            "pvai.userId": "%X{context.userId:-}",
62
            "pvai.moduleId": "%X{context.moduleId:-}",
63
            "pvai.caseId": "%X{context.caseId:-}",
64
            "class": "%logger{40}",
65
            "rest": "%message"
66
            }
67
          </pattern>
68
        </pattern>
69
      </providers>
70
    </encoder>
71
  </appender>
72
  
73
  <root level="INFO">
74
    <appender-ref ref="console"/>
75
    <!-- uncomment this to have also JSON logs -->
76
    <appender-ref ref="logstash"/>
77
    <appender-ref ref="flatfile"/>
78
  </root>
79
</configuration>
80
 
          



src/main/java/com.dzone.sample.logging

Java
 







src/main/java/com.dzone.sample.logging

Java
 







Configure LogStash PipeLine

/usr/local/logstash-7.5.1/config/logstash.conf

Java
 







Start LogStash:

/usr/local/logstash-7.5.1/bin

./logstash -f ../config/logstash.conf

Sample Logstash Pipeline execution log

code screenshot


Start Kibana

/usr/local/var/homebrew/linked/kibana-full/bin

./kibana


Start ElasticSearch

/usr/local/var/homebrew/linked/elasticsearch-full/bin

./elasticsearch


Once all the Servers are started, then hit the CoreAppOne Service in Browser.

http://localhost:8080/coreapp1

This CoreAppOne Service calls CoreAppServiceTwo, which intern calls CoreAppServiceThree.

Now, you can able to see both Console logs and JSON logs under Your Workspace/build folder

Let's Connect to Kibana to configure the Logstash index

http://localhost:5601

kibana


kibana


index patterns


create index pattern step one


create index pattern step two


kibana discovery


CoreAppLogStash


CoreAppLogTrace

Topics:
big data ,elasticsearch ,kibana ,logstash ,mdc ,sleuth ,spring boot ,springbootapplication

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}