DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • How to Merge HTML Documents in Java
  • The Future of Java and AI: Coding in 2025
  • Tired of Spring Overhead? Try Dropwizard for Your Next Java Microservice
  • Using Python Libraries in Java

Trending

  • Blue Skies Ahead: An AI Case Study on LLM Use for a Graph Theory Related Application
  • The Human Side of Logs: What Unstructured Data Is Trying to Tell You
  • Secure by Design: Modernizing Authentication With Centralized Access and Adaptive Signals
  • 5 Subtle Indicators Your Development Environment Is Under Siege
  1. DZone
  2. Coding
  3. Java
  4. Structured Logging in Java With Elastic Stack

Structured Logging in Java With Elastic Stack

Logging is an important pillar of observability in microservices architecture. Structured logging can be leveraged to enable many business-critical functionalities.

By 
Randhir Singh user avatar
Randhir Singh
·
Jul. 28, 21 · Tutorial
Likes (16)
Comment
Save
Tweet
Share
20.6K Views

Join the DZone community and get the full member experience.

Join For Free

Logging is an important pillar of observability in a microservices architecture. Logs are commonly used to find out issues in the software functionality. However, they can be leveraged to enable many business-critical functionalities, e.g., real-time analytics and many others.

In this article, we will use logging libraries in Java to produce structured logs and Elastic stack to collect and aggregate logs. We will then query Elasticsearch to derive insights from the indexed logs.

Scenario

Let us say we have deployed an API for customers to use. We would like to know some statistics on the API usage, e.g., in a given time, who invoked it, how many times, how many were successful, what were the failures, etc.

The Java developer who wrote the API may log the incoming request as follows:

Java
 
log.info("API called by customer {}, product {} for {}", 
				getCustomerId(),
				getProductId()
				getAction());


This prints a log in an unstructured format.

Java
 
10:03:35.331 [main] INFO none.rks.myproduct - API called by client client-1, product product-1 for purpose-1 


Doing any kind of analytics on unstructured logs is difficult. If we want to search on client or product, a log aggregator will not allow us to do that as those fields will not be available in the index.

Structured logging in Java

To create logs in a structured format, we can start by using Java ECS logging in our Java application. The logs generated in this manner will be structured as JSON objects and will use Elastic Common Schema (ECS)-compliant field names.

In our application, we'll configure Log4j2 to use Java ECS logging.

Add the following dependency to pom.xml.

Java
 
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-log4j2</artifactId>
        </dependency>
        <dependency>
            <groupId>co.elastic.logging</groupId>
            <artifactId>log4j2-ecs-layout</artifactId>
            <version>${ecs-logging-java.version}</version>
        </dependency>


As Spring Boot uses logback as the default logger, let us exclude it from the spring-boot-starter dependency.

Java
 
	<dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
            <exclusions>
                <exclusion>
                    <groupId>org.springframework.boot</groupId>
                    <artifactId>spring-boot-starter-logging</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

Next, add a log4j2 configuration for log4j2 logger log4j2.xml in src/main/resources.

XML
 
<Configuration name="LogToConsole" target="SYSTEM_OUT">
    <Appenders>
        <Console name="LogToConsole" target="SYSTEM_OUT">
            <EcsLayout serviceName="my-app"/>
        </Console>
        <File name="FileAppender" fileName="logs/app.log.json">
            <EcsLayout serviceName="my-app"/>
        </File>
    </Appenders>
    <Loggers>
        <Root>
            <AppenderRef ref="LogToConsole" />
            <AppenderRef ref="FileAppender" />
        </Root>
        <Logger name="none.rks.Main" level="debug" additivity="false">
            <AppenderRef ref="LogToConsole" />
            <AppenderRef ref="FileAppender" />
        </Logger>
    </Loggers>
</Configuration>

Let us use Log4j2 in our application to log statements.

Java
 
package none.rks;

import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.apache.logging.log4j.message.StringMapMessage;

public class Foo {
  private static final Logger log = LogManager.getLogger(Foo.class);
  
  public void bar() {
    		log.info(new StringMapMessage()
                     .with("message", "API called")
                     .with("customer.id", getCustomerId())
                     .with("product.id", getProductId())
                     .with("customer.action", getAction());
  }
}

This will produce logs in JSON structured format with custom fields. 

Java
 
{
  "@timestamp":"2021-07-01T11:20:12.176Z", 
  "log.level": "INFO", 
  "customer.id": "client-1",
  "product.id": "25a76f91-41dd-49da-8020-3553c9100267",
  "customer.action": "CREATE",
  "message":"API called", 
  "process.thread.name":"reactor-http-nio-2",
  "log.logger":"none.rks.Foo"
 }

These fields will enable us to query log aggregator based on these fields. Let us setup and configure log aggregator next.

Setup Elastic Stack

Elastic stack comes in both on-premise version and managed Cloud version. For illustration purposes, let us install the required components of the Elastic stack on our Windows laptop.

First, download and install Elasticsearch that will index our logs. Navigate to http://localhost:9200 to make sure Elasticsearch is running fine. 

Java
 
{
  "name" : "LAPTOP-4S09L4LN",
  "cluster_name" : "elasticsearch",
  "cluster_uuid" : "iPjGMFBuQE-v-8UwwC6JFA",
  "version" : {
    "number" : "7.13.1",
    "build_flavor" : "default",
    "build_type" : "zip",
    "build_hash" : "9a7758028e4ea59bcab41c12004603c5a7dd84a9",
    "build_date" : "2021-05-28T17:40:59.346932922Z",
    "build_snapshot" : false,
    "lucene_version" : "8.8.2",
    "minimum_wire_compatibility_version" : "6.8.0",
    "minimum_index_compatibility_version" : "6.0.0-beta1"
  },
  "tagline" : "You Know, for Search"
}


Next, install a compatible version of Kibana to visualize those logs. Navigate to http://localhost:5601 to make sure Kibana is installed correctly. By default, it connects to Elasticsearch running at http://localhost:9200.

Finally, install Filebeat to collect and ship the logs to Elasticsearch. We will configure Filebeat to collect logs from the desired location and send it to Elasticsearch running at http://localhost:9200.

Locate filebeat.yaml where Filebeat was installed. Here we have shown relevant properties only.

Java
 
filebeat.inputs:
- type: log
  enabled: true
  paths:
    - c:\programdata\elasticsearch\logs\*
  json:
    message_key: message    
    keys_under_root: true
    overwrite_keys: true
    
setup.kibana:
  host: "localhost:5601"  
    
output.elasticsearch:
  hosts: ["localhost:9200"]


Run Analytics

To get statistics on how many times the API was called, for example, go to DevTools in Kibana and run queries.

Java
 
GET my-index/_search
{
  "query": {
    "match_phrase": {
      "customer.id": "*"
    }
  }
}


As the custom field customer.id was indexed, we are able to run queries and gather analytics. The same queries can be run using the Kibana console.

Conclusion

Structured logging enables us to go beyond the common troubleshooting role played by application logs. It enables many business-critical operations. In this article, we covered how to create structured logs. We used Elastic search as an example of a log aggregator. We showed how to install and configure components of the Elastic stack for our purpose. Finally, we were able to run queries on custom fields that we added to our structured logs.

Java (programming language)

Opinions expressed by DZone contributors are their own.

Related

  • How to Merge HTML Documents in Java
  • The Future of Java and AI: Coding in 2025
  • Tired of Spring Overhead? Try Dropwizard for Your Next Java Microservice
  • Using Python Libraries in Java

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!