Over a million developers have joined DZone.

How to Feed Spring Boot Metrics to Elasticsearch

DZone's Guide to

How to Feed Spring Boot Metrics to Elasticsearch

In this article, Nicolas Frankel describes how to send JMX metrics to an Elasticsearch instance from your Spring Boot application.

· Integration Zone ·
Free Resource

SnapLogic is the leading self-service enterprise-grade integration platform. Download the 2018 GartnerMagic Quadrant for Enterprise iPaaS or play around on the platform, risk free, for 30 days.

This week’s post aims to describe how to send JMX metrics taken from the JVM to an Elasticsearch instance.

Business App Requirements

The business app(s) has some minor requirements.

The easiest use-case is to start from a Spring Boot application. In order for metrics to be available, just add the Actuator dependency to it.


Note that when inheriting from spring-boot-starter-parent, setting the version is not necessary and taken from the parent POM.

To send data to JMX, configure a brand-new @Bean in the context:

@Bean @ExportMetricWriter
MetricWriter metricWriter(MBeanExporter exporter) {
    return new JmxMetricWriter(exporter);

To-be Architectural Design

There are several options to put JMX data into Elasticsearch.

Possible Options

  1. The most straightforward way is to use Logstash with the JMX plugin.
  2. Alternatively, you can hack your own micro-service architecture: 
    • Let the application send metrics to the JVM (there’s the Spring Boot actuator for that; the overhead is pretty limited).
    • Have a feature expose JMX data on an HTTP endpoint using Jolokia.
    • Have a dedicated app poll the endpoint and send data to Elasticsearch.
  3. This way, every component has its own responsibility, there’s not much performance overhead and the metric-handling part can fail while the main app is still available.
  4. An alternative would be to directly poll the JMX data from the JVM.

Unfortunate Setback

Any architect worth his salt (read: lazy) should always consider the out-of-the-box option. The Logstash JMX plugin looks promising. After installing the plugin, the jmx input can be configured into the Logstash configuration file:

input {
  jmx {
    path => "/var/logstash/jmxconf"
    polling_frequency => 5
    type => "jmx"

output {
  stdout { codec => rubydebug }

The plugin is designed to read JVM parameters (such as host and port), as well as the metrics to handle from JSON configuration files. In the above example, they will be watched in the /var/logstash/jmxconf folder. Moreover, they can be added, removed, and updated on the fly.

Here’s an example of such configuration file:


An MBean’s ObjectName can be determined from inside the jconsole:

Image title

The plugin allows wildcard in the metric’s name and usage of captured values in the alias. Also, by default, all attributes will be read (those can be restricted if necessary).

Note: when starting the business app, it’s highly recommended to set the JMX port through the com.sun.management.jmxremote.port system property.

Unfortunately, at the time of this writing, running the above configuration fails with messages of this kind:

[WARN][logstash.inputs.jmx] Failed retrieving metrics for attribute Value on object blah blah blah
[WARN][logstash.inputs.jmx] undefined method `event' for #<LogStash::Inputs::Jmx:0x70836e5d>

For reference purposes, the GitHub issue can be found here.

The Do-It-Yourself Alternative

Considering it’s easier to poll HTTP endpoints than JMX, and that implementations already exist. Let’s go for Option 3 above. Libraries will include:

  • Spring Boot for the business app.
  • With the Actuator starter to provides metrics.
  • Configured with the JMX exporter for sending data.
  • Also with the dependency to expose JMX beans on an HTTP endpoint.
  • Another Spring Boot app for the “poller.”
  • Configured with a scheduled service to regularly poll the endpoint and send it to Elasticsearch.

Image title

Additional Business App Requirement

To expose the JMX data over HTTP, simply add the Jolokia dependency to the business app:


From this point on, one can query for any JMX metric via the HTTP endpoint exposed by Jolokia. By default, the full URL looks like this: /jolokia/read/<JMX_ObjectName>.

Custom-Made Broker

The broker app responsibilities include:

  • reading JMX metrics from the business app through the HTTP endpoint at regular intervals
  • sending them to Elasticsearch for indexing

My initial move was to use Spring Data, but it seems the current release is not compatible with Elasticsearch latest Version 5, as I got the following exception:

  Received message from unsupported version: [2.0.0] minimal compatible version is: [5.0.0]

Besides, Spring Data is based on entities, which implies deserializing from HTTP and serializing back again to Elasticsearch: that has a negative impact on performance for no real added value.

The code itself is quite straightforward:

open class JolokiaElasticApplication {

  @Autowired lateinit var client: JestClient

  @Bean open fun template() = RestTemplate()

  @Scheduled(fixedRate = 5000)
  open fun transfer() {
    val result = template().getForObject(
    val index = Index.Builder(result).index("metrics").type("metric").id(UUID.randomUUID().toString()).build()

fun main(args: Array<String>) {
  SpringApplication.run(JolokiaElasticApplication::class.java, *args)

Of course, it’s a Spring Boot application (line 1). To poll at regular intervals, it must be annotated with @EnableScheduling (line 2) and have the polling method annotated with @Scheduled and parameterized with the interval in milliseconds (line 9).

In Spring Boot application, calling HTTP endpoints is achieved through the RestTemplate. Once created (line 7). It’s singleton; it can be (re)used throughout the application. The call result is deserialized into a String (line 11).

The client to use is Jest. Jest offers a dedicated indexing API. It just requires the JSON string to be sent, as well as the index name, the object name as well as its id (line 14). With the Spring Boot Elastic starter on the classpath, a JestClient instance is automatically registered in the bean factory. Just autowire it in the configuration (line 5) to use it (line 15).

At this point, launching the Spring Boot application will poll the business app at regular intervals for the specified metrics and send it to Elasticsearch. It’s of course quite crude, everything is hard-coded, but it gets the job done.


Despite the failing plugin, we managed to get the JMX data from the business application to Elasticsearch by using a dedicated Spring Boot app.

With SnapLogic’s integration platform you can save millions of dollars, increase integrator productivity by 5X, and reduce integration time to value by 90%. Sign up for our risk-free 30-day trial!

jmx ,integration ,spring boot ,elastisearch

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}