Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Natively Compiling Micronaut Microservices Using GraalVM for Insanely Faster Startups

DZone's Guide to

Natively Compiling Micronaut Microservices Using GraalVM for Insanely Faster Startups

Learn the steps to set up Micronaut with GraalVM and libraries for faster microservice application startup.

· Microservices Zone ·
Free Resource

Containerized Microservices require new monitoring. Read the eBook that explores why a new APM approach is needed to even see containerized applications.

The Micronaut framework is a microservice framework which would be immediately recognizable to developers familiar with Spring Boot or MicroProfile. It certainly felt that way to me, and this is by design — it makes it easier for developers to consider moving over to this new framework. But why should you? Micronaut takes a different approach to enabling everything we developers take for granted in Spring Boot and MicroProfile. Rather than do runtime annotation processing as Spring Boot and MicroProfile, Micronaut uses annotation processors at compile time to generate additional classes that are compiled alongside your code. This means startup time is reduced due to the substantially lower amount of overhead that is required to scan the classpath of your project. In fact, Micronaut tries to avoid reflection as much as possible, only using it where absolutely necessary.

The benefit of this is obvious. Where Spring Boot and MicroProfile applications can take tens of seconds to start (depending on the complexity of the classpath that must be scanned), Micronaut starts on my machine in less than a second — normally around 650ms, in fact.

Despite this, Micronaut offers everything you've come to expect from a microservices framework — dependency injection, convention over configuration, service discovery, routing, etc.

This is cool enough, and it is great for testing — starting a server from a clean build is so much less painful when you're only waiting a second or so. But I wanted to push further, and use GraalVM to compile the Java code down to a native image. This should give us an even better startup, making it even more appealing for serverless use cases where you pay just for the execution time.

So - what is necessary to use GraalVM to compile down a Micronaut application to native code? Here's a quick tutorial on what I had to do:

Firstly, you need to install GraalVM itself. This is essentially JDK 8 with additional tools (such as the one we will use later to create a native image). You can download GraalVM from the website, or you can use a tool like to download it onto your system. Here are the instructions for installing GraalVM with SDKman:

sdk install java 1.0.0-rc6-graal
sdk use java 1.0.0-rc6-graal

With GraalVM installed, we need to install the substrateVM library into our local Maven cache. SubstrateVM is a small virtual machine written in Java that GraalVM compiles together with our application code to provide us with GC, memory management, etc.

mvn install:install-file -Dfile=${JAVA_HOME}/jre/lib/svm/builder/svm.jar \
                           -DgroupId=com.oracle.substratevm \
                           -DartifactId=svm \
                           -Dversion=GraalVM-1.0.0-rc6 \
                           -Dpackaging=jar

Assuming that we've already installed the Micronaut CLI, we can then create a Graal native microservice using the following command:

mn create-app hello-world --features graal-native-image

Once that is created we can change into that directory and compile the code and run it with a Micronaut tool that will generate a report in build/reflect.json, with information on the reflection that is occurring within the application. This report is fed into the GraalVM compiler to ensure it knows how to properly compile everything.

./gradlew assemble
java -cp build/libs/hello-world-0.1-all.jar io.micronaut.graal.reflect.GraalClassLoadingAnalyzer

With this, we can then use the GraalVM native-image tool to generate a native version of our code. The following command is what ended up working for me:

native-image --class-path target/hello-world-0.1.jar \ 
  -H:ReflectionConfigurationFiles=target/reflect.json \ 
  -H:EnableURLProtocols=http \ 
  -H:IncludeResources="logback.xml|application.yml|META-INF/services/*.*" \ 
  -H:Name=hello-world \ 
  -H:Class=hello.world.Application \ 
  -H:+ReportUnsupportedElementsAtRuntime \ 
  -H:+AllowVMInspection \
  --delay-class-initialization-to-runtime=io.netty.handler.codec.http.HttpObjectEncoder,io.netty.handler.codec.http.websocketx.WebSocket00FrameEncoder

If that completes successfully, you can now run your natively-compiled version of the application as per usual. On my machine, this is what I see:

~/Code/projects/micronaut/hello-world $ ./hello-world
15:16:14.827 [main] INFO  io.micronaut.runtime.Micronaut - Startup completed in 22ms. Server Running: http://localhost:8080

I can access my microservices at the specified URL as per usual, but the startup time has dropped to 22ms! That's incredibly fast!

I've got a bunch more experiments and cool things underway. I'll talk about those on this blog, but the best way to keep informed is to follow me on Twitter.

Discover how to automatically manage containers and microservices with better control and performance using Instana APM. Try it for yourself today.

Topics:
microservices ,micronaut ,tutorial ,graalvm ,substratevm ,performance

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}