Spring Cloud Stream Binding Kafka With EmbeddedKafkaRule Using In Tests
Spring Cloud Stream is a framework built on top of Spring Boot and Spring Integration that helps in creating event-driven or message-driven microservices.
Join the DZone community and get the full member experience.Join For Free
In this article, we'll introduce the main concepts and constructs of Spring Cloud Stream with some simple test-examples based on EmbeddedKafkaRule using MessageCollector
Dependencies & Configuration
To get started, we'll need to add the Spring Cloud Starter Stream with the Kafka broker Gradle dependency to our build.gradle:
The Spring Cloud Stream project needs to be configured with the Kafka broker URL, topic, and other binder configurations. Below is an example of a configuration for the application.yaml:
This is a simple Spring Cloud Stream-based service that listens to input binding (SpringCloudStreamBindingKafkaApp.kt):
The annotation @EnableBinding configures the service to bind input and output channels.
Now let's see the main concepts:
Bindings: a collection of interfaces that identify the input and output channels declaratively
Binder: messaging middleware implementation such as Kafka or another
Channel: represents the communication pipe between messaging middleware and the application
StreamListeners: message-handling methods in beans that will be automatically invoked on a message from the channel after the MessageConverter does the serialization or deserialization between middleware specific events and domain object types or "POJO"
Message Schemas: used for serialization and deserialization of messages, these schemas can be statically read from a location or loaded dynamically
We will need at least one producer and a consumer to test the message and send and receive operations. Below is the sample code for a producer and consumer in its simplest form, developed using Spring Cloud Stream.
There is a producer bean that will send messages to a Kafka topic (ProducerBinding.kt):
A consumer bean will listen to a Kafka topic and receive messages (ConsumerBinding.kt):
A Kafka broker with a topic is created. For this test, we will use an Embedded Kafka server with spring-kafka-test
Functional Testing Using MessageCollector
This is a binder implementation that allows interaction with channels and reception of the messages. We send a message to the producer binding message channel and then receive it as payload (ProducerTest.kt):
Embedded Kafka broker testing
We use @ClassRule annotation to create this Kafka broker.
The rule starts the Kafka and Zookeeper servers on a random port before starting the tests and shuts them down after complete. Embedded Kafka broker eliminates the need to have a real instance of Kafka and zookeeper while the test is running (ConsumerTest.kt):
In this tutorial, we demonstrated concepts of Spring Cloud Stream and showed how to use it with Kafka, and demonstrated how to use the complete JUnit testing based on EmbeddedKafkaRule with using MessageCollector.
You can find the complete source code here.
Opinions expressed by DZone contributors are their own.