TopicRecordNameStrategy in Kafka
Here, learn about TopicRecordNameStrategy and its use in Kafka. Create one topic and publish different event types same topic belongs to the same entity.
Join the DZone community and get the full member experience.
Join For FreeApache Kafka is a widely used streaming platform. Kafka works based on the pub-sub model. An application that publishes a message called Producer. On the other side, the application consuming it is called a Consumer.
Topics are used to exchange messages between the systems. In a distributed environment, there could be multiple messages which are to be exchanged. There can be a situation the number of topics that are needed can grow.
To normalize the way how the topics are created, one can take advantage of TopicRecordNameStrategy
where one single topic can be used for a given entity; for example, User
can be used for UserCreated
and UserUpdated
.
Events like create
, update
, and delete
of a user can be published on the same topic.
Each record in a Kafka topic consists of a key, a value, and a timestamp. Kafka provides various strategies for naming the records in a topic, including the topicRecordNameStrategy
. In this article, we will explore how to use the topicRecordNameStrategy
to publish messages of two different event types to the same Kafka topic.
1. Create a topic "users" by running the following command:
bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic users
2. Configure the Kafka producer to use the topicRecordNameStrategy
. Consider adding value.subject.name.strategy
key to supporting TopicRecordNameStrategy
.
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.subject.name.strategy", "io.confluent.kafka.serializers.subject.TopicRecordNameStrategy");
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
3. Define two different classes for each event type and serialize them using the same producer instance. The topicRecordnameStrategy
will automatically set the record name to the Kafka topic based on the className
for the serialized message.
public class UserCreated {
private String id;
private String firstName;
private String lastName;
private String email;
//getters and setters
}
public class UserUpdated {
private String id;
private String firstName;
private String lastName;
private String email;
private Long updatedAt;
//getters and setters
}
4. Define two Avro schemas for both the events UserCreated
and userUpdated
.
{
"type": "record",
"name": "UserCreated",
"namespace": "com.messages.events.user",
"fields": [
{"name": "id", "type": "string"},
{"name": "firstName", "type": "string"},
{"name": "lastName", "type": "string"},
{"name": "emailId", "type": "string"}
]
}
{
"type": "record",
"name": "UserUpdated",
"namespace": "com.messages.events.user",
"fields": [
{"name": "id", "type": "string"},
{"name": "firstName", "type": "string"},
{"name": "lastName", "type": "string"},
{"name": "email", "type": "string"},
{"name": "updatedAt", "type": "long"}
]
}
5. Create a ProducerRecord
to publish the UserCreated
event to Kafka. Notice that we are using the topic users while publishing the message:
UserCreated userCreated = new UserCreated("1000", "userFirstName", "UserLastName","userOne@email.com");
ProducerRecord<String, Object> userCreatedRecord = new ProducerRecord<>("user", userCreated.getId(), userCreated);
producer.send(userCreatedRecord);
UserUpdated userUpdated = new UserUpdated("1000", "Updated UserFirstName", "Updated UserLastName" "userOne@email.com", Instant.now().toEpochMilli());
ProducerRecord<String, Object> userUpdatedRecord = new ProducerRecord<>("users", userUpdated.getId(), userUpdated);
producer.send(userUpdatedRecord);
In this example, we are setting the record name in the Kafka topic to the event type name by using the topicRecordNameStrategy
.
Conclusion
In summary, messages of different event types are published to the same Kafka topic which custom record names. This allows you to avoid multiple Kafka topics for different event types for the same entity.
Opinions expressed by DZone contributors are their own.
Comments