Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Kafka Tutorial: Generate Multiple Consumer Groups Dynamically With Spring-Kafka

DZone 's Guide to

Kafka Tutorial: Generate Multiple Consumer Groups Dynamically With Spring-Kafka

Generate multiple consumer groups in minutes with Spring-Kafka.

· Big Data Zone ·
Free Resource

humming-bird-taking-nectar-from-blooming-flower

Hey all, today I will show one way to generate multiple consumer groups dynamically with Spring-Kafka. Before this approach, let's do it with annotations. We just create a configuration class which consist of  a  spring @Bean that generates our KafkaListenerContainerFactory.

You may also like: Spring for Apache Kafka Part 1: Error Handling, Message Conversion, and Transaction Support.

Example consumer config file:

bootstrap.servers=127.0.0.1:9093
key.deserializer=org.apache.kafka.common.serialization.LongDeserializer
value.deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.trusted.packages=com.sample.pojo,java.util, java.lang
group.id=CONSUMER-1-GROUP
auto.offset.reset=earliest

 

 public class GeneralUtils { 

 public static Map<String, Object>  provideKafkaConfig(String property) {

            Map<String, Object> configProps = new HashMap<>();
            InputStream input = null;
            try {
                Properties prop = new Properties();

                input = new FileInputStream(property);
                if (input == null) {
                    log.error("Sorry, unable to find " + property);
                    return null;
                }
                prop.load(input);
                Enumeration<?> e = prop.propertyNames();
                while (e.hasMoreElements()) {
                    String key = (String) e.nextElement();
                    String value = prop.getProperty(key);
                    configProps.put(key, value);
                }
            } catch (IOException e) {
                log.error(GeneralUtils.getExcStackTrace(e));
            } finally {
                try {
                    if (input != null)
                        input.close();
                } catch (Exception ex) {
                    log.error(GeneralUtils.getExcStackTrace(ex));
                }

            }
            return configProps;
        }
    }


@ComponentScan({"com.sample.config.*"})
@EnableKafka
@Slf4j
@Configuration
public class KafkaConsumerConfig {


    public Map<String, Object> configMap(boolean isOnline) {
        Map<String, Object> configProps = new HashMap<>();
        //log4j config
        String property = System.getProperty("kafkaConsumerPropFilePath") :             
        if (null != property)
                configProps=GeneralUtils.provideKafkaConfig(property);
        return configProps;
    }

    @Bean
    public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<Long, LogDay>> onlineKafkaListenerContainerFactory() {

        Map<String, Object> propMap = configMap(true);
        ConcurrentKafkaListenerContainerFactory<Long, LogDay> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConcurrency(5);
        factory.getContainerProperties().setPollTimeout(1000l);
        factory.setConsumerFactory(new DefaultKafkaConsumerFactory<>(propMap));
        return factory;
    }

}


And then, we generate our listener method, which uses our KafkaListenerContainerFactory.

@KafkaListener(topics = "TEST-TOPIC", groupId = "CONSUMER-1-GROUP",
        containerFactory = "onlineKafkaListenerContainerFactory")
public void messageListener(ConsumerRecord<String, LogDay> consumerRecord) {
//....// 
}


So, how about reading our consumer groups from a file or database and generate them dynamically?

Just check this example below. We don't use any Spring-Kafka annotations and don't define a   KafkaListenerContainerfactory  as a @Bean.

We generate an object from the DefaultKafaConsumerFactory class with a Kafka consumer property map. Then, we generate an object from Container Propertie with the related topic and set our message listener inside it.

At last, we generate our ConcurrentMessageListenerContainer and start it. 

@Slf4j
@Component
public class ListenerBean {

    public Map<String, Object> configMap(String propName) {
        Map<String, Object> configProps = new HashMap<>();
        //log4j config
        String property = System.getProperty(propName);
        if (null != property)
            configProps = GeneralUtils.provideKafkaConfig(property);
        return configProps;
    }
    @EventListener
    public void handleEvent(ContextRefreshedEvent event) {


        List<TargetSystemDto> targetSystemDtoList = new ArrayList<>();
        TargetSystemDto targetSystemDto=new TargetSystemDto();
        targetSystemDto.setConsumerPropName("targetSystem1PropFilePath");
        targetSystemDto.setSystemName("TARGET-SYSTEM-1");
         targetSystemDto.setWsUrl("https://sample.com.tr:443/sampleRestApi");       
        targetSystemDtoList.add(targetSystemDto);
        targetSystemDto=new TargetSystemDto();
        targetSystemDto.setSystemName("TARGET-SYSTEM-2");
        targetSystemDto.setConsumerPropName("targetSystem2PropFilePath");
      targetSystemDto.setWsUrl("https://sample.com.tr:443/sampleRestApi2");       
        targetSystemDtoList.add(targetSystemDto);
        targetSystemDtoList.forEach(dto -> generateAndStartConsumerGroup(dto));

    }

    private void generateAndStartConsumerGroup(TargetSystemDto dto) {

        Map<String, Object> propMap = configMap(dto.getConsumerPropName());
        DefaultKafkaConsumerFactory<Long, PaymentPojo> factory = new DefaultKafkaConsumerFactory<>(propMap);
        ContainerProperties containerProperties = new ContainerProperties(dto.getTopicName());
        containerProperties.setMessageListener(
                (MessageListener<String, PaymentPojo>) messageObject ->
                {
                    PaymentPojo paymentPojo = messageObject.value();
                   /* for instance do some condition check
                   for the related consumer group and  call  a  rest api which has standart parameters
                   or transfer it to a different topic                
                   ......
                    */
                });
        ConcurrentMessageListenerContainer container =
                new ConcurrentMessageListenerContainer<>(
                        factory,
                        containerProperties);
        container.start();
        log.info("{} consumer is configured and started",dto.getSystemName());


    }


Related Articles

Topics:
big data ,kafka tutorial ,spring-kafka ,java ,consumer group ,tutorial

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}