DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Robust Integration Solutions With Apache Camel and Spring Boot
  • Securing and Monitoring Your Data Pipeline: Best Practices for Kafka, AWS RDS, Lambda, and API Gateway Integration
  • Automated Application Integration With Flask, Kakfa, and API Logic Server
  • Why Real-time Data Integration Is a Priority for Architects in the Modern Era

Trending

  • Understanding IEEE 802.11(Wi-Fi) Encryption and Authentication: Write Your Own Custom Packet Sniffer
  • Intro to RAG: Foundations of Retrieval Augmented Generation, Part 1
  • Docker Base Images Demystified: A Practical Guide
  • The Evolution of Scalable and Resilient Container Infrastructure
  1. DZone
  2. Software Design and Architecture
  3. Integration
  4. Apache Camel Integration with Kafka

Apache Camel Integration with Kafka

This article covers the integration of Apache Camel with Kafka, from setup to testing, with code blocks and plenty of pictures!

By 
Vignesh Prabhu user avatar
Vignesh Prabhu
·
Feb. 28, 21 · Tutorial
Likes (1)
Comment
Save
Tweet
Share
16.0K Views

Join the DZone community and get the full member experience.

Join For Free

This article covers Apache Camel Integration with Kafka.

Setup:

Kafka Setup

We will launch Kafka as a docker container.

docker-compose.yml

YAML
 




x
17


 
1
version: '2'
2

          
3
services:
4
  # this is our kafka cluster.
5
  kafka-cluster:
6
    image: landoop/fast-data-dev:cp3.3.0
7
    environment:
8
      ADV_HOST: 127.0.0.1         # Change to 192.168.99.100 if using Docker Toolbox
9
      RUNTESTS: 0                 # Disable Running tests so the cluster starts faster
10
      FORWARDLOGS: 0              # Disable running 5 file source connectors that bring application logs into Kafka topics
11
      SAMPLEDATA: 0               # Do not create sea_vessel_position_reports, nyc_yellow_taxi_trip_data, reddit_posts topics with sample Avro records.
12
    ports:
13
      - 2181:2181                 # Zookeeper
14
      - 3030:3030                 # Landoop UI
15
      - 8081-8083:8081-8083       # REST Proxy, Schema Registry, Kafka Connect ports
16
      - 9581-9585:9581-9585       # JMX Ports
17
      - 9092:9092                 # Kafka Broker



From the path of the docker-compose.yml file run the below command and observe that the Kafka cluster is successfully started.

PowerShell
 




xxxxxxxxxx
1


 
1
 docker-compose up



Cluster successfully started


Open Kafka Console at http://localhost:3030


A screen like the below opens:

Kafka Development Environment

Let us create 2 Springboot camel microservices, camel-demo-a and camel-demo-b

camel-demo-a will publish the data to Kafka topic which will be consumed by camel-demo-b

In the pom.xml of both the microservices, add the below dependency.

XML
 




xxxxxxxxxx
1


 
1
<dependency>
2
    <groupId>org.apache.camel.springboot</groupId>
3
    <artifactId>camel-kafka-starter</artifactId>
4
    <version>3.8.0</version>
5
</dependency>



Configure the Kafka broker URL in the application.properties

Properties files
 




xxxxxxxxxx
1


 
1
camel.component.kafka.brokers=localhost:9092



Configuring the KafkaSenderRoute in camel-demo-a

The route is configured to read from the file and publish to Kafka topic:

Java
 




xxxxxxxxxx
1
14


 
1
package com.vignesh.cameldemoa.routes.a;
2

          
3
import org.apache.camel.builder.RouteBuilder;
4
import org.springframework.stereotype.Component;
5

          
6
@Component
7
public class KafkaSenderRoute extends RouteBuilder {
8
    @Override
9
    public void configure() throws Exception {
10
        from("file:files/input")
11
                .to("kafka:mytopic");
12
    }
13
}
14

          



Configuring the KafkaReceiverRoute in camel-demo-b

Let us assume that the sender route is publishing a JSON message, which we will unmarshal and do some processing.

To the pom.xml of camel-demo-b application, add below dependency:

XML
 




xxxxxxxxxx
1


 
1
<dependency>
2
    <groupId>org.apache.camel.springboot</groupId>
3
    <artifactId>camel-jackson-starter</artifactId>
4
    <version>3.8.0</version>
5
</dependency>



Creating the Model class:

Java
 




xxxxxxxxxx
1
31


 
1
package com.vignesh.cameldemob.model;
2

          
3
public class Employee {
4
    private int id;
5
    private String name;
6

          
7
    public Employee() {
8
    }
9

          
10
    public Employee(int id, String name) {
11
        this.id = id;
12
        this.name = name;
13
    }
14

          
15
    public int getId() {
16
        return id;
17
    }
18

          
19
    public String getName() {
20
        return name;
21
    }
22

          
23
    @Override
24
    public String toString() {
25
        return "Employee{" +
26
                "id=" + id +
27
                ", name='" + name + '\'' +
28
                '}';
29
    }
30
}
31

          



The route is configured to consume the message from the Kafka topic, unmarshal using the Jackson JSON library and do some processing.

Java
 




xxxxxxxxxx
1
31


 
1
package com.vignesh.cameldemob.route.b;
2

          
3
import com.vignesh.cameldemob.model.Employee;
4
import org.apache.camel.builder.RouteBuilder;
5
import org.apache.camel.model.dataformat.JsonLibrary;
6
import org.slf4j.Logger;
7
import org.slf4j.LoggerFactory;
8
import org.springframework.beans.factory.annotation.Autowired;
9
import org.springframework.stereotype.Component;
10

          
11
@Component
12
public class KafkaReceiverRoute extends RouteBuilder {
13
    @Autowired
14
    GetEmployee getEmployee;
15

          
16
    @Override
17
    public void configure() throws Exception {
18
        from("kafka:mytopic")
19
                .unmarshal().json(JsonLibrary.Jackson, Employee.class)
20
                .bean(getEmployee)
21
                .to("log:myloggingqueue");
22
    }
23
}
24
@Component
25
class GetEmployee{
26
    Logger logger= LoggerFactory.getLogger(GetEmployee.class);
27
    public void getData(Employee employee){
28
        logger.info("Emp data: "+employee.getId());
29

          
30
    }
31
}



Testing:

Start camel-demo-a application and place the json file in the input folder:

sample.json under input file 

The file will be read and the message will be published on the Kafka topic.

Kafka Topics

Start camel-demo-b application. Observe that the route consumes the message from the Kafka topic, performs json unmarshalling, and further processing.

CamelDemoBApplication


kafka Apache Camel Integration

Opinions expressed by DZone contributors are their own.

Related

  • Robust Integration Solutions With Apache Camel and Spring Boot
  • Securing and Monitoring Your Data Pipeline: Best Practices for Kafka, AWS RDS, Lambda, and API Gateway Integration
  • Automated Application Integration With Flask, Kakfa, and API Logic Server
  • Why Real-time Data Integration Is a Priority for Architects in the Modern Era

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!