DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Harnessing Real-Time Insights With Streaming SQL on Kafka
  • How to Design Event Streams, Part 2
  • How to Design Event Streams, Part 1
  • High-Speed Real-Time Streaming Data Processing

Trending

  • Mastering Advanced Aggregations in Spark SQL
  • How Can Developers Drive Innovation by Combining IoT and AI?
  • AI-Driven Root Cause Analysis in SRE: Enhancing Incident Resolution
  • Memory Leak Due to Time-Taking finalize() Method
  1. DZone
  2. Data Engineering
  3. Big Data
  4. Data Stream Using Apache Kafka and Camel Application

Data Stream Using Apache Kafka and Camel Application

This article reviews how to create a Kafka data stream using Camel, providing code and an overview of Kafka and its benefits.

By 
BABU P user avatar
BABU P
·
Mar. 29, 23 · Analysis
Likes (5)
Comment
Save
Tweet
Share
8.9K Views

Join the DZone community and get the full member experience.

Join For Free

Apache Kafka is an event streaming platform that was developed by LinkedIn and later made open-source under the Apache Software Foundation. Its primary function is to handle high-volume real-time data streams and provide a scalable and fault-tolerant architecture for creating data pipelines, streaming applications, and microservices.

Kafka employs a publish-subscribe messaging model, in which data is sorted into topics, and publishers send messages to those topics. Subscribers can then receive those messages in real time. The platform offers a scalable and fault-tolerant architecture by spreading data across multiple nodes and replicating data across multiple brokers. This guarantees that data is consistently available, even if a node fails.

Kafka's architecture is based on several essential components, including brokers, producers, consumers, and topics. Brokers manage the message queues and handle message persistence, while producers and consumers are responsible for publishing and subscribing to Kafka topics, respectively. Topics function as the communication channels through which messages are sent and received.

Kafka also provides an extensive range of APIs and tools to manage data streams and build real-time applications. Kafka Connect, one of its most popular tools and APIs, enables the creation of data pipelines that integrate with other systems. Kafka Streams, on the other hand, allows developers to build streaming applications using a high-level API.

In summary, Kafka is a robust and adaptable platform that can be used to construct real-time data pipelines and streaming applications. It has been widely adopted in various sectors, including finance, healthcare, e-commerce, and more.

To create a Kafka data stream using Camel, you can use the Camel-Kafka component, which is already included in Apache Camel. Below are the steps to follow for creating a Kafka data stream using Camel:

  1. Prepare a Kafka broker and create a topic for the data stream.
  2. Set up a new Camel project on your IDE and include the required Camel dependencies, including the Camel-Kafka component.
  3. Create a new Camel route within your project that defines the data stream. The route should use the Kafka component and specify the topic to which the data should be sent or received.
  4. Select the appropriate data format for the data stream. For instance, if you want to send JSON data, use the Jackson data format to serialize and deserialize the data.
  5. Launch the Camel context and the Kafka producer or consumer to start sending or receiving data.

Overall, using the Camel-Kafka component with Apache Camel is a simple way to create data streams between applications and a Kafka cluster.

Kafka ClusterHere is the code for reading Table form DB and writing to Kafka cluster: Apache Camel Producer Application:

Java
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.kafka.KafkaConstants;
import org.springframework.stereotype.Component;

@Component
public class OracleDBToKafkaRouteBuilder extends RouteBuilder {

@Override
public void configure() throws Exception {

// Configure Oracle DB endpoint
String oracleDBEndpoint = "jdbc:oracle:thin:@localhost:1521:orcl";
String oracleDBUser = "username";
String oracleDBPassword = "password";
String oracleDBTable = "mytable";
String selectQuery = "SELECT * FROM " + oracleDBTable;

// Configure Kafka endpoint
String kafkaEndpoint = "kafka:my-topic?brokers=localhost:9092";
String kafkaSerializer = "org.apache.kafka.common.serialization.StringSerializer";

from("timer:oracleDBPoller?period=5000")

// Read from Oracle DB
.to("jdbc:" + oracleDBEndpoint + "?user=" + oracleDBUser + "&password=" + oracleDBPassword)
.setBody(simple(selectQuery))
.split(body())

// Serialize to Kafka
.setHeader(KafkaConstants.KEY, simple("${body.id}"))
.marshal().string(kafkaSerializer)
.to(kafkaEndpoint);
}
}


Here is the code for reading Kafka Topic  and writing the Oracle DB table: Apache Camel Camel  Application;

Java
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.kafka.KafkaConstants;
import org.springframework.stereotype.Component;

@Component
public class KafkaToOracleDBRouteBuilder extends RouteBuilder {

@Override
public void configure() throws Exception {

// Configure Kafka endpoint
String kafkaEndpoint = "kafka:my-topic?brokers=localhost:9092";
String kafkaDeserializer = "org.apache.kafka.common.serialization.StringDeserializer";

// Configure Oracle DB endpoint
String oracleDBEndpoint = "jdbc:oracle:thin:@localhost:1521:orcl";
String oracleDBUser = "username";
String oracleDBPassword = "password";
String oracleDBTable = "mytable";

from(kafkaEndpoint)

// Deserialize from Kafka
.unmarshal().string(kafkaDeserializer)
.split(body().tokenize("\n"))

// Write to Oracle DB
.to("jdbc:" + oracleDBEndpoint + "?user=" + oracleDBUser + "&password=" + oracleDBPassword)
.setBody(simple("INSERT INTO " + oracleDBTable + " VALUES(${body})"))
.to("jdbc:" + oracleDBEndpoint + "?user=" + oracleDBUser + "&password=" + oracleDBPassword);
}
}


Apache Camel Data stream kafka Stream (computing)

Opinions expressed by DZone contributors are their own.

Related

  • Harnessing Real-Time Insights With Streaming SQL on Kafka
  • How to Design Event Streams, Part 2
  • How to Design Event Streams, Part 1
  • High-Speed Real-Time Streaming Data Processing

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!