Camel Kafka Connector: No Code, No Hassle Integrations
In this article, we are going to discuss the combination of Camel components and Kafka which has made integrations with Kafka even more easy, stable, and versatile.
Join the DZone community and get the full member experience.Join For Free
In this article, we are going to discuss Camel Kafka Connectors. Apache Camel has more than 300 components used for the integration of different endpoints and protocols. Thus, this combination of Camel components and Kafka has made integrations with Kafka even more easy, stable, and versatile. Also, not a single line of code is required.
We can find more details about Apache Camel Kafka Connectors in community documentation. There are two types of connectors; source and sink. This I want to highlight as per the documentation:
Camel-Kafka Source Connector is a pre-configured Camel consumer which will perform the same action on a fixed rate and send the exchanges to Kafka, while a Camel-Kafka Sink Connector is a pre-configured Camel producer which will perform the same operation on each message exported from Kafka.
In this article, we will implement the Camel-SSH component of Kafka Sink Connector. This example is based on camel-kafka-connector-examples.
I have tested this on Fedora 33 with Apache Kafka-2.7.0 and Podman. Podman we are going to use for running SSH server and KafkaCat utility to send Kafka messages.
So let us start our findings and learning.
1. Let us first download camel-ssh-kafka-connector. At the time of writing this article, the version I downloaded is camel-ssh-kafka-connector-0.7.0-package.zip.
2. I extracted it in my local disk.
3. Download Apache Kafka. While writing this article, the latest version I downloaded is Kafka-2.7.
4. Start Kafka and create a
5. Start SSH Server using Podman.
6. Setup plugin path of
7. Setup Connector with a
CamelSshSinkConnector.propertiesfile, which has SSH sink configurations.
8. Run connector in standalone mode. Being a POC in Kafka one node setup, we will run
9. Create a file with Linux commands to create the file and then append some records.
10. Send the record within
sshCommands.txtusing KafkaCat utility to Kafka. Here we are using Podman to run a KafkaCat docker image for sending messages.
11. Now, after sending messages to Kafka
camel-ssh-kafka-connector sink already running, we expect that the SSH server which we started earlier should have received these commands from Kafka with camel-ssh sink connector. Here username and password of this SSH server is root.
12. Once tested, we can stop the Podman container.
13. We can finally stop Kafka and Zookeeper. The connector instance can be closed with Ctrl + C on it's terminal or just closing that terminal.
14. Another important point is to check the group and offset details associated with
That's it guys, hope you would have found this article interesting and informative.
Opinions expressed by DZone contributors are their own.