DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • Exploring IoT Integration With Wearable Devices From a Software Development Angle
  • Harnessing the Power of MQTT for the Future of IoT
  • How to Stream Sensor Data to Apache Pinot for Real Time Analysis
  • Apache Ranger and AWS EMR Automated Installation and Integration Series (5): Windows AD + Open-Source Ranger

Trending

  • IoT and Cybersecurity: Addressing Data Privacy and Security Challenges
  • Exploring Intercooler.js: Simplify AJAX With HTML Attributes
  • Building a Real-Time Change Data Capture Pipeline With Debezium, Kafka, and PostgreSQL
  • Web Crawling for RAG With Crawl4AI
  1. DZone
  2. Data Engineering
  3. Big Data
  4. How to Stream Data Between HiveMQ Cloud and Apache Kafka for Free

How to Stream Data Between HiveMQ Cloud and Apache Kafka for Free

Looking to stream IoT data between an MQTT broker to Apache Kafka for free? HiveMQ Cloud Kafka integration can help!

By 
Shashank Sharma user avatar
Shashank Sharma
·
Feb. 26, 23 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
4.4K Views

Join the DZone community and get the full member experience.

Join For Free

According to a study published by Statista, IoT devices will produce 79 zettabytes of data in 2025, which will be a 483% increase from 2019. To put this number into perspective, if we store this information in smartphones with a storage of 128 GB each, we would need 617.1875 billion smartphones. Yet, without further processing, this data is worth almost nothing. Only by transforming and analyzing this data do you unlock the immense added value promised by the Internet of Things (IoT).

A common question is how you actually process the data collected from IoT devices. There are several ways to do this, but one of the most compelling is using the MQTT protocol to send IoT data via Apache Kafka for further processing in a system of your choice.

MQTT and Apache Kafka are often used together to enhance the functionality of IoT and Machine-to-Machine communications. You commonly see them married in the following use cases:

  • Data collection: MQTT is used to collect data from IoT devices and publish it to a Kafka broker, where it is processed, analyzed, and stored for future use.
  • Real-time processing: Using MQTT and Kafka, organizations build real-time data processing pipelines that handle large amounts of incoming data from IoT devices.

The easiest way to process data from your IoT devices to your Kafka service is our newly introduced Kafka integration with HiveMQ Cloud. 

In this blog, we will show you the key capabilities of the Kafka-HiveMQ Cloud integration, how you can use it to stream your data, and walk you through how you can set it up.

The HiveMQ Cloud Kafka Integration

Before we jump into the step-by-step instructions, let's look at the benefits of the Kafka-HiveMQ Cloud integration. This simple configuration enables you to stream your data efficiently between your HiveMQ Cloud broker and your Kafka cluster for bidirectional message exchange without ongoing operational burden.

There are five (5) easy steps to ingest data from your IoT devices with the Apache Kafka service of your choice. These can be broadly divided into

  • Connection Configuration parameters
  • Topic mapping parameters 

The connection configuration parameters help establish a secure connection between HiveMQ Cloud and your Apache Kafka cluster. The topic mappings let you set up the bidirectional data flow between your MQTT cluster and Apache Kafka.

But first, you must find the Kafka extension in the “Integrations” tab inside your HiveMQ Cloud cluster. This integration is available with HiveMQ Cloud. 

Note: if you are using the free version of HiveMQ Cloud for the first time to follow these instructions, you can start without adding any payment information.

Integrations tab in HiveMQ Cloud

Now you are ready to dive into the five steps:

  1. Connect HiveMQ Cloud with the Kafka service of your choice: To connect, you need a list of bootstrap servers for your Kafka cluster so the integration can fetch the initial metadata about your Kafka cluster.

  2. Secure the connection: Now, you need to add your Kafka credentials. This helps ensure there is a secure connection between HiveMQ Cloud and Kafka. We offer two different SASL mechanisms for connection security. 

  3. Send data from HiveMQ to Kafka: Once you set up and secure the connection, you can choose what data to forward from your IoT devices. This requires mapping topics from HiveMQ Cloud to your Kafka cluster. The source topic is the MQTT topic you want to send from your HiveMQ cluster. The destination topics are the Kafka topic receiving the messages that your HiveMQ cluster sent.

  4. Establish bidirectional communication: For bidirectional communication between Kafka and HiveMQ, you can configure the Kafka cluster to HiveMQ Cloud similarly as you define the topic mapping from HiveMQ Cloud to your Kafka cluster. In this case, the source topic represents the Kafka topic from which the integration should read messages. These messages are then published with the defined destination topic on your HiveMQ Cloud MQTT broker cluster.

  5. Enable the configuration: You can start the data flow between the HiveMQ cloud cluster and your Kafka cluster by selecting the “enable” button.
    Kafka Streaming Configuration

If you've followed these five steps, you should now be able to employ Apache Kafka with HiveMQ Cloud to use data from your IoT devices for bidirectional communication.

Get Started

To access the Kafka-HiveMQ Cloud functionality for free, all you need to do is sign up for free.

The integration is a lightweight version of our HiveMQ Enterprise Extension for Kafka and offers to solve frequently requested use cases. If you are still missing functionality, don’t hesitate to reach out to us. We are always keen on direct user feedback.

IoT MQTT Cloud cluster kafka Stream (computing) Integration Apache Hive

Published at DZone with permission of Shashank Sharma. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Exploring IoT Integration With Wearable Devices From a Software Development Angle
  • Harnessing the Power of MQTT for the Future of IoT
  • How to Stream Sensor Data to Apache Pinot for Real Time Analysis
  • Apache Ranger and AWS EMR Automated Installation and Integration Series (5): Windows AD + Open-Source Ranger

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!