Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

IoT Edge Use Cases With Apache Kafka

DZone's Guide to

IoT Edge Use Cases With Apache Kafka

Learn more about how you can incorporate IoT edge with Apache Kafka.

· IoT Zone ·
Free Resource

MiniFi Java Agent 0.5

First, copy over the necessary NARs from the Apache NiFi 1.7 lib:

  • nifi-ssl-context-service-nar-1.7.0.nar
  • nifi-standard-services-api-nar-1.7.0.nar
  • nifi-kafka-1-0-nar-1.7.0.nar

This will support PublishKafka_1_0 and ConsumeKafka_1_0.

Then, create a consume and/or publish flow. You can combine the two based on your needs. In my simple example, I consume the Kafka messages in MiniFi and write to a file. I also write the metadata to a JSON file.

What is nice is that we design and test a regular Kafka flow on NiFi 1.7+ and then we export a template as XML and run MiniFi Toolkit to convert it to YAML to install on MiniFi. This process can be automated and MiniFi Agents can look for new config files and reload with them.

Consume Kafka

In the consuming flow, we start with the ConsumeKafka processor that reads from the topics specified at the broker mentioned. Then, I just store the file then any local program can process it. I could have done more processing in MiniFi or pushed it to something else say JMS or MQTT. My thought is to have a local program running on the machine get this data. That way, this legacy program does not need to know about Kafka. I also pull off all of the metadata and convert it into a separately-named JSON file for storage as well.


Publish Electric Monitoring Data to Kafka

To publish events to Kafka, I am collecting some sensor data and formatting it as JSON with a Python script that MiniFi calls. I can then push that data to a topic, smartPlug, on my available broker. Any errors I can log in a local file (or could have pushed to syslog, Slack, Email, JMS, Kafka, etc....)

Let's monitor the messages going through our topic, smartPlug.

To improve our monitoring, you can use an advanced tool like Hortonworks Streams Messaging Manager (SMM) — https://hortonworks.com/blog/introducing-hortonworks-streams-messaging-manager-smm/. That will let you track individual topics, messages, consumers, and producers for a full and complete analysis of what is going on with your Kafka system.

Publish Messages to Kafka

Consume Any Messages From the smartPlug Topic


Topics:
apache kafka ,streaming ,events ,big data ,hortonworks ,hadoop ,spark ,apache nifi ,minifi ,hadoop & big data

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}