DZone
IoT Zone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
  • Refcardz
  • Trend Reports
  • Webinars
  • Zones
  • |
    • Agile
    • AI
    • Big Data
    • Cloud
    • Database
    • DevOps
    • Integration
    • IoT
    • Java
    • Microservices
    • Open Source
    • Performance
    • Security
    • Web Dev
DZone > IoT Zone > IoT Edge Use Cases With Apache Kafka

IoT Edge Use Cases With Apache Kafka

Learn more about how you can incorporate IoT edge with Apache Kafka.

Tim Spann user avatar by
Tim Spann
CORE ·
Dec. 03, 18 · IoT Zone · Tutorial
Like (9)
Save
Tweet
14.86K Views

Join the DZone community and get the full member experience.

Join For Free

MiniFi Java Agent 0.5

First, copy over the necessary NARs from the Apache NiFi 1.7 lib:

  • nifi-ssl-context-service-nar-1.7.0.nar
  • nifi-standard-services-api-nar-1.7.0.nar
  • nifi-kafka-1-0-nar-1.7.0.nar

This will support PublishKafka_1_0 and ConsumeKafka_1_0.

Then, create a consume and/or publish flow. You can combine the two based on your needs. In my simple example, I consume the Kafka messages in MiniFi and write to a file. I also write the metadata to a JSON file.

What is nice is that we design and test a regular Kafka flow on NiFi 1.7+ and then we export a template as XML and run MiniFi Toolkit to convert it to YAML to install on MiniFi. This process can be automated and MiniFi Agents can look for new config files and reload with them.

Consume Kafka

In the consuming flow, we start with the ConsumeKafka processor that reads from the topics specified at the broker mentioned. Then, I just store the file then any local program can process it. I could have done more processing in MiniFi or pushed it to something else say JMS or MQTT. My thought is to have a local program running on the machine get this data. That way, this legacy program does not need to know about Kafka. I also pull off all of the metadata and convert it into a separately-named JSON file for storage as well.


Publish Electric Monitoring Data to Kafka

To publish events to Kafka, I am collecting some sensor data and formatting it as JSON with a Python script that MiniFi calls. I can then push that data to a topic, smartPlug, on my available broker. Any errors I can log in a local file (or could have pushed to syslog, Slack, Email, JMS, Kafka, etc....)

Let's monitor the messages going through our topic, smartPlug.

To improve our monitoring, you can use an advanced tool like Hortonworks Streams Messaging Manager (SMM) — https://hortonworks.com/blog/introducing-hortonworks-streams-messaging-manager-smm/. That will let you track individual topics, messages, consumers, and producers for a full and complete analysis of what is going on with your Kafka system.

Publish Messages to Kafka

Consume Any Messages From the smartPlug Topic


kafka IoT

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • It's Official! Fat Arrows in JavaScript!
  • Fun With Modules
  • How API Management Can Ease Your Enterprise Cloud Migration
  • The Best Solution to Burnout We’ve Ever Heard

Comments

IoT Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • MVB Program
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends:

DZone.com is powered by 

AnswerHub logo