Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Deep Learning in Real-Time With TensorFlow, H2O.ai, and Kafka Streams

DZone's Guide to

Deep Learning in Real-Time With TensorFlow, H2O.ai, and Kafka Streams

Learn about a talk I gave that introduces use cases and concepts behind deep learning and shows how to deploy the built analytic models to real-time applications.

· AI Zone
Free Resource

Find out how AI-Fueled APIs from Neura can make interesting products more exciting and engaging. 

Intelligent real-time applications are a game-changer in any industry. Deep learning is one of the hottest buzzwords in this area. New technologies like GPUs combined with elastic cloud infrastructure enable the sophisticated use of artificial neural networks to add business value in real-world scenarios. Tech giants use it, for example, for image recognition and speech translation. This session discusses some real-world scenarios from different industries to explain when and how traditional companies can leverage deep learning in real-time applications.

This session shows how to deploy deep learning models into real-time applications to do predictions on new events. Apache Kafka will be used to inter-analytic models in a highly scalable and performant way.

The first part of my talk introduces use cases and concepts behind deep learning. It discusses how to build convolutional neural networks (CNNs), recurrent neural networks (RNNs), and autoencoders leveraging open-source frameworks like TensorFlow, DeepLearning4J, or H2O.

The second part shows how to deploy the built analytic models to real-time applications leveraging Apache Kafka as streaming platform and Apache Kafka's Streams API to embed the intelligent business logic into an external application or microservice.

Key Takeaways for the Audience: Kafka Streams + Deep Learning

Here are the takeaways from this talk:

  • The focus of this talk is to discuss and show how to productionize analytic models built by data scientists — the key challenge for most companies.
  • Deep learning allows building different neural networks to solve complex classification and regression scenarios and can add business value in any industry.
  • Deep learning is used to build analytics models using open-source frameworks like TensorFlow, DeepLearning4J, or H2O.ai.
  • Apache Kafka's Streams API allows embedding intelligent business logic into any application or microservice.
  • Apache Kafka's Streams API leverages these deep learning models (without redeveloping) to act on new events in real-time.

Slides and Further Material on Apache Kafka and Machine Learning

Here are the slides from my talk:

Some further material around Apache Kafka, Kafka Streams, and machine learning:

I will post more examples and use cases around Apache Kafka and machine learning in the upcoming months. Stay tuned!

To find out how AI-Fueled APIs can increase engagement and retention, download Six Ways to Boost Engagement for Your IoT Device or App with AI today.

Topics:
ai ,machine learning ,apache kafka ,deep learning ,real-time data ,tensorflow ,h2o.ai ,kafka streams

Published at DZone with permission of Kai Wähner, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}