Over a million developers have joined DZone.

Moving Streaming Analytics Out of the Data Center

Why moving analytics out of the traditional data center benefits cybersecurity, fraud, maintenance, and customer experience.

· IoT Zone

Access the survey results 'State of Industrial Internet Application Development' to learn about latest challenges, trends and opportunities with Industrial IoT, brought to you in partnership with GE Digital.

This blog focuses on moving streaming analytics outside the confines of the traditional data center. Moving streaming analytics closer to where data originates can be accomplished by leveraging an enterprise-grade data movement application, married with an extremely lightweight streaming engine. This combination is being used by forward-looking organizations to solve usage cases in a number of areas, including:

  • Cyber Security – Identify a malicious intrusion before or as it occurs.
  • Fraud – Analyze streaming transactions to determine which needs immediate attention.
  • Predictive Maintenance – Predict outlier conditions from streaming machine and sensor data.
  • Customer Experience and Marketing – Use streaming data insights to personalize interactions.
  • Stream Data Management – Transform and clean data in motion, storing only what you need.

IoAT (Internet of AnyThing) data includes any new data source generated from sensors and machines, server logs, clickstream web application servers, social media, as well as files and email. By reacting rapidly, a high degree of analytic intelligence such as IoAT data can be collected from its origin to help companies gain fast insights to outpace their competition. Whether it’s personalizing a “next best offer” or providing an immediate treatment to an early warning alert, IoAT data is only beginning to impact bottom line revenues. The time to act is now, but what components do we need to add to our modern data applications?

Hortonworks Data Flow

Hortonworks Data Flow (HDF), powered by Apache NiFi, is a data application platform designed to solve data acquisition and delivery challenges, inside and outside the data center. It provides a fast, easy and secure way to move data from anywhere it originates to anywhere it needs to go. HDF has a simple GUI command and control for building “Data Workflows” so there is no need to write custom data movement scripts. HDF provides Simple Event Processing (SEP) out of the box to curate data payloads as they move from one place to another.

Image title

Figure 1 : Act Fast on Streaming Data with HDF and SAS ESP

SAS Event Stream Processing

SAS Event Stream Processing (ESP) analyzes and understands streaming data as it is being generated, detects patterns of interest as they occur, and provides the necessary instructions to take the correct actions; i.e. what alerts to issue and what portions of data should be retained for further investigation.

HDF and SAS ESP

By incorporating SAS ESP into our edge originating HDF based data workflows, we are able to inject Complex Event Processing (CEP) into any IoAT application. It’s important to note that both HDF and SAS ESP are ideal candidates to run on Edge Node Gateways, which typically have very small memory footprints and run lightweight Linux kernels. Since HDF and SAS ESP are architected to run from the smallest of devices large clustered configurations, they are a perfect compliment both inside and outside the data center.

Summary

In summary, HDF, powered by Apache NiFi, is a secure, reliable enterprise data movement platform, which provides Simple Event Processing (SEP) processors out of the box. HDF is architected to run from the Data Center to the edge of an IoAT framework and back. SAS ESP is an extremely fast streaming analytics engine, also architected to run in the Data Center and out to an edge node device and back. SAS ESP and HDF are seamlessly integrated via SAS ESP NiFi Processors. Together, they can provide immediate, streaming, deep actionable intelligence for improved customer experience and lower cost operations.

The IoT Zone is brought to you in partnership with GE Digital.  Discover how IoT developers are using Predix to disrupt traditional industrial development models.

Topics:
hortonworks ,hadoop ,big data ,cyber security

Published at DZone with permission of Mark Lochbihler, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}