Evaluating Streaming Analytics Solutions
What criteria should you consider when looking at streaming Big Data analytics?
Join the DZone community and get the full member experience.Join For Free
streaming data analytics is an approach to big data analysis that shifts focus from systems of record (e.g., “what were last quarter's sales of a product x”) to real-time insight and action (e.g., “what an individual customer likely to buy, and what form of engagement will best influence their behavior?”).
the field is evolving rapidly, with the apache foundation, google, amazon aws, and microsoft azure launching new projects and services for real time streaming data. in this article we’ll review the architecture of streaming analytics solutions, outline a framework for evaluation, and suggest resources for further research.
streaming data and real time event processing architecture
streaming analytics systems are designed to process events and deliver corresponding actions within 50 to 100 milliseconds. systems run in-memory, and avoid queries typical of hadoop. streaming analytics is widespread in financial services for fraud management. use is growing for personalized online offers, customer engagement, and automated network services or internet of things (iot) devices.
events can begin with a social media post, or a web browsing sessions, a wireless call or text, or a database update. events are formatted by “event listeners,” and routed to an ingestion service such as amazon’s aws kinesis, or to the event processing system. real time events are combined and processed according to pre-defined scenarios.
scenarios combine multiple events to trigger a corresponding action. financial transactions attempted in distant locations triggers a fraud scenario, with notices going to the account owner and blocked transactions. a series of dropped calls can be handled according to customer segmentation and order history. subscribers approaching contract renewal is another common use-case. in all cases, scenarios combine to form a strategy for real time customer engagement.
streaming analytics applied
streaming analytics adds value through customer engagement, improving revenue, renewals, and brand loyalty. with a branded application we can look forward to a travel experience that includes:
- being informed in real time of flight schedule changes. on the day of the flight, traffic patterns are used to recommend departure times and routes to the airport.
- parking options are presented with directions to the selected parking lot.
- travelers who miss the flight are placed on an outbound call queue for rebooking.
- maintenance is notified when check-in kiosks are not functioning.
- travelers are updated when baggage is loaded. in the event that baggage misses a connection, the traveler is notified and instructions for deliver are solicited without requiring a visit to “customer service.”
- on-ground maintenance is expedited, based on in-flight maintenance alerts.
- on arrival travelers are provided connecting flight information, and directions to uber pick up locations and other services.
evaluating streaming analytics solutions
streaming analytics systems are complex, and include a complex set of critical capabilities. the gartner group recognizes the following vendors as leaders in streaming analytics: apache foundation, evam, microsoft, oracle, sap, sas, software ag, and tibco software (source: gartner hype cycle for data science, july 25, 2016).
in this short article we’ll summarize key capabilities which are all important, but in my experience the approach to scenarios is critically important. we’ll focus on scenario design and management accordingly.
business considerations include the time and cost to implement a solution. vendors should have proven deployments with customer references, and demonstrated ability to add-value to your business needs with domain expertise.
architecture and performance includes support for public cloud or on-premise solutions, with plug and play support for open source projects, and a resilient clustered architecture. the system should be capable of scaling down to departmental use or scaling to support enterprise wide use.
event capture and data integration is achieved with a library of event listeners, and off the shelf integration with legacy systems, relational data stores, and support for data enrichment (customer profile data). integration should be available for flume, log stash, kafka, and rabbit mq.
event processing includes sub 100 millisecond response times, with a scalable in-memory distributed engine. events should be supported with a persistent event queue, and support third party query and analytics systems, and dynamic configuration updates without system interruption. the system should support flexible time windows, and counts, sums, and averages, and support both asynchronous and synchronous event models. throughput should be easily monitored with a system wide dashboard.
actions should include a library including email, sms, push notifications, calling a third party event engine, restful apis, or web services, and should be easily extensible with a documented sdk.
security and audit is required fordeployments with authentication and logging of all use.
logging and monitoring should include the logging of all events, by event id and timestamp. scenarios and updates should also be logged by user .
operations and testing includechange and release management process, with changes in system configuration and scenarios added without system interruption. new scenarios can also be launched in a “test” mode, or tested with simulated events, where events are logged but without actions.
analytics: the system should include persistent stores for analytics, including predefined and customizable views, and open to integration with third party analytics systems such as r, moa, or h2o. the system should also be capable of generating real time alerts to the users. the platform should support extensibility with optional modules, such as frequent pattern analysis, enhanced real-time clustering, and other analytical methods.
business or technical events: scenario design and management
the ability to implement and update scenarios quickly, and scale management to scores of scenarios is determined by how events are designed. most systems build on technical events (ie., an update to a database). scenarios are logical combinations of technical events, assembled as code by a programmer. adding new scenarios or updating existing scenarios are determined by the speed and availability of a programmer who is intimately familiar with the system design. this is, unfortunately, the common denominator today. as we’ll discuss below, this approach is difficult to scale as scenarios grow in number.
alternatively, events are qualified and exposed as a catalog of business events (ie., a “new customer” or “dropped call”). business events are combined in scenarios by marketing professionals and non-programmers, using visual designers or simple languages. scenarios built on business events are easily reviewed by management for release management, and can be implemented in minutes, and existing scenarios quickly updated.
as the number of scenarios grows they tend to overlap, and a number of scenarios can be triggered in a short time frame. this “cascade” of events is common, and can inundate customers with a deluge of uncoordinated actions. to avoid this it’s important for systems to support scenario prioritization and constraints. scenarios built on business events can be easily compared to detect overlap, and scenarios can be prioritized. finally, systems should also support user-action constraints, where users are not subjected to more than x actions.
there are a lot of moving parts involved in a streaming analytic system, but these systems are among the most practical big data solutions. i’ve been part of implementing a pilot for a global wireless carrier involving millions of customers that was implemented in one month, and was hosted on a single aws machine. unlike most other big data strategies, a pilot of streaming analytics delivers immediate and easily recognized business value. the pilot system drove improved customer renewal rates, and the system has been rapidly expanded to a global footprint.
the gartner group recognizes the following vendors as leaders in streaming analytics: apache foundation, evam, microsoft, oracle, sap, sas, software ag, and tibco software (source: gartner hype cycle for data science, july 25, 2016). many of these vendors offer good resources. i recently co-authored a comprehensive guide to evaluating streaming analytics solutions, which is available for download at evam, who co-sponsored the guide. download a comprehensive guide to evaluating streaming analytics here: http://www.evam.com/evaluation-guide-to-streaming-analytics .
Opinions expressed by DZone contributors are their own.