An Overview of IoT Analytics Maturity
An Overview of IoT Analytics Maturity
This post gives a high-level view (with examples) of how IoT analytics have matured over time, where those stages are useful, and how they impact the world.
Join the DZone community and get the full member experience.Join For Free
In the world of connected devices, where the IoT ecosystem is moving towards maturity, the maturity of IoT Analytics will play a key role in coming years. The investment being made in the area of IoT will be unlocked by the adoption of IoT analytics.
We can understand the application of IoT analytics in two ways:
At the layer where analytics are applied: At a broad level, there are various physical layers in an IoT ecosystem, which can be broadly classified into the following:
- As per the complexity of the use case and implementation: IoT analytics ranges from rule-based implementations to complex event processing implementation using advanced analytics techniques. This is the area which we will focus our attention in this article.
Analytics in the IoT space can be any or combination of following.
Complex rule: This is an extension of the above concept, but it requires an understanding of the various parameters of devices and the correlation between them. An example of this type of event would be combining the effects of the air filter's status and the humidity of the environment for a connected car. This would be again near real time.
Combining asynchronous events: Things become more complex from here. Say at the cloud level, a device sends data every 10 seconds, and another device sends weather information (ambient temperature) at every minute. From a business perspective, we need to calculate a metric that involves device data and ambient temperature. At a very basic level, this can be handled by using RESTful services, where relevant data is written when event1 happens, and when event2 happens, event1 is retrieved from the RESTful API and combined with event2 data and used. When this concept is extended to a number of events and advanced analytic techniques are used with them, they are also termed as complex event processing (CEP, see point number 6).
Windowed operations: This is in between near real-time and batch processing. Suppose a business problem expects to monitor a parameter for a defined interval of time. For example, a device expecting a voltage in a certain range would handle a spike, but if this happens for a sustained period of time, then the device is at risk. In this scenario, data for an interval of n seconds needs to be analyzed. This can be achieved by implementing a variation of stream analytics, and many providers are offering solutions in this area.
Batch analytics: As the name suggests, the data is persisted and analyzed in offline or cold mode. This can also be done at the cloud level. Either way, a number of tools or solutions are required to achieve this. The key business parameters are analyzed by doing batch processing. Often, this persisted data takes the scale of Big Data and specialized Big Data solutions (Hadoop MapReduce, Spark, etc.) may be required to achieve the same. Here, various types of analytics can be used.
Rule-based: Comes from business, domain expertise.
Descriptive analytics: Uses data mining and aggregation techniques to understand the situation with devices.
Predictive analytics: Creates algorithms based on the available data and uses them to predict future outcomes. An example of an impending device failure in next n days will be appropriate here. This information can be used to plan the maintenance of the device so that it minimizes the downtime impact (during a weekend, if possible.)
Complex event processing (using multiple sources and advanced techniques): This is by far the most complex and involved. This situation requires combining data from various sources and/or applying predictive modeling in near real-time to predict the outcome. For example, a potentially fraudulent credit card transaction would come into this category. Data about spending from the past and present needs to be combined, probably with the location of the usage, and this combined data needs to be passed to a published and validated machine learning model, which can provide feedback quickly enough to alert the person before the transaction happens.
The following figure will make it easier to understand:
Opinions expressed by DZone contributors are their own.