Across many of today’s integration use cases, batch processing, or some form of discrete transaction, forms the basis of application interaction. IoT projects, however, increasingly require event-driven architecture, which demands new integration patterns and new technology.
Large, complex environments (aircraft and buildings) or large numbers of remote devices (meters and sensors) can produce a perpetual and enormous stream of data, causing stress on traditional data integration skills and creating new ones — like data ingestion, or integration and endpoint management with existing backend IT. Many IoT projects (connected cars, healthcare, industrial automation) can involve millions of devices, which far exceeds the number of endpoints companies are used to integrating into their enterprise architecture.
Additionally, IoT projects are most often highly distributed architectures that combine edge processing, gateway processing, and business system processing — this creates a complex chain of integration where the sheer volume of data can overwhelm not only the actors across the chain but also the network supporting the data flow.
There’s an API for That…
Following the mantra of the app economy, most vendors and developers will simply claim – “there’s an API for that!” Unfortunately, this simplistic view of IoT integration can leave us fatally exposed to the limitations and assumptions that exist in other use cases.
We’ve become comfortable and complacent that transactional “request/response” integration patterns are adequate to solve any problem — but this is a path to failure when dealing with the problem described above. The need for an event-driven pattern then becomes obvious… but this best practice still doesn’t solve the most difficult problem — data volume.
Inefficiency of Things!
Delivering content efficiently has become critically important to mobile and web developers, as the complexity of applications increases and the demand for performance is priority one. HTTP/2 offers significant benefits to developers that have for years hacked their way around the inefficiency of HTTP’s previous versions. But this hasn’t delivered much to the integration chain for IoT. The inefficiency of things remains!
Luckily, this needn’t be the case — and while I certainly won’t recommend ditching HTTP entirely, there is a need for IoT developers to take a critical look at how and where they are using this protocol. A publish/subscribe pattern will reduce significantly the load on backend systems. We have at our disposal streaming APIs that can offer far greater efficiency than transactional APIs — and a subset of these tackle data volume, too. Take a look at our recent data integration eBook to see how data efficient real-time streaming can benefit your IoT architecture compared with protocols like MQTT.
In almost any integration scenario, reducing bandwidth consumption and improving connection performance on unreliable networks (especially the mobile internet) can have a profound effect on application usability, reliability, and performance. But none more so than IoT. With delta streaming, it’s possible to remove unnecessary, redundant and out-of-date data from the stream between endpoints.
As we’ve said, within complex things or across large numbers of connected devices, the amount of data can be debilitating. Using delta streaming, the amount of data exchanged between endpoints can be reduced by more than 80% which not only reduces the load on your IoT infrastructure (edge, gateway, and business systems) but also increases reliability and scalability on the network connecting these nodes.
IoT integration architecture needs careful and critical thought for success.
- The scale of device data requires an event-driven, not transactional integration pattern.
- Existing IT systems are unprepared for the number of endpoints involved, a publish/subscribe gateway is needed.
- Devices, systems, and networks must be protected from data saturation, and delta streaming technologies can significantly reduce this burden.