Enough Already With ‘Event Streaming’
The next time you receive multiple emails in the space of one minute, ask yourself if those emails were actually "streamed" to you?
Join the DZone community and get the full member experience.
Join For FreeAnyone watching the current (streaming?) video series of the 2022 EDA Summit is likely to once again be shocked by the number of EDA thought leaders making interminable references to event streaming. Why shocked? Because events are NEVER "streamed," and anyone claiming to understand EDA probably ought to understand that fairly fundamental point.
The Oxford Dictionary defines "streaming" as "a method of transmitting or receiving data (especially video and audio material) … as a steady, continuous flow, allowing playback to start while the rest of the data is still being received." Events, you are probably aware, are never split across separate payloads. Indeed, on major platforms such as AWS where event payloads are limited to a maximum of 256 KB, even then I have never seen an individual event payload that came close to exceeding this limit.
Some might argue that if events ever did pass the 256 KB limit set by AWS, then at that point in time, those events will indeed need to be "streamed," or split across numerous payloads. In fact, at least in the case of AWS (and my space is somewhat limited), even those payloads aren’t split; they are in no way "streamed." The full event payload is instead delivered to an S3 Bucket, and the trimmed event is sent as normal, with the exception of it containing a link to the full event payload in S3. As such, in the example of extremely large event payloads on AWS, even those payloads are never "streamed" (because no one sends videos as events).
So does the term "event streaming" even make sense? In fact, yes, but only when followed by the keyword "analytics," as in event streaming analytics. In the case of event streaming analytics, events are analyzed as a — logical — stream within the context of a rolling window of time. (Event) streaming analytics almost always relates to IoT, where the last — rolling — 30 minutes of events, for example, are compared with each other to detect discrepancies (e.g., in temperature). But once again, each individual payload delivered within the rolling window under analysis was received all at once, in its integrality: fairly obvious given the micro-size of IoT (MQTT) payloads.
This might all seem quite trivial, like normal marketing blah blah, but I believe that those pundits within the EDA community making endless references to "event streaming" — and here I think especially of the those pushing Kafka (which as far as I am aware has no pre-packaged "streaming analytics" capabilities, only Java/Scala libraries inviting you to code your own) — are in fact doing the community a disservice by adding unnecessary and factually incorrect confusion. The next time you receive multiple spams in the space of a minute, ask yourself if those emails were not in fact "streamed" to you? If MS Outlook isn’t, in the end, an "email streaming" client?
Published at DZone with permission of Cameron HUNT. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments