DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Because the DevOps movement has redefined engineering responsibilities, SREs now have to become stewards of observability strategy.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Related

  • The Comprehensive IT Guide to Diagnosing and Fixing Packet Loss
  • Large-Scale Web Scraping: How To Do It?
  • Beginners Guide for Web Scraping Using Selenium
  • Multi-Threaded Geo Web Crawler In Java

Trending

  • IoT and Cybersecurity: Addressing Data Privacy and Security Challenges
  • Introduction to Retrieval Augmented Generation (RAG)
  • Intro to RAG: Foundations of Retrieval Augmented Generation, Part 1
  • The Ultimate Guide to Code Formatting: Prettier vs ESLint vs Biome
  1. DZone
  2. Data Engineering
  3. Data
  4. Spark Streaming: Windowing

Spark Streaming: Windowing

A tutorial on how data scientists can implement windowing functions in their Spark Streaming data sets using the Python language.

By 
Neha Priya user avatar
Neha Priya
·
Sep. 03, 18 · Tutorial
Likes (8)
Comment
Save
Tweet
Share
25.5K Views

Join the DZone community and get the full member experience.

Join For Free

In our previous article, we talked about real-time streaming data.

Now, let's consider the idea of windows. In Spark Streaming, we have small batches to come in, so we have RDD and then we have another RDD and so on.

Spark batches the incoming data according to your batch interval, but sometimes you want to remember things from the past. Maybe you want to retain a rolling thirty second average for some of your streaming data, but you want results every five seconds. In this case, you’d want a batch interval of five seconds, but a window length of thirty seconds. Spark provides several methods for making these kinds of calculations.

What if I want to see the highest value after every thirty minutes, and also update us with the highest value every five seconds?

Then it's a real problem, as every five seconds we are going to get a new brand RDD but we need to remember the data from previous RDDs.

So the solution we have here is to use window functions.

Windows allow us to take a first batch and then a second batch and then a third batch and then create a window of all those batches based on the specified time interval. So this way we can always have the new RDD and also the history of the RDDs which existed in the window.

Window

The simplest windowing function is a window, which lets you create a new DStream, computed by applying the windowing parameters to the old DStream. You can use any of the DStream operations on the new stream, so you’ve got all the flexibility you could ever want.

For example, you want to POST all the active users from the last five seconds to a web service, but you want to update the results every second.

sc = SparkContext(appName="ActiveUsers")
ssc = StreamingContext(sc, 1)

activeUsers = [
    ["Alice", "Bob"],
    ["Bob"],
    ["Carlos", "Dan"],
    ["Carlos", "Dan", "Erin"],
    ["Carlos", "Frank"],
]

rddQueue = []
for datum in activeUsers:
    rddQueue += [ssc.sparkContext.parallelize(datum)]

inputStream = ssc.queueStream(rddQueue)
inputStream.window(5, 1)\
    .map(lambda x: set([x]))\
    .reduce(lambda x, y: x.union(y))\
    .pprint()

ssc.start()
sleep(5)
ssc.stop(stopSparkContext=True, stopGraceFully=True)

So we want to keep a list of Active Users, and print a list of all the users who are online. Also, if a new user checks in, we want to make sure the online users list gets updated with it. However, there could be some users who are online from the past couple of minutes but they are not active. That does not mean they are not online, so we want to keep all the online users in a list, even when they are not active. Keep on updating the list with any new users, as well, for every second. This is the implementation of the use case.

This example prints out all active users from the last five seconds, but it prints it every second. We don’t have to manually track state, because the window function keeps old data around for another four intervals. The window function lets you specify the window length and the slide duration, or how often you want a new window calculated.

Image title

Stay tuned for the reduceByKeyandWindow() function!

Data (computing) Web Service POST (HTTP) Stream (computing) History (command) IT Implementation

Opinions expressed by DZone contributors are their own.

Related

  • The Comprehensive IT Guide to Diagnosing and Fixing Packet Loss
  • Large-Scale Web Scraping: How To Do It?
  • Beginners Guide for Web Scraping Using Selenium
  • Multi-Threaded Geo Web Crawler In Java

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!