DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Required Knowledge To Pass AWS Certified Data Analytics Specialty Exam
  • Unlocking the Benefits of a Private API in AWS API Gateway
  • Zero Trust for AWS NLBs: Why It Matters and How to Do It
  • Unlocking the Power of Serverless AI/ML on AWS: Expert Strategies for Scalable and Secure Applications

Trending

  • Enhancing Avro With Semantic Metadata Using Logical Types
  • Understanding Java Signals
  • Evolution of Cloud Services for MCP/A2A Protocols in AI Agents
  • Recurrent Workflows With Cloud Native Dapr Jobs
  1. DZone
  2. Data Engineering
  3. Databases
  4. Amazon Redshift Workload Management (WLM): A Step-by-Step Guide

Amazon Redshift Workload Management (WLM): A Step-by-Step Guide

Amazon Redshift's WLM feature lets you define how queries are prioritized and resources are allocated. This guide walks through setting up WLM step by step.

By 
arvind toorpu user avatar
arvind toorpu
DZone Core CORE ·
Oct. 14, 24 · Tutorial
Likes (3)
Comment
Save
Tweet
Share
2.8K Views

Join the DZone community and get the full member experience.

Join For Free

As a database administrator or data engineer working with Amazon Redshift, it's crucial to manage resources effectively to handle different workloads. Amazon Redshift's Workload Management (WLM) feature lets you define how queries are prioritized and how resources like CPU and memory are allocated. This guide will walk you through setting up WLM step by step, making it easy for newbies to get started.

What Is Workload Management (WLM)?

WLM allows Amazon Redshift to handle multiple concurrent queries by allocating resources to query queues. You can create custom queues, allocate memory, and set concurrency limits for specific workloads, ensuring that critical queries run efficiently even under heavy loads.


Step-by-Step Setup Guide to Implement WLM

Step 1: Access Amazon Redshift Console

  1. Log in to your AWS Management Console.
  2. In the search bar, type "Redshift" and click on Amazon Redshift.
  3. Navigate to your Redshift cluster. Choose the cluster for which you want to set up WLM.

Step 2: Navigate to Workload Management (WLM) Settings

  1. Once inside your cluster, go to the Properties tab.
  2. Scroll down to Workload Management (WLM) settings.
  3. You'll see the current queues and configurations here.

Redshift WLM Setup

Redshift WLM Setup

Step 3: Switch to Manual WLM (If Necessary)

By default, Redshift uses Automatic WLM, whereas Amazon Redshift handles queue management automatically.

  1. For greater control, switch to Manual WLM.
  2. Click Modify.
  3. Disable Auto WLM and switch to Manual WLM to create custom queues.

Step 4: Define WLM Queues

Now let’s set up custom queues.

Create a New Queue

Click Add Queue. For example, let’s create a queue for Arvind’s Critical Reports. Give it a name, e.g., arvind_critical_queue.

Set the Concurrency level (the number of queries that can run simultaneously in this queue). For critical queries, you might want fewer concurrent queries for better resource allocation. Set Concurrency to 3 for Arvind's queue.

Memory Allocation

Allocate memory for your queue. Amazon Redshift allocates memory as a percentage of the total available memory.

Let’s allocate 30% of the memory to arvind_critical_queue, ensuring that the queries in this queue have enough resources for optimal performance.

Timeout Settings (Optional)

To ensure that queries don’t hang, you can set a Query Timeout. Let’s give queries in arvind_critical_queue a 10-minute timeout (600 seconds).

Step 5: Create Additional Queues (Optional)

You may want to create multiple queues for different workloads. For example:

  • Arvind’s Regular Reports Queue:
 
Concurrency: 5
Memory Allocation: 50%
  Timeout: 15 minutes


  • Arvind’s Ad Hoc Queries Queue:
 
Concurrency: 10
Memory Allocation: 20%
  No Timeout


Step 6: Assign Query Groups to Queues

  1. You can assign specific queries or user groups to specific WLM queues.
  2. Go to the WLM Query Monitoring Rules tab.
  3. Click Add Rule. 

For example, to assign all queries run by arvind_user to the arvind_critical_queue:

 
Set user_group = arvind_user
  Set queue_name = arvind_critical_queue


This ensures that all critical queries by Arvind are run through the appropriate queue, prioritizing them over less important workloads.

User groups, concurrency scaling mode, and query priority selection boxes

Step 7: Save Changes and Monitor

Once you've configured the queues, click Save.

Amazon Redshift will apply the new WLM configuration.

You can monitor the performance of these queues using CloudWatch to track query execution times, memory usage, and concurrency limits.

Example Scenario: How Arvind Manages Daily Reporting Workloads

Let’s say Arvind Toorpu runs several types of queries daily:

  • Critical reports: These queries are run early in the morning to generate business-critical reports. They need top priority and should not be delayed by other workloads. Arvind allocates 30% of memory and limits concurrency to 3 for these.
  • Regular reports: These queries generate regular, less urgent reports, so Arvind allocates 50% of memory and sets concurrency to 5 to allow multiple reports to run simultaneously.
  • Ad hoc queries: These are run sporadically during the day and aren’t time-sensitive. Arvind allocates only 20% of the resources, allowing more queries to run concurrently but with lower priority.

By configuring WLM in this way, Arvind ensures that critical tasks receive the resources they need while allowing flexibility for other workloads.

Final Thoughts

Workload Management (WLM) is a powerful tool in Amazon Redshift that manages multiple types of workloads. By creating queues with appropriate resource allocations and concurrency limits, you can ensure that your system runs efficiently, even under heavy load. For beginners, it's important to start with basic queues and gradually refine them as you better understand your workloads.

With the right setup, you can give priority to the queries that matter most and keep everything running smoothly, just like Arvind does with his daily reports!

AWS Amazon Redshift Database administrator

Published at DZone with permission of arvind toorpu. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Required Knowledge To Pass AWS Certified Data Analytics Specialty Exam
  • Unlocking the Benefits of a Private API in AWS API Gateway
  • Zero Trust for AWS NLBs: Why It Matters and How to Do It
  • Unlocking the Power of Serverless AI/ML on AWS: Expert Strategies for Scalable and Secure Applications

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: