DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

SBOMs are essential to circumventing software supply chain attacks, and they provide visibility into various software components.

Related

  • Finding the Right Database Solution: A Comprehensive Comparison of AWS RDS MySQL and Aurora MySQL
  • 7 Invaluable Advantages of Using Amazon RDS
  • How to Use AWS Aurora Database for a Retail Point of Sale (POS) Transaction System
  • PostgreSQL 12 End of Life: What to Know and How to Prepare

Trending

  • Top 5 Trends in Big Data Quality and Governance in 2025
  • Exploring Data Redaction Enhancements in Oracle Database 23ai
  • When Caches Collide: Solving Race Conditions in Fare Updates
  • Are Traditional Data Warehouses Being Devoured by Agentic AI?
  1. DZone
  2. Data Engineering
  3. Databases
  4. Managing Encrypted Aurora DAS Over Kinesis With AWS SDK

Managing Encrypted Aurora DAS Over Kinesis With AWS SDK

Amazon Aurora DAS streams encrypted activity data via Kinesis for secure monitoring. This guide shows how to decrypt it using the AWS Encryption SDK.

By 
Shubham Kaushik user avatar
Shubham Kaushik
·
Neeraj Kaushik user avatar
Neeraj Kaushik
·
Jun. 03, 25 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
998 Views

Join the DZone community and get the full member experience.

Join For Free

When it comes to auditing and monitoring database activity, Amazon Aurora's Database Activity Stream (DAS) provides a secure and near real-time stream of database activity. By default, DAS encrypts all data in transit using AWS Key Management Service (KMS) with a customer-managed key (CMK) and streams this encrypted data into a Serverless Streaming Data Service - Amazon Kinesis. 

While this is great for compliance and security, reading and interpreting the encrypted data stream requires additional effort — particularly if you're building custom analytics, alerting, or logging solutions. This article walks you through how to read the encrypted Aurora DAS records from Kinesis using the AWS Encryption SDK. 

Security and compliance are top priorities when working with sensitive data in the cloud — especially in regulated industries such as finance, healthcare, and government. Amazon Aurora's DAS is designed to help customers monitor database activity in real time, providing deep visibility into queries, connections, and data access patterns. However, this stream of data is encrypted in transit by default using a customer-managed AWS KMS (Key Management Service) key and routed through Amazon Kinesis Data Streams for consumption. 

While this encryption model enhances data security, it introduces a technical challenge: how do you access and process the encrypted DAS data? The payload cannot be directly interpreted, as it's wrapped in envelope encryption and protected by your KMS CMK. 

Understanding the Challenge 

Before discussing the solution, it's important to understand how Aurora DAS encryption works: 

  • Envelope Encryption Model: Aurora DAS uses envelope encryption, where the data is encrypted with a data key, and that data key is itself encrypted using your KMS key. 
  • Two Encrypted Components: Each record in the Kinesis stream contains: 
    • The database activity events encrypted with a data key 
    • The data key encrypted with your KMS CMK 
  • Kinesis Data Stream Format: The records follow this structure: 
JSON
 
{ 

  "type": "DatabaseActivityMonitoringRecords", 

  "version": "1.1", 

  "databaseActivityEvents": "[encrypted audit records]", 

  "key": "[encrypted data key]" 

} 


Solution Overview: AWS Encryption SDK Approach 

Aurora DAS encrypts data in multiple layers, and the AWS Encryption SDK helps you easily unwrap all that encryption so you can see what’s going on. Here's why this specific approach is required: 

  • Handles Envelope Encryption: The SDK is designed to work with the envelope encryption pattern used by Aurora DAS. 
  • Integrates with KMS: It seamlessly integrates with your KMS keys for the initial decryption of the data key. 
  • Manages Cryptographic Operations: The SDK handles the complex cryptographic operations required for secure decryption. 

The decryption process follows these key steps: 

  • First, decrypt the encrypted data key using your KMS CMK. 
  • Then, use that decrypted key to decrypt the database activity events.
  • Finally, decompress the decrypted data to get the readable JSON output

Implementation 

Step 1: Set Up Aurora With Database Activity Streams 

Before implementing the decryption solution, ensure you have: 

  1. An Aurora PostgreSQL or MySQL cluster with sufficient permissions 
  2. A customer-managed KMS key for encryption 
  3. Database Activity Streams enabled on your Aurora cluster 

When you turn on DAS, AWS sets up a Kinesis stream called aws-rds-das-[cluster-resource-id] that receives the encrypted data.

Step 2: Prepare the AWS Encryption SDK Environment 

For decrypting DAS events, your processing application (typically a Lambda function) needs the AWS Encryption SDK. This SDK is not included in standard AWS runtimes and must be added separately. 

Why this matters: The AWS Encryption SDK provides specialized cryptographic algorithms and protocols designed specifically for envelope encryption patterns used by AWS services like DAS. 

The most efficient approach is to create a Lambda Layer containing: 

  • aws_encryption_sdk: Required for the envelope decryption process 
  • boto3: Needed for AWS service interactions, particularly with KMS 

Step 3: Implement the Decryption Logic 

Here’s a Lambda function example that handles decrypting DAS events. Each part of the decryption process is thoroughly documented with comments in the code:

Python
 
import base64 

import json 

import zlib 

import boto3 

import aws_encryption_sdk 

from aws_encryption_sdk import CommitmentPolicy 

from aws_encryption_sdk.internal.crypto import WrappingKey 

from aws_encryption_sdk.key_providers.raw import RawMasterKeyProvider 

from aws_encryption_sdk.identifiers import WrappingAlgorithm, EncryptionKeyType 

 

# Configuration - update these values 

REGION_NAME = 'your-region'  # Change to your region 

RESOURCE_ID = 'your cluster resource ID'  # Change to your RDS resource ID 

 

# Initialize encryption client with appropriate commitment policy 

# This is required for proper operation with the AWS Encryption SDK 

enc_client = aws_encryption_sdk.EncryptionSDKClient(commitment_policy=CommitmentPolicy.FORBID_ENCRYPT_ALLOW_DECRYPT) 

 

# Custom key provider class for decryption 

# This class is necessary to use the raw data key from KMS with the Encryption SDK 

class MyRawMasterKeyProvider(RawMasterKeyProvider): 

    provider_id = "BC" 

     

    def __new__(cls, *args, **kwargs): 

        obj = super(RawMasterKeyProvider, cls).__new__(cls) 

        return obj 

     

    def __init__(self, plain_key): 

        RawMasterKeyProvider.__init__(self) 

        # Configure the wrapping key with proper algorithm for DAS decryption 

        self.wrapping_key = WrappingKey( 

            wrapping_algorithm=WrappingAlgorithm.AES_256_GCM_IV12_TAG16_NO_PADDING, 

            wrapping_key=plain_key, 

            wrapping_key_type=EncryptionKeyType.SYMMETRIC 

        ) 

     

    def _get_raw_key(self, key_id): 

        # Return the wrapping key when the Encryption SDK requests it 

        return self.wrapping_key 

 

# First decryption step: use the data key to decrypt the payload 

def decrypt_payload(payload, data_key): 

    # Create a key provider using our decrypted data key 

    my_key_provider = MyRawMasterKeyProvider(data_key) 

    my_key_provider.add_master_key("DataKey") 

     

    # Decrypt the payload using the AWS Encryption SDK 

    decrypted_plaintext, header = enc_client.decrypt( 

        source=payload, 

        materials_manager=aws_encryption_sdk.materials_managers.default.DefaultCryptoMaterialsManager( 

            master_key_provider=my_key_provider) 

    ) 

     

    return decrypted_plaintext 

 

# Second step: decompress the decrypted data 

# DAS events are compressed before encryption to save bandwidth 

def decrypt_decompress(payload, key): 

    decrypted = decrypt_payload(payload, key) 

    # Use zlib with specific window bits for proper decompression 

    return zlib.decompress(decrypted, zlib.MAX_WBITS + 16) 

 

# Main Lambda handler function that processes events from Kinesis 

def lambda_handler(event, context): 

    session = boto3.session.Session() 

    kms = session.client('kms', region_name=REGION_NAME) 

     

    for record in event['Records']: 

        # Step 1: Get the base64-encoded data from Kinesis 

        payload = base64.b64decode(record['kinesis']['data']) 

        record_data = json.loads(payload) 

         

        # Step 2: Extract the two encrypted components 

        payload_decoded = base64.b64decode(record_data['databaseActivityEvents']) 

        data_key_decoded = base64.b64decode(record_data['key']) 

         

        # Step 3: Decrypt the data key using KMS 

        # This is the first level of decryption in the envelope model 

        data_key_decrypt_result = kms.decrypt( 

            CiphertextBlob=data_key_decoded, 

            EncryptionContext={'aws:rds:dbc-id': RESOURCE_ID} 

        ) 

        decrypted_data_key = data_key_decrypt_result['Plaintext'] 

         

        # Step 4: Use the decrypted data key to decrypt and decompress the events 

        # This is the second level of decryption in the envelope model 

        decrypted_event = decrypt_decompress(payload_decoded, decrypted_data_key) 

         

        # Step 5: Process the decrypted event 

        # At this point, decrypted_event contains the plaintext JSON of database activity 

        print(decrypted_event) 

         

        # Additional processing logic would go here 

        # For example, you might: 

        # - Parse the JSON and extract specific fields 

        # - Store events in a database for analysis 

        # - Trigger alerts based on suspicious activities 

     

    return { 

        'statusCode': 200, 

        'body': json.dumps('Processing Complete') 

    } 


Step 4: Error Handling and Performance Considerations 

As you implement this solution in production, keep these key factors in mind: 

Error Handling: 

  • KMS permissions: Ensure your Lambda function has the necessary KMS permissions so it can decrypt the data successfully.
  • Encryption context: The context must match exactly (aws:rds:dbc-id) 
  • Resource ID: Make sure you're using the correct Aurora cluster resource ID—if it's off, the KMS decryption step will fail.

Performance Considerations: 

  • Batch size: Configure appropriate Kinesis batch sizes for your Lambda 
  • Timeout settings: Decryption operations may require longer timeouts 
  • Memory allocation: Processing encrypted streams requires more memory 

Conclusion 

Aurora's Database Activity Streams provide powerful auditing capabilities, but the default encryption presents a technical challenge for utilizing this data. By leveraging the AWS Encryption SDK and understanding the envelope encryption model, you can successfully decrypt and process these encrypted streams. 

The key takeaways from this article are: 

  1. Aurora DAS uses a two-layer envelope encryption model that requires specialized decryption 
  2. The AWS Encryption SDK is essential for properly handling this encryption pattern 
  3. The decryption process involves first decrypting the data key with KMS, then using that key to decrypt the actual events 
  4. Proper implementation enables you to unlock valuable database activity data for security monitoring and compliance 

By following this approach, you can build robust solutions that leverage the security benefits of encrypted Database Activity Streams while still gaining access to the valuable insights they contain. 

AWS Amazon Aurora Database

Opinions expressed by DZone contributors are their own.

Related

  • Finding the Right Database Solution: A Comprehensive Comparison of AWS RDS MySQL and Aurora MySQL
  • 7 Invaluable Advantages of Using Amazon RDS
  • How to Use AWS Aurora Database for a Retail Point of Sale (POS) Transaction System
  • PostgreSQL 12 End of Life: What to Know and How to Prepare

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • [email protected]

Let's be friends: