DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. Continuously Encrypt Amazon Redshift Loads with S3, KMS, and Lambda

Continuously Encrypt Amazon Redshift Loads with S3, KMS, and Lambda

When building a new system, our urge is to do the magic, make it work, and gain the user appreciation for it as fast as we can. However, you need to continuously encrypt.

Alon Brody user avatar by
Alon Brody
·
Mar. 10, 17 · Tutorial
Like (3)
Save
Tweet
Share
7.55K Views

Join the DZone community and get the full member experience.

Join For Free

one of the main goals of this blog is to help developers and data architects, just like us, with their amazon redshift operations. starting from a full comparison with google bigquery and explaining how to load google analytics data into it and perform analysis using tableau, all the way to providing you with tools such as pgproxy , which can help you load balance queries between multiple redshift clusters, this article may extend your know-how on redshift and help you do your job much better.

when building a new system, our urge is to do the magic, make it work, and gain the user appreciation for it as fast as we can. security is not the first consideration. however, as the system grows, and especially as the amount of data we store grows, the realization that there’s an asset that needs protecting becomes abundantly clear.

this guide will help you continually encrypt your data loads to redshift and make sure your information is safe.

about amazon kms

before getting into the step-by-step details, note that we are going to use one of amazon’s main encryption tools, the aws key management service (kms). this service allows for the creation and management of encryption keys. it is based on the envelope algorithm concept, so it requires two keys for encrypting and decrypting operations — a data key to encrypt the data and a master key to encrypt the data key.

using amazon unload/copy utility

this amazon redshift utility helps to migrate data between redshift clusters or databases. using it we can also move the data from redshift to s3. then we will use kms to encrypt the data and return it to redshift.

image source: github .

this is planned for use as a scheduled activity for instances running a data pipeline shell activity .

1. connect to redshift

we use pygresql (python module) to connect to our redshift cluster. to install pygresql on an ec2 linux instance, run these commands:

sudo easy_install pip
sudo yum install postgresql postgresql-devel gcc python-devel
sudo pip install pygresql

2. create tables

next, we need to create two identical tables in redshift, one for source redshift and another for destination redshift. in source redshift, we will create a table name “source_student_data,” and in destination redshift, we will create a table name “destination_student_data.”

create table source_student_data
(
	adm_date timestamp,
	student_id integer,
	student_country varchar(10),
	student_division varchar(20),
	student_state_name char(15),
	student_semester char(8),
	student_ap char(3)
)
diststyle even
sortkey
(
	adm_date
);

create table destination_student_data
	adm_date timestamp,
	student_id integer,
	student_country varchar(10),
	student_division varchar(20),
	student_state_name char(15),
	student_semester char(8),
	student_ap char(3)
)
diststyle even
sortkey
(
	adm_date
);

3. create a kms master key

because we are going to encrypt our redshift data with kms, we need to create a kms master key. in this step, we will create it using the following script, named createkmskey.sh:

#!/bin/bash
keyarn=$(aws kms create-key --description unloadcopyutility --key-usage "encrypt_decrypt" --region us-west-2 --query keymetadata.arn)

keyid=$(echo $keyarn | cut -d/ -f2 | tr -d '"')

aws kms create-alias --target-key-id $keyid --alias-name "alias/unloadcopyutility" --region us-west-2

echo "created new kms master key $keyarn with alias/unloadcopyutility"

4. encrypt your password

after creating a kms master key, it’s time to encrypt our password. we will write a bash script named encryptvalue.sh, which will generate a base64-encoded encrypted value that we will use in our configuration file in the next step. to create “encryptvalue.sh”, run the following commands:

#!/bin/bash

key=alias/unloadcopyutility

aws kms encrypt --key-id $key –plaintext $1 --region us-west-2 --query ciphertextblob --output text

after creating the script, execute the following command with a password:

$ ./encryptvalue.sh password123

the output will resemble the following:

dqplc+63kj29llps+izexfl5ce47qrg+ptqcnahqfhy0fbkraqebagb4lgpvebyoeoyb7figrmrzeqnuo0k4pqbagpwb0br2nhwaaabomgygcsqgsib3dqehbqbzmfccaqawugyjkozihvcnaqcbmb4gcwcgsaflawqbljarbawcor73wpqthnkyshmcarcajbci0vusbm9izm8s8fhkxhtk9vgco5slp+odimgbnvyce5qod6k=

now, we need to copy this value and insert it into our configuration file.

5. create configuration file for redshift info

we are going to use a python script to automatically unload and copy our redshift data, so we need to provide information beforehand. in this step, we will create a configuration file named ‘config.json’, where we will provide the redshift cluster endpoint, database name, aws access key, and s3 bucket, as well as the base64-encoded password created in the previous step.

{
  // the source database from which we'll export data
  "unloadsource": {
    "clusterendpoint": "source-cluster.c6jhdm5beccty.us-west-2.redshift.amazonaws.com",
    "clusterport": 5439,
    // base 64 encoded password for the user to unload data as. use the encryptvalue.sh utility to generate this string 
"connectpwd": "dqplc+63kj29llps+izexfl5ce47qrg+ptqcnahqfhy0fbkraqebagb4lgpvebyoeoyb7figrmrzeqnuo0k4pqbagpwb0br2nhwaaabomgygcsqgsib3dqehbqbzmfccaqawugyjkozihvcnaqcbmb4gcwcgsaflawqbljarbawcor73wpqthnkyshmcarcajbci0vusbm9izm8s8fhkxhtk9vgco5slp+odimgbnvyce5qod6k=",
    "connectuser": "redshift",
    "db": "studentdb",
    "schemaname": "public",
    "tablename": "source_student_data"
  },
  // location and credentials for s3, which are used to store migrated data while in flight
  "s3staging": {

    "aws_access_key_id": "awzxxxxxxxxx",

    "aws_secret_access_key": "cjnwqxxxxxxxxxxx",
    // path on s3 to use for storing in-flight data. the current date and time is appended to the prefix
    "path": "s3://redshift-copy-cluster/prefix/",
    "deleteonsuccess": "true",
    // region to use for the s3 export
    "region": "us-west-2"
  },
  // the destination database into which we will copy data
  "copytarget": {
    "clusterendpoint": "destination-cluster.c6jhdm5beccty.us-west-2.redshift.amazonaws.com",
    "clusterport": 5439,
    // base 64 encoded password for the user to copy data as. use the encryptvalue.sh utility to generate this string
"connectpwd": "dqplc+63kj29llps+izexfl5ce47qrg+ptqcnahqfhy0fbkraqebagb4lgpvebyoeoyb7figrmrzeqnuo0k4pqbagpwb0br2nhwaaabomgygcsqgsib3dqehbqbzmfccaqawugyjkozihvcnaqcbmb4gcwcgsaflawqbljarbawcor73wpqthnkyshmcarcajbci0vusbm9izm8s8fhkxhtk9vgco5slp+odimgbnvyce5qod6k=",
    "connectuser": "redshift",
    "db": "studentdb",
    "schemaname": "public",
    "tablename": "destination_student_data"
  }
}

6. encrypt the data

we are using a python script name, “redshift-unload-copy.py,” which will unload the source data from redshift, then encrypt the data with the kms master key and upload to s3, and finally copy the encrypted data from s3 to the destination redshift cluster.

$ python redshift-unload-copy.py s3://redshift-copy-cluster/config.json us-west-2

to access the full script, click here .

using aws lambda

this solution builds an automatic pipeline that creates a kms master key, uploads encrypted data to s3, and copies the encrypted data back to redshift. next, we’ll use lambda to continuously encrypt newly incoming data.

at the initial stage, lambda receives an s3 notification. based on the file prefix, lambda receives the bucket and the key, then builds the copy command that will run in the destination redshift cluster. to make the lambda function idempotent, it verifies the file has not already been copied before executing the ‘copy’ command.

to implement this solution, we need to do the following things:

first of all, download the source file from this location .

after downloading the zip file, unzip it, and edit the parameters of the ‘copy.py’ script’ file according to these requirements:

iam_role = "arn:aws:redshift:us-west-2:805124xxxxxx:cluster:destination-cluster"
db_database = "studentdb"
db_user = "redshift"
db_password = "xxxxxxx"
db_port = "5439"
db_host = "destination-cluster.c6jhdm5beccty.us-west-2.redshift.amazonaws.com"
query_bucket = "s3://redshift-copy-cluster"
query_prefix = "s3://redshift-copy-cluster/prefix"

once the copy.py file is edited, zip the whole folder again.

now, in the aws console, go to aws lambda and create a function.

aws create a lambda function

click create a lambda function to display the next page.

aws lambda function select blue print

select blank function to configure the triggers.

aws lambda configure triggers

click next .

aws lambda configure action

on this page, specify the name , description , and runtime environment . for our use case, the runtime environment will be python 2.7. from the code entry type drop-down list, select upload a zip file and select the zip file we created earlier.

aws lambda advanced settings

provide the handler , role , and existing role information, and click next .

aws lambda provide handler

click create function to create the function in lambda.

the function runs the copy query to load the data into the destination redshift cluster upon receiving any encrypted data in the s3 bucket.

this solution can be used to replicate real-time data from one redshift cluster to another. in that case, we need to configure an event trigger in the s3 bucket. for details on configuring an event trigger with s3 bucket, we can follow this link .

final note

using own kms customer-managed keys allow us to protect the amazon redshift data and give full control over who can use these keys to access the cluster data. it is worth mentioning that aws services s3 and kms provide an easy solution to encrypt data loads without any additional charge, though kms has a limited free tier offer up to 20,000 requests per month.

in the past, we had to meet our security professionals and ask them to integrate security into our data repositories, in large and complex environments we still do. however, we can also follow such guidelines which enable us, the data architects, to ensure the bare minimum when it comes to running our data solutions on the cloud. i hope this solution will help you do your work in a safe and secure data environment.

Redshift (theory) Amazon Redshift Data (computing) AWS Database cluster AI

Published at DZone with permission of Alon Brody. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • File Uploads for the Web (2): Upload Files With JavaScript
  • What Is the Temporal Dead Zone In JavaScript?
  • Building Microservice in Golang
  • OWASP Kubernetes Top 10

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: