DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

The software you build is only as secure as the code that powers it. Learn how malicious code creeps into your software supply chain.

Apache Cassandra combines the benefits of major NoSQL databases to support data management needs not covered by traditional RDBMS vendors.

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Related

  • Building Scalable Data Lake Using AWS
  • Building a Scalable ML Pipeline and API in AWS
  • Breaking AWS Lambda: Chaos Engineering for Serverless Devs
  • AWS Step Functions Local: Mocking Services, HTTP Endpoints Limitations

Trending

  • Docker Base Images Demystified: A Practical Guide
  • The Evolution of Scalable and Resilient Container Infrastructure
  • Designing for Sustainability: The Rise of Green Software
  • Introducing Graph Concepts in Java With Eclipse JNoSQL
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. AWS Cross Account S3 Access Through Lambda Functions

AWS Cross Account S3 Access Through Lambda Functions

In this article, you will learn about how to set up AWS Cross Account S3 access through Lambda Functions, covering the configuration process in detail.

By 
KONDALA RAO PATIBANDLA user avatar
KONDALA RAO PATIBANDLA
·
May. 09, 23 · Tutorial
Likes (5)
Comment
Save
Tweet
Share
10.7K Views

Join the DZone community and get the full member experience.

Join For Free

This article describes the steps involved in setting up Lambda Functions for accessing AWS S3 in a Cross Account scenario. The configuration process for this integration will be explained in detail, providing a comprehensive guide. By following these instructions, readers can gain an understanding of the requirements for Cross Account access to S3 using Lambda Functions. This article is aimed at AWS developers who are looking to improve their knowledge of Cross Account S3 access and want to learn how to configure Lambda Functions for this purpose.  A high-level illustration of AWS Cross Account S3 access through Lamda.

 A high-level illustration of AWS Cross Account S3 access through Lamda.

Prerequisites

To create this solution, you must have the following prerequisites:

  • An AWS account.
  • Python 3, preferably the latest version.
  • Basic knowledge of AWS SDK for Python (boto3).
  • Basic knowledge of S3 and KMS Services.
  • Basic knowledge of Lambda and IAM Roles.

Setup in Account A

In AWS Account A, create the following resources step by step.

  • Create S3 Bucket
  • Setup S3 bucket policy
  • Setup KMS key

Create S3 Bucket

Let's begin by creating an S3 bucket in AWS Account A. Firstly, log in to the AWS console and then navigate to the S3 dashboard. 

S3Dashboard

S3Dashboard

Please enter the bucket name and region, and then click on the "Create" button.

Please enter the bucket name and region, and then click on the "Create" button.

Successfully created bucket.

Successfully created bucket.

Setup S3 Bucket Policy

It is important to set up a bucket policy that allows a cross-account lambda role to access an S3 bucket and perform S3 operations.

JSON
 
{
	"Version": "2012-10-17",
	"Statement": [
		{
			"Sid": "Enforce HTTPS Connections",
			"Effect": "Allow",
			"Principal": {
				"AWS": "arn:aws:iam:::role/lambda-s3-access-role"
			},
			"Action": "s3:*",
			"Resource": "arn:aws:s3:::demo-s3-cross-account-bucket/*"
		}
	]
}


Please navigate to the permission tab and then proceed to edit the bucket policy. Replace the existing policy with the one provided above.
Edit the bucket policy

Setup KMS Key

This step is crucial for enabling encryption and decryption of S3 bucket data using the KMS key. It's important to remember that if the bucket is associated with the KMS key, a cross-account lambda role must be authorized to avoid a 401 unauthorized error. To achieve this, we need to modify the existing KMS key policy to permit cross-region lambda roles to perform encryption or decryption operations.

JSON
 
{
  "Version": "2012-10-17",
  "Statement": [      
      {
          "Effect": "Allow",
          "Action": [
              "kms:Encrypt",
              "kms:Decrypt",
              "kms:DescribeKey",
              "kms:GenerateDataKey*",
              "kms:ReEncrypt*"
          ],
          "Resource": [
    		"arn:aws:iam:::role/lambda-s3-access-role",
    	  ]
      }
  ]
}


To enable the cross-account lambda role, go to the KMS dashboard and choose the key that's linked to the S3 bucket. Then, select the "Key Policy" tab and add the above key policy.

Key Policy Tab

Setup in Account B

In AWS Account B, create the following resources step by step.

  • Create IAM Policy and Role
  • Create Lambda function
  • Test Lambda and Verify

Create IAM Policy and Role

Let's move on and begin creating an IAM role in AWS Account B. Log in to the AWS Account B console and navigate to the IAM dashboard. 

IAM Dashboard

 IAM Dashboard

Choose "AWS service" as the trusted identity type and "Lambda" as the use case, then click on the "Next" button.

Choose "AWS service" as the trusted identity type and "Lambda" as the use case, then click on the "Next" button.

Click on the "Create policy" button to add the role policy.

Click on the "Create policy" button to add the role policy.

This role policy enables Lambda to access a cross-account S3 bucket, which requires specifying the S3 bucket name and KMS key.

JSON
 
{
  "Version": "2012-10-17",
  "Statement": [
      {
          "Effect": "Allow",
          "Action": [
              "s3:GetObject",
              "s3:PutObject",
              "s3:PutObjectAcl",
              "s3:List*"
          ],
          "Resource": [
    		"arn:aws:s3:::demo-s3-cross-account-bucket",
    		"arn:aws:s3:::demo-s3-cross-account-bucket/*"
	      ]
      },
      {
          "Effect": "Allow",
          "Action": [
              "kms:Encrypt",
              "kms:Decrypt",
              "kms:DescribeKey",
              "kms:GenerateDataKey*"
          ],
          "Resource": [
    		"arn:aws:kms:::key/demo-s3-cross-account-kms-key",
    	  ]
      }
  ]
  ]
}


Add the above policy on the JSON tab and then click on the "Next" button.

Add the above policy on the JSON tab and then click on the "Next" button.

Please provide a policy name and click the "Create policy" button.

Please provide a policy name and click the "Create policy" button.

Once the role policy is created successfully, map it to the role. Select the policy and click the "Next" button.

Once the role policy is created successfully, map it to the role. Select the policy and click the "Next" button.

Provide the role name and click the "Create role" button.

Provide the role name and click the "Create role" button.

The role is created successfully; then, we will use this role for lambda in the coming step.

The role is created successfully; then, we will use this role for lambda in the coming step.

Create Lambda Function

Now, we'll start creating a Lambda function in AWS Account B. Please log in to the AWS Account B console and go to the Lambda dashboard.

Lambda Dashboard

Python
 
import logging
import boto3
from botocore.exceptions import ClientError
import os

def lambda_handler(event, context):
    upload_file(event['file_name'], event['bucket'], event['object_name']):
  
def upload_file(file_name, bucket, object_name=None):
    """Upload a file to an S3 bucket

    :param file_name: File to upload
    :param bucket: Bucket to upload to
    :param object_name: S3 object name. If not specified then file_name is used
    :return: True if file was uploaded, else False
    """

    # If S3 object_name was not specified, use file_name
    if object_name is None:
        object_name = os.path.basename(file_name)

    # Upload the file
    s3_client = boto3.client('s3')
    try:
        response = s3_client.upload_file(file_name, bucket, object_name)
    except ClientError as e:
        logging.error(e)
        return False
    return True


Create a Lambda function using Python 3.10 runtime, and select the previously created role. Then, add the Python code mentioned above and create the Lambda function.

Create Function

Test Lambda Function and Validate

Finally, let's go through the steps we provisioned to test and validate the process and ensure that it's working properly. To execute the Lambda function, provide the necessary input in the event JSON, as shown in the illustration below, and click on the "Test" button.

Test Lambda Function and Validate

After successful execution of the Lambda function in AWS account B, verify the S3 bucket in AWS account A. You will see the uploaded file as illustrated below.

demo-s3-cross-account-bucket

Conclusion

In conclusion, this article has provided a detailed guide on how to set up Lambda Functions for accessing AWS S3 in a Cross Account scenario. By following the step-by-step instructions outlined in this article, AWS developers can enhance their knowledge of Cross Account S3 access and gain the skills needed to configure Lambda Functions for this purpose. This guide is designed to help readers understand the requirements for Cross Account access to S3 using Lambda Functions and to provide them with the knowledge and tools necessary to successfully complete the configuration process. With the information provided in this article, readers can confidently set up Lambda Functions for Cross Account S3 access and take advantage of the benefits that this integration offers.

AWS AWS Lambda

Opinions expressed by DZone contributors are their own.

Related

  • Building Scalable Data Lake Using AWS
  • Building a Scalable ML Pipeline and API in AWS
  • Breaking AWS Lambda: Chaos Engineering for Serverless Devs
  • AWS Step Functions Local: Mocking Services, HTTP Endpoints Limitations

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!