Upload an Entire s3 Bucket/Local Directory to Another s3 Bucket From Node.js
Checkout this convenient method for working with s3 Buckets.
Join the DZone community and get the full member experience.
Join For FreeI have come across several requests wherein a team needs to upload an entire folder/source s3 bucket (that may contain subfolders) to another s3 bucket (destination).
We are all aware of the nice little AWS CLI command to complete the above task:
Bucket to bucket sync:
s3 sync s3://srcBucket s3://destBucket
Local folder to s3Bucket sync:
s3 sync . s3://destBucket
The problem is that when we would like to achieve the same result from a script rather than the command line, we find that none of the SDKs provided by AWS helps us achieve this.
You may also like: AWS S3 Bucket Naming Challenge [Comic].
Some suggest using AWS's data pipeline though I believe this is unnecessary when there is no transformation requirement.
So How do we go about this?
There is a node package — aws-cli-js. You can read more about it here.
Use the following code:
var awsCli = require('aws-cli-js');
var Options = awsCli.Options;
var Aws = awsCli.Aws;
var options = new Options(
/* accessKey */ CRED.accesskey,
/* secretKey */ CRED.secretkey,
);
var aws = new Aws(options);
aws.command(' s3 sync s3://srcBkt s3://destBkt, function (err, data) {
console.log('data = ', JSON.stringify(data.raw));
});
Note: Make sure to pass on the right credentials. We can use the callback method and process the results too. I hope this helps. Let me know your thoughts and anything that can be improved!
Further Reading
Opinions expressed by DZone contributors are their own.
Trending
-
Redefining DevOps: The Transformative Power of Containerization
-
Authorization: Get It Done Right, Get It Done Early
-
Auto-Scaling Kinesis Data Streams Applications on Kubernetes
-
What ChatGPT Needs Is Context
Comments