DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Generative AI has transformed nearly every industry. How can you leverage GenAI to improve your productivity and efficiency?

SBOMs are essential to circumventing software supply chain attacks, and they provide visibility into various software components.

Related

  • Unlocking the Power of Streaming: Effortlessly Upload Gigabytes to AWS S3 With Node.js
  • Enable Faster Uploads and Downloads with Your S3 Bucket
  • Data Ingestion: The Front Door to Modern Data Infrastructure
  • AWS S3 Strategies for Scalable and Secure Data Lake Storage

Trending

  • Indexed Views in SQL Server: A Production DBA's Complete Guide
  • The Cybersecurity Blind Spot in DevOps Pipelines
  • How to Build a Real API Gateway With Spring Cloud Gateway and Eureka
  • The Agile Paradox
  1. DZone
  2. Software Design and Architecture
  3. Cloud Architecture
  4. How to Upload and Serve Data Using Amazon CloudFront and Amazon S3 in Node.js

How to Upload and Serve Data Using Amazon CloudFront and Amazon S3 in Node.js

Upload and serve data faster and more efficiently.

By 
Swathi Prasad user avatar
Swathi Prasad
·
Jul. 15, 19 · Tutorial
Likes (4)
Comment
Save
Tweet
Share
11.4K Views

Join the DZone community and get the full member experience.

Join For Free

Most applications today serve users across the globe and need a way to deliver their content fast. To accomplish this, developers often rely on a Content Delivery Network (CDN), a network of servers that are geographically distributed with the intent of serving content to users as fast as possible.

Amazon CloudFront is one such CDN. In this article, I will describe how to upload files to S3 bucket and serve those files through CloudFront in Node.js.

Prerequisites

Create a bucket in S3 and create a CloudFront distribution in AWS. Navigate to IAM and go to Security Credentials under User. Create an access key and download the CSV file. We will need this access key later.

Then, click on My Security Credentials in My Account.

My security credentials

Under CloudFront key pairs, create a key-pair and download the private key. Make sure to keep track of your access key ID. We will need it later for integration.

Creating Node.js Application

Let’s create a simple Node.js express server and add two REST API endpoints for file upload and download. Here is the sample project structure.

I am using typescript and ts-node-dev npm modules in this sample project. Therefore, I have tsconfig.json in the project.  

Here is the entire app.ts file. The file contains the logic to initialize express server and REST endpoints. I am also using a multer npm module to handle multi-part file upload.

import express from 'express';
import * as fileCtrl from './fileController';
import multer from 'multer';
import crypto from 'crypto';
import path from 'path';

const app = express();
const port = 3000;

app.listen(port, () => {
  console.log('Server listening on port %s.', port);
});

//Multer module for handling multi part file upload.
var storage = multer.diskStorage({
  destination: './files',
  filename: function (req, file, cb) {
    crypto.pseudoRandomBytes(16, function (err, raw) {
      if (err) return cb(err)

      cb(null, raw.toString('hex') + path.extname(file.originalname))
    })
  }
})

app.use(multer({ storage: storage }).single('file'));


app.get('/api/download', asyncHandler(fileCtrl.download));
app.post('/api/upload', asyncHandler(fileCtrl.upload));

export function asyncHandler(handler) {
  return function (req, res, next) {
    if (!handler) {
      next(new Error(`Invalid handler ${handler}, it must be a function.`));
    } else {
      handler(req, res, next).catch(next);
    }
  };
}


Uploading Files to Amazon S3 Bucket

Let us look at how to upload files to S3 bucket. We will need to install node module aws-sdk to access S3 buckets from Node.js application.

Once we have installed the handler for upload, the endpoint is defined in as follows:

export async function upload(req, res) {
  let response = await uploadFile(req.file.originalname, req.file.path);
  res.send(response);
  res.end();
}


In fileComponent.ts, we need to import the AWS-SDK module as follows.

import awsSDK from 'aws-sdk';


In the beginning of this article, we downloaded a CSV file that contained access key id and secret access key. We will use them to upload files to the S3 bucket. Using the AWS-SDK module, we need to configure the access key id and secret access key as follows:

export function uploadFile(filename, fileDirectoryPath) {
  awsSDK.config.update({ accessKeyId: process.env.S3_ACCESS_KEY_ID, secretAccessKey: process.env.S3_SECRET_ACCESS_KEY });
  const s3 = new awsSDK.S3();

  return new Promise(function (resolve, reject) {
    fs.readFile(fileDirectoryPath.toString(), function (err, data) {
      if (err) { reject(err); }
      s3.putObject({
        Bucket: '' + process.env.S3_BUCKET_NAME,
        Key: filename,
        Body: data,
        ACL: 'public-read'
      }, function (err, data) {
        if (err) reject(err);
        resolve("succesfully uploaded");
      });
    });
  });
}


Using the putObject() method, we will upload files to the S3 bucket. In putObject(), we need to pass the bucket name to which we will upload the files. Note that, depending on your bucket policies, you can send parameters in putObject(). In this example, I have set the canned ACL policy to public-read. 

Now, we can start the server and test our POST endpoint. Here is an example from Postman.


Postman


Once the request is successful, we can see the file in the S3 bucket.


S3 bucket


Serving Files via Amazon CloudFront

Earlier, we downloaded private key from CloudFront key pairs. We will use that private key and access key ID to access CloudFront in Node.js.

The handler for download API endpoint is as follows.

export async function download(req, res) {
  let response = await getFileLink(req.query.filename);
  res.send(response);
  res.end();
}


In this handler, we are expecting a file name that has to be downloaded via CloudFront.

Let us look at how to access CloudFront in our Node.js app. First, we will install an aws-cloudfront-sign npm module. Using this module, we can get signed Amazon CloudFront URLs which enables us to provide users access to our private content. The signed URLs also contain additional meta-information, such as expiration time. This gives more control over access to our content.

export function getFileLink(filename) {
  return new Promise(function (resolve, reject) {
    var options = { keypairId: process.env.CLOUDFRONT_ACCESS_KEY_ID, privateKeyPath: process.env.CLOUDFRONT_PRIVATE_KEY_PATH };
    var signedUrl = awsCloudFront.getSignedUrl(process.env.CLOUDFRONT_URL + filename, options);
    resolve(signedUrl);
  });
}


Here, we need to pass the access key ID path to the private key file and CloudFront URL to getSignedUrl(). The CloudFront URL should look something like this: https://XYZ.cloudfront.net. 

Start the server and test the GET endpoint as follows:


GET endpoint


Conclusion

In this article, we saw how to upload files to Amazon S3 and serve those files via Amazon CloudFront. I hope you enjoyed this article. Let me know if you have any comments or suggestions in the comments section below.

The example for this article can be found on this GitHub repository.

AWS CloudFront Node.js Upload Data (computing)

Opinions expressed by DZone contributors are their own.

Related

  • Unlocking the Power of Streaming: Effortlessly Upload Gigabytes to AWS S3 With Node.js
  • Enable Faster Uploads and Downloads with Your S3 Bucket
  • Data Ingestion: The Front Door to Modern Data Infrastructure
  • AWS S3 Strategies for Scalable and Secure Data Lake Storage

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • [email protected]

Let's be friends: