How to Upload and Serve Data Using Amazon CloudFront and Amazon S3 in Node.js

DZone 's Guide to

How to Upload and Serve Data Using Amazon CloudFront and Amazon S3 in Node.js

In this post, we'll learn how to upload files to an S3 bucket and serve those files through CloudFront in Node.js.

· Web Dev Zone ·
Free Resource

Most applications today serve users across the globe and need a way to deliver their content fast. The content can be images, videos, PDF, and so on. Content delivery networks (CDNs) are a network of servers that are geographically distributed with the intent of serving content to users as fast as possible.

Amazon CloudFront is a content delivery network service which can deliver content fast and securely. In this article, I will describe how to upload files to an S3 bucket and serve those files through CloudFront in Node.js. CloudFront will use S3 as an origin in this article.


Create a bucket in S3 and create a CloudFront distribution in AWS. I will not go into detail about creating S3 buckets and CloudFront distributions in this article. Navigate to IAM and go to 'Security Credentials' under user. Create an access key and download the CSV file in case you don't already have one. We will need this access key later.

Create Access Key

Then, click on 'My Security Credentials' under the your account.

CloudFront keypairs

Under CloudFront keypairs, create a keypair and download the private key and note down your access key ID. We will need them later for the integration.

Creating a Node.js Application

Let's create a simple Node.js Express server and add two REST API endpoints for file upload and download. Here is the sample project structure.

Creat Node.js application

I am using typescript and ts-node-dev npm modules in this sample project. Therefore, I have a tsconfig.json file in the project.

Here is the entire app.ts file. The file contains the logic to initialize an Express server and REST endpoints. I am also using the multernpm module to handle multi-part file uploads.

import express from 'express';
import * as fileCtrl from './fileController';
import multer from 'multer';
import crypto from 'crypto';
import path from 'path';

const app = express();
const port = 3000;

app.listen(port, () => {
  console.log('Server listening on port %s.', port);

//Multer module for handling multi part file upload.
var storage = multer.diskStorage({
  destination: './files',
  filename: function (req, file, cb) {
    crypto.pseudoRandomBytes(16, function (err, raw) {
      if (err) return cb(err)

      cb(null, raw.toString('hex') + path.extname(file.originalname))

app.use(multer({ storage: storage }).single('file'));

app.get('/api/download', asyncHandler(fileCtrl.download));
app.post('/api/upload', asyncHandler(fileCtrl.upload));

export function asyncHandler(handler) {
  return function (req, res, next) {
    if (!handler) {
      next(new Error(`Invalid handler ${handler}, it must be a function.`));
    } else {
      handler(req, res, next).catch(next);

Uploading Files to an Amazon S3 Bucket

Let us look at how to upload files to S3 bucket. We will need to install the aws-sdk Node module to access S3 buckets from a Node.js application.

Once we have it installed, the handler for the upload endpoints is defined in as follows:

export async function upload(req, res) {
  let response = await uploadFile(req.file.originalname, req.file.path);

In fileComponent.ts, we need to import the aws-sdk module as follows.

import awsSDK from 'aws-sdk';

In the beginning of this article, we downloaded a CSV file that contained an access key id and secret access key. We will use them to upload files to the S3 bucket. Using the aws-sdkmodule, we need to configure an access key id and secret access key as follows:

export function uploadFile(filename, fileDirectoryPath) {
  awsSDK.config.update({ accessKeyId: process.env.S3_ACCESS_KEY_ID, secretAccessKey: process.env.S3_SECRET_ACCESS_KEY });
  const s3 = new awsSDK.S3();

  return new Promise(function (resolve, reject) {
    fs.readFile(fileDirectoryPath.toString(), function (err, data) {
      if (err) { reject(err); }
        Bucket: '' + process.env.S3_BUCKET_NAME,
        Key: filename,
        Body: data,
        ACL: 'public-read'
      }, function (err, data) {
        if (err) reject(err);
        resolve("succesfully uploaded");

Using the putObject() method, we will upload files to an S3 bucket. In putObject(), we need to pass the bucket name to which we will upload the files. Note that, depending on your bucket policies, you can send parameters in putObject(). In this example, I have set the canned ACL policy to  public-read. Make sure to check your S3 bucket policies.

Note that I have used environment variables for sensitive information, such as access key IDs, secret keys, and so on. If the variables are configured as recommended by AWS, they don't need to be explicitly passed to the S3 client as above. The S3 client automatically detects the configured environment variables.

Now, we can start the server and test our POST endpoint. Here is an example from Postman.


Once the request is successful, we can see the file in an S3 bucket.

S3 bucket

Serving Files via Amazon CloudFront

Earlier, we downloaded private keys from CloudFront keypairs. We will use that private key and access key ID to access CloudFront in Node.js.

The handler for download API endpoint is as follows.

export async function download(req, res) {
  let response = await getFileLink(req.query.filename);

In this handler, we are expecting a file name that has to be downloaded via CloudFront.

Let us look at how to access CloudFront in our Node.js app. First, we will install the aws-cloudfront-signnpm module. Using this module, we can get signed Amazon CloudFront URLs which enable us to provide users access to our private content. The signed URL also contains additional meta information such as expiration time. This gives us more control over access to our content.

export function getFileLink(filename) {
  return new Promise(function (resolve, reject) {
    var options = { keypairId: process.env.CLOUDFRONT_ACCESS_KEY_ID, privateKeyPath: process.env.CLOUDFRONT_PRIVATE_KEY_PATH };
    var signedUrl = awsCloudFront.getSignedUrl(process.env.CLOUDFRONT_URL + filename, options);

Here, we need to pass the access key ID, the path to the private key file, and the CloudFront URL to getSignedURL(). The CloudFront URL should look something like this: https://XYZ.cloudfront.net. XYZ will be your CloudFront distribution name.

Start the server and test the GET endpoint as follows:

Get endpoint on Postman


In this article, we saw how to upload files to Amazon S3 and serve those files via Amazon CloudFront. I hope you enjoyed this article. Let me know if you have any comments or suggestions in the comments section below.

The example for this article can be found on GitHub repository.

amazon s3, node.js, node.js tutorial, web dev

Published at DZone with permission of Swathi Prasad , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}