Uploading Files to AWS S3 with NodeJS

Introduction

Amazon S3 (Simple Storage Service) is an object storage service offered by AWS. It's designed to be highly scalable, available, and secure, making it a popular choice for a wide range of use cases, from hosting static websites to storing backups and big data.

Basic Concepts

  • Object
    • Is a file.
    • Can include metadata to describe information for that file.
  • Bucket
    • Is where objects are stored.
    • Can create one or many buckets in the regions that Amazon supports.
    • The bucket name must be a unique name globally.
    • Can configure permissions for the bucket to allow access and modification of files inside.
    • Amazon S3 stores data as objects in buckets.

AWS CLI

First, access this link to install the AWS CLI according to the operating system you are using:

https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html


Use the following command to check the version after successfully installing the AWS CLI:

aws --version


Next, use the following command to configure AWS information:

aws configure

  • You will be asked to enter the AWS access key id and AWS secret access key, please enter the correct information to be able to use AWS services.
  • Note that depending on the permission configuration, you can access the corresponding AWS services, please ensure that your entered key has permission to use the AWS S3 service to proceed with the next steps.


Use the following command to create a bucket:

aws s3api create-bucket --bucket {bucket name} --region {region}

# ex
aws s3api create-bucket --bucket unique-bucket-name --region us-west-2

  • Note that the bucket name must be globally unique.
  • For the region, please choose the location closest to you for fast and stable access speed.


Some other commands when working with S3

List objects in a bucket:

aws s3 ls s3://{bucket name}

# ex
aws s3 ls s3://unique-bucket-name
                           PRE folder/
2025-09-05 13:43:01         67 example-key
2025-09-05 14:11:35     278763 example.jpg

  • Here, PRE is the prefix, the part after it will contain specific file information.


To access all object information in a bucket, use the following command:

aws s3 ls s3://unique-bucket-name --recursive
2025-09-05 13:43:01         67 example-key
2025-08-28 11:20:49        637 folder/in/s3/uploaded_file.txt
2025-09-05 14:11:35     278763 example.jpg


Delete a file from a bucket:

aws s3 rm s3://{bucket name}/{path to file}

# ex
aws s3 rm s3://unique-bucket-name/text.txt
delete: s3://unique-bucket-name/text.txt


Using with NodeJS

To integrate AWS with NodeJS, you first need to install the @aws-sdk/client-s3 package:

yarn add @aws-sdk/client-s3


Below is the code to upload a file to AWS S3:

import {S3Client, ListObjectsV2Command} from '@aws-sdk/client-s3'
import {Upload} from '@aws-sdk/lib-storage'
import * as fs from 'fs'

const bucketName = 'unique-bucket-name'
const client = new S3Client({region: 'ap-southeast-1'})

const upload = async () => {
  const s3Key = 'folder/in/s3/uploaded_file.txt'
  const fileStream = fs.createReadStream('path/to/file/text.txt')
  try {
    const uploader = new Upload({
      client,
      params: {
        Bucket: bucketName,
        Key: s3Key,
        Body: fileStream,
        // ACL: 'public-read', // bucket must be grant public-read to use this config
      },
      tags: [
        {
          Key: 'project',
          Value: 'my-app',
        },
      ],
    })
    uploader.on('httpUploadProgress', progress => {
      console.log(progress)
    })
    const res = await uploader.done()
    console.log('Upload Success!', res.Location)
  } catch (err) {
    console.error('Upload Error:', err)
  }
}
upload()


The result is as follows:

{
  loaded: 748,
  total: 748,
  part: 1,
  Key: 'folder/in/s3/uploaded_file.txt',
  Bucket: 'unique-bucket-name'
}
Upload Success! https://unique-bucket-name.s3.ap-southeast-1.amazonaws.com/folder/in/s3/uploaded_file.txt


To load folders in a bucket as follows:

const getFolders = async () => {
  try {
    const params = {
      Bucket: bucketName,
      Delimiter: '/',
    }

    const command = new ListObjectsV2Command(params)
    const data = await client.send(command)

    if (data.CommonPrefixes) {
      console.log('Folders in S3 bucket:')
      data.CommonPrefixes.forEach(folder => {
        console.log(`- ${folder.Prefix}`)
      })
    } else {
      console.log('No folders found.')
    }

    if (data.Contents) {
      console.log('\nFiles in the root of the S3 bucket:')
      data.Contents.forEach(file => {
        console.log(`- ${file.Key}`)
      })
    }
  } catch (err) {
    console.error('Error listing folders:', err)
  }
}
getFolders()


Result:

Folders in S3 bucket:
- folder

Files in the root of the S3 bucket:
- example-key
- example.jpg

  • Note that in S3 there is no concept of folder, but a bucket stores object. The object will have information like a key (in the form of path/inside/abc.txt), and we can view the PRE part (prefix in the key like path/inside as a folder for easier understanding).


To load files in a bucket:

const getFiles = async () => {
  try {
    const params = {
      Bucket: bucketName,
      Prefix: 'folder/in/s3/', // specify a particular prefix to retrieve the files within it
    }
    const command = new ListObjectsV2Command(params)
    const data = await client.send(command)

    if (data.Contents) {
      console.log('Files in S3 bucket:')
      data.Contents.forEach(file => {
        console.log(`- ${file.Key}`)
        console.log(`  Size: ${file.Size} bytes`)
        console.log(`  Last Modified: ${file.LastModified}`)
      })
    } else {
      console.log('No files found in the specified bucket or folder.')
    }
  } catch (err) {
    console.error('Error listing files:', err)
  }
}
getFiles()


Result:

Files in S3 bucket:
- folder/in/s3/uploaded_file.txt
  Size: 748 bytes
  Last Modified: 2025-08-28 11:20:49

Comments

Popular posts from this blog

All practice series

Deploying a NodeJS Server on Google Kubernetes Engine

Setting up Kubernetes Dashboard with Kind

Using Kafka with Docker and NodeJS

Monitoring with cAdvisor, Prometheus and Grafana on Docker

Kubernetes Practice Series

Kubernetes Deployment for Zero Downtime

Practicing with Google Cloud Platform - Google Kubernetes Engine to deploy nginx

NodeJS Practice Series

Helm for beginer - Deploy nginx to Google Kubernetes Engine