Unleashing the Power of AWS: A Journey with S3 : Use of AWS S3 in Node.js

Abhishek Mishra
3 min readDec 31, 2023

--

In the evolution of my MERN stack project, MongoDB served as a reliable database, but a specific challenge arose concerning image storage. This is where Amazon Web Services (AWS) S3 proved to be a game-changer.

Motivated to enhance image storage capabilities, AWS S3 provided a seamless solution. It automatically generated unique URLs for each image, eliminating the need for complex storage within MongoDB. Storing these S3 URLs in the database significantly improved efficiency.

The impact on my project was transformative. AWS S3 not only addressed the immediate image storage challenge but also optimized overall performance. Now, a simple reference to the corresponding S3 URL in the HTML image tag seamlessly fetches and displays the image on the web app.

First create a node.js project for using AWS S3 in backend using below command:

npm init 

Create .env file for environment variables like this :

PORT=
accessKeyId:
secretAccessKey:
bucket_name:

Now install all dependencies :

npm install express dotenv express-fileupload aws-sdk

I’m using express for creating the server, dotenv for managing environment variables, and express-fileupload for handling file uploads to the backend , and obviously aws-sdk for storing file to aws. This enables us to easily upload files to AWS for further processing .

Now, create a folder named ‘S3’ so that we can develop a function for uploading files, making it accessible anywhere in our project.

Create a file index.js and write below code:

var AWS = require("aws-sdk"); //Import aws 

const accessKeyId = process.env.accessKeyId; //set accessKeyId in env

const secretAccessKey = process.env.secretAccessKey; //set secretAccessKey in env

var credentials = new AWS.Credentials({ accessKeyId, secretAccessKey });
AWS.config.update({
region: 'YOUR_REGION', // e.g., us-east-1
credentials,
});


module.exports = new AWS.S3();

Create one more file uploads3.js and write below code :

const s3 = require("./index"); //Import s3 from index 

async function uploadToS3(files) {
if (!Array.isArray(files)) {
files = [files];
}

if (files?.length >= 0 || files?.length >= 1) { //check for file length so that we can upload one by one
const fileUploadRequests = [];
for (let i = 0; i < files.length; i++) { //iterate till file length
let file = files[i];
if (file) {
fileUploadRequests.push(
s3
.upload({
Bucket: process.env.bucket_name, //bucket name of aws s3
Key: file.name,
Body: file.data
})
.promise()
);
}
}
const s3objects = await Promise.all(fileUploadRequests); //resolve all promise
const urlArray = s3objects.map((s3objects) => s3objects.Location); //here we can get img url

return urlArray; //returning img url
}
return []; //if file not present
}
module.exports = { uploadToS3 }; //export uploadToS3 and use this fun anywhere

This is the setup for AWS S3. Import uploadToS3, a function that takes a file as a parameter and returns a unique image URL. We can store this image URL in the database, making it accessible on the frontend.

Happy Coding …

--

--

Abhishek Mishra
Abhishek Mishra

Written by Abhishek Mishra

Full stack developer 👨🏽‍💻 | MERN STACK | Rust🦀

No responses yet