How-to-use-AWS-S3-Bucket-with-NodeJS-Application

How to use AWS S3 Bucket with NodeJS Application?

What is the AWS S3 bucket?

Consider a Google Drive with an API that allows you to upload and download files programmatically.

The majority of websites require hosting in order to host images, videos, and other media. One apparent option is to save it to your hard disc. That appears to be the case, but what if the amount of storage required outweighs the hard drive’s storage capacity? We’ll have to scale it down, which is a time-consuming process.

A hosting service like AWS S3 plays a role here since it can store and scale a large number of media files.

The Amazon Simple Storage Service is an online storage service. It’s intended to make web-scale computing more accessible to programmers.

The web services interface for Amazon S3 is straightforward.

A public cloud storage resource provided in Amazon Web Services (AWS) Simple Storage Service (S3), an object storage solution, is an Amazon S3 bucket. Amazon S3 buckets are comparable to file folders in that they store objects that include data and descriptive metadata.

hire node js developer

Implement in NodeJs Application

Step 1: Get your credential keys

If you don’t already have an AWS account, create one. Log in to your Amazon Web Services account.

You’ll find your Access Key Id and Secret Access Key under “My security credentials.”

This key will be used later.

Step 2: Create a Bucket

Click on “Create Bucket” to make a new one.

Then, in the form, fill in the necessary information. The name of the bucket must be unique. Review all characteristics and permissions and apply them as necessary.

A bucket will be generated by clicking next, next, and next. And your bucket is ready.

Step 3: Implement in nodeJs project

Let’s start with the basics before we start coding. Using the command, create a blank project and fill in the relevant information.

Installing the npm packages that are required.

npm i aws-sdk

The first step is to import the aws-sdk package.

const AWS = require('aws-sdk');

Now we need “Access Key Id” and “Secret Access Key” to connect to AWS S3 and enter the bucket name.

const s3 = new AWS.S3({
accessKeyId: "ENTER YOUR accessKeyId",
secretAccessKey: "ENTER YOUR secretAccessKey",
});

const BUCKET = '<YOUR BUCKET NAME>';

Upload object/file to the bucket

You can upload files or data to a bucket by upload() method, putObject() method, or generate a signed URL for the upload file.

→ s3.upoad

The S3 Transfer Manager is in charge of the upload file method, which means it will manage multipart uploads for you behind the scenes if necessary.

Example:

const uploadFile = (filePath, keyName) => {

return new Promise((resolve, reject) => {
try {
var fs = require('fs');
const file = fs.readFileSync(filePath);
const BUCKET = '<YOUR BUCKET NAME>';

const uploadParams = {
Bucket: BUCKET,
Key: keyName,
Body: file
};

s3.upload(uploadParams, function (err, data) {
if (err) {
return reject(err);
}
if (data) {
return resolve(data);
}
});
} catch (err) {
return reject(err);
}
})
}

uploadFile('<FILE PATH>','<FILE NAME>')

→ s3.putObject

The put object method corresponds to the S3 API request at the lowest level. It does not assist you with multipart uploads. It will try to send the whole body in a single request.

Example:

const putObject = (key, fileBuffer) => {
return new Promise((resolve, reject) => {
try {

const BUCKET = '<YOUR BUCKET NAME>';

const params = {
Bucket: '<YOUR BUCKET NAME>',
Key: key,
Body: fileBuffer
};

s3.putObject(params, function (err, data) {
if (err)
return reject(err);

data.url = `https://${BUCKET}.${dosCredentials.region}.digitaloceanspaces.com/${key}`;
data.key = key;
return resolve(data);
});
} catch (err) {
return reject(err);
}
});
}

putObject('<FILE NAME>', '<FILE BUFFER>');

→ s3.getSignedUrl

You can use a pre-signed URL to grant temporary access to someone without AWS credentials or access permissions. An AWS user with access to the item generates a pre-signed URL. The unauthorized user is then given the generated URL, which they can use to submit files or objects to the bucket using putObject with getSignedUrl.

Example:

const getSignUrl = (key) => {
return new Promise((resolve, reject) => {
try {
var params = {
Bucket : '<YOUR BUCKET NAME>',
Key : key,
Expires : 30 * 60,
ContentType : mime.lookup(path.basename(filename)),
};

const signedUrl = s3.getSignedUrl('putObject', params);

if (signedUrl) {
return resolve(signedUrl);
} else {
return reject("Cannot create signed URL");
}
} catch (err) {
return reject("Cannot create signed URL!");
}
});
}

getSignUrl('<FILE PATH>');

Access or download object/file from the bucket

By using getObject, the unauthorized user can access bucket files or objects.

Example:

const getSignUrlForFile = (key) => {
return new Promise((resolve, reject) => {
try {
const path = require('path');
const fileName = path.basename(key);

var params = {
Bucket: '<YOUR BUCKET NAME>',
Key: key,
Expires: 30 * 60
};

const signedUrl = s3.getSignedUrl('getObject', params);

if (signedUrl) {
return resolve({
signedUrl,
fileName,
});
} else {
return reject("Cannot create signed URL");
}
} catch (err) {
return reject("Cannot create signed URL!");
}
});
}

getSignUrlForFile('<FILE PATH>');

Delete object/file from the bucket

Using the Amazon S3 console, AWS SDKs, AWS Command Line Interface (AWS CLI), or REST API, you can delete one or more objects directly from Amazon S3. You should delete things that you no longer need because all objects in your S3 bucket incur storage expenses. If you’re collecting log files, for example, it’s a good idea to delete them when you’re done with them. You can use a lifecycle rule to have items like log files deleted automatically.

Example:

const deleteObject = (key) => {
return new Promise((resolve, reject) => {
try {
var params = {
Bucket: '<YOUR BUCKET NAME>',
Key: key
};

s3.deleteObject(params, function (err, data) {
if (err)
return reject(err);
return resolve(data);
});
} catch (err) {
return reject(err);
}
});
}

deleteObject('<FILE PATH>');

Conclusion

You may control your AWS S3 bucket with a nodejs application using any of the techniques listed above. On the official website, there are additional configurations for the AWS s3 bucket.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply