top

Uploading files to AWS S3 using Nodejs

AWS S3. A place where you can store files. That’s what most of you already know about it. S3 is one of the older service provided by Amazon, before the days of revolutionary Lambda functions and game changing Alexa Skills.You can store almost any type of files from doc to pdf, and of size ranging from 0B to 5TB. According to their official docs, AWS S3 is, “S3 provides comprehensive security and compliance capabilities that meet even the most stringent regulatory requirements. It gives customers flexibility in the way they manage data for cost optimization, access control, and compliance. ” - AWS Docs If I try to put it in simple terms, AWS S3, is an object based storage system where every file your store is saved as object not file. There are many big tech-wigs, which uses S3, and Dropbox is one of them. Recently, Dropbox starts saving metadata of their file in their own service but they are still saving main data in S3. Why? Well, S3 is not that expensive and it’s 99.9% available. Plus, you get the change to use services like Glacier which can save data and charge almost $0.01 per GB. So far, if I have gotten your attention, and you’re thinking how to use S3 in my nodejs application. Well, you don’t have to wait for long. AWS has official package which exposes S3 apis for node js apps and makes it too easy for developers to access S3 from their apps. Source: https://youtu.be/FLolHgKRTKg In next few steps, I will guide you to build a nodejs based app, which can write any file to AWS S3.   1. Set up node app A basic node app usually have 2 file, package.json (for dependencies)  and a starter file (like app.js, index.js, server.js). You can use your OS’s file manager or your favourite IDE to create project but I usually prefer CLI. So, let’s go into our shell and follow these commands: mkdir s3-contacts-upload-demo cd s3-contacts-upload-demo touch index.js .gitignore npm init   If you didn’t get any error in above commands, you will have a folder by the name of s3-contacts-upload-demo, 3 files in it, package.json, .gitignore andindex.json. You can use this file to add file in your .gitignore file, so that these won’t get committed to github or some other version control. npm init creates a package.json, which have project’s details, you can just hit enter in your shell for default values if you wish.   2. Install dependencies Let’s start by installing the NPM package: npm install --save aws-sdk After successful installation of this package, you can also check your package.json file, it will have aws-sdk listed in “dependencies” field. This npm package can be used to access any AWS service from your nodejs app, and here we will use it for S3.   3. Import packages Once installed, import the package in your code: const fs = require('fs'); const AWS = require('aws-sdk'); const s3 = new AWS.S3({   accessKeyId: process.env.AWS_ACCESS_KEY,   secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY }); As you can notice, we have also imported fs package which will be used to write file data in this app. We are also using environment variables, to set up AWS access and secret access key, as it is a bad practice to put them on version control like Github or SVN. Now, you have your S3 instance, which can access all the buckets in your AWS account.   4. Pass bucket information and write business logic Below is a simple prototype of how to upload file to S3. Here, Bucket is name of your bucket and key is name of subfolder. So, if your bucket name is “test-bucket” and you want to save file in “test-bucket/folder/subfolder/file.csv”, then value of Key should be “older/subfolder/file.csv”. Note: S3 is object based storage and not file based. So, even in AWS console you can see nested folders, behind the scene they never get saved like that. Every object has two fields: Key and Value. Key here is simply name of file and Value is data which is getting stored. So, if a bucket “bucket1” has key “key1/key2/file.mp3”, you can visualize it like this: {   "bucket1": {       "key1/key2/file.mp3": "<mp3-data>"   } } Below is simple snippe to upload a file,using Key and BucketName. const params = {  Bucket: 'bucket',  Key: 'key',  Body: stream }; s3.upload(params, function(err, data) {  console.log(err, data); });   5. File to upload to S3 First, create a file, let’s say contacts.csv and write some data in it. Above is some dummy data for you to get started. Now you have a contacts.csv file, let’s read it using fs module and save it to S3. S3 upload method returns error and data in callback, where data field contains location, bucket and key of uploaded file. For complete API reference, refer their official docs. Now run this app by following command: AWS_ACCESS_KEY=<your_access_key> AWS_SECRET_ACCESS_KEY=<your_secret_key> node index.js We have passed our AWS keys as environment variables. If you don’t get any error in above snippet, your bucket should have the file. And that’s it, under 30 lines of code you uploaded a file to AWS S3. You can find the full project here. There are lot many things you can do with this package like List out buckets and objects Set permissions on bucket Create, get or delete bucket and much more I would highly recommend you to through this doc for APIs, they are very well explained with parameters you can pass to each API and response format they return. I hope you found this post useful and please do let us know in case of any question or query.
Rated 4.0/5 based on 0 customer reviews
Normal Mode Dark Mode

Uploading files to AWS S3 using Nodejs

Mukul Jain
Blog
06th Mar, 2018
Uploading files to AWS S3 using Nodejs

AWS S3. A place where you can store files. That’s what most of you already know about it. S3 is one of the older service provided by Amazon, before the days of revolutionary Lambda functions and game changing Alexa Skills.You can store almost any type of files from doc to pdf, and of size ranging from 0B to 5TB.

According to their official docs, AWS S3 is,

“S3 provides comprehensive security and compliance capabilities that meet even the most stringent regulatory requirements. It gives customers flexibility in the way they manage data for cost optimization, access control, and compliance. ” - AWS Docs

If I try to put it in simple terms, AWS S3, is an object based storage system where every file your store is saved as object not file. There are many big tech-wigs, which uses S3, and Dropbox is one of them. Recently, Dropbox starts saving metadata of their file in their own service but they are still saving main data in S3. Why? Well, S3 is not that expensive and it’s 99.9% available. Plus, you get the change to use services like Glacier which can save data and charge almost $0.01 per GB.

So far, if I have gotten your attention, and you’re thinking how to use S3 in my nodejs application. Well, you don’t have to wait for long.

AWS has official package which exposes S3 apis for node js apps and makes it too easy for developers to access S3 from their apps.

Source: https://youtu.be/FLolHgKRTKg

In next few steps, I will guide you to build a nodejs based app, which can write any file to AWS S3.

 

1. Set up node app

A basic node app usually have 2 file, package.json (for dependencies)  and a starter file (like app.js, index.js, server.js).

You can use your OS’s file manager or your favourite IDE to create project but I usually prefer CLI. So, let’s go into our shell and follow these commands:

mkdir s3-contacts-upload-demo
cd s3-contacts-upload-demo
touch index.js .gitignore
npm init
 

If you didn’t get any error in above commands, you will have a folder by the name of s3-contacts-upload-demo, 3 files in it, package.json, .gitignore andindex.json. You can use this file to add file in your .gitignore file, so that these won’t get committed to github or some other version control.

npm init creates a package.json, which have project’s details, you can just hit enter in your shell for default values if you wish.

Set up node app

 

2. Install dependencies

Let’s start by installing the NPM package:

npm install --save aws-sdk

After successful installation of this package, you can also check your package.json file, it will have aws-sdk listed in “dependencies” field.

This npm package can be used to access any AWS service from your nodejs app, and here we will use it for S3.

 

3. Import packages

Once installed, import the package in your code:

const fs = require('fs');
const AWS = require('aws-sdk');
const s3 = new AWS.S3({
  accessKeyId: process.env.AWS_ACCESS_KEY,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
});

As you can notice, we have also imported fs package which will be used to write file data in this app. We are also using environment variables, to set up AWS access and secret access key, as it is a bad practice to put them on version control like Github or SVN.

Now, you have your S3 instance, which can access all the buckets in your AWS account.

 

4. Pass bucket information and write business logic

Below is a simple prototype of how to upload file to S3. Here, Bucket is name of your bucket and key is name of subfolder. So, if your bucket name is “test-bucket” and you want to save file in “test-bucket/folder/subfolder/file.csv”, then value of Key should be “older/subfolder/file.csv”.

Note: S3 is object based storage and not file based. So, even in AWS console you can see nested folders, behind the scene they never get saved like that. Every object has two fields: Key and Value. Key here is simply name of file and Value is data which is getting stored.

So, if a bucket “bucket1” has key “key1/key2/file.mp3”, you can visualize it like this:

{
  "bucket1": {
      "key1/key2/file.mp3": "<mp3-data>"
  }
}

Below is simple snippe to upload a file,using Key and BucketName.

const params = {
 Bucket: 'bucket',
 Key: 'key',
 Body: stream
};

s3.upload(params, function(err, data) {
 console.log(err, data);
});

 

5. File to upload to S3

First, create a file, let’s say contacts.csv and write some data in it.

 File to upload to S3

Above is some dummy data for you to get started. Now you have a contacts.csv file, let’s read it using fs module and save it to S3.

S3 upload method returns error and data in callback, where data field contains location, bucket and key of uploaded file. For complete API reference, refer their official docs.

Now run this app by following command:

AWS_ACCESS_KEY=<your_access_key>
AWS_SECRET_ACCESS_KEY=<your_secret_key>
node index.js

We have passed our AWS keys as environment variables.

If you don’t get any error in above snippet, your bucket should have the file.

And that’s it, under 30 lines of code you uploaded a file to AWS S3.

You can find the full project here.

There are lot many things you can do with this package like

  • List out buckets and objects

  • Set permissions on bucket

  • Create, get or delete bucket and much more

I would highly recommend you to through this doc for APIs, they are very well explained with parameters you can pass to each API and response format they return.

I hope you found this post useful and please do let us know in case of any question or query.

Mukul

Mukul Jain

Blog Author
Software Engineer at Edfora. Passionate about nodejs, serverless, AI & building futuristic stuff. Also a spontaneous blogger & witty tech speaker.

Leave a Reply

Your email address will not be published. Required fields are marked *

SUBSCRIBE OUR BLOG

Follow Us On

Share on