S3 file upload stream using node js

39,103

Solution 1

You can now use streaming with the official Amazon SDK for nodejs in the section "Uploading a File to an Amazon S3 Bucket" or see their example on GitHub.

What's even more awesome, you finally can do so without knowing the file size in advance. Simply pass the stream as the Body:

var fs = require('fs');
var zlib = require('zlib');

var body = fs.createReadStream('bigfile').pipe(zlib.createGzip());
var s3obj = new AWS.S3({params: {Bucket: 'myBucket', Key: 'myKey'}});
s3obj.upload({Body: body})
  .on('httpUploadProgress', function(evt) { console.log(evt); })
  .send(function(err, data) { console.log(err, data) });

Solution 2

For your information, the v3 SDK were published with a dedicated module to handle that use case : https://www.npmjs.com/package/@aws-sdk/lib-storage

Took me a while to find it.

Solution 3

Give https://www.npmjs.org/package/streaming-s3 a try.

I used it for uploading several big files in parallel (>500Mb), and it worked very well. It very configurable and also allows you to track uploading statistics. You not need to know total size of the object, and nothing is written on disk.

Share:
39,103
Janak Kansal
Author by

Janak Kansal

Updated on December 30, 2020

Comments

  • Janak Kansal
    Janak Kansal over 3 years

    I am trying to find some solution to stream file on amazon S3 using node js server with requirements:

    • Don't store temp file on server or in memory. But up-to some limit not complete file, buffering can be used for uploading.
    • No restriction on uploaded file size.
    • Don't freeze server till complete file upload because in case of heavy file upload other request's waiting time will unexpectedly increase.

    I don't want to use direct file upload from browser because S3 credentials needs to share in that case. One more reason to upload file from node js server is that some authentication may also needs to apply before uploading file.

    I tried to achieve this using node-multiparty. But it was not working as expecting. You can see my solution and issue at https://github.com/andrewrk/node-multiparty/issues/49. It works fine for small files but fails for file of size 15MB.

    Any solution or alternative ?

  • securecurve
    securecurve over 8 years
    Can this library stream upload a file from an uploading user instead me having to buffer it to my server first (whether on memory or disk)?
  • Harshavardhana
    Harshavardhana over 8 years
    It takes input stream, it can be a file stream or any stream whatsoever. It will upload automatically to server until the stream closes.
  • Daniel Kobe
    Daniel Kobe almost 8 years
    this isn't working with my output stream from yazl zip object?
  • Juan
    Juan over 5 years
    Brilliant! You can also pipe Buffers to zlib.createGzip() by transforming it into a Stream. const { Duplex } = require('stream'); `
  • Zaheer
    Zaheer over 4 years
    Does anyone know how this works? If each part is a fixed size, how do they fill in the last part if it doesn't exactly match the full size?
  • fIwJlxSzApHEZIl
    fIwJlxSzApHEZIl almost 4 years
    Can you update the link Johann? It appears to have changed.
  • Johann Philipp Strathausen
    Johann Philipp Strathausen almost 4 years
    @anon58192932 thanks for catching that, the link is now updated!
  • knownasilya
    knownasilya over 2 years
    Ran into issues with this where the stream passed in is transformed into a geojson feature collection.