Uploading base64 encoded Image to Amazon S3 via Node.js

111,241

Solution 1

For people who are still struggling with this issue. Here is the approach I used with native aws-sdk :

var AWS = require('aws-sdk');
AWS.config.loadFromPath('./s3_config.json');
var s3Bucket = new AWS.S3( { params: {Bucket: 'myBucket'} } );

Inside your router method (ContentType should be set to the content type of the image file):

  var buf = Buffer.from(req.body.imageBinary.replace(/^data:image\/\w+;base64,/, ""),'base64')
  var data = {
    Key: req.body.userId, 
    Body: buf,
    ContentEncoding: 'base64',
    ContentType: 'image/jpeg'
  };
  s3Bucket.putObject(data, function(err, data){
      if (err) { 
        console.log(err);
        console.log('Error uploading data: ', data); 
      } else {
        console.log('successfully uploaded the image!');
      }
  });

s3_config.json file :

{
  "accessKeyId":"xxxxxxxxxxxxxxxx",
  "secretAccessKey":"xxxxxxxxxxxxxx",
  "region":"us-east-1"
}

Solution 2

ok, this one is the answer how to save canvas data to file

basically it loos like this in my code

buf = new Buffer(data.dataurl.replace(/^data:image\/\w+;base64,/, ""),'base64')


req = knoxClient.put('/images/'+filename, {
             'Content-Length': buf.length,
             'Content-Type':'image/png'
  })

req.on('response', (res) ->
  if res.statusCode is 200
      console.log('saved to %s', req.url)
      socket.emit('upload success', imgurl: req.url)
  else
      console.log('error %d', req.statusCode)
  )

req.end(buf)

Solution 3

Here's the code from one article I came across, posting below:

const imageUpload = async (base64) => {

  const AWS = require('aws-sdk');

  const { ACCESS_KEY_ID, SECRET_ACCESS_KEY, AWS_REGION, S3_BUCKET } = process.env;

  AWS.config.setPromisesDependency(require('bluebird'));
  AWS.config.update({ accessKeyId: ACCESS_KEY_ID, secretAccessKey: SECRET_ACCESS_KEY, region: AWS_REGION });

  const s3 = new AWS.S3();

  const base64Data = new Buffer.from(base64.replace(/^data:image\/\w+;base64,/, ""), 'base64');

  const type = base64.split(';')[0].split('/')[1];

  const userId = 1;

  const params = {
    Bucket: S3_BUCKET,
    Key: `${userId}.${type}`, // type is not required
    Body: base64Data,
    ACL: 'public-read',
    ContentEncoding: 'base64', // required
    ContentType: `image/${type}` // required. Notice the back ticks
  }

  let location = '';
  let key = '';
  try {
    const { Location, Key } = await s3.upload(params).promise();
    location = Location;
    key = Key;
  } catch (error) {
  }

  console.log(location, key);

  return location;

}

module.exports = imageUpload;

Read more: http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property

Credits: https://medium.com/@mayneweb/upload-a-base64-image-data-from-nodejs-to-aws-s3-bucket-6c1bd945420f

Solution 4

The accepted answer works great but if someone needs to accept any file instead of just images this regexp works great:

/^data:.+;base64,/

Share:
111,241
Franz Enzenhofer
Author by

Franz Enzenhofer

awesome and sometimes even better

Updated on July 08, 2022

Comments

  • Franz Enzenhofer
    Franz Enzenhofer almost 2 years

    Yesterday I did a deep night coding session and created a small node.js/JS (well actually CoffeeScript, but CoffeeScript is just JavaScript so lets say JS) app.

    what's the goal:

    1. client sends a canvas datauri (png) to server (via socket.io)
    2. server uploads image to amazon s3

    step 1 is done.

    the server now has a string a la

    data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMgAAADICAYAAACt...
    

    my question is: what are my next steps to "stream"/upload this data to Amazon S3 and create an actual image there?

    knox https://github.com/LearnBoost/knox seems like an awesome lib to PUT something to S3, but what I'm missing is the glue between the base64-encoded-image-string and actual upload action?

    Any ideas, pointers and feedback welcome.

  • Nichole A. Miler
    Nichole A. Miler over 8 years
    [MissingRequiredParameter: Missing required key 'Key' in params]
  • Divyanshu Das
    Divyanshu Das over 8 years
    Key: req.body.userId I used userId as key in post data... it was long back... but you can declare any string as key. To make sure already present files are not overwritten keep the key unique.
  • alexventuraio
    alexventuraio about 8 years
    @Divyanshu Thanks for such useful example. I got two doubt: How to make S3 generates a unique KEY to prevent from overriding files? and If I don't set the ContentType, when I download the files I won't be able to get the correct file? I mean, I will get such a corrupted file? Thanks in advance!
  • Divyanshu Das
    Divyanshu Das about 8 years
    @Lexynux unique key is something you have to generate on your system. something like:- aws_key : req.user.username + uuid(Math.round((new Date()).getTime() / 1000)) you can add more randomness to it to make sure that it does not clash. There is no specific rule here.
  • Divyanshu Das
    Divyanshu Das about 8 years
    @Lexynux about content type, try to set content type beforehand, you can find code snippets in node to determine content-type of a file. not setting content-type might lead to corrupted files, I haven't verified this since I always set content-type before posting to s3
  • alexventuraio
    alexventuraio about 8 years
    Alright @Divyanshu I will try to do so and if I need some more help I will write you back! Thanks a lot from México City!
  • NaveenG
    NaveenG almost 8 years
    Buffer object will throw an error "Buffer not define" can you give me solution for that.
  • NaveenG
    NaveenG almost 8 years
    @Divyanshu Buffer object will throw an error "Buffer not define" can you give me solution for that
  • Divyanshu Das
    Divyanshu Das almost 8 years
    @NaveenG You should not be getting that since Buffer class is part of javascript language. You might wanna check your javascript and node installation.
  • Krishna
    Krishna almost 8 years
    I am also getting the same error. you got any solution or not
  • Krishna
    Krishna over 7 years
    @NaveenG Did you solve your error? I also face he same error
  • Pushkar Kathuria
    Pushkar Kathuria over 7 years
    @Divyanshu Could you please help with the video upload as well?
  • Divyanshu Das
    Divyanshu Das over 7 years
    @PushkarKathuria, The above approach should work for video as well. You just need to set the correct mime_type and base64 for video to read in buffer. For video base64 should take format like this - data:video/mp4;base64
  • Marklar
    Marklar almost 7 years
    From the docs it appears that the upload method returns the location in data but the putObject method does not. Do you know how to get the new S3 location path after a successful putObject?
  • Divyanshu Das
    Divyanshu Das almost 7 years
    @Marklar location path is basically key - e.g. if your bucket name is - bucketone and key name is xyz.png, then file path will be bucketone.s3.amazonaws.com/xyz.png
  • Manish
    Manish almost 7 years
    Thanks a lot ...I was confused with file name here the key is the filename and contain path as well. For example "/img/profile/userid.jpg" is work like a charm for me.
  • Shuhei Kagawa
    Shuhei Kagawa over 6 years
    @Divyanshu Thanks for this great answer! It helped me a lot. However, I think ContentEncoding: 'base64' is not correct because new Buffer(..., 'base64') decodes base64-encoded string into its binary representation.
  • Meet Zaveri
    Meet Zaveri over 6 years
    Hey @Divyanshu it's working, but I have a problem. Image is missing pixels in half of the portion. So image is not completely viewable. My decoded string is starting from - /9j/4QO... after stripping those "data:image" and all.
  • Meet Zaveri
    Meet Zaveri over 6 years
    @Divyanshu Also replace(/^data:image\/\w+;base64,/, "") is not correctly finding data:image
  • Divyanshu Das
    Divyanshu Das over 6 years
    @MeetZaveri, that's weird, this code still works for me. I used it in a new project recently. I am not sure how could I help more. Are you sure there is no issue with image or other settings ?
  • Meet Zaveri
    Meet Zaveri over 6 years
    @Divyanshu it was done days before. No issues now. Your sol. provided an assist
  • Pointi
    Pointi about 6 years
    @NaveenG This is a node example, maybe you are using plain JS?
  • Pistos
    Pistos about 6 years
    Using the Buffer like that also works with the upload() function, too.
  • Ka Tech
    Ka Tech about 6 years
    Thank you thank you thank you!!!!!! I've been spending weeks on this and finally a straightforward solution.
  • Adam Florin
    Adam Florin over 5 years
    Yes to what @ShuheiKagawa said; ContentEncoding is not necessary. Also, according to the latest Node.js docs, that form of the Buffer constructor is deprecated, replaced by Buffer.from(..., 'base64').
  • delux247
    delux247 over 5 years
    Yes, @ShuheiKagawa is correct, ContentEncoding is unnecessary and actually caused my image urls served from S3 to be invalid for the Facebook API. You can test your urls here.. developers.facebook.com/tools/debug/sharing
  • thehme
    thehme about 5 years
    Just needed the ContentType, not sure why this too so long to find. AWS docs aren't the easiest to navigate.
  • Raj Thakar
    Raj Thakar about 5 years
    @DivyanshuDas can i use .split(/base64,/)[1] insted of .replace(/^data:image\/\w+;base64,/, "") ? because i want to upload any file. i am using AWS lambda and can't find any other method to upload file via lambda function.
  • Lead Developer
    Lead Developer almost 4 years
    Yes! Body should be a Buffer object. not base64 string!
  • Janen R
    Janen R about 3 years
    Working !! ........ this fixed the issue .replace(/^data:image\/\w+;base64,/, "") !!
  • Abhishek Singh
    Abhishek Singh almost 3 years
    "new" keyword should not come before Buffer.from
  • Sushil Thapa
    Sushil Thapa over 2 years
    For loading credentials in node.js app: docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/…
  • 高欣平
    高欣平 over 2 years
    works for me, thanks a lot