Save a JSON from AWS Lambda to AWS S3 with node.js

12,661

As @dashmug said, your example is not a Lambda function.

You must have exports.handler in your file somewhere unless specified in the function configuration.

All Lambda functions start by calling exports.handler with ( event, context, callback ) parameters. These include the data of the event or action, some additional context, and a success/fail callback.

Here is what you are looking for:

Update: changed S3.putObject Promise wrapped function to S3.putObject().promise() per @dashmug’s recommendation.

Requires AWS SDK for JavaScript (v2.3.0 - March 31 2016 or later)

'use strict';

const
    AWS = require( 'aws-sdk' ),
    S3  = new AWS.S3();

exports.handler = ( event, context, callback ) => {
    console.log( `FUNCTION STARTED: ${new Date()}` );

    S3.putObject( {
         Bucket: 'gpiocontroll-XYZ',
         Key: 'test.txt',
         Body: 'stuff'
    } )
         .promise()
         .then( () => console.log( 'UPLOAD SUCCESS' ) )
         .then( () => callback( null, 'MISSION SUCCESS' ) )
         .catch( e => {
            console.error( 'ERROR', e );
            callback( e );
         } );
};

Note: you must give the Lambda function IAM permissions to the S3 Bucket you are trying to access. In the case above, your IAM role should look something like this:

{
    "Effect": "Allow",
    "Action": [ "s3:PutObject" ],
    "Resource": [
        "arn:aws:s3:::gpiocontroll-XYZ/*"
    ]
}
Share:
12,661
luckybusted
Author by

luckybusted

I am a frontend developer and Projectmanager at a small Ad-Agency in Munich, Germany.

Updated on July 02, 2022

Comments

  • luckybusted
    luckybusted almost 2 years

    I am trying to save a JSON File from AWS Lambda to S3. (to be more precise: I want to create a new file 'supertest.json' containing the 'data' inside the S3 bucket 'gpiocontroll-XYZ' )

    The Lambda function looks like this:

    'use strict'
    
    const aws = require('aws-sdk');
    const s3 = new aws.S3();
    
    //const fs = require('fs');
    
    function saveJSONtoS3(data){
        console.log('SAVEJSON', data);
    
        var params = {
            Bucket: 'gpiocontroll-XYZ', // your bucket name,
            Key: 'test.txt', // path to the object you're looking for
            Body: data
        }
    
        s3.putObject(params, function(err, data) {
            // Handle any error and exit
            if (err)
            console.log('ERROR', err);
            else {
                console.log('UPLOADED SUCCESS');
            }
            console.log('INSIDE FUNCTION');
        });
    
        console.log('END')
    }
    
    module.exports = {
        saveJSONtoS3 : saveJSONtoS3
    }
    

    The log on Lambda looks like:

    2017-12-27T20:04:29.382Z    255d436d-eb41-11e7-b237-1190c4f33d2d    SAVEJSON {"table":[{"pin":"1","state":"aus"}]}
    2017-12-27T20:04:29.402Z    255d436d-eb41-11e7-b237-1190c4f33d2d    END
    END RequestId: 255d436d-eb41-11e7-b237-1190c4f33d2d
    REPORT RequestId: 255d436d-eb41-11e7-b237-1190c4f33d2d  Duration: 362.29 ms Billed Duration: 400 ms     Memory Size: 128 MB Max Memory Used: 43 MB  
    

    So it seems like everything is fine but the s3.putObject function just don't get triggered. Lambda and S3 are both in the same region. The S3 is public with an IAM user. Do I need to log in in the Lambda function somehow?

    Thanks a lot!