How could I use aws lambda to write file to s3 (python)?
Solution 1
I've had success streaming data to S3, it has to be encoded to do this:
import boto3
def lambda_handler(event, context):
string = "dfghj"
encoded_string = string.encode("utf-8")
bucket_name = "s3bucket"
file_name = "hello.txt"
s3_path = "100001/20180223/" + file_name
s3 = boto3.resource("s3")
s3.Bucket(bucket_name).put_object(Key=s3_path, Body=encoded_string)
If the data is in a file, you can read this file and send it up:
with open(filename) as f:
string = f.read()
encoded_string = string.encode("utf-8")
Solution 2
My response is very similar to Tim B but the most import part is
1.Go to S3 bucket and create a bucket you want to write to
2.Follow the below steps otherwise you lambda will fail due to permission/access. I've copied and pasted it the link content here for you too just in case if they change the url /move it to some other page.
a. Open the roles page in the IAM console.
b. Choose Create role.
c. Create a role with the following properties.
-Trusted entity – AWS Lambda.
-Permissions – AWSLambdaExecute.
-Role name – lambda-s3-role.
The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs.
-
Copy and past this into your Lambda python function
import json, boto3,os, sys, uuid from urllib.parse import unquote_plus s3_client = boto3.client('s3') def lambda_handler(event, context): some_text = "test" #put the bucket name you create in step 1 bucket_name = "my_buck_name" file_name = "my_test_file.csv" lambda_path = "/tmp/" + file_name s3_path = "output/" + file_name os.system('echo testing... >'+lambda_path) s3 = boto3.resource("s3") s3.meta.client.upload_file(lambda_path, bucket_name, file_name) return { 'statusCode': 200, 'body': json.dumps('file is created in:'+s3_path) }
Rick.Wang
Updated on July 09, 2022Comments
-
Rick.Wang almost 2 years
I have tried to use lambda function to write a file to S3, then test shows "succeeded" ,but nothing appeared in my S3 bucket. What happened? Does anyone can give me some advice or solutions? Thanks a lot. Here's my code.
import json import boto3 def lambda_handler(event, context): string = "dfghj" file_name = "hello.txt" lambda_path = "/tmp/" + file_name s3_path = "/100001/20180223/" + file_name with open(lambda_path, 'w+') as file: file.write(string) file.close() s3 = boto3.resource('s3') s3.meta.client.upload_file(lambda_path, 's3bucket', s3_path)
-
Rick.Wang about 6 yearsThanks a lot. My code is also valid,I forgot to reload S3 bucket. I also try your method,it also works.Thank you so much.
-
Robert Swift over 5 yearsI found that the S3 path shouldn't have a leading
/
otherwise an empty folder is created, essentially becoming//100001
so I think the line should read:s3_path = "100001/20180223/" + file_name
-
Matt Klein almost 5 yearsNote that this isn't streaming, but buffering to disk and then sending
-
acaruci over 4 yearsThis solution worked perfectly. I needed to write a csv file , so I actually wrote to an io.StringIO and then encoded the buffer content to utf-8 and saved it to an S3 file
-
user3821178 over 3 yearswhy do you need a lambda path variable?
-
vesperknight about 3 yearsthat variable isn't used so he doesn't need it, he just forgot to remove it