How to read csv file from s3 bucket in AWS Lambda?
Solution 1
csvfile = s3.get_object(Bucket=bucket, Key=file_key)
csvcontent = csvfile['Body'].read().split(b'\n')
Here you have already retrieved the file contents and split it into lines. I'm not sure why you're trying to open
something again, you can just pass csvcontent
into your reader:
csv_data = csv.DictReader(csvcontent)
Solution 2
To get the CSV file data from s3 bucket in the proper and with easy to retrieve index format below code helped me a lot:
key = 'key-name'
bucket = 'bucket-name'
s3_resource = boto3.resource('s3')
s3_object = s3_resource.Object(bucket, key)
data = s3_object.get()['Body'].read().decode('utf-8').splitlines()
lines = csv.reader(data)
headers = next(lines)
print('headers: %s' %(headers))
for line in lines:
#print complete line
print(line)
#print index wise
print(line[0], line[1])
Solution 3
csvfile['Body']
type is StreamingBody
, so you can't uses the open xx with
.
this code had read all data from the stream.
csvcontent = csvfile['Body'].read().split(b'\n')
so jsut parse the line to get more usefully content.
Angara kilkiri
Updated on April 11, 2020Comments
-
Angara kilkiri about 4 years
I am trying to read the content of a csv file which was uploaded on an s3 bucket. To do so, I get the bucket name and the file key from the event that triggered the lambda function and read it line by line. Here is my code:
import json import os import boto3 import csv def lambda_handler(event, context): for record in event['Records']: bucket = record['s3']['bucket']['name'] file_key = record['s3']['object']['key'] s3 = boto3.client('s3') csvfile = s3.get_object(Bucket=bucket, Key=file_key) csvcontent = csvfile['Body'].read().split(b'\n') data = [] with open(csvfile['Body'], 'r') as csv_file: csv_file = csv.DictReader(csv_file) data = list(csv_file)
The exact error I’m getting on the CloudWatch is:
[ERROR] TypeError: expected str, bytes or os.PathLike object, not list Traceback (most recent call last): File "/var/task/lambda_function.py", line 19, in lambda_handler with open(csvcontent, 'r') as csv_file:
Could someone help me fix this? I appreciate any help you can provide as I am new to lambda
-
Kalenji over 3 yearsdo you know how to then get that csv file in lambda as the data frame?
-
sheetal over 3 yearsFor small csvs yes. For large files this code will eat up all memory and get stuck
-
deesolie almost 3 yearsNote if you are still getting weird characters using utf-8, try utf-8-sig as it reads the byte order mark as info instead of a string. See stackoverflow.com/questions/57152985/…
-
Janzaib M Baloch over 2 yearsis s3 in your case boto3.client("s3") or boto3.resource("s3") ?