Python/AWS Lambda Function: How to view /tmp storage?
Solution 1
You can't 'view' the /tmp directory after the lambda execution has ended.
Lambda works in distributed architecture and after the execution all resources used (including all files stored in /tmp
) are disposed.
So if you want to check your files, you might want consider using EC2 or S3.
If you just want to check if the s3 download was successful, during the execution, you can try:
import os
os.path.isfile('/tmp/' + filename)
Solution 2
As previous answers suggested, you might want to create a /tmp
directory in S3 bucket
and download/upload your temp processing file to this /tmp
directory before final clean up.
You can do the following (I'm not showing detailed process here):
import boto
s3 = boto3.client("s3")
s3.put_object(Bucket=Your_bucket_name,Key=tmp/Your_file_name)
How you download your file from your /tmp
is through:
s3.download_file(Your_bucket_name, Your_key_name, Your_file_name)
after you download files and process, you want to upload it again to /tmp
through:
s3.upload_file(Your_file_name, Your_bucket_name, Your_key_name)
You can add your /tmp/
in Your_key_name
Then you should be able to list the bucket easily from this sample:
for key in bucket.list():
print "{name}\t{size}\t{modified}".format(
name = key.name,
size = key.size,
modified = key.last_modified,
)
Make sure you keep your download and upload asynchronously by this boto async package.
Solution 3
Try to use a S3
bucket to store the file and read it from the AWS Lambda
function, you should ensure the AWS Lambda role has access to the S3
bucket.
Admin
Updated on June 13, 2020Comments
-
Admin almost 4 years
Lambda functions have access to disk space in their own
/tmp
directories. My question is, where can I visually view the/tmp
directory?I’m attempting to download the files into the
/tmp
directory to read them, and write a new file to it as well. I actually want see the files I’m working with are getting stored properly in/tmp
during execution.Thank you
-
RandomEli about 7 yearsEC2 is more than checking files, if he only want a
/tmp
directory that he want to access, S3 is enough. -
Admin about 7 yearsno I am actually downloading files from S3 bucket, and want to temporarily store the downloaded files in the /tmp. Just want to make sure I'm actually using the /tmp directory. From the site
Each Lambda function receives 500MB of non-persistent disk space in its own /tmp directory.
-
Admin about 7 years@joarleymoraes I am actually downloading files from S3 bucket, and want to temporarily store the downloaded files in the /tmp. Just want to make sure I'm actually using the /tmp directory. From the site
Each Lambda function receives 500MB of non-persistent disk space in its own /tmp directory.
Is there a way to view the /tmp directory while the lambda is executing? -
joarleymoraes about 7 yearsok, got it. How are you downloading the file ? The function that downloads the file should return the tmp filename.
-
Admin about 7 years@joarleymoraes Downloading like so:
s3client.download_file(bucket_name, obj.key, '/tmp/'+filename)
. Want to check if the files being downloaded are in fact being stored in that/tmp
directory. I have no way of checking. The logs do not even show any errors. -
joarleymoraes about 7 yearsYou can simply check with:
os.path.isfile('/tmp/' + filename)
. You might want actually open the file and check the content as well. Usewith open('/tmp/' + filename) as fp:
syntax. -
Admin about 7 years@joarleymoraes So would it be
print(os.path.isfile('/tmp/' + filename))
? And does that return true or false? -
joarleymoraes about 7 yearsEdited the answer to include this.
-
Admin about 7 yearsLet us continue this discussion in chat.
-
Dev_Man about 4 yearsthis answer is a gem!
-
UncleBob over 2 yearsDanger: The files in /tmp are (unfortunately) not disposed between executions. They are disposed when deploying the lambda, but they will persist between executions, and the space is shared between all executions. So unless you delete the files yourself after every execution, you will at some point run into disk space problems (limited to 512mb)!