How to zip files in Amazon s3 Bucket and get its URL
S3 is not a file server, nor does it offer operating system file services, such as data manipulation.
If there is many "HUGE" files, your best bet is
- start a simple EC2 instance
- Download all those files to EC2 instance, compress them, reupload it back to S3 bucket with a new object name
Yes, you can use AWS lambda to do the same thing, but lambda is bounds to 900 seconds (15 mins) execution timeout (Thus it is recommended to allocate more RAM to boost lambda execution performance)
Traffics from S3 to local region EC2 instance and etc services is FREE.
If your main purpose is just to read those file within same AWS region using EC2/etc services, then you don't need this extra step. Just access the file directly.
(Update) : As mentioned by @Robert Reiz, now you can also use AWS Fargate to do the job.
Note :
It is recommended to access and share file using AWS API. If you intend to share the file publicly, you must look into security issue seriously and impose download restriction. AWS traffics out to internet is never cheap.
Related videos on Youtube
jeff ayan
Updated on November 25, 2020Comments
-
jeff ayan over 3 years
I have a bunch of files inside Amazon s3 bucket, I want to zip those file and download get the contents via S3 URL using Java Spring.
-
John Rotenstein about 7 yearsCould you please clarify your requirements? What do you mean by "zip those amazon URLs into Zip"? Do you mean you wish to create a new object in an Amazon S3 bucket that consists of a list of URLs? Or do you wish to create a Zip file from several existing files? Please Edit your question to provide more information so that we can assist you.
-
jeff ayan about 7 yearsSir, I have huge size files in Amazon s3 bucket. I just want to create a Zip file from those files and get as a single file directly from bucket
-
-
pankaj about 5 yearsLambda execution timeout settings can be set up to 15 mins not 300 seconds as I can see on dashboard.
-
digitaldavenyc over 4 yearsThis won't work on files larger than 10MB. Is there any other automated way to serve compressed files on AWS?
-
Andrew over 4 yearsJust spitballing here, but you could create an API gateway, send a request to a lambda function that could process the files (I think you're granted 5GB tmp space to do file processing), copy the archive back to the s3 bucket via lambda, determine that path, and return the download url of that path as the response to the client (via the gateway).
-
Andrew over 4 yearsSorry, should 500MB tmp space, not 5GB, although one training I did said 5GB.... Never tested, so don't know what'd happen...
-
Robert Reiz over 3 yearsEC2 ist one of the most expensive services on AWS. I would recommend ECS Fargate, because it has all the advantages of EC2, but costs much less. If you need to run these kind of tasks regulary you can even create a scheduled task on ECS Fargate, which will trigger a Docker container every X hours or days.