Could we use AWS Glue just copy a file from one S3 folder to another S3 folder?

14,046

Solution 1

You can do this, and there may be a reason to use AWS Glue: if you have chained Glue jobs and glue_job_#2 is triggered on the successful completion of glue_job_#1.

The simple Python script below moves a file from one S3 folder (source) to another folder (target) using the boto3 library, and optionally deletes the original copy in source directory.

import boto3

bucketname = "my-unique-bucket-name"
s3 = boto3.resource('s3')
my_bucket = s3.Bucket(bucketname)
source = "path/to/folder1"
target = "path/to/folder2"

for obj in my_bucket.objects.filter(Prefix=source):
    source_filename = (obj.key).split('/')[-1]
    copy_source = {
        'Bucket': bucketname,
        'Key': obj.key
    }
    target_filename = "{}/{}".format(target, source_filename)
    s3.meta.client.copy(copy_source, bucketname, target_filename)
    # Uncomment the line below if you wish the delete the original source file
    # s3.Object(bucketname, obj.key).delete()

Reference: Boto3 Docs on S3 Client Copy

Note: I would use f-strings for generating the target_filename, but f-strings are only supported in >= Python3.6 and I believe the default AWS Glue Python interpreter is still 2.7.

Reference: PEP on f-strings

Solution 2

I think you can do it with Glue, but wouldn't it be easier to use the CLI?

You can do the following:

aws s3 sync s3://bucket_1 s3://bucket_2

Solution 3

You could do this with Glue but it's not the right tool for the job.

Far simpler would be to have a Lambda job triggered by a S3 created-object event. There's even a tutorial on AWS Docs on doing (almost) this exact thing.

http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html

Solution 4

We ended up using Databricks to do everything.

Glue is not ready. It returns error messages that make no sense. We created tickets and waited for five days still no reply.

Solution 5

the S3 API lets you do a COPY command (really a PUT with a header to indicate source URL) to copy objects within or between buckets. It's used to fake rename()s regularly but you could initiate the call yourself, from anything.

There is no need to D/L any data; within the same S3 region the copy has a bandwidth of about 6-10 MB/s.

AWS CLI cp command can do this.

Share:
14,046
Jie
Author by

Jie

Updated on June 22, 2022

Comments

  • Jie
    Jie over 1 year

    I need to copy a zipped file from one AWS S3 folder to another and would like to make that a scheduled AWS Glue job. I cannot find an example for such a simple task. Please help if you know the answer. May be the answer is in AWS Lambda, or other AWS tools.

    Thank you very much!

  • Jie
    Jie almost 6 years
    The reason for using Glue is that it can be a job, with the job's complication, other jobs can be triggered.
  • Dave Whittingham
    Dave Whittingham almost 6 years
    I am not 100% sure Glue us the tool for that. Glue is more of an ETL tool that crawls databases for extract into AWS. Have you had a look at Data Pipeline? docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/…
  • Nathan Griffiths
    Nathan Griffiths almost 6 years
    An AWS rep told me recently that Data Pipeline was likely to be phased out in favour of Glue ETL over time. Not sure how official that is but I would probably go with Glue ETL if I had to choose between them, seems more likely AWS will be investing in that long term.
  • Mathews Sunny
    Mathews Sunny over 5 years
    Can you format your answer, it will look awesome then.
  • Ravmcgav
    Ravmcgav over 2 years
    Is it possible to modify this code to copy to the Glue tmp folder that is accessible from within a job? stackoverflow.com/questions/66376252/…