Boto: uploading multiple files to s3

11,378
def sync_to_s3(target_dir, aws_region=AWS_REGION, bucket_name=BUCKET_NAME):
    if not os.path.isdir(target_dir):
        raise ValueError('target_dir %r not found.' % target_dir)

    s3 = boto3.resource('s3', region_name=aws_region)
    try:
        s3.create_bucket(Bucket=bucket_name,
                         CreateBucketConfiguration={'LocationConstraint': aws_region})
    except ClientError:
        pass

    for filename in os.listdir(target_dir):
        logger.warn('Uploading %s to Amazon S3 bucket %s' % (filename, bucket_name))
        s3.Object(bucket_name, filename).put(Body=open(os.path.join(target_dir, filename), 'rb'))

        logger.info('File uploaded to https://s3.%s.amazonaws.com/%s/%s' % (
            aws_region, bucket_name, filename))

you can pass a folder as argument and iterate files

Share:
11,378
serlingpa
Author by

serlingpa

Updated on June 08, 2022

Comments

  • serlingpa
    serlingpa almost 2 years

    I am a JavaScript/Angular 2 developer who is now getting involved with deployment using Bitbucket pipelines, Python and Boto for s3 integration. I was only introduced to these three technologies yesterday!

    My webpack build produces a folder, dist, that contains all of the files I would like to upload to s3. I would like these files to appear in the root of the s3 bucket.

    I have the following in my bitbucket-pipelines.yaml:

    image: node:5.6.0
    
    pipelines:
      default:
        - step:
            script:
              # other stuff..,
              - python s3_upload.py io-master.mycompany.co.uk dist io-dist 
    

    Here is the entire Python s3_upload.py

    As you can see, the script uses put_object:

    client.put_object(
        Body=open(artefact, 'rb'),
        Bucket=bucket,
        Key=bucket_key
    )
    

    What I would like to be able to do is upload the contents of the dist folder to s3. Do I have to learn Python in order to be able to do this, or is there a method in Boto to do this already?