How to download files from s3 service to local folder

32,635

Solution 1

If you have installed the AWS PowerShell Module, you haven't correctly loaded it into your current session. We're identifying this as the issue because the error you specified means that the given cmdlet can't be found.

Verify first that the module is installed, by any of the options below:

Load module into an existing session: (PowerShell v3 and v4):

From the documentation:

In PowerShell 4.0 and later releases, Import-Module also searches the Program Files folder for installed modules, so it is not necessary to provide the full path to the module. You can run the following command to import the AWSPowerShell module. In PowerShell 3.0 and later, running a cmdlet in the module also automatically imports a module into your session.

To verify correct installation, add the following command to the beginning of your script:

PS C:\> Import-Module AWSPowerShell

Load module into an existing session: (PowerShell v2):

To verify correct installation, add the following command to the beginning of your script:

PS C:\> Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"

Open a new session with Windows PowerShell for AWS Desktop Shortcut:

A shortcut is added to your desktop that starts PowerShell with the correct module loaded into the session. If your installation was successful, this shortcut should be present and should also correctly load the AWS PowerShell module without additional effort from you.

From the documentation:

The installer creates a Start Menu group called, Amazon Web Services, which contains a shortcut called Windows PowerShell for AWS. For PowerShell 2.0, this shortcut automatically imports the AWSPowerShell module and then runs the Initialize-AWSDefaults cmdlet. For PowerShell 3.0, the AWSPowerShell module is loaded automatically whenever you run an AWS cmdlet. So, for PowerShell 3.0, the shortcut created by the installer only runs the Initialize-AWSDefaults cmdlet. For more information about Initialize-AWSDefaults, see Using AWS Credentials.

Further Reading:

Solution 2

Since this question is one of the top Google results for "powershell download s3 files" I'm going to answer the question in the title (even though the actual question text is different):

Read-S3Object -BucketName "my-s3-bucket" -KeyPrefix "path/to/directory" -Folder .

You might need to call Set-AWSCredentials if it's not a public bucket.

Solution 3

Similar to Will's example, if you want to download the whole content of a "folder" keeping the directory structure try:

Get-S3Object -BucketName "my-bucket" -KeyPrefix "path/to/directory" | Read-S3Object -Folder .

MS doc at https://docs.aws.amazon.com/powershell/latest/reference/items/Read-S3Object.html provides examples with fancier filtering.

Share:
32,635
komali
Author by

komali

Updated on May 11, 2020

Comments

  • komali
    komali about 4 years

    I have requirement to download files from simple storage service to local folder and count the no.of files in local folder and check against simple storage service then send mail with the number of files.

    I tried to download files from simple storage service but I am getting error like get-s3object commandnotfoundexception. How do I resolve this issue?

    Code reference I have taken

    # Your account access key - must have read access to your S3 Bucket
    $accessKey = "YOUR-ACCESS-KEY"
    # Your account secret access key
    $secretKey = "YOUR-SECRET-KEY"
    # The region associated with your bucket e.g. eu-west-1, us-east-1 etc. (see http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-regions)
    $region = "eu-west-1"
    # The name of your S3 Bucket
    $bucket = "my-test-bucket"
    # The folder in your bucket to copy, including trailing slash. Leave blank to copy the entire bucket
    $keyPrefix = "my-folder/"
    # The local file path where files should be copied
    $localPath = "C:\s3-downloads\"    
    
    $objects = Get-S3Object -BucketName $bucket -KeyPrefix $keyPrefix -AccessKey $accessKey -SecretKey $secretKey -Region $region
    
    foreach($object in $objects) {
        $localFileName = $object.Key -replace $keyPrefix, ''
        if ($localFileName -ne '') {
            $localFilePath = Join-Path $localPath $localFileName
            Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -AccessKey $accessKey -SecretKey $secretKey -Region $region
        }
    }
    
  • komali
    komali over 8 years
    i need to count no of files available in s3 bucket.can you please help me out on this.
  • Anthony Neace
    Anthony Neace over 8 years
    You should ask a new question for this to be answered in better detail, but very generally you would invoke Get-S3Object for the given bucket and count the resulting metadata objects. This cmdlet will return batches of up to 1000 objects, so you may have to iterate it to get everything.
  • dragon788
    dragon788 over 3 years
    Note this probably requires more API calls than just allowing the Read-S3Object to handle the recursion and filtering.