Error when trying to use aws cli and buckets with periods

223

Solution 1

Had this same problem today. The solution is specifying your region.

What wasn't working:

aws s3 sync --acl public-read dist/ s3://some.bucket.name/

What works now:

aws s3 --region us-east-1 sync --acl public-read dist/ s3://some.bucket.name/

EDIT: If all your buckets are going to exist in the same regions, go to command line (in OSX):

aws configure

If you haven't supplied your AWS keys, do so then. Question #3 should be Default region. in my case I entered:

us-east-1

Now I no longer have to specify region unless the bucket is outside my default region.

Solution 2

...I encountered a similar problem with my account.

I found (through trial and error) that when I used the wrong region identifier, I was getting the Errno 8.

Here's what worked:

  • us-west-2

Here's what didn't work:

  • us-west
  • us-west-2a

I couldn't find any definitive help about what would work (or not work) when specifying a region identifier, so this answer is completely empirical and may not help in all cases.

Share:
223

Related videos on Youtube

Kissenger
Author by

Kissenger

Updated on September 18, 2022

Comments

  • Kissenger
    Kissenger over 1 year

    I am trying to deploy a web app I have written, but I am stuck with one element. The bulk of it is just an Angular application that interacts with a MongoDB database, thats all fine. Where I am stuck is that I need local read access to around 10Gb of files (geoTiff digital elevation models) - these dont change and are broken down into 500 or so files. Each time my app needs geographic elevations, it needs to find the right file, read the right bit of the files, return the data - the quicker the better. To reiterate, I am not serving these files, just reading data from them.

    In development these files are on my machine and I have no problems, but the files seem to be too large to bundle in the Angular app (runs out of memory), and too large to include in any backend assets folder. I've looked at two serverless cloud hosting platforms (GCP and Heroku) both of which limit the size of the deployed files to around 1Gb (if I remember right). I have considered using cloud storage for the files, but I'm worried about negative performance as each time I need a file it would need to be downloaded from the cloud to the application. The only solution I can think of is to use a VM based service like Google Compute and use an API service to recieve requests from the app and deliver back the required data, but I had hoped it could be more co-located (not least cos that solution costs more $$)...

    I'm new to deployment so any advice welcome.

    • EEAA
      EEAA over 9 years
      Where is the documentation you state that says periods are valid?
    • vjones
      vjones over 9 years
      I don't know if this helps, but I just tried to duplicate this problem and I had no issues doing an ls or otherwise operating on a bucket with a period. The CLI version reports as: aws-cli/1.3.21 Python/2.6.9 Linux/3.10.42-52.145.amzn1.x86_64. Installed boto packages are python-boto-2.30.0-1.0.amzn1.noarch and python-botocore-0.55.0-1.1.amzn1.noarch I'm running on Amazon Linux and did a yum update before and after doing this test.
    • Roger Gilbrat
      Roger Gilbrat over 9 years
      I'm running this from OS X. I've downloaded aws cli from the aws site and it seems to come with botocore-0.57.0. How do I tell what version of boto I'm running? How do I upgrade it? I'm also use google's gsutil and it also seems to use boto, so maybe there is a conflict? Is there a better place to get the aws cli?
    • Roger Gilbrat
      Roger Gilbrat over 9 years
      Update: I installed pip and install boto, then reinstalled the aws-cli and still get the same issue.
    • dialt0ne
      dialt0ne over 9 years
      Is the location for the bucket and the S3 API endpoint you are using in the same region?
  • Christoffer
    Christoffer over 9 years
    If you use Ireland as region you need to set this to eu-west-1 (and not ireland even though that is what the bucket say it's region is)
  • Kissenger
    Kissenger almost 4 years
    Thanks for the answer. What are the advantages of a GIS specific DB as opposed to something like Mongo? I ask because I am already using mondodb in the same scope, so it would seem reasonable to leverage my knowledge of that rather than have to learn something new...
  • Dave
    Dave almost 4 years
    You could use mongoDB GIS (See intro here), but you'll have to convert your geoTiffs to GeoJSON and at only 10G of data, it's unlikely you'll see a performance advantage. PostGIS will be easier to load and query, but will require you to support two DBs.