Access Denied using boto3 through aws Lambda

67,244

Solution 1

Possibility of the specific S3 object which you are looking for is having limited permissions

  1. S3 object level permission for read is denied
  2. The role attached to lambda does not have permission to get/read S3 objects
  3. If access granted using S3 bucket policy, verify read permissions are provided

Solution 2

Omuthu's answer actually correctly identified my problem, but it didn't provide a solution so I thought I'd do that.

It's possible that when you setup your permissions in IAM you made something like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        }
    ]
}

Unfortunately, that's not correct. You need to apply the Object permissions to the objects in the bucket. So it has to look like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::test/*"
            ]
        }
    ]
}

Note the second ARN witht the /* at the end of it.

Solution 3

I had a similar problem, I solved it by attaching the appropriate policy to my user.

IAM -> Users -> Username -> Permissions -> Attach policy.

Also make sure you add the correct access key and secret access key, you can do so using AmazonCLI.

Solution 4

I had similar problem, the difference was the bucket was encrypted in KMS key.

Fixed with: IAM -> Encryption keys -> YOUR_AWS_KMS_KEY -> to your policy or account

Solution 5

Adding to Amri's answer, if your bucket is private and you have the credentials to access it you can use the boto3.client:

import boto3
s3 = boto3.client('s3',aws_access_key_id='ACCESS_KEY',aws_secret_access_key='SECRET_KEY')
response = s3.get_object(Bucket='BUCKET', Key='KEY')

*For this file: s3://bucket/a/b/c/some.text, Bucket is 'bucket' and Key is 'a/b/c/some.text'

---EDIT---

You can easily change the script to accept keys as environment variables for instance so they are not hardcoded. I left it like this for simplicity

Share:
67,244

Related videos on Youtube

Hello lad
Author by

Hello lad

Updated on July 09, 2022

Comments

  • Hello lad
    Hello lad almost 2 years

    I use the data processing pipeline constructed of

    S3 + SNS + Lambda

    becasue S3 can not send notificaiton out of its storage region so I made use of SNS to send S3 notification to Lambda in other region.

    The lambda function coded with

    from __future__ import print_function
    import boto3
    
    
    def lambda_handler (event, context):
        input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
        input_file_key = event["Records"][0]["s3"]["object"]["key"]
    
        input_file_name = input_file_bucket+"/"+input_file_key
    
        s3=boto3.resource("s3")
        obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
        response = obj.get()
    
        return event #echo first key valuesdf
    

    when I ran save and test, I got the following error

        {
      "stackTrace": [
        [
          "/var/task/lambda_function.py",
          20,
          "lambda_handler",
          "response = obj.get()"
        ],
        [
          "/var/runtime/boto3/resources/factory.py",
          394,
          "do_action",
          "response = action(self, *args, **kwargs)"
        ],
        [
          "/var/runtime/boto3/resources/action.py",
          77,
          "__call__",
          "response = getattr(parent.meta.client, operation_name)(**params)"
        ],
        [
          "/var/runtime/botocore/client.py",
          310,
          "_api_call",
          "return self._make_api_call(operation_name, kwargs)"
        ],
        [
          "/var/runtime/botocore/client.py",
          395,
          "_make_api_call",
          "raise ClientError(parsed_response, operation_name)"
        ]
      ],
      "errorType": "ClientError",
      "errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied"
    }
    

    I configured the lambda Role with

    full S3 access
    

    and set bucket policy on my target bucket

    everyone can do anything(list, delete, etc.)
    

    It seems that I haven't set policy well.

  • Patrick Perini
    Patrick Perini almost 8 years
    This is pretty vague. Could you possibly point out how one might address this issue?
  • omuthu
    omuthu almost 8 years
    Two possibilities 1. S3 object level permission for read is denied 2. The role attached to lambda does not have permission to get/read S3 objects
  • Vitaly Zdanevich
    Vitaly Zdanevich over 6 years
    For me helped adding of s3:GetObject to the policy.
  • tedder42
    tedder42 almost 6 years
    this is a bad idea in a Lambda function. There's no reason to hardcode keys.
  • Tal Joffe
    Tal Joffe almost 6 years
    @tedder42 sometimes it makes sense. let's say you need to copy between buckets with different permissions.
  • Pedro
    Pedro over 5 years
    I had to write a bucket policy to grant access. I had to add to bucket policy: ` { "Version": "2012-10-17", "Statement": [ { "Sid": "Allow All", "Effect": "Allow", "Principal": { "AWS": [ "arn:aws:iam::<userid>:user/<username>" ] }, "Action": [ "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl" ], "Resource": "arn:aws:s3:::<resource-name>/*" } ] } `
  • chaikov
    chaikov over 4 years
    Hey where can I see this settings? I cannot find it anywhere...
  • Anatolii Stepaniuk
    Anatolii Stepaniuk almost 4 years
    s3:ListBucket is needed to create bucket
  • DrCord
    DrCord over 3 years
    Also worth pointing out that AWS S3 returns Access Denied for objects that don't exist so as to not reveal whether an object exists or not...
  • DJ_Stuffy_K
    DJ_Stuffy_K about 3 years
    I have s3* permissions but still I'm getting the error.
  • Moo
    Moo over 2 years
    The Amplify CLI set it up the incorrect way when I added a storage trigger. This was a frustrating bug...
  • george
    george about 2 years
    you saved me! :) I need to buy you a coffee!