permission denied on AWS Transfer on SFTP server

23,395

Solution 1

User Role should be:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowListingOfUserFolder",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::BUCKET_NAME"
            ]
        },
        {
            "Sid": "HomeDirObjectAccess",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObjectVersion",
                "s3:DeleteObject",
                "s3:GetObjectVersion"
            ],
            "Resource": "arn:aws:s3:::BUCKET_NAME/*"
        }
    ]
}

Trust relationship of User:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "",
      "Effect": "Allow",
      "Principal": {
        "Service": "transfer.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}

Home directory for your user should be /BUCKET_NAME

Solution 2

I had issues with this until I added, specifically, the s3:GetObject permission to the aws_transfer_user policy. I expected s3:ListBucket to be enough, but it was not. sftp> ls would fail until I had GetObject.

Here's the Terraform for it:

resource "aws_transfer_user" "example-ftp-user" {
  count                     = length(var.uploader_users)
  user_name                 = var.uploader_users[count.index].username

  server_id                 = aws_transfer_server.example-transfer.id
  role                      = aws_iam_role.sftp_content_incoming.arn
  home_directory_type       = "LOGICAL"

  home_directory_mappings {
      entry = "/"
      target = "/my-bucket/$${Transfer:UserName}"
    }

    policy = <<POLICY
{
    "Version": "2012-10-17",
    "Statement": [
      {
        "Sid": "AllowSftpUserAccessToS3",
        "Effect": "Allow",
        "Action": [
          "s3:ListBucket",
          "s3:PutObject",
          "s3:GetObject",
          "s3:DeleteObjectVersion",
          "s3:DeleteObject",
          "s3:GetObjectVersion",
          "s3:GetBucketLocation"
        ],
        "Resource": [
          "${aws_s3_bucket.bucket.arn}/${var.uploader_users[count.index].username}",
          "${aws_s3_bucket.bucket.arn}/${var.uploader_users[count.index].username}/*"
        ]
      }
    ]
}
POLICY
}

And I define users in a .tfvars file; e.g.:

uploader_users = [
  {
    username = "firstuser"
    public_key = "ssh-rsa ...."
  },
  {
    username = "seconduser"
    public_key = "ssh-rsa ..."
  },
  {
    username = "thirduser"
    public_key = "ssh-rsa ..."
  }
]

I hope this helps someone. It took me a lot of tinkering before I finally got this working, and I'm not 100% sure of the interactions with other policies might ultimately be in play. But after applying this was the moment I could connect and list bucket contents without getting "Permission denied".

Share:
23,395

Related videos on Youtube

user11020868
Author by

user11020868

Updated on September 18, 2022

Comments

  • user11020868
    user11020868 almost 2 years

    I can log into my server with cyberduck or filezilla but cannot read my homedirectory. s3 bucket "mybucket" exists. In cyber duck I see

    "Cannot readdir on root. Please contact your web hosting service provider for assistance." and in Filezilla "Error: Reading directory .: permission denied"

    even though I can connect to server.

    Am I missing some user permission in the policies below ?

    These are my permissions

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "VisualEditor0",
                "Effect": "Allow",
                "Action": [
                    "s3:ListBucket",
                    "s3:GetBucketLocation"
                ],
                "Resource": "arn:aws:s3:::MYBUCKET"
            },
            {
                "Sid": "VisualEditor1",
                "Effect": "Allow",
                "Action": [
                    "s3:PutObject",
                    "s3:GetObject",
                    "s3:DeleteObject"
                ],
                "Resource": "arn:aws:s3:::MYBUCKET/*"
            },
            {
                "Sid": "VisualEditor2",
                "Effect": "Allow",
                "Action": "transfer:*",
                "Resource": "*"
            }
        ]
    }
    

    These are my trust relationships:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Principal": {
                    "Service": "s3.amazonaws.com"
                },
                "Action": "sts:AssumeRole"
            },
            {
                "Effect": "Allow",
                "Principal": {
                    "Service": "transfer.amazonaws.com"
                },
                "Action": "sts:AssumeRole"
            }
        ]
    }
    
    • Admin
      Admin over 5 years
      Do you use AWS SFTP? You haven't mentioned it in the question
    • Admin
      Admin over 5 years
      Yes, I am using aws sftp.
  • Jude Niroshan
    Jude Niroshan over 5 years
    This should be the accepted answer!
  • user11020868
    user11020868 over 5 years
    Thanks, this resolves my issue.
  • Warren Krewenki
    Warren Krewenki about 5 years
    This answer just saved me a lot of heartache. I was setting up SFTP and my default role/policy had a trust relationship with s3.amazonaws.com. Connecting would give me an error stating "Unable to AssumeRole". The real problem was that I needed a trust relationship with transfer.amazonaws.com instead of s3.amazonaws.com .
  • Algeriassic
    Algeriassic about 5 years
    Please mark it as the accepted answer.
  • user1393608
    user1393608 over 4 years
    I want to allow user only to Put objects i.e remove "s3:GetObject", "s3:DeleteObjectVersion", "s3:DeleteObject", "s3:GetObjectVersion" But with that I cannot list objects in the Home directory, Any solution to resolve this greatly appreciated?
  • CSR
    CSR about 4 years
    I am able to work with a user specific home folder in the same bucket with this approach. But, when working with logical directories approach, as mentioned here (github.com/aws-samples/transfer-for-sftp-logical-directorie‌​s) I am getting Access denied error in file zilla and user is unable to login. Please help.
  • Tommy
    Tommy about 3 years
    oh my goodness thank you so much; this answer combined with @WarrenKrewenki finally fixed this for me after hours of pain and suffering! I was using this via pysftp and just getting PermissionError: [Errno 13] Unable to assume role which is obviously not a lot of info!!
  • Algeriassic
    Algeriassic about 3 years
    Happy that it helped you fix the issue!
  • sylr
    sylr over 2 years
    If your S3 bucket is encrypted with a KMS you also need to allow the role to use that KMS in the policy, see: stackoverflow.com/a/54241647/4091202