Access denied (SA doesn't have storage.objects.create access) when trying to upload using a preSigned url to google cloud storage

7,433

The error you get means that your Cloud Function service account is lacking the storage.objects.create permission.

In order to fix it, you can either give your service account a predefined role like Storage Object Creator or create a custom role with that permission.


EDIT

I see that I missed that your service account has the Storage Admin role, which includes storage.objects.* permissions and thus should be more than enough.

I would recommend you to try the following options:

  1. Remove the Storage Admin role from you function's service account and then add it again. Make sure to wait for a couple of minutes so that the roles could propagate properly.
  2. If the first option doesn't work for you, try giving the function's service account the Storage Object Creator and Service Account Token Creator roles separately.

Let me know if it helps.

Share:
7,433
James
Author by

James

Updated on September 18, 2022

Comments

  • James
    James over 1 year

    Having issues trying to allow a client to upload a file via a presigned url.

    Error received

    <?xml version='1.0' encoding='UTF-8'?>
    <Error>
    <Code>AccessDenied</Code>
    <Message>Access denied.</Message>
    <Details>
    urlsigner@<project>.iam.gserviceaccount.com does not have storage.objects.create access to <bucket-name>/<filename>.pdf.
    </Details>
    </Error>
    

    Code executes as a gcp cloud function and is used for generating. Cloud function is default service account function was updated to urlSigner

    async createSignedUrl({
        contentType,
        fileName,
        userId,
      }: SignedUrlRequest): Promise<string> {
        const bucket = await this.cloudStorage.bucket(BUCKET_NAME);
        console.log(
          moment.utc().add('minutes', 15).toDate(),
          Date.now() + 15 * 60 * 1000
        );
        const options: any = {
          version: 'v4',
          action: 'write',
          expires: Date.now() + 15 * 60 * 1000,
          contentType,
        };
        const [signedUrl] = await bucket
          .file(`${userId}-${fileName}`)
          .getSignedUrl(options);
    
        return signedUrl;
      }
    

    CORS temp json file

    [
      {
        "maxAgeSeconds": 3600,
        "method": ["GET", "POST", "PUT", "DELETE", "OPTIONS"],
        "origin": ["*"],
        "responseHeader": [
          "Content-Type",
          "Authorization",
          "Content-Length",
          "User-Agent",
          "x-goog-resumable"
        ]
      }
    ]
    

    Steps used to create gcp resources

    gcloud iam service-accounts create urlsigner --display-name="GCS URL Signer" --project=<project-id>

    gcloud iam service-accounts keys create service_account.json --iam-account=urlsigner@<project-id>.iam.gserviceaccount.com

    gsutil mb gs://<bucket-name>

    gsutil iam ch serviceAccount:urlsigner@<project-id>.iam.gserviceaccount.com:roles/storage.admin gs://<bucket-name>

    gsutil cors set cors-json-file.json gs://<bucket-name>

    The upload code

    export async function uploadDoc(preSignedUrl: string, file: File) {
      return await new Promise((resolve, reject) => {
        const xhr = new XMLHttpRequest();
        xhr.open('PUT', preSignedUrl, true);
        xhr.onload = () => {
          const status = xhr.status;
          if (status === 200) {
            console.log('uploaded');
            resolve('good to go');
          } else {
            console.log('failed to upload');
            reject('not good to go');
          }
        };
    
        xhr.onabort = () => {
          reject('aborted for idk');
        };
    
        xhr.onerror = () => {
          reject('Failed for idk');
        };
    
        xhr.setRequestHeader('Content-Type', file.type);
        const formData = new FormData();
        formData.append('verify-doc', file);
        xhr.send(formData);
      });
    }
    

    Lastly, verified that the cloud function has the Service-Token-Creator role necessary for signing the blob that gets sent back to the front-end and that under the bucket permissions the service account is listed with the storage admin role too. So at this point just scratching my head as to why I'm getting the error of access denied for lack of permissions by the service account.

    Thanks in advance for whoever can assist.

    PS. I referenced this article as well https://medium.com/google-cloud/upload-download-files-from-a-browser-with-gcs-signed-urls-and-signed-policy-documents-f66fff8e425

    • James
      James over 3 years
      issued resolved itself without any changes by me. My guess something with the servers got off and my updates to the service account weren't being honored
  • James
    James over 3 years
    but doesnt storage admin include that anyway.
  • Deniss T.
    Deniss T. over 3 years
    Thanks for pointing that out, I've updated the answer. Please let me know if it helps