upload large files (> 1 GB) to azure blob storage through web api

16,972

According to your code, you want to upload a large file to Azure blob storage as blockblob. Please note that it has a limitation. For more details, please refer to the document

The maximum size for a block blob created via Put Blob is 256 MB for version 2016-05-31 and later, and 64 MB for older versions. If your blob is larger than 256 MB for version 2016-05-31 and later, or 64 MB for older versions, you must upload it as a set of blocks

So If you want to large files to azure block blob, pleae use the following steps:

1. Read the whole file to bytes, and divide the file into smaller pieces in your code.

  • Maybe 8 MB for each pieces.

2. Upload each piece with Put Block API.

  • In each request, it contains a blockid.

3. Make up the blob with Put Block List API.

  • In this request, you need to put all the blockid in the body in ordered.

For example :

[HttpPost]
        [Consumes("multipart/form-data")]
        [RequestFormLimits(MultipartBodyLengthLimit = 2147483648)]
        public async Task<ActionResult> PostAsync([FromForm]FileRequestObject fileRequestObject)
        {
            
          

            string storageAccountConnectionString = "DefaultEndpointsProtocol=https;AccountName=blobstorage0516;AccountKey=UVOOBCxQpr5BVueU+scUeVG/61CZbZmj9ymouAR9609WbqJhhma2N+WL/hvaoNs4p4DJobmT0F0KAs0hdtPcqA==;EndpointSuffix=core.windows.net";
            CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(storageAccountConnectionString);
            CloudBlobClient BlobClient = StorageAccount.CreateCloudBlobClient();
            CloudBlobContainer Container = BlobClient.GetContainerReference("test");
            await Container.CreateIfNotExistsAsync();
            CloudBlockBlob blob = Container.GetBlockBlobReference(fileRequestObject.File.FileName);
            HashSet<string> blocklist = new HashSet<string>();
            var file = fileRequestObject.File;
            const int pageSizeInBytes = 10485760;
            long prevLastByte = 0;
            long bytesRemain = file.Length;

            byte[] bytes;

            using (MemoryStream ms = new MemoryStream())
            {
                var fileStream = file.OpenReadStream();
                await fileStream.CopyToAsync(ms);
                bytes = ms.ToArray();
            }

            // Upload each piece
                do
                {
                    long bytesToCopy = Math.Min(bytesRemain, pageSizeInBytes);
                    byte[] bytesToSend = new byte[bytesToCopy];
                    
                    Array.Copy(bytes, prevLastByte, bytesToSend, 0, bytesToCopy);
                    prevLastByte += bytesToCopy;
                    bytesRemain -= bytesToCopy;

                    //create blockId
                    string blockId = Guid.NewGuid().ToString();
                    string base64BlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId));

                    await blob.PutBlockAsync(
                        base64BlockId,
                        new MemoryStream(bytesToSend, true),
                        null
                        );

                    blocklist.Add(base64BlockId);

                } while (bytesRemain > 0);

            //post blocklist
            await blob.PutBlockListAsync(blocklist);



            return Ok();
            // For more information on protecting this API from Cross Site Request Forgery (CSRF) attacks, see https://go.microsoft.com/fwlink/?LinkID=717803
        }

public class FileRequestObject
    {
        public IFormFile File { get; set; }
    }

enter image description here enter image description here For more details, please refer to https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/azure-blob-storage-part-4-uploading-large-blobs/

Share:
16,972

Related videos on Youtube

Ramakrishna Reddy
Author by

Ramakrishna Reddy

Updated on June 04, 2022

Comments

  • Ramakrishna Reddy
    Ramakrishna Reddy almost 2 years

    we have an application(.Net core) that is hosted in azure app service and we are trying to upload large files to Azure blob through web API using Form data from UI. We have changed request length and API request timeout still we are facing connection time out errors even while uploading 200MB files

    below is the sample code I am using

    [HttpPost]
    [Route("upload")]
    [Consumes("multipart/form-data")]
    [RequestFormLimits(MultipartBodyLengthLimit = 2147483648)]
    public async Task<IHttpActionResult> Upload([FromForm] FileRequestObject fileRequestObject)
    {
        var url = "upload_url_to_blob_storage";
        var file = fileRequestObject.Files[0];
    
        var blob = new CloudBlockBlob(new Uri(url));
        blob.Properties.ContentType = file.ContentType;
    
        await blob.UploadFromStreamAsync(file.InputStream);
    
        //some other operations based on file upload
        return Ok();
    }
    
    
    public class FileRequestObject
    {
        public List<IFormFile> Files { get; set; }
        public string JSON { get; set; }
        public string BlobUrls { get; set; }
    
    }
    
    • kgalic
      kgalic over 4 years
      Is it an option to use the API to generate the SAS token and then upload the file directly to the blob store?
    • Jim Xu
      Jim Xu over 4 years
      Are you sure you're not hitting the Block Blob size limits for PUT operations?``` The maximum size for a block blob created via Put Blob is 256 MB for version 2016-05-31 and later, and 64 MB for older versions. If your blob is larger than 256 MB for version 2016-05-31 and later, or 64 MB for older versions, you must upload it as a set of blocks.``` For more details, please refer to docs.microsoft.com/en-us/rest/api/storageservices/put-blob
    • Ramakrishna Reddy
      Ramakrishna Reddy over 4 years
      @kgalic no, we have complete upload operation is in API, we are passing stream data to the API endpoint and then from there uploading to blob
    • Ramakrishna Reddy
      Ramakrishna Reddy over 4 years
      Do we get timeout errors if Block Blob size limits exceeds
    • Ivan Yang
      Ivan Yang over 4 years
      @RamakrishnaReddy, please share us the code you're using, and remove the personal data / secret.
    • Ramakrishna Reddy
      Ramakrishna Reddy over 4 years
      @IvanYang added the code, please have a look
    • Jim Xu
      Jim Xu over 4 years
      @RamakrishnaReddy Could you please tell me how you define FileRequestObject classs?
    • Ramakrishna Reddy
      Ramakrishna Reddy over 4 years
      @JimXu updated the FileRequestObject
    • Jim Xu
      Jim Xu over 4 years
      @RamakrishnaReddy Ok. I will check it
  • Kiran Ramchandra Parab
    Kiran Ramchandra Parab almost 4 years
    it worked for me without "writable" i.e. MemoryStream(bytesToSend)