Uploading large files to Controller in chunks using HttpClient, IFormFile always empty

11,308

Solution 1

The issue was that I was using a StreamContent instead of a ByteArrayContent to represent my file chunks. Here's what I ended up with:

public async Task<Bool> UploadFileAsync(Guid id, string name, Stream file)
{
    int chunckSize = 2097152; //2MB
    int totalChunks = (int)(file.Length / chunckSize);
    if (file.Length % chunckSize != 0)
    {
        totalChunks++;
    }

    for (int i = 0; i < totalChunks; i++)
    {
        long position = (i * (long)chunckSize);
        int toRead = (int)Math.Min(file.Length - position, chunckSize);
        byte[] buffer = new byte[toRead];
        await file.ReadAsync(buffer, 0, buffer.Length);

        using (MultipartFormDataContent form = new MultipartFormDataContent())
        {
            form.Add(new ByteArrayContent(buffer), "files", name);
            form.Add(new StringContent(id.ToString()), "id");
            var meta = JsonConvert.SerializeObject(new ChunkMetaData
            {
                UploadUid = id.ToString(),
                FileName = name,
                ChunkIndex = i,
                TotalChunks = totalChunks,
                TotalFileSize = file.Length,
                ContentType = "application/unknown"
            });
            form.Add(new StringContent(meta), "metaData");
            var response = await Client.PostAsync("/api/Upload", form).ConfigureAwait(false);
            return response.IsSuccessStatusCode;
        }
    }
    return true;
}

Solution 2

Your param is empty, because you're not sending an array of files, but rather just one file. Therefore, the binding fails, and you get a null. The act of chunking (which you aren't actually even doing) does not equate to an IEnumerable<IFormFile>; it's still just an IFormFile.

While you need to send as multipart/form-data because you're sending both a file upload and some other post data, I think you're misunderstanding what this actually does. It simply means the request body contains multiple different mime-types, it does not mean that it's uploading the file in multiple parts, which seems to be what you're thinking it does.

The actual act of streaming the upload occurs on the server-side. It's about how the server chooses to handle the file being uploaded, and not so much about how the user is uploading it. More specifically, any sort of modelbinding, particular to an IFormFile will cause the file to be spooled to disk first, and then passed into your action. In other words, if you're accepting an IFormFile, you've already lost the battle. It's already been fully transferred from the client to your server.

The ASP.NET Core docs show you how to actually stream the upload, and unsurprisingly there's a fair bit of code involved, none of which you have currently. You basically have to turn modelbinding off entirely on the action and manually parse the request body yourself, being careful to actually chunk the reads from the stream and not do something that will force the entirely thing into memory at once.

Share:
11,308

Related videos on Youtube

NuclearProgrammer
Author by

NuclearProgrammer

Updated on June 04, 2022

Comments

  • NuclearProgrammer
    NuclearProgrammer almost 2 years

    I am trying to create a .Net Standard "Client" class for uploading (sometimes very large) files to a Controller. I want to do this by breaking the file into chunks and uploading them one at a time. The intent is for other applications to use this instead of communicating directly to the Web Api.

    I already have the Controller working. I've verified that it works using a Kendo-ui control which supports chunk-saving.

    The issue I am having is that the IEnumerable<IFormFile> files parameter for my controller is always empty when posted from my client class

    Controller

    [Route("api/Upload")]
    public ActionResult ChunkSave(IEnumerable<IFormFile> files, string metaData, Guid id)
    {
        MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(metaData));
        var serializer = new DataContractJsonSerializer(typeof(ChunkMetaData));
        ChunkMetaData somemetaData = serializer.ReadObject(ms) as ChunkMetaData;
    
        // The Name of the Upload component is "files"
        if (files != null)
        {
            // If this is the first chunk, try to delete the file so that we don't accidently
            // and up appending new bytes to the old file.
            if (somemetaData.ChunkIndex == 0)
            {
                _io.DeleteFile(id, Path.GetFileName(somemetaData.FileName));
            }
    
            foreach (var file in files)
            {
                // Some browsers send file names with full path. This needs to be stripped.
                 _io.AppendToFile(id, Path.GetFileName(somemetaData.FileName), file.OpenReadStream());
            }
        }
    
        FileResult fileBlob = new FileResult();
        fileBlob.uploaded = somemetaData.TotalChunks - 1 <= somemetaData.ChunkIndex;
        fileBlob.fileUid = somemetaData.UploadUid;
        return new JsonResult(fileBlob);
    }
    

    Client:

    public class FileTransferClient
    {
        HttpClient Client { get; set; } 
    
        public FileTransferClient(Uri apiUrl)
        {
            this.Client = new HttpClient(new HttpClientHandler() { UseDefaultCredentials = true })
            {
                BaseAddress = apiUrl
            };
            this.Client.DefaultRequestHeaders.Accept.Add(
                new MediaTypeWithQualityHeaderValue("application/json"));
        }
    
        public async Task<bool> UploadFile(Guid id, Stream file, string name, string contentType)
        {
            bool ret = true;
            int chunckSize = 2097152; //2MB
            int totalChunks = (int)(file.Length / chunckSize);
            if (file.Length % chunckSize != 0)
            {
                totalChunks++;
            }
    
            for (int i = 0; i < totalChunks; i++)
            {
                long position = (i * (long)chunckSize);
                int toRead = (int)Math.Min(file.Length - position + 1, chunckSize);
                byte[] buffer = new byte[toRead];
                await file.ReadAsync(buffer, 0, toRead);
    
                MultipartFormDataContent content = new MultipartFormDataContent();
                content.Add(new StringContent(id.ToString()), "id");
                var meta = JsonConvert.SerializeObject(new ChunkMetaData
                {
                    UploadUid = id.ToString(),
                    FileName = name,
                    ChunkIndex = i,
                    TotalChunks = totalChunks,
                    TotalFileSize = file.Length,
                    ContentType = contentType
                });
                content.Add(new StringContent(meta), "metaData");
                using (var ms = new MemoryStream(buffer))
                {
                    content.Add(new StreamContent(ms),"files");
                    var response = await Client.PostAsync("/api/Upload", content).ConfigureAwait(false);
                    if (!response.IsSuccessStatusCode)
                    {
                        ret = false;
                        break;
                    }
                }
            }
            return ret;
        }
    }
    
  • NuclearProgrammer
    NuclearProgrammer over 5 years
    I think you may have missed the part in the client where the file was being broken into 2mb chunks and sent in multiple Client.PostAsync() calls. The IEnumerable<IFormFile> allows for multiple files to be uploaded in the same http request. It’s used elsewhere in my application with <input type=‘file’ name=‘files’ multiple />. When uploading a large file the expected behavior is for the client to submit multiple requests (one for each chunk) and the controller to service those requests one at a time (with the files collection only containing a single file/chunk at a time).
  • Chris Pratt
    Chris Pratt over 5 years
    If there's multiple requests, then there will be multiple calls to this single action, each with just one IFormFile.
  • NuclearProgrammer
    NuclearProgrammer over 5 years
    The ienumerable isn’t there to handle the multiple chunks. It’s just one of the accepted signatures for a kendo ui control I am using elsewhere. It works whether it is a single IFormFile named “files” or an Ienumerable of IFormFile named “files” (the model binding treats multiple form values with the same name as multiple entries in the collection) when posting multiple files or a collection with a single item when posting a single file.
  • NuclearProgrammer
    NuclearProgrammer over 5 years
    The issue happens when I try to post to the method from my client class whether or not the controller accepts an ienumerable or a single instance. If it takes a collection, the collection is empty; if it takes an instance, the instance is null. I know the controller works because I use it with a kendo ui control elsewhere and it works great (even when uploading multiple large files at once). demos.telerik.com/kendo-ui/upload/index
  • NuclearProgrammer
    NuclearProgrammer over 5 years
    The controller is written to expect multiple requests for a single file - that’s why it appends the IFormFile’s content rather than creating a new file each time.
  • Ringo
    Ringo about 5 years
    Thanks, this is going to be useful for me. I'm using react-xhr-uploader and the HTML5 File API to upload large files to an S3-local server -- hopefully using your code here. npmjs.com/package/react-xhr-uploader