How to deliver big files in ASP.NET Response?

62,436

Solution 1

Although correct way to deliver the big files in IIS is the following option,

  1. Set MinBytesPerSecond to Zero in WebLimits (This will certainly help in improving performance, as IIS chooses to close clients holding KeepAlive connections with smaller size transfers)
  2. Allocate More Worker Process to Application Pool, I have set to 8, now this should be done only if your server is distributing larger files. This will certainly cause other sites to perform slower, but this will ensure better deliveries. We have set to 8 as this server has only one website and it just delivers huge files.
  3. Turn off App Pool Recycling
  4. Turn off Sessions
  5. Leave Buffering On
  6. Before each of following steps, check if Response.IsClientConnected is true, else give up and dont send anything.
  7. Set Content-Length before sending the file
  8. Flush the Response
  9. Write to Output Stream, and Flush in regular intervals

Solution 2

When you have set the content length with the bufferOutput to false then the possible reason of the fails is because IIS try to gzip the file you send, and by set the Content-Length IIS can not change it back to the compressed one, and the errors starts (*).

So keep the BufferOutput to false, and second disable the gzip from iis for the files you send - or disable the iis gzip for all files and you handle the gzip part programmatically, keeping out of gzip the files you send.

Some similar questions for the same reason: ASP.NET site sometimes freezing up and/or showing odd text at top of the page while loading, on load balanced servers

HTTP Compression: Some external scripts/CSS not decompressing properly some of the time

(*) why not change it again ? because from the moment you set a header you can not take it back, except if you have enable this option on IIS and take care that the header have not all ready send to the browser.

Follow up

If not gziped, the next thing it came to my mind is that the file is sent and for some reason the connection got delayed, and got a timeout and closed. So you get the "Remote Host Closed The Connection".

This can be solved depending on the cause:

  1. Client really closed the connection
  2. The timeout is from the page itself, if you use handler (again, probably, the message must be "Page Timed Out" ).
  3. The timeout is coming from the idle waiting, the page take more than the execution time, gets a timeout and close the connection. Maybe in this case the message was the Page Timed Out.
  4. The pool make a recycle the moment you send the file. Disable all pool recycles! This is the most possible cases that I can think of right now.

If it is coming from the IIS, go to the web site properties and make sure you set the biggest "Connection Timeout", and "Enable HTTP Keep-Alives".

The page timeout by changing the web.config (you can change it programmatically only for one specific page)

<httpRuntime executionTimeout="43200"

Also have a look at : http://weblogs.asp.net/aghausman/archive/2009/02/20/prevent-request-timeout-in-asp-net.aspx

Session lock

One more thing that you need to examine is to not use session on the handler that you use to send the file, because the session locks the action until finish out and if a user take longer time to download a file, a second one may get time out.

some relative:

call aspx page to return an image randomly slow

Replacing ASP.Net's session entirely

Response.WriteFile function fails and gives 504 gateway time-out

Solution 3

What I would do is use the not so well-known ASP.NET Response.TransmitFile method, as it's very fast (and possibly uses IIS kernel cache) and takes care of all header stuff. It is based on the Windows unmanaged TransmitFile API.

But to be able to use this API, you need a physical file to transfer. So here is a pseudo c# code that explain how to do this with a fictional myCacheFilePath physical file path. It also supports client caching possibilities. Of course, if you already have a file at hand, you don't need to create that cache:

    if (!File.Exists(myCacheFilePath))
    {
        LoadMyCache(...); // saves the file to disk. don't do this if your source is already a physical file (not stored in a db for example).
    }

    // we suppose user-agent (browser) cache is enabled
    // check appropriate If-Modified-Since header
    DateTime ifModifiedSince = DateTime.MaxValue;
    string ifm = context.Request.Headers["If-Modified-Since"];
    if (!string.IsNullOrEmpty(ifm))
    {
        try
        {
            ifModifiedSince = DateTime.Parse(ifm, DateTimeFormatInfo.InvariantInfo);
        }
        catch
        {
            // do nothing
        }

        // file has not changed, just send this information but truncate milliseconds
        if (ifModifiedSince == TruncateMilliseconds(File.GetLastWriteTime(myCacheFilePath)))
        {
            ResponseWriteNotModified(...); // HTTP 304
            return;
        }
    }

    Response.ContentType = contentType; // set your file content type here
    Response.AddHeader("Last-Modified", File.GetLastWriteTimeUtc(myCacheFilePath).ToString("r", DateTimeFormatInfo.InvariantInfo)); // tell the client to cache that file

    // this API uses windows lower levels directly and is not memory/cpu intensive on Windows platform to send one file. It also caches files in the kernel.
    Response.TransmitFile(myCacheFilePath)

Solution 4

This piece of code works for me. It starts the data stream to client immediately. It shows progress during download. It doesn't violate HTTP. Content-Length header is specified and the chuncked transfer encoding is not used.

protected void PrepareResponseStream(string clientFileName, HttpContext context, long sourceStreamLength)
{
    context.Response.ClearHeaders();
    context.Response.Clear();

    context.Response.ContentType = "application/pdf";
    context.Response.AddHeader("Content-Disposition", string.Format("filename=\"{0}\"", clientFileName));

    //set cachebility to private to allow IE to download it via HTTPS. Otherwise it might refuse it
    //see reason for HttpCacheability.Private at http://support.microsoft.com/kb/812935
    context.Response.Cache.SetCacheability(HttpCacheability.Private);
    context.Response.Buffer = false;
    context.Response.BufferOutput = false;
    context.Response.AddHeader("Content-Length", sourceStreamLength.ToString    (System.Globalization.CultureInfo.InvariantCulture));
}

protected void WriteDataToOutputStream(Stream sourceStream, long sourceStreamLength, string clientFileName, HttpContext context)
{
    PrepareResponseStream(clientFileName, context, sourceStreamLength);
    const int BlockSize = 4 * 1024 * 1024;
    byte[] buffer = new byte[BlockSize];
    int bytesRead;
    Stream outStream = m_Context.Response.OutputStream;
    while ((bytesRead = sourceStream.Read(buffer, 0, BlockSize)) > 0)
    {
        outStream.Write(buffer, 0, bytesRead);
    }
    outStream.Flush();
}
Share:
62,436

Related videos on Youtube

Akash Kava
Author by

Akash Kava

Author of, YantraJS - JavaScript engine &amp; runtime for .NET Standard GushCRM - CRM for Talent/Acting Agencies Blog: www.webatoms.in/blog Twitter: twitter.com/akashkava GitHub: github.com/neurospeech Company: neurospeech.com

Updated on July 09, 2022

Comments

  • Akash Kava
    Akash Kava almost 2 years

    I am not looking for any alternative of streaming file contents from database, indeed I am looking for root of the problem, this was running file till IIS 6 where we ran our app in classic mode, now we upgraded our IIS to 7 and we are running app pool in pipeline mode and this problem started.

    I have an handler, where I have to deliver big files to client request. And I face following problems,

    Files are of average size 4 to 100 MB, so lets consider 80MB file download case.

    Buffering On, Slow Start

    Response.BufferOutput = True;
    

    This results in very slow start of file, as user downloads and even progress bar does not appear till few seconds, typically 3 to 20 seconds, reason behind is, IIS reads entire file first, determines the content-length and then begin the file transfer. File is being played in video player, and it runs very very slow, however iPad only downloads fraction of file first so it works fast.

    Buffering Off, No Content-Length, Fast Start, No Progress

    Reponse.BufferOutput = False;
    

    This results in immediate start, however end client (typical browser like Chrome) does not know Content-Length as IIS does not know either, so it does not display progress, instead it says X KB downloaded.

    Buffering Off, Manual Content-Length, Fast Start, Progress and Protocol Violation

    Response.BufferOutput = False;
    Response.AddHeader("Content-Length", file.Length);
    

    This results in correct immediate file download in Chrome etc, however in some cases IIS handler results in "Remote Client Closed Connection" error (this is very frequent) and other WebClient results in protocol violation. This happens 5 to 10% of all requests, not every requests.

    I guess what is happening is, IIS does not send anything called 100 continue when we dont do buffering and client might disconnect not expecting any output. However, reading files from source may take longer time, but at client side I have increased timeout but seems like IIS timesout and have no control.

    Is there anyway I can force Response to send 100 continue and not let anyone close the connection?

    UPDATE

    I found following headers in Firefox/Chrome, nothing seems unusual here for Protocol Violation or Bad Header.

    Access-Control-Allow-Headers:*
    Access-Control-Allow-Methods:POST, GET, OPTIONS
    Access-Control-Allow-Origin:*
    Access-Control-Max-Age:1728000
    Cache-Control:private
    Content-Disposition:attachment; filename="24.jpg"
    Content-Length:22355
    Content-Type:image/pjpeg
    Date:Wed, 07 Mar 2012 13:40:26 GMT
    Server:Microsoft-IIS/7.5
    X-AspNet-Version:4.0.30319
    X-Powered-By:ASP.NET
    

    UPDATE 2

    Turning Recycling still did not offer much but I have increased my MaxWorkerProcess to 8 and I now get less number of errors then before.

    But on an average, out of 200 requests in one second, 2 to 10 requests fail.., and this happens on almost every alternate seconds.

    UPDATE 3

    Continuing 5% of requests failing with "The server committed a protocol violation. Section=ResponseStatusLine", I have another program that downloads content from the webserver which uses WebClient, and which gives this error 4-5 times a second, on an average I have 5% of requests failing. Is there anyway to trace WebClient failure?

    Problems Redefined

    Zero Byte File Received

    IIS closes connection for some reason, on client side in WebConfig, I receive 0 bytes for the file which is not zero bytes, We do SHA1 hash check, this told us that in IIS web server, no error is recorded.

    This was my mistake, and its resolved as we are using Entity Framework, it was reading dirty (uncommitted rows) as read was not in transaction scope, putting it in transaction scope has resolved this issue.

    Protocol Violation Exception Raised

    WebClient throws WebException saying "The server committed a protocol violation. Section=ResponseStatusLine.

    I know I can enable unsafe header parsing but that is not the point, when it is my HTTP Handler that is sending proper headers, dont know why IIS is sending anything extra (checked on firefox and chrome, nothing unusual), this happens only 2% of times.

    UPDATE 4

    Found sc-win32 64 error and I read somewhere that WebLimits for MinBytesPerSecond must be changed from 240 to 0, still everything is same. However I have noticed that whenever IIS logs 64 sc-win32 error, IIS records HTTP Status as 200 but there was some error. Now I cant turn on Failed Trace Logging for 200 because it will result in massive files.

    Both of above problems were solved by increasing MinBytesPerSecond and as well as disabling Sessions, I have added detailed answer summarizing every point.

    • Caspar Kleijne
      Caspar Kleijne over 12 years
      That is why they invented the File Transfer Protocol ;)
    • Akash Kava
      Akash Kava over 12 years
      FTP for 100MB isnt a replacement, Linux download servers work correctly, its the IIS which is acting funny, funnier part is IIS closes the client connection and blames as Remote Client Closed Connection !!
    • Zhaph - Ben Duguid
      Zhaph - Ben Duguid over 12 years
      @CasparKleijne More to the point, this is why IIS 7 has a "Smooth Streaming Module" to deliver video to Silverlight and other clients
    • Akash Kava
      Akash Kava over 12 years
      @Zhaph-BenDuguid Well its MS's propriety thing, our data exists in blob store in MS SQL, and blobs are broken down into 1MB size, they dont exist as files in disk, we know about filestream and it crashes for more than 100K files, and we have around few million files, so we have to stick to what we have. Smooth Streaming and Static File modules work on individual file existing on disk, it will not work in my case.
    • Stilgar
      Stilgar over 12 years
      It may be interesting to see what the static file handler that ships with IIS7 does for larger files and copy there solution whatever it is.
    • Aristos
      Aristos over 12 years
      @AkashKava I think and the case of the session, I update my answer, check if you use session when you send the file and disabled it.
  • Akash Kava
    Akash Kava over 12 years
    All I am doing is, sending byte[] buffers, I am not sending files, however looks like a good hint, I will check on it.
  • Akash Kava
    Akash Kava over 12 years
    IIS Dynamic Content Compression module is not at all installed on any of my servers. So I assume IIS is not compressing content at all right? or is it something I have to explicitly turn off somewhere else? By the way this is my MVC handler, and I assume dynamic content is sent by us. IIS Static Compression is installed, should I uninstall that too?
  • Aristos
    Aristos over 12 years
    @AkashKava You can easy see if the pages are gzipped, just look the headers with the tools from the browser. Now not uninstall anything, just disabled it. Depend from the iis version look on the internet how you can do that
  • Akash Kava
    Akash Kava over 12 years
    You can check my question, I have added headers, I dont see compression. Also this happens 5 to 10% of requests, not all.
  • Akash Kava
    Akash Kava over 12 years
    Thanks for update, I have set all known timeouts for 1 hour, 3600, I think recycling is something that is causing this problem, I did dig little and found out that my old server running in IIS classic mode doesnt give such problems, app pools running in pipeline mode is giving problem but I was under impression that it will only recycle if its idle, how do i turn off recycling and how to improve performance, i have only one website on this server and i want only one pool with max possible performance, even if it takes 500mb no prob, 50% cpu usage, no prob.
  • Aristos
    Aristos over 12 years
    @AkashKava The recycle have many options, can make recycle on idle, but also on memory limit, on time reach the time out, and other... I have disable all recycle because I do not actually need them and I do not have any issue. - also now you say pipe line, maybe some loaded module of yours make this issue, check them out.
  • Akash Kava
    Akash Kava over 12 years
    Nope, we have about million files, and we want to stream blocks from databases directly, extracting blocks and creating temp file will add overhead of managing and deleting it. I want to find out the cause of problem, not alternative.
  • Simon Mourier
    Simon Mourier over 12 years
    @AkashKava- These are not "temp" files, but instead cached files that will be reused on subsequent requests. Extracting files from a database at each request is not very good, hence the cache story that leverages Windows hi-perf lower IO layers. And storing millions of files on a disk is not more an issue than storing millions of files in a database. This is a solution to the problems you describe.
  • Akash Kava
    Akash Kava over 12 years
    We are dealing with our millions of files in database since last 7 years, till IIS 6.0 everything was alright, this problem started recently in IIS 7.0 as we upgraded our servers. "Extracting files from a database at each request is not very good", do you have any performance benchmarks? How do you make your data access ACID compliant with just files on the disk? These questions are out of our scope, I want to get down to bottom of why IIS is behaving strange.
  • Akash Kava
    Akash Kava over 12 years
    Thank you for your time, I have been able to improve IIS performance and also there was a little bug in our code which was reading dirty rows from database which caused zero byte files, other than that, I have awarded bounty to you but I have created new answer summarizing everything. Also thanks for Session point, I missed it too and it creates lots cookies unnecessarily.