php, file download

25,967

Solution 1

One issue I have with the following code is you have no control over the output stream, your letting PHP handle it without knowing exactly what is going on within the background:

What you should do is set up an output system that you can control and replicated accros servers.

For example:

if (file_exists($file))
{
    if (FALSE!== ($handler = fopen($file, 'r')))
    {
        header('Content-Description: File Transfer');
        header('Content-Type: application/octet-stream');
        header('Content-Disposition: attachment; filename='.basename($file));
        header('Content-Transfer-Encoding: chunked'); //changed to chunked
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header('Pragma: public');
        //header('Content-Length: ' . filesize($file)); //Remove

        //Send the content in chunks
        while(false !== ($chunk = fread($handler,4096)))
        {
            echo $chunk;
        }
    }
    exit;
}
echo "<h1>Content error</h1><p>The file does not exist!</p>";

This is only basic but give it a go!

Also read my reply here: file_get_contents => PHP Fatal error: Allowed memory exhausted

Solution 2

It seems readfile can have issues with long files. As @Khez asked, it could be that the script is running for too long. A quick Googling resulted in a couple examples of chunking the file.

http://teddy.fr/blog/how-serve-big-files-through-php http://www.php.net/manual/en/function.readfile.php#99406

Solution 3

One solution to certain scenarios is that you can use PHP-script to intelligently decide what file from where to download, but instead of sending the file directly from PHP, you could return a redirection to the client which then contains the direct link which is processed by the web server alone.

This could be done at least in two ways: either PHP-script copies the file into a "download zone" which for example might be cleaned from "old" files regularly by some other background/service script or you expose the real permanent location to the clients.

There are of course drawbacks as is the case with each solution. In this one is that depending on the clients (curl, wget, GUI browser) requesting the file they may not support redirection you make and in the other one, the files are very exposed to the outer world and can be at all times read without the (access) control of the PHP script.

Share:
25,967

Related videos on Youtube

jsonx
Author by

jsonx

Mathematician/Programmer/RC Pilot :)

Updated on December 10, 2020

Comments

  • jsonx
    jsonx over 3 years

    I am using the simple file downloading script:

    if (file_exists($file)) {
        header('Content-Description: File Transfer');
        header('Content-Type: application/octet-stream');
        header('Content-Disposition: attachment; filename='.basename($file));
        header('Content-Transfer-Encoding: binary');
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header('Pragma: public');
        header('Content-Length: ' . filesize($file));
        ob_clean();
        flush();
        readfile($file);
        exit;
    }
    

    It is working on my localserver upto 200mb.

    When i try this code in my website it downloads 173KB instead of 200MB file.

    I checked everything, wrote some custom code (using ob functions and fread instead of readfile) but can't download big files.

    Thank you for your answers.

    • I am using Apache 2.2, PHP 5.3
    • All PHP settings to deal with big files are ok. (execution times, memory limits, ...
  • jsonx
    jsonx about 13 years
    all settings are fine to work with big files, execution times, memory limits, ...
  • jsonx
    jsonx about 13 years
    in the first link's read_chunked() function solved the problem.
  • jsonx
    jsonx about 13 years
    Thank you Zofrex, yes there must be done some authentication and some other operations before sending files. mod_xsendfile looks good, i will try it. It maybe the clearer solution.
  • jsonx
    jsonx about 13 years
    Thanks. Chunked sending is working on all servers without problem. Also i am sending the filesize, actually sending complete filesize and sending the file chunk by chunk may trivial but it works good for now. Otherwise for big downloads user won't see the complete filesize and this may not good for usability.
  • RobertPitt
    RobertPitt about 13 years
    It's totally depends on how the data would be transferred, for instance streaming and downloading are require 2 separate methods of content delivery. glad to help
  • 735Tesla
    735Tesla over 10 years
    Shouldn't base name($file) be encoded? What if it contains characters that screw with http?

Related