Download File to server from URL

646,581

Solution 1

Since PHP 5.1.0, file_put_contents() supports writing piece-by-piece by passing a stream-handle as the $data parameter:

file_put_contents("Tmpfile.zip", fopen("http://someurl/file.zip", 'r'));

From the manual:

If data [that is the second argument] is a stream resource, the remaining buffer of that stream will be copied to the specified file. This is similar with using stream_copy_to_stream().

(Thanks Hakre.)

Solution 2

private function downloadFile($url, $path)
{
    $newfname = $path;
    $file = fopen ($url, 'rb');
    if ($file) {
        $newf = fopen ($newfname, 'wb');
        if ($newf) {
            while(!feof($file)) {
                fwrite($newf, fread($file, 1024 * 8), 1024 * 8);
            }
        }
    }
    if ($file) {
        fclose($file);
    }
    if ($newf) {
        fclose($newf);
    }
}

Solution 3

Try using cURL

set_time_limit(0); // unlimited max execution time
$options = array(
  CURLOPT_FILE    => '/path/to/download/the/file/to.zip',
  CURLOPT_TIMEOUT =>  28800, // set this to 8 hours so we dont timeout on big files
  CURLOPT_URL     => 'http://remoteserver.com/path/to/big/file.zip',
);

$ch = curl_init();
curl_setopt_array($ch, $options);
curl_exec($ch);
curl_close($ch);

I'm not sure but I believe with the CURLOPT_FILE option it writes as it pulls the data, ie. not buffered.

Solution 4

prodigitalson's answer didn't work for me. I got missing fopen in CURLOPT_FILE more details.

This worked for me, including local urls:

function downloadUrlToFile($url, $outFileName)
{   
    if(is_file($url)) {
        copy($url, $outFileName); 
    } else {
        $options = array(
          CURLOPT_FILE    => fopen($outFileName, 'w'),
          CURLOPT_TIMEOUT =>  28800, // set this to 8 hours so we dont timeout on big files
          CURLOPT_URL     => $url
        );

        $ch = curl_init();
        curl_setopt_array($ch, $options);
        curl_exec($ch);
        $httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
        curl_close($ch);
        return $httpcode;
    }
}

        

Solution 5

  1. Create a folder called "downloads" in destination server
  2. Save [this code] into .php file and run in destination server

Downloader :

<html>
<form method="post">
<input name="url" size="50" />
<input name="submit" type="submit" />
</form>
<?php
    // maximum execution time in seconds
    set_time_limit (24 * 60 * 60);

    if (!isset($_POST['submit'])) die();

    // folder to save downloaded files to. must end with slash
    $destination_folder = 'downloads/';

    $url = $_POST['url'];
    $newfname = $destination_folder . basename($url);

    $file = fopen ($url, "rb");
    if ($file) {
      $newf = fopen ($newfname, "wb");

      if ($newf)
      while(!feof($file)) {
        fwrite($newf, fread($file, 1024 * 8 ), 1024 * 8 );
      }
    }

    if ($file) {
      fclose($file);
    }

    if ($newf) {
      fclose($newf);
    }
?>
</html> 
Share:
646,581
xaav
Author by

xaav

Updated on July 08, 2022

Comments

  • xaav
    xaav almost 2 years

    Well, this one seems quite simple, and it is. All you have to do to download a file to your server is:

    file_put_contents("Tmpfile.zip", file_get_contents("http://someurl/file.zip"));
    

    Only there is one problem. What if you have a large file, like 100mb. Then, you will run out of memory, and not be able to download the file.

    What I want is a way to write the file to the disk as I am downloading it. That way, I can download bigger files, without running into memory problems.

  • xaav
    xaav over 13 years
    Normally, this would be fine, but I have this code in a web app, so I cant be sure users will have cURL installed. However, I did give this a vote up.
  • PleaseStand
    PleaseStand over 13 years
    That wouldn't be my first choice. If allow_fopen_url Off is set in php.ini (good idea for security), your script would be broken.
  • user3167101
    user3167101 over 13 years
    @idealmachine I think file_get_contents() would not work either if that were the case (see OP).
  • user3167101
    user3167101 over 13 years
    @Geoff is it a distributed web app? Because if you control the hosting, then it doesn't matter about your users (cURL is a library on your server).
  • xaav
    xaav over 13 years
    No. I do not control hosting. It is a distributed web app that anyone could have.
  • user3167101
    user3167101 over 13 years
    @geoff I was specific, I mentioned the function you wanted. What you may have wanted was someone to write the code for you - but I'm sure you learned something doing it yourself. Also, if we are going to comment on each other's SO interactions - please accept some more answers :)
  • vvMINOvv
    vvMINOvv over 12 years
    thank's for your snippit, but would you be able to explain your code @xaav? I'm not exactly brilliant at php. What is 1024*8 for ? Thank's again.
  • David Bélanger
    David Bélanger almost 12 years
    @wMINOw The length of the line.
  • Mangirdas Skripka
    Mangirdas Skripka almost 12 years
    Curl might be missing. But almost all shared hosting companies have CURL installed by default. I mean, I haven't seen one that doesn't.
  • Doktor J
    Doktor J over 11 years
    Specifically, it means to read up to 8KB at a time (1024 bytes per KB * 8) since the parameter is in bytes. As long as the line is <= 8KB, it will read the entire line at once.
  • hakre
    hakre about 11 years
    @alex: Please see the edit, feel free to incorporate. let me know when I can remove this comment here then.
  • user3167101
    user3167101 about 11 years
    @hakre Thanks, much appreciated. I'll make it flow a bit easier. :)
  • Wayne Weibel
    Wayne Weibel over 10 years
    The 'b' flag should also be used in most cases with fopen; prevents adverse effects to images and other non plain text files.
  • GunJack
    GunJack over 10 years
    Why is not this the best answer?
  • Tommix
    Tommix about 10 years
    your answer is very simple and good working, helped me where cURL failed to get file, this worked. Thanks :)
  • Akhilesh
    Akhilesh over 9 years
    This should be the best answer.
  • Adam Swinden
    Adam Swinden over 9 years
    How do you handle errors with this approach? What if a 404 is returned or the connection is interrupted or times out?
  • The Bumpaster
    The Bumpaster over 8 years
    That would be useful, but if 404 goes through you would be downloading anything, and you can always run a search check for that file (some logic) if (!myfile) -> break;
  • user3167101
    user3167101 over 8 years
    You might want to explain what this actually does.
  • Gustavo
    Gustavo about 8 years
    As from my tests, you can't assign to CURLOPT_FILE a file path directly. It has to be a file handler. First, open the file with $fh = fopen('/path/to/download/the/file/to.zip', 'w'); and close with fclose($fh); after curl_close($ch);. And set CURLOPT_FILE => $fh
  • Sean the Bean
    Sean the Bean about 8 years
    This assumes the user wants a standalone script rather than a solution that will work within an existing PHP application, and I believe the latter is what the OP and most others are looking for. An explanation would also be helpful for people who want to understand the approach.
  • Dheeraj Thedijje
    Dheeraj Thedijje almost 8 years
    thanks this saved my 1GB of mobile data i was planning transfering file from one server to another, i could directly moved zip file instead of downloading and uploading to new server. :)
  • user9645
    user9645 almost 8 years
    This does not address the OP's problem of exceeding the PHP memory limit.
  • Tofeeq
    Tofeeq almost 8 years
    Best way to download large files
  • SergeDirect
    SergeDirect over 7 years
    Magnificently fast - you r true maestro!
  • NomanJaved
    NomanJaved over 7 years
    sometimes the fopen failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden show this error because of web crawling prevention how to resolve it? @<xaav> @<Ibrahim Azhar Armar>
  • Riffaz Starr
    Riffaz Starr about 7 years
    whenever I try this always my transferred file size is 50816 but my file size is bigger than this.. 120MB.. Any idea why is this?
  • Oleg Abrazhaev
    Oleg Abrazhaev almost 7 years
    NomanJaved you need to do auth request on a login form first and reuse the same curl client (or at least it's headers and cookie) for all further requests.
  • Lorenz Meyer
    Lorenz Meyer over 6 years
    This does not answer the question, because the question is about writing on disk not to the output buffer.
  • Naveen DA
    Naveen DA over 6 years
    How it downloads without CURL ?
  • Viktor Joras
    Viktor Joras about 6 years
    set_time_limit (24 * 60 * 60); has to be put inside a loop. It has no effect at the beginning of the script.
  • Kaspar L. Palgi
    Kaspar L. Palgi over 5 years
    Downloads 0KB empty file although the file is more than 17MB Skype main.db file. On PHP 5.4 also had to remove "private" from the beginning of the function.
  • Marco
    Marco about 5 years
    This solution was very nice for me, because I had to download a huge XML file which exceeds my PHP memory_limit if 1024M.
  • Valentine Shi
    Valentine Shi about 5 years
    This is pretty simple and straightforward. Quite useful for simpler cases where the files are small or the the environment is a local development.
  • BendaThierry.com
    BendaThierry.com almost 5 years
    I will never never never set 777 as perms on a webserver, and I will kick off any webdeveloper whom has the bad idea to do that. Every time, everywhere. Be carefull ! You can not do that ! Think about security. Following OWASP rules is not enough. Having good thinking about simple things matters.
  • Pradeep Kumar
    Pradeep Kumar almost 5 years
    @ThierryB. Note: I've given local path. & this can be used in internal applications. Having good reading and understanding of question and answer matters. Think different scenarios. And this is not accepted/best answer. Every question has different answers with pros & cons in it.. Example for you to understand: Even Fibonacci have multiple unique solutions where only one will be best. Others will be used in different scenarios.
  • BendaThierry.com
    BendaThierry.com almost 5 years
    Ok, but taking time to think about best practises and implement them inside secured places will give you a better understanding of concepts you must implement. Maybe if an intruder is inside your ($)home, doing some traps or building things the best way you can will give him some headaches ;)
  • PlayHardGoPro
    PlayHardGoPro over 4 years
    Would this approach be better than using curl?
  • Dhruv Thakkar
    Dhruv Thakkar almost 4 years
    any idea for .xlsx files ? It's storing empty file with 0 byte memory .
  • Pacerier
    Pacerier almost 4 years
    How is this different/better from the answer with fopen?
  • Saurin Dashadia
    Saurin Dashadia almost 4 years
  • Saurin Dashadia
    Saurin Dashadia almost 4 years
  • webHasan
    webHasan over 3 years
    @Netwons make sure wget available in your server.
  • Netwons
    Netwons over 3 years
    wget available to system error ======> errorCode=1 SSL/TLS handshake failure: The TLS connection was non-properly terminated.
  • Netwons
    Netwons over 3 years
    or error ======> Connecting to www.you.com (www.you.com)|178.79.180.188|:443... connected.
  • NomanJaved
    NomanJaved about 3 years
    Note: The $path should include the name of the file you want to create. Example: $path = $_SERVER['DOCUMENT_ROOT'].'/uploads/' . "test.mp3";