Downloading all the files in a directory with cURL

128,817

Solution 1

OK, considering that you are using Windows, the most simple way to do that is to use the standard ftp tool bundled with it. I base the following solution on Windows XP, hoping it'll work as well (or with minor modifications) on other versions.

First of all, you need to create a batch (script) file for the ftp program, containing instructions for it. Name it as you want, and put into it:

curl -u login:pass ftp.myftpsite.com/iiumlabs* -O

open ftp.myftpsite.com
login
pass
mget *
quit

The first line opens a connection to the ftp server at ftp.myftpsite.com. The two following lines specify the login, and the password which ftp will ask for (replace login and pass with just the login and password, without any keywords). Then, you use mget * to get all files. Instead of the *, you can use any wildcard. Finally, you use quit to close the ftp program without interactive prompt.

If you needed to enter some directory first, add a cd command before mget. It should be pretty straightforward.

Finally, write that file and run ftp like this:

ftp -i -s:yourscript

where -i disables interactivity (asking before downloading files), and -s specifies path to the script you created.


Sadly, file transfer over SSH is not natively supported in Windows. But for that case, you'd probably want to use PuTTy tools anyway. The one of particular interest for this case would be pscp which is practically the PuTTy counter-part of the openssh scp command.

The syntax is similar to copy command, and it supports wildcards:

pscp -batch [email protected]:iiumlabs* .

If you authenticate using a key file, you should pass it using -i path-to-key-file. If you use password, -pw pass. It can also reuse sessions saved using PuTTy, using the load -load your-session-name argument.

Solution 2

If you're not bound to curl, you might want to use wget in recursive mode but restricting it to one level of recursion, try the following;

wget --no-verbose --no-parent --recursive --level=1\
--no-directories --user=login --password=pass ftp://ftp.myftpsite.com/
  • --no-parent : Do not ever ascend to the parent directory when retrieving recursively.
  • --level=depth : Specify recursion maximum depth level depth. The default maximum depth is five layers.
  • --no-directories : Do not create a hierarchy of directories when retrieving recursively.

Solution 3

What about something like this:

for /f %%f in ('curl -s -l -u user:pass ftp://ftp.myftpsite.com/') do curl -O -u user:pass ftp://ftp.myftpsite.com/%%f

Solution 4

You can use script like this for mac:

for f in $(curl -s -l -u user:pass ftp://your_ftp_server_ip/folder/) 
 do curl -O -u user:pass ftp://your_ftp_server_ip/folder/$f 
done

Solution 5

Oh, I have just the thing you need!

$host = "ftp://example.com/dir/";
$savePath = "downloadedFiles";
if($check = isFtpUp($host)){

    echo $ip." -is alive<br />";

    $check = trim($check);
    $files = explode("\n",$check);

    foreach($files as $n=>$file){
        $file = trim($file);
        if($file !== "." || $file !== ".."){
            if(!saveFtpFile($file, $host.$file, $savePath)){
                // downloading failed. possible reason: $file is a folder name.
                // echo "Error downloading file.<br />";
            }else{
                echo "File: ".$file." - saved!<br />";
            }
        }else{
            // do nothing
        }
    }
}else{
    echo $ip." - is down.<br />";
}

and functions isFtpUp and saveFtpFile are as follows:

function isFtpUp($host){
$ch = curl_init();

curl_setopt($ch, CURLOPT_URL, $host);
curl_setopt($ch, CURLOPT_USERPWD, "anonymous:[email protected]");
curl_setopt($ch, CURLOPT_FTPLISTONLY, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 3);

$result = curl_exec($ch);

return $result;

}

function saveFtpFile( $targetFile = null, $sourceFile = null, $savePath){

// function settings
set_time_limit(60);
$timeout = 60;
$ftpuser = "anonymous";
$ftppassword = "[email protected]";
$savePath = "downloadedFiles"; // should exist!
$curl = curl_init();
$file = @fopen ($savePath.'/'.$targetFile, 'w');

if(!$file){
    return false;
}

curl_setopt($curl, CURLOPT_URL, $sourceFile);
curl_setopt($curl, CURLOPT_USERPWD, $ftpuser.':'.$ftppassword);

// curl settings

// curl_setopt($curl, CURLOPT_FAILONERROR, 1);
// curl_setopt($curl, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($curl, CURLOPT_TIMEOUT, $timeout);
curl_setopt($curl, CURLOPT_FILE, $file);

$result = curl_exec($curl);


if(!$result){
    return false;
}

curl_close($curl);
fclose($file);

return $result;
}

EDIT:

it's a php script. save it as a .php file, put it on your webserver, change $ip to address(need not be ip) of ftp server you want to download files from, create a directory named downloadedFiles on the same directory as this file.

Share:
128,817
Alex Gordon
Author by

Alex Gordon

Check out my YouTube channel with videos on Azure development.

Updated on July 09, 2022

Comments

  • Alex Gordon
    Alex Gordon almost 2 years

    I am using cURL to try to download all files in a certain directory.

    here's what my list of files looks like:

    enter image description here

    I have tried to do in bash script: iiumlabs.[].csv.pgp and iiumlabs* and I guess curl is not big on wildcards.

    curl -u login:pass ftp.myftpsite.com/iiumlabs* -O
    

    question: how do i download this directory of files using cURL?