Why don't large files download easily in Laravel?
Solution 1
This happens because Response::download()
loads the file in to memory before serving it to the user. Admittedly this is a flaw in the framework, but most people do not try to serve large files through the framework.
Solution 1 - Put the files you want to download in the public folder, on a static domain, or cdn - bypass Laravel completely.
Understandably, you might be trying to restrict access to your downloads by login, in which case you'll need to craft your own download method, something like this should work...
function sendFile($path, $name = null, array $headers = array())
{
if (is_null($name)) $name = basename($path);
// Prepare the headers
$headers = array_merge(array(
'Content-Description' => 'File Transfer',
'Content-Type' => File::mime(File::extension($path)),
'Content-Transfer-Encoding' => 'binary',
'Expires' => 0,
'Cache-Control' => 'must-revalidate, post-check=0, pre-check=0',
'Pragma' => 'public',
'Content-Length' => File::size($path),
), $headers);
$response = new Response('', 200, $headers);
$response->header('Content-Disposition', $response->disposition($name));
// If there's a session we should save it now
if (Config::get('session.driver') !== '')
{
Session::save();
}
// Send the headers and the file
ob_end_clean();
$response->send_headers();
if ($fp = fread($path, 'rb')) {
while(!feof($fp) and (connection_status()==0)) {
print(fread($fp, 8192));
flush();
}
}
// Finish off, like Laravel would
Event::fire('laravel.done', array($response));
$response->foundation->finish();
exit;
}
This function is a combination of Response::download() and Laravel's shutdown process. I've not had a chance to test it myself, I don't have Laravel 3 installed at work. Please let me know if it does the job for you.
PS: The only thing this script does not take care of is cookies. Unfortunately the Response::cookies() function is protected. If this becomes a problem you can lift the code from the function and put it in your sendFile method.
PPS: There might be an issue with output buffering; if it is a problem have a look in the PHP manual at readfile() examples, there's a method that should work there.
PPPS: Since you're working with binary files you might want to consider replacing readfile()
with fpassthru()
EDIT: Disregard PPS and PPPS, I've updated the code to use fread+print instead as this seems more stable.
Solution 2
You can use the Symfony\Component\HttpFoundation\StreamedResponse like this:
$response = new StreamedResponse(
function() use ($filePath, $fileName) {
// Open output stream
if ($file = fopen($filePath, 'rb')) {
while(!feof($file) and (connection_status()==0)) {
print(fread($file, 1024*8));
flush();
}
fclose($file);
}
},
200,
[
'Content-Type' => 'application/octet-stream',
'Content-Disposition' => 'attachment; filename="' . $fileName . '"',
]);
return $response;
for more information check this
Solution 3
I'm using the readfile_chunked()
custom method as stated in php.net here. For Laravel 3, I've extended the response method like this:
Add this file as applications/libraries/response.php
<?php
class Response extends Laravel\Response {
//http://www.php.net/manual/en/function.readfile.php#54295
public static function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
// $handle = fopen($filename, 'rb');
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
}
}
Then comment out this line in application/config/application.php:
'Response' => 'Laravel\\Response',
Example code:
//return Response::download(Config::get('myconfig.files_folder').$file->upload, $file->title);
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename='.$file->title);
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . File::size(Config::get('myconfig.files_folder').$file->upload));
ob_clean();
flush();
Response::readfile_chunked(Config::get('myconfig.files_folder').$file->upload);
exit;
Works great so far.
Solution 4
2020 Laravel 7 there is a better way:
return response()->download($pathToFile);
I've used this with a file 398mb with no issues, when the same file was causing problems with previous solutions.
From Laravel docs: "The download method may be used to generate a response that forces the user's browser to download the file at the given path. The download method accepts a file name as the second argument to the method, which will determine the file name that is seen by the user downloading the file. Finally, you may pass an array of HTTP headers as the third argument to the method:
return response()->download($pathToFile);
return response()->download($pathToFile, $name, $headers);
return response()->download($pathToFile)->deleteFileAfterSend();
We also have streamed downloads which may suit more: Laravel docs
Lango
Updated on July 28, 2022Comments
-
Lango almost 2 years
My file (126 MB size, .exe) is giving me issues.
I'm using the standard laravel download method.
I tried increasing the memory but it still either says I have run out of memory, or I download a 0 KB size file.
The documentation doesn't mention anything about large file sizes.
My code is
ini_set("memory_limit","-1"); // Trying to see if this works return Response::download($full_path);
Anything I am doing wrong?
-- Edit --
Going on Phill Sparks comment, this is what I have and it works. It's a a combinations of Phill's plus some from php.net. Not sure if there is anything in there missing?
public static function big_download($path, $name = null, array $headers = array()) { if (is_null($name)) $name = basename($path); // Prepare the headers $headers = array_merge(array( 'Content-Description' => 'File Transfer', 'Content-Type' => File::mime(File::extension($path)), 'Content-Transfer-Encoding' => 'binary', 'Expires' => 0, 'Cache-Control' => 'must-revalidate, post-check=0, pre-check=0', 'Pragma' => 'public', 'Content-Length' => File::size($path), ), $headers); $response = new Response('', 200, $headers); $response->header('Content-Disposition', $response->disposition($name)); // If there's a session we should save it now if (Config::get('session.driver') !== '') { Session::save(); } // Below is from http://uk1.php.net/manual/en/function.fpassthru.php comments session_write_close(); ob_end_clean(); $response->send_headers(); if ($file = fopen($path, 'rb')) { while(!feof($file) and (connection_status()==0)) { print(fread($file, 1024*8)); flush(); } fclose($file); } // Finish off, like Laravel would Event::fire('laravel.done', array($response)); $response->foundation->finish(); exit; }
-
Lango about 11 yearsDoesn't ini_set("memory_limit","-1"); do that?
-
Lango about 11 yearsDoes it need to be the size of the file?
-
Kees Sonnema about 11 yearsno is does not have to be the size of the file. and -1 does not work.
-
Kees Sonnema about 11 yearsbut.. it has to be bigger than the filesize obvious
-
Lango about 11 yearsThe memory here refers to ram? What if it is a 1 gig download and two people download at once. Does that mean it needs 2 gig memory?
-
Kees Sonnema about 11 yearsif it's 1 gig it's 1 gig for both. that's has nothing to do with eachother
-
Kees Sonnema about 11 yearsi'm not familiar with laravel. but i knew it worked like this. and i found it on SO itself.
-
Lango about 11 yearsIt almost worked, I kept getting file not found erros with both readfile and fpassthru(). But I looked at the links and combined it with yours to make something that works. Though I'm not sure how correct it is? I edited my question to show the latest.
-
Phill Sparks about 11 yearsHi @Lango, I can't see any problem with your solution. I've updated my answer to include the fread+print approach and an
ob_get_clean()
too. I can't see why fpassthru wouldn't work when fread+print do, since they use the same file pointer - but if it works for you then roll with it!