Download from Laravel storage without loading whole file in memory Download from Laravel storage without loading whole file in memory laravel laravel

Download from Laravel storage without loading whole file in memory


It seems that output buffering is still building up a lot in memory.

Try disabling ob before doing the fpassthru:

function() use($stream) {    while(ob_get_level() > 0) ob_end_flush();    fpassthru($stream);},

It could be that there are multiple output buffers active that is why the while is needed.


Instead of loading the whole file into memory at once, try to use fread to read and send it chunk by chunk.

Here is a very good article: http://zinoui.com/blog/download-large-files-with-php

<?php//disable execution time limit when downloading a big file.set_time_limit(0);/** @var \League\Flysystem\Filesystem $fs */$fs = Storage::disk('local')->getDriver();$fileName = 'bigfile';$metaData = $fs->getMetadata($fileName);$handle = $fs->readStream($fileName);header('Pragma: public');header('Expires: 0');header('Cache-Control: must-revalidate, post-check=0, pre-check=0');header('Cache-Control: private', false);header('Content-Transfer-Encoding: binary');header('Content-Disposition: attachment; filename="' . $metaData['path'] . '";');header('Content-Type: ' . $metaData['type']);/*    I've commented the following line out.    Because \League\Flysystem\Filesystem uses int for file size    For file size larger than PHP_INT_MAX (2147483647) bytes    It may return 0, which results in:        Content-Length: 0    and it stops the browser from downloading the file.    Try to figure out a way to get the file size represented by a string.    (e.g. using shell command/3rd party plugin?)*///header('Content-Length: ' . $metaData['size']);$chunkSize = 1024 * 1024;while (!feof($handle)) {    $buffer = fread($handle, $chunkSize);    echo $buffer;    ob_flush();    flush();}fclose($handle);exit;?>

Update

A simpler way to do this: just call

if (ob_get_level()) ob_end_clean();

before returning a response.

Credit to @Christiaan

//disable execution time limit when downloading a big file.set_time_limit(0);/** @var \League\Flysystem\Filesystem $fs */$fs = Storage::disk('local')->getDriver();$fileName = 'bigfile';$metaData = $fs->getMetadata($fileName);$stream = $fs->readStream($fileName);if (ob_get_level()) ob_end_clean();return response()->stream(    function () use ($stream) {        fpassthru($stream);    },    200,    [        'Content-Type' => $metaData['type'],        'Content-disposition' => 'attachment; filename="' . $metaData['path'] . '"',    ]);


X-Send-File.

X-Send-File is an internal directive that has variants for Apache, nginx, and lighthttpd. It allows you to completely skip distributing a file through PHP and is an instruction that tells the webserver what to send as a response instead of the actual response from the FastCGI.

I've dealt with this before on a personal project and if you want to see the sum of my work, you can access it here:
https://github.com/infinity-next/infinity-next/blob/master/app/Http/Controllers/Content/ImageController.php#L250-L450

This deals not only with distributing files, but handling streaming media seeking. You are free to use that code.

Here is the official nginx documentation on X-Send-File.
https://www.nginx.com/resources/wiki/start/topics/examples/xsendfile/

You do have to edit your webserver and mark specific directories as internal for nginx to comply with X-Send-File directives.

I have example configuration for both Apache and nginx for my above code here.
https://github.com/infinity-next/infinity-next/wiki/Installation

This has been tested on high-traffic websites. Do not buffer media through a PHP Daemon unless your site has next to no traffic or you're bleeding resources.