What is the best way to handle this: large download via PHP + slow connection from client = script timeout before file is completely downloaded What is the best way to handle this: large download via PHP + slow connection from client = script timeout before file is completely downloaded php php

What is the best way to handle this: large download via PHP + slow connection from client = script timeout before file is completely downloaded


Use X-SENDFILE. Most webservers will support it either natively, or though a plugin (apache).

using this header you can simply specify a local file path and exit the PHP script. The webserver sees the header and serves that file instead.


The easy solution would be to disable the timeout. You can do this on a per-request basis with:

set_time_limit(0);

If your script is not buggy, this shouldn't be problem – unless your server is not able to handle so many concurrent connections due to slow clients.

In that case, #1, #2 and #3 are two good solutions, and I would go with whichever is cheaper. Your concerns about #1 could be mitigated by generating download tokens that could only be used once, or for a small period of time.

Option #4, in my opinion, is not a great option. The speed can greatly vary during a download, so any estimate you would do initially would be, with a significant probability, wrong.


I am a bit reserved about #4. An attacker could forge a fake AJAX request to set your timeout to a very high value, then he can get you into an infinite loop. (If you were worried about that in the first place)

I would suggest a solution similar to @prodigitalson. You can make directories using hash values /downloads/389a002392ag02/myfile.zip which symlinks to the real file. Your PHP script redirects to that file which gets served by HTTP server. The symlink gets deleted periodically.

The added benefit for creating directory instead of a file is that end user doesn't see a mangled file name.