Flask large file download Flask large file download flask flask

Flask large file download


If you serve binary files, you should not iterate through lines since it basically contains only one "line", which means you still load the whole file all at once into the RAM.

The only proper way to read large files is via chunks:

CHUNK_SIZE = 8192def read_file_chunks(path):    with open(path, 'rb') as fd:        while 1:            buf = fd.read(CHUNK_SIZE)            if buf:                yield buf            else:                break

Then it's safe to call stream_with_context on this chunk reader, e.g. if you serve video files:

@app.route('/videos/<name>')def serve_video(name):    fp = resource_path_for(name)    if fp.exists():        return Response(            stream_with_context(read_file_chunks(fp)),            headers={                'Content-Disposition': f'attachment; filename={name}'            }        )    else:        raise exc.NotFound()

Under the hood, the Flask response procedure takes each chunk (from the generator read_file_chunks(fp)) and flushes it to the connection before loading the next chunk. After flushing, the chunk data is no more referenced and gets cleaned up by the garbage collector, thus there will be no many chunks staying in the RAM at the same time.


Since your file is large and dynamically generated, I'd suggest you not to use send_from_directory() to send files.

Check out the flask streaming documentation on how to stream files (sending small chunks of data instead of a full file) : http://flask.pocoo.org/docs/1.0/patterns/streaming/

from flask import Response@app.route('/large.csv')def generate_large_csv():    def generate():        for row in iter_all_rows():            yield ','.join(row) + '\n'    return Response(generate(), mimetype='text/csv')

The above code is a snippet for how to stream csv files using flask.

However, if your file is static, then Flask recommends using nginx for deployment.