CloudFoundry: nginx for serving static content on top of Gunicorn (Docker) CloudFoundry: nginx for serving static content on top of Gunicorn (Docker) docker docker

CloudFoundry: nginx for serving static content on top of Gunicorn (Docker)


If you're using Cloud Foundry, it's super easy to spin up an extra instance or two to scale your app up and handle more load. I would recommend doing this as it's dead simple and should work for really any app type. That and look at client side caching. You can reduce your the load on your server generated by requests for static files by simply having files cached client side.

If you really are serving up a lot of static file and it's not cost efficient to scale up additional instances of your app, you can do the following:

1.) Push your Flask app using the Python buildpack. This will be given the main route for your app.

2.) Push your app files using the Static File buildpack using a separate hostname or context path. For example: static.example.com or www.example.com/static.

By doing this, you will route any requests to the static.example.com route or the www.example.com/static route and path to your static files being hosted by Nginx (courtesy of the Static File buildpack). Requests to your main route or not to the static path, will end up going to your Python app. The platform handles this and makes sure the routes go to the correct app based on the routes you define for each app.

The only downside is that this relies on you having static content separated out, so that you can map a custom route or custom route and path for your static content. That said, I don't think this should be an issue because you're running Flask. If it is an issue, you can always map multiple routes + paths. Depending on how your files are structured, this may require a lot of routes + paths to be mapped.

As I mentioned above, this has the advantage of relying on the platform to route static requests to one app and all other requests to another app. If you were to try and set up Nginx as a proxy, you'd be adding more layers of proxies and more latency to your requests.