How can I manage a queue of requests in my Flask service? How can I manage a queue of requests in my Flask service? flask flask

How can I manage a queue of requests in my Flask service?


I've never found a good way of using the built in queue for sharing a work queue between processes. By that I mean gunicorn/uwsgi processes not manager/workers.

My suggestion would be to use something like redis or rabbit mq for distributing the workload. There are several frameworks for this such as celery or just using pika. This would build your application scaleable in a way. Beging able to distribute API, work queue and workers on separate instances in what scale you require.