Python Flask API with stomp listener multiprocessing
Seems like Celery is excatly what you need.Celery is a task queue that can distribute work across worker-processes and even across machines.Miguel Grinberg created a great post about that, Showing how to accept tasks via flask and spawn them using Celery as tasks.Good Luck!
To resolve this issue I have decided to run the flask API and the message queue listener as two entirely separate applications in the same docker container. I have installed and configured supervisord to start and the processes individually.
[supervisord]nodaemon=truelogfile=/home/appuser/logs/supervisord.log[program:gunicorn]command=gunicorn -w 1 -c gunicorn.conf.py "app:create_app()" -b 0.0.0.0:8081 --timeout 10000directory=/home/appuser/appuser=appuserautostart=trueautorestart=truestdout_logfile=/home/appuser/logs/supervisord_worker_stdout.logstderr_logfile=/home/appuser/logs/supervisord_worker_stderr.log[program:mqlistener]command=python3 start_listener.pydirectory=/home/appuser/mqlisteneruser=appuserautostart=trueautorestart=truestdout_logfile=/home/appuser/logs/supervisord_mqlistener_stdout.logstderr_logfile=/home/appuser/logs/supervisord_mqlistener_stderr.log