Flask: passing around background worker job (rq, redis)
I've not used rq
before but I see that a job has a .key
property. It might be easier to store that hash in your session. Then you can use the Job
class's .fetch
method which will itself call a .refresh()
and return the job to you. Reading the .result()
at that point would give you the job's current status.
Maybe like this (untested):
from rq.job import Job@app.route('/make/')def make(): job = q.enqueue(do_something, 'argument') session['job'] = job.key return 'Done'@app.route('/get/')def get(): try: job = Job() job.fetch(session['job']) out = str(job.result) except: out = 'No result yet' return out
Problem with serializing arguments (you actually trying to serialize function object which is impossible with pickle
).
Try
@app.route('/make/')def make(): job = q.enqueue(func=do_something, args=('argument',)) session['job'] = job return 'Done'