Celery and transaction.atomic Celery and transaction.atomic postgresql postgresql

Celery and transaction.atomic


As @dotz mentioned, it is hardly useful to spawn an async task and immediately block and keep waiting until it finishes.

Moreover, if you attach to it this way (the .get() at the end), you can be sure that the mymodel instance changes just made won't be seen by your worker because they won't be committed yet - remember you're still inside the atomic block.

What you could do instead (from Django 1.9) is delay the task until after the current active transaction is committed, using django.db.transaction.on_commit hook:

from django.db import transactionwith transaction.atomic():    mymodel.save()    transaction.on_commit(lambda:        mytask.delay(mymodel.id))

I use this pattern quite often in my post_save signal handlers that trigger some processing of new model instances. For example:

from django.db import transactionfrom django.db.models.signals import post_savefrom django.dispatch import receiverfrom . import models   # Your models defining some Order modelfrom . import tasks   # Your tasks defining a routine to process new instances@receiver(post_save, sender=models.Order)def new_order_callback(sender, instance, created, **kwargs):    """ Automatically triggers processing of a new Order. """    if created:        transaction.on_commit(lambda:            tasks.process_new_order.delay(instance.pk))

This way, however, your task won't be executed if the database transaction fails. It is usually the desired behavior, but keep it in mind.

Edit: It's actually nicer to register the on_commit celery task this way (w/o lambda):

transaction.on_commit(tasks.process_new_order.s(instance.pk).delay)


"Separate task" = something that is ran by a worker.

"Celery worker" = another process.

I am not aware of any method that would let you have a single database transaction shared between 2 or more processess. What you want is to run the task in a synchronous way, in that transaction, and wait for the result... but, if that's what you want, why do you need a task queue anyway?


There are some race conditions in Celery tasks. I think here is an explanation of your problem.

Take a look into those docs. Also there are some packages like django-celery-transactions that can help you with your question.