Storing task state between multiple django processes
Posted
by user366148
on Stack Overflow
See other posts from Stack Overflow
or by user366148
Published on 2010-06-14T09:09:50Z
Indexed on
2010/06/14
9:12 UTC
Read the original article
Hit count: 186
I am building a logging-bridge between rabbitmq messages and Django application to store background task state in the database for further investigation/review, also to make it possible to re-publish tasks via the Django admin interface. I guess it's nothing fancy, just a standard Producer-Consumer pattern.
- Web application publishes to message queue and inserts initial task state into the database
- Consumer, which is a separate python process, handles the message and updates the task state depending on task output
The problem is, some tasks are missing in the db and therefore never executed. I suspect it's because Consumer receives the message earlier than db commit is performed. So basically, returning from Model.save() doesn't mean the transaction has ended and the whole communication breaks.
Is there any way I could fix this? Maybe some kind of post_transaction signal I could use?
Thank you in advance.
© Stack Overflow or respective owner