Celery / Django Single Tasks being run multiple times
- by felix001
I'm facing an issue where I'm placing a task into the queue and it is being run multiple times.
From the celery logs I can see that the same worker is running the task ...
[2014-06-06 15:12:20,731: INFO/MainProcess] Received task: input.tasks.add_queue
[2014-06-06 15:12:20,750: INFO/Worker-2] starting runner..
[2014-06-06 15:12:20,759: INFO/Worker-2] collection started
[2014-06-06 15:13:32,828: INFO/Worker-2] collection complete
[2014-06-06 15:13:32,836: INFO/Worker-2] generation of steps complete
[2014-06-06 15:13:32,836: INFO/Worker-2] update created
[2014-06-06 15:13:33,655: INFO/Worker-2] email sent
[2014-06-06 15:13:33,656: INFO/Worker-2] update created
[2014-06-06 15:13:34,420: INFO/Worker-2] email sent
[2014-06-06 15:13:34,421: INFO/Worker-2] FINISH - Success
However when I view the actual logs of the application it is showing 5-6 log lines for each step (??).
Im using Django 1.6 with RabbitMQ. The method for placing into the queue is via placing a delay on a function.
This function (task decorator is added( then calls a class which is run.
Has anyone any idea on the best way to troubleshoot this ?
Edit : As requested heres the code,
views.py
In my view im sending my data to the queue via ...
from input.tasks import add_queue_project
add_queue_project.delay(data)
tasks.py
from celery.decorators import task
@task()
def add_queue_project(data):
""" run project """
logger = logging_setup(app="project")
logger.info("starting project runner..")
f = project_runner(data)
f.main()
class project_runner():
""" main project runner """
def __init__(self,data):
self.data = data
self.logger = logging_setup(app="project")
def self.main(self):
.... Code
settings.py
THIRD_PARTY_APPS = (
'south', # Database migration helpers:
'crispy_forms', # Form layouts
'rest_framework',
'djcelery',
)
import djcelery
djcelery.setup_loader()
BROKER_HOST = "127.0.0.1"
BROKER_PORT = 5672 # default RabbitMQ listening port
BROKER_USER = "test"
BROKER_PASSWORD = "test"
BROKER_VHOST = "test"
CELERY_BACKEND = "amqp" # telling Celery to report the results back to RabbitMQ
CELERY_RESULT_DBURI = ""
CELERY_IMPORTS = ("input.tasks", )
celeryd
The line im running is to start celery,
python2.7 manage.py celeryd -l info
Thanks,