dimanche 4 mai 2014

python - céleri ne pas se connecter retourne de valeur ou des extractions de nouvelles tâches avant redémarrage - Stack Overflow


All of a sudden Celery have started acting up on us and stopped fetching tasks.


Info:



  • We're using celery 3.0.10

  • We're using django-celery 3.0.10

  • We're using RabbitMQ.


The problem:


Celery is not logging the output from a task until it is restarted (loglevel INFO). This would be OK but it is not fetching a new task until it restarted either. And no, the tasks are not deadlocked, I have a print right before the return of a boolean that is printed to the log.


Example:


@task()
def my_task(username):
print "Start"
result = api_call(username)
print "Finished", result
return result

In the logs we see:


[2014-02-25 20:45:28,300: INFO/MainProcess] Got task from broker: my_project.my_app.my_task[475ff845-6a63-4b7b-9e02-4ce198043707]
[2014-02-25 20:45:29,667: WARNING/Worker-X] Start
[2014-02-25 20:45:29,667: WARNING/Worker-X] Finished, True

Then nothing. Until I restart:


[2014-02-25 21:08:15,081: INFO/MainProcess] Task my_project.my_app.my_task[475ff845-6a63-4b7b-9e02-4ce198043707] succeeded in 1392992732.81s: True
...

Where the three dots represent the next iteration of got task from broker etc. It's the same no matter how many workers we have, they'll just do it with concurrency.


Any ideas why?


Edit:


After investigating more I've realised that each worker takes exactly four tasks before locking and I get the print before return for each worker, so none of the tasks are actually locked. All four task returns are logged after the restart and looking in the Rabbit queue there is tasks to be processed. It is like each worker can only hold four task results in memory and is never flushing it, until celery is restart, then it is flushed and they fetch a new one. But all are processed properly.




Turns out it was something funky going on with my version of Celery and the version of Kombu I had accessible. It didn't work with any version of Kombu that was currently in the repo (an older version was needed), so upgrading celery and kombu worked.



All of a sudden Celery have started acting up on us and stopped fetching tasks.


Info:



  • We're using celery 3.0.10

  • We're using django-celery 3.0.10

  • We're using RabbitMQ.


The problem:


Celery is not logging the output from a task until it is restarted (loglevel INFO). This would be OK but it is not fetching a new task until it restarted either. And no, the tasks are not deadlocked, I have a print right before the return of a boolean that is printed to the log.


Example:


@task()
def my_task(username):
print "Start"
result = api_call(username)
print "Finished", result
return result

In the logs we see:


[2014-02-25 20:45:28,300: INFO/MainProcess] Got task from broker: my_project.my_app.my_task[475ff845-6a63-4b7b-9e02-4ce198043707]
[2014-02-25 20:45:29,667: WARNING/Worker-X] Start
[2014-02-25 20:45:29,667: WARNING/Worker-X] Finished, True

Then nothing. Until I restart:


[2014-02-25 21:08:15,081: INFO/MainProcess] Task my_project.my_app.my_task[475ff845-6a63-4b7b-9e02-4ce198043707] succeeded in 1392992732.81s: True
...

Where the three dots represent the next iteration of got task from broker etc. It's the same no matter how many workers we have, they'll just do it with concurrency.


Any ideas why?


Edit:


After investigating more I've realised that each worker takes exactly four tasks before locking and I get the print before return for each worker, so none of the tasks are actually locked. All four task returns are logged after the restart and looking in the Rabbit queue there is tasks to be processed. It is like each worker can only hold four task results in memory and is never flushing it, until celery is restart, then it is flushed and they fetch a new one. But all are processed properly.



Turns out it was something funky going on with my version of Celery and the version of Kombu I had accessible. It didn't work with any version of Kombu that was currently in the repo (an older version was needed), so upgrading celery and kombu worked.


0 commentaires:

Enregistrer un commentaire