celery with redis backend
Solution 1
In ubuntu 10.10 when you install redis it automatically starts at boot, and it was conflicting with celery because apparently even if you tell it to use this or that port if celery can connect to redis in 6379 port it will use it as a backend. I've not seen the code but is my conclusion. So i closed all the redis instances ran my own with my redis.conf on whatever port i want and then i put these on my settings.
BROKER_URL = 'redis://localhost:8889/0'
REDIS_DB = 0
REDIS_CONNECT_RETRY = True
CELERY_RESULT_BACKEND = 'redis'
CELERY_REDIS_PORT = 8889
BROKER_PORT = 8889
CELERY_RESULT_PORT = 8889
CELERY_TASK_RESULT_EXPIRES = 10
CELERYBEAT_SCHEDULER= 'djcelery.schedulers.DatabaseScheduler'
Solution 2
Do you have import djcelery; djcelery.setup_loader()
in your settings.py
?
You can find out where it is trying to connect by running the following:
$ python manage.py celeryctl shell
>>> celery.broker_connection().as_uri()
The 'setup_loader' is very important, so you must make sure that you include it.
(the good news is that celery will support Django out of the box for version 2.7).
Guillermo Siliceo Trueba
Updated on June 04, 2022Comments
-
Guillermo Siliceo Trueba almost 2 years
Ok, I got everything installed properly to get celery + redis in django
I got
INSTALLED_APPS = ( 'djcelery', 'kombu.transport.django', )
These are in my settings
CELERY_REDIS_HOST = 'localhost' CELERY_REDIS_PORT = 8889 CELERY_REDIS_DB = 0 CELERY_RESULT_BACKEND = 'redis' BROKER_URL = "redis://localhost:8889/0" REDIS_CONNECT_RETRY = True CELERY_IGNORE_RESULT = True CELERY_SEND_EVENTS = True CELERY_TASK_RESULT_EXPIRES = 60 CELERYBEAT_SCHEDULER= 'djcelery.schedulers.DatabaseScheduler'
I got redis working with
./redis-server
It outputs to the terminal every second or so I can also run
./manage.py celeryd -E -B --loglevel=INFO -n w1.d820
Without errors I can even see my tasks being added and successfully finished with
./manage celeryev
So even thought it gets logged by celeryev this code doesnt work:
from celery.task import task @task def add(x, y): return x + y res = add.apply_async(args=[1,5]) print res.wait()
It just hangs without ever returning the result, I can see its actually trying to get it from redis because if I do
./redis-cli MONITOR
I get lots of GETs trying to get an non existance key So my conclusion is that django is not saving to the backend Whats wrong with my settings? I think Iam missing something obvious and I can't see it because I've been too long fighting this. Help
Btw I can't use the standard port 6379 because I am on a shared server.
UPDATE using these
celery==2.5.3 django-celery==2.5.5 django-celery-with-redis==2.5