Celery task always PENDING
Solution 1
According to Celery 'Getting Started' not able to retrieve results; always pending and https://github.com/celery/celery/issues/2146 it is a Windows issue. Celery --pool=solo
option solves the issue.
Solution 2
Instead of Celery --pool=solo
option, try -P threads
on Windows.
Solution 3
Setting CELERY_TASK_TRACK_STARTED = True
(or track_started=True on individual tasks) can also help - this will enable the STARTED status.
Solution 4
Remove the ignore_result=False
from the celery docs
Task.ignore_result
Don’t store task state. Note that this means you can’t
use AsyncResult to check if the task is ready, or get its return value.
Solution 5
thanks everyone.
my celery config:
-------------- celery@DESKTOP-FD38GOO v4.4.2 (cliffs)
--- ***** -----
-- ******* ---- Windows-10-10.0.18362-SP0 2020-04-17 06:58:18
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: mysite:0x25cfd40d208
- ** ---------- .> transport: redis://localhost:6379//
- ** ---------- .> results: redis://localhost:6379/1
- *** --- * --- .> concurrency: 8 (thread)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. mysite.celery.debug_task
. supplier.tasks.add
. supplier.tasks.count_widgets
. supplier.tasks.count_widgets2
. supplier.tasks.mul
. supplier.tasks.xsum
i have fixed such issue:
i pending such issue about 1 days, and try uninstall redis and install redis on windows 10 some times.
at last i found there are not concurrency config.
first solution:
celery -A mysite worker -l info -P threads
second solution:
celery -A mysite worker -l info --pool=solo
my celery config:
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/1'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_IGNORE_RESULT = False
CELERY_TIMEZONE = TIME_ZONE
CELERY_TRACK_STARTED = True
CELERYD_LOG_FILE = os.path.join(
BASE_DIR, 'celery', 'logs')
CELERYD_LOG_LEVEL = "INFO"
Ivan Gromov
Updated on June 10, 2022Comments
-
Ivan Gromov almost 2 years
I try to run Celery example on Windows with redis backend. The code looks like:
from celery import Celery app = Celery('risktools.distributed.celery_tasks', backend='redis://localhost', broker='redis://localhost') @app.task(ignore_result=False) def add(x, y): return x + y @app.task(ignore_result=False) def add_2(x, y): return x + y
I start the tasks using iPython console:
>>> result_1 = add.delay(1, 2) >>> result_1.state 'PENDING' >>> result_2 = add_2.delay(2, 3) >>> result_2.state 'PENDING'
It seems that both tasks were not executed, but Celery worker output shows that they succeeded:
[2014-12-08 15:00:09,262: INFO/MainProcess] Received task: risktools.distributed.celery_tasks.add[01dedca1-2db2-48df-a4d6-2f06fe285e45] [2014-12-08 15:00:09,267: INFO/MainProcess] Task celery_tasks.add[01dedca1-2db2-48df-a4d6-2f06fe28 5e45] succeeded in 0.0019998550415s: 3 [2014-12-08 15:00:24,219: INFO/MainProcess] Received task: risktools.distributed.celery_tasks.add[cb5505ce-cf93-4f5e-aebb-9b2d98a11320] [2014-12-08 15:00:24,230: INFO/MainProcess] Task celery_tasks.add[cb5505ce-cf93-4f5e-aebb-9b2d98a1 1320] succeeded in 0.010999917984s: 5
I've tried to troubleshoot this issue according to Celery documentation, but none of the advices were useful. What am I doing wrong and how can I receive results from a Celery task?
UPD: I've added a task without
ignore_result
parameter, but nothing has changed@app.task def add_3(x, y): return x + y >>>r = add_3.delay(2, 2) >>>r.state 'PENDING'
-
Ivan Gromov over 9 yearsI still get
PENDING
state (seeadd_3
method in UPD) -
xxmajia over 8 yearsMac also encounters this issue as well, thanks for the tips
-
Yan Sklyarenko about 4 yearsThis sounds like a comment, not an answer.