Running multiple workers using Celery
Solution 1
I have now updated my answer following the comment from MartinP regarding worker spawning child processes not threads:
Celery worker
and worker processes
are different things (Read this for reference).
When a worker is started it then spawns a certain number of child processes.
The default number of those processes is equal to a number of cores on that machine.
On Linux you can check the number of cores via:
$ nproc --all
Otherwise you can specify it yourself, for e.g.:
$ celery -A proj worker --loglevel=INFO --concurrency=2
In the above example there's one worker which will be able to spawn 2 child processes. It is normally advised to run a single worker per machine and the concurrency value will define how many processes will run in parallel, but if multiple workers required to run then you can start them like shown below:
$ celery -A proj worker -l info --concurrency=4 -n wkr1@hostname
$ celery -A proj worker -l info --concurrency=2 -n wkr2@hostname
$ celery -A proj worker -l info --concurrency=2 -n wkr3@hostname
Refer to celery docs for more info
Solution 2
Looks like your worker is just running a single process/thread. You probably just need to add the --concurrency
or -c
argument when starting the worker to spawn multiple (parallel) worker instances.
celery -A proj worker -c 4
Related videos on Youtube
![Admin](/assets/logo_square_200-5d0d61d6853298bd2a4fe063103715b4daf2819fc21225efa21dfb93e61952ea.png)
Admin
Updated on September 18, 2022Comments
-
Admin almost 2 years
I need to read from Rabbitmq and execute task in parallel using Celery in a single system.
[2014-12-30 15:54:22,374: INFO/Worker-1] ... [2014-12-30 15:54:23,401: INFO/Worker-1] ... [2014-12-30 15:54:30,878: INFO/Worker-1] ... [2014-12-30 15:54:32,209: INFO/Worker-1] ... [2014-12-30 15:54:33,255: INFO/Worker-1] ... [2014-12-30 15:54:48,445: INFO/Worker-1] ... [2014-12-30 15:54:49,811: INFO/Worker-1] ... [2014-12-30 15:54:50,903: INFO/Worker-1] ... [2014-12-30 15:55:39,674: INFO/Worker-1] ... [2014-12-30 15:55:41,024: INFO/Worker-1] ... [2014-12-30 15:55:42,147: INFO/Worker-1] ...
It seams only 1 worker is running all the time .. ie one after another in sequential order. How can I configure Celery to run multiple workers to run parallel ?
-
MartinP almost 6 yearsconcurency parameter does't run threads. It run child processes by default so it process tasks in parallel - docs.celeryproject.org/en/latest/reference/…
-
Thomas John over 5 yearsHow do the concurrency and threads relate each other, if the default concurrency is the number of cores of the machine, what will be the number of threads? Is it configurable?
-
FragLegs almost 5 years@ThomasJohn I think the default number for threads is still the number of cores of the machine. Generally, you'll want to set the --concurrency flag high for thread-based workers.