Celery. Decrease number of processes

13,875

Solution 1

I tried setting concurrency to 1 and max_tasks_per_child to 1 in my settings.py file and ran 3 tasks at the same time. It just spawns 1 process as a User and the other 2 as celery. It should should just run 1 process and then wait for it to finish before running the other one.

I am using django celery.

EDIT {

I was assigning concurrency by writing CELERYD_CONCURRENCY = 1 in settings.py file. But when I looked at the celery log file using "tail -f /var/log/celery/w1.log" then I saw a value of 8 assigned to concurrency. This told me that setting.py does not change the concurrency. To fix this issue I added the following lines to "/etc/default/celeryd" file.

# Extra arguments to celeryd
CELERYD_OPTS="--concurrency=1"

Now the second task in the queue waits until the first is finished.

}

Solution 2

celery worker --concurrency option allows to specify the number of child processes processing the queue.

Solution 3

I have this in my celeryd-config file

CELERYD_NODES=2

which results in

$ ps -ef | grep "celery" | grep -v "grep"
www-data  1783     1  0 17:50 ?        00:00:46 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery1.xxx-31-39-06-74-75 --logfile=/var/log/celery/1.log --pidfile=/var/run/celery/1.pid 
www-data  1791  1783  0 17:50 ?        00:00:01 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery1.xxx-31-39-06-74-75 --logfile=/var/log/celery/1.log --pidfile=/var/run/celery/1.pid
www-data  1802     1  0 17:50 ?        00:00:52 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery2.xxx-31-39-06-74-75 --logfile=/var/log/celery/2.log --pidfile=/var/run/celery/2.pid 
www-data  1858  1802  0 17:50 ?        00:00:01 /usr/bin/python /opt/webapps/repo/manage.py celeryd --loglevel=INFO -n celery2.xxx-31-39-06-74-75 --logfile=/var/log/celery/2.log --pidfile=/var/run/celery/2.pid

There are FOUR processes, not two, but there are two workers. It looks like each worker thread has two processes. So presumably if you set CELERYD_NODES to 3, you will get 3 workers but 6 processes.

Share:
13,875
Nikolay Fominyh
Author by

Nikolay Fominyh

I am living in Russia, and I use Python to defend from bears! =) Russian bears honor code prevents me from down-voting without comments! :)

Updated on June 06, 2022

Comments

  • Nikolay Fominyh
    Nikolay Fominyh about 2 years

    Is there any way around to limit number of workers in celery? I have small server and celery always creates 10 processes on 1 core processor. I want to limit this number to 3 processes.

  • Nikolay Fominyh
    Nikolay Fominyh almost 12 years
    CELERYD_CONCURENCY = 1 not working - still spawns 10 processes. And --concurrency=1 has same effect.
  • mher
    mher almost 12 years
    Have you tried to kill all python processes before launching? May be they are old or unrelated processes.
  • Nikolay Fominyh
    Nikolay Fominyh almost 12 years
    --autoscale=1,1 don't help.. very strange.
  • Nikolay Fominyh
    Nikolay Fominyh almost 12 years
    I agree on 3 processes, but for me it spawns 10 always.
  • fatrock92
    fatrock92 almost 12 years
    I only tried three processes. I don't know the limit though. The main thing is that if concurrency is 1 why are the subsequent tasks even running. They should be on the hold.
  • Nikolay Fominyh
    Nikolay Fominyh over 11 years
    Ignores for me this option. CELERYD_NODES = 1 results in 10 processes. CELERYD_NODES = 2 results in 10 processes.
  • Nikolay Fominyh
    Nikolay Fominyh over 11 years
    Wow! Editing /etc/default/celeryd helped me! Thank you very much! =)