Run a celery worker in the background

20,965

Solution 1

supervisor is really simple and requires really little work to get it setup up, same applies for to celery in combination with supervisor.

It should not take more than 10 minutes to setup it up :)

  1. install supervisor with apt-get

  2. create /etc/supervisor/conf.d/celery.conf config file

  3. paste somethis in the celery.conf file

    [program:celery]
    directory = /my_project/
    command = /usr/bin/python manage.py celery worker
    
  4. plus (if you need) some optional and useful stuff (with dummy values)

    user = celery_user
    group = celery_group
    stdout_logfile = /var/log/celeryd.log
    stderr_logfile = /var/log/celeryd.err
    autostart = true
    environment=PATH="/some/path/",FOO="bar"
    
  5. restart supervisor (or do supervisorctl reread; supervisorctl add celery)

after that you get the nice ctl commands to manage the celery process:

supervisorctl start/restart/stop celery

supervisorctl tail [-f] celery [stderr]

Solution 2

celery worker -A app.celery --loglevel=info --detach

Solution 3

I have faced the same problem as a lazy solution is to use & at the end of the command. For example

celery worker -A <app>.celery --loglevel=info &  

Solution 4

For me this one worked, I was using celery with django

celery -A proj_name worker -l INFO --detach
Share:
20,965
blueFast
Author by

blueFast

Updated on June 23, 2021

Comments

  • blueFast
    blueFast about 3 years

    I am running a celery worker like this:

    celery worker --app=portalmq --logfile=/tmp/portalmq.log --loglevel=INFO -E --pidfile=/tmp/portalmq.pid
    

    Now I want to run this worker in the background. I have tried several things, including:

    nohup celery worker --app=portalmq --logfile=/tmp/portal_mq.log --loglevel=INFO -E --pidfile=/tmp/portal_mq.pid >> /tmp/portal_mq.log 2>&1 </dev/null &
    

    But it is not working. I have checked the celery documentation, and I found this:

    Specially this comment is relevant:

    In production you will want to run the worker in the background as a daemon.
    To do this you need to use the tools provided by your platform, or something
    like supervisord (see Running the worker as a daemon for more information).
    

    This is too much overhead just to run a process in the background. I would need to install supervisord in my servers, and get familiar with it. No go at the moment. Is there a simple way of running a celery worker in the backrground?

  • blueFast
    blueFast over 11 years
    Thanks. I'll try that as soon as possible.
  • Always_a_learner
    Always_a_learner over 7 years
    @Tommaso What do we suppose to write in "directory = /my_project/"?
  • Tommaso Barbugli
    Tommaso Barbugli over 7 years
    @simer the path of your Django project (the command expects to have manage.py in the path)
  • atb00ker
    atb00ker almost 4 years
    Not the best way, see other answers here, but still works as advertised, so I'll up vote it. :-)
  • Berk
    Berk over 2 years
    I'm wondering if there's any way to restart celery.
  • Kaiss B.
    Kaiss B. about 2 years
    Restarting celery in this case requires you to do "ps aux | grep celery" in the command prompt and finding its PID, then kill it with "kill -9 <PID>"