Django Celery get task count
Solution 1
If your broker is configured as redis://localhost:6379/1
, and your tasks are submitted to the general celery
queue, then you can get the length by the following means:
import redis
queue_name = "celery"
client = redis.Redis(host="localhost", port=6379, db=1)
length = client.llen(queue_name)
Or, from a shell script (good for monitors and such):
$ redis-cli -n 1 -h localhost -p 6379 llen celery
Solution 2
Here is how you can get the number of messages in a queue using celery that is broker-agnostic.
By using connection_or_acquire
, you can minimize the number of open connections to your broker by utilizing celery's internal connection pooling.
celery = Celery(app)
with celery.connection_or_acquire() as conn:
conn.default_channel.queue_declare(
queue='my-queue', passive=True).message_count
You can also extend Celery to provide this functionality:
from celery import Celery as _Celery
class Celery(_Celery)
def get_message_count(self, queue):
'''
Raises: amqp.exceptions.NotFound: if queue does not exist
'''
with self.connection_or_acquire() as conn:
return conn.default_channel.queue_declare(
queue=queue, passive=True).message_count
celery = Celery(app)
num_messages = celery.get_message_count('my-queue')
Solution 3
If you have already configured redis in your app, you can try this:
from celery import Celery
QUEUE_NAME = 'celery'
celery = Celery(app)
client = celery.connection().channel().client
length = client.llen(QUEUE_NAME)
Solution 4
Get a redis client instance used by Celery, then check the queue length. Don't forget to release the connection every time you use it (use .acquire
):
# Get a configured instance of celery:
from project.celery import app as celery_app
def get_celery_queue_len(queue_name):
with celery_app.pool.acquire(block=True) as conn:
return conn.default_channel.client.llen(queue_name)
Always acquire a connection from the pool, don't create it manually. Otherwise, your redis server will run out of connection slots and this will kill your other clients.
Solution 5
I'll expand on the answer of @StephenFuhry around the not-found error, because more or less broker-agnostic way of retrieving queue length is beneficial even if Celery suggests to mess with brokers directly. In Celery 4 (with Redis broker) this error looks like:
ChannelError: Channel.queue_declare: (404) NOT_FOUND - no queue 'NAME' in vhost '/'
Observations:
-
ChannelError
is akombu
exception (if fact, it'samqp
's andkombu
"re-exports" it). -
On Redis broker Celery/Kombu represent queues as Redis lists
-
Redis collection type keys are removed whenever the collection becomes empty
-
If we look at what
queue_declare
does, it has these lines:if passive and not self._has_queue(queue, **kwargs): raise ChannelError(...)
-
Kombu Redis virtual transport's
_has_queue
is this:def _has_queue(self, queue, **kwargs): with self.conn_or_acquire() as client: with client.pipeline() as pipe: for pri in self.priority_steps: pipe = pipe.exists(self._q_for_pri(queue, pri)) return any(pipe.execute())
The conclusion is that on a Redis broker ChannelError
raised from queue_declare
is okay (for an existing queue of course), and just means that the queue is empty.
Here's an example of how to output all active Celery queues' lengths (normally should be 0, unless your worker can't cope with the tasks).
from kombu.exceptions import ChannelError
def get_queue_length(name):
with celery_app.connection_or_acquire() as conn:
try:
ok_nt = conn.default_channel.queue_declare(queue=name, passive=True)
except ChannelError:
return 0
else:
return ok_nt.message_count
for queue_info in celery_app.control.inspect().active_queues().values():
print(queue_info[0]['name'], get_queue_length(queue_info[0]['name']))
Related videos on Youtube
Comments
-
maazza almost 2 years
I am currently using django with celery and everything works fine.
However I want to be able to give the users an opportunity to cancel a task if the server is overloaded by checking how many tasks are currently scheduled.
How can I achieve this ?
I am using redis as broker.
I just found this : Retrieve list of tasks in a queue in Celery
It is somehow relate to my issue but I don't need to list the tasks , just count them :)
-
Lal over 8 yearsPlease provide some kind of explanation too to support your answer.
-
Stephen Fuhry over 8 years@Lal Added some explanation of the approach - hope that helps!
-
Simanas almost 8 yearsamqp.exceptions.NotFound: Queue.declare: (404) NOT_FOUND - no queue 'default' in vhost '/' Because my queue is not on '/' host it's on '/apples' host. How do I get to that host?
-
Jiang Jun almost 7 yearsFor redis,
client = app.broker_connection().channel().client
-
Bojan Jovanovic about 6 yearsEven though this is a correct solution for the redis broker, please mark @stephen Fuhry's comment as the correct solution as it is broker agnostic.
-
Mario about 6 yearssetting passive to 'False' also works and circumvents the 404 NOT FOUND issue.
-
Max Malysh about 6 yearsThis will create a new hanging Redis connection every time you run this code. You have to release the opened connection and channel.
-
mccc about 4 years@Mario that would end up creating the non-existing Exchange, though, which is most likely not desired