I’m utilizing Celery 5.4.0 with a RabbitMQ broker.
I have a list of tasks that are expected to run for extended periods, and I have three workers available for processing.
These are the configurations my Celery app is using:
worker_pool="prefork",
worker_concurrency=3,
worker_max_tasks_per_child=1,
worker_prefetch_multiplier=1,
When I initiate the tasks, Celery successfully picks up three tasks and processes them. However, once one of the workers completes its task, Celery doesn’t spawn a new worker to pick up another task from the broker queue. It only does so when the third worker completes its task.
This behavior is not optimal for me as I would prefer tasks to be initiated as soon as a worker becomes available.
Could this be a bug, or am I misunderstanding some concepts? I thought that worker_prefetch_multiplier
would address this issue but this behavior persists.