(disclaimer: I am new to Django and Celery)
I am using Celery as part of my scraping tool that ends up generating a task for every endpoint it needs to pull from. When looking at logs or flower/celery-insights its difficult to understand what endpoint is being processed by that task. To my knowledge my options are to A) set the task name, B) set the task ID, or C) use metadata or D) a combination of A,B, or C.
What I have been struggling with is attempting to generate tasks with unique names based on a generator/iteration.
To keep things simple lets say I have the following
import logging
from time import sleep
logger = logging.getLogger(__name__)
CELERY_BROKER_URL = 'redis://default:redis@redis:6379/0'
celery_app = Celery('celery_test', broker=CELERY_BROKER_URL, result_backend=CELERY_BROKER_URL)
@celery_app.task(bind=True)
def print_num(num):
logger.info(f"Proccessing number: {num}")
# Simulate some processing
sleep(20)
logger.info(f"Number Processed: {num}")
category_tasks = group(print_num.s(num) for num in range(10))
result = category_tasks.async_apply()
This would generate 10 tasks, all named celery_test.process_item
with a different UUID for the task name. What I am trying to accomplish is to have the task names be releative to the number they are processing. ie celery_test.proces_item.1
, celery_test.proces_item.2
… celery_test.proces_item.10
If anyone has any insight or knowledge on how this could be accomplished or that it simply cannot be done, please let me know.
Thank you