Considering a task
<code>from celery import Celery
from django.conf import settings
broker=settings.CELERY_BROKER_URL,
backend=settings.CELERY_BROKER_URL,
timezone=settings.CELERY_TIMEZONE,
task_track_started=settings.CELERY_TASK_TRACK_STARTED,
task_time_limit=settings.CELERY_TASK_TIME_LIMIT,
broker_connection_retry_on_startup=settings.CELERY_BROKER_CONNECTION_RETRY_ON_STARTUP,
# This would fail to connect because I couldn't get TLS to work this way
# app.config_from_object('django.conf:settings', namespace='CELERY')
@app.task(queue='periodic')
def some_task(corpus_id):
<code>from celery import Celery
from django.conf import settings
app = Celery(
'api',
broker=settings.CELERY_BROKER_URL,
backend=settings.CELERY_BROKER_URL,
timezone=settings.CELERY_TIMEZONE,
task_track_started=settings.CELERY_TASK_TRACK_STARTED,
task_time_limit=settings.CELERY_TASK_TIME_LIMIT,
broker_connection_retry_on_startup=settings.CELERY_BROKER_CONNECTION_RETRY_ON_STARTUP,
)
# This would fail to connect because I couldn't get TLS to work this way
# app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
@app.task(queue='periodic')
def some_task(corpus_id):
return 123
</code>
from celery import Celery
from django.conf import settings
app = Celery(
'api',
broker=settings.CELERY_BROKER_URL,
backend=settings.CELERY_BROKER_URL,
timezone=settings.CELERY_TIMEZONE,
task_track_started=settings.CELERY_TASK_TRACK_STARTED,
task_time_limit=settings.CELERY_TASK_TIME_LIMIT,
broker_connection_retry_on_startup=settings.CELERY_BROKER_CONNECTION_RETRY_ON_STARTUP,
)
# This would fail to connect because I couldn't get TLS to work this way
# app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
@app.task(queue='periodic')
def some_task(corpus_id):
return 123
A submitted task is (and stays) PENDING
.
<code>r = some_task.apply_async([4], queue='gpu')
<code>r = some_task.apply_async([4], queue='gpu')
r.state == 'PENDING'
</code>
r = some_task.apply_async([4], queue='gpu')
r.state == 'PENDING'
Nothing is logged in the worker:
<code>-------------- celery@gpuworker v5.4.0 (opalescent)
-- ******* ---- Linux-5.15.0-118-generic-x86_64-with-glibc2.39 2024-09-10 21:55:59
- ** ---------- .> app: api:0x7ff754494700
- ** ---------- .> transport: rediss://redis:6379/1
- ** ---------- .> results: rediss://redis:6379/1
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
.> gpu exchange=gpu(direct) key=gpu
[2024-09-10 21:55:59,900: INFO/MainProcess] Connected to rediss://redis:6379/1
[2024-09-10 21:55:59,974: INFO/MainProcess] mingle: searching for neighbors
[2024-09-10 21:56:01,098: INFO/MainProcess] mingle: all alone
[2024-09-10 21:56:01,406: INFO/MainProcess] celery@gpuworker ready.
<code>-------------- celery@gpuworker v5.4.0 (opalescent)
--- ***** -----
-- ******* ---- Linux-5.15.0-118-generic-x86_64-with-glibc2.39 2024-09-10 21:55:59
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: api:0x7ff754494700
- ** ---------- .> transport: rediss://redis:6379/1
- ** ---------- .> results: rediss://redis:6379/1
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> gpu exchange=gpu(direct) key=gpu
[tasks]
. api.celery.some_task
[2024-09-10 21:55:59,900: INFO/MainProcess] Connected to rediss://redis:6379/1
[2024-09-10 21:55:59,974: INFO/MainProcess] mingle: searching for neighbors
[2024-09-10 21:56:01,098: INFO/MainProcess] mingle: all alone
[2024-09-10 21:56:01,406: INFO/MainProcess] celery@gpuworker ready.
</code>
-------------- celery@gpuworker v5.4.0 (opalescent)
--- ***** -----
-- ******* ---- Linux-5.15.0-118-generic-x86_64-with-glibc2.39 2024-09-10 21:55:59
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app: api:0x7ff754494700
- ** ---------- .> transport: rediss://redis:6379/1
- ** ---------- .> results: rediss://redis:6379/1
- *** --- * --- .> concurrency: 1 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> gpu exchange=gpu(direct) key=gpu
[tasks]
. api.celery.some_task
[2024-09-10 21:55:59,900: INFO/MainProcess] Connected to rediss://redis:6379/1
[2024-09-10 21:55:59,974: INFO/MainProcess] mingle: searching for neighbors
[2024-09-10 21:56:01,098: INFO/MainProcess] mingle: all alone
[2024-09-10 21:56:01,406: INFO/MainProcess] celery@gpuworker ready.
I checked if the job is somewhere in app.control.inspect()
:
<code>i = app.control.inspect()
i.reserved(), i.active(), i.registered(), i.scheduled()
# (None, None, None, None)
<code>i = app.control.inspect()
i.reserved(), i.active(), i.registered(), i.scheduled()
# (None, None, None, None)
</code>
i = app.control.inspect()
i.reserved(), i.active(), i.registered(), i.scheduled()
# (None, None, None, None)
Because there’s no exception, error, log entry in the worker, etc, I’m clueless where to begin debugging this. Is there some way to debug this? Is it fair to assume the redis connection works, since if there were connection issues, there would have been an exception on apply_async
or at least the state would become FAILED
at some point? Can I check things in the AsyncResult r
(r.failed() == False
and r.ready() == False
)? Why isn’t this listed in any of the app
‘s inspect?
I also checked that app
and some_task.app
are the same, to ensure I’m not comparing apples and oranges:
<code><Celery api at 0x7f26e52b71c0>
<Celery api at 0x7f26e52b71c0>
<code><Celery api at 0x7f26e52b71c0>
<Celery api at 0x7f26e52b71c0>
</code>
<Celery api at 0x7f26e52b71c0>
<Celery api at 0x7f26e52b71c0>
A solution of course would be great, but I would already be very happy to have some sanity checks or inspection suggestions.