Celery picks up new tasks, but all old are stuck in PENDING

In a nutshell: I'm able to run new celery tasks but the old ones are stuck with PENDING states.

Celery run command:

celery -A config.celery_app worker --pool=solo --loglevel=info

Environment:

Celery 5.4.0 redis-server 7.4.1 on WSL (Windows Linux)
The celery and Django app runs on windows and the redis-server in WSL.

I ran celery a couple of times and everything was good, but now I constantly see some old tasks as "PENDING" state when I read their state in Django with AsyncState(task_id).

Purging the queue with celery -A config.celery_app purge didn't make anything either.

I visited a couple of threads on this, but none seem to be dealing with my precise problem. (I'm still able to execute new tasks)

I wonder if it's not a problem with running on WSL or something like this, because I'm also unable to see the task queues from the app or CLI:

celery inspect active_queues

gives kombu.exceptions.OperationalError: [WinError 10061] No connection could be made because the target machine actively refused it

and doing:

    i = app.control.inspect()
    scheduled = i.scheduled()

shows no tasks

Project is built with django-cookiecutter and Celery task was basically defined as:

@app.task
def my_task(**kwargs):
    return my_function(**kwargs)
Вернуться на верх