No module named 'project_name' with Django + Celery?

Day 2 of debugging this I have to turn to stackoverflow I'm on the edge.

I used cookiecutter-django to generate my project months ago.

Project name
     config/settings/...
     src/
        app_name/
            __init__.py
     manage.py

when I create celery.py inside config I get an error because it is named after a package, so I googled and tried using absolute_import but I just went with naming the file celery_app.py like guys from django-cookie-cutter did

celery_app.py

import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.base")

app = Celery("my_awesome_project") #I tried this with 10 different names, doesn't make any difference

# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object("django.conf:settings", namespace="CELERY")

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()

init.py inside config/

#from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery_app import app as celery_app

__all__ = ('celery_app',)

last lines of the traceback

  File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
  File "<frozen importlib._bootstrap>", line 984, in _find_and_load_unlocked
ModuleNotFoundError: No module named '<project_name>'

There was a bug with python 3.7 and celery where you had to downgrade importlib-metadata with pip3 install importlib-metadata==4.13.0 I am using python 3.9.6 and have tried both downgrading and installing the latest importlib-metadata

I tried running all variations of celery -A config.celery_app worker -l info and celery -A config.celery_app:app worker -l DEBUG from the root folder

From cookiecutter-django README.md:

To run a celery worker:

bash cd my_awesome_project celery -A config.celery_app worker -l info

Please note: For Celery's import magic to work, it is important where the celery commands are run. If you are in the same folder with manage.py, you should be right.

I am also using redis, I have redis-server running and the localhost running as well. Inside redis-cli ping returns pong so it works.

Where I started: https://realpython.com/asynchronous-tasks-with-django-and-celery/

Posts with similar problems: (Django-Celery Error) ImportError: No module named myproject Celery ImportError: No module named proj KeyError / frozen importlib._bootstrap error on second library import in spyder Error while running celery worker : ModuleNotFoundError: No module named 'mysite' https://www.reddit.com/r/django/comments/vcqr5e/celery_does_not_discovers_tasks_inside_django/

"Working" example: https://github.com/stuartmaxwell/django-celery-example

If anybody has had the similar error please leave some feedback I would really appreciate it.

Added

ROOT_DIR = Path(__file__).resolve(strict=True).parent.parent
sys.path.append(str(ROOT_DIR / "src"))

to the celery_app.py just like in the regular settings and now it works.

Back to Top