Celery on docker to generate files in the application directory

I have a Django application with Celery as delayed task mechanism. There are some file operations which I'd like to transfer to Celery to relieve Django of time consuming processes. Here's how my docker-compose looks:

...
services:
  django: &django
    build:
      context: .
      dockerfile: ./compose/local/django/Dockerfile
    image: my_project_local_django
    container_name: my_project_local_django
    depends_on:
      - postgres
      - redis
    volumes:
      - .:/app:z
      - ./media:/app/media  # To make sure that /media hooks up to Celery.
    env_file:
      - ./.envs/.local/.django
      - ./.envs/.local/.postgres
    ports:
      - "8000:8000"
    command: /start

  postgres:
    ...

  redis:
    image: redis:6
    container_name: my_project_local_redis
    volumes:
      - my_project_local_local_redis_data:/data

  celeryworker:
    <<: *django
    image: my_project_local_celeryworker
    container_name: my_project_local_celeryworker
    volumes:
      - ./media:/app/media  # To ensure Celery has access to local /media directory.
    depends_on:
      - redis
      - postgres
    ports: []
    command: /start-celeryworker

  celerybeat:
    <<: *django
    image: my_project_local_celerybeat
    container_name: my_project_local_celerybeat
    volumes:
      - ./media:/app/media  # To ensure Celery has access to local /media directory.
    depends_on:
      - redis
      - postgres
      - django
    restart: always
    ports: []
    command: /start-celerybeat

  flower:
    <<: *django
    image: my_project_local_flower
    container_name: my_project_local_flower
    ports:
      - "5555:5555"
    command: /start-flower

And here's the task. So far I'm just testing a simple scenario of generating a blank file.

...
@celery_app.task()
def generate_file(object_id):
    lock_key = f"generate_file_task_lock:{object_id}"

    # Try to acquire the lock
    if redis_client.set(lock_key, "locked", ex=60, nx=True):  # `nx=True` ensures the lock is only set if it doesn't exist
        try:
            object = Model.objects.get(pk=object_id)

            dir_path = os.path.join(f"media/user_files/{object.user.username}/")
            file_path = dir_path + str(object.pk)

            # Create a blank file
            os.makedirs(dir_path, exist_ok=True)
            with open(file_path, "w+") as file:
                file.write("")

            if os.path.exists(file_path):
                logger.info(f"File created successfully: {file_path}")
            else:
                logger.error(f"File not found after writing: {file_path}")

            return f"Object {object_id} processed."
        except Exception as e:
            logger.error(f"Error creating file: {str(e)}", exc_info=True)
            raise  # Re-raise the exception so Celery logs it
        finally:
            redis_client.delete(lock_key)  # Release the lock when the task is finished
    else:
        return f"Object {object_id} already has a running task."

I've simplified this code a bit so there's less fluff.

When running this task, logs indicate that a file is being generated successfully but it doesn't show up in my project files. I'm assuming it might be created within Celery container. I tried to make sure Celery has access to the /media directory but I must be missing something and out of ideas how to solve it.

On a side note, if I choose to generate files in a directory other than /media (project root for example), they appear in my project. The only problem with that is they are not exposed to downloading, so it's logical to use the /media folder if I want them to be exposed.

Вернуться на верх