Django tests in GitLab CI always use PostgreSQL instead of SQLite despite APP_ENV override

I am running Django tests inside GitLab CI and trying to switch between SQLite (for local/test) and PostgreSQL (for production).
My settings structure looks like this:

settings/_init_.py:


import os

APP_ENV = os.getenv("APP_ENV", "local")

if APP_ENV == "local":
    from .local import DATABASES 
elif APP_ENV == "production":
    from .production import DATABASES 
else:
    raise ValueError("Incorrect configuration")

production.py:

import os

DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.postgresql",
        "NAME": os.getenv("POSTGRES_DB"),
        "USER": os.getenv("POSTGRES_USER"),
        "PASSWORD": os.getenv("POSTGRES_PASSWORD"),
        "HOST": os.getenv("POSTGRES_HOST"),
        "PORT": os.getenv("POSTGRES_PORT"),
    }
}

local.py:

from .base import BASE_DIR


DATABASES = {
    "default": {
        "ENGINE": "django.db.backends.sqlite3",
        "NAME": BASE_DIR / "db.sqlite3",
    }
}

In GitLab CI (.gitlab-ci.yml), my test job looks like this:

backend:test: 
    stage: test 
    image: python:3.13-slim 
    variables: 
        APP_ENV: local
    services:
       - name: postgres:15-alpine 
       alias: postgres 
    script: - uv run python manage.py test

What I expected

Tests should use SQLite when APP_ENV=local.

What actually happens

Tests always use PostgreSQL and fails with error.

Traceback:

django.db.utils.OperationalError: connection to server on socket "/var/run/[MASKED]ql/.s.PGSQL.5432" failed: No such file or directory
Is the server running locally and accepting connections on that socket?

Even though you set APP_ENV=local, Django is likely not using the settings entry point where this variable is checked. This usually happens when DJANGO_SETTINGS_MODULE is pointing directly to a specific settings file (like production), or when environment variables are being overridden elsewhere (for example via a .env file or base settings logic).

Because of that, your conditional import in settings/init.py is never applied, and Django continues to use the PostgreSQL configuration.

The fix is to explicitly control which settings module Django loads in CI. You should set DJANGO_SETTINGS_MODULE to the main settings package (the one containing your APP_ENV logic), not to a specific environment file. This ensures that the APP_ENV switch actually runs.

A more robust and recommended approach is to stop relying on APP_ENV for tests and instead create a dedicated test settings file. In that file, you define SQLite as the database. Then, in GitLab CI, you directly point DJANGO_SETTINGS_MODULE to this test settings module. This avoids any ambiguity and guarantees that tests always use SQLite.

Additionally, if you are not using PostgreSQL in tests, you should remove the Postgres service from your CI configuration to prevent unintended environment variable interference.

I finally solved this problem. I checked the GitLab docs about PostgreSQL (https://docs.gitlab.com/ci/services/postgres/ ) and how to run its instance in a pipeline. I needed to add the PostgreSQL service to default, not inside the job:

default:
interruptible: true
retry: 1
services:
    - name: postgres
alias: db,postgres,pg

And add global variables instead of adding them in services:

variables:
POSTGRES_DB: postgres
POSTGRES_USER: postgres
POSTGRES_HOST: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust

Here is the backend:test job:

backend:test:
stage: test
image: python:3.13-slim
before_script:
    - apt-get update && apt-get install -y python3-dev gcc libpq-dev postgresql-client
    - python -m pip install --upgrade pip
    - pip install uv
    - cd backend/api
    - uv sync --frozen
script:
    - uv run python manage.py check
    - uv run python manage.py test

Thanks for your answers and comments. I hope this helps someone who will encounter this issue in the future.

Вернуться на верх