How to keep datas remain after switch to deocker for deployment(Wagtail into an existing Django project)
let me clear what is going on....... am working on wagtail project which is comes from ff this steps: -How to add Wagtail into an existing Django project(https://docs.wagtail.org/en/stable/advanced_topics/add_to_django_project.html)
and am try to switch project to docker for deployment and do not forget am "import sql dump into postgres", and am aim to run on uwsgi and everything is perfect no error;but there are problem after everything i can't see anything on pages and when i try to access admin there is no any existed pages(which means i should create pages from scratch)
here are some of cmds i used: -docker-compose -f docker-compose-deploy.yml build -docker-compose -f docker-compose-deploy.yml up -docker-compose -f docker-compose-deploy.yml run --rm app sh -c "python manage.py createsuperuser"
if it needed here is the Dockerfile:
FROM python:3.9-slim-bookworm
LABEL maintainer="londonappdeveloper.com"
ENV PYTHONUNBUFFERED=1
# Install dependencies
RUN apt-get update && apt-get install --no-install-recommends -y \
build-essential \
libpq-dev \
gcc \
exiftool \
imagemagick \
libmagickwand-dev \
libmagic1 \
redis-tools \
&& apt-get purge -y --auto-remove -o APT::AutoRemove::RecommendsImportant=false \
&& rm -rf /var/lib/apt/lists/*
# Set up virtual environment
RUN python -m venv /py && \
/py/bin/pip install --upgrade pip
# Copy application files
COPY ./requirements.txt /requirements.txt
COPY . /app
COPY ./scripts /scripts
WORKDIR /app
EXPOSE 8000
# Install Python dependencies
RUN /py/bin/pip install -r /requirements.txt
# Set up directories for static and media files
RUN adduser --disabled-password --no-create-home app && \
mkdir -p /vol/web/static && \
mkdir -p /vol/web/media && \
chown -R app:app /vol && \
chmod -R 755 /vol && \
chmod -R +x /scripts
ENV PATH="/scripts:/py/bin:$PATH"
USER app
CMD ["run.sh"]
docker-compose-deploy.yml:
version: '3.9'
services:
app:
build:
context: .
restart: always
volumes:
- static-data:/vol/web
environment:
- DB_HOST=db
- DB_NAME=${DB_NAME}
- DB_USER=${DB_USER}
- DB_PASS=${DB_PASS}
- SECRET_KEY=${SECRET_KEY}
- ALLOWED_HOSTS=${ALLOWED_HOSTS}
- REDIS_URL=${REDIS_URL}
depends_on:
- db
- redis
links:
- db:db
- redis:redis
db:
image: postgres:13-alpine
restart: always
volumes:
- postgres-data:/var/lib/postgresql/data
environment:
- POSTGRES_DB=${DB_NAME}
- POSTGRES_USER=${DB_USER}
- POSTGRES_PASSWORD=${DB_PASS}
proxy:
build:
context: ./proxy
restart: always
depends_on:
- app
ports:
- 80:8000
volumes:
- static-data:/vol/static
redis:
image: redis:alpine
volumes:
postgres-data:
static-data:
i try to see the project working as b4 apply docker but now it is empty nothing showing and also in admin page there is no any pages it is empty.
Answering based on yesterday's comment about import conflicts when importing into a database with existing migrations:
Option 1: connect to the postgres database, drop your $DB_NAME database, recreate it, then import without running the migrations first. Reference
Option 2: Configure your PG container to import your data dump on initialization. To do this one, you will first have to destroy the data volume you have for PG since this only happens the first time the database is started. For details see the "Initialization scripts" section of the official image docs.