Deploying Dockerized (React + Django + PostgreSQL ) app with custom license to a client without exposing source code
I am running a test simulation on a virtual server in VirtualBox to see how the procedure of installing a web application using Docker would work on a client server. My stack includes:
- Frontend: React.js, built into a Docker image
- Backend: Django (Python) in Docker
- Database: PostgreSQL 16 in Docker
- Orchestration: Docker Compose managing all services
- Environment variables: Managed via
.env.dockerfor the backend (database credentials, email settings, etc.) and for the frontend at build time (API URL) - License: A custom license mechanism I implemented myself, which must be included and validated on the client server using license.json as the key sold to clients
In my test:
- I built the backend and frontend Docker images locally on my development machine.
- For the frontend, I rebuilt the image with
REACT_APP_API_URL=http://localhost:8000so that it points to the local backend. - I exported the backend and frontend images as
.tarfiles to simulate distribution to a client server. - On the client server (virtual machine), I loaded the images and tried running them using Docker Compose.
- I observed that if the frontend API URL is not baked in at build time, React requests go to
undefined/users/....
Question:
For a real client deployment using this stack, what is the best approach to install and deploy the application using Docker without exposing frontend & backend source code, while ensuring:
- The frontend correctly points to the backend API
- The backend can be configured via environment variables at runtime
- The custom license mechanism is preserved and validated properly
- The client can easily run the application using Docker Compose
Additionally, what should I be aware of in terms of:
- Environment variable management
- Networking between containers on the client server
- Rebuilding or updating images if the backend URL changes