Is it correct to modify `django.db.connections.databases` dynamically to multiple databases? Or is it better to deploy a separate API per client?
This is my first time developing a multi-tenant SaaS application in Django, in this SaaS each company has its own PostgreSQL database, and these databases are created dynamically when a company registers. I cannot predefine all databases in settings.DATABASES
, as companies can register at any time without requiring a server restart.
My current solution uses Middleware to detect the company from the subdomain or a JWT token and then modify connections.databases
at runtime to configure the connection to the company's database:
import redis
from django.db import connections
from django.core.exceptions import ImproperlyConfigured
from django.utils.connection import ConnectionDoesNotExist
from rest_framework_simplejwt.authentication import JWTAuthentication
from myapp.models import Company # Company model stored in the global database
class CompanyDBMiddleware:
def __init__(self, get_response):
self.get_response = get_response
self.jwt_authenticator = JWTAuthentication()
self.cache = redis.Redis(host='localhost', port=6379, db=0)
def __call__(self, request):
company_db = self.get_database_for_company(request)
if not company_db:
raise ImproperlyConfigured("Could not determine the company's database.")
# Register connection only if it does not exist in `connections.databases`
if company_db not in connections.databases:
connections.databases[company_db] = {
'ENGINE': 'django.db.backends.postgresql',
'NAME': company_db,
'USER': 'postgres',
'PASSWORD': 'your_password',
'HOST': 'localhost',
'PORT': '5432',
'CONN_MAX_AGE': 60, # To avoid opening and closing connections on each request
}
request.company_db = company_db
response = self.get_response(request)
# Close connection after the response
try:
connections[company_db].close()
except ConnectionDoesNotExist:
pass
return response
def get_database_for_company(self, request):
subdomain = request.get_host().split('.')[0]
company_db = None
cache_key = f"company_db_{subdomain}"
company_db = self.cache.get(cache_key)
if company_db:
return company_db.decode("utf-8")
try:
company = Company.objects.using('default').get(subdomain=subdomain, active=True)
company_db = company.db_name
self.cache.setex(cache_key, 300, company_db) # Cache the database name for 5 minutes
return company_db
except Company.DoesNotExist:
return None
My questions are:
✅ Is it correct to modify connections.databases dynamically on each request to handle multiple databases?
✅ Is there a better way to do this in Django without restarting the application when registering new databases?
✅ How does this practice affect performance in environments with load balancing and multiple Django instances?
✅ Would it be better to deploy a separate API per client in its own Django container?
✅ I am considering giving each client their own portal on a separate domain and only deploying their frontend in a container while keeping a centralized API. Is this approach more efficient?
I appreciate any recommendations on best practices or potential issues with this approach. Thanks! 😊
Django documentation explicitly states that you shouldn't be doing this:
You shouldn’t alter settings in your applications at runtime. The only place you should assign to settings is in a settings file.
Consider using a router, or writing your own database backend by extending django.db.backends.postgresql
.
You can use django-tenants library. Its designed for this purpose.
https://django-tenants.readthedocs.io/en/latest/