What is the process to ensure local Django database stucture updates get updated on production?

Currently I am running Django 5.1.1 and having it hosting on Google App Engine standard environment.

When I make I am working on the site, I have the Cloud SQL Auth Proxy running. This means that any database changes that happen, happen on the cloud database.

Going forward, I want to have two databases: a test database that is local, and the production one that is the one that I am using now in the cloud. I assume this would be accomplished by having settings.py check for environment variables, and setting the proper database to use. Then, when I deploy changes to Google, I will have use the Google cloud console to run the migrations to update the production database structure.

Is this the correct way to handle having seperate test and production databases? Are there any gotchas or other things to worry about with this setup?

I assume this would be accomplished by having settings.py check for environment variables, and setting the proper database to use.

Yes, use the settings to set the database you want to use: https://docs.djangoproject.com/en/5.1/ref/settings/#databases

Using a .env file is a good for that. You would set the username, password etc. in it.


Most of the answers on StackOverflow are many years old, so I wanted to confirm that this is a good way to handle this situation.

Yes, this is still working the same way. No reasons to change it really.


Are there any gotchas or other things to worry about with this setup that I have described?

Make sure to use the same engine (Postgres for example) locally and in production. You can use a different one locally but there are small differences sometimes.

For the automated tests, you can run then via SQLite to go fast - but you should run them once with your current engine before deploying.


Question in title:

What is the process to ensure local Django database stucture updates get updated on production?

Migrations do that for you: https://docs.djangoproject.com/en/5.1/topics/migrations/#workflow

You can consider using the Cloud SQL for MySQL or Cloud SQL for PostgreSQL database that’s managed and scaled by Google, and also supported by Django. Both support Database Migration Service (DMS) from source databases (mysql client or psql client as a test database) to Cloud SQL destination database (Cloud SQL for MySQL or PostgreSQL as a production database). DMS makes it easier for you to migrate your data to Google Cloud.

You mentioned you are using the Cloud SQL Auth Proxy, which also works by having a local client running in the local environment and creates one connection to the Cloud SQL instance. The below diagram from Google Cloud shows how the Cloud SQL Auth Proxy connects to Cloud SQL:

image

Additionally, as per this documentation:

Django apps that run on Google Cloud are running on the same infrastructure that powers all of Google's products, which generally improves the application's ability to adapt to a variable workload.

I hope this helps.

Back to Top