How to support multiple Google Cloud Storage buckets in a Django FileField without breaking .url resolution?
I'm using Django with django-storages
and GoogleCloudStorage backend.
My model has a FileField
like this:
raw_file_gcp = models.FileField(storage=GoogleCloudStorage(bucket_name='videos-raw'))
At runtime, I copy the file to a different bucket (e.g. 'videos-retention'
) using the GCS Python SDK:
source_blob = default_bucket.blob(name)
default_bucket.copy_blob(source_blob, retention_bucket, name)
After copying, I update the .name
field:
recording.raw_file_gcp.name = f'videos-retention/{name}'
recording.save()
Everything works until I restart the app.
Then, when I access .url
, Django tries to build the URL using the original bucket from the storage=...
argument, even if the file now belongs to another bucket.
It ends up generating invalid URLs like:
https://storage.googleapis.com/videos-raw/https%3A/storage.cloud.google.com/videos-retention/filename.mp4
I expected Django to keep working even if the file was moved to a different bucket.
After debugging, I realized that the FileField
is tightly coupled to the bucket used when the model is initialized. If the file moves to another bucket, .url
breaks.
ChatGPT suggested removing FileField
altogether, using a CharField
instead, and dynamically resolving the GCS bucket + path in a custom @property
using google.cloud.storage.blob.generate_signed_url()
.
Is there a better solution for this?
How can I support multiple GCS buckets dynamically with Django FileField?