Django: Best way to create logs when updating a model in bulk

I am developing a Django system where I have a main model and a log/history model to record changes. My main model:

class Modelo(models.Model):
    user_a = models.ForeignKey(User, on_delete=models.CASCADE, related_name="delegated_signatures")
    user_b = models.ForeignKey(User, on_delete=models.CASCADE, related_name="received_delegations")
    is_active = models.BooleanField(default=True)
    created_at = models.DateTimeField(auto_now_add=True)

And the history model:

class ModeloLogHistory(models.Model):
    modelo = models.ForeignKey(Modelo, on_delete=models.CASCADE)
    changed_by = models.ForeignKey(User, on_delete=models.DO_NOTHING)
    change_type = models.CharField(max_length=50, choices=ChangeType.CHOICES)
    previous_data = models.JSONField(null=True, blank=True)
    new_data = models.JSONField(null=True, blank=True)
    created_at = models.DateTimeField(auto_now_add=True)

The logic works well for individual changes, but there is a scenario where an administrator can deactivate all records at once (is_active = False). This could generate a large volume of logs in the database, making me wonder about the best approach to handle these changes efficiently.

Вернуться на верх