Is there any benefit to using `URLValidator` on a django `URLField`

Is there any benefit to using URLValidator on a django URLField in models.py or does the URLField already do all the necessary validation?

Also is it recommended to use it to enforce https? For example:

from django.core.validators import URLValidator

class Processor(models.Model):
    website = models.URLField(
        max_length=250, 
        blank=True, 
        null=True,
        validators=[URLValidator(schemes=['https'])]  # Enforce HTTPS
    )

The URLField is actually a CharField for a URL and it's validated by URLValidator (which is a RegexValidator subclass that ensures a value looks like a URL, and raises an error code if it doesn’t or if it's longer than max_length characters).

The URLField takes the optional max_length argument. If you don’t specify max_length, a default of 200 is used.

So yes the URLField ensures that the value entered looks like a URL and is consistent with the max_length argument of the field. You wouldn't want to use a TextField for URLs would you?

To your other question, yes you can provide URL scheme list to validate against and specify https. Do note that if not provided, the default list is ['http', 'https', 'ftp', 'ftps'].

See URLValidator.

Django's URLField [GitHub] is defined as:

class URLField(CharField):
    default_validators = [validators.URLValidator()]
    # …

So it is essentially a CharField, by default with a maximum of 200 characters, and an URLValidator() as validator. So adding URLValidator() as validators makes no sense.

Also is it recommended to use it to enforce https?

Not when you just store URLs, a model field only stores data in the database, and that's it. The question is what you are doing with that data. If you for example allow people to enter a website URL for their profile, then you could allow any schema.

If you however want to start scraping the site, then it might be better to enforce a secure connection, but that is not per se something the model should be doing: the function that scrapes the data could first verify that it uses https or ftps before doing that.

Вернуться на верх