Regularization penalizes flexibility

Given that high-dimensional mappings are prone to overfitting, regularization is a term introduced in the training loss to discourage the model from employing many parameters.

Resources

Backlinks