Cross-validation

Technique used in Machine Learning to assess how well a model will perform on unseen data.

Explanation

Benefits of cross-validation:

  • Reduces overfitting: By training and testing on different data splits, cross validation helps prevent the model from memorizing specific patterns in the training data and encourages it to learn more generalizable features.
  • Provides a more reliable performance estimate.

Examples

k-fold cross validation: The data is split into k random folds, each fold is used for validation once, the remaining k-1 folds for training in each iteration.