Search code examples
machine-learningdeep-learningcross-validation

How to use cross-validation and early stopping together?


I have to train my model using K-fold cross validation but at the same time I want to use Early stopping to prevent overfitting. How can it be done ? Since Early stopping will return a different model in each fold, does the average of the accuracies of the folds mean anything ?


Solution

  • Even when you do not use Early Stopping, every time you use Cross-Validation you have a different model in each fold: the model has different parameters and different results, but that's the point of CV. You can use ES without any particular attention.