I'm comparing the performance of Catboost, XGBoost and LinearRegression in Pycaret. Catboost and XGBoost are untuned.
So far I see that Catboost and XGBoost are overfitting.
For linear regression train/test-score is train R2: 0.72, test R2: 0.65
Is there a way to set a 'Early Stopping' for XGBoost and Catboost to avoid this overfit? Or is there other parameters to tune in Pycaret to avoid overfitting?
There exists more possibilities, how to avoid an overfit.