Search code examples
pythonmachine-learningscikit-learncross-validationimblearn

How to perform SMOTE with cross validation in sklearn in python


I have a highly imbalanced dataset and would like to perform SMOTE to balance the dataset and perfrom cross validation to measure the accuracy. However, most of the existing tutorials make use of only single training and testing iteration to perfrom SMOTE.

Therefore, I would like to know the correct procedure to perfrom SMOTE using cross-validation.

My current code is as follows. However, as mentioned above it only uses single iteration.

from imblearn.over_sampling import SMOTE
from sklearn.model_selection import train_test_split

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=0)
sm = SMOTE(random_state=2)
X_train_res, y_train_res = sm.fit_sample(X_train, y_train.ravel())
clf_rf = RandomForestClassifier(n_estimators=25, random_state=12)
clf_rf.fit(x_train_res, y_train_res)

I am happy to provide more details if needed.


Solution

  • You need to perform SMOTE within each fold. Accordingly, you need to avoid train_test_split in favour of KFold:

    from sklearn.model_selection import KFold
    from imblearn.over_sampling import SMOTE
    from sklearn.metrics import f1_score
    
    kf = KFold(n_splits=5)
    
    for fold, (train_index, test_index) in enumerate(kf.split(X), 1):
        X_train = X[train_index]
        y_train = y[train_index]  # Based on your code, you might need a ravel call here, but I would look into how you're generating your y
        X_test = X[test_index]
        y_test = y[test_index]  # See comment on ravel and  y_train
        sm = SMOTE()
        X_train_oversampled, y_train_oversampled = sm.fit_sample(X_train, y_train)
        model = ...  # Choose a model here
        model.fit(X_train_oversampled, y_train_oversampled )  
        y_pred = model.predict(X_test)
        print(f'For fold {fold}:')
        print(f'Accuracy: {model.score(X_test, y_test)}')
        print(f'f-score: {f1_score(y_test, y_pred)}')
    

    You can also, for example, append the scores to a list defined outside.