Search code examples
pythonscikit-learnnormalizationscalinggrid-search

scikit-learn: StandardScaler() freezes in comb. with Pipeline and GridSearchCV


I am trying to fit a model onto a dataset with the following construction:

# Import stuff and generate dataset.
import sklearn as skl
import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import make_classification
from sklearn import preprocessing
from sklearn import svm
from sklearn.model_selection import GridSearchCV
from sklearn.pipeline import Pipeline
from sklearn import metrics
from tempfile import mkdtemp
from shutil import rmtree
from sklearn.externals.joblib import Memory
X, y = skl.datasets.make_classification(n_samples=1400, n_features=11,  n_informative=5, n_classes=2, weights=[0.94, 0.06], flip_y=0.05, random_state=42)
X_train, X_test, y_train, y_test = skl.model_selection.train_test_split(X, y, test_size=0.3, random_state=42)


# 1. Instantiate a scaler. 
#normer = preprocessing.Normalizer()
normer = preprocessing.StandardScaler()

# 2. Instantiate a Linear Support Vector Classifier.
svm1 = svm.SVC(probability=True, class_weight={1: 10})

# 3. Forge normalizer and classifier into a pipeline. Make sure the pipeline steps are memorizable during the grid search.
cached = mkdtemp()
memory = Memory(cachedir=cached, verbose=1)
pipe_1 = Pipeline(steps=[('normalization', normer), ('svm', svm1)], memory=memory)

# 4. Instantiate Cross Validation
cv = skl.model_selection.KFold(n_splits=5, shuffle=True, random_state=42)

# 5. Instantiate the Grid Search for Hypereparameter Tuning
params = [ {"svm__kernel": ["linear"], "svm__C": [1, 10, 100, 1000]}, 
           {"svm__kernel": ["rbf"], "svm__C": [1, 10, 100, 1000], "svm__gamma": [0.001, 0.0001]} ]
grd = GridSearchCV(pipe_1, params, scoring='roc_auc', cv=cv)

The program freezes in my Jupyter notebook when calling

y_pred = grd3.fit(X_train, y_train).predict_proba(X_test)[:, 1]

I aborted after 20 minutes. When I use preprocessing.Normalizer() instead of StandardScaler, the .fit() is done after two or three minutes.

What could be the problem here?

Edit: here is the output from the GridSearchCV():

GridSearchCV(cv=KFold(n_splits=5, random_state=2, shuffle=True), error_score='raise',estimator=Pipeline(memory=None, steps=[('normalization', StandardScaler(copy=True, with_mean=True, with_std=True)), ('svm', SVC(C=1.0, cache_size=200, class_weight={1: 10}, coef0=0.0, decision_function_shape='ovr', degree=3, gamma='auto', kernel='rbf', max_iter=-1, probability=True, random_state=None, shrinking=True, tol=0.001, verbose=False))]), fit_params=None, iid=True, n_jobs=1,param_grid=[{'svm__kernel': ['linear'], 'svm__C': [1, 10, 100, 1000]}, {'svm__kernel': ['rbf'], 'svm__C': [1, 10, 100, 1000], 'svm__gamma': [0.001, 0.0001]}],pre_dispatch='2*n_jobs', refit=True, return_train_score=True, scoring='roc_auc', verbose=0)

Solution

  • Thanks for answering my comments (and I didn't see your data gen code, my bad).

    You had a typo in your code, it should be:

    y_pred = grd.fit(X_train, y_train).predict_proba(X_test)[:, 1]

    Not:

    y_pred = grd3.fit(X_train, y_train).predict_proba(X_test)[:, 1]

    But from the logs it doesn't appear to be freezing but gets VERRRRY slow when testing C = 1000 in your grid search.

    Does it need to be this high???

    Testing on my computer (for the linear kernel, RBF will likely take even longer):

    SVM_C = [10, 100, 1000] takes [1.8s, 16s, 127s]

    So I'd recommend just testing up to C = 200/500 unless you're planning to run this overnight in a multi-fold CV grid search.

    More generally

    The both the fit function of a grid search and the predict probability functions can take a large amount of time.

    I'd recommend splitting them in to two steps so you can mitigate the chance of freezing.

    grd.fit(X_train, y_train)

    y_pred = grd.predict_proba(X_test)[:, 1]