I am performing some machine learning tasks using SVM. I suspect the data is non-linear so I also included the RBF kernel. I found that SVM with RBF kernel is MUCH worse than linear SVM. I wonder if I did something wrong with my classifier parameter specifications.
My code as follows:
from sklearn.svm import LinearSVC
from sklearn.svm import SVC
svm1 = LinearSVC() # performs the best, similar to logistic regression results which is expected
svm2 = LinearSVC(class_weight="auto") # performs somewhat worse than svm1
svm3 = SVC(kernel='rbf', random_state=0, C=1.0, cache_size=4000, class_weight='balanced') # performs way worse than svm1; takes the longest processing time
svm4 = SVC(kernel='rbf', random_state=0, C=1.0, cache_size=4000) # this is the WORST of all, the classifier simply picks the majority class
With RBF try tuning your C
and gamma
parameters. Scikit-learn's grid search will help you.
Here is an example to get you started:
svc = SVC(...)
params = {"C":[0.1, 1, 10], "gamma": [0.1, 0.01, 0.001]}
grid_search = GridSearchCV(svc, params)
grid_search.fit(X,y)