I would like to get these plots: http://scikit-learn.org/stable/auto_examples/linear_model/plot_lasso_coordinate_descent_path.html
from an elastic net I have already trained. The example does
from sklearn.linear_model import lasso_path, enet_path
from sklearn import datasets
diabetes = datasets.load_diabetes()
X = diabetes.data
print("Computing regularization path using the elastic net...")
alphas_enet, coefs_enet, _ = enet_path(
X, y, eps=eps, l1_ratio=0.8, fit_intercept=False)
which basically requires recomputing from X,y
the whole model.
Unfortunately, I do not have X,y
.
In the training I have used sklearn.linear_model.ElasticNetCV
which returns:
coef_ : array, shape (n_features,) | (n_targets, n_features)
parameter vector (w in the cost function formula)
mse_path_ : array, shape (n_l1_ratio, n_alpha, n_folds)
Mean square error for the test set on each fold, varying l1_ratio and alpha.
while I would need parameter vector varying l1_ratio and alpha.
Can this be done without recomputation? It would be a tremendous waste of time as those coef_paths are actually calculated already
Not once it is fit.
If you look through the source code for ElasticNetCV
, you will see that within the fit method the class is calling enet_path
, but with alphas
set to the value of alpha initialized in ElasticNet
(default 1.0) which is set by the value of alphas in ElasticNetCV
which will end up being a single value. So instead of calculating the coefficients for the default 100 values of alpha that allow you to create the path graphs, you only get the one for each value of alpha you set in your CV. That being said you could initialize the alphas in your CV to mimic the 100 default in enet_path
and then combine the coefficients from each fold, but this would be rather long running. As you mentioned you have already fit the CV this is not an option.