Search code examples
scikit-learnalphalasso-regression

Why increasing Lasso alpha values the root mean squared error only increase?


Hi i am fitting a Lasso model using different values in the range of 2*10^-5 to 500 for the alpha parameters like:

alphas=np.linspace(0.00002,500,20) 

when i plot the negative root mean squared error and absolute error from cross validation i get a graph like this:

img1

so the error increases in modulo and then stays constant instead of decreasing... why am i getting this result?

Choosing really small values of alpha like:

alphas=np.linspace(0.00001,0.00007,20)

i get this result for RMSE:

enter image description here

Do you have any idea why it seems to working only for so little values of alpha? Thank you


Solution

  • lasso regression aims to increase bias and decrease variance. By increasing penalty term you are moving away from the predictor that has the lowest bias (increasing RMSE). The estimator with the lowest RMSE is not always the best due to potential overfitting. Search for bias-variance tradeoff