Search code examples
pythongradient-descent

Where does gradient descent appear in machine learning libraries (e.g. scikitlearn)


I understand how gradient descent works and that a user can manually define a gradient descent function to minimize some cost function. My question is very general that where does GD appear in scikitlearn codes when we train and test machine learning models such as linear regression or random forest? Is GD simply embedded in the .fit() function? Or do we need to include it as a parameter to the model?


Solution

  • Short answer: most of the sklearn models use the stochastic gradient descent for optimization/fitting. And you never need to specify that. Some functions allow you to specify optimizer (booster in plain language) like adam.