Search code examples
tensorflowskflow

What is the default learning rate for TensorFlowDNNRegressor with SGD or Adagrad?


This is probably an easy question, but I just can't find it. But I'm also pretty new to all this, so maybe I'm just blind.

What is the default learning rate when using TensorFlowDNNRegressor with SGD or Adagrad? The default when using Adam or Adadelta seems to be 0.001, but I cannot find a default for Adagrad which is the default optimizer for TensorFlowDNNRegressor, nor for classic SGD.

Thanks!


Solution

  • https://github.com/tensorflow/skflow/blob/master/g3doc/api_docs/python/estimators.md#class-skflowtensorflowdnnregressor-tensorflowdnnregressor

    https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/learn/python/learn/estimators/base.py

    Default learning rate for TensorFlowDNNRegressor is 0.1 as mentioned in the above doc and code.

    I checked the code, but there is no default value for learning rate of Adagrad optimiser https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/adagrad.py