Search code examples
tensorflowoptimizationkerasgradient-descent

Cannot find apply_gradients in adamOptimizer in keras or tensorflow


apply_gradients might have been removed in future versions of optimizer in tensorflow or keras. DO not know why but I am getting this:

AttributeError: 'Adam' object has no attribute 'apply_gradients'

Any other way to achieve the same thing?


Solution

  • apply_gradients is something that is only possible in tensorflow.keras, because you can make manual training loops with eager execution on.

    Pure keras must use symbolic graph and can only apply gradients with fit or train_on_batch.