Search code examples
tensorflowgradient-descent

TensorFlow: How many gradient steps are made per session.run() call?


A gradent descent algorithm makes several steps towards the minima. My question is how many of these steps are performed for every call to sess.run. To elaborate by example:

I am using a gradient descent algorithm (tf.train.AdamOptimizer) in my network. I have a loop such as this:

for epoch in range(100):
    sess.run(ops['optimizer'],
        feed_dict=train_feed_dict
    )    

This epoch loop runs 100 times. My question is if a single call to sess.run makes a single small step towards the minima. OR, is more than one step of the gradent decent made for each epoch?


Solution

  • If ops['optimizer'] is a single call of tf.train.AdamOptimizer(some_learning_rate).minimize(some_loss), then running it will perform exactly one descent step. So there will be 100 steps in your loop.