Search code examples
tensorflowbatch-normalization

In tf.slim, whether I need to add the dependency to the loss


In tf.slim, I have used the batch_norm.

My question is: whether I need explicitly to add the dependency to the loss?

I think, the slim knew I have used the batch_norm, whether it has automatically add the dependency to the loss? I am very confused.


Solution

  • Yes, you need.

    Could you follow the instructions here:

    Note: when training, the moving_mean and moving_variance need to be updated. By default the update ops are placed in tf.GraphKeys.UPDATE_OPS, so they need to be added as a dependency to the train_op. For example:

    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)
    with tf.control_dependencies(update_ops):
      train_op = optimizer.minimize(loss)