How can I add validation to tensorboard? I have written a wrappers for layers, like:
def convolution(input_data, kernel_shape, strides, activation, name=None):
with tf.name_scope(name):
kernel = tf.Variable(tf.truncated_normal(kernel_shape, stddev=stddev), name="weights")
bias = tf.Variable(tf.zeros([kernel_shape[-1]]), name="biases")
conv = tf.nn.conv2d(input=input_data, filter=kernel, strides=strides, padding="SAME", name="convolutions")
result = activation(tf.nn.bias_add(conv, bias), name="activations")
tf.scalar_summary(name + "/mean", tf.reduce_mean(kernel))
return result
and use summary_op = tf.merge_all_summaries()
in main
. Also I have implemented train_op
and valid_op
, which both calls inference
function. However, there appears an error that we have duplicate tags for scalar_summary, i.e. inference
is used in both train_op
and valid_op
, which lead to duplication of, say, conv1/mean
summary.
How can I make this work? I need is to run train and validation using the same function inference
.
As the error suggests, you cannot have two summaries with the same tag. This happens in your case because you are calling tf.scalar_summary
twice with the same tag, once when constructing the train_op
and once when constructing the valid_op
. Here is a possible solution :
You can add a flag to your inference
function, say is_training
, to indicate that the code is being called to construct part of a training graph. You would have to thread that flag to all your layer functions. In convolution
for instance, you should do the following :
if is_training:
tf.scalar_summary(name + "/mean", tf.reduce_mean(kernel))
return result
When constructing the train_op
, you pass is_training=True
, and when constructing the valid_op
, you pass is_training=False
. There is an example of such a programming pattern here in the Inception model.