Search code examples
pythontensorflowtensorboard

What are all my variables duplicated in Tensorboard?


I'm new to Tensorflow and am running a basic CNN. As a way of visualising the training process, I build a summary with loss and accuracy in order to view later in Tensorboard like this:

tf.summary.scalar("loss", cost)
tf.summary.scalar("accuracy", accuracy)

I initialise the summaries as below. (get_logdir_string() returns a unique string composed of the given parameter and the current datetime)

merged_summary_op = tf.summary.merge_all()
summary_writer = tf.summary.FileWriter(get_logdir_string('CIFAR10'),
                                            graph=tf.get_default_graph())

Then for each mini-batch iteration I do this:

_, summary = sess.run([optimizer, merged_summary_op], feed_dict={x: batch_x, 
                                                                 y_true: batch_y, 
                                                                 keep_prob: dropout})
summary_writer.add_summary(summary, step * batch_size)

Then I run Tensorboard and am presented with something like the following, where only the first of each variable (without the suffix) contains data: Screenshot

Has anyone encountered this before? Thanks!


Solution

  • I found the culprit and thought I'd post it here for future reference.

    It turns out the I needed to call tf.reset_default_graph() before each run.