I would like to do something similar to the train_writer
and test_writer
from the TensorBoard tutorial. But using tf.train.Supervisor
. I am however not sure how best to go about this.
Pseudo code:
train_op = #...
train_summaries = # ...
test_summaries = # ...
config = tf.ConfigProto(allow_soft_placement=True)
sv = tf.train.Supervisor(
logdir = ????,
summary_op = ????,
summary_writer = ????,
)
with sv.managed_session(config=config) as sess:
while not sv.should_stop():
sess.run(train_op)
So my question is: How do I save the train_summaries
and test_summaries
do different directories? E.g. ./logdir/train
and ./logdir/test/
.
You are looking for summary_computed
. The docstring shows how to create custom summary writers. You cannot get the Supervisor to manage it automatically, but it is quite simple. From https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/supervisor.py
# Create a Supervisor with no automatic summaries.
sv = Supervisor(logdir='/tmp/mydir', is_chief=is_chief, summary_op=None)
# As summary_op was None, managed_session() does not start the
# summary thread.
with sv.managed_session(FLAGS.master) as sess:
for step in xrange(1000000):
if sv.should_stop():
break
if is_chief and step % 100 == 0:
# Create the summary every 100 chief steps.
sv.summary_computed(sess, sess.run(my_summary_op))
else:
# Train normally
sess.run(my_train_op)