I'm testing summaries before going any deeper, and I have the following snipped code
import tensorflow as tf
import numpy as np
def test_placeholders():
"Simply dump a placeholder to TensorBoard"
x = tf.placeholder(tf.float32, [])
sess = tf.Session()
summary = tf.summary.scalar("x", x)
train_writer = tf.summary.FileWriter('/tmp/tf/placeholder',
sess.graph, flush_secs=1)
r = sess.run(tf.global_variables_initializer())
s = sess.run(summary, feed_dict={x: 1.57})
train_writer.add_summary(s)
train_writer.close()
def test_merge():
"A simple function that make a loop computation and write down into TB"
x = tf.placeholder(tf.float32)
k = np.random.random() + 0.1
# Create a session
sess = tf.Session()
sess.run(tf.global_variables_initializer())
# define a single summary
summary_x = tf.summary.scalar("x", x)
train_writer = tf.summary.FileWriter('/tmp/tf/foo',
sess.graph, flush_secs=1)
# write some summaries
for i in range(0, 5):
# WORKS!
summary = sess.run(summary_x, feed_dict={x: k * i * i})
train_writer.add_summary(summary, i)
# write some summaries using merge_all
# (we have only one define summary)
merged = tf.summary.merge_all()
for i in range(5, 10):
# FAILS: You must feed a value for placeholder ...
summary = sess.run(merged, feed_dict={x: k * i * i})
train_writer.add_summary(summary, i)
train_writer.close()
if __name__ == '__main__':
test_placeholders() # if I comment this line ...
test_merge() # test_merge() works!?
So basically there are two functions that make some loops and write some logs for TensorBoard.
The Problem:
Each functions works fine isolated each other, however, when I run both sequentially, the second fails here
# FAILS: You must feed a value for placeholder ...
summary = sess.run(merged, feed_dict={x: k * i * i})
because its seems that merged contains something from the previous function that is not filled.
tensorflow.python.framework.errors_impl.InvalidArgumentError: You must feed a value for placeholder tensor 'Placeholder' with dtype float
[[Node: Placeholder = Placeholder[dtype=DT_FLOAT, shape=[], _device="/job:localhost/replica:0/task:0/cpu:0"]()]]
Caused by op u'Placeholder', defined at:
Digging into the code, I found that TF stores for convenience variables into defaults containers, e.g. graph, _collections from previous works, so making a call to
tf.reset_default_graph()
works as reset all the stuff from previous execution.
The Question:
What is the tensorflow style for isolate and dealing with multiples TF executions in the same process that doesn't make interferences between them?
The problem you are having is related to Tensors loaded to the same Graph.
Noting that test_merge
contains merged = tf.summary.merge_all()
this merges all summaries collected in the default graph, and everything is loaded to the default graph so when you try to evaluate summary = sess.run(merged, feed_dict={x: k * i * i})
it requires the input from the first function as well. If you change the order of the calls you will see that your code executes. If you need separate graphs, it can be problematic, so try to have everything loaded to one graph - but if you need to then this answer might be of use Working with multiple graphs in TensorFlow.