I'm new to tensorflow/tensorboard, and I'm trying to figure out how to use them. I created a simple graph which adds up a few constants, say something like
a = tf.constant(10.0)
b = tf.constant(50.0)
sum = a + b
and visualizing it in tensorboard by
writer = tf.summary.FileWriter('test_graph/')
writer.add_graph(tf.get_default_graph())
writer.flush()
everything ok so far, the graphs are correctly visualized.
If I use tf.reset_default_graph() and I open tensorboard again, as expected I got the message:
Graph visualization failed: the graph is empty....
At this point, I tried simply to run:
sum_new = a + b
so, basically I'm defining sum_new by using the tensors a and b I created before resetting the graph. I don't get any errors, because a and be are still existing in memory. However, if I open tensorboard, I still get the error message about the graph being empty.
Why is this happening? If I create a graph and then delete it, is there any way to keep using the variables defined before the reset in a new graph?
When you call tf.reset_default_graph()
it simply, well, resets the default graph to a new tf.Graph
instance. TensorFlow keeps a singleton graph object that is the default one, and resetting it means that any new operations created in the default graph will be on a different graph than before.
However, that does not mean that the previous default graph is deleted. In fact, graphs cannot be deleted, other than losing every reference to it (so basically when you cannot obtain a reference to the object by any means). In your example, you have a
and b
, which are two tensors produced by operations in the first default graph, let's call it g1
. After calling tf.reset_default_graph()
, the default graph is now g2
, and it is empty. However, g1
still exists, and you can actually get a reference to it with a.graph
or b.graph
. If you do writer.add_graph(a.graph)
you will be able to see it in TensorBoard.
Now, what happens when you do sum_new = a + b
after resetting the default graph. In general, we would assume that a new addition operation is created in the default graph, which is now g2
. However, it doesn't work exactly like that. In fact, if we tried to do that, TensorFlow would complain, since a
and b
do not belong to g2
, but to g1
. What happens is that when TensorFlow makes a new operation it looks at its arguments, and makes the new operation in the same graph as them. So with sum_new
a new addition operation would be created in g1
again. As I said, you will be able to see it if you write a.graph
to TensorBoard.
Interestingly, if you do sum_new = a + 20.0
, it would still work, and create a new constant operation (for the 20.0
) and an addition in g1
. However, if you do sum_new = a + tf.constant(20.0)
, then it finally fails. The tf.constant(20.0)
is created in the new default graph g2
, and when you try to operate it with a
it fails because the tensors belong to different graphs.
A potentially relevant takeaway from this is that tf.reset_default_graph()
does not necessarily delete and frees the memory from the previous default graph, so if you have very heavy graphs that you want to delete (which it's not too common I think, since memory is usually spent by sessions), make sure you get rid of every reference to it (in this case, for example, you could do del a, b
).