When I run my python code from a console, everything works fine: I get one new tensorboard file (626 bytes) every time and I can look it up using the Tensorboard service.
But, when I run this code from Spyder IDE, after every run there is a new file that contains data from all the runs made in Spyder since it's started. After 10 runs executed in Spyder, even when I shut down tensorboard server and delete dir with logs, after 11th run there will be a new file with approximately 6K size, containg all of the previous runs.
import tensorflow as tf
a = tf.constant(2,name ='a')
b = tf.constant(3,name = 'b')
x = tf.add(a, b)
with tf.Session() as sess:
# add this line to use TensorBoard.
writer = tf.summary.FileWriter('./graphs', sess.graph)
print (sess.run(x))
writer.close() # close the writer when you’re done using i
After Spyder restart whole story starts again, first run produces correct result, while second already contains its predecessor data.
Spyder IDE does some caching or what?
Ok, I've found an answer to my own question. There is a setting in Spyder: Tools -> Preferences -> Run -> 'Clear all variables before execution'
(also in Run -> Configuration per file... -> 'Clear all variables before execution')