As in this picture, if I want to add scalar from events.out.tfevents, but not create a new one.
How can I set the params int this code:
SummaryWriter(self, log_dir=None, comment='', purge_step=None, max_queue=10,
flush_secs=120, filename_suffix='')
You should be able to run it the same way (e.g. log_dir
has to be the same, tensorboard
in your case).
You have to remember to use next global step when adding scalar though.
First run, assume it crashed at 9
th step:
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter("my_dir")
x = range(10)
for i in x:
writer.add_scalar("y=x", i, i)
writer.close()
If you want to continue writing to this event file, you have to move last parameter global step by 10
:
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter("my_dir")
x = range(10)
for i in x:
writer.add_scalar("y=x", i, i + 10) # start from step 10
writer.close()
Running first file, followed by the second one and opening tensorboard
via tensorboard --logdir my_dir
would give you: