Search code examples
pythontensorflowdeep-learningvisualizationtensorboard

Why tensorboard plots are weirdly curved?


Recently I started to use tensorboard to monitor learning progress of my models, but I noticed that when scalars lay too far from each other, the graph become weirdly curved (you can see it on the second graph, sudden jumps tend to draw weird lines to Y axis and back). What I expect is exactly what is located on the first graph, - single-fold categorization accuracy looks perfectly fine for me.

Anyway I can deal with it to make the graph eye-friendly?

Tensor board output

Output to the second graph performed like in every deep learning training loop :

writer = SummaryWriter()
for epoch in range(1, epochs + 1): 
    for step, (images, labels) in enumerate(train_dataloader):
         *do deep learning stuff* 
         if step % 50 and step != 0:
             writer.add_scalar("Plot_name", loss, step)

P.S. By too far away I mean that we have a sequence of (scalar, step) going like this : [1.2, 50], [1.1, 100], [1.05, 150], [0.8, 200].

It's kinda a big jump between last and penultimate scalar


Solution

  • Thanks everyone for participating, I just found out that I have been plotting Validation Loss/Epoch on the same "Single-fold loss" graph, so that it was plotting some values around 0 (like [1.01, 1], [0.95, 2], [0.8, 3]), and that is why it was drawing such weirdly curved lines.

    Now it looks like this, as expected.

    Fixed tensorboard