Search code examples
pythontensorflowkerastensorboard

Why does Keras Tensorboard scalar graph not linear (loop)?


I'm using TensorBoard via Keras. But scalar graph is messed up. As in not linear and looping back to itself. Is there anyway to correct this?

enter image description here

class LRTensorBoard(TensorBoard):
    def __init__(self, log_dir):
        super().__init__(log_dir=log_dir)
    def on_epoch_end(self, epoch, logs=None):
        logs.update({'lr': K.eval(self.model.optimizer.lr)})
        super().on_epoch_end(epoch, logs)

model = Sequential()
model.add(GRU(16, input_shape=(TimeStep.TIME_STEP + 1, TimeStep.FEATURES), activation='relu', return_sequences=True))
model.add(GRU(16, activation='relu', return_sequences=True))
model.add(GRU(16, activation='relu'))
model.add(Dense(3, activation='softmax'))

tensorboard = TensorBoard(log_dir=logDir, histogram_freq=0, write_graph=True)
tensorboard.set_model(model)

model.compile(loss='categorical_crossentropy', optimizer=optimize, metrics=[categorical_accuracy])
history = model.fit(TimeStep.fodder, TimeStep.target, epochs=100, shuffle=True, batch_size=4064, validation_split=0.3, callbacks=[tensorboard, LRTensorBoard(log_dir=logDir)])

Solution

  • This is because TensorBoard expects all logs to have a different directory. For example, if you have 2 models named CNN1 and CNN2, then you should have the following structure:

    logs/
        CNN1/
        CNN2/
    

    If you do not have this exact structure, TensorBoard will consider that both logs belong to the same training session, hence the weird curves...

    [EDIT] When I read your code, I see one easy fix: when you specify the logdir, append a directory with a timestamp as suffix