Search code examples
pythonkerastensorboardkeras-2

How to add variables to progress bar in Keras?


I'd like to monitor eg. the learning rate during training in Keras both in the progress bar and in Tensorboard. I figure there must be a way to specify which variables are logged, but there's no immediate clarification on this issue on the Keras website.

I guess it's got something to do with creating a custom Callback function, however, it should be possible to modify the already existing progress bar callback, no?


Solution

  • It can be achieved via a custom metric. Take the learning rate as an example:

    def get_lr_metric(optimizer):
        def lr(y_true, y_pred):
            return optimizer.lr
        return lr
    
    x = Input((50,))
    out = Dense(1, activation='sigmoid')(x)
    model = Model(x, out)
    
    optimizer = Adam(lr=0.001)
    lr_metric = get_lr_metric(optimizer)
    model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['acc', lr_metric])
    
    # reducing the learning rate by half every 2 epochs
    cbks = [LearningRateScheduler(lambda epoch: 0.001 * 0.5 ** (epoch // 2)),
            TensorBoard(write_graph=False)]
    X = np.random.rand(1000, 50)
    Y = np.random.randint(2, size=1000)
    model.fit(X, Y, epochs=10, callbacks=cbks)
    

    The LR will be printed in the progress bar:

    Epoch 1/10
    1000/1000 [==============================] - 0s 103us/step - loss: 0.8228 - acc: 0.4960 - lr: 0.0010
    Epoch 2/10
    1000/1000 [==============================] - 0s 61us/step - loss: 0.7305 - acc: 0.4970 - lr: 0.0010
    Epoch 3/10
    1000/1000 [==============================] - 0s 62us/step - loss: 0.7145 - acc: 0.4730 - lr: 5.0000e-04
    Epoch 4/10
    1000/1000 [==============================] - 0s 58us/step - loss: 0.7129 - acc: 0.4800 - lr: 5.0000e-04
    Epoch 5/10
    1000/1000 [==============================] - 0s 58us/step - loss: 0.7124 - acc: 0.4810 - lr: 2.5000e-04
    Epoch 6/10
    1000/1000 [==============================] - 0s 63us/step - loss: 0.7123 - acc: 0.4790 - lr: 2.5000e-04
    Epoch 7/10
    1000/1000 [==============================] - 0s 61us/step - loss: 0.7119 - acc: 0.4840 - lr: 1.2500e-04
    Epoch 8/10
    1000/1000 [==============================] - 0s 61us/step - loss: 0.7117 - acc: 0.4880 - lr: 1.2500e-04
    Epoch 9/10
    1000/1000 [==============================] - 0s 59us/step - loss: 0.7116 - acc: 0.4880 - lr: 6.2500e-05
    Epoch 10/10
    1000/1000 [==============================] - 0s 63us/step - loss: 0.7115 - acc: 0.4880 - lr: 6.2500e-05
    

    Then, you can visualize the LR curve in TensorBoard.

    enter image description here