Search code examples
python-3.xdeep-learningtensorflow2.0learning-rate

TensorFlow 2.0 - Learning Rate Scheduler


I am using Python 3.7 and TensorFlow 2.0, I have to train a neural network for 160 epochs with the following learning rate scheduler:

Decreasing the learning rate by a factor of 10 at 80 and 120 epochs, where the initial learning rate = 0.01.

How can I write a function to incorporate this learning rate scheduler:

def scheduler(epoch):
    if epoch < 80:
        return 0.01
    elif epoch >= 80 and epoch < 120:
        return 0.01 / 10
    elif epoch >= 120:
        return 0.01 / 100

callback = tf.keras.callbacks.LearningRateScheduler(scheduler)

model.fit(
    x = data, y = labels,
    epochs=100, callbacks=[callback],
    validation_data=(val_data, val_labels))

Is this a correct implementation?

Thanks!


Solution

  • The tf.keras.callbacks.LearningRateScheduler expects a function that takes an epoch index as input (integer, indexed from 0) and returns a new learning rate as output (float):

    def scheduler(epoch, current_learning_rate):
        if epoch == 79 or epoch == 119:
            return current_learning_rate / 10
        else:
            return min(current_learning_rate, 0.001)
    

    This will reduce the learning rate by a factor of 10 at Epochs 80 and 120 and will leave it as it is in all other epochs.