Search code examples
pythontensorflowkerasdeep-learningtensorflow2.0

Is there a way to reset the learning rate on each fold while employing the ReduceLROnPlateau callback of Keras?


As the title is self-descriptive, I'm looking for a way to reset the learning rate (lr) on each fold. The ReduceLROnPlateau callback of Keras manages the lr.


Solution

  • With no reproducible example I can only make a suggestion. If you take a look at the source code of ReduceLROnPlateau you can get some inspiration and create a custom callback to reset the learning rate on the beginning of training:

    class ResetLR(tf.keras.callbacks.Callback):
      def on_train_begin(self, logs={}):
        default_lr = 0.1
        previous_lr = self.model.optimizer.lr.read_value()
        if previous_lr!=defaul_lr:
          print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
          self.model.optimizer.lr.assign(default_lr)
    

    So with this callback you train using a for loop:

    custom_callback = ResetLR()
    for fold in folds:
      model.fit(...., callbacks=[custom_callback])
    

    If this does not work (due to tensorflow versions) you can try assigning the default learning rate using the tf.keras.backend like so:

    class ResetLR(tf.keras.callbacks.Callback):
      def on_train_begin(self, logs={}):
        default_lr = 0.1
        previous_lr = float(tf.keras.backend.get_value(self.model.optimizer.lr))
        if previous_lr!=default_lr:
          print("Resetting learning rate from {} to {}".format(previous_lr, default_lr))
          tf.keras.backend.set_value(self.model.optimizer.lr, default_lr)
    

    Also I would suggest taking a look at this post, for more references.