Search code examples
pythontensorflowkerasoptimization

Keras custom Optimizer ValueError: Missing learning rate


I'm trying to create a custom optimizer using the Keras library in TensorFlow. I'm coding the optimizer from scratch. It's showing the following error:

ValueError: Missing learning rate, please set self.learning_rate at optimizer creation time.

In my optimizer's creation, I'm adding the self.learning_rate and still the problem persists. Here's my code for the init function of the optimizer:

def __init__(self, lr=0.001, beta_1=0.9, beta_2=0.999,
                 momentum=0.0, epsilon=None, decay=0.0, name='AdamSGD', **kwargs):
        super(AdamSGD, self).__init__(name=name, **kwargs)
        self.iterations = K.variable(0, dtype='int64', name='iterations')
        self.learning_rate = K.variable(lr, name='lr')
        self.beta_1 = K.variable(beta_1, name='beta_1')
        self.beta_2 = K.variable(beta_2, name='beta_2')
        self.momentum = K.variable(momentum, name='momentum')
        self.epsilon = epsilon or K.epsilon()
        self.decay = K.variable(decay, name='decay')

This is how I'm compiling my model and where the error is being triggered:

model.compile(
    optimizer=AdamSGD(lr=0.01),
    loss='categorical_crossentropy',
    metrics=['accuracy']
)

The entire error:

     24 # Compile the model
     25 model.compile(
---> 26     optimizer=AdamSGD(lr=0.01),
     27     loss='categorical_crossentropy',
     28     metrics=['accuracy']

     24         self.iterations = K.variable(0, dtype='int64', name='iterations')
---> 25         self.learning_rate = K.variable(lr, name='lr')
     26         self.beta_1 = K.variable(beta_1, name='beta_1')
     27         self.beta_2 = K.variable(beta_2, name='beta_2')

     60     try:
---> 61       if getattr(self, name) is value:
     62         # Short circuit for `self.$x = self.$x`.
     63         return

    330     def learning_rate(self):
    331         if not hasattr(self, "_learning_rate") or self._learning_rate is None:
--> 332             raise ValueError(
    333                 "Missing learning rate, please set self.learning_rate at"
    334                 " optimizer creation time."

ValueError: Missing learning rate, please set self.learning_rate at optimizer creation time.

If someone has any suggestions or knows where I'm going wrong, please let me know as I'm stuck on this error for quite some time now.


Solution

  • The error message is counter intuitive, but you should "simply" set the attribute self._learning_rate in your constructor. You can do so with the function self._build_learning_rate(learning_rate).

    class AdamSGD(keras.optimizers.Optimizer):
         def __init__(self, lr=0.001, beta_1=0.9, beta_2=0.999,
                         momentum=0.0, epsilon=None, decay=0.0, name='AdamSGD', **kwargs):
            super(AdamSGD, self).__init__(name=name, **kwargs)
            self.iterations = K.variable(0, dtype='int64', name='iterations')
            self._learning_rate = self._build_learning_rate(lr)
            self.beta_1 = K.variable(beta_1, name='beta_1')
            self.beta_2 = K.variable(beta_2, name='beta_2')
            self.momentum = K.variable(momentum, name='momentum')
            self.epsilon = epsilon or K.epsilon()
            self.decay = K.variable(decay, name='decay')
    

    This function enables the optimizer to accept both a schedule or a float value as learning rate.

    You might also need to implement some extra methods, see this comment in the code:

    Creating a custom optimizer

    If you intend to create your own optimization algorithm, please inherit from this class and override the following methods:

    • build: Create your optimizer-related variables, such as momentums in SGD optimizer.
    • update_step: Implement your optimizer's updating logic.
    • get_config: serialization of the optimizer, include all hyper parameters.