Search code examples
pythontensorflowconv-neural-networktransfer-learning

TypeError: '<' not supported between instances of 'LearningRateScheduler' and 'int'


I've been trying to train VGG16 from scratch but when I compile the model I get this error
Here's where I get my reference from: kaggleVGG16

TypeError                                 Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_16112\3102448775.py in <module>
----> 1 model.compile(optimizer=keras.optimizers.Adam(lr=lr_callback),
      2              loss="categorical_crossentropy",
      3               metrics=['accuracy']
      4              )

D:\Users\moon\anaconda3\lib\site-packages\keras\optimizers\optimizer_v2\adam.py in __init__(self, learning_rate, beta_1, beta_2, epsilon, amsgrad, name, **kwargs)
    108                name='Adam',
    109                **kwargs):
--> 110     super(Adam, self).__init__(name, **kwargs)
    111     self._set_hyper('learning_rate', kwargs.get('lr', learning_rate))
    112     self._set_hyper('decay', self._initial_decay)

D:\Users\moon\anaconda3\lib\site-packages\keras\optimizers\optimizer_v2\optimizer_v2.py in __init__(self, name, gradient_aggregator, gradient_transformers, **kwargs)
    353                         f"{allowed_kwargs}.")
    354       # checks that all keyword arguments are non-negative.
--> 355       if kwargs[k] is not None and kwargs[k] < 0:
    356         raise ValueError("Expected {} >= 0, received: {}".format(k, kwargs[k]))
    357       if k == "lr":

TypeError: '<' not supported between instances of 'LearningRateScheduler' and 'int'

Here's my code:

initial_learning_rate = 0.1     
decay_steps = 1000              
decay_rate = 0.5                
staircase = True         

# Create the exponential decay schedule
def lr_schedule(epoch, lr):
    if staircase:
        return lr * decay_rate ** (epoch // decay_steps)
    else:
        return lr * decay_rate ** (epoch / decay_steps)


from keras.callbacks import LearningRateScheduler

lr_callback = LearningRateScheduler(lr_schedule)
callback=[early_stopping , learning_rate_reduce , lr_callback]


model.compile(optimizer=keras.optimizers.Adam(lr=lr_callback),
             loss="categorical_crossentropy",
              metrics=['accuracy']
             )

I tried changing lr to learning rate. It compiles just fine but when I'm starting to train the model, I get this error

ValueError: Attempt to convert a value (<keras.callbacks.LearningRateScheduler object at 0x00000208E3A16640>) with an unsupported type (<class 'keras.callbacks.LearningRateScheduler'>) to a Tensor.

Solution

  • I think I've solved it. I just removed the lr or the learning rate from the Adam because the learning rate is already on the LearningRateScheduler and is being used on the callback. Another problem I encountered is that I have 2 classes but on the output layer, I only have used 1 the same as the Kaggle Notebook Link I provided. That's it, thanks, folks!