I am trying to load a saved model that was compiled with Adagrad optimizer.
import tensorflow as tf
from tensorflow.python import keras
from keras.optimizers import Adagrad
from Mod import MyLossPokus
modelName = "/some/path"
model = keras.models.load_model(modelName, custom_objects={'MyLossPokus': MyLossPokus, "Custom\>Adagrad": Adagrad } )
But I get a weird error:
TypeError: Unexpected keyword argument passed to optimizer: weight_decay. Allowed kwargs are {'global_clipnorm', 'decay', 'clipvalue', 'lr', 'clipnorm'}.
(I am not passing te weight_decay argument at all!)
Is this a bug? Or do I call the load_model function improperly?
File "/aux/MakePredictions.py", line 105, in <module>
LiveGames()
File "/aux/MakePredictions.py", line 38, in LiveGames
PredictionSmall (game)
File "/aux/MakePredictions.py", line 56, in PredictionSmall
model = keras.models.load_model(modelName, custom_objects={'MyLossPokus': MyLossPokus, "Custom>Adagrad": Adagrad } )
File "/home/au/.local/lib/python3.10/site-packages/tensorflow/python/keras/saving/save.py", line 205, in load_model
return saved_model_load.load(filepath, compile, options)
File "/home/au/.local/lib/python3.10/site-packages/tensorflow/python/keras/saving/saved_model/load.py", line 168, in load
model.compile(**saving_utils.compile_args_from_training_config(
File "/home/au/.local/lib/python3.10/site-packages/tensorflow/python/keras/saving/saving_utils.py", line 207, in compile_args_from_training_config
optimizer = optimizers.deserialize(optimizer_config)
File "/home/au/.local/lib/python3.10/site-packages/tensorflow/python/keras/optimizers.py", line 94, in deserialize
return deserialize_keras_object(
File "/home/au/.local/lib/python3.10/site-packages/tensorflow/python/keras/utils/generic_utils.py", line 674, in deserialize_keras_object
deserialized_obj = cls.from_config(
File "/home/au/.local/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/adagrad.py", line 138, in from_config
return cls(**config)
File "/home/au/.local/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/adagrad.py", line 84, in __init__
super().__init__(name, **kwargs)
File "/home/au/.local/lib/python3.10/site-packages/keras/optimizers/optimizer_v2/optimizer_v2.py", line 379, in __init__
raise TypeError(
TypeError: Unexpected keyword argument passed to optimizer: weight_decay. Allowed kwargs are {'global_clipnorm', 'decay', 'clipvalue', 'lr', 'clipnorm'}.
I figured what works for me is from keras.optimizers import Adagrad
and than not providing Custom\>Adagrad
at all :)