Search code examples
pythontensorflowkeras

NotImplementedError: Layers with arguments in `__init__` must override `get_config`


I'm trying to save my TensorFlow model using model.save(), however - I am getting this error.

The model summary is provided here: Model Summary

The code for the transformer model:

def transformer(vocab_size, num_layers, units, d_model, num_heads, dropout, name="transformer"):
    inputs = tf.keras.Input(shape=(None,), name="inputs")
    dec_inputs = tf.keras.Input(shape=(None,), name="dec_inputs")

    enc_padding_mask = tf.keras.layers.Lambda(
        create_padding_mask, output_shape=(1, 1, None),
        name='enc_padding_mask')(inputs)
    # mask the future tokens for decoder inputs at the 1st attention block
    look_ahead_mask = tf.keras.layers.Lambda(
        create_look_ahead_mask,
        output_shape=(1, None, None),
        name='look_ahead_mask')(dec_inputs)
    # mask the encoder outputs for the 2nd attention block
    dec_padding_mask = tf.keras.layers.Lambda(
        create_padding_mask, output_shape=(1, 1, None),
        name='dec_padding_mask')(inputs)

    enc_outputs = encoder(
        vocab_size=vocab_size,
        num_layers=num_layers,
        units=units,
        d_model=d_model,
        num_heads=num_heads,
        dropout=dropout,
    )(inputs=[inputs, enc_padding_mask])

    dec_outputs = decoder(
        vocab_size=vocab_size,
        num_layers=num_layers,
        units=units,
        d_model=d_model,
        num_heads=num_heads,
        dropout=dropout,
    )(inputs=[dec_inputs, enc_outputs, look_ahead_mask, dec_padding_mask])

    outputs = tf.keras.layers.Dense(units=vocab_size, name="outputs")(dec_outputs)

    return tf.keras.Model(inputs=[inputs, dec_inputs], outputs=outputs, name=name)

I don't understand why it's giving this error since the model trains perfectly fine. Any help would be appreciated.

My saving code for reference:

print("Saving the model.")
saveloc = "C:/tmp/solar.h5"
model.save(saveloc)
print("Model saved to: " + saveloc + " succesfully.")

Solution

  • It's not a bug, it's a feature.

    This error lets you know that TF can't save your model, because it won't be able to load it.
    Specifically, it won't be able to reinstantiate your custom Layer classes: encoder and decoder.

    To solve this, just override their get_config method according to the new arguments you've added.

    A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.


    For example, if your encoder class looks something like this:

    class encoder(tf.keras.layers.Layer):
    
        def __init__(
            self,
            vocab_size, num_layers, units, d_model, num_heads, dropout,
            **kwargs,
        ):
            super().__init__(**kwargs)
            self.vocab_size = vocab_size
            self.num_layers = num_layers
            self.units = units
            self.d_model = d_model
            self.num_heads = num_heads
            self.dropout = dropout
    
        # Other methods etc.
    

    then you only need to override this method:

        def get_config(self):
    
            config = super().get_config().copy()
            config.update({
                'vocab_size': self.vocab_size,
                'num_layers': self.num_layers,
                'units': self.units,
                'd_model': self.d_model,
                'num_heads': self.num_heads,
                'dropout': self.dropout,
            })
            return config
    

    When TF sees this (for both classes), you will be able to save the model.

    Because now when the model is loaded, TF will be able to reinstantiate the same layer from config.


    Layer.from_config's source code may give a better sense of how it works:

    @classmethod
    def from_config(cls, config):
      return cls(**config)