I am getting an error when I try to load the model with tf.keras.models.load_model()
and I am getting the following error
ValueError: The mask that was passed in was tf.RaggedTensor(values=Tensor("Placeholder_2:0", shape=(None,), dtype=bool),
row_splits=Tensor("Placeholder_3:0", shape=(None,), dtype=int64))
and cannot be applied to RaggedTensor inputs.
Please make sure that there is no mask passed in by upstream layers.
Following is my model architecture
model = tf.keras.Sequential([
encoder,
tf.keras.layers.Embedding(input_dim=len(encoder.get_vocabulary()),output_dim=64,mask_zero=True),
tf.keras.layers.LSTM(64, return_sequences = True),
tf.keras.layers.GlobalMaxPool1D(),
tf.keras.layers.Dense(7)
])
Encoding Layer:
encoder = tf.keras.layers.experimental.preprocessing.TextVectorization(
max_tokens=VOCAB_SIZE)
The model was saved with: model.save(PATH)
I am loading the model from a different notebook. Can I get some help?
Okay I solved this by removing the mask_zero=True
attribute from the embedding layer, however, I'm not sure why this works and why it did not work with mask_zero=True
. It would be helpful if someone can tell me the reason.