Search code examples
pythontensorflowmachine-learningkerasautoencoder

NotImplementedError: Layer ModuleWrapper has arguments in `__init__` and therefore must override `get_config`


I'm trying to save my autoencoder model (for classification) to disk but the following error appears when doing: model.save(model_name)

NotImplementedError: Layer ModuleWrapper has arguments in __init__ and therefore must override get_config.

This is part of my code:

import numpy as np
import tensorflow as tf  # TF 2.5.0
from tensorflow import keras
from tensorflow.keras.models import Sequential
from keras.layers import Dense
from tensorflow.keras import layers
from tensorflow.keras import regularizers
from tensorflow.keras.utils import to_categorical
import matplotlib.pyplot as plt
import pandas as pd
import sklearn
from sklearn.model_selection import train_test_split
from callbacks import all_callbacks
import os, time

print(train_data.shape, train_labels.shape, test_data.shape, test_labels.shape )
# Shape -> (3680, 1024, 1) (3680, 10) (920, 1024, 1) (920, 10)

act_func = 'relu'
out_func = 'softmax'
k_inic = 'glorot_uniform'

def create_model():
    model = Sequential()
    model.add(Dense(512,activation=act_func, kernel_initializer=k_inic))
    model.add(Dense(100,activation=act_func, kernel_initializer=k_inic))  
    model.add(Dense(10, activation=out_func, kernel_initializer=k_inic))

    opt = keras.optimizers.Adam()        
    model.compile(loss='mse', optimizer=opt, metrics=["accuracy"])
    model.build(input_shape=(None, 1024))
    return model    

history = model.fit(train_data, train_labels, epochs = EPOCHS, batch_size = BATCH_SIZE, validation_split=VALIDATION_SPLIT, verbose = 0)
model = create_model()
res = model.evaluate(test_data, test_labels, batch_size = BATCH_SIZE, verbose = 0)[1]

model_name = "autoencoder_crwu"
model.save(model_name)

Model summary:

Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
module_wrapper_24 (ModuleWra (None, 512)               524800    
_________________________________________________________________
module_wrapper_25 (ModuleWra (None, 100)               51300     
_________________________________________________________________
module_wrapper_26 (ModuleWra (None, 10)                1010      
=================================================================
Total params: 577,110
Trainable params: 577,110

The model works and the best accuray that i got was 93.8% but i cant save it (i do can save the weights).

I've read here that I need to implement get_config but don't know how to do it for my code since the other examples uses classes or other things that I don't understand. Is there an easy way to implement it? or any resource to see on how?

Also, why are the layers called ModuleWrapper instead of Dense?

Thanks


Solution

  • ModuleWrapper layer name is because you are mixing keras and tensorflow libraries. Use just one of them (Then you will get dense name for Dense layers and also you don't need to implement get_config).

    Change this line:

    #from keras.layers import Dense             #comment this
    from tensorflow.keras.layers import Dense   #add this
    

    Also, noted shapes of your dataset will cause error, since they are incompatible with the model you have defined, and you should remove last axis from your data. Add these 2 lines before model.fit():

    train_data = tf.squeeze(train_data)
    test_data = tf.squeeze(test_data) 
    

    These lines change shapes from (None,1024,1) to (None,1024). Then you can feed them to your model without any error.