Search code examples
pythonkerasneural-networklstm

l1 regularizer in Keras gives error "'Tensor' object is not callable"


I'm trying to apply l1 regularization to a neural network. When done on the first layer there is no problem, but then it is done to the second layer it raises a TypeError. Then, if I compile the NN again, the error is raised in the first layer.

The code is as follows:

from keras.regularizers import l1
from keras.layers import LSTM,Input
from keras.models import Model

inputs = Input(shape=(window+1,features))

l1 = LSTM(512,return_sequences=True,kernel_regularizer=l1(0.001),recurrent_regularizer=l1(0.001),
        bias_regularizer=l1(0.001),recurrent_activation="sigmoid",activation="relu")(inputs)

l3 = LSTM(512,return_sequences=False,kernel_regularizer=l1(0.001),recurrent_regularizer=l1(0.001), 
        bias_regularizer=l1(0.001),recurrent_activation="sigmoid",activation="relu")(l1)

outputs = LSTM(1,return_sequences=False,kernel_regularizer=l1(0.001),recurrent_regularizer=l1(0.001),
        bias_regularizer=l1(0.001),recurrent_activation="sigmoid",activation="relu")(l3)

model = Model(inputs=inputs,outputs=outputs)
model.compile(loss="mse",optimizer="adam")

TypeError: 'Tensor' object is not callable

When just the l1 regularizer is just imported, if I do type(l1) it tells me that its a function type, but after the error is raised if I do type(l1) it tells me its tensorflow.python.framework.ops.Tensor.

Why is this and how I can solve it? Or I missed something when applying l1 regularizer? When applying l2 regularizer I can do it without any problem.


Solution

  • You defined an LSTM with name l1, it overrides the function you imported (also named l1).