ValueError: Layer lambda_47 was called with an input that isn't a symbolic tensor. Received type: <class 'tuple'>. Full input: [(<tf.Tensor 'lambda_45/Slice:0' shape=(110000, 1, 128) dtype=float32>, <tf.Tensor 'lambda_46/Slice:0' shape=(110000, 1, 128) dtype=float32>)]. All inputs to the layer should be tensors
I have been trying to implement tensorflow operations with a keras frontend in a model definition. I am having a problem creating a transformation layer that allows for weight updates. I have read that Keras' Lambda function is the key to doing this, but I ran into this error.
Here is my code:
### CONTROL VARIABLES (i.e. user input parameters)
dropout_rate = 0.5
batch_size = 128
nb_epochs = 40
#with tf.device('/gpu:0'):
### MODEL CREATION
X_input = Input(shape=input_shape, name='input_1')
# Input
X_i = Lambda(lambda x: tf.slice(x, [0,0,0], [110000,1,128]))(X_input) # Slicing out inphase column
X_q = Lambda(lambda x: tf.slice(x, [0,1,0], [110000,1,128]))(X_input) # Slicing out quadrature column
X_mag = Lambda(lambda x_i, x_q: tf.math.sqrt(tf.math.add(tf.math.square(x_i), tf.math.square(x_q))))((X_i, X_q)) # Acquiring magnitude of IQ
## THE SOURCE OF THE ERROR IS THE LINE ABOVE ^
## ITS USING TENSORFLOW OPERATORS TO FIND ABSOLUTE VALUE
X_phase = Lambda(lambda x_i, x_q: tf.math.atan2(x_i, x_q))((X_i, X_q)) # Acquiring phase of IQ
X = Concatenate(axis=1)([X_mag, X_phase]) # Combining into two column (magnitude,phase) tensor
X = Conv2D(128, kernel_size=(2,8), padding='same',data_format='channels_last')(X)
X = Activation('relu')(X)
X = Dropout(dropout_rate)(X)
X = Conv2D(64, kernel_size=(1,8), padding='same',data_format='channels_last')(X)
X = Activation('relu')(X)
X = Dropout(dropout_rate)(X)
X = Flatten()(X)
X = Dense(128, kernel_initializer='he_normal', activation='relu')(X)
X = Dropout(dropout_rate)(X)
X = Dense(len(classes), kernel_initializer='he_normal')(X)
X = Activation('softmax', name = 'labels')(X)
model = Model(inputs = X_input, outputs = X)
model.summary()
model.compile(optimizer=Adam(learning_rate), loss='categorical_crossentropy', metrics =['accuracy'])
Full stack trace error:
The shape of x is (220000, 2, 128)
(110000, 2, 128) [2, 128]
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py in assert_input_compatibility(self, inputs)
278 try:
--> 279 K.is_keras_tensor(x)
280 except ValueError:
3 frames
/usr/local/lib/python3.6/dist-packages/keras/backend/tensorflow_backend.py in is_keras_tensor(x)
473 raise ValueError('Unexpectedly found an instance of type `' +
--> 474 str(type(x)) + '`. '
475 'Expected a symbolic tensor instance.')
ValueError: Unexpectedly found an instance of type `<class 'tuple'>`. Expected a symbolic tensor instance.
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
<ipython-input-22-dba00eef4193> in <module>()
108 X_i = Lambda(lambda x: tf.slice(x, [0,0,0], [110000,1,128]))(X_input) # Slicing out inphase column
109 X_q = Lambda(lambda x: tf.slice(x, [0,1,0], [110000,1,128]))(X_input) # Slicing out quadrature column
--> 110 X_mag = Lambda(lambda x_i, x_q: tf.math.sqrt(tf.math.add(tf.math.square(x_i), tf.math.square(x_q))))((X_i, X_q)) # Acquiring magnitude of IQ
111 X_phase = Lambda(lambda x_i, x_q: tf.math.atan2(x_i, x_q))((X_i, X_q)) # Acquiring phase of IQ
112 X = Concatenate(axis=1)([X_mag, X_phase]) # Combining into two column (magnitude,phase) tensor
/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py in __call__(self, inputs, **kwargs)
412 # Raise exceptions in case the input is not compatible
413 # with the input_spec specified in the layer constructor.
--> 414 self.assert_input_compatibility(inputs)
415
416 # Collect input shapes to build layer.
/usr/local/lib/python3.6/dist-packages/keras/engine/base_layer.py in assert_input_compatibility(self, inputs)
283 'Received type: ' +
284 str(type(x)) + '. Full input: ' +
--> 285 str(inputs) + '. All inputs to the layer '
286 'should be tensors.')
287
ValueError: Layer lambda_47 was called with an input that isn't a symbolic tensor. Received type: <class 'tuple'>. Full input: [(<tf.Tensor 'lambda_45/Slice:0' shape=(110000, 1, 128) dtype=float32>, <tf.Tensor 'lambda_46/Slice:0' shape=(110000, 1, 128) dtype=float32>)]. All inputs to the layer should be tensors.
So the error is occurring on the "X_mag = Lambda" line. I have searched all related stack overflow posts, and none seem to account for the embedded use of tf operations here. Please help me resolve this issue!
Its really stumped me over the past two days.
You can't pass tuples to layers as their input. Instead you should use lists. Also, as a result, the lambda
function in a Lambda
layer accepts only one input argument, i.e. a list, which you can access its element using index:
X_mag = Lambda(lambda x: tf.math.sqrt(
tf.math.add(tf.math.square(x[0]), tf.math.square(x[1]))))([X_i, X_q]) # Acquiring magnitude of IQ
X_phase = Lambda(lambda x: tf.math.atan2(x[0], x[1]))([X_i, X_q]) # Acquiring phase of IQ