I am attempting to get a slice of the input data, source
. The input shape is [?, 512, 6] and has this shape as it is fed into an LSTM layer, the batch size is unspecified. My neural network, NN, is non-sequential and I require another layer using a subset of the original input data. Specifically, I want to obtain source[:, :, feature]
where feature
is a integer variable.
I can use raw = Reshape((512*6,))(source)
and pass raw
into a dense layer and the network will assemble, compile and train. However, when compiling using a subset of the source
array I get the following error:
AttributeError: 'NoneType' object has no attribute '_inbound_nodes'
I have tried a few different things. The original data type of source
is a numpy array. One attempt was to use:
raw = source[:,:,feature]
I believe that tf.slice
should be able to handle this. I have obtained the error as well when using:
raw = tf.slice( source, [0,0,feature], [-1,512,1] )
I have attempted to use type()
to try and determine which is the NoneType
object with no success. How do I obtain a working slice?
Edit: I have been able to obtain the slice and successfully added to the model using:
raw = Lambda(lambda x: x[:,:,feature])(source)
and
raw = Lambda(lambda x: tf.slice(x, [0,0,feature], [-1,512,1]))(source)
While this works and the model compiles, upon attempting to save I get the error:
TypeError: Not JSON Serializable
I have found some links discussing this issue and they state that the issue is that one of the layers is not a Keras layer. Their solutions involve using a Lambda layer to resolve the issue. I believe that I have been able to isolate the problem layer to be the Lambda layer, which as far as I can tell is a Keras layer. It is imported using:
from keras.layers import Lambda
I have found a solution through trial and error. I do not currently know why it works though. Let source
be the input array to the network and have the shape [?,width,height]
and the individual feature
element of interest be between 0
and height
. First we need to flatten source
such that it has one dimension. I believe the key to solving the problem is in the Flatten
layer as it must be making the nature of flat
different from source
some how despite the type of source
and flat
being the same (<class 'tensorflow.python.framework.ops.Tensor'>
). Now we can use a Lambda
layer to select the feature that we want based on it's new position within flat
.
flat = Flatten()(source)
subsection = Lambda(lambda x: x[:,feature*width:(1+feature)*width])(flat)
The new tensor subsection
can now be fed into some layer and the network will both compile as well as save correctly.