I'm trying to concatenate a number to the last dimension of a (None, 10, 3) tensor to make it a (None, 10, 4) tensor using a custom layer. It seems impossible, because to concatenate, all the dimensions except for the one being merged on must be equal and we can't initialize a tensor with 'None' as the first dimension.
For example, the code below gives me this error:
ValueError: Shape must be rank 3 but is rank 2 for '{{node position_embedding_concat_37/concat}} = ConcatV2[N=2, T=DT_FLOAT, Tidx=DT_INT32](Placeholder, position_embedding_concat_37/concat/values_1, position_embedding_concat_37/concat/axis)' with input shapes: [?,10,3], [10,1], []
class PositionEmbeddingConcat(tf.keras.layers.Layer):
def __init__(self, sequence_length, **kwargs):
super(PositionEmbeddingConcat, self).__init__(**kwargs)
self.positional_embeddings_array = np.arange(sequence_length).reshape(sequence_length, 1)
def call(self, inputs):
outp = tf.concat([inputs, self.positional_embeddings_array], axis = 2)
return outp
seq_len = 10
input_layer = Input(shape = (seq_len, 3))
embedding_layer = PositionEmbeddingConcat(sequence_length = seq_len)
embeddings = embedding_layer(input_layer)
dense_layer = Dense(units = 1)
output = dense_layer(Flatten()(embeddings))
modelT = tf.keras.Model(input_layer, output)
Is there another way to do this?
You will have to make sure you respect the batch dimension. Maybe something like this:
outp = tf.concat([inputs, tf.cast(tf.repeat(self.positional_embeddings_array[None, ...], repeats=tf.shape(inputs)[0], axis=0), dtype=tf.float32)], axis = 2)
Also, tf.shape
gives you the dynamic shape of a tensor.