Search code examples
pythontensorflowkerasdimensions

How to squeeze or reduce dimensions of a mapdataset in tensorflow


I have a mapdataset element with the following dimensions:

MapDataset element_spec=(TensorSpec(shape=(None, 1, 24), dtype=tf.float32, name=None), TensorSpec(shape=(None, 1, 24), dtype=tf.float32, name=None))

I want to create a functional api as below:

input_ = keras.layers.Input(shape=24)
hidden1 = keras.layers.Dense(30, activation="relu")(input_)
hidden2 = keras.layers.Dense(30, activation="relu")(hidden1)
concat = keras.layers.concatenate([input_, hidden2])
output = keras.layers.Dense(1)(concat)
model = keras.models.Model(inputs=[input_], outputs=[output])

I get the following error

ValueError: Input 0 of layer "model_3" is incompatible with the layer: expected shape=(None, 24), found shape=(None, 1, 24)

How can I reduce the dimensions in the map dataset? I tried [:, -1:, :] or the tf.squeeze() method but it did not work.


Solution

  • You can just apply another map function to your dataset to reduce the dimensions, before feeding your dataset to your model:

    def prepare_data(x):
      return tf.random.normal((samples, 1, 24)), tf.random.normal((samples, 1, 24))
    
    def reduce_dimension(x, y):
      return tf.squeeze(x, axis=1), tf.squeeze(y, axis=1)
    
    samples = 50
    dataset = tf.data.Dataset.range(samples)
    
    dataset = dataset.map(prepare_data)
    print('Before reducing dimension: ', dataset.element_spec)
    
    dataset = dataset.map(reduce_dimension)
    print('After reducing dimension: ', dataset.element_spec)
    
    Before reducing dimension:  (TensorSpec(shape=(50, 1, 24), dtype=tf.float32, name=None), TensorSpec(shape=(50, 1, 24), dtype=tf.float32, name=None))
    After reducing dimension:  (TensorSpec(shape=(50, 24), dtype=tf.float32, name=None), TensorSpec(shape=(50, 24), dtype=tf.float32, name=None))
    

    Depending on your use case, you could also simply reduce the dimensions in the first map function. Here, I am assuming that your MapDataset already exists.