Search code examples
python-3.xtensorflowkerasconv-neural-networkkeras-layer

Keras - passing different parameter for different data point onto Lambda Layer


I am working on a CNN model in Keras/TF background. At the end of final convolutional layer, I need to pool the output maps from the filters. Instead of using GlobalAveragePooling or any other sort of pooling, I had to pool according to time frames which exist along the width of the output map.

So if a sample output from one filter is let's say n x m, n being time frames and m outputs along the features. Here I just need to pool output from frames n1 to n2 where n1 and n2 <= n. So my output slice is (n2-n1)*m, on which I will apply pooling. I came across Lambda Layer of keras to do this. But I am stuck at a point where n1 and n2 will be different for each points. So my question is how can pass a custom argument for each data point onto a Lambda Layer? or am I approaching this in a wrong way?

A sample snippet:

# for slicing a tensor
def time_based_slicing(x, crop_at):
    dim = x.get_shape()
    len_ = crop_at[1] - crop_at[0]
    return tf.slice(x, [0, crop_at[0], 0, 0], [1, len_, dim[2], dim[3]])

# for output shape
def return_out_shape(input_shape):
    return tuple([input_shape[0], None, input_shape[2], input_shape[3]])

# lambda layer addition
model.add(Lambda(time_based_slicing, output_shape=return_out_shape, arguments={'crop_at': (2, 5)}))

The above argument crop_at needs to be custom for each data point when fitting in a loop. Any pointers/clues to this will be helpful.


Solution

  • Given that you know the indices of the time frames that belong to each datapoint from before, you can store them in a text file and pass them as an additional Input to your model:

    slice_input = Input((2,))
    

    And use those in your time_based_slicing function.