Search code examples
pythontensorflowmachine-learningkeras

How to rescale every sample individually in a pre-processing layer?


I want to add a pre-processing layer to a keras model that both applies during training and when using the model. It is important it also applies when the model is used, so raw data can be fed to it. This layer should rescale every sample individually and put it on a [-1, 1] scale. With sklearn this is possible, but I would like to keep my model only dependent on tensorflow.

In the keras documentation I found a normalization layer: tf.keras.layers.Normalization, but I suspect this layer normalizes the entire dataset.

In sklearn, you could use the MinMaxScaler, which does scale each feature individually. See documentation here. How can I achieve similar behavior inside my keras model?


Solution

  • I could not find a tensorflow function that could rescale every sample individually, so in the end I resorted to creating a custom layer that takes care of it:

    import tensorflow as tf
    from tensorflow.keras import layers
    
    class MinMaxScalingLayer(layers.Layer):
        def __init__(self, feature_range=(-1, 1)):
            super(MinMaxScalingLayer, self).__init__()
            self.feature_range = feature_range
    
        def call(self, inputs):
            # Compute the min and max values
            min_val = tf.reduce_min(inputs, axis=1, keepdims=True)
            max_val = tf.reduce_max(inputs, axis=1, keepdims=True)
            scaled = ((inputs - min_val) / (max_val - min_val) * (self.feature_range[1] - self.feature_range[0])) + self.feature_range[0]
            return scaled