Search code examples
machine-learningkerasdeep-learningneural-networkconv-neural-network

How do I define/change the accuracy for a non-classification convolutional neural network?


I'm using Keras to make a prediction model. It takes in two time series and outputs a number between 0 and 1. Currently, I am getting very low accuracy as the model is only considered "correct" if it gets the exact number. For example, the correct number is 0.34, it would be considered incorrect if it predicted 0.35. I want to be able to consider all numbers within a range to be correct, for example: within 0.05 of the true value. Another option may be to round, but I have the problem of it outputting 6 decimal places.

  1. How can I consider all numbers within a range to be "correct" for the accuracy?
  2. How can I round the output of the CNN?

Here is my CNN code:

def networkModel():
    model = tf.keras.Sequential([

tf.keras.layers.Conv2D(filters = 16, kernel_size=(2, 2), activation='relu',padding='same'),
tf.keras.layers.Conv2D(filters = 9, kernel_size=(2, 2), activation='relu',padding='same'),
tf.keras.layers.MaxPooling2D(pool_size=(2, 2)),
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(256, activation='relu'),
tf.keras.layers.Dense(1, activation='sigmoid')

])

    model.compile(optimizer='adam',
            loss = tf.keras.losses.BinaryCrossentropy(),
            metrics=['accuracy'])

    return model

Solution

  • For this specific case, you can define a custom accuracy function as a metric and define a Callback for your Keras model.

    Custom Accuracy Metric:

    import keras.backend as K
    
    def custom_accuracy(y_true, y_pred, tolerance=0.05):
        absolute_difference = K.abs(y_true - y_pred)
        correct_predictions = K.cast(absolute_difference <= tolerance, dtype='float32')
        return K.mean(correct_predictions)
    
    model.compile(optimizer='adam', loss='mse', metrics=[custom_accuracy])
    

    Custom Callback:

    from keras.callbacks import Callback
    import numpy as np
    
    class CustomAccuracyCallback(Callback):
        def __init__(self, validation_data, tolerance=0.05):
            super(CustomAccuracyCallback, self).__init__()
            self.validation_data = validation_data
            self.tolerance = tolerance
    
        def on_epoch_end(self, epoch, logs={}):
            x_val, y_val = self.validation_data
            y_pred = self.model.predict(x_val)
            accuracy = np.mean(np.abs(y_val - y_pred) <= self.tolerance)
            print(f"\nEpoch {epoch + 1}: Custom Accuracy: {accuracy:.4f}")
            logs['custom_accuracy'] = accuracy
    
    custom_callback = CustomAccuracyCallback((x_val, y_val))
    model.fit(x_train, y_train, validation_data=(x_val, y_val), callbacks=[custom_callback])