Search code examples
pythonkeraskeras-layer

How do you create a custom activation function with Keras?


Sometimes the default standard activations like ReLU, tanh, softmax, ... and the advanced activations like LeakyReLU aren't enough. And it might also not be in keras-contrib.

How do you create your own activation function?


Solution

  • Credits to this Github issue comment by Ritchie Ng.

    # Creating a model
    from keras.models import Sequential
    from keras.layers import Dense
    
    # Custom activation function
    from keras.layers import Activation
    from keras import backend as K
    from keras.utils.generic_utils import get_custom_objects
    
    
    def custom_activation(x):
        return (K.sigmoid(x) * 5) - 1
    
    get_custom_objects().update({'custom_activation': Activation(custom_activation)})
    
    # Usage
    model = Sequential()
    model.add(Dense(32, input_dim=784))
    model.add(Activation(custom_activation, name='SpecialActivation'))
    print(model.summary())
    

    Please keep in mind that you have to import this function when you save and restore the model. See the note of keras-contrib.