Search code examples
tensorflowtensorflow2.0keras-layertf.kerastensorflow2.x

Is it possible to initalize the layers in the neural newtworks first and add activations later?


for example, I have a sequential model with three layers.

model_loc = tf.keras.Sequential()

This below snippet is the usual way I add layers to the model and apply activation

model.add(Dense(10, input_dim=3, activation=tf.nn.tanh))
model.add(Dense(10, activation=tf.nn.tanh))
model.add(Dense(4))

Is it possible to apply activation function after each layer is added? Something like below :

model.add(Dense(10, input_dim=3))
model.add(activation=tf.nn.tanh))

model.add(Dense(10))
model.add(activation=tf.nn.sigmoid))

model.add(Dense(4))

Any help would be appreciated!


Solution

  • This is exactly why keras provides the Activation layer:

    model.add(Dense(10, input_dim=3))
    model.add(Activation("tanh"))
    
    model.add(Dense(10))
    model.add(Activation("sigmoid"))
    
    model.add(Dense(4))
    

    EDIT


    In case you want to use custom activations, you can use one of three different methods.

    Assume you are redefining sigmoid:

    def my_sigmoid(x):
        return 1 / (1 + tf.math.exp(-x))
    
    1. Use Activation layer:

      model.add(Activation(my_sigmoid))
      
    2. Use a Lambda layer:

      model.add(Lambda(lambda x: 1 / (1 + tf.math.exp(-x))))
      
    3. Define a custom Layer:

      class MySigmoid(Layer):
      
          def __init__(*args, **kwargs):
              super().__init__(*args, **kwargs)
      
          def call(inputs, **kwargs):
              return 1 / (1+tf.math.exp(-inputs))
      
      model.add(MySigmoid)
      

    Method 3 is especially useful for parametric activations, like PReLU.

    Method 2 is a quick fix for testing, but personally, I like to avoid it.

    Method 1 is the way to go for simple functions.