Search code examples
keraspytorch

Pytorch how use a linear activation function


In Keras, I can create any network layer with a linear activation function as follows (for example, a fully-connected layer is taken):

model.add(keras.layers.Dense(outs, input_shape=(160,), activation='linear'))

But I can't find the linear activation function in the PyTorch documentation. ReLU is not suitable, because there are negative values in my sample. How do I create a layer with a linear activation function in PyTorch?


Solution

  • If you take a look at the Keras documentation, you will see tf.keras.layers.Dense's activation='linear' corresponds to the a(x) = x function. Which means no non-linearity.

    So in PyTorch, you just define the linear function without adding any activation layer:

    torch.nn.Linear(160, outs)