Search code examples
deep-learningneural-networknon-linear

Can a neural network having non-linear activation function (say ReLU) be used for linear classification task?


I think the answer would be yes, but I'm unable to reason out a good explanation on this.


Solution

  • Technically, yes.

    The reason you could use a non-linear activation function for this task is that you can manually alter the results. Let's say the range the activation function outputs is between 0.0-1.0, then you can round up or down to get a binary 0/1. Just to be clear, rounding up or down isn't linear activation, but for this specific question the purpose of the network was for classification, where some kind of rounding has to be applied.

    The reason you shouldn't is the same reason that you shouldn't attach an industrial heater to a fan and call it a hair-drier, it's unnecessarily powerful and it could potentially waste resources and time.

    I hope this answer helped, have a good day!