Search code examples
functionneural-networkapproximation

Neural network: fit a function


Can you approximate a function (different from a line but still in the x,y plane: like cos, sin, arc, exp, etc) using a neural network with just an input, an output and a single layer of hidden neurons?


Solution

  • Yes, you can! Actually that's what the Universal Approximation Theory says, in short: the feed-forward network with a single hidden-layer can approximate any continuous function. However, it does not say anything about number of neurons in this layer (which can be very high) and the ability to algorithmicaly optimize weights of such a network. All it says is that such network exists.

    Here is the link to the original publication by Cybenko, who used sigmoid activation function for the proof: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.441.7873&rep=rep1&type=pdf

    And here is more friendly derivation: http://mcneela.github.io/machine_learning/2017/03/21/Universal-Approximation-Theorem.html