Search code examples
neural-networkdeep-learningazure-machine-learning-service

Can't approximate simple multiplication function in neural network with 1 hidden layer


I just wanted to test how good can neural network approximate multiplication function (regression task). I am using Azure Machine Learning Studio. I have 6500 samples, 1 hidden layer (I have tested 5 /30 /100 neurons per hidden layer), no normalization. And default parameters Learning rate - 0.005, Number of learning iterations - 200, The initial learning weigh - 0.1, The momentum - 0 [description]. I got extremely bad accuracy, close to 0. At the same time boosted Decision forest regression shows very good approximation.

What am I doing wrong? This task should be very easy for NN.


Solution

  • Big multiplication function gradient forces the net probably almost immediately into some horrifying state where all its hidden nodes have zero gradient. We can use two approaches:

    1) Devide by constant. We are just deviding everything before the learning and multiply after.

    2) Make log-normalization. It makes multiplication into addition:

    m = x*y => ln(m) = ln(x) + ln(y).