Search code examples
neural-networkregressionpybrain

Bad regression output of neural network - an unwanted upper bound?


I am having a problem in a project which uses pybrain(a python library for neural network) to build an ANN and do regression as prediction. I am using 3-layer ANN, with 14 inputs, 10 hidden neurons in the hidden layer, and 2 outputs. A typical training or test example would be like this,

Inputs(divided by space): 1534334.489 1554790.856 1566060.675 20 20 20 50 45000 -11.399025 13 1.05E-03 1.775475116 20 0

Outputs(divided by space): 1571172.296 20

And I am using pybrain's BackpropTrainer so it is training using Backpropagation, and I trained until convergence. The weird thing of the result is that the prediction of the first output(e.g. the first output of the trained ANN using test inputs) tracks the real value well in lower parts of the curve but seems to have an unwanted upperbound when real value rises.

I changed the number of hidden neurons to 10 but it still behaves like this. Even if I tested the trained ANN using the original training samples, it would still have an upperbound like this.

Does anyone have an intuition or advice on what's wrong here? Thanks!


Solution

  • Try to normalize the values(input and output) between (-1, +1).