Search code examples
pythonpybrain

Why does one input have much more weight than another


I am using the simple getting started network in Pybrain. I trained it with my data (about 200 data points). This data is 2 inputs and one output.

I did a test where I kept one input the same and iterated over a range of conceivable values for the other input. The output was the same throughout the possible values of the second input. I had to go 234.17 times higher than the highest possible value to get it to change.

I also did a test where i kept the second input the same and only changed the first one. The output only changed for the first 0.12% of the possible data values for the first input.

Do I simply need to train the network with more data, or do I need to do something different with the way the network is set up?

Edit:

Here is the code that sets up the NN and returns the prediction for one value:

net = buildNetwork(2, 3, 1)
ds = SupervisedDataSet(2, 1)
for each in train_data:
    ds.addSample((each[0], each[1]), (each[2],)) #adds training data to Data Set
print len(ds)
trainer = BackpropTrainer(net, ds) #sets up trainer
print trainer.trainUntilConvergence(maxEpochs=2000, verbose=True) #Trains
print net.activate([(1420884000%86400),51.74]) #print prediction

One input is the time of day (in seconds since midnight). The other input is current temperature. The output should be the predicted temperature in an arbitrary time based on the training data.


Solution

  • I get better results with neural networks if the inputs are all scaled to be between 0 and 1, with the maximum possible value (or guaranteed largest value in that dataset) as 1 and the minimum as 0.

    This is particularly a good idea when there are multiple inputs, as there usually are, so that it is not as difficult for the network to start with roughly equal weight given to the two different inputs. The objectively smaller input might be more important for the system in question, and scaling both inputs to be between 0 and 1 makes this easier to achieve.