Currently I'm attempting to create a three layer neural network.When I began attempting training for XOR, this though crossed my mind:
double NewWeight(double oldWeight){
return oldWeight+(MeanSquaredError*input*learningRate);
}
This is the formula for a new weight according to http://natureofcode.com/book/chapter-10-neural-networks/
First,if I have an input of zero regardless of error the weight will remain the same. Is this solved using a bias?
Secondly, Neural networks often have more than two inputs (such as in XOR). In that case, would you need to add the two inputs? Or perhaps find the mean of the weight with separate inputs?
If you suggest I use a different new weight function, please don't post an equation without explaining the symbols behind it. Thanks!
First, the bias does not change anything. Usually, the bias is realised by an additional input with a constant 1 and a weight as bias. See https://en.wikipedia.org/wiki/Perceptron#Definitions.
Second, you calculate the weights for each edge in your network. So, if you have two inputs, you calculate the weights for each of them.
I would say that if you have 0 as input, you have no information. With no information you cannot tell how to change a weight. Your function is absolute correct for back-propagation.