Initially I implemented a Backpropagation network in Matlab and used it on XOR. However, now I am using to the same network using the following input/target combination.
Inputs = [0 0; 0 1; 1000 0; 1 1]
, Targets = [0; 1000; 1; 0]
And I get output as [1;1;1;1]
The network wasn't able to learn the network at all. Could anyone please explain why it is so? And what can I do if I to build a network that can learn such small I/O networks?
Any explanation is highly appreciated.
Regards Max
It's looks like a scaling problem. In your original XOR problems, the inputs and outputs were all on comparable scales, namely [0,1]. In your revised problem, some inputs appear to be [0,1] and some [0,1000].
The solution is to normalize the inputs to similar scales: [0,1] or [-1,1] are commonly used. In your case, it may be sufficient to divide the inputs by 1000 to place your range into [0,1]. Don't forget to denormalize the outputs (i.e. multiply by 1000 in your case) to return to the original scale.