Search code examples
neural-networkxorbackpropagation

Neural Network XOR with 8+ Input nodes


Using a standard backdrop, I can train a network with up to 8 binary inputs to learn XOR. So that's a total of 256 input sets, and the output correctly identifies the 8 input sets with only one of the 8 inputs being 1, the rest being 0.

Layout:

• 8 Inputs;

• 1 Hidden layer with 2 or more nodes;

• Out: 1 node

It will train in approx 500 epochs, less if I use more hidden nodes.

However, I can not get to it train on 9 Input nodes at all, no matter how many hidden nodes I use.

Is there an intrinsic limit of 8 preventing this? I suspect I might need another hidden layer, but wanted to get an insight in to whether it is fundamentally impossible as it is?

Thanks for any clues.


Solution

  • Yes, a 9:2:1 can unequivically solve XOR. If you're unable to find a solution, it's either unsuitable settings, or a problem with the algorithm.