Search code examples
cneural-networkxorbackpropagation

XOR neural network seems to converge around 0.5


I'm trying to code a XOR neural network for some weeks, but I always face the same problem. First of all you have to know that I spent hours and hours trying all I found on the net but nothing worked.

After trying to do it using 3Blue1Brown videos on the subject without success, I am now using this http://neuralnetworksanddeeplearning.com/chap2.html. I coded a Matrix library with all the necessary functions.

My network does have 3 layers with: 2 input neurons, 2 hidden neurons, 1 output neuron. Moreover I have 2 biases pointing the hidden neurons, and one pointing the output neuron. I use the sigmoid function to have values between 0 and 1, and the quadratic cost function. Everytime I train the network (ie everytime I use backpropagation) I choose a random input with its corresponding output.

The problem is, whatever how many times I train it, the output is never even close to 0 or 1 but always messing around 0.5, and my cost function is stuck around 0.14.

ANY HINT OR HELP IS APPRECIATED -- I really don't get where the problem is, I feel like I've tried everything. PS: Did not show any code here, if needed, don't hesitate to say it.


Solution

  • I've managed to resolve my problem by adding layers in my network. Moreover, when I improved it in order to code an OCR, I added a learning rate to escape from local miminas which were partly the problem everytime my network was getting stuck.