Search code examples
neural-networkbackpropagation

Testing ANN backpropagation implementation


When assembling an artificial neural net with backpropagation, the algorithm is complex and there is no stable reference point for testing.

What is the best approach to debug correctness of implemented algorithm?

I am not talking about train/validation/test phases, I would like some sort of gauge or, perhaps, a step–by–step training results/weight values to check the against the inner workings of a network, for a given configuration and training data.


Solution

  • What I think might help is:

    1. Using modules with an automatic differentation frameworks (like Theano or Tensor Flow) and defining a network with the same configuration as yours - and checking results using their inner differentation frameworks.
    2. Using a numerical gradient computations.
    3. Computing appropriate gradients on your own and comparing results.