Search code examples
neural-networkbackpropagationq-learning

Large values of weights in neural network


I use Q-learning with neural network as approimator. And after several training iteration, weights acquire values in the range from 0 to 10. Can the weights take such values? Or does this indicate bad network parameters?


Solution

  • Weights can take those values. Especially when you're propagating a large number of iterations; the connections that need to be 'heavy', get 'heavier'.

    There are plenty examples showing neural networks with weights larger than 1. Example.

    Also, following this image, there is no such thing as weight limits:

    image legend