I was reading some literature about ANN and got a bit confused with how the biases are updated. I understand that the process is done through backpropagation, however I am confused to which part of the biases are actually adjusted since I read that their value is always one.
So my question is if the biases values are adjusted because their connection channel weights are update therefore causing the adjustment or if is the actual value one that is updated.
Thanks in advance!
Bias is just another parameter that is trained by computing derivatives, as every other part of the neural network. One can simulate a bias by concatenating extra 1 to activations on the previous layer, since
w x + b = <[w b], [x 1]>
where [ ] is concatenation. Consequently it is not the bias that is 1, bias is just a trainable parameter, but one can think about a bias as if it was regular neuron-neuron connection, where the input neuron is equal to 1.