For the multilayer perceptron example at
[http://www.deeplearning.net/tutorial/mlp.html#mlp][1]
why are the params (weights & biases) added. See the fifth code section in the above page.
They are not really "added". The parameters of the layer are in a list and sum of lists is still a list. So the operation is not a mathematical summation.
Here's the code to describe the idea:
param1 = range(2, 6)
param2 = range(12, 15)
print param1
print param2
print "param1 + param2: %s" % ''.join(str(param1 + param2))
Will print
[2, 3, 4, 5]
[12, 13, 14]
param1 + param2: [2, 3, 4, 5, 12, 13, 14]