Search code examples
machine-learningneural-networkbackpropagation

How to modify backpropagation for a standard multilayer network including a scalar gain at each layer?


Considering a standard multilayer network including a scalar gain at each layer. The net input at layer m would be computed as : n^m = β^m [W^m α^m − 1 + b^m]

where β^m is the scalar gain at layer m . This gain would be trained like the weights and biases of the network.

How can I modify the backpropagation algorithm for this new network ?

What would be a new equation added to update β^m ?

This is an exercise from this book .

E11.13

Neural Network Design (2nd Edition) - Martin T. Hagan, Howard B. Demuth, Mark H. Beale, Orlando De Jesus


Solution

  • I have written the answer in LaTeX

    page 1 / 2

    page 2 / 2