Search code examples
machine-learningneural-networkbackpropagation

In neural Networks back propagation, how to get differential equations?


I am confused why dz=da*g'(z)? as we all know, in forward propagation,a=g(z),after taking the derivative of z, I can get da/dz=g'(z),so dz=da*1/g'(z)? Thanks!!


Solution

  • From what I remember, in many courses, representations like dZ are a shorter way of writing dJ/dZ and and so on. All derivatives are of the cost with respect to various parameters, activations and weighted sums etc.