I am trying to understand formula behind backward propagation.
So if our last layer is sigmoid we can calculate dZ = A−Y
But if we have a layer in middle of neural network, it is calculated like dZ = dA⋅A⋅(1−A)
.
Our general formula is dZi = dAi⋅Ai⋅(1−Ai)
.
Then we have dA = A−Y
. so we can obtain dZ = dA⋅A⋅(1−A) = (A−Y)⋅A⋅(1−A)
also dZ = A−Y
.
now that made me confused.
Can anyone help please. I need an explanation.
PS: I’ve obtained formulas from chatgpt and I’ve asked it what does it mean but got not an understandable answer.