I’m trying to implement backpropagation from scratch. I’m trying to update the weights of a network with multiple inputs and multiple outputs but no hidden layers. The logic I have implemented seems fine to me. But the weights are not converging but rather increasing. I’m not able to debug as to why it is happening.
def update():
global weights_n
for i in range(10000):
y_mult = p1(x_random)
print("Varying Output",y_mult)
print(weights_n)
error = calculate_errors(y_mult, y)
z=y_mult-y
print("Error",error)
grad_a1 = np.dot(x_random.T,z)
print(learning_rate * grad_a1)
weights_n = weights_n - learning_rate * grad_a1
print(weights_n)
if error.all()<=0.5:
break
return weights_n
When I try to use it with numpy sum and matmul, it works but with dot product something is off.
user26675387 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.