I’m new to machine learning and I have been learning gradient descent algorithm. I believe this code uses simultaneous update, even though it looks like sequential update. Since the values of partial derivative are calculated before updating either w or b, that is, from the original w and b, the algorithm thus applied on individual w,b is being applied from original values. Am I wrong?
dj_dw=((w*x[i]+b-y)*x[i])/m
dj_db=(w*x[i]+b-y)/m
w=w-a*dj_dw
b=b-a*dj_db
Apologies for any errors, I’m new.
I tried cross checking using gemini and chatgpt and they said it was sequential, Hence the confusion
Mayank Gupta is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.