Why is it in logistic regression that the cost function looks like this:
enter image description here
but then the cost function while implementing gradient descent is basicalliy simplified down to linear regressions cost function where no logithms are included:
enter image description here
I fully understand why logs are necessary in logistic regression due to there being multiple local minima and potentially not reaching the global minimum. So why are they not present in the gradient descent algorithm?
I was expecting the cost function to be the same in gradient descent.
Frank Affatigato is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.