I’m implementing an decision tree classifier using sklearn and testing out different criteria, but I can’t seem to find what the difference is between the ‘entropy’ and ‘log_loss’ criteria. The underlying _classes.py of the tree folder in sklearn’s source code defines the log loss and entropy as the same type in its classifier criteria dict, presumably making them the same operation?
CRITERIA_CLF = {
"gini": _criterion.Gini,
"log_loss": _criterion.Entropy,
"entropy": _criterion.Entropy,
}
What is the underlying difference? Seems running either log loss and entropy does the same thing under the hood.