I am dealing with a very imbalanced multiclass classification problem and trying to use sklearn’s GradientBoostingClassifier as my model. From the documentation here, I can see that the only available loss functions are ‘log-loss’ and ‘exponential’. From my research, I have seen that exponential loss grows exponentially for negative values which makes it more sensitive to outliers. This makes me think it would be better at classifying my minority classes than log loss, which grows linearly for negative values. However, when I try to use the exponential loss I get the error
ValueError: ExponentialLoss requires 2 classes; got 55 class(es)
showing that, I can’t use the exponential loss for a multiclass classification problem. From this paper, it seems entirely possible to use exponential loss for multiclass classifications–so I am curious why this is unavailable for this model.
Should I just stick with log loss? Should I change the model? Why can’t I use exponential loss for multiple classes? Any answers would be appreciated! I’m trying to further understand this model