I am trying to make a classifier as parsimonious as possible. To do this, I am recursively dropping features using cross validation.
In the context of my model, precision is the most important metric. However, I would also like to see how other metrics evolve as less features are fed to the model.
In particular, I would like to evaluate the models’ recall and F1 scores.
rfe = RFECV(
estimator=clf, # An XGBClassifier instance
step=1,
min_features_to_select=1,
cv=cv, # A StratifiedKFold instance
scoring='precision',
# scoring=['f1', 'precision', 'recall'], # throws error
verbose=1,
n_jobs=1
)
I commented-out the line that passes a list of metrics to the scoring
argument because it throws an InvalidParameterError
(that is, it’s not happy that I passed it a list).
Is there a way to pass multiple metrics to an RFECV instance?