Quoting from Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems by Aurélien Géron.
“Bootstrapping introduces a bit more diversity in the subsets that each predictor is trained on, so bagging ends up with a slightly higher bias than pasting, but this also means that predictors end up being less correlated so the ensemble’s variance is reduced. Overall, bagging often results in better models, which explains why it is generally preferred.”
Can someone please explain this intuitively?
I am unable to comprehend why bagging is a better option that pasting?
Aayushi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
1