decision tree - Machine Learning - AdaBoost the best standalone algorithm -
on given problem, i've checked using k-fold (k=10) few standalone algorithms among them cart , knn.
knn came upper hand on cart , furthermore tuned using best k found (3).
when went on check ensemble methods tried adaboost tuned knn , default cart , surprisingly cart performed better.
i'm not deep math behind scenes don't understand how makes sense.
so 2 questions:
- how come adaboost cart performed better tuned knn?
- if standalone performance not predict ensemble base estimator, how wisely choose base estimators ensemble methods?
Comments
Post a Comment