decision tree - Machine Learning - AdaBoost the best standalone algorithm -


on given problem, i've checked using k-fold (k=10) few standalone algorithms among them cart , knn.

knn came upper hand on cart , furthermore tuned using best k found (3).

when went on check ensemble methods tried adaboost tuned knn , default cart , surprisingly cart performed better.

i'm not deep math behind scenes don't understand how makes sense.

so 2 questions:

  1. how come adaboost cart performed better tuned knn?
  2. if standalone performance not predict ensemble base estimator, how wisely choose base estimators ensemble methods?


Comments

Popular posts from this blog

java - SSE Emitter : Manage timeouts and complete() -

jquery - uncaught exception: DataTables Editor - remote hosting of code not allowed -

java - How to resolve error - package com.squareup.okhttp3 doesn't exist? -