Variable Selection with AdaBoost

Prof. Yongdai Kim

Hankuk Univ. of Foreign Studies

2000. 12. 21


AdaBoost is known to be the best classification method. But there
is no procedure of selecting variables in the framework of 
AdaBoost yet. In this talk, I propose a variable selection 
procedure with AdaBoost. By simulation, we argue that the proposed
method is preferable to the standard forward/backward sequential
method. Also, it is discussed that decision trees are not an
appropriate method of variable selection and brief comparision of
bagging and AdaBoost is presented. 




This page is maintained by InYoung Kim (iykim@scai.snu.ac.kr).
Last update: October 2, 2000