Integrated Classification Comparison:Bagging NB & Boosting NB
-
Abstract
Bagging and Boosting are two important voting classification algorithms. Bagging parallel generates multiple classifiers and by adjusting the sample weights, Boosting generates multiple classifiers serially. This paper integrated Bagging and Boosting algorithms with Naive Bayesian to construct Bagging NB and AdaBoosting NB. Based on UCI data sets, experiment result show that Bagging NB is more stable and can produce more accurate classifier than NB. Boosting algorithms are sensitive with the distribution of the data set and sometimes less effective.
-
-