LI Xiao-bo. Integrated Classification Comparison:Bagging NB & Boosting NB[J]. Microelectronics & Computer, 2010, 27(8): 136-139.
Citation: LI Xiao-bo. Integrated Classification Comparison:Bagging NB & Boosting NB[J]. Microelectronics & Computer, 2010, 27(8): 136-139.

Integrated Classification Comparison:Bagging NB & Boosting NB

  • Bagging and Boosting are two important voting classification algorithms. Bagging parallel generates multiple classifiers and by adjusting the sample weights, Boosting generates multiple classifiers serially. This paper integrated Bagging and Boosting algorithms with Naive Bayesian to construct Bagging NB and AdaBoosting NB. Based on UCI data sets, experiment result show that Bagging NB is more stable and can produce more accurate classifier than NB. Boosting algorithms are sensitive with the distribution of the data set and sometimes less effective.
  • loading

Catalog

    Turn off MathJax
    Article Contents

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return