The Naive Bayesian Classifier Learning Algorithm based on Adaboost and Parameter Expectations
Naive Bayesian classifier is a simple classification method based on Bayes statistics, which is one of the most popular classifiers and has been successfully applied to many fields. To improve the generalization ability of the naive Bayesian classifier, discriminative learning of the naive Bayesian classifier is researched. In this paper, a parameter learning algorithm AENB of the naive Bayesian classifier is proposed. This algorithm adopts the Adaboosts classifier ensemble framework, sequentially generates a series of individual classifiers with parameters, and obtains parameter expectations by summing the weighting parameters of each individual classifier. In the final, the naive Bayesian classifier with parameter expectations is constructed. The experimental results show that the AENB algorithm improves classification accuracy of the naive Bayesian classifier in the most cases. Furthermore, compared with the naive Bayesian classifier ensemble, AENB requires less space because there is no need to save parameters of individual classifiers.
classification Adaboost Naive Bayes parameter learning
Hongbo Shi Xiaoyong Lv
School of Information management,Shanxi University of Finance and economics,Taiyuan,030031 China
国际会议
黄山
英文
377-381
2010-05-28(万方平台首次上网日期,不代表论文的发表时间)