会议专题

ON COMBINING DISTRIBUTED SVMS BY SIMPLE BAYESIAN FORMALISM RULES

Support vector machines (SVMs) has been accepted as a fashionable method in machine learning community.However, it cannot be easily scaled to handle large scale problems for its time and space complexity that is around quadratic with respect to the number of training samples.This paper proposes to combine distributed SVMs by simple Bayesian formalism rules (B-SVMs).B-SVMs randomly decomposes a large-scale task into many smaller and simpler sub-tasks in training phase and uses simple Bayesian formalism rules to make decision for final classification in test phase.B-SVMs was compared with single SVMs that is trained on entire training data set, parallel SVMs combined by majority voting (MV-SVMs), and one kind of fast modular SVMs (FM-SVMs).Experimental results on four problems show that B-SVMs can get higher accuracy than MV-SVMs and FM-SVMs does, the proposed algorithm can significantly reduce training and test time.More importantly, it produces test accuracy that is almost the same as single SVMs does.

XIAO-MING JIN YI-MIN WEN

Business school of Central South University, 25 Lu Shan Rd., Changsha 410083, China;Hunan Industry P Hunan Industry Polytechnic, Changsha 410208, China

国际会议

2007 International Conference on Machine Learning and Cybernetics(IEEE第六届机器学习与控制论国际会议)

香港

英文

3630-3635

2007-08-19(万方平台首次上网日期,不代表论文的发表时间)