Dual Weight Learning Vector Quantization
A new Learning Vector Quantization (LVQ) approach,so-called Dual Weight Learning Vector Quantization (DWLVQ),is presented in this paper.The basic idea is to introduce an additional weight (namely the importance vector) for each feature of reference vectors to indicate the importance of this feature during the classification.The importance vectors are adapted regarding the fitness of the respective reference vetor over the training iteration.Along with the progress of the training procedure,the dual weights (reference vector and importance vector) can be adjusted simultaneously and mutually to improve the recognition rate eventually.Machine learning databases from UCI are selected to verify the performance of the proposed new approach.The experimental results show that DWLVQ can yield superior performance in terms of recognition rate,computational complexity and stability,compared with the other existing methods which including LVQ,Generalized LVQ(GLVQ),Relevance LVQ(RLVQ) and Generalized Relevance LVQ (GRLVQ).
Chuanfeng Lv Qiangfu Zhao Xing An Zhiwen Liu
Department of Electronic Engineering Beijing Institute of Technology China,100081 School of Computer Science and Engineering The University of Aizu Japan,965-8580
国际会议
9th International Conference on Signal Processing(第九届国际信号处理学术会议)(ICSP08)
北京
英文
2008-10-26(万方平台首次上网日期,不代表论文的发表时间)