会议专题

Mutual Information Based on Renyis Entropy Feature Selection

Feature selection problem has become the focus of much pattern classification research and mutual information is more and more important in the feature selection algorithms. We proposed normalized mutual information based on Renyis quadratic entropy feature selection, which reduces the computational complexity, relying on the efficient estimation of the mutual information. Then we combine NMIFS with wrappers into a two-stage feature selection algorithm. This helps us find more charactering feature subset. We perform some experiments to compare the efficiency and classification accuracy to other MI-based feature selection algorithm. Results show that our method leads to promising improvement on computation complexity.

feature selection mutual information Renyi Entropy estimation of entropy NMIFS

LIU Can-Tao HU Bao-Gang

National Laboratory of Pattern Recognition /Sino-French Laboratory in Computer Science,Automation and Applied Mathematics,Institute of Automation,Chinese Academy of Sciences,Beijing 100190,P.R.China;Graduate University of Chinese Academy of Sciences,Beiji

国际会议

2009 IEEE International Conference on Intelligent Computing and Intelligent Systems(2009 IEEE 智能计算与智能系统国际会议)

上海

英文

816-820

2009-11-20(万方平台首次上网日期,不代表论文的发表时间)