会议专题

LOCALIZED GENERALIZATION ERROR MODEL FOR MULTILAYER PERCEPTRON NEURAL NETWORKS

In this work, the localized generalization error model (L-GEM) for Multilayer Perceptron Neural Network (MLPNN) is derived. The L-GEM is inspired by the fact that a classifier should not be required to recognize unseen samples that are very different from the training samples. Therefore, evaluating a classifier by very different unseen samples may be counter-productive. In the L-GEM, the local is defined by the difference between feature values of unseen samples and training samples is less than a given real value (Q). The L-GEM provides an upper bound of the Mean-Square-Error of unseen samples local to the training dataset. As the generalization capability of a MLPNN is the key evaluation criterion of a successful training of MLPNN, we select the number of hidden neurons of a MLPNN using the L-GEM. The experimental results on four UCI datasets show that the proposed L-GEM yields better MLPNNs with higher generalization power (testing accuracy) and smaller number of hidden neurons.

Multilayer Perceptron Neural Network Localized Generalization Error bound Stochastic Sensitivity Measure Architecture Selection

FEI YANG WING W.Y.NG ERIC C.C.TSANG XIAO-QIN ZENG DANIEL S.YEUNG

Media and Life Science Computing Lab, Shenzhen Graduate School, Harbin Institute of Technology, Chin Department of Computing, Hong Kong Polytechnic University, Hong Kong, China Department of Computer Science and Engineering, Hohai University, Nanjing 210098, China

国际会议

2008 International Conference on Machine Learning and Cybernetics(2008机器学习与控制论国际会议)

昆明

英文

794-799

2008-07-12(万方平台首次上网日期,不代表论文的发表时间)