Better digit recognition with a committee of simple Neural Nets
We present a new method to train the members of a committee of one-hidden-layer neural nets. Instead of training various nets on subsets of the training data we preprocess the training data for each individual model such that the corresponding errors are decorrelated. On the MNIST digit recognition benchmark set we obtain a recognition error rate of 0.39%, using a committee of 25 one-hidden-layer neural nets, which is on par with state-of-the-art recognition rates of more complicated systems.
Ueli Meier Dan Claudiu Ciresan Luca Maria Gambardella Jürgen Schmidhuber
IDSIA USI, SUPSI 6928 Manno-Lugano, Switzerland
国际会议
北京
英文
1250-1254
2011-09-01(万方平台首次上网日期,不代表论文的发表时间)