Strategies for Constructive Neural Networks and its Application to Regression Models
Regression problem is an important application area for neural networks (NNs). Among a large number of existing NN architectures,the feedforward NN (FNN) paradigm is one of the most widely used structures. Although one-hidden-layer feedforward neural networks (OHLFNNs) have simple structures,they possess interesting representational and learning capabilities. In this paper,we are interested particularly in incremental constructive training of OHL-FNNs. In the proposed incremental constructive training schemes for an OHL-FNN,input-side training and output-side training may be separated in order to reduce the training time. A new technique is proposed to scale the error signal during the constructive learning process to improve the input-side training efficiency and to obtain better generalization performance. Two pruning methods for removing the input-side redundant connections have also been applied. Numerical simulations demonstrate the potential and advantages of the proposed strategies when compared to other existing techniques in the literature.
Constructive neural networks Network pruning Training strategy Regression models
Jifu Nong
College of Mathematics and Computer Science,Guangxi University for Nationalities,Nanning,China Guangxi Key Laboratory of Hybrid Computation and IC Design Analysis,Nanning,China
国际会议
西安
英文
1766-1770
2011-12-23(万方平台首次上网日期,不代表论文的发表时间)