Some Classical Constructive Neural Networks and their New Developments
Reviewing old ones is to better understand new ones and also for innovating. The mapping capability of artificial neural networks is dependent on their structure, i.e., the number of layers and the number of hidden units. Presently, there is no formal way of computing network topology as a function of the complexity of a problem; it is usually selected by trial-and-error and can be rather time consuming. Basically, we make use of two mechanisms that may modify the topology of the network: growth and pruning. This paper firstly discusses some learning algorithms and topologies of classical constructive neural networks. Only incremental or growing algorithms employing supervised learning algorithms are outlined here which includes Tiling algorithm, Tower algorithm, Upstart algorithm, Cascade-Correlation algorithm, Restricted coulomb energy network and Resource-allocation network. For each neural network model, we review their topology structure and learning features. The new development of constructive neural networks is given at the end of the paper.
constructive neural networks incremental learning discrete neural networks continuous neural networks
Zhen Li Guojian Cheng Xinjian Qiang
School of Computer Science Xian Shiyou University Xian, Shaanxi, P.R.China
国际会议
2010 International Conference on Educational and Network Technology(2010教育与网络技术国际会议 ICENT 2010)
秦皇岛
英文
174-178
2010-06-25(万方平台首次上网日期,不代表论文的发表时间)