会议专题

CHARACTERIZATION OF DEGREE OF APPROXIMATION FOR NEURAL NETWORKS WITH ONE HIDDEN LAYER

There have been various studies on approximation ability of feedforward neural networks (FNNs). Most of the existing studies are, however, only concerned with density or upper bound estimation on how a function can be approximated by an FNN, and consequently, the essential approximation ability of an FNN can not been revealed. In this paper, by establishing both upper and lower bound estimations on degree of approximation, the essential approximation ability of a class of FNNs is clarified in terms of the modulus of smoothness of functions to be approximated. The involved FNNs can not only approximate any continuous functions arbitrarily well, but also provide an explicit lower bound on number of hidden units required. By making use of approximation tools, it is shown that when the functions to be approximated are Lipschitzian, the approximation speed of the FNNs is determined by modulus of smoothness of the functions.

Neural networks approximation error approximation order

FEI-LONG CAO ZONG-BEN XU MAN-XI HE

Department of Information and Mathematics Sciences, College of Science,China Jiliang University, Han Institute for Information and System Sciences, Faculty of Science, Xian Jiaotong University, Xian, Department of Information and Mathematics Sciences, College of Science,China Jiliang University, Han

国际会议

2006 International Conference on Machine Learning and Cybernetics(IEEE第五届机器学习与控制论坛)

大连

英文

2944-2947

2006-08-13(万方平台首次上网日期,不代表论文的发表时间)