会议专题

COMPUTATION OF TWO-LAYER PERCEPTRON NETWORKS SENSITIVITY TO INPUT PERTURBATION

The sensitivity of a neural networks output to its input perturbation is an important measure for evaluating the networks performance. In this paper we propose a novel method to quantify the sensitivity of a Two-Layer Perceptron Network (TLPN). The sensitivity is defined as the mathematical expectation of absolute output deviations due to input perturbations with respect to all possible inputs. In our method a bottom-up way is followed, in which the sensitivity of a neuron is first considered and then is that of the entire network. The main contribution of the method is that it requests a weak assumption on the input, that is its elements need only to be independent identically distributed, and thus is more practical to real applications. Some experiments have been conducted, and the results demonstrate high accuracy and efficiency of the method.

Sensitivity Two-Layer Perceptron Network Central Limit Theorem

JING YANG XIAO-QIN ZENG WING W.Y.NG DANIEL S.YEUNG

Department of Computer Science and Engineering, Hohai University, Nanjing 210098, China Media and Life Science Computing Laboratory, Shenzhen Graduate School, Harbin Institute of Technolog

国际会议

2008 International Conference on Machine Learning and Cybernetics(2008机器学习与控制论国际会议)

昆明

英文

762-767

2008-07-12(万方平台首次上网日期,不代表论文的发表时间)