会议专题

Risk-Averting Criteria for Training Neural Networks

This paper shows that when a risk-averting error criterion is used to train a neural network or estimate a nonlinear regression model, as the risk-sensitivity index of the criterion increases, the domain on which the criterion is convex increases monotonically to the entice weight or parameter vector space RN except the union of a finite number of manifolds whose dimensions are less than N. This paper also shows that in reading the risk-sensitivity index reduces the maximum deviation of the outputs of the trained neural network or estimated regression model from the corresponding output measurements.

James Ting-Ho Lo

Department of Mathematics and Statistics University of Maryland Baltimore County Baltimore, MD 21228, U.S.A.

国际会议

8th International Conference on Neural Information Processing(ICONIP 2001)(第八届国际神经信息处理大会)

上海

英文

514-519

2001-11-14(万方平台首次上网日期,不代表论文的发表时间)