会议专题

Performance of Activation Functions Used in Recurrent Neural Networks

According to the conventional gradient algorithm, a type of gradient-based neural networks (GNN) is developed and presented for the online solution of the constant Lyapunov matrix equation. For the superior convergence, such GNN models is improved and investigated by adding different types activation functions. Theoretical and simulative results both substantiate the efficacy of the improved neural networks for Lyapunov matrix equation soiling.

Recurrent neural networks Lyapunov matrix equation Activation function Global exponential convergence

Chenfu Yi Yuhuan Chen

School of Information Engineering Jiangxi University of Science and Technology Ganzhou 341000,China Center for Educational Technology Gannan Normal University Ganzhou 341000,China

国际会议

2011 3rd International Conference on Computer and Network Technology(ICCNT 2011)(2011第三届IEEE计算机与网络技术国际会议)

太原

英文

40-43

2011-02-26(万方平台首次上网日期,不代表论文的发表时间)