会议专题

Local Self-Adaptation Mechanisms for Large-Scale Neural System Building

For integrating neural networks into large systems, dynamical stability and parameter settings are key issues, especially for popular recurrent network modelssuch as dynamic neural fields. In neural circuits, homeostatic plasticity seems to counter these problems. Here we present a set of gradient adaptation rules that autonomously regulate the strength of synaptic input and the parameters of the transfer function for each neuron individually. By doing this, we actively maintain the average membrane potentials and firing rates as well as the variances of the firing rate at specified levels. A focus of this contribution lies on clarifying at which time scales these mechanisms should work. The benefit of such self-adaptation is a significant reduction of free parameters as well as the possibility to connect a neural field to almost arbitrary inputs since dynamical stability is actively maintained. We consider these two properties to be crucial since they will facilitate the construction of large neural systems significantly.

Neural fields Self-adaptation Dynamic stability

M. Garcia Ortiz A. Gepperth

Honda Research Institute Europe GmbH, 63073 Offenbach, Germany

国际会议

The Second International Conference on Cognitive Neurodynamics--2009(第二届国际认知神经动力学会议)

杭州

英文

543-552

2009-11-15(万方平台首次上网日期,不代表论文的发表时间)