会议专题

Learning Rates comparison in Backpropagation method for Interval Type-2 Fuzzy Neural Networks

Fuzzy Neural Network (FNN) is a hybrid intelligent system that shows better results for complicate nonlinear systems, in which there are linguistic information and data information, simultaneously. Type-2 Fuzzy Neural Network (T2FNN) has a same structure with FNN and is appropriate for systems with a high level of uncertainty. To reduce computational complexity of the type reduction process, Interval T2FNN is developed; however, it still has a large computational load. In this paper, dynamical optimal learning rate is compared to constant learning rate in the BP method. Simulation results show that backpropagation convergence, using constant learning rate to learn all parameters of IT2FNN is 10 times faster than this convergence through using dynamical optimal learning rate to learn weighting factors. They have same computational complexity. However, through learning of all parameters; there is no need to know the initial information of the system.

Interval Type-2 Fuzzy Neural Network Backpropagation learning rate computational load convergence speed

Shahrzad Attarzadeh Maryam Mahmoodi

Department of Computer Engineering Islamic Azad University of Meymeh Meymeh, Iran

国际会议

2011 3rd International Conference on Computer and Automation Engineering(ICCAE 2011)(2011年第三届IEEE计算机与自动化工程国际会议)

重庆

英文

34-38

2011-01-21(万方平台首次上网日期,不代表论文的发表时间)