[期刊论文]


Fast learning algorithm to improve performance of Quickprop

作   者:
Chi-Chung Cheung;Sin-Chun Ng;

出版年:2012

页    码:678 - 678
出版社:Institution of Engineering and Technology (IET)


摘   要:

Quickprop is one of the most popular fast learning algorithms in training feed-forward neural networks. Its learning rate is fast; however, it is still limited by the gradient of the backpropagation algorithm and it is easily trapped into a local minimum. Proposed is a new fast learning algorithm to overcome these two drawbacks. The performance investigation in different learning problems (applications) shows that the new algorithm always converges with a faster learning rate compared with Quickprop and other fast learning algorithms. The improvement in global convergence capability is especially large, which increased from 4 to 100% in one learning problem.



关键字:

backpropagation; convergence; feedforward neural nets; gradient methods; minimisation; backpropagation algorithm gradient; fast learning algorithm; feedforward neural network training; global convergence; learning rate; local minimum; quickprop performance improvement


所属期刊
Electronics Letters
ISSN: 0013-5194
来自:Institution of Engineering and Technology (IET)