TY - GEN
T1 - Generalized back-propagation algorithm for faster convergence
AU - Ng, S. C.
AU - Leung, S. H.
AU - Luk, A.
PY - 1996
Y1 - 1996
N2 - The conventional back-propagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. A new generalized back-propagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima is introduced in this paper. The new back-propagation algorithm is to change the derivative of the activation function so as to magnify the backward propagated error signal, thus the convergence rate can be accelerated and the local minimum can be escaped.
AB - The conventional back-propagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. A new generalized back-propagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima is introduced in this paper. The new back-propagation algorithm is to change the derivative of the activation function so as to magnify the backward propagated error signal, thus the convergence rate can be accelerated and the local minimum can be escaped.
UR - http://www.scopus.com/inward/record.url?scp=0029749567&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-0029749567&origin=recordpage
M3 - RGC 32 - Refereed conference paper (with host publication)
VL - 1
SP - 409
EP - 413
BT - IEEE International Conference on Neural Networks - Conference Proceedings
PB - IEEE
T2 - Proceedings of the 1996 IEEE International Conference on Neural Networks, ICNN. Part 1 (of 4)
Y2 - 3 June 1996 through 6 June 1996
ER -