TY - GEN
T1 - Convergence of the generalized back-propagation algorithm with constant learning rates
AU - Ng, S. C.
AU - Leung, S. H.
AU - Luk, A.
PY - 1998/5
Y1 - 1998/5
N2 - A new generalized back-propagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima has been recently introduced in [1]. In this paper, we will analyze the convergence of the generalized back-propagation algorithm. The weight sequences in generalized back-propagation algorithm can be approximated by a certain ordinary differential equation (ODE). When the learning rate tends to zero, the interpolated weight sequences of generalized back-propagation converge weakly to the solution of associated ODE.
AB - A new generalized back-propagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima has been recently introduced in [1]. In this paper, we will analyze the convergence of the generalized back-propagation algorithm. The weight sequences in generalized back-propagation algorithm can be approximated by a certain ordinary differential equation (ODE). When the learning rate tends to zero, the interpolated weight sequences of generalized back-propagation converge weakly to the solution of associated ODE.
UR - http://www.scopus.com/inward/record.url?scp=0031619046&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-0031619046&origin=recordpage
U2 - 10.1109/IJCNN.1998.685924
DO - 10.1109/IJCNN.1998.685924
M3 - RGC 32 - Refereed conference paper (with host publication)
SN - 0-7803-4859-1
VL - 2
SP - 1090
EP - 1094
BT - The 11998 IEEE International Joint Conference on Neural Networks Proceedings
PB - IEEE
T2 - 1998 IEEE International Joint Conference on Neural Networks (IJCNN 1998)
Y2 - 4 May 1998 through 9 May 1998
ER -