Extended least squares based algorithm for training feedforward networks

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)22_Publication in policy or professional journal

33 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)806-810
Journal / PublicationIEEE Transactions on Neural Networks
Volume8
Issue number3
Publication statusPublished - 1997

Abstract

An extended least squares-based algorithm for feedforward networks is proposed. The weights connecting the last hidden and output layers are first evaluated by least squares algorithm. The weights between input and hidden layers are then evaluated using the modified gradient descent algorithms. This arrangement eliminates the stalling problem experienced by the pure least squares type algorithms; however, still maintains the characteristic of fast convergence. In the investigated problems, the total number of flops required for the networks to converge using the proposed training algorithm are only 0.221%-16.0% of that using the Levenberg-Marquardt algorithm. The number of floating point operations per iteration of the proposed algorithm are only 1.517-3.521 times of that of the standard backpropagation algorithm. © 1997 IEEE.