Fast and global convergent weight evolution algorithm based on modified back-propagation

S. C. Ng, S. H. Leung*, A. Luk

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

9 Citations (Scopus)

Abstract

The conventional back-propagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. A new weight evolution algorithm based on modified back-propagation can effectively avoid the local minima and speed up the convergence rate. The idea is to perturb those weights linked with the output neurons for which their squared errors are above-average; thus local minima can be escaped. The modified back-propagation is to change the partial derivative of the activation function so as to magnify the backward propagated error signal, and thus the convergence rate can be accelerated.
Original languageEnglish
Title of host publicationProceedings of ICNN'95 - International Conference on Neural Networks
PublisherIEEE
Pages3004-3008
Volume6
ISBN (Print)0-7803-2768-3
DOIs
Publication statusPublished - Nov 1995
Event1995 IEEE International Conference on Neural Networks (ICNN'95) - Perth, Australia
Duration: 27 Nov 19951 Dec 1995

Conference

Conference1995 IEEE International Conference on Neural Networks (ICNN'95)
PlaceAustralia
CityPerth
Period27/11/951/12/95

Fingerprint

Dive into the research topics of 'Fast and global convergent weight evolution algorithm based on modified back-propagation'. Together they form a unique fingerprint.

Cite this