Abstract
The conventional back-propagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. A new weight evolution algorithm based on modified back-propagation can effectively avoid the local minima and speed up the convergence rate. The idea is to perturb those weights linked with the output neurons for which their squared errors are above-average; thus local minima can be escaped. The modified back-propagation is to change the partial derivative of the activation function so as to magnify the backward propagated error signal, and thus the convergence rate can be accelerated.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of ICNN'95 - International Conference on Neural Networks |
| Publisher | IEEE |
| Pages | 3004-3008 |
| Volume | 6 |
| ISBN (Print) | 0-7803-2768-3 |
| DOIs | |
| Publication status | Published - Nov 1995 |
| Event | 1995 IEEE International Conference on Neural Networks (ICNN'95) - Perth, Australia Duration: 27 Nov 1995 → 1 Dec 1995 |
Conference
| Conference | 1995 IEEE International Conference on Neural Networks (ICNN'95) |
|---|---|
| Place | Australia |
| City | Perth |
| Period | 27/11/95 → 1/12/95 |