Decouple implementation of weight decay for recursive least square

Andrew Chi-Sing Leung, Yi Xiao, Yong Xu, Kwok-Wo Wong

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

1 Citation (Scopus)

Abstract

In the conventional recursive least square (RLS) algorithm for multilayer feedforward neural networks, controlling the initial error covariance matrix can limit weight magnitude. However, the weight decay effect decreases linearly as the number of learning epochs increases. Although we can modify the original RLS algorithm to maintain a constant weight decay effect, the computational and space complexities of the modified RLS algorithm are very high. This paper first presents a set of more compact RLS equations for this modified RLS algorithm. Afterwards, to reduce the computational and space complexities, we propose a decoupled version for this algorithm. The effectiveness of this decoupled algorithm is demonstrated by computer simulations. © 2012 Springer-Verlag London Limited.
Original languageEnglish
Pages (from-to)1709-1716
JournalNeural Computing and Applications
Volume21
Issue number7
DOIs
Publication statusPublished - Oct 2012

Research Keywords

  • Recursive least square
  • Regularization
  • Weight decay

Fingerprint

Dive into the research topics of 'Decouple implementation of weight decay for recursive least square'. Together they form a unique fingerprint.

Cite this