Noise/fault aware regularization for incremental learning in extreme learning machines
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 200-214 |
Number of pages | 15 |
Journal / Publication | Neurocomputing |
Volume | 486 |
Online published | 26 Nov 2021 |
Publication status | Published - 14 May 2022 |
Link(s)
Abstract
This paper investigates noise/fault tolerant incremental algorithms for the extreme learning machine (ELM) concept. Existing incremental ELM algorithms can be classified into two approaches: non-recomputation and recomputation. This paper first formulates a noise/fault aware objective function for nonlinear regression problems. Instead of developing noise/fault aware algorithms for the two computational approaches in a one-by-one manner, this paper uses two representative incremental algorithms, namely incremental ELM (I-ELM) and error minimized ELM (EM-ELM), to develop two noise/fault aware incremental algorithms. The proposed algorithms are called generalized I-ELM (GI-ELM) and generalized EM-ELM (GEM-ELM). The GI-ELM adds k hidden nodes into the existing network at each incremental step without recomputing the existing weights. To have a fair comparison, we consider a modified version of I-ELM as a comparison algorithm. The simulation demonstrates that the noise/fault tolerance of the proposed GI-ELM is better than that of the modified I-ELM. In the GEM-ELM, k hidden nodes are added into the existing network at each incremental step. Meanwhile, all output weights are recomputed based on a recursive formula. We also consider a modified version of EM-ELM as a comparison algorithm. The simulation demonstrates that the noise/fault tolerance of the proposed GEM-ELM is better than that of the modified EM-ELM. Moreover, we demonstrate that the multiple set concept can further enhance the performance of the two proposed algorithms. Following our research results, one can make some non-noise/fault tolerant incremental algorithms to be noise/fault tolerant.
Research Area(s)
- Fault tolerance, Flat structural neural networks, Noise resistance, Random node
Citation Format(s)
Noise/fault aware regularization for incremental learning in extreme learning machines. / Wong, Hiu-Tung; Leung, Ho-Chun; Leung, Chi-Sing; Wong, Eric.
In: Neurocomputing, Vol. 486, 14.05.2022, p. 200-214.Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review