TY - GEN
T1 - Noise resistant training for extreme learning machine
AU - Lui, Yik Lam
AU - Wong, Hiu Tung
AU - Leung, Chi-Sing
AU - Kwong, Sam
N1 - Full text of this publication does not contain sufficient affiliation information. With consent from the author(s) concerned, the Research Unit(s) information for this record is based on the existing academic department affiliation of the author(s).
PY - 2017
Y1 - 2017
N2 - The extreme learning machine (ELM) concept provides some effective training algorithms to construct single hidden layer feedforward networks (SHLFNs). However, the conventional ELM algorithms were designed for the noiseless situation only, in which the outputs of the hidden nodes are not contaminated by noise. This paper presents two noise-resistant training algorithms, namely noise-resistant incremental ELM (NRI-ELM) and noise-resistant convex incremental ELM (NRCI-ELM). For NRI-ELM, its noise-resistant ability is better than that of the conventional incremented ELM algorithms. To further enhance the noise resistant ability, the NRCI-ELM algorithm is proposed. The convergent properties of the two proposed noise resistant algorithms are also presented.
AB - The extreme learning machine (ELM) concept provides some effective training algorithms to construct single hidden layer feedforward networks (SHLFNs). However, the conventional ELM algorithms were designed for the noiseless situation only, in which the outputs of the hidden nodes are not contaminated by noise. This paper presents two noise-resistant training algorithms, namely noise-resistant incremental ELM (NRI-ELM) and noise-resistant convex incremental ELM (NRCI-ELM). For NRI-ELM, its noise-resistant ability is better than that of the conventional incremented ELM algorithms. To further enhance the noise resistant ability, the NRCI-ELM algorithm is proposed. The convergent properties of the two proposed noise resistant algorithms are also presented.
KW - Node noise
KW - Extreme learning machines
KW - Incremental algorithm
UR - http://www.scopus.com/inward/record.url?scp=85021733548&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85021733548&origin=recordpage
U2 - 10.1007/978-3-319-59081-3_31
DO - 10.1007/978-3-319-59081-3_31
M3 - RGC 32 - Refereed conference paper (with host publication)
SN - 9783319590806
T3 - Lecture Notes in Computer Science
SP - 257
EP - 265
BT - Advances in Neural Networks - ISNN 2017
PB - Springer Nature
T2 - 14th International Symposium on Neural Networks (ISNN 2017)
Y2 - 21 June 2017 through 26 June 2017
ER -