Fault-tolerant incremental learning for extreme learning machines

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 12 - Chapter in an edited book (Author)peer-review

2 Citations (Scopus)

Abstract

The extreme learning machine (ELM) framework provides an efficient way for constructing single-hidden-layer feedforward networks (SLFNs). Its main idea is that the input bias terms and the input weights of the hidden nodes are selected in a random way. During training, we only need to adjust the output weights of the hidden nodes. The existing incremental learning algorithms, called incremental-ELM (I-ELM) and convex I-ELM (CI-ELM), for extreme learning machines (ELMs) cannot handle the fault situation. This paper proposes two fault-tolerant incremental ELM algorithms, namely fault-tolerant I-ELM (FTI-ELM) and fault-tolerant CI-ELM (FTCI-ELM). The FTI-ELM only tunes the output weight of the newly additive node to minimize the training set error of faulty networks. It keeps all the previous learned weights unchanged. Its fault-tolerant performance is better than that of I-ELM and CI-ELM. To further improve the performance, the FTCI-ELM is proposed. It tunes the output weight of the newly additive node, as well as using a simple scheme to modify the existing output weights, to maximize the reduction in the training set error of faulty networks.
Original languageEnglish
Title of host publicationNeural Information Processing
Subtitle of host publication23rd International Conference, ICONIP 2016, Proceedings
EditorsSeiichi Ozawa, Kazushi Ikeda, Derong Liu, Akira Hirose, Kenji Doya, Minho Lee
PublisherSpringer Verlag
Pages168-176
Volume9948 LNCS
ISBN (Print)9783319466712
DOIs
Publication statusPublished - Oct 2016
Event23rd International Conference on Neural Information Processing, ICONIP 2016 - Kyoto, Japan
Duration: 16 Oct 201621 Oct 2016

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9948 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference23rd International Conference on Neural Information Processing, ICONIP 2016
PlaceJapan
CityKyoto
Period16/10/1621/10/16

Research Keywords

  • Extreme learning machines
  • Fault tolerance
  • Single hidden layer network
  • Weight noise

Fingerprint

Dive into the research topics of 'Fault-tolerant incremental learning for extreme learning machines'. Together they form a unique fingerprint.

Cite this