An ℓ0-Norm-Based Centers Selection for Failure Tolerant RBF Networks

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

1 Scopus Citations
View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)151902-151914
Journal / PublicationIEEE Access
Volume7
Online published7 Oct 2019
Publication statusPublished - 2019

Link(s)

Abstract

There are two important issues in the construction of a radial basis function (RBF) neural network. The first one is to select suitable RBF centers. The second one is that the resultant RBF network should be with good fault tolerance. This paper proposes an algorithm that is able to select RBF centers and to train fault tolerant RBF networks simultaneously. The proposed algorithm borrows the concept from sparse approximation. In our formulation, we first define a fault tolerant objective function based on all input vectors from the training samples. We then introduce the minimax concave penalty (MCP) function, which is an approximation of ℓ0-norm, into the objective function. The MCP term is able to force some unimportant RBF weights to zero. Hence the RBF node selection process can be achieved during training. As the MCP function is nondifferentiable and nonconvex, traditional gradient descent based algorithms are still unable to minimize the modified objective function. Based on the alternating direction method of multipliers (ADMM) framework, we develop an algorithm, called ADMM-MCP, to minimize the modified objective function. The convergent proof of the proposed ADMM-MCP algorithm is also presented. Simulation results show that the proposed ADMM-MCP algorithm is superior to many existing center selection algorithms under the concurrent fault situation.

Research Area(s)

  • Training, Linear programming, Radial basis function networks, Approximation algorithms, Fault tolerance, Fault tolerant systems, Convex functions, Failure tolerant, RBF, center selection, ADMM, ℓ0-norm, global convergence, FAULT-TOLERANCE, NEURAL-NETWORKS, DESIGN, ALGORITHMS, REGRESSION, CONVERGENCE, REGULARIZER

Download Statistics

No data available