TY - JOUR
T1 - Global stability of a class of continuous-time recurrent neural networks
AU - Hu, Sanqing
AU - Wang, Jun
PY - 2002/9
Y1 - 2002/9
N2 - This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of continuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES conditions for the neural networks whose activation functions may not be monotone nondecreasing. We also provide a Lyapunov diagonal stability condition, without the nonsingularity requirement for the connection weight matrices, to ascertain the GES of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. This Lyapunov diagonal stability condition generalizes and unifies many the existing GAS and GES results. Moreover, two higher exponential convergence rates are estimated.
AB - This paper investigates global asymptotic stability (GAS) and global exponential stability (GES) of a class of continuous-time recurrent neural networks. First, we introduce a necessary and sufficient condition for existence and uniqueness of equilibrium of the neural networks with Lipschitz continuous activation functions. Next, we present two sufficient conditions to ascertain the GAS of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. We then give two GES conditions for the neural networks whose activation functions may not be monotone nondecreasing. We also provide a Lyapunov diagonal stability condition, without the nonsingularity requirement for the connection weight matrices, to ascertain the GES of the neural networks with globally Lipschitz continuous and monotone nondecreasing activation functions. This Lyapunov diagonal stability condition generalizes and unifies many the existing GAS and GES results. Moreover, two higher exponential convergence rates are estimated.
KW - Continuous
KW - Continuous-time
KW - Global asymptotic stability
KW - Global exponential stability
KW - Global Lipschitz
KW - Recurrent neural networks
UR - http://www.scopus.com/inward/record.url?scp=0036745062&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-0036745062&origin=recordpage
U2 - 10.1109/TCSI.2002.802360
DO - 10.1109/TCSI.2002.802360
M3 - RGC 22 - Publication in policy or professional journal
SN - 1057-7122
VL - 49
SP - 1334
EP - 1341
JO - IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
JF - IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
IS - 9
ER -