Global Exponential Stability of a General Class of Recurrent Neural Networks with Time-Varying Delays

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)22_Publication in policy or professional journal

273 Scopus Citations
View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Pages (from-to)1353-1358
Journal / PublicationIEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Volume50
Issue number10
Publication statusPublished - Oct 2003
Externally publishedYes

Abstract

This brief presents new theoretical results on the global exponential stability of neural networks with time-varying delays and Lipschitz continuous activation functions. These results include several sufficient conditions for the global exponential stability of general neural networks with time-varying delays and without monotone, bounded, or continuously differentiable activation function. In addition to providing new criteria for neural networks with time-varying delays, these stability conditions also improve upon the existing ones with constant time delays and without time delays. Furthermore, it is convenient to estimate the exponential convergence rates of the neural networks by using the results.

Research Area(s)

  • Global exponential stability, Neural networks, Rate of exponential convergence, Time-varying delays

Citation Format(s)

Global Exponential Stability of a General Class of Recurrent Neural Networks with Time-Varying Delays. / Zeng, Zhigang; Wang, Jun; Liao, Xiaoxin.

In: IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, Vol. 50, No. 10, 10.2003, p. 1353-1358.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)22_Publication in policy or professional journal