Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

59 Scopus Citations
View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Pages (from-to)802-807
Journal / PublicationIEEE Transactions on Automatic Control
Volume47
Issue number5
Publication statusPublished - May 2002
Externally publishedYes

Abstract

This note presents new results on global asymptotic stability (GAS) and global exponential stability (GES) of a general class of continuous-time recurrent neural networks with Lipschitz continuous and monotone nondecreasing activation functions. We first give three sufficient conditions for the GAS of neural networks. These testable sufficient conditions differ from and improve upon existing ones. We then extend an existing GAS result to GES one and also extend the existing GES results to more general cases with less restrictive connection weight matrices and/or partially Lipschitz activation functions.

Research Area(s)

  • Global asymptotic (exponential) stability, Lipschitz continuous, Recurrent neural networks