Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 802-807 |
Journal / Publication | IEEE Transactions on Automatic Control |
Volume | 47 |
Issue number | 5 |
Publication status | Published - May 2002 |
Externally published | Yes |
Link(s)
Abstract
This note presents new results on global asymptotic stability (GAS) and global exponential stability (GES) of a general class of continuous-time recurrent neural networks with Lipschitz continuous and monotone nondecreasing activation functions. We first give three sufficient conditions for the GAS of neural networks. These testable sufficient conditions differ from and improve upon existing ones. We then extend an existing GAS result to GES one and also extend the existing GES results to more general cases with less restrictive connection weight matrices and/or partially Lipschitz activation functions.
Research Area(s)
- Global asymptotic (exponential) stability, Lipschitz continuous, Recurrent neural networks
Citation Format(s)
Global asymptotic stability and global exponential stability of continuous-time recurrent neural networks. / Hu, Sanqing; Wang, Jun.
In: IEEE Transactions on Automatic Control, Vol. 47, No. 5, 05.2002, p. 802-807.Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review