Two-Timescale Multilayer Recurrent Neural Networks for Nonlinear Programming
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 37-47 |
Journal / Publication | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 33 |
Issue number | 1 |
Online published | 27 Oct 2020 |
Publication status | Published - Jan 2022 |
Link(s)
Abstract
This article presents a neurodynamic approach to nonlinear programming. Motivated by the idea of sequential quadratic programming, a class of two-timescale multilayer recurrent neural networks is presented with neuronal dynamics in their output layer operating at a bigger timescale than in their hidden layers. In the two-timescale multilayer recurrent neural networks, the transient states in the hidden layer(s) undergo faster dynamics than those in the output layer. Sufficient conditions are derived on the convergence of the two-timescale multilayer recurrent neural networks to local optima of nonlinear programming problems. Simulation results of collaborative neurodynamic optimization based on the two-timescale neurodynamic approach on global optimization problems with nonconvex objective functions or constraints are discussed to substantiate the efficacy of the two-timescale neurodynamic approach.
Research Area(s)
- Neurodynamic optimization, nonlinear programming, two-timescale system
Citation Format(s)
Two-Timescale Multilayer Recurrent Neural Networks for Nonlinear Programming. / Wang, Jiasen; Wang, Jun.
In: IEEE Transactions on Neural Networks and Learning Systems, Vol. 33, No. 1, 01.2022, p. 37-47.
In: IEEE Transactions on Neural Networks and Learning Systems, Vol. 33, No. 1, 01.2022, p. 37-47.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review