TY - JOUR
T1 - Design of general projection neural networks for solving monotone linear variational inequalities and linear and quadratic optimization problems
AU - Hu, Xiaolin
AU - Wang, Jun
PY - 2007/10
Y1 - 2007/10
N2 - Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mχ + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mχ + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples. © 2007 IEEE.
AB - Most existing neural networks for solving linear variational inequalities (LVIs) with the mapping Mχ + p require positive definiteness (or positive semidefiniteness) of M. In this correspondence, it is revealed that this condition is sufficient but not necessary for an LVI being strictly monotone (or monotone) on its constrained set where equality constraints are present. Then, it is proposed to reformulate monotone LVIs with equality constraints into LVIs with inequality constraints only, which are then possible to be solved by using some existing neural networks. General projection neural networks are designed in this correspondence for solving the transformed LVIs. Compared with existing neural networks, the designed neural networks feature lower model complexity. Moreover, the neural networks are guaranteed to be globally convergent to solutions of the LVI under the condition that the linear mapping Mχ + p is monotone on the constrained set. Because quadratic and linear programming problems are special cases of LVI in terms of solutions, the designed neural networks can solve them efficiently as well. In addition, it is discovered that the designed neural network in a specific case turns out to be the primal-dual network for solving quadratic or linear programming problems. The effectiveness of the neural networks is illustrated by several numerical examples. © 2007 IEEE.
KW - Global convergence
KW - Linear programming
KW - Linear variational inequality (LVI)
KW - Quadratic programming
KW - Recurrent neural network
UR - http://www.scopus.com/inward/record.url?scp=35148879483&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-35148879483&origin=recordpage
U2 - 10.1109/TSMCB.2007.903706
DO - 10.1109/TSMCB.2007.903706
M3 - RGC 22 - Publication in policy or professional journal
C2 - 17926722
SN - 1083-4419
VL - 37
SP - 1414
EP - 1421
JO - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
JF - IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IS - 5
ER -