TY - JOUR
T1 - An analog neural network approach for the least absolute shrinkage and selection operator problem
AU - Wang, Hao
AU - Lee, Ching Man
AU - Feng, Ruibin
AU - Leung, Chi Sing
PY - 2018/5
Y1 - 2018/5
N2 - This paper addresses the analog optimization for non-differential functions. The Lagrange programming neural network (LPNN) approach provides us a systematic way to build analog neural networks for handling constrained optimization problems. However, its drawback is that it cannot handle non-differentiable functions. In compressive sampling, one of the optimization problems is least absolute shrinkage and selection operator (LASSO), where the constraint is non-differentiable. This paper considers the hidden state concept from the local competition algorithm to formulate an analog model for the LASSO problem. Hence, the non-differentiable limitation of LPNN can be overcome. Under some conditions, at equilibrium, the network leads to the optimal solution of the LASSO. Also, we prove that these equilibrium points are stable. Simulation study illustrates that the proposed analog model and the traditional digital method have the similar mean squared performance.
AB - This paper addresses the analog optimization for non-differential functions. The Lagrange programming neural network (LPNN) approach provides us a systematic way to build analog neural networks for handling constrained optimization problems. However, its drawback is that it cannot handle non-differentiable functions. In compressive sampling, one of the optimization problems is least absolute shrinkage and selection operator (LASSO), where the constraint is non-differentiable. This paper considers the hidden state concept from the local competition algorithm to formulate an analog model for the LASSO problem. Hence, the non-differentiable limitation of LPNN can be overcome. Under some conditions, at equilibrium, the network leads to the optimal solution of the LASSO. Also, we prove that these equilibrium points are stable. Simulation study illustrates that the proposed analog model and the traditional digital method have the similar mean squared performance.
KW - Analog neural network
KW - Local competition algorithm
KW - LPNN
KW - Neural dynamics
UR - http://www.scopus.com/inward/record.url?scp=85013178140&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85013178140&origin=recordpage
U2 - 10.1007/s00521-017-2863-5
DO - 10.1007/s00521-017-2863-5
M3 - 21_Publication in refereed journal
VL - 29
SP - 389
EP - 400
JO - Neural Computing and Applications
JF - Neural Computing and Applications
SN - 0941-0643
IS - 9
ER -