TY - JOUR
T1 - Recurrent networks for compressive sampling
AU - Leung, Chi-Sing
AU - Sum, John
AU - Constantinides, A. G.
PY - 2014/4/10
Y1 - 2014/4/10
N2 - This paper develops two neural network models, based on Lagrange programming neural networks (LPNNs), for recovering sparse signals in compressive sampling. The first model is for the standard recovery of sparse signals. The second one is for the recovery of sparse signals from noisy observations. Their properties, including the optimality of the solutions and the convergence behavior of the networks, are analyzed. We show that for the first case, the network converges to the global minimum of the objective function. For the second case, the convergence is locally stable. © 2013 Elsevier B.V.
AB - This paper develops two neural network models, based on Lagrange programming neural networks (LPNNs), for recovering sparse signals in compressive sampling. The first model is for the standard recovery of sparse signals. The second one is for the recovery of sparse signals from noisy observations. Their properties, including the optimality of the solutions and the convergence behavior of the networks, are analyzed. We show that for the first case, the network converges to the global minimum of the objective function. For the second case, the convergence is locally stable. © 2013 Elsevier B.V.
KW - Neural circuit
KW - Stability
UR - http://www.scopus.com/inward/record.url?scp=84893715357&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-84893715357&origin=recordpage
U2 - 10.1016/j.neucom.2013.09.028
DO - 10.1016/j.neucom.2013.09.028
M3 - RGC 21 - Publication in refereed journal
SN - 0925-2312
VL - 129
SP - 298
EP - 305
JO - Neurocomputing
JF - Neurocomputing
ER -