Recurrent networks for compressive sampling

Chi-Sing Leung, John Sum, A. G. Constantinides

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

16 Citations (Scopus)

Abstract

This paper develops two neural network models, based on Lagrange programming neural networks (LPNNs), for recovering sparse signals in compressive sampling. The first model is for the standard recovery of sparse signals. The second one is for the recovery of sparse signals from noisy observations. Their properties, including the optimality of the solutions and the convergence behavior of the networks, are analyzed. We show that for the first case, the network converges to the global minimum of the objective function. For the second case, the convergence is locally stable. © 2013 Elsevier B.V.
Original languageEnglish
Pages (from-to)298-305
JournalNeurocomputing
Volume129
DOIs
Publication statusPublished - 10 Apr 2014

Research Keywords

  • Neural circuit
  • Stability

Fingerprint

Dive into the research topics of 'Recurrent networks for compressive sampling'. Together they form a unique fingerprint.

Cite this