Projects per year
Abstract
Sparse recovery has attracted considerable attention in signal processing community these years, because of its widespread usage in many applications. Though lots of convex and nonconvex methods have been proposed for this topic, most of them are computationally expensive. Recently, learned iterative shrinkage-thresholding algorithm (LISTA) and its variants are incorporated to sparse recovery, which are based on specific deep neural networks designed by unfolding ISTA. However, the problem they are able to solve is regularized by l1 norm, which is less effective at promoting sparseness than some nonconvex sparsity-inducing penalties (SIPs). Motivated by the fact that the problems regularized by these SIPs can be solved by proximal operator methods, we devise a network named learning proximal operator method (LePOM) for sparse recovery. For LePOM on a general class of SIPs, a necessary condition of its convergence is established. Based on this condition, a simplified network is proposed with less parameters. Theoretical analysis of this network inspires us to further reduce the number of parameters and arrive at proposing Analytical LePOM (ALePOM). ALePOM determines most parameters by solving an optimization problem and significantly reduces the number of parameters. Theoretical analysis shows if the signal is sufficiently sparse, ALePOM converges linearly. Simulations confirm our analyses and demonstrate that the proposed solutions outperform state-of-the-art sparse recovery algorithms and neural network-based methods.
| Original language | English |
|---|---|
| Article number | 9025003 |
| Pages (from-to) | 5244-5259 |
| Journal | IEEE Transactions on Signal Processing |
| Volume | 68 |
| Online published | 5 Mar 2020 |
| DOIs | |
| Publication status | Published - 2020 |
Research Keywords
- neural network
- nonconvex sparsity-inducing penalty
- proximal operator method
- Sparse recovery
- unfolding iterative algorithm
Fingerprint
Dive into the research topics of 'Learning Proximal Operator Methods for Nonconvex Sparse Recovery with Theoretical Guarantee'. Together they form a unique fingerprint.Projects
- 1 Finished
-
NSFC: Non-convex Optimization for Robust Sparse Recovery: Fast Algorithms and Theoretical Analysis
SO, H. C. (Principal Investigator / Project Coordinator) & Gu, Y. (Co-Investigator)
1/01/16 → 26/02/20
Project: Research