2-D Learned Proximal Gradient Algorithm for Fast Sparse Matrix Recovery

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

1 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Article number9200730
Pages (from-to)1492-1496
Journal / PublicationIEEE Transactions on Circuits and Systems II: Express Briefs
Volume68
Issue number4
Online published18 Sep 2020
Publication statusPublished - Apr 2021

Abstract

Many real-world problems can be modeled as sparse matrix recovery from two-dimensional (2D) measurements, which is recognized as one of the most important topics in signal processing community. Benefited from the roaring success of compressed sensing, many classical iterative algorithms can be directly applied or reinvented for matrix recovery, though they are computationally expensive. To alleviate this, we propose a neural network named 2D learned proximal gradient algorithm (2D-LPGA), which aims to quickly reconstruct the target matrix. Theoretical analysis reveals that if the parameters of the network satisfy certain conditions, it can reconstruct the sparse signal with linear convergence rate. Moreover, numerical experiments demonstrate the superiority of the proposed method over other classical schemes.

Research Area(s)

  • Neural network, Proximal gradient, Sparse matrix recovery, Unfolding

Citation Format(s)

2-D Learned Proximal Gradient Algorithm for Fast Sparse Matrix Recovery. / Yang, Chengzhu; Gu, Yuantao; Chen, Badong; Ma, Hongbing; So, Hing Cheung.

In: IEEE Transactions on Circuits and Systems II: Express Briefs, Vol. 68, No. 4, 9200730, 04.2021, p. 1492-1496.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review