Square-Root Lasso with Nonconvex Regularization: An ADMM Approach

Xinyue Shen, Laming Chen, Yuantao Gu*, H. C. So

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

31 Citations (Scopus)

Abstract

Square-root least absolute shrinkage and selection operator (Lasso), a variant of Lasso, has recently been proposed with a key advantage that the optimal regularization parameter is independent of the noise level in the measurements. In this letter, we introduce a class of nonconvex sparsity-inducing penalties to the square-root Lasso to achieve better sparse recovery performance over the convex counterpart. The resultant formulation is converted to a nonconvex but multiconvex optimization problem, i.e., it is convex in each block of variables. Alternating direction method of multipliers is applied as the solver, according to which two efficient algorithms are devised for row-orthonormal sensing matrix and general sensing matrix, respectively. Numerical experiments are conducted to evaluate the performance of the proposed methods.
Original languageEnglish
Article number7469302
Pages (from-to)934-938
JournalIEEE Signal Processing Letters
Volume23
Issue number7
DOIs
Publication statusPublished - 1 Jul 2016

Research Keywords

  • alternating direction method of multipliers (ADMM)
  • linearized ADMM
  • non-convex regularization
  • Sparse recovery
  • square-root penalty

Fingerprint

Dive into the research topics of 'Square-Root Lasso with Nonconvex Regularization: An ADMM Approach'. Together they form a unique fingerprint.

Cite this