Parametric and semiparametric reduced-rank regression with flexible sparsity
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 163-174 |
Journal / Publication | Journal of Multivariate Analysis |
Volume | 136 |
Publication status | Published - 1 Apr 2015 |
Externally published | Yes |
Link(s)
Abstract
We consider joint rank and variable selection in multivariate regression. Previously proposed joint rank and variable selection approaches assume that different responses are related to the same set of variables, which suggests using a group penalty on the rows of the coefficient matrix. However, this assumption may not hold in practice and motivates the usual lasso (l1) penalty on the coefficient matrix. We propose to use the gradient-proximal algorithm to solve this problem, which is a recent development in optimization. We also present some theoretical results for the proposed estimator with the l1 penalty. We then consider several extensions including adaptive lasso penalty, sparse group penalty, and additive models. The proposed methodology thus offers a much more complete set of tools in high-dimensional multivariate regression. Finally, we present numerical illustrations based on simulated and real data sets.
Research Area(s)
- Additive models, Oracle inequality, Reduced-rank regression, Sparse group lasso
Citation Format(s)
Parametric and semiparametric reduced-rank regression with flexible sparsity. / Lian, Heng; Feng, Sanying; Zhao, Kaifeng.
In: Journal of Multivariate Analysis, Vol. 136, 01.04.2015, p. 163-174.
In: Journal of Multivariate Analysis, Vol. 136, 01.04.2015, p. 163-174.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review