Rank reduction for high-dimensional generalized additive models
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 672-684 |
Journal / Publication | Journal of Multivariate Analysis |
Volume | 173 |
Online published | 3 Jun 2019 |
Publication status | Published - Sept 2019 |
Link(s)
Abstract
When a regression problem contains multiple predictors, additive models avoid the difficulty of fitting multivariate functions and at the same time retain some nonlinearity of the model. When the dimension is high, the necessity to estimate a large number of functions, even though univariate, can cause concerns regarding statistical efficiency. We propose a rank reduction approach that assumes that all functions share a small common set of latent functions, which allows borrowing information from a large number of functions. The idea is general and could be used in any model with a large number of functions to estimate, but here we restrict our attention to generalized additive models, especially logistic models, that can deal with discrete responses and is useful for classification. Numerical results are reported to illustrate the finite sample performance of the estimator. We also establish an improved convergence rate of the rank reduction approach compared to the standard estimator and extend it to sparse modeling to deal with an even larger number of predictors.
Research Area(s)
- Asymptotic normality, B-splines, Latent functions, Logistic regression
Citation Format(s)
Rank reduction for high-dimensional generalized additive models. / Lin, Hongmei; Lian, Heng; Liang, Hua.
In: Journal of Multivariate Analysis, Vol. 173, 09.2019, p. 672-684.
In: Journal of Multivariate Analysis, Vol. 173, 09.2019, p. 672-684.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review