Rank reduction for high-dimensional generalized additive models

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)672-684
Journal / PublicationJournal of Multivariate Analysis
Volume173
Online published3 Jun 2019
Publication statusPublished - Sept 2019

Abstract

When a regression problem contains multiple predictors, additive models avoid the difficulty of fitting multivariate functions and at the same time retain some nonlinearity of the model. When the dimension is high, the necessity to estimate a large number of functions, even though univariate, can cause concerns regarding statistical efficiency. We propose a rank reduction approach that assumes that all functions share a small common set of latent functions, which allows borrowing information from a large number of functions. The idea is general and could be used in any model with a large number of functions to estimate, but here we restrict our attention to generalized additive models, especially logistic models, that can deal with discrete responses and is useful for classification. Numerical results are reported to illustrate the finite sample performance of the estimator. We also establish an improved convergence rate of the rank reduction approach compared to the standard estimator and extend it to sparse modeling to deal with an even larger number of predictors.

Research Area(s)

  • Asymptotic normality, B-splines, Latent functions, Logistic regression

Citation Format(s)

Rank reduction for high-dimensional generalized additive models. / Lin, Hongmei; Lian, Heng; Liang, Hua.
In: Journal of Multivariate Analysis, Vol. 173, 09.2019, p. 672-684.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review