EFFICIENT KERNEL-BASED VARIABLE SELECTION WITH SPARSISTENCY

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)2123-2151
Journal / PublicationStatistica Sinica
Volume31
Issue number4
Publication statusPublished - Oct 2021

Link(s)

Abstract

Sparse learning is central to high-dimensional data analysis, and various methods have been developed. Ideally, a sparse learning method should be methodologically flexible, computationally efficient, and provide a theoretical guarantee. However, most existing methods need to compromise some of these properties in order to attain the others. We develop a three-step sparse learning method, involving a kernel-based estimation of the regression function and its gradient functions, as well as a hard thresholding. Its key advantages are that it includes no explicit model assumption, admits general predictor effects, allows efficient computation, and attains desirable asymptotic sparsistency. The proposed method can be adapted to any reproducing kernel Hilbert space (RKHS) with different kernel functions, and its computational cost is only linear in the data dimension. The asymptotic sparsistency of the proposed method is established for general RKHS under mild conditions. The results of numerical experiments show that the proposed method compares favorably with its competitors in both simulated and real examples.

Research Area(s)

  • Gradient learning, hard thresholding, nonparametric sparse learning, ridge regression, RKHS, REGRESSION, LIKELIHOOD, REDUCTION, SHRINKAGE

Citation Format(s)

EFFICIENT KERNEL-BASED VARIABLE SELECTION WITH SPARSISTENCY. / He, Xin; Wang, Junhui; Lv, Shaogao.
In: Statistica Sinica, Vol. 31, No. 4, 10.2021, p. 2123-2151.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

Download Statistics

No data available