A unified penalized method for sparse additive quantile models: an RKHS approach

Shaogao Lv, Xin He, Junhui Wang*

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

6 Citations (Scopus)

Abstract

This paper focuses on the high-dimensional additive quantile model, allowing for both dimension and sparsity to increase with sample size. We propose a new sparsity-smoothness penalty over a reproducing kernel Hilbert space (RKHS), which includes linear function and spline-based nonlinear function as special cases. The combination of sparsity and smoothness is crucial for the asymptotic theory as well as the computational efficiency. Oracle inequalities on excess risk of the proposed method are established under weaker conditions than most existing results. Furthermore, we develop a majorize-minimization forward splitting iterative algorithm (MMFIA) for efficient computation and investigate its numerical convergence properties. Numerical experiments are conducted on the simulated and real data examples, which support the effectiveness of the proposed method.
Original languageEnglish
Pages (from-to)897-923
JournalAnnals of the Institute of Statistical Mathematics
Volume69
Issue number4
Online published6 Jun 2016
DOIs
Publication statusPublished - Aug 2017

Research Keywords

  • Additive models
  • Large p small n
  • Oracle inequality
  • Quantile regression
  • Reproducing kernel Hilbert space
  • Variable selection

RGC Funding Information

  • RGC-funded

Fingerprint

Dive into the research topics of 'A unified penalized method for sparse additive quantile models: an RKHS approach'. Together they form a unique fingerprint.

Cite this