Projects per year
Abstract
This paper focuses on the high-dimensional additive quantile model, allowing for both dimension and sparsity to increase with sample size. We propose a new sparsity-smoothness penalty over a reproducing kernel Hilbert space (RKHS), which includes linear function and spline-based nonlinear function as special cases. The combination of sparsity and smoothness is crucial for the asymptotic theory as well as the computational efficiency. Oracle inequalities on excess risk of the proposed method are established under weaker conditions than most existing results. Furthermore, we develop a majorize-minimization forward splitting iterative algorithm (MMFIA) for efficient computation and investigate its numerical convergence properties. Numerical experiments are conducted on the simulated and real data examples, which support the effectiveness of the proposed method.
| Original language | English |
|---|---|
| Pages (from-to) | 897-923 |
| Journal | Annals of the Institute of Statistical Mathematics |
| Volume | 69 |
| Issue number | 4 |
| Online published | 6 Jun 2016 |
| DOIs | |
| Publication status | Published - Aug 2017 |
Research Keywords
- Additive models
- Large p small n
- Oracle inequality
- Quantile regression
- Reproducing kernel Hilbert space
- Variable selection
RGC Funding Information
- RGC-funded
Fingerprint
Dive into the research topics of 'A unified penalized method for sparse additive quantile models: an RKHS approach'. Together they form a unique fingerprint.Projects
- 1 Finished
-
GRF: Model-free Variable Selection via Learning Gradients
WANG, J. (Principal Investigator / Project Coordinator)
1/08/15 → 24/06/19
Project: Research