Abstract
We establish minimax optimal rates of convergence for estimation in a high dimensional additive model assuming that it is approximately sparse. Our results reveal a behavior universal to this class of high dimensional problems. In the sparse regime when the components are sufficiently smooth or the dimensionality is sufficiently large, the optimal rates are identical to those for high dimensional linear regression and, therefore, there is no additional cost to entertain a nonparametric model. Otherwise, in the so-called smooth regime, the rates coincide with the optimal rates for estimating a univariate function and, therefore, they are immune to the "curse of dimensionality."
Original language | English |
---|---|
Pages (from-to) | 2564-2593 |
Journal | Annals of Statistics |
Volume | 44 |
Issue number | 6 |
Online published | 23 Nov 2016 |
DOIs | |
Publication status | Published - Dec 2016 |
Research Keywords
- Convergence rate
- Method of regularization
- Minimax optimality
- Reproducing kernel Hilbert space
- Sobolev space
Publisher's Copyright Statement
- COPYRIGHT TERMS OF DEPOSITED FINAL PUBLISHED VERSION FILE: © Institute of Mathematical Statistics, 2016. Yuan, M., & Zhou, D-X. (2016). Minimax optimal rates of estimation in high dimensional additive models. Annals of Statistics, 44(6), 2564-2593. https://doi.org/10.1214/15-AOS1422.