Minimax optimal rates of estimation in high dimensional additive models

Ming Yuan*, Ding-Xuan Zhou

*Corresponding author for this work

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

40 Citations (Scopus)
36 Downloads (CityUHK Scholars)

Abstract

We establish minimax optimal rates of convergence for estimation in a high dimensional additive model assuming that it is approximately sparse. Our results reveal a behavior universal to this class of high dimensional problems. In the sparse regime when the components are sufficiently smooth or the dimensionality is sufficiently large, the optimal rates are identical to those for high dimensional linear regression and, therefore, there is no additional cost to entertain a nonparametric model. Otherwise, in the so-called smooth regime, the rates coincide with the optimal rates for estimating a univariate function and, therefore, they are immune to the "curse of dimensionality."
Original languageEnglish
Pages (from-to)2564-2593
JournalAnnals of Statistics
Volume44
Issue number6
Online published23 Nov 2016
DOIs
Publication statusPublished - Dec 2016

Research Keywords

  • Convergence rate
  • Method of regularization
  • Minimax optimality
  • Reproducing kernel Hilbert space
  • Sobolev space

Publisher's Copyright Statement

  • COPYRIGHT TERMS OF DEPOSITED FINAL PUBLISHED VERSION FILE: © Institute of Mathematical Statistics, 2016. Yuan, M., & Zhou, D-X. (2016). Minimax optimal rates of estimation in high dimensional additive models. Annals of Statistics, 44(6), 2564-2593. https://doi.org/10.1214/15-AOS1422.

Fingerprint

Dive into the research topics of 'Minimax optimal rates of estimation in high dimensional additive models'. Together they form a unique fingerprint.

Cite this