Learning rates for the risk of kernel-based quantile regression estimators in additive models

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

23 Scopus Citations
View graph of relations

Author(s)

  • Andreas Christmann
  • Ding-Xuan Zhou

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)449-477
Journal / PublicationAnalysis and Applications
Volume14
Issue number3
Online published5 Mar 2015
Publication statusPublished - May 2016

Abstract

Additive models play an important role in semiparametric statistics. This paper gives learning rates for regularized kernel-based methods for additive models. These learning rates compare favorably in particular in high dimensions to recent results on optimal learning rates for purely nonparametric regularized kernel-based quantile regression using the Gaussian radial basis function kernel, provided the assumption of an additive model is valid. Additionally, a concrete example is presented to show that a Gaussian function depending only on one variable lies in a reproducing kernel Hilbert space generated by an additive Gaussian kernel, but does not belong to the reproducing kernel Hilbert space generated by the multivariate Gaussian kernel of the same variance.

Research Area(s)

  • Additive model, quantile regression, rate of convergence, support vector machine

Citation Format(s)

Learning rates for the risk of kernel-based quantile regression estimators in additive models. / Christmann, Andreas; Zhou, Ding-Xuan.

In: Analysis and Applications, Vol. 14, No. 3, 05.2016, p. 449-477.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review