Optimal learning rates for least squares regularized regression with unbounded sampling

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

45 Scopus Citations
View graph of relations

Author(s)

  • Cheng Wang
  • Ding-Xuan Zhou

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)55-67
Journal / PublicationJournal of Complexity
Volume27
Issue number1
Publication statusPublished - Feb 2011

Abstract

A standard assumption in theoretical study of learning algorithms for regression is uniform boundedness of output sample values. This excludes the common case with Gaussian noise. In this paper we investigate the learning algorithm for regression generated by the least squares regularization scheme in reproducing kernel Hilbert spaces without the assumption of uniform boundedness for sampling. By imposing some incremental conditions on moments of the output variable, we derive learning rates in terms of regularity of the regression function and capacity of the hypothesis space. The novelty of our analysis is a new covering number argument for bounding the sample error. © 2010 Elsevier Inc. All rights reserved.

Research Area(s)

  • Covering number, Learning theory, Least squares regression, Regularization in reproducing kernel Hilbert spaces

Citation Format(s)

Optimal learning rates for least squares regularized regression with unbounded sampling. / Wang, Cheng; Zhou, Ding-Xuan.

In: Journal of Complexity, Vol. 27, No. 1, 02.2011, p. 55-67.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review