Learning rates of least-square regularized regression

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

216 Scopus Citations
View graph of relations

Author(s)

  • Qiang Wu
  • Yiming Ying
  • Ding-Xuan Zhou

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)171-192
Journal / PublicationFoundations of Computational Mathematics
Volume6
Issue number2
Publication statusPublished - Jun 2006

Abstract

This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is mζ with ζ arbitrarily close to 1, regardless of the variance of the bounded probability distribution. © 2005 SFoCM.

Research Area(s)

  • Covering number, Learning theory, Regularization error, Regularization scheme, Reproducing kernel Hilbert space

Citation Format(s)

Learning rates of least-square regularized regression. / Wu, Qiang; Ying, Yiming; Zhou, Ding-Xuan.
In: Foundations of Computational Mathematics, Vol. 6, No. 2, 06.2006, p. 171-192.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review