Optimal learning rates for least squares regularized regression with unbounded sampling
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 55-67 |
Journal / Publication | Journal of Complexity |
Volume | 27 |
Issue number | 1 |
Publication status | Published - Feb 2011 |
Link(s)
Abstract
A standard assumption in theoretical study of learning algorithms for regression is uniform boundedness of output sample values. This excludes the common case with Gaussian noise. In this paper we investigate the learning algorithm for regression generated by the least squares regularization scheme in reproducing kernel Hilbert spaces without the assumption of uniform boundedness for sampling. By imposing some incremental conditions on moments of the output variable, we derive learning rates in terms of regularity of the regression function and capacity of the hypothesis space. The novelty of our analysis is a new covering number argument for bounding the sample error. © 2010 Elsevier Inc. All rights reserved.
Research Area(s)
- Covering number, Learning theory, Least squares regression, Regularization in reproducing kernel Hilbert spaces
Citation Format(s)
Optimal learning rates for least squares regularized regression with unbounded sampling. / Wang, Cheng; Zhou, Ding-Xuan.
In: Journal of Complexity, Vol. 27, No. 1, 02.2011, p. 55-67.Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review