Optimal Learning Rates for Kernel Partial Least Squares
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 908-933 |
Journal / Publication | Journal of Fourier Analysis and Applications |
Volume | 24 |
Issue number | 3 |
Online published | 7 Apr 2017 |
Publication status | Published - Jun 2018 |
Link(s)
Abstract
We study two learning algorithms generated by kernel partial least squares (KPLS) and kernel minimal residual (KMR) methods. In these algorithms, regularization against overfitting is obtained by early stopping, which makes stopping rules crucial to their learning capabilities. We propose a stopping rule for determining the number of iterations based on cross-validation, without assuming a priori knowledge of the underlying probability measure, and show that optimal learning rates can be achieved. Our novel analysis consists of a nice bound for the number of iterations in a priori knowledge-based stopping rule for KMR and a stepping stone from KMR to KPLS. Technical tools include a recently developed integral operator approach based on a second order decomposition of inverse operators and an orthogonal polynomial argument.
Research Area(s)
- Learning theory, Kernel partial least squares, Kernel minimal residual, Cross validation, REGRESSION, APPROXIMATIONS, ALGORITHMS, OPERATORS
Citation Format(s)
Optimal Learning Rates for Kernel Partial Least Squares. / Lin, Shao-Bo; Zhou, Ding-Xuan.
In: Journal of Fourier Analysis and Applications, Vol. 24, No. 3, 06.2018, p. 908-933.
In: Journal of Fourier Analysis and Applications, Vol. 24, No. 3, 06.2018, p. 908-933.
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review