Optimal Learning Rates for Kernel Partial Least Squares

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

8 Scopus Citations
View graph of relations

Author(s)

  • Shao-Bo Lin
  • Ding-Xuan Zhou

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)908-933
Journal / PublicationJournal of Fourier Analysis and Applications
Volume24
Issue number3
Online published7 Apr 2017
Publication statusPublished - Jun 2018

Abstract

We study two learning algorithms generated by kernel partial least squares (KPLS) and kernel minimal residual (KMR) methods. In these algorithms, regularization against overfitting is obtained by early stopping, which makes stopping rules crucial to their learning capabilities. We propose a stopping rule for determining the number of iterations based on cross-validation, without assuming a priori knowledge of the underlying probability measure, and show that optimal learning rates can be achieved. Our novel analysis consists of a nice bound for the number of iterations in a priori knowledge-based stopping rule for KMR and a stepping stone from KMR to KPLS. Technical tools include a recently developed integral operator approach based on a second order decomposition of inverse operators and an orthogonal polynomial argument.

Research Area(s)

  • Learning theory, Kernel partial least squares, Kernel minimal residual, Cross validation, REGRESSION, APPROXIMATIONS, ALGORITHMS, OPERATORS

Citation Format(s)

Optimal Learning Rates for Kernel Partial Least Squares. / Lin, Shao-Bo; Zhou, Ding-Xuan.
In: Journal of Fourier Analysis and Applications, Vol. 24, No. 3, 06.2018, p. 908-933.

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review