Model selection for regularized least-squares algorithm in learning theory

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journal

109 Scopus Citations
View graph of relations

Author(s)

Detail(s)

Original languageEnglish
Pages (from-to)59-85
Journal / PublicationFoundations of Computational Mathematics
Volume5
Issue number1
Publication statusPublished - Feb 2005
Externally publishedYes

Abstract

We investigate the problem of model selection for learning algorithms depending on a continuous parameter. We propose a model selection procedure based on a worst-case analysis and on a data-independent choice of the parameter. For the regularized least-squares algorithm we bound the generalization error of the solution by a quantity depending on a few known constants and we show that the corresponding model selection procedure reduces to solving a bias-variance problem. Under suitable smoothness conditions on the regression function, we estimate the optimal parameter as a function of the number of data and we prove that this choice ensures consistency of the algorithm. © 2004 SFoCM.

Research Area(s)

  • Model selection, Optimal choice of parameters, Regularized least-squares algorithm