Error bounds for learning the kernel

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

11 Scopus Citations
View graph of relations

Author(s)

  • Charles A. Micchelli
  • Massimiliano Pontil
  • Qiang Wu
  • Ding-Xuan Zhou

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)849-868
Journal / PublicationAnalysis and Applications
Volume14
Issue number6
Online published6 Oct 2016
Publication statusPublished - Nov 2016

Abstract

The problem of learning the kernel function has received considerable attention in machine learning. Much of the work has focused on kernel selection criteria, particularly on minimizing a regularized error functional over a prescribed set of kernels. Empirical studies indicate that this approach can enhance statistical performance and is computationally feasible. In this paper, we present a theoretical analysis of its generalization error. We establish for a wide variety of classes of kernels, such as the set of all multivariate Gaussian kernels, that this learning method generalizes well and, when the regularization parameter is appropriately chosen, it is consistent. A central role in our analysis is played by the interaction between the sample error and the approximation error.

Research Area(s)

  • error decomposition, Gaussians, Kernel, least square, regularization, support vector machine

Citation Format(s)

Error bounds for learning the kernel. / Micchelli, Charles A.; Pontil, Massimiliano; Wu, Qiang et al.
In: Analysis and Applications, Vol. 14, No. 6, 11.2016, p. 849-868.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review