Error bounds for learning the kernel
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 849-868 |
Journal / Publication | Analysis and Applications |
Volume | 14 |
Issue number | 6 |
Online published | 6 Oct 2016 |
Publication status | Published - Nov 2016 |
Link(s)
Abstract
The problem of learning the kernel function has received considerable attention in machine learning. Much of the work has focused on kernel selection criteria, particularly on minimizing a regularized error functional over a prescribed set of kernels. Empirical studies indicate that this approach can enhance statistical performance and is computationally feasible. In this paper, we present a theoretical analysis of its generalization error. We establish for a wide variety of classes of kernels, such as the set of all multivariate Gaussian kernels, that this learning method generalizes well and, when the regularization parameter is appropriately chosen, it is consistent. A central role in our analysis is played by the interaction between the sample error and the approximation error.
Research Area(s)
- error decomposition, Gaussians, Kernel, least square, regularization, support vector machine
Citation Format(s)
Error bounds for learning the kernel. / Micchelli, Charles A.; Pontil, Massimiliano; Wu, Qiang et al.
In: Analysis and Applications, Vol. 14, No. 6, 11.2016, p. 849-868.
In: Analysis and Applications, Vol. 14, No. 6, 11.2016, p. 849-868.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review