Projects per year
Abstract
The problem of learning the kernel function has received considerable attention in machine learning. Much of the work has focused on kernel selection criteria, particularly on minimizing a regularized error functional over a prescribed set of kernels. Empirical studies indicate that this approach can enhance statistical performance and is computationally feasible. In this paper, we present a theoretical analysis of its generalization error. We establish for a wide variety of classes of kernels, such as the set of all multivariate Gaussian kernels, that this learning method generalizes well and, when the regularization parameter is appropriately chosen, it is consistent. A central role in our analysis is played by the interaction between the sample error and the approximation error.
| Original language | English |
|---|---|
| Pages (from-to) | 849-868 |
| Journal | Analysis and Applications |
| Volume | 14 |
| Issue number | 6 |
| Online published | 6 Oct 2016 |
| DOIs | |
| Publication status | Published - Nov 2016 |
Research Keywords
- error decomposition
- Gaussians
- Kernel
- least square
- regularization
- support vector machine
Fingerprint
Dive into the research topics of 'Error bounds for learning the kernel'. Together they form a unique fingerprint.Projects
- 1 Finished
-
GRF: Approximation Analysis of Kaczmarz Type Online Schemes and Fourier Analysis of Some Learning Algorithms Involving Sample Pair-based Loss Functions
ZHOU, D. (Principal Investigator / Project Coordinator)
1/01/14 → 30/11/17
Project: Research