Abstract
In this article, we are concerned with the generalization performance of nonparametric estimation for pairwise learning. Most of the existing work requires the hypothesis space to be convex or a VC-class, and the loss to be convex. However, these restrictive assumptions limit the applicability of the results in studying many popular methods, especially kernel methods and neural networks. We significantly relax these restrictive assumptions and establish a sharp oracle inequality of the empirical minimizer with a general hypothesis space for the Lipschitz continuous pairwise losses. As an example, we apply our general results to study pairwise least squares regression and derive an excess population risk bound that matches the minimax lower bound for the pointwise least squares regression. The key novelty lies in constructing a structured deep ReLU neural network to approximate the true predictor, and in designing a targeted hypothesis space composed of networks with this structure and controllable complexity. Experiments validate the effectiveness of the proposed method. This example demonstrates that the obtained general results indeed help us to explore the generalization performance on a variety of problems that cannot be handled by existing approaches. © 2012 IEEE.
| Original language | English |
|---|---|
| Number of pages | 12 |
| Journal | IEEE Transactions on Neural Networks and Learning Systems |
| Online published | 19 Feb 2026 |
| DOIs | |
| Publication status | Online published - 19 Feb 2026 |
Funding
The work of Puyu Wang was supported by Alexander von Humboldt Foundation. The work of Ding-Xuan Zhou was supported in part by Australian Research Council through Discovery Project under Grant DP240101919.
Research Keywords
- Deep neural networks
- generalization analysis
- learning theory
- nonparametric estimation
- pairwise learning
Fingerprint
Dive into the research topics of 'Fine-Grained Analysis of Nonparametric Estimation for Pairwise Learning'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver